Moving data from SQL Server to MongoDB/CouchDB/Couchbase

I must have re-written the first sentence for this blog 20 times. The reason is that I realize the audience for this post must be incredibly small. You have to work somewhere that is using SQL Server. You also have to work somewhere that is using MongoDB or a Couch product (CouchDB, Couchbase).

Even further you have to want to move data from SQL Server to a document database that stores its data as JSON. But who knows, maybe someone will find this approach useful. =)

The goal is to take data from SQL Server (relational) to MongoDB (document). I’m assuming since you are running SQL Server that you have some .NET languages around the office.

To start, make a new Visual Studio solution and create a Visual C# SQL CLR Database Project. You can find this under the > Database grouping of project types.

The basic idea is that you want to write code, deploy it to SQL Server and then run the code as a stored procedure that produces JSON.

Example of code:

using System.Collections.Generic;
using System.Data;
using Microsoft.SqlServer.Server;
using POC;
using POC.PageElements;


public partial class StoredProcedures
{
    [Microsoft.SqlServer.Server.SqlProcedure]
    public static void StoredProcedure1()
    {
        Builder builder = new Builder();
        var articles = builder.Build<Article>("exec usp_get_articles");

        foreach (var article in articles)
        {
            article.Contributors = builder.Build<Contributor>("exec usp_get_article_contributors " + article.ArticleId);
            foreach (var contributor in article.Contributors)
            {
                contributor.SocialLinks = builder.Build<SocialLink>("select * from contributor_social_link where contributor_id = " + contributor.Id);
            }

            article.Categories = builder.Build<Category>("exec usp_get_article_categories " + article.ArticleId);

            article.Pages = builder.Build<Page>("exec usp_get_article_pages " + article.ArticleId);
            foreach (var page in article.Pages)
            {
                var paragraphs = builder.Build<Paragraph>("exec usp_get_paragraphs " + page.Id);
                foreach (var paragraph in paragraphs)
                {
                    var images = builder.Build<Image>("exec usp_get_paragraph_image " + paragraph.Id);
                    paragraph.Image = (images.Count > 0) ? images.ToArray()[0] : null;
                    page.PageElements.Add(paragraph);
                }


                var tables = builder.Build<Table>("exec usp_get_tables " + page.Id);
                foreach (var table in tables)
                {
                    page.PageElements.Add(table);
                }


                var imageGroups = builder.Build<ImageGroup>("exec usp_get_image_groups " + page.Id);
                foreach (var imageGroup in imageGroups)
                {
                    var images = builder.Build<Image>("exec usp_get_image_group_images " + imageGroup.Id);
                    imageGroup.Images = (images.Count > 0) ? images : new List<Image>();
                    page.PageElements.Add(imageGroup);
                }


                var inlineImages = builder.Build<Image>("exec usp_get_inline_images " + page.Id);
                foreach (var inlineImage in inlineImages)
                {
                    page.PageElements.Add(inlineImage);
                }


                var videos = builder.Build<Video>("exec usp_get_videos " + page.Id);
                foreach (var video in videos)
                {
                    page.PageElements.Add(video);
                }


                var blockQuotes = builder.Build<BlockQuote>("exec usp_get_block_quotes " + page.Id);
                foreach (var blockQuote in blockQuotes)
                {
                    page.PageElements.Add(blockQuote);
                }
            }
        }



        SqlPipe sp;
        sp = SqlContext.Pipe;

        var dataRecord = new SqlDataRecord(
            new SqlMetaData("id", SqlDbType.Int),
            new SqlMetaData("json", SqlDbType.Text));
        sp.SendResultsStart(dataRecord);

        foreach (var article in articles)
        {
            dataRecord.SetInt32(0, article.Id);
            dataRecord.SetString(1, article.ToString());
            sp.SendResultsRow(dataRecord);
        }

        sp.SendResultsEnd();
    }
};

Some of the code calls functions that I haven’t included, but you can use your imagination to understand what they do. They simply run queries that populate a list of objects. We keep constructing a very rich Article object by making one database call after another.

Once we’re done, we have a list of articles that have been populated from the database. Remember the code above is actually deployed as executable code in SQL Server. Inside of Management Studio you would navigate to your database and then Programmability->Assemblies. If deployment goes well, you’ll see your code there.

The end of the code blob above shows that we are calling article.ToString(); The contents of this call:

        public override string ToString()
        {
            return JsonConvert.SerializeObject(this);
        }

As you can see we’re simply serializing the article to JSON using the Newtonsoft JSON library for C#.

I will say that the deployment of code to SQL Server (especially third party libraries) is not easy. It’s possible, but it will take some Googling and investigation.

The benefits of being able to produce JSON from SQL Server is massive however. You can quickly change the objects in C# that get serialized to JSON, which helps development velocity. My use case has each article running roughly 13 queries, which means that the total number of queries for 1,000 articles would be 13,000!

However in my testing I was able to produce around 100,000 articles in less than 10 minutes. The secret sauce is the fact that the code is running on the database machine and so we don’t have to pay the price of network latency.

Hopefully this approach is helpful in sending non-trivially formed data from SQL Server to MongoDB!

Moving data from SQL Server to MongoDB/CouchDB/Couchbase

Cross Database Joins with Groovy/Gremlin

It’s funny how things connect to each other. The other day I was watching a video about a graph database, which lead me to the Titan project, which lead me to this blog post: http://thinkaurelius.com/2013/02/04/polyglot-persistence-and-query-with-gremlin/

Our database environment at work is definitely a polyglot. It’s a mixture of SQL Server, Postgres, CouchDB, MongoDB, MySQL, Couchbase and a few others. Most of the time these databases do not need to talk directly. Each one is usually a silo and fairly self contained.

However there are a few strong relationships between databases. MySQL talks to Mongo, and SQL Server talks to Mongo. They have a lot of data in common.

The punchline of the post is that Gremlin (https://github.com/tinkerpop/gremlin/wiki) is a Groovy based REPL that allows us to easily talk to heterogenous databases.

Obviously we could write pure Java to do this, but Groovy with Gremlin makes it a joy, not a chore.

For example, here is some code to connect to SQL Server (assuming you have the SQL JDBC jar file):

import com.microsoft.sqlserver.jdbc.SQLServerDriver
import groovy.sql.Sql

def getArticles = {
        sql = Sql.newInstance("jdbc:sqlserver://127.0.0.1;databaseName=cms", "login","password", "com.microsoft.sqlserver.jdbc.SQLServerDriver")

        def map = new HashMap()

        sql.eachRow("select article_id, title from article") {
                map[it.article_id] = title
        };

        return map
}

Very simple. Here is some code in the same script to connect to Mongo and get some data:

def getMongoDatabase = {
        def mongo = new GMongo("127.0.0.1")
        return mongo.getDB("articles")
}

def getExistingMongoArticleIds = { db ->
        def cursor = db.articles.find([:], [_id : 1]).toArray()
        def article_ids = []

        cursor.each { item ->
                article_ids.add((int)item._id)
        }

        return article_ids
}

The beauty of this is that as our company moves towards more and more JSON (we use it in our services, our code and our databases), Groovy provides a concise syntax and is easy to use.

Granted all the data has to come back to the client to do cross-system joins, but with some creativity, the performance impacts can be minimized.

Cross Database Joins with Groovy/Gremlin