Get #Amazon #Prime for this #holiday #amazonprime #christmas #2019

#Mongodb #mlab bulk import #JSON

Mongodb Homepage
Mongodb Homepage

So, I had a few MBs of record in json that I wanted to insert into a mongodb database via mLab, database-as-a-service provider. I gonna share with you how I did it and solve a parsing error.

1)
On mLab, I found these instructions and I was led to the following list of commands. If you are a mLab user, you can follow the listed steps to get more information.

Binary

Import database

mongorestore -h ds131687.mlab.com:31687 -d parking -u <user> -p <password> <input db directory>

Export database

mongodump -h ds131687.mlab.com:31687 -d parking -u <user> -p <password> -o <output directory>

Import collection

mongorestore -h ds131687.mlab.com:31687 -d <dbname> -u <user> -p <password> <input .bson file>

Export collection

mongodump -h ds131687.mlab.com:31687 -d <dbname> -c <collection> -u <user> -p <password> -o <output directory>

JSON

Import collection

mongoimport -h ds131687.mlab.com:31687 -d <dbname> -c <collection> -u <user> -p <password> --file <input file>

Export collection

mongoexport -h ds131687.mlab.com:31687 -d <dbname> -c <collection> -u <user> -p <password> -o <output file>

CSV

Import collection

mongoimport -h ds131687.mlab.com:31687 -d <dbname> -c <collection> -u <user> -p <password> --file <input .csv file> --type csv --headerline

Export collection

mongoexport -h ds131687.mlab.com:31687 -d <dbname> parking -c <collection> -u <user> -p <password> -o <output .csv file> --csv -f <comma-separated list of field names>

2)
There is also this import section on mLab that provides the following examples:
% mongoimport -h ds012345.mlab.com:56789 -d dbname -c collectionname -u dbuser -p dbpassword --file filename.json
% mongoimport -h ds012345.mlab.com:56789 -d dbname -c collectionname -u dbuser -p dbpassword --file filename.csv --type csv --headerline 

3)
However I got this error when I was trying with the above commands:
mongoimport -h ds131687.mlab.com:31687 -d <dbname> -c <collection>
-u <user> -p <pwd> --file data.json 2017-12-31T18:03:13.346-0800 connected to: ds131687.mlab.com:31687 2017-12-31T18:03:13.435-0800 Failed: error processing document
#2: invalid character ',' looking for beginning of value 2017-12-31T18:03:13.435-0800 imported 1 document

Thanks to this blog, I realized that the reason with this error was my json objects were comma-delimited. To import json objects in the following format, we need --jsonArray flag.
[
   {id: 1, data: 'mongo'},
   {id: 2, data: 'mongo'},
   {id: 3, data: 'mongo'}
]

The following array of json objects have a comma after the last object and it will cause error too.
[
   {id: 1, data: 'mongo'},
   {id: 2, data: 'mongo'},
   {id: 3, data: 'mongo'},
]

It would have worked if the document has one json object per line and doesn't have delimiter.
{id: 1, data: 'mongo'}
{id: 2, data: 'mongo'}
{id: 3, data: 'mongo'}

Instead, we have to add --jsonArray flag at the command.
mongoimport -h ds131687.mlab.com:31687 -d <dbname> -c <collection> -u <user> -p <password> --file <input file> --jsonArray

4)
We can also use this  --jsonArray flag with mongoexport to get json data in different structure.

Thank you for reading!

Jun