How to write UTF-8 characters using bulk insert in SQL Server?

I came here before looking for a solution for bulk inserting special characters. Didn’t like the workaround with UTF-16 (that would double the size of csv file). I found out that you definitely CAN and it’s very easy, you don’t need a format file. This answer is for other people who are looking for the … Read more

SqlBulkCopy from a List

With FastMember, you can do this without ever needing to go via DataTable (which, in my tests, more-than-doubles the performance): using(var bcp = new SqlBulkCopy(connection)) using(var reader = ObjectReader.Create(data, “Id”, “Name”, “Description”)) { bcp.DestinationTableName = “SomeTable”; bcp.WriteToServer(reader); } Note that ObjectReader can also work with non-generic sources, and it is not necessary to specify the … Read more

Bulk Insert to Oracle using .NET

I’m loading 50,000 records in 15 or so seconds using Array Binding in ODP.NET It works by repeatedly invoking a stored procedure you specify (and in which you can do updates/inserts/deletes), but it passes the multiple parameter values from .NET to the database in bulk. Instead of specifying a single value for each parameter to … Read more

Bulk Insert Partially Quoted CSV File in SQL Server

Unfortunately SQL Server interprets the quoted comma as a delimiter. This applies to both BCP and bulk insert . From http://msdn.microsoft.com/en-us/library/ms191485%28v=sql.100%29.aspx If a terminator character occurs within the data, it is interpreted as a terminator, not as data, and the data after that character is interpreted as belonging to the next field or record. Therefore, … Read more

mongodb: insert if not exists

Sounds like you want to do an “upsert”. MongoDB has built-in support for this. Pass an extra parameter to your update() call: {upsert:true}. For example: key = {‘key’:’value’} data = {‘key2′:’value2’, ‘key3′:’value3’}; coll.update(key, data, upsert=True); #In python upsert must be passed as a keyword argument This replaces your if-find-else-update block entirely. It will insert if … Read more

How can I insert 10 million records in the shortest time possible?

Please do not create a DataTable to load via BulkCopy. That is an ok solution for smaller sets of data, but there is absolutely no reason to load all 10 million rows into memory before calling the database. Your best bet (outside of BCP / BULK INSERT / OPENROWSET(BULK…)) is to stream the contents from … Read more

How can I Insert many rows into a MySQL table and return the new IDs?

Old thread but just looked into this, so here goes: if you are using InnoDB on a recent version of MySQL, you can get the list of IDs using LAST_INSERT_ID() and ROW_COUNT(). InnoDB guarantees sequential numbers for AUTO INCREMENT when doing bulk inserts, provided innodb_autoinc_lock_mode is set to 0 (traditional) or 1 (consecutive). Consequently you … Read more