Mass Insert Into Sql Server
I am attempting to insert a mass of records into SQL Server 2005 from Vb.Net. Although the insertion is working fine, I am doing my best to try to make it as fast as possible. Currently, it takes ~ 11 mins for 100,000 records. What would be the suggested approach to inserting a large number of records into SQL Server from an Application?
My current apporach is basically opening the connection, iterating through my list of information and firing off individual sql insert statments, and then closing the connection. Anyone have a better suggestion on how to do this?
Current Function:
Public Sub BatchInsert(ByVal ParamCollections As List(Of SqlParameter()))
Dim Conn As SqlConnection = New SqlConnection(DBHelper.DatabaseConnection)
Using scope As TransactionScope = New TransactionScope()
Using Conn
Dim cmd As SqlCommand = New SqlCommand("sproc_name", Conn)
Conn.Open()
cmd.CommandType = CommandType.StoredProcedure
For i = 0 To ParamCollections.Count - 1
cmd.Parameters.Clear()
cmd.Parameters.AddRange(ParamCollections(i))
cmd.ExecuteNonQuery()
Next
Conn.Close()
scope.Complete()
End Using
End Using
End Sub
Use the SqlBulkCopy class, it will be able to run through those 100K rows much faster than individual inserts.
Oh, and if you can, I would urge you to implement a IDataReader capable class, to feed the SqlBulkCopy.WriteToServer(IDataReader) method, this will allow you to produce data sequentially, one row at a time. If you are importing from a text file, as an example, building some IEnumerable<T>
methods that uses yield return
and converting it to a IDataReader object will allow you to feed data to the server very naturally.
To counter the loss of rollback ability with BCP, you can transfer the data into a temporary table, and then execute normal INSERT INTO
statements on the server afterwards, bulk-transferring the data from the temporary table into the production table, this will allow you to use a transaction for the last transfer part, and will still run a lot faster than your original individual insert statements.
EDIT: and Here's an example (C#, but should be easy to convert to VB.Net) of the usage of the bulk load API.
Thanks to everyone's help, I was able to complete my task. The SQLBulkCopy fit my needs perfectly (although there were some other excellent suggestions). Using SqlBulkcopy,the time went from 11 mins to 45 seconds. I can't believe the difference!
For future reference, here are a few bits of information:
Basic Implementation code:
Public Sub PerformBulkCopy(ByVal dt As DataTable)
Using Conn As SqlConnection = New SqlConnection(DBHelper.DatabaseConnection)
Conn.Open()
Using s As SqlBulkCopy = New SqlBulkCopy(Conn)
s.DestinationTableName = "TableName"
s.WriteToServer(dt)
s.Close()
End Using
Conn.Close()
End Using
End Sub
Very informative link that I found:
Using Sql Bulk Copy
Thanks to all for the help! I sincerely appreciate it.
Put your data to be imported into a csv file and run the bcp utility on the data. You can't get any faster with sequential calls inserting single rows at a time, you certainly need a bulk utility if you want performance.
The SQLBulkCopy class will allow you to transmit all the data in a collection so the server can process everything at once, eliminating the back and forth. So if you want to avoid creating temporary files (which I would), then look to that class.
Just having the connection remain open is a good start, but you still have the overhead of sending a row, having SQL store it, return a result, and then you must iterate to the next row.
链接地址: http://www.djcxy.com/p/95648.html上一篇: 查询SQL Server 2000和SQL Server 2005之间的性能差异
下一篇: 批量插入到SQL Server中