Posted by: Colin Smith
SQL Admin, SQL Administration, SQL Administration. MS SQL, SQL Server
I recently had a need to BCP a very large data set in and out of a database. The out is no issue.
bcp [databasename].[schema].[table] direction path_to_output_file.txt -Sservername -c -T
is the format that I was using and it worked fantastic. I was able to BCP out my 934Million rows.
Now I had to push it back into a different table. I built the table and started my BCP.
bcp [databasename].[schema].[table] IN path_to_output_file.txt -Sservername -c -T
Well that got me about 800 Millione records before I blew up my log file. Ouch… So I had to change the BCP command and tell it to commit after so may rows had been inserted. This is an important step so that you do not blow up your log. Simple as this.
bcp [databasename].[schema].[table] direction path_to_output_file.txt -Sservername -c -T -b2000000
That will push the data in and ever 2Million rows inserted it will commit and save your log.
have fun using BCP from the command line.