0

I need to generate either backups or data dumps for about 30 tables. I want to generate a file of some sort that contains the data for these tables and transfer them to a different system. Each table has 3.8-4.5 million rows. also a large number of these rows have columns with text that has newline characters which need to be handled correctly.

I'm using Sql Server 2012 and have SQL Server Management Studio.

Due to security and other particular reasons I cannot just use SQL Server Managment Studio's Import/Export Wizard for simply exporting tables from one database to another. I have a requirement to get a dump of the data to transfer to a remote system.

I have tried SQL Server Management Studio's Generate Scripts capability, which does appear to handle the newline characters, however There is another issue. In some of the fields the text is similar to "something blah \". that slash causes a problem with the script that gets generated, because the Generate Scripts process wraps that text in '' and with \' it basically escapes the single quote which corrupts the remaining data.

I have also tried to generate flat files via SQL Server Management Studio's Import/Export wizard and this doesn't appear to handle the newline characters correctly and again corrupting the remaining data.

Any help would be greatly appreciated. and I can clarify any information that is missing.

shadonar
  • 161
  • 2
  • 9
  • @John from what I understood, bcp uses newlines as delimiters for new rows, and wouldn't correctly handle the newlines found within the text of the content in my tables. If I'm wrong about this please let me know. – shadonar Apr 03 '15 at 21:12
  • 2
    bcp can handle the requirements stated above. Books Online and the other links on the accepted answer (for the suggested duplicate) contain more information. Once you understand the bcp options, test them. If you find a specific issue as a result of that testing, either edit this question with more details (ideally with a sample table structure and data file), or ask a new question, as appropriate. – Paul White Apr 04 '15 at 01:37
  • 2
    @shadonar as Paul mentioned, you can use Use BCP OUT on source server and BULK INSERT to insert data on destination server. This handy script will help you - check option 2.. THe script uses -n native format of BCP, so it will be compact and secure (non human readable) as well. – Kin Shah Apr 04 '15 at 01:58
  • I looked closer at BCP and though the options aren't exactly easy to understand I was able to generate exactly what I needed. I was also able to use the BCP IN to bring the data into a test table correctly. Thank you all so much for the help – shadonar Apr 08 '15 at 16:47

1 Answers1

1

Just take a full backup (with compression, if possible), restore it on the other server, and then you can select into or insert/select without having to worry about what happens to carriage returns, line feeds, and other characters that cause problems for files. You want to move (parts of) a database, not files.

Trying to just extract some 120 million rows from a database sounds like more trouble than it's worth. You're making this harder than it should be, for what? To save a few bytes in data transfer?

Aaron Bertrand
  • 180,303
  • 28
  • 400
  • 614
  • Amen, for the simple, non-complicated approach that is guaranteed to not lose anything in the process. – Hannah Vernon Apr 04 '15 at 02:56
  • sadly, the people receiving the data are technically only allowed to view the specific tables, and no other content, and performing a full backup and basically trusting them to only do the specified tables is not an option. Otherwise I believe this is exactly what I would have done. – shadonar Apr 08 '15 at 16:44