Discussion:
What approach to importing a large text file?
(too old to reply)
i***@yahoo.com
2010-01-06 22:15:46 UTC
Permalink
Here are the details:

- Source is a 5 GB text file. It is in a "ragged right" format with
NO character delimiters.
- Destination is 20 tables.

What is the most efficient approach to importing this large text
file? Should I attempt a bulk insert task? Will that work
considering that I'm going from one source file to 20 destination
tables? If I use the bulk insert task, how do I construct the format
file?
tbradshaw via SQLMonster.com
2010-01-07 17:36:38 UTC
Permalink
Imani,

You have a few choices here, but one thing is certain: you cannot import
directly into 20 destination tables. You will either
(1) Import it all into a single Load table, then use SQL tasks to disperse it
out to 20 destination table, or
(2) Clean your file ahead of import and perhaps divide it into 20 files
suitable for import.

I personally use option (2) the most. (C# based cleaning application). This
gives me a chance to scrub, validate & divide data prior to import. It also
cuts down on the amount of bad data. Why import bad data when you're going
to throw it away anyway.

Tip: make sure you load the files from a local drive, not over a network.

Finally, in my experience, bulk insert is considerably faster than a Data
Pump, especially when there aren't any transformations involved.

Let us know how you make out.

Best Regards,
Tom

Thomas Bradshaw
Data Integration Services
MyWebGrocer Inc
--
Message posted via SQLMonster.com
http://www.sqlmonster.com/Uwe/Forums.aspx/sql-server-dts/201001/1
i***@yahoo.com
2010-01-07 22:02:37 UTC
Permalink
Is there an interim "holding place" similar to a load table within
SSIS where I can place the data, without using a SQL Server database
table?
Post by tbradshaw via SQLMonster.com
Imani,
You have a few choices here, but one thing is certain:  you cannot import
directly into 20 destination tables.  You will either
(1) Import it all into a single Load table, then use SQL tasks to disperse it
out to 20 destination table, or
(2) Clean your file ahead of import and perhaps divide it into 20 files
suitable for import.
I personally use option (2) the most.  (C# based cleaning application).  This
gives me a chance to scrub, validate & divide data prior to import.  It also
cuts down on the amount of bad data.  Why import bad data when you're going
to throw it away anyway.
Tip:  make sure you load the files from a local drive, not over a network.
Finally, in my experience, bulk insert is considerably faster than a Data
Pump, especially when there aren't any transformations involved.
Let us know how you make out.
Best Regards,
Tom
Thomas Bradshaw
Data Integration Services
MyWebGrocer Inc
--
Message posted via SQLMonster.comhttp://www.sqlmonster.com/Uwe/Forums.aspx/sql-server-dts/201001/1
Todd C
2010-01-08 14:00:01 UTC
Permalink
You may want to go with a Conditional Split to direct rows to the proper
destination.

Can you give us a sample of your raw data? How would one determine that a
row belongs to one particular table?
--
Todd C
MCTS SQL Server 2005
Post by i***@yahoo.com
Is there an interim "holding place" similar to a load table within
SSIS where I can place the data, without using a SQL Server database
table?
Post by tbradshaw via SQLMonster.com
Imani,
You have a few choices here, but one thing is certain: you cannot import
directly into 20 destination tables. You will either
(1) Import it all into a single Load table, then use SQL tasks to disperse it
out to 20 destination table, or
(2) Clean your file ahead of import and perhaps divide it into 20 files
suitable for import.
I personally use option (2) the most. (C# based cleaning application). This
gives me a chance to scrub, validate & divide data prior to import. It also
cuts down on the amount of bad data. Why import bad data when you're going
to throw it away anyway.
Tip: make sure you load the files from a local drive, not over a network.
Finally, in my experience, bulk insert is considerably faster than a Data
Pump, especially when there aren't any transformations involved.
Let us know how you make out.
Best Regards,
Tom
Thomas Bradshaw
Data Integration Services
MyWebGrocer Inc
--
Message posted via SQLMonster.comhttp://www.sqlmonster.com/Uwe/Forums.aspx/sql-server-dts/201001/1
.
Loading...