Not sure what your exact problem is, however here is a general solution:
You can use the BULK INSERT sql statement to insert large volumes of records from a text file. To call the BULK INSERT statement from ADO .NET, you can use a SqlCommand object to execute a DDL statment.
This method really isn't too fast for inserting a large number of records.
The following comes from the article you cite:
"It should be noted that these statements are not performed as a batch
process; each row is updated individually."
Another possible way of doing batch updates is to batch them yourself. IOW,
you can create parse the file to create insert statements. You can
concatenate many (100's or possibly 1000's) of these statements together by
placing a semicolon in between each statement and then execute the
concatenated statement. This is a really fast way of inserting records in a
batch since it can greatly reduce the number of trips to the DB. If an error
occurs though, it's almost impossible to isolate which record caused the
problem.
Good luck.
Scott
[color=blue]
> Also you can use .Update method of dataAdapter object to load datatable to[/color]
a db table.[color=blue]
>
>[/color]
Probably won't help you today, but in ADO.Net 2.0 (Whidbey), they are
supposed to be adding bulk operations. (so you would just be able to use
the data adapter, but set a batch size property). I saw a sample once of
the actual SQL that was generated. It was very similar to what you
describe, Lenn. I think that they separated each insert with a GO
statement.
"Scott" <me@me.com> wrote in message
news:udGdogrXEH A.3292@TK2MSFTN GP09.phx.gbl...[color=blue]
> Lenn,
>
> This method really isn't too fast for inserting a large number of records.
> The following comes from the article you cite:
>
> "It should be noted that these statements are not performed as a batch
> process; each row is updated individually."
>
> Another possible way of doing batch updates is to batch them yourself.[/color]
IOW,[color=blue]
> you can create parse the file to create insert statements. You can
> concatenate many (100's or possibly 1000's) of these statements together[/color]
by[color=blue]
> placing a semicolon in between each statement and then execute the
> concatenated statement. This is a really fast way of inserting records in[/color]
a[color=blue]
> batch since it can greatly reduce the number of trips to the DB. If an[/color]
error[color=blue]
> occurs though, it's almost impossible to isolate which record caused the
> problem.
>
> Good luck.
>
> Scott
>[color=green]
> > Also you can use .Update method of dataAdapter object to load datatable[/color][/color]
to[color=blue]
> a db table.[color=green]
> >
> >[/color]
>[/color] http://msdn.microsoft.com/library/de...pdatetopic.asp[color=blue]
>
>[/color]
Quick correction: It was Scott who described the SQL. Sorry.
"J.Marsch" <jeremy@ctcdeve loper.com> wrote in message
news:ePpb63rXEH A.3564@TK2MSFTN GP11.phx.gbl...[color=blue]
> Probably won't help you today, but in ADO.Net 2.0 (Whidbey), they are
> supposed to be adding bulk operations. (so you would just be able to use
> the data adapter, but set a batch size property). I saw a sample once of
> the actual SQL that was generated. It was very similar to what you
> describe, Lenn. I think that they separated each insert with a GO
> statement.
>
>
> "Scott" <me@me.com> wrote in message
> news:udGdogrXEH A.3292@TK2MSFTN GP09.phx.gbl...[color=green]
> > Lenn,
> >
> > This method really isn't too fast for inserting a large number of[/color][/color]
records.[color=blue][color=green]
> > The following comes from the article you cite:
> >
> > "It should be noted that these statements are not performed as a batch
> > process; each row is updated individually."
> >
> > Another possible way of doing batch updates is to batch them yourself.[/color]
> IOW,[color=green]
> > you can create parse the file to create insert statements. You can
> > concatenate many (100's or possibly 1000's) of these statements together[/color]
> by[color=green]
> > placing a semicolon in between each statement and then execute the
> > concatenated statement. This is a really fast way of inserting records[/color][/color]
in[color=blue]
> a[color=green]
> > batch since it can greatly reduce the number of trips to the DB. If an[/color]
> error[color=green]
> > occurs though, it's almost impossible to isolate which record caused the
> > problem.
> >
> > Good luck.
> >
> > Scott
> >[color=darkred]
> > > Also you can use .Update method of dataAdapter object to load[/color][/color][/color]
datatable[color=blue]
> to[color=green]
> > a db table.[color=darkred]
> > >
> > >[/color]
> >[/color]
>[/color] http://msdn.microsoft.com/library/de...pdatetopic.asp[color=blue][color=green]
> >
> >[/color]
>
>[/color]
I thought about that but from what I know the BULK INSERT needs to have the
file on the SQL server or being able to access it on a network share. This
cannot happen in my environment.
"SoKool" <SoKool@discuss ions.microsoft. com> wrote in message
news:05200A9A-C4AC-4BE3-8D6C-BDD97F0D6A0B@mi crosoft.com...[color=blue]
> Not sure what your exact problem is, however here is a general solution:
>
> You can use the BULK INSERT sql statement to insert large volumes of[/color]
records from a text file. To call the BULK INSERT statement from ADO .NET,
you can use a SqlCommand object to execute a DDL statment.[color=blue]
>
> Example:
> ----------
> private void Test()
> {
> SqlConnection conn;
> SqlCommand command;
>
> try
> {
> conn = new SqlConnection(" Data Source=MYSERVER ;Initial[/color]
Catalog=Northwi nd;Integrated_S ecurity=SSPI");[color=blue]
>
> command = new SqlCommand();
>
> conn.Open();
>
> command.Connect ion = conn;
> command.Command Text = "BULK INSERT Northwind.dbo.[Order Details]" +
> @"FROM 'f:\orders\line item.tbl'" +
> "WITH" +
> "(" +
> "FIELDTERMINATO R = '|'," +
> "ROWTERMINA TOR = ':\n'," +
> "FIRE_TRIGG ERS" +
> ")";
>
> command.Execute NonQuery();
> }
> catch (Exception e)
> {
> MessageBox.Show (e.Message);
> }
>
> }
>
> "Daniel P." wrote:
>[color=green]
> > How can I use ADO.NET to insert lots of records in a very fast way?
> >
> > Thanks!
> >
> >
> >[/color][/color]
I decided to use an INSERT statment prepared with variable names @ inside.
Then set the params and call ExecuteNonQuery for each record. It is not as
fast as the previous implementation in C++ and DbLib but it works fione for
the time being..
Thanks!
"Scott" <me@me.com> wrote in message
news:udGdogrXEH A.3292@TK2MSFTN GP09.phx.gbl...[color=blue]
> Lenn,
>
> This method really isn't too fast for inserting a large number of records.
> The following comes from the article you cite:
>
> "It should be noted that these statements are not performed as a batch
> process; each row is updated individually."
>
> Another possible way of doing batch updates is to batch them yourself.[/color]
IOW,[color=blue]
> you can create parse the file to create insert statements. You can
> concatenate many (100's or possibly 1000's) of these statements together[/color]
by[color=blue]
> placing a semicolon in between each statement and then execute the
> concatenated statement. This is a really fast way of inserting records in[/color]
a[color=blue]
> batch since it can greatly reduce the number of trips to the DB. If an[/color]
error[color=blue]
> occurs though, it's almost impossible to isolate which record caused the
> problem.
>
> Good luck.
>
> Scott
>[color=green]
> > Also you can use .Update method of dataAdapter object to load datatable[/color][/color]
to[color=blue]
> a db table.[color=green]
> >
> >[/color]
>[/color] http://msdn.microsoft.com/library/de...pdatetopic.asp[color=blue]
>
>[/color]
Might be too late to be of help... and I'm not sure if you're using SQL
Server...but..
In the past, I have inserted many records into one to many tables with a
single call by creating a sproc capable of processing a XML input
parameter (text or ntext data type).
I found this technique useful, too, when having to insert to or update a
table with many fields of data. Instead of having deal with all of the
fields, I could use a loop in the C# form to gather up all the data
field names and values, build an XML document and feed it to a sproc.
If done right, you don't have to update the sproc each time you add or
change fields in the table.
*** Sent via Devdex http://www.devdex.com ***
Don't just participate in USENET...get rewarded for it!
Scott
[color=blue]
> In the past, I have inserted many records into one to many tables with a
> single call by creating a sproc capable of processing a XML input
> parameter (text or ntext data type).[/color]
"Kenneth Courville" <krcourville@-nospam-msn.com> wrote in message
news:ePuNXDvXEH A.3044@TK2MSFTN GP09.phx.gbl...[color=blue]
> Might be too late to be of help... and I'm not sure if you're using SQL
> Server...but..
>
> In the past, I have inserted many records into one to many tables with a
> single call by creating a sproc capable of processing a XML input
> parameter (text or ntext data type).
>
> I found this technique useful, too, when having to insert to or update a
> table with many fields of data. Instead of having deal with all of the
> fields, I could use a loop in the C# form to gather up all the data
> field names and values, build an XML document and feed it to a sproc.
>
> If done right, you don't have to update the sproc each time you add or
> change fields in the table.
>
>
>
> *** Sent via Devdex http://www.devdex.com ***
> Don't just participate in USENET...get rewarded for it![/color]
Comment