a large query with .net.
I am trying to emulate a legacy database system so I don't know the
upper bounds of the sql query. An example query would be something
like:
Select * from invoices where year 1995
the query must be updatable and only return say 10 to 100 rows at a
time.
It should also be forward only and discard rows no longer in use to
save memory.
And if at all possible I would like to lock one row at a time as the
row is read in.(troy@.makaro.com) writes:
Quote:
Originally Posted by
Could someone please point me in the right direction on how to read in
a large query with .net.
>
I am trying to emulate a legacy database system so I don't know the
upper bounds of the sql query. An example query would be something
like:
>
Select * from invoices where year 1995
>
the query must be updatable and only return say 10 to 100 rows at a
time.
It should also be forward only and discard rows no longer in use to
save memory.
It sounds like you should use ExecuteReader and loop through the rows.
That is, do not use DataAdapter.Fill. Furthermore, to make it possible
to update the rows as you have read them in, you need to enable MARS,
Multiple Active Result Sets, which I believe you do in the connection
string.
However, if your plan is to read one row at a time and update back,
I wonder from where you get the information to update. It's much much
efficient to perform the update in the database on all rows in one
go.
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||Sounds like you need to take a look at this URL:
http://msdn2.microsoft.com/en-us/li...594(VS.80).aspx though
I doubt a newb will understand it much. This is just in case there are
other "MVP's" like myself who are in need of a good reference for a
task like large query input.
HTH,
Carl Tegeder
Master MS-SQL Administrator
MS-SQL MVP
On Feb 26, 12:36 pm, t...@.makaro.com wrote:
Quote:
Originally Posted by
Could someone please point me in the right direction on how to read in
a large query with .net.
>
I am trying to emulate a legacy database system so I don't know the
upper bounds of the sql query. An example query would be something
like:
>
Select * from invoices where year 1995
>
the query must be updatable and only return say 10 to 100 rows at a
time.
It should also be forward only and discard rows no longer in use to
save memory.
>
And if at all possible I would like to lock one row at a time as the
row is read in.
do is to emulate a legacy product's data access method. The reason I
am doing this is there is just way too much code to convert into
proper sql. I'm talking at least 1 million lines of code. I have
already written a conversion program to convert the code to VB.NET and
now I must right a dll assembly to emulate the legacy data access.
Here is one an example of what I have to emulate:
get #3, key #1 GE "20060101"
while invoiceDate < "20070101"
! The current row is now locked!
! make changes
update #3 ! updates the currently locked row and now unlocked.
get #3 ! read the next row in
next
The converted code looks something like:
' note: 3 = the registered table
SQL.getGreaterEqual(3, "20060101") ' notice no upper bounds
while invoiceDate < "20070101"
' The current row is now locked!
' make changes
SQL.update(3) ' updates the currently locked row and now unlocked.
SQL.getNext(3) ! read the next row in
next
My question now is:
How do I lock one row at a time?
One thought I had was to make all connections go through my own
service. That service could keep track of locks. The problem with that
is that future products will want to use sql properly which would
bypass the locking.|||(troy@.makaro.com) writes:
Quote:
Originally Posted by
Great information on MARS and the ExecuteReader. What I am trying to
do is to emulate a legacy product's data access method. The reason I
am doing this is there is just way too much code to convert into
proper sql. I'm talking at least 1 million lines of code. I have
already written a conversion program to convert the code to VB.NET and
now I must right a dll assembly to emulate the legacy data access.
I can't escape asking what's the point? You get the legacy product
converted to .Net, but it will still have the architecutre of the
old product, and risk is that you get a compromise with the worst from
both.
Quote:
Originally Posted by
Here is one an example of what I have to emulate:
>
get #3, key #1 GE "20060101"
while invoiceDate < "20070101"
! The current row is now locked!
! make changes
update #3 ! updates the currently locked row and now unlocked.
get #3 ! read the next row in
next
>
The converted code looks something like:
>
' note: 3 = the registered table
SQL.getGreaterEqual(3, "20060101") ' notice no upper bounds
while invoiceDate < "20070101"
' The current row is now locked!
' make changes
SQL.update(3) ' updates the currently locked row and now unlocked.
SQL.getNext(3) ! read the next row in
next
>
My question now is:
How do I lock one row at a time?
I take it that the other product was using another data store than
SQL Server?
There are a couple of ways to do this, but it is important to understand
that locking a row is nothing you don't really do actively in SQL Server.
This is left to the lock manager.
And it's even less possible in ADO .Net, since ADO .Net uses client-
side cursors only. That is data is read from SQL Server and buffered.
Something like ExecuteReader may not read all million rows at once,
but it will not fetch one row at a time.
One way is to wrap the entire reader in a transaction with the isolation
level REPEATABLE READ. But then rows will remained locked until you
commit.
However, the only reasonable approach is optimistic locking. That is,
don't lock, but check for concurrent updates when you update. This
can be done in two ways:
1) Add a timestamp column: a timestamp column is automatically updated when
the row is updated. If you include the timestamp column in the WHERE
clause, and you see that @.@.rowcount is 0, then you know that the row
was changed since you last read it. 2) Without a timestamp column just
add all columns to the WHERE clause. I believe that the Update commands
that comes with the CommandBuilder includes this.
I can think of a third way: first read all keys into local array. Then
iterate over the array, and read one row at a time as 1) Start transaction
with REPEATABLE READ, 2) read row 3) update and 4) commit. But this
will be slow as I don't know what.
All and all, I think you are fighting an uphiil battle.
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||Thanks for the response. see inline comments:
Quote:
Originally Posted by
>
I take it that the other product was using another data store than
SQL Server?
Yes, unfortunately :(
Quote:
Originally Posted by
However, the only reasonable approach is optimistic locking.
ya, this is the best method for sure, its just that I need to emulate
the old system as accurately as possible. The legacy software does not
expect concurrency errors on the updates.
Quote:
Originally Posted by
I can think of a third way: first read all keys into local array. Then
iterate over the array, and read one row at a time as 1) Start transaction
with REPEATABLE READ, 2) read row 3) update and 4) commit. But this
will be slow as I don't know what.
Fortunately the resultsets that require a lock on each row will
probably be fairly small. Larger resultsets are typically reports
which don't require locks. I can see I am going to have to write a
performance test to see the actual speed of reading say 500 rows one
at a time with a lock on them.
No comments:
Post a Comment