Jared's techno blog

Thursday, December 23, 2004

optimize large data access

optimize large data access

Posted: 12-22-2004 01:17 PM
I've been having headache for 3 days on this problem of retrieving 65,000 rows of DataTable from Oracle database. Customer needs to download the entire bill data. I tried a couple of ways of getting data, by filling DataSet or executing a DataReader, and generate a .csv file and zip it up on the server for download. But it either takes too long or gives an error of 'Page Cannot be displayed'. What is a good approach to this issue? Anybody can advise?



Re: optimize large data access
Posted: 12-22-2004 06:13 PM
You most likely need to use a FileStream to write out the data, depending on how much data there is, you might not be able or want to keep all of it in memory.

I suggest using the data reader because:

You can use the ADO.NET DataReader to retrieve a read-only, forward-only stream of data from a database. Results are returned as the query executes, and are stored in the network buffer on the client until you request them using the Read method of the DataReader. Using the DataReader can increase application performance both by retrieving data as soon as it is available, rather than waiting for the entire results of the query to be returned, and (by default) storing only one row at a time in memory, reducing system overhead.

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/cpguide/html/cpcontheadonetdatareader.asp

and then writing to a file as you get more data. Consider writing every 100 rows or some other number because FILE/IO is just too expensive.
xpcoder.NET
csharpalmanac.NET"

0 Comments:

Post a Comment

<< Home