Watch, Follow, &
Connect with Us

For forums, blogs and more please visit our
Developer Tools Community.


Welcome, Guest
Guest Settings
Help

Thread: Loading large dataset



Permlink Replies: 8 - Last Post: Sep 17, 2014 10:31 PM Last Post By: Omer Eisenberg
Omer Eisenberg

Posts: 2
Registered: 4/13/00
Loading large dataset
Click to report abuse...   Click to reply to this thread Reply
  Posted: Sep 16, 2014 11:30 AM
Hi All,
I have the following code working properly in an ADO database.

query->SQL->Add ("select * from table_name");
query->First ();
while (!query->Eof) {
item1=query->Fields->FieldsByName("fld1")->AsString;
item2=query->Fields->FieldsByName("fld2")->AsString;
query->Nexr();
}

For a large MySQL database, it take some 20 seconds to load 250,000 records.
Any ideas how to improve performance?

Thanks a lot in advance,
Omer.
Jeff Overcash (...

Posts: 1,529
Registered: 9/23/99
Re: Loading large dataset
Click to report abuse...   Click to reply to this thread Reply
  Posted: Sep 16, 2014 11:47 AM   in response to: Omer Eisenberg in response to: Omer Eisenberg
Omer Eisenberg wrote:
Hi All,
I have the following code working properly in an ADO database.

query->SQL->Add ("select * from table_name");
query->First ();
while (!query->Eof) {
item1=query->Fields->FieldsByName("fld1")->AsString;
item2=query->Fields->FieldsByName("fld2")->AsString;
query->Nexr();
}

For a large MySQL database, it take some 20 seconds to load 250,000 records.
Any ideas how to improve performance?

Thanks a lot in advance,
Omer.

Try switching the CursorLocation from client to server. Does that improve
things? If yes then you are hitting the time needed to fully buffer the data.

also you can time FetchAll() alone after calling Open(). That is the time it
takes to fetch all th data locally.

also a lot of databases do what is called packet stuffing. They will put in as
many rows as will fit in a single network packet to reduce network calls. Then
the database client dll does not go out again to get more data as you read rows
until it has fully read the data in the last packet it got.

With that in mind, stay away from things like select * on large result sets,
don't use *, give only the columns you are interested in. This means less
network packets (if your DB packet stuffs) and less client side memory needed to
hold the whole thing.

--
Jeff Overcash (TeamB)
(Please do not email me directly unless asked. Thank You)
And so I patrol in the valley of the shadow of the tricolor
I must fear evil. For I am but mortal and mortals can only die.
Asking questions, pleading answers from the nameless
faceless watchers that stalk the carpeted corridors of Whitehall.
(Fish)
Alain Bastien

Posts: 153
Registered: 11/12/01
Re: Loading large dataset
Click to report abuse...   Click to reply to this thread Reply
  Posted: Sep 17, 2014 4:21 AM   in response to: Jeff Overcash (... in response to: Jeff Overcash (...
May I ?

Use a Stored Procedure. It's a bit complexe, but highly both efficient and
effective.

http://www.diffen.com/difference/Effectiveness_vs_Efficiency
Alain Bastien

Posts: 153
Registered: 11/12/01
Re: Loading large dataset
Click to report abuse...   Click to reply to this thread Reply
  Posted: Sep 17, 2014 4:29 AM   in response to: Alain Bastien in response to: Alain Bastien
Jeff Overcash (...

Posts: 1,529
Registered: 9/23/99
Re: Loading large dataset
Click to report abuse...   Click to reply to this thread Reply
  Posted: Sep 17, 2014 9:28 AM   in response to: Alain Bastien in response to: Alain Bastien
Alain Bastien wrote:
May I ?

Use a Stored Procedure. It's a bit complexe, but highly both efficient and
effective.

http://www.diffen.com/difference/Effectiveness_vs_Efficiency

Won't change a thing. He wants to load a large number of records locally
(perhaps for a report or something). a stored procedure will not impact that at
all.

Stored procedures tend to be slightly (hard to notice other than in a tight loop
iterated a lot of times) slower than straight SQL. Also some database engines
do packet stuff results from SQL, but can not packet stuff results from SP's.

--
Jeff Overcash (TeamB)
(Please do not email me directly unless asked. Thank You)
And so I patrol in the valley of the shadow of the tricolor
I must fear evil. For I am but mortal and mortals can only die.
Asking questions, pleading answers from the nameless
faceless watchers that stalk the carpeted corridors of Whitehall.
(Fish)
Alain Bastien

Posts: 153
Registered: 11/12/01
Re: Loading large dataset
Click to report abuse...   Click to reply to this thread Reply
  Posted: Sep 17, 2014 1:15 PM   in response to: Jeff Overcash (... in response to: Jeff Overcash (...
But one need to have eyes like a Bee to read 250,000 records displayed on
the screen !!
Jeff Overcash (...

Posts: 1,529
Registered: 9/23/99
Re: Loading large dataset
Click to report abuse...   Click to reply to this thread Reply
  Posted: Sep 17, 2014 1:31 PM   in response to: Alain Bastien in response to: Alain Bastien
Alain Bastien wrote:
But one need to have eyes like a Bee to read 250,000 records displayed on
the screen !!

Reports often need that. More than likely a where clause would work just as
well as a stored procedure in his case. Turning it into a stored procedure only
is helpful if that query is so complex that the optimizer does a really bad job,
but you can decompose the query into an outer loop/inner loop type thing to
improve the final optimized query.

His question, as asked, a stored procedure will not have any impact at all.

--
Jeff Overcash (TeamB)
(Please do not email me directly unless asked. Thank You)
And so I patrol in the valley of the shadow of the tricolor
I must fear evil. For I am but mortal and mortals can only die.
Asking questions, pleading answers from the nameless
faceless watchers that stalk the carpeted corridors of Whitehall.
(Fish)
Alain Bastien

Posts: 153
Registered: 11/12/01
Re: Loading large dataset
Click to report abuse...   Click to reply to this thread Reply
  Posted: Sep 17, 2014 10:13 PM   in response to: Jeff Overcash (... in response to: Jeff Overcash (...
Yet Still ..using a loop

while !eof
{check criteria}
and next()
in a table of 1 million records is an archaic, obsolete method . I used to
do this in the 80's in Ashton Tate Dbase

Now with Sql combined with Qreport or Fastreport in one click and a few
milliseconds .. You get the answer
Omer Eisenberg

Posts: 2
Registered: 4/13/00
Re: Loading large dataset
Click to report abuse...   Click to reply to this thread Reply
  Posted: Sep 17, 2014 10:31 PM   in response to: Alain Bastien in response to: Alain Bastien
Alain Bastien wrote:
Yet Still ..using a loop

while !eof
{check criteria}
and next()
in a table of 1 million records is an archaic, obsolete method . I used to
do this in the 80's in Ashton Tate Dbase

Now with Sql combined with Qreport or Fastreport in one click and a few
milliseconds .. You get the answer

Thanks a lot for all the answers.
I'm actually loading the data after running an SQL statement. I'm loading 10^5 records from a database of 10^6 records or even larger.
I need it for computation, not for reports.
Running it on a server won't change much, since the server is local.
Thanks again,
Omer.
Legend
Helpful Answer (5 pts)
Correct Answer (10 pts)

Server Response from: ETNAJIVE02