How many rows can be fetched using DBD::Informix?

How many rows can be fetched using DBD::Informix?

am 10.01.2006 17:58:46 von ehughes

Hello All,

I have run into a problem I have not seen before. I need to select
all of the rows in a table and I am getting the following error:

# upd_wia_test.pl
Today's date: 1/10/2006
Running 'execute' .... done
Running 'fetch' .... Out of memory!
Issuing rollback() for database handle being DESTROY'd without
explicit disconnect().

I am running perl 5.8.7 on a Sun Ultra 60 with 2Gb of Ram, using
Solaris 8. I am using DBI-1.50 and DBD-Informix-2005-02. Did I forget
anything?

It looks like that during fetchall_arrayref the system is running out
of memory. A check of the table shows there are 35359 rows in that
table. What parameters are in play here? Is there something in my
onconfig that I can modify? Any thoughts?

Thank you for you time,

Elton


=========================================================
NOVA 505 W. Olive Ave. Suite 550
Elton Hughes (IT) Sunnyvale CA 94086
Phone: 408-730-7235 Fax: 408-730-7643
---------------------------------------------------------

Re: How many rows can be fetched using DBD::Informix?

am 10.01.2006 18:30:44 von jonathan.leffler

------=_Part_8126_24171214.1136914244798
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable
Content-Disposition: inline

On 1/10/06, Elton Hughes wrote:

> I have run into a problem I have not seen before. I need to select
> all of the rows in a table and I am getting the following error:
>
> # upd_wia_test.pl
> Today's date: 1/10/2006
> Running 'execute' .... done
> Running 'fetch' .... Out of memory!
> Issuing rollback() for database handle being DESTROY'd without
> explicit disconnect().
>
> I am running perl 5.8.7 on a Sun Ultra 60 with 2Gb of Ram, using
> Solaris 8. I am using DBI-1.50 and DBD-Informix-2005-02. Did I forget
> anything?
>
> It looks like that during fetchall_arrayref the system is running out
> of memory. A check of the table shows there are 35359 rows in that
> table. What parameters are in play here? Is there something in my
> onconfig that I can modify? Any thoughts?
>


Dear Elton,

The obvious primary factors are:
1. Size of each row of data.
2. If you've got blobs in the mix, you need to know how big they are;
DBD::Informix fetches them into memory.
3. Is this a 32-bit or a 64-bit version of Perl (ClientSDK - 2.90.UCx vs
2.90.FCx)?
4. On 32-bit systems, the number of shared libraries and their load
addresses can affect the total space available to the heap.
5. Are you using shared memory connections to IDS? (In sqlhosts, are you
using olipcshm, olipcstr or oltlitcp?)


There could be a memory consumption bug - I've not seen it reported before.

Can you give a simple reproduction - I'm using the same Perl and module
versions on the same o/s.

FWIW: I loaded 40960 rows into a table:
CREATE TABLE load_test(i SERIAL NOT NULL PRIMARY KEY, s VARCHAR(64) NOT
NULL);
and then used a script like this to select all 40K rows without problem:

#!/bin/perl -w
use strict;
use DBI;
my $dbh =3D DBI->connect('dbi:Informix:stores','','',{RaiseError=3D>1});
my $sth =3D $dbh->prepare('select * from load_test');
$sth->execute;
my $a_ref =3D $sth->fetchall_arrayref({});
foreach my $h_ref (@$a_ref)
{
my($i) =3D $h_ref->{i};
my($s) =3D $h_ref->{s};
print "$i =3D> $s\n";
}
$dbh->disconnect;
exit 0;

The actual rows probably averaged about 25 characters in the 's' column.

--
Jonathan Leffler #include
Guardian of DBD::Informix - v2005.02 - http://dbi.perl.org
"I don't suffer from insanity - I enjoy every minute of it."

------=_Part_8126_24171214.1136914244798--

Re: How many rows can be fetched using DBD::Informix?

am 10.01.2006 18:42:58 von scoles

I do not think there is any maximum # of rows. As I have have use queries
with it well over 35k rows. It would be dependant on the size of fields in
each row and the amount of memory you have.

In this case I think you query is simply running out of memory.

Try using a cursor in your SQL or use the fetch_row which might be faster
anyway than a fetchall_arrayref for a large table like that.

Re: How many rows can be fetched using DBD::Informix?

am 10.01.2006 19:14:35 von jseger

------=_Part_24324_16675874.1136916875589
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable
Content-Disposition: inline

I can attest that there is no max number of rows set, or at least if there
is, it's way higher than 35k. I have pulled hundreds of thousands of rows
from an informix database using DBI with no limit found yet. However, what
you are doing with those rows matters greatly. If you are throwing them al=
l
into an array and then processing them (often more efficient for small
datasets), it's easy to bump up against your memory limits.

On large datasets, you may have better luck with what a friend of mine call=
s
"sip and spit", meaning read a row, process a row, read a row, process a
row.

On 1/10/06, John Scoles wrote:
>
> I do not think there is any maximum # of rows. As I have have use queries
> with it well over 35k rows. It would be dependant on the size of fields
> in
> each row and the amount of memory you have.
>
> In this case I think you query is simply running out of memory.
>
> Try using a cursor in your SQL or use the fetch_row which might be faster
> anyway than a fetchall_arrayref for a large table like that.
>
>
>
>


--
It all comes down to a choice really.
You can get busy living, or get busy dying.
--Stephen King

------=_Part_24324_16675874.1136916875589--

Re: How many rows can be fetched using DBD::Informix?

am 10.01.2006 21:06:46 von Tim.Bunce

On Tue, Jan 10, 2006 at 01:14:35PM -0500, Jeffrey Seger wrote:
> I can attest that there is no max number of rows set, or at least if there
> is, it's way higher than 35k. I have pulled hundreds of thousands of rows
> from an informix database using DBI with no limit found yet. However, what
> you are doing with those rows matters greatly. If you are throwing them all
> into an array and then processing them (often more efficient for small
> datasets), it's easy to bump up against your memory limits.
>
> On large datasets, you may have better luck with what a friend of mine calls
> "sip and spit", meaning read a row, process a row, read a row, process a
> row.

Yeap: http://search.cpan.org/src/TIMB/DBI_AdvancedTalk_2004/sld017 .htm

Tim.

> On 1/10/06, John Scoles wrote:
> >
> > I do not think there is any maximum # of rows. As I have have use queries
> > with it well over 35k rows. It would be dependant on the size of fields
> > in
> > each row and the amount of memory you have.
> >
> > In this case I think you query is simply running out of memory.
> >
> > Try using a cursor in your SQL or use the fetch_row which might be faster
> > anyway than a fetchall_arrayref for a large table like that.
> >
> >
> >
> >
>
>
> --
> It all comes down to a choice really.
> You can get busy living, or get busy dying.
> --Stephen King