C# w/ ODBC, 2.1 million list select gives empty DataSet

C# w/ ODBC, 2.1 million list select gives empty DataSet

am 12.01.2010 00:34:25 von Laurent Chouinard

This is a multi-part message in MIME format.
--------------030405070903000609090100
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit

Hi everyone,

I've been using the ODBC driver for Windows along with my C# (Framework
2.0) for a few years now and it's been working very nicely.

Until now.

One of my customer's database has grown to considerable size over time,
and in particular, one report he attemps to do in my software gives an
empty report.

Upon investigation, I determined that the SELECT I send to the
postgresql gets processed properly and it probably hands over about 2.1
million rows as the response. After that, I use the "Fill" method as
instructed in the "HOW TO" section in the ODBC documentation to fill the
data into a DataSet object.

After considerable amount of work (almost the same as when I do that
same query in PgAdmin III, about 110 seconds), the DataSet gets
initialized empty, with zero tables in it. No errors, no exceptions.
Everything moves along as if nothing happened.

I just upgraded to the latest ODBC drivers (dated december 2009) and
it's not changing my issue.

Am I hitting a limit of some sort? I played with the "ConnectionTimeout"
values of OdbcCommand and OdbcDataAdapter, didn't seem to change.

Here's a selected sample of the code I use:



OdbcCommand odcQuery = new OdbcCommand(sSQLQuery);

using (OdbcConnection dbcPostGresConnection = new
OdbcConnection(_dpDBSettings.GetConnectionString()))
{
try
{
odcQuery.Connection = dbcPostGresConnection;
dbcPostGresConnection.Open();

switch (emCurrentType)
{
case emSqlQueryType.CREATE:
case emSqlQueryType.DELETE:
case emSqlQueryType.UPDATE:
case emSqlQueryType.INSERT:
case emSqlQueryType.ALTER:
case emSqlQueryType.DROP:
{
_kNumberOfAffectedRecordsFromLastQuery =
odcQuery.ExecuteNonQuery();
break;
}

case emSqlQueryType.SELECT:
{
OdbcDataAdapter oddaDatasetFiller = new
OdbcDataAdapter(odcQuery);
oddaDatasetFiller.Fill(_dsLastResults);

if (_dsLastResults.Tables.Count > 0)
{
_kNumberOfAffectedRecordsFromLastQuery =
_dsLastResults.Tables[0].Rows.Count;
}

break;
}
}

_bLastQuerySucceeded = true;
}
catch... etc all possible catches ahead.



Anyone has a suggestion or ideas? I've searched the web, mailing lists
and documentation as I could, nothing seems to be said about this
particular issue.

One interseting thing I might add, I was previously using a very old
ODBC driver, from 2006. This one caused an exception on the .Fill()
method, which was "ERROR [HY000] Out of memory while reading tuples".
Now I'm assuming that since I upgraded to the new driver, not having
this exception means that I'm not running out of memory anymore, right?

Even more details: with the original 2006 drivers, I could see my
application (the .EXE) climb up to 700MB of private bytes/virtual size
in Task Manager until eventually crashing and then not releasing that
memory. So initially, I was investigating a memory leak issue.

With the new drivers, it still climbs to 700MB or so (makes sense, 2
million rows is quite a bit), and then after the "0 rows" returned,
memory is cleared immediately. Makes sense as well, OBDC doesn't carry
data anymore, dataset is almost null, garbage collector is having a
field day.

Thanks for any suggestions or input from anyone.

Regards,

Laurent Chouinard



SEM Logo Laurent Chouinard
laurent.chouinard@sem.ca
SEM inc.
3610 Valiquette, St-Laurent, QC, H4S 1X8
Tel: 514-334-7569 / 1-888-334-7569
Fax: 514-334-5922




--------------030405070903000609090100
Content-Type: multipart/related;
boundary="------------030600020801090509060009"


--------------030600020801090509060009
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit




http-equiv="Content-Type">


Hi everyone,



I've been using the ODBC driver for Windows along with my C# (Framework
2.0) for a few years now and it's been working very nicely.



Until now.



One of my customer's database has grown to considerable size over time,
and in particular, one report he attemps to do in my software gives an
empty report.



Upon investigation, I determined that the SELECT I send to the
postgresql gets processed properly and it probably hands over about 2.1
million rows as the response. After that, I use the "Fill" method as
instructed in the "HOW TO" section in the ODBC documentation to fill
the data into a DataSet object.



After considerable amount of work (almost the same as when I do that
same query in PgAdmin III, about 110 seconds), the DataSet gets
initialized empty, with zero tables in it. No errors, no exceptions.
Everything moves along as if nothing happened.



I just upgraded to the latest ODBC drivers (dated december 2009) and
it's not changing my issue.



Am I hitting a limit of some sort? I played with the
"ConnectionTimeout" values of OdbcCommand and OdbcDataAdapter, didn't
seem to change.



Here's a selected sample of the code I use:







            OdbcCommand odcQuery = new OdbcCommand(sSQLQuery);



            using (OdbcConnection dbcPostGresConnection = new
OdbcConnection(_dpDBSettings.GetConnectionString()))

            {

                try

                {

                    odcQuery.Connection = dbcPostGresConnection;

                    dbcPostGresConnection.Open();



                    switch (emCurrentType)

                    {

                        case emSqlQueryType.CREATE:

                        case emSqlQueryType.DELETE:

                        case emSqlQueryType.UPDATE:

                        case emSqlQueryType.INSERT:

                        case emSqlQueryType.ALTER:

                        case emSqlQueryType.DROP:

                        {

                            _kNumberOfAffectedRecordsFromLastQuery =
odcQuery.ExecuteNonQuery();

                            break;

                        }



                        case emSqlQueryType.SELECT:

                        {

                            OdbcDataAdapter oddaDatasetFiller = new
OdbcDataAdapter(odcQuery);

                            oddaDatasetFiller.Fill(_dsLastResults);

                           

                            if (_dsLastResults.Tables.Count > 0)

                            {

                                _kNumberOfAffectedRecordsFromLastQuery
= _dsLastResults.Tables[0].Rows.Count;

                            }

                           

                            break;

                        }

                    }



                    _bLastQuerySucceeded = true;

                }

                catch... etc all possible catches ahead.







Anyone has a suggestion or ideas? I've searched the web, mailing lists
and documentation as I could, nothing seems to be said about this
particular issue.



One interseting thing I might add, I was previously using a very old
ODBC driver, from 2006. This one caused an exception on the .Fill()
method, which was "ERROR [HY000] Out of memory while reading tuples".
Now I'm assuming that since I upgraded to the new driver, not having
this exception means that I'm not running out of memory anymore, right?

   

Even more details: with the original 2006 drivers, I could see my
application (the .EXE) climb up to 700MB of private bytes/virtual size
in Task Manager until eventually crashing and then not releasing that
memory. So initially, I was investigating a memory leak issue.



With the new drivers, it still climbs to 700MB or so (makes sense, 2
million rows is quite a bit), and then after the "0 rows" returned,
memory is cleared immediately. Makes sense as well, OBDC doesn't carry
data anymore, dataset is almost null, garbage collector is having a
field day.



Thanks for any suggestions or input from anyone.



Regards,



Laurent Chouinard








width="100%">







SEM Logo src="cid:part1.06070607.04020208@sem.ca" border="0" height="43"
width="100">
style="font-family: Tahoma,arial; font-weight: bold; font-size: 13px;">Laurent Chouinard

href="mailto:laurent.chouinard@sem.ca">laurent.chouinard@sem .ca

style="font-family: Tahoma,arial; font-weight: bold; font-size: 13px;">SEM
inc.

3610
Valiquette, St-Laurent, QC, H4S 1X8

Tel: 514-334-7569 / 1-888-334-7569

Fax: 514-334-5922








--------------030600020801090509060009
Content-Type: image/gif
Content-Transfer-Encoding: base64
Content-ID:

R0lGODlhZAArANUAAFiPwGZobLKztJq92Nnn+dPT0/7+/lFRUgwMDKrF2+/v 7/T09M3Z6wVZ
pi0tLdXj8BBgpoCAgJycnDB0sOXs9MLDxcvW6/P6+uDg4I+drsfT4iFqsMjV 5uzy9vn5+dPe
7ABOn77L3M3Y5xplqqKlqcfX6fL3+s7c7v7/+/f398bR4OTn7NLe8VhaXCxx tMjb64iKi29z
efn+/TNzryIkKBUWFzo7P/z7/EaDvJCSlEdHSC1upwEBAcvY6QAAAP///yH5 BAAAAAAALAAA
AABkACsAAAb/QJlBhGsYj8ikcslsOp9Q5Oz1M/yu2OwvAQFBvuCweEwum8/o dBi0oWq1hBEI
V3rY7/i8fs/v+/+AdicZIBMXb1mEEyZXKIiPkJGSk5SQMjggCZGYA5Wen6Bv BqOjVR4UKysP
qSsdhJ2QnKGztJIGKCgGNx0PDCUcJRa/JcQDGSKxILC1zM1VPx0fHMAlvhbX 2AQBCDmPslml
laVWWOSU5qCOBhQiJS8MPfE9DAwiFiIvBDE+PjELV+ZkoQs3idRAgo9Ihbr1 I8UDDvIiSoyn
jZ8PGwIICmSG0JmtHyZ6WDhBsiRJeBIr8rAII8vGHwoKyJxJsybNG1pi2tyJ AVEK/wVAgyrw
kEXGgxInQihdqpQDSnn6fPCY6qPGCiwbK9BAwIOr165gv3KtoeDZDwlbxYbt 6vXA1VICaNRA
QHcuAgcVsFDQwCKGXbqAbag4EbEiv5U+DgD8hqGGxceQI/tAUPbHigCSM1vE +wxzZsXQRJxI
gCAzDxIsCnuWyvLKNwGaY0+u3OIw1dgrV9bo+QMzYosrEWAYUiLq7cNSMzxQ LZlH3te/Wcvm
R/kHBsRToze/HeHKauQWSZiw8E6H1OiIlTOXrPh1ZARz48Ofb5dGWdi2pQJe yzV/C++m9UMB
UiE4ppl6UO0DWXAYuAfcARgIJaGEOEmQnwMC7DRTBNn5oP8DgJod8AEDLAig 3WM8IEjRd/nl
4OBheYlzFnI8FDDJAax92FtsNDhFQA6yqdiDYSiu1MKL/OgQ4YQTYmGhRQgU 4EEKVFZppQe+
eQiiRTV0iEACJxhHHVfYCUlkZDYgSZ18bNLXwlVPZkeDA3TWaSedXa6kI4sx OHCeDzAQcIJ5
FrXg521msvinA2qeqNl/T/45nXR7QhYBjlS1QECBwGVw6GGJNucDDd9UsGCH 2aV6GwIrkGBb
hwHmpuWOj8GwD1U1lAAkVTSE8Ck/ZipY5KgCeUDopLDygIGr5zkaW3e0WpQD sxZlUBtVNmhA
A4qhnkqsMj/ghEEAd9YpV125ISb/HLOwTnrRVdHyA8MKpRW6baYMbPtYtwv6 wCi4NxDVkJVW
ClWAA7cpS21XOUjg8MMQP5yDACl0ZukPfkKZaj8s6AvqclApSpUOL3myK3XL IscqKFaw2J2w
wAUnwAseAwvyiqIGIMsNPzHJ5MHSCRenngX4zOQCLV9cwIkr2cACBzWnePOQ IvMjgUAS0Kd1
m6UljAF+XrYpNl0BbMlPdx5kHFkMBKhQsw/BRqabAt9EiqyXGCzdrLOZlR2v D9AqKhUJD2jL
7dRnPhbDD3W722+NNxza7qR+u3wF2I/VwAELbkPWbVes7cY4uDM6np9VPwAt qbuVX/zDAvc+
FsADJxje/yniMRwgAQwH5CZAFY2b/lh3VmCAo/A+/PdDbcNjoShqtb8dbHcF tOAY8Qa8psP2
3Hfvffe6vyFABOSXb/755f/+QwRz0klDN1YI4ED7NNgQwgkMqKADDfzzj1ph MbBBCwLAPhoQ
ZRQC6cgn0FELA6RgASmYEgSrMAoIUqkD7RBJ/jTAAQ1oQAUWSMl3EKA+14CL gQv0CEBEYRAt
yGAFvnjBCWRYkqdQRFgBsJFLTmiQHvqwhyosh0IA8sNR5MIEHyhBD9yBjRDK QwQVqQEJVogV
0gXxigUJ1wPsEUJ6PGUkFIiBDmL0hm9g8YwJwYIJKPABEXDAAtMARlIAMACc eIXDimjMoygA
YoIOUOCPf7wAITJAQUQQAgeO0KMiPYEJTZhFCyWAQAMAQIELWPKSmMykJjfJ yU568pOgvEAf
B9AAF3QgEigYAAjYMIFWuvKVsIylLGdJy1ra0pYzmIEL5NCAEEjCAzJIAABc sIFiGvOYyEym
MpfJzGY685nFdAEOfJnIRwQBADs=
--------------030600020801090509060009--

--------------030405070903000609090100--

Re: C# w/ ODBC, 2.1 million list select gives empty DataSet

am 13.01.2010 05:41:47 von Craig Ringer

On 12/01/2010 7:34 AM, Laurent Chouinard wrote:

> With the new drivers, it still climbs to 700MB or so (makes sense, 2
> million rows is quite a bit), and then after the "0 rows" returned,
> memory is cleared immediately. Makes sense as well, OBDC doesn't carry
> data anymore, dataset is almost null, garbage collector is having a
> field day.

I don't use ODBC much personally and don't work on the driver - but it
sounds to me a lot like the new driver may be running out of memory too,
but is eating the failure (or reporting it via some side-channel) rather
than throwing. If it is, that'd be pretty bad behavior in my personal
opinion, but then I don't know ODBC well. Someone who does may be able
to shed some more light.

My suggestion, though: Even if you resolve this now, as things continue
to grow you're still going to have OOM issues. Perhaps it'd be a good
idea to use a cursor for this?

--
Craig Ringer

--
Sent via pgsql-odbc mailing list (pgsql-odbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-odbc

Re: C# w/ ODBC, 2.1 million list select gives empty DataSet

am 13.01.2010 06:35:34 von Hiroshi Inoue

Laurent Chouinard wrote:
> Hi everyone,
>
> I've been using the ODBC driver for Windows along with my C# (Framework
> 2.0) for a few years now and it's been working very nicely.
>
> Until now.
>
> One of my customer's database has grown to considerable size over time,
> and in particular, one report he attemps to do in my software gives an
> empty report.
>
> Upon investigation, I determined that the SELECT I send to the
> postgresql gets processed properly and it probably hands over about 2.1
> million rows as the response. After that, I use the "Fill" method as
> instructed in the "HOW TO" section in the ODBC documentation to fill the
> data into a DataSet object.
>
> After considerable amount of work (almost the same as when I do that
> same query in PgAdmin III, about 110 seconds), the DataSet gets
> initialized empty, with zero tables in it. No errors, no exceptions.
> Everything moves along as if nothing happened.
>
> I just upgraded to the latest ODBC drivers (dated december 2009) and
> it's not changing my issue.

Hmmm, 2.1 millions of rows are too many. Possibly "Out of memory
while reading tuples" error occurs but the driver forgets to handle
it in some cases. Please try the drivers on testing for 8.4.0201 at
http://www.geocities.jp/inocchichichi/psqlodbc/index.html .
You may be able to improve the memory usage by checking the "Use
Declare/Fetch" option.

regards,
Hiroshi Inoue


--
Sent via pgsql-odbc mailing list (pgsql-odbc@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-odbc

Re: C# w/ ODBC, 2.1 million list select gives empty DataSet

am 13.01.2010 21:34:07 von Laurent Chouinard

This is a multi-part message in MIME format.
--------------020903030708090705010702
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit

On 2010-01-12 23:41, Craig Ringer wrote:
> On 12/01/2010 7:34 AM, Laurent Chouinard wrote:
>
>> With the new drivers, it still climbs to 700MB or so (makes sense, 2
>> million rows is quite a bit), and then after the "0 rows" returned,
>> memory is cleared immediately. Makes sense as well, OBDC doesn't carry
>> data anymore, dataset is almost null, garbage collector is having a
>> field day.
>
> I don't use ODBC much personally and don't work on the driver - but it
> sounds to me a lot like the new driver may be running out of memory
> too, but is eating the failure (or reporting it via some side-channel)
> rather than throwing. If it is, that'd be pretty bad behavior in my
> personal opinion, but then I don't know ODBC well. Someone who does
> may be able to shed some more light.
>
> My suggestion, though: Even if you resolve this now, as things
> continue to grow you're still going to have OOM issues. Perhaps it'd
> be a good idea to use a cursor for this?
>

You're right. For some reason, it completely escaped me that the new
driver could be failing just as the previous one, but in a less
acceptable way.

Indeed, the use of a cursor seems logical when dealing with very large
quantities of data, especially considering that 2.1 million rows is
after only a year of use. My final solution must be future proof to some
extent. I tried to have my code be as database-compatible as possible so
that I can swap from Postgres to Microsoft easily depending on customer
requirements. Using a cursor means I have a PG specific way, but then
again, I could also find out how to do that in MS and call that a day. A
long one, however!

Thanks all for your inputs.

Also, thanks Hiroshi for your suggestion for the other driver, but I
will not venture into non-official releases at the moment because my
product is already in use by customers. I can't rely on experimental or
beta versions of code.

Regards,
Laurent Chouinard

--------------020903030708090705010702
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit




http-equiv="Content-Type">


On
2010-01-12 23:41, Craig Ringer wrote:

cite="mid:4B4D4F0B.4000802@postnewspapers.com.au" type="cite">On
12/01/2010 7:34 AM, Laurent Chouinard wrote:




With
the new drivers, it still climbs to 700MB or so (makes sense, 2


million rows is quite a bit), and then after the "0 rows" returned,


memory is cleared immediately. Makes sense as well, OBDC doesn't carry


data anymore, dataset is almost null, garbage collector is having a


field day.





I don't use ODBC much personally and don't work on the driver - but it
sounds to me a lot like the new driver may be running out of memory
too, but is eating the failure (or reporting it via some side-channel)
rather than throwing. If it is, that'd be pretty bad behavior in my
personal opinion, but then I don't know ODBC well. Someone who does may
be able to shed some more light.




My suggestion, though: Even if you resolve this now, as things continue
to grow you're still going to have OOM issues. Perhaps it'd be a good
idea to use a cursor for this?







You're right. For some reason, it completely escaped me that the new
driver could be failing just as the previous one, but in a less
acceptable way.



Indeed, the use of a cursor seems logical when dealing with very large
quantities of data, especially considering that 2.1 million rows is
after only a year of use. My final solution must be future proof to
some extent. I tried to have my code be as database-compatible as
possible so that I can swap from Postgres to Microsoft easily depending
on customer requirements. Using a cursor means I have a PG specific
way, but then again, I could also find out how to do that in MS and
call that a day. A long one, however!



Thanks all for your inputs.



Also, thanks Hiroshi for your suggestion for the other driver, but I
will not venture into non-official releases at the moment because my
product is already in use by customers. I can't rely on experimental or
beta versions of code.



Regards,

Laurent Chouinard




--------------020903030708090705010702--