mySQL C API vector binding support?
am 25.09.2006 14:42:36 von dragomir.stanchev
Hello,
I have a "slight" problem with the mySQL C API. I want to bind vectors
for output/input prepared statements. However, the C API does NOT
support such data binding. Is there ANY other way to do a
bulk-INSERT/DELETE/UPDATE or bulk-SELECT in mySQL C API? I mean, one
can do it with a loop, but obiously the goal is to have a higher
performance.I tried it with a plain-old loop, but it is still slow.
Oracle supports such feature and it is considerably faster than the
usual loop-Insert (appr. 100 times faster for big queries) ! ! !
Please, advise.
Thanx.
Regards,
Dragomir Stanchev
Re: mySQL C API vector binding support?
am 17.10.2006 16:07:14 von dragomir.stanchev
The|Godfather wrote:
> Hello,
> I have a "slight" problem with the mySQL C API. I want to bind vectors
> for output/input prepared statements. However, the C API does NOT
> support such data binding. Is there ANY other way to do a
> bulk-INSERT/DELETE/UPDATE or bulk-SELECT in mySQL C API? I mean, one
> can do it with a loop, but obiously the goal is to have a higher
> performance.I tried it with a plain-old loop, but it is still slow.
> Oracle supports such feature and it is considerably faster than the
> usual loop-Insert (appr. 100 times faster for big queries) ! ! !
> Please, advise.
>
> Thanx.
>
> Regards,
>
> Dragomir Stanchev
Hi,
I seemed to found a partial solution to MY problem. However it works
only with INSERT.
When you want to add to your mySQL database many vectorS of elements or
other array-style things using prepared quires under mySQL C API you
can do it in 2 ways:
--->
The following tests were conducted using an array of 3000 vectors ,
each holding 100 elements of type int.
--->
1. Insert elements one by one. Obivous solution, but VERY
slowwww. For 300 000 elements (integers) it takes the staggering ~38
minutes doing INSERT to a remote ( not local) server.
2. The solution I used: just parse the Prepared query and instead
of:
"INSERT INTO tableName values()"
DO
"INSERT INTO tableName values(),(),(),().......(n) " ,
where n is vector<>.size() or whatever. Note that usually it is advised
to use a vector of size 100 elements cause it is the most efficient.
THE efficiency gain: ~15 seconds total time for the same 300 000
queries and the same remote mySQL server.
Just out of curiosity I testet the Oracle OCI integrated array
insertion mechanism and it turns out that Oracle was slower : ~19
seconds for 300 000 elements , 100 elements per one bulk insert.
Note: I tried to optimize it for UPDATE or/and DELETE but no success so
far. Any suggestions would be welcome.
Cheers,
Dragomir Stanchev