UPDATE with string > 16M characters into corrupts MyISAM table
UPDATE with string > 16M characters into corrupts MyISAM table
am 23.10.2002 22:19:37 von Craig Cummings
>Description:
I'm trying to insert long strings into the LONGTEXT column 'sequence' of
MyISAM table 'chromosomes'. The details of this table are given at the
bottom of this section. I'm using Perl/DBI to interface with MySQL.
This works fine as long as the strings are less than 16 million
characters. However, when I attempt to update a row by adding a string
longer than 16M characters, I run into problems.
* Attempt #1 (inserting the whole string):
If I attempt to update the row by setting the column value to equal the
full-length string, e.g.
UPDATE TABLE chromosomes SET sequence = "string" WHERE acc = "acc";
I get the following error:
DBD::mysql::db do failed: MySQL server has gone away at
../load_seqs_into_db.pl line 39, chunk 9.
"CHECK TABLE chromosomes" shows that the table is OK.
From my past experience this results from the packet size exceeding
MAX_ALLOWED_PACKET. However, this variable is currently set to 67107840,
far higher than the string length (20700699 characters). (As an aside,
even though I specify MAX_ALLOWED_PACKET as 2GB in my.cnf, the server
always reports the 67MB value. Any idea why this happens?)
* Attempt #2 (breaking up the string):
I tried to accomplish the same task by breaking the string into smaller
substrings then setting the column value by iterative concatenation
operations, e.g.
foreach consecutive subsequence {
UPDATE chromosomes
SET sequence = CONCAT("sequence", "subsequence")
WHERE acc = "acc"
}
This works fine for strings shorter than 17M characters. However, the
problematic string causes an error on the update that should bring the
stroed string over the 17M character mark. Specifically, I get an error
127 from the table handler. After this error, SELECT statements from that
table no longer return all of the expected rows. "CHECK TABLE
chromosomes" reveals that the table is corrupted. " REPAIR TABLE
chromosomes" restores the normal SELECT behavior but deletes the row
corresponding to the problematic sequence.
*Table details*
mysql> show table status like "chromosomes";
--------------------+---------------------+----------------+ ---------+
| Name | Type | Row_format | Rows | Avg_row_length | Data_length
| Max_data_length | Index_length | Data_free | Auto_increment | Create_time
|
Update_time | Check_time | Create_options | Comment |
+-------------+--------+------------+------+---------------- +-------------+-----
------------+--------------+-----------+----------------+--- ------------------+-
--------------------+---------------------+----------------+ ---------+
| chromosomes | MyISAM | Dynamic | 36 | 321955 | 11590388 |
4294967295 | 2048 | 0 | NULL | 2002-10-11
08:11:45 |
2002-10-23 11:53:27 | 2002-10-23 12:00:56 | | |
+-------------+--------+------------+------+---------------- +-------------+-----
------------+--------------+-----------+----------------+--- ------------------+-
--------------------+---------------------+----------------+ ---------+
mysql> describe chromosomes;
+----------+-------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+----------+-------------+------+-----+---------+-------+
| chr | varchar(12) | | | | |
| acc | varchar(12) | | PRI | | |
| sequence | longtext | YES | | NULL | |
+----------+-------------+------+-----+---------+-------+
>How-to-repeat:
These bugs are repeatable with any string longer than about 16M
characters, even after repairing the table.
>Fix:
I have not found one
>Originator: Craig Cummings
>MySQL support: none
>Synopsis: UPDATEs that insert string >16M chars causes errors, corrupts table
>Severity: serious
>Priority: high
>Category: mysql
>Class: sw-bug
>Release: mysql-4.0.3-beta (Source distribution)
>Server: /tools/local/mysql/bin/mysqladmin Ver 8.37 Distrib 4.0.3-beta,
for dec-osf4.0f on alphaev6
Server version 4.0.3-beta-log
Protocol version 10
Connection Localhost via UNIX socket
UNIX socket /tmp/mysql.sock
Uptime: 12 days 19 min 6 sec
System: OSF1 ricotta V4.0 1229 alpha
Machine: alpha
Some paths: /tools/perl/current/bin/perl /usr/ucb/make
/tools/gnu/bin/gmake /tools/gnu/bin/gcc /usr/ucb/cc
GCC: Reading specs from
/tools/gnu/lib/gcc-lib/alphaev6-dec-osf4.0f/2.95.2/specs
gcc version 2.95.2 19991024 (release)
Compilation info: CC='gcc' CFLAGS='-O3' CXX='gcc' CXXFLAGS='-O3
-felide-constructors -fno-exceptions -fno-rtti' LDFLAGS=''
LIBC:
lrwxr-xr-x 1 root system 17 Mar 13 2001 /lib/libc.a ->
.../ccs/lib/libc.a
lrwxr-xr-x 1 root system 17 Mar 13 2001 /usr/lib/libc.a ->
.../ccs/lib/libc.a
Configure command: ./configure --prefix=/tools/local/mysql-4.0.3/
--localstatedir=/usr3/data3/mysql CFLAGS=-O3 'CXXFLAGS=-O3 -felide-constructors
-fno-exceptions -fno-rtti' CXX=gcc
Perl: This is perl, version 5.004_04 built for alpha-dec_osf
Thanks in advance for any help you can provide.
Best regards,
Craig Cummings, Ph.D.
Relman Laboratory
Stanford University School of Medicine
Department of Microbiology and Immunology
e-mail: cummings@cmgm.stanford.edu
phone: 650-498-5998
fax: 650-852-3291
------------------------------------------------------------ ---------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail bugs-thread12841@lists.mysql.com
To unsubscribe, e-mail
Re: UPDATE with string > 16M characters into corrupts MyISAM table
am 24.10.2002 17:07:57 von Sinisa Milivojevic
Craig Cummings writes:
> >Description:
>
> I'm trying to insert long strings into the LONGTEXT column 'sequence' of
> MyISAM table 'chromosomes'. The details of this table are given at the
> bottom of this section. I'm using Perl/DBI to interface with MySQL.
> This works fine as long as the strings are less than 16 million
> characters. However, when I attempt to update a row by adding a string
> longer than 16M characters, I run into problems.
>
> * Attempt #1 (inserting the whole string):
> If I attempt to update the row by setting the column value to equal the
> full-length string, e.g.
>
> UPDATE TABLE chromosomes SET sequence = "string" WHERE acc = "acc";
>
> I get the following error:
>
> DBD::mysql::db do failed: MySQL server has gone away at
> ./load_seqs_into_db.pl line 39, chunk 9.
>
> Thanks in advance for any help you can provide.
>
> Best regards,
>
> Craig Cummings, Ph.D.
>
> Relman Laboratory
> Stanford University School of Medicine
> Department of Microbiology and Immunology
>
> e-mail: cummings@cmgm.stanford.edu
> phone: 650-498-5998
> fax: 650-852-3291
>
>
Hi!
I have tested your case and it works just fine for me.
In order to have the above functional, you must have done the
following:
* set max_allowed_packet at server side (can not be larger then 1 Gb)
* linked your DBD agains 4.0 MySQL C API and not against 3.23
* increased max_allowed_packet on the client side too
Even if you do all of the above you might still experience problems on
some Unices due to ulimit and other system-wide limitations.
--
Regards,
__ ___ ___ ____ __
/ |/ /_ __/ __/ __ \/ / Mr. Sinisa Milivojevic
/ /|_/ / // /\ \/ /_/ / /__ MySQL AB, Fulltime Developer
/_/ /_/\_, /___/\___\_\___/ Larnaca, Cyprus
<___/ www.mysql.com
------------------------------------------------------------ ---------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail bugs-thread12854@lists.mysql.com
To unsubscribe, e-mail
Re: UPDATE with string > 16M characters into corrupts MyISAM table
am 01.11.2002 07:42:48 von Craig Cummings
Dear Mr. Milivojevic,
Following your recommendations, we rebuilt Msql-Mysql-modules against the
4.0.3 version of the libraries. As before, max_allowed_packet is set at
67107840 on the server side. I do not know how to increase the
max_allowed_packet size in the context of Perl DBI.
Relinking of the DBD modules solved the problem for insertion of a
20700699 character string. However, attempting to insert a 30291480
character string with the same code:
$sql = "UPDATE chromosomes
SET sequence = \"a_very_long_string\"
WHERE acc = \"accession\"
";
eval {
$dbh->do($sql);
};
results in the following error:
DBD::mysql::db do failed: Out of memory (Needed 30291504 bytes) at
./load_seqs_into_db.pl line 39, chunk 33.
I don't think this is a UNIX limitation. Here's a listing of my available
resources:
ricotta 142 > ulimit -a
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 1048576
stack(kbytes) 2048
memory(kbytes) 2051936
coredump(blocks) 1
nofiles(descriptors) 4096
vmemory(kbytes) 1048576
I read somewhere in the documentation that the C API limits packet size to
16 Mbytes. Because the Perl DBD module utilizes the C library, this could
be a potential limitation. But could this be causing the error if I am
able to insert a 20 Mbyte string?
Any further sage advice would be greatly appreciated.
Best regards,
Craig Cummings
On Thu, 24 Oct 2002, Sinisa Milivojevic wrote:
> Craig Cummings writes:
> > >Description:
> >
> > I'm trying to insert long strings into the LONGTEXT column 'sequence' of
> > MyISAM table 'chromosomes'. The details of this table are given at the
> > bottom of this section. I'm using Perl/DBI to interface with MySQL.
> > This works fine as long as the strings are less than 16 million
> > characters. However, when I attempt to update a row by adding a string
> > longer than 16M characters, I run into problems.
> >
> > * Attempt #1 (inserting the whole string):
> > If I attempt to update the row by setting the column value to equal the
> > full-length string, e.g.
> >
> > UPDATE TABLE chromosomes SET sequence = "string" WHERE acc = "acc";
> >
> > I get the following error:
> >
> > DBD::mysql::db do failed: MySQL server has gone away at
> > ./load_seqs_into_db.pl line 39, chunk 9.
> >
> > Thanks in advance for any help you can provide.
> >
> > Best regards,
> >
> > Craig Cummings, Ph.D.
> >
> > Relman Laboratory
> > Stanford University School of Medicine
> > Department of Microbiology and Immunology
> >
> > e-mail: cummings@cmgm.stanford.edu
> > phone: 650-498-5998
> > fax: 650-852-3291
> >
> >
>
> Hi!
>
> I have tested your case and it works just fine for me.
>
> In order to have the above functional, you must have done the
> following:
>
> * set max_allowed_packet at server side (can not be larger then 1 Gb)
> * linked your DBD agains 4.0 MySQL C API and not against 3.23
> * increased max_allowed_packet on the client side too
>
> Even if you do all of the above you might still experience problems on
> some Unices due to ulimit and other system-wide limitations.
>
> --
> Regards,
> __ ___ ___ ____ __
> / |/ /_ __/ __/ __ \/ / Mr. Sinisa Milivojevic
> / /|_/ / // /\ \/ /_/ / /__ MySQL AB, Fulltime Developer
> /_/ /_/\_, /___/\___\_\___/ Larnaca, Cyprus
> <___/ www.mysql.com
>
>
Craig Cummings, Ph.D.
Relman Laboratory
Stanford University School of Medicine
Department of Microbiology and Immunology
e-mail: cummings@cmgm.stanford.edu
phone: 650-498-5998
fax: 650-852-3291
------------------------------------------------------------ ---------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail bugs-thread12883@lists.mysql.com
To unsubscribe, e-mail
Re: UPDATE with string > 16M characters into corrupts MyISAM table
am 01.11.2002 13:12:53 von Sinisa Milivojevic
Craig Cummings writes:
> Dear Mr. Milivojevic,
>
> Following your recommendations, we rebuilt Msql-Mysql-modules against the
> 4.0.3 version of the libraries. As before, max_allowed_packet is set at
> 67107840 on the server side. I do not know how to increase the
> max_allowed_packet size in the context of Perl DBI.
>
> Relinking of the DBD modules solved the problem for insertion of a
> 20700699 character string. However, attempting to insert a 30291480
> character string with the same code:
>
> $sql = "UPDATE chromosomes
> SET sequence = \"a_very_long_string\"
> WHERE acc = \"accession\"
> ";
> eval {
> $dbh->do($sql);
> };
>
> results in the following error:
>
> DBD::mysql::db do failed: Out of memory (Needed 30291504 bytes) at
> ./load_seqs_into_db.pl line 39, chunk 33.
>
> I don't think this is a UNIX limitation. Here's a listing of my available
> resources:
>
> ricotta 142 > ulimit -a
> time(seconds) unlimited
> file(blocks) unlimited
> data(kbytes) 1048576
> stack(kbytes) 2048
> memory(kbytes) 2051936
> coredump(blocks) 1
> nofiles(descriptors) 4096
> vmemory(kbytes) 1048576
>
> I read somewhere in the documentation that the C API limits packet size to
> 16 Mbytes. Because the Perl DBD module utilizes the C library, this could
> be a potential limitation. But could this be causing the error if I am
> able to insert a 20 Mbyte string?
>
> Any further sage advice would be greatly appreciated.
>
> Best regards,
>
> Craig Cummings
>
>
Hi!
First of all, glad that my recoomandations work.
Second, our 4.0.* C API that you use limits packet size to 1 Gb, so
that can not cause any problems.
The above problem that you describe is caused by how DBD was
designed. I am not much of an expert for it, but you could try DBI
interface.
--
Regards,
__ ___ ___ ____ __
/ |/ /_ __/ __/ __ \/ / Mr. Sinisa Milivojevic
/ /|_/ / // /\ \/ /_/ / /__ MySQL AB, Fulltime Developer
/_/ /_/\_, /___/\___\_\___/ Larnaca, Cyprus
<___/ www.mysql.com
------------------------------------------------------------ ---------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail bugs-thread12884@lists.mysql.com
To unsubscribe, e-mail
UPDATE with string > 16M characters into corrupts MyISAM table
am 11.11.2002 21:15:42 von Michael Widenius
Hi!
>>>>> "Craig" == Craig Cummings writes:
>> From my past experience this results from the packet size exceeding
Craig> MAX_ALLOWED_PACKET. However, this variable is currently set to 67107840,
Craig> far higher than the string length (20700699 characters). (As an aside,
Craig> even though I specify MAX_ALLOWED_PACKET as 2GB in my.cnf, the server
Craig> always reports the 67MB value. Any idea why this happens?)
I tested this with the uppcoming MySQL 4.0.5 version:
(/my/mysql-4.0/sql) ./mysqld --max_allowed_packet=1G --help | grep max_allowed
--max_allowed_packet=#
max_allowed_packet 1073740800
(/my/mysql-4.0/sql) ./mysqld --max_allowed_packet=1G &
....
(/my/mysql-4.0/sql) mysqladmin var | grep max_allowed
| max_allowed_packet | 1073740800
In other words, it works for me.
I know that I fixed something with max_allowed_packet in 4.0.4, so an
upgrade may help fix the 64M limit problem.
>> How-to-repeat:
Craig> These bugs are repeatable with any string longer than about 16M
Craig> characters, even after repairing the table.
I have tested records > 16M with the tests/myisam-big-rows.tst sql
script that can be found in the MySQL source tree.
(I have added a copy of this in this email)
Could you please try to modify this to get MyISAM to crash ?
If you succeed then just email the test program to bugs@ and we will
take a look at this.
Regards,
Monty
--
For technical support contracts, goto https://order.mysql.com/
__ ___ ___ ____ __
/ |/ /_ __/ __/ __ \/ / Mr. Michael Widenius
/ /|_/ / // /\ \/ /_/ / /__ MySQL AB, CTO
/_/ /_/\_, /___/\___\_\___/ Helsinki, Finland
<___/ www.mysql.com
------------------------------------------------------------ ---------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail bugs-thread12965@lists.mysql.com
To unsubscribe, e-mail
Re: UPDATE with string > 16M characters into corrupts MyISAM table
am 12.02.2003 16:49:17 von Craig Cummings
Hi there,
It's been a while since I originally posted this bug but hopefully
it's not too late to follow up.
First of all, my latest server is version 4.0.5-beta. The
max_allowed_packet variable is now correctly reported by the server as 1
GB. However, I'm still encountering problems (losing the server) when I
try to insert a big string into a longtext field of a MyISAM table,
although now the limit is around 30 MB rather than 16 MB.
I've managed to make myisam-big-rows.tst crash with a little modification.
I have appended three files for your review:
------------------------------------------------------------ -----------
1) myisam-big-rows.out (The output of your original script)
Logging to file 'myisam-big-rows.out'
mysql> source /usr2/data2/cummings/Traser/myisam-big-rows.tst
Query OK, 0 rows affected (0.00 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.00 sec)
Query OK, 1 row affected (0.00 sec)
Query OK, 1 row affected (0.00 sec)
Query OK, 1 row affected (0.75 sec)
Rows matched: 1 Changed: 1 Warnings: 0
+------------+-------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+-------+----------+----------+
| bigtest.t1 | check | status | OK |
+------------+-------+----------+----------+
1 row in set (0.40 sec)
Query OK, 1 row affected (0.68 sec)
Rows matched: 1 Changed: 1 Warnings: 0
Query OK, 1 row affected (0.00 sec)
Rows matched: 1 Changed: 1 Warnings: 0
Query OK, 1 row affected (0.98 sec)
Rows matched: 1 Changed: 1 Warnings: 0
Query OK, 1 row affected (0.70 sec)
Rows matched: 1 Changed: 1 Warnings: 0
+---+-----------+
| a | length(b) |
+---+-----------+
| 1 | 16778000 |
| 2 | 16777000 |
| 3 | 16778000 |
+---+-----------+
3 rows in set (0.22 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.70 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.72 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.70 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.72 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.72 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.68 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.72 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.70 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.70 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.72 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.90 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.70 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.70 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.68 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.70 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.70 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.70 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.90 sec)
Query OK, 1 row affected (0.28 sec)
Rows matched: 1 Changed: 1 Warnings: 0
Query OK, 1 row affected (0.28 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.65 sec)
Query OK, 1 row affected (1.12 sec)
Rows matched: 1 Changed: 1 Warnings: 0
+----+------------+-----------+
| a | mid(b,1,5) | length(b) |
+----+------------+-----------+
| 1 | GGGGG | 16778000 |
| 2 | EEEEE | 16777000 |
| 3 | HHHHH | 16778000 |
| 4 | JJJJJ | 16777201 |
| 5 | A | 1 |
| 6 | LLLLL | 16777203 |
| 22 | ]]]]] | 17000019 |
| 8 | NNNNN | 16777205 |
| 9 | OOOOO | 16777206 |
| 10 | PPPPP | 16777207 |
| 11 | QQQQQ | 16777208 |
| 12 | RRRRR | 16777209 |
| 13 | SSSSS | 16777210 |
| 14 | TTTTT | 16777211 |
| 15 | UUUUU | 16777212 |
| 16 | VVVVV | 16777213 |
| 17 | WWWWW | 16777214 |
| 18 | XXXXX | 16777215 |
| 19 | YYYYY | 16777216 |
| 20 | ZZZZZ | 16777217 |
| 21 | [[[[[ | 16777218 |
+----+------------+-----------+
21 rows in set (1.18 sec)
+------------+-------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+-------+----------+----------+
| bigtest.t1 | check | status | OK |
+------------+-------+----------+----------+
1 row in set (5.15 sec)
+------------+--------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+--------+----------+----------+
| bigtest.t1 | repair | status | OK |
+------------+--------+----------+----------+
1 row in set (15.42 sec)
+------------+-------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+-------+----------+----------+
| bigtest.t1 | check | status | OK |
+------------+-------+----------+----------+
1 row in set (5.35 sec)
ERROR 1064: You have an error in your SQL syntax. Check the manual that corresponds to your MySQL server version for the right syntax to use near 'table where b<>repeat(mid(b,1,1),length(b))' at line 1
Query OK, 10 rows affected (3.10 sec)
ERROR 1064: You have an error in your SQL syntax. Check the manual that corresponds to your MySQL server version for the right syntax to use near 'table where b<>repeat(mid(b,1,1),length(b))' at line 1
+------------+-------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+-------+----------+----------+
| bigtest.t1 | check | status | OK |
+------------+-------+----------+----------+
1 row in set (3.02 sec)
+------------+--------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+--------+----------+----------+
| bigtest.t1 | repair | status | OK |
+------------+--------+----------+----------+
1 row in set (8.52 sec)
+------------+-------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+-------+----------+----------+
| bigtest.t1 | check | status | OK |
+------------+-------+----------+----------+
1 row in set (2.90 sec)
Query OK, 0 rows affected (0.12 sec)
------------------------------------------------------------ -----------
2) big-rows-mod.tst (My modified script. I've increased the size of
several strings in 4 MB increments)
#
# Test rows with length above > 16M
# Note that for this to work, you should start mysqld with
# -O max_allowed_packet=32M
#
drop table if exists t1;
create table t1 (a tinyint not null auto_increment, b longblob not null, primary key (a)) checksum=1;
insert into t1 (b) values(repeat(char(65),10));
insert into t1 (b) values(repeat(char(66),10));
insert into t1 (b) values(repeat(char(67),10));
update t1 set b=repeat(char(68),16777216) where a=1;
check table t1;
update t1 set b=repeat(char(69),16777000) where a=2;
update t1 set b=repeat(char(70),167) where a=3;
update t1 set b=repeat(char(71),16778000) where a=1;
update t1 set b=repeat(char(72),32778000) where a=3;
select a,length(b) from t1;
set @a=1;
insert into t1 (b) values (repeat(char(73+@a),16777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),20777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),24777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),28777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),32777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),36777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),40777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),44777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),48777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),52777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),16777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),16777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),16777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),16777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),16777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),16777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),16777200+@a));
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),16777200+@a));
update t1 set b=('A') where a=5;
delete from t1 where a=7;
set @a=@a+1;
insert into t1 (b) values (repeat(char(73+@a),16777200+@a));
update t1 set b=repeat(char(73+@a+1),17000000+@a) where a=last_insert_id();
select a,mid(b,1,5),length(b) from t1;
check table t1;
repair table t1;
check table t1;
select a from table where b<>repeat(mid(b,1,1),length(b));
delete from t1 where (a & 1);
select a from table where b<>repeat(mid(b,1,1),length(b));
check table t1;
repair table t1;
check table t1;
drop table t1;
------------------------------------------------------------ -----------
3) big-rows-mod.out (The output from the modified script. Note that
everything goes OK until I attempt to insert a 40 MB string. This throws
an error 12 and everything goes downhill from there.)
Logging to file 'big-rows-modified.out'
mysql> source /usr2/data2/cummings/Traser/big-rows-mod.tst
Query OK, 0 rows affected (0.00 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.00 sec)
Query OK, 1 row affected (0.00 sec)
Query OK, 1 row affected (0.00 sec)
Query OK, 1 row affected (0.72 sec)
Rows matched: 1 Changed: 1 Warnings: 0
+------------+-------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+-------+----------+----------+
| bigtest.t1 | check | status | OK |
+------------+-------+----------+----------+
1 row in set (0.42 sec)
Query OK, 1 row affected (0.68 sec)
Rows matched: 1 Changed: 1 Warnings: 0
Query OK, 1 row affected (0.00 sec)
Rows matched: 1 Changed: 1 Warnings: 0
Query OK, 1 row affected (0.98 sec)
Rows matched: 1 Changed: 1 Warnings: 0
Query OK, 1 row affected (1.60 sec)
Rows matched: 1 Changed: 1 Warnings: 0
+---+-----------+
| a | length(b) |
+---+-----------+
| 1 | 16778000 |
| 2 | 16777000 |
| 3 | 32778000 |
+---+-----------+
3 rows in set (0.42 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (0.77 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (1.08 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (1.27 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (1.50 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (1.75 sec)
Query OK, 0 rows affected (0.00 sec)
Query OK, 1 row affected (1.97 sec)
Query OK, 0 rows affected (0.00 sec)
ERROR 1030: Got error 12 from table handler
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 1 row affected (0.33 sec)
Rows matched: 1 Changed: 1 Warnings: 0
Query OK, 1 row affected (0.45 sec)
Query OK, 0 rows affected (0.00 sec)
ERROR 1062: Duplicate entry '10' for key 1
Query OK, 0 rows affected (0.00 sec)
Rows matched: 0 Changed: 0 Warnings: 0
+---+------------+-----------+
| a | mid(b,1,5) | length(b) |
+---+------------+-----------+
| 1 | GGGGG | 16778000 |
| 2 | EEEEE | 16777000 |
| 3 | HHHHH | 32778000 |
| 4 | JJJJJ | 16777201 |
| 5 | A | 1 |
| 6 | LLLLL | 24777203 |
| 8 | NNNNN | 32777205 |
| 9 | OOOOO | 36777206 |
+---+------------+-----------+
8 rows in set (0.68 sec)
+------------+-------+----------+--------------------------- ------------------------------------+
| Table | Op | Msg_type | Msg_text |
+------------+-------+----------+--------------------------- ------------------------------------+
| bigtest.t1 | check | warning | Table is marked as crashed |
| bigtest.t1 | check | error | Found key at page 1024 that points to record outside datafile |
| bigtest.t1 | check | error | Corrupt |
+------------+-------+----------+--------------------------- ------------------------------------+
3 rows in set (0.00 sec)
+------------+--------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+--------+----------+----------+
| bigtest.t1 | repair | status | OK |
+------------+--------+----------+----------+
1 row in set (7.90 sec)
+------------+-------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+-------+----------+----------+
| bigtest.t1 | check | status | OK |
+------------+-------+----------+----------+
1 row in set (2.88 sec)
ERROR 1064: You have an error in your SQL syntax. Check the manual that corresponds to your MySQL server version for the right syntax to use near 'table where b<>repeat(mid(b,1,1),length(b))' at line 1
Query OK, 4 rows affected (1.70 sec)
ERROR 1064: You have an error in your SQL syntax. Check the manual that corresponds to your MySQL server version for the right syntax to use near 'table where b<>repeat(mid(b,1,1),length(b))' at line 1
+------------+-------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+------------+-------+----------+----------+
| bigtest.t1 | check | status | OK |
+------------+-------+----------+----------+
1 row in set (1.45 sec)
+------------+--------+----------+----------------------+
| Table | Op | Msg_type | Msg_text |
+------------+--------+----------+----------------------+
| bigtest.t1 | repair | error | 12 when fixing table |
| bigtest.t1 | repair | status | Operation failed |
+------------+--------+----------+----------------------+
2 rows in set (3.52 sec)
+------------+-------+----------+--------------------------- ------------------------+
| Table | Op | Msg_type | Msg_text |
+------------+-------+----------+--------------------------- ------------------------+
| bigtest.t1 | check | warning | Table is marked as crashed and last repair failed |
| bigtest.t1 | check | error | Found 3 keys of 2 |
| bigtest.t1 | check | error | Corrupt |
+------------+-------+----------+--------------------------- ------------------------+
3 rows in set (0.02 sec)
Query OK, 0 rows affected (0.12 sec)
------------------------------------------------------------ -----------
Hopefully, this pinpoints the source of the problem. I appreciate any
advice you may have.
Best regards,
Craig
On Mon, 11 Nov 2002, Michael Widenius wrote:
>
> Hi!
>
> >>>>> "Craig" == Craig Cummings writes:
>
>
>
>
> >> From my past experience this results from the packet size exceeding
> Craig> MAX_ALLOWED_PACKET. However, this variable is currently set to 67107840,
> Craig> far higher than the string length (20700699 characters). (As an aside,
> Craig> even though I specify MAX_ALLOWED_PACKET as 2GB in my.cnf, the server
> Craig> always reports the 67MB value. Any idea why this happens?)
>
> I tested this with the uppcoming MySQL 4.0.5 version:
>
> (/my/mysql-4.0/sql) ./mysqld --max_allowed_packet=1G --help | grep max_allowed
> --max_allowed_packet=#
> max_allowed_packet 1073740800
> (/my/mysql-4.0/sql) ./mysqld --max_allowed_packet=1G &
> ...
> (/my/mysql-4.0/sql) mysqladmin var | grep max_allowed
> | max_allowed_packet | 1073740800
>
> In other words, it works for me.
>
> I know that I fixed something with max_allowed_packet in 4.0.4, so an
> upgrade may help fix the 64M limit problem.
>
>
>
> >> How-to-repeat:
> Craig> These bugs are repeatable with any string longer than about 16M
> Craig> characters, even after repairing the table.
>
> I have tested records > 16M with the tests/myisam-big-rows.tst sql
> script that can be found in the MySQL source tree.
> (I have added a copy of this in this email)
>
> Could you please try to modify this to get MyISAM to crash ?
> If you succeed then just email the test program to bugs@ and we will
> take a look at this.
>
> Regards,
> Monty
>
> --
> For technical support contracts, goto https://order.mysql.com/
> __ ___ ___ ____ __
> / |/ /_ __/ __/ __ \/ / Mr. Michael Widenius
> / /|_/ / // /\ \/ /_/ / /__ MySQL AB, CTO
> /_/ /_/\_, /___/\___\_\___/ Helsinki, Finland
> <___/ www.mysql.com
>
>
>
Craig Cummings, Ph.D.
Relman Laboratory
Stanford University School of Medicine
Department of Microbiology and Immunology
e-mail: cummings@cmgm.stanford.edu
phone: 650-498-5998
fax: 650-852-3291
------------------------------------------------------------ ---------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail bugs-thread13756@lists.mysql.com
To unsubscribe, e-mail
Re: UPDATE with string > 16M characters into corrupts MyISAM table
am 12.02.2003 17:27:52 von Alexander Keremidarski
Craig,
Craig Cummings wrote:
> Hi there,
>
> It's been a while since I originally posted this bug but hopefully
> it's not too late to follow up.
>
> First of all, my latest server is version 4.0.5-beta. The
> max_allowed_packet variable is now correctly reported by the server as 1
> GB. However, I'm still encountering problems (losing the server) when I
> try to insert a big string into a longtext field of a MyISAM table,
> although now the limit is around 30 MB rather than 16 MB.
Please upgrade to 4.0.10
There is long list of bug fixed between 4.0.5 and 4.0.10 including BLOB/TEXT >
16MB related
For full list of changes refer to
http://www.mysql.com/documentation/mysql/bychapter/manual_Ne ws.html#News-4.0.10
> I've managed to make myisam-big-rows.tst crash with a little modification.
> I have appended three files for your review:
> 2) big-rows-mod.tst (My modified script. I've increased the size of
> several strings in 4 MB increments)
I have just tested your modified script with latest 4.0 build (post 4.0.10) after
couple of small changes (s/from table/from t1/) and got different result as follows:
Table Op Msg_type Msg_text
bugs.t1 check status OK
a length(b)
1 16778000
2 16777000
3 32778000
a mid(b,1,5) length(b)
1 GGGGG 16778000
2 EEEEE 16777000
3 HHHHH 32778000
4 JJJJJ 16777201
5 A 1
6 LLLLL 24777203
22 ]]]]] 17000019
8 NNNNN 32777205
9 OOOOO 36777206
10 PPPPP 40777207
11 QQQQQ 44777208
12 RRRRR 48777209
13 SSSSS 52777210
14 TTTTT 16777211
15 UUUUU 16777212
16 VVVVV 16777213
17 WWWWW 16777214
18 XXXXX 16777215
19 YYYYY 16777216
20 ZZZZZ 16777217
21 [[[[[ 16777218
Table Op Msg_type Msg_text
bugs.t1 check status OK
Table Op Msg_type Msg_text
bugs.t1 repair status OK
Table Op Msg_type Msg_text
bugs.t1 check status OK
Table Op Msg_type Msg_text
bugs.t1 check status OK
Table Op Msg_type Msg_text
bugs.t1 repair status OK
Table Op Msg_type Msg_text
bugs.t1 check status OK
Best regards
--
MySQL Conference and Expo 2003 http://www.mysql.com/events/uc2003/index.html
For technical support contracts, visit https://order.mysql.com/?ref=msal
__ ___ ___ ____ __
/ |/ /_ __/ __/ __ \/ / Mr. Alexander Keremidarski
/ /|_/ / // /\ \/ /_/ / /__ MySQL AB, Full-Time Developer
/_/ /_/\_, /___/\___\_\___/ Sofia, Bulgaria
<___/ www.mysql.com
------------------------------------------------------------ ---------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail bugs-thread13757@lists.mysql.com
To unsubscribe, e-mail
Re: UPDATE with string > 16M characters into corrupts MyISAM table
am 12.02.2003 17:47:51 von Sinisa Milivojevic
Craig Cummings writes:
>
> Hi there,
>
> It's been a while since I originally posted this bug but hopefully
> it's not too late to follow up.
>
> First of all, my latest server is version 4.0.5-beta. The
> max_allowed_packet variable is now correctly reported by the server as 1
> GB. However, I'm still encountering problems (losing the server) when I
> try to insert a big string into a longtext field of a MyISAM table,
> although now the limit is around 30 MB rather than 16 MB.
>
> I've managed to make myisam-big-rows.tst crash with a little modification.
> I have appended three files for your review:
>
[skip]
> Craig Cummings, Ph.D.
>
> Relman Laboratory
> Stanford University School of Medicine
> Department of Microbiology and Immunology
>
> e-mail: cummings@cmgm.stanford.edu
> phone: 650-498-5998
> fax: 650-852-3291
>
Hi!
You were simply hitting one in the series of bugs that made problems
with insertion of large BLOB/TEXT columns, that were fixed in versions
4.0,8, 4.0.9 and 4.0.10.
This is the output from my 4.0.10:
t1 check status OK
a length(b)
1 16778000
2 16777000
3 32778000
a mid(b,1,5) length(b)
1 GGGGG 16778000
2 EEEEE 16777000
3 HHHHH 32778000
4 JJJJJ 16777201
5 A 1
6 LLLLL 24777203
22 ]]]]] 17000019
8 NNNNN 32777205
9 OOOOO 36777206
10 PPPPP 40777207
11 QQQQQ 44777208
12 RRRRR 48777209
13 SSSSS 52777210
14 TTTTT 16777211
15 UUUUU 16777212
16 VVVVV 16777213
17 WWWWW 16777214
18 XXXXX 16777215
19 YYYYY 16777216
20 ZZZZZ 16777217
21 [[[[[ 16777218
Table Op Msg_type Msg_text
bug.t1 check status OK
I could not go beyond the last point, as I do not have enough memory
on my PC to run it.
--
__ ___ ___ ____ __
/ |/ /_ __/ __/ __ \/ / Mr. Sinisa Milivojevic
/ /|_/ / // /\ \/ /_/ / /__ MySQL AB, Fulltime Developer
/_/ /_/\_, /___/\___\_\___/ Larnaca, Cyprus
<___/ www.mysql.com
Join MySQL Users Conference and Expo:
http://www.mysql.com/events/uc2003/
------------------------------------------------------------ ---------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail bugs-thread13758@lists.mysql.com
To unsubscribe, e-mail
Re: UPDATE with string > 16M characters into corrupts MyISAM table
am 12.02.2003 18:03:13 von Craig Cummings
Hi Alexander and Sinisa,
Thanks for your advice and heroic bug fixing effort over the last
two months. I'm glad to see I'm not the only one running into these
problems. FYI, the huge strings that are causing all these problems are
human chromosomal sequences. More and more very large sequences become
available every day and there's a community of bioinformatics people out
there who want to get them into a database. MySQL is very popular among
this community so I'm glad you are making the effort to accomodate this
difficult data.
I'll upgrade to 4.0.10 and see how it goes. Incidentally, I
noticed in the 4.0.11 changelog the following entry: "Fixed bug in
updating BLOB columns with long strings". Should I wait until this
version is released? If so, when will it be available?
Best regards,
Craig Cummings
On Wed, 12 Feb 2003, Alexander Keremidarski wrote:
>
> Please upgrade to 4.0.10
> There is long list of bug fixed between 4.0.5 and 4.0.10 including BLOB/TEXT >
> 16MB related
>
>
> For full list of changes refer to
> http://www.mysql.com/documentation/mysql/bychapter/manual_Ne ws.html#News-4.0.10
------------------------------------------------------------ ---------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail bugs-thread13760@lists.mysql.com
To unsubscribe, e-mail
Re: UPDATE with string > 16M characters into corrupts MyISAM table
am 12.02.2003 19:26:53 von Sinisa Milivojevic
Craig Cummings writes:
> Hi Alexander and Sinisa,
>
> Thanks for your advice and heroic bug fixing effort over the last
> two months. I'm glad to see I'm not the only one running into these
> problems. FYI, the huge strings that are causing all these problems are
> human chromosomal sequences. More and more very large sequences become
> available every day and there's a community of bioinformatics people out
> there who want to get them into a database. MySQL is very popular among
> this community so I'm glad you are making the effort to accomodate this
> difficult data.
>
> I'll upgrade to 4.0.10 and see how it goes. Incidentally, I
> noticed in the 4.0.11 changelog the following entry: "Fixed bug in
> updating BLOB columns with long strings". Should I wait until this
> version is released? If so, when will it be available?
>
> Best regards,
>
> Craig Cummings
Yes, you are right !!!
This is a last bug fix, that can hit huge BLOB's, but only if they are
very close to a size divisible with 16 M !!!
4.0.11 should be in two weeks.
--
__ ___ ___ ____ __
/ |/ /_ __/ __/ __ \/ / Mr. Sinisa Milivojevic
/ /|_/ / // /\ \/ /_/ / /__ MySQL AB, Fulltime Developer
/_/ /_/\_, /___/\___\_\___/ Larnaca, Cyprus
<___/ www.mysql.com
Join MySQL Users Conference and Expo:
http://www.mysql.com/events/uc2003/
------------------------------------------------------------ ---------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail bugs-thread13764@lists.mysql.com
To unsubscribe, e-mail