backup subroutine

backup subroutine

am 11.09.2003 20:48:09 von case510

Hello,

(Sorry if this message is duplicated)
 
Please, which way to create mysql table backup is the
most painless?

I used to do that as described below until my ISP
disabled the script because it
ate all system resource when I managed to create a
backup of 80MB table:

1. prepare and execute "SELECT * FROM `$table`"
2. fetch rows one by one and store queries "INSERT
INTO $table VALUES..." into a
variable.
3. When the cycle is finished, print this variable
into a file

It works fine on relatively small tables. But 80MB one
caused both mysql and
Apache have gone away.

Any suggestion?

Thanks,
Alex G.


__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 21:09:36 von Rudy Lippan

On Thu, 11 Sep 2003, case510@yahoo.com wrote:

> Date: Thu, 11 Sep 2003 11:48:09 -0700 (PDT)
> From: "case510@yahoo.com"
> To: perl@lists.mysql.com
> Subject: backup subroutine
>
> Hello,
>
> (Sorry if this message is duplicated)
>  
> Please, which way to create mysql table backup is the
> most painless?
>


The mysqldump command will probably do what you want.

Rudy


--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 21:09:36 von Rudy Lippan

On Thu, 11 Sep 2003, case510@yahoo.com wrote:

> Date: Thu, 11 Sep 2003 11:48:09 -0700 (PDT)
> From: "case510@yahoo.com"
> To: perl@lists.mysql.com
> Subject: backup subroutine
>
> Hello,
>
> (Sorry if this message is duplicated)
>  
> Please, which way to create mysql table backup is the
> most painless?
>


The mysqldump command will probably do what you want.

Rudy


--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 21:10:33 von Greg Meckes

Do you have access to mysqldump?

If so, maybe you can dump the data, and the insert statements into one
file as in:

mysqldump --user=user -p -t -e -c databasename tablename >> file.txt

Then you can restore all the data if need be:
mysql --user=user -p tablename < file.txt

Greg


--- "case510@yahoo.com" wrote:
> Hello,
>
> (Sorry if this message is duplicated)
>
> Please, which way to create mysql table backup is the
> most painless?
>
> I used to do that as described below until my ISP
> disabled the script because it
> ate all system resource when I managed to create a
> backup of 80MB table:
>
> 1. prepare and execute "SELECT * FROM `$table`"
> 2. fetch rows one by one and store queries "INSERT
> INTO $table VALUES..." into a
> variable.
> 3. When the cycle is finished, print this variable
> into a file
>
> It works fine on relatively small tables. But 80MB one
> caused both mysql and
> Apache have gone away.
>
> Any suggestion?
>
> Thanks,
> Alex G.
>
>
> __________________________________
> Do you Yahoo!?
> Yahoo! SiteBuilder - Free, easy-to-use web site design software
> http://sitebuilder.yahoo.com
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe:
http://lists.mysql.com/perl?unsub=gregmeckes@yahoo.com
>



__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 21:10:33 von Greg Meckes

Do you have access to mysqldump?

If so, maybe you can dump the data, and the insert statements into one
file as in:

mysqldump --user=user -p -t -e -c databasename tablename >> file.txt

Then you can restore all the data if need be:
mysql --user=user -p tablename < file.txt

Greg


--- "case510@yahoo.com" wrote:
> Hello,
>
> (Sorry if this message is duplicated)
>
> Please, which way to create mysql table backup is the
> most painless?
>
> I used to do that as described below until my ISP
> disabled the script because it
> ate all system resource when I managed to create a
> backup of 80MB table:
>
> 1. prepare and execute "SELECT * FROM `$table`"
> 2. fetch rows one by one and store queries "INSERT
> INTO $table VALUES..." into a
> variable.
> 3. When the cycle is finished, print this variable
> into a file
>
> It works fine on relatively small tables. But 80MB one
> caused both mysql and
> Apache have gone away.
>
> Any suggestion?
>
> Thanks,
> Alex G.
>
>
> __________________________________
> Do you Yahoo!?
> Yahoo! SiteBuilder - Free, easy-to-use web site design software
> http://sitebuilder.yahoo.com
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe:
http://lists.mysql.com/perl?unsub=gregmeckes@yahoo.com
>



__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

RE: backup subroutine

am 11.09.2003 21:37:25 von case510

Sorry, I probably forgot to say that this is a
subroutine in CGI script. I will try to run `mysqldump
....`.
Do you know how it operates with InnoDB if there are
Foreign Keys? In my case I could analyse data and
generate table dump so that the tables with foreign
keys are restored with no problem then

Alex G.


--- "Kangara, Madhu"
wrote:
>
> use mysqldump to create a backup of table/database
> to a flat file
>
> $mysqldump --help to get all help details
> Madhu
> -----Original Message-----
> From: case510@yahoo.com [mailto:case510@yahoo.com]
> Sent: Thursday, September 11, 2003 11:48 AM
> To: perl@lists.mysql.com
> Subject: backup subroutine
>
>
> Hello,
>
> (Sorry if this message is duplicated)
>  
> Please, which way to create mysql table backup is
> the
> most painless?
>
> I used to do that as described below until my ISP
> disabled the script because it
> ate all system resource when I managed to create a
> backup of 80MB table:
>
> 1. prepare and execute "SELECT * FROM `$table`"
> 2. fetch rows one by one and store queries "INSERT
> INTO $table VALUES..." into a
> variable.
> 3. When the cycle is finished, print this variable
> into a file
>
> It works fine on relatively small tables. But 80MB
> one
> caused both mysql and
> Apache have gone away.
>
> Any suggestion?
>
> Thanks,
> Alex G.
>
>
> __________________________________
> Do you Yahoo!?
> Yahoo! SiteBuilder - Free, easy-to-use web site
> design software
> http://sitebuilder.yahoo.com
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe:
>
http://lists.mysql.com/perl?unsub=madhu.kangara@t-mobile.com


__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

RE: backup subroutine

am 11.09.2003 21:37:25 von case510

Sorry, I probably forgot to say that this is a
subroutine in CGI script. I will try to run `mysqldump
....`.
Do you know how it operates with InnoDB if there are
Foreign Keys? In my case I could analyse data and
generate table dump so that the tables with foreign
keys are restored with no problem then

Alex G.


--- "Kangara, Madhu"
wrote:
>
> use mysqldump to create a backup of table/database
> to a flat file
>
> $mysqldump --help to get all help details
> Madhu
> -----Original Message-----
> From: case510@yahoo.com [mailto:case510@yahoo.com]
> Sent: Thursday, September 11, 2003 11:48 AM
> To: perl@lists.mysql.com
> Subject: backup subroutine
>
>
> Hello,
>
> (Sorry if this message is duplicated)
>  
> Please, which way to create mysql table backup is
> the
> most painless?
>
> I used to do that as described below until my ISP
> disabled the script because it
> ate all system resource when I managed to create a
> backup of 80MB table:
>
> 1. prepare and execute "SELECT * FROM `$table`"
> 2. fetch rows one by one and store queries "INSERT
> INTO $table VALUES..." into a
> variable.
> 3. When the cycle is finished, print this variable
> into a file
>
> It works fine on relatively small tables. But 80MB
> one
> caused both mysql and
> Apache have gone away.
>
> Any suggestion?
>
> Thanks,
> Alex G.
>
>
> __________________________________
> Do you Yahoo!?
> Yahoo! SiteBuilder - Free, easy-to-use web site
> design software
> http://sitebuilder.yahoo.com
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe:
>
http://lists.mysql.com/perl?unsub=madhu.kangara@t-mobile.com


__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 21:52:58 von Ulrich Borchers

why store _all_ queries in a variable first ?
why not opening a file first, then writing
query by query / line by line to the file and
then close it ? you don't need that temporary
variable at all.
this would at least eliminate your memory
consumption problem. but if that make apache
and mysqld go away, I am not certain.

Since you are running this from a CGI-script,
you might of course also run into a Timeout
trap (httpd.conf). The default value for
Apache ist 300 seconds I think.

Uli


On 11 Sep 2003 at 12:10, Greg Meckes wrote:

> Do you have access to mysqldump?
>
> If so, maybe you can dump the data, and the insert statements into one
> file as in:
>
> mysqldump --user=3Duser -p -t -e -c databasename tablename >> file.txt
>
> Then you can restore all the data if need be:
> mysql --user=3Duser -p tablename < file.txt
>
> Greg
>
>
> --- "case510@yahoo.com" wrote:
> > Hello,
> >
> > (Sorry if this message is duplicated)
> >
> > Please, which way to create mysql table backup is the
> > most painless?
> >
> > I used to do that as described below until my ISP
> > disabled the script because it
> > ate all system resource when I managed to create a
> > backup of 80MB table:
> >
> > 1. prepare and execute "SELECT * FROM `$table`"
> > 2. fetch rows one by one and store queries "INSERT
> > INTO $table VALUES..." into a
> > variable.
> > 3. When the cycle is finished, print this variable
> > into a file
> >
> > It works fine on relatively small tables. But 80MB one
> > caused both mysql and
> > Apache have gone away.
> >
> > Any suggestion?
> >
> > Thanks,
> > Alex G.
> >
> >
> > __________________________________
> > Do you Yahoo!?
> > Yahoo! SiteBuilder - Free, easy-to-use web site design software
> > http://sitebuilder.yahoo.com
> >
> > --
> > MySQL Perl Mailing List
> > For list archives: http://lists.mysql.com/perl
> > To unsubscribe:
> http://lists.mysql.com/perl?unsub=3Dgregmeckes@yahoo.com
> >
>
>
>
> __________________________________
> Do you Yahoo!?
> Yahoo! SiteBuilder - Free, easy-to-use web site design software
> http://sitebuilder.yahoo.com
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe: http://lists.mysql.com/perl?unsub=3Dubo1@gmx.de
>


--
Dipl.-Inf. Ulrich Borchers
MEGABIT Informationstechnik GmbH
Karstr. 25, 41068 Mönchengladbach
Tel. +49 2161 30898-0



--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=3Dgcdmp-msql-mysql-modules @m.gmane.org

Re: backup subroutine

am 11.09.2003 21:52:58 von Ulrich Borchers

why store _all_ queries in a variable first ?
why not opening a file first, then writing
query by query / line by line to the file and
then close it ? you don't need that temporary
variable at all.
this would at least eliminate your memory
consumption problem. but if that make apache
and mysqld go away, I am not certain.

Since you are running this from a CGI-script,
you might of course also run into a Timeout
trap (httpd.conf). The default value for
Apache ist 300 seconds I think.

Uli


On 11 Sep 2003 at 12:10, Greg Meckes wrote:

> Do you have access to mysqldump?
>
> If so, maybe you can dump the data, and the insert statements into one
> file as in:
>
> mysqldump --user=3Duser -p -t -e -c databasename tablename >> file.txt
>
> Then you can restore all the data if need be:
> mysql --user=3Duser -p tablename < file.txt
>
> Greg
>
>
> --- "case510@yahoo.com" wrote:
> > Hello,
> >
> > (Sorry if this message is duplicated)
> >
> > Please, which way to create mysql table backup is the
> > most painless?
> >
> > I used to do that as described below until my ISP
> > disabled the script because it
> > ate all system resource when I managed to create a
> > backup of 80MB table:
> >
> > 1. prepare and execute "SELECT * FROM `$table`"
> > 2. fetch rows one by one and store queries "INSERT
> > INTO $table VALUES..." into a
> > variable.
> > 3. When the cycle is finished, print this variable
> > into a file
> >
> > It works fine on relatively small tables. But 80MB one
> > caused both mysql and
> > Apache have gone away.
> >
> > Any suggestion?
> >
> > Thanks,
> > Alex G.
> >
> >
> > __________________________________
> > Do you Yahoo!?
> > Yahoo! SiteBuilder - Free, easy-to-use web site design software
> > http://sitebuilder.yahoo.com
> >
> > --
> > MySQL Perl Mailing List
> > For list archives: http://lists.mysql.com/perl
> > To unsubscribe:
> http://lists.mysql.com/perl?unsub=3Dgregmeckes@yahoo.com
> >
>
>
>
> __________________________________
> Do you Yahoo!?
> Yahoo! SiteBuilder - Free, easy-to-use web site design software
> http://sitebuilder.yahoo.com
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe: http://lists.mysql.com/perl?unsub=3Dubo1@gmx.de
>


--
Dipl.-Inf. Ulrich Borchers
MEGABIT Informationstechnik GmbH
Karstr. 25, 41068 Mönchengladbach
Tel. +49 2161 30898-0



--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=3Dgcdmp-msql-mysql-modules @m.gmane.org

Re: backup subroutine

am 11.09.2003 22:15:37 von case510

Yes, this is better I think.
But the problem with memory consumption persists when
I "SELECT * FROM $table". The result is stored
somewhere by DBI or DBI::mysql until it's fetched from
there.

Other solution may be to count rows first, then
estimate if there are millions of rows add LIMIT n,m
and execute SELECT few times. It sounds quite
complicated.

--- Ulrich Borchers wrote:
> why store _all_ queries in a variable first ?
> why not opening a file first, then writing
> query by query / line by line to the file and
> then close it ? you don't need that temporary
> variable at all.
> this would at least eliminate your memory
> consumption problem. but if that make apache
> and mysqld go away, I am not certain.
>
> Since you are running this from a CGI-script,
> you might of course also run into a Timeout
> trap (httpd.conf). The default value for
> Apache ist 300 seconds I think.
>
> Uli
>
>
> On 11 Sep 2003 at 12:10, Greg Meckes wrote:
>
> > Do you have access to mysqldump?
> >
> > If so, maybe you can dump the data, and the insert
> statements into one
> > file as in:
> >
> > mysqldump --user=user -p -t -e -c databasename
> tablename >> file.txt
> >
> > Then you can restore all the data if need be:
> > mysql --user=user -p tablename < file.txt
> >
> > Greg
> >
> >
> > --- "case510@yahoo.com" wrote:
> > > Hello,
> > >
> > > (Sorry if this message is duplicated)
> > >
> > > Please, which way to create mysql table backup
> is the
> > > most painless?
> > >
> > > I used to do that as described below until my
> ISP
> > > disabled the script because it
> > > ate all system resource when I managed to create
> a
> > > backup of 80MB table:
> > >
> > > 1. prepare and execute "SELECT * FROM `$table`"
> > > 2. fetch rows one by one and store queries
> "INSERT
> > > INTO $table VALUES..." into a
> > > variable.
> > > 3. When the cycle is finished, print this
> variable
> > > into a file
> > >
> > > It works fine on relatively small tables. But
> 80MB one
> > > caused both mysql and
> > > Apache have gone away.
> > >
> > > Any suggestion?
> > >
> > > Thanks,
> > > Alex G.
> > >
> > >
> > > __________________________________
> > > Do you Yahoo!?
> > > Yahoo! SiteBuilder - Free, easy-to-use web site
> design software
> > > http://sitebuilder.yahoo.com
> > >
> > > --
> > > MySQL Perl Mailing List
> > > For list archives: http://lists.mysql.com/perl
> > > To unsubscribe:
> >
>
http://lists.mysql.com/perl?unsub=gregmeckes@yahoo.com
> > >
> >
> >
> >
> > __________________________________
> > Do you Yahoo!?
> > Yahoo! SiteBuilder - Free, easy-to-use web site
> design software
> > http://sitebuilder.yahoo.com
> >
> > --
> > MySQL Perl Mailing List
> > For list archives: http://lists.mysql.com/perl
> > To unsubscribe:
> http://lists.mysql.com/perl?unsub=ubo1@gmx.de
> >
>
>
> --
> Dipl.-Inf. Ulrich Borchers
> MEGABIT Informationstechnik GmbH
> Karstr. 25, 41068 Mönchengladbach
> Tel. +49 2161 30898-0
>
>
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe:
> http://lists.mysql.com/perl?unsub=case510@yahoo.com
>


__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 22:15:37 von case510

Yes, this is better I think.
But the problem with memory consumption persists when
I "SELECT * FROM $table". The result is stored
somewhere by DBI or DBI::mysql until it's fetched from
there.

Other solution may be to count rows first, then
estimate if there are millions of rows add LIMIT n,m
and execute SELECT few times. It sounds quite
complicated.

--- Ulrich Borchers wrote:
> why store _all_ queries in a variable first ?
> why not opening a file first, then writing
> query by query / line by line to the file and
> then close it ? you don't need that temporary
> variable at all.
> this would at least eliminate your memory
> consumption problem. but if that make apache
> and mysqld go away, I am not certain.
>
> Since you are running this from a CGI-script,
> you might of course also run into a Timeout
> trap (httpd.conf). The default value for
> Apache ist 300 seconds I think.
>
> Uli
>
>
> On 11 Sep 2003 at 12:10, Greg Meckes wrote:
>
> > Do you have access to mysqldump?
> >
> > If so, maybe you can dump the data, and the insert
> statements into one
> > file as in:
> >
> > mysqldump --user=user -p -t -e -c databasename
> tablename >> file.txt
> >
> > Then you can restore all the data if need be:
> > mysql --user=user -p tablename < file.txt
> >
> > Greg
> >
> >
> > --- "case510@yahoo.com" wrote:
> > > Hello,
> > >
> > > (Sorry if this message is duplicated)
> > >
> > > Please, which way to create mysql table backup
> is the
> > > most painless?
> > >
> > > I used to do that as described below until my
> ISP
> > > disabled the script because it
> > > ate all system resource when I managed to create
> a
> > > backup of 80MB table:
> > >
> > > 1. prepare and execute "SELECT * FROM `$table`"
> > > 2. fetch rows one by one and store queries
> "INSERT
> > > INTO $table VALUES..." into a
> > > variable.
> > > 3. When the cycle is finished, print this
> variable
> > > into a file
> > >
> > > It works fine on relatively small tables. But
> 80MB one
> > > caused both mysql and
> > > Apache have gone away.
> > >
> > > Any suggestion?
> > >
> > > Thanks,
> > > Alex G.
> > >
> > >
> > > __________________________________
> > > Do you Yahoo!?
> > > Yahoo! SiteBuilder - Free, easy-to-use web site
> design software
> > > http://sitebuilder.yahoo.com
> > >
> > > --
> > > MySQL Perl Mailing List
> > > For list archives: http://lists.mysql.com/perl
> > > To unsubscribe:
> >
>
http://lists.mysql.com/perl?unsub=gregmeckes@yahoo.com
> > >
> >
> >
> >
> > __________________________________
> > Do you Yahoo!?
> > Yahoo! SiteBuilder - Free, easy-to-use web site
> design software
> > http://sitebuilder.yahoo.com
> >
> > --
> > MySQL Perl Mailing List
> > For list archives: http://lists.mysql.com/perl
> > To unsubscribe:
> http://lists.mysql.com/perl?unsub=ubo1@gmx.de
> >
>
>
> --
> Dipl.-Inf. Ulrich Borchers
> MEGABIT Informationstechnik GmbH
> Karstr. 25, 41068 Mönchengladbach
> Tel. +49 2161 30898-0
>
>
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe:
> http://lists.mysql.com/perl?unsub=case510@yahoo.com
>


__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 22:29:47 von Rudy Lippan

On Thu, 11 Sep 2003, case510@yahoo.com wrote:

> Date: Thu, 11 Sep 2003 13:15:37 -0700 (PDT)
> From: "case510@yahoo.com"
> To: perl@lists.mysql.com
> Subject: Re: backup subroutine
>
> Yes, this is better I think.
> But the problem with memory consumption persists when
> I "SELECT * FROM $table". The result is stored
> somewhere by DBI or DBI::mysql until it's fetched from
> there.
>
> Other solution may be to count rows first, then
> estimate if there are millions of rows add LIMIT n,m
> and execute SELECT few times. It sounds quite
> complicated.

Or you can use mysql_use_result so that DBD::mysql does not
store the results of the query client side.

Rudy



--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 22:29:47 von Rudy Lippan

On Thu, 11 Sep 2003, case510@yahoo.com wrote:

> Date: Thu, 11 Sep 2003 13:15:37 -0700 (PDT)
> From: "case510@yahoo.com"
> To: perl@lists.mysql.com
> Subject: Re: backup subroutine
>
> Yes, this is better I think.
> But the problem with memory consumption persists when
> I "SELECT * FROM $table". The result is stored
> somewhere by DBI or DBI::mysql until it's fetched from
> there.
>
> Other solution may be to count rows first, then
> estimate if there are millions of rows add LIMIT n,m
> and execute SELECT few times. It sounds quite
> complicated.

Or you can use mysql_use_result so that DBD::mysql does not
store the results of the query client side.

Rudy



--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 23:01:59 von Ulrich Borchers

If you prefer to the mysqldump solution (if nobody may execute it ;-) then
don't forget "-q"

`mysqldump -u user --password=3Dpass -q database table > file`

"-q" disables output buffering. otherwise mysqldump will comsume
a lot of memory too !

Uli


On 11 Sep 2003 at 13:15, case510@yahoo.com wrote:

> Yes, this is better I think.
> But the problem with memory consumption persists when
> I "SELECT * FROM $table". The result is stored
> somewhere by DBI or DBI::mysql until it's fetched from
> there.
>
> Other solution may be to count rows first, then
> estimate if there are millions of rows add LIMIT n,m
> and execute SELECT few times. It sounds quite
> complicated.
>
> --- Ulrich Borchers wrote:
> > why store _all_ queries in a variable first ?
> > why not opening a file first, then writing
> > query by query / line by line to the file and
> > then close it ? you don't need that temporary
> > variable at all.
> > this would at least eliminate your memory
> > consumption problem. but if that make apache
> > and mysqld go away, I am not certain.
> >
> > Since you are running this from a CGI-script,
> > you might of course also run into a Timeout
> > trap (httpd.conf). The default value for
> > Apache ist 300 seconds I think.
> >
> > Uli
> >
> >
> > On 11 Sep 2003 at 12:10, Greg Meckes wrote:
> >
> > > Do you have access to mysqldump?
> > >
> > > If so, maybe you can dump the data, and the insert
> > statements into one
> > > file as in:
> > >
> > > mysqldump --user=3Duser -p -t -e -c databasename
> > tablename >> file.txt
> > >
> > > Then you can restore all the data if need be:
> > > mysql --user=3Duser -p tablename < file.txt
> > >
> > > Greg
> > >
> > >
> > > --- "case510@yahoo.com" wrote:
> > > > Hello,
> > > >
> > > > (Sorry if this message is duplicated)
> > > >
> > > > Please, which way to create mysql table backup
> > is the
> > > > most painless?
> > > >
> > > > I used to do that as described below until my
> > ISP
> > > > disabled the script because it
> > > > ate all system resource when I managed to create
> > a
> > > > backup of 80MB table:
> > > >
> > > > 1. prepare and execute "SELECT * FROM `$table`"
> > > > 2. fetch rows one by one and store queries
> > "INSERT
> > > > INTO $table VALUES..." into a
> > > > variable.
> > > > 3. When the cycle is finished, print this
> > variable
> > > > into a file
> > > >
> > > > It works fine on relatively small tables. But
> > 80MB one
> > > > caused both mysql and
> > > > Apache have gone away.
> > > >
> > > > Any suggestion?
> > > >
> > > > Thanks,
> > > > Alex G.
> > > >
> > > >
> > > > __________________________________
> > > > Do you Yahoo!?
> > > > Yahoo! SiteBuilder - Free, easy-to-use web site
> > design software
> > > > http://sitebuilder.yahoo.com
> > > >
> > > > --
> > > > MySQL Perl Mailing List
> > > > For list archives: http://lists.mysql.com/perl
> > > > To unsubscribe:
> > >
> >
> http://lists.mysql.com/perl?unsub=3Dgregmeckes@yahoo.com
> > > >
> > >
> > >
> > >
> > > __________________________________
> > > Do you Yahoo!?
> > > Yahoo! SiteBuilder - Free, easy-to-use web site
> > design software
> > > http://sitebuilder.yahoo.com
> > >
> > > --
> > > MySQL Perl Mailing List
> > > For list archives: http://lists.mysql.com/perl
> > > To unsubscribe:
> > http://lists.mysql.com/perl?unsub=3Dubo1@gmx.de
> > >
> >
> >
> > --
> > Dipl.-Inf. Ulrich Borchers
> > MEGABIT Informationstechnik GmbH
> > Karstr. 25, 41068 Mönchengladbach
> > Tel. +49 2161 30898-0
> >
> >
> >
> > --
> > MySQL Perl Mailing List
> > For list archives: http://lists.mysql.com/perl
> > To unsubscribe:
> > http://lists.mysql.com/perl?unsub=3Dcase510@yahoo.com
> >
>
>
> __________________________________
> Do you Yahoo!?
> Yahoo! SiteBuilder - Free, easy-to-use web site design software
> http://sitebuilder.yahoo.com
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe: http://lists.mysql.com/perl?unsub=3Dubo1@gmx.de
>


--
Dipl.-Inf. Ulrich Borchers
MEGABIT Informationstechnik GmbH
Karstr. 25, 41068 Mönchengladbach
Tel. +49 2161 30898-0



--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=3Dgcdmp-msql-mysql-modules @m.gmane.org

Re: backup subroutine

am 11.09.2003 23:01:59 von Ulrich Borchers

If you prefer to the mysqldump solution (if nobody may execute it ;-) then
don't forget "-q"

`mysqldump -u user --password=3Dpass -q database table > file`

"-q" disables output buffering. otherwise mysqldump will comsume
a lot of memory too !

Uli


On 11 Sep 2003 at 13:15, case510@yahoo.com wrote:

> Yes, this is better I think.
> But the problem with memory consumption persists when
> I "SELECT * FROM $table". The result is stored
> somewhere by DBI or DBI::mysql until it's fetched from
> there.
>
> Other solution may be to count rows first, then
> estimate if there are millions of rows add LIMIT n,m
> and execute SELECT few times. It sounds quite
> complicated.
>
> --- Ulrich Borchers wrote:
> > why store _all_ queries in a variable first ?
> > why not opening a file first, then writing
> > query by query / line by line to the file and
> > then close it ? you don't need that temporary
> > variable at all.
> > this would at least eliminate your memory
> > consumption problem. but if that make apache
> > and mysqld go away, I am not certain.
> >
> > Since you are running this from a CGI-script,
> > you might of course also run into a Timeout
> > trap (httpd.conf). The default value for
> > Apache ist 300 seconds I think.
> >
> > Uli
> >
> >
> > On 11 Sep 2003 at 12:10, Greg Meckes wrote:
> >
> > > Do you have access to mysqldump?
> > >
> > > If so, maybe you can dump the data, and the insert
> > statements into one
> > > file as in:
> > >
> > > mysqldump --user=3Duser -p -t -e -c databasename
> > tablename >> file.txt
> > >
> > > Then you can restore all the data if need be:
> > > mysql --user=3Duser -p tablename < file.txt
> > >
> > > Greg
> > >
> > >
> > > --- "case510@yahoo.com" wrote:
> > > > Hello,
> > > >
> > > > (Sorry if this message is duplicated)
> > > >
> > > > Please, which way to create mysql table backup
> > is the
> > > > most painless?
> > > >
> > > > I used to do that as described below until my
> > ISP
> > > > disabled the script because it
> > > > ate all system resource when I managed to create
> > a
> > > > backup of 80MB table:
> > > >
> > > > 1. prepare and execute "SELECT * FROM `$table`"
> > > > 2. fetch rows one by one and store queries
> > "INSERT
> > > > INTO $table VALUES..." into a
> > > > variable.
> > > > 3. When the cycle is finished, print this
> > variable
> > > > into a file
> > > >
> > > > It works fine on relatively small tables. But
> > 80MB one
> > > > caused both mysql and
> > > > Apache have gone away.
> > > >
> > > > Any suggestion?
> > > >
> > > > Thanks,
> > > > Alex G.
> > > >
> > > >
> > > > __________________________________
> > > > Do you Yahoo!?
> > > > Yahoo! SiteBuilder - Free, easy-to-use web site
> > design software
> > > > http://sitebuilder.yahoo.com
> > > >
> > > > --
> > > > MySQL Perl Mailing List
> > > > For list archives: http://lists.mysql.com/perl
> > > > To unsubscribe:
> > >
> >
> http://lists.mysql.com/perl?unsub=3Dgregmeckes@yahoo.com
> > > >
> > >
> > >
> > >
> > > __________________________________
> > > Do you Yahoo!?
> > > Yahoo! SiteBuilder - Free, easy-to-use web site
> > design software
> > > http://sitebuilder.yahoo.com
> > >
> > > --
> > > MySQL Perl Mailing List
> > > For list archives: http://lists.mysql.com/perl
> > > To unsubscribe:
> > http://lists.mysql.com/perl?unsub=3Dubo1@gmx.de
> > >
> >
> >
> > --
> > Dipl.-Inf. Ulrich Borchers
> > MEGABIT Informationstechnik GmbH
> > Karstr. 25, 41068 Mönchengladbach
> > Tel. +49 2161 30898-0
> >
> >
> >
> > --
> > MySQL Perl Mailing List
> > For list archives: http://lists.mysql.com/perl
> > To unsubscribe:
> > http://lists.mysql.com/perl?unsub=3Dcase510@yahoo.com
> >
>
>
> __________________________________
> Do you Yahoo!?
> Yahoo! SiteBuilder - Free, easy-to-use web site design software
> http://sitebuilder.yahoo.com
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe: http://lists.mysql.com/perl?unsub=3Dubo1@gmx.de
>


--
Dipl.-Inf. Ulrich Borchers
MEGABIT Informationstechnik GmbH
Karstr. 25, 41068 Mönchengladbach
Tel. +49 2161 30898-0



--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=3Dgcdmp-msql-mysql-modules @m.gmane.org

Re: backup subroutine

am 11.09.2003 23:06:00 von case510

If I understood, when mysql_use_result is 1, the
result is not passed to client until it is fetched.
Where is it stored then?


--- Rudy Lippan wrote:
> On Thu, 11 Sep 2003, case510@yahoo.com wrote:
>
> > Date: Thu, 11 Sep 2003 13:15:37 -0700 (PDT)
> > From: "case510@yahoo.com"
> > To: perl@lists.mysql.com
> > Subject: Re: backup subroutine
> >
> > Yes, this is better I think.
> > But the problem with memory consumption persists
> when
> > I "SELECT * FROM $table". The result is stored
> > somewhere by DBI or DBI::mysql until it's fetched
> from
> > there.
> >
> > Other solution may be to count rows first, then
> > estimate if there are millions of rows add LIMIT
> n,m
> > and execute SELECT few times. It sounds quite
> > complicated.
>
> Or you can use mysql_use_result so that DBD::mysql
> does not
> store the results of the query client side.
>
> Rudy
>
>
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe:
> http://lists.mysql.com/perl?unsub=case510@yahoo.com
>


__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 23:06:00 von case510

If I understood, when mysql_use_result is 1, the
result is not passed to client until it is fetched.
Where is it stored then?


--- Rudy Lippan wrote:
> On Thu, 11 Sep 2003, case510@yahoo.com wrote:
>
> > Date: Thu, 11 Sep 2003 13:15:37 -0700 (PDT)
> > From: "case510@yahoo.com"
> > To: perl@lists.mysql.com
> > Subject: Re: backup subroutine
> >
> > Yes, this is better I think.
> > But the problem with memory consumption persists
> when
> > I "SELECT * FROM $table". The result is stored
> > somewhere by DBI or DBI::mysql until it's fetched
> from
> > there.
> >
> > Other solution may be to count rows first, then
> > estimate if there are millions of rows add LIMIT
> n,m
> > and execute SELECT few times. It sounds quite
> > complicated.
>
> Or you can use mysql_use_result so that DBD::mysql
> does not
> store the results of the query client side.
>
> Rudy
>
>
>
> --
> MySQL Perl Mailing List
> For list archives: http://lists.mysql.com/perl
> To unsubscribe:
> http://lists.mysql.com/perl?unsub=case510@yahoo.com
>


__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 23:16:15 von case510

As I read in its --help, mysqldump may use parameters
stored in my.ini, therefore a CGI script using
mysqldump is tied to a particular database.

By the way, how does it dump data of InnoDB tables
with FK?

--- Ulrich Borchers wrote:
> If you prefer to the mysqldump solution (if nobody
> may execute it ;-) then
> don't forget "-q"
>
> `mysqldump -u user --password=pass -q database table
> > file`
>
> "-q" disables output buffering. otherwise mysqldump
> will comsume
> a lot of memory too !
>
> Uli
>
>
> On 11 Sep 2003 at 13:15, case510@yahoo.com wrote:
>
> > Yes, this is better I think.
> > But the problem with memory consumption persists
> when
> > I "SELECT * FROM $table". The result is stored
> > somewhere by DBI or DBI::mysql until it's fetched
> from
> > there.
> >
> > Other solution may be to count rows first, then
> > estimate if there are millions of rows add LIMIT
> n,m
> > and execute SELECT few times. It sounds quite
> > complicated.
> >
> > --- Ulrich Borchers wrote:
> > > why store _all_ queries in a variable first ?
> > > why not opening a file first, then writing
> > > query by query / line by line to the file and
> > > then close it ? you don't need that temporary
> > > variable at all.
> > > this would at least eliminate your memory
> > > consumption problem. but if that make apache
> > > and mysqld go away, I am not certain.
> > >
> > > Since you are running this from a CGI-script,
> > > you might of course also run into a Timeout
> > > trap (httpd.conf). The default value for
> > > Apache ist 300 seconds I think.
> > >
> > > Uli
> > >
> > >
> > > On 11 Sep 2003 at 12:10, Greg Meckes wrote:
> > >
> > > > Do you have access to mysqldump?
> > > >
> > > > If so, maybe you can dump the data, and the
> insert
> > > statements into one
> > > > file as in:
> > > >
> > > > mysqldump --user=user -p -t -e -c databasename
> > > tablename >> file.txt
> > > >
> > > > Then you can restore all the data if need be:
> > > > mysql --user=user -p tablename < file.txt
> > > >
> > > > Greg
> > > >
> > > >
> > > > --- "case510@yahoo.com"
> wrote:
> > > > > Hello,
> > > > >
> > > > > (Sorry if this message is duplicated)
> > > > >
> > > > > Please, which way to create mysql table
> backup
> > > is the
> > > > > most painless?
> > > > >
> > > > > I used to do that as described below until
> my
> > > ISP
> > > > > disabled the script because it
> > > > > ate all system resource when I managed to
> create
> > > a
> > > > > backup of 80MB table:
> > > > >
> > > > > 1. prepare and execute "SELECT * FROM
> `$table`"
> > > > > 2. fetch rows one by one and store queries
> > > "INSERT
> > > > > INTO $table VALUES..." into a
> > > > > variable.
> > > > > 3. When the cycle is finished, print this
> > > variable
> > > > > into a file
> > > > >
> > > > > It works fine on relatively small tables.
> But
> > > 80MB one
> > > > > caused both mysql and
> > > > > Apache have gone away.
> > > > >
> > > > > Any suggestion?
> > > > >
> > > > > Thanks,
> > > > > Alex G.
> > > > >
> > > > >
> > > > > __________________________________
> > > > > Do you Yahoo!?
> > > > > Yahoo! SiteBuilder - Free, easy-to-use web
> site
> > > design software
> > > > > http://sitebuilder.yahoo.com
> > > > >
> > > > > --
> > > > > MySQL Perl Mailing List
> > > > > For list archives:
> http://lists.mysql.com/perl
> > > > > To unsubscribe:
> > > >
> > >
> >
>
http://lists.mysql.com/perl?unsub=gregmeckes@yahoo.com
> > > > >
> > > >
> > > >
> > > >
> > > > __________________________________
> > > > Do you Yahoo!?
> > > > Yahoo! SiteBuilder - Free, easy-to-use web
> site
> > > design software
> > > > http://sitebuilder.yahoo.com
> > > >
> > > > --
> > > > MySQL Perl Mailing List
> > > > For list archives: http://lists.mysql.com/perl
> > > > To unsubscribe:
> > > http://lists.mysql.com/perl?unsub=ubo1@gmx.de
> > > >
> > >
> > >
> > > --
> > > Dipl.-Inf. Ulrich Borchers
> > > MEGABIT Informationstechnik GmbH
> > > Karstr. 25, 41068 Mönchengladbach
> > > Tel. +49 2161 30898-0
> > >
> > >
> > >
> > > --
> > > MySQL Perl Mailing List
> > > For list archives: http://lists.mysql.com/perl
> > > To unsubscribe:
> > >
> http://lists.mysql.com/perl?unsub=case510@yahoo.com
> > >
> >
> >
> > __________________________________
> > Do you Yahoo!?
> > Yahoo! SiteBuilder - Free, easy-to-use web site
> design software
> > http://sitebuilder.yahoo.com
> >
> > --
> > MySQL Perl Mailing List
> > For list archives: http://lists.mysql.com/perl
> > To unsubscribe:
> http://lists.mysql.com/perl?unsub=ubo1@gmx.de
> >
>
>
> --
> Dipl.-Inf. Ulrich Borchers
> MEGABIT Informationstechnik GmbH
> Karstr. 25, 41068 Mönchengladbach
> Tel. +49 2161 30898-0
>
>


__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org

Re: backup subroutine

am 11.09.2003 23:16:15 von case510

As I read in its --help, mysqldump may use parameters
stored in my.ini, therefore a CGI script using
mysqldump is tied to a particular database.

By the way, how does it dump data of InnoDB tables
with FK?

--- Ulrich Borchers wrote:
> If you prefer to the mysqldump solution (if nobody
> may execute it ;-) then
> don't forget "-q"
>
> `mysqldump -u user --password=pass -q database table
> > file`
>
> "-q" disables output buffering. otherwise mysqldump
> will comsume
> a lot of memory too !
>
> Uli
>
>
> On 11 Sep 2003 at 13:15, case510@yahoo.com wrote:
>
> > Yes, this is better I think.
> > But the problem with memory consumption persists
> when
> > I "SELECT * FROM $table". The result is stored
> > somewhere by DBI or DBI::mysql until it's fetched
> from
> > there.
> >
> > Other solution may be to count rows first, then
> > estimate if there are millions of rows add LIMIT
> n,m
> > and execute SELECT few times. It sounds quite
> > complicated.
> >
> > --- Ulrich Borchers wrote:
> > > why store _all_ queries in a variable first ?
> > > why not opening a file first, then writing
> > > query by query / line by line to the file and
> > > then close it ? you don't need that temporary
> > > variable at all.
> > > this would at least eliminate your memory
> > > consumption problem. but if that make apache
> > > and mysqld go away, I am not certain.
> > >
> > > Since you are running this from a CGI-script,
> > > you might of course also run into a Timeout
> > > trap (httpd.conf). The default value for
> > > Apache ist 300 seconds I think.
> > >
> > > Uli
> > >
> > >
> > > On 11 Sep 2003 at 12:10, Greg Meckes wrote:
> > >
> > > > Do you have access to mysqldump?
> > > >
> > > > If so, maybe you can dump the data, and the
> insert
> > > statements into one
> > > > file as in:
> > > >
> > > > mysqldump --user=user -p -t -e -c databasename
> > > tablename >> file.txt
> > > >
> > > > Then you can restore all the data if need be:
> > > > mysql --user=user -p tablename < file.txt
> > > >
> > > > Greg
> > > >
> > > >
> > > > --- "case510@yahoo.com"
> wrote:
> > > > > Hello,
> > > > >
> > > > > (Sorry if this message is duplicated)
> > > > >
> > > > > Please, which way to create mysql table
> backup
> > > is the
> > > > > most painless?
> > > > >
> > > > > I used to do that as described below until
> my
> > > ISP
> > > > > disabled the script because it
> > > > > ate all system resource when I managed to
> create
> > > a
> > > > > backup of 80MB table:
> > > > >
> > > > > 1. prepare and execute "SELECT * FROM
> `$table`"
> > > > > 2. fetch rows one by one and store queries
> > > "INSERT
> > > > > INTO $table VALUES..." into a
> > > > > variable.
> > > > > 3. When the cycle is finished, print this
> > > variable
> > > > > into a file
> > > > >
> > > > > It works fine on relatively small tables.
> But
> > > 80MB one
> > > > > caused both mysql and
> > > > > Apache have gone away.
> > > > >
> > > > > Any suggestion?
> > > > >
> > > > > Thanks,
> > > > > Alex G.
> > > > >
> > > > >
> > > > > __________________________________
> > > > > Do you Yahoo!?
> > > > > Yahoo! SiteBuilder - Free, easy-to-use web
> site
> > > design software
> > > > > http://sitebuilder.yahoo.com
> > > > >
> > > > > --
> > > > > MySQL Perl Mailing List
> > > > > For list archives:
> http://lists.mysql.com/perl
> > > > > To unsubscribe:
> > > >
> > >
> >
>
http://lists.mysql.com/perl?unsub=gregmeckes@yahoo.com
> > > > >
> > > >
> > > >
> > > >
> > > > __________________________________
> > > > Do you Yahoo!?
> > > > Yahoo! SiteBuilder - Free, easy-to-use web
> site
> > > design software
> > > > http://sitebuilder.yahoo.com
> > > >
> > > > --
> > > > MySQL Perl Mailing List
> > > > For list archives: http://lists.mysql.com/perl
> > > > To unsubscribe:
> > > http://lists.mysql.com/perl?unsub=ubo1@gmx.de
> > > >
> > >
> > >
> > > --
> > > Dipl.-Inf. Ulrich Borchers
> > > MEGABIT Informationstechnik GmbH
> > > Karstr. 25, 41068 Mönchengladbach
> > > Tel. +49 2161 30898-0
> > >
> > >
> > >
> > > --
> > > MySQL Perl Mailing List
> > > For list archives: http://lists.mysql.com/perl
> > > To unsubscribe:
> > >
> http://lists.mysql.com/perl?unsub=case510@yahoo.com
> > >
> >
> >
> > __________________________________
> > Do you Yahoo!?
> > Yahoo! SiteBuilder - Free, easy-to-use web site
> design software
> > http://sitebuilder.yahoo.com
> >
> > --
> > MySQL Perl Mailing List
> > For list archives: http://lists.mysql.com/perl
> > To unsubscribe:
> http://lists.mysql.com/perl?unsub=ubo1@gmx.de
> >
>
>
> --
> Dipl.-Inf. Ulrich Borchers
> MEGABIT Informationstechnik GmbH
> Karstr. 25, 41068 Mönchengladbach
> Tel. +49 2161 30898-0
>
>


__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

--
MySQL Perl Mailing List
For list archives: http://lists.mysql.com/perl
To unsubscribe: http://lists.mysql.com/perl?unsub=gcdmp-msql-mysql-modules@m .gmane.org