Looking for ways to prevent timeout

Looking for ways to prevent timeout

am 05.11.2007 03:11:48 von Jon Westcot

------=_NextPart_000_0079_01C81F16.8C58CAB0
Content-Type: text/plain;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable

Hi all:

I'm hoping to find a solution to the problem I'm having with my =
script timing out while inserting records into a table.

Overall, the process is pretty fast, which is impressive, but when =
it gets to the 22,000 to 23,000 record mark, it seems to time out. I've =
had it get up over 26,000 so far, but nothing better than that. And I =
only need to process around 30,000 right now.

I've tried setting max_execution_time to 1800; no improvement. The =
value for max_input_time is -1, which, if I understood it correcctly, is =
the same as saying no limit. And I've tried calling set_time_limit() =
with both 0 and with 360, none of which seemed to help.

Is there ANY WAY to increase the amount of time I can use when =
running a script that will work? I've tried everything I can find in =
the PHP manual.

Any help you can provide will be greatly appreciated!

Jon

------=_NextPart_000_0079_01C81F16.8C58CAB0--

Re: Looking for ways to prevent timeout

am 05.11.2007 03:28:32 von Jochem Maas

Jon Westcot wrote:
> Hi all:
>
> I'm hoping to find a solution to the problem I'm having with my script timing out while inserting records into a table.
>
> Overall, the process is pretty fast, which is impressive, but when it gets to the 22,000 to 23,000 record mark, it seems to time out. I've had it get up over 26,000 so far, but nothing better than that. And I only need to process around 30,000 right now.
>
> I've tried setting max_execution_time to 1800; no improvement. The value for max_input_time is -1, which, if I understood it correcctly, is the same as saying no limit. And I've tried calling set_time_limit() with both 0 and with 360, none of which seemed to help.
>
> Is there ANY WAY to increase the amount of time I can use when running a script that will work? I've tried everything I can find in the PHP manual.
>
> Any help you can provide will be greatly appreciated!

http://php.net/ignore_user_abort will help, but nothing will stop you hitting a max execution time.
but my guess is your not hitting the max but rather the browser is killing the connection because it's
had no response fom your script and as a result apache is killing your script as it thinks it's no longer
needed (i.e. the browser no longer wants the response).

>
> Jon
>

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Re: Looking for ways to prevent timeout

am 05.11.2007 03:53:19 von Nathan Nobbe

------=_Part_21387_19479551.1194231199055
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

On 11/4/07, Jon Westcot wrote:
>
> Hi all:
>
> I'm hoping to find a solution to the problem I'm having with my script
> timing out while inserting records into a table.
>
> Overall, the process is pretty fast, which is impressive, but when it
> gets to the 22,000 to 23,000 record mark, it seems to time out. I've had it
> get up over 26,000 so far, but nothing better than that. And I only need to
> process around 30,000 right now.
>
> I've tried setting max_execution_time to 1800; no improvement. The
> value for max_input_time is -1, which, if I understood it correcctly, is the
> same as saying no limit. And I've tried calling set_time_limit() with both
> 0 and with 360, none of which seemed to help.
>
> Is there ANY WAY to increase the amount of time I can use when running
> a script that will work? I've tried everything I can find in the PHP
> manual.
>
> Any help you can provide will be greatly appreciated!


are you familiar with ajax ?
i would build some client side tool that would split the job into a series
of requests.
say, you input 20,000; then the script fires off 5 requests, to do 4000
inserts a piece.
you could easily implement a status bar and it would be guaranteed to work.
also, it would scale just about as high as you can imagine.

-nathan

------=_Part_21387_19479551.1194231199055--

Re: Looking for ways to prevent timeout

am 05.11.2007 08:46:55 von Jon Westcot

Hi Nathan:

No, I'm not familiar with Ajax. Where can I read up on it? More
important, how can I find out if Ajax is implemented on the server? Or is
it something I can add myself?

Thanks again,

Jon

----- Original Message -----
From: "Nathan Nobbe"
To: "Jon Westcot"
Cc: "PHP General"
Sent: Sunday, November 04, 2007 7:53 PM
Subject: Re: [PHP] Looking for ways to prevent timeout


> On 11/4/07, Jon Westcot wrote:
> >
> > Hi all:
> >
> > I'm hoping to find a solution to the problem I'm having with my
script
> > timing out while inserting records into a table.
> >
> > Overall, the process is pretty fast, which is impressive, but when
it
> > gets to the 22,000 to 23,000 record mark, it seems to time out. I've
had it
> > get up over 26,000 so far, but nothing better than that. And I only
need to
> > process around 30,000 right now.
> >
> > I've tried setting max_execution_time to 1800; no improvement. The
> > value for max_input_time is -1, which, if I understood it correcctly, is
the
> > same as saying no limit. And I've tried calling set_time_limit() with
both
> > 0 and with 360, none of which seemed to help.
> >
> > Is there ANY WAY to increase the amount of time I can use when
running
> > a script that will work? I've tried everything I can find in the PHP
> > manual.
> >
> > Any help you can provide will be greatly appreciated!
>
>
> are you familiar with ajax ?
> i would build some client side tool that would split the job into a series
> of requests.
> say, you input 20,000; then the script fires off 5 requests, to do 4000
> inserts a piece.
> you could easily implement a status bar and it would be guaranteed to
work.
> also, it would scale just about as high as you can imagine.
>
> -nathan
>

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Re: Looking for ways to prevent timeout

am 05.11.2007 08:52:13 von Jon Westcot

Hi Jochem:

Thanks for the suggestion. Not to sound more dense than I already seem,
but how do I do this? How do I tell the browser that something is still
running? I'm issuing a flush() after every 1000 records along with echoing
a textual status update. Should I do it more frequently, say every 100
records?

I'm really struggling with this concept here, and I appreciate the help
that everyone is giving me!

Jon

----- Original Message -----
From: "Jochem Maas"
To: "Jon Westcot"
Cc: "PHP General"
Sent: Sunday, November 04, 2007 7:28 PM
Subject: Re: [PHP] Looking for ways to prevent timeout


> Jon Westcot wrote:
> > Hi all:
> >
> > I'm hoping to find a solution to the problem I'm having with my
script timing out while inserting records into a table.
> >
> > Overall, the process is pretty fast, which is impressive, but when
it gets to the 22,000 to 23,000 record mark, it seems to time out. I've had
it get up over 26,000 so far, but nothing better than that. And I only need
to process around 30,000 right now.
> >
> > I've tried setting max_execution_time to 1800; no improvement. The
value for max_input_time is -1, which, if I understood it correcctly, is the
same as saying no limit. And I've tried calling set_time_limit() with both
0 and with 360, none of which seemed to help.
> >
> > Is there ANY WAY to increase the amount of time I can use when
running a script that will work? I've tried everything I can find in the
PHP manual.
> >
> > Any help you can provide will be greatly appreciated!
>
> http://php.net/ignore_user_abort will help, but nothing will stop you
hitting a max execution time.
> but my guess is your not hitting the max but rather the browser is killing
the connection because it's
> had no response fom your script and as a result apache is killing your
script as it thinks it's no longer
> needed (i.e. the browser no longer wants the response).
>
> >
> > Jon
> >
>
>

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Re: Looking for ways to prevent timeout

am 05.11.2007 10:41:57 von Jochem Maas

Jon Westcot wrote:
> Hi Jochem:
>
> Thanks for the suggestion. Not to sound more dense than I already seem,
> but how do I do this?

by calling the function somewhere near the top of your script?

ignore_user_abort();

How do I tell the browser that something is still
> running? I'm issuing a flush() after every 1000 records along with echoing
> a textual status update. Should I do it more frequently, say every 100
> records?

I have never trusted that method of keeping the browser from thinking the
response is not forthcoming but it's better than nothing.

>
> I'm really struggling with this concept here, and I appreciate the help
> that everyone is giving me!

dont forget to read the manuAl AND the user comments on the pages relevant to
the functions you are using to tackle the problem

>
> Jon
>
> ----- Original Message -----
> From: "Jochem Maas"
> To: "Jon Westcot"
> Cc: "PHP General"
> Sent: Sunday, November 04, 2007 7:28 PM
> Subject: Re: [PHP] Looking for ways to prevent timeout
>
>
>> Jon Westcot wrote:
>>> Hi all:
>>>
>>> I'm hoping to find a solution to the problem I'm having with my
> script timing out while inserting records into a table.
>>> Overall, the process is pretty fast, which is impressive, but when
> it gets to the 22,000 to 23,000 record mark, it seems to time out. I've had
> it get up over 26,000 so far, but nothing better than that. And I only need
> to process around 30,000 right now.
>>> I've tried setting max_execution_time to 1800; no improvement. The
> value for max_input_time is -1, which, if I understood it correcctly, is the
> same as saying no limit. And I've tried calling set_time_limit() with both
> 0 and with 360, none of which seemed to help.
>>> Is there ANY WAY to increase the amount of time I can use when
> running a script that will work? I've tried everything I can find in the
> PHP manual.
>>> Any help you can provide will be greatly appreciated!
>> http://php.net/ignore_user_abort will help, but nothing will stop you
> hitting a max execution time.
>> but my guess is your not hitting the max but rather the browser is killing
> the connection because it's
>> had no response fom your script and as a result apache is killing your
> script as it thinks it's no longer
>> needed (i.e. the browser no longer wants the response).
>>
>>> Jon
>>>
>>
>

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Re: Looking for ways to prevent timeout

am 05.11.2007 15:46:39 von Nathan Nobbe

------=_Part_23009_29730161.1194273999080
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

On 11/5/07, Jon Westcot wrote:
>
> Hi Nathan:
>
> No, I'm not familiar with Ajax. Where can I read up on it? More
> important, how can I find out if Ajax is implemented on the server? Or is
> it something I can add myself?
>


if you arent familiar w/ ajax, no worries; the main concept in my suggestion
is
sending a number of requests rather than a single request. that way you can
execute a fraction of the queries on each request. this will ensure that
you dont
hit the maximum execution time.

a purely php approach would be using the header() function. so, on each
request
where the queries are not yet complete, your script sends a header tag to
the browser
which will immediately invoke another request back on the same script.
once all the queries have been executed dont invoke the header() function,
that would
essentially be the results page.
the reason i prefer ajax over this approach is that the page will be
blanking out a lot,
basically on every request. but it would defiantly work.

also, ajax is mainly a client side technology; where http requests are sent
to the
server without incurring a ful page refresh. you need nothing extra on the
server.
ive been using prototype, a javascript toolkit which has some nice support
for ajax.
if you want to check it out, heres an article on ajax using prototype:
http://www.prototypejs.org/learn/introduction-to-ajax

-nathan

------=_Part_23009_29730161.1194273999080--

Re: Looking for ways to prevent timeout

am 05.11.2007 16:03:37 von parasane

On 11/5/07, Jochem Maas wrote:
> Jon Westcot wrote:
> > Hi Jochem:
> >
> > Thanks for the suggestion. Not to sound more dense than I already seem,
> > but how do I do this?
>
> by calling the function somewhere near the top of your script?
>
> ignore_user_abort();
>
> How do I tell the browser that something is still
> > running? I'm issuing a flush() after every 1000 records along with echoing
> > a textual status update. Should I do it more frequently, say every 100
> > records?
>
> I have never trusted that method of keeping the browser from thinking the
> response is not forthcoming but it's better than nothing.
>
> >
> > I'm really struggling with this concept here, and I appreciate the help
> > that everyone is giving me!
>
> dont forget to read the manuAl AND the user comments on the pages relevant to
> the functions you are using to tackle the problem
>
> >
> > Jon
> >
> > ----- Original Message -----
> > From: "Jochem Maas"
> > To: "Jon Westcot"
> > Cc: "PHP General"
> > Sent: Sunday, November 04, 2007 7:28 PM
> > Subject: Re: [PHP] Looking for ways to prevent timeout
> >
> >
> >> Jon Westcot wrote:
> >>> Hi all:
> >>>
> >>> I'm hoping to find a solution to the problem I'm having with my
> > script timing out while inserting records into a table.
> >>> Overall, the process is pretty fast, which is impressive, but when
> > it gets to the 22,000 to 23,000 record mark, it seems to time out. I've had
> > it get up over 26,000 so far, but nothing better than that. And I only need
> > to process around 30,000 right now.
> >>> I've tried setting max_execution_time to 1800; no improvement. The
> > value for max_input_time is -1, which, if I understood it correcctly, is the
> > same as saying no limit. And I've tried calling set_time_limit() with both
> > 0 and with 360, none of which seemed to help.
> >>> Is there ANY WAY to increase the amount of time I can use when
> > running a script that will work? I've tried everything I can find in the
> > PHP manual.
> >>> Any help you can provide will be greatly appreciated!
> >> http://php.net/ignore_user_abort will help, but nothing will stop you
> > hitting a max execution time.
> >> but my guess is your not hitting the max but rather the browser is killing
> > the connection because it's
> >> had no response fom your script and as a result apache is killing your
> > script as it thinks it's no longer
> >> needed (i.e. the browser no longer wants the response).
> >>
> >>> Jon
> >>>
> >>
> >
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>

Does it absolutely need to be run via the browser? Can it be run
from the CLI instead? Maybe even something like this:

######### PART 1 #########
// script1.php
/* Do whatever you need
this script to do, including
passing variables, et cetera.*/

// For example:
$today = date("m/d/Y");

$fruit[] = "apple";
$fruit[] = "banana";
$fruit[] = "cherry";

function arr2str($arr,$delimiter=',') {
for($i=0;$i if($i > 0 && $i < count($arr)) {
$str .= ",";
}
$str .= $arr[$i];
}
return $str;
}

// This would have to be in an exact order if you
// want to name these variables in the next script.
exec('`which php` cli_from_web2.php '.$today.' '.arr2str($fruit),$ret);
?>


######### PART 2 #########
// script2.php

// Disable any attempt to stop the script,
// short of a ps kill or equivalent.
ignore_user_abort();

// Remember, if you want to name the variables,
// they have to either be distinguished via the
// CLI or kept in an exact, expected order.

for($i=1;$i // Start with 1 because 0 is script name
echo $argv[$i]."\n"; // Output to stdio to demo
}
?>


--
Daniel P. Brown
[office] (570-) 587-7080 Ext. 272
[mobile] (570-) 766-8107

Give a man a fish, he'll eat for a day. Then you'll find out he was
allergic and is hospitalized. See? No good deed goes unpunished....

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Re: Looking for ways to prevent timeout

am 05.11.2007 17:40:27 von Jochem Maas

Nathan Nobbe wrote:
> On 11/5/07, Jon Westcot wrote:
>> Hi Nathan:
>>
>> No, I'm not familiar with Ajax. Where can I read up on it? More
>> important, how can I find out if Ajax is implemented on the server? Or is
>> it something I can add myself?
>>
>

although it might sound cool - using ajax to issue multiple requests is NOT the
correct solution. your merely moving the goalposts (what happens when user moves off the
page just as the third ajax request is made?)

your looking to run a series of inserts whilst garanteeing that they are not interrupted.

Dan Brown (the nice guy on this list, not the twat that wrote the 'daVinci Code') suggests
a *much* better way to go- namely using a CLI script. the fun part is getting a button
push on an admin page to somehow initiate the CLI script.

one way of doing this could be to have a 'job' table in your database to which 'jobs'
are inserted (e.g. 'do my 30000 record import') and that your [CLI] script checks the
database to see if it should start a 'job' and just exit if it does not need to do so
.... lastly in order to have the [CLI] script regularly check if it needs to do something you
can use cron to schedule that the script runs at regular intervals (e.g. every 15 minutes)

many ways to skin this cat - my guess is all the decent ways of doing it will involve a
CLI script probably in conjunction with a cronjob.

>
> if you arent familiar w/ ajax, no worries; the main concept in my suggestion
> is
> sending a number of requests rather than a single request. that way you can
> execute a fraction of the queries on each request. this will ensure that
> you dont
> hit the maximum execution time.
>
> a purely php approach would be using the header() function. so, on each
> request
> where the queries are not yet complete, your script sends a header tag to
> the browser
> which will immediately invoke another request back on the same script.
> once all the queries have been executed dont invoke the header() function,
> that would
> essentially be the results page.
> the reason i prefer ajax over this approach is that the page will be
> blanking out a lot,
> basically on every request. but it would defiantly work.
>
> also, ajax is mainly a client side technology; where http requests are sent
> to the
> server without incurring a ful page refresh. you need nothing extra on the
> server.
> ive been using prototype, a javascript toolkit which has some nice support
> for ajax.
> if you want to check it out, heres an article on ajax using prototype:
> http://www.prototypejs.org/learn/introduction-to-ajax
>
> -nathan
>

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Re: Looking for ways to prevent timeout

am 05.11.2007 18:29:13 von Nathan Nobbe

------=_Part_23772_29499854.1194283753768
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

On 11/5/07, Jochem Maas wrote:
>
> Nathan Nobbe wrote:
> > On 11/5/07, Jon Westcot wrote:
> >> Hi Nathan:
> >>
> >> No, I'm not familiar with Ajax. Where can I read up on it? More
> >> important, how can I find out if Ajax is implemented on the server? Or
> is
> >> it something I can add myself?
> >>
> >
>
> although it might sound cool - using ajax to issue multiple requests is
> NOT the
> correct solution.


correct implies there is only a single solution to the problem. i certainly
did not suggest it
because it *sounds* cool. the reason i suggested it, is because it is the
best solution
to receive feedback on the user interface while the queries are run.

your merely moving the goalposts (what happens when user moves off the
> page just as the third ajax request is made?)


well, if you wanted to support that scenario, a resume feature would be
pretty easy to
implement.

your looking to run a series of inserts whilst garanteeing that they are not
> interrupted.


i dont recall seeing that requirement.

Dan Brown (the nice guy on this list, not the twat that wrote the 'daVinci
> Code') suggests
> a *much* better way to go- namely using a CLI script.


an *alternative* solution, with the trade-off that notification via the u.i.
is not an option.

the fun part is getting a button
> push on an admin page to somehow initiate the CLI script.


that is nice. its also nice that the solution i suggested provides the same
sort of button and
updated feedback on the u.i.

using a cli script is a great solution, however the only sort of
notification mechanism is one
that is sent after the queries have finished, via an email most likely.

yes, there are many ways to skin the cat; evaluate them based on the
requirements and the
constraints and choose the best one for the problem.

-nathan

------=_Part_23772_29499854.1194283753768--

Re: Looking for ways to prevent timeout

am 05.11.2007 19:45:14 von Satyam

----- Original Message -----
From: "Jochem Maas"

> Dan Brown (the nice guy on this list, not the twat that wrote the 'daVinci
> Code') suggests
> a *much* better way to go- namely using a CLI script. the fun part is
> getting a button
> push on an admin page to somehow initiate the CLI script.
>
> one way of doing this could be to have a 'job' table in your database to
> which 'jobs'
> are inserted (e.g. 'do my 30000 record import') and that your [CLI] script
> checks the
> database to see if it should start a 'job' and just exit if it does not
> need to do so
> ... lastly in order to have the [CLI] script regularly check if it needs
> to do something you
> can use cron to schedule that the script runs at regular intervals (e.g.
> every 15 minutes)
>
> many ways to skin this cat - my guess is all the decent ways of doing it
> will involve a
> CLI script probably in conjunction with a cronjob.
>
>>
>> if you arent familiar w/ ajax, no worries; the main concept in my
>> suggestion
>> is
>> sending a number of requests rather than a single request. that way you
>> can
>> execute a fraction of the queries on each request. this will ensure that
>> you dont
>> hit the maximum execution time.
>>

Just to comment an alternative on how to break the job. It is just something
that happened to me once and might be useful.

My particular job could be naturally broken in several stages. Actually, it
had to. Though it could be solved with a huge complex SQL query with
several joins and subqueries (which were not available), it could also be
perfomed with the help of a few intermediate auxiliary tables. So I had a
query (which would have been the sub-query) inserting records into a flat
table with no indexes. On a second step I added the index, then there was
another join in between this auxiliary table and another table (and this one
was pretty complex and at that time, with no stored procedures, it required
some processing with PHP) and the final step that produced the result. At
that time, before AJAX was popular, I showed the progress on an iframe on
which I changed the "src" attribute for each successive step (poor man's
AJAX), but it could also be done via AJAX or reloading the whole page
instead of an iframe within it.

Some time later I tried to redo one other such process into a single query
with subqueries and I found that using the auxiliary tables was faster. I
admit that my attempt was half-hearted, I wanted to either see a big
improvement or ignore the whole business. The improvement wasn't that big
so I dropped it and assumed the other processes would not show any big
improvement either. After all, I knew the data and optimized it as much as
possible so I can't assume the SQL optimizer could do much better than I
had.

Satyam

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Re: Looking for ways to prevent timeout

am 05.11.2007 23:19:50 von Jochem Maas

Nathan Nobbe wrote:
> On 11/5/07, Jochem Maas wrote:

....

yes yes yes to all that, except for one thing. Im personally of
the opinion such 'import' operation should involve no human interaction
and garanteed to complete (e.g. auto resume), save for possibly
initializing a process. the way I see it you what to try to garantee the
import will be atomic in such cases.

there is nothing to stop you having a fancy UI that polls the server
to check the job table (as exampled earlier in this thread) for the status
of a job (and subsequently, possibly retrieve some processed output).

the two processes ('init & 'review and 'run job') should be independent, any
form of control (e.g. a cancellation) should happen via some sort of IPC
mechanism and should be optional apart from possibly initialization (depending
on business requirements). successful import completion should not have to
rely on the user having to press 'continue' or even stay on the page or anything
of that nature.

also consider that with regard to such tasks it's not efficient to
have some one staring at a progress bar. and annoying, after a few seconds,
for the user, regardless of how pretty.

>
> -nathan
>

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php