Bookmarks

Yahoo Gmail Google Facebook Delicious Twitter Reddit Stumpleupon Myspace Digg

Search queries

WwWXxX, udp high ports, d-link extract firmware dsl-2750u, wwwwxxxxx.2G, yxxxcom, WWWXXX, ftp://192.168.100.100, wwwxxx 100, www.xxxcon, wwwxxx

Links

XODOX
Impressum

#1: Memory problem - Urgent - please help.

Posted on 2007-07-31 00:15:54 by news - ntua

cpu = intel xeon 64bit
os = Redhat AS 4 update 5, 64bit
physical memory = 8 GB
perl v5.8.5 (the one of Redhat)

I have great problem. At first I thought everything was ok because the
program was running, but I notice it was getting slower and slower ...
I did a "watch cat /proc/meminfo" and notice that from 8GB free mem after
some time, I end up with 14kb ....
I start searching ..and I found that the problem was at this line
('gzip...') and at a `cp ...` some lines earlier.
....
foreach (grep $_=~$filter_original_files, readdir DIR)
{
....
`gzip --force --decompress "$temp_directory/$_"`;
....
}
....
The files I parse are hundrends , their uncompressed size is over of 100 Gb
, but this should not be problem. At every iteration I notice that I am
loosing as much memory as the uncompressed file is .
This should happen. I tried also:

system("command ...")
open(SHELL, "command -|")
eval { ... }
do { .... }
qx/command &/;
fork
also fork at shell with `( command )`;

Nothing helped. Whenever I do a `shell` command, its memory added to Perl
process and does not get released after shell finish its job.
Please help, (fast) because I wont have my head for on my shoulders for much
more time (no joking).

Report this message

#2: Re: Memory problem - Urgent - please help.

Posted on 2007-07-31 04:37:25 by Paul Lalli

On Jul 30, 6:15 pm, "news.ntua" <r...@localhost.com> wrote:
> I have great problem. At first I thought everything was ok because the
> program was running, but I notice it was getting slower and slower ...
> I did a "watch cat /proc/meminfo" and notice that from 8GB free mem after
> some time, I end up with 14kb ....
> I start searching ..and I found that the problem was at this line
> ('gzip...') and at a `cp ...` some lines earlier.
> ...
> foreach (grep $_=~$filter_original_files, readdir DIR)

This line reads the entire file list into memory, for no reason at
all. Change to:

while (my $file = readdir DIR) {
if ($file =~ /$filter_original_files/) {

If $filter_original_files doesn't change at all, then you should also
stick a 'o' at the end of the regexp, like so: if( $file =~ /
$filter_original_files/o) {

If $filter_original_files doesn't contain any regexp characters, and
is just a normal string, you should use the index() function rather
than regular expressions, like so: if (index($file,
$filter_original_files) != -1) {

> ...
> `gzip --force --decompress "$temp_directory/$_"`;

This line forks a new process, execs a shell, runs the gzip program in
that shell, and then captures the output into memory, and then does
nothing with that output. Change to:

system('gzip', '--force', '--decompress', "$temp_directory/$file");

so that you are running gzip directly without starting a shell, and
not bothering to capture output.

> Please help, (fast) because I wont have my head for on my shoulders for much
> more time (no joking).

This sort of request is rude, and is much more likely to get you less
help than more. Your failure to plan is not an emergency for anyone
but you.

Paul Lalli

Report this message

#3: Re: Memory problem - Urgent - please help.

Posted on 2007-07-31 05:14:19 by paduille.4061.mumia.w+nospam

On 07/30/2007 05:15 PM, news.ntua wrote:
>
> cpu = intel xeon 64bit
> os = Redhat AS 4 update 5, 64bit
> physical memory = 8 GB
> perl v5.8.5 (the one of Redhat)
>
> I have great problem. At first I thought everything was ok because the
> program was running, but I notice it was getting slower and slower ...
> I did a "watch cat /proc/meminfo" and notice that from 8GB free mem after
> some time, I end up with 14kb ....
> I start searching ..and I found that the problem was at this line
> ('gzip...') and at a `cp ...` some lines earlier.
> ....
> foreach (grep $_=~$filter_original_files, readdir DIR)
> {
> ....
> `gzip --force --decompress "$temp_directory/$_"`;
> ....
> }
> ....
> The files I parse are hundrends , their uncompressed size is over of 100 Gb
> , but this should not be problem. At every iteration I notice that I am
> loosing as much memory as the uncompressed file is .
> This should happen. I tried also:
>
> system("command ...")
> open(SHELL, "command -|")
> eval { ... }
> do { .... }
> qx/command &/;
> fork
> also fork at shell with `( command )`;
>
> Nothing helped. Whenever I do a `shell` command, its memory added to Perl
> process and does not get released after shell finish its job.
> Please help, (fast) because I wont have my head for on my shoulders for
> much
> more time (no joking).
>

You might consider asking in comp.lang.perl.misc also.

Since you didn't post a short-but-runnable script, there's no way anyone
can debug this problem for you.

If you do decide to post into comp.lang.perl.misc, read the posting
guidelines first:
http://www.augustmail.com/~tadmc/clpmisc/clpmisc_guidelines. html

Report this message