Bookmarks

Yahoo Gmail Google Facebook Delicious Twitter Reddit Stumpleupon Myspace Digg

Search queries

sqlexpress database file auto-creation error, dbf2mysql parameter, WWWXXXAPC, wwwxxxAPC, How to unsubscrube from dategen spam, docmd.close 2585, WWWXXXDOCO, nu vot, dhcpd lease file "binding state", WWWXXXDOCO

Links

XODOX
Impressum

#1: speeding up reading opendir...

Posted on 2008-01-09 09:34:29 by jodleren

Hi!

I have a system, which reads in the entire tree - all files. As of
now, there are 1312 folders to read.

1st time it takes 53-60 seconds to read. The data is somehow cached,
2nd time time is takes 2-3 seconds.

Is there a way to "cache" data beforehand? Like "preparing" the
directory....?

WBR
Sonnich

Report this message

#2: Re: speeding up reading opendir...

Posted on 2008-01-09 10:01:00 by Courtney

jodleren wrote:
> Hi!
>
> I have a system, which reads in the entire tree - all files. As of
> now, there are 1312 folders to read.
>
> 1st time it takes 53-60 seconds to read. The data is somehow cached,
> 2nd time time is takes 2-3 seconds.
>
> Is there a way to "cache" data beforehand? Like "preparing" the
> directory....?
>

Before what? Before you read them?

Bit of a logical impasse there ;-)

If its *nix, you might execute a cron script every few minutes and read
the whole directory structure, which will bring it into the disk file cache.

Of course under heavy I/O load that caching may get flushed again..






> WBR
> Sonnich

Report this message

#3: Re: speeding up reading opendir...

Posted on 2008-01-09 13:56:36 by Csaba Gabor

On Jan 9, 9:34 am, jodleren <sonn...@hot.ee> wrote:
> I have a system, which reads in the entire tree - all files. As of
> now, there are 1312 folders to read.
>
> 1st time it takes 53-60 seconds to read. The data is somehow cached,
> 2nd time time is takes 2-3 seconds.
>
> Is there a way to "cache" data beforehand? Like "preparing" the
> directory....?
>
> WBR
> Sonnich

You might simply direct the output of the dir
command into a file (or string or array depending
on which exec type of command you use) and then
parse that yourself. It should be FAR faster.

For reference,
It took my Win XP system about 315 seconds to do:
C:\>dir /s > delme.dir
with the entire c: drive, about 140000 files
totaling about 44 gigabytes in 32000 directories.
The resultant file was about 10 megabytes

Csaba Gabor from Vienna

Report this message

#4: Re: speeding up reading opendir...

Posted on 2008-01-09 15:59:00 by Rob

On Jan 9, 12:56=A0pm, Csaba Gabor <dans...@gmail.com> wrote:
> On Jan 9, 9:34 am, jodleren <sonn...@hot.ee> wrote:
>
> > I have a system, which reads in the entire tree - all files. As of
> > now, there are 1312 folders to read.
>
> > 1st time it takes 53-60 seconds to read. The data is somehow cached,
> > 2nd time time is takes 2-3 seconds.
>
> > Is there a way to "cache" data beforehand? Like "preparing" the
> > directory....?
>
> > WBR
> > Sonnich
>
> You might simply direct the output of the dir
> command into a file (or string or array depending
> on which exec type of command you use) and then
> parse that yourself. =A0It should be FAR faster.
>
> For reference,
> It took my Win XP system about 315 seconds to do:
> C:\>dir /s > delme.dir
> with the entire c: drive, about 140000 files
> totaling about 44 gigabytes in 32000 directories.
> The resultant file was about 10 megabytes
>
> Csaba Gabor from Vienna

If you're running this on Windows, you're probably seeing the Windows
cache coming into play, which is why it runs faster the second time.

As previously suggested, try usign exec() to output a directory to a
file, and then parse that instead.


Rob.

Report this message