speeding up reading opendir...

speeding up reading opendir...

am 09.01.2008 09:34:29 von jodleren

Hi!

I have a system, which reads in the entire tree - all files. As of
now, there are 1312 folders to read.

1st time it takes 53-60 seconds to read. The data is somehow cached,
2nd time time is takes 2-3 seconds.

Is there a way to "cache" data beforehand? Like "preparing" the
directory....?

WBR
Sonnich

Re: speeding up reading opendir...

am 09.01.2008 10:01:00 von Courtney

jodleren wrote:
> Hi!
>
> I have a system, which reads in the entire tree - all files. As of
> now, there are 1312 folders to read.
>
> 1st time it takes 53-60 seconds to read. The data is somehow cached,
> 2nd time time is takes 2-3 seconds.
>
> Is there a way to "cache" data beforehand? Like "preparing" the
> directory....?
>

Before what? Before you read them?

Bit of a logical impasse there ;-)

If its *nix, you might execute a cron script every few minutes and read
the whole directory structure, which will bring it into the disk file cache.

Of course under heavy I/O load that caching may get flushed again..






> WBR
> Sonnich

Re: speeding up reading opendir...

am 09.01.2008 13:56:36 von Csaba Gabor

On Jan 9, 9:34 am, jodleren wrote:
> I have a system, which reads in the entire tree - all files. As of
> now, there are 1312 folders to read.
>
> 1st time it takes 53-60 seconds to read. The data is somehow cached,
> 2nd time time is takes 2-3 seconds.
>
> Is there a way to "cache" data beforehand? Like "preparing" the
> directory....?
>
> WBR
> Sonnich

You might simply direct the output of the dir
command into a file (or string or array depending
on which exec type of command you use) and then
parse that yourself. It should be FAR faster.

For reference,
It took my Win XP system about 315 seconds to do:
C:\>dir /s > delme.dir
with the entire c: drive, about 140000 files
totaling about 44 gigabytes in 32000 directories.
The resultant file was about 10 megabytes

Csaba Gabor from Vienna

Re: speeding up reading opendir...

am 09.01.2008 15:59:00 von Rob

On Jan 9, 12:56=A0pm, Csaba Gabor wrote:
> On Jan 9, 9:34 am, jodleren wrote:
>
> > I have a system, which reads in the entire tree - all files. As of
> > now, there are 1312 folders to read.
>
> > 1st time it takes 53-60 seconds to read. The data is somehow cached,
> > 2nd time time is takes 2-3 seconds.
>
> > Is there a way to "cache" data beforehand? Like "preparing" the
> > directory....?
>
> > WBR
> > Sonnich
>
> You might simply direct the output of the dir
> command into a file (or string or array depending
> on which exec type of command you use) and then
> parse that yourself. =A0It should be FAR faster.
>
> For reference,
> It took my Win XP system about 315 seconds to do:
> C:\>dir /s > delme.dir
> with the entire c: drive, about 140000 files
> totaling about 44 gigabytes in 32000 directories.
> The resultant file was about 10 megabytes
>
> Csaba Gabor from Vienna

If you're running this on Windows, you're probably seeing the Windows
cache coming into play, which is why it runs faster the second time.

As previously suggested, try usign exec() to output a directory to a
file, and then parse that instead.


Rob.