Reading remote files
am 01.09.2009 19:34:35 von Grace Shibley
--0014852f5fb261cf2a0472879081
Content-Type: text/plain; charset=ISO-8859-1
Is there a way to read large (possibly 500 MB) remote files without loading
the whole file into memory?
We are trying to write a function that will return chunks of binary data
from a file on our server given a file location, specified offset and data
size.
But, we have not been able to get around loading the whole file into memory
first. Is there a way to do this??
--0014852f5fb261cf2a0472879081--
Re: Reading remote files
am 01.09.2009 19:36:57 von Ashley Sheridan
On Tue, 2009-09-01 at 10:34 -0700, Grace Shibley wrote:
> Is there a way to read large (possibly 500 MB) remote files without loading
> the whole file into memory?
> We are trying to write a function that will return chunks of binary data
> from a file on our server given a file location, specified offset and data
> size.
>
> But, we have not been able to get around loading the whole file into memory
> first. Is there a way to do this??
What sort of remote file is it, i.e. how are you remotely connecting to
it? FTP, HTTP, SSH?
Thanks,
Ash
http://www.ashleysheridan.co.uk
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
RE: Reading remote files
am 01.09.2009 19:38:33 von Bob McConnell
From: Grace Shibley
> Is there a way to read large (possibly 500 MB) remote files without
loading
> the whole file into memory?
> We are trying to write a function that will return chunks of binary
data
> from a file on our server given a file location, specified offset and
data
> size.
>=20
> But, we have not been able to get around loading the whole file into
memory
> first. Is there a way to do this??
Are you actually having a problem with memory, or simply that you have
to transfer it over a network first? Depending on the protocol used, you
may be able to read it in chunks, but those chunks will still have to be
copied to the computer that is reading it before it can be processed.
The other option is to run a process in the computer where the file
resides and only send the results over the network.
Bob McConnell
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Re: Reading remote files
am 01.09.2009 19:43:09 von Grace Shibley
--001636025dae033f42047287af7b
Content-Type: text/plain; charset=ISO-8859-1
HTTP
On Tue, Sep 1, 2009 at 10:36 AM, Ashley Sheridan
wrote:
> On Tue, 2009-09-01 at 10:34 -0700, Grace Shibley wrote:
> > Is there a way to read large (possibly 500 MB) remote files without
> loading
> > the whole file into memory?
> > We are trying to write a function that will return chunks of binary data
> > from a file on our server given a file location, specified offset and
> data
> > size.
> >
> > But, we have not been able to get around loading the whole file into
> memory
> > first. Is there a way to do this??
>
> What sort of remote file is it, i.e. how are you remotely connecting to
> it? FTP, HTTP, SSH?
>
> Thanks,
> Ash
> http://www.ashleysheridan.co.uk
>
>
>
>
--001636025dae033f42047287af7b--
Re: Reading remote files
am 01.09.2009 19:46:24 von Ashley Sheridan
On Tue, 2009-09-01 at 10:43 -0700, Grace Shibley wrote:
> HTTP
>
> On Tue, Sep 1, 2009 at 10:36 AM, Ashley Sheridan
> wrote:
>
> > On Tue, 2009-09-01 at 10:34 -0700, Grace Shibley wrote:
> > > Is there a way to read large (possibly 500 MB) remote files without
> > loading
> > > the whole file into memory?
> > > We are trying to write a function that will return chunks of binary data
> > > from a file on our server given a file location, specified offset and
> > data
> > > size.
> > >
> > > But, we have not been able to get around loading the whole file into
> > memory
> > > first. Is there a way to do this??
> >
> > What sort of remote file is it, i.e. how are you remotely connecting to
> > it? FTP, HTTP, SSH?
> >
> > Thanks,
> > Ash
> > http://www.ashleysheridan.co.uk
> >
> >
> >
> >
As far as I know then, HTTP doesn't support entering files at points
specified by a remote user. A request is made for a file, and the server
determines how to break it up in order to send.
Thanks,
Ash
http://www.ashleysheridan.co.uk
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Re: Reading remote files
am 01.09.2009 19:55:15 von Grace Shibley
--00c09ffb585c555b14047287daec
Content-Type: text/plain; charset=ISO-8859-1
Are you actually having a problem with memory, or simply that you have
to transfer it over a network first? Depending on the protocol used, you
may be able to read it in chunks, but those chunks will still have to be
copied to the computer that is reading it before it can be processed.
The other option is to run a process in the computer where the file
resides and only send the results over the network.
Bob McConnell
We haven't actually had a problem yet, but we don't want to run a risk of a
server crash. We want to be able to call this PHP function from a
standalone application that will get that particular chunk of data specified
and save it to the local drive.
But, so far, we have been told that any function we use (fopen/fread,
file_get_contents) will first load the entire file into memory.
As far as I know then, HTTP doesn't support entering files at points
specified by a remote user. A request is made for a file, and the server
determines how to break it up in order to send.
Apparently, with file_get_contents, you can specify an offset and a
datasize, but it still loads the whole file first. Is this true?
On Tue, Sep 1, 2009 at 10:46 AM, Ashley Sheridan
wrote:
> On Tue, 2009-09-01 at 10:43 -0700, Grace Shibley wrote:
> > HTTP
> >
> > On Tue, Sep 1, 2009 at 10:36 AM, Ashley Sheridan
> > wrote:
> >
> > > On Tue, 2009-09-01 at 10:34 -0700, Grace Shibley wrote:
> > > > Is there a way to read large (possibly 500 MB) remote files without
> > > loading
> > > > the whole file into memory?
> > > > We are trying to write a function that will return chunks of binary
> data
> > > > from a file on our server given a file location, specified offset and
> > > data
> > > > size.
> > > >
> > > > But, we have not been able to get around loading the whole file into
> > > memory
> > > > first. Is there a way to do this??
> > >
> > > What sort of remote file is it, i.e. how are you remotely connecting to
> > > it? FTP, HTTP, SSH?
> > >
> > > Thanks,
> > > Ash
> > > http://www.ashleysheridan.co.uk
> > >
> > >
> > >
> > >
> As far as I know then, HTTP doesn't support entering files at points
> specified by a remote user. A request is made for a file, and the server
> determines how to break it up in order to send.
>
> Thanks,
> Ash
> http://www.ashleysheridan.co.uk
>
>
>
>
--00c09ffb585c555b14047287daec--
Re: Reading remote files
am 01.09.2009 21:17:15 von Ralph Deffke
I think, this also depends on the oprating system. I would say that any
development team would avoid loading file type data into fast memory. These
problems are all over applications. From the PHP point of view, it could
mean that file data have to be read into memory, but it could not mean that
the data have to be necceserily in a memory chip. as smart as oprerating
systems, apache and PHP are designed, I would expect some disk cashing
mechanism for large data block fom the developers.
so if u did not have any problem yet, do define a test with the average of
traffic u are expecting and see what happens. I see a pretty good chance
that there will be not so much a problem.
ralph_deffke@yahoo.de
"Grace Shibley" wrote in message
news:a4d1d5260909011055o55689189n4e42af2e319fb24@mail.gmail. com...
> Are you actually having a problem with memory, or simply that you have
> to transfer it over a network first? Depending on the protocol used, you
> may be able to read it in chunks, but those chunks will still have to be
> copied to the computer that is reading it before it can be processed.
>
> The other option is to run a process in the computer where the file
> resides and only send the results over the network.
>
> Bob McConnell
>
>
> We haven't actually had a problem yet, but we don't want to run a risk of
a
> server crash. We want to be able to call this PHP function from a
> standalone application that will get that particular chunk of data
specified
> and save it to the local drive.
> But, so far, we have been told that any function we use (fopen/fread,
> file_get_contents) will first load the entire file into memory.
>
> As far as I know then, HTTP doesn't support entering files at points
> specified by a remote user. A request is made for a file, and the server
> determines how to break it up in order to send.
>
> Apparently, with file_get_contents, you can specify an offset and a
> datasize, but it still loads the whole file first. Is this true?
>
>
> On Tue, Sep 1, 2009 at 10:46 AM, Ashley Sheridan
> wrote:
>
> > On Tue, 2009-09-01 at 10:43 -0700, Grace Shibley wrote:
> > > HTTP
> > >
> > > On Tue, Sep 1, 2009 at 10:36 AM, Ashley Sheridan
> > > wrote:
> > >
> > > > On Tue, 2009-09-01 at 10:34 -0700, Grace Shibley wrote:
> > > > > Is there a way to read large (possibly 500 MB) remote files
without
> > > > loading
> > > > > the whole file into memory?
> > > > > We are trying to write a function that will return chunks of
binary
> > data
> > > > > from a file on our server given a file location, specified offset
and
> > > > data
> > > > > size.
> > > > >
> > > > > But, we have not been able to get around loading the whole file
into
> > > > memory
> > > > > first. Is there a way to do this??
> > > >
> > > > What sort of remote file is it, i.e. how are you remotely connecting
to
> > > > it? FTP, HTTP, SSH?
> > > >
> > > > Thanks,
> > > > Ash
> > > > http://www.ashleysheridan.co.uk
> > > >
> > > >
> > > >
> > > >
> > As far as I know then, HTTP doesn't support entering files at points
> > specified by a remote user. A request is made for a file, and the server
> > determines how to break it up in order to send.
> >
> > Thanks,
> > Ash
> > http://www.ashleysheridan.co.uk
> >
> >
> >
> >
>
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php