how to provide download of files mow in documentroot
how to provide download of files mow in documentroot
am 28.03.2010 21:30:11 von ebhakt
--0016e6d2607ac786c00482e16c17
Content-Type: text/plain; charset=ISO-8859-1
Hi
i am writing a web application in php
this webapp primarily focuses on file uploads and downloads
the uploaded files will be saved in a folder which is not in document root
and my query is how will i be able to provide download to such files not
located in document root via php
--
Bhaskar Tiwari
GTSE Generalist
Directory Services
Microsoft
____________________________________________________________
All we have to decide is what to do with the time that has been given to us
http://www.ebhakt.com/
http://fytclub.net/
http://ebhakt.info/
--0016e6d2607ac786c00482e16c17--
Re: how to provide download of files mow in documentroot
am 29.03.2010 08:13:42 von Devendra Jadhav
--0016369c8da62fc6390482ea6aa3
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: quoted-printable
Hey..
Try creating soft link to the destination folder from doc root.
I haven't tried it but give it a try...
On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
> Hi
> i am writing a web application in php
> this webapp primarily focuses on file uploads and downloads
> the uploaded files will be saved in a folder which is not in document roo=
t
> and my query is how will i be able to provide download to such files not
> located in document root via php
>
>
> --
> Bhaskar Tiwari
> GTSE Generalist
> Directory Services
> Microsoft
>
> ____________________________________________________________
> All we have to decide is what to do with the time that has been given to =
us
>
>
> http://www.ebhakt.com/
> http://fytclub.net/
> http://ebhakt.info/
>
--=20
Devendra Jadhav
दà¥à¤µà¥à¤à¤¦à¥ र =
à¤à¤¾à¤§à¤µ
--0016369c8da62fc6390482ea6aa3--
Re: how to provide download of files mow in documentroot
am 29.03.2010 08:19:21 von ebhakt
--00504501598b6d4e7d0482ea7ed2
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: quoted-printable
No i don't want to create any soft links
that primarily rejects all the benefits of putting a file outside of
document root
i want some solution similar to private file downloads provided by drupal'
so that the php webserver provides the download and not apache
in realtime
On Mon, Mar 29, 2010 at 11:43 AM, Devendra Jadhav wr=
ote:
> Hey..
>
> Try creating soft link to the destination folder from doc root.
> I haven't tried it but give it a try...
>
>
> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
>
>> Hi
>> i am writing a web application in php
>> this webapp primarily focuses on file uploads and downloads
>> the uploaded files will be saved in a folder which is not in document ro=
ot
>> and my query is how will i be able to provide download to such files not
>> located in document root via php
>>
>>
>> --
>> Bhaskar Tiwari
>> GTSE Generalist
>> Directory Services
>> Microsoft
>>
>> ____________________________________________________________
>> All we have to decide is what to do with the time that has been given to
>> us
>>
>>
>> http://www.ebhakt.com/
>> http://fytclub.net/
>> http://ebhakt.info/
>>
>
>
>
> --
> Devendra Jadhav
> दà¥à¤µà¥à¤à¤¦à¥ र =
à¤à¤¾à¤§à¤µ
>
--=20
Bhaskar Tiwari
GTSE Generalist
Directory Services
Microsoft
____________________________________________________________
All we have to decide is what to do with the time that has been given to us
http://www.ebhakt.com/
http://fytclub.net/
http://ebhakt.info/
--00504501598b6d4e7d0482ea7ed2--
Re: how to provide download of files mow in documentroot
am 29.03.2010 08:22:01 von Devendra Jadhav
--0016363b8142f279130482ea8720
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: quoted-printable
Then you can do file_get_contents within PHP. or any file handling
mechanism.
On Mon, Mar 29, 2010 at 11:49 AM, ebhakt wrote:
> No i don't want to create any soft links
> that primarily rejects all the benefits of putting a file outside of
> document root
>
> i want some solution similar to private file downloads provided by drupal=
'
>
> so that the php webserver provides the download and not apache
> in realtime
>
>
>
> On Mon, Mar 29, 2010 at 11:43 AM, Devendra Jadhav =
wrote:
>
>> Hey..
>>
>> Try creating soft link to the destination folder from doc root.
>> I haven't tried it but give it a try...
>>
>>
>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
>>
>>> Hi
>>> i am writing a web application in php
>>> this webapp primarily focuses on file uploads and downloads
>>> the uploaded files will be saved in a folder which is not in document
>>> root
>>> and my query is how will i be able to provide download to such files no=
t
>>> located in document root via php
>>>
>>>
>>> --
>>> Bhaskar Tiwari
>>> GTSE Generalist
>>> Directory Services
>>> Microsoft
>>>
>>> ____________________________________________________________
>>> All we have to decide is what to do with the time that has been given t=
o
>>> us
>>>
>>>
>>> http://www.ebhakt.com/
>>> http://fytclub.net/
>>> http://ebhakt.info/
>>>
>>
>>
>>
>> --
>> Devendra Jadhav
>> दà¥à¤µà¥à¤à¤¦à¥ र=
à¤à¤¾à¤§à¤µ
>>
>
>
>
> --
> Bhaskar Tiwari
> GTSE Generalist
> Directory Services
> Microsoft
>
> ____________________________________________________________
> All we have to decide is what to do with the time that has been given to =
us
>
>
> http://www.ebhakt.com/
> http://fytclub.net/
> http://ebhakt.info/
>
>
>
--=20
Devendra Jadhav
दà¥à¤µà¥à¤à¤¦à¥ र =
à¤à¤¾à¤§à¤µ
--0016363b8142f279130482ea8720--
Re: how to provide download of files mow in documentroot
am 29.03.2010 14:13:10 von ro0ot.w00t
--000325554072c015c90482ef6ffd
Content-Type: text/plain; charset=ISO-8859-1
Top posting sucks, so I'll answer the post somewhere down there.
2010/3/29 Devendra Jadhav
> Then you can do file_get_contents within PHP. or any file handling
> mechanism.
> >> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
> >>> Hi
> >>> i am writing a web application in php
> >>> this webapp primarily focuses on file uploads and downloads
> >>> the uploaded files will be saved in a folder which is not in document
> >>> root
> >>> and my query is how will i be able to provide download to such files
> not
> >>> located in document root via php
> >>>
>
Try something like that
$content = file_get_contents($filename);
$etag = md5($content);
header('Last-Modified: '.gmdate('D, d M Y H:i:s',
filemtime($filename)).' GMT');
header('ETag: '.$etag);
header('Accept-Ranges: bytes');
header('Content-Length: '.strlen($content));
header('Cache-Control: '.$cache_value); // you decide
header('Content-type: '.$should_be_set);
echo $content;
exit;
?>
Depending on the $filesize, you should use something else than
file_get_contents() (for example fopen/fread). file_get_contents on a huge
file will exhaust your webservers RAM.
Regards
--000325554072c015c90482ef6ffd--
Re: how to provide download of files mow in documentroot
am 29.03.2010 14:40:23 von Nathan Rixham
Jan G.B. wrote:
> Top posting sucks, so I'll answer the post somewhere down there.
>
>
> 2010/3/29 Devendra Jadhav
>
>> Then you can do file_get_contents within PHP. or any file handling
>> mechanism.
>>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
>>>>> Hi
>>>>> i am writing a web application in php
>>>>> this webapp primarily focuses on file uploads and downloads
>>>>> the uploaded files will be saved in a folder which is not in document
>>>>> root
>>>>> and my query is how will i be able to provide download to such files
>> not
>>>>> located in document root via php
>>>>>
>
> Try something like that
>
> $content = file_get_contents($filename);
> $etag = md5($content);
> header('Last-Modified: '.gmdate('D, d M Y H:i:s',
> filemtime($filename)).' GMT');
> header('ETag: '.$etag);
> header('Accept-Ranges: bytes');
> header('Content-Length: '.strlen($content));
> header('Cache-Control: '.$cache_value); // you decide
> header('Content-type: '.$should_be_set);
> echo $content;
> exit;
> ?>
>
> Depending on the $filesize, you should use something else than
> file_get_contents() (for example fopen/fread). file_get_contents on a huge
> file will exhaust your webservers RAM.
Yup, so you can map the in web server config; then
"allow from" only from localhost + yourdomain. This means you can then
request it like an url and do a head request to get the etag etc then
return a 304 not modified if you received a matching etag Last-Modified
etc; (thus meaning you only file_get_contents when really really needed).
I'd advise against saying you Accept-Ranges bytes if you don't accept
byte ranges (ie you aren't going to send little bits of the file).
If you need the downloads to be secure only; then you could easily
negate php all together and simply expose the directory via a location
so that it is web accessible and set it up to ask for "auth" using
htpasswd; a custom script, ldap or whatever.
And if you don't need security then why have php involved at all? simply
symlink to the directory or expose it via http and be done with the
problem in a minute or two.
Regards!
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Re: how to provide download of files mow in documentroot
am 29.03.2010 14:53:42 von ro0ot.w00t
--000325554072bcb73a0482f0006c
Content-Type: text/plain; charset=ISO-8859-1
2010/3/29 Nathan Rixham
> Jan G.B. wrote:
> > Top posting sucks, so I'll answer the post somewhere down there.
> >
> >
> > 2010/3/29 Devendra Jadhav
> >
> >> Then you can do file_get_contents within PHP. or any file handling
> >> mechanism.
> >>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
> >>>>> Hi
> >>>>> i am writing a web application in php
> >>>>> this webapp primarily focuses on file uploads and downloads
> >>>>> the uploaded files will be saved in a folder which is not in document
> >>>>> root
> >>>>> and my query is how will i be able to provide download to such files
> >> not
> >>>>> located in document root via php
> >>>>>
> >
> > Try something like that
> >
> > $content = file_get_contents($filename);
> > $etag = md5($content);
> > header('Last-Modified: '.gmdate('D, d M Y H:i:s',
> > filemtime($filename)).' GMT');
> > header('ETag: '.$etag);
> > header('Accept-Ranges: bytes');
> > header('Content-Length: '.strlen($content));
> > header('Cache-Control: '.$cache_value); // you decide
> > header('Content-type: '.$should_be_set);
> > echo $content;
> > exit;
> > ?>
> >
> > Depending on the $filesize, you should use something else than
> > file_get_contents() (for example fopen/fread). file_get_contents on a
> huge
> > file will exhaust your webservers RAM.
>
> Yup, so you can map the in web server config; then
> "allow from" only from localhost + yourdomain. This means you can then
> request it like an url and do a head request to get the etag etc then
> return a 304 not modified if you received a matching etag Last-Modified
> etc; (thus meaning you only file_get_contents when really really needed).
>
> I'd advise against saying you Accept-Ranges bytes if you don't accept
> byte ranges (ie you aren't going to send little bits of the file).
>
> If you need the downloads to be secure only; then you could easily
> negate php all together and simply expose the directory via a location
> so that it is web accessible and set it up to ask for "auth" using
> htpasswd; a custom script, ldap or whatever.
>
> And if you don't need security then why have php involved at all? simply
> symlink to the directory or expose it via http and be done with the
> problem in a minute or two.
>
> Regards!
>
In my opinion, serving user-content on a productive server is wicked sick.
You don't want your visitors to upload malicous files that may trigger some
modules as mod_php in apache. So it makes sense to store user-uploads
outside of a docroot and with no symlink or whatsover.
One more thing added: your RAM will be exhausted even if you open that 600mb
file just once.
Apaches memory handling is a bit weird: if *one* apache process is using
200mb RAM on *one* impression because your application uses that much, then
that process will not release the memory while it's serving another 1000
requests for `clear.gif` which is maybe 850b in size.
So better forget that file_get_contents)( when the filesize can be huge. :-)
Regards
--000325554072bcb73a0482f0006c--
Re: how to provide download of files mow in documentroot
am 29.03.2010 14:53:43 von Anshul Agrawal
--000feaee5b27ce10400482f000e0
Content-Type: text/plain; charset=ISO-8859-1
On Mon, Mar 29, 2010 at 6:10 PM, Nathan Rixham wrote:
> Jan G.B. wrote:
> > Top posting sucks, so I'll answer the post somewhere down there.
> >
> >
> > 2010/3/29 Devendra Jadhav
> >
> >> Then you can do file_get_contents within PHP. or any file handling
> >> mechanism.
> >>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
> >>>>> Hi
> >>>>> i am writing a web application in php
> >>>>> this webapp primarily focuses on file uploads and downloads
> >>>>> the uploaded files will be saved in a folder which is not in document
> >>>>> root
> >>>>> and my query is how will i be able to provide download to such files
> >> not
> >>>>> located in document root via php
> >>>>>
> >
> > Try something like that
> >
> > $content = file_get_contents($filename);
> > $etag = md5($content);
> > header('Last-Modified: '.gmdate('D, d M Y H:i:s',
> > filemtime($filename)).' GMT');
> > header('ETag: '.$etag);
> > header('Accept-Ranges: bytes');
> > header('Content-Length: '.strlen($content));
> > header('Cache-Control: '.$cache_value); // you decide
> > header('Content-type: '.$should_be_set);
> > echo $content;
> > exit;
> > ?>
> >
> > Depending on the $filesize, you should use something else than
> > file_get_contents() (for example fopen/fread). file_get_contents on a
> huge
> > file will exhaust your webservers RAM.
>
> Yup, so you can map the in web server config; then
> "allow from" only from localhost + yourdomain. This means you can then
> request it like an url and do a head request to get the etag etc then
> return a 304 not modified if you received a matching etag Last-Modified
> etc; (thus meaning you only file_get_contents when really really needed).
>
> I'd advise against saying you Accept-Ranges bytes if you don't accept
> byte ranges (ie you aren't going to send little bits of the file).
>
> If you need the downloads to be secure only; then you could easily
> negate php all together and simply expose the directory via a location
> so that it is web accessible and set it up to ask for "auth" using
> htpasswd; a custom script, ldap or whatever.
>
> And if you don't need security then why have php involved at all? simply
> symlink to the directory or expose it via http and be done with the
> problem in a minute or two.
>
> Regards!
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>
Also look at readfile() and fpassthru if dealing with large files.
Moreover, if you have control over the webserver then you can use PHP only
for authenticating the getFile request and offload the file delivery
operation to your webserver (Apache, NginX, lighttpd) using "X-SendFile"
header in the response.
Best,
Anshul
--000feaee5b27ce10400482f000e0--
Re: how to provide download of files mow in documentroot
am 29.03.2010 22:41:48 von Nathan Rixham
Jan G.B. wrote:
> 2010/3/29 Nathan Rixham
>
>> Jan G.B. wrote:
>>> Top posting sucks, so I'll answer the post somewhere down there.
>>>
>>>
>>> 2010/3/29 Devendra Jadhav
>>>
>>>> Then you can do file_get_contents within PHP. or any file handling
>>>> mechanism.
>>>>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
>>>>>>> Hi
>>>>>>> i am writing a web application in php
>>>>>>> this webapp primarily focuses on file uploads and downloads
>>>>>>> the uploaded files will be saved in a folder which is not in document
>>>>>>> root
>>>>>>> and my query is how will i be able to provide download to such files
>>>> not
>>>>>>> located in document root via php
>>>>>>>
>>> Try something like that
>>>
>>> $content = file_get_contents($filename);
>>> $etag = md5($content);
>>> header('Last-Modified: '.gmdate('D, d M Y H:i:s',
>>> filemtime($filename)).' GMT');
>>> header('ETag: '.$etag);
>>> header('Accept-Ranges: bytes');
>>> header('Content-Length: '.strlen($content));
>>> header('Cache-Control: '.$cache_value); // you decide
>>> header('Content-type: '.$should_be_set);
>>> echo $content;
>>> exit;
>>> ?>
>>>
>>> Depending on the $filesize, you should use something else than
>>> file_get_contents() (for example fopen/fread). file_get_contents on a
>> huge
>>> file will exhaust your webservers RAM.
>> Yup, so you can map the in web server config; then
>> "allow from" only from localhost + yourdomain. This means you can then
>> request it like an url and do a head request to get the etag etc then
>> return a 304 not modified if you received a matching etag Last-Modified
>> etc; (thus meaning you only file_get_contents when really really needed).
>>
>> I'd advise against saying you Accept-Ranges bytes if you don't accept
>> byte ranges (ie you aren't going to send little bits of the file).
>>
>> If you need the downloads to be secure only; then you could easily
>> negate php all together and simply expose the directory via a location
>> so that it is web accessible and set it up to ask for "auth" using
>> htpasswd; a custom script, ldap or whatever.
>>
>> And if you don't need security then why have php involved at all? simply
>> symlink to the directory or expose it via http and be done with the
>> problem in a minute or two.
>>
>> Regards!
>>
>
> In my opinion, serving user-content on a productive server is wicked sick.
> You don't want your visitors to upload malicous files that may trigger some
> modules as mod_php in apache. So it makes sense to store user-uploads
> outside of a docroot and with no symlink or whatsover.
even the simplest of server configurations will ensure safety. just use
..htaccess to SetHandler default-handler which treats everything as
static content and serves it right up.
> One more thing added: your RAM will be exhausted even if you open that 600mb
> file just once.
> Apaches memory handling is a bit weird: if *one* apache process is using
> 200mb RAM on *one* impression because your application uses that much, then
> that process will not release the memory while it's serving another 1000
> requests for `clear.gif` which is maybe 850b in size.
again everything depends on how you have your server configured; you can
easily tell apache to kill each child after one run or a whole host of
other configs; but ultimately if you can avoid opening up that file in
php then do; serving statically as above is the cleanest quickest way to
do it (other than using s3 or similar).
regards!
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Re: how to provide download of files mow in documentroot
am 30.03.2010 16:54:08 von ro0ot.w00t
--00032555945e462041048305cd12
Content-Type: text/plain; charset=ISO-8859-1
2010/3/29 Nathan Rixham
> Jan G.B. wrote:
> > 2010/3/29 Nathan Rixham
> >
> >> Jan G.B. wrote:
> >>> Top posting sucks, so I'll answer the post somewhere down there.
> >>>
> >>>
> >>> 2010/3/29 Devendra Jadhav
> >>>
> >>>> Then you can do file_get_contents within PHP. or any file handling
> >>>> mechanism.
> >>>>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
> >>>>>>> Hi
> >>>>>>> i am writing a web application in php
> >>>>>>> this webapp primarily focuses on file uploads and downloads
> >>>>>>> the uploaded files will be saved in a folder which is not in
> document
> >>>>>>> root
> >>>>>>> and my query is how will i be able to provide download to such
> files
> >>>> not
> >>>>>>> located in document root via php
> >>>>>>>
> >>> Try something like that
> >>>
> >>> $content = file_get_contents($filename);
> >>> $etag = md5($content);
> >>> header('Last-Modified: '.gmdate('D, d M Y H:i:s',
> >>> filemtime($filename)).' GMT');
> >>> header('ETag: '.$etag);
> >>> header('Accept-Ranges: bytes');
> >>> header('Content-Length: '.strlen($content));
> >>> header('Cache-Control: '.$cache_value); // you decide
> >>> header('Content-type: '.$should_be_set);
> >>> echo $content;
> >>> exit;
> >>> ?>
> >>>
> >>> Depending on the $filesize, you should use something else than
> >>> file_get_contents() (for example fopen/fread). file_get_contents on a
> >> huge
> >>> file will exhaust your webservers RAM.
> >> Yup, so you can map the in web server config; then
> >> "allow from" only from localhost + yourdomain. This means you can then
> >> request it like an url and do a head request to get the etag etc then
> >> return a 304 not modified if you received a matching etag Last-Modified
> >> etc; (thus meaning you only file_get_contents when really really
> needed).
> >>
> >> I'd advise against saying you Accept-Ranges bytes if you don't accept
> >> byte ranges (ie you aren't going to send little bits of the file).
> >>
> >> If you need the downloads to be secure only; then you could easily
> >> negate php all together and simply expose the directory via a location
> >> so that it is web accessible and set it up to ask for "auth" using
> >> htpasswd; a custom script, ldap or whatever.
> >>
> >> And if you don't need security then why have php involved at all? simply
> >> symlink to the directory or expose it via http and be done with the
> >> problem in a minute or two.
> >>
> >> Regards!
> >>
> >
> > In my opinion, serving user-content on a productive server is wicked
> sick.
> > You don't want your visitors to upload malicous files that may trigger
> some
> > modules as mod_php in apache. So it makes sense to store user-uploads
> > outside of a docroot and with no symlink or whatsover.
>
> even the simplest of server configurations will ensure safety. just use
> .htaccess to SetHandler default-handler which treats everything as
> static content and serves it right up.
>
Yes. But the average persons posting here aren't server config gods, I
believe.
Also, you can not implement permissions on these files.
The discussion was about serving files from a place outside any docroot!
Guess there is a reason for that.
>
> > One more thing added: your RAM will be exhausted even if you open that
> 600mb
> > file just once.
> > Apaches memory handling is a bit weird: if *one* apache process is using
> > 200mb RAM on *one* impression because your application uses that much,
> then
> > that process will not release the memory while it's serving another 1000
> > requests for `clear.gif` which is maybe 850b in size.
>
> again everything depends on how you have your server configured; you can
> easily tell apache to kill each child after one run or a whole host of
> other configs; but ultimately if you can avoid opening up that file in
> php then do; serving statically as above is the cleanest quickest way to
> do it (other than using s3 or similar).
>
> regards!
>
Sure, you could configure your apache like that. Unless you have some
traffic on your site, because the time intensive thing for apache is to
spawn new processes. So it's just not a good idea to do that, Nor to serve
big files via file_get_contents.
Regards
--00032555945e462041048305cd12--
Re: how to provide download of files mow in documentroot
am 30.03.2010 16:58:18 von Nathan Rixham
Jan G.B. wrote:
> 2010/3/29 Nathan Rixham
>
>> Jan G.B. wrote:
>>> 2010/3/29 Nathan Rixham
>>>
>>>> Jan G.B. wrote:
>>>>> Top posting sucks, so I'll answer the post somewhere down there.
>>>>>
>>>>>
>>>>> 2010/3/29 Devendra Jadhav
>>>>>
>>>>>> Then you can do file_get_contents within PHP. or any file handling
>>>>>> mechanism.
>>>>>>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
>>>>>>>>> Hi
>>>>>>>>> i am writing a web application in php
>>>>>>>>> this webapp primarily focuses on file uploads and downloads
>>>>>>>>> the uploaded files will be saved in a folder which is not in
>> document
>>>>>>>>> root
>>>>>>>>> and my query is how will i be able to provide download to such
>> files
>>>>>> not
>>>>>>>>> located in document root via php
>>>>>>>>>
>>>>> Try something like that
>>>>>
>>>>> $content = file_get_contents($filename);
>>>>> $etag = md5($content);
>>>>> header('Last-Modified: '.gmdate('D, d M Y H:i:s',
>>>>> filemtime($filename)).' GMT');
>>>>> header('ETag: '.$etag);
>>>>> header('Accept-Ranges: bytes');
>>>>> header('Content-Length: '.strlen($content));
>>>>> header('Cache-Control: '.$cache_value); // you decide
>>>>> header('Content-type: '.$should_be_set);
>>>>> echo $content;
>>>>> exit;
>>>>> ?>
>>>>>
>>>>> Depending on the $filesize, you should use something else than
>>>>> file_get_contents() (for example fopen/fread). file_get_contents on a
>>>> huge
>>>>> file will exhaust your webservers RAM.
>>>> Yup, so you can map the in web server config; then
>>>> "allow from" only from localhost + yourdomain. This means you can then
>>>> request it like an url and do a head request to get the etag etc then
>>>> return a 304 not modified if you received a matching etag Last-Modified
>>>> etc; (thus meaning you only file_get_contents when really really
>> needed).
>>>> I'd advise against saying you Accept-Ranges bytes if you don't accept
>>>> byte ranges (ie you aren't going to send little bits of the file).
>>>>
>>>> If you need the downloads to be secure only; then you could easily
>>>> negate php all together and simply expose the directory via a location
>>>> so that it is web accessible and set it up to ask for "auth" using
>>>> htpasswd; a custom script, ldap or whatever.
>>>>
>>>> And if you don't need security then why have php involved at all? simply
>>>> symlink to the directory or expose it via http and be done with the
>>>> problem in a minute or two.
>>>>
>>>> Regards!
>>>>
>>> In my opinion, serving user-content on a productive server is wicked
>> sick.
>>> You don't want your visitors to upload malicous files that may trigger
>> some
>>> modules as mod_php in apache. So it makes sense to store user-uploads
>>> outside of a docroot and with no symlink or whatsover.
>> even the simplest of server configurations will ensure safety. just use
>> .htaccess to SetHandler default-handler which treats everything as
>> static content and serves it right up.
>>
>
> Yes. But the average persons posting here aren't server config gods, I
> believe.
> Also, you can not implement permissions on these files.
> The discussion was about serving files from a place outside any docroot!
> Guess there is a reason for that.
>
>
>>> One more thing added: your RAM will be exhausted even if you open that
>> 600mb
>>> file just once.
>>> Apaches memory handling is a bit weird: if *one* apache process is using
>>> 200mb RAM on *one* impression because your application uses that much,
>> then
>>> that process will not release the memory while it's serving another 1000
>>> requests for `clear.gif` which is maybe 850b in size.
>> again everything depends on how you have your server configured; you can
>> easily tell apache to kill each child after one run or a whole host of
>> other configs; but ultimately if you can avoid opening up that file in
>> php then do; serving statically as above is the cleanest quickest way to
>> do it (other than using s3 or similar).
>>
>> regards!
>>
>
> Sure, you could configure your apache like that. Unless you have some
> traffic on your site, because the time intensive thing for apache is to
> spawn new processes. So it's just not a good idea to do that, Nor to serve
> big files via file_get_contents.
was only addressing and issue you pointed out.. anyways.. so you propose
what exactly? don't server via apache, don't use file_get_contents
instead do..?
ps you do realise that virtually every "huge" file on the net is served
via a web server w/o problems yeah?
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Re: how to provide download of files mow in documentroot
am 30.03.2010 17:11:52 von ro0ot.w00t
2010/3/30 Nathan Rixham :
> Jan G.B. wrote:
>> 2010/3/29 Nathan Rixham
>>
>>> Jan G.B. wrote:
>>>> 2010/3/29 Nathan Rixham
>>>>
>>>>> Jan G.B. wrote:
>>>>>> Top posting sucks, so I'll answer the post somewhere down there.
>>>>>>
>>>>>>
>>>>>> 2010/3/29 Devendra Jadhav
>>>>>>
>>>>>>> Then you can do file_get_contents within PHP. or any file handling
>>>>>>> mechanism.
>>>>>>>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
>>>>>>>>>> Hi
>>>>>>>>>> i am writing a web application in php
>>>>>>>>>> this webapp primarily focuses on file uploads and downloads
>>>>>>>>>> the uploaded files will be saved in a folder which is not in
>>> document
>>>>>>>>>> root
>>>>>>>>>> and my query is how will i be able to provide download to such
>>> files
>>>>>>> not
>>>>>>>>>> located in document root via php
>>>>>>>>>>
>>>>>> Try something like that
>>>>>>
>>>>>> =A0 =A0 =A0 =A0 $content =3D file_get_contents($filename);
>>>>>> =A0 =A0 =A0 =A0 $etag =3D md5($content);
>>>>>> =A0 =A0 =A0 =A0 header('Last-Modified: '.gmdate('D, d M Y H:i:s',
>>>>>> filemtime($filename)).' GMT');
>>>>>> =A0 =A0 =A0 =A0 header('ETag: '.$etag);
>>>>>> =A0 =A0 =A0 =A0 header('Accept-Ranges: bytes');
>>>>>> =A0 =A0 =A0 =A0 header('Content-Length: '.strlen($content));
>>>>>> =A0 =A0 =A0 =A0 header('Cache-Control: '.$cache_value); // you decid=
e
>>>>>> =A0 =A0 =A0 =A0 header('Content-type: '.$should_be_set);
>>>>>> =A0 =A0 =A0 =A0 echo $content;
>>>>>> =A0 =A0 =A0 =A0 exit;
>>>>>> ?>
>>>>>>
>>>>>> Depending on the $filesize, you should use something else than
>>>>>> file_get_contents() (for example fopen/fread). file_get_contents on =
a
>>>>> huge
>>>>>> file will exhaust your webservers RAM.
>>>>> Yup, so you can map the in web server config; th=
en
>>>>> "allow from" only from localhost + yourdomain. This means you can the=
n
>>>>> request it like an url and do a head request to get the etag etc then
>>>>> return a 304 not modified if you received a matching etag Last-Modifi=
ed
>>>>> etc; (thus meaning you only file_get_contents when really really
>>> needed).
>>>>> I'd advise against saying you Accept-Ranges bytes if you don't accept
>>>>> byte ranges (ie you aren't going to send little bits of the file).
>>>>>
>>>>> If you need the downloads to be secure only; then you could easily
>>>>> negate php all together and simply expose the directory via a locatio=
n
>>>>> so that it is web accessible and set it up to ask for "auth" using
>>>>> htpasswd; a custom script, ldap or whatever.
>>>>>
>>>>> And if you don't need security then why have php involved at all? sim=
ply
>>>>> symlink to the directory or expose it via http and be done with the
>>>>> problem in a minute or two.
>>>>>
>>>>> Regards!
>>>>>
>>>> In my opinion, serving user-content on a productive server is wicked
>>> sick.
>>>> You don't want your visitors to upload malicous files that may trigger
>>> some
>>>> modules as mod_php in apache. So it makes sense to store user-uploads
>>>> outside of a docroot and with no symlink or whatsover.
>>> even the simplest of server configurations will ensure safety. just use
>>> .htaccess to SetHandler default-handler which treats everything as
>>> static content and serves it right up.
>>>
>>
>> Yes. But the average persons posting here aren't server config gods, I
>> believe.
>> Also, you can not implement permissions on these files.
>> The discussion was about serving files from a place outside any docroot!
>> Guess there is a reason for that.
>>
>>
>>>> One more thing added: your RAM will be exhausted even if you open that
>>> 600mb
>>>> file just once.
>>>> Apaches memory handling is a bit weird: if *one* apache process is usi=
ng
>>>> 200mb RAM on *one* impression because your application uses that much,
>>> then
>>>> that process will not release the memory while it's serving another 10=
00
>>>> requests for `clear.gif` which is maybe 850b in size.
>>> again everything depends on how you have your server configured; you ca=
n
>>> easily tell apache to kill each child after one run or a whole host of
>>> other configs; but ultimately if you can avoid opening up that file in
>>> php then do; serving statically as above is the cleanest quickest way t=
o
>>> do it (other than using s3 or similar).
>>>
>>> regards!
>>>
>>
>> Sure, you could configure your apache like that. Unless you have some
>> traffic on your site, because the time intensive thing for apache is to
>> spawn new processes. So it's just not a good idea to do that, Nor to ser=
ve
>> big files via file_get_contents.
>
> was only addressing and issue you pointed out.. anyways.. so you propose
> what exactly? don't server via apache, don't use file_get_contents
> instead do..?
>
> ps you do realise that virtually every "huge" file on the net is served
> via a web server w/o problems yeah?
>
>
I was recommending other file methods like fopen() combinations,
fpassthru() and at best readfile(). All of them do not buffer the
whole file in memory.
http://php.net/readfile
http://php.net/fpassthru
Regards
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Re: how to provide download of files mow in documentroot
am 30.03.2010 17:27:48 von Nathan Rixham
Jan G.B. wrote:
> I was recommending other file methods like fopen() combinations,
> fpassthru() and at best readfile(). All of them do not buffer the
> whole file in memory.
>
> http://php.net/readfile
> http://php.net/fpassthru
ahh so you were; completely missed that, apologies - readfile's the one
and good advice.
still keen to point out that if you don't need any other features from
php then why use php when webserver will do the job perfectly well -
primary reason for me mentioning this is to take advantage of the cache
control / etag / last modified etc (most php scripts just return 200 ok
repeatedly)
regards
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Re: how to provide download of files mow in documentroot
am 30.03.2010 21:37:12 von Anshul Agrawal
--001485e7c80696c344048309c134
Content-Type: text/plain; charset=ISO-8859-1
On Tue, Mar 30, 2010 at 8:41 PM, Jan G.B. wrote:
> 2010/3/30 Nathan Rixham :
> > Jan G.B. wrote:
> >> 2010/3/29 Nathan Rixham
> >>
> >>> Jan G.B. wrote:
> >>>> 2010/3/29 Nathan Rixham
> >>>>
> >>>>> Jan G.B. wrote:
> >>>>>> Top posting sucks, so I'll answer the post somewhere down there.
> >>>>>>
> >>>>>>
> >>>>>> 2010/3/29 Devendra Jadhav
> >>>>>>
> >>>>>>> Then you can do file_get_contents within PHP. or any file handling
> >>>>>>> mechanism.
> >>>>>>>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
> >>>>>>>>>> Hi
> >>>>>>>>>> i am writing a web application in php
> >>>>>>>>>> this webapp primarily focuses on file uploads and downloads
> >>>>>>>>>> the uploaded files will be saved in a folder which is not in
> >>> document
> >>>>>>>>>> root
> >>>>>>>>>> and my query is how will i be able to provide download to such
> >>> files
> >>>>>>> not
> >>>>>>>>>> located in document root via php
> >>>>>>>>>>
> >>>>>> Try something like that
> >>>>>>
> >>>>>> $content = file_get_contents($filename);
> >>>>>> $etag = md5($content);
> >>>>>> header('Last-Modified: '.gmdate('D, d M Y H:i:s',
> >>>>>> filemtime($filename)).' GMT');
> >>>>>> header('ETag: '.$etag);
> >>>>>> header('Accept-Ranges: bytes');
> >>>>>> header('Content-Length: '.strlen($content));
> >>>>>> header('Cache-Control: '.$cache_value); // you decide
> >>>>>> header('Content-type: '.$should_be_set);
> >>>>>> echo $content;
> >>>>>> exit;
> >>>>>> ?>
> >>>>>>
> >>>>>> Depending on the $filesize, you should use something else than
> >>>>>> file_get_contents() (for example fopen/fread). file_get_contents on
> a
> >>>>> huge
> >>>>>> file will exhaust your webservers RAM.
> >>>>> Yup, so you can map the in web server config;
> then
> >>>>> "allow from" only from localhost + yourdomain. This means you can
> then
> >>>>> request it like an url and do a head request to get the etag etc then
> >>>>> return a 304 not modified if you received a matching etag
> Last-Modified
> >>>>> etc; (thus meaning you only file_get_contents when really really
> >>> needed).
> >>>>> I'd advise against saying you Accept-Ranges bytes if you don't accept
> >>>>> byte ranges (ie you aren't going to send little bits of the file).
> >>>>>
> >>>>> If you need the downloads to be secure only; then you could easily
> >>>>> negate php all together and simply expose the directory via a
> location
> >>>>> so that it is web accessible and set it up to ask for "auth" using
> >>>>> htpasswd; a custom script, ldap or whatever.
> >>>>>
> >>>>> And if you don't need security then why have php involved at all?
> simply
> >>>>> symlink to the directory or expose it via http and be done with the
> >>>>> problem in a minute or two.
> >>>>>
> >>>>> Regards!
> >>>>>
> >>>> In my opinion, serving user-content on a productive server is wicked
> >>> sick.
> >>>> You don't want your visitors to upload malicous files that may trigger
> >>> some
> >>>> modules as mod_php in apache. So it makes sense to store user-uploads
> >>>> outside of a docroot and with no symlink or whatsover.
> >>> even the simplest of server configurations will ensure safety. just use
> >>> .htaccess to SetHandler default-handler which treats everything as
> >>> static content and serves it right up.
> >>>
> >>
> >> Yes. But the average persons posting here aren't server config gods, I
> >> believe.
> >> Also, you can not implement permissions on these files.
> >> The discussion was about serving files from a place outside any docroot!
> >> Guess there is a reason for that.
> >>
> >>
> >>>> One more thing added: your RAM will be exhausted even if you open that
> >>> 600mb
> >>>> file just once.
> >>>> Apaches memory handling is a bit weird: if *one* apache process is
> using
> >>>> 200mb RAM on *one* impression because your application uses that much,
> >>> then
> >>>> that process will not release the memory while it's serving another
> 1000
> >>>> requests for `clear.gif` which is maybe 850b in size.
> >>> again everything depends on how you have your server configured; you
> can
> >>> easily tell apache to kill each child after one run or a whole host of
> >>> other configs; but ultimately if you can avoid opening up that file in
> >>> php then do; serving statically as above is the cleanest quickest way
> to
> >>> do it (other than using s3 or similar).
> >>>
> >>> regards!
> >>>
> >>
> >> Sure, you could configure your apache like that. Unless you have some
> >> traffic on your site, because the time intensive thing for apache is to
> >> spawn new processes. So it's just not a good idea to do that, Nor to
> serve
> >> big files via file_get_contents.
> >
> > was only addressing and issue you pointed out.. anyways.. so you propose
> > what exactly? don't server via apache, don't use file_get_contents
> > instead do..?
> >
> > ps you do realise that virtually every "huge" file on the net is served
> > via a web server w/o problems yeah?
> >
> >
>
> I was recommending other file methods like fopen() combinations,
> fpassthru() and at best readfile(). All of them do not buffer the
> whole file in memory.
>
> http://php.net/readfile
> http://php.net/fpassthru
>
> Regards
>
>
>
--
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>
I wanted to see the diff between the memory usage of following three methods
in PHP.
1. readfile
2. fopen followed by fpassthru, and
3. file_get_contents
Using xdebug trace, all three of them gave same number. With
memory_get_peak_usage(true) file_get_contents took double the space. (file
being tested was mere 4mb in size)
Unable to decide what is the best way to profile such methods. Can anybody
suggest?
Thanks,
Anshul
--001485e7c80696c344048309c134--
Re: how to provide download of files mow in documentroot
am 30.03.2010 21:42:42 von Nathan Rixham
Anshul Agrawal wrote:
> On Tue, Mar 30, 2010 at 8:41 PM, Jan G.B. wrote:
>
>> 2010/3/30 Nathan Rixham :
>>> Jan G.B. wrote:
>>>> 2010/3/29 Nathan Rixham
>>>>
>>>>> Jan G.B. wrote:
>>>>>> 2010/3/29 Nathan Rixham
>>>>>>
>>>>>>> Jan G.B. wrote:
>>>>>>>> Top posting sucks, so I'll answer the post somewhere down there.
>>>>>>>>
>>>>>>>>
>>>>>>>> 2010/3/29 Devendra Jadhav
>>>>>>>>
>>>>>>>>> Then you can do file_get_contents within PHP. or any file handling
>>>>>>>>> mechanism.
>>>>>>>>>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
>>>>>>>>>>>> Hi
>>>>>>>>>>>> i am writing a web application in php
>>>>>>>>>>>> this webapp primarily focuses on file uploads and downloads
>>>>>>>>>>>> the uploaded files will be saved in a folder which is not in
>>>>> document
>>>>>>>>>>>> root
>>>>>>>>>>>> and my query is how will i be able to provide download to such
>>>>> files
>>>>>>>>> not
>>>>>>>>>>>> located in document root via php
>>>>>>>>>>>>
>>>>>>>> Try something like that
>>>>>>>>
>>>>>>>> $content = file_get_contents($filename);
>>>>>>>> $etag = md5($content);
>>>>>>>> header('Last-Modified: '.gmdate('D, d M Y H:i:s',
>>>>>>>> filemtime($filename)).' GMT');
>>>>>>>> header('ETag: '.$etag);
>>>>>>>> header('Accept-Ranges: bytes');
>>>>>>>> header('Content-Length: '.strlen($content));
>>>>>>>> header('Cache-Control: '.$cache_value); // you decide
>>>>>>>> header('Content-type: '.$should_be_set);
>>>>>>>> echo $content;
>>>>>>>> exit;
>>>>>>>> ?>
>>>>>>>>
>>>>>>>> Depending on the $filesize, you should use something else than
>>>>>>>> file_get_contents() (for example fopen/fread). file_get_contents on
>> a
>>>>>>> huge
>>>>>>>> file will exhaust your webservers RAM.
>>>>>>> Yup, so you can map the in web server config;
>> then
>>>>>>> "allow from" only from localhost + yourdomain. This means you can
>> then
>>>>>>> request it like an url and do a head request to get the etag etc then
>>>>>>> return a 304 not modified if you received a matching etag
>> Last-Modified
>>>>>>> etc; (thus meaning you only file_get_contents when really really
>>>>> needed).
>>>>>>> I'd advise against saying you Accept-Ranges bytes if you don't accept
>>>>>>> byte ranges (ie you aren't going to send little bits of the file).
>>>>>>>
>>>>>>> If you need the downloads to be secure only; then you could easily
>>>>>>> negate php all together and simply expose the directory via a
>> location
>>>>>>> so that it is web accessible and set it up to ask for "auth" using
>>>>>>> htpasswd; a custom script, ldap or whatever.
>>>>>>>
>>>>>>> And if you don't need security then why have php involved at all?
>> simply
>>>>>>> symlink to the directory or expose it via http and be done with the
>>>>>>> problem in a minute or two.
>>>>>>>
>>>>>>> Regards!
>>>>>>>
>>>>>> In my opinion, serving user-content on a productive server is wicked
>>>>> sick.
>>>>>> You don't want your visitors to upload malicous files that may trigger
>>>>> some
>>>>>> modules as mod_php in apache. So it makes sense to store user-uploads
>>>>>> outside of a docroot and with no symlink or whatsover.
>>>>> even the simplest of server configurations will ensure safety. just use
>>>>> .htaccess to SetHandler default-handler which treats everything as
>>>>> static content and serves it right up.
>>>>>
>>>> Yes. But the average persons posting here aren't server config gods, I
>>>> believe.
>>>> Also, you can not implement permissions on these files.
>>>> The discussion was about serving files from a place outside any docroot!
>>>> Guess there is a reason for that.
>>>>
>>>>
>>>>>> One more thing added: your RAM will be exhausted even if you open that
>>>>> 600mb
>>>>>> file just once.
>>>>>> Apaches memory handling is a bit weird: if *one* apache process is
>> using
>>>>>> 200mb RAM on *one* impression because your application uses that much,
>>>>> then
>>>>>> that process will not release the memory while it's serving another
>> 1000
>>>>>> requests for `clear.gif` which is maybe 850b in size.
>>>>> again everything depends on how you have your server configured; you
>> can
>>>>> easily tell apache to kill each child after one run or a whole host of
>>>>> other configs; but ultimately if you can avoid opening up that file in
>>>>> php then do; serving statically as above is the cleanest quickest way
>> to
>>>>> do it (other than using s3 or similar).
>>>>>
>>>>> regards!
>>>>>
>>>> Sure, you could configure your apache like that. Unless you have some
>>>> traffic on your site, because the time intensive thing for apache is to
>>>> spawn new processes. So it's just not a good idea to do that, Nor to
>> serve
>>>> big files via file_get_contents.
>>> was only addressing and issue you pointed out.. anyways.. so you propose
>>> what exactly? don't server via apache, don't use file_get_contents
>>> instead do..?
>>>
>>> ps you do realise that virtually every "huge" file on the net is served
>>> via a web server w/o problems yeah?
>>>
>>>
>> I was recommending other file methods like fopen() combinations,
>> fpassthru() and at best readfile(). All of them do not buffer the
>> whole file in memory.
>>
>> http://php.net/readfile
>> http://php.net/fpassthru
>>
>> Regards
>>
>>
>>
> --
>> PHP General Mailing List (http://www.php.net/)
>> To unsubscribe, visit: http://www.php.net/unsub.php
>>
>>
> I wanted to see the diff between the memory usage of following three methods
> in PHP.
> 1. readfile
> 2. fopen followed by fpassthru, and
> 3. file_get_contents
>
> Using xdebug trace, all three of them gave same number. With
> memory_get_peak_usage(true) file_get_contents took double the space. (file
> being tested was mere 4mb in size)
>
> Unable to decide what is the best way to profile such methods. Can anybody
> suggest?
do it with a huge file and watch top or suchlike; you'll note that
readfile doesn't affect memory whereas file_get_contents does;
fpassthrough has an extra couple of commands (fopen close) but that's a
marginal hit.
regards!
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Re: how to provide download of files mow in documentroot
am 31.03.2010 09:43:27 von Anshul Agrawal
--000e0cd2d9f4da8b88048313e6dc
Content-Type: text/plain; charset=ISO-8859-1
On Wed, Mar 31, 2010 at 1:12 AM, Nathan Rixham wrote:
> Anshul Agrawal wrote:
> > On Tue, Mar 30, 2010 at 8:41 PM, Jan G.B.
> wrote:
> >
> >> 2010/3/30 Nathan Rixham :
> >>> Jan G.B. wrote:
> >>>> 2010/3/29 Nathan Rixham
> >>>>
> >>>>> Jan G.B. wrote:
> >>>>>> 2010/3/29 Nathan Rixham
> >>>>>>
> >>>>>>> Jan G.B. wrote:
> >>>>>>>> Top posting sucks, so I'll answer the post somewhere down there.
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> 2010/3/29 Devendra Jadhav
> >>>>>>>>
> >>>>>>>>> Then you can do file_get_contents within PHP. or any file
> handling
> >>>>>>>>> mechanism.
> >>>>>>>>>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrote:
> >>>>>>>>>>>> Hi
> >>>>>>>>>>>> i am writing a web application in php
> >>>>>>>>>>>> this webapp primarily focuses on file uploads and downloads
> >>>>>>>>>>>> the uploaded files will be saved in a folder which is not in
> >>>>> document
> >>>>>>>>>>>> root
> >>>>>>>>>>>> and my query is how will i be able to provide download to such
> >>>>> files
> >>>>>>>>> not
> >>>>>>>>>>>> located in document root via php
> >>>>>>>>>>>>
> >>>>>>>> Try something like that
> >>>>>>>>
> >>>>>>>> $content = file_get_contents($filename);
> >>>>>>>> $etag = md5($content);
> >>>>>>>> header('Last-Modified: '.gmdate('D, d M Y H:i:s',
> >>>>>>>> filemtime($filename)).' GMT');
> >>>>>>>> header('ETag: '.$etag);
> >>>>>>>> header('Accept-Ranges: bytes');
> >>>>>>>> header('Content-Length: '.strlen($content));
> >>>>>>>> header('Cache-Control: '.$cache_value); // you decide
> >>>>>>>> header('Content-type: '.$should_be_set);
> >>>>>>>> echo $content;
> >>>>>>>> exit;
> >>>>>>>> ?>
> >>>>>>>>
> >>>>>>>> Depending on the $filesize, you should use something else than
> >>>>>>>> file_get_contents() (for example fopen/fread). file_get_contents
> on
> >> a
> >>>>>>> huge
> >>>>>>>> file will exhaust your webservers RAM.
> >>>>>>> Yup, so you can map the in web server config;
> >> then
> >>>>>>> "allow from" only from localhost + yourdomain. This means you can
> >> then
> >>>>>>> request it like an url and do a head request to get the etag etc
> then
> >>>>>>> return a 304 not modified if you received a matching etag
> >> Last-Modified
> >>>>>>> etc; (thus meaning you only file_get_contents when really really
> >>>>> needed).
> >>>>>>> I'd advise against saying you Accept-Ranges bytes if you don't
> accept
> >>>>>>> byte ranges (ie you aren't going to send little bits of the file).
> >>>>>>>
> >>>>>>> If you need the downloads to be secure only; then you could easily
> >>>>>>> negate php all together and simply expose the directory via a
> >> location
> >>>>>>> so that it is web accessible and set it up to ask for "auth" using
> >>>>>>> htpasswd; a custom script, ldap or whatever.
> >>>>>>>
> >>>>>>> And if you don't need security then why have php involved at all?
> >> simply
> >>>>>>> symlink to the directory or expose it via http and be done with the
> >>>>>>> problem in a minute or two.
> >>>>>>>
> >>>>>>> Regards!
> >>>>>>>
> >>>>>> In my opinion, serving user-content on a productive server is wicked
> >>>>> sick.
> >>>>>> You don't want your visitors to upload malicous files that may
> trigger
> >>>>> some
> >>>>>> modules as mod_php in apache. So it makes sense to store
> user-uploads
> >>>>>> outside of a docroot and with no symlink or whatsover.
> >>>>> even the simplest of server configurations will ensure safety. just
> use
> >>>>> .htaccess to SetHandler default-handler which treats everything as
> >>>>> static content and serves it right up.
> >>>>>
> >>>> Yes. But the average persons posting here aren't server config gods, I
> >>>> believe.
> >>>> Also, you can not implement permissions on these files.
> >>>> The discussion was about serving files from a place outside any
> docroot!
> >>>> Guess there is a reason for that.
> >>>>
> >>>>
> >>>>>> One more thing added: your RAM will be exhausted even if you open
> that
> >>>>> 600mb
> >>>>>> file just once.
> >>>>>> Apaches memory handling is a bit weird: if *one* apache process is
> >> using
> >>>>>> 200mb RAM on *one* impression because your application uses that
> much,
> >>>>> then
> >>>>>> that process will not release the memory while it's serving another
> >> 1000
> >>>>>> requests for `clear.gif` which is maybe 850b in size.
> >>>>> again everything depends on how you have your server configured; you
> >> can
> >>>>> easily tell apache to kill each child after one run or a whole host
> of
> >>>>> other configs; but ultimately if you can avoid opening up that file
> in
> >>>>> php then do; serving statically as above is the cleanest quickest way
> >> to
> >>>>> do it (other than using s3 or similar).
> >>>>>
> >>>>> regards!
> >>>>>
> >>>> Sure, you could configure your apache like that. Unless you have some
> >>>> traffic on your site, because the time intensive thing for apache is
> to
> >>>> spawn new processes. So it's just not a good idea to do that, Nor to
> >> serve
> >>>> big files via file_get_contents.
> >>> was only addressing and issue you pointed out.. anyways.. so you
> propose
> >>> what exactly? don't server via apache, don't use file_get_contents
> >>> instead do..?
> >>>
> >>> ps you do realise that virtually every "huge" file on the net is served
> >>> via a web server w/o problems yeah?
> >>>
> >>>
> >> I was recommending other file methods like fopen() combinations,
> >> fpassthru() and at best readfile(). All of them do not buffer the
> >> whole file in memory.
> >>
> >> http://php.net/readfile
> >> http://php.net/fpassthru
> >>
> >> Regards
> >>
> >>
> >>
> > --
> >> PHP General Mailing List (http://www.php.net/)
> >> To unsubscribe, visit: http://www.php.net/unsub.php
> >>
> >>
> > I wanted to see the diff between the memory usage of following three
> methods
> > in PHP.
> > 1. readfile
> > 2. fopen followed by fpassthru, and
> > 3. file_get_contents
> >
> > Using xdebug trace, all three of them gave same number. With
> > memory_get_peak_usage(true) file_get_contents took double the space.
> (file
> > being tested was mere 4mb in size)
> >
> > Unable to decide what is the best way to profile such methods. Can
> anybody
> > suggest?
>
>
> do it with a huge file and watch top or suchlike; you'll note that
> readfile doesn't affect memory whereas file_get_contents does;
> fpassthrough has an extra couple of commands (fopen close) but that's a
> marginal hit.
>
> regards!
>
Somehow the max memory usage reported by the system for readfile and
fpassthru is double the file size, which in turn is more than the memory
limit allowed in PHP.
For file_get_contents, php throws an memory out of limit exception.
It seems when the file data is handed over to Apache, apache buffer is what
is eating up the memory space and reflected in process list. (I am using
Windows by the way)
Thanks for help,
Anshul
--000e0cd2d9f4da8b88048313e6dc--
Re: how to provide download of files mow in documentroot
am 31.03.2010 11:15:06 von Tommy Pham
On Wed, Mar 31, 2010 at 12:43 AM, Anshul Agrawal w=
rote:
> On Wed, Mar 31, 2010 at 1:12 AM, Nathan Rixham wrote:
>
>> Anshul Agrawal wrote:
>> > On Tue, Mar 30, 2010 at 8:41 PM, Jan G.B.
>> wrote:
>> >
>> >> 2010/3/30 Nathan Rixham :
>> >>> Jan G.B. wrote:
>> >>>> 2010/3/29 Nathan Rixham
>> >>>>
>> >>>>> Jan G.B. wrote:
>> >>>>>> 2010/3/29 Nathan Rixham
>> >>>>>>
>> >>>>>>> Jan G.B. wrote:
>> >>>>>>>> Top posting sucks, so I'll answer the post somewhere down there=
..
>> >>>>>>>>
>> >>>>>>>>
>> >>>>>>>> 2010/3/29 Devendra Jadhav
>> >>>>>>>>
>> >>>>>>>>> Then you can do file_get_contents within PHP. or any file
>> handling
>> >>>>>>>>> mechanism.
>> >>>>>>>>>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt wrot=
e:
>> >>>>>>>>>>>> Hi
>> >>>>>>>>>>>> i am writing a web application in php
>> >>>>>>>>>>>> this webapp primarily focuses on file uploads and downloads
>> >>>>>>>>>>>> the uploaded files will be saved in a folder which is not i=
n
>> >>>>> document
>> >>>>>>>>>>>> root
>> >>>>>>>>>>>> and my query is how will i be able to provide download to s=
uch
>> >>>>> files
>> >>>>>>>>> not
>> >>>>>>>>>>>> located in document root via php
>> >>>>>>>>>>>>
>> >>>>>>>> Try something like that
>> >>>>>>>>
>> >>>>>>>> Â Â Â Â $content =3D file_get_contents($fil=
ename);
>> >>>>>>>> Â Â Â Â $etag =3D md5($content);
>> >>>>>>>> Â Â Â Â header('Last-Modified: '.gmdate('D,=
d M Y H:i:s',
>> >>>>>>>> filemtime($filename)).' GMT');
>> >>>>>>>> Â Â Â Â header('ETag: '.$etag);
>> >>>>>>>> Â Â Â Â header('Accept-Ranges: bytes');
>> >>>>>>>> Â Â Â Â header('Content-Length: '.strlen($c=
ontent));
>> >>>>>>>> Â Â Â Â header('Cache-Control: '.$cache_val=
ue); // you decide
>> >>>>>>>> Â Â Â Â header('Content-type: '.$should_be_=
set);
>> >>>>>>>> Â Â Â Â echo $content;
>> >>>>>>>> Â Â Â Â exit;
>> >>>>>>>> ?>
>> >>>>>>>>
>> >>>>>>>> Depending on the $filesize, you should use something else than
>> >>>>>>>> file_get_contents() (for example fopen/fread). file_get_content=
s
>> on
>> >> a
>> >>>>>>> huge
>> >>>>>>>> file will exhaust your webservers RAM.
>> >>>>>>> Yup, so you can map the in web server confi=
g;
>> >> then
>> >>>>>>> "allow from" only from localhost + yourdomain. This means you ca=
n
>> >> then
>> >>>>>>> request it like an url and do a head request to get the etag etc
>> then
>> >>>>>>> return a 304 not modified if you received a matching etag
>> >> Last-Modified
>> >>>>>>> etc; (thus meaning you only file_get_contents when really really
>> >>>>> needed).
>> >>>>>>> I'd advise against saying you Accept-Ranges bytes if you don't
>> accept
>> >>>>>>> byte ranges (ie you aren't going to send little bits of the file=
).
>> >>>>>>>
>> >>>>>>> If you need the downloads to be secure only; then you could easi=
ly
>> >>>>>>> negate php all together and simply expose the directory via a
>> >> location
>> >>>>>>> so that it is web accessible and set it up to ask for "auth" usi=
ng
>> >>>>>>> htpasswd; a custom script, ldap or whatever.
>> >>>>>>>
>> >>>>>>> And if you don't need security then why have php involved at all=
?
>> >> simply
>> >>>>>>> symlink to the directory or expose it via http and be done with =
the
>> >>>>>>> problem in a minute or two.
>> >>>>>>>
>> >>>>>>> Regards!
>> >>>>>>>
>> >>>>>> In my opinion, serving user-content on a productive server is wic=
ked
>> >>>>> sick.
>> >>>>>> You don't want your visitors to upload malicous files that may
>> trigger
>> >>>>> some
>> >>>>>> modules as mod_php in apache. So it makes sense to store
>> user-uploads
>> >>>>>> outside of a docroot and with no symlink or whatsover.
>> >>>>> even the simplest of server configurations will ensure safety. jus=
t
>> use
>> >>>>> .htaccess to SetHandler default-handler which treats everything as
>> >>>>> static content and serves it right up.
>> >>>>>
>> >>>> Yes. But the average persons posting here aren't server config gods=
, I
>> >>>> believe.
>> >>>> Also, you can not implement permissions on these files.
>> >>>> The discussion was about serving files from a place outside any
>> docroot!
>> >>>> Guess there is a reason for that.
>> >>>>
>> >>>>
>> >>>>>> One more thing added: your RAM will be exhausted even if you open
>> that
>> >>>>> 600mb
>> >>>>>> file just once.
>> >>>>>> Apaches memory handling is a bit weird: if *one* apache process i=
s
>> >> using
>> >>>>>> 200mb RAM on *one* impression because your application uses that
>> much,
>> >>>>> then
>> >>>>>> that process will not release the memory while it's serving anoth=
er
>> >> 1000
>> >>>>>> requests for `clear.gif` which is maybe 850b in size.
>> >>>>> again everything depends on how you have your server configured; y=
ou
>> >> can
>> >>>>> easily tell apache to kill each child after one run or a whole hos=
t
>> of
>> >>>>> other configs; but ultimately if you can avoid opening up that fil=
e
>> in
>> >>>>> php then do; serving statically as above is the cleanest quickest =
way
>> >> to
>> >>>>> do it (other than using s3 or similar).
>> >>>>>
>> >>>>> regards!
>> >>>>>
>> >>>> Sure, you could configure your apache like that. Unless you have so=
me
>> >>>> traffic on your site, because the time intensive thing for apache i=
s
>> to
>> >>>> spawn new processes. So it's just not a good idea to do that, Nor t=
o
>> >> serve
>> >>>> big files via file_get_contents.
>> >>> was only addressing and issue you pointed out.. anyways.. so you
>> propose
>> >>> what exactly? don't server via apache, don't use file_get_contents
>> >>> instead do..?
>> >>>
>> >>> ps you do realise that virtually every "huge" file on the net is ser=
ved
>> >>> via a web server w/o problems yeah?
>> >>>
>> >>>
>> >> I was recommending other file methods like fopen() combinations,
>> >> fpassthru() and at best readfile(). All of them do not buffer the
>> >> whole file in memory.
>> >>
>> >> http://php.net/readfile
>> >> http://php.net/fpassthru
>> >>
>> >> Regards
>> >>
>> >>
>> >>
>> > --
>> >> PHP General Mailing List (http://www.php.net/)
>> >> To unsubscribe, visit: http://www.php.net/unsub.php
>> >>
>> >>
>> > I wanted to see the diff between the memory usage of following three
>> methods
>> > in PHP.
>> > 1. readfile
>> > 2. fopen followed by fpassthru, and
>> > 3. file_get_contents
>> >
>> > Using xdebug trace, all three of them gave same number. With
>> > memory_get_peak_usage(true) file_get_contents took double the space.
>> (file
>> > being tested was mere 4mb in size)
>> >
>> > Unable to decide what is the best way to profile such methods. Can
>> anybody
>> > suggest?
>>
>>
>> do it with a huge file and watch top or suchlike; you'll note that
>> readfile doesn't affect memory whereas file_get_contents does;
>> fpassthrough has an extra couple of commands (fopen close) but that's a
>> marginal hit.
>>
>> regards!
>>
>
> Somehow the max memory usage reported by the system for readfile and
> fpassthru is double the file size, which in turn is more than the memory
> limit allowed in PHP.
> For file_get_contents, php throws an memory out of limit exception.
>
> It seems when the file data is handed over to Apache, apache buffer is wh=
at
> is eating up the memory space and reflected in process list. (I am using
> Windows by the way)
>
> Thanks for help,
> Anshul
>
Have you read this? http://httpd.apache.org/docs/2.1/caching.html
NOTE: Link implies version 2.1 but doc is for version 2.2.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php