Failing to download do to File size too big

Failing to download do to File size too big

am 07.12.2007 20:40:00 von Chris

Hi Everyone,

I have been working on an ASP .NET Page (using C#) on Windows Server 2003
with IIS 6.0. The website simply allows the user to collect certain files
they want to download and when they have selected all of their desired files
the site zips them (using the CSHARPZIPLIB library) and downloads it to the
user using response headers. Now this operation works perfectly until the
size of the ZIPPED file becomes greater than about 60 Megabytes. I created a
Log File and from looking at it I have concluded that when it sees that the
zipped file is this big it calls the page_load function. It does this twice
(for a total of three page_load's); afterwhich it gives up and displays a
"Page can not be displayed" page (It throws no error that is visible). Now I
have tested downloading a big file (about 80 MB) without any zipping
mechanism and it still fails to download, which tells me that the zipping
mechanism is working appropriately. Something within the downloading is
getting upset when the file is over around 60 MB, and it doesn't matter if it
is zipped or not; it still dis-likes it. Also, when I run the same site
through the Visual Studio 2005 debugger, it works perfectly for every size I
have tried so far (up to prob about 150 to 200 MBs which is as big as the
users cart will probably get from this site)........anyone have suggestions,
advice, or comments? I need guidance!!

Thank you,
Chris

Re: Failing to download do to File size too big

am 20.12.2007 19:21:30 von webwizz_in_arab

I did all these things...It still gives me the problem. I have messed
with the MetaBase.xml file before; I have changed everything that
could possibly affect the file size and it doesn't fix it. Here is
what I use to download the file -- its within a function that has a
string (theFile) passed into it:

if (!String.IsNullOrEmpty(theFile))
{
FileInfo finfo = new FileInfo(theFile);

if (finfo.Exists)
{
Response.Clear();
Response.AddHeader("Content-Disposition", "attachment;
filename=" + finfo.Name);
Response.AddHeader("Content-Length", finfo.Length.ToString());
Response.ContentType = "application/octet-stream";
Response.End();
Response.Clear();
finfo = null;
}
else
{
Response.Write("This file does not exist.");
}
}
else
{
Response.Write("You must specify a file. Link not working.");
}

Do you see anything wrong with this? is it still something in the
MetaBase.xml file? maybe the web.config or machine.config files? which
I have played with all these files and still can't seem to get it to
work.

Any more advice?
or forwards to a site that could possibly answer my question?

On Dec 17, 1:11 pm, THKS wrote:
> these are the exact metabase parameters:"AspMaxRequestEntityAllowed"
> and "AspBufferingLimit".
> Also try to increase theASPscript timeout in IIS configuration tab to
> avoid the timeouts before the download is getting completed.
>
>
>
> "Chris" wrote:
> > Hi Everyone,
>
> > I have been working on anASP.NETPage (using C#) on Windows Server 2003
> > with IIS 6.0. The website simply allows the user to collect certain files
> > they want to download and when they have selected all of their desired files
> > the site zips them (using the CSHARPZIPLIB library) anddownloadsit to the
> > user using response headers. Now this operation works perfectly until the
> > size of the ZIPPED file becomes greater than about 60 Megabytes. I created a
> > Log File and from looking at it I have concluded that when it sees that the
> > zipped file is this big it calls the page_load function. It does this twice
> > (for a total of three page_load's); afterwhich it gives up and displays a
> > "Page can not be displayed" page (It throws no error that is visible). Now I
> > have tested downloading a big file (about 80 MB) without any zipping
> > mechanism and it still fails to download, which tells me that the zipping
> > mechanism is working appropriately. Something within the downloading is
> > getting upset when the file is over around 60 MB, and it doesn't matter if it
> > is zipped or not; it still dis-likes it. Also, when I run the same site
> > through the Visual Studio 2005 debugger, it works perfectly for every size I
> > have tried so far (up to prob about 150 to 200 MBs which is as big as the
> > users cart will probably get from this site)........anyone have suggestions,
> > advice, or comments? I need guidance!!
>
> > Thank you,
> > Chris- Hide quoted text -
>
> - Show quoted text -

Re: Failing to download do to File size too big

am 21.12.2007 06:14:59 von David Wang

On Dec 20, 10:21=A0am, webwizz_in_a...@hotmail.com wrote:
> I did all these things...It still gives me the problem. =A0I have messed
> with the MetaBase.xml file before; I have changed everything that
> could possibly affect the file size and it doesn't fix it. =A0Here is
> what I use to download the file -- its within a function that has a
> string (theFile) passed into it:
>
> if (!String.IsNullOrEmpty(theFile))
> {
> =A0 =A0FileInfo finfo =3D new FileInfo(theFile);
>
> =A0 =A0if (finfo.Exists)
> =A0 =A0{
> =A0 =A0 =A0 Response.Clear();
> =A0 =A0 =A0 Response.AddHeader("Content-Disposition", "attachment;
> filename=3D" + finfo.Name);
> =A0 =A0 =A0 Response.AddHeader("Content-Length", finfo.Length.ToString());=

> =A0 =A0 =A0 Response.ContentType =3D "application/octet-stream";
> =A0 =A0 =A0 Response.End();
> =A0 =A0 =A0 Response.Clear();
> =A0 =A0 =A0 finfo =3D null;
> =A0 =A0}
> =A0 =A0else
> =A0 =A0{
> =A0 =A0 =A0 Response.Write("This file does not exist.");
> =A0 =A0}}
>
> else
> {
> =A0 =A0Response.Write("You must specify a file. =A0Link not working.");
>
> }
>
> Do you see anything wrong with this? is it still something in the
> MetaBase.xml file? maybe the web.config or machine.config files? which
> I have played with all these files and still can't seem to get it to
> work.
>
> Any more advice?
> or forwards to a site that could possibly answer my question?
>
> On Dec 17, 1:11 pm, THKS wrote:
>
>
>
> > these are the exact metabase parameters:"AspMaxRequestEntityAllowed"
> > and "AspBufferingLimit".
> > Also try to increase theASPscript timeout in IIS configuration tab to
> > avoid the timeouts before the download is getting completed.
>
> > "Chris" wrote:
> > > Hi Everyone,
>
> > > I have been working on anASP.NETPage (using C#) on Windows Server 2003=

> > > with IIS 6.0. =A0The website simply allows the user to collect certain=
files
> > > they want to download and when they have selected all of their desired=
files
> > > the site zips them (using the CSHARPZIPLIB library) anddownloadsit to =
the
> > > user using response headers. =A0Now this operation works perfectly unt=
il the
> > > size of the ZIPPED file becomes greater than about 60 Megabytes. =A0I =
created a
> > > Log File and from looking at it I have concluded that when it sees tha=
t the
> > > zipped file is this big it calls the page_load function. =A0It does th=
is twice
> > > (for a total of three page_load's); afterwhich it gives up and display=
s a
> > > "Page can not be displayed" page (It throws no error that is visible).=
=A0Now I
> > > have tested downloading a big file (about 80 MB) without any zipping
> > > mechanism and it still fails to download, which tells me that the zipp=
ing
> > > mechanism is working appropriately. =A0Something within the downloadin=
g is
> > > getting upset when the file is over around 60 MB, and it doesn't matte=
r if it
> > > is zipped or not; it still dis-likes it. =A0Also, when I run the same =
site
> > > through the Visual Studio 2005 debugger, it works perfectly for every =
size I
> > > have tried so far (up to prob about 150 to 200 MBs which is as big as =
the
> > > users cart will probably get from this site)........anyone have sugges=
tions,
> > > advice, or comments? =A0I need guidance!!
>
> > > Thank you,
> > > Chris- Hide quoted text -
>
> > - Show quoted text -- Hide quoted text -
>
> - Show quoted text -


Perhaps it is because with Debug binaries the connection does not
timeout and thus you can send large data, while in Retail/Release
code, connections have a shorter timeout (prevent other security
attacks) -- and you are trying to sequentially blast the data all at
once which can take a while.

Otherwise, you really want to look at ASP.Net connectivity behavior.
IIS and ASP.Net has no problems sending files multiple gigabytes in
size.


//David
http://w3-4u.blogspot.com
http://blogs.msdn.com/David.Wang
//

Re: Failing to download do to File size too big

am 21.12.2007 22:27:00 von Chris

Thank everyone on their help on this issue!! I have found the solution. In
my code that downloaded the file to the user through reponse headers I forgot
to "flush" before I wrote the file to the users computer. I had to do this:

Response.AppendHeader("Content-Type:", "application/octet-stream");
Response.AppendHeader("Content-Disposition", "attachment; filename=" +
FInfo.name); // The FInfo variable is an object under the System.IO.FileInfo
Class
Reponse.AppendHeader("Content-Length", FInfo.Length.ToString());
Response.Flush(); // THIS IS WHAT WAS NEEDED FOR IT TO WORK
Response.WriteFile(FInfo.FullName);
Response.End();

"David Wang" wrote:

> On Dec 20, 10:21 am, webwizz_in_a...@hotmail.com wrote:
> > I did all these things...It still gives me the problem. I have messed
> > with the MetaBase.xml file before; I have changed everything that
> > could possibly affect the file size and it doesn't fix it. Here is
> > what I use to download the file -- its within a function that has a
> > string (theFile) passed into it:
> >
> > if (!String.IsNullOrEmpty(theFile))
> > {
> > FileInfo finfo = new FileInfo(theFile);
> >
> > if (finfo.Exists)
> > {
> > Response.Clear();
> > Response.AddHeader("Content-Disposition", "attachment;
> > filename=" + finfo.Name);
> > Response.AddHeader("Content-Length", finfo.Length.ToString());
> > Response.ContentType = "application/octet-stream";
> > Response.End();
> > Response.Clear();
> > finfo = null;
> > }
> > else
> > {
> > Response.Write("This file does not exist.");
> > }}
> >
> > else
> > {
> > Response.Write("You must specify a file. Link not working.");
> >
> > }
> >
> > Do you see anything wrong with this? is it still something in the
> > MetaBase.xml file? maybe the web.config or machine.config files? which
> > I have played with all these files and still can't seem to get it to
> > work.
> >
> > Any more advice?
> > or forwards to a site that could possibly answer my question?
> >
> > On Dec 17, 1:11 pm, THKS wrote:
> >
> >
> >
> > > these are the exact metabase parameters:"AspMaxRequestEntityAllowed"
> > > and "AspBufferingLimit".
> > > Also try to increase theASPscript timeout in IIS configuration tab to
> > > avoid the timeouts before the download is getting completed.
> >
> > > "Chris" wrote:
> > > > Hi Everyone,
> >
> > > > I have been working on anASP.NETPage (using C#) on Windows Server 2003
> > > > with IIS 6.0. The website simply allows the user to collect certain files
> > > > they want to download and when they have selected all of their desired files
> > > > the site zips them (using the CSHARPZIPLIB library) anddownloadsit to the
> > > > user using response headers. Now this operation works perfectly until the
> > > > size of the ZIPPED file becomes greater than about 60 Megabytes. I created a
> > > > Log File and from looking at it I have concluded that when it sees that the
> > > > zipped file is this big it calls the page_load function. It does this twice
> > > > (for a total of three page_load's); afterwhich it gives up and displays a
> > > > "Page can not be displayed" page (It throws no error that is visible). Now I
> > > > have tested downloading a big file (about 80 MB) without any zipping
> > > > mechanism and it still fails to download, which tells me that the zipping
> > > > mechanism is working appropriately. Something within the downloading is
> > > > getting upset when the file is over around 60 MB, and it doesn't matter if it
> > > > is zipped or not; it still dis-likes it. Also, when I run the same site
> > > > through the Visual Studio 2005 debugger, it works perfectly for every size I
> > > > have tried so far (up to prob about 150 to 200 MBs which is as big as the
> > > > users cart will probably get from this site)........anyone have suggestions,
> > > > advice, or comments? I need guidance!!
> >
> > > > Thank you,
> > > > Chris- Hide quoted text -
> >
> > > - Show quoted text -- Hide quoted text -
> >
> > - Show quoted text -
>
>
> Perhaps it is because with Debug binaries the connection does not
> timeout and thus you can send large data, while in Retail/Release
> code, connections have a shorter timeout (prevent other security
> attacks) -- and you are trying to sequentially blast the data all at
> once which can take a while.
>
> Otherwise, you really want to look at ASP.Net connectivity behavior.
> IIS and ASP.Net has no problems sending files multiple gigabytes in
> size.
>
>
> //David
> http://w3-4u.blogspot.com
> http://blogs.msdn.com/David.Wang
> //
>