Multiple large file downloads in bulk

Safin

Skilled
A friend of mine is asking me advice on this topic. This is a commercial website so it cannot be an experimental test :)

Here is the scenario.
There is a depository of files(images) in a folder. The files are mapped to a db. A user can select the files from frontend. (Think of it as an add to cart) and then once he has done his purchase, click download all. Now you can create a on the fly zip of images and make available the zip file for download(with a timeout to expire the zip after some time). The location of zip again can be made in a non web accessible environment and enforce php protection to enable download to stop a random user to download the files with a link to zip.

The files are rather big in size with individual sizes extending right upto 1.5mb so php timeouts are an issue especially since there seems to be an issue with raising the max_execution_time limit.

Anyone can think of a better solution?
 
can you put it to an ftp folder and provide the user a time bound ftp account? Let the user download the file in say 3 days, and later expire the account.

You can host the ftp server somewhere else as well reducing transfer load on the web server.
 
i don't know if this will work in your case but i have something like this planned for a website.

Instead of using the php timeout method why don't u add a field to database and append it to the url. Once the user tries accessing the link check with the database and see if the code is still valid in terms of when it was created and how many hours/days it is valid for. After that give the person the file if it is valid .
 
^^ Without a cron deleting all the redundant files that will lead to massive redundancy in a short time as all users might use different combination of "basket". Your solution is ok if it was a single file download. Here we are talking multiple basket of files which vary from user to user. And getting that to work is a hassle. Simply put a packaging of files will invariably lead to a timeout. If it was just one file being served all the time, i agree the issue would be a non-issue.
 
Did you say 1.5MB? Thats too small actually. Have you tried it on another server? Capable one I mean...

A client of mine runs stock animation website with similar features. The downloads vary between 15MB to 100 MB (max). The lightbox is zipped and kept in a directory with a hashed file name.

When downloaded the link is active for 24 hours after which it expires. the deletion part is done every 3 days, sometimes sooner depending on the directory size.
 
OK, can you clarify this a bit? Where the does the php timeout come from? Are you saying that zipping the files takes too long and as a result the php code times out?
 
Did you say 1.5MB? Thats too small actually. Have you tried it on another server? Capable one I mean...

The server obviously isn't capable enough. Else i would just change the max execution time and let the script zip the files up over a longer period.

Btw this is a stock photo download site too.
 
OK, so this zipping that you do - are you concerned about size/bandwidth? Or are you just really concerned with consolidation, i.e. the ability to download a _single_ file as opposed to 10 separate downloads?
 
See the problem is two fold.

The major one is the timing out of the zip or the file getting corrupt at the script level. I can avoid that by restricting the images being zipped in a batch and split them over a multiple zip file.

The other obvious issue is that there is no "cache" as such that i can serve to the end user as everyone's basket is different.

I am going to advice him to move to a better dedicated server. I can't think of a better solution than the one mentioned in 1st post anyway.
 
1) You know the Link ID of Images on the database

2) with the custom selection of the user,make a Folder with session name

3) make a copy of the files to the session folder

4) convert to zip & create a downloadable link which lasts only for that session

5) delete the custom folder once the session is killed,this way u dont have redundancy
 
My 5 cents ...
give thought to separating the PHP requests and ZIP process.
i.e. Let the user specify files required. accept the request from server, Here the user session with php/http should end.
start an independent thread to perform zip, once it is complete, let the user know about location of zip via mail/message.
 
Safin said:
See the problem is two fold.

I am going to advice him to move to a better dedicated server. I can't think of a better solution than the one mentioned in 1st post anyway.

Did you try running a trace on zipping files worth 10 MB? Run it few times and see how much time it takes. Even good VPS should be able to handle it. You can put all these in a queue and make the download wait till his download is available(zipped).

shotgun said:
My 5 cents ...

give thought to separating the PHP requests and ZIP process.

i.e. Let the user specify files required. accept the request from server, Here the user session with php/http should end.

start an independent thread to perform zip, once it is complete, let the user know about location of zip via mail/message.

Similar functionality is provided on my client's site too.

If the file size is less than 20 MB and server is not loaded, it is generated immediately and available for download.

If greater than 20 MB, its placed in a queue, from where a cron job deals with the queue.

Once generated, files are available for 24 hours. And cron job takes care of deleting files older than 24 hours.
 
Back
Top