A friend of mine is asking me advice on this topic. This is a commercial website so it cannot be an experimental test
Here is the scenario.
There is a depository of files(images) in a folder. The files are mapped to a db. A user can select the files from frontend. (Think of it as an add to cart) and then once he has done his purchase, click download all. Now you can create a on the fly zip of images and make available the zip file for download(with a timeout to expire the zip after some time). The location of zip again can be made in a non web accessible environment and enforce php protection to enable download to stop a random user to download the files with a link to zip.
The files are rather big in size with individual sizes extending right upto 1.5mb so php timeouts are an issue especially since there seems to be an issue with raising the max_execution_time limit.
Anyone can think of a better solution?

Here is the scenario.
There is a depository of files(images) in a folder. The files are mapped to a db. A user can select the files from frontend. (Think of it as an add to cart) and then once he has done his purchase, click download all. Now you can create a on the fly zip of images and make available the zip file for download(with a timeout to expire the zip after some time). The location of zip again can be made in a non web accessible environment and enforce php protection to enable download to stop a random user to download the files with a link to zip.
The files are rather big in size with individual sizes extending right upto 1.5mb so php timeouts are an issue especially since there seems to be an issue with raising the max_execution_time limit.
Anyone can think of a better solution?