Can we download a whole web site ?

HTTrack - offline browser & downloader

HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility.

It allows you to download a World Wide Web site from the Internet to a local directory,
building recursively all directories, getting HTML, images, and other files from the server
to your computer. HTTrack arranges the original site's relative link-structure.

Simply open a page of the "mirrored" website in your browser, and you can browse the site
from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored
site, and resume interrupted downloads.
HTTrack is fully configurable, and has an integrated help system
 
There are plenty of spider programs that are available with a google search that will crawl a website and download everything.... and I do mean everything. You have to be careful because it will also download a lot of stuff you don't need including adds and such.
 
Aman27deep said:
I'm wondering, if someone decided to download Google, how much yB (Yottabyte) worth of data might that be?
:lol:

BTW yes that HTTrack really works, Once I downloaded whole W3Schools for my offline reference.

And don't get misguided, it won't be able to download anything from password protected areas, so if you are thinking of getting any website then forget it now.

And yes, website copy is possible, Google grew up doing this only :p
 
Most of these types of programs will allow you to choose how many directory levels to go up so you can end up getting way to many files if you aren't careful :) and no you can't download protected areas of sites, and I am pretty sure you won't be able to go to the iTunes store and download music, and Google is a bit doubtful too ;)
 
Back
Top