There
were a lot of times when you wanted to download a complete website.
they might be educational websites or any informational blog.
In a linux based OS you can do that with the help of a pre installed command line application called “wget”.
If
you’ve watched the movie “The Social Network”, which is based on Mark
Zuckerberg. You might have noticed Mark saying this word when he was
busy developing “Face Mash”. It is the same “wget”.
According to the “man-page” of “wget”-
GNU Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.Wget is non-interactive, meaning that it can work in the background, while the user is not logged on. This allows you to start a retrieval and disconnect from the system, lettingWget finish the work. By contrast, most of the Web browsers require constant user's presence, which can be a great hindrance when transferring a lot of data.
You
can surely download a single web page with wget , which it does by
default but our focus here is to download a many web pages connected to
that webpage.
For this go to your Terminal and move to the directory that you want to save the result and type the following command.
- wget -r www.example.com
No comments:
Post a Comment