Wget

GNU Wget (or just Wget, formerly Geturl) is a program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from the words World Wide Web and get. It supports HTTP, HTTPS, and FTP download protocols.

Why should I use Wget?

Using the Wget program over SSH at the UNIX command line prompt is a great shortcut for downloading files from a remote server to your DreamHost server.

Using Wget helps to avoid the sometimes painful and slow download process because it downloads the files directly to your DreamHost server. Otherwise, you'd need to download them to your computer, then use an FTP program such as Filezilla to upload them to your server which due to the nature of those applications, takes longer.

Wget is a powerful tool, with a lot of options, but even the basics are useful.

rsync may be a better (faster, less complicated) option for users migrating between two rsync-enabled servers (such as moving from DreamHost Shared hosting to DreamHost VPS hosting).

Basic usage

To use Wget:

  1. Create a shell user in your panel.
  2. Log into your server via SSH.
  3. Type in ‘wget’ followed by the full URL of the file you wish to download. For example, run the following command to download the .tar.gz file for Python version 2.7.7:
    [server]$ wget http://www.python.org/ftp/python/2.7.7/Python-2.7.7.tgz
    • This downloads the .tgz file to the directory you ran the command in.
    • Wget is often used to download compressed files.
  4. If the file you download is compressed, decompress the file using gunzip, unzip, or tar to expand and unpack the download.
  5. If you need to pass variables to a script, enclose the URL in single quotes which prevents the ampersand character from being interpreted as the shell command:
    [server]$ wget 'http://www.example.com/myscript.php?var1=foo&var2=bar'

Advanced usage

To create a mirror image of a folder on a different server (with the same structure as the original one), you can simply use the ftp command to log into the server and transfer the folder:

[server]$ wget -r  ftp://username:password@example.com/folder/*

This command downloads 'folder/' and everything within it while keeping its directory structure. This saves you a lot of time rather than using Wget on each file individually.

To save space, you can simply zip the folder using:

[server]$ zip -r  folder.zip folder

and then clean up by deleting the copy:

[server]$ rm -rf folder

Its a great way to backup your entire website at once and of course it's very helpful moving large sites across hosts.

For example, use the following command to download the entire contents of example.com:

[server]$ wget -r -l 0 http://www.example.com/

Man page info

To view the manual page for Wget, run the following in your terminal:

[server]$ man wget 

See also

Internal links

External links