Install & Use wget Command in Linux

Install & Use wget Command in Linux

Wget is a free network downloader used to download files from the server. As a non-interactive downloader, it works in the background even when the user is not logged on without impediment to the ongoing process. To learn Everything about Wget Command in Linux, stay with us till the end and review some practical examples of using the wget command to download files.

Using the Wget command allows you to begin a retrieval, then cut your connection to the system and let Wget finish the work. As you know, most web browsers demand a user’s constant presence, which might be a major barrier when uploading a lot of data.

Since the wget package is pre-installed on most Linux distributions, you need to purchase your own Linux VPS to start using the advantages of Wget Command.

All you need to know about Wget Command in Linux

The GNU Project produced the software program Wget. GNU Wget is a command-line tool for downloading files from the web. It can be used to get data and files from different web servers. The name is a mashup of the words: ”World Wide Web” and ”get”.

FTP, SFTP, HTTP, and HTTPS are all supported download protocols. Wget can be used on any Unix system and is created in portable C. It can also be used with AmigaOS, Microsoft Windows, Mac OS X, and other widely used operating systems. If a download fails due to a network issue, wget will keep retrying until the entire file has been downloaded. Wget has been intended for robustness over sluggish or unpredictable network connections. If the server is capable of resuming, it will be told to pick up where it left off with the download.

Wget can produce local copies of remote websites while fully duplicating the directory structure of the original site. It can follow links in HTML and XHTML pages. The links in downloaded HTML files can be converted using the wget command to local files for offline viewing.

Generally, the basic syntax of Wget is:

$ wget [option] [URL]

For example, to download a webpage, you just need to run:

wget http://example.com/sample.php

Wget Command Features

Below features of the Wget command help you experience an easy file download in Linux.

  • Downloads large and multiple files simultaneously.
  • Sets download limits for bandwidth and speed.
  • It utilizes proxies to download files.
  • Support for SSL/TLS for encrypted downloads using the OpenSSL or GnuTLS library.
  • Retries failed downloads.
  • Recursively mirror directories.
  • Support for IPv4 and IPv6 downloads.

How to Install Wget in Linux?

While the wget package is pre-installed on most Linux distributions, you can check if the wget utility is already installed in your Linux box or not. Just run:

$ rpm -q wget         [On RHEL/CentOS/Fedora and Rocky Linux/AlmaLinux]
$ dpkg -l | grep wget [On Debian, Ubuntu and Mint]

If you found that Wget is not installed, use the commands below to install it using the package manager of your Linux according to your running distro.

$ sudo apt install wget -y      [On Debian, Ubuntu and Mint]
$ sudo yum install wget -y      [On RHEL/CentOS/Fedora and Rocky Linux/AlmaLinux]
$ sudo emerge -a net-misc/wget  [On Gentoo Linux]
$ sudo pacman -Sy wget           [On Arch Linux]
$ sudo zypper install wget      [On OpenSUSE]

As you may know, the -y option is used to avoid asking for confirmation before installing any program.

Most Used Wget Commands in Linux [With Examples]

After a successful Wget installation, you are ready to review 19 Wget command examples to be able to use it for everyday tasks.

1. Download a File with wget

One of the simplest uses of the wget command is to download a single file and place it in the directory where you are currently working.

For example, you can use the following command to get the latest version of WordPress. While downloading, you can view the download progress, size, date, and time.

$ wget https://wordpress.org/latest.zip

As you expect, by running the command above, a file named ‘’latest.zip’’ will be downloaded in your current working directory.

2. Download Multiple Files with Wget Command

According to the Wget Command Features section, you know that it is capable to download multiple files at once. To do this, create a text document and place the download URLs there. The -i option is used to get all the files stored in your example text file. Each URL must be on its own line.

For example, to download the Arch Linux, Debian, and Fedora iso files, run:

$ wget -i linux-distros.txt

linux-distros.txt

http://mirrors.edge.kernel.org/archlinux/iso/2018.06.01/archlinux-2018.06.01-x86_64.iso
https://cdimage.debian.org/debian-cd/current/amd64/iso-cd/debian-9.4.0-amd64-netinst.iso
https://download.fedoraproject.org/pub/fedora/linux/releases/28/Server/x86_64/iso/Fedora-Server-dvd-x86_64-28-1.1.iso

3. Download Multiple Files with HTTP and FTP Protocol Using Wget

To download multiple files using HTTP and FTP protocol with the wget command at once, run the following command as an example.

$ wget http://ftp.gnu.org/gnu/wget/wget2-2.0.0.tar.gz ftp://ftp.gnu.org/gnu/wget/wget2-2.0.0.tar.gz.sig

4. Download & Save Files Under Different Names Using Wget

with the help of -O option, you can get the downloaded file under a different name. Again, we example of WordPress. So,

the downloaded resource must be saved as wordpress-install.zip instead of its original name.

$ wget -O wordpress-install.zip https://wordpress.org/latest.zip

5. Download Files in the Background Using Wget

Using Wget Command in Linux allows you to download files in the background. You can use -b function for really large files. It helps you download your content in the background. Here is its syntax:

$ wget -b http://www.example.com/samplepage.php

In this way, the download is sent in the background once the download starts, and logs are written in the wget.log file.

It will appear in your working directory and is used to check your download progress and latest status.

6. Resume Download with Wget

This is helpful if your connection stops while downloading a large file because you may carry on with the previous download rather than beginning over. Using -c option, you will be able to resume an uncompleted download.

The syntax is:

$wget -c [URL]

For example, to download the Ubuntu 22.04 iso file:

# wget -c https://releases.ubuntu.com/22.04/ubuntu-22.04-desktop-amd64.iso

7. Mirror the Entire Website Using Wget

Here is another usage of Wget Command in Linux. The following command will create a local copy of the website along with all the assets, allowing you to download, mirror, or copy a full website for offline viewing.

$ wget --recursive --page-requisites --adjust-extension --span-hosts --convert-links --restrict-file-names=windows --domains yoursite.com --no-parent yoursite.com

To analyze the content of the above command, look at its explanation:

wget \
     --recursive \ # Download the whole site.
     --page-requisites \ # Get all assets/elements (CSS/JS/images).
     --adjust-extension \ # Save files with .html on the end.
     --span-hosts \ # Include necessary assets from offsite as well.
     --convert-links \ # Update links to still work in the static version.
     --restrict-file-names=windows \ # Modify filenames to work in Windows as well.
     --domains yoursite.com \ # Do not follow links outside this domain.
     --no-parent \ # Don't follow links outside the directory you pass in.
         yoursite.com/whatever/path # The URL to download

8. Limit the Download Speed with Wget

When you don’t want wget to eat up all the available bandwidth, this option is helpful. Bytes/second is the standard unit of measurement for speed. The download speed limit is set to 100k using the option --limit-rate=100k , and the logs are created under wget.log as shown below.

$ wget -c --limit-rate=100k -b wget.log https://releases.ubuntu.com/22.04/ubuntu-22.04-desktop-amd64.iso

9. Download via FTP with Wget

The following example is also usable with FTP. To download via FTP, run the command below and just specify the username and password.

$ wget --ftp-user=FTP_USERNAME --ftp-password=FTP_PASSWORD ftp://ftp.example.com/filename.tar.gz

Also, it is possible to download a file from a password-protected HTTP server. To do this, use the options --http-user=username and --http-password=password as shown below.

$ wget --http-user=narad --http-password=password http://http.example.com/filename.tar.gz

10. Using Wget to Skip SSL Certificate Check

The Wget Command in Linux is also used for ignoring SSL certificate checks. Use the --no-check-certificate option if you want to download a file via HTTPS from a host that has an invalid SSL certificate:

$ wget --no-check-certificate https://domain-with-invalid-ss.com

11. Display Version & Help Using Wget

To view all the possible options of the line command that is available with the Wget command line options, you can use the --version and --help.

$ wget --version
$ wget --help

12. Downloading to the Standard Output with Wget

You can use Wget to download and output the latest version of your desired file. For example, you can run the following command to download the standard output of WordPress:

$ wget -q -O - "http://wordpress.org/latest.tar.gz" | tar -xzf - -C /var/www

13. Using Wget to Change the Wget User-Agent

The Wget User-Agent may occasionally be blocked by the remote server when downloading a file. Use the -U option in these circumstances to simulate a different browser.

For example, to emulate Firefox 60 requesting the page from wget-forbidden.com, run:

$ wget --user-agent="Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0" http://wget-forbidden.com/

14. Downloading a File to a Specific Directory with Wget

Wget saves downloaded files by default in the current working directory. You can use the -P option to save the file to a particular location.

For example, to tell wget to save the CentOS 7 iso file to the /mnt/iso directory, type:

$ wget -P /mnt/iso http://mirrors.mit.edu/centos/7/isos/x86_64/CentOS-7-x86_64-Minimal-1804.iso

15. Set Retry Attempts with Wget

Your download may be interrupted by issues with your internet connection. So, use the -tries function to increase the number of retry attempts to address this problem:

wget -tries=100 https://wordpress.org/latest.zip

16. Download Numbered Files with Wget

You may quickly download all of your files or images that are numbered in a list by running the following command:

wget http://example.com/images/{1..50}.jpg

17. Extract Multiple URLs with Wget

To add all URLs in a urls.txt file, type:

1 https://example.com/1
2 https://example.com/2
3 https://example.com/3
1  $ wget -i urls.txt

18. Overwrite the log Using Wget

wget http://www.example.com/filename.txt -o /path/filename.txt

19. Find Broken Links Using Wget

The wget program can be used to find all broken URLs that show 404 errors on a particular website. The syntax is as follows:

wget -o wget-log -r -l 5 --spider http://example.com

The above command says:

-o: creates a file that collects output for later use.

-l: Specifies the level of recursion.

-r: The download becomes recursive.

-spider : Sets wget to spider mode.

Now that you have looked inside the wget-log file, you can locate the list of broken links. Here is the order to carry it out:

grep -B 2 '404' wget-log | grep "http" | cut -d " " -f 4 | sort -u

You reviewed the most used Wget command examples. In the end, let’s review all of them at a glance.

Wget Command UsageSyntax
Download a File$ wget https://wordpress.org/latest.zip
Download Multiple Files$ wget -i linux-distros.txt
Download Multiple Files with HTTP and FTP Protocol$ wget http://ftp.gnu.org/gnu/wget/wget2-2.0.0.tar.gz ftp://ftp.gnu.org/gnu/wget/wget2-2.0.0.tar.gz.sig
Download & Save Files Under Different Names$ wget -O wordpress-install.zip https://wordpress.org/latest.zip
Download Files in the Background$ wget -b http://www.example.com/samplepage.php
Resume Download$wget -c [URL]
Mirror the Entire Website$ wget --recursive --page-requisites --adjust-extension --span-hosts --convert-links --restrict-file-names=windows --domains yoursite.com --no-parent yoursite.com
Limit the Download Speed$ wget -c --limit-rate=100k -b wget.log https://releases.ubuntu.com/22.04/ubuntu-22.04-desktop-amd64.iso
Download via FTP$ wget --ftp-user=FTP_USERNAME --ftp-password=FTP_PASSWORD ftp://ftp.example.com/filename.tar.gz
Skip SSL Certificate Check$ wget --no-check-certificate https://domain-with-invalid-ss.com
Display Version & Help$ wget --version
$ wget --help
Downloading to the Standard Output$ wget -q -O - "http://wordpress.org/latest.tar.gz" | tar -xzf - -C /var/www
Change the Wget User-Agent$ wget --user-agent="Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0" http://wget-forbidden.com/
Downloading a File to a Specific Directory$ wget -P /mnt/iso http://mirrors.mit.edu/centos/7/isos/x86_64/CentOS-7-x86_64-Minimal-1804.iso
Set Retry Attemptswget -tries=100 https://wordpress.org/latest.zip
Download Numbered Fileswget http://example.com/images/{1..50}.jpg
xtract Multiple URLs$ wget -i urls.txt
Overwrite the logwget http://www.example.com/filename.txt -o /path/filename.txt
Find Broken Linkswget -o wget-log -r -l 5 --spider http://example.com

FAQ

Sometimes your server cannot support resuming download. So, wget will start the download from the beginning and overwrite the considered file.

Wget's advantage over curl is its capacity for recursive downloading. In other words, it will download one document, then click on the links to download the other documents.

 

 

To download a series of for example Linux kernel, version 5.1.1 to 5.1.15, type:

$ wget https://mirrors.edge.kernel.org/pub/linux/kernel/v5.x/linux-5.1.{1..15}.tar.gz

-r option is a recursive call to the given link in the command line.

$wget -r [URL]

To make the request less frequent run:

$wget -s 15 [URL]

Conclusion

In this article, you learned Everything about Wget Command in Linux. Linux Wget commands with options for routine administration operations have been covered in this article. 15 examples are explained to teach you how you can use Wget Command in Linux to mirror websites, resume interrupted downloads, download several files at once, and combine Wget features to suit your needs with wget.

You can also call all the functions of this guide from scripts and cron jobs. If you encounter any problems, please do not hesitate to contact us. Our technical support team will try their best to solve your problems.

Leave a Reply

Your email address will not be published. Required fields are marked.