Chapter 37: Bash Downloader (wqet)
Bash Downloader (wqet)! 😄
First things first — I think you meant wget (pronounced “double-u get”), not “wqet”. It’s a very common typo because the keys are close on the keyboard (q next to w, e next to g).
In almost every Bash/Linux tutorial series (including the ones we’ve been doing), people call it “Bash Downloader (wget)” because wget is the classic non-interactive network downloader — the tool you use when you want to grab files from the internet directly in the terminal, especially for scripts, automation, servers, or when your browser is too slow/heavy.
wget = World Wide Web get It’s a free GNU tool (like curl, but with different strengths) designed mainly for downloading files over HTTP, HTTPS, and FTP — and it does this without needing you to interact (no clicking “OK” or typing passwords every time after the first setup).
Think of wget as your reliable robot assistant that can:
- Download one file
- Resume broken downloads
- Mirror entire websites
- Run in background
- Work even if your SSH session closes
- Handle slow/unstable internet better than many browsers
curl (which we learned last time) is more flexible for APIs/POST/headers, but wget is often better/faster for plain file downloads and recursive website grabbing.
Step 1: Check if You Have wget (Most Linux Systems Do)
Open terminal and type:
|
0 1 2 3 4 5 6 |
wget --version |
You should see something like:
|
0 1 2 3 4 5 6 7 |
GNU Wget 1.21.4 built on linux-gnu. ... |
If not installed (rare on Ubuntu/Fedora):
|
0 1 2 3 4 5 6 7 8 9 10 |
sudo apt update && sudo apt install wget # Ubuntu/Debian # or sudo dnf install wget # Fedora # or brew install wget # macOS with Homebrew |
Basic Syntax
|
0 1 2 3 4 5 6 |
wget [options] [URL] |
No options → simple download to current folder with original filename.
1. Super Basic – Download One File
|
0 1 2 3 4 5 6 |
wget https://example.com/file.zip |
→ Downloads file.zip to your current directory.
See progress bar, speed, ETA — very friendly.
2. Save with Custom Name (-O)
|
0 1 2 3 4 5 6 |
wget -O my_backup.zip https://example.com/backup_123.zip |
-O = output document (your chosen name)
3. Resume Interrupted Download (-c or –continue)
This is wget’s superpower — browsers often lose partial downloads.
|
0 1 2 3 4 5 6 |
wget -c https://bigfile.iso |
If connection drops → run same command again → continues from where left off!
4. Download in Background (-b)
Great for long downloads — close terminal/SSH, it keeps going.
|
0 1 2 3 4 5 6 |
wget -b https://huge_dataset.tar.gz |
→ Creates wget-log file with progress. Check later:
|
0 1 2 3 4 5 6 |
tail -f wget-log |
5. Quiet Mode (-q) + No Output
For scripts/cron jobs (no spam in logs):
|
0 1 2 3 4 5 6 |
wget -q https://file.txt |
Add -O – to output to stdout (pipe to something):
|
0 1 2 3 4 5 6 |
wget -qO- https://example.com/script.sh | bash # dangerous — read first! |
6. Download Entire Website or Folder (Recursive -r)
Mirror a site (great for offline reading/docs):
|
0 1 2 3 4 5 6 |
wget -r -np -k https://docs.python.org/3/tutorial/ |
Common useful flags:
- -r → recursive (follow links)
- -np → no parent (don’t go up directories)
- -k → convert links to local (so offline browsing works)
- -l 5 → limit recursion depth to 5 levels
- -p → download prerequisites (images, css, etc.)
Full offline mirror example:
|
0 1 2 3 4 5 6 |
wget --mirror --convert-links --adjust-extension --page-requisites --no-parent https://example.com/docs/ |
7. Limit Speed / Bandwidth (- limit-rate)
Don’t kill your internet:
|
0 1 2 3 4 5 6 |
wget --limit-rate=200k https://bigvideo.mp4 |
200k = 200 KB/s
8. Other Handy Options Table
| What you want | Command Example | Why it’s useful in Hyderabad life |
|---|---|---|
| Simple download | wget https://…/file.pdf | Quick grab PDF/notes |
| Resume big download | wget -c https://…/ubuntu.iso | Slow Jio/Airtel → no worry |
| Background long download | wget -b https://…/movie.mkv | Sleep/shutdown PC, still downloads |
| Mirror website for offline | wget -r -k -l inf -p https://tutorial.com | Study without net (power cut season!) |
| Quiet in script/cron | wget -q -O /tmp/update.zip https://… | Auto-backup script |
| Output to stdout (pipe) | wget -qO- https://api.example/data.json | Feed to jq or another tool |
| Retry many times | wget –tries=20 –waitretry=5 https://… | Unstable connection |
| No certificates check (insecure!) | wget –no-check-certificate https://selfsigned.com | Old servers (use only if you trust) |
9. Practice Right Now (Safe Examples)
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
# 1. Download small test file wget https://httpbin.org/image/jpeg -O test.jpg # 2. See what downloaded ls -lh test.jpg # 3. Download GitHub raw file wget https://raw.githubusercontent.com/torvalds/linux/master/README # 4. Try resume (simulate by Ctrl+C after start) wget https://speed.hetzner.de/100MB.bin # Ctrl+C → then again: wget -c https://speed.hetzner.de/100MB.bin |
wget vs curl (Quick Teacher Comparison)
| Feature | wget | curl |
|---|---|---|
| Best for | File downloads, mirroring sites | APIs, POST/PUT, headers, auth |
| Resume support | Built-in (-c) | Needs –continue-at |
| Recursive/mirror | Yes (-r –mirror) | No (needs extra tools) |
| Default output | Saves to file | Prints to screen |
| Background easy? | Yes (-b) | No (needs & or nohup) |
| 2026 usage | Classic downloader | Modern API/script favorite |
Both are great — many people use wget for files, curl for everything else.
Got it now? wget is your reliable downloader robot — perfect for big files, unstable net, scripts, and offline copies.
Any confusion? Want wget in a real backup script? Or “how to download YouTube with yt-dlp” (uses similar ideas)? Or next topic like “grep” or “find”?
Tell your Hyderabad teacher — keep downloading smartly! 🐧⬇️😄
