Bash Index : W - The 'W' Bash commands : description, flags and examples




wipe a signature from a device

wipefs [options] device


Flag Usage
without -a or -o list all visible filesystems and the offsets of their basic signatures
-a --all erase all available signatures
-b --backup backup the signature into the file $HOME/wipefs-device-offset.bak
-f --force force erasure, even if the filesystem is mounted
this is required in order to erase a partition-table signature on a block device
it is not yet completely clear to me when this option is required or not depending on the situation : primary partition / extended partitions / primary partitions used as LVM PVs / ...
-n --noheadings do not print a header line
-o offset --offset offset erase the signature found at offset bytes from the specified device



For each file passed as argument, count :

Useless uses :

You may already know the UUOC. Welcome to the UUOW :
Usage DON'T DO
Count the lines of file (UUOC + UUOW combo !!! ) cat file | wc -l wc -l file
Count the lines of file matching a regex grep -E 'regex' file | wc -l grep -Ec 'regex' file


Flag Usage
-c --bytes print the byte counts
-m --chars print the character counts
-l --lines print the newline counts
This equals the number of lines only if every line has a trailing \n, which is not always true :
echo -n 'hello world' | wc -l
-w --words print the word counts



Send a message to everybody's terminal
Every invocation of this command is logged by syslog.


Flag Usage
(none) Precedes the message by a banner. Bob running :
wall Hello world
will display
Broadcast message from bob@hostname (pts/6) (Mon Jun 15 14:50:32 2015):

Hello world
-n root only : do not display the banner. Instead display :
Remote broadcast message (Mon Jun 15 14:54:38 2015):

Hello world

To setup a chat for local users of a server (source) :

  1. Create a shell script like :
    #!/usr/bin/env bash
    # Simple chat system for local users
    nick=${1?Usage: $0 nickname (e.g. $0 bob)}
    chmod a+w "$log"
    echo "^D to exit chat." >&2
    tail -F "$log" & tailPid=$!
    trap 'kill "$tailPid"' 0
    while IFS='' read -er line; do echo "<$nick> $line"; done >> "$log"
  2. Users may join with : ./ bob
  3. Don't forget to purge the $log file at the end



Show who is logged on.


Flag Usage
(void) defaults to -s
-a --all same as -b -d --login -p -r -t -T -u
-b display date and time of latest boot
-d --dead print dead processes
-H --heading display column headers
-l --login list only the entries that correspond to processes via which the system is waiting for a user to login. The user name is always LOGIN.
-m display hostname + user on current stdin
If who is launched with 2 arguments (whatever they are), -m is assumed. Try it :
  • who am i
  • who mom likes
  • who mommy loves
-p --process print active processes spawned by init
-u --users
  • list users logged in + session idle time + session PID
  • idle time can be :
    • hh:mm : duration in hours:minutes
    • . : active in the last minute
    • old : >24h
-r --runlevel display current runlevel
-s --short print only name, line and time
-t --time print last system clock change
-T -w --mesg add user's message status as :
  • + : allowing "write" messages
  • - : disallowing "write" messages
  • ? : can not find terminal device
Run info who for details.



whois is a utility designed to be used on the Internet to query the registrar databases. It outputs the name and contact information of the person / company who owns a domain name.

Looks like this utility is not part of the default Debian install.. It can be installed via the whois package.


  • whois
  • whois | less



GNU Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and ftp protocols, as well as retrieval through HTTP proxies.

Parallel downloads :

wget can only download one resource at a time. To make parallel downloads :
  • Use the quick-n-dirty "background process" method : wget url1 & wget url2 & wget url3
  • consider tools like aria2 (source)
  • hack it with xargs : cat "$listOfUrlsToDownload" | xargs -n 1 -P $numberOfParallelDownloads wget -q (source)
  • hack it with parallel : parallel -j $numberOfParallelDownloads "echo -n .; wget -c -q {}" < "$listOfUrlsToDownload"

Using an HTTP proxy (source, details) :

You can store wget settings in :
  • ~/.wgetrc for user-specific settings
  • /etc/wgetrc for system-wide settings
This file may contain :
http_proxy = http://proxyUser:proxyPassword@proxyHostname:proxyPort/
	use_proxy = on
	wait = 15

Alternate shell-based solution (details) :

export http_proxy="http://proxyUser:proxyPassword@proxyHostname:proxyPort"
Then : wget ...

If proxyPassword contains some special characters such as [SPACE], : or @, you'll have to url-encode them.

How to play with HTTP/1.0 and HTTP/1.1 ?

Use telnet.


Flag Usage
-c HTTP continue getting a partially-downloaded file.
--cache=on|off HTTP When set to off, disable server-side cache. In this case, wget will send the remote server an appropriate directive (Pragma: no-cache) to get the file from the remote service, rather than returning the cached version. Default is on.
--dns-cache=off HTTP Turn off caching of DNS lookups. Normally, wget remembers the addresses it looked up from DNS so it doesn't have to repeatedly contact the DNS server for the same (typically small) set of addresses it retrieves from. This cache exists in memory only; a new wget run will contact DNS again.
--header HTTP Send an additional HTTP header. When using extra headers, make sure the various gateways/proxies on the line don't mangle/remove them.
-E --html-extension HTTP Append a .html extension to the downloaded files which content is application/xhtml+xml or text/html. Thus, index.php will be saved as index.php.html
--ftp-user=user --ftp-password=password FTP Authenticate as user / password when (... not tested yet)
--http-user=user --http-password=password HTTP Authenticate as user / password when prompted by a ".htaccess restriction". Looks like there is nothing to do with the "realm" (window title / prompt) here.
-i file --input-file=file HTTP Read URLs from file. With --force-html, file will be considered as an HTML file.
-k --convert-links HTTP After the download is complete, convert the links in the document to make them suitable for local viewing. This affects not only the visible hyperlinks, but any part of the document that links to external content, such as embedded images, links to style sheets, hyperlinks to non-HTML content, etc.
--local-encoding=encoding HTTP Use encoding as the default system encoding. That affects how wget converts URLs specified as arguments from locale to UTF-8 for IRI support.
-l depth --level=depth HTTP Specify recursion maximum depth level. Defaults to 5. Use inf for infinite recursion.
--max-redirect=n HTTP Follow at most n redirections for a resource (defaults to 20). (source)
--max-redirect=0 disables redirections following.
-m --mirror HTTP Equivalent to -r -N -l inf -nr, this is used to build a mirror of a web site.
-N --timestamping HTTP Turn on time-stamping.
-np --no-parent HTTP Don't try to climb into the parent directory
--no-check-certificate HTTP Skip certificate check to access a server that has an invalid SSL certificate : self-signed, expired, not from a trusted issuer, ...
-nr --dont-remove-listing FTP Don't remove the temporary .listing files generated by FTP retrievals.
-nv --no-verbose HTTP Turn off verbose output, but error messages and basic information are still displayed. To make it completely silent, consider -q.
-O file HTTP write Output to file. To display the output rather than writing it to a file, specify /dev/stdout, or its shorter synonymous : -.
-P /path/to/directory
HTTP (AKA "destination directory") Save downloaded content to /path/to/directory (defaults to .)
-p --pages-requisites HTTP Download all the files that are necessary to properly display a given HTML page : inlined images, sounds, and referenced stylesheets. (Somewhat overloads -r + -l used together, see man)
--post-data HTTP Send data using the POST method.
Example : --post-data=""
Data sent by POST must be URL-encoded !
-q --quiet HTTP Turn off output. This is equivalent to -O /dev/null, and faster.
-r --recursive HTTP recursive download : pages mentioned in hyperlinks will be downloaded too.
--remote-encoding=encoding HTTP Use encoding as the default remote server encoding. This can be found in "Content-Type" HTTP header and in "Content-Type http-equiv" HTML meta tag.
--restrict-file-names=unix|windows|ascii HTTP Specify which special characters found in the remote URLs will be escaped so that they comply with local file names. For instance, if using ascii, Wget will escape all non-ascii characters.
-S --server-response HTTP
display the server HTTP (or FTP) headers
-T n --timeout=n HTTP
Set the network timeout (time to DNS + Connect + Read) to n seconds.
-t n --tries=n HTTP
Retry n times. Defaults to 20. 0 and inf will cause infinite retrying
-U userAgentString
HTTP Identify as userAgentString to the HTTP server
--user=user --password=password HTTP
Specify credentials (single set of options for both protocols). Can be overridden by --http-user and --ftp-user (and the matching --XXX-password ).
-w delay --wait=delay HTTP Wait delay seconds between requests to lighten the web server. delay defaults to seconds, but can be suffixed with m, h, d for minutes, hours or days.
-Y on|off
--proxy on|off (really works ?)
HTTP Toggle proxy support on or off (default is on). This disables wgetrc settings.


Query a web server and display HTTP headers (see also the cURL method) :

wget -q -S

Use HTTP authentication :

wget --header="Authorization: Basic ZGJlcnJlYmlAcHJpc21hbWVkaWEuY29tOnByaXNtYTIwMTE=" ...

  • Authentication should be done so that only one request is necessary. If something's wrong, you may get a HTTP 401, then a HTTP 200. Consider using the -S flag while developping.
  • The value after the Basic, is the base64 encoding of the string login:password (source)

wget --header="Authorization: Basic "$(echo login:password | base64) ...

Play with the HTTP Host header :

wget -q -S --header="Host:"

This may not work if sent through a proxy, as this proxy may block HTTP requests having custom headers, or return the result of a previously cached request made without the custom header.

Mirror a website (sources : 1, 2) :

To do so :
  • with internal links
  • without leaving the initial domain
  • without climbing into parent directories

wget -k -r -E -np -p --restrict-file-names=ascii URL_of_start_page

Get a complete web page from the command line :

wget -nv -np -p pageUrl

Download the photos from a web gallery (doesn't work with Picasa... yet :

wget -r --level=1 pageUrl

If wget complains about the certificate :

When wget outputs :
WARNING: The certificate of '' is not trusted.
WARNING: The certificate of '' hasn't got a known issuer.
try :



whereis locates the binary, source, and manual page files for a command.



Locate a command
which returns the pathnames of the files which would be executed in the current environment. It does this by searching the $PATH for executable files matching the names of the arguments.
which command can also be used to retrieve the actual binary of command (i.e. unaliased) if you have defined an alias named exactly after command. See example.


basic examples

which wget

which bc

Use the unaliased version of a command :

echo						just to split pasted commands from their output and make this example more readable
alias echo='echo "ECHO SAYS:"'	defines an alias for echo named exactly echo, hence the need to discriminate echo (the alias) from echo (/usr/bin/echo)
echo 'hello world (with the alias)'
$(which echo) 'hello world (regular echo from which)'
unalias echo
echo 'hello world (unaliased)'
ECHO SAYS: hello world (with the alias)
hello world (regular echo from which)
hello world (unaliased)



Displays the "Short description" section of the given command's man page.


whatis whatis



watch [options] command

Execute command periodically, showing output full screen.

Keep in mind that command is passed to sh -c, so extra quoting may be necessary.


Flag Usage
-d --differences show the differences with the previous run
try it : watch -d date
-g --chgexit exit when the output of command changes
This is blocked by -d.
-n s
  • repeats command every s seconds
  • s ≥ 0.1s (supports both . and , as the decimal separator)
  • defaults to 2s


watch -d "grep -i bluecoat_fan epn_leave-msgs.log"
keeping an eye on the file epn_leave-msgs.log to see when the string bluecoat_fan will appear in it



wait stops script execution until all jobs running in the background have terminated, or until the specified PID terminates.


Proof-of-concept :

This waits 3 seconds :
date; sleep 3 & sleep 1 & wait; date