for webHost in www.example.com apache.org; do for scheme in http https; do url="${scheme}://$webHost" echo -e "$url" curl -sI "$url" | grep '^HTTP/1.1' done echo done
http://www.example.com HTTP/1.1 200 OK https://www.example.com HTTP/1.1 200 Connection established different reason phrases http://apache.org HTTP/1.1 301 Moved Permanently https://apache.org HTTP/1.1 200 Connection established different responses to HTTP / HTTPS requestsYou may notice :
for url in https://www.example.com https://www.google.com https://www.wikipedia.com https://nginx.org; do echo -e "\n$url" curl -sI "$url" | grep '^HTTP' | tr ' ' '_' this is to highlight [SPACE] characters done
https://www.example.com HTTP/1.1_200_Connection_established HTTP/2_200_ https://www.google.com HTTP/1.1_200_Connection_established HTTP/2_204_ https://www.wikipedia.com HTTP/1.1_200_Connection_established HTTP/2_301_ https://nginx.org HTTP/1.1_200_Connection_established HTTP/1.1_200_OKWe can see that :
[SPACE]
since their reason phrases are empty strings* Connection #0 to host www.example.com left intact
* Closing connection #0
* Closing connection #0
Flag | Protocol | Usage |
---|---|---|
-A userAgent --user-agent userAgent | Specify the user agent. This can also be done with : -H "User-Agent: userAgent" | |
-b file --cookie file | Read saved cookies from file and send them to the HTTP server. | |
-c file --cookie-jar file | Write received cookie(s) to file | |
-C --continue-at offset | Continue / resume a file transfer at the given position : offset bytes will be skipped from the beginning of the source file.
Use -C - to tell cURL to automatically find out where/how to resume the transfer.
|
|
--compressed | Request a compressed response using one of the algorithms libcurl supports, and return the uncompressed document. This is equivalent to : -H "Accept-Encoding: gzip, deflate". A server that does not support the requested compression method(s) will ignore the compress header request and simply reply back the contents uncompressed.
|
|
-D file --dump-header file |
|
|
-d data --data data | HTTP | Sends data in a POST request to the HTTP server, in a way that can emulate as if a user has filled in a HTML form and pressed the submit button. You MUST NOT urlencode this data. |
-f --fail | HTTP |
|
-F formFieldName=formFieldValue | HTTP | Emulate a filled-in form in which a user has pressed the submit button. This causes cURL to POST data using the Content-Type multipart/form-data (allowing to upload binary data) according to RFC 1867. Use -F multiple times to submit several fields. |
--ftp-pasv | FTP | Use the FTP passive mode. |
--ftp-ssl | FTP | Use SSL/TLS for the FTP connection. |
-H header --header header |
HTTP |
|
-I (capital i) --head |
|
|
-i --include | Include the HTTP headers in the output. This includes server name, date of the document, HTTP version, cookies and more...
Using both -i and -D is pleonastic as -i == -D /dev/stdout, but may collide with further options (like -o /dev/null). No big deal having an extra -i
|
|
-k --insecure | Prevent the connection from failing when the SSL certificate check fails. | |
--limit-rate bytesPerSecond | Limit the transfer speed (both for uploads and downloads) to bytesPerSecond bytes per second. bytesPerSecond can be appended a kKmMgG suffix for Kilo/Mega/Gigabytes per second. | |
-L --location |
|
|
--noproxy hosts | Bypass the system proxy for the listed hosts
To match all hosts and actually disable the proxy, use the
* wildcard within single quotes, otherwise it will be substituted by the shell : --noproxy '*' |
|
-o file --output file | HTTP | Write output to file instead of STDOUT
This allows fetching multiple files at once : curl -o http://www.example.com/foo.html -o http://www.example.com/bar.html
|
-q --disable | If used as the first parameter on the command line, ignore the ~/.curlrc config file | |
-O --remote-name | HTTP | Write Output to a local file named like the remote file. This will write index.html locally : curl -O http://www.example.com/index.html
|
--resolve host:port:ip.add.re.ss | Pre-populates the DNS cache with entries for host:port pair so redirects and everything that operates against this pair will instead use the provided ip.add.re.ss
This looks like a good alternative to playing with HTTP headers during tests. (sources : 1, 2) |
|
-s --silent | Silent mode. Don't show progress meter or error messages. Makes cURL mute. | |
--sslv2 -2 --sslv3 -3 |
HTTP | Forces cURL to use SSL version 2 (respectively : 3) when negotiating with a remote SSL server.
Both are insecure and obsolete (Prohibiting SSL v2, Deprecating SSL v3) and must NOT be used anymore. Consider --tls directives instead.
|
HTTP |
|
|
-T file URL --upload file URL |
Upload file to URL. If no remote file name is given within URL, the local file name will be used. If URL doesn't end with a /, its last part will be considered as the remote file name. | |
-u login:password --user login:password |
HTTP FTP |
Specify login and password to use for server authentication |
-U login:password --proxy-user login:password |
HTTP FTP |
Specify login and password to use for proxy authentication |
-v | Verbose mode
|
|
-x proxy --proxy proxy | Use the specified proxy. Format : protocol://proxyHost:proxyPort
|
|
-Xcommand |
|
% Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 1270 100 1270 0 0 75107 0 --:--:-- --:--:-- --:--:-- 79375
curl -iso /dev/null -D /dev/stdout -H "Pragma: akamai-x-get-cache-key, akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-get-true-cache-key, akamai-x-get-extracted-values, akamai-x-check-cacheable, akamai-x-get-request-id, akamai-x-serial-no, akamai-x-get-ssl-client-session-id" -H "Host: www.example.com" http://12.34.56.78/
akcurl -iso /dev/null -D /dev/stdout -H "Host: www.example.com" http://12.34.56.78/
akcurl -iso /dev/null -D /dev/stdout -H "Host: old.example.com" http://example.com.edgesuite.net/
siteName='www.example.com'; login='username'; password='password'; baseUrl='http://'$siteName; resultFile='./index.html'; redirectTo='http%3A%2F%2F'$siteName'%2Fwp-admin%2F'; curl -L -iso $resultFile -D /dev/stdout --data "log=$login&pwd=$password&wp-submit=Se+connecter&redirect_to=$redirectTo&testcookie=1" --cookie ./cookie.txt $baseUrl/wp-login && grep --color=auto "Tableau de bord" $resultFile
Notes :serverName='srvXYZ'; siteName='www.example.com'; login='username'; password='password'; baseUrl='http://'$serverName; resultFile='./index.html'; redirectTo='http%3A%2F%2F'$siteName'%2Fwp-admin%2F'; curl -L -iso $resultFile -D /dev/stdout --data "log=$login&pwd=$password&wp-submit=Connexion&redirect_to=$redirectTo&testcookie=1" --cookie ./cookie.txt --header "Host: $siteName" $baseUrl/wp-login.php
Notes, same as above, but also :hostnameFake='my.productionSite.com'; hostnameReal='serverName'; resultFile='./result.txt'; cookieFile='./cookie.txt'; curl -iso $resultFile -D /dev/stdout --cookie-jar $cookieFile -H "Host: $hostnameFake" http://$hostnameReal/loginPage; curl -iso $resultFile -D /dev/stdout --data 'login=admin& password=secret' --cookie $cookieFile --cookie-jar $cookieFile -H "Host: $hostnameFake" http://$hostnameReal/loginCredentialsCheck; curl -iso $resultFile -D /dev/stdout --cookie $cookieFile -H "Host: $hostnameFake" http://$hostnameReal/contentPage; grep --color=auto "some text" $resultFile
Notes :curl -s -o /dev/null -iD /dev/stdout -w "Effective URL :\t\t%{url_effective}\nContent-type :\t\t%{content_type}\nHTTP CODE :\t\t%{http_code}\nDNS lookup duration :\t%{time_namelookup} s\nConnect duration :\t%{time_connect} s\nTTFB :\t\t\t%{time_starttransfer} s\nTotal time :\t\t%{time_total} s\nDownloaded :\t\t%{size_download} bytes\n" http://www.example.com
curl --insecure --ftp-ssl --ftp-pasv --user "login:password" "ftp://ftpHost:ftpPort/path/to/upload/directory/"
The trailing / looks mandatory (?).
fieldSeparator='|'; for curlOption in '-s' '-s -f'; do echo -e "\ncURL option : '$curlOption'" for httpStatusCode in 200 204 301 302 304 400 401 403 404 410 501 503; do echo -ne "$httpStatusCode$fieldSeparator\"" curl $curlOption "https://httpstat.us/$httpStatusCode" && answer='\e[1;32mOK\e[0m' || answer='\e[1;31mKO\e[0m' echo -e "\"$fieldSeparator$answer" done | column -s "$fieldSeparator" -t done
cURL option : '-s' 200 "200 OK" OK 204 "" OK 301 "301 Moved Permanently" OK 302 "302 Found" OK 304 "" OK 400 "400 Bad Request" OK 401 "401 Unauthorized" OK 403 "403 Forbidden" OK 404 "404 Not Found" OK 410 "410 Gone" OK 501 "501 Not Implemented" OK 503 "503 Service Unavailable" OK cURL option : '-s -f' 200 "200 OK" OK 204 "" OK 301 "301 Moved Permanently" OK 302 "302 Found" OK 304 "" OK 400 "" KO 401 "" KO 403 "" KO 404 "" KO 410 "" KO 501 "" KO 503 "" KO