http_proxy=http://proxyHost:proxyPort https_proxy=https://proxyHost:proxyPort
http_proxy=http://proxyHost:proxyPort https_proxy=http://proxyHost:proxyPort
141
exit code, and the message :* additional stuff not fine transfer.c:1037: 0 0 * SSL read: error:00000000:lib(0):func(0):reason(0), errno 104 * Closing connection #0What does that mean ?
141
exit code (source) :> Under what circumstances will cURL print the following message: > * additional stuff not fine transfer.c:1037: 0 0 That's a debug output I put there once to aid me debugging a transfer case I had problems with and I then left it there. It is only present in debug builds and basically if you need to ask about it, the message is not for you...
It just explains that the transfer is not deemed complete yet. You need to look at protocol and TCP details to see what's going on
Failure with receiving network data https://stackoverflow.com/questions/10285700/curl-error-recv-failure-connection-reset-by-peer-php-curl#10349895
https_proxy=http://login:password@proxy.company.tld:port; curl
* Connection #0 to host www.example.com left intact
* Closing connection #0
* Closing connection #0
curl: (52) Empty reply from server
curl: (35) Unknown SSL protocol error in connection to 12.34.56.78:443
curl: (7) couldn't connect to host
(UNKNOWN) [12.34.56.78] 80 (http) : Connection timed out (UNKNOWN) [12.34.56.78] 443 (https) openConfirmed !
empty response,
no data sent from server, or something like that with an other tool such as wget or the Web Developer Firefox extension
Flag | Protocol | Usage |
---|---|---|
-A userAgent --user-agent userAgent | Specify the user agent. This can also be done with : -H "User-Agent: userAgent" | |
-b file --cookie file | Read saved cookies from file and send them to the HTTP server. | |
-c file --cookie-jar file | Write received cookie(s) to file | |
-C --continue-at offset | Continue / resume a file transfer at the given position : offset bytes will be skipped from the beginning of the source file.
Use -C - to tell cURL to automatically find out where/how to resume the transfer.
|
|
--compressed | Request a compressed response using one of the algorithms libcurl supports, and return the uncompressed document. This is equivalent to : -H "Accept-Encoding: gzip, deflate". A server that does not support the requested compression method(s) will ignore the compress header request and simply reply back the contents uncompressed.
|
|
-D file --dump-header file | Dump protocol headers to file | |
-d data --data data | HTTP | Sends data in a POST request to the HTTP server, in a way that can emulate as if a user has filled in a HTML form and pressed the submit button. You MUST NOT urlencode this data. |
-F formFieldName=formFieldValue | HTTP | Emulate a filled-in form in which a user has pressed the submit button. This causes cURL to POST data using the Content-Type multipart/form-data (allowing to upload binary data) according to RFC 1867. Use -F multiple times to submit several fields. |
--ftp-pasv | FTP | Use the FTP passive mode. |
--ftp-ssl | FTP | Use SSL/TLS for the FTP connection. |
-H header --header header | HTTP | Provide extra header when getting a web page. When using extra headers, make sure the various gateways/proxies on the line don't mangle/remove them. |
-I (capital i) --head | Fetch the HTTP headers only with the HTTP command HEAD. | |
-i --include | Include the HTTP headers in the output. This includes server name, date of the document, HTTP version, cookies and more...
Using both -i and -D is pleonastic as -i == -D /dev/stdout, but may collide with further options (like -o /dev/null). No big deal having an extra -i
|
|
-k --insecure | Prevent the connection from failing when the SSL certificate check fails. | |
--limit-rate bytesPerSecond | Limit the transfer speed (both for uploads and downloads) to bytesPerSecond bytes per second. bytesPerSecond can be appended a kKmMgG suffix for Kilo/Mega/Gigabytes per second. | |
-L --location |
|
|
--noproxy hosts | Bypass the system proxy for the listed hosts
To match all hosts and actually disable the proxy, use the
* wildcard within single quotes, otherwise it will be substituted by the shell : --noproxy '*' |
|
-o file --output file | HTTP | Write output to file instead of STDOUT
This allows fetching multiple files at once : curl -o http://www.example.com/foo.html -o http://www.example.com/bar.html
|
-q --disable | If used as the first parameter on the command line, ignore the ~/.curlrc config file | |
-O --remote-name | HTTP | Write Output to a local file named like the remote file. This will write index.html locally : curl -O http://www.example.com/index.html
|
--resolve host:port:ip.add.re.ss | Pre-populates the DNS cache with entries for host:port pair so redirects and everything that operates against this pair will instead use the provided ip.add.re.ss
This looks like a good alternative to playing with HTTP headers during tests. (sources : 1, 2) |
|
-s --silent | Silent mode. Don't show progress meter or error messages. Makes cURL mute. | |
--sslv2 -2 --sslv3 -3 |
HTTP | Forces cURL to use SSL version 2 (respectively : 3) when negotiating with a remote SSL server.
Both are insecure and obsolete (Prohibiting SSL v2, Deprecating SSL v3) and must NOT be used anymore. Consider --tls directives instead.
|
HTTP |
|
|
-T file URL --upload file URL |
Upload file to URL. If no remote file name is given within URL, the local file name will be used. If URL doesn't end with a /, its last part will be considered as the remote file name. | |
-u login:password --user login:password |
HTTP FTP |
Specify login and password to use for server authentication |
-U login:password --proxy-user login:password |
HTTP FTP |
Specify login and password to use for proxy authentication |
-v | Verbose mode
|
|
-x proxy --proxy proxy | Use the specified proxy. Format : protocol://proxyHost:proxyPort
|
|
-Xcommand |
|
% Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 1270 100 1270 0 0 75107 0 --:--:-- --:--:-- --:--:-- 79375
curl -iso /dev/null -D /dev/stdout -H "Pragma: akamai-x-get-cache-key, akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-get-true-cache-key, akamai-x-get-extracted-values, akamai-x-check-cacheable, akamai-x-get-request-id, akamai-x-serial-no, akamai-x-get-ssl-client-session-id" -H "Host: www.example.com" http://12.34.56.78/
akcurl -iso /dev/null -D /dev/stdout -H "Host: www.example.com" http://12.34.56.78/
akcurl -iso /dev/null -D /dev/stdout -H "Host: old.example.com" http://example.com.edgesuite.net/
siteName='www.example.com'; login='username'; password='password'; baseUrl='http://'$siteName; resultFile='./index.html'; redirectTo='http%3A%2F%2F'$siteName'%2Fwp-admin%2F'; curl -L -iso $resultFile -D /dev/stdout --data "log=$login&pwd=$password&wp-submit=Se+connecter&redirect_to=$redirectTo&testcookie=1" --cookie ./cookie.txt $baseUrl/wp-login && grep --color=auto "Tableau de bord" $resultFile
Notes :serverName='srvXYZ'; siteName='www.example.com'; login='username'; password='password'; baseUrl='http://'$serverName; resultFile='./index.html'; redirectTo='http%3A%2F%2F'$siteName'%2Fwp-admin%2F'; curl -L -iso $resultFile -D /dev/stdout --data "log=$login&pwd=$password&wp-submit=Connexion&redirect_to=$redirectTo&testcookie=1" --cookie ./cookie.txt --header "Host: $siteName" $baseUrl/wp-login.php
Notes, same as above, but also :hostnameFake='my.productionSite.com'; hostnameReal='serverName'; resultFile='./result.txt'; cookieFile='./cookie.txt'; curl -iso $resultFile -D /dev/stdout --cookie-jar $cookieFile -H "Host: $hostnameFake" http://$hostnameReal/loginPage; curl -iso $resultFile -D /dev/stdout --data 'login=admin& password=secret' --cookie $cookieFile --cookie-jar $cookieFile -H "Host: $hostnameFake" http://$hostnameReal/loginCredentialsCheck; curl -iso $resultFile -D /dev/stdout --cookie $cookieFile -H "Host: $hostnameFake" http://$hostnameReal/contentPage; grep --color=auto "some text" $resultFile
Notes :curl -s -o /dev/null -iD /dev/stdout -w "Effective URL :\t\t%{url_effective}\nContent-type :\t\t%{content_type}\nHTTP CODE :\t\t%{http_code}\nDNS lookup duration :\t%{time_namelookup} s\nConnect duration :\t%{time_connect} s\nTTFB :\t\t\t%{time_starttransfer} s\nTotal time :\t\t%{time_total} s\nDownloaded :\t\t%{size_download} bytes\n" http://www.example.com
curl --insecure --ftp-ssl --ftp-pasv --user "login:password" "ftp://ftpHost:ftpPort/path/to/upload/directory/"
The trailing / looks mandatory (?).