<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Linux &#124; egrep &#187; troubleshooting</title>
	<atom:link href="https://www.linuxegrep.com/tag/troubleshooting/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.linuxegrep.com</link>
	<description>Extended search for information</description>
	<lastBuildDate>Tue, 08 Jan 2019 06:23:44 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.5.2</generator>
		<item>
		<title>HTTP / Web server troubleshooting using Wget.</title>
		<link>https://www.linuxegrep.com/archives/howto/webmaster-howto/http-web-server-troubleshooting-using-wget/</link>
		<comments>https://www.linuxegrep.com/archives/howto/webmaster-howto/http-web-server-troubleshooting-using-wget/#comments</comments>
		<pubDate>Tue, 16 Jun 2009 14:34:53 +0000</pubDate>
		<dc:creator>meharo</dc:creator>
				<category><![CDATA[Webmaster HOWTOs]]></category>
		<category><![CDATA[troubleshooting]]></category>
		<category><![CDATA[wget]]></category>

		<guid isPermaLink="false">http://www.findlinuxhelp.com/?p=73</guid>
		<description><![CDATA[There are few useful options to the powerful wget command, a non-interactive Linux/Unix command line downloader which helps you identifying various http server responses, performance related issues and optional feature supports. For probing an http server and identifying its response, we can use the &#8211;spider option. wget --spider http://www.google.com &#8211;06:24:36&#8211; http://www.google.com/ Resolving www.google.com&#8230; 74.125.53.103, 74.125.53.99, [...]]]></description>
				<content:encoded><![CDATA[<p style="text-align: justify;">There are few useful options to the powerful wget command, a non-interactive Linux/Unix command line downloader which helps you identifying various http server responses, performance related issues and optional feature supports.</p>
<p style="text-align: justify;">For probing an http server and identifying its response, we can use the <strong><span style="font-family: 'courier new', courier;">&#8211;</span><span style="font-family: 'courier new', courier;">spider</span></strong><span style="font-family: 'courier new', courier;"> </span>option.</p>
<pre lang="bash">wget --spider http://www.google.com</pre>
<p style="padding-left: 30px; text-align: justify;"><span style="color: #000000;"><span style="color: #c0c0c0;"><span style="color: #999999;"><span style="font-family: 'courier new', courier;">&#8211;06:24:36&#8211; http://www.google.com/<br />
Resolving www.google.com&#8230; 74.125.53.103, 74.125.53.99, 74.125.53.104, &#8230;<br />
Connecting to www.google.com|74.125.53.103|:80&#8230; connected.<br />
HTTP request sent, awaiting response&#8230; 200 OK<br />
Length: unspecified [text/html]<br />
200 OK</span></span></span></span></p>
<p style="text-align: justify;"><span id="more-73"></span></p>
<p style="text-align: justify;">It will say you the http response code and return a corresponding exit code. If the server responded properly with a successful response, the program will exit with error code 0. When you add the switch <strong><span style="font-family: 'courier new', courier;">&#8211;</span><span style="font-family: 'courier new', courier;">spider</span></strong>, the real file will not be downloaded to local.</p>
<p style="text-align: justify;">To analyse the http server response header, use <strong><span style="font-family: 'courier new', courier;">-S</span></strong> (<strong><span style="font-family: 'courier new', courier;">&#8211;server-response</span></strong>) switch. It prints the header sent by the server.</p>
<pre lang="bash">wget --spider -S http://www.google.com</pre>
<p style="padding-left: 30px; text-align: justify;"><span style="color: #000000;"><span style="color: #c0c0c0;"><span style="color: #999999;"><span style="font-family: 'courier new', courier;">&#8211;06:23:14&#8211; http://www.google.com/<br />
Resolving www.google.com&#8230; 74.125.53.103, 74.125.53.99, 74.125.53.104, &#8230;<br />
Connecting to www.google.com|74.125.53.103|:80&#8230; connected.<br />
HTTP request sent, awaiting response&#8230;<br />
HTTP/1.0 200 OK<br />
Cache-Control: private, max-age=0<br />
Date: Tue, 16 Jun 2009 13:23:14 GMT<br />
Expires: -1<br />
Content-Type: text/html; charset=ISO-8859-1<br />
Set-Cookie: PREF=ID=640b4463b5aaaadf:TM=1245158594:LM=1245158594:S=35L2K0_MlEo7Cka5; expires=Thu, 16-Jun-2011 13:23:14 GMT; path=/; domain=.google.com<br />
Server: gws<br />
Length: unspecified [text/html]<br />
200 OK</span></span></span></span></p>
<p style="text-align: justify;">This is useful to check any additional parameters like the MIME type sent by server, charset and last updated time of the file.</p>
<p style="text-align: justify;">Wget supports sending custom/altered header fields in its request header.</p>
<pre lang="bash">wget --header="Host: www.mysite.com" --spider http://192.168.0.1</pre>
<p style="text-align: justify;">This command will request for site www.mysite.com hosted on server 192.168.0.1 using name based virtual hosting.</p>
<pre lang="bash">wget --spider --header="Accept-Encoding: compress, gzip" http://www.mysite.com</pre>
<p style="text-align: justify;">This request will tell the server, accepting <em><strong>compress</strong></em> and <em><strong>gzip</strong></em> encoding methods. If server supports sending compressed http packets, it will respond with a <em><strong>Content-Encoding</strong></em> flag in its response header.</p>
<p style="padding-left: 30px; text-align: justify;"><span style="color: #000000;"><span style="color: #999999;"><span style="font-family: 'courier new', courier;">HTTP/1.1 200 OK<br />
Date: Tue, 16 Jun 2009 13:32:08 GMT<br />
Server: Apache/2.2.8 (Unix) mod_ssl/2.2.8 OpenSSL/0.9.8b DAV/2 mod_jk/1.2.26<br />
Vary: Accept-Encoding<br />
Content-Encoding: gzip<br />
Keep-Alive: timeout=5<br />
Connection: Keep-Alive<br />
Content-Type: text/html</span></span></span></p>
<p style="text-align: justify;">Also try various timeouts and re-try values to benchmark your server response and performance.</p>
<pre lang="bash">wget --spider --wait=0 --waitretry=0 -T 0.1 -t 10 http://www.mysite.com</pre>
<p style="text-align: justify;">This checks your server for faster response and if it could not get a page within the specified time, returns a non-zero exit value.</p>
<p style="padding-left: 30px; text-align: justify;"><span style="color: #000000;"><span style="color: #999999;"><span style="font-family: 'courier new', courier;">-t number<br />
&#8211;tries=number<br />
Set number of retries to number.</span></span></span></p>
<p style="padding-left: 30px; text-align: justify;"><span style="color: #000000;"><span style="color: #999999;"><span style="font-family: 'courier new', courier;">-T seconds<br />
&#8211;timeout=seconds<br />
Set the network timeout to seconds seconds. This is equivalent to specifying &#8211;dns-timeout, &#8211;connect-timeout, and &#8211;read-time-out, all at the same time.</span></span></span></p>
<p style="padding-left: 30px; text-align: justify;"><span style="color: #000000;"><span style="color: #999999;"><span style="font-family: 'courier new', courier;">-w seconds<br />
&#8211;wait=seconds<br />
Wait the specified number of seconds between the retrievals.</span></span></span></p>
<p style="padding-left: 30px; text-align: justify;"><span style="color: #000000;"><span style="font-family: 'courier new', courier;"><span style="color: #999999;"><span style="font-family: 'courier new', courier;">&#8211;waitretry=seconds</span></span></span></span></p>
<p style="text-align: justify;">If you don&#8217;t want Wget to wait between every retrieval, but only between retries of failed downloads, you can use this option.</p>
]]></content:encoded>
			<wfw:commentRss>https://www.linuxegrep.com/archives/howto/webmaster-howto/http-web-server-troubleshooting-using-wget/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>

<!-- Performance optimized by W3 Total Cache. Learn more: http://www.w3-edge.com/wordpress-plugins/

Page Caching using disk: enhanced

 Served from: linuxegrep.com @ 2026-05-07 11:45:37 by W3 Total Cache -->