<?xml version="1.0" encoding="UTF-8" ?>
<rss version="2.0">
<channel>
<title><![CDATA[沧海一粟]]></title> 
<link>http://www.dzhope.com/index.php</link> 
<description><![CDATA[Web系统架构与服务器运维,php开发]]></description> 
<language>zh-cn</language> 
<copyright><![CDATA[沧海一粟]]></copyright>
<item>
<link>http://www.dzhope.com/post//</link>
<title><![CDATA[网站排障分析常用的命令]]></title> 
<author>jed &lt;jed521@163.com&gt;</author>
<category><![CDATA[服务器技术]]></category>
<pubDate>Sat, 14 Jan 2012 17:21:10 +0000</pubDate> 
<guid>http://www.dzhope.com/post//</guid> 
<description>
<![CDATA[ 
	整理一些常用分析网站的小命令方便大家排障，内容均来源于网络。<br/>系统连接状态篇：<br/>1.查看TCP连接状态<br/><div class="code"><br/>netstat -nat &#124;awk &#039;&#123;print $6&#125;&#039;&#124;sort&#124;uniq -c&#124;sort -rn <br/><br/>netstat -n &#124; awk &#039;/^tcp/ &#123;++S&#91;$NF&#93;&#125;;END &#123;for(a in S) print a, S&#91;a&#93;&#125;&#039; 或<br/>netstat -n &#124; awk &#039;/^tcp/ &#123;++state&#91;$NF&#93;&#125;; END &#123;for(key in state) print key,&quot;&#92;t&quot;,state&#91;key&#93;&#125;&#039;<br/>netstat -n &#124; awk &#039;/^tcp/ &#123;++arr&#91;$NF&#93;&#125;;END &#123;for(k in arr) print k,&quot;&#92;t&quot;,arr&#91;k&#93;&#125;&#039;<br/><br/>netstat -n &#124;awk &#039;/^tcp/ &#123;print $NF&#125;&#039;&#124;sort&#124;uniq -c&#124;sort -rn <br/><br/>netstat -ant &#124; awk &#039;&#123;print $NF&#125;&#039; &#124; grep -v &#039;&#91;a-z&#93;&#039; &#124; sort &#124; uniq -c<br/><br/></div><br/>2.查找请求数请20个IP（常用于查找攻来源）：<br/><br/><div class="code"><br/>netstat -anlp&#124;grep 80&#124;grep tcp&#124;awk &#039;&#123;print $5&#125;&#039;&#124;awk -F: &#039;&#123;print $1&#125;&#039;&#124;sort&#124;uniq -c&#124;sort -nr&#124;head -n20<br/><br/>netstat -ant &#124;awk &#039;/:80/&#123;split($5,ip,&quot;:&quot;);++A&#91;ip&#91;1&#93;&#93;&#125;END&#123;for(i in A) print A&#91;i&#93;,i&#125;&#039; &#124;sort -rn&#124;head -n20<br/><br/></div><br/>3.用tcpdump嗅探80端口的访问看看谁最高<br/><br/><div class="code"><br/>tcpdump -i eth0 -tnn dst port 80 -c 1000 &#124; awk -F&quot;.&quot; &#039;&#123;print $1&quot;.&quot;$2&quot;.&quot;$3&quot;.&quot;$4&#125;&#039; &#124; sort &#124; uniq -c &#124; sort -nr &#124;head -20<br/><br/></div><br/>4.查找较多time_wait连接<br/><div class="code"><br/>netstat -n&#124;grep TIME_WAIT&#124;awk &#039;&#123;print $5&#125;&#039;&#124;sort&#124;uniq -c&#124;sort -rn&#124;head -n20<br/></div><br/>5.找查较多的SYN连接<br/><div class="code"><br/>netstat -an &#124; grep SYN &#124; awk &#039;&#123;print $5&#125;&#039; &#124; awk -F: &#039;&#123;print $1&#125;&#039; &#124; sort &#124; uniq -c &#124; sort -nr &#124; more<br/></div><br/>6.根据端口列进程<br/><div class="code"><br/>netstat -ntlp &#124; grep 80 &#124; awk &#039;&#123;print $7&#125;&#039; &#124; cut -d/ -f1<br/></div><br/>网站日志分析篇1（Apache）：<br/><br/>1.获得访问前10位的ip地址<br/><div class="code"><br/>cat access.log&#124;awk &#039;&#123;print $1&#125;&#039;&#124;sort&#124;uniq -c&#124;sort -nr&#124;head -10<br/>cat access.log&#124;awk &#039;&#123;counts&#91;$(11)&#93;+=1&#125;; END &#123;for(url in counts) print counts&#91;url&#93;, url&#125;&#039;<br/></div><br/>2.访问次数最多的文件或页面,取前20<br/><div class="code"><br/>cat access.log&#124;awk &#039;&#123;print $11&#125;&#039;&#124;sort&#124;uniq -c&#124;sort -nr&#124;head -20<br/></div><br/>3.列出传输最大的几个exe文件（分析下载站的时候常用）<br/><div class="code"><br/>cat access.log &#124;awk &#039;($7~/&#92;.exe/)&#123;print $10 &quot; &quot; $1 &quot; &quot; $4 &quot; &quot; $7&#125;&#039;&#124;sort -nr&#124;head -20<br/></div><br/>4.列出输出大于200000byte(约200kb)的exe文件以及对应文件发生次数<br/><div class="code"><br/>cat access.log &#124;awk &#039;($10 &gt; 200000 &amp;&amp; $7~/&#92;.exe/)&#123;print $7&#125;&#039;&#124;sort -n&#124;uniq -c&#124;sort -nr&#124;head -100<br/></div><br/>5.如果日志最后一列记录的是页面文件传输时间，则有列出到客户端最耗时的页面<br/><div class="code"><br/>cat access.log &#124;awk&nbsp;&nbsp;&#039;($7~/&#92;.php/)&#123;print $NF &quot; &quot; $1 &quot; &quot; $4 &quot; &quot; $7&#125;&#039;&#124;sort -nr&#124;head -100<br/></div><br/>6.列出最最耗时的页面(超过60秒的)的以及对应页面发生次数<br/><div class="code"><br/>cat access.log &#124;awk &#039;($NF &gt; 60 &amp;&amp; $7~/&#92;.php/)&#123;print $7&#125;&#039;&#124;sort -n&#124;uniq -c&#124;sort -nr&#124;head -100<br/></div><br/>7.列出传输时间超过 30 秒的文件<br/><div class="code"><br/>cat access.log &#124;awk &#039;($NF &gt; 30)&#123;print $7&#125;&#039;&#124;sort -n&#124;uniq -c&#124;sort -nr&#124;head -20<br/></div><br/>8.统计网站流量（G)<br/><div class="code"><br/>cat access.log &#124;awk &#039;&#123;sum+=$10&#125; END &#123;print sum/1024/1024/1024&#125;&#039;<br/></div><br/>9.统计404的连接<br/><div class="code"><br/>awk &#039;($9 ~/404/)&#039; access.log &#124; awk &#039;&#123;print $9,$7&#125;&#039; &#124; sort<br/></div><br/>10. 统计http status.<br/><div class="code"><br/>cat access.log &#124;awk &#039;&#123;counts&#91;$(9)&#93;+=1&#125;; END &#123;for(code in counts) print code, counts&#91;code&#93;&#125;&#039;<br/>cat access.log &#124;awk &#039;&#123;print $9&#125;&#039;&#124;sort&#124;uniq -c&#124;sort -rn<br/></div><br/>10.蜘蛛分析<br/>查看是哪些蜘蛛在抓取内容。<br/><div class="code"><br/>/usr/sbin/tcpdump -i eth0 -l -s 0 -w - dst port 80 &#124; strings &#124; grep -i user-agent &#124; grep -i -E &#039;bot&#124;crawler&#124;slurp&#124;spider&#039;<br/></div><br/><br/><strong>网站日分析2(Squid篇）</strong><br/>2.按域统计流量<br/><div class="code"><br/>zcat squid_access.log.tar.gz&#124; awk &#039;&#123;print $10,$7&#125;&#039; &#124;awk &#039;BEGIN&#123;FS=&quot;&#91; /&#93;&quot;&#125;&#123;trfc&#91;$4&#93;+=$1&#125;END&#123;for(domain in trfc)&#123;printf &quot;%s&#92;t%d&#92;n&quot;,domain,trfc&#91;domain&#93;&#125;&#125;&#039;<br/></div><br/>效率更高的perl版本请到此下载:<a href="http://docs.linuxtone.org/soft/tools/tr.pl" target="_blank">http://docs.linuxtone.org/soft/tools/tr.pl</a><br/><br/>数据库篇<br/>1.查看数据库执行的sql<br/><div class="code"><br/>/usr/sbin/tcpdump -i eth0 -s 0 -l -w - dst port 3306 &#124; strings &#124; egrep -i &#039;SELECT&#124;UPDATE&#124;DELETE&#124;INSERT&#124;SET&#124;COMMIT&#124;ROLLBACK&#124;CREATE&#124;DROP&#124;ALTER&#124;CALL&#039;<br/></div><br/><br/>系统Debug分析篇<br/><br/>1.调试命令<br/><div class="code"><br/>strace -p pid<br/></div><br/>2.跟踪指定进程的PID<br/><div class="code"><br/>gdb -p pid<br/></div><br/><br/><br/><div class="code"><br/>1.根据访问IP统计UV<br/><br/>awk &#039;&#123;print $1&#125;&#039;&nbsp;&nbsp;access.log&#124;sort &#124; uniq -c &#124;wc -l<br/><br/>2.统计访问URL统计PV<br/><br/>awk &#039;&#123;print $7&#125;&#039; access.log&#124;wc -l<br/><br/>3.查询访问最频繁的URL<br/><br/>awk &#039;&#123;print $7&#125;&#039; access.log&#124;sort &#124; uniq -c &#124;sort -n -k 1 -r&#124;more<br/><br/>4.查询访问最频繁的IP<br/><br/>awk &#039;&#123;print $1&#125;&#039; access.log&#124;sort &#124; uniq -c &#124;sort -n -k 1 -r&#124;more<br/><br/>5.根据时间段统计查看日志<br/><br/> cat&nbsp;&nbsp;access.log&#124; sed -n &#039;/14&#92;/Mar&#92;/2015:21/,/14&#92;/Mar&#92;/2015:22/p&#039;&#124;more<br/></div><br/>Tags - <a href="http://www.dzhope.com/tags/linux%25E5%2591%25BD%25E4%25BB%25A4/" rel="tag">linux命令</a> , <a href="http://www.dzhope.com/tags/linux/" rel="tag">linux</a>
]]>
</description>
</item><item>
<link>http://www.dzhope.com/post//#blogcomment</link>
<title><![CDATA[[评论] 网站排障分析常用的命令]]></title> 
<author> &lt;user@domain.com&gt;</author>
<category><![CDATA[评论]]></category>
<pubDate>Thu, 01 Jan 1970 00:00:00 +0000</pubDate> 
<guid>http://www.dzhope.com/post//#blogcomment</guid> 
<description>
<![CDATA[ 
	
]]>
</description>
</item>
</channel>
</rss>