linux shell用于apache服务器中日志文件分析的命令
1、查看apache进程:
ps aux | grep httpd | grep -v grep | wc -l
2、查看80端口的tcp连接:
netstat -tan | grep "ESTABLISHED" | grep ":80" | wc -l
3、通过日志查看当天ip连接数,过滤重复:
cat access_log | grep "20/Oct/2008" | awk '{print $2}' | sort | uniq -c | sort -nr
4、当天ip连接数最高的ip都在干些什么(原来是蜘蛛):
cat access_log | grep "20/Oct/2008:00" | grep "122.102.7.212" | awk '{print $8}' | sort | uniq -c | sort -nr | head -n 10
5、当天访问页面排前10的url:
cat access_log | grep "20/Oct/2008:00" | awk '{print $8}' | sort | uniq -c | sort -nr | head -n 10
6、用tcpdump嗅探80端口的访问看看谁最高:
tcpdump -i eth0 -tnn dst port 80 -c 1000 | awk -F"." '{print $1"."$2"."$3"."$4}' | sort | uniq -c | sort -nr
接着从日志里查看该ip在干嘛:
cat access_log | grep 122.102.7.212| awk '{print $1" "$8}' | sort | uniq -c | sort -nr | less
7、查看某一时间段的ip连接数:
grep "2006:0[7-8]" www20060723.log | awk '{print $2}' | sort | uniq -c| sort -nr | wc
附,shell分析切割日志,合并计算
1,日志格式如:
2,脚本
egrep -E 'id=[0-9]{1,4},sour' sms_log.txt|cut -c 80-| awk -F'[=,]' '{a[$2]++;s1[$2]+=$4;s2[$2]+=$6;s3[$2]+=$8;} END {for (i in a) print i,"soucre="s1[i],"filter="s2[i],"telsize="s3[i]}'
3,结果 |