We have been having filtering issues at work. The only pattern I can see is that upon heavy Internet use (40+Mb) the filtering services have a hard time keeping up. The problem is that Internet usage is impossible to predict so calling support and trying to reproduce the problem is nearly impossible. Assuming I can get through at the time it is happening, by the time I update the support personnel on the other end of the phone, it usually starts working again.
Instead of fighting this problem further I came up with a quick script to see how often it really isn’t working, and record my results.
A script that loops though and hits a blocked page every ‘X’ seconds and then parses the HTML results to see if the redirect or block page was served. This should show my reliability of our web filter during the day without detrimenting any network performance. I whipped together a quick and dirty script with some functions from my other scripts. It should do the trick.
#!/bin/bash ####################################################### # Script created to test web filter's reliability # It polls a webpage that is supposed to be blocked. # # Dan Kane ####################################################### # Web filter Server IP or unique text found in the block page html WebServer="10.9.1" # just zeroing the counters blockedcounter=0 # just zeroing the counters allowedcounter=0 # Log file location logfile="/var/log/webfilter.log" # Blocked url blockurl="facebook.com" # used for my output module to show debug information debug="1" # used for my output module to log everything log="1" # sleep through the loop or go as fast as we can? sleep="1" ####################################################### #FUNCTIONS # #used for debug and logging output(){ if [ $debug -eq "1" ]; then /bin/echo `/bin/date +"%m-%d-%Y %r"`: $@ fi if [ $log -eq "1" ]; then /bin/echo `/bin/date +"%m-%d-%Y %r"` : $@ >> $logfile fi } onexit(){ echo "didnt work $allowedcounter ::: worked $blockedcounter"; exit exit } ####################################################### # CODE # # catch traps and show results trap "onexit" SIGINT > /dev/null # Enter our 1 second loop while : do website=$(wget -qO- $blockurl) found=$(echo $website | grep $WebServer | wc -l) if [ $found -lt 1 ]; then allowedcounter=$(( $allowedcounter + 1 )) output "Page NOT blocked --- didnt work $allowedcounter ::: worked $blockedcounter" else blockedcounter=$(( $blockedcounter + 1 )) output "Page blocked --- didnt work $allowedcounter ::: worked $blockedcounter" fi if [ $sleep -eq "1" ]; then sleep 1 fi done
We will start recording the results and see what else we can find out. If the amount of data we are inserting is too much, maybe insert results into MySQL is in order, but we will see what we find first.
Recent Comments