March 7-12, 2011
Instructor: Kevin Johnson
Elluminate being used for video broadcast to about 30 remote students
Sponsers:
Quote from a tee shirt: I am a bomb Technician, if you see me running, try to keep up.
Breaks:
Lunch:
Randy's Cell 250-7681
Open Source Vulnerability Bulletin Board, OSVBB
Security Testing should be part of Job description
HTTP/1.1 defined in RFC 2616
original design considerations
GET http://www.google.com HTTP/1.1 Accept: */* Accept-Language: en-us User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; NET CLR 1.1.4322; .NET CLR 2.0.50727) Paros/3.2.13 Host: www.google.com Proxy-Connection: Keep-Alive Cookie:PREF=ID=6aa36b...:LM=11198...:GM=1:S=CZy0... Content-length:0
Mozilla/4.0 - This signifies that the browser is compliant with the standards set by netscape
MSIE 7.0 - Internet Explorer 7.0 is the software type
Windows NT 5.1 - This browser is running on Windows XP
NET CLR 1.1.4322; .NET CLR 2.0.50727 - These two versions of the .NET client are supported
Paros/3.2.13 - added by Paros
* McAffee dropp(ed) App Firewall if “scanalert” (is)was in the user agent field
rfc2616
Prevents scripts from running code from another site.
GET POST HEAD TRACE OPTIONS CONNECT PUT DELETE
Check if OPTIONS is enabled - it is not necessary.
$ nc www.sec542.org 80 POST /form_auth/login.php HTTP/1.0 Content-Length: 42 user=testuser&pass=opensesame&button=Login
paros - as a simple proxy - a new fork is zap by owasp
examples use this for Authentication
cd /usr/bin/samurai/paros java -jar paros.jar
SSL v2, turn it off
Chris Dickerson - Just released a sample report
Inline HTML
as a script tag
<script>alert("Sec542 Rocks")</script>
as part of an HTML item
<img src="images/logo.gif" onload="javascript:alert('Loaded');">
loaded from another document
<script src=http://www.inguardians.com/malicious.js>
copy the HTML file from the JavaScript directory and the DVD to your desktop
Between Head tags add:
Between <head> and </head> tags, add:
<script>alert("Hello World!");</script>
Test index.html
Change to
<script src="attack.js"></script>
Create a file called attack.js and put in it:
alert("Hello World!");
Test index.html
change attack.js to
function formChange() {
document.forms[1].action="http://www.sec542.org/"
document.forms[1].sushi.value="Toro"
}
In the index.html file:
<body onload="formChange();">
Reload the page in the browser. Submit the form to see the URI change
Add this to the attack.js file
function createCookie() {
document.cookie = "userid=kevin;expires=Fri, 27-Feb-2009;path=/";
}
In the index.html file, add the mouseover event
<form id="login" onmouseover=alert(document.cookie);" name="login" action="#" method="post">
Python scripting
create a file beginning with
#!/usr/bin/python
chmod 755 custom.py
#!/usr/bin/python
import httplib
conn = httplib.HTTPConnection("www.sec542.org")
url = "/python/index.php
conn.request('GET',url)
reps = conn.getresponse()
print resp.getheader("Server")
print resp.getheader("Date")
print resp.getheader("Cookie")
n = 1
outfile = open("results.txt","a")
while n<=100:
conn = httplib.HTTPConnextion("www.sec542.org")
url = "/python/index.php?id="+str(n)
conn.request('GET',url)
resp = conn.getresponse()
print resp.getheader("Server")
print resp.getheader("Date")
print resp.getheader("Cookie")
respcode = resp.status
outfile.write(result)
n+=1
whatis whois
whois for fbcgalax.org, www2.sans.org, www3.sans.org
Looking at hosted services
nslookup [host] [DNS_server] set debug gives more information
nslookup is depricated on Linux for dig
Simple scanner designed to find hosts within a domain by Robert “RSnake” Hansen and updated to 2.0 by Joshua “Jabra” Abraham
Lookup various hosts in a domain
perl fierce.pl -dns sec542.org
Scan an IP range
perl fierce.pl -range 192.168.1.0-255 -dnsservers ns1.sec542.org
Perform reverse lookups after finding CNAMEs
perl fierce.pl -dns sec542.org -wide -output output.txt
nslookup mail.sec542.org
dig sec542.org dig sed542.org mx
perform zone transfer
dig AXFR sec542.org
host -la sec542.org
Fierce
cd /usr/bin/samurai/fierce2
run fierce with a DNS name specified
fierce -dns sec542.org
run fierce on an IP range
fierce -range 192.168.1.0-255 -dnsservers 192.168.1.41
ip:72.14.204.147
site:www.sans.org inurl:phpinfo intitle:"Admin Login" link:sans.org ext:xls
Quotes for literal matches
"sans Web Application"
“-” omits pages or pages with specific strings from results
site:sans.org -site:www.sans.org site:sans.org -handlers
author:kevin@inguardians.com insubject:"Problems with my code"
Robin Wood created gpscan.rb to search Google profiles for specific targets
./gpscan.rb Microsoft
http://www.digininja.org/projects/gpscan.php
Jason Wood created the LinkedIn user name generator Reconnoiter
./usernameGen.py InGuardians 10
sudo nmap -sV -O www.inguardians.com
sudo nmap -sV -O www.inguardians.com
printf "GET / HTTP/1.0\n\n" | nc -v www.inguardians.com 80
(This does not work on my laptop.)
Side note:
$ nc www.sec542.org 80 HEAD / HTTP/1.0 <enter><enter>
$ nc www.sec542.org 443 HEAD / HTTP/1.0 <enter><enter>
(No SSL, so it cannot connect.)
nmap -sV www.sec542.org
Test for SSLv2
openssl s_client -connect www.aoe.vt.edu:443 -ssl2
Test for NULL Cipher (clear text)
openssl s_client -connect www.aoe.vt.edu:443 -cipher NULL
Verify SSLv3
openssl s_client -connect www.aoe.vt.edu:443 -ssl3
script from sslthing
if ! [ $1]; then
echo syntax: $0 host:sslport [-v]
exit
fi
if ! [ -e $ossl ]; then
echo The path to openssl is wrong, please edit $0
exit
fi
## Request available ciphers from openssl and test them for ssl in -ssl2 -tls1
for ssl in -ssl2 -tls1
do
echo Testing `echo $ssl | cut -c2- | tr "a-z "A-Z"'...
$ossl ciphers $ssl -v | while read line
do
ciphers=`echo $line | awk '{print $1}'`
bits=`echo $line | awk '{print $5}' | cut -f2 -d\( | cut -f1 -d\)`
if ($ossl s_client $ssl -cipher $cipher -connect $1 < sslthing.tmp 2>&1 | grep ^New > /dev/null); then
if [ $2 ]; then
echo OK
else
echo $cipher - $bits bits
fi
else
if [$2 ]; then
echo Failed
fi
fi
done | grep -v error
don
Run the command using wine
$ TCHSSLCheck.exe www.sec542.org 443 > ~/THC_443.txt
Fixme messages won't affect the program
$ TCHSSLCheck.exe www.sec542.org 10000 > ~/THC_10000.txt
cd /usr/bin/samurai/sslthing ./sslthing www.sec542.org:443 > ~/Thing_443.txt ./sslthing www.sec542.org:10000 > ~/Thing_10000.txt
Bing
ip:<ip address>
#!/bin/bash for method in GET POST PUT TRACE CONNECT OPTIONS; do printf "$method / HTTP/1.1\nHost: www.aoe.vt.edu\n\n | nc www.aoe.vt.edu 80 done
cd /usr/bin/samurai/nikto ./nikto.pl -host www.sec542.org -Format HTM -output ~/nikto.html
robots.txt
User-agent: * Disallow: /images/ Disallow: /css/
individual pages
These should both be use as different clients respect each of them
<META HTTP-EQUIV="PRAGMA" CONTENT="NO-CACHE"> <META HTTP-EQUIV="CACHE-CONTROL" CONTENT="NO-CACHE">
These are useful for controlling search engine spiders and where they go
<META NAME="ROBOTS" CONTENT="INDEX,NOFOLLOW"> <META NAME="GOOGLEBOT" CONTENT="NOARCHIVE">
Robot Exclusion Protocol - http://www.robotstxt.org
robots=off
wget -r http://www.sec542.org --no-check-certificate
java -jar webscarab.jar spidering tab browse to site Fetch Tree Fetch Tree until no other pages are found
cd /usr/bin/samurai/burpsuite_v1.2 java -jar burpsuite_v1.2.jar check spider running check box
cd /usr/bin/samurai/cewl ./cewl.rb http://www.sec542.org ./cewl.rb http://www.sec542.org -w ~/cewl_wordlist
w3m -dump index.html > index.txt
Dreamweaver has notes for pages which get put in _notes with .mno extension
directory-list-2.3-small.txt directory-list-2.3-medium.txt directory-list-2.3-big.txt directory-list-2.3-lowercase-2.3-big.txt
cd /usr/bin/samurai/DirBuster-1.0-RC1 java -jar DirBuster-1.0-RC1.jar Enter http://www/sec542.org in Target URL Browse to select word list other options as desired Select start
cd /usr/bin/samurai
java -jar webscarab.jar
Session ID Analysis
Select session within "Previous Requests:" drop down
Click Test
On Sec542 Target VM
cd Desktop/Jetty ./runjetty.sh
jetty runs on port 8080
Select "GET http://www.sec542.org:8080/examples/servlet/SessionExample 200 OK" from "Previous Requests:" click Test set the number of samples to 1000 and click fetch switch to the analysis tab
cd /usr/bin/samurai/burpsuite_v1.2 java -jar burpsuite_v1.2.jar Clear Private Data including cookies in Firefox Switch to Burp proxy and visit Jetty and Webmin bookmarks Right Click SessionExample and select send to sequencer Select sequencer tab the select each URL in turn and click "start capture". click "analyze now" click "Stop" when finished
Who does Kevin Johnson follow:
Who are the up and coming people?
* two methods
cd /usr/bin/samurai/grendel-scan ./grendel.sh
Enter the target URL's and click add (one at a time)
http://www.sec542.org http://www.sec542.org/scanners
Output directory
/home/samurai/scanresults/grendel/
Select plug-ins
In the menu, select:
Scan -> Start Scan
Launch Firefox and select Grendel-Scan in the SwitchProxy toolbar
select the bookmark for BASE
This adds this url to the Grendel scan
open report.html in a browser.
$ ./w3af_console w3qf>>> plugins w3af/plugins>>> output console,textfile w3af/plugins>>> output config textFile
$ w3af_console -s filename
cd /usr/bin/samurai/w3af ./w3af_gui
click profile menu and select new profile. Give it a name such as sec542
In the target URL, enter https://www.sec542.org/scanners