====== Developer/Security 542 Web App Penetration Testing and Ethical Hacking ====== March 7-12, 2011 Instructor: Kevin Johnson * 5000 facebook accounts * Has twin brother Keith * Worked with Matt Carpener * graduated H.S. 1991 Elluminate being used for video broadcast to about 30 remote students Sponsers: * Aruba Networks * FireEye * IBM Quote from a tee shirt: I am a bomb Technician, if you see me running, try to keep up. Breaks: * 1030-1050 * 1500-1520 Lunch: * 12-1330 Randy's Cell 250-7681 ===== Day 1 The Attacker's View of the Web ===== ==== Why the Web ..1.3 ==== Open Source Vulnerability Bulletin Board, OSVBB ==== Web App Pen Testing ..1.8 ==== Security Testing should be part of Job description ==== Web Site Server Architecture ..1.13 ==== ==== The HTTP Protocol ..3.20 ==== HTTP/1.1 defined in RFC 2616 http://tools.ietf.org/rfc2616 original design considerations http://www.w3.org/Protocols/DesignIssues.html == Example HTTP Request ..1.22 == GET http://www.google.com HTTP/1.1 Accept: */* Accept-Language: en-us User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; NET CLR 1.1.4322; .NET CLR 2.0.50727) Paros/3.2.13 Host: www.google.com Proxy-Connection: Keep-Alive Cookie:PREF=ID=6aa36b...:LM=11198...:GM=1:S=CZy0... Content-length:0 == User-Agent ..1.23 == Mozilla/4.0 - This signifies that the browser is compliant with the standards set by netscape MSIE 7.0 - Internet Explorer 7.0 is the software type Windows NT 5.1 - This browser is running on Windows XP NET CLR 1.1.4322; .NET CLR 2.0.50727 - These two versions of the .NET client are supported Paros/3.2.13 - added by Paros * Windows NT 6.0 - Windows Vista; Windows server 2008 * Windows NT 5.2 - Windows Server 2003; Windows XP x64 Edition * Windows NT 5.1 - Windows XP * Windows NT 5.01 - Windows 2000, Service Pack 1 (SP1) * Windows NT 5.0 - Windows 2000 * Windows NT 4.0 - Windows NT * Win 9x 4.90 - Windows Millennium Edition (Windows Me) * Windows 98 - Windows 98 * Windows 95 - Windows 95 * Windows CE - Windows CE * McAffee dropp(ed) App Firewall if "scanalert" (is)was in the user agent field == Origin Server ..1.25 == rfc2616 == Same Origin Policy ..1.26 == Prevents scripts from running code from another site. == HTTP Request Methods ..1.31 == GET POST HEAD TRACE OPTIONS CONNECT PUT DELETE Check if OPTIONS is enabled - it is not necessary. == HTTP Response Codes ..1.36 == * 1xx Informational * 100 Continue * 2xx Success * 200 OK * 3xx Redirection * 302 Redirect * 304 Not modified * 4xx Client Error * 401 Unauthorized * 404 Not Found * 5xx Server Error * 500 Server Error * 502 Bad Gateway === Exercise: Examining HTTP Requests and Responses ..1.38 === $ nc www.sec542.org 80 POST /form_auth/login.php HTTP/1.0 Content-Length: 42 user=testuser&pass=opensesame&button=Login === Client Authentication ..1.46 === * Basic ..1.49 * Digest ..1.51 * HTTP Client Certificate Authentication ..1.54 * Windows Integrated Authentication ..1.57 * Forms Based ..1.60 paros - as a simple proxy - a new fork is zap by owasp examples use this for Authentication === Exercise: Client Authentication ..1.64 === cd /usr/bin/samurai/paros java -jar paros.jar === Session Tracking ..1.74 === * Client Side vs. Server Side ..1.77 * Session ID * Cookies ..1.78 * URL Encoding ..1.79 * Hidden form fields ..1.80 === SSL ..1.82 === SSL v2, turn it off === Exercise: Analyzing SSL ..1.88 === * Wireshark on Web server * import key to decrypt Data ==== Penetration Testing Types and Methods ..1.94 ==== * Black Box ..1.95 * Crystal Box ..1.96 * Grey Box ..1.97 ==== Web App Pen Test Components ..1.102 ==== * Preparation ..1.103 * Mananging ..1.104 * Scope ..1.105 * Gathering Information ..1.106 * Rules of Engagement ..1.107 * Identifying Tester Traffic ..1.108 * Time ..1.109 * Communications Planning ..1.110 ==== Reporting and Presenting Findings ..1.111 ==== - Executive summary ..1.114 - Introduction ..1.115 - Methodology ..1.116 - Findings - includes recommendations ..1.117 - Conclusions .. 1.118 Chris Dickerson - Just released a sample report * twitter ID indi303 * exotic liability podcast == Data Collection Tools ..1.120 == * CAL9000 * OWASP Project written in HTML and JavaScript that runs in browser * Freemind * mind mapping * Wiki ==== Attack Methodology ..1.125 ==== * Reconnaissance ..1.127 * Begin with Zone Transfer * Mapping ..1.130 * Spidering * Burp * w3af * Discovery ..1.132 * finding issues, not exploiting them * Exploitation ..1.133 * actually attacking flaws ==== Types of Flaws ..1.135 ==== * Information Leakage ..1.126 * Configuration Flaws ..1.137 * Bypass Flaws ..1.138 * Authentication bypass * Authorization bypass * File control bypass * Front-end bypass * Injection Flaws ..1.139 * Command injection * Code Injection * SQL Injection * Cross Site Scripting (XSS) * HTTP Response Splitting * Cross Site Request Forgery (CSRF) * targets the trust a site has in the user. For Example, performing bank transactions on the target's behalf. ==== JavaScript for Pen Testers ..1.140 ==== Inline HTML as a script tag as part of an HTML item loaded from another document Test index.html Change to Create a file called attack.js and put in it: alert("Hello World!"); Test index.html change attack.js to function formChange() { document.forms[1].action="http://www.sec542.org/" document.forms[1].sushi.value="Toro" } In the index.html file: Reload the page in the browser. Submit the form to see the URI change Add this to the attack.js file function createCookie() { document.cookie = "userid=kevin;expires=Fri, 27-Feb-2009;path=/"; } In the index.html file, add the mouseover event
* The work Toro should be pre-filled * A pop-up should be displayed due to mousover * pop-up should contain cookie ===== Day 2 Reconnaissance and Mapping ===== Python scripting ==== Creating Custom Scripts for Penetration Testing ..2.4 ==== === Python for Penetration Testing ..2.6 === === Exercise: Python Scripting ..2.16 === create a file beginning with #!/usr/bin/python chmod 755 custom.py #!/usr/bin/python import httplib conn = httplib.HTTPConnection("www.sec542.org") url = "/python/index.php conn.request('GET',url) reps = conn.getresponse() print resp.getheader("Server") print resp.getheader("Date") print resp.getheader("Cookie") n = 1 outfile = open("results.txt","a") while n<=100: conn = httplib.HTTPConnextion("www.sec542.org") url = "/python/index.php?id="+str(n) conn.request('GET',url) resp = conn.getresponse() print resp.getheader("Server") print resp.getheader("Date") print resp.getheader("Cookie") respcode = resp.status outfile.write(result) n+=1 ==== Reconnaissance ..2.23 ==== ==== Target Selection ..2.26 ==== ==== Whois and DNS Records ..2.29 ==== whatis whois whois for fbcgalax.org, www2.sans.org, www3.sans.org Looking at hosted services == nslookup ..2.33 == nslookup [host] [DNS_server] set debug gives more information nslookup is depricated on Linux for dig == dig ..2.34 == * -t MX * -t AXFR * -t ANY == Fierce ..2.36 == Simple scanner designed to find hosts within a domain by Robert "RSnake" Hansen and updated to 2.0 by Joshua "Jabra" Abraham * performs a series of test to find hosts within a domain * queries the system DNS servers to find target DNS servers * Attempts to retrieve SOA records * Proceeds to guess host names using a wordlist * Once an IP address is found, does a reverse lookup on next 5 and previous 5 IP addresses * Uses a set of prefixes to find hosts (i.e. www2, www3) Lookup various hosts in a domain perl fierce.pl -dns sec542.org Scan an IP range perl fierce.pl -range 192.168.1.0-255 -dnsservers ns1.sec542.org Perform reverse lookups after finding CNAMEs perl fierce.pl -dns sec542.org -wide -output output.txt === Exercise: DNS Harvesting ..2.37 === nslookup mail.sec542.org dig sec542.org dig sed542.org mx perform zone transfer dig AXFR sec542.org host -la sec542.org Fierce cd /usr/bin/samurai/fierce2 run fierce with a DNS name specified fierce -dns sec542.org run fierce on an IP range fierce -range 192.168.1.0-255 -dnsservers 192.168.1.41 ==== External Information Sources ..2.45 ==== * Facebook * Linked-In * MySpace * MSN - Bing ip:72.14.204.147 * Yahoo * Altavista * Google == Google directives ..2.48 == site:www.sans.org inurl:phpinfo intitle:"Admin Login" link:sans.org ext:xls == Google Modifiers to Focus Searches ..2.49 == Quotes for literal matches "sans Web Application" "-" omits pages or pages with specific strings from results site:sans.org -site:www.sans.org site:sans.org -handlers * "*" is used as a keyword wildcard * "+" forces Google to include ignored words * "the" * "and" * "or", etc == Google Groups ..2.50 == * groups.google.com * also supports author:kevin@inguardians.com insubject:"Problems with my code" == SensePost's Aura ..2.53 == * Converts Google SOAP API requests into general searches of the Google website * violates google's terms of service == Foundstone's SiteDigger ..2.54 == * Aura is needed if you do not have a Google SOAP API key == SensePost's Wikto ..55 == * Aura is needed if you do not have a Google SOAP API key == cDc's Goolag Scanner ..2.56 == * Google may block the IP address for 1 to 24 hours == Google Alerts ..2.58 == * Automated search method showing new results based on any search terms == Newsgoups ..2.60 == == Social Networks ..2.61 == == Automated Social Network Parsing ..2.62 == Robin Wood created gpscan.rb to search Google profiles for specific targets ./gpscan.rb Microsoft http://www.digininja.org/projects/gpscan.php Jason Wood created the LinkedIn user name generator Reconnoiter ./usernameGen.py InGuardians 10 http://www/jwnetworksconsulting.com/blog == Maltego ..2.65 == * mapping tool that finds the relationships between people, sites and companies * Community and Professional versions available ==== Mapping ..2.68 ==== == Mapping Phase Components ..2.69 == * Port Scan * OS Fingerprinting * SSL Analysis * Virtual Hosting & Load Balancer Analysis * Software Configuration Analysis * Spidering * Detailed Analysis of Spidering Results ==== Port Scanning, OS Fingerprinting & Version Detection ..2.70 ==== == Nmap Port Scanner ..2.71 == sudo nmap -sV -O www.inguardians.com == Passive OS Fingerprinting with P0f ..2.73 == * Passively detects operating system types based on traffic generated by target * http://lcamtuf.coredump.cx/p0f.shtml === Exercise: Gathering Server Info ..2.82 === sudo nmap -sV -O www.inguardians.com printf "GET / HTTP/1.0\n\n" | nc -v www.inguardians.com 80 (This does not work on my laptop.) == HTTPrint ..2.79 == * fingerprint web servers * http://www.net-square.com/httprint Side note: * Apache capitalizes Header Responses * IIS switches the order of Referer == Netcraft Detection ..2.80 == * Webserver analysis site, reportin server use statistics for public sites * Kevin does not want to reveal customer information to netcraft === Exercise: Gathering Server Info ..2.82 == $ nc www.sec542.org 80 HEAD / HTTP/1.0 $ nc www.sec542.org 443 HEAD / HTTP/1.0 (No SSL, so it cannot connect.) nmap -sV www.sec542.org **Applications**, **Wine**, **Programs**, **HTTPrint** Options, **uncheck** ICMP enable item in the "ICMP (Ping before testing)" area and click "**OK**" Load, **play**. ==== Analyzing SSL Support ..2.88 ==== * Anything below SSLv3 or TLSv1 is older and has issues * openssl * THCSSLCheck * sitedigger == Scripting OpenSSL ..2.90 == Test for SSLv2 openssl s_client -connect www.aoe.vt.edu:443 -ssl2 Test for NULL Cipher (clear text) openssl s_client -connect www.aoe.vt.edu:443 -cipher NULL Verify SSLv3 openssl s_client -connect www.aoe.vt.edu:443 -ssl3 script from sslthing if ! [ $1]; then echo syntax: $0 host:sslport [-v] exit fi if ! [ -e $ossl ]; then echo The path to openssl is wrong, please edit $0 exit fi ## Request available ciphers from openssl and test them for ssl in -ssl2 -tls1 for ssl in -ssl2 -tls1 do echo Testing `echo $ssl | cut -c2- | tr "a-z "A-Z"'... $ossl ciphers $ssl -v | while read line do ciphers=`echo $line | awk '{print $1}'` bits=`echo $line | awk '{print $5}' | cut -f2 -d\( | cut -f1 -d\)` if ($ossl s_client $ssl -cipher $cipher -connect $1 < sslthing.tmp 2>&1 | grep ^New > /dev/null); then if [ $2 ]; then echo OK else echo $cipher - $bits bits fi else if [$2 ]; then echo Failed fi fi done | grep -v error don == Using THC SSL Check to Evaluate Targets ..2.91 == * Windows exe, but can run in Wine == Using SSLDigger to Evaluate Targets ..2.92 == * Foundstone tool === Exercise: Testing SSL ..2.94 === == SSL Testing: Launching THC SSL Check ..2.96 == Run the command using wine $ TCHSSLCheck.exe www.sec542.org 443 > ~/THC_443.txt Fixme messages won't affect the program $ TCHSSLCheck.exe www.sec542.org 10000 > ~/THC_10000.txt == SSL Testing: Launching SSLThing ..2.97 == cd /usr/bin/samurai/sslthing ./sslthing www.sec542.org:443 > ~/Thing_443.txt ./sslthing www.sec542.org:10000 > ~/Thing_10000.txt ==== Virtual Hosting and Load Balancers ..2.100 ==== Bing ip: * url analysis * Timestamp analysis * Last modified values comparison * Load balancer cookies detection * SSL differences * HTML source code discrepancies ==== Analyzing Software Configuration ..2.112 ==== #!/bin/bash for method in GET POST PUT TRACE CONNECT OPTIONS; do printf "$method / HTTP/1.1\nHost: www.aoe.vt.edu\n\n | nc www.aoe.vt.edu 80 done === Exercise: Nikto ..2.118 === cd /usr/bin/samurai/nikto ./nikto.pl -host www.sec542.org -Format HTM -output ~/nikto.html ==== Spidering a Site ..2.124 ==== == Robot Control ..2.126 == robots.txt User-agent: * Disallow: /images/ Disallow: /css/ individual pages * Tags to prevent caching of the content These should both be use as different clients respect each of them * Tags to control search engine spiders and where they go These are useful for controlling search engine spiders and where they go Robot Exclusion Protocol - http://www.robotstxt.org == Automated Spidering with WebScarab ..2.128 == * ignores robots.txt and robots meta tags * Select Fetch Recursively check box * then click Fetch Tree * from OWASP == Automated Spidering with Paros ..2.130 == * www.parosproxy.org == Automated Spidering with Burp Suite ..2.131 == * Free basic version * browse to site and select spider check box * available from portswigger.net == Automated Spidering with wget ..2.132 == * -r recurse option * -l [N] maximum link recursion depth (default is 5) * adheres to robots.txt directives * to ignore robots.txt robots=off == Specialized Spidering Tools == * CeWL from Robin "digininja" Wood http://www.digininja.org * spiders a web site and then generates a word list for use as passwords or other dictionary attacks * will grab information from EXIF data === Exercise: Web Spidering ..2.134 === wget -r http://www.sec542.org --no-check-certificate java -jar webscarab.jar spidering tab browse to site Fetch Tree Fetch Tree until no other pages are found cd /usr/bin/samurai/burpsuite_v1.2 java -jar burpsuite_v1.2.jar check spider running check box cd /usr/bin/samurai/cewl ./cewl.rb http://www.sec542.org ./cewl.rb http://www.sec542.org -w ~/cewl_wordlist ==== Analyzing Spider Results ..2.144 ==== * Comments that reveal useful or sensitive information * Commented code and links * Disabled functionality * Linked Servers == Convert HTML to Text ..2.150 == w3m -dump index.html > index.txt ==== Application Flow Charting ..2.151 ==== * Visio - has capability built in * Kivio for Linux * Omnigraffle for MacOSX ..2 154 * WebScarab, Burp and w3af has graphing built in ..2.155 === Relationship Analysis ..2.156 == Dreamweaver has notes for pages which get put in _notes with .mno extension === Exercise: OWASP DirBuster ..2.165 === directory-list-2.3-small.txt directory-list-2.3-medium.txt directory-list-2.3-big.txt directory-list-2.3-lowercase-2.3-big.txt cd /usr/bin/samurai/DirBuster-1.0-RC1 java -jar DirBuster-1.0-RC1.jar Enter http://www/sec542.org in Target URL Browse to select word list other options as desired Select start ==== Session Analysis ..2.170 ==== == Session Token Predictability ..2.174 == == WebScarab Session Token Gathering and Analysis ..2.177 == === Exercise: Session Analysis ..2.179 === == Session Gathering: Testing Webmin ..2.181 == cd /usr/bin/samurai java -jar webscarab.jar Session ID Analysis Select session within "Previous Requests:" drop down Click Test == Session Gathering: Testing Jetty ..2.182 == On Sec542 Target VM cd Desktop/Jetty ./runjetty.sh jetty runs on port 8080 Select "GET http://www.sec542.org:8080/examples/servlet/SessionExample 200 OK" from "Previous Requests:" click Test set the number of samples to 1000 and click fetch switch to the analysis tab == Session Gathering: Using Burp Sequencer ..2.183 == cd /usr/bin/samurai/burpsuite_v1.2 java -jar burpsuite_v1.2.jar Clear Private Data including cookies in Firefox Switch to Burp proxy and visit Jetty and Webmin bookmarks Right Click SessionExample and select send to sequencer Select sequencer tab the select each URL in turn and click "start capture". click "analyze now" click "Stop" when finished ===== Day 3 Server-Side Discovery ===== Who does Kevin Johnson follow: * twitter * Robin Wood * Chris John Rilley * Jason Wood - in Utah Who are the up and coming people? * Frank Majo * Jack Daniel ==== Vulnerability Discovery Overview ..3.4 ==== * this part of the methodology starts the exploitation part of the test ==== Automated Web Application Vulnerability Scanners ..3.7 ==== * Web scanners interact with the site, through spidering or proxies and this interaction actually changes the way the plug-ins send traffic. === Grendel-Scan ..3.10 === * http://grendel-scan.com * automates a large portion of the discovery step * provides a manual interface for manual interaction during the automatic scan * further work has been done, but not released in a new version. Some updates are in code repositories. == Mapping Mode ..3.12 == * two methods * typical spider, but can provide one or more URL's for the spider to start with * interception proxy, allows user to manually walk through the application. Grendal-scan the uses these pages as a basis of it's scan. * things requiring human interaction * Form submission pages * Dynamic JavaScript creating links == Discovery Plug-ins ..3.13 == === Grendel-Scan Exercise ..3.14 === cd /usr/bin/samurai/grendel-scan ./grendel.sh Enter the target URL's and click add (one at a time) http://www.sec542.org http://www.sec542.org/scanners Output directory /home/samurai/scanresults/grendel/ Select plug-ins * default scan except * uncheck website mirror under application architecture * check Comment Lister under information leakage In the menu, select: Scan -> Start Scan Launch Firefox and select Grendel-Scan in the SwitchProxy toolbar select the bookmark for BASE This adds this url to the Grendel scan open report.html in a browser. === w3af ..3.20 === * open source web application scanner * http://w3af.sourceforge.net * GUI and command line interface (similar to Metasploit) * w3af is designed to perform the spidering part of mapping, all of the server-side vulnerability discovery and exploitation * it bundles multiple tools such as sqlmap and BeEF built in as modules * run the latest development versions * GUI uses profiles ..3.22 * enter url, including protocol as the starting point for the scan. * The w3af Console ..3.23 $ ./w3af_console w3qf>>> plugins w3af/plugins>>> output console,textfile w3af/plugins>>> output config textFile * a script file can be created with the commands in it ..3.24 $ w3af_console -s filename * discovery plugins ..3.25 * Robots Reader * Reads and report on the robots.txt file * Detect Transparent Proxy * Uses Trace method to find proxies * Google Spider * spider using google cache * Evasion plugins ..3.26 * Modification that bypass mod_security installations * adding self-referential directories to the URL * http://www.sec542.org/./cgi-bin/./script.pl * using hex encoding * Changing the case of letter randomly * w3af Audit Plug-Ins ..3.37 * discovery step in this course * tries to find flaws * XSS * SQL injection * Response splitting * some examples of Audit plugins * sslCertificate * finds issues with SSL confugurations * unSSL * Determines if SSL content is available via HTTP * osCommanding * Attempts to find command injection flaws * w3af Grep Plug-Ins ..3.28 * Path disclosure * Code being disclosed * AJAX code * E-mail addresses * w3af can even determine the language used in the site * w3af Brute force plug-ins * use information gathered by the other plug-ins * E-mails gathered can be used as usernames * Words form the site can be used as passwords * Currently the brute force plug-ins are available for * Forms based authentication * HTTP Basic authentication * No Digest * No NTLM * Running w3af * enter one url to start the test from(one is the limit) * create a profile * select which plug-ins to use * select output * make sure that in the configuration of the spider, you limit it to the target * press Start * w3af Results * response navigator tab will allow for search strings * w3af exploitation * depending on the discovered flaw, these features can be used to * get command shell access of target web server * hook browsers via XSS flaws in the server * interact with a command shell on a database server using sqlmap * numerous others * Kevin's favorite open source tool === w3af Exercise ..3.33 === cd /usr/bin/samurai/w3af ./w3af_gui click profile menu and select new profile. Give it a name such as sec542 In the target URL, enter https://www.sec542.org/scanners * select the following plug-ins * Audit * osCommanding * sqli * sslCertificate * unSSL * Discovery * allowed Methods * phpEggs * serverHeader * webSpider * Output * gtkOutput * htmlFile * Save Configuration - button on the plug-in screen * after the scan completes, select exploit tab. Then right click on the osCommandingShell and select "Exploit all untill successful". We should see the shell become listed in the right hand window. Double-click on this listed shell opens the shell for us to use it. * Use the shell * Only use non-interactive commands like, id, who and uname -a * Examine the Results * report.html === The Burb Suite ..3.40 === * http://portswigger.net * low level access to http protocol, but not as user friendly as WebScarab * can automatically rewrite requests == Burp Suite Components ..3.42 == * Burp Proxy * intercepts http/s connections * Burp Spider * Crawls a web site * Burp Intruder * Attack tool that contains a large number of attack methods * Burp Repeater * Repeats interactions/attacks * Burp Sequencer * Analyzes session tokens * Burp Decoder * Decodes various types of encoding for textual information * Burp Comparer * Compares two pages together, implementing a form of "diff" == Burp Proxy ..3.43 == * Includes fine-grained rules to determine which requests/responses are intercepted * Proxy can automatically re-write HTML * Can be used to remove client sidefiltering or input limits == Burp Spider ..3.44 == * requests sent through the proxy can be used to start the spider (right click in proxy, "send to spider") * By default, it starts with the most recent (check "Spider Running") * Handles more difficult client side code * authenticates using credentials * stores predetermined answers for forms == Burp Indruder ..3.45 == * automated attack tool * send requests from other tools to intruder * limited in free version -- fuzzer is throttled == Burp Repeater ..3.46 == * resubmits transactions * shows request and responses * stops after 10 re-directs to prevent looping * used to manually modify the request before submitting it == Burp Sequencer ..3.47 == * Analyzes randomness of session tokens == Burp Decoder ..3.48 == * Encodes and Decodes data * commonly used to decode base64 data == Burp Comparer ..3.49 == * looks at two different items (requests or responses) to see the difference * Byte level or word level * It uses whitespace to determine words === Exercise: Burb Suite ..3.50 === === SamuraiWTF ..3.57 === ==== Manual Verification Techniques ..3.61 ==== === Info Leakage and Dir Browsing ..3.63 === === Exercise: Directory Browsing ..3.69 === === Username Harvesting ..3.73 === === Exercise: Username Harvesting ..3.81 === === Command Injection ..3.86 === === Exercise: Command Injection ..3.89 === === SQL Injection ..3.100 === === Exercise: SQL Injection ..3.116 === === Blind SQL Injection ..3.120 === === Cross Site Scripting ..3.131 === === Exercise: XSS Exercise ..3.140 === === Cross Site Request Forgery ..3.153 === == Apache settings to turn off == * register_globals * display_errors * allow_url_fopen