🔐

In-scope targets only. Read the programme scope on HackerOne or Bugcrowd before running any tool. Passive tools (crt.sh, Subfinder passive mode) are safe on any in-scope target. Active DNS brute-forcing may be restricted — check programme policy explicitly before using Amass active mode.

60-DAY BUG BOUNTY COURSE PROGRESS
Day 6 / 60 — 10%

✅ D1:Intro
✅ D2:Burp Setup
✅ D3:HTTP
✅ D4:OWASP Top 10
✅ D5:Burp Deep Dive
▶ D6:Subdomain Enum
D7:XSS Hunting
D8–60:···

🗺️

In Day 5 you went deep on Burp Suite — Repeater, Intruder, the Proxy workflow, and how to intercept and modify every request flowing through the application. You now have the tool that will confirm every bug you find for the rest of this course.

Before you start firing Burp at request parameters, Day 6 answers the most important question in bug bounty: which targets are actually worth your time? The biggest mistake beginners make is going straight to the main domain and testing the same endpoints as every other hunter. The bugs that pay are almost never on the homepage. They live on dev.target.com that nobody discovered, staging.target.com still running PHP 7.1, api-v2.target.com the security team forgot to harden, and internal.target.com that appears nowhere on the public site. Day 6 teaches you to build the full map before you test a single thing.

This lesson covers the complete active recon workflow — crt.sh certificate transparency, Subfinder, Amass, httpx live host filtering, ffuf directory fuzzing, and how to organise everything into a prioritised attack surface document that feeds directly into your Burp Suite testing workflow from Day 5.


Why Subdomains Are Where Bugs Actually Live

The main domain is the most tested, most monitored, and most hardened part of any programme. Security teams audit it constantly. Automated scanners run on it around the clock. Finding an original, unreported vulnerability there requires deep expertise and patience. Subdomains are a completely different story.

A programme with *.target.com in scope can have 300+ valid subdomains. Most hunters test fewer than 10. The hunter who maps all 300, filters to live hosts, and investigates the unusual ones consistently finds bugs others walk straight past.

🎯 HIGH-VALUE PATTERNS
dev. · staging. · test.
uat. · internal. · admin.
api-v2. · beta. · legacy.
corp. · old. · backup.

✅ WHY THEY PAY
Older software with known CVEs
Debug mode enabled in production
Weak or no authentication
Default credentials not changed
Internal APIs publicly exposed
Forgotten after project ends
No automated scanner coverage


Step 0 — Read Programme Scope Before Running Anything

⚠️ Scope first, always. On HackerOne and Bugcrowd, every programme has an Assets section listing in-scope and out-of-scope targets. A wildcard (*.target.com) covers all subdomains. Many programmes explicitly exclude specific assets. Testing out-of-scope gets your report closed and may get you banned from the programme.
✅ IN SCOPE: *.target.com # wildcard = all subdomains
✅ IN SCOPE: api.target.com # specific subdomain
❌ OUT OF SCOPE: careers.target.com # third-party ATS
❌ OUT OF SCOPE: blog.target.com # hosted on external platform

# Three questions to answer before starting:
Is active DNS brute-forcing / automated scanning permitted?
Are acquired / subsidiary domains in scope?
What is the safe harbour / responsible disclosure policy?


Passive Discovery — crt.sh Certificate Transparency

Certificate transparency logs are public records of every SSL/TLS certificate ever issued for a domain. Every new subdomain deployed with HTTPS gets a certificate logged permanently in these public logs. crt.sh indexes them — completely passive, no rate limits, zero traffic to the target. Run this first on every programme target.

# ─── Query crt.sh via API (command line) ────────────────────────
curl -s “https://crt.sh/?q=%.target.com&output=json” \
  | jq -r ‘.[].name_value’ \
  | sort -u \
  | grep -v ‘^\*’ > crt.txt # strip wildcard entries

wc -l crt.txt # count unique subdomains found

# ─── Browser alternative ────────────────────────────────────────
# Visit: https://crt.sh/?q=%.target.com
# Look for dev., staging., internal., admin. patterns immediately

💡 crt.sh reveals historical subdomains. Certificate logs include subdomains from years ago — including decommissioned assets that still respond to HTTP but receive zero monitoring. Forgotten subdomains with outdated software are a consistent source of high-value bug bounty findings. Run crt.sh before any active tool.

Subfinder — 40+ Passive Sources in One Command

Subfinder by ProjectDiscovery queries over 40 passive sources simultaneously — certificate transparency, DNS databases, Shodan, VirusTotal, search engine indices, and more — returning a clean deduplicated list. The industry-standard passive subdomain enumeration tool for bug bounty hunters. Pre-installed on Kali Linux.

# ─── Install / verify ───────────────────────────────────────────
subfinder -version
sudo apt install subfinder -y # if not installed

# ─── Basic subdomain discovery ──────────────────────────────────
subfinder -d target.com -silent -o subfinder.txt

# ─── Verbose — see which source found each subdomain ────────────
subfinder -d target.com -v # great for learning what each source contributes

# ─── Merge with crt.sh results ──────────────────────────────────
cat crt.txt subfinder.txt | sort -u > combined.txt


Amass — Deep Passive Recon & Org-Wide Intelligence

# ─── Amass passive mode (always safe, no target contact) ────────
amass enum -passive -d target.com -o amass.txt

# ─── Amass intel — find related domains owned by same org ───────
amass intel -org “Target Company Name”
# Reveals ASN, IP ranges, and related domains = expanded scope

# ─── Active DNS brute-force (scope-permitted only!) ─────────────
amass enum -active -brute -d target.com \
            -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-5000.txt
# ⚠️ Only use -active when programme explicitly permits automated scanning


httpx — Filter Hundreds of Subdomains to Live Targets Only

After enumeration you have a large raw list — many dead, parked, or no longer resolving. httpx probes each one over HTTP and HTTPS, records which respond, and returns page title, status code, and technology stack for each. Your raw list goes from noise to testable signal in one command.

securityelites.com

Kali Linux — httpx Live Host Filtering (In-Scope Bug Bounty Target)
└─$ cat combined.txt | httpx -mc 200,301,302,403 -title -tech-detect -o live.txt
https://target.com [200] [Home | Target Inc] [nginx, React]
https://api.target.com [200] [API Gateway v2.1] [nginx, Node.js]
https://dev.target.com [200] [Dev Env — NOT FOR PUBLIC] [Apache/2.2]
https://staging.target.com [200] [Staging Server] [Apache/2.2, PHP/7.1]
https://admin.target.com [403] [Forbidden] [nginx]
https://internal.target.com [200] [Internal Dashboard] [Django 2.1]
https://cdn.target.com [200] [CDN Assets] [CloudFront]
# 347 raw subdomains → 7 live hosts with tech stacks identified
PRIORITY: dev (Apache 2.2 EOL) · staging (PHP 7.1 EOL) · internal (Django 2.1 EOL, no auth) · admin (403 bypass)

httpx filters 347 raw subdomains to 7 live hosts with tech stacks identified. Priority targets emerge instantly: dev.target.com (Apache 2.2 — end-of-life, check NVD for CVEs), staging (PHP 7.1 — no security patches since December 2019), internal (Django 2.1 EOL, unauthenticated dashboard), admin (403 Forbidden — try bypass headers). Next step: open each priority target in your Burp-proxied browser (Day 5 setup) and capture all requests into HTTP History before testing.
# ─── Full httpx command ─────────────────────────────────────────
cat combined.txt | httpx \
  -mc 200,301,302,403 \ # match these status codes
  -title \ # capture page title
  -tech-detect \ # identify web technology stack
  -o live.txt # save to file

# ─── Extract priority subdomain patterns immediately ────────────
grep -iE “dev\.|staging\.|admin\.|internal\.|test\.|beta\.” live.txt


ffuf — Directory Fuzz Live Targets to Find Hidden Endpoints

Once you have live priority subdomains, directory fuzzing finds hidden admin panels, API paths, backup files, and forgotten endpoints not publicly linked. Pair ffuf findings with Burp Suite (Day 5 Deep Dive) — browse each discovered path through Burp Proxy to capture the full request-response cycle in HTTP History.

# ─── Install ffuf ───────────────────────────────────────────────
sudo apt install ffuf -y

# ─── Basic directory fuzz on a priority subdomain ───────────────
ffuf -u https://dev.target.com/FUZZ \
     -w /usr/share/seclists/Discovery/Web-Content/directory-list-2.3-medium.txt \
     -mc 200,301,302,403 \
     -t 40 -o dirs.json

# ─── Filter false positives (custom 404 pages) ──────────────────
ffuf -u https://dev.target.com/FUZZ \
     -w wordlist.txt -mc 200,301 -fs 1234 # filter by response size

# ─── Hunt high-value extensions alongside directories ───────────
ffuf -u https://dev.target.com/FUZZ \
     -w wordlist.txt \
     -e .php,.bak,.zip,.sql,.env,.txt \
     -mc 200,301


The Complete 7-Command Recon Pipeline

securityelites.com

ATTACK SURFACE MAP — BUG BOUNTY DAY 6 — securityelites.com
🔥 PRIORITY TARGETS — OPEN IN BURP (DAY 5) IMMEDIATELY
dev.target.com → Apache 2.2.8 EOL, /admin/ [200], /debug/ [200]
staging.target.com → PHP 7.1 EOL, /phpinfo.php [200] ← server config exposed
internal.target.com → Django 2.1 EOL, /dashboard/ [200] no auth required
admin.target.com → 403 Forbidden → test X-Original-URL / X-Forwarded-For bypass

STATS: 347 raw → 7 live → 3 critical priority + 1 bypass candidate

NEXT: Load into Burp Suite → HTTP History → Repeater for each priority endpoint

Attack Surface Map — the output of the full Day 6 pipeline. Three sections: priority targets with findings annotated (EOL software + unauthenticated endpoints = immediate Burp Suite testing), stats summary (raw to live ratio), and next-action note pointing directly back to the Day 5 Burp Suite Deep Dive workflow. phpinfo.php returning 200 on staging is critical — it exposes PHP version, server config, environment variables, and potentially credential paths in plain text.
# ─── Full pipeline — run in sequence ────────────────────────────

# 1. Certificate transparency (passive, zero target contact)
curl -s “https://crt.sh/?q=%.target.com&output=json” | jq -r ‘.[].name_value’ | sort -u | grep -v ‘^\*’ > crt.txt

# 2. Subfinder aggregation (passive, 40+ sources)
subfinder -d target.com -silent -o subfinder.txt

# 3. Amass passive
amass enum -passive -d target.com -o amass.txt

# 4. Combine and deduplicate all sources
cat crt.txt subfinder.txt amass.txt | sort -u > all_subs.txt

# 5. Filter to live hosts with tech detection
cat all_subs.txt | httpx -mc 200,301,302,403 -title -tech-detect -o live.txt

# 6. Extract and review priority subdomain patterns
grep -iE “dev\.|staging\.|admin\.|internal\.|test\.” live.txt

# 7. Directory fuzz each priority target
ffuf -u https://PRIORITY_SUBDOMAIN/FUZZ -w /usr/share/seclists/Discovery/Web-Content/directory-list-2.3-medium.txt -mc 200,301,403 -t 40


Prioritisation & Burp Suite Handoff

After the pipeline runs, you have a map. Here is how to prioritise it and hand it directly to your Burp Suite workflow from Day 5.

P1
NOW
EOL software + unauthenticated panels + open APIs
Open each in Burp-proxied browser (Day 5 setup). HTTP History captures every request. Send interesting parameters to Repeater. Apache 2.2, PHP 7.1, Django 2.1 — check NVD for known CVEs before manual testing.
P2
TODAY
403 bypass + phpinfo.php + .env + backup files
403 on /admin/ — send request to Burp Repeater (Day 5), add X-Original-URL: /admin/ header and retry. phpinfo.php = server config in plaintext. .env and .sql files = credentials and full database dumps.
P3
LATER
API endpoints + authenticated areas + main domain
Most hunted, lowest unique finding rate. Test after priority subdomains are covered. Use Burp Intruder and the request manipulation skills from Day 5 for systematic parameter testing.

📋 Subdomain Recon Reference Card

securityelites.com

SUBDOMAIN RECON REFERENCE — BUG BOUNTY DAY 6 — securityelites.com
PASSIVE (ALWAYS SAFE)
curl -s “crt.sh/?q=%.DOMAIN&output=json” \
 | jq -r ‘.[].name_value’ | sort -u
subfinder -d DOMAIN -silent -o subs.txt
amass enum -passive -d DOMAIN

FILTER TO LIVE HOSTS
cat subs.txt | httpx \
 -mc 200,301,302,403 \
 -title -tech-detect -o live.txt
grep -iE “dev\.|staging\.” live.txt

DIRECTORY FUZZ (ffuf)
ffuf -u https://TARGET/FUZZ \
-w directory-list-2.3-medium.txt \
-mc 200,301,403 -t 40
# Add -e .bak,.env,.sql,.php

→ BURP SUITE (DAY 5)
Open priority targets in browser
Burp Proxy ON → capture requests
HTTP History → send to Repeater
Systematic param testing begins

Subdomain Recon Reference Card — Day 6 Bug Bounty Course. Four panels: passive pipeline (crt.sh + Subfinder + Amass), live host filtering with httpx, directory fuzzing with ffuf, and the handoff to Burp Suite from Day 5. The Burp panel is deliberate — recon feeds directly into the Day 5 proxy workflow. After this pipeline, every priority subdomain gets opened in the Burp-proxied browser to populate HTTP History before any active testing begins.

Day 6 Complete — Attack Surface Mapped
60-Day Bug Bounty Course — All Free, All Practical

You have the map and the Burp Suite skills from Day 5. Day 7 teaches you to hunt the most consistently findable vulnerability type across those targets — XSS.

Frequently Asked Questions

What is subdomain enumeration in bug bounty?
Discovering all subdomains beyond the main website. Dev, staging, admin, and API subdomains are less hardened, run older software, expose internal functionality, and receive far less security scrutiny — making them the highest-probability targets for unique findings.
What tools are used for subdomain enumeration?
crt.sh (certificate transparency, fully passive), Subfinder (40+ passive sources), Amass (passive + optional active DNS), httpx (filter to live hosts with tech detection), ffuf/Gobuster (directory fuzzing). Always combine multiple tools — each source finds different subdomains.
Passive vs active subdomain enumeration?
Passive (crt.sh, Subfinder, Amass passive) uses public data with zero target contact — safe on any in-scope programme. Active (DNS brute-force) touches target DNS and leaves server logs. Verify programme policy before using active tools.
Why focus on dev and staging subdomains?
Older unpatched software (known CVEs), debug mode on, weak or missing authentication, exposed internal APIs, forgotten after project ends. A staging subdomain running EOL software is an instant reproducible finding.
What is httpx used for in bug bounty recon?
Filters hundreds of raw subdomains to only live responding hosts — with page title, status code, and tech stack. 500 raw subdomains becomes 80 testable targets with immediate context for each.
How do I find subdomains for a bug bounty target?
Pipeline: crt.sh → subfinder → amass passive → cat all | sort -u → httpx to filter live → ffuf to fuzz priorities. Combine all sources. Load priority targets into Burp (Day 5 setup) for systematic testing.

ME
Mr Elite
Founder, SecurityElites.com

The highest-paying bug I ever found was on legacy-api.target.com — not listed anywhere public, found via crt.sh in under 10 minutes. No authentication on a user data endpoint. Complete account takeover for any user ID. It had been live for three years. Three years of zero testing because no one looked. Do the recon other hunters skip. Then open those priority subdomains in Burp exactly as you set it up in Day 5 and work through every parameter methodically.

Up Next — Day 7
XSS Hunting — Find & Report Cross-Site Scripting Bugs
You have the map from today and Burp Suite from Day 5. Day 7 teaches you to hunt XSS across every target you just discovered.

Day 7: XSS Hunting →

LEAVE A REPLY

Please enter your comment!
Please enter your name here