Thumbnail: jekyll

TCM PEH Recon Module Notes

on under Certs
4 minute read

Passive Information Gathering

Location Information

Thorough location recon
Satellite images
Drone recon
Building layout (badge readers, break areas, security, fencing)

Job Information

Thorough personnel recon
Employees (names, job title, phone number, managers, etc)
Pictures (badge photos, desk photos, computer photos, etc)

Web/Host (note, most of this uh…is active, not passive)

target validation via:
WHOIS, nslookup, dnsrecon
finding subdomains via:
google fu, dig, nmap, sublist3r, builtwith, netcat
Fingerprinting via:
nmap, wappalyzer, whatweb, builtwith, netcat
Data breaches via:
haveibeenpwned, breach-parse, weleakinfo
data breaches is his most used method of gaining access these days by far

hunter.io

is a website, realistically used quite a bit
this is the first stop

  1. sign in
  2. dashboard
  3. enter domain & search
  4. lists findings and “Most common patter:”
    Most common pattern for my target is output as {f}{last}@domain
    Also listed job titles as found
    Also lists source each email was pulled from
    CAN EXPORT TO CSV!!!
    This already looks extremely useful for osint!

Breach-parse

https://github.com/hmaverickadams/breach-parse
it’s just a crappy script he made, requires downloading a 44GB data dump of breaches
can possibly use this
see https://www.troyhunt.com/the-773-million-record-collection-1-data-reach/
bigger version of breach collections at https://raidforums.com/Thread-Collection-1-5-Zabagur-AntiPublic-Latest-120GB-1TB-TOTAL-Leaked-Download

theharvester

built into kali, so useful
some of the data sources need API keys
it’s not GREAT but it works and is always there

Web Information Gathering

Scraping the domain is not enough, need all subdomains as well

SubList3r
apt install sublist3r
sublist3r -h to get syntax
sublist3r -d domain
Will pick up 4th level subdomains without using any recursive flags
If it’s slow then -t <n> where <n> = threads

https://crt.sh
*.domain
This will enumerate all the certs, nice for subdomain enumeration

OWASP Amass
Install instructions at https://github.com/OWASP/Amass
Will do a lot more than Sublist3r
It’s what bug bounty guys use

tomnomnom’s http probe
https://github.com/tomnomnom/httprobe
Takes a list of domains (like Sublist3r output) and probes for alive/dead

What it’s built with

You never know where a vulnerability will be

https://builtwith.com
Enter domain and search

Wappalyzer
Webapp analyzer Firefox plugin
Search Firefox extensions for Wappalyzer, add
Go to domain
Click the plugin, accept
Will give you an immediate short listing of stuff
WAPPALYZER IS ACTIVE RECON

Whatweb
whatweb <url>
Will give even more info on version info

Burp Suite

A web proxy, you know

Initial setup

Start burp
Open firefox
Goto Preferences, Settings, enter 127.0.0.1:8080 for the Proxy
Uh use FoxyProxy extension instead
Go to https://burp
Allow cert permanently
Click on CA Certificate, save
Go back into Firefox Preferences, Privacy and Security, View Certificates, Import, select the CA Cert, Open, check both boxes, OK

Info gathering

It intercepts the web requests that are sent to and from webservers
Target tab shows intercepted traffic, including linked API traffic and web plugins
Use this info to enumerate the website by clicking through responses
BurpSuite Pro is $399 per year, btw

Google Fu

lmgtfy

Search for “google search operators”
site:<term> to search only within <term> site
-<term> to remove <term> from results
filetype:<extension> to search only for that doc type
Combine these in single strings for best results

Utilizing Social Media

Someone always posts employee group pics on Linkedin and Twitter
Yields badge photos for fake badge models
Yields hardware/software photos for intel

Linkedin always yeilds people and positions
Can scrape Linkedin Company People to generate list of emails by email format (f)(lastname)@(domain)

Certs
comments powered by Disqus