About ℹ️ • Description 🔬 • Installation 🛠️ • Versions 📦
SARA (Security Assistant Researcher Analyzer) is an asynchronous web crawler with enumeration capabilities.
The primary goal of the tool is to crawl target addresses, analyze source code and JavaScript files, and search for keywords, comments, and inline scripts. It also supports custom headers for flexibility.
Also there is an animation indicating that the program is running is included. The message I’m researching will add a . every second to show activity.
- Crawling and Analysis: SARA scans pages, extracting valuable information such as links, JavaScript files, and analyzing their content. Can crawl multiple levels deep.
- Directory and Subdomain Enumeration: Perfect for discovering niche endpoints like
/apior/graphql, as well as enumerating subdomains. - Support for GET and POST requests: All modes (
-c,--enum-d,--enum-s) support both `GET` and `POST` requests for more flexible crawling and enumeration. - Headless Browser To Bypass WAF (
-hl). - Simulated User Behavior:
- The
User-Agentheader is randomly chosen from a preset list to mimic real browsers (e.g., Chrome, Firefox) and OS (e.g., Macintosh, Linux, Windows) - Users can specify their own
User-Agent, overriding the default one. - The tool introduces a delay between requests (1–5 seconds) to simulate natural user behavior, minimizing noise and suspicion.
- Conducting penetration tests.
- Bug bounty programs that require deep automation with minimal detection and noise.
Here is a --help command output:
-t-c(Crawling Mode).- HTTP response codes
- Links on the target web page
- HTTP response headers and their analysis
- JavaScript files and their analysis
- Inline scripts
- Keywords and comments
-d(Deep Crawling & Controlled Exploration)-hl(Headless-browser)--enum-d(Directory Enumeration).- Requires a target with a full URL, including the protocol (e.g.,
https://example.com). - Accepts one target at a time.
- Can use a default wordlist or a custom file for directories.
Default wordlist
/admin, /login, /dashboard, /config, /api, /robots.txt, /sitemap.xml, /env, /private, /uploads, /tmp, /health, /metrics, /status, /graphql, /graphiql--enum-s(Subdomain Enumeration)- By default, it enumerates subdomains using the HTTPS protocol.
- Accepts one target at a time.
- Can use a default wordlist or a custom file for subdomains.
Default wordlist
dev, test, staging, qa, admin, dashboard, api, auth, mail, ftp, vpn, status--http-X-H- Accepts both a string or a file containing headers.
-o- Output will also be printed to the terminal.
- JSON is the recommended format for easier post-processing.
-kw
Adds custom keywords for analysis during crawling.
- Accepts either a string or a file containing keywords.
-wjs-wha-h(Help)
Accepts a single URL, a domain, or a file containing multiple targets. Works in conjunction with the three primary modes: -c, --enum-s, and --enum-d.
Accepts either a single URL or a file containing multiple URLs.
In crawling mode, SARA gathers key data about the target, including:
With -d or --depth, SARA can crawl multiple levels deep.
Use --depth 99 to enable interactive mode where you decide whether to proceed to the next depth level.
This flag enables crawling with a headless browser (via Playwright), which helps bypass WAF protections, JavaScript-rendered pages, and cloaked content.
Useful for targets that return misleading or blocked responses to regular HTTP requests.
Performs directory enumeration. Since the tool intentionally uses slow request rates (1-5 seconds per request), it is recommended for highly specific endpoint enumeration, such as APIs.
Performs subdomain enumeration. Unlike --enum-d, this mode accepts only a domain as input (e.g., example.com).
Works with --enum-s to enable subdomain enumeration over HTTP instead of HTTPS.
HTTP method to use (default: GET). Use -X POST for POST requests.
Adds custom HTTP headers. If provided, the custom User-Agent header will replace the default.
Saves the output to a user-specified file.
Disables JavaScript file analysis during crawling.
Disables HTTP header analysis during crawling.
Displays the manual, including command examples for easier usage.
Follow these steps to set up and use SARA:
- Clone the Repository
- Navigate to the Source Directory
- Make file executable
- Install Dependencies
- Run the Tool
- Optional: Create a System-wide Command Shortcut
git clone https://github.com/Kode-n-Rolla/sara.gitcd sara/srcchmod +x sara.pypip install -r requirements.txtAlternatively, if you encounter an error like × This environment is externally managed, use:
pip install -r requirements.txt --break-system-packagesWith venv
python3 -m venv sarasource sara/bin/activatepip install requirements.txtdeactivateExecute the script with python3 to see available options:
python3 sara.py -hTo make it easier to run SARA, you can create a symbolic link (recommended save in /opt first or any your tools directory):
sudo ln -s "$(pwd)/sara.py" /usr/local/bin/saraNow, you can run the tool from anywhere using:
sara --help| Version | Key Features | Release Notes |
|---|---|---|
| 1.1 |
|
|
| 1.0 |
|
Initial release with core functionality including crawling, enumeration, and customizable features for penetration testing and bug bounty tasks. |