<p align="center">
<img src="https://github.com/PushpenderIndia/ragno/blob/master/img/ragno-logo.png" alt="Ragno Logo" />
</p>
<p align="center">
<a href="https://python.org">
<img src="https://img.shields.io/badge/Python-3.9-green.svg">
</a>
<a href="https://github.com/PushpenderIndia/ragno/blob/master/LICENSE">
<img src="https://img.shields.io/badge/License-MIT-lightgrey.svg">
</a>
<a href="https://github.com/PushpenderIndia/ragno/releases">
<img src="https://img.shields.io/badge/Release-1.6-blue.svg">
</a>
<a href="https://github.com/PushpenderIndia/ragno">
<img src="https://img.shields.io/badge/Open%20Source-%E2%9D%A4-brightgreen.svg">
</a>
</p>
**Ragno** is a **Passive URL Crawler** | Written in **Python3** | Fetches URLs from the **Wayback Machine**, **AlienVault's Open Threat Exchange** & **Common Crawl**
## Disclaimer
<p align="center">
:computer: This project was created only for good purposes and personal use.
</p>
THIS SOFTWARE IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND. YOU MAY USE THIS SOFTWARE AT YOUR OWN RISK. THE USE IS COMPLETE RESPONSIBILITY OF THE END-USER. THE DEVELOPERS ASSUME NO LIABILITY AND ARE NOT RESPONSIBLE FOR ANY MISUSE OR DAMAGE CAUSED BY THIS PROGRAM.
## Features
- [x] Works on Windows/Linux/MacOS
- [x] Passive Crawler (Does not intract with target directly)
- [x] Crawl URLs from 3 Sources i.e.
| Crawl URLs from |
| --------------- |
| Wayback Machine |
| Common Crawl |
| AlienVault's OTX (Open Threat Exchange) |
- [x] DeepCrawl Feature (If Enabled, then Ragno try to fetch URLs from all **74+ CommonCrawl APIs**)
- [x] MultiThreading (Only Used When **DeepCrawl** Feature is Enabled)
- [x] Result of **Subdomains** could be excluded & included via CommandLine Argument (i.e. **-s**)
- [x] Save Result in TXT File
- [x] Quiet Mode
## How To Use in Linux
```bash
# Installing using pip
$ pip3 install Ragno
# Checking Help Menu
$ ragno --help
# Run Normal (Fast) Crawl
$ ragno -d target.com
# Run Normal (Fast) Crawl + Saving Result
$ ragno -d target.com -o result.txt
# Run Normal (Fast) Crawl + Saving Result + Quiet Mode (Without Showing URLs on screen)
$ ragno -d target.com -o result.txt -q
# Run Deep Crawl + Saving Result + Quiet Mode (Without Showing URLs on screen)
$ ragno -d target.com -o result.txt -q --deepcrawl
```
## How To Use in Windows
```bash
# Install dependencies
$ Install latest python 3.x from Official Site (https://www.python.org/downloads/)
# Installing ragno using pip
$ pip install Ragno
# Checking Help Menu
$ ragno --help
# Run Normal (Fast) Crawl
$ ragno -d target.com
# Run Normal (Fast) Crawl + Saving Result
$ ragno -d target.com -o result.txt
# Run Normal (Fast) Crawl + Saving Result + Quiet Mode (Without Showing URLs on screen)
$ ragno -d target.com -o result.txt -q
# Run Deep Crawl + Saving Result + Quiet Mode (Without Showing URLs on screen)
$ ragno -d target.com -o result.txt -q --deepcrawl
```
## Available Arguments
* Optional Arguments
| Short Hand | Full Hand | Description |
| ---------- | --------- | ----------- |
| -h | --help | show this help message and exit |
| -o OUTPUT | --output OUTPUT | Save Result in TXT file |
| -s | --subs | Include Result of Subdomains |
| -q | --quiet | Run Scan Without printing URLs on screen |
| | --deepcrawl | Uses All Available APIs of CommonCrawl for Crawling URLs [Takes Time] |
| -t THREAD | --thread THREAD | Number of Threads to Used. Default=50 [Use When deepcrawl is Enabled] |
* Required Arguments
| Short Hand | Full Hand | Description |
| ---------- | --------- | ----------- |
| -d DOMAIN | --domain DOMAIN | Target Domain Name, ex:- google.com |
## Use Cases
> After Finding URLs, you can filter them on the basics of your attack & can Mass Hunt Particular vulnerabilites such as XSS, LFI, Open redirect, SSRF, etc
### Example 1: One Liner for Hunting Open Redirect
- Install qsreplace:
```
sudo wget https://github.com/tomnomnom/qsreplace/releases/download/v0.0.3/qsreplace-linux-amd64-0.0.3.tgz && sudo tar zvfx qsreplace-linux-amd64-0.0.3.tgz && sudo rm qsreplace-linux-amd64-0.0.3.tgz && sudo mv qsreplace /usr/bin/ && sudo chmod +x /usr/bin/qsreplace
```
- Run One Liner
```
ragno -d testphp.vulnweb.com -q -o ragno_urls.txt && cat ragno_urls.txt | grep -a -i \=http | qsreplace "http://evil.com" | while read target_url do; do curl -s -L $target_url -I | grep "evil.com" && echo "[+] [Vulnerable] $target_url \n"; done
```
* You can Use GF Tool by Tomnonnom, to filter URLs with juice parameters, and then you can test them further.
## Contribute
* All Contributors are welcome, this repo needs contributors who will improve this tool to make it best.
Raw data
{
"_id": null,
"home_page": "https://github.com/PushpenderIndia/ragno",
"name": "Ragno",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "ragno,bug bounty,http,pentesting,security,recon,passive information gathering,passive recon",
"author": "Pushpender Singh",
"author_email": "pushpendersingh@protonmail.com",
"download_url": "https://files.pythonhosted.org/packages/db/7f/2d623c8b1758321646ae12f6c3ae41b4998a311ffede98b32f8a05c9f2b5/Ragno-1.6.tar.gz",
"platform": null,
"description": "<p align=\"center\">\n <img src=\"https://github.com/PushpenderIndia/ragno/blob/master/img/ragno-logo.png\" alt=\"Ragno Logo\" />\n</p>\n\n<p align=\"center\">\n <a href=\"https://python.org\">\n <img src=\"https://img.shields.io/badge/Python-3.9-green.svg\">\n </a>\n <a href=\"https://github.com/PushpenderIndia/ragno/blob/master/LICENSE\">\n <img src=\"https://img.shields.io/badge/License-MIT-lightgrey.svg\">\n </a>\n <a href=\"https://github.com/PushpenderIndia/ragno/releases\">\n <img src=\"https://img.shields.io/badge/Release-1.6-blue.svg\">\n </a>\n <a href=\"https://github.com/PushpenderIndia/ragno\">\n <img src=\"https://img.shields.io/badge/Open%20Source-%E2%9D%A4-brightgreen.svg\">\n </a>\n</p>\n\n**Ragno** is a **Passive URL Crawler** | Written in **Python3** | Fetches URLs from the **Wayback Machine**, **AlienVault's Open Threat Exchange** & **Common Crawl**\n\n## Disclaimer\n<p align=\"center\">\n :computer: This project was created only for good purposes and personal use.\n</p>\n\nTHIS SOFTWARE IS PROVIDED \"AS IS\" WITHOUT WARRANTY OF ANY KIND. YOU MAY USE THIS SOFTWARE AT YOUR OWN RISK. THE USE IS COMPLETE RESPONSIBILITY OF THE END-USER. THE DEVELOPERS ASSUME NO LIABILITY AND ARE NOT RESPONSIBLE FOR ANY MISUSE OR DAMAGE CAUSED BY THIS PROGRAM.\n\n## Features\n- [x] Works on Windows/Linux/MacOS\n- [x] Passive Crawler (Does not intract with target directly)\n- [x] Crawl URLs from 3 Sources i.e.\n\n| Crawl URLs from |\n| --------------- |\n| Wayback Machine |\n| Common Crawl |\n| AlienVault's OTX (Open Threat Exchange) |\n\n- [x] DeepCrawl Feature (If Enabled, then Ragno try to fetch URLs from all **74+ CommonCrawl APIs**)\n- [x] MultiThreading (Only Used When **DeepCrawl** Feature is Enabled)\n- [x] Result of **Subdomains** could be excluded & included via CommandLine Argument (i.e. **-s**)\n- [x] Save Result in TXT File\n- [x] Quiet Mode\n\n## How To Use in Linux\n```bash\n# Installing using pip\n$ pip3 install Ragno\n\n# Checking Help Menu\n$ ragno --help\n\n# Run Normal (Fast) Crawl\n$ ragno -d target.com \n\n# Run Normal (Fast) Crawl + Saving Result\n$ ragno -d target.com -o result.txt\n\n# Run Normal (Fast) Crawl + Saving Result + Quiet Mode (Without Showing URLs on screen)\n$ ragno -d target.com -o result.txt -q\n\n# Run Deep Crawl + Saving Result + Quiet Mode (Without Showing URLs on screen)\n$ ragno -d target.com -o result.txt -q --deepcrawl\n```\n\n## How To Use in Windows\n```bash\n# Install dependencies \n$ Install latest python 3.x from Official Site (https://www.python.org/downloads/)\n\n# Installing ragno using pip\n$ pip install Ragno\n\n# Checking Help Menu\n$ ragno --help\n\n# Run Normal (Fast) Crawl\n$ ragno -d target.com \n\n# Run Normal (Fast) Crawl + Saving Result\n$ ragno -d target.com -o result.txt\n\n# Run Normal (Fast) Crawl + Saving Result + Quiet Mode (Without Showing URLs on screen)\n$ ragno -d target.com -o result.txt -q\n\n# Run Deep Crawl + Saving Result + Quiet Mode (Without Showing URLs on screen)\n$ ragno -d target.com -o result.txt -q --deepcrawl\n```\n\n## Available Arguments \n* Optional Arguments\n\n| Short Hand | Full Hand | Description |\n| ---------- | --------- | ----------- |\n| -h | --help | show this help message and exit |\n| -o OUTPUT | --output OUTPUT | Save Result in TXT file |\n| -s | --subs | Include Result of Subdomains |\n| -q | --quiet | Run Scan Without printing URLs on screen |\n| | --deepcrawl | Uses All Available APIs of CommonCrawl for Crawling URLs [Takes Time] |\n| -t THREAD | --thread THREAD | Number of Threads to Used. Default=50 [Use When deepcrawl is Enabled] |\n\n* Required Arguments\n\n| Short Hand | Full Hand | Description |\n| ---------- | --------- | ----------- |\n| -d DOMAIN | --domain DOMAIN | Target Domain Name, ex:- google.com |\n\n## Use Cases\n\n> After Finding URLs, you can filter them on the basics of your attack & can Mass Hunt Particular vulnerabilites such as XSS, LFI, Open redirect, SSRF, etc\n\n### Example 1: One Liner for Hunting Open Redirect\n- Install qsreplace:\n```\nsudo wget https://github.com/tomnomnom/qsreplace/releases/download/v0.0.3/qsreplace-linux-amd64-0.0.3.tgz && sudo tar zvfx qsreplace-linux-amd64-0.0.3.tgz && sudo rm qsreplace-linux-amd64-0.0.3.tgz && sudo mv qsreplace /usr/bin/ && sudo chmod +x /usr/bin/qsreplace\n```\n\n- Run One Liner\n```\nragno -d testphp.vulnweb.com -q -o ragno_urls.txt && cat ragno_urls.txt | grep -a -i \\=http | qsreplace \"http://evil.com\" | while read target_url do; do curl -s -L $target_url -I | grep \"evil.com\" && echo \"[+] [Vulnerable] $target_url \\n\"; done\n```\n\n* You can Use GF Tool by Tomnonnom, to filter URLs with juice parameters, and then you can test them further.\n\n## Contribute\n\n* All Contributors are welcome, this repo needs contributors who will improve this tool to make it best.\n",
"bugtrack_url": null,
"license": "GNU General Public License v3 (GPLv3)",
"summary": "Ragno Passive URL Crawler - Python Module",
"version": "1.6",
"project_urls": {
"Homepage": "https://github.com/PushpenderIndia/ragno"
},
"split_keywords": [
"ragno",
"bug bounty",
"http",
"pentesting",
"security",
"recon",
"passive information gathering",
"passive recon"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "95d34efc813c65006987d7550a99d2a62a9fc025211174a9e1bc5a6f94347eb1",
"md5": "595db90cfc8d0669687b3e408dbc423a",
"sha256": "7dc38750951ff5fce623def394d239bee33453ba4d79a8d9e03605012669b074"
},
"downloads": -1,
"filename": "Ragno-1.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "595db90cfc8d0669687b3e408dbc423a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 7004,
"upload_time": "2023-11-10T17:57:53",
"upload_time_iso_8601": "2023-11-10T17:57:53.905422Z",
"url": "https://files.pythonhosted.org/packages/95/d3/4efc813c65006987d7550a99d2a62a9fc025211174a9e1bc5a6f94347eb1/Ragno-1.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "db7f2d623c8b1758321646ae12f6c3ae41b4998a311ffede98b32f8a05c9f2b5",
"md5": "7e75bbdb5737191c18270ff71b2fdaf1",
"sha256": "588a148b2f6ab6eb4cdb7127cbbdf289520301e0fdef68724762a39ae78354fd"
},
"downloads": -1,
"filename": "Ragno-1.6.tar.gz",
"has_sig": false,
"md5_digest": "7e75bbdb5737191c18270ff71b2fdaf1",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 6864,
"upload_time": "2023-11-10T17:57:55",
"upload_time_iso_8601": "2023-11-10T17:57:55.815785Z",
"url": "https://files.pythonhosted.org/packages/db/7f/2d623c8b1758321646ae12f6c3ae41b4998a311ffede98b32f8a05c9f2b5/Ragno-1.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-11-10 17:57:55",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "PushpenderIndia",
"github_project": "ragno",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [],
"lcname": "ragno"
}