Name | urify JSON |
Version |
0.2.0
JSON |
| download |
home_page | |
Summary | Dissect and filter URLs provided on stdin. |
upload_time | 2023-10-23 20:36:31 |
maintainer | |
docs_url | None |
author | Harsh Verma |
requires_python | >=3.10,<4.0 |
license | |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Urify 🫧
Streamlining URLs for security testing - No more duplicates, just streamlined precision
---
### 🚀 Installation
1. Installation using PIP
```sh
pip install urify
```
2. Installation using Docker
```
docker pull urify
```
---
### 🛠 Commands:
Oh, you thought this was just a basic tool? Think again!
#### 🔍 Basic URL Dissections:
- `keys`: Want to know what keys are in your query string? I got you!
- `values`: Retrieve values from the query string. Yes, like treasure hunting but more nerdy.
- `params`: Get those key=value pairs, one at a time, like peeling an onion.
- `apex`: Get the apex domain. It's like climbing a mountain, but lazier.
- `fqdn`: Retrieve the fully qualified domain name. Fancy words, right?
- `json`: Encode your dissected URL into JSON. Because... JSON!
#### 🎛 Advanced Filtering:
- `filter`: Refine your URLs using a myriad of component filters:
- `-scheme`: Filter by request schemes. Yes, we're getting technical here.
- `-sub`: Look for specific subdomains. It's like Where's Waldo but for domains.
- `-domain`: Specify domains. Because we all have favorites.
- `-tld`: Filter by top-level domains. It's all about the hierarchy.
- `-ext`: Filter by file extensions. The grand finale of URLs.
- `-port`: Specify ports. No, not the wine. The number.
- `-apex`: Get URLs by apex domain. It's like VIP access.
- `-fqdn`: Filter by fully qualified domain names. The whole shebang!
- `-inverse`: Want to be rebellious? Use filters as a deny-list.
- `-strict`: Validate all filters. No half measures here!
- `-dissect`: Once filtered, dissect the URL for a specific component. Pick and choose!
---
### 📖 Examples:
#### 1. Basic URL Dissections:
```
$ urify -help
Dissect and filter URLs provided on stdin.
Usage: urify [-help] [mode] ...
Options:
-help show this help message
Modes:
keys Retrieve keys from the query string, one per line.
values Retrieve values from the query string, one per line.
params Key=value pairs from the query string (one per line)
path Retrieve the path (e.g., /users/me).
apex Retrieve the apex domain (e.g., github.com).
fqdn Retrieve the fully qualified domain name (e.g., api.github.com).
json JSON encode the dissected URL object.
filter Refine URLs using component filters.
```
- **keys**:
```
$ cat urls.txt | urify keys
search
id
product-id
name
```
- **values**:
```
$ cat urls.txt | urify values
query
123
29
powder
```
- **params**:
```
$ cat urls.txt | urify params
search=query
id=123
product-id=29
name=powder
```
- **apex**:
```
$ cat urls.txt | urify apex
example.com
spaghetti.com
example.org
cartel.net
```
- **fqdn**:
```
$ cat urls.txt | urify fqdn
sub.example.com
mom.spaghetti.com
blog.example.org
shop.cartel.net
```
- **json**:
```
echo "http://bob:secret@sub.example.com:80/file%20name.pdf?page=10" | urify json
```
```
{
"scheme": "http",
"username": "bob",
"password": "secret",
"subdomain": "sub",
"domain": "example",
"tld": "com",
"port": "80",
"path": "/file%20name.pdf",
"raw_query": "page=10",
"query": [
{
"key": "page",
"value": "10"
}
],
"fragment": "",
"apex": "example.com",
"fqdn": "sub.example.com"
}
```
#### 2. Advanced Filtering:
```
$ urify filter -help
Refine URLs using component filters.
By default, the command treats provided filters as an allow-list,
evaluating just one parameter from a specified field.
To assess all components, add the `-strict` flag.
For a deny-list approach, incorporate the `-inverse` flag.
After evaluation, the default result is the URL.
For specific URL dissected object, pair the `-dissect` option with any one of:
`keys` | `values` | `params` | `apex` | `fqdn` | `json`
If `-dissect` is not provided, `filter` command aims to print
dissimilar urls using pattern matching
Usage: urify filter [-help] [-scheme SCHEME [...]] [-sub SUB [...]] [-domain DOMAIN [...]] [-tld TLD [...]] [-ext EXT [...]]
[-port PORT [...]] [-apex APEX [...]] [-fqdn FQDN [...]] [-inverse] [-strict] [-dissect MODE]
Options:
-help show this help message
-scheme SCHEME [...] The request schemes (e.g. http, https)
-sub SUB [...] The subdomains (e.g. abc, abc.xyz)
-domain DOMAIN [...] The domains (e.g. github, youtube)
-tld TLD [...] The top level domains (e.g. in, com)
-ext EXT [...] The file extensions (e.g. pdf, html)
-port PORT [...] The ports (e.g. 22, 8080)
-apex APEX [...] The apex domains (e.g. github.com, youtube.com)
-fqdn FQDN [...] The fully qualified domain names (e.g. api.github.com, app.example.com)
-inverse Process filters as deny-list
-strict Validate all filter checks
-dissect MODE Dissect url and retrieve mode after filtering
Modes:
keys|values|params|path|apex|fqdn|json
```
- **filter dissimilar urls**
```
$ cat similar_urls.txt | urify filter
http://blog.example.net/post/12
http://info.example.net/tag/14?id=133
```
- **filter by scheme**:
```
$ cat urls.txt | urify filter -scheme https
https://sub.example.com/literally%20me.jpg#top
```
- **filter by domain**:
```
$ cat urls.txt | urify filter -domain example
https://sub.example.com/literally%20me.jpg#top
http://blog.example.org:8443/new%20blog?id=123
```
- **filter and dissect**:
```
$ cat urls.txt | urify filter -ext pdf jpg -dissect apex
example.com
spaghetti.com
```
- **filters as deny-list**:
```
$ cat urls.txt | urify filter -ext html -inverse
https://sub.example.com/literally%20me.jpg#top
http://mom.spaghetti.com:8080/cookbook.pdf?search=query
http://blog.example.org:8443/new%20blog?id=123
```
- **validate all filters**:
```
$ cat urls.txt | urify filter -tld com -port 8080 -strict
http://mom.spaghetti.com:8080/cookbook.pdf?search=query
```
---
### 📝 Notes:
- By default, the `filter` command treats the provided filters as an allow-list. But if you're feeling a little naughty, use the `-inverse` flag for a deny-list approach.
- If you're the meticulous type and want to validate every single filter parameter, add the `-strict` flag to your command. No stone unturned!
- After all the filtering shenanigans, if you just want the URL, we'll give it to you. But if you're in the mood for something special, pair the `-dissect` option with any one of the listed choices.
---
## 🧠 Custom CLI Parser (Skip this part):
Ever felt that built-in command-line parsers are just... meh? I did too. That's why urify is powered by its very own custom CLI parser. Why settle for the ordinary when you can have the extraordinary?
### 🧐 Features:
- **Command-centric**: Create commands as easily as adding a decorator with `@cli.command(...)`. Seriously, it's like sprinkling magic dust on your functions.
- **Options Galore**: Add options to your commands using the `@cli.option(...)` decorator. Define data types, set multiple values, and even specify valid choices. It's like an all-you-can-eat buffet, but for command-line arguments.
### 📖 Behind the Scenes:
#### Command Class:
The `Command` class encapsulates each command. It's like a VIP pass for your functions:
- `name`: The unique identifier for your command.
- `callback`: The function to be invoked when the command is called.
- `parser`: The argument parser associated with this command.
#### CLI Class:
The heart and soul of the custom parser. The `Cli` class:
- **Initialization**: Set up your CLI with descriptions, help flags, and more. Feel like a movie director but for commands.
- **Command Registration**: Register functions as commands with the `@cli.command(...)` decorator. It's like enlisting soldiers for battle.
- **Option Handling**: Define options with the `@cli.option(...)` decorator. Like adding extra toppings to your pizza.
- **Execution**: Finally, the `cli.run()` method brings everything together by parsing the arguments and invoking the right command. Lights, camera, action!
### 🤓 Technical Deep Dive:
The magic happens with the use of the `_ArgumentParser` class (a modified version of `argparse.ArgumentParser`). This allows for a more intuitive and concise definition of command-line interfaces. The beauty of this parser is that it draws inspiration from renowned libraries like `click` and `typer` while maintaining the unique aspects of `argparse`.
Overall, this custom CLI parser provides a seamless and powerful way to interact with command-line tools without the usual hassles.
---
*Disclaimer: No URLs were harmed in the making of this tool.*
---
Happy dissecting! 🎈
Raw data
{
"_id": null,
"home_page": "",
"name": "urify",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.10,<4.0",
"maintainer_email": "",
"keywords": "",
"author": "Harsh Verma",
"author_email": "synacktra.work@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/39/f6/a0a589edc5c0da2edc3f7feb3d1a7b53a3f02536af7134cd332ea5be3b6a/urify-0.2.0.tar.gz",
"platform": null,
"description": "# Urify \ud83e\udee7\n\nStreamlining URLs for security testing - No more duplicates, just streamlined precision\n\n---\n\n### \ud83d\ude80 Installation\n\n1. Installation using PIP\n\n ```sh\n pip install urify\n ```\n\n2. Installation using Docker\n\n ```\n docker pull urify\n ```\n\n---\n\n### \ud83d\udee0 Commands:\n\nOh, you thought this was just a basic tool? Think again!\n\n#### \ud83d\udd0d Basic URL Dissections:\n\n- `keys`: Want to know what keys are in your query string? I got you!\n- `values`: Retrieve values from the query string. Yes, like treasure hunting but more nerdy.\n- `params`: Get those key=value pairs, one at a time, like peeling an onion.\n- `apex`: Get the apex domain. It's like climbing a mountain, but lazier.\n- `fqdn`: Retrieve the fully qualified domain name. Fancy words, right?\n- `json`: Encode your dissected URL into JSON. Because... JSON!\n\n#### \ud83c\udf9b Advanced Filtering:\n\n- `filter`: Refine your URLs using a myriad of component filters:\n\n - `-scheme`: Filter by request schemes. Yes, we're getting technical here.\n - `-sub`: Look for specific subdomains. It's like Where's Waldo but for domains.\n - `-domain`: Specify domains. Because we all have favorites.\n - `-tld`: Filter by top-level domains. It's all about the hierarchy.\n - `-ext`: Filter by file extensions. The grand finale of URLs.\n - `-port`: Specify ports. No, not the wine. The number.\n - `-apex`: Get URLs by apex domain. It's like VIP access.\n - `-fqdn`: Filter by fully qualified domain names. The whole shebang!\n - `-inverse`: Want to be rebellious? Use filters as a deny-list.\n - `-strict`: Validate all filters. No half measures here!\n - `-dissect`: Once filtered, dissect the URL for a specific component. Pick and choose!\n\n---\n\n### \ud83d\udcd6 Examples:\n\n#### 1. Basic URL Dissections:\n\n```\n$ urify -help\nDissect and filter URLs provided on stdin.\n\nUsage: urify [-help] [mode] ...\n\nOptions:\n -help show this help message\n\nModes:\n keys Retrieve keys from the query string, one per line.\n values Retrieve values from the query string, one per line.\n params Key=value pairs from the query string (one per line)\n path Retrieve the path (e.g., /users/me).\n apex Retrieve the apex domain (e.g., github.com).\n fqdn Retrieve the fully qualified domain name (e.g., api.github.com).\n json JSON encode the dissected URL object.\n filter Refine URLs using component filters.\n```\n\n- **keys**:\n ```\n $ cat urls.txt | urify keys\n search\n id\n product-id\n name\n ```\n\n- **values**:\n ```\n $ cat urls.txt | urify values\n query\n 123\n 29\n powder\n ```\n\n- **params**:\n ```\n $ cat urls.txt | urify params\n search=query\n id=123\n product-id=29\n name=powder\n ```\n\n- **apex**:\n ```\n $ cat urls.txt | urify apex\n example.com\n spaghetti.com\n example.org\n cartel.net\n ```\n\n- **fqdn**:\n ```\n $ cat urls.txt | urify fqdn\n sub.example.com\n mom.spaghetti.com\n blog.example.org\n shop.cartel.net\n ```\n\n- **json**:\n ```\n echo \"http://bob:secret@sub.example.com:80/file%20name.pdf?page=10\" | urify json\n ```\n\n ``` \n {\n \"scheme\": \"http\",\n \"username\": \"bob\",\n \"password\": \"secret\",\n \"subdomain\": \"sub\",\n \"domain\": \"example\",\n \"tld\": \"com\",\n \"port\": \"80\",\n \"path\": \"/file%20name.pdf\",\n \"raw_query\": \"page=10\",\n \"query\": [\n {\n \"key\": \"page\",\n \"value\": \"10\"\n }\n ],\n \"fragment\": \"\",\n \"apex\": \"example.com\",\n \"fqdn\": \"sub.example.com\"\n }\n ```\n\n#### 2. Advanced Filtering:\n\n```\n$ urify filter -help\n\nRefine URLs using component filters.\nBy default, the command treats provided filters as an allow-list,\nevaluating just one parameter from a specified field.\n\nTo assess all components, add the `-strict` flag.\nFor a deny-list approach, incorporate the `-inverse` flag.\nAfter evaluation, the default result is the URL.\nFor specific URL dissected object, pair the `-dissect` option with any one of:\n`keys` | `values` | `params` | `apex` | `fqdn` | `json`\nIf `-dissect` is not provided, `filter` command aims to print \ndissimilar urls using pattern matching\n\nUsage: urify filter [-help] [-scheme SCHEME [...]] [-sub SUB [...]] [-domain DOMAIN [...]] [-tld TLD [...]] [-ext EXT [...]] \n [-port PORT [...]] [-apex APEX [...]] [-fqdn FQDN [...]] [-inverse] [-strict] [-dissect MODE]\n\nOptions:\n -help show this help message\n -scheme SCHEME [...] The request schemes (e.g. http, https)\n -sub SUB [...] The subdomains (e.g. abc, abc.xyz)\n -domain DOMAIN [...] The domains (e.g. github, youtube)\n -tld TLD [...] The top level domains (e.g. in, com)\n -ext EXT [...] The file extensions (e.g. pdf, html)\n -port PORT [...] The ports (e.g. 22, 8080)\n -apex APEX [...] The apex domains (e.g. github.com, youtube.com)\n -fqdn FQDN [...] The fully qualified domain names (e.g. api.github.com, app.example.com)\n -inverse Process filters as deny-list\n -strict Validate all filter checks\n -dissect MODE Dissect url and retrieve mode after filtering\n\nModes:\n keys|values|params|path|apex|fqdn|json\n```\n\n- **filter dissimilar urls**\n ```\n $ cat similar_urls.txt | urify filter\n http://blog.example.net/post/12\n http://info.example.net/tag/14?id=133\n ```\n\n- **filter by scheme**:\n ```\n $ cat urls.txt | urify filter -scheme https\n https://sub.example.com/literally%20me.jpg#top\n ```\n\n- **filter by domain**:\n ```\n $ cat urls.txt | urify filter -domain example\n https://sub.example.com/literally%20me.jpg#top\n http://blog.example.org:8443/new%20blog?id=123\n ```\n\n- **filter and dissect**:\n ```\n $ cat urls.txt | urify filter -ext pdf jpg -dissect apex\n example.com\n spaghetti.com\n ```\n\n- **filters as deny-list**:\n ```\n $ cat urls.txt | urify filter -ext html -inverse\n https://sub.example.com/literally%20me.jpg#top\n http://mom.spaghetti.com:8080/cookbook.pdf?search=query\n http://blog.example.org:8443/new%20blog?id=123\n ```\n\n- **validate all filters**:\n ```\n $ cat urls.txt | urify filter -tld com -port 8080 -strict\n http://mom.spaghetti.com:8080/cookbook.pdf?search=query\n ```\n\n---\n\n### \ud83d\udcdd Notes:\n\n- By default, the `filter` command treats the provided filters as an allow-list. But if you're feeling a little naughty, use the `-inverse` flag for a deny-list approach.\n- If you're the meticulous type and want to validate every single filter parameter, add the `-strict` flag to your command. No stone unturned!\n- After all the filtering shenanigans, if you just want the URL, we'll give it to you. But if you're in the mood for something special, pair the `-dissect` option with any one of the listed choices.\n\n---\n\n## \ud83e\udde0 Custom CLI Parser (Skip this part):\n\nEver felt that built-in command-line parsers are just... meh? I did too. That's why urify is powered by its very own custom CLI parser. Why settle for the ordinary when you can have the extraordinary? \n\n### \ud83e\uddd0 Features:\n\n- **Command-centric**: Create commands as easily as adding a decorator with `@cli.command(...)`. Seriously, it's like sprinkling magic dust on your functions.\n- **Options Galore**: Add options to your commands using the `@cli.option(...)` decorator. Define data types, set multiple values, and even specify valid choices. It's like an all-you-can-eat buffet, but for command-line arguments.\n\n### \ud83d\udcd6 Behind the Scenes:\n\n#### Command Class:\n\nThe `Command` class encapsulates each command. It's like a VIP pass for your functions:\n\n- `name`: The unique identifier for your command.\n- `callback`: The function to be invoked when the command is called.\n- `parser`: The argument parser associated with this command. \n\n#### CLI Class:\n\nThe heart and soul of the custom parser. The `Cli` class:\n\n- **Initialization**: Set up your CLI with descriptions, help flags, and more. Feel like a movie director but for commands.\n- **Command Registration**: Register functions as commands with the `@cli.command(...)` decorator. It's like enlisting soldiers for battle.\n- **Option Handling**: Define options with the `@cli.option(...)` decorator. Like adding extra toppings to your pizza.\n- **Execution**: Finally, the `cli.run()` method brings everything together by parsing the arguments and invoking the right command. Lights, camera, action!\n\n### \ud83e\udd13 Technical Deep Dive:\n\nThe magic happens with the use of the `_ArgumentParser` class (a modified version of `argparse.ArgumentParser`). This allows for a more intuitive and concise definition of command-line interfaces. The beauty of this parser is that it draws inspiration from renowned libraries like `click` and `typer` while maintaining the unique aspects of `argparse`.\n\nOverall, this custom CLI parser provides a seamless and powerful way to interact with command-line tools without the usual hassles.\n\n---\n\n*Disclaimer: No URLs were harmed in the making of this tool.*\n\n---\n\nHappy dissecting! \ud83c\udf88",
"bugtrack_url": null,
"license": "",
"summary": "Dissect and filter URLs provided on stdin.",
"version": "0.2.0",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "7493ffcc5a36046931979640e765f628e7f8ac245585bdb68228d2f0c543a045",
"md5": "6a6f6ea478b499158ae66ea882ca66dc",
"sha256": "d8969126610ab7aa836be34e0f51370baee2a143ffd6c4b052f28e4f62996286"
},
"downloads": -1,
"filename": "urify-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "6a6f6ea478b499158ae66ea882ca66dc",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10,<4.0",
"size": 14150,
"upload_time": "2023-10-23T20:36:29",
"upload_time_iso_8601": "2023-10-23T20:36:29.844700Z",
"url": "https://files.pythonhosted.org/packages/74/93/ffcc5a36046931979640e765f628e7f8ac245585bdb68228d2f0c543a045/urify-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "39f6a0a589edc5c0da2edc3f7feb3d1a7b53a3f02536af7134cd332ea5be3b6a",
"md5": "77a6be68e0a5d29c7bb9c9afbff26c8b",
"sha256": "9de6b698814e12bbe9c0191674f3455c83c052f4c69608cc32357396cd19ec26"
},
"downloads": -1,
"filename": "urify-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "77a6be68e0a5d29c7bb9c9afbff26c8b",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10,<4.0",
"size": 14966,
"upload_time": "2023-10-23T20:36:31",
"upload_time_iso_8601": "2023-10-23T20:36:31.291325Z",
"url": "https://files.pythonhosted.org/packages/39/f6/a0a589edc5c0da2edc3f7feb3d1a7b53a3f02536af7134cd332ea5be3b6a/urify-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-10-23 20:36:31",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "urify"
}