<h1 align="center">Spyder-HTML</h1>
<p align="center">
<img src="http://img.shields.io/static/v1?label=STATUS&message=EM%20DESENVOLVIMENTO&color=GREEN&style=for-the-badge"/>
<img src="http://img.shields.io/static/v1?label=VERSION&message=2.1.4&color=blue&style=for-the-badge"/>
<img src="https://img.shields.io/github/license/accessmaker/Spyder-ml?style=for-the-badge"/>
</p>
A tool made to facilitate the analysis of html code.
<h2>INSTALL(git clone):</h2>
git clone <https://github.com/accessmaker/Spyder-ml>
python setup.py install
<h2>INSTALL(PIP):</h2>
pip install spyder-ml
<h2>USAGE:</h2>
spyderml [-h] [-t TARGET] [--tr TR] [--update]
[--tags TAGS | --comments | --attribs ATTRIBS | --getjs | --techs | --geturls | --html | --jsr]
[-o OUTPUT] [-C COOKIE] [-A AGENT] [-hf HEADERSFILE] [-S]
[-w WORKERS] [--domain DOMAIN] [--cache] [--proxy PROXY]
[-D DATA [DATA ...]]
A tool made to facilitate the analysis of html code.
options:<br>
-h, --help show this help message and exit<br>
-t TARGET, --target TARGET<br>
Parameter that defines the target URL<
<http://example.com/index.html> <br>
--tr TR Type of request(POST or GET(Default)).
--update Flag responsible for updating the database.<br>
--tags TAGS Flag that defines which tags the program will bring<br>
--comments Flag that brings the comments<br>
--attribs ATTRIBS Flag that defines which attributes the application
will look for.<br>
--getjs Flag that brings all JS files from the page.<br>
--techs Flag trying to discover the technologies of the page.<br>
--geturls This flag takes all the target's urls and tries to
access them.<br>
--html This Flag results in all the page's html code.<br>
--jsr Makes a request that returns a JSON.<br>
-o OUTPUT, --output OUTPUT
Flag that defines in which file the command output
will be saved.<br>
-C COOKIE, --cookie COOKIE
Cookie to send with the request<br>
-A AGENT, --agent AGENT
User-Agent to send with the request<br>
-hf HEADERSFILE, --headersfile HEADERSFILE
Parameter that passes an HTTP request header file to
be scanned.<br>
-S, --spider flag to run spider<br>
-w WORKERS, --workers WORKERS
Defines the number of workers.<br>
--domain DOMAIN Defines the domain of the web crawler.<br>
--cache Defines whether to create cache or not (default:
false).<br>
--proxy PROXY Defines the proxy that will be used (Which can be
passed tor or burpsuite to use these two default
proxies).<br>
-D DATA [DATA ...], --data DATA [DATA ...]
Data to send with the request in format key1:value1
key2:value2 key3:value3...<br>
'Functionality': It searches the html document for specific things
Raw data
{
"_id": null,
"home_page": "https://github.com/accessmaker/Spyder-ml",
"name": "spyder-ml",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8,<4.0",
"maintainer_email": "",
"keywords": "",
"author": "Lucas dSilva",
"author_email": "accessmaker.mlbb@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/06/0c/fbfdc11ef8fe69cbf03d6aa62acfe70380d849339de0fdaf7e2e96a3c3dc/spyder_ml-2.1.4.tar.gz",
"platform": null,
"description": "<h1 align=\"center\">Spyder-HTML</h1>\n\n<p align=\"center\">\n<img src=\"http://img.shields.io/static/v1?label=STATUS&message=EM%20DESENVOLVIMENTO&color=GREEN&style=for-the-badge\"/>\n<img src=\"http://img.shields.io/static/v1?label=VERSION&message=2.1.4&color=blue&style=for-the-badge\"/>\n<img src=\"https://img.shields.io/github/license/accessmaker/Spyder-ml?style=for-the-badge\"/>\n</p>\n\nA tool made to facilitate the analysis of html code.\n\n<h2>INSTALL(git clone):</h2>\n\ngit clone <https://github.com/accessmaker/Spyder-ml>\n\npython setup.py install\n\n<h2>INSTALL(PIP):</h2>\n\npip install spyder-ml\n\n<h2>USAGE:</h2>\n\nspyderml [-h] [-t TARGET] [--tr TR] [--update]\n [--tags TAGS | --comments | --attribs ATTRIBS | --getjs | --techs | --geturls | --html | --jsr]\n [-o OUTPUT] [-C COOKIE] [-A AGENT] [-hf HEADERSFILE] [-S]\n [-w WORKERS] [--domain DOMAIN] [--cache] [--proxy PROXY]\n [-D DATA [DATA ...]]\n\nA tool made to facilitate the analysis of html code.\n\noptions:<br>\n -h, --help show this help message and exit<br>\n -t TARGET, --target TARGET<br>\n Parameter that defines the target URL<\n <http://example.com/index.html> <br>\n --tr TR Type of request(POST or GET(Default)).\n --update Flag responsible for updating the database.<br>\n --tags TAGS Flag that defines which tags the program will bring<br>\n --comments Flag that brings the comments<br>\n --attribs ATTRIBS Flag that defines which attributes the application\n will look for.<br>\n --getjs Flag that brings all JS files from the page.<br>\n --techs Flag trying to discover the technologies of the page.<br>\n --geturls This flag takes all the target's urls and tries to\n access them.<br>\n --html This Flag results in all the page's html code.<br>\n --jsr Makes a request that returns a JSON.<br>\n -o OUTPUT, --output OUTPUT\n Flag that defines in which file the command output\n will be saved.<br>\n -C COOKIE, --cookie COOKIE\n Cookie to send with the request<br>\n -A AGENT, --agent AGENT\n User-Agent to send with the request<br>\n -hf HEADERSFILE, --headersfile HEADERSFILE\n Parameter that passes an HTTP request header file to\n be scanned.<br>\n -S, --spider flag to run spider<br>\n -w WORKERS, --workers WORKERS\n Defines the number of workers.<br>\n --domain DOMAIN Defines the domain of the web crawler.<br>\n --cache Defines whether to create cache or not (default:\n false).<br>\n --proxy PROXY Defines the proxy that will be used (Which can be\n passed tor or burpsuite to use these two default\n proxies).<br>\n -D DATA [DATA ...], --data DATA [DATA ...]\n Data to send with the request in format key1:value1\n key2:value2 key3:value3...<br>\n\n'Functionality': It searches the html document for specific things\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A tool made to facilitate the analysis of html code.",
"version": "2.1.4",
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "73824777382424e1e0e043eeb9c5c3a7c5fd73409ff2448a11fb1ad0027fc73e",
"md5": "fc7b9cf9f6acad30041384019b486027",
"sha256": "ec79e0ee69ead2c95b2b458e864368072931fc88a40c870f38b4fc1665994156"
},
"downloads": -1,
"filename": "spyder_ml-2.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "fc7b9cf9f6acad30041384019b486027",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8,<4.0",
"size": 12690,
"upload_time": "2023-04-11T23:10:57",
"upload_time_iso_8601": "2023-04-11T23:10:57.295302Z",
"url": "https://files.pythonhosted.org/packages/73/82/4777382424e1e0e043eeb9c5c3a7c5fd73409ff2448a11fb1ad0027fc73e/spyder_ml-2.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "060cfbfdc11ef8fe69cbf03d6aa62acfe70380d849339de0fdaf7e2e96a3c3dc",
"md5": "58dbae6ef6a846ae85ed3e4dc2ec9732",
"sha256": "960d7f99ed979cc45012d62f4c98f598c9f3b5dbb7afa6996d0e4bb48c59a06e"
},
"downloads": -1,
"filename": "spyder_ml-2.1.4.tar.gz",
"has_sig": false,
"md5_digest": "58dbae6ef6a846ae85ed3e4dc2ec9732",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8,<4.0",
"size": 10691,
"upload_time": "2023-04-11T23:10:58",
"upload_time_iso_8601": "2023-04-11T23:10:58.941814Z",
"url": "https://files.pythonhosted.org/packages/06/0c/fbfdc11ef8fe69cbf03d6aa62acfe70380d849339de0fdaf7e2e96a3c3dc/spyder_ml-2.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-04-11 23:10:58",
"github": true,
"gitlab": false,
"bitbucket": false,
"github_user": "accessmaker",
"github_project": "Spyder-ml",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [],
"lcname": "spyder-ml"
}