# Scrappier
Scrappier is a web scrapper which uses chrome in a headless mode. This library provides an easy-to-read syntaxis to navigate through the different elements and perform actions.
## Requirements
* python 3.8
* chrome driver installed in /usr/bin/chromedriver
## Instalation
`pip install scrappier`
## Basic usage
```python
from scrappier import Browser
browser = Browser()
cards = browser.where_class("card").get()
for card in cards:
span = card.where_tag_name("span").first()
print(span.text())
```
## Available methods for browser
### build()
### html()
### resize(width:int, height:int)
### screen(path:str)
### select(name:str, value:str)
### url()
### visit(url:str)
### wait(seconds:int)
### webdriver()
### where_inner_text(text)
### where_xpath(xpath:str)
### where_id(id:str)
### where_name(name:str)
### where_contain_text(name:str)
### where_class_name(name:str)
### where_tag_name(name:str)
### where_attribute()
### width()
## Available methods for ElementFinder
### until(seconds:int)
### get()
### first()
### where_xpath(xpath:str, driver, element=None)
### where_id(id:str, driver, element=None)
### where_inner_text(text:str, driver, element=None)
### where_contain_text(text, driver, element=None)
### where_class_name(name:str, driver, element=None)
### where_tag_name(name:str, driver, element=None)
### next_sibling(name:str, driver, element)
## Available methods for Element
### attributes() -> list
### enter()
### type(text:str)
### text()
### html()
### attribute(name:str)
### click()
### children()
### next_sibling()
### value(value:str = None)
gets the value of the element if there is no argument, but if an string was given, will set the input value
### where_tag_name(name:str)
### where_attribute(attribute:str, value:str)
## Colaborators
If you are a collaborator, please consider do the next:
1. Create your new functionality
2. Create a test of your new functionality
3. change the version of the package in setup.py
4. execute the next command: `python3 setup.py sdist bdist_wheel`
5. upload the package: `source venv/bin/active && twine upload dist/*<your-version>*`
## for developers
install twine: `python3 -m pip instal twine`
install setuptools: `pip3 install setuptools`
install setuptools: `python3 setup.py sdist bdist_wheel`
upload your changes: `twine upload dist/*`
Raw data
{
"_id": null,
"home_page": "",
"name": "scrappier",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "python,scrapper,web,scrapping,selenium,scrappier,browser,chrome",
"author": "Cristian Guzm\u00e1n",
"author_email": "<cristian.guzman.contacto@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/b8/c5/df1b4bf9cdebb7d8fd9eeb18e5ddbb80f14bdb41cfa9624b197da46a842b/scrappier-1.6.0.tar.gz",
"platform": null,
"description": "\n# Scrappier\n\nScrappier is a web scrapper which uses chrome in a headless mode. This library provides an easy-to-read syntaxis to navigate through the different elements and perform actions.\n\n## Requirements\n\n* python 3.8\n* chrome driver installed in /usr/bin/chromedriver\n\n## Instalation\n\n`pip install scrappier`\n\n## Basic usage\n\n```python\nfrom scrappier import Browser\n\nbrowser = Browser()\n\ncards = browser.where_class(\"card\").get()\n\nfor card in cards:\n span = card.where_tag_name(\"span\").first()\n\n print(span.text())\n```\n\n## Available methods for browser\n\n### build()\n\n### html()\n\n### resize(width:int, height:int)\n\n### screen(path:str)\n\n### select(name:str, value:str)\n\n### url()\n\n### visit(url:str)\n\n### wait(seconds:int)\n\n### webdriver()\n\n### where_inner_text(text)\n\n### where_xpath(xpath:str)\n\n### where_id(id:str)\n\n### where_name(name:str)\n\n### where_contain_text(name:str)\n\n### where_class_name(name:str)\n\n### where_tag_name(name:str)\n\n### where_attribute()\n\n### width()\n\n## Available methods for ElementFinder\n\n### until(seconds:int)\n\n### get()\n\n### first()\n\n### where_xpath(xpath:str, driver, element=None)\n\n### where_id(id:str, driver, element=None)\n\n### where_inner_text(text:str, driver, element=None)\n\n### where_contain_text(text, driver, element=None)\n\n### where_class_name(name:str, driver, element=None)\n\n### where_tag_name(name:str, driver, element=None)\n\n### next_sibling(name:str, driver, element)\n\n## Available methods for Element\n\n### attributes() -> list\n\n### enter()\n\n### type(text:str)\n\n### text()\n\n### html()\n\n### attribute(name:str)\n\n### click()\n\n### children()\n\n### next_sibling()\n\n### value(value:str = None)\n\ngets the value of the element if there is no argument, but if an string was given, will set the input value\n\n### where_tag_name(name:str)\n\n### where_attribute(attribute:str, value:str)\n\n## Colaborators\n\nIf you are a collaborator, please consider do the next:\n\n1. Create your new functionality\n2. Create a test of your new functionality\n3. change the version of the package in setup.py\n4. execute the next command: `python3 setup.py sdist bdist_wheel`\n5. upload the package: `source venv/bin/active && twine upload dist/*<your-version>*`\n\n## for developers\n\ninstall twine: `python3 -m pip instal twine`\ninstall setuptools: `pip3 install setuptools`\ninstall setuptools: `python3 setup.py sdist bdist_wheel`\nupload your changes: `twine upload dist/*`\n\n",
"bugtrack_url": null,
"license": "",
"summary": "Scrapper for chrome",
"version": "1.6.0",
"project_urls": null,
"split_keywords": [
"python",
"scrapper",
"web",
"scrapping",
"selenium",
"scrappier",
"browser",
"chrome"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "2aec23588e090e111627d2e62c5f9363b3dcde98d894f89a42b0d75c89d05786",
"md5": "84d96a59cb3bc18660129dec63d90bcc",
"sha256": "bd8154c7267fff46fd76f41adde300f9f50edfc6a0f36f85f53a4f61ff00ef0d"
},
"downloads": -1,
"filename": "scrappier-1.6.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "84d96a59cb3bc18660129dec63d90bcc",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 7097,
"upload_time": "2023-05-03T18:41:46",
"upload_time_iso_8601": "2023-05-03T18:41:46.804930Z",
"url": "https://files.pythonhosted.org/packages/2a/ec/23588e090e111627d2e62c5f9363b3dcde98d894f89a42b0d75c89d05786/scrappier-1.6.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b8c5df1b4bf9cdebb7d8fd9eeb18e5ddbb80f14bdb41cfa9624b197da46a842b",
"md5": "aa9221f65ce050ea3f3a8b5c80b0abbd",
"sha256": "bafac881022d9003be140525f06e6afae28ba71b7fb9c8006c15cc8cd6387d69"
},
"downloads": -1,
"filename": "scrappier-1.6.0.tar.gz",
"has_sig": false,
"md5_digest": "aa9221f65ce050ea3f3a8b5c80b0abbd",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 5729,
"upload_time": "2023-05-03T18:41:48",
"upload_time_iso_8601": "2023-05-03T18:41:48.948699Z",
"url": "https://files.pythonhosted.org/packages/b8/c5/df1b4bf9cdebb7d8fd9eeb18e5ddbb80f14bdb41cfa9624b197da46a842b/scrappier-1.6.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-05-03 18:41:48",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "scrappier"
}