investing-tickets-scraper


Nameinvesting-tickets-scraper JSON
Version 0.0.2 PyPI version JSON
download
home_page
SummaryScraps stock tickets from "Investing.com" using Selenium and parse using BeautifulSoup
upload_time2023-01-04 16:19:44
maintainer
docs_urlNone
authorLucas Rocha
requires_python
license
keywords python tickers index stocks exchange investing
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # investing-tickets-scraper
#### This package scraps all tickets available from "investing.com" site

## How to install
Installing "investing-tickets-scraper" from pypi (recomended).
```bash
pip install investing-tickets-scraper
```

## How to use

```python
# Import the library
from investing_tickets_scraper.scraper import Scraper
import pandas as pd

# Create the object scraper using the imported class
scraper = Scraper()

# Configurates the scraper
scraper.config(chromedriver_path="C:\Program Files (x86)\chromedriver.exe", # Chromedriver_path = chromedriver for Selenium, if you don't know what is it, check this video "https://youtu.be/Xjv1sY630Uc" and install it
                country="United States")  # Country = the country you want to scrap the tickeks. To check all countries available you can use "print(scraper.contries_available())"
                                                                                                      
# Start scraping
scraper.scrap() # It will open the Google Chrome and scrap it. Is recommended not to use the mouse and the keboard

# Return the data as a pandas dataframe
df = scraper.return_dataframe()
print(df) # df
```



            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "investing-tickets-scraper",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "python,tickers,index,stocks,exchange,investing",
    "author": "Lucas Rocha",
    "author_email": "lucasrocha.png@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/fb/2c/e09164a8ab69f2c70eefda51edcd436a2d0bbca8c570d724ac9e54bc3fd7/investing_tickets_scraper-0.0.2.tar.gz",
    "platform": null,
    "description": "# investing-tickets-scraper\n#### This package scraps all tickets available from \"investing.com\" site\n\n## How to install\nInstalling \"investing-tickets-scraper\" from pypi (recomended).\n```bash\npip install investing-tickets-scraper\n```\n\n## How to use\n\n```python\n# Import the library\nfrom investing_tickets_scraper.scraper import Scraper\nimport pandas as pd\n\n# Create the object scraper using the imported class\nscraper = Scraper()\n\n# Configurates the scraper\nscraper.config(chromedriver_path=\"C:\\Program Files (x86)\\chromedriver.exe\", # Chromedriver_path = chromedriver for Selenium, if you don't know what is it, check this video \"https://youtu.be/Xjv1sY630Uc\" and install it\n                country=\"United States\")  # Country = the country you want to scrap the tickeks. To check all countries available you can use \"print(scraper.contries_available())\"\n                                                                                                      \n# Start scraping\nscraper.scrap() # It will open the Google Chrome and scrap it. Is recommended not to use the mouse and the keboard\n\n# Return the data as a pandas dataframe\ndf = scraper.return_dataframe()\nprint(df) # df\n```\n\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Scraps stock tickets from \"Investing.com\" using Selenium and parse using BeautifulSoup",
    "version": "0.0.2",
    "split_keywords": [
        "python",
        "tickers",
        "index",
        "stocks",
        "exchange",
        "investing"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a6d6aa65071f9f71ae247029ed8752145cb2a42f9cd2128a8aebeb5984964fbb",
                "md5": "a1d0f4321cf1fc817ddaede5eb5326e5",
                "sha256": "840d016d40be15788ffe1b2e4bd90e4b3c2fff11aec26d0fb4dc7ccaaaaad336"
            },
            "downloads": -1,
            "filename": "investing_tickets_scraper-0.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a1d0f4321cf1fc817ddaede5eb5326e5",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 8172,
            "upload_time": "2023-01-04T16:19:42",
            "upload_time_iso_8601": "2023-01-04T16:19:42.549109Z",
            "url": "https://files.pythonhosted.org/packages/a6/d6/aa65071f9f71ae247029ed8752145cb2a42f9cd2128a8aebeb5984964fbb/investing_tickets_scraper-0.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fb2ce09164a8ab69f2c70eefda51edcd436a2d0bbca8c570d724ac9e54bc3fd7",
                "md5": "4c889b01210ebd61ae7266dbb611a806",
                "sha256": "9872af91064ba15daff0ae6b6c05c65682d83869c661e3e2e593ce31b40286e1"
            },
            "downloads": -1,
            "filename": "investing_tickets_scraper-0.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "4c889b01210ebd61ae7266dbb611a806",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 7469,
            "upload_time": "2023-01-04T16:19:44",
            "upload_time_iso_8601": "2023-01-04T16:19:44.199056Z",
            "url": "https://files.pythonhosted.org/packages/fb/2c/e09164a8ab69f2c70eefda51edcd436a2d0bbca8c570d724ac9e54bc3fd7/investing_tickets_scraper-0.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-01-04 16:19:44",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "investing-tickets-scraper"
}
        
Elapsed time: 0.02416s