baskref


Namebaskref JSON
Version 1.0.0 PyPI version JSON
download
home_pageNone
SummarybaskRef is a tool to scrape basketball Data from the web.
upload_time2024-09-22 15:49:41
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT License Copyright (c) 2022 Dominik Zulovec Sajovic Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords basketball web scraper python
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # BaskRef (Basketball Scraper)
BaskRef is a tool to scrape basketball Data from the web.

The goal of this project is to provide a data collection utility for 
NBA basketball data. The collection strategy is to scrape data from 
https://www.basketball-reference.com.
The data can then be saved into a csv to be used by a different utility.

## About the Package

### What data are we collecting?

- games & game stats (in depth stats of the games)
- players game stats

All datasets are available to be collected:
- by day (all games in one day)
- by whole season (regular + playoffs)
- by playoffs

#### Future Collections (Not yet implemented)
- players meta data (Not Implemented)
- game logs (Not Implemented)


## How to Install & Run the Package?

### Install the project
```bash
pip install baskref

# optional set logging level. Default value is INFO
export LOG_LEVEL=DEBUG # INFO, DEBUG, ERROR
```

### Scrape Game Data

Scrape all games for the 7th of January 2022.
```bash
baskref -t g -d 2022-01-07 -fp datasets
# python -c "from baskref import run_baskref; run_baskref()" -t g -d 2022-01-07 -fp datasets
```

Scrape all games for the 2006 NBA season (regular season + playoffs).
```bash
baskref -t gs -y 2006 -fp datasets
# python -c "from baskref import run_baskref; run_baskref()" -t gs -y 2006 -fp datasets
```

Scrape all games for the 2006 NBA playoffs.
```bash
baskref -t gp -y 2006 -fp datasets
# if you don't install the package
# python -c "from baskref import run_baskref; run_baskref()" -t gp -y 2006 -fp datasets
```

### Scrape Game URLs only

```bash
# simply add "u" to any of the three scraping types:
# g -> gu, gs -> gsu, gp -> gpu
baskref -t gu -d 2022-01-07 -fp datasets
```

### Scrape Player Stats Data

```bash
# simply add "pl" to any of the three scraping types:
# g -> gpl, gs -> gspl, gp -> gppl
baskref -t gpl -d 2022-01-07 -fp datasets
```

### Scrape Using a Proxy
Use proxy for scraping.
```bash
baskref -t g -d 2022-01-07 -fp datasets -p http://someproxy.com
```


## How to Use the Package?

Install requirements
```bash
pip install -r requirements.txt
```

### Data Collection Utility
This refers to the scraping functionalities.

For any mode of collection first you need to import and initialize 
the below classes.
```python
from baskref.data_collection import (
    BaskRefUrlScraper,
    BaskRefDataScraper,
)

url_scraper = BaskRefUrlScraper()
data_scraper = BaskRefDataScraper()

# optionally you can set a proxy
proxy_url_scraper = BaskRefUrlScraper("http://someproxy.com")
proxy_data_scraper = BaskRefDataScraper("http://someproxy.com")
```
The BaskRefDataScraper.get_games_data returns a list of dictionaries.

Collect games for a specific day
```python
from datetime import date

game_urls = url_scraper.get_game_urls_day(date(2022,1,7))
game_data = data_scraper.get_games_data(game_urls)
```

Collect games for a specific season (regular + playoffs)
```python
game_urls = url_scraper.get_game_urls_year(2006)
game_data = data_scraper.get_games_data(game_urls)
```

Collect games for a specific postseason
```python
game_urls = url_scraper.get_game_urls_playoffs(2006)
game_data = data_scraper.get_games_data(game_urls)
```

Collect player stats for for a specific day
```python
from datetime import date

game_urls = url_scraper.get_game_urls_day(date(2022,1,7))
pl_stats_data = data_scraper.get_player_stats_data(game_urls)
```

### Data Saving Package
This refers to the saving of the data.

Save a list of dictionaries to a CSV file.
```python
import os
from baskref.data_saving.file_saver import save_file_from_list

save_path = os.path.join('datasets', 'file_name.csv')
save_file_from_list(game_data, save_path)
```

## How to Run Tests?

Run all tests with Pytest
```
pytest
```

Run all tests with coverage
```
coverage run --source=baskref -m pytest
coverage report --omit="*/test*" -m --skip-empty
```

## Code Formating

The code base uses black for automatic formating.
the configuration for black is stored in pyproject.toml file.

```bash
# run black over the entire code base
black .
```

## Linting

The code base uses pylint and mypy for code linting.

### Pylint

the configuration for pylint is stored in .pylintrc file.

```bash 
# run pylint over the entire code base
pylint --recursive=y ./
```

### MyPy

the configuration for mypy is stored in pyproject.toml file.

```bash 
# run mypy over the entire code base
mypy baskref
```

## Bonus

### Prepare project for development

1. Create Virtual Environment

- You might want to use a virtual environment for executing the project.
- this is an optional step (if skipping go straight to step 2)

Create a new virtual environemnt
```
python -m venv venv  # The second parameter is a path to the virtual env.
```

Activate the new virtual environment
```
# Windows
.\venv\Scripts\activate

# Unix
source venv/bin/activate
```

Leaving the virtual environment
```
deactivate
```

2. Install all the dev requirements

```
pip install -r requirements_dev.txt

# uninstall all packages Windows
pip freeze > unins && pip uninstall -y -r unins && del unins

# uninstall all packages linux
pip freeze | xargs pip uninstall -y
```

3. Install the pre-commit hook
```
pre-commit install
```

### Prepare a new Version
This section describes some of the steps when preparing a new baskref version.

- empty the dist folder
```
rm -rf dist/*
```

- adjust the pyproject.toml file
    - version
    - dependencies

- install project locally and test it
```
python -m build
pip install .
```

- install twine
```
pip install --upgrade twine
```

- publish project to test.pypi (optional)
```
twine upload --repository testpypi dist/*
# install from test.pypi
pip install --index-url https://test.pypi.org/simple/ baskref
```

- publish a new version
```
twine upload dist/*
```


## Contributors

1. [Dominik Zulovec Sajovic](https://www.linkedin.com/in/dominik-zulovec-sajovic/)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "baskref",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "basketball, web scraper, python",
    "author": null,
    "author_email": "Dominik Zulovec Sajovic <dominik.zsajovic@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/cd/61/0be6dede00f7fe1c323054932a3078690b2e582315f0dba8332e79bab175/baskref-1.0.0.tar.gz",
    "platform": null,
    "description": "# BaskRef (Basketball Scraper)\nBaskRef is a tool to scrape basketball Data from the web.\n\nThe goal of this project is to provide a data collection utility for \nNBA basketball data. The collection strategy is to scrape data from \nhttps://www.basketball-reference.com.\nThe data can then be saved into a csv to be used by a different utility.\n\n## About the Package\n\n### What data are we collecting?\n\n- games & game stats (in depth stats of the games)\n- players game stats\n\nAll datasets are available to be collected:\n- by day (all games in one day)\n- by whole season (regular + playoffs)\n- by playoffs\n\n#### Future Collections (Not yet implemented)\n- players meta data (Not Implemented)\n- game logs (Not Implemented)\n\n\n## How to Install & Run the Package?\n\n### Install the project\n```bash\npip install baskref\n\n# optional set logging level. Default value is INFO\nexport LOG_LEVEL=DEBUG # INFO, DEBUG, ERROR\n```\n\n### Scrape Game Data\n\nScrape all games for the 7th of January 2022.\n```bash\nbaskref -t g -d 2022-01-07 -fp datasets\n# python -c \"from baskref import run_baskref; run_baskref()\" -t g -d 2022-01-07 -fp datasets\n```\n\nScrape all games for the 2006 NBA season (regular season + playoffs).\n```bash\nbaskref -t gs -y 2006 -fp datasets\n# python -c \"from baskref import run_baskref; run_baskref()\" -t gs -y 2006 -fp datasets\n```\n\nScrape all games for the 2006 NBA playoffs.\n```bash\nbaskref -t gp -y 2006 -fp datasets\n# if you don't install the package\n# python -c \"from baskref import run_baskref; run_baskref()\" -t gp -y 2006 -fp datasets\n```\n\n### Scrape Game URLs only\n\n```bash\n# simply add \"u\" to any of the three scraping types:\n# g -> gu, gs -> gsu, gp -> gpu\nbaskref -t gu -d 2022-01-07 -fp datasets\n```\n\n### Scrape Player Stats Data\n\n```bash\n# simply add \"pl\" to any of the three scraping types:\n# g -> gpl, gs -> gspl, gp -> gppl\nbaskref -t gpl -d 2022-01-07 -fp datasets\n```\n\n### Scrape Using a Proxy\nUse proxy for scraping.\n```bash\nbaskref -t g -d 2022-01-07 -fp datasets -p http://someproxy.com\n```\n\n\n## How to Use the Package?\n\nInstall requirements\n```bash\npip install -r requirements.txt\n```\n\n### Data Collection Utility\nThis refers to the scraping functionalities.\n\nFor any mode of collection first you need to import and initialize \nthe below classes.\n```python\nfrom baskref.data_collection import (\n    BaskRefUrlScraper,\n    BaskRefDataScraper,\n)\n\nurl_scraper = BaskRefUrlScraper()\ndata_scraper = BaskRefDataScraper()\n\n# optionally you can set a proxy\nproxy_url_scraper = BaskRefUrlScraper(\"http://someproxy.com\")\nproxy_data_scraper = BaskRefDataScraper(\"http://someproxy.com\")\n```\nThe BaskRefDataScraper.get_games_data returns a list of dictionaries.\n\nCollect games for a specific day\n```python\nfrom datetime import date\n\ngame_urls = url_scraper.get_game_urls_day(date(2022,1,7))\ngame_data = data_scraper.get_games_data(game_urls)\n```\n\nCollect games for a specific season (regular + playoffs)\n```python\ngame_urls = url_scraper.get_game_urls_year(2006)\ngame_data = data_scraper.get_games_data(game_urls)\n```\n\nCollect games for a specific postseason\n```python\ngame_urls = url_scraper.get_game_urls_playoffs(2006)\ngame_data = data_scraper.get_games_data(game_urls)\n```\n\nCollect player stats for for a specific day\n```python\nfrom datetime import date\n\ngame_urls = url_scraper.get_game_urls_day(date(2022,1,7))\npl_stats_data = data_scraper.get_player_stats_data(game_urls)\n```\n\n### Data Saving Package\nThis refers to the saving of the data.\n\nSave a list of dictionaries to a CSV file.\n```python\nimport os\nfrom baskref.data_saving.file_saver import save_file_from_list\n\nsave_path = os.path.join('datasets', 'file_name.csv')\nsave_file_from_list(game_data, save_path)\n```\n\n## How to Run Tests?\n\nRun all tests with Pytest\n```\npytest\n```\n\nRun all tests with coverage\n```\ncoverage run --source=baskref -m pytest\ncoverage report --omit=\"*/test*\" -m --skip-empty\n```\n\n## Code Formating\n\nThe code base uses black for automatic formating.\nthe configuration for black is stored in pyproject.toml file.\n\n```bash\n# run black over the entire code base\nblack .\n```\n\n## Linting\n\nThe code base uses pylint and mypy for code linting.\n\n### Pylint\n\nthe configuration for pylint is stored in .pylintrc file.\n\n```bash \n# run pylint over the entire code base\npylint --recursive=y ./\n```\n\n### MyPy\n\nthe configuration for mypy is stored in pyproject.toml file.\n\n```bash \n# run mypy over the entire code base\nmypy baskref\n```\n\n## Bonus\n\n### Prepare project for development\n\n1. Create Virtual Environment\n\n- You might want to use a virtual environment for executing the project.\n- this is an optional step (if skipping go straight to step 2)\n\nCreate a new virtual environemnt\n```\npython -m venv venv  # The second parameter is a path to the virtual env.\n```\n\nActivate the new virtual environment\n```\n# Windows\n.\\venv\\Scripts\\activate\n\n# Unix\nsource venv/bin/activate\n```\n\nLeaving the virtual environment\n```\ndeactivate\n```\n\n2. Install all the dev requirements\n\n```\npip install -r requirements_dev.txt\n\n# uninstall all packages Windows\npip freeze > unins && pip uninstall -y -r unins && del unins\n\n# uninstall all packages linux\npip freeze | xargs pip uninstall -y\n```\n\n3. Install the pre-commit hook\n```\npre-commit install\n```\n\n### Prepare a new Version\nThis section describes some of the steps when preparing a new baskref version.\n\n- empty the dist folder\n```\nrm -rf dist/*\n```\n\n- adjust the pyproject.toml file\n    - version\n    - dependencies\n\n- install project locally and test it\n```\npython -m build\npip install .\n```\n\n- install twine\n```\npip install --upgrade twine\n```\n\n- publish project to test.pypi (optional)\n```\ntwine upload --repository testpypi dist/*\n# install from test.pypi\npip install --index-url https://test.pypi.org/simple/ baskref\n```\n\n- publish a new version\n```\ntwine upload dist/*\n```\n\n\n## Contributors\n\n1. [Dominik Zulovec Sajovic](https://www.linkedin.com/in/dominik-zulovec-sajovic/)\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2022 Dominik Zulovec Sajovic  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
    "summary": "baskRef is a tool to scrape basketball Data from the web.",
    "version": "1.0.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/orion512/basketball_scraper/issues",
        "Homepage": "https://github.com/orion512/basketball_scraper",
        "Project Board": "https://github.com/users/orion512/projects/2/views/1"
    },
    "split_keywords": [
        "basketball",
        " web scraper",
        " python"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2e41403b98c1e8fa21e06c9073dc933b775fdaeb9b0f47ae7e85ba22938e6228",
                "md5": "30de52e9e20cff6495db07cf105375c2",
                "sha256": "772578ed6db70172acf0d8a3774fee205b73c6d9ab1f40e54a65587d93f168d1"
            },
            "downloads": -1,
            "filename": "baskref-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "30de52e9e20cff6495db07cf105375c2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 17968,
            "upload_time": "2024-09-22T15:49:39",
            "upload_time_iso_8601": "2024-09-22T15:49:39.927888Z",
            "url": "https://files.pythonhosted.org/packages/2e/41/403b98c1e8fa21e06c9073dc933b775fdaeb9b0f47ae7e85ba22938e6228/baskref-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cd610be6dede00f7fe1c323054932a3078690b2e582315f0dba8332e79bab175",
                "md5": "0d86d1d3e0989bc4a41778dc8851da88",
                "sha256": "43076bab53c186e48ef8e8e6e92dc899202d79e824d1184209fbddb4d369b9e1"
            },
            "downloads": -1,
            "filename": "baskref-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "0d86d1d3e0989bc4a41778dc8851da88",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 17405,
            "upload_time": "2024-09-22T15:49:41",
            "upload_time_iso_8601": "2024-09-22T15:49:41.651941Z",
            "url": "https://files.pythonhosted.org/packages/cd/61/0be6dede00f7fe1c323054932a3078690b2e582315f0dba8332e79bab175/baskref-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-22 15:49:41",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "orion512",
    "github_project": "basketball_scraper",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "baskref"
}
        
Elapsed time: 1.74636s