earningspy


Nameearningspy JSON
Version 0.1.13 PyPI version JSON
download
home_pageNone
SummaryPython toolkit for PEAD research and earnings calendar analysis.
upload_time2025-09-14 02:12:14
maintainerNone
docs_urlNone
authorNone
requires_python<3.12,>=3.9
licenseMIT
keywords earnings finance ai scraper pead quant
VCS
bugtrack_url
requirements aiohappyeyeballs aiohttp aiosignal async-timeout attrs beautifulsoup4 certifi charset-normalizer contourpy cssselect frozenlist idna kiwisolver lxml multidict numpy pandas propcache python-dateutil pytz requests six soupsieve tenacity tqdm typing-extensions tzdata urllib3 user-agent yarl
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
| [![Run Tests](https://github.com/c4road/earningspy/actions/workflows/run-tests.yml/badge.svg)](https://github.com/c4road/earningspy/actions/workflows/run-tests.yml) | [![Bump Version on Merge](https://github.com/c4road/earningspy/actions/workflows/bump-version.yml/badge.svg)](https://github.com/c4road/earningspy/actions/workflows/bump-version.yml) | [![Publish Wheel to PyPi](https://github.com/c4road/earningspy/actions/workflows/publish-wheel.yml/badge.svg)](https://github.com/c4road/earningspy/actions/workflows/publish-wheel.yml) |
|----------|----------|----------|


# EarningsPy 📈

EarningsPy is the elegant Python alternative for studying Post Earnings Announcement Drift (PEAD) in financial markets. Designed for quant researchers, data scientists, and finance professionals, this package provides robust tools to analyze earnings calendars, automate data collection, and perform advanced event studies with ease.

## Features

- 🗓️ **Earnings Calendar Access**: Effortlessly retrieve earnings dates by sector, industry, index, or market capitalization.
- 🚀 **PEAD Analysis**: Built-in utilities to compute post-earnings drift and related statistics.
- 🏦 **Data Integration**: Seamless integration with Finviz for comprehensive earnings and 20 min delayed market data.
- 🔍 **Flexible Filtering**: Filter earnings events by week, month, or custom criteria.
- 🛠️ **Quant-Friendly API**: Pandas-based workflows for easy integration into quant research pipelines.
- 📊 **Excel-Ready Data**: Generate profiled, ready-to-use datasets for calculations and modeling directly in Excel.


## Installation

```bash
pip install earningspy
```

## Usage (WIP)

### Fetch next week earnings
```python
from earningspy.calendars.earnings import EarningSpy
EarningSpy.get_next_week_earnings()
```

### Fetch earnings by ticker
```python
from earningspy.calendars.earnings import EarningSpy
EarningSpy.get_by_tickers(['AAPL', 'MSFT', 'GOOGL'])
```

### Inspect PEAD anomaly with after-earnings data

This is useful to build timeseries, dashboards and graphs to study historical information, not suitable to train models because you will have look ahead bias. The data fetched using this approach will contain post earning information. Meaning, it will contain the fundamentals lastly reported.

#### First time (create your local calendar file)
```python
from earningspy.calendars.earnings import EarningSpy
from earningspy.inspectors.pead import PEADInspector
import nest_asyncio  # Only if ran on a notebook
nest_asyncio.apply()

# Get after earning data
previous_week = EarningSpy.get_this_week_earnings()
inspector = PEADInspector(
    calendar=previous_week
)

# Inspect on stocks with three days passed from the earnings
# dry_run=True shows the list of stocks to be processed without inspecting
inspector.inspect(days=3, dry_run=True, post_earnings=True)

# dry_run=False gets timeseries data to calculate anomaly metrics (CAPM, CAR, BHAR, VIX, etc)
inspector.inspect(days=3, dry_run=False, post_earnings=True)

# check new columns created
inspector.calendar

# Store in a safe place
inspector.calendar.to_csv('post_earnings.csv')
```

#### Second time (load local calendar and append new information) 

```python
from earningspy.calendars.earnings import EarningSpy
from earningspy.inspectors.pead import PEADInspector
import nest_asyncio  # Only if ran on a notebook
nest_asyncio.apply()

# Get after earning data
# Use .get_this_week_earnings() if the week has not ended
# Use .get_previous_week_earnings() if the week already passed
previous_week = EarningSpy.get_this_week_earnings()
inspector = PEADInspector(
    calendar=previous_week
)

# load local storage
storage = pd.read_csv('post_earnings.csv', index_col=0, parse_dates=True)

# join new calendar with local storage
merged = pead.join(storage, type_='post')

# you can see the data that is going to be processed without processing it on a given days window
inspector.inspect(days=3, dry_run=True, post_earnings=True)

# you can chain the inspector this will calculate metrics for 60, 30, and 3 event windows
inspector = inspector.inspect(days=60, post_earnings=True) \
                     .inspect(days=30, post_earnings=True) \
                     .inspect(days=3, post_earnings=True)

# Put the updated calendar back on local storage
inspector.calendar.to_csv('post_earnings.csv')
```

Then you can use the local storge with your preferable BI tool (PowerBI, Tableau, Excel)

### Inspect PEAD anomaly with before-earnings data (WIP)

This is useful to train models and perform statistical regressions, it could also be used to build dashboards and graphs but bear in mind that the data fetched using this approach will contain pre earning information. Meaning, the date will point to fundamentals 1 to 5 days before the earnings. For example, if you plot a timeseries over the EPS you will see the EPS previous to the one reported on that given date. 

#### First time (create your local calendar file)

```python
from earningspy.calendars.earnings import EarningSpy
from earningspy.inspectors.pead import PEADInspector
import nest_asyncio  # Only if ran on a notebook
nest_asyncio.apply()

# Get before earning data
# Use .get_next_week_earnings() because you are seeking before earnings information
next_week = EarningSpy.get_next_week_earnings()
inspector = PEADInspector(
    calendar=previous_week
)

# nothing to inspect here because the earnings did not come yet
inspector.calendar.to_csv('post_earnings.csv')
```

#### Second time (load local calendar and append new information) 

```python
from earningspy.calendars.earnings import EarningSpy
from earningspy.inspectors.pead import PEADInspector
import nest_asyncio  # Only if ran on a notebook
nest_asyncio.apply()

# Get before earning data
previous_week = EarningSpy.get_next_week_earnings()
inspector = PEADInspector(
    calendar=previous_week
)

# load local storage
storage = pd.read_csv('pre_earnings.csv', index_col=0, parse_dates=True)

# join new calendar with local storage
merged = pead.join(storage, type_='pre')

# you can see the data that is going to be processed without processing it on a given days window
inspector.inspect(days=3, dry_run=True)
inspector.inspect(days=30, dry_run=True)
inspector.inspect(days=60, dry_run=True)

# you can chain the inspector this will calculate metrics for 60, 30, and 3 event windows
inspector = inspector.inspect(days=60) \
                     .inspect(days=30) \
                     .inspect(days=3)

# Put the updated calendar back on local storage
inspector.calendar.to_csv('pre_earnings.csv')
```

Then you can use the local storge with your preferable BI tool (PowerBI, Tableau, Excel).
At this point you should already have anomaly metrics.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "earningspy",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.12,>=3.9",
    "maintainer_email": null,
    "keywords": "earnings, finance, AI, scraper, PEAD, quant",
    "author": null,
    "author_email": "Alberto Rincones <alberto.rincones@code4road.com>",
    "download_url": "https://files.pythonhosted.org/packages/ac/24/323641ad92b297b95d90d79937c7abc93345f2c95bbbdb2d2df243f73328/earningspy-0.1.13.tar.gz",
    "platform": null,
    "description": "\n| [![Run Tests](https://github.com/c4road/earningspy/actions/workflows/run-tests.yml/badge.svg)](https://github.com/c4road/earningspy/actions/workflows/run-tests.yml) | [![Bump Version on Merge](https://github.com/c4road/earningspy/actions/workflows/bump-version.yml/badge.svg)](https://github.com/c4road/earningspy/actions/workflows/bump-version.yml) | [![Publish Wheel to PyPi](https://github.com/c4road/earningspy/actions/workflows/publish-wheel.yml/badge.svg)](https://github.com/c4road/earningspy/actions/workflows/publish-wheel.yml) |\n|----------|----------|----------|\n\n\n# EarningsPy \ud83d\udcc8\n\nEarningsPy is the elegant Python alternative for studying Post Earnings Announcement Drift (PEAD) in financial markets. Designed for quant researchers, data scientists, and finance professionals, this package provides robust tools to analyze earnings calendars, automate data collection, and perform advanced event studies with ease.\n\n## Features\n\n- \ud83d\uddd3\ufe0f **Earnings Calendar Access**: Effortlessly retrieve earnings dates by sector, industry, index, or market capitalization.\n- \ud83d\ude80 **PEAD Analysis**: Built-in utilities to compute post-earnings drift and related statistics.\n- \ud83c\udfe6 **Data Integration**: Seamless integration with Finviz for comprehensive earnings and 20 min delayed market data.\n- \ud83d\udd0d **Flexible Filtering**: Filter earnings events by week, month, or custom criteria.\n- \ud83d\udee0\ufe0f **Quant-Friendly API**: Pandas-based workflows for easy integration into quant research pipelines.\n- \ud83d\udcca **Excel-Ready Data**: Generate profiled, ready-to-use datasets for calculations and modeling directly in Excel.\n\n\n## Installation\n\n```bash\npip install earningspy\n```\n\n## Usage (WIP)\n\n### Fetch next week earnings\n```python\nfrom earningspy.calendars.earnings import EarningSpy\nEarningSpy.get_next_week_earnings()\n```\n\n### Fetch earnings by ticker\n```python\nfrom earningspy.calendars.earnings import EarningSpy\nEarningSpy.get_by_tickers(['AAPL', 'MSFT', 'GOOGL'])\n```\n\n### Inspect PEAD anomaly with after-earnings data\n\nThis is useful to build timeseries, dashboards and graphs to study historical information, not suitable to train models because you will have look ahead bias. The data fetched using this approach will contain post earning information. Meaning, it will contain the fundamentals lastly reported.\n\n#### First time (create your local calendar file)\n```python\nfrom earningspy.calendars.earnings import EarningSpy\nfrom earningspy.inspectors.pead import PEADInspector\nimport nest_asyncio  # Only if ran on a notebook\nnest_asyncio.apply()\n\n# Get after earning data\nprevious_week = EarningSpy.get_this_week_earnings()\ninspector = PEADInspector(\n    calendar=previous_week\n)\n\n# Inspect on stocks with three days passed from the earnings\n# dry_run=True shows the list of stocks to be processed without inspecting\ninspector.inspect(days=3, dry_run=True, post_earnings=True)\n\n# dry_run=False gets timeseries data to calculate anomaly metrics (CAPM, CAR, BHAR, VIX, etc)\ninspector.inspect(days=3, dry_run=False, post_earnings=True)\n\n# check new columns created\ninspector.calendar\n\n# Store in a safe place\ninspector.calendar.to_csv('post_earnings.csv')\n```\n\n#### Second time (load local calendar and append new information) \n\n```python\nfrom earningspy.calendars.earnings import EarningSpy\nfrom earningspy.inspectors.pead import PEADInspector\nimport nest_asyncio  # Only if ran on a notebook\nnest_asyncio.apply()\n\n# Get after earning data\n# Use .get_this_week_earnings() if the week has not ended\n# Use .get_previous_week_earnings() if the week already passed\nprevious_week = EarningSpy.get_this_week_earnings()\ninspector = PEADInspector(\n    calendar=previous_week\n)\n\n# load local storage\nstorage = pd.read_csv('post_earnings.csv', index_col=0, parse_dates=True)\n\n# join new calendar with local storage\nmerged = pead.join(storage, type_='post')\n\n# you can see the data that is going to be processed without processing it on a given days window\ninspector.inspect(days=3, dry_run=True, post_earnings=True)\n\n# you can chain the inspector this will calculate metrics for 60, 30, and 3 event windows\ninspector = inspector.inspect(days=60, post_earnings=True) \\\n                     .inspect(days=30, post_earnings=True) \\\n                     .inspect(days=3, post_earnings=True)\n\n# Put the updated calendar back on local storage\ninspector.calendar.to_csv('post_earnings.csv')\n```\n\nThen you can use the local storge with your preferable BI tool (PowerBI, Tableau, Excel)\n\n### Inspect PEAD anomaly with before-earnings data (WIP)\n\nThis is useful to train models and perform statistical regressions, it could also be used to build dashboards and graphs but bear in mind that the data fetched using this approach will contain pre earning information. Meaning, the date will point to fundamentals 1 to 5 days before the earnings. For example, if you plot a timeseries over the EPS you will see the EPS previous to the one reported on that given date. \n\n#### First time (create your local calendar file)\n\n```python\nfrom earningspy.calendars.earnings import EarningSpy\nfrom earningspy.inspectors.pead import PEADInspector\nimport nest_asyncio  # Only if ran on a notebook\nnest_asyncio.apply()\n\n# Get before earning data\n# Use .get_next_week_earnings() because you are seeking before earnings information\nnext_week = EarningSpy.get_next_week_earnings()\ninspector = PEADInspector(\n    calendar=previous_week\n)\n\n# nothing to inspect here because the earnings did not come yet\ninspector.calendar.to_csv('post_earnings.csv')\n```\n\n#### Second time (load local calendar and append new information) \n\n```python\nfrom earningspy.calendars.earnings import EarningSpy\nfrom earningspy.inspectors.pead import PEADInspector\nimport nest_asyncio  # Only if ran on a notebook\nnest_asyncio.apply()\n\n# Get before earning data\nprevious_week = EarningSpy.get_next_week_earnings()\ninspector = PEADInspector(\n    calendar=previous_week\n)\n\n# load local storage\nstorage = pd.read_csv('pre_earnings.csv', index_col=0, parse_dates=True)\n\n# join new calendar with local storage\nmerged = pead.join(storage, type_='pre')\n\n# you can see the data that is going to be processed without processing it on a given days window\ninspector.inspect(days=3, dry_run=True)\ninspector.inspect(days=30, dry_run=True)\ninspector.inspect(days=60, dry_run=True)\n\n# you can chain the inspector this will calculate metrics for 60, 30, and 3 event windows\ninspector = inspector.inspect(days=60) \\\n                     .inspect(days=30) \\\n                     .inspect(days=3)\n\n# Put the updated calendar back on local storage\ninspector.calendar.to_csv('pre_earnings.csv')\n```\n\nThen you can use the local storge with your preferable BI tool (PowerBI, Tableau, Excel).\nAt this point you should already have anomaly metrics.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Python toolkit for PEAD research and earnings calendar analysis.",
    "version": "0.1.13",
    "project_urls": {
        "Homepage": "https://github.com/c4road/earningspy"
    },
    "split_keywords": [
        "earnings",
        " finance",
        " ai",
        " scraper",
        " pead",
        " quant"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d2a3699b1c1cb53ac871527507436d90384b9309dca25a4df0843e3eca232498",
                "md5": "5200e3b785b429e1d5833acec0c4f848",
                "sha256": "b4a1d7afe2b1c4de41b223bcc81aa678ca3f725f570f382801a8562f29a9b3dd"
            },
            "downloads": -1,
            "filename": "earningspy-0.1.13-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "5200e3b785b429e1d5833acec0c4f848",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.12,>=3.9",
            "size": 38403,
            "upload_time": "2025-09-14T02:12:13",
            "upload_time_iso_8601": "2025-09-14T02:12:13.182418Z",
            "url": "https://files.pythonhosted.org/packages/d2/a3/699b1c1cb53ac871527507436d90384b9309dca25a4df0843e3eca232498/earningspy-0.1.13-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ac24323641ad92b297b95d90d79937c7abc93345f2c95bbbdb2d2df243f73328",
                "md5": "dd0070f40cd39c94848ef8a751c0e06a",
                "sha256": "efaf3e5dd0572fdd7e3514301c63b62ada812583fac784cdb4d3ea6aba0277cb"
            },
            "downloads": -1,
            "filename": "earningspy-0.1.13.tar.gz",
            "has_sig": false,
            "md5_digest": "dd0070f40cd39c94848ef8a751c0e06a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.12,>=3.9",
            "size": 32742,
            "upload_time": "2025-09-14T02:12:14",
            "upload_time_iso_8601": "2025-09-14T02:12:14.578777Z",
            "url": "https://files.pythonhosted.org/packages/ac/24/323641ad92b297b95d90d79937c7abc93345f2c95bbbdb2d2df243f73328/earningspy-0.1.13.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-14 02:12:14",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "c4road",
    "github_project": "earningspy",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "aiohappyeyeballs",
            "specs": [
                [
                    "==",
                    "2.6.1"
                ]
            ]
        },
        {
            "name": "aiohttp",
            "specs": [
                [
                    "==",
                    "3.12.15"
                ]
            ]
        },
        {
            "name": "aiosignal",
            "specs": [
                [
                    "==",
                    "1.4.0"
                ]
            ]
        },
        {
            "name": "async-timeout",
            "specs": [
                [
                    "==",
                    "5.0.1"
                ]
            ]
        },
        {
            "name": "attrs",
            "specs": [
                [
                    "==",
                    "25.3.0"
                ]
            ]
        },
        {
            "name": "beautifulsoup4",
            "specs": [
                [
                    "==",
                    "4.13.4"
                ]
            ]
        },
        {
            "name": "certifi",
            "specs": [
                [
                    "==",
                    "2025.7.14"
                ]
            ]
        },
        {
            "name": "charset-normalizer",
            "specs": [
                [
                    "==",
                    "3.4.2"
                ]
            ]
        },
        {
            "name": "contourpy",
            "specs": [
                [
                    "==",
                    "1.3.2"
                ]
            ]
        },
        {
            "name": "cssselect",
            "specs": [
                [
                    "==",
                    "1.3.0"
                ]
            ]
        },
        {
            "name": "frozenlist",
            "specs": [
                [
                    "==",
                    "1.7.0"
                ]
            ]
        },
        {
            "name": "idna",
            "specs": [
                [
                    "==",
                    "3.10"
                ]
            ]
        },
        {
            "name": "kiwisolver",
            "specs": [
                [
                    "==",
                    "1.4.8"
                ]
            ]
        },
        {
            "name": "lxml",
            "specs": [
                [
                    "==",
                    "6.0.0"
                ]
            ]
        },
        {
            "name": "multidict",
            "specs": [
                [
                    "==",
                    "6.6.3"
                ]
            ]
        },
        {
            "name": "numpy",
            "specs": [
                [
                    "==",
                    "2.2.6"
                ]
            ]
        },
        {
            "name": "pandas",
            "specs": [
                [
                    "==",
                    "2.3.1"
                ]
            ]
        },
        {
            "name": "propcache",
            "specs": [
                [
                    "==",
                    "0.3.2"
                ]
            ]
        },
        {
            "name": "python-dateutil",
            "specs": [
                [
                    "==",
                    "2.9.0.post0"
                ]
            ]
        },
        {
            "name": "pytz",
            "specs": [
                [
                    "==",
                    "2025.2"
                ]
            ]
        },
        {
            "name": "requests",
            "specs": [
                [
                    "==",
                    "2.32.4"
                ]
            ]
        },
        {
            "name": "six",
            "specs": [
                [
                    "==",
                    "1.17.0"
                ]
            ]
        },
        {
            "name": "soupsieve",
            "specs": [
                [
                    "==",
                    "2.7"
                ]
            ]
        },
        {
            "name": "tenacity",
            "specs": [
                [
                    "==",
                    "9.1.2"
                ]
            ]
        },
        {
            "name": "tqdm",
            "specs": [
                [
                    "==",
                    "4.67.1"
                ]
            ]
        },
        {
            "name": "typing-extensions",
            "specs": [
                [
                    "==",
                    "4.14.1"
                ]
            ]
        },
        {
            "name": "tzdata",
            "specs": [
                [
                    "==",
                    "2025.2"
                ]
            ]
        },
        {
            "name": "urllib3",
            "specs": [
                [
                    "==",
                    "2.5.0"
                ]
            ]
        },
        {
            "name": "user-agent",
            "specs": [
                [
                    "==",
                    "0.1.10"
                ]
            ]
        },
        {
            "name": "yarl",
            "specs": [
                [
                    "==",
                    "1.20.1"
                ]
            ]
        }
    ],
    "tox": true,
    "lcname": "earningspy"
}
        
Elapsed time: 1.52458s