pfeed


Namepfeed JSON
Version 0.0.1.dev12 PyPI version JSON
download
home_pagehttps://pfund.ai
SummaryData pipeline for algo-trading, getting and storing both real-time and historical data made easy.
upload_time2024-04-19 08:53:43
maintainerNone
docs_urlNone
authorStephen Yau
requires_python<3.13,>=3.10
licenseApache-2.0
keywords trading algo-trading data pipeline etl data lake data warehouse data integration historical data live data data streaming
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PFeed: Data Pipeline for Algo-Trading, Getting and Storing Real-Time and Historical Data Made Easy.

![GitHub stars](https://img.shields.io/github/stars/PFund-Software-Ltd/pfeed?style=social)
![PyPI downloads](https://img.shields.io/pypi/dm/pfeed?label=downloads)
[![PyPI](https://img.shields.io/pypi/v/pfeed.svg)](https://pypi.org/project/pfeed)
![PyPI - Support Python Versions](https://img.shields.io/pypi/pyversions/pfeed)
[![Jupyter Book Badge](https://raw.githubusercontent.com/PFund-Software-Ltd/pfeed/main/docs/images/jupyterbook.svg)](https://jupyterbook.org)
[![Poetry](https://img.shields.io/endpoint?url=https://python-poetry.org/badge/v0.json)](https://python-poetry.org/)

PFeed (/piː fiːd/) is a data integration library tailored for algorithmic trading, 
serving as an ETL (Extract, Transform, Load) data pipeline between raw data sources and traders,
helping them in creating a **local data lake for quantitative research**.

PFeed allows traders to download historical, paper, and live data from various data sources, both free and paid,
and stores them into a local data lake using [MinIO](https://min.io/).

It is designed to be used alongside [PFund](https://github.com/PFund-Software-Ltd/pfund) — A Complete Algo-Trading Framework for Machine Learning, TradFi, CeFi and DeFi ready. Supports Vectorized and Event-Driven Backtesting, Paper and Live Trading, or as a standalone package.

<details>
<summary>Table of Contents</summary>

- [Project Status](#project-status)
- [Mission](#mission)
- [Core Features](#core-features)
- [Installation](#installation)
- [Quick Start](#quick-start)
    - [Main Usage: Data Feed](#main-usage-data-feed)
    - [Download Historical Data on Command Line](#download-historical-data-on-command-line)
    - [Download Historical Data in Python](#download-historical-data-in-python)
    - [List Current Config](#list-current-config)
    - [Run PFeed's docker-compose.yml](#run-pfeeds-docker-composeyml)
- [Supported Data Sources](#supported-data-sources)
- [Related Projects](#related-projects)

</details>


## Project Status
**_Caution: PFeed is at a VERY EARLY stage, use it at your own risk._**

PFeed is currently under active development, the framework design will be prioritized first over
stability and scalability. 

Please note that the available version is a *dev* version, not a *stable* one. \
You are encouraged to play with the *dev* version, but only use it when a *stable* version is released.

> PFeed for the time being only supports [Bybit](https://bybit.com/) and Yahoo Finance for testing purpose.

## Mission
Algo-trading has always been a complicated task due to the multitude of components and procedures involved. \
Data collection and processing is probably the most mundane and yet critical part of it, as all results and findings 
are derived from the data.

However, preparing this data for use is not quick and easy. For example, sometimes even when the data is available (e.g. [Bybit data](https://public.bybit.com/trading/)), it is often in raw form and requires some cleaning.

> PFeed's mission is to **_free traders from the tedious data work_** by providing cleaned data in a standard format that is ready for use, making them significantly faster to get to the analysis and strategy development phase.


## Core Features
- [x] Unified approach for interacting with various data sources, obtaining historical and real-time data
- [x] ETL data pipline for transforming raw data and storing it in [MinIO](https://min.io/) (optional)
- [x] Utilizes [Ray](https://github.com/ray-project/ray) for parallel data downloading
- [x] Supports Pandas, [Polars](https://github.com/pola-rs/polars) as data tools
- [ ] Integrates with [Prefect](https://www.prefect.io) to control data flows
- [ ] Listens to PFund's trade engine and adds trade history to a local database [Timescaledb](https://www.timescale.com/) (optional)


## Installation
### Using [Poetry](https://python-poetry.org) (Recommended)
```bash
# [RECOMMENDED]: dataframes (e.g. polars, pandas) + data storage (e.g. MinIO) + boosted performance
poetry add "pfeed[df,data,boost]"

# only for downloading data, e.g. Bybit and Yahoo Finance
poetry add pfeed

# update to the latest version:
poetry update pfeed
```

### Using Pip
```bash
pip install pfeed

# install the latest version:
pip install -U pfeed
```

### Checking your installation
```bash
$ pfeed --version
```

## Quick Start
### Main Usage: Data Feed
1. Download bybit raw data on the fly if not stored locally

    ```python
    import pfeed as pe

    feed = pe.BybitFeed()

    # df is a dataframe or a lazyframe (lazily loaded dataframe)
    df = feed.get_historical_data(
        'BTC_USDT_PERP',
        resolution='raw',
        start_date='2024-03-01',
        end_date='2024-03-01',
        data_tool='polars',  # or 'pandas'
    )
    ```

    > By using pfeed, you are just one line of code away from playing with e.g. bybit data, how convenient!

    Printing the first few rows of `df`:
    |    | ts                            | symbol   |   side |   volume |   price | tickDirection   | trdMatchID                           |   grossValue |   homeNotional |   foreignNotional |
    |---:|:------------------------------|:---------|-------:|---------:|--------:|:----------------|:-------------------------------------|-------------:|---------------:|------------------:|
    |  0 | 2024-03-01 00:00:00.097599983 | BTCUSDT  |      1 |    0.003 | 61184.1 | ZeroMinusTick   | 79ac9a21-0249-5985-b042-906ec7604794 |  1.83552e+10 |          0.003 |           183.552 |
    |  1 | 2024-03-01 00:00:00.098299980 | BTCUSDT  |      1 |    0.078 | 61184.9 | PlusTick        | 2af4e516-8ff4-5955-bb9c-38aa385b7b44 |  4.77242e+11 |          0.078 |          4772.42  |

2. Get dataframe with different resolution, e.g. 1-minute data
    ```python
    import pfeed as pe

    feed = pe.BybitFeed()

    # df is a dataframe or a lazyframe (lazily loaded dataframe)
    df = feed.get_historical_data(
        'BTC_USDT_PERP',
        resolution='1minute',  # or '1tick'/'1t', '2second'/'2s', '3minute'/'3m' etc.
        start_date='2024-03-01',
        end_date='2024-03-01',
        data_tool='polars',
    )
    ```
    > If you will be interacting with the data frequently, you should consider downloading it to your local machine.

    Printing the first few rows of `df`:
    |    | ts                  | product       | resolution   |    open |    high |     low |   close |   volume |
    |---:|:--------------------|:--------------|:-------------|--------:|--------:|--------:|--------:|---------:|
    |  0 | 2024-03-01 00:00:00 | BTC_USDT_PERP | 1m           | 61184.1 | 61244.5 | 61175.8 | 61244.5 |  159.142 |
    |  1 | 2024-03-01 00:01:00 | BTC_USDT_PERP | 1m           | 61245.3 | 61276.5 | 61200.7 | 61232.2 |  227.242 |
    |  2 | 2024-03-01 00:02:00 | BTC_USDT_PERP | 1m           | 61232.2 | 61249   | 61180   | 61184.2 |   91.446 |


3. pfeed also supports simple wrapping of [yfinance](https://github.com/ranaroussi/yfinance)
    ```python
    import pfeed as pe

    feed = pe.YahooFinanceFeed()

    # you can still use any kwargs supported by yfinance's ticker.history(...)
    # e.g. 'prepost', 'auto_adjust' etc.
    yfinance_kwargs = {}

    df = feed.get_historical_data(
        'AAPL',
        resolution='1d',
        start_date='2024-03-01',
        end_date='2024-03-20',
        **yfinance_kwargs
    )
    ```
    > Note that YahooFinanceFeed doesn't support the kwarg `data_tool`, e.g. polars
    
    Printing the first few rows of `df`:
    | ts                  | symbol   | resolution   |   open |   high |    low |   close |   volume |   dividends |   stock_splits |
    |:--------------------|:---------|:-------------|-------:|-------:|-------:|--------:|---------:|------------:|---------------:|
    | 2024-03-01 05:00:00 | AAPL     | 1d           | 179.55 | 180.53 | 177.38 |  179.66 | 73488000 |           0 |              0 |
    | 2024-03-04 05:00:00 | AAPL     | 1d           | 176.15 | 176.9  | 173.79 |  175.1  | 81510100 |           0 |              0 |
    | 2024-03-05 05:00:00 | AAPL     | 1d           | 170.76 | 172.04 | 169.62 |  170.12 | 95132400 |           0 |              0 |



### Download Historical Data on the Command Line Interface (CLI)
```bash
# download data, default data type (dtype) is 'raw' data
pfeed download -d BYBIT -p BTC_USDT_PERP --start-date 2024-03-01 --end-date 2024-03-08

# download multiple products BTC_USDT_PERP and ETH_USDT_PERP and minute data
pfeed download -d BYBIT -p BTC_USDT_PERP -p ETH_USDT_PERP --dtype minute

# download all perpetuals data from bybit
pfeed download -d BYBIT --ptype PERP

# download all the data from bybit (CAUTION: your local machine probably won't have enough space for this!)
pfeed download -d BYBIT

# store data into MinIO (need to start MinIO by running `pfeed docker-compose up -d` first)
pfeed download -d BYBIT -p BTC_USDT_PERP --use-minio

# enable debug mode and turn off using Ray
pfeed download -d BYBIT -p BTC_USDT_PERP --debug --no-ray
```

### Download Historical Data in Python
```python
import pfeed as pe

# compared to the CLI approach, this is more convenient for downloading multiple products
pe.bybit.download(
    pdts=[
        'BTC_USDT_PERP',
        'ETH_USDT_PERP',
        'BCH_USDT_PERP',
    ],
    dtypes=['raw'],  # data types, e.g. 'raw', 'tick', 'second', 'minute' etc.
    start_date='2024-03-01',
    end_date='2024-03-08',
    use_minio=False,
)
```

### List Current Config
```bash
# list the current config:
pfeed config --list

# change the data storage location to your local project's 'data' folder:
pfeed config --data-path ./data

# for more commands:
pfeed --help
```

### Run PFeed's docker-compose.yml
```bash
# same as 'docker-compose', only difference is it has pointed to pfeed's docker-compose.yml file
pfeed docker-compose [COMMAND]

# e.g. start services
pfeed docker-compose up -d

# e.g. stop services
pfeed docker-compose down
```


## Supported Data Sources
| Data Source               | Get Historical Data | Download Historical Data | Get Live/Paper Data | Stream Live/Paper Data |
| ------------------------- | ------------------- | ------------------------ | ------------------- | ---------------------- |
| Yahoo Finance             | 🟢                  | ⚪                       | ⚪                  | ⚪                     |
| Bybit                     | 🟢                  | 🟢                       | 🟡                  | 🔴                     |
| *Interactive Brokers (IB) | 🔴                  | ⚪                       | 🔴                  | 🔴                     |
| *[FirstRate Data]         | 🔴                  | 🔴                       | ⚪                  | ⚪                     |
| Binance                   | 🔴                  | 🔴                       | 🔴                  | 🔴                     |
| OKX                       | 🔴                  | 🔴                       | 🔴                  | 🔴                     |

[FirstRate Data]: https://firstratedata.com

🟢 = finished \
🟡 = in progress \
🔴 = todo \
⚪ = not applicable \
\* = paid data


## Related Projects
- [PFund](https://github.com/PFund-Software-Ltd/pfund) — A Complete Algo-Trading Framework for Machine Learning, TradFi, CeFi and DeFi ready. Supports Vectorized and Event-Driven Backtesting, Paper and Live Trading
- [PyTrade.org](https://pytrade.org) - A curated list of Python libraries and resources for algorithmic trading.
            

Raw data

            {
    "_id": null,
    "home_page": "https://pfund.ai",
    "name": "pfeed",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.13,>=3.10",
    "maintainer_email": null,
    "keywords": "trading, algo-trading, data pipeline, ETL, data lake, data warehouse, data integration, historical data, live data, data streaming",
    "author": "Stephen Yau",
    "author_email": "softwareentrepreneer+pfeed@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/95/af/69fd4c935a2a6400948315106ae46921bc7209ddce459767f14d065c3eef/pfeed-0.0.1.dev12.tar.gz",
    "platform": null,
    "description": "# PFeed: Data Pipeline for Algo-Trading, Getting and Storing Real-Time and Historical Data Made Easy.\n\n![GitHub stars](https://img.shields.io/github/stars/PFund-Software-Ltd/pfeed?style=social)\n![PyPI downloads](https://img.shields.io/pypi/dm/pfeed?label=downloads)\n[![PyPI](https://img.shields.io/pypi/v/pfeed.svg)](https://pypi.org/project/pfeed)\n![PyPI - Support Python Versions](https://img.shields.io/pypi/pyversions/pfeed)\n[![Jupyter Book Badge](https://raw.githubusercontent.com/PFund-Software-Ltd/pfeed/main/docs/images/jupyterbook.svg)](https://jupyterbook.org)\n[![Poetry](https://img.shields.io/endpoint?url=https://python-poetry.org/badge/v0.json)](https://python-poetry.org/)\n\nPFeed (/pi\u02d0 fi\u02d0d/) is a data integration library tailored for algorithmic trading, \nserving as an ETL (Extract, Transform, Load) data pipeline between raw data sources and traders,\nhelping them in creating a **local data lake for quantitative research**.\n\nPFeed allows traders to download historical, paper, and live data from various data sources, both free and paid,\nand stores them into a local data lake using [MinIO](https://min.io/).\n\nIt is designed to be used alongside [PFund](https://github.com/PFund-Software-Ltd/pfund) \u2014 A Complete Algo-Trading Framework for Machine Learning, TradFi, CeFi and DeFi ready. Supports Vectorized and Event-Driven Backtesting, Paper and Live Trading, or as a standalone package.\n\n<details>\n<summary>Table of Contents</summary>\n\n- [Project Status](#project-status)\n- [Mission](#mission)\n- [Core Features](#core-features)\n- [Installation](#installation)\n- [Quick Start](#quick-start)\n    - [Main Usage: Data Feed](#main-usage-data-feed)\n    - [Download Historical Data on Command Line](#download-historical-data-on-command-line)\n    - [Download Historical Data in Python](#download-historical-data-in-python)\n    - [List Current Config](#list-current-config)\n    - [Run PFeed's docker-compose.yml](#run-pfeeds-docker-composeyml)\n- [Supported Data Sources](#supported-data-sources)\n- [Related Projects](#related-projects)\n\n</details>\n\n\n## Project Status\n**_Caution: PFeed is at a VERY EARLY stage, use it at your own risk._**\n\nPFeed is currently under active development, the framework design will be prioritized first over\nstability and scalability. \n\nPlease note that the available version is a *dev* version, not a *stable* one. \\\nYou are encouraged to play with the *dev* version, but only use it when a *stable* version is released.\n\n> PFeed for the time being only supports [Bybit](https://bybit.com/) and Yahoo Finance for testing purpose.\n\n## Mission\nAlgo-trading has always been a complicated task due to the multitude of components and procedures involved. \\\nData collection and processing is probably the most mundane and yet critical part of it, as all results and findings \nare derived from the data.\n\nHowever, preparing this data for use is not quick and easy. For example, sometimes even when the data is available (e.g. [Bybit data](https://public.bybit.com/trading/)), it is often in raw form and requires some cleaning.\n\n> PFeed's mission is to **_free traders from the tedious data work_** by providing cleaned data in a standard format that is ready for use, making them significantly faster to get to the analysis and strategy development phase.\n\n\n## Core Features\n- [x] Unified approach for interacting with various data sources, obtaining historical and real-time data\n- [x] ETL data pipline for transforming raw data and storing it in [MinIO](https://min.io/) (optional)\n- [x] Utilizes [Ray](https://github.com/ray-project/ray) for parallel data downloading\n- [x] Supports Pandas, [Polars](https://github.com/pola-rs/polars) as data tools\n- [ ] Integrates with [Prefect](https://www.prefect.io) to control data flows\n- [ ] Listens to PFund's trade engine and adds trade history to a local database [Timescaledb](https://www.timescale.com/) (optional)\n\n\n## Installation\n### Using [Poetry](https://python-poetry.org) (Recommended)\n```bash\n# [RECOMMENDED]: dataframes (e.g. polars, pandas) + data storage (e.g. MinIO) + boosted performance\npoetry add \"pfeed[df,data,boost]\"\n\n# only for downloading data, e.g. Bybit and Yahoo Finance\npoetry add pfeed\n\n# update to the latest version:\npoetry update pfeed\n```\n\n### Using Pip\n```bash\npip install pfeed\n\n# install the latest version:\npip install -U pfeed\n```\n\n### Checking your installation\n```bash\n$ pfeed --version\n```\n\n## Quick Start\n### Main Usage: Data Feed\n1. Download bybit raw data on the fly if not stored locally\n\n    ```python\n    import pfeed as pe\n\n    feed = pe.BybitFeed()\n\n    # df is a dataframe or a lazyframe (lazily loaded dataframe)\n    df = feed.get_historical_data(\n        'BTC_USDT_PERP',\n        resolution='raw',\n        start_date='2024-03-01',\n        end_date='2024-03-01',\n        data_tool='polars',  # or 'pandas'\n    )\n    ```\n\n    > By using pfeed, you are just one line of code away from playing with e.g. bybit data, how convenient!\n\n    Printing the first few rows of `df`:\n    |    | ts                            | symbol   |   side |   volume |   price | tickDirection   | trdMatchID                           |   grossValue |   homeNotional |   foreignNotional |\n    |---:|:------------------------------|:---------|-------:|---------:|--------:|:----------------|:-------------------------------------|-------------:|---------------:|------------------:|\n    |  0 | 2024-03-01 00:00:00.097599983 | BTCUSDT  |      1 |    0.003 | 61184.1 | ZeroMinusTick   | 79ac9a21-0249-5985-b042-906ec7604794 |  1.83552e+10 |          0.003 |           183.552 |\n    |  1 | 2024-03-01 00:00:00.098299980 | BTCUSDT  |      1 |    0.078 | 61184.9 | PlusTick        | 2af4e516-8ff4-5955-bb9c-38aa385b7b44 |  4.77242e+11 |          0.078 |          4772.42  |\n\n2. Get dataframe with different resolution, e.g. 1-minute data\n    ```python\n    import pfeed as pe\n\n    feed = pe.BybitFeed()\n\n    # df is a dataframe or a lazyframe (lazily loaded dataframe)\n    df = feed.get_historical_data(\n        'BTC_USDT_PERP',\n        resolution='1minute',  # or '1tick'/'1t', '2second'/'2s', '3minute'/'3m' etc.\n        start_date='2024-03-01',\n        end_date='2024-03-01',\n        data_tool='polars',\n    )\n    ```\n    > If you will be interacting with the data frequently, you should consider downloading it to your local machine.\n\n    Printing the first few rows of `df`:\n    |    | ts                  | product       | resolution   |    open |    high |     low |   close |   volume |\n    |---:|:--------------------|:--------------|:-------------|--------:|--------:|--------:|--------:|---------:|\n    |  0 | 2024-03-01 00:00:00 | BTC_USDT_PERP | 1m           | 61184.1 | 61244.5 | 61175.8 | 61244.5 |  159.142 |\n    |  1 | 2024-03-01 00:01:00 | BTC_USDT_PERP | 1m           | 61245.3 | 61276.5 | 61200.7 | 61232.2 |  227.242 |\n    |  2 | 2024-03-01 00:02:00 | BTC_USDT_PERP | 1m           | 61232.2 | 61249   | 61180   | 61184.2 |   91.446 |\n\n\n3. pfeed also supports simple wrapping of [yfinance](https://github.com/ranaroussi/yfinance)\n    ```python\n    import pfeed as pe\n\n    feed = pe.YahooFinanceFeed()\n\n    # you can still use any kwargs supported by yfinance's ticker.history(...)\n    # e.g. 'prepost', 'auto_adjust' etc.\n    yfinance_kwargs = {}\n\n    df = feed.get_historical_data(\n        'AAPL',\n        resolution='1d',\n        start_date='2024-03-01',\n        end_date='2024-03-20',\n        **yfinance_kwargs\n    )\n    ```\n    > Note that YahooFinanceFeed doesn't support the kwarg `data_tool`, e.g. polars\n    \n    Printing the first few rows of `df`:\n    | ts                  | symbol   | resolution   |   open |   high |    low |   close |   volume |   dividends |   stock_splits |\n    |:--------------------|:---------|:-------------|-------:|-------:|-------:|--------:|---------:|------------:|---------------:|\n    | 2024-03-01 05:00:00 | AAPL     | 1d           | 179.55 | 180.53 | 177.38 |  179.66 | 73488000 |           0 |              0 |\n    | 2024-03-04 05:00:00 | AAPL     | 1d           | 176.15 | 176.9  | 173.79 |  175.1  | 81510100 |           0 |              0 |\n    | 2024-03-05 05:00:00 | AAPL     | 1d           | 170.76 | 172.04 | 169.62 |  170.12 | 95132400 |           0 |              0 |\n\n\n\n### Download Historical Data on the Command Line Interface (CLI)\n```bash\n# download data, default data type (dtype) is 'raw' data\npfeed download -d BYBIT -p BTC_USDT_PERP --start-date 2024-03-01 --end-date 2024-03-08\n\n# download multiple products BTC_USDT_PERP and ETH_USDT_PERP and minute data\npfeed download -d BYBIT -p BTC_USDT_PERP -p ETH_USDT_PERP --dtype minute\n\n# download all perpetuals data from bybit\npfeed download -d BYBIT --ptype PERP\n\n# download all the data from bybit (CAUTION: your local machine probably won't have enough space for this!)\npfeed download -d BYBIT\n\n# store data into MinIO (need to start MinIO by running `pfeed docker-compose up -d` first)\npfeed download -d BYBIT -p BTC_USDT_PERP --use-minio\n\n# enable debug mode and turn off using Ray\npfeed download -d BYBIT -p BTC_USDT_PERP --debug --no-ray\n```\n\n### Download Historical Data in Python\n```python\nimport pfeed as pe\n\n# compared to the CLI approach, this is more convenient for downloading multiple products\npe.bybit.download(\n    pdts=[\n        'BTC_USDT_PERP',\n        'ETH_USDT_PERP',\n        'BCH_USDT_PERP',\n    ],\n    dtypes=['raw'],  # data types, e.g. 'raw', 'tick', 'second', 'minute' etc.\n    start_date='2024-03-01',\n    end_date='2024-03-08',\n    use_minio=False,\n)\n```\n\n### List Current Config\n```bash\n# list the current config:\npfeed config --list\n\n# change the data storage location to your local project's 'data' folder:\npfeed config --data-path ./data\n\n# for more commands:\npfeed --help\n```\n\n### Run PFeed's docker-compose.yml\n```bash\n# same as 'docker-compose', only difference is it has pointed to pfeed's docker-compose.yml file\npfeed docker-compose [COMMAND]\n\n# e.g. start services\npfeed docker-compose up -d\n\n# e.g. stop services\npfeed docker-compose down\n```\n\n\n## Supported Data Sources\n| Data Source               | Get Historical Data | Download Historical Data | Get Live/Paper Data | Stream Live/Paper Data |\n| ------------------------- | ------------------- | ------------------------ | ------------------- | ---------------------- |\n| Yahoo Finance             | \ud83d\udfe2                  | \u26aa                       | \u26aa                  | \u26aa                     |\n| Bybit                     | \ud83d\udfe2                  | \ud83d\udfe2                       | \ud83d\udfe1                  | \ud83d\udd34                     |\n| *Interactive Brokers (IB) | \ud83d\udd34                  | \u26aa                       | \ud83d\udd34                  | \ud83d\udd34                     |\n| *[FirstRate Data]         | \ud83d\udd34                  | \ud83d\udd34                       | \u26aa                  | \u26aa                     |\n| Binance                   | \ud83d\udd34                  | \ud83d\udd34                       | \ud83d\udd34                  | \ud83d\udd34                     |\n| OKX                       | \ud83d\udd34                  | \ud83d\udd34                       | \ud83d\udd34                  | \ud83d\udd34                     |\n\n[FirstRate Data]: https://firstratedata.com\n\n\ud83d\udfe2 = finished \\\n\ud83d\udfe1 = in progress \\\n\ud83d\udd34 = todo \\\n\u26aa = not applicable \\\n\\* = paid data\n\n\n## Related Projects\n- [PFund](https://github.com/PFund-Software-Ltd/pfund) \u2014 A Complete Algo-Trading Framework for Machine Learning, TradFi, CeFi and DeFi ready. Supports Vectorized and Event-Driven Backtesting, Paper and Live Trading\n- [PyTrade.org](https://pytrade.org) - A curated list of Python libraries and resources for algorithmic trading.",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Data pipeline for algo-trading, getting and storing both real-time and historical data made easy.",
    "version": "0.0.1.dev12",
    "project_urls": {
        "Documentation": "https://pfeed-docs.pfund.ai",
        "Homepage": "https://pfund.ai",
        "Repository": "https://github.com/PFund-Software-Ltd/pfeed"
    },
    "split_keywords": [
        "trading",
        " algo-trading",
        " data pipeline",
        " etl",
        " data lake",
        " data warehouse",
        " data integration",
        " historical data",
        " live data",
        " data streaming"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "81cd610691b102b53c5d08f4713b4540a746b631e5d09663398f3dd0e6fd7f09",
                "md5": "b8b5c2661c4e2030da8ffcb9894a7552",
                "sha256": "d8bf7e8d3f84be30e0cb389962014ff03567afbb20f8b4687972e9125d49184e"
            },
            "downloads": -1,
            "filename": "pfeed-0.0.1.dev12-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b8b5c2661c4e2030da8ffcb9894a7552",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.13,>=3.10",
            "size": 40631,
            "upload_time": "2024-04-19T08:53:41",
            "upload_time_iso_8601": "2024-04-19T08:53:41.322836Z",
            "url": "https://files.pythonhosted.org/packages/81/cd/610691b102b53c5d08f4713b4540a746b631e5d09663398f3dd0e6fd7f09/pfeed-0.0.1.dev12-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "95af69fd4c935a2a6400948315106ae46921bc7209ddce459767f14d065c3eef",
                "md5": "091711adf40f460bba05560ad5a1c833",
                "sha256": "5328f48a1a9dc52ad0d1f4657ff1140840b7d46fc2d45633f9bb5f3bfc4668ff"
            },
            "downloads": -1,
            "filename": "pfeed-0.0.1.dev12.tar.gz",
            "has_sig": false,
            "md5_digest": "091711adf40f460bba05560ad5a1c833",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.13,>=3.10",
            "size": 34530,
            "upload_time": "2024-04-19T08:53:43",
            "upload_time_iso_8601": "2024-04-19T08:53:43.537203Z",
            "url": "https://files.pythonhosted.org/packages/95/af/69fd4c935a2a6400948315106ae46921bc7209ddce459767f14d065c3eef/pfeed-0.0.1.dev12.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-19 08:53:43",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "PFund-Software-Ltd",
    "github_project": "pfeed",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "pfeed"
}
        
Elapsed time: 0.25567s