# Synmax API Client
## Installation
If you just want to use the package, run:
```bash
pip install --upgrade synmax-api-python-client
```
### Requirements
Make sure you have [Python 3.10+](https://docs.python.org/3/) and [pip](https://pypi.org/project/pip/) installed.
## Quickstart
Hyperion API Swagger doc: https://hyperion.api.synmax.com/apidocs/#/default
### Jupyter notebook setting
run this block of code at the beggining of synmax api client
```python
!pip install nest_asyncio
import nest_asyncio
nest_asyncio.apply()
After this run your code as usual.
```
### Configuring synmax client
```python
import logging
from synmax.hyperion import HyperionApiClient, ApiPayload
# enable debug if required.
logging.basicConfig(level=logging.DEBUG)
# two ways to pass access token.
# 1. Set environment variables: os.environ['access_token'] = 'your token'
# OR
# 2. pass to HyperionApiClient instance
access_token = 'your access token goes here'
hyperion_client = HyperionApiClient(access_token=access_token)
```
#### Fetching data based on your subscription key (access_key)
```python
from synmax.hyperion import HyperionApiClient, ApiPayload
hyperion_client = HyperionApiClient(access_token='....')
# fetch regions
region_df = hyperion_client.fetch_regions()
print(region_df.count())
```
#### Paginated data
```python
import logging
from synmax.hyperion import HyperionApiClient, ApiPayload
# enable debug if required.
logging.basicConfig(level=logging.DEBUG)
# two ways to pass access token.
# 1. Set environment variables: os.environ['access_token'] = 'your token'
# 2. pass to HyperionApiClient instance
access_token = 'your access token goes here'
hyperion_client = HyperionApiClient(access_token=access_token)
# well completion based on input filters of type ApiPayload;
# fetch_all = True will paginate all of rows and return accumulation of each page result. True by default
# set fetch_all=False to get first page or any single page starting row with payload.pagination_start = <start row index, default to 0>
payload = ApiPayload(start_date='2022-06-1', end_date='2022-06-25', state_code='TX')
# return result is in pandas.DataFrame
completions_df = hyperion_client.well_completion(payload)
print(completions_df.count())
# output is in pandas.DataFrame
# Querying API pages: 100%|██████████| 8/8 [00:06<00:00, 1.14it/s]
# with optional payload to fetch full dataset
result_df = hyperion_client.wells()
print(result_df.count())
# Querying API wells pages: 0%| | 4/7225 [00:16<8:51:17, 4.41s/it]
## Well data
result_df = hyperion_client.wells(payload)
## Product by Country and Operator
result_df = hyperion_client.production_by_county_and_operator(payload)
## Available api methods on hyperion_client
dir(hyperion_client)
# output: ['ducs_by_operator', 'fetch_regions', 'frac_crews', 'production_by_county_and_operator', 'production_by_well', 'rigs', 'well_completion', 'wells']
```
## publishing package
```shell
pip install twine
python setup.py bdist_wheel
twine upload dist/*
python setup.py clean --all
```
## pre-commit hooks
```bash
pip install pre-commit
run 'pre-commit install'
# Run one time to fix any existing files
pre-commit run --all-files
```
Raw data
{
"_id": null,
"home_page": "https://github.com/SynMaxDev/synmax-api-python-client.git",
"name": "synmax-api-python-client",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": null,
"author": "Eric Anderson and Deepa Aswathaiah",
"author_email": null,
"download_url": null,
"platform": null,
"description": "# Synmax API Client\n\n## Installation\n\nIf you just want to use the package, run:\n\n```bash\npip install --upgrade synmax-api-python-client\n\n\n```\n\n### Requirements\n\nMake sure you have [Python 3.10+](https://docs.python.org/3/) and [pip](https://pypi.org/project/pip/) installed.\n\n## Quickstart\n\n Hyperion API Swagger doc: https://hyperion.api.synmax.com/apidocs/#/default\n\n### Jupyter notebook setting\n run this block of code at the beggining of synmax api client\n```python\n\n!pip install nest_asyncio\n\nimport nest_asyncio\nnest_asyncio.apply()\n\nAfter this run your code as usual.\n```\n### Configuring synmax client\n\n```python\n\nimport logging\nfrom synmax.hyperion import HyperionApiClient, ApiPayload\n\n# enable debug if required.\nlogging.basicConfig(level=logging.DEBUG)\n\n# two ways to pass access token.\n# 1. Set environment variables: os.environ['access_token'] = 'your token'\n# OR\n# 2. pass to HyperionApiClient instance\naccess_token = 'your access token goes here'\nhyperion_client = HyperionApiClient(access_token=access_token)\n\n```\n\n#### Fetching data based on your subscription key (access_key)\n\n```python\nfrom synmax.hyperion import HyperionApiClient, ApiPayload\n\nhyperion_client = HyperionApiClient(access_token='....')\n# fetch regions\nregion_df = hyperion_client.fetch_regions()\nprint(region_df.count())\n\n\n```\n\n#### Paginated data\n\n```python\n\nimport logging\nfrom synmax.hyperion import HyperionApiClient, ApiPayload\n\n# enable debug if required.\nlogging.basicConfig(level=logging.DEBUG)\n\n# two ways to pass access token.\n# 1. Set environment variables: os.environ['access_token'] = 'your token'\n# 2. pass to HyperionApiClient instance\naccess_token = 'your access token goes here'\nhyperion_client = HyperionApiClient(access_token=access_token)\n\n# well completion based on input filters of type ApiPayload;\n# fetch_all = True will paginate all of rows and return accumulation of each page result. True by default\n# set fetch_all=False to get first page or any single page starting row with payload.pagination_start = <start row index, default to 0>\npayload = ApiPayload(start_date='2022-06-1', end_date='2022-06-25', state_code='TX')\n\n# return result is in pandas.DataFrame\ncompletions_df = hyperion_client.well_completion(payload)\nprint(completions_df.count())\n# output is in pandas.DataFrame\n# Querying API pages: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 8/8 [00:06<00:00, 1.14it/s]\n\n# with optional payload to fetch full dataset\nresult_df = hyperion_client.wells()\nprint(result_df.count())\n# Querying API wells pages: 0%| | 4/7225 [00:16<8:51:17, 4.41s/it]\n\n\n## Well data\nresult_df = hyperion_client.wells(payload)\n\n## Product by Country and Operator\nresult_df = hyperion_client.production_by_county_and_operator(payload)\n\n## Available api methods on hyperion_client\ndir(hyperion_client)\n# output: ['ducs_by_operator', 'fetch_regions', 'frac_crews', 'production_by_county_and_operator', 'production_by_well', 'rigs', 'well_completion', 'wells']\n\n```\n\n## publishing package\n\n```shell\npip install twine\n\npython setup.py bdist_wheel\n\ntwine upload dist/*\n\n\npython setup.py clean --all\n\n\n```\n\n## pre-commit hooks\n ```bash\n\n pip install pre-commit\n run 'pre-commit install'\n\n # Run one time to fix any existing files\n pre-commit run --all-files\n\n\n ```\n",
"bugtrack_url": null,
"license": null,
"summary": "Synmax API client",
"version": "0.0.62",
"project_urls": {
"Homepage": "https://github.com/SynMaxDev/synmax-api-python-client.git"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "6d4fb2d2ba42d476634e5b57e1af033fc5113eba01830e660797c52b5ab7e69c",
"md5": "a1bfdf16efb377ff3e4584b4f6753101",
"sha256": "ad52168fa3a85d1349b30e26d0c9edb3b704fde04237d90c3ebe2bde93467501"
},
"downloads": -1,
"filename": "synmax_api_python_client-0.0.62-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a1bfdf16efb377ff3e4584b4f6753101",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 34572,
"upload_time": "2024-11-26T22:40:20",
"upload_time_iso_8601": "2024-11-26T22:40:20.000435Z",
"url": "https://files.pythonhosted.org/packages/6d/4f/b2d2ba42d476634e5b57e1af033fc5113eba01830e660797c52b5ab7e69c/synmax_api_python_client-0.0.62-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-26 22:40:20",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "SynMaxDev",
"github_project": "synmax-api-python-client",
"github_not_found": true,
"lcname": "synmax-api-python-client"
}