# Cloud Shelve
`Cloud Shelve (cshelve)` is a Python package that provides a seamless way to store and manage data in the cloud using the familiar [Python Shelve interface](https://docs.python.org/3/library/shelve.html). It is designed for efficient and scalable storage solutions, allowing you to leverage cloud providers for persistent storage while keeping the simplicity of the `shelve` API.
## Installation
Install `cshelve` via pip:
```bash
pip install cshelve # For testing purposes
pip install cshelve[azure-blob] # For Azure Blob Storage
```
## Usage
The `cshelve` module provides a simple key-value interface for storing data in the cloud.
### Quick Start Example
Here is a quick example demonstrating how to store and retrieve data using `cshelve`:
```python
import cshelve
# Open a local database file
d = cshelve.open('local.db')
# Store data
d['my_key'] = 'my_data'
# Retrieve data
print(d['my_key']) # Output: my_data
# Close the database
d.close()
```
### Cloud Storage Example (e.g., AWS, Azure)
To configure remote cloud storage, you need to provide an INI file containing your cloud provider's configuration. The file should have a `.ini` extension. Remote storage also requires the installation of optional dependencies for the cloud provider you want to use.
#### Example AWS S3 Configuration
First, install the AWS S3 provider:
```bash
pip install cshelve[aws-s3]
```
Then, create an INI file with the following configuration:
```bash
$ cat aws-s3.ini
[default]
provider = aws-s3
bucket_name = cshelve
auth_type = access_key
key_id = $AWS_KEY_ID
key_secret = $AWS_KEY_SECRET
```
Next, export the environment variables:
```bash
export AWS_KEY_ID=your_access_key_id
export AWS_KEY_SECRET=your_secret_access_key
```
Once the INI file is ready, you can interact with remote storage the same way as with local storage. Here's an example using AWS:
```python
import cshelve
# Open using the remote storage configuration
d = cshelve.open('aws-s3.ini')
# Store data
d['my_key'] = 'my_data'
# Retrieve data
print(d['my_key']) # Output: my_data
# Close the connection to the remote storage
d.close()
```
#### Example Azure Blob Configuration
First, install the Azure Blob Storage provider:
```bash
pip install cshelve[azure-blob]
```
Then, create an INI file with the following configuration:
```bash
$ cat azure-blob.ini
[default]
provider = azure-blob
account_url = https://myaccount.blob.core.windows.net
auth_type = passwordless
container_name = mycontainer
```
Once the INI file is ready, you can interact with remote storage the same way as with local storage. Here's an example using Azure:
```python
import cshelve
# Open using the remote storage configuration
d = cshelve.open('azure-blob.ini')
# Store data
d['my_key'] = 'my_data'
# Retrieve data
print(d['my_key']) # Output: my_data
# Close the connection to the remote storage
d.close()
```
### Advanced Scenario: Storing DataFrames in the Cloud
In this advanced example, we will demonstrate how to store and retrieve a Pandas DataFrame using `cshelve` with Azure Blob Storage.
First, install the required dependencies:
```bash
pip install cshelve[azure-blob] pandas
```
Create an INI file with the Azure Blob Storage configuration:
```bash
$ cat azure-blob.ini
[default]
provider = azure-blob
account_url = https://myaccount.blob.core.windows.net
auth_type = passwordless
container_name = mycontainer
```
Here's the code to store and retrieve a DataFrame:
```python
import cshelve
import pandas as pd
# Create a sample DataFrame
df = pd.DataFrame({
'name': ['Alice', 'Bob', 'Charlie'],
'age': [25, 30, 35],
'city': ['New York', 'Los Angeles', 'Chicago']
})
# Open the remote storage using the Azure Blob configuration
with cshelve.open('azure-blob.ini') as db:
# Store the DataFrame
db['my_dataframe'] = df
# Retrieve the DataFrame
with cshelve.open('azure-blob.ini') as db:
retrieved_df = db['my_dataframe']
print(retrieved_df)
```
More configuration examples for other cloud providers can be found [here](./tests/configurations/).
### Providers configuration
#### AWS S3
Provider: `aws-s3`
Installation: `pip install cshelve[aws-s3]`
The AWS S3 provider uses an [AWS S3 Bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/Welcome.html) as remote storage.
| Option | Description | Required | Default Value |
|---------------------|-----------------------------------------------------------------------------|--------------------|---------------|
| `bucket_name` | The name of the S3 bucket. | :white_check_mark: | |
| `auth_type` | The authentication method to use: `access_key`. | :white_check_mark: | |
| `key_id` | The environment variable for the AWS access key ID. | :white_check_mark: | |
| `key_secret`| The environment variable for the AWS secret access key. | :white_check_mark: | |
Depending on the `open` flag, the permissions required by `cshelve` for S3 storage vary.
| Flag | Description | Permissions Needed |
|------|-------------|--------------------|
| `r` | Open an existing S3 bucket for reading only. | [AmazonS3ReadOnlyAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonS3ReadOnlyAccess.html) |
| `w` | Open an existing S3 bucket for reading and writing. | [AmazonS3ReadAndWriteAccess](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html) |
| `c` | Open an S3 bucket for reading and writing, creating it if it doesn't exist. | [AmazonS3FullAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonS3FullAccess.html) |
| `n` | Purge the S3 bucket before using it. | [AmazonS3FullAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonS3FullAccess.html) |
#### Azure Blob
Provider: `azure-blob`
Installation: `pip install cshelve[azure-blob]`
The Azure provider uses [Azure Blob Storage](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction) as remote storage.
The module considers the provided container as dedicated to the application. The impact might be significant. For example, if the flag `n` is provided to the `open` function, the entire container will be purged, aligning with the [official interface](https://docs.python.org/3/library/shelve.html#shelve.open).
| Option | Description | Required | Default Value |
|----------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------|---------------|
| `account_url` | The URL of your Azure storage account. | :x: | |
| `auth_type` | The authentication method to use: `access_key`, `passwordless`, `connection_string` or `anonymous`. | :white_check_mark: | |
| `container_name` | The name of the container in your Azure storage account. | :white_check_mark: | |
Depending on the `open` flag, the permissions required by `cshelve` for blob storage vary.
| Flag | Description | Permissions Needed |
|------|-------------|--------------------|
| `r` | Open an existing blob storage container for reading only. | [Storage Blob Data Reader](https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage-blob-data-reader) |
| `w` | Open an existing blob storage container for reading and writing. | [Storage Blob Data Contributor](https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) |
| `c` | Open a blob storage container for reading and writing, creating it if it doesn't exist. | [Storage Blob Data Contributor](https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) |
| `n` | Purge the blob storage container before using it. | [Storage Blob Data Contributor](https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) |
Authentication type supported:
| Auth Type | Description | Advantage | Disadvantage | Example Configuration |
|-------------------|-------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------|---------------------------------------|-----------------------|
| Access Key | Uses an Access Key or a Shared Access Signature for authentication. | Fast startup as no additional credential retrieval is needed. | Credentials need to be securely managed and provided. | [Example](./tests/configurations/azure-integration/access-key.ini) |
| Anonymous | No authentication for anonymous access on public blob storage. | No configuration or credentials needed. | Read-only access. | [Example](./tests/configurations/azure-integration/anonymous.ini) |
| Connection String | Uses a connection string for authentication. Credentials are provided directly in the string. | Fast startup as no additional credential retrieval is needed. | Credentials need to be securely managed and provided. | [Example](./tests/configurations/azure-integration/connection-string.ini) |
| Passwordless | Uses passwordless authentication methods such as Managed Identity. | Recommended for better security and easier credential management. | May impact startup time due to the need to retrieve authentication credentials. | [Example](./tests/configurations/azure-integration/standard.ini) |
#### In Memory
Provider: `in-memory`
Installation: No additional installation required.
The In-Memory provider uses an in-memory data structure to simulate storage. This is useful for testing and development purposes.
| Option | Description | Required | Default Value |
|----------------|------------------------------------------------------------------------------|----------|---------------|
| `persist-key` | If set, its value will be conserved and reused during the program execution. | :x: | None |
| `exists` | If True, the database exists; otherwise, it will be created. | :x: | False |
## Contributing
We welcome contributions from the community! Have a look at our [issues](https://github.com/Standard-Cloud/cshelve/issues).
## License
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for more details.
## Contact
If you have any questions, issues, or feedback, feel free to [open an issue](https://github.com/Standard-Cloud/cshelve/issues).
Raw data
{
"_id": null,
"home_page": null,
"name": "cshelve",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "azure, azure-storage-account, cloud, database, shelve",
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/37/d7/c8e5ed7962cbf091a655bc89fefb1b5ba63e04cb6c089b888cc911d755bd/cshelve-1.1.0.tar.gz",
"platform": null,
"description": "# Cloud Shelve\n`Cloud Shelve (cshelve)` is a Python package that provides a seamless way to store and manage data in the cloud using the familiar [Python Shelve interface](https://docs.python.org/3/library/shelve.html). It is designed for efficient and scalable storage solutions, allowing you to leverage cloud providers for persistent storage while keeping the simplicity of the `shelve` API.\n\n## Installation\n\nInstall `cshelve` via pip:\n\n```bash\npip install cshelve # For testing purposes\npip install cshelve[azure-blob] # For Azure Blob Storage\n```\n\n## Usage\n\nThe `cshelve` module provides a simple key-value interface for storing data in the cloud.\n\n### Quick Start Example\n\nHere is a quick example demonstrating how to store and retrieve data using `cshelve`:\n\n```python\nimport cshelve\n\n# Open a local database file\nd = cshelve.open('local.db')\n\n# Store data\nd['my_key'] = 'my_data'\n\n# Retrieve data\nprint(d['my_key']) # Output: my_data\n\n# Close the database\nd.close()\n```\n\n### Cloud Storage Example (e.g., AWS, Azure)\n\nTo configure remote cloud storage, you need to provide an INI file containing your cloud provider's configuration. The file should have a `.ini` extension. Remote storage also requires the installation of optional dependencies for the cloud provider you want to use.\n\n#### Example AWS S3 Configuration\n\nFirst, install the AWS S3 provider:\n```bash\npip install cshelve[aws-s3]\n```\n\nThen, create an INI file with the following configuration:\n```bash\n$ cat aws-s3.ini\n[default]\nprovider = aws-s3\nbucket_name = cshelve\nauth_type = access_key\nkey_id = $AWS_KEY_ID\nkey_secret = $AWS_KEY_SECRET\n```\n\nNext, export the environment variables:\n```bash\nexport AWS_KEY_ID=your_access_key_id\nexport AWS_KEY_SECRET=your_secret_access_key\n```\n\nOnce the INI file is ready, you can interact with remote storage the same way as with local storage. Here's an example using AWS:\n\n```python\nimport cshelve\n\n# Open using the remote storage configuration\nd = cshelve.open('aws-s3.ini')\n\n# Store data\nd['my_key'] = 'my_data'\n\n# Retrieve data\nprint(d['my_key']) # Output: my_data\n\n# Close the connection to the remote storage\nd.close()\n```\n\n#### Example Azure Blob Configuration\n\nFirst, install the Azure Blob Storage provider:\n```bash\npip install cshelve[azure-blob]\n```\n\nThen, create an INI file with the following configuration:\n```bash\n$ cat azure-blob.ini\n[default]\nprovider = azure-blob\naccount_url = https://myaccount.blob.core.windows.net\nauth_type = passwordless\ncontainer_name = mycontainer\n```\n\nOnce the INI file is ready, you can interact with remote storage the same way as with local storage. Here's an example using Azure:\n\n```python\nimport cshelve\n\n# Open using the remote storage configuration\nd = cshelve.open('azure-blob.ini')\n\n# Store data\nd['my_key'] = 'my_data'\n\n# Retrieve data\nprint(d['my_key']) # Output: my_data\n\n# Close the connection to the remote storage\nd.close()\n```\n\n### Advanced Scenario: Storing DataFrames in the Cloud\n\nIn this advanced example, we will demonstrate how to store and retrieve a Pandas DataFrame using `cshelve` with Azure Blob Storage.\n\nFirst, install the required dependencies:\n```bash\npip install cshelve[azure-blob] pandas\n```\n\nCreate an INI file with the Azure Blob Storage configuration:\n```bash\n$ cat azure-blob.ini\n[default]\nprovider = azure-blob\naccount_url = https://myaccount.blob.core.windows.net\nauth_type = passwordless\ncontainer_name = mycontainer\n```\n\nHere's the code to store and retrieve a DataFrame:\n\n```python\nimport cshelve\nimport pandas as pd\n\n# Create a sample DataFrame\ndf = pd.DataFrame({\n 'name': ['Alice', 'Bob', 'Charlie'],\n 'age': [25, 30, 35],\n 'city': ['New York', 'Los Angeles', 'Chicago']\n})\n\n# Open the remote storage using the Azure Blob configuration\nwith cshelve.open('azure-blob.ini') as db:\n # Store the DataFrame\n db['my_dataframe'] = df\n\n# Retrieve the DataFrame\nwith cshelve.open('azure-blob.ini') as db:\n retrieved_df = db['my_dataframe']\n\nprint(retrieved_df)\n```\n\nMore configuration examples for other cloud providers can be found [here](./tests/configurations/).\n\n### Providers configuration\n#### AWS S3\n\nProvider: `aws-s3`\nInstallation: `pip install cshelve[aws-s3]`\n\nThe AWS S3 provider uses an [AWS S3 Bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/Welcome.html) as remote storage.\n\n| Option | Description | Required | Default Value |\n|---------------------|-----------------------------------------------------------------------------|--------------------|---------------|\n| `bucket_name` | The name of the S3 bucket. | :white_check_mark: | |\n| `auth_type` | The authentication method to use: `access_key`. | :white_check_mark: | |\n| `key_id` | The environment variable for the AWS access key ID. | :white_check_mark: | |\n| `key_secret`| The environment variable for the AWS secret access key. | :white_check_mark: | |\n\nDepending on the `open` flag, the permissions required by `cshelve` for S3 storage vary.\n\n| Flag | Description | Permissions Needed |\n|------|-------------|--------------------|\n| `r` | Open an existing S3 bucket for reading only. | [AmazonS3ReadOnlyAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonS3ReadOnlyAccess.html) |\n| `w` | Open an existing S3 bucket for reading and writing. | [AmazonS3ReadAndWriteAccess](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html) |\n| `c` | Open an S3 bucket for reading and writing, creating it if it doesn't exist. | [AmazonS3FullAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonS3FullAccess.html) |\n| `n` | Purge the S3 bucket before using it. | [AmazonS3FullAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonS3FullAccess.html) |\n\n#### Azure Blob\n\nProvider: `azure-blob`\nInstallation: `pip install cshelve[azure-blob]`\n\nThe Azure provider uses [Azure Blob Storage](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction) as remote storage.\nThe module considers the provided container as dedicated to the application. The impact might be significant. For example, if the flag `n` is provided to the `open` function, the entire container will be purged, aligning with the [official interface](https://docs.python.org/3/library/shelve.html#shelve.open).\n\n| Option | Description | Required | Default Value |\n|----------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------|---------------|\n| `account_url` | The URL of your Azure storage account. | :x: | |\n| `auth_type` | The authentication method to use: `access_key`, `passwordless`, `connection_string` or `anonymous`. | :white_check_mark: | |\n| `container_name` | The name of the container in your Azure storage account. | :white_check_mark: | |\n\nDepending on the `open` flag, the permissions required by `cshelve` for blob storage vary.\n\n| Flag | Description | Permissions Needed |\n|------|-------------|--------------------|\n| `r` | Open an existing blob storage container for reading only. | [Storage Blob Data Reader](https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage-blob-data-reader) |\n| `w` | Open an existing blob storage container for reading and writing. | [Storage Blob Data Contributor](https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) |\n| `c` | Open a blob storage container for reading and writing, creating it if it doesn't exist. | [Storage Blob Data Contributor](https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) |\n| `n` | Purge the blob storage container before using it. | [Storage Blob Data Contributor](https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) |\n\nAuthentication type supported:\n\n| Auth Type | Description | Advantage | Disadvantage | Example Configuration |\n|-------------------|-------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------|---------------------------------------|-----------------------|\n| Access Key | Uses an Access Key or a Shared Access Signature for authentication. | Fast startup as no additional credential retrieval is needed. | Credentials need to be securely managed and provided. | [Example](./tests/configurations/azure-integration/access-key.ini) |\n| Anonymous | No authentication for anonymous access on public blob storage. | No configuration or credentials needed. | Read-only access. | [Example](./tests/configurations/azure-integration/anonymous.ini) |\n| Connection String | Uses a connection string for authentication. Credentials are provided directly in the string. | Fast startup as no additional credential retrieval is needed. | Credentials need to be securely managed and provided. | [Example](./tests/configurations/azure-integration/connection-string.ini) |\n| Passwordless | Uses passwordless authentication methods such as Managed Identity. | Recommended for better security and easier credential management. | May impact startup time due to the need to retrieve authentication credentials. | [Example](./tests/configurations/azure-integration/standard.ini) |\n\n#### In Memory\n\nProvider: `in-memory`\nInstallation: No additional installation required.\n\nThe In-Memory provider uses an in-memory data structure to simulate storage. This is useful for testing and development purposes.\n\n| Option | Description | Required | Default Value |\n|----------------|------------------------------------------------------------------------------|----------|---------------|\n| `persist-key` | If set, its value will be conserved and reused during the program execution. | :x: | None |\n| `exists` | If True, the database exists; otherwise, it will be created. | :x: | False |\n\n## Contributing\n\nWe welcome contributions from the community! Have a look at our [issues](https://github.com/Standard-Cloud/cshelve/issues).\n\n## License\n\nThis project is licensed under the MIT License. See the [LICENSE](LICENSE) file for more details.\n\n## Contact\n\nIf you have any questions, issues, or feedback, feel free to [open an issue](https://github.com/Standard-Cloud/cshelve/issues).\n",
"bugtrack_url": null,
"license": null,
"summary": "Propulsing the shelve module to the cloud",
"version": "1.1.0",
"project_urls": {
"documentation": "https://github.com/Standard-Cloud/cshelve",
"homepage": "https://github.com/Standard-Cloud/cshelve",
"repository": "https://github.com/Standard-Cloud/cshelve"
},
"split_keywords": [
"azure",
" azure-storage-account",
" cloud",
" database",
" shelve"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "afb2a31152efd7372d0dd683a6340c27cf7c67f5c5389e327dbbf210a5c9fbf1",
"md5": "5f93c19eb365325b2fdd5b19e28659cc",
"sha256": "297122921dbb230e1f7bafe03428f50eeba0622cc29118cd611867d735ca8cb5"
},
"downloads": -1,
"filename": "cshelve-1.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5f93c19eb365325b2fdd5b19e28659cc",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 22664,
"upload_time": "2025-02-07T07:57:36",
"upload_time_iso_8601": "2025-02-07T07:57:36.251689Z",
"url": "https://files.pythonhosted.org/packages/af/b2/a31152efd7372d0dd683a6340c27cf7c67f5c5389e327dbbf210a5c9fbf1/cshelve-1.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "37d7c8e5ed7962cbf091a655bc89fefb1b5ba63e04cb6c089b888cc911d755bd",
"md5": "66623b74782ecf0995d24fd722a155a3",
"sha256": "355658854a9b638106583d46bd362a0b9ad8cd8fd913fe542085907693e0c9aa"
},
"downloads": -1,
"filename": "cshelve-1.1.0.tar.gz",
"has_sig": false,
"md5_digest": "66623b74782ecf0995d24fd722a155a3",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 132092,
"upload_time": "2025-02-07T07:57:38",
"upload_time_iso_8601": "2025-02-07T07:57:38.051879Z",
"url": "https://files.pythonhosted.org/packages/37/d7/c8e5ed7962cbf091a655bc89fefb1b5ba63e04cb6c089b888cc911d755bd/cshelve-1.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-07 07:57:38",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Standard-Cloud",
"github_project": "cshelve",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "cshelve"
}