# AzureStorageUtils
[![PyPI version](https://badge.fury.io/py/azure_strg_utils.svg)](https://badge.fury.io/py/azure-strg-utils/0.2.2)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Python Versions](https://img.shields.io/pypi/pyversions/azure_strg_utils.svg)](https://pypi.org/project/azure_strg_utils)
The package facilitates interaction with Azure storage accounts, enabling the execution of a wide range of operations on these accounts.
## Installation
Use the package manager `pip` to install azure-strg-utils.
```bash
pip install azure-strg-utils
```
## Usage
* **Create Container**: Allows the creation of containers within the storage account to organize and store data.
* **List Containers**: Provides a list of all containers present within the storage account.
* **List Blobs**: Enables the listing of blobs (files) within a container.
* **Download Operations**:
* **Download Single File**: Retrieves a specific file from a container.
* **Download Specific File**: Downloads a particular file from a container.
* **Download Entire Blob**: Downloads all files present within a blob.
* **Upload Operations**:
* **Upload Single File**: Adds a single file to a specified container.
* **Upload Specific File**: Uploads a particular file to a container.
* **Upload all File**: Upload all files to a specified container.
* **Delete Operations**:
* **Delete Single File from Blob**: Removes a specific file from a blob in a container.
* **Delete Specific File(s) Using Regex**: Deletes files matching a specific pattern from a blob.
* **Delete Single or All Containers**: Deletes a particular container or all containers within the storage account.
* **Copy Operations**:
* **Copy Single File from Blob**: Copy single file from one containee to another container.
* **Copy Specific File(s) Using Regex**: Copy all files matching a specific pattern from a blob.
* **Copy All Files**: Copy all files present inside a blob in source containers to another blob.
* **Conditional Operations**:
* Download, delete, or retrieve a list of files from Azure Blob Storage using date-based operators such as less_than, less_than_or_equal, greater_than, and greater_than_or_equal.
* **Upload Dataframe**:
* **Save dataframe in blob**: Save pandas Dataframe to a blob in specific format. Curently supported format are XML,JSON and CSV.
* **File Regex Operations**:
Apply File Regex: Allows operations such as download, deletion, or upload based on files that match a specific pattern within a blob.
These operations provide the ability to manage containers, files (blobs), and perform various actions like downloading, uploading, and deleting specific or multiple files based on certain patterns within the Azure storage account. Please use these operations with caution, especially deletion operations, as they might result in permanent data loss.
```python
from azure_strg_utils import AzureStorageUtils
```
### Establish a connection.
Establish a connection by supplying the connection string to create an instance of AzureStorageUtils.
```python
CONNECTION_STRING=os.getenv('CONNECTION_STRING')
client=AzureStorageUtils(connection_string=CONNECTION_STRING)
```
### Get list of containers.
Produces a list of containers available within the connected Azure storage account.
This code snippet utilizes the list_container() method from the AzureStorageUtils instance (client) to retrieve and display the list of containers present in the connected Azure storage account.
```python
container=client.list_container()
print(container)
```
### Create a container.
This code snippet utilizes the create_container() method from the AzureStorageUtils to create a new container named 'test' in the connected Azure storage account.
```python
client.create_container(container_name='test')
```
### Get list of folders.
This code snippet utilizes the list_folders() method from the AzureStorageUtils to retrieve and display the list of folders within the specified container ('test') in the connected Azure storage account.
```python
folder=client.list_folders(container_name='test')
print(folder)
```
### Returns a list of files .
This code snippet utilizes the list_blob() method from the AzureStorageUtils to retrieve and display the list of files within the 'raw' blob or folder, located within the 'test' container in the Azure storage account.
```python
blob,folder_files=client.list_blob(
container_name='test',
blob_name='raw'
)
print(folder_files)
```
### Retrieves a Pandas DataFrame.
Retrieves a Pandas DataFrame of a file located inside a folder within the Azure storage account by specifying the parameter `is_dataframe=True`.
This code snippet employs the download_file() method from the AzureStorageUtils to download and convert the 'cars.csv' file, situated within the 'raw' folder of the 'test' container in Azure storage, into a Pandas DataFrame.
```python
df=client.download_file(
container_name='test',
blob_name='raw',
file_name='cars.csv',
is_dataframe=True
)
```
### Downloads a specified file.
Downloads a specified file ('sales_data.csv') located in a specific folder ('raw') within the Azure storage account to a designated path using the 'path' parameter.
This code snippet utilizes the download_file() method from the AzureStorageUtils to download the 'sales_data.csv' file from the 'raw' folder in the 'test' container in Azure storage. It saves the file to the './test' directory on the local system
```python
client.download_file(
container_name='test',
blob_name='raw',
file_name='sales_data.csv',
path='./test'
)
```
### Downloads files with specific naming pattern.
This code downloads all files from the 'raw' folder in the 'test' container that match the specified naming pattern (e.g., files with names starting with 'cust' and ending with '.csv'). Modify the file_regex parameter to match your desired naming pattern for file selection.
```python
client.download_file(
container_name='test',
blob_name='raw',
path='downloaded',
all_files=True,
file_regex='cust*.csv'
)
```
### Downloads all files from specific blob/folder.
This code downloads all files from the 'raw' folder in the 'test' container including the sub directory files.
```python
client.download_file(
container_name='test',
blob_name='raw',
path='downloaded',
all_files=True,
)
```
### Uploads a single file to a specific blob/folder.
This code uses the upload_file() method from the AzureStorageUtils to upload the file 'Product_data.csv' from the 'Data_Profiling/data' directory into the 'raw' blob/folder within the 'test' container.
```python
client.upload_file(
container_name='test',
blob_name='raw',
file_name='Product_data.csv',
file_path='Data_Profiling/data'
)
```
### Uploads all files from a specified path to a blob/folder.
This code utilizes the upload_file() method from the AzureStorageUtils with the all_files=True parameter. It uploads all files from the 'Data_Profiling/data' directory into the 'raw' blob/folder within the 'test' container.
```python
client.upload_file(
container_name='test',
blob_name='raw',
all_files=True,
file_path='Data_Profiling/data'
)
```
### Uploads files with specific naming pattern from a specified path to a blob/folder.
This code utilizes the upload_file() method from the AzureStorageUtils instance (client) with the all_files=True parameter and the file_regex='cust*' parameter to upload all files matching the 'cust*' pattern from the 'data' directory into the 'multi/raw' blob/folder within the 'test' container.
```python
client.upload_file(
container_name='test',
blob_name='multi/raw',
all_files=True,
file_path='data',
file_regex='cust*'
)
```
### Uploads all files from a specified folder to a blob/folder.
This code utilizes the upload_file() method from the AzureStorageUtils with the all_files=True parameter to upload all files from the 'Data_Profiling/data' directory into the 'raw' blob/folder within the 'test' container.
```python
client.upload_file(
container_name='test',
blob_name='raw',
all_files=True,
file_path='Data_Profiling/data'
)
```
### Deletes all files inside a blob.
This code uses the delete_file() method from the AzureStorageUtils instance (client) with the all_files=True parameter to delete all files contained within the 'raw' blob inside the 'test' container.
```python
client.delete_file(
container_name='test',
blob_name='raw',
all_files=True
)
```
### Deletes files with with specific naming pattern inside a blob.
This code utilizes the delete_file() method from the AzureStorageUtils with the all_files=True parameter and the file_regex='cust*' parameter to delete all files inside the 'raw' blob that match the specified pattern ('cust*').
```python
client.delete_file(
container_name='test',
blob_name='raw',
all_files=True,
file_regex='cust*'
)
```
### Deletes a single file inside a blob.
This code uses the delete_file() method from the AzureStorageUtils instance (client) to delete the 'Product_data.csv' file from the 'raw' blob in the 'test' container.
```python
client.delete_file(
container_name='test',
blob_name='raw',
file_name='Product_data.csv'
)
```
### Conditional operation.
We can use conditional_operation() method to perform download/delete operation on blob.
Supported action are `download` and `delete` and we can pass any comparision like 'less_than','less_than_or_equal','greater_than',and 'greater_than_or_equal'.
```python
# download all files which have creation date greater than 2023-12-15
client.conditional_operation(
container_name='rawdata',
blob_name='raw',
creation_date='2023-12-15',
comparison='greater_than',
action='download',
path='data')
# delete all files which have creation date less than 2023-12-15
client.conditional_operation(
container_name='rawdata',
blob_name='raw',
creation_date='2023-12-15',
comparison='less_than',
action='delete')
# delete all files which have creation date less than 2023-12-15 and file name starts with c*
client.conditional_operation(
container_name='rawdata',
blob_name='raw',
creation_date='2023-12-15',
comparison='less_than',
action='delete',
file_regex='c*')
```
### Copy the file.
Use copy_blob() method to copy single or multiple files from one container to another container within same storage account.
We can pass any operator as well.
```python
# copy all files which have creation date greater than 2023-12-15 from rawdata to new-test container
client.copy_blob(
container_name='rawdata',
blob_name='raw',
destination_container='new-test',
destination_blob='raw',
creation_date='2023-12-15',
comparison='greater_than'
)
# copy all files which have creation date greater than 2023-12-15 from rawdata to new-test container and file name starts with c*
# and delete the file after copy.
client.copy_blob(
container_name='rawdata',
blob_name='raw',
destination_container='new-test',
destination_blob='raw',
creation_date='2023-12-15',
comparison='greater_than',
file_regex='c*',
delete_file=True
)
```
### Save pandas dataframe.
Use upload_dataframe() method to save the daatframe in blob. Supported format are XML,CSV and JSON.
```python
# upload the dataframe to blob test in xml format.
client.upload_dataframe(
dataframe=df,
file_name='cars.xml',
container_name='rawdata',
blob_name='test'
)
```
### Deletes a container.
This code uses the delete_container() method from the AzureStorageUtils instance (client) to delete the container named 'test' from the Azure storage account.
```python
client.delete_container(container_name='test')
```
### Deletes all containers from the storage account
This code utilizes the delete_container() method from the AzureStorageUtils instance (client) with the all_containers=True parameter to delete all containers present in the Azure storage account. Use this cautiously as it will delete all containers and their contents.
```python
client.delete_container(all_containers=True)
```
## Contributing
our contributions to this project are encouraged and valued! You have the autonomy to make modifications directly and merge them without the need for a formal review process.
You have the freedom to utilize the code in any manner you deem fit. However, please be cautious as any deletion or modification of critical containers or data is not my responsibility.
To contribute:
Fork the repository to your GitHub account.
Clone the forked repository to your local machine.
Create a new branch for your changes: git checkout -b feature/YourFeature
Implement your modifications or additions.
Commit your changes: git commit -am 'Add new feature'
Push the branch to your GitHub repository: git push origin feature/YourFeature
Merge your changes directly into the main branch.
Your contributions directly impact the project. Please ensure that your modifications align with the project's goals and standards. Thank you for your valuable contributions!
Raw data
{
"_id": null,
"home_page": "https://github.com/1997vijay/AzureBlobUtils.git",
"name": "azure-strg-utils",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": "",
"keywords": "azure,storage,blob,container",
"author": "Vijay Kumar",
"author_email": "vijay.kumar.1997@outlook.com",
"download_url": "",
"platform": null,
"description": "# AzureStorageUtils\r\n[![PyPI version](https://badge.fury.io/py/azure_strg_utils.svg)](https://badge.fury.io/py/azure-strg-utils/0.2.2)\r\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\r\n[![Python Versions](https://img.shields.io/pypi/pyversions/azure_strg_utils.svg)](https://pypi.org/project/azure_strg_utils)\r\n\r\nThe package facilitates interaction with Azure storage accounts, enabling the execution of a wide range of operations on these accounts.\r\n\r\n## Installation\r\n\r\nUse the package manager `pip` to install azure-strg-utils.\r\n\r\n```bash\r\npip install azure-strg-utils\r\n```\r\n\r\n## Usage\r\n* **Create Container**: Allows the creation of containers within the storage account to organize and store data.\r\n* **List Containers**: Provides a list of all containers present within the storage account.\r\n* **List Blobs**: Enables the listing of blobs (files) within a container.\r\n\r\n* **Download Operations**:\r\n * **Download Single File**: Retrieves a specific file from a container.\r\n * **Download Specific File**: Downloads a particular file from a container.\r\n * **Download Entire Blob**: Downloads all files present within a blob.\r\n\r\n* **Upload Operations**:\r\n * **Upload Single File**: Adds a single file to a specified container.\r\n * **Upload Specific File**: Uploads a particular file to a container.\r\n * **Upload all File**: Upload all files to a specified container.\r\n\r\n* **Delete Operations**:\r\n * **Delete Single File from Blob**: Removes a specific file from a blob in a container.\r\n * **Delete Specific File(s) Using Regex**: Deletes files matching a specific pattern from a blob.\r\n * **Delete Single or All Containers**: Deletes a particular container or all containers within the storage account.\r\n\r\n* **Copy Operations**:\r\n * **Copy Single File from Blob**: Copy single file from one containee to another container.\r\n * **Copy Specific File(s) Using Regex**: Copy all files matching a specific pattern from a blob.\r\n * **Copy All Files**: Copy all files present inside a blob in source containers to another blob.\r\n\r\n* **Conditional Operations**:\r\n * Download, delete, or retrieve a list of files from Azure Blob Storage using date-based operators such as less_than, less_than_or_equal, greater_than, and greater_than_or_equal. \r\n\r\n\r\n* **Upload Dataframe**:\r\n * **Save dataframe in blob**: Save pandas Dataframe to a blob in specific format. Curently supported format are XML,JSON and CSV.\r\n\r\n* **File Regex Operations**:\r\nApply File Regex: Allows operations such as download, deletion, or upload based on files that match a specific pattern within a blob.\r\nThese operations provide the ability to manage containers, files (blobs), and perform various actions like downloading, uploading, and deleting specific or multiple files based on certain patterns within the Azure storage account. Please use these operations with caution, especially deletion operations, as they might result in permanent data loss.\r\n```python\r\nfrom azure_strg_utils import AzureStorageUtils\r\n```\r\n### Establish a connection.\r\nEstablish a connection by supplying the connection string to create an instance of AzureStorageUtils.\r\n```python\r\nCONNECTION_STRING=os.getenv('CONNECTION_STRING')\r\nclient=AzureStorageUtils(connection_string=CONNECTION_STRING)\r\n```\r\n\r\n### Get list of containers.\r\nProduces a list of containers available within the connected Azure storage account.\r\nThis code snippet utilizes the list_container() method from the AzureStorageUtils instance (client) to retrieve and display the list of containers present in the connected Azure storage account.\r\n```python\r\ncontainer=client.list_container()\r\nprint(container)\r\n```\r\n\r\n### Create a container.\r\nThis code snippet utilizes the create_container() method from the AzureStorageUtils to create a new container named 'test' in the connected Azure storage account.\r\n```python\r\nclient.create_container(container_name='test')\r\n```\r\n\r\n### Get list of folders.\r\nThis code snippet utilizes the list_folders() method from the AzureStorageUtils to retrieve and display the list of folders within the specified container ('test') in the connected Azure storage account.\r\n```python\r\nfolder=client.list_folders(container_name='test')\r\nprint(folder)\r\n```\r\n\r\n### Returns a list of files .\r\nThis code snippet utilizes the list_blob() method from the AzureStorageUtils to retrieve and display the list of files within the 'raw' blob or folder, located within the 'test' container in the Azure storage account.\r\n```python\r\nblob,folder_files=client.list_blob(\r\n container_name='test',\r\n blob_name='raw'\r\n )\r\nprint(folder_files)\r\n```\r\n\r\n### Retrieves a Pandas DataFrame.\r\nRetrieves a Pandas DataFrame of a file located inside a folder within the Azure storage account by specifying the parameter `is_dataframe=True`.\r\nThis code snippet employs the download_file() method from the AzureStorageUtils to download and convert the 'cars.csv' file, situated within the 'raw' folder of the 'test' container in Azure storage, into a Pandas DataFrame. \r\n```python\r\ndf=client.download_file(\r\n container_name='test',\r\n blob_name='raw',\r\n file_name='cars.csv',\r\n is_dataframe=True\r\n )\r\n```\r\n\r\n### Downloads a specified file.\r\nDownloads a specified file ('sales_data.csv') located in a specific folder ('raw') within the Azure storage account to a designated path using the 'path' parameter.\r\nThis code snippet utilizes the download_file() method from the AzureStorageUtils to download the 'sales_data.csv' file from the 'raw' folder in the 'test' container in Azure storage. It saves the file to the './test' directory on the local system\r\n```python\r\nclient.download_file(\r\n container_name='test',\r\n blob_name='raw',\r\n file_name='sales_data.csv',\r\n path='./test'\r\n )\r\n```\r\n\r\n### Downloads files with specific naming pattern.\r\nThis code downloads all files from the 'raw' folder in the 'test' container that match the specified naming pattern (e.g., files with names starting with 'cust' and ending with '.csv'). Modify the file_regex parameter to match your desired naming pattern for file selection.\r\n```python\r\nclient.download_file(\r\n container_name='test',\r\n blob_name='raw',\r\n path='downloaded',\r\n all_files=True,\r\n file_regex='cust*.csv'\r\n )\r\n```\r\n\r\n### Downloads all files from specific blob/folder.\r\nThis code downloads all files from the 'raw' folder in the 'test' container including the sub directory files.\r\n```python\r\nclient.download_file(\r\n container_name='test',\r\n blob_name='raw',\r\n path='downloaded',\r\n all_files=True,\r\n )\r\n```\r\n\r\n### Uploads a single file to a specific blob/folder.\r\nThis code uses the upload_file() method from the AzureStorageUtils to upload the file 'Product_data.csv' from the 'Data_Profiling/data' directory into the 'raw' blob/folder within the 'test' container. \r\n```python\r\nclient.upload_file(\r\n container_name='test',\r\n blob_name='raw',\r\n file_name='Product_data.csv',\r\n file_path='Data_Profiling/data'\r\n )\r\n```\r\n\r\n### Uploads all files from a specified path to a blob/folder.\r\nThis code utilizes the upload_file() method from the AzureStorageUtils with the all_files=True parameter. It uploads all files from the 'Data_Profiling/data' directory into the 'raw' blob/folder within the 'test' container. \r\n```python\r\nclient.upload_file(\r\n container_name='test',\r\n blob_name='raw',\r\n all_files=True,\r\n file_path='Data_Profiling/data'\r\n )\r\n```\r\n\r\n### Uploads files with specific naming pattern from a specified path to a blob/folder.\r\nThis code utilizes the upload_file() method from the AzureStorageUtils instance (client) with the all_files=True parameter and the file_regex='cust*' parameter to upload all files matching the 'cust*' pattern from the 'data' directory into the 'multi/raw' blob/folder within the 'test' container. \r\n```python\r\nclient.upload_file(\r\n container_name='test',\r\n blob_name='multi/raw',\r\n all_files=True,\r\n file_path='data',\r\n file_regex='cust*'\r\n )\r\n```\r\n\r\n### Uploads all files from a specified folder to a blob/folder.\r\nThis code utilizes the upload_file() method from the AzureStorageUtils with the all_files=True parameter to upload all files from the 'Data_Profiling/data' directory into the 'raw' blob/folder within the 'test' container. \r\n```python\r\nclient.upload_file(\r\n container_name='test',\r\n blob_name='raw',\r\n all_files=True,\r\n file_path='Data_Profiling/data'\r\n )\r\n```\r\n\r\n### Deletes all files inside a blob.\r\nThis code uses the delete_file() method from the AzureStorageUtils instance (client) with the all_files=True parameter to delete all files contained within the 'raw' blob inside the 'test' container.\r\n```python\r\nclient.delete_file(\r\n container_name='test',\r\n blob_name='raw',\r\n all_files=True\r\n )\r\n```\r\n\r\n### Deletes files with with specific naming pattern inside a blob.\r\nThis code utilizes the delete_file() method from the AzureStorageUtils with the all_files=True parameter and the file_regex='cust*' parameter to delete all files inside the 'raw' blob that match the specified pattern ('cust*').\r\n```python\r\nclient.delete_file(\r\n container_name='test',\r\n blob_name='raw',\r\n all_files=True,\r\n file_regex='cust*'\r\n )\r\n```\r\n\r\n### Deletes a single file inside a blob.\r\nThis code uses the delete_file() method from the AzureStorageUtils instance (client) to delete the 'Product_data.csv' file from the 'raw' blob in the 'test' container. \r\n```python\r\nclient.delete_file(\r\n container_name='test',\r\n blob_name='raw',\r\n file_name='Product_data.csv'\r\n )\r\n```\r\n\r\n### Conditional operation.\r\nWe can use conditional_operation() method to perform download/delete operation on blob.\r\nSupported action are `download` and `delete` and we can pass any comparision like 'less_than','less_than_or_equal','greater_than',and 'greater_than_or_equal'.\r\n```python\r\n\r\n# download all files which have creation date greater than 2023-12-15\r\nclient.conditional_operation(\r\n container_name='rawdata',\r\n blob_name='raw',\r\n creation_date='2023-12-15',\r\n comparison='greater_than',\r\n action='download',\r\n path='data') \r\n\r\n# delete all files which have creation date less than 2023-12-15\r\nclient.conditional_operation(\r\n container_name='rawdata',\r\n blob_name='raw',\r\n creation_date='2023-12-15',\r\n comparison='less_than',\r\n action='delete') \r\n\r\n# delete all files which have creation date less than 2023-12-15 and file name starts with c*\r\nclient.conditional_operation(\r\n container_name='rawdata',\r\n blob_name='raw',\r\n creation_date='2023-12-15',\r\n comparison='less_than',\r\n action='delete',\r\n file_regex='c*') \r\n```\r\n\r\n### Copy the file.\r\nUse copy_blob() method to copy single or multiple files from one container to another container within same storage account.\r\nWe can pass any operator as well.\r\n```python\r\n\r\n# copy all files which have creation date greater than 2023-12-15 from rawdata to new-test container\r\nclient.copy_blob(\r\n container_name='rawdata',\r\n blob_name='raw',\r\n destination_container='new-test',\r\n destination_blob='raw',\r\n creation_date='2023-12-15',\r\n comparison='greater_than'\r\n)\r\n\r\n# copy all files which have creation date greater than 2023-12-15 from rawdata to new-test container and file name starts with c*\r\n# and delete the file after copy.\r\nclient.copy_blob(\r\n container_name='rawdata',\r\n blob_name='raw',\r\n destination_container='new-test',\r\n destination_blob='raw',\r\n creation_date='2023-12-15',\r\n comparison='greater_than',\r\n file_regex='c*',\r\n delete_file=True\r\n)\r\n\r\n```\r\n### Save pandas dataframe.\r\nUse upload_dataframe() method to save the daatframe in blob. Supported format are XML,CSV and JSON.\r\n```python\r\n# upload the dataframe to blob test in xml format.\r\nclient.upload_dataframe(\r\n dataframe=df,\r\n file_name='cars.xml',\r\n container_name='rawdata',\r\n blob_name='test'\r\n )\r\n```\r\n\r\n### Deletes a container.\r\nThis code uses the delete_container() method from the AzureStorageUtils instance (client) to delete the container named 'test' from the Azure storage account.\r\n```python\r\nclient.delete_container(container_name='test')\r\n```\r\n\r\n### Deletes all containers from the storage account\r\nThis code utilizes the delete_container() method from the AzureStorageUtils instance (client) with the all_containers=True parameter to delete all containers present in the Azure storage account. Use this cautiously as it will delete all containers and their contents.\r\n```python\r\nclient.delete_container(all_containers=True)\r\n```\r\n\r\n## Contributing\r\nour contributions to this project are encouraged and valued! You have the autonomy to make modifications directly and merge them without the need for a formal review process.\r\nYou have the freedom to utilize the code in any manner you deem fit. However, please be cautious as any deletion or modification of critical containers or data is not my responsibility.\r\nTo contribute:\r\n\r\n Fork the repository to your GitHub account.\r\n Clone the forked repository to your local machine.\r\n Create a new branch for your changes: git checkout -b feature/YourFeature\r\n Implement your modifications or additions.\r\n Commit your changes: git commit -am 'Add new feature'\r\n Push the branch to your GitHub repository: git push origin feature/YourFeature\r\n Merge your changes directly into the main branch.\r\n\r\nYour contributions directly impact the project. Please ensure that your modifications align with the project's goals and standards. Thank you for your valuable contributions!\r\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "The package facilitates interaction with Azure storage accounts, enabling the execution of a wide range of operations on these accounts.",
"version": "0.2.2",
"project_urls": {
"Homepage": "https://github.com/1997vijay/AzureBlobUtils.git"
},
"split_keywords": [
"azure",
"storage",
"blob",
"container"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "7922ffc8c92470ea37f29fa4f757351e8240856d0d3242204abf01dbf08f6722",
"md5": "ee25e7a4a60a28f8d42e6fc319649002",
"sha256": "adf57a9c740afebc71b3e0c465192e983f0c243bc952171c74a594a4a3c76100"
},
"downloads": -1,
"filename": "azure_strg_utils-0.2.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ee25e7a4a60a28f8d42e6fc319649002",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 14651,
"upload_time": "2024-01-20T07:03:38",
"upload_time_iso_8601": "2024-01-20T07:03:38.484487Z",
"url": "https://files.pythonhosted.org/packages/79/22/ffc8c92470ea37f29fa4f757351e8240856d0d3242204abf01dbf08f6722/azure_strg_utils-0.2.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-01-20 07:03:38",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "1997vijay",
"github_project": "AzureBlobUtils",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [],
"lcname": "azure-strg-utils"
}