self-encryption


Nameself-encryption JSON
Version 0.32.43 PyPI version JSON
download
home_pagehttps://maidsafe.net
SummarySelf encrypting files (convergent encryption plus obfuscation)
upload_time2024-12-23 00:46:13
maintainerNone
docs_urlNone
authorMaidSafe Developers <dev@maidsafe.net>
requires_python>=3.7
licenseGPL-3.0
keywords encryption convergent-encryption self-encryption obfuscation security
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # self_encryption

Self encrypting files (convergent encryption plus obfuscation)

|Crate|Documentation|
|:---:|:-----------:|
|[![](https://img.shields.io/crates/v/self_encryption.svg)](https://crates.io/crates/self_encryption)|[![Documentation](https://docs.rs/self_encryption/badge.svg)](https://docs.rs/self_encryption)|

| [MaidSafe website](https://maidsafe.net) | [SAFE Dev Forum](https://forum.safedev.org) | [SAFE Network Forum](https://safenetforum.org) |
|:----------------------------------------:|:-------------------------------------------:|:----------------------------------------------:|

## Table of Contents
- [Overview](#overview)
- [Documentation](#documentation)
- [Features](#features)
- [Usage](#usage)
  - [Rust Usage](#rust-usage)
    - [Basic Operations](#basic-operations)
    - [Storage Backends](#storage-backends)
    - [Streaming Operations](#streaming-operations)
    - [Advanced Usage](#advanced-usage)
  - [Python Usage](#python-usage)
- [Implementation Details](#implementation-details)
- [License](#license)
- [Contributing](#contributing)

## Overview

A version of [convergent encryption](http://en.wikipedia.org/wiki/convergent_encryption) with an additional obfuscation step. This pattern allows secured data that can also be [de-duplicated](http://en.wikipedia.org/wiki/Data_deduplication). This library presents an API that takes a set of bytes and returns a secret key derived from those bytes, and a set of encrypted chunks.

**Important Security Note**: While this library provides very secure encryption of the data, the returned secret key **requires the same secure handling as would be necessary for any secret key**.

![image of self encryption](https://github.com/maidsafe/self_encryption/blob/master/img/self_encryption.png?raw=true)

## Documentation
- [Self Encrypting Data Whitepaper](https://docs.maidsafe.net/Whitepapers/pdf/SelfEncryptingData.pdf)
- [Process Overview Video](https://www.youtube.com/watch?v=Jnvwv4z17b4)

## Features

- Content-based chunking
- Convergent encryption
- Self-validating chunks
- Hierarchical data maps for handling large files
- Streaming encryption/decryption
- Python bindings
- Flexible storage backend support
- Custom storage backends via functors

## Usage

### Rust Usage

#### Installation

Add this to your `Cargo.toml`:
```toml
[dependencies]
self_encryption = "0.30"
bytes = "1.0"
```

#### Basic Operations

```rust
use self_encryption::{encrypt, decrypt_full_set};
use bytes::Bytes;

// Basic encryption/decryption
fn basic_example() -> Result<()> {
    let data = Bytes::from("Hello, World!".repeat(1000));  // Must be at least 3072 bytes
    
    // Encrypt data
    let (data_map, encrypted_chunks) = encrypt(data.clone())?;
    
    // Decrypt data
    let decrypted = decrypt_full_set(&data_map, &encrypted_chunks)?;
    assert_eq!(data, decrypted);
    
    Ok(())
}
```

#### Storage Backends

```rust
use self_encryption::{shrink_data_map, get_root_data_map, decrypt_from_storage};
use std::collections::HashMap;
use std::sync::{Arc, Mutex};

// Memory Storage Example
fn memory_storage_example() -> Result<()> {
    let storage = Arc::new(Mutex::new(HashMap::new()));
    
    // Store function
    let store = |hash, data| {
        storage.lock().unwrap().insert(hash, data);
        Ok(())
    };
    
    // Retrieve function
    let retrieve = |hash| {
        storage.lock().unwrap()
            .get(&hash)
            .cloned()
            .ok_or_else(|| Error::Generic("Chunk not found".into()))
    };
    
    // Use with data map operations
    let shrunk_map = shrink_data_map(data_map, store)?;
    let root_map = get_root_data_map(shrunk_map, retrieve)?;
    
    Ok(())
}

// Disk Storage Example
fn disk_storage_example() -> Result<()> {
    let chunk_dir = PathBuf::from("chunks");
    
    // Store function
    let store = |hash, data| {
        let path = chunk_dir.join(hex::encode(hash));
        std::fs::write(path, data)?;
        Ok(())
    };
    
    // Retrieve function
    let retrieve = |hash| {
        let path = chunk_dir.join(hex::encode(hash));
        Ok(Bytes::from(std::fs::read(path)?))
    };
    
    // Use with data map operations
    let shrunk_map = shrink_data_map(data_map, store)?;
    let root_map = get_root_data_map(shrunk_map, retrieve)?;
    
    Ok(())
}
```

### Python Usage

#### Installation

```bash
pip install self-encryption
```

#### Basic Operations

```python
from self_encryption import encrypt, decrypt

# Basic in-memory encryption/decryption
def basic_example():
    # Create test data (must be at least 3072 bytes)
    data = b"Hello, World!" * 1000
    
    # Encrypt data - returns data map and encrypted chunks
    data_map, chunks = encrypt(data)
    print(f"Data encrypted into {len(chunks)} chunks")
    print(f"Data map has child level: {data_map.child()}")
    
    # Decrypt data
    decrypted = decrypt(data_map, chunks)
    assert data == decrypted
```

#### File Operations

```python
from pathlib import Path
from self_encryption import encrypt_from_file, decrypt_from_storage, streaming_encrypt_from_file

def file_example():
    # Setup paths
    input_path = Path("large_file.dat")
    chunk_dir = Path("chunks")
    output_path = Path("decrypted_file.dat")
    
    # Ensure chunk directory exists
    chunk_dir.mkdir(exist_ok=True)
    
    # Regular file encryption - stores all chunks at once
    data_map, chunk_names = encrypt_from_file(str(input_path), str(chunk_dir))
    print(f"File encrypted into {len(chunk_names)} chunks")
    
    # Streaming encryption - memory efficient for large files
    def store_chunk(name_hex: str, content: bytes) -> None:
        chunk_path = chunk_dir / name_hex
        chunk_path.write_bytes(content)
    
    data_map = streaming_encrypt_from_file(str(input_path), store_chunk)
    print(f"File encrypted with streaming method")
    
    # Create chunk retrieval function
    def get_chunk(hash_hex: str) -> bytes:
        chunk_path = chunk_dir / hash_hex
        return chunk_path.read_bytes()
    
    # Decrypt file
    decrypt_from_storage(data_map, str(output_path), get_chunk)
```

#### Advanced Features

```python
from self_encryption import shrink_data_map, get_root_data_map

def advanced_example():
    # Create custom storage backend
    chunk_store = {}
    
    def store_chunk(name_hex: str, content: bytes) -> None:
        chunk_store[name_hex] = content
    
    def get_chunk(name_hex: str) -> bytes:
        return chunk_store[name_hex]
    
    # Use streaming encryption with custom storage
    data_map = streaming_encrypt_from_file("large_file.dat", store_chunk)
    
    # Get root data map for hierarchical storage
    root_map = get_root_data_map(data_map, get_chunk)
    print(f"Root data map level: {root_map.child()}")
```

## Implementation Details

### Core Process

- Files are split into chunks of up to 1MB
- Each chunk is processed in three steps:
  1. Compression (using Brotli)
  2. Encryption (using AES-256-CBC)
  3. XOR obfuscation

### Key Generation and Security

- Each chunk's encryption uses keys derived from the content hashes of three chunks:

  ```
  For chunk N:
  - Uses hashes from chunks [N, N+1, N+2]
  - Combined hash = hash(N) || hash(N+1) || hash(N+2)
  - Split into:
    - Pad (first X bytes)
    - Key (next 16 bytes for AES-256)
    - IV  (final 16 bytes)
  ```

- This creates a chain of dependencies where each chunk's encryption depends on its neighbors
- Provides both convergent encryption and additional security through the interdependencies

### Encryption Flow

1. Content Chunking:
   - File is split into chunks of optimal size
   - Each chunk's raw content is hashed (SHA3-256)
   - These hashes become part of the DataMap

2. Per-Chunk Processing:

   ```rust
   // For each chunk:
   1. Compress data using Brotli
   2. Generate key materials:
      - Combine three consecutive chunk hashes
      - Extract pad, key, and IV
   3. Encrypt compressed data using AES-256-CBC
   4. XOR encrypted data with pad for obfuscation
   ```

3. DataMap Creation:
   - Stores both pre-encryption (src) and post-encryption (dst) hashes
   - Maintains chunk ordering and size information
   - Required for both encryption and decryption processes

### Decryption Flow

1. Chunk Retrieval:
   - Use DataMap to identify required chunks
   - Retrieve chunks using dst_hash as identifier

2. Per-Chunk Processing:

   ```rust
   // For each chunk:
   1. Regenerate key materials using src_hashes from DataMap
   2. Remove XOR obfuscation using pad
   3. Decrypt using AES-256-CBC with key and IV
   4. Decompress using Brotli
   ```

3. Chunk Reassembly:
   - Chunks are processed in order specified by DataMap
   - Reassembled into original file

### Storage Features

- Flexible backend support through trait-based design
- Supports both memory and disk-based storage
- Streaming operations for memory efficiency
- Hierarchical data maps for large files:

  ```rust
  // DataMap shrinking for large files
  1. Serialize large DataMap
  2. Encrypt serialized map using same process
  3. Create new DataMap with fewer chunks
  4. Repeat until manageable size reached
  ```

### Security Properties

- Content-based convergent encryption
- Additional security through chunk interdependencies
- Self-validating chunks through hash verification
- No single point of failure in chunk storage
- Tamper-evident through hash chains

### Performance Optimizations

- Parallel chunk processing where possible
- Streaming support for large files
- Efficient memory usage through chunking
- Optimized compression settings
- Configurable chunk sizes

This implementation provides a balance of:

- Security (through multiple encryption layers)
- Deduplication (through convergent encryption)
- Performance (through parallelization and streaming)
- Flexibility (through modular storage backends)

## License

Licensed under the General Public License (GPL), version 3 ([LICENSE](LICENSE) http://www.gnu.org/licenses/gpl-3.0.en.html).

### Linking Exception

self_encryption is licensed under GPLv3 with linking exception. This means you can link to and use the library from any program, proprietary or open source; paid or gratis. However, if you modify self_encryption, you must distribute the source to your modified version under the terms of the GPLv3.

See the LICENSE file for more details.

## Contributing

Want to contribute? Great :tada:

There are many ways to give back to the project, whether it be writing new code, fixing bugs, or just reporting errors. All forms of contributions are encouraged!

For instructions on how to contribute, see our [Guide to contributing](https://github.com/maidsafe/QA/blob/master/CONTRIBUTING.md).


            

Raw data

            {
    "_id": null,
    "home_page": "https://maidsafe.net",
    "name": "self-encryption",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "encryption, convergent-encryption, self-encryption, obfuscation, security",
    "author": "MaidSafe Developers <dev@maidsafe.net>",
    "author_email": "MaidSafe Developers <dev@maidsafe.net>",
    "download_url": "https://files.pythonhosted.org/packages/cb/af/fca6c5246e7fa567f280c4df9da21f354b526e597d93c36496fc1f6ed7e1/self_encryption-0.32.43.tar.gz",
    "platform": null,
    "description": "# self_encryption\n\nSelf encrypting files (convergent encryption plus obfuscation)\n\n|Crate|Documentation|\n|:---:|:-----------:|\n|[![](https://img.shields.io/crates/v/self_encryption.svg)](https://crates.io/crates/self_encryption)|[![Documentation](https://docs.rs/self_encryption/badge.svg)](https://docs.rs/self_encryption)|\n\n| [MaidSafe website](https://maidsafe.net) | [SAFE Dev Forum](https://forum.safedev.org) | [SAFE Network Forum](https://safenetforum.org) |\n|:----------------------------------------:|:-------------------------------------------:|:----------------------------------------------:|\n\n## Table of Contents\n- [Overview](#overview)\n- [Documentation](#documentation)\n- [Features](#features)\n- [Usage](#usage)\n  - [Rust Usage](#rust-usage)\n    - [Basic Operations](#basic-operations)\n    - [Storage Backends](#storage-backends)\n    - [Streaming Operations](#streaming-operations)\n    - [Advanced Usage](#advanced-usage)\n  - [Python Usage](#python-usage)\n- [Implementation Details](#implementation-details)\n- [License](#license)\n- [Contributing](#contributing)\n\n## Overview\n\nA version of [convergent encryption](http://en.wikipedia.org/wiki/convergent_encryption) with an additional obfuscation step. This pattern allows secured data that can also be [de-duplicated](http://en.wikipedia.org/wiki/Data_deduplication). This library presents an API that takes a set of bytes and returns a secret key derived from those bytes, and a set of encrypted chunks.\n\n**Important Security Note**: While this library provides very secure encryption of the data, the returned secret key **requires the same secure handling as would be necessary for any secret key**.\n\n![image of self encryption](https://github.com/maidsafe/self_encryption/blob/master/img/self_encryption.png?raw=true)\n\n## Documentation\n- [Self Encrypting Data Whitepaper](https://docs.maidsafe.net/Whitepapers/pdf/SelfEncryptingData.pdf)\n- [Process Overview Video](https://www.youtube.com/watch?v=Jnvwv4z17b4)\n\n## Features\n\n- Content-based chunking\n- Convergent encryption\n- Self-validating chunks\n- Hierarchical data maps for handling large files\n- Streaming encryption/decryption\n- Python bindings\n- Flexible storage backend support\n- Custom storage backends via functors\n\n## Usage\n\n### Rust Usage\n\n#### Installation\n\nAdd this to your `Cargo.toml`:\n```toml\n[dependencies]\nself_encryption = \"0.30\"\nbytes = \"1.0\"\n```\n\n#### Basic Operations\n\n```rust\nuse self_encryption::{encrypt, decrypt_full_set};\nuse bytes::Bytes;\n\n// Basic encryption/decryption\nfn basic_example() -> Result<()> {\n    let data = Bytes::from(\"Hello, World!\".repeat(1000));  // Must be at least 3072 bytes\n    \n    // Encrypt data\n    let (data_map, encrypted_chunks) = encrypt(data.clone())?;\n    \n    // Decrypt data\n    let decrypted = decrypt_full_set(&data_map, &encrypted_chunks)?;\n    assert_eq!(data, decrypted);\n    \n    Ok(())\n}\n```\n\n#### Storage Backends\n\n```rust\nuse self_encryption::{shrink_data_map, get_root_data_map, decrypt_from_storage};\nuse std::collections::HashMap;\nuse std::sync::{Arc, Mutex};\n\n// Memory Storage Example\nfn memory_storage_example() -> Result<()> {\n    let storage = Arc::new(Mutex::new(HashMap::new()));\n    \n    // Store function\n    let store = |hash, data| {\n        storage.lock().unwrap().insert(hash, data);\n        Ok(())\n    };\n    \n    // Retrieve function\n    let retrieve = |hash| {\n        storage.lock().unwrap()\n            .get(&hash)\n            .cloned()\n            .ok_or_else(|| Error::Generic(\"Chunk not found\".into()))\n    };\n    \n    // Use with data map operations\n    let shrunk_map = shrink_data_map(data_map, store)?;\n    let root_map = get_root_data_map(shrunk_map, retrieve)?;\n    \n    Ok(())\n}\n\n// Disk Storage Example\nfn disk_storage_example() -> Result<()> {\n    let chunk_dir = PathBuf::from(\"chunks\");\n    \n    // Store function\n    let store = |hash, data| {\n        let path = chunk_dir.join(hex::encode(hash));\n        std::fs::write(path, data)?;\n        Ok(())\n    };\n    \n    // Retrieve function\n    let retrieve = |hash| {\n        let path = chunk_dir.join(hex::encode(hash));\n        Ok(Bytes::from(std::fs::read(path)?))\n    };\n    \n    // Use with data map operations\n    let shrunk_map = shrink_data_map(data_map, store)?;\n    let root_map = get_root_data_map(shrunk_map, retrieve)?;\n    \n    Ok(())\n}\n```\n\n### Python Usage\n\n#### Installation\n\n```bash\npip install self-encryption\n```\n\n#### Basic Operations\n\n```python\nfrom self_encryption import encrypt, decrypt\n\n# Basic in-memory encryption/decryption\ndef basic_example():\n    # Create test data (must be at least 3072 bytes)\n    data = b\"Hello, World!\" * 1000\n    \n    # Encrypt data - returns data map and encrypted chunks\n    data_map, chunks = encrypt(data)\n    print(f\"Data encrypted into {len(chunks)} chunks\")\n    print(f\"Data map has child level: {data_map.child()}\")\n    \n    # Decrypt data\n    decrypted = decrypt(data_map, chunks)\n    assert data == decrypted\n```\n\n#### File Operations\n\n```python\nfrom pathlib import Path\nfrom self_encryption import encrypt_from_file, decrypt_from_storage, streaming_encrypt_from_file\n\ndef file_example():\n    # Setup paths\n    input_path = Path(\"large_file.dat\")\n    chunk_dir = Path(\"chunks\")\n    output_path = Path(\"decrypted_file.dat\")\n    \n    # Ensure chunk directory exists\n    chunk_dir.mkdir(exist_ok=True)\n    \n    # Regular file encryption - stores all chunks at once\n    data_map, chunk_names = encrypt_from_file(str(input_path), str(chunk_dir))\n    print(f\"File encrypted into {len(chunk_names)} chunks\")\n    \n    # Streaming encryption - memory efficient for large files\n    def store_chunk(name_hex: str, content: bytes) -> None:\n        chunk_path = chunk_dir / name_hex\n        chunk_path.write_bytes(content)\n    \n    data_map = streaming_encrypt_from_file(str(input_path), store_chunk)\n    print(f\"File encrypted with streaming method\")\n    \n    # Create chunk retrieval function\n    def get_chunk(hash_hex: str) -> bytes:\n        chunk_path = chunk_dir / hash_hex\n        return chunk_path.read_bytes()\n    \n    # Decrypt file\n    decrypt_from_storage(data_map, str(output_path), get_chunk)\n```\n\n#### Advanced Features\n\n```python\nfrom self_encryption import shrink_data_map, get_root_data_map\n\ndef advanced_example():\n    # Create custom storage backend\n    chunk_store = {}\n    \n    def store_chunk(name_hex: str, content: bytes) -> None:\n        chunk_store[name_hex] = content\n    \n    def get_chunk(name_hex: str) -> bytes:\n        return chunk_store[name_hex]\n    \n    # Use streaming encryption with custom storage\n    data_map = streaming_encrypt_from_file(\"large_file.dat\", store_chunk)\n    \n    # Get root data map for hierarchical storage\n    root_map = get_root_data_map(data_map, get_chunk)\n    print(f\"Root data map level: {root_map.child()}\")\n```\n\n## Implementation Details\n\n### Core Process\n\n- Files are split into chunks of up to 1MB\n- Each chunk is processed in three steps:\n  1. Compression (using Brotli)\n  2. Encryption (using AES-256-CBC)\n  3. XOR obfuscation\n\n### Key Generation and Security\n\n- Each chunk's encryption uses keys derived from the content hashes of three chunks:\n\n  ```\n  For chunk N:\n  - Uses hashes from chunks [N, N+1, N+2]\n  - Combined hash = hash(N) || hash(N+1) || hash(N+2)\n  - Split into:\n    - Pad (first X bytes)\n    - Key (next 16 bytes for AES-256)\n    - IV  (final 16 bytes)\n  ```\n\n- This creates a chain of dependencies where each chunk's encryption depends on its neighbors\n- Provides both convergent encryption and additional security through the interdependencies\n\n### Encryption Flow\n\n1. Content Chunking:\n   - File is split into chunks of optimal size\n   - Each chunk's raw content is hashed (SHA3-256)\n   - These hashes become part of the DataMap\n\n2. Per-Chunk Processing:\n\n   ```rust\n   // For each chunk:\n   1. Compress data using Brotli\n   2. Generate key materials:\n      - Combine three consecutive chunk hashes\n      - Extract pad, key, and IV\n   3. Encrypt compressed data using AES-256-CBC\n   4. XOR encrypted data with pad for obfuscation\n   ```\n\n3. DataMap Creation:\n   - Stores both pre-encryption (src) and post-encryption (dst) hashes\n   - Maintains chunk ordering and size information\n   - Required for both encryption and decryption processes\n\n### Decryption Flow\n\n1. Chunk Retrieval:\n   - Use DataMap to identify required chunks\n   - Retrieve chunks using dst_hash as identifier\n\n2. Per-Chunk Processing:\n\n   ```rust\n   // For each chunk:\n   1. Regenerate key materials using src_hashes from DataMap\n   2. Remove XOR obfuscation using pad\n   3. Decrypt using AES-256-CBC with key and IV\n   4. Decompress using Brotli\n   ```\n\n3. Chunk Reassembly:\n   - Chunks are processed in order specified by DataMap\n   - Reassembled into original file\n\n### Storage Features\n\n- Flexible backend support through trait-based design\n- Supports both memory and disk-based storage\n- Streaming operations for memory efficiency\n- Hierarchical data maps for large files:\n\n  ```rust\n  // DataMap shrinking for large files\n  1. Serialize large DataMap\n  2. Encrypt serialized map using same process\n  3. Create new DataMap with fewer chunks\n  4. Repeat until manageable size reached\n  ```\n\n### Security Properties\n\n- Content-based convergent encryption\n- Additional security through chunk interdependencies\n- Self-validating chunks through hash verification\n- No single point of failure in chunk storage\n- Tamper-evident through hash chains\n\n### Performance Optimizations\n\n- Parallel chunk processing where possible\n- Streaming support for large files\n- Efficient memory usage through chunking\n- Optimized compression settings\n- Configurable chunk sizes\n\nThis implementation provides a balance of:\n\n- Security (through multiple encryption layers)\n- Deduplication (through convergent encryption)\n- Performance (through parallelization and streaming)\n- Flexibility (through modular storage backends)\n\n## License\n\nLicensed under the General Public License (GPL), version 3 ([LICENSE](LICENSE) http://www.gnu.org/licenses/gpl-3.0.en.html).\n\n### Linking Exception\n\nself_encryption is licensed under GPLv3 with linking exception. This means you can link to and use the library from any program, proprietary or open source; paid or gratis. However, if you modify self_encryption, you must distribute the source to your modified version under the terms of the GPLv3.\n\nSee the LICENSE file for more details.\n\n## Contributing\n\nWant to contribute? Great :tada:\n\nThere are many ways to give back to the project, whether it be writing new code, fixing bugs, or just reporting errors. All forms of contributions are encouraged!\n\nFor instructions on how to contribute, see our [Guide to contributing](https://github.com/maidsafe/QA/blob/master/CONTRIBUTING.md).\n\n",
    "bugtrack_url": null,
    "license": "GPL-3.0",
    "summary": "Self encrypting files (convergent encryption plus obfuscation)",
    "version": "0.32.43",
    "project_urls": {
        "Documentation": "https://docs.rs/self_encryption",
        "Homepage": "https://maidsafe.net",
        "Repository": "https://github.com/maidsafe/self_encryption"
    },
    "split_keywords": [
        "encryption",
        " convergent-encryption",
        " self-encryption",
        " obfuscation",
        " security"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cbaffca6c5246e7fa567f280c4df9da21f354b526e597d93c36496fc1f6ed7e1",
                "md5": "a7216ed5811935603a30183c39310a92",
                "sha256": "8a5d2de6d1b28fddd4af29f69fa48c7bbbbe2b4bc89892c5df9c84d9fb2ea083"
            },
            "downloads": -1,
            "filename": "self_encryption-0.32.43.tar.gz",
            "has_sig": false,
            "md5_digest": "a7216ed5811935603a30183c39310a92",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 170398,
            "upload_time": "2024-12-23T00:46:13",
            "upload_time_iso_8601": "2024-12-23T00:46:13.548482Z",
            "url": "https://files.pythonhosted.org/packages/cb/af/fca6c5246e7fa567f280c4df9da21f354b526e597d93c36496fc1f6ed7e1/self_encryption-0.32.43.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-23 00:46:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "maidsafe",
    "github_project": "self_encryption",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "self-encryption"
}
        
Elapsed time: 0.66383s