tap-dbf


Nametap-dbf JSON
Version 0.1.15 PyPI version JSON
download
home_pageNone
SummarySinger tap for DBF files
upload_time2024-12-24 21:24:40
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords elt dbase dbf singer.io
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # tap-dbf

Singer tap for the [dBase file format](https://en.wikipedia.org/wiki/.dbf).

## Configuration

| Setting | Required | Default | Description |
|:--------|:--------:|:-------:|:------------|
| path | True     | None    | Glob expression where the files are located. Stream names will be extracted from the file name. |
| fs_root | False    | file:// | The root of the filesystem to read from. |
| ignore_missing_memofile | False    |       0 | Whether to proceed reading the file even if the [memofile] is not present. |
| s3 | False    | None    | S3 configuration. |
| s3.key | False    | None    | The AWS key ID. |
| s3.secret | False    | None    | The AWS secret key. |
| s3.endpoint_url | False    | None    | The S3 endpoint URL. |
| gcs | False    | None    | GCS configuration. |
| gcs.token | False    | None    | OAuth 2.0 token for GCS. |
| stream_maps | False    | None    | Config object for stream maps capability. For more information check out [Stream Maps](https://sdk.meltano.com/en/latest/stream_maps.html). |
| stream_map_config | False    | None    | User-defined config values to be used within map expressions. |
| faker_config | False    | None    | Config for the [`Faker`](https://faker.readthedocs.io/en/master/) instance variable `fake` used within map expressions. Only applicable if the plugin specifies `faker` as an addtional dependency (through the `singer-sdk` `faker` extra or directly). |
| faker_config.seed | False    | None    | Value to seed the Faker generator for deterministic output: https://faker.readthedocs.io/en/master/#seeding-the-generator |
| faker_config.locale | False    | None    | One or more LCID locale strings to produce localized output for: https://faker.readthedocs.io/en/master/#localization |
| flattening_enabled | False    | None    | 'True' to enable schema flattening and automatically expand nested properties. |
| flattening_max_depth | False    | None    | The max depth to flatten schemas. |
| batch_config | False    | None    |             |
| batch_config.encoding | False    | None    | Specifies the format and compression of the batch files. |
| batch_config.encoding.format | False    | None    | Format to use for batch files. |
| batch_config.encoding.compression | False    | None    | Compression format to use for batch files. |
| batch_config.storage | False    | None    | Defines the storage layer to use when writing batch files |
| batch_config.storage.root | False    | None    | Root path to use when writing batch files. |
| batch_config.storage.prefix | False    | None    | Prefix to use when writing batch files. |

### JSON example

```json
{
  "path": "tests/data/files/*.dbf",
  "ignore_missing_memofile": true
}
```

## Filesystems

### Local

Example configuration:

```json
{
  "path": "/files/*.dbf",
  "fs_root": "file://data",
  "ignore_missing_memofile": true
}
```

The `fs_root` key is optional and defaults to the current working directory:

```json
{
  "path": "data/files/*.dbf",
  "ignore_missing_memofile": true
}
```

### S3

You need to install the package with the `s3` extra:

```shell
pip install 'tap-dbf[s3]'
```

Example configuration:

```json
{
  "path": "/*.dbf",
  "fs_root": "s3://files",
  "ignore_missing_memofile": true,
  "s3": {
    "key": "someKey",
    "secret": "someSecret",
    "endpoint_url": "http://localhost:9000"
  }
}
```

### Google Cloud Storage

You need to install the package with the `gcs` extra:

```shell
pip install 'tap-dbf[gcs]'
```

Example configuration:

```json
{
  "path": "/*.dbf",
  "fs_root": "gcs://files",
  "ignore_missing_memofile": true,
  "gcs": {
    "token": "cloud"
  }
}
```

See https://gcsfs.readthedocs.io/en/latest/#credentials for more information about the `token` key.

## Roadmap

- Google Drive filesystem
- Dropbox filesystem

[memofile]: https://en.wikipedia.org/wiki/.dbf#Memo_fields_and_the_.DBT_file

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "tap-dbf",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "\"Edgar R. Mondrag\u00f3n\" <edgarrm358@gmail.com>",
    "keywords": "ELT, dBase, dbf, singer.io",
    "author": null,
    "author_email": "\"Edgar R. Mondrag\u00f3n\" <edgarrm358@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/9e/54/ad8b9b2dd22728e31d0dcbe1a0cbe5b486cab669a5ee6797e5e4d42e397c/tap_dbf-0.1.15.tar.gz",
    "platform": null,
    "description": "# tap-dbf\n\nSinger tap for the [dBase file format](https://en.wikipedia.org/wiki/.dbf).\n\n## Configuration\n\n| Setting | Required | Default | Description |\n|:--------|:--------:|:-------:|:------------|\n| path | True     | None    | Glob expression where the files are located. Stream names will be extracted from the file name. |\n| fs_root | False    | file:// | The root of the filesystem to read from. |\n| ignore_missing_memofile | False    |       0 | Whether to proceed reading the file even if the [memofile] is not present. |\n| s3 | False    | None    | S3 configuration. |\n| s3.key | False    | None    | The AWS key ID. |\n| s3.secret | False    | None    | The AWS secret key. |\n| s3.endpoint_url | False    | None    | The S3 endpoint URL. |\n| gcs | False    | None    | GCS configuration. |\n| gcs.token | False    | None    | OAuth 2.0 token for GCS. |\n| stream_maps | False    | None    | Config object for stream maps capability. For more information check out [Stream Maps](https://sdk.meltano.com/en/latest/stream_maps.html). |\n| stream_map_config | False    | None    | User-defined config values to be used within map expressions. |\n| faker_config | False    | None    | Config for the [`Faker`](https://faker.readthedocs.io/en/master/) instance variable `fake` used within map expressions. Only applicable if the plugin specifies `faker` as an addtional dependency (through the `singer-sdk` `faker` extra or directly). |\n| faker_config.seed | False    | None    | Value to seed the Faker generator for deterministic output: https://faker.readthedocs.io/en/master/#seeding-the-generator |\n| faker_config.locale | False    | None    | One or more LCID locale strings to produce localized output for: https://faker.readthedocs.io/en/master/#localization |\n| flattening_enabled | False    | None    | 'True' to enable schema flattening and automatically expand nested properties. |\n| flattening_max_depth | False    | None    | The max depth to flatten schemas. |\n| batch_config | False    | None    |             |\n| batch_config.encoding | False    | None    | Specifies the format and compression of the batch files. |\n| batch_config.encoding.format | False    | None    | Format to use for batch files. |\n| batch_config.encoding.compression | False    | None    | Compression format to use for batch files. |\n| batch_config.storage | False    | None    | Defines the storage layer to use when writing batch files |\n| batch_config.storage.root | False    | None    | Root path to use when writing batch files. |\n| batch_config.storage.prefix | False    | None    | Prefix to use when writing batch files. |\n\n### JSON example\n\n```json\n{\n  \"path\": \"tests/data/files/*.dbf\",\n  \"ignore_missing_memofile\": true\n}\n```\n\n## Filesystems\n\n### Local\n\nExample configuration:\n\n```json\n{\n  \"path\": \"/files/*.dbf\",\n  \"fs_root\": \"file://data\",\n  \"ignore_missing_memofile\": true\n}\n```\n\nThe `fs_root` key is optional and defaults to the current working directory:\n\n```json\n{\n  \"path\": \"data/files/*.dbf\",\n  \"ignore_missing_memofile\": true\n}\n```\n\n### S3\n\nYou need to install the package with the `s3` extra:\n\n```shell\npip install 'tap-dbf[s3]'\n```\n\nExample configuration:\n\n```json\n{\n  \"path\": \"/*.dbf\",\n  \"fs_root\": \"s3://files\",\n  \"ignore_missing_memofile\": true,\n  \"s3\": {\n    \"key\": \"someKey\",\n    \"secret\": \"someSecret\",\n    \"endpoint_url\": \"http://localhost:9000\"\n  }\n}\n```\n\n### Google Cloud Storage\n\nYou need to install the package with the `gcs` extra:\n\n```shell\npip install 'tap-dbf[gcs]'\n```\n\nExample configuration:\n\n```json\n{\n  \"path\": \"/*.dbf\",\n  \"fs_root\": \"gcs://files\",\n  \"ignore_missing_memofile\": true,\n  \"gcs\": {\n    \"token\": \"cloud\"\n  }\n}\n```\n\nSee https://gcsfs.readthedocs.io/en/latest/#credentials for more information about the `token` key.\n\n## Roadmap\n\n- Google Drive filesystem\n- Dropbox filesystem\n\n[memofile]: https://en.wikipedia.org/wiki/.dbf#Memo_fields_and_the_.DBT_file\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Singer tap for DBF files",
    "version": "0.1.15",
    "project_urls": {
        "Documentation": "https://github.com/edgarrmondragon/tap-dbf#readme",
        "Homepage": "https://github.com/edgarrmondragon/tap-dbf",
        "Repository": "https://github.com/edgarrmondragon/tap-dbf"
    },
    "split_keywords": [
        "elt",
        " dbase",
        " dbf",
        " singer.io"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1da585067a6e090d53a2117c0b2ebab5d35ab791684eef66bdaf5e0d8503c660",
                "md5": "afe59649711c31efe8d7e6309b37c2df",
                "sha256": "da4ea30d11c516090494a9abb52ef1cbb1692d03da0f70ff5d02d15e50476ec6"
            },
            "downloads": -1,
            "filename": "tap_dbf-0.1.15-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "afe59649711c31efe8d7e6309b37c2df",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 11814,
            "upload_time": "2024-12-24T21:24:36",
            "upload_time_iso_8601": "2024-12-24T21:24:36.977452Z",
            "url": "https://files.pythonhosted.org/packages/1d/a5/85067a6e090d53a2117c0b2ebab5d35ab791684eef66bdaf5e0d8503c660/tap_dbf-0.1.15-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9e54ad8b9b2dd22728e31d0dcbe1a0cbe5b486cab669a5ee6797e5e4d42e397c",
                "md5": "3f00db116ad589dd0cf13404b4eb5d11",
                "sha256": "0c21f1963d1f2e3eaf247c680f256531a8934fbf88ce78ee1fdc412e656e2951"
            },
            "downloads": -1,
            "filename": "tap_dbf-0.1.15.tar.gz",
            "has_sig": false,
            "md5_digest": "3f00db116ad589dd0cf13404b4eb5d11",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 15683,
            "upload_time": "2024-12-24T21:24:40",
            "upload_time_iso_8601": "2024-12-24T21:24:40.682522Z",
            "url": "https://files.pythonhosted.org/packages/9e/54/ad8b9b2dd22728e31d0dcbe1a0cbe5b486cab669a5ee6797e5e4d42e397c/tap_dbf-0.1.15.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-24 21:24:40",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "edgarrmondragon",
    "github_project": "tap-dbf#readme",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "tap-dbf"
}
        
Elapsed time: 0.44920s