db-to-sqlite


Namedb-to-sqlite JSON
Version 1.5 PyPI version JSON
download
home_pagehttps://datasette.io/tools/db-to-sqlite
SummaryCLI tool for exporting tables or queries from any SQL database to a SQLite file
upload_time2023-02-07 01:11:28
maintainer
docs_urlNone
authorSimon Willison
requires_python>=3.7
licenseApache License, Version 2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # db-to-sqlite

[![PyPI](https://img.shields.io/pypi/v/db-to-sqlite.svg)](https://pypi.python.org/pypi/db-to-sqlite)
[![Changelog](https://img.shields.io/github/v/release/simonw/db-to-sqlite?include_prereleases&label=changelog)](https://github.com/simonw/db-to-sqlite/releases)
[![Tests](https://github.com/simonw/db-to-sqlite/workflows/Test/badge.svg)](https://github.com/simonw/db-to-sqlite/actions?query=workflow%3ATest)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/db-to-sqlite/blob/main/LICENSE)

CLI tool for exporting tables or queries from any SQL database to a SQLite file.

## Installation

Install from PyPI like so:

    pip install db-to-sqlite

If you want to use it with MySQL, you can install the extra dependency like this:

    pip install 'db-to-sqlite[mysql]'

Installing the `mysqlclient` library on OS X can be tricky - I've found [this recipe](https://gist.github.com/simonw/90ac0afd204cd0d6d9c3135c3888d116) to work (run that before installing `db-to-sqlite`).

For PostgreSQL, use this:

    pip install 'db-to-sqlite[postgresql]'

## Usage
```
Usage: db-to-sqlite [OPTIONS] CONNECTION PATH

  Load data from any database into SQLite.

  PATH is a path to the SQLite file to create, e.c. /tmp/my_database.db

  CONNECTION is a SQLAlchemy connection string, for example:

      postgresql://localhost/my_database
      postgresql://username:passwd@localhost/my_database

      mysql://root@localhost/my_database
      mysql://username:passwd@localhost/my_database

  More: https://docs.sqlalchemy.org/en/13/core/engines.html#database-urls

Options:
  --version                     Show the version and exit.
  --all                         Detect and copy all tables
  --table TEXT                  Specific tables to copy
  --skip TEXT                   When using --all skip these tables
  --redact TEXT...              (table, column) pairs to redact with ***
  --sql TEXT                    Optional SQL query to run
  --output TEXT                 Table in which to save --sql query results
  --pk TEXT                     Optional column to use as a primary key
  --index-fks / --no-index-fks  Should foreign keys have indexes? Default on
  -p, --progress                Show progress bar
  --postgres-schema TEXT        PostgreSQL schema to use
  --help                        Show this message and exit.
```

For example, to save the content of the `blog_entry` table from a PostgreSQL database to a local file called `blog.db` you could do this:

    db-to-sqlite "postgresql://localhost/myblog" blog.db \
        --table=blog_entry

You can specify `--table` more than once.

You can also save the data from all of your tables, effectively creating a SQLite copy of your entire database. Any foreign key relationships will be detected and added to the SQLite database. For example:

    db-to-sqlite "postgresql://localhost/myblog" blog.db \
        --all

When running `--all` you can specify tables to skip using `--skip`:

    db-to-sqlite "postgresql://localhost/myblog" blog.db \
        --all \
        --skip=django_migrations

If you want to save the results of a custom SQL query, do this:

    db-to-sqlite "postgresql://localhost/myblog" output.db \
        --output=query_results \
        --sql="select id, title, created from blog_entry" \
        --pk=id

The `--output` option specifies the table that should contain the results of the query.

## Using db-to-sqlite with PostgreSQL schemas

If the tables you want to copy from your PostgreSQL database aren't in the default schema, you can specify an alternate one with the `--postgres-schema` option:

    db-to-sqlite "postgresql://localhost/myblog" blog.db \
        --all \
        --postgres-schema my_schema

## Using db-to-sqlite with MS SQL

The best way to get the connection string needed for the MS SQL connections below is to use urllib from the Standard Library as below

    params = urllib.parse.quote_plus(
        "DRIVER={SQL Server Native Client 11.0};"
        "SERVER=localhost;"
        "DATABASE=my_database;"
        "Trusted_Connection=yes;"
    )

The above will resolve to

    DRIVER%3D%7BSQL+Server+Native+Client+11.0%7D%3B+SERVER%3Dlocalhost%3B+DATABASE%3Dmy_database%3B+Trusted_Connection%3Dyes

You can then use the string above in the odbc_connect below

    mssql+pyodbc:///?odbc_connect=DRIVER%3D%7BSQL+Server+Native+Client+11.0%7D%3B+SERVER%3Dlocalhost%3B+DATABASE%3Dmy_database%3B+Trusted_Connection%3Dyes
    mssql+pyodbc:///?odbc_connect=DRIVER%3D%7BSQL+Server+Native+Client+11.0%7D%3B+SERVER%3Dlocalhost%3B+DATABASE%3Dmy_database%3B+UID%3Dusername%3B+PWD%3Dpasswd

## Using db-to-sqlite with Heroku Postgres

If you run an application on [Heroku](https://www.heroku.com/) using their [Postgres database product](https://www.heroku.com/postgres), you can use the `heroku config` command to access a compatible connection string:

    $ heroku config --app myappname | grep HEROKU_POSTG
    HEROKU_POSTGRESQL_OLIVE_URL: postgres://username:password@ec2-xxx-xxx-xxx-x.compute-1.amazonaws.com:5432/dbname

You can pass this to `db-to-sqlite` to create a local SQLite database with the data from your Heroku instance.

You can even do this using a bash one-liner:

    $ db-to-sqlite $(heroku config --app myappname | grep HEROKU_POSTG | cut -d: -f 2-) \
        /tmp/heroku.db --all -p
    1/23: django_migrations
    ...
    17/23: blog_blogmark
    [####################################]  100%
    ...

## Related projects

* [Datasette](https://github.com/simonw/datasette): A tool for exploring and publishing data. Works great with SQLite files generated using `db-to-sqlite`.
* [sqlite-utils](https://github.com/simonw/sqlite-utils): Python CLI utility and library for manipulating SQLite databases.
* [csvs-to-sqlite](https://github.com/simonw/csvs-to-sqlite): Convert CSV files into a SQLite database.

## Development

To set up this tool locally, first checkout the code. Then create a new virtual environment:

    cd db-to-sqlite
    python3 -m venv venv
    source venv/bin/activate

Or if you are using `pipenv`:

    pipenv shell

Now install the dependencies and test dependencies:

    pip install -e '.[test]'

To run the tests:

    pytest

This will skip tests against MySQL or PostgreSQL if you do not have their additional dependencies installed.

You can install those extra dependencies like so:

    pip install -e '.[test_mysql,test_postgresql]'

You can alternative use `pip install psycopg2-binary` if you cannot install the `psycopg2` dependency used by the `test_postgresql` extra.

See [Running a MySQL server using Homebrew](https://til.simonwillison.net/homebrew/mysql-homebrew) for tips on running the tests against MySQL on macOS, including how to install the `mysqlclient` dependency.

The PostgreSQL and MySQL tests default to expecting to run against servers on localhost. You can use environment variables to point them at different test database servers:

- `MYSQL_TEST_DB_CONNECTION` - defaults to `mysql://root@localhost/test_db_to_sqlite`
- `POSTGRESQL_TEST_DB_CONNECTION` - defaults to `postgresql://localhost/test_db_to_sqlite`

The database you indicate in the environment variable - `test_db_to_sqlite` by default - will be deleted and recreated on every test run.

            

Raw data

            {
    "_id": null,
    "home_page": "https://datasette.io/tools/db-to-sqlite",
    "name": "db-to-sqlite",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "",
    "author": "Simon Willison",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/a7/1a/2db017f1323d0b9199fa537a96eb87e614ede73ddd97bc73b49dba6287d2/db-to-sqlite-1.5.tar.gz",
    "platform": null,
    "description": "# db-to-sqlite\n\n[![PyPI](https://img.shields.io/pypi/v/db-to-sqlite.svg)](https://pypi.python.org/pypi/db-to-sqlite)\n[![Changelog](https://img.shields.io/github/v/release/simonw/db-to-sqlite?include_prereleases&label=changelog)](https://github.com/simonw/db-to-sqlite/releases)\n[![Tests](https://github.com/simonw/db-to-sqlite/workflows/Test/badge.svg)](https://github.com/simonw/db-to-sqlite/actions?query=workflow%3ATest)\n[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/db-to-sqlite/blob/main/LICENSE)\n\nCLI tool for exporting tables or queries from any SQL database to a SQLite file.\n\n## Installation\n\nInstall from PyPI like so:\n\n    pip install db-to-sqlite\n\nIf you want to use it with MySQL, you can install the extra dependency like this:\n\n    pip install 'db-to-sqlite[mysql]'\n\nInstalling the `mysqlclient` library on OS X can be tricky - I've found [this recipe](https://gist.github.com/simonw/90ac0afd204cd0d6d9c3135c3888d116) to work (run that before installing `db-to-sqlite`).\n\nFor PostgreSQL, use this:\n\n    pip install 'db-to-sqlite[postgresql]'\n\n## Usage\n```\nUsage: db-to-sqlite [OPTIONS] CONNECTION PATH\n\n  Load data from any database into SQLite.\n\n  PATH is a path to the SQLite file to create, e.c. /tmp/my_database.db\n\n  CONNECTION is a SQLAlchemy connection string, for example:\n\n      postgresql://localhost/my_database\n      postgresql://username:passwd@localhost/my_database\n\n      mysql://root@localhost/my_database\n      mysql://username:passwd@localhost/my_database\n\n  More: https://docs.sqlalchemy.org/en/13/core/engines.html#database-urls\n\nOptions:\n  --version                     Show the version and exit.\n  --all                         Detect and copy all tables\n  --table TEXT                  Specific tables to copy\n  --skip TEXT                   When using --all skip these tables\n  --redact TEXT...              (table, column) pairs to redact with ***\n  --sql TEXT                    Optional SQL query to run\n  --output TEXT                 Table in which to save --sql query results\n  --pk TEXT                     Optional column to use as a primary key\n  --index-fks / --no-index-fks  Should foreign keys have indexes? Default on\n  -p, --progress                Show progress bar\n  --postgres-schema TEXT        PostgreSQL schema to use\n  --help                        Show this message and exit.\n```\n\nFor example, to save the content of the `blog_entry` table from a PostgreSQL database to a local file called `blog.db` you could do this:\n\n    db-to-sqlite \"postgresql://localhost/myblog\" blog.db \\\n        --table=blog_entry\n\nYou can specify `--table` more than once.\n\nYou can also save the data from all of your tables, effectively creating a SQLite copy of your entire database. Any foreign key relationships will be detected and added to the SQLite database. For example:\n\n    db-to-sqlite \"postgresql://localhost/myblog\" blog.db \\\n        --all\n\nWhen running `--all` you can specify tables to skip using `--skip`:\n\n    db-to-sqlite \"postgresql://localhost/myblog\" blog.db \\\n        --all \\\n        --skip=django_migrations\n\nIf you want to save the results of a custom SQL query, do this:\n\n    db-to-sqlite \"postgresql://localhost/myblog\" output.db \\\n        --output=query_results \\\n        --sql=\"select id, title, created from blog_entry\" \\\n        --pk=id\n\nThe `--output` option specifies the table that should contain the results of the query.\n\n## Using db-to-sqlite with PostgreSQL schemas\n\nIf the tables you want to copy from your PostgreSQL database aren't in the default schema, you can specify an alternate one with the `--postgres-schema` option:\n\n    db-to-sqlite \"postgresql://localhost/myblog\" blog.db \\\n        --all \\\n        --postgres-schema my_schema\n\n## Using db-to-sqlite with MS SQL\n\nThe best way to get the connection string needed for the MS SQL connections below is to use urllib from the Standard Library as below\n\n    params = urllib.parse.quote_plus(\n        \"DRIVER={SQL Server Native Client 11.0};\"\n        \"SERVER=localhost;\"\n        \"DATABASE=my_database;\"\n        \"Trusted_Connection=yes;\"\n    )\n\nThe above will resolve to\n\n    DRIVER%3D%7BSQL+Server+Native+Client+11.0%7D%3B+SERVER%3Dlocalhost%3B+DATABASE%3Dmy_database%3B+Trusted_Connection%3Dyes\n\nYou can then use the string above in the odbc_connect below\n\n    mssql+pyodbc:///?odbc_connect=DRIVER%3D%7BSQL+Server+Native+Client+11.0%7D%3B+SERVER%3Dlocalhost%3B+DATABASE%3Dmy_database%3B+Trusted_Connection%3Dyes\n    mssql+pyodbc:///?odbc_connect=DRIVER%3D%7BSQL+Server+Native+Client+11.0%7D%3B+SERVER%3Dlocalhost%3B+DATABASE%3Dmy_database%3B+UID%3Dusername%3B+PWD%3Dpasswd\n\n## Using db-to-sqlite with Heroku Postgres\n\nIf you run an application on [Heroku](https://www.heroku.com/) using their [Postgres database product](https://www.heroku.com/postgres), you can use the `heroku config` command to access a compatible connection string:\n\n    $ heroku config --app myappname | grep HEROKU_POSTG\n    HEROKU_POSTGRESQL_OLIVE_URL: postgres://username:password@ec2-xxx-xxx-xxx-x.compute-1.amazonaws.com:5432/dbname\n\nYou can pass this to `db-to-sqlite` to create a local SQLite database with the data from your Heroku instance.\n\nYou can even do this using a bash one-liner:\n\n    $ db-to-sqlite $(heroku config --app myappname | grep HEROKU_POSTG | cut -d: -f 2-) \\\n        /tmp/heroku.db --all -p\n    1/23: django_migrations\n    ...\n    17/23: blog_blogmark\n    [####################################]  100%\n    ...\n\n## Related projects\n\n* [Datasette](https://github.com/simonw/datasette): A tool for exploring and publishing data. Works great with SQLite files generated using `db-to-sqlite`.\n* [sqlite-utils](https://github.com/simonw/sqlite-utils): Python CLI utility and library for manipulating SQLite databases.\n* [csvs-to-sqlite](https://github.com/simonw/csvs-to-sqlite): Convert CSV files into a SQLite database.\n\n## Development\n\nTo set up this tool locally, first checkout the code. Then create a new virtual environment:\n\n    cd db-to-sqlite\n    python3 -m venv venv\n    source venv/bin/activate\n\nOr if you are using `pipenv`:\n\n    pipenv shell\n\nNow install the dependencies and test dependencies:\n\n    pip install -e '.[test]'\n\nTo run the tests:\n\n    pytest\n\nThis will skip tests against MySQL or PostgreSQL if you do not have their additional dependencies installed.\n\nYou can install those extra dependencies like so:\n\n    pip install -e '.[test_mysql,test_postgresql]'\n\nYou can alternative use `pip install psycopg2-binary` if you cannot install the `psycopg2` dependency used by the `test_postgresql` extra.\n\nSee [Running a MySQL server using Homebrew](https://til.simonwillison.net/homebrew/mysql-homebrew) for tips on running the tests against MySQL on macOS, including how to install the `mysqlclient` dependency.\n\nThe PostgreSQL and MySQL tests default to expecting to run against servers on localhost. You can use environment variables to point them at different test database servers:\n\n- `MYSQL_TEST_DB_CONNECTION` - defaults to `mysql://root@localhost/test_db_to_sqlite`\n- `POSTGRESQL_TEST_DB_CONNECTION` - defaults to `postgresql://localhost/test_db_to_sqlite`\n\nThe database you indicate in the environment variable - `test_db_to_sqlite` by default - will be deleted and recreated on every test run.\n",
    "bugtrack_url": null,
    "license": "Apache License, Version 2.0",
    "summary": "CLI tool for exporting tables or queries from any SQL database to a SQLite file",
    "version": "1.5",
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ba8a8d721fac196657f8c3d074fa2f9a9211661cb3bd4b8904d357bb0c5f511d",
                "md5": "f5af175d7d153c8efceffd6b3c21704a",
                "sha256": "b2bcd1f76a3fd77b69487d62d1f0ebe6ef89d74770f22d99c4334fcca611ba2d"
            },
            "downloads": -1,
            "filename": "db_to_sqlite-1.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f5af175d7d153c8efceffd6b3c21704a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 16283,
            "upload_time": "2023-02-07T01:11:27",
            "upload_time_iso_8601": "2023-02-07T01:11:27.635699Z",
            "url": "https://files.pythonhosted.org/packages/ba/8a/8d721fac196657f8c3d074fa2f9a9211661cb3bd4b8904d357bb0c5f511d/db_to_sqlite-1.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a71a2db017f1323d0b9199fa537a96eb87e614ede73ddd97bc73b49dba6287d2",
                "md5": "936077267d20d9eba0f1222fb73fefb0",
                "sha256": "1d6ca66287ac53bddd7515197cea58480d2d4673ced8e5c8d3e05c6ec1e84be9"
            },
            "downloads": -1,
            "filename": "db-to-sqlite-1.5.tar.gz",
            "has_sig": false,
            "md5_digest": "936077267d20d9eba0f1222fb73fefb0",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 14077,
            "upload_time": "2023-02-07T01:11:28",
            "upload_time_iso_8601": "2023-02-07T01:11:28.784162Z",
            "url": "https://files.pythonhosted.org/packages/a7/1a/2db017f1323d0b9199fa537a96eb87e614ede73ddd97bc73b49dba6287d2/db-to-sqlite-1.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-02-07 01:11:28",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "db-to-sqlite"
}
        
Elapsed time: 0.10939s