# Solr to Elasticsearch Migrator
This will migrate a Solr node to an Elasticsearch index.
## Requirements
* Python 3+
* elasticsearch
* pysolr
## Usage
```
usage: solr-to-es [-h] [--solr-query SOLR_QUERY] [--solr-fields COMMA_SEP_FIELDS]
[--rows-per-page ROWS_PER_PAGE] [--es-timeout ES_TIMEOUT]
solr_url elasticsearch_url elasticsearch_index doc_type
```
The following example will page through all documents on the local Solr, and submit them to the local Elasticsearch server in the index `es_index` with a document type of `solr_docs`.
```bash
solr-to-es http://localhost:8983/solr/<<collection_name>> http://localhost:9200 <<collection_name>> solr_docs
```
`solr_url` is the full url to your Solr,
`elasticsearch_url` is the url of your Elasticsearch server.
`elasticsearch_index` is the index you will submit the Solr documents to on Elasticsearch.
`doc_type` is the type of document Elasticsearch should assume you are importing.
`--solr-query` defaults to `*:*`
`--solr-fields` defaults to ` ` (i.e. all fields)
`--rows-per-page` defaults to `500`
`--es-timeout` defaults to `60`
`--es-user` for authentication in Elasticsearch
`--es-password` for authentication in Elasticsearch
`--es-max-retries` maximum number of times a document will be retried when 429 is received, set to 0 for no retries on 429
`--es-initial-backoff` number of seconds we should wait before the first retry. Any subsequent retries will be powers of initial_backoff * 2**retry_number
## Install
Run `python setup.py install` to install the script.
### Demo
Here is an example of grabbing the over 114 thousand journal articles from Plos.org API about *animals*.
```
docker run -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" elasticsearch
solr-to-es --solr-query animal http://api.plos.org/search localhost:9200 es_plos solr_docs
curl http://localhost:9200/_cat/indices?v
```
_Note: that you will get an 403 Forbidden error from the script, and that is because the solr.quepid.com doesn't allow deep paging, however you will have documents in your ES cluster_.
Raw data
{
"_id": null,
"home_page": "https://github.com/o19s/solr-to-es",
"name": "pds.solr-to-es",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "solr elasticsearch o19s",
"author": "Scott Stults",
"author_email": "sstults@opensourceconnections.com",
"download_url": "",
"platform": null,
"description": "# Solr to Elasticsearch Migrator\n\nThis will migrate a Solr node to an Elasticsearch index.\n\n## Requirements\n\n * Python 3+\n * elasticsearch\n * pysolr\n\n## Usage\n\n```\nusage: solr-to-es [-h] [--solr-query SOLR_QUERY] [--solr-fields COMMA_SEP_FIELDS]\n [--rows-per-page ROWS_PER_PAGE] [--es-timeout ES_TIMEOUT]\n solr_url elasticsearch_url elasticsearch_index doc_type\n```\n\nThe following example will page through all documents on the local Solr, and submit them to the local Elasticsearch server in the index `es_index` with a document type of `solr_docs`.\n\n```bash\nsolr-to-es http://localhost:8983/solr/<<collection_name>> http://localhost:9200 <<collection_name>> solr_docs\n```\n\n`solr_url` is the full url to your Solr,\n\n`elasticsearch_url` is the url of your Elasticsearch server.\n\n`elasticsearch_index` is the index you will submit the Solr documents to on Elasticsearch.\n\n`doc_type` is the type of document Elasticsearch should assume you are importing.\n\n`--solr-query` defaults to `*:*`\n\n`--solr-fields` defaults to ` ` (i.e. all fields)\n\n`--rows-per-page` defaults to `500`\n\n`--es-timeout` defaults to `60`\n\n`--es-user` for authentication in Elasticsearch\n\n`--es-password` for authentication in Elasticsearch\n\n`--es-max-retries` maximum number of times a document will be retried when 429 is received, set to 0 for no retries on 429\n\n`--es-initial-backoff` number of seconds we should wait before the first retry. Any subsequent retries will be powers of initial_backoff * 2**retry_number\n\n## Install\n\nRun `python setup.py install` to install the script.\n\n### Demo\n\nHere is an example of grabbing the over 114 thousand journal articles from Plos.org API about *animals*.\n\n```\ndocker run -p 9200:9200 -p 9300:9300 -e \"discovery.type=single-node\" elasticsearch\n\nsolr-to-es --solr-query animal http://api.plos.org/search localhost:9200 es_plos solr_docs\n\ncurl http://localhost:9200/_cat/indices?v\n\n```\n_Note: that you will get an 403 Forbidden error from the script, and that is because the solr.quepid.com doesn't allow deep paging, however you will have documents in your ES cluster_.\n",
"bugtrack_url": null,
"license": "Apache License, Version 2.0",
"summary": "Export Solr Nodes to Elasticsearch Indexes",
"version": "0.3.0",
"project_urls": {
"Homepage": "https://github.com/o19s/solr-to-es"
},
"split_keywords": [
"solr",
"elasticsearch",
"o19s"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "77d5e63532882cd7a57cd10db16ec3ebe7daf31f8a2d0454d4643799fa8d0fb3",
"md5": "1f226abd7e6249bc256c73625aa2a7cc",
"sha256": "cfeea2e901876921ce43f0e0b17449c5095d90c8b1f156196060bc07944c09d4"
},
"downloads": -1,
"filename": "pds.solr_to_es-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1f226abd7e6249bc256c73625aa2a7cc",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 5969,
"upload_time": "2023-10-09T21:25:52",
"upload_time_iso_8601": "2023-10-09T21:25:52.833833Z",
"url": "https://files.pythonhosted.org/packages/77/d5/e63532882cd7a57cd10db16ec3ebe7daf31f8a2d0454d4643799fa8d0fb3/pds.solr_to_es-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-10-09 21:25:52",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "o19s",
"github_project": "solr-to-es",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [],
"lcname": "pds.solr-to-es"
}