atom-torch


Nameatom-torch JSON
Version 0.0.2 PyPI version JSON
download
home_pagehttps://github.com/kyegomez/atom
Summaryatom - Pytorch
upload_time2023-09-08 22:17:33
maintainer
docs_urlNone
authorKye Gomez
requires_python>=3.6,<4.0
licenseMIT
keywords artificial intelligence deep learning optimizers prompt engineering
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)

# Atom
Atom is a finetuned LLAMA to create better LLMS through Pytorch Data!




## Installation

You can install the package using pip

```python
git clone https://github.com/jquesnelle/yarn
cd Atom
pip install -e .
```

### Training

To train the models, run `accelerate config` and enable DeepSpeed acceleration. `deepspeed/zero3.json` was the configuration file used for training.

```sh
# ./train.sh
```

The tokenized training data is available on [Hugging Face](https://huggingface.co/datasets/emozilla/pg_books-tokenized-bos-eos-chunked-65536) and was derived from the [pg19](https://huggingface.co/datasets/emozilla/pg19) dataset.

### Evaluation

To reproduce the evaluations, install [lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness) with `pip install git+https://github.com/EleutherAI/lm-evaluation-harness` and then run the two provided scripts.

```sh
# ./eval.sh
# ./eval-harness.sh
```

### Citation

```
@misc{peng2023yarn,
      title={YaRN: Efficient Context Window Extension of Large Language Models}, 
      author={Bowen Peng and Jeffrey Quesnelle and Honglu Fan and Enrico Shippole},
      year={2023},
      eprint={2309.00071},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
```
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kyegomez/atom",
    "name": "atom-torch",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6,<4.0",
    "maintainer_email": "",
    "keywords": "artificial intelligence,deep learning,optimizers,Prompt Engineering",
    "author": "Kye Gomez",
    "author_email": "kye@apac.ai",
    "download_url": "https://files.pythonhosted.org/packages/c3/1f/a948e5d9423c4b929246d79e49e91cc29132ca714a2cff2baeda30be09d6/atom_torch-0.0.2.tar.gz",
    "platform": null,
    "description": "[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)\n\n# Atom\nAtom is a finetuned LLAMA to create better LLMS through Pytorch Data!\n\n\n\n\n## Installation\n\nYou can install the package using pip\n\n```python\ngit clone https://github.com/jquesnelle/yarn\ncd Atom\npip install -e .\n```\n\n### Training\n\nTo train the models, run `accelerate config` and enable DeepSpeed acceleration. `deepspeed/zero3.json` was the configuration file used for training.\n\n```sh\n# ./train.sh\n```\n\nThe tokenized training data is available on [Hugging Face](https://huggingface.co/datasets/emozilla/pg_books-tokenized-bos-eos-chunked-65536) and was derived from the [pg19](https://huggingface.co/datasets/emozilla/pg19) dataset.\n\n### Evaluation\n\nTo reproduce the evaluations, install [lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness) with `pip install git+https://github.com/EleutherAI/lm-evaluation-harness` and then run the two provided scripts.\n\n```sh\n# ./eval.sh\n# ./eval-harness.sh\n```\n\n### Citation\n\n```\n@misc{peng2023yarn,\n      title={YaRN: Efficient Context Window Extension of Large Language Models}, \n      author={Bowen Peng and Jeffrey Quesnelle and Honglu Fan and Enrico Shippole},\n      year={2023},\n      eprint={2309.00071},\n      archivePrefix={arXiv},\n      primaryClass={cs.CL}\n}\n```",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "atom - Pytorch",
    "version": "0.0.2",
    "project_urls": {
        "Homepage": "https://github.com/kyegomez/atom",
        "Repository": "https://github.com/kyegomez/atom"
    },
    "split_keywords": [
        "artificial intelligence",
        "deep learning",
        "optimizers",
        "prompt engineering"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "80f46c4c5f1a17aa98a8040e7b40bf27b944c75eab06d3650f7113b3692607c1",
                "md5": "a5223b9049819cd45ba89e1ea5ced8f4",
                "sha256": "2ebe961645855f9a97a204bc4b0bf3ccdb1a3a643a37ec39edb4f90702c79800"
            },
            "downloads": -1,
            "filename": "atom_torch-0.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a5223b9049819cd45ba89e1ea5ced8f4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6,<4.0",
            "size": 33631,
            "upload_time": "2023-09-08T22:17:31",
            "upload_time_iso_8601": "2023-09-08T22:17:31.908209Z",
            "url": "https://files.pythonhosted.org/packages/80/f4/6c4c5f1a17aa98a8040e7b40bf27b944c75eab06d3650f7113b3692607c1/atom_torch-0.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c31fa948e5d9423c4b929246d79e49e91cc29132ca714a2cff2baeda30be09d6",
                "md5": "ab8e589b013234a4b621ac0aefa20832",
                "sha256": "6483a3a259491dbd88348fc14a657adf7c53e4f387bd3ecb7de4d21192a0ab76"
            },
            "downloads": -1,
            "filename": "atom_torch-0.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "ab8e589b013234a4b621ac0aefa20832",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6,<4.0",
            "size": 32202,
            "upload_time": "2023-09-08T22:17:33",
            "upload_time_iso_8601": "2023-09-08T22:17:33.628548Z",
            "url": "https://files.pythonhosted.org/packages/c3/1f/a948e5d9423c4b929246d79e49e91cc29132ca714a2cff2baeda30be09d6/atom_torch-0.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-09-08 22:17:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kyegomez",
    "github_project": "atom",
    "github_not_found": true,
    "lcname": "atom-torch"
}
        
Elapsed time: 0.12215s