minerva-torch


Nameminerva-torch JSON
Version 0.0.1 PyPI version JSON
download
home_pagehttps://github.com/kyegomez/Minerva
SummaryTransformers at zeta scales
upload_time2023-11-25 09:34:27
maintainer
docs_urlNone
authorZeta Team
requires_python>=3.8,<4.0
licenseMIT
keywords transformers zeta scale
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Minerva: Unleashing the Secrets of advanced Mathematics πŸ›οΈπŸ”’

<!-- ![Minerva Next Generation Open Source Language Model](/Minerva-banner.png) -->
Minerva is a groundbreaking language model that pushes the boundaries of mathematical understanding and problem-solving. Designed with an advanced math theme, Minerva embodies the spirit of renowned mathematicians such as Euclid, Pythagoras, and Archimedes. By harnessing their advanced wisdom, Minerva offers unparalleled capabilities in mathematical reasoning and exploration.

---

<div align="center">

[![GitHub issues](https://img.shields.io/github/issues/kyegomez/Minerva)](https://github.com/kyegomez/Minerva/issues) [![GitHub forks](https://img.shields.io/github/forks/kyegomez/Minerva)](https://github.com/kyegomez/Minerva/network) [![GitHub stars](https://img.shields.io/github/stars/kyegomez/Minerva)](https://github.com/kyegomez/Minerva/stargazers) [![GitHub license](https://img.shields.io/github/license/kyegomez/Minerva)](https://github.com/kyegomez/Minerva/blob/main/LICENSE)

</div>

<div align="center">

[![Share on Twitter](https://img.shields.io/twitter/url/https/twitter.com/cloudposse.svg?style=social&label=Share%20%40kyegomez/Minerva)](https://twitter.com/intent/tweet?text=Unleash%20the%20power%20of%20Minerva%20-%20the%20advanced-themed%20MATH%20LLM%20from%20Google!&url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva) [![Share on Facebook](https://img.shields.io/badge/Share-%20facebook-blue)](https://www.facebook.com/sharer/sharer.php?u=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva) [![Share on LinkedIn](https://img.shields.io/badge/Share-%20linkedin-blue)](https://www.linkedin.com/shareArticle?mini=true&url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva&title=&summary=&source=)

[![Share on Reddit](https://img.shields.io/badge/-Share%20on%20Reddit-orange)](https://www.reddit.com/submit?url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva&title=Unleash%20the%20power%20of%20Minerva%20-%20the%20advanced-themed%20MATH%20LLM%20from%20Google!) [![Share on Hacker News](https://img.shields.io/badge/-Share%20on%20Hacker%20News-orange)](https://news.ycombinator.com/submitlink?u=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva&t=Unleash%20the%20power%20of%20Minerva%20-%20the%20advanced-themed%20MATH%20LLM%20from%20Google!) [![Share on Pinterest](https://img.shields.io/badge/-Share%20on%20Pinterest-red)](https://pinterest.com/pin/create/button/?url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva&media=https%3A%2F%2Fexample.com%2Fimage.jpg&description=Unleash%20the%20power%20of%20Minerva%20-%20the%20advanced-themed%20MATH%20LLM%20from%20Google!) [![Share on WhatsApp](https://img.shields.io/badge/-Share%20on%20WhatsApp-green)](https://api.whatsapp.com/send?text=Unleash%20the%20power%20of%20Minerva%20-%20the%20advanced-themed%20MATH%20LLM%20from%20Google!%20%23Minerva%20%23AI%0A%0Ahttps%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva)

</div>

---





# Install

```shell
pip install minerva
```


# Usage
```python
import torch
from minerva import Minerva, Train

# Example usage
x = torch.randint(0, 20000, (1, 1024))

Minerva(x)

# or train
Train()
```

# Training

To train Minerva, follow these steps:

1. Configure the training settings by setting the environment variables:

   - `ENTITY_NAME`: Your wandb project name
   - `OUTPUT_DIR`: Specify the output directory for saving the weights (e.g., `./weights`)

2. Launch the training process using Deepspeed:

```shell
Accelerate Config
Accelerate launch train_distributed_accelerate.py
```

## Dataset Building

To build a custom dataset for Minerva, you can preprocess the data using the `build_dataset.py` script. This script performs tasks such as pre-tokenization, data chunking, and uploading to the Huggingface hub. Here's an example command:

| Dataset | Description |
|-|-|  
| Mathematical Web Pages | Web pages containing mathematical expressions in MathJax format, cleaned to preserve math notation|
| arXiv | 2 million arXiv papers up to Feb 2021, in LaTeX format |
| General Natural Language Data | Same dataset used to pretrain PaLM models |

The mathematical web pages and arXiv datasets focus on technical and mathematical content. The general natural language data provides a broad coverage of general language.

The paper states the mathematical web pages and arXiv each account for 47.5% of the total data. The remaining 5% is general natural language data which is a subset of what was used for PaLM pretraining.

## Roadmap πŸ—ΊοΈπŸ“

- [ ] Create a dataset of ARXVIV papers
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kyegomez/Minerva",
    "name": "minerva-torch",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8,<4.0",
    "maintainer_email": "",
    "keywords": "Transformers,zeta scale",
    "author": "Zeta Team",
    "author_email": "kye@apac.ai",
    "download_url": "https://files.pythonhosted.org/packages/02/85/b2c35adb9f9147c40fa1dda5c9eb4cf511d441a1f659564ae3b65018a2a3/minerva_torch-0.0.1.tar.gz",
    "platform": null,
    "description": "# Minerva: Unleashing the Secrets of advanced Mathematics \ud83c\udfdb\ufe0f\ud83d\udd22\n\n<!-- ![Minerva Next Generation Open Source Language Model](/Minerva-banner.png) -->\nMinerva is a groundbreaking language model that pushes the boundaries of mathematical understanding and problem-solving. Designed with an advanced math theme, Minerva embodies the spirit of renowned mathematicians such as Euclid, Pythagoras, and Archimedes. By harnessing their advanced wisdom, Minerva offers unparalleled capabilities in mathematical reasoning and exploration.\n\n---\n\n<div align=\"center\">\n\n[![GitHub issues](https://img.shields.io/github/issues/kyegomez/Minerva)](https://github.com/kyegomez/Minerva/issues) [![GitHub forks](https://img.shields.io/github/forks/kyegomez/Minerva)](https://github.com/kyegomez/Minerva/network) [![GitHub stars](https://img.shields.io/github/stars/kyegomez/Minerva)](https://github.com/kyegomez/Minerva/stargazers) [![GitHub license](https://img.shields.io/github/license/kyegomez/Minerva)](https://github.com/kyegomez/Minerva/blob/main/LICENSE)\n\n</div>\n\n<div align=\"center\">\n\n[![Share on Twitter](https://img.shields.io/twitter/url/https/twitter.com/cloudposse.svg?style=social&label=Share%20%40kyegomez/Minerva)](https://twitter.com/intent/tweet?text=Unleash%20the%20power%20of%20Minerva%20-%20the%20advanced-themed%20MATH%20LLM%20from%20Google!&url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva) [![Share on Facebook](https://img.shields.io/badge/Share-%20facebook-blue)](https://www.facebook.com/sharer/sharer.php?u=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva) [![Share on LinkedIn](https://img.shields.io/badge/Share-%20linkedin-blue)](https://www.linkedin.com/shareArticle?mini=true&url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva&title=&summary=&source=)\n\n[![Share on Reddit](https://img.shields.io/badge/-Share%20on%20Reddit-orange)](https://www.reddit.com/submit?url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva&title=Unleash%20the%20power%20of%20Minerva%20-%20the%20advanced-themed%20MATH%20LLM%20from%20Google!) [![Share on Hacker News](https://img.shields.io/badge/-Share%20on%20Hacker%20News-orange)](https://news.ycombinator.com/submitlink?u=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva&t=Unleash%20the%20power%20of%20Minerva%20-%20the%20advanced-themed%20MATH%20LLM%20from%20Google!) [![Share on Pinterest](https://img.shields.io/badge/-Share%20on%20Pinterest-red)](https://pinterest.com/pin/create/button/?url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva&media=https%3A%2F%2Fexample.com%2Fimage.jpg&description=Unleash%20the%20power%20of%20Minerva%20-%20the%20advanced-themed%20MATH%20LLM%20from%20Google!) [![Share on WhatsApp](https://img.shields.io/badge/-Share%20on%20WhatsApp-green)](https://api.whatsapp.com/send?text=Unleash%20the%20power%20of%20Minerva%20-%20the%20advanced-themed%20MATH%20LLM%20from%20Google!%20%23Minerva%20%23AI%0A%0Ahttps%3A%2F%2Fgithub.com%2Fkyegomez%2FMinerva)\n\n</div>\n\n---\n\n\n\n\n\n# Install\n\n```shell\npip install minerva\n```\n\n\n# Usage\n```python\nimport torch\nfrom minerva import Minerva, Train\n\n# Example usage\nx = torch.randint(0, 20000, (1, 1024))\n\nMinerva(x)\n\n# or train\nTrain()\n```\n\n# Training\n\nTo train Minerva, follow these steps:\n\n1. Configure the training settings by setting the environment variables:\n\n   - `ENTITY_NAME`: Your wandb project name\n   - `OUTPUT_DIR`: Specify the output directory for saving the weights (e.g., `./weights`)\n\n2. Launch the training process using Deepspeed:\n\n```shell\nAccelerate Config\nAccelerate launch train_distributed_accelerate.py\n```\n\n## Dataset Building\n\nTo build a custom dataset for Minerva, you can preprocess the data using the `build_dataset.py` script. This script performs tasks such as pre-tokenization, data chunking, and uploading to the Huggingface hub. Here's an example command:\n\n| Dataset | Description |\n|-|-|  \n| Mathematical Web Pages | Web pages containing mathematical expressions in MathJax format, cleaned to preserve math notation|\n| arXiv | 2 million arXiv papers up to Feb 2021, in LaTeX format |\n| General Natural Language Data | Same dataset used to pretrain PaLM models |\n\nThe mathematical web pages and arXiv datasets focus on technical and mathematical content. The general natural language data provides a broad coverage of general language.\n\nThe paper states the mathematical web pages and arXiv each account for 47.5% of the total data. The remaining 5% is general natural language data which is a subset of what was used for PaLM pretraining.\n\n## Roadmap \ud83d\uddfa\ufe0f\ud83d\udccd\n\n- [ ] Create a dataset of ARXVIV papers",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Transformers at zeta scales",
    "version": "0.0.1",
    "project_urls": {
        "Homepage": "https://github.com/kyegomez/Minerva"
    },
    "split_keywords": [
        "transformers",
        "zeta scale"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f33d3470d7082580683ebf6c12798d910105f0d4db38558403b4d3f95ac35edf",
                "md5": "fdc3c23fa9d0680ff606c0ef2360be72",
                "sha256": "5f9e67cc4c424c93ef06bf4da12b76d0fa210f308b31e6c2391096a7baa762d0"
            },
            "downloads": -1,
            "filename": "minerva_torch-0.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "fdc3c23fa9d0680ff606c0ef2360be72",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8,<4.0",
            "size": 15059,
            "upload_time": "2023-11-25T09:34:25",
            "upload_time_iso_8601": "2023-11-25T09:34:25.528708Z",
            "url": "https://files.pythonhosted.org/packages/f3/3d/3470d7082580683ebf6c12798d910105f0d4db38558403b4d3f95ac35edf/minerva_torch-0.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0285b2c35adb9f9147c40fa1dda5c9eb4cf511d441a1f659564ae3b65018a2a3",
                "md5": "3e46e888f2284947c7a444d7b413439d",
                "sha256": "27941333b7b0bcc1292cdf86864c4d192d52d1f498d6716c45206f8388ce6155"
            },
            "downloads": -1,
            "filename": "minerva_torch-0.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "3e46e888f2284947c7a444d7b413439d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8,<4.0",
            "size": 15441,
            "upload_time": "2023-11-25T09:34:27",
            "upload_time_iso_8601": "2023-11-25T09:34:27.546347Z",
            "url": "https://files.pythonhosted.org/packages/02/85/b2c35adb9f9147c40fa1dda5c9eb4cf511d441a1f659564ae3b65018a2a3/minerva_torch-0.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-11-25 09:34:27",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kyegomez",
    "github_project": "Minerva",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "minerva-torch"
}
        
Elapsed time: 0.14293s