| Name | not-again-ai JSON |
| Version |
0.14.0
JSON |
| download |
| home_page | https://github.com/DaveCoDev/not-again-ai |
| Summary | Designed to once and for all collect all the little things that come up over and over again in AI projects and put them in one place. |
| upload_time | 2024-10-21 01:30:18 |
| maintainer | None |
| docs_url | None |
| author | DaveCoDev |
| requires_python | <4.0,>=3.11 |
| license | MIT |
| keywords |
|
| VCS |
 |
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# not-again-ai
[![GitHub Actions][github-actions-badge]](https://github.com/johnthagen/python-blueprint/actions)
[![Packaged with Poetry][poetry-badge]](https://python-poetry.org/)
[![Nox][nox-badge]](https://github.com/wntrblm/nox)
[![Ruff][ruff-badge]](https://github.com/astral-sh/ruff)
[![Type checked with mypy][mypy-badge]](https://mypy-lang.org/)
[github-actions-badge]: https://github.com/johnthagen/python-blueprint/workflows/python/badge.svg
[poetry-badge]: https://img.shields.io/badge/packaging-poetry-cyan.svg
[nox-badge]: https://img.shields.io/badge/%F0%9F%A6%8A-Nox-D85E00.svg
[black-badge]: https://img.shields.io/badge/code%20style-black-000000.svg
[ruff-badge]: https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json
[mypy-badge]: https://www.mypy-lang.org/static/mypy_badge.svg
**not-again-ai** is a collection of various building blocks that come up over and over again when developing AI products. The key goals of this package are to have simple, yet flexible interfaces and to minimize dependencies. It is encouraged to also **a)** use this as a template for your own Python package. **b)** instead of installing the package, copy and paste functions into your own projects. We make this easier by limiting the number of dependencies and use an MIT license.
**Documentation** available within individual **[notebooks](notebooks)**, docstrings within the source, or auto-generated at [DaveCoDev.github.io/not-again-ai/](https://DaveCoDev.github.io/not-again-ai/).
# Installation
Requires: Python 3.11, or 3.12
Install the entire package from [PyPI](https://pypi.org/project/not-again-ai/) with:
```bash
$ pip install not_again_ai[llm,local_llm,statistics,viz]
```
Note that local LLM requires separate installations and will not work out of the box due to how hardware dependent it is. Be sure to check the [notebooks](notebooks/local_llm/) for more details.
The package is split into subpackages, so you can install only the parts you need.
### Base
1. `pip install not_again_ai`
### Data
1. `pip install not_again_ai[data]`
1. `playwright install` to download the browser binaries.
### LLM
1. `pip install not_again_ai[llm]`
1. Setup OpenAI API
1. Go to https://platform.openai.com/settings/profile?tab=api-keys to get your API key.
1. (Optional) Set the `OPENAI_API_KEY` and the `OPENAI_ORG_ID` environment variables.
1. Setup Azure OpenAI (AOAI)
1. Using AOAI requires using Entra ID authentication. See https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/managed-identity for how to set this up for your AOAI deployment.
* Requires the correct role assigned to your user account and being signed into the Azure CLI.
1. (Optional) Set the `AZURE_OPENAI_ENDPOINT` environment variable.
1. Setup GitHub Models
1. Get a Personal Access Token from https://github.com/settings/tokens and set the `GITHUB_TOKEN` environment variable. The token does not need any permissions.
1. Check the [Github Marketplace](https://github.com/marketplace/models) to see which models are available.
### Local LLM
1. `pip install not_again_ai[llm,local_llm]`
1. Some HuggingFace transformers tokenizers are gated behind access requests. If you wish to use these, you will need to request access from HuggingFace on the model card.
* Then set the `HF_TOKEN` environment variable to your HuggingFace API token which can be found here: https://huggingface.co/settings/tokens
1. If you wish to use Ollama:
1. Follow the instructions at https://github.com/ollama/ollama to install Ollama for your system.
1. (Optional) [Add Ollama as a startup service (recommended)](https://github.com/ollama/ollama/blob/main/docs/linux.md#adding-ollama-as-a-startup-service-recommended)
1. (Optional) To make the Ollama service accessible on your local network from a Linux server, add the following to the `/etc/systemd/system/ollama.service` file which will make Ollama available at `http://<local_address>:11434`:
```bash
[Service]
...
Environment="OLLAMA_HOST=0.0.0.0"
```
1. It is recommended to always have the latest version of Ollama. To update Ollama check the [docs](https://github.com/ollama/ollama/blob/main/docs/). The command for Linux is: `curl -fsSL https://ollama.com/install.sh | sh`
1. HuggingFace transformers and other requirements are hardware dependent so for providers other than Ollama, this only installs some generic dependencies. Check the [notebooks](notebooks/local_llm/) for more details on what is available and how to install it.
### Statistics
1. `pip install not_again_ai[statistics]`
### Visualization
1. `pip install not_again_ai[viz]`
# Development Information
The following information is relevant if you would like to contribute or use this package as a template for yourself.
This package uses [Poetry](https://python-poetry.org/) to manage dependencies and
isolated [Python virtual environments](https://docs.python.org/3/library/venv.html). To proceed, be sure to first install [pipx](https://github.com/pypa/pipx#install-pipx)
and then [install Poetry](https://python-poetry.org/docs/#installing-with-pipx).
Install Poetry Plugin: Export
```bash
$ pipx inject poetry poetry-plugin-export
```
(Optional) configure Poetry to use an in-project virtual environment.
```bash
$ poetry config virtualenvs.in-project true
```
## Dependencies
Dependencies are defined in [`pyproject.toml`](./pyproject.toml) and specific versions are locked
into [`poetry.lock`](./poetry.lock). This allows for exact reproducible environments across
all machines that use the project, both during development and in production.
To upgrade all dependencies to the versions defined in [`pyproject.toml`](./pyproject.toml):
```bash
$ poetry update
```
To install all dependencies (with all extra dependencies) into an isolated virtual environment:
> Append `--sync` to uninstall dependencies that are no longer in use from the virtual environment.
```bash
$ poetry install --all-extras
```
To [activate](https://python-poetry.org/docs/basic-usage#activating-the-virtual-environment) the
virtual environment that is automatically created by Poetry:
```bash
$ poetry shell
```
To deactivate the environment:
```bash
(.venv) $ exit
```
## Packaging
This project is designed as a Python package, meaning that it can be bundled up and redistributed
as a single compressed file.
Packaging is configured by:
- [`pyproject.toml`](./pyproject.toml)
To package the project as both a
[source distribution](https://packaging.python.org/en/latest/flow/#the-source-distribution-sdist) and
a [wheel](https://packaging.python.org/en/latest/specifications/binary-distribution-format/):
```bash
$ poetry build
```
This will generate `dist/not-again-ai-<version>.tar.gz` and `dist/not_again_ai-<version>-py3-none-any.whl`.
Read more about the [advantages of wheels](https://pythonwheels.com/) to understand why generating
wheel distributions are important.
## Publish Distributions to PyPI
Source and wheel redistributable packages can
be [published to PyPI](https://python-poetry.org/docs/cli#publish) or installed
directly from the filesystem using `pip`.
```bash
$ poetry publish
```
# Enforcing Code Quality
Automated code quality checks are performed using
[Nox](https://nox.thea.codes/en/stable/) and
[`nox-poetry`](https://nox-poetry.readthedocs.io/en/stable/). Nox will automatically create virtual
environments and run commands based on [`noxfile.py`](./noxfile.py) for unit testing, PEP 8 style
guide checking, type checking and documentation generation.
> Note: `nox` is installed into the virtual environment automatically by the `poetry install`
> command above. Run `poetry shell` to activate the virtual environment.
To run all default sessions:
```bash
(.venv) $ nox
```
## Unit Testing
Unit testing is performed with [pytest](https://pytest.org/). pytest has become the de facto Python
unit testing framework. Some key advantages over the built-in
[unittest](https://docs.python.org/3/library/unittest.html) module are:
1. Significantly less boilerplate needed for tests.
2. PEP 8 compliant names (e.g. `pytest.raises()` instead of `self.assertRaises()`).
3. Vibrant ecosystem of plugins.
pytest will automatically discover and run tests by recursively searching for folders and `.py`
files prefixed with `test` for any functions prefixed by `test`.
The `tests` folder is created as a Python package (i.e. there is an `__init__.py` file within it)
because this helps `pytest` uniquely namespace the test files. Without this, two test files cannot
be named the same, even if they are in different subdirectories.
Code coverage is provided by the [pytest-cov](https://pytest-cov.readthedocs.io/en/latest/) plugin.
When running a unit test Nox session (e.g. `nox -s test`), an HTML report is generated in
the `htmlcov` folder showing each source file and which lines were executed during unit testing.
Open `htmlcov/index.html` in a web browser to view the report. Code coverage reports help identify
areas of the project that are currently not tested.
pytest and code coverage are configured in [`pyproject.toml`](./pyproject.toml).
To run selected tests:
```bash
(.venv) $ nox -s test -- -k "test_web"
```
## Code Style Checking
[PEP 8](https://peps.python.org/pep-0008/) is the universally accepted style guide for Python
code. PEP 8 code compliance is verified using [Ruff][Ruff]. Ruff is configured in the
`[tool.ruff]` section of [`pyproject.toml`](./pyproject.toml).
[Ruff]: https://github.com/astral-sh/ruff
To lint code, run:
```bash
(.venv) $ nox -s lint
```
To automatically fix fixable lint errors, run:
```bash
(.venv) $ nox -s lint_fix
```
## Automated Code Formatting
[Ruff][Ruff] is used to automatically format code and group and sort imports.
To automatically format code, run:
```bash
(.venv) $ nox -s fmt
```
To verify code has been formatted, such as in a CI job:
```bash
(.venv) $ nox -s fmt_check
```
## Type Checking
[Type annotations](https://docs.python.org/3/library/typing.html) allows developers to include
optional static typing information to Python source code. This allows static analyzers such
as [mypy](http://mypy-lang.org/), [PyCharm](https://www.jetbrains.com/pycharm/),
or [Pyright](https://github.com/microsoft/pyright) to check that functions are used with the
correct types before runtime.
```python
def factorial(n: int) -> int:
...
```
mypy is configured in [`pyproject.toml`](./pyproject.toml). To type check code, run:
```bash
(.venv) $ nox -s type_check
```
See also [awesome-python-typing](https://github.com/typeddjango/awesome-python-typing).
### Distributing Type Annotations
[PEP 561](https://www.python.org/dev/peps/pep-0561/) defines how a Python package should
communicate the presence of inline type annotations to static type
checkers. [mypy's documentation](https://mypy.readthedocs.io/en/stable/installed_packages.html)
provides further examples on how to do this.
Mypy looks for the existence of a file named [`py.typed`](./src/not-again-ai/py.typed) in the root of the
installed package to indicate that inline type annotations should be checked.
## Typos
Check for typos using [typos](https://github.com/crate-ci/typos)
```bash
(.venv) $ nox -s typos
```
## Continuous Integration
Continuous integration is provided by [GitHub Actions](https://github.com/features/actions). This
runs all tests, lints, and type checking for every commit and pull request to the repository.
GitHub Actions is configured in [`.github/workflows/python.yml`](./.github/workflows/python.yml).
## [Visual Studio Code](https://code.visualstudio.com/docs/languages/python)
Install the [Python extension](https://marketplace.visualstudio.com/items?itemName=ms-python.python) for VSCode.
Install the [Ruff extension](https://marketplace.visualstudio.com/items?itemName=charliermarsh.ruff) for VSCode.
Default settings are configured in [`.vscode/settings.json`](./.vscode/settings.json) which will enable Ruff with consistent settings.
# Generating Documentation
## Generating a User Guide
[Material for MkDocs](https://squidfunk.github.io/mkdocs-material/) is a powerful static site
generator that combines easy-to-write Markdown, with a number of Markdown extensions that increase
the power of Markdown. This makes it a great fit for user guides and other technical documentation.
The example MkDocs project included in this project is configured to allow the built documentation
to be hosted at any URL or viewed offline from the file system.
To build the user guide, run,
```bash
(.venv) $ nox -s docs
```
and open `docs/user_guide/site/index.html` using a web browser.
To build the user guide, additionally validating external URLs, run:
```bash
(.venv) $ nox -s docs_check_urls
```
To build the user guide in a format suitable for viewing directly from the file system, run:
```bash
(.venv) $ nox -s docs_offline
```
To build and serve the user guide with automatic rebuilding as you change the contents,
run:
```bash
(.venv) $ nox -s docs_serve
```
and open <http://127.0.0.1:8000> in a browser.
Each time the `main` Git branch is updated, the
[`.github/workflows/pages.yml`](.github/workflows/pages.yml) GitHub Action will
automatically build the user guide and publish it to [GitHub Pages](https://pages.github.com/).
This is configured in the `docs_github_pages` Nox session.
## Generating API Documentation
This project uses [mkdocstrings](https://github.com/mkdocstrings/mkdocstrings) plugin for
MkDocs, which renders
[Google-style docstrings](https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html)
into an MkDocs project. Google-style docstrings provide a good mix of easy-to-read docstrings in
code as well as nicely-rendered output.
```python
"""Computes the factorial through a recursive algorithm.
Args:
n: A positive input value.
Raises:
InvalidFactorialError: If n is less than 0.
Returns:
Computed factorial.
"""
```
## Misc
If you get a `Failed to create the collection: Prompt dismissed..` error when running `poetry update` on Ubuntu, try setting the following environment variable:
```bash
export PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring
```
# Attributions
[python-blueprint](https://github.com/johnthagen/python-blueprint) for the Python package skeleton.
Raw data
{
"_id": null,
"home_page": "https://github.com/DaveCoDev/not-again-ai",
"name": "not-again-ai",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.11",
"maintainer_email": null,
"keywords": null,
"author": "DaveCoDev",
"author_email": "dave.co.dev@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/bf/83/7bef82d2ae848cbf14d485fbcc2f6f1056aa8b636a02e7616d67a510835d/not_again_ai-0.14.0.tar.gz",
"platform": null,
"description": "# not-again-ai\n\n[![GitHub Actions][github-actions-badge]](https://github.com/johnthagen/python-blueprint/actions)\n[![Packaged with Poetry][poetry-badge]](https://python-poetry.org/)\n[![Nox][nox-badge]](https://github.com/wntrblm/nox)\n[![Ruff][ruff-badge]](https://github.com/astral-sh/ruff)\n[![Type checked with mypy][mypy-badge]](https://mypy-lang.org/)\n\n[github-actions-badge]: https://github.com/johnthagen/python-blueprint/workflows/python/badge.svg\n[poetry-badge]: https://img.shields.io/badge/packaging-poetry-cyan.svg\n[nox-badge]: https://img.shields.io/badge/%F0%9F%A6%8A-Nox-D85E00.svg\n[black-badge]: https://img.shields.io/badge/code%20style-black-000000.svg\n[ruff-badge]: https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json\n[mypy-badge]: https://www.mypy-lang.org/static/mypy_badge.svg\n\n**not-again-ai** is a collection of various building blocks that come up over and over again when developing AI products. The key goals of this package are to have simple, yet flexible interfaces and to minimize dependencies. It is encouraged to also **a)** use this as a template for your own Python package. **b)** instead of installing the package, copy and paste functions into your own projects. We make this easier by limiting the number of dependencies and use an MIT license.\n\n**Documentation** available within individual **[notebooks](notebooks)**, docstrings within the source, or auto-generated at [DaveCoDev.github.io/not-again-ai/](https://DaveCoDev.github.io/not-again-ai/).\n\n# Installation\n\nRequires: Python 3.11, or 3.12\n\nInstall the entire package from [PyPI](https://pypi.org/project/not-again-ai/) with: \n\n```bash\n$ pip install not_again_ai[llm,local_llm,statistics,viz]\n```\n\nNote that local LLM requires separate installations and will not work out of the box due to how hardware dependent it is. Be sure to check the [notebooks](notebooks/local_llm/) for more details.\n\nThe package is split into subpackages, so you can install only the parts you need.\n\n### Base\n1. `pip install not_again_ai`\n\n\n### Data\n1. `pip install not_again_ai[data]`\n1. `playwright install` to download the browser binaries.\n\n\n### LLM\n1. `pip install not_again_ai[llm]`\n1. Setup OpenAI API\n 1. Go to https://platform.openai.com/settings/profile?tab=api-keys to get your API key.\n 1. (Optional) Set the `OPENAI_API_KEY` and the `OPENAI_ORG_ID` environment variables.\n1. Setup Azure OpenAI (AOAI)\n 1. Using AOAI requires using Entra ID authentication. See https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/managed-identity for how to set this up for your AOAI deployment.\n * Requires the correct role assigned to your user account and being signed into the Azure CLI.\n 1. (Optional) Set the `AZURE_OPENAI_ENDPOINT` environment variable.\n1. Setup GitHub Models\n 1. Get a Personal Access Token from https://github.com/settings/tokens and set the `GITHUB_TOKEN` environment variable. The token does not need any permissions.\n 1. Check the [Github Marketplace](https://github.com/marketplace/models) to see which models are available.\n\n\n### Local LLM\n 1. `pip install not_again_ai[llm,local_llm]`\n 1. Some HuggingFace transformers tokenizers are gated behind access requests. If you wish to use these, you will need to request access from HuggingFace on the model card.\n * Then set the `HF_TOKEN` environment variable to your HuggingFace API token which can be found here: https://huggingface.co/settings/tokens\n 1. If you wish to use Ollama:\n 1. Follow the instructions at https://github.com/ollama/ollama to install Ollama for your system. \n 1. (Optional) [Add Ollama as a startup service (recommended)](https://github.com/ollama/ollama/blob/main/docs/linux.md#adding-ollama-as-a-startup-service-recommended)\n 1. (Optional) To make the Ollama service accessible on your local network from a Linux server, add the following to the `/etc/systemd/system/ollama.service` file which will make Ollama available at `http://<local_address>:11434`:\n ```bash\n [Service]\n ...\n Environment=\"OLLAMA_HOST=0.0.0.0\"\n ```\n 1. It is recommended to always have the latest version of Ollama. To update Ollama check the [docs](https://github.com/ollama/ollama/blob/main/docs/). The command for Linux is: `curl -fsSL https://ollama.com/install.sh | sh`\n 1. HuggingFace transformers and other requirements are hardware dependent so for providers other than Ollama, this only installs some generic dependencies. Check the [notebooks](notebooks/local_llm/) for more details on what is available and how to install it.\n\n\n### Statistics\n1. `pip install not_again_ai[statistics]`\n\n\n### Visualization\n1. `pip install not_again_ai[viz]`\n\n\n# Development Information\n\nThe following information is relevant if you would like to contribute or use this package as a template for yourself. \n\nThis package uses [Poetry](https://python-poetry.org/) to manage dependencies and\nisolated [Python virtual environments](https://docs.python.org/3/library/venv.html). To proceed, be sure to first install [pipx](https://github.com/pypa/pipx#install-pipx)\nand then [install Poetry](https://python-poetry.org/docs/#installing-with-pipx).\n\nInstall Poetry Plugin: Export\n\n```bash\n$ pipx inject poetry poetry-plugin-export\n```\n\n(Optional) configure Poetry to use an in-project virtual environment.\n```bash\n$ poetry config virtualenvs.in-project true\n```\n\n## Dependencies\n\nDependencies are defined in [`pyproject.toml`](./pyproject.toml) and specific versions are locked\ninto [`poetry.lock`](./poetry.lock). This allows for exact reproducible environments across\nall machines that use the project, both during development and in production.\n\nTo upgrade all dependencies to the versions defined in [`pyproject.toml`](./pyproject.toml):\n\n```bash\n$ poetry update\n```\n\nTo install all dependencies (with all extra dependencies) into an isolated virtual environment:\n\n> Append `--sync` to uninstall dependencies that are no longer in use from the virtual environment.\n\n```bash\n$ poetry install --all-extras\n```\n\nTo [activate](https://python-poetry.org/docs/basic-usage#activating-the-virtual-environment) the\nvirtual environment that is automatically created by Poetry:\n\n```bash\n$ poetry shell\n```\n\nTo deactivate the environment:\n\n```bash\n(.venv) $ exit\n```\n\n## Packaging\n\nThis project is designed as a Python package, meaning that it can be bundled up and redistributed\nas a single compressed file.\n\nPackaging is configured by:\n\n- [`pyproject.toml`](./pyproject.toml)\n\nTo package the project as both a \n[source distribution](https://packaging.python.org/en/latest/flow/#the-source-distribution-sdist) and\na [wheel](https://packaging.python.org/en/latest/specifications/binary-distribution-format/):\n\n```bash\n$ poetry build\n```\n\nThis will generate `dist/not-again-ai-<version>.tar.gz` and `dist/not_again_ai-<version>-py3-none-any.whl`.\n\nRead more about the [advantages of wheels](https://pythonwheels.com/) to understand why generating\nwheel distributions are important.\n\n## Publish Distributions to PyPI\n\nSource and wheel redistributable packages can\nbe [published to PyPI](https://python-poetry.org/docs/cli#publish) or installed\ndirectly from the filesystem using `pip`.\n\n```bash\n$ poetry publish\n```\n\n# Enforcing Code Quality\n\nAutomated code quality checks are performed using \n[Nox](https://nox.thea.codes/en/stable/) and\n[`nox-poetry`](https://nox-poetry.readthedocs.io/en/stable/). Nox will automatically create virtual\nenvironments and run commands based on [`noxfile.py`](./noxfile.py) for unit testing, PEP 8 style\nguide checking, type checking and documentation generation.\n\n> Note: `nox` is installed into the virtual environment automatically by the `poetry install`\n> command above. Run `poetry shell` to activate the virtual environment.\n\nTo run all default sessions:\n\n```bash\n(.venv) $ nox\n```\n\n## Unit Testing\n\nUnit testing is performed with [pytest](https://pytest.org/). pytest has become the de facto Python\nunit testing framework. Some key advantages over the built-in\n[unittest](https://docs.python.org/3/library/unittest.html) module are:\n\n1. Significantly less boilerplate needed for tests.\n2. PEP 8 compliant names (e.g. `pytest.raises()` instead of `self.assertRaises()`).\n3. Vibrant ecosystem of plugins.\n\npytest will automatically discover and run tests by recursively searching for folders and `.py`\nfiles prefixed with `test` for any functions prefixed by `test`.\n\nThe `tests` folder is created as a Python package (i.e. there is an `__init__.py` file within it)\nbecause this helps `pytest` uniquely namespace the test files. Without this, two test files cannot\nbe named the same, even if they are in different subdirectories.\n\nCode coverage is provided by the [pytest-cov](https://pytest-cov.readthedocs.io/en/latest/) plugin.\n\nWhen running a unit test Nox session (e.g. `nox -s test`), an HTML report is generated in\nthe `htmlcov` folder showing each source file and which lines were executed during unit testing.\nOpen `htmlcov/index.html` in a web browser to view the report. Code coverage reports help identify\nareas of the project that are currently not tested.\n\npytest and code coverage are configured in [`pyproject.toml`](./pyproject.toml).\n\nTo run selected tests:\n\n```bash\n(.venv) $ nox -s test -- -k \"test_web\"\n```\n\n## Code Style Checking\n\n[PEP 8](https://peps.python.org/pep-0008/) is the universally accepted style guide for Python\ncode. PEP 8 code compliance is verified using [Ruff][Ruff]. Ruff is configured in the\n`[tool.ruff]` section of [`pyproject.toml`](./pyproject.toml).\n\n[Ruff]: https://github.com/astral-sh/ruff\n\nTo lint code, run:\n\n```bash\n(.venv) $ nox -s lint\n```\n\nTo automatically fix fixable lint errors, run:\n\n```bash\n(.venv) $ nox -s lint_fix\n```\n\n## Automated Code Formatting\n\n[Ruff][Ruff] is used to automatically format code and group and sort imports.\n\nTo automatically format code, run:\n\n```bash\n(.venv) $ nox -s fmt\n```\n\nTo verify code has been formatted, such as in a CI job:\n\n```bash\n(.venv) $ nox -s fmt_check\n```\n\n## Type Checking\n\n[Type annotations](https://docs.python.org/3/library/typing.html) allows developers to include\noptional static typing information to Python source code. This allows static analyzers such\nas [mypy](http://mypy-lang.org/), [PyCharm](https://www.jetbrains.com/pycharm/),\nor [Pyright](https://github.com/microsoft/pyright) to check that functions are used with the\ncorrect types before runtime.\n\n\n```python\ndef factorial(n: int) -> int:\n ...\n```\n\nmypy is configured in [`pyproject.toml`](./pyproject.toml). To type check code, run:\n\n```bash\n(.venv) $ nox -s type_check\n```\n\nSee also [awesome-python-typing](https://github.com/typeddjango/awesome-python-typing).\n\n### Distributing Type Annotations\n\n[PEP 561](https://www.python.org/dev/peps/pep-0561/) defines how a Python package should\ncommunicate the presence of inline type annotations to static type\ncheckers. [mypy's documentation](https://mypy.readthedocs.io/en/stable/installed_packages.html)\nprovides further examples on how to do this.\n\nMypy looks for the existence of a file named [`py.typed`](./src/not-again-ai/py.typed) in the root of the\ninstalled package to indicate that inline type annotations should be checked.\n\n## Typos\n\nCheck for typos using [typos](https://github.com/crate-ci/typos)\n\n```bash\n(.venv) $ nox -s typos\n```\n\n## Continuous Integration\n\nContinuous integration is provided by [GitHub Actions](https://github.com/features/actions). This\nruns all tests, lints, and type checking for every commit and pull request to the repository.\n\nGitHub Actions is configured in [`.github/workflows/python.yml`](./.github/workflows/python.yml).\n\n## [Visual Studio Code](https://code.visualstudio.com/docs/languages/python)\n\nInstall the [Python extension](https://marketplace.visualstudio.com/items?itemName=ms-python.python) for VSCode.\n\nInstall the [Ruff extension](https://marketplace.visualstudio.com/items?itemName=charliermarsh.ruff) for VSCode.\n\nDefault settings are configured in [`.vscode/settings.json`](./.vscode/settings.json) which will enable Ruff with consistent settings.\n\n# Generating Documentation\n\n## Generating a User Guide\n\n[Material for MkDocs](https://squidfunk.github.io/mkdocs-material/) is a powerful static site\ngenerator that combines easy-to-write Markdown, with a number of Markdown extensions that increase\nthe power of Markdown. This makes it a great fit for user guides and other technical documentation.\n\nThe example MkDocs project included in this project is configured to allow the built documentation\nto be hosted at any URL or viewed offline from the file system.\n\nTo build the user guide, run,\n\n```bash\n(.venv) $ nox -s docs\n```\n\nand open `docs/user_guide/site/index.html` using a web browser.\n\nTo build the user guide, additionally validating external URLs, run:\n\n```bash\n(.venv) $ nox -s docs_check_urls\n```\n\nTo build the user guide in a format suitable for viewing directly from the file system, run:\n\n```bash\n(.venv) $ nox -s docs_offline\n```\n\nTo build and serve the user guide with automatic rebuilding as you change the contents,\nrun:\n\n```bash\n(.venv) $ nox -s docs_serve\n``` \n\nand open <http://127.0.0.1:8000> in a browser.\n\nEach time the `main` Git branch is updated, the \n[`.github/workflows/pages.yml`](.github/workflows/pages.yml) GitHub Action will\nautomatically build the user guide and publish it to [GitHub Pages](https://pages.github.com/).\nThis is configured in the `docs_github_pages` Nox session.\n\n## Generating API Documentation\n\nThis project uses [mkdocstrings](https://github.com/mkdocstrings/mkdocstrings) plugin for\nMkDocs, which renders\n[Google-style docstrings](https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html)\ninto an MkDocs project. Google-style docstrings provide a good mix of easy-to-read docstrings in\ncode as well as nicely-rendered output.\n\n```python\n\"\"\"Computes the factorial through a recursive algorithm.\n\nArgs:\n n: A positive input value.\n\nRaises:\n InvalidFactorialError: If n is less than 0.\n\nReturns:\n Computed factorial.\n\"\"\"\n```\n\n## Misc\n\nIf you get a `Failed to create the collection: Prompt dismissed..` error when running `poetry update` on Ubuntu, try setting the following environment variable:\n\n ```bash\n export PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring\n ```\n\n# Attributions\n[python-blueprint](https://github.com/johnthagen/python-blueprint) for the Python package skeleton.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Designed to once and for all collect all the little things that come up over and over again in AI projects and put them in one place.",
"version": "0.14.0",
"project_urls": {
"Documentation": "https://github.com/DaveCoDev/not-again-ai",
"Homepage": "https://github.com/DaveCoDev/not-again-ai",
"Repository": "https://github.com/DaveCoDev/not-again-ai"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "f82efee705f3594f0ce0a625e3697c680b4ca3dce06704b79ec44e3e10526b6f",
"md5": "2ad7919653bfbaa6b8963e2ea11a57b2",
"sha256": "3835a51ff22c16fa68b17389f284bf8dd691da1ac6cbba9055bd8a0a04b7c9a9"
},
"downloads": -1,
"filename": "not_again_ai-0.14.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "2ad7919653bfbaa6b8963e2ea11a57b2",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.11",
"size": 46899,
"upload_time": "2024-10-21T01:30:15",
"upload_time_iso_8601": "2024-10-21T01:30:15.881003Z",
"url": "https://files.pythonhosted.org/packages/f8/2e/fee705f3594f0ce0a625e3697c680b4ca3dce06704b79ec44e3e10526b6f/not_again_ai-0.14.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "bf837bef82d2ae848cbf14d485fbcc2f6f1056aa8b636a02e7616d67a510835d",
"md5": "65d5b453e5388529f95da0dc58f5d548",
"sha256": "35cd5653348b4a23c611ffc2522a6cb26aac5467ea4d97dcb24b824905ad6ad9"
},
"downloads": -1,
"filename": "not_again_ai-0.14.0.tar.gz",
"has_sig": false,
"md5_digest": "65d5b453e5388529f95da0dc58f5d548",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.11",
"size": 39454,
"upload_time": "2024-10-21T01:30:18",
"upload_time_iso_8601": "2024-10-21T01:30:18.718946Z",
"url": "https://files.pythonhosted.org/packages/bf/83/7bef82d2ae848cbf14d485fbcc2f6f1056aa8b636a02e7616d67a510835d/not_again_ai-0.14.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-21 01:30:18",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "DaveCoDev",
"github_project": "not-again-ai",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "not-again-ai"
}