fastai


Namefastai JSON
Version 2.2.5 PyPI version JSON
download
home_pagehttps://github.com/fastai/fastai/tree/master/
Summaryfastai simplifies training fast and accurate neural nets using modern best practices
upload_time2021-01-13 19:38:48
maintainer
docs_urlNone
authorJeremy Howard, Sylvain Gugger, and contributors
requires_python>=3.6
licenseApache Software License 2.0
keywords fastai deep learning machine learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Welcome to fastai
> fastai simplifies training fast and accurate neural nets using modern best practices


**Important**: This documentation covers fastai v2, which is a from-scratch rewrite of fastai. The v1 documentation has moved to [fastai1.fast.ai](https://fastai1.fast.ai). To stop fastai from updating to v2, run in your terminal `echo 'fastai 1.*' >> $CONDA_PREFIX/conda-meta/pinned` (if you use conda).

![CI](https://github.com/fastai/fastai/workflows/CI/badge.svg) [![PyPI](https://img.shields.io/pypi/v/fastai?color=blue&label=pypi%20version)](https://pypi.org/project/fastai/#description) [![Conda (channel only)](https://img.shields.io/conda/vn/fastai/fastai?color=seagreen&label=conda%20version)](https://anaconda.org/fastai/fastai) [![Build fastai images](https://github.com/fastai/docker-containers/workflows/Build%20fastai%20images/badge.svg)](https://github.com/fastai/docker-containers) ![docs](https://github.com/fastai/fastai/workflows/docs/badge.svg)

## Installing

You can use fastai without any installation by using [Google Colab](https://colab.research.google.com/). In fact, every page of this documentation is also available as an interactive notebook - click "Open in colab" at the top of any page to open it (be sure to change the Colab runtime to "GPU" to have it run fast!) See the fast.ai documentation on [Using Colab](https://course.fast.ai/start_colab) for more information.

You can install fastai on your own machines with conda (highly recommended). If you're using [Anaconda](https://www.anaconda.com/products/individual) then run:
```bash
conda install -c fastai -c pytorch -c anaconda fastai gh anaconda
```

...or if you're using [miniconda](https://docs.conda.io/en/latest/miniconda.html)) then run:
```bash
conda install -c fastai -c pytorch fastai
```

To install with pip, use: `pip install fastai`. If you install with pip, you should install PyTorch first by following the PyTorch [installation instructions](https://pytorch.org/get-started/locally/).

If you plan to develop fastai yourself, or want to be on the cutting edge, you can use an editable install (if you do this, you should also use an editable install of [fastcore](https://github.com/fastai/fastcore) to go with it.):

``` 
git clone https://github.com/fastai/fastai
pip install -e "fastai[dev]"
``` 

## Learning fastai

The best way to get started with fastai (and deep learning) is to read [the book](https://www.amazon.com/Deep-Learning-Coders-fastai-PyTorch/dp/1492045527), and complete [the free course](https://course.fast.ai).

To see what's possible with fastai, take a look at the [Quick Start](https://docs.fast.ai/quick_start.html), which shows how to use around 5 lines of code to build an image classifier, an image segmentation model, a text sentiment model, a recommendation system, and a tabular model. For each of the applications, the code is much the same.

Read through the [Tutorials](https://docs.fast.ai/tutorial) to learn how to train your own models on your own datasets. Use the navigation sidebar to look through the fastai documentation. Every class, function, and method is documented here.

To learn about the design and motivation of the library, read the [peer reviewed paper](https://www.mdpi.com/2078-2489/11/2/108/htm).

## About fastai

fastai is a deep learning library which provides practitioners with high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains, and provides researchers with low-level components that can be mixed and matched to build new approaches. It aims to do both things without substantial compromises in ease of use, flexibility, or performance. This is possible thanks to a carefully layered architecture, which expresses common underlying patterns of many deep learning and data processing techniques in terms of decoupled abstractions. These abstractions can be expressed concisely and clearly by leveraging the dynamism of the underlying Python language and the flexibility of the PyTorch library. fastai includes:

- A new type dispatch system for Python along with a semantic type hierarchy for tensors
- A GPU-optimized computer vision library which can be extended in pure Python
- An optimizer which refactors out the common functionality of modern optimizers into two basic pieces, allowing optimization algorithms to be implemented in 4–5 lines of code
- A novel 2-way callback system that can access any part of the data, model, or optimizer and change it at any point during training
- A new data block API
- And much more...

fastai is organized around two main design goals: to be approachable and rapidly productive, while also being deeply hackable and configurable. It is built on top of a hierarchy of lower-level APIs which provide composable building blocks. This way, a user wanting to rewrite part of the high-level API or add particular behavior to suit their needs does not have to learn how to use the lowest level.

<img alt="Layered API" src="https://raw.githubusercontent.com/fastai/fastai/master/nbs/images/layered.png" width="345" style="max-width: 345px">

## Migrating from other libraries

It's very easy to migrate from plain PyTorch, Ignite, or any other PyTorch-based library, or even to use fastai in conjunction with other libraries. Generally, you'll be able to use all your existing data processing code, but will be able to reduce the amount of code you require for training, and more easily take advantage of modern best practices. Here are migration guides from some popular libraries to help you on your way:

- [Plain PyTorch](https://docs.fast.ai/migrating_pytorch)
- [Ignite](https://docs.fast.ai/migrating_ignite)
- [Lightning](https://docs.fast.ai/migrating_lightning)
- [Catalyst](https://docs.fast.ai/migrating_catalyst)

## Tests

To run the tests in parallel, launch:

`nbdev_test_nbs` or `make test`

For all the tests to pass, you'll need to install the following optional dependencies:

```
pip install "sentencepiece<0.1.90" wandb tensorboard albumentations pydicom opencv-python scikit-image pyarrow kornia \
    catalyst captum neptune-cli
```

Tests are written using `nbdev`, for example see the documentation for `test_eq`.

## Contributing

After you clone this repository, please run `nbdev_install_git_hooks` in your terminal. This sets up git hooks, which clean up the notebooks to remove the extraneous stuff stored in the notebooks (e.g. which cells you ran) which causes unnecessary merge conflicts.

Before submitting a PR, check that the local library and notebooks match. The script `nbdev_diff_nbs` can let you know if there is a difference between the local library and the notebooks.

- If you made a change to the notebooks in one of the exported cells, you can export it to the library with `nbdev_build_lib` or `make fastai`.
- If you made a change to the library, you can export it back to the notebooks with `nbdev_update_lib`.

## Docker Containers

For those interested in official docker containers for this project, they can be found [here](https://github.com/fastai/docker-containers#fastai).



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/fastai/fastai/tree/master/",
    "name": "fastai",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "fastai,deep learning,machine learning",
    "author": "Jeremy Howard, Sylvain Gugger, and contributors",
    "author_email": "info@fast.ai",
    "download_url": "https://files.pythonhosted.org/packages/68/10/b4abb379eaa2c38ab31bf2c891de1e8e7f120bab6aa6949943c881cc1dc0/fastai-2.2.5.tar.gz",
    "platform": "",
    "description": "# Welcome to fastai\n> fastai simplifies training fast and accurate neural nets using modern best practices\n\n\n**Important**: This documentation covers fastai v2, which is a from-scratch rewrite of fastai. The v1 documentation has moved to [fastai1.fast.ai](https://fastai1.fast.ai). To stop fastai from updating to v2, run in your terminal `echo 'fastai 1.*' >> $CONDA_PREFIX/conda-meta/pinned` (if you use conda).\n\n![CI](https://github.com/fastai/fastai/workflows/CI/badge.svg) [![PyPI](https://img.shields.io/pypi/v/fastai?color=blue&label=pypi%20version)](https://pypi.org/project/fastai/#description) [![Conda (channel only)](https://img.shields.io/conda/vn/fastai/fastai?color=seagreen&label=conda%20version)](https://anaconda.org/fastai/fastai) [![Build fastai images](https://github.com/fastai/docker-containers/workflows/Build%20fastai%20images/badge.svg)](https://github.com/fastai/docker-containers) ![docs](https://github.com/fastai/fastai/workflows/docs/badge.svg)\n\n## Installing\n\nYou can use fastai without any installation by using [Google Colab](https://colab.research.google.com/). In fact, every page of this documentation is also available as an interactive notebook - click \"Open in colab\" at the top of any page to open it (be sure to change the Colab runtime to \"GPU\" to have it run fast!) See the fast.ai documentation on [Using Colab](https://course.fast.ai/start_colab) for more information.\n\nYou can install fastai on your own machines with conda (highly recommended). If you're using [Anaconda](https://www.anaconda.com/products/individual) then run:\n```bash\nconda install -c fastai -c pytorch -c anaconda fastai gh anaconda\n```\n\n...or if you're using [miniconda](https://docs.conda.io/en/latest/miniconda.html)) then run:\n```bash\nconda install -c fastai -c pytorch fastai\n```\n\nTo install with pip, use: `pip install fastai`. If you install with pip, you should install PyTorch first by following the PyTorch [installation instructions](https://pytorch.org/get-started/locally/).\n\nIf you plan to develop fastai yourself, or want to be on the cutting edge, you can use an editable install (if you do this, you should also use an editable install of [fastcore](https://github.com/fastai/fastcore) to go with it.):\n\n``` \ngit clone https://github.com/fastai/fastai\npip install -e \"fastai[dev]\"\n``` \n\n## Learning fastai\n\nThe best way to get started with fastai (and deep learning) is to read [the book](https://www.amazon.com/Deep-Learning-Coders-fastai-PyTorch/dp/1492045527), and complete [the free course](https://course.fast.ai).\n\nTo see what's possible with fastai, take a look at the [Quick Start](https://docs.fast.ai/quick_start.html), which shows how to use around 5 lines of code to build an image classifier, an image segmentation model, a text sentiment model, a recommendation system, and a tabular model. For each of the applications, the code is much the same.\n\nRead through the [Tutorials](https://docs.fast.ai/tutorial) to learn how to train your own models on your own datasets. Use the navigation sidebar to look through the fastai documentation. Every class, function, and method is documented here.\n\nTo learn about the design and motivation of the library, read the [peer reviewed paper](https://www.mdpi.com/2078-2489/11/2/108/htm).\n\n## About fastai\n\nfastai is a deep learning library which provides practitioners with high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains, and provides researchers with low-level components that can be mixed and matched to build new approaches. It aims to do both things without substantial compromises in ease of use, flexibility, or performance. This is possible thanks to a carefully layered architecture, which expresses common underlying patterns of many deep learning and data processing techniques in terms of decoupled abstractions. These abstractions can be expressed concisely and clearly by leveraging the dynamism of the underlying Python language and the flexibility of the PyTorch library. fastai includes:\n\n- A new type dispatch system for Python along with a semantic type hierarchy for tensors\n- A GPU-optimized computer vision library which can be extended in pure Python\n- An optimizer which refactors out the common functionality of modern optimizers into two basic pieces, allowing optimization algorithms to be implemented in 4\u20135 lines of code\n- A novel 2-way callback system that can access any part of the data, model, or optimizer and change it at any point during training\n- A new data block API\n- And much more...\n\nfastai is organized around two main design goals: to be approachable and rapidly productive, while also being deeply hackable and configurable. It is built on top of a hierarchy of lower-level APIs which provide composable building blocks. This way, a user wanting to rewrite part of the high-level API or add particular behavior to suit their needs does not have to learn how to use the lowest level.\n\n<img alt=\"Layered API\" src=\"https://raw.githubusercontent.com/fastai/fastai/master/nbs/images/layered.png\" width=\"345\" style=\"max-width: 345px\">\n\n## Migrating from other libraries\n\nIt's very easy to migrate from plain PyTorch, Ignite, or any other PyTorch-based library, or even to use fastai in conjunction with other libraries. Generally, you'll be able to use all your existing data processing code, but will be able to reduce the amount of code you require for training, and more easily take advantage of modern best practices. Here are migration guides from some popular libraries to help you on your way:\n\n- [Plain PyTorch](https://docs.fast.ai/migrating_pytorch)\n- [Ignite](https://docs.fast.ai/migrating_ignite)\n- [Lightning](https://docs.fast.ai/migrating_lightning)\n- [Catalyst](https://docs.fast.ai/migrating_catalyst)\n\n## Tests\n\nTo run the tests in parallel, launch:\n\n`nbdev_test_nbs` or `make test`\n\nFor all the tests to pass, you'll need to install the following optional dependencies:\n\n```\npip install \"sentencepiece<0.1.90\" wandb tensorboard albumentations pydicom opencv-python scikit-image pyarrow kornia \\\n    catalyst captum neptune-cli\n```\n\nTests are written using `nbdev`, for example see the documentation for `test_eq`.\n\n## Contributing\n\nAfter you clone this repository, please run `nbdev_install_git_hooks` in your terminal. This sets up git hooks, which clean up the notebooks to remove the extraneous stuff stored in the notebooks (e.g. which cells you ran) which causes unnecessary merge conflicts.\n\nBefore submitting a PR, check that the local library and notebooks match. The script `nbdev_diff_nbs` can let you know if there is a difference between the local library and the notebooks.\n\n- If you made a change to the notebooks in one of the exported cells, you can export it to the library with `nbdev_build_lib` or `make fastai`.\n- If you made a change to the library, you can export it back to the notebooks with `nbdev_update_lib`.\n\n## Docker Containers\n\nFor those interested in official docker containers for this project, they can be found [here](https://github.com/fastai/docker-containers#fastai).\n\n\n",
    "bugtrack_url": null,
    "license": "Apache Software License 2.0",
    "summary": "fastai simplifies training fast and accurate neural nets using modern best practices",
    "version": "2.2.5",
    "split_keywords": [
        "fastai",
        "deep learning",
        "machine learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "md5": "a5ec73943e4472134eb214fde4054a0c",
                "sha256": "469dd5fdce6e66bc365c88dc63616de78d0a2ad524f3827f2b4c427a0d608225"
            },
            "downloads": -1,
            "filename": "fastai-2.2.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a5ec73943e4472134eb214fde4054a0c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 191159,
            "upload_time": "2021-01-13T19:38:43",
            "upload_time_iso_8601": "2021-01-13T19:38:43.762911Z",
            "url": "https://files.pythonhosted.org/packages/f3/09/36d4d472d0c953f5d931f4c415f715a909793e5d222af205f3e65c034da3/fastai-2.2.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "md5": "2625d423b480ca8a6d5ffc5e99c77beb",
                "sha256": "ce3eee3ee29b937ec7bf5a51459ce10126fbef20d39876fad4521bf4b87a4e89"
            },
            "downloads": -1,
            "filename": "fastai-2.2.5.tar.gz",
            "has_sig": false,
            "md5_digest": "2625d423b480ca8a6d5ffc5e99c77beb",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 45500358,
            "upload_time": "2021-01-13T19:38:48",
            "upload_time_iso_8601": "2021-01-13T19:38:48.496598Z",
            "url": "https://files.pythonhosted.org/packages/68/10/b4abb379eaa2c38ab31bf2c891de1e8e7f120bab6aa6949943c881cc1dc0/fastai-2.2.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2021-01-13 19:38:48",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "fastai"
}
        
Elapsed time: 0.18970s