FunctionEncoder


NameFunctionEncoder JSON
Version 0.1.1 PyPI version JSON
download
home_pageNone
SummaryA package for learning basis functions over arbitrary function sets. This allows even high-dimensional problems to be solved via a minimal number of basis functions. This allows for zero-shot transfer within these spaces, and also a mechanism for fully informative function representation via the coefficients of the basis functions. Hilbert spaces are nifty.
upload_time2024-12-07 14:46:56
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Function Encoder

A function encoder learns basis functions/vectors over arbitrary Hilbert spaces. This allows for zero-shot transfer within this learned by space by using a weighted combination of the basis functions to approximate any function in the space. The coefficients can be calculated quickly from a small amount of data, either using an inner product or the least squares method. The basis functions are learned from data as a neural network, which allows them to scale to high-dimensional function spaces. Furthermore, since the number of basis functions is fixed, this yields a fixed-size representation of the function which can be used for downstream tasks. 

See [the original paper](https://arxiv.org/abs/2401.17173) for a mathematical introduction or [the blog](https://tyler-ingebrand.github.io/2024/05/04/zero-shot-RL.html) for an intuitive explanation of function encoders. 

## Installation
For the latest stable release:
```commandline
pip install FunctionEncoder
```
For the latest version:
```commandline
pip install git+https://github.com/tyler-ingebrand/FunctionEncoder.git
```


## Examples

Each of the following examples can be found in the Examples/ directory. These examples illustrate the basis use of this repo and algorithm, but are by no means the extent of its applications or scalability. 

### Euclidean Vectors

This algorithm can be applied to any Hilbert space. To visualize what this algorithm looks like, we can apply it to Euclidean vectors. Watch as the basis vectors (black) converge to the Hilbert space being fit (blue square).

https://github.com/tyler-ingebrand/FunctionEncoder/assets/105821676/174ddf15-de2d-44dc-b7fe-6b5fad831a4b

### Quadratics

![A figure showing approximations over quadratics.](imgs/plot.png)

In the figure above, each panel shows a quadratic function (blue) and its function encoder approximation (orange). Small amounts of data are taken from each quadratic function, and used to compute a representation by taking the Monte Carlo approximation of the inner product between the function and basis functions. Then, the function is approximated as a weighted combination of basis functions. As you can see, a single set of learned basis functions is able to reproduce all nine of these quadratics accurately. 

The basis functions look like this:

![A figure showing the basis functions](imgs/basis.png)

### Distributions

As distributions are also Hilbert spaces, we can apply the exact same algorithm. The only difference is the definition of the inner product. The black dots below are example data points, and the red area indicates the approximated probability density function. Just like in the quadratic example, the same basis functions are able to approximate the pdfs of all of these distributions. 

![A figure showing Gaussian donuts](imgs/donuts.png)

## Related Papers

* ["Zero-Shot Reinforcement Learning Via Function Encoders" (ICML 2024)](https://arxiv.org/abs/2401.17173)
* ["Zero-Shot Transfer of Neural ODEs" (NeurIPS 2024)](https://arxiv.org/abs/2405.08954)
* ["Basis-to-Basis Operator Learning Using Function Encoders" (CMAME)](https://arxiv.org/abs/2410.00171)


## Citation

If you use this repo for research, please cite 

```
@article{Ingebrand2024,
  author       = {Tyler Ingebrand and
                  Amy Zhang and
                  Ufuk Topcu},
  title        = {Zero-Shot Reinforcement Learning via Function Encoders},
  booktitle    = {{ICML}},
  year         = {2024},
}
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "FunctionEncoder",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "Tyler Ingebrand <tyleringebrand@utexas.edu>",
    "download_url": "https://files.pythonhosted.org/packages/4a/7b/0690983554448c9492e04a5a269c85262d9c5b68e2d413d73468818c122c/functionencoder-0.1.1.tar.gz",
    "platform": null,
    "description": "# Function Encoder\n\nA function encoder learns basis functions/vectors over arbitrary Hilbert spaces. This allows for zero-shot transfer within this learned by space by using a weighted combination of the basis functions to approximate any function in the space. The coefficients can be calculated quickly from a small amount of data, either using an inner product or the least squares method. The basis functions are learned from data as a neural network, which allows them to scale to high-dimensional function spaces. Furthermore, since the number of basis functions is fixed, this yields a fixed-size representation of the function which can be used for downstream tasks. \n\nSee [the original paper](https://arxiv.org/abs/2401.17173) for a mathematical introduction or [the blog](https://tyler-ingebrand.github.io/2024/05/04/zero-shot-RL.html) for an intuitive explanation of function encoders. \n\n## Installation\nFor the latest stable release:\n```commandline\npip install FunctionEncoder\n```\nFor the latest version:\n```commandline\npip install git+https://github.com/tyler-ingebrand/FunctionEncoder.git\n```\n\n\n## Examples\n\nEach of the following examples can be found in the Examples/ directory. These examples illustrate the basis use of this repo and algorithm, but are by no means the extent of its applications or scalability. \n\n### Euclidean Vectors\n\nThis algorithm can be applied to any Hilbert space. To visualize what this algorithm looks like, we can apply it to Euclidean vectors. Watch as the basis vectors (black) converge to the Hilbert space being fit (blue square).\n\nhttps://github.com/tyler-ingebrand/FunctionEncoder/assets/105821676/174ddf15-de2d-44dc-b7fe-6b5fad831a4b\n\n### Quadratics\n\n![A figure showing approximations over quadratics.](imgs/plot.png)\n\nIn the figure above, each panel shows a quadratic function (blue) and its function encoder approximation (orange). Small amounts of data are taken from each quadratic function, and used to compute a representation by taking the Monte Carlo approximation of the inner product between the function and basis functions. Then, the function is approximated as a weighted combination of basis functions. As you can see, a single set of learned basis functions is able to reproduce all nine of these quadratics accurately. \n\nThe basis functions look like this:\n\n![A figure showing the basis functions](imgs/basis.png)\n\n### Distributions\n\nAs distributions are also Hilbert spaces, we can apply the exact same algorithm. The only difference is the definition of the inner product. The black dots below are example data points, and the red area indicates the approximated probability density function. Just like in the quadratic example, the same basis functions are able to approximate the pdfs of all of these distributions. \n\n![A figure showing Gaussian donuts](imgs/donuts.png)\n\n## Related Papers\n\n* [\"Zero-Shot Reinforcement Learning Via Function Encoders\" (ICML 2024)](https://arxiv.org/abs/2401.17173)\n* [\"Zero-Shot Transfer of Neural ODEs\" (NeurIPS 2024)](https://arxiv.org/abs/2405.08954)\n* [\"Basis-to-Basis Operator Learning Using Function Encoders\" (CMAME)](https://arxiv.org/abs/2410.00171)\n\n\n## Citation\n\nIf you use this repo for research, please cite \n\n```\n@article{Ingebrand2024,\n  author       = {Tyler Ingebrand and\n                  Amy Zhang and\n                  Ufuk Topcu},\n  title        = {Zero-Shot Reinforcement Learning via Function Encoders},\n  booktitle    = {{ICML}},\n  year         = {2024},\n}\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A package for learning basis functions over arbitrary function sets. This allows even high-dimensional problems to be solved via a minimal number of basis functions. This allows for zero-shot transfer within these spaces, and also a mechanism for fully informative function representation via the coefficients of the basis functions. Hilbert spaces are nifty.",
    "version": "0.1.1",
    "project_urls": {
        "homepage": "https://github.com/tyler-ingebrand/FunctionEncoder"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "69983bd73721e2f7e5af3e1b8fd8ef52faede9e9160a1fd697e27915358eb9fd",
                "md5": "d8c78d17683e5f1667cc9519a8aafbdd",
                "sha256": "fb71e1c2b4ca5b2cb7ef9b862ab49084f42ff77a089eca5e35d5ba01de352fee"
            },
            "downloads": -1,
            "filename": "FunctionEncoder-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d8c78d17683e5f1667cc9519a8aafbdd",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 32301,
            "upload_time": "2024-12-07T14:46:53",
            "upload_time_iso_8601": "2024-12-07T14:46:53.843472Z",
            "url": "https://files.pythonhosted.org/packages/69/98/3bd73721e2f7e5af3e1b8fd8ef52faede9e9160a1fd697e27915358eb9fd/FunctionEncoder-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4a7b0690983554448c9492e04a5a269c85262d9c5b68e2d413d73468818c122c",
                "md5": "2ad1a1a64044eee027eeda821fca405d",
                "sha256": "354dc74097e51ce6cf127890d90c693106916cf44b893cdcc999e416f7febc04"
            },
            "downloads": -1,
            "filename": "functionencoder-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "2ad1a1a64044eee027eeda821fca405d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 21972,
            "upload_time": "2024-12-07T14:46:56",
            "upload_time_iso_8601": "2024-12-07T14:46:56.361138Z",
            "url": "https://files.pythonhosted.org/packages/4a/7b/0690983554448c9492e04a5a269c85262d9c5b68e2d413d73468818c122c/functionencoder-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-07 14:46:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "tyler-ingebrand",
    "github_project": "FunctionEncoder",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "functionencoder"
}
        
Elapsed time: 0.39153s