| Name | chiku JSON |
| Version |
0.0.25
JSON |
| download |
| home_page | |
| Summary | Python package for efficient probabilistic polynomial approximation of arbitrary functions. |
| upload_time | 2023-07-30 23:53:50 |
| maintainer | |
| docs_url | None |
| author | |
| requires_python | >=3.7 |
| license | |
| keywords |
|
| VCS |
 |
| bugtrack_url |
|
| requirements |
numdifftools
numpy
SciPy
sympy
tensorflow
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
## chiku
### Efficient Probabilistic Polynomial Function Approximation Python Library.
#### Installation
To install run: <code>pip install chiku</code>
#### Approximation Libraries
Complex (non-linear) functions like Sigmoid ( $\sigma(x)$ ) and Hyperbolic Tangent ( $\tanh{x}$ ) can be computed with Fully Homomorphic Encryption (FHE) in an encrypted domain using piecewise-linear functions (a linear approximation of $\sigma(x) = 0.5 + 0.25x$ can be derived from the first two terms of Taylor series $\frac{1}{2} + \frac{1}{4}x$ ) or polynomial approximations like Taylor, Pade, Chebyshev, Remez, and Fourier series. These deterministic approaches yield the same polynomial for the same function. In contrast, we propose to use Artificial Neural Network ( $ANN$ ) to derive the approximation polynomial probabilistically, where the coefficients are based on the initial weights and convergence of the $ANN$ model. Our scheme is publicly available here as an open-source Python package.
Library | Taylor | Fourier | Pade | Chebyshev | Remez | ANN
--------|--------|---------|------|-----------|-------|-----
[numpy](https://github.com/numpy/numpy)||||✔||
[scipy](https://github.com/scipy/scipy)|✔||✔|||
[mpmath](https://github.com/mpmath/mpmath)|✔|✔|✔|✔||
[chiku](https://github.com/devharsh/chiku)|✔|✔|✔|✔|✔|✔
The table above compares our library with other popular Python packages for numerical analysis. While the $mpmath$ library provides Taylor, Pade, Fourier, and Chebyshev approximations, a user has to transform the functions to suit the $mpmath$ datatypes (e.g., $mpf$ for real float and $mpc$ for complex values). In contrast, our library requires no modifications and can approximate arbitrary functions. Additionally, we provide Remez approximation along with the other methods supported by the $mpmath$.
#### ANN Approximation
While $ANN$ are known for their universal function approximation properties, they are often treated as a black box and used to calculate the output value. We propose to use a basic 3-layer perceptron consisting of an input layer, a hidden layer, and an output layer; both hidden and output layers have linear activations to generate the coefficients for an approximation polynomial of a given order. In this architecture, the input layer is dynamic, with the input nodes corresponding to the desired polynomial degrees. While having a variable number of hidden layers is possible, we fix it to a single layer with a single node to minimize the computation.

We show coefficient calculations for a third-order polynomial $d=3$ for a univariate function $f(x) = y$ for an input $x$, actual output $y$, and predicted output $y_{out}$.
Input layer weights are
$\{w_1, w_2, \ldots, w_d\} = \{w_1, w_2, w_3\} = \{x, x^2, x^3\}$
and biases are $\{b_1, b_2, b_3\} = b_h$. Thus the output of the hidden layer is
$y_h = w_1 x + w_2 x^2 + w_3 x^3 + b_h$
The predicted output is calculated by
$y_{out} = w_{out} \cdot y_h + b_{out}$
$= w_1 w_{out} x + w_2 w_{out} x^2 + w_3 w_{out} x^3 + (b_h w_{out} + b_{out})$
where the layer weights $\{w_1 w_{out}, w_2 w_{out}, w_3 w_{out}\}$ are the coefficients for the approximating polynomial of order-3
and the constant term is $b_h w_{out} + b_{out}$.
Our polynomial approximation approach using $ANN$ can generate polynomials with specified degrees. E.g., a user can generate a complete third-order polynomial for $\sin(x)$, which yields a polynomial
$-0.0931199x^3 - 0.001205849x^2 + 0.85615075x + 0.0009873845$
in the interval $[-\pi,\pi]$. Meanwhile, a user may want to optimize the above polynomial by eliminating the coefficients for $x^2$ to reduce costly multiplications in FHE, which yields the following:
$-0.09340597x^3 + 0.8596622x + 0.0005142888.$
Raw data
{
"_id": null,
"home_page": "",
"name": "chiku",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": "",
"keywords": "",
"author": "",
"author_email": "Devharsh Trivedi <devharsh@live.in>",
"download_url": "https://files.pythonhosted.org/packages/ad/c4/fee0f666ba772dab8abbd33827822a5930a349e18f212f9cb41055ba408a/chiku-0.0.25.tar.gz",
"platform": null,
"description": "## chiku\n### Efficient Probabilistic Polynomial Function Approximation Python Library.\n\n#### Installation\nTo install run: <code>pip install chiku</code>\n\n#### Approximation Libraries\nComplex (non-linear) functions like Sigmoid ( $\\sigma(x)$ ) and Hyperbolic Tangent ( $\\tanh{x}$ ) can be computed with Fully Homomorphic Encryption (FHE) in an encrypted domain using piecewise-linear functions (a linear approximation of $\\sigma(x) = 0.5 + 0.25x$ can be derived from the first two terms of Taylor series $\\frac{1}{2} + \\frac{1}{4}x$ ) or polynomial approximations like Taylor, Pade, Chebyshev, Remez, and Fourier series. These deterministic approaches yield the same polynomial for the same function. In contrast, we propose to use Artificial Neural Network ( $ANN$ ) to derive the approximation polynomial probabilistically, where the coefficients are based on the initial weights and convergence of the $ANN$ model. Our scheme is publicly available here as an open-source Python package.\n\nLibrary | Taylor | Fourier | Pade | Chebyshev | Remez | ANN\n--------|--------|---------|------|-----------|-------|-----\n[numpy](https://github.com/numpy/numpy)||||\u2714||\n[scipy](https://github.com/scipy/scipy)|\u2714||\u2714|||\n[mpmath](https://github.com/mpmath/mpmath)|\u2714|\u2714|\u2714|\u2714||\n[chiku](https://github.com/devharsh/chiku)|\u2714|\u2714|\u2714|\u2714|\u2714|\u2714\n\nThe table above compares our library with other popular Python packages for numerical analysis. While the $mpmath$ library provides Taylor, Pade, Fourier, and Chebyshev approximations, a user has to transform the functions to suit the $mpmath$ datatypes (e.g., $mpf$ for real float and $mpc$ for complex values). In contrast, our library requires no modifications and can approximate arbitrary functions. Additionally, we provide Remez approximation along with the other methods supported by the $mpmath$.\n\n#### ANN Approximation\nWhile $ANN$ are known for their universal function approximation properties, they are often treated as a black box and used to calculate the output value. We propose to use a basic 3-layer perceptron consisting of an input layer, a hidden layer, and an output layer; both hidden and output layers have linear activations to generate the coefficients for an approximation polynomial of a given order. In this architecture, the input layer is dynamic, with the input nodes corresponding to the desired polynomial degrees. While having a variable number of hidden layers is possible, we fix it to a single layer with a single node to minimize the computation.\n\n\n\nWe show coefficient calculations for a third-order polynomial $d=3$ for a univariate function $f(x) = y$ for an input $x$, actual output $y$, and predicted output $y_{out}$.\n\nInput layer weights are\n\n$\\{w_1, w_2, \\ldots, w_d\\} = \\{w_1, w_2, w_3\\} = \\{x, x^2, x^3\\}$\n\nand biases are $\\{b_1, b_2, b_3\\} = b_h$. Thus the output of the hidden layer is\n\n$y_h = w_1 x + w_2 x^2 + w_3 x^3 + b_h$\n\nThe predicted output is calculated by\n\n$y_{out} = w_{out} \\cdot y_h + b_{out}$\n$= w_1 w_{out} x + w_2 w_{out} x^2 + w_3 w_{out} x^3 + (b_h w_{out} + b_{out})$\n\nwhere the layer weights $\\{w_1 w_{out}, w_2 w_{out}, w_3 w_{out}\\}$ are the coefficients for the approximating polynomial of order-3\n\nand the constant term is $b_h w_{out} + b_{out}$.\n\nOur polynomial approximation approach using $ANN$ can generate polynomials with specified degrees. E.g., a user can generate a complete third-order polynomial for $\\sin(x)$, which yields a polynomial\n$-0.0931199x^3 - 0.001205849x^2 + 0.85615075x + 0.0009873845$\nin the interval $[-\\pi,\\pi]$. Meanwhile, a user may want to optimize the above polynomial by eliminating the coefficients for $x^2$ to reduce costly multiplications in FHE, which yields the following:\n$-0.09340597x^3 + 0.8596622x + 0.0005142888.$\n",
"bugtrack_url": null,
"license": "",
"summary": "Python package for efficient probabilistic polynomial approximation of arbitrary functions.",
"version": "0.0.25",
"project_urls": {
"Bug Tracker": "https://github.com/devharsh/chiku/issues",
"Docs": "https://github.com/devharsh/chiku/tree/main/tests",
"Homepage": "https://github.com/devharsh/chiku/blob/main/README.md",
"repository": "https://github.com/devharsh/chiku"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "4988f5674ce0aef9090d92866d211f110ce11180aa29f3433f245c0a3de8cf57",
"md5": "38d77519fe2b9e3561748ac9cc00397e",
"sha256": "c56184e46ab7f6971192e44937dd207af2814a6455164b24ea8994888605ef75"
},
"downloads": -1,
"filename": "chiku-0.0.25-py3-none-any.whl",
"has_sig": false,
"md5_digest": "38d77519fe2b9e3561748ac9cc00397e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 23558,
"upload_time": "2023-07-30T23:53:43",
"upload_time_iso_8601": "2023-07-30T23:53:43.167654Z",
"url": "https://files.pythonhosted.org/packages/49/88/f5674ce0aef9090d92866d211f110ce11180aa29f3433f245c0a3de8cf57/chiku-0.0.25-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "adc4fee0f666ba772dab8abbd33827822a5930a349e18f212f9cb41055ba408a",
"md5": "9dc61f5bc4f2b949ef7576924fab3649",
"sha256": "eba9c09c658720cdce752f665ddde7685119194fe63d685d910252e157da52dd"
},
"downloads": -1,
"filename": "chiku-0.0.25.tar.gz",
"has_sig": false,
"md5_digest": "9dc61f5bc4f2b949ef7576924fab3649",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 24111,
"upload_time": "2023-07-30T23:53:50",
"upload_time_iso_8601": "2023-07-30T23:53:50.405528Z",
"url": "https://files.pythonhosted.org/packages/ad/c4/fee0f666ba772dab8abbd33827822a5930a349e18f212f9cb41055ba408a/chiku-0.0.25.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-07-30 23:53:50",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "devharsh",
"github_project": "chiku",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "numdifftools",
"specs": [
[
"==",
"0.9.41"
]
]
},
{
"name": "numpy",
"specs": [
[
"==",
"1.25.1"
]
]
},
{
"name": "SciPy",
"specs": [
[
"==",
"1.11.1"
]
]
},
{
"name": "sympy",
"specs": [
[
"==",
"1.12"
]
]
},
{
"name": "tensorflow",
"specs": [
[
"==",
"2.13.0"
]
]
}
],
"lcname": "chiku"
}