rt-activation


Namert-activation JSON
Version 1.0.0 PyPI version JSON
download
home_pageNone
SummaryRectified Tangent Activation (RTA) function for Keras/TensorFlow
upload_time2025-07-28 18:53:52
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseMIT
keywords keras tensorflow activation neural-networks deep-learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # RT Activation
A custom activation function for Keras/TensorFlow implementing Rectified Tangent Activation (RTA).

## Formula
f(x) = max(x, tanh(x))

## Installation
```bash
pip install rt-activation
```
## Usage

### Simple Usage (String-based)
```python
import keras
from keras import layers
import rt_activation  # This registers the activation function

model = keras.Sequential([
    keras.Input(shape=input_shape),
    layers.Conv2D(32, kernel_size=(3, 3), activation="RTA"),
    layers.MaxPooling2D(pool_size=(2, 2)),
    layers.Conv2D(64, kernel_size=(3, 3), activation="RTA"),
    layers.MaxPooling2D(pool_size=(2, 2)),
    layers.Flatten(),
    layers.Dropout(0.5),
    layers.Dense(num_classes, activation="softmax"),
])
```

### Function-based Usage
```python
import keras
from keras import layers
from rt_activation import RTA

model = keras.Sequential([
    keras.Input(shape=input_shape),
    layers.Conv2D(32, kernel_size=(3, 3), activation=RTA),
    layers.MaxPooling2D(pool_size=(2, 2)),
    layers.Conv2D(64, kernel_size=(3, 3), activation=RTA),
    layers.MaxPooling2D(pool_size=(2, 2)),
    layers.Flatten(),
    layers.Dropout(0.5),
    layers.Dense(num_classes, activation="softmax"),
])
```

## Properties
- **Smooth**: Differentiable everywhere
- **Non-saturating**: Linear growth for large positive values
- **Bounded for negatives**: tanh behavior for negative inputs
- **Zero-centered**: Output can be negative

## License
MIT License

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "rt-activation",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "keras, tensorflow, activation, neural-networks, deep-learning",
    "author": null,
    "author_email": "Gaurav Pandey <gaurav.pandey1812@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/e1/0a/fe98e56dd287393e1582beb2c8026c3e8921420356422cb610d93d9e3ac8/rt_activation-1.0.0.tar.gz",
    "platform": null,
    "description": "# RT Activation\nA custom activation function for Keras/TensorFlow implementing Rectified Tangent Activation (RTA).\n\n## Formula\nf(x) = max(x, tanh(x))\n\n## Installation\n```bash\npip install rt-activation\n```\n## Usage\n\n### Simple Usage (String-based)\n```python\nimport keras\nfrom keras import layers\nimport rt_activation  # This registers the activation function\n\nmodel = keras.Sequential([\n    keras.Input(shape=input_shape),\n    layers.Conv2D(32, kernel_size=(3, 3), activation=\"RTA\"),\n    layers.MaxPooling2D(pool_size=(2, 2)),\n    layers.Conv2D(64, kernel_size=(3, 3), activation=\"RTA\"),\n    layers.MaxPooling2D(pool_size=(2, 2)),\n    layers.Flatten(),\n    layers.Dropout(0.5),\n    layers.Dense(num_classes, activation=\"softmax\"),\n])\n```\n\n### Function-based Usage\n```python\nimport keras\nfrom keras import layers\nfrom rt_activation import RTA\n\nmodel = keras.Sequential([\n    keras.Input(shape=input_shape),\n    layers.Conv2D(32, kernel_size=(3, 3), activation=RTA),\n    layers.MaxPooling2D(pool_size=(2, 2)),\n    layers.Conv2D(64, kernel_size=(3, 3), activation=RTA),\n    layers.MaxPooling2D(pool_size=(2, 2)),\n    layers.Flatten(),\n    layers.Dropout(0.5),\n    layers.Dense(num_classes, activation=\"softmax\"),\n])\n```\n\n## Properties\n- **Smooth**: Differentiable everywhere\n- **Non-saturating**: Linear growth for large positive values\n- **Bounded for negatives**: tanh behavior for negative inputs\n- **Zero-centered**: Output can be negative\n\n## License\nMIT License\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Rectified Tangent Activation (RTA) function for Keras/TensorFlow",
    "version": "1.0.0",
    "project_urls": {
        "Homepage": "https://github.com/gkp1812/rt-activation",
        "Issues": "https://github.com/gkp1812/rt-activation/issues",
        "Repository": "https://github.com/gkp1812/rt-activation"
    },
    "split_keywords": [
        "keras",
        " tensorflow",
        " activation",
        " neural-networks",
        " deep-learning"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ac6439ff1a04f1153693f8fa1ca342f657d1177d19653aa706a106cdc93292f8",
                "md5": "0c42f8af18c8bcd4ec6e362f8edd2070",
                "sha256": "6fc0211a955a47433e516de5738291a0cbaa8cd930609d24f80a115403495625"
            },
            "downloads": -1,
            "filename": "rt_activation-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0c42f8af18c8bcd4ec6e362f8edd2070",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 4240,
            "upload_time": "2025-07-28T18:53:51",
            "upload_time_iso_8601": "2025-07-28T18:53:51.317260Z",
            "url": "https://files.pythonhosted.org/packages/ac/64/39ff1a04f1153693f8fa1ca342f657d1177d19653aa706a106cdc93292f8/rt_activation-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "e10afe98e56dd287393e1582beb2c8026c3e8921420356422cb610d93d9e3ac8",
                "md5": "28607380c7da24b86890dede26930947",
                "sha256": "9975fe053ec5ac271d466fbedaf3b0a126eb81604e26bce4c797221844002b12"
            },
            "downloads": -1,
            "filename": "rt_activation-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "28607380c7da24b86890dede26930947",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 3952,
            "upload_time": "2025-07-28T18:53:52",
            "upload_time_iso_8601": "2025-07-28T18:53:52.422094Z",
            "url": "https://files.pythonhosted.org/packages/e1/0a/fe98e56dd287393e1582beb2c8026c3e8921420356422cb610d93d9e3ac8/rt_activation-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-28 18:53:52",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "gkp1812",
    "github_project": "rt-activation",
    "github_not_found": true,
    "lcname": "rt-activation"
}
        
Elapsed time: 1.56795s