Name | bert-for-tf2e JSON |
Version |
0.14.13
JSON |
| download |
home_page | |
Summary | A TensorFlow 2.11.0 Keras implementation of BERT. |
upload_time | 2023-01-23 21:03:39 |
maintainer | |
docs_url | None |
author | Esa Krissa |
requires_python | >=3.6 |
license | MIT |
keywords |
tensorflow
keras
bert
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
BERT for TensorFlow 2.11.0
======================
This is a modification version of the original `bert-for-tf2` created by `kpe`. I made a minor change to the code
to make it work with never version of TensorFlow, following the solution that i found in github community.
Resolving the TypeError issue. Last time checked at 1/23/2023 - it worked fine.
This repo contains a `TensorFlow 2.11.0`_ `Keras`_ implementation of `google-research/bert`_
with support for loading of the original `pre-trained weights`_,
and producing activations **numerically identical** to the one calculated by the original model.
`ALBERT`_ and `adapter-BERT`_ are also supported by setting the corresponding
configuration parameters (``shared_layer=True``, ``embedding_size`` for `ALBERT`_
and ``adapter_size`` for `adapter-BERT`_). Setting both will result in an adapter-ALBERT
by sharing the BERT parameters across all layers while adapting every layer with layer specific adapter.
The implementation is build from scratch using only basic tensorflow operations,
following the code in `google-research/bert/modeling.py`_
(but skipping dead code and applying some simplifications). It also utilizes `kpe/params-flow`_ to reduce
common Keras boilerplate code (related to passing model and layer configuration arguments).
`bert-for-tf2e`_ should work with both `TensorFlow 2.11.0`_ and `TensorFlow 1.14`_ or newer.
Install
-------
``bert-for-tf2e`` bert for tensorflow 2.0 (extended) is on the Python Package Index (PyPI):
::
pip install bert-for-tf2e
For more detail please check the original version:
---------
- `SOURCE`_ - https://github.com/kpe/bert-for-tf2
Raw data
{
"_id": null,
"home_page": "",
"name": "bert-for-tf2e",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": "",
"keywords": "tensorflow keras bert",
"author": "Esa Krissa",
"author_email": "esakrissa.wayan@gmail.com",
"download_url": "",
"platform": null,
"description": "BERT for TensorFlow 2.11.0\n======================\n\nThis is a modification version of the original `bert-for-tf2` created by `kpe`. I made a minor change to the code \nto make it work with never version of TensorFlow, following the solution that i found in github community.\nResolving the TypeError issue. Last time checked at 1/23/2023 - it worked fine.\n\nThis repo contains a `TensorFlow 2.11.0`_ `Keras`_ implementation of `google-research/bert`_\nwith support for loading of the original `pre-trained weights`_,\nand producing activations **numerically identical** to the one calculated by the original model.\n\n`ALBERT`_ and `adapter-BERT`_ are also supported by setting the corresponding\nconfiguration parameters (``shared_layer=True``, ``embedding_size`` for `ALBERT`_\nand ``adapter_size`` for `adapter-BERT`_). Setting both will result in an adapter-ALBERT\nby sharing the BERT parameters across all layers while adapting every layer with layer specific adapter.\n\nThe implementation is build from scratch using only basic tensorflow operations,\nfollowing the code in `google-research/bert/modeling.py`_\n(but skipping dead code and applying some simplifications). It also utilizes `kpe/params-flow`_ to reduce\ncommon Keras boilerplate code (related to passing model and layer configuration arguments).\n\n`bert-for-tf2e`_ should work with both `TensorFlow 2.11.0`_ and `TensorFlow 1.14`_ or newer.\n\nInstall\n-------\n\n``bert-for-tf2e`` bert for tensorflow 2.0 (extended) is on the Python Package Index (PyPI):\n\n::\n\n pip install bert-for-tf2e\n\n\nFor more detail please check the original version:\n---------\n- `SOURCE`_ - https://github.com/kpe/bert-for-tf2\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A TensorFlow 2.11.0 Keras implementation of BERT.",
"version": "0.14.13",
"split_keywords": [
"tensorflow",
"keras",
"bert"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "490d88c29004e79a1727d38a8a5df6df238748f4a7b9130e37681a38c8321afc",
"md5": "f04ed1d6f9dce4b4852efc09e4dc518f",
"sha256": "50c92a69fcf0147bcb0e3f4327397000c69ebb38a50fd85b8be03c37ee419137"
},
"downloads": -1,
"filename": "bert_for_tf2e-0.14.13-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f04ed1d6f9dce4b4852efc09e4dc518f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 48167,
"upload_time": "2023-01-23T21:03:39",
"upload_time_iso_8601": "2023-01-23T21:03:39.808634Z",
"url": "https://files.pythonhosted.org/packages/49/0d/88c29004e79a1727d38a8a5df6df238748f4a7b9130e37681a38c8321afc/bert_for_tf2e-0.14.13-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-01-23 21:03:39",
"github": false,
"gitlab": false,
"bitbucket": false,
"lcname": "bert-for-tf2e"
}