gpt3_tokenizer


Namegpt3_tokenizer JSON
Version 0.1.5 PyPI version JSON
download
home_pagehttps://github.com/alisonjf/gpt3-tokenizer
SummaryEncoder/Decoder and tokens counter for GPT3
upload_time2024-04-26 18:07:33
maintainerNone
docs_urlNone
authorAlison Ferrenha
requires_python!=3.0.*,!=3.1.*,!=3.2.*,>=2.7
licenseMIT
keywords openai gpt gpt-3 gpt3 gpt4 gpt-4 tokenizer
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            gpt3_tokenizer
===============
| An `OpenAI`_ GPT3 helper library for encoding/decoding strings and counting tokens.
| Counting tokens gives the same output as OpenAI's `tokenizer`_
|
| Supported python versions: **>=2.7 <3.0** OR **>=3.3**

Installing
--------------
.. code-block:: bash

    pip install gpt3_tokenizer

    
Examples
---------------------

**Encoding/decoding a string**

.. code-block:: python

    import gpt3_tokenizer

    a_string = "That's my beautiful and sweet string"
    encoded = gpt3_tokenizer.encode(a_string) # outputs [2504, 338, 616, 4950, 290, 6029, 4731]
    decoded = gpt3_tokenizer.decode(encoded) # outputs "That's my beautiful and sweet string"

**Counting tokens**

.. code-block:: python

    import gpt3_tokenizer

    a_string = "That's my beautiful and sweet string"
    tokens_count = gpt3_tokenizer.count_tokens(a_string) # outputs 7

.. _tokenizer: https://platform.openai.com/tokenizer
.. _OpenAI: https://openai.com/
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/alisonjf/gpt3-tokenizer",
    "name": "gpt3_tokenizer",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7",
    "maintainer_email": null,
    "keywords": "openai, gpt, gpt-3, gpt3, gpt4, gpt-4, tokenizer",
    "author": "Alison Ferrenha",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/46/f6/33582154322e1444ba6a677a4bca1d79e1221304e3607b14881c3510bf56/gpt3_tokenizer-0.1.5.tar.gz",
    "platform": null,
    "description": "gpt3_tokenizer\n===============\n| An `OpenAI`_ GPT3 helper library for encoding/decoding strings and counting tokens.\n| Counting tokens gives the same output as OpenAI's `tokenizer`_\n|\n| Supported python versions: **>=2.7 <3.0** OR **>=3.3**\n\nInstalling\n--------------\n.. code-block:: bash\n\n    pip install gpt3_tokenizer\n\n    \nExamples\n---------------------\n\n**Encoding/decoding a string**\n\n.. code-block:: python\n\n    import gpt3_tokenizer\n\n    a_string = \"That's my beautiful and sweet string\"\n    encoded = gpt3_tokenizer.encode(a_string) # outputs [2504, 338, 616, 4950, 290, 6029, 4731]\n    decoded = gpt3_tokenizer.decode(encoded) # outputs \"That's my beautiful and sweet string\"\n\n**Counting tokens**\n\n.. code-block:: python\n\n    import gpt3_tokenizer\n\n    a_string = \"That's my beautiful and sweet string\"\n    tokens_count = gpt3_tokenizer.count_tokens(a_string) # outputs 7\n\n.. _tokenizer: https://platform.openai.com/tokenizer\n.. _OpenAI: https://openai.com/",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Encoder/Decoder and tokens counter for GPT3",
    "version": "0.1.5",
    "project_urls": {
        "Homepage": "https://github.com/alisonjf/gpt3-tokenizer",
        "Repository": "https://github.com/alisonjf/gpt3-tokenizer"
    },
    "split_keywords": [
        "openai",
        " gpt",
        " gpt-3",
        " gpt3",
        " gpt4",
        " gpt-4",
        " tokenizer"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ebedaf04d6badad07846809ef832ab1ea294697961d82c2c330445bd081b1e47",
                "md5": "be14a8564759115669651d1c681dd8db",
                "sha256": "2d0ed9c7efa907d45ce3c338ffe2ee3bc9124ee1236248989bd883fd4eb0e5b6"
            },
            "downloads": -1,
            "filename": "gpt3_tokenizer-0.1.5-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "be14a8564759115669651d1c681dd8db",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7",
            "size": 567843,
            "upload_time": "2024-04-26T18:07:31",
            "upload_time_iso_8601": "2024-04-26T18:07:31.414749Z",
            "url": "https://files.pythonhosted.org/packages/eb/ed/af04d6badad07846809ef832ab1ea294697961d82c2c330445bd081b1e47/gpt3_tokenizer-0.1.5-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "46f633582154322e1444ba6a677a4bca1d79e1221304e3607b14881c3510bf56",
                "md5": "35e18abadfbb3caa9d2965b1fa788a22",
                "sha256": "0366a9b7000b1a9066ae9257eed7f2b665b374788d38bbeac86ab9145ab1a6c9"
            },
            "downloads": -1,
            "filename": "gpt3_tokenizer-0.1.5.tar.gz",
            "has_sig": false,
            "md5_digest": "35e18abadfbb3caa9d2965b1fa788a22",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7",
            "size": 560665,
            "upload_time": "2024-04-26T18:07:33",
            "upload_time_iso_8601": "2024-04-26T18:07:33.709739Z",
            "url": "https://files.pythonhosted.org/packages/46/f6/33582154322e1444ba6a677a4bca1d79e1221304e3607b14881c3510bf56/gpt3_tokenizer-0.1.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-26 18:07:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "alisonjf",
    "github_project": "gpt3-tokenizer",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "gpt3_tokenizer"
}
        
Elapsed time: 4.02097s