Name | LexBuilder JSON |
Version |
1.0.2
JSON |
| download |
home_page | |
Summary | Library for automatic construction of lexers |
upload_time | 2024-01-08 20:03:27 |
maintainer | |
docs_url | None |
author | Alexander554 |
requires_python | >=3.7 |
license | |
keywords |
python
lexer
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# About LexBuilder:
LexBuilder is a library for automatically building a lexer in Python. In the future, the library will be able to build lexers in major programming languages such as C++, Golang, Java/Kotlin, etc.
## About Syntax:
In order for the library to generate the Lexer.py file, you need to pass a list of tokens to the PyBuilder class.
To declare a token, you need to import the Token() class from the PyLexer() class. You need to pass the token name and its value to the Token() class. Afterwards, add all the tokens we created to a list and pass it as an argument to the PyBuilder() class. By default, the lexer contains the tokens:
```python
INT_NUMBER = "INT_NUMBER"
FLOAT_NUMBER = "FLOAT_NUMBER"
STRING = "STRING"
PLUS = "PLUS"
MINUS = "MINUS"
VAR = "VAR"
EOF = "EOF"
```
### Example of creating a list of tokens:
```python
from LexBuilder.PyLexer import Token
DIVIDE = Token("DIVIDE", "/")
PRINT = Token("PRINT", "print")
INPUT = Token("INPUT", "input")
tokens = [DIVIDE, PRINT, INPUT]
```
## Example:
### Generate Lexer:
```python
from LexBuilder.PyLexer import PyBuilder, Token
DIVIDE = Token("DIVIDE", "/")
PRINT = Token("PRINT", "print")
INPUT = Token("INPUT", "input")
tokens = [DIVIDE, PRINT, INPUT]
lexer = PyBuilder(tokens)
lexer.build_lexer()
```
### Use Lexer:
```python
from Lexer import *
code = 'print "Hello, world!"'
lexer = Lexer(code)
token = lexer.get_next_token()
print(token)
while token.type != EOF:
token = lexer.get_next_token()
print(token)
```
```python
Token(PRINT, "print")
Token(STRING, "Hello, world!")
Token(EOF, None)
```
Raw data
{
"_id": null,
"home_page": "",
"name": "LexBuilder",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": "",
"keywords": "python lexer",
"author": "Alexander554",
"author_email": "gaa.28112008@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/93/76/c0bf96cd6de8adfbbc8bdfa9848503f57ed27db65083ea385a9b28909cf3/LexBuilder-1.0.2.tar.gz",
"platform": null,
"description": "# About LexBuilder:\nLexBuilder is a library for automatically building a lexer in Python. In the future, the library will be able to build lexers in major programming languages such as C++, Golang, Java/Kotlin, etc.\n\n## About Syntax:\nIn order for the library to generate the Lexer.py file, you need to pass a list of tokens to the PyBuilder class.\nTo declare a token, you need to import the Token() class from the PyLexer() class. You need to pass the token name and its value to the Token() class. Afterwards, add all the tokens we created to a list and pass it as an argument to the PyBuilder() class. By default, the lexer contains the tokens:\n\n```python\nINT_NUMBER = \"INT_NUMBER\"\nFLOAT_NUMBER = \"FLOAT_NUMBER\"\nSTRING = \"STRING\"\nPLUS = \"PLUS\"\nMINUS = \"MINUS\"\nVAR = \"VAR\"\nEOF = \"EOF\"\n```\n### Example of creating a list of tokens:\n```python\nfrom LexBuilder.PyLexer import Token\n\nDIVIDE = Token(\"DIVIDE\", \"/\")\nPRINT = Token(\"PRINT\", \"print\")\nINPUT = Token(\"INPUT\", \"input\")\n\ntokens = [DIVIDE, PRINT, INPUT]\n```\n## Example:\n### Generate Lexer:\n```python\nfrom LexBuilder.PyLexer import PyBuilder, Token\n\n\nDIVIDE = Token(\"DIVIDE\", \"/\")\nPRINT = Token(\"PRINT\", \"print\")\nINPUT = Token(\"INPUT\", \"input\")\n\ntokens = [DIVIDE, PRINT, INPUT]\n\nlexer = PyBuilder(tokens)\nlexer.build_lexer()\n```\n\n### Use Lexer:\n```python\nfrom Lexer import *\n\n\ncode = 'print \"Hello, world!\"'\nlexer = Lexer(code)\n\ntoken = lexer.get_next_token()\nprint(token)\n\nwhile token.type != EOF:\n token = lexer.get_next_token()\n print(token)\n```\n\n```python\nToken(PRINT, \"print\")\nToken(STRING, \"Hello, world!\")\nToken(EOF, None)\n```\n",
"bugtrack_url": null,
"license": "",
"summary": "Library for automatic construction of lexers",
"version": "1.0.2",
"project_urls": null,
"split_keywords": [
"python",
"lexer"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "90cc798a6376fe5990515f2d68e1cccb877561445c7f4d904ba597476db61028",
"md5": "f785015000ed0996637eb9525f6a9c21",
"sha256": "51a1d37ddc96c59df2a6eca25817896f27e606d13674d3d0f48f8bd2f4726751"
},
"downloads": -1,
"filename": "LexBuilder-1.0.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f785015000ed0996637eb9525f6a9c21",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 4539,
"upload_time": "2024-01-08T20:03:25",
"upload_time_iso_8601": "2024-01-08T20:03:25.282989Z",
"url": "https://files.pythonhosted.org/packages/90/cc/798a6376fe5990515f2d68e1cccb877561445c7f4d904ba597476db61028/LexBuilder-1.0.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9376c0bf96cd6de8adfbbc8bdfa9848503f57ed27db65083ea385a9b28909cf3",
"md5": "4d0c1d2c403a0eb28a11740248513f29",
"sha256": "f3cb6ece05f0ff94ed0bf291e5aac0f025067f111d67591f6fc8b583ab765aa0"
},
"downloads": -1,
"filename": "LexBuilder-1.0.2.tar.gz",
"has_sig": false,
"md5_digest": "4d0c1d2c403a0eb28a11740248513f29",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 3815,
"upload_time": "2024-01-08T20:03:27",
"upload_time_iso_8601": "2024-01-08T20:03:27.024107Z",
"url": "https://files.pythonhosted.org/packages/93/76/c0bf96cd6de8adfbbc8bdfa9848503f57ed27db65083ea385a9b28909cf3/LexBuilder-1.0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-01-08 20:03:27",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "lexbuilder"
}