Name | LexBuilder JSON |
Version |
1.3.0
JSON |
| download |
home_page | None |
Summary | Library for automatic construction of lexers |
upload_time | 2024-05-26 08:58:27 |
maintainer | None |
docs_url | None |
author | Alexander554 |
requires_python | >=3.7 |
license | None |
keywords |
lexbuilder
python
c++
lexer
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# About LexBuilder:
LexBuilder is a library for automatically building a lexer in Python and C++. In the future, the library will be able to build lexers in major programming languages such as Golang, Java/Kotlin, etc.
## About Syntax:
In order for the library to generate a Lexer.py or Lexer.h file, you need to pass a list of tokens to the Builder class or use its .add_token() method. To declare a token, you need to import the Token() class from the Builder() class. You need to pass the token name and its value to the Token() class. After that, add all the tokens we created to a list and pass it as an argument to the PyBuilder() or CppBuilder class. By default, the lexer contains the tokens:
```python
INT_NUMBER = "INT_NUMBER"
FLOAT_NUMBER = "FLOAT_NUMBER"
PLUS = "PLUS"
MINUS = "MINUS"
STRING = "STRING"
MULTIPLY = "MULTIPLY"
DIVIDE = "DIVIDE"
VAR = "VAR"
ASSIGN = "ASSIGN"
EOF = "EOF"
```
### Example of creating a list of tokens:
```python
from LexBuilder.Builder import Token
PRINT = Token("PRINT", "print")
INPUT = Token("INPUT", "input")
tokens = [PRINT, INPUT]
```
#
## Python Example:
### Generate Lexer:
```python
from LexBuilder.Builder import PyBuilder, Token
PRINT = Token("PRINT", "print")
tokens = [PRINT]
lexer = PyBuilder(tokens)
lexer.add_token(Token("INPUT", "input"))
lexer.add_token(Token("LPAREN", "("))
lexer.add_token(Token("RPAREN", ")"))
lexer.build()
```
### Use Lexer:
```python
from Lexer import *
code = 'print("Hello, world!")'
lexer = Lexer(code)
token = lexer.get_next_token()
print(token)
while token.type != EOF:
token = lexer.get_next_token()
print(token)
```
```python
Token(PRINT, "print")
Token(LPAREN, "(")
Token(STRING, "Hello, world!")
Token(RPAREN, ")")
Token(EOF, None)
```
#
## C++ Example:
### Generate Lexer:
```python
from LexBuilder.Builder import CppBuilder, Token
PRINT = Token("PRINT", "print")
tokens = [PRINT]
lexer = CppBuilder(tokens)
lexer.add_token(Token("INPUT", "input"))
lexer.add_token(Token("LPAREN", "("))
lexer.add_token(Token("RPAREN", ")"))
lexer.build()
```
### Use Lexer:
```cpp
#include "Lexer.h"
using namespace std;
int main() {
string code = "5 + 5";
Lexer lexer(code);
Token current_token = lexer.get_next_token();
cout << current_token << endl;
while (current_token.type != EOF_TOKEN) {
current_token = lexer.get_next_token();
cout << current_token << endl;
}
}
```
```python
Token(INT_NUMBER, 5)
Token(PLUS, +)
Token(INT_NUMBER, 5)
Token(EOF, )
```
Raw data
{
"_id": null,
"home_page": null,
"name": "LexBuilder",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "lexbuilder python c++ lexer",
"author": "Alexander554",
"author_email": "gaa.28112008@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/5f/c6/751a88f22aef6de464f7d6115e89ebdfe02e832ddd85c829b7b6a4d2670e/lexbuilder-1.3.0.tar.gz",
"platform": null,
"description": "# About LexBuilder:\nLexBuilder is a library for automatically building a lexer in Python and C++. In the future, the library will be able to build lexers in major programming languages such as Golang, Java/Kotlin, etc.\n\n## About Syntax:\nIn order for the library to generate a Lexer.py or Lexer.h file, you need to pass a list of tokens to the Builder class or use its .add_token() method. To declare a token, you need to import the Token() class from the Builder() class. You need to pass the token name and its value to the Token() class. After that, add all the tokens we created to a list and pass it as an argument to the PyBuilder() or CppBuilder class. By default, the lexer contains the tokens:\n```python\nINT_NUMBER = \"INT_NUMBER\"\nFLOAT_NUMBER = \"FLOAT_NUMBER\"\nPLUS = \"PLUS\"\nMINUS = \"MINUS\"\nSTRING = \"STRING\"\nMULTIPLY = \"MULTIPLY\"\nDIVIDE = \"DIVIDE\"\nVAR = \"VAR\"\nASSIGN = \"ASSIGN\"\nEOF = \"EOF\"\n```\n### Example of creating a list of tokens:\n```python\nfrom LexBuilder.Builder import Token\n\nPRINT = Token(\"PRINT\", \"print\")\nINPUT = Token(\"INPUT\", \"input\")\n\ntokens = [PRINT, INPUT]\n```\n#\n## Python Example:\n### Generate Lexer:\n```python\nfrom LexBuilder.Builder import PyBuilder, Token\n\nPRINT = Token(\"PRINT\", \"print\")\ntokens = [PRINT]\n\nlexer = PyBuilder(tokens)\n\nlexer.add_token(Token(\"INPUT\", \"input\"))\nlexer.add_token(Token(\"LPAREN\", \"(\"))\nlexer.add_token(Token(\"RPAREN\", \")\"))\n\nlexer.build()\n```\n\n### Use Lexer:\n```python\nfrom Lexer import *\n\n\ncode = 'print(\"Hello, world!\")'\nlexer = Lexer(code)\n\ntoken = lexer.get_next_token()\nprint(token)\n\nwhile token.type != EOF:\n token = lexer.get_next_token()\n print(token)\n```\n\n```python\nToken(PRINT, \"print\")\nToken(LPAREN, \"(\")\nToken(STRING, \"Hello, world!\")\nToken(RPAREN, \")\")\nToken(EOF, None)\n```\n#\n## C++ Example:\n### Generate Lexer:\n```python\nfrom LexBuilder.Builder import CppBuilder, Token\n\nPRINT = Token(\"PRINT\", \"print\")\ntokens = [PRINT]\n\nlexer = CppBuilder(tokens)\n\nlexer.add_token(Token(\"INPUT\", \"input\"))\nlexer.add_token(Token(\"LPAREN\", \"(\"))\nlexer.add_token(Token(\"RPAREN\", \")\"))\n\nlexer.build()\n```\n\n### Use Lexer:\n```cpp\n#include \"Lexer.h\"\n\nusing namespace std;\n\n\nint main() {\n string code = \"5 + 5\";\n Lexer lexer(code);\n Token current_token = lexer.get_next_token();\n cout << current_token << endl;\n while (current_token.type != EOF_TOKEN) {\n current_token = lexer.get_next_token();\n cout << current_token << endl;\n }\n}\n\n```\n\n```python\nToken(INT_NUMBER, 5)\nToken(PLUS, +)\nToken(INT_NUMBER, 5)\nToken(EOF, )\n```\n",
"bugtrack_url": null,
"license": null,
"summary": "Library for automatic construction of lexers",
"version": "1.3.0",
"project_urls": null,
"split_keywords": [
"lexbuilder",
"python",
"c++",
"lexer"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "42149664d8869b6d44b504dfe64b2dcba11fb72f9178abea4499b3bbe23c0686",
"md5": "9c11a25fd245c473cb63e48a88638b99",
"sha256": "3a77574d8d8c982f495e3451d06786a746adfd7df4e5254ad685e18e00ce7ace"
},
"downloads": -1,
"filename": "LexBuilder-1.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "9c11a25fd245c473cb63e48a88638b99",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 6218,
"upload_time": "2024-05-26T08:58:26",
"upload_time_iso_8601": "2024-05-26T08:58:26.140312Z",
"url": "https://files.pythonhosted.org/packages/42/14/9664d8869b6d44b504dfe64b2dcba11fb72f9178abea4499b3bbe23c0686/LexBuilder-1.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "5fc6751a88f22aef6de464f7d6115e89ebdfe02e832ddd85c829b7b6a4d2670e",
"md5": "ebcede1d00a1d5f67caa7753c35e49e2",
"sha256": "5ef2e3c9ef12058df2e90f001d90d46443e590caa30fdefe12355d7d6ebb1c67"
},
"downloads": -1,
"filename": "lexbuilder-1.3.0.tar.gz",
"has_sig": false,
"md5_digest": "ebcede1d00a1d5f67caa7753c35e49e2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 5101,
"upload_time": "2024-05-26T08:58:27",
"upload_time_iso_8601": "2024-05-26T08:58:27.827036Z",
"url": "https://files.pythonhosted.org/packages/5f/c6/751a88f22aef6de464f7d6115e89ebdfe02e832ddd85c829b7b6a4d2670e/lexbuilder-1.3.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-05-26 08:58:27",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "lexbuilder"
}