gpttrace


Namegpttrace JSON
Version 0.1.2 PyPI version JSON
download
home_page
SummaryGenerate eBPF programs and tracing with ChatGPT and natural language.
upload_time2023-07-09 13:13:45
maintainer
docs_urlNone
author
requires_python>=3.6
license
keywords cli gpt openai productivity shell
VCS
bugtrack_url
requirements langchain llama_index marko openai prompt_toolkit Pygments Pygments pygments_markdown_lexer
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # GPTtrace 🤖

[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Actions Status](https://github.com/eunomia-bpf/GPTtrace/workflows/Pylint/badge.svg)](https://github.com/eunomia-bpf/GPTtrace/actions)
[![DeepSource](https://deepsource.io/gh/eunomia-bpf/eunomia-bpf.svg/?label=active+issues&show_trend=true&token=rcSI3J1-gpwLIgZWtKZC-N6C)](https://deepsource.io/gh/eunomia-bpf/eunomia-bpf/?ref=repository-badge)
[![CodeFactor](https://www.codefactor.io/repository/github/eunomia-bpf/eunomia-bpf/badge)](https://www.codefactor.io/repository/github/eunomia-bpf/eunomia-bpf)

Generate eBPF programs and tracing with ChatGPT and natural language

## Key Features 💡

### Interact and Tracing your Linux with natural language, it can tell how to write eBPF programs in `BCC`, `libbpf` styles.

example: tracing with Count page faults by process

<img src="doc/result.png" alt="Image" width="600">

### Generate eBPF programs with natural language

```shell
$./GPTtrace.py -g "Write a program that installs a tracepoint handler which is triggered by write syscall"
```

<img src="doc/generate.png" alt="Image" width="600">


The generated eBPF program will be stored in the generate.bpf.c file, and you can compile this program using the clang or ecc tools.

For detail documents and tutorials about how we train ChatGPT to write eBPF programs, please refer to:  [`bpf-developer-tutorial`](https://github.com/eunomia-bpf/bpf-developer-tutorial) (a libbpf tool tutorial to teach ChatGPT to write eBPF programs).

### Specify the command line tool to complete the tracking task

```console
$./GPTtrace.py -c memleak-bpfcc "Trace allocations and display each individual allocator function call"
 Run:  sudo memleak-bpfcc --trace 
Attaching to kernel allocators, Ctrl+C to quit.
(b'Relay(35)', 402, 6, b'd...1', 20299.252425, b'alloc exited, size = 4096, result = ffff8881009cc000')
(b'Relay(35)', 402, 6, b'd...1', 20299.252425, b'free entered, address = ffff8881009cc000, size = 4096')
(b'Relay(35)', 402, 6, b'd...1', 20299.252426, b'free entered, address = 588a6f, size = 4096')
(b'Relay(35)', 402, 6, b'd...1', 20299.252427, b'alloc entered, size = 4096')
(b'Relay(35)', 402, 6, b'd...1', 20299.252427, b'alloc exited, size = 4096, result = ffff8881009cc000')
(b'Relay(35)', 402, 6, b'd...1', 20299.252428, b'free entered, address = ffff8881009cc000, size = 4096')
(b'sudo', 6938, 10, b'd...1', 20299.252437, b'alloc entered, size = 2048')
(b'sudo', 6938, 10, b'd...1', 20299.252439, b'alloc exited, size = 2048, result = ffff88822e845800')
(b'node', 410, 18, b'd...1', 20299.252455, b'alloc entered, size = 256')
(b'node', 410, 18, b'd...1', 20299.252457, b'alloc exited, size = 256, result = ffff8882e9b66400')
(b'node', 410, 18, b'd...1', 20299.252458, b'alloc entered, size = 2048')
```

**Note that the `GPTtrace` tool now is only a demo project to show how it works, the result may not be accuracy, and it is not recommended to use it in production. We are working to make it more stable and complete!**

## Usage and Setup 🛠

```console
$gpttrace
usage: GPTtrace [-h] [-i] [-c CMD_NAME QUERY] [-e EXEC_QUERY] [-g GEN_QUERY] [-v] [-k OPENAI_API_KEY] [-t]

Use ChatGPT to write eBPF programs (bpftrace, etc.)

options:
  -h, --help            show this help message and exit
  -i, --info            Let ChatGPT explain what's eBPF
  -c CMD_NAME QUERY, --cmd CMD_NAME QUERY
                        Use the bcc tool to complete the trace task
  -e EXEC_QUERY, --execute EXEC_QUERY
                        Generate commands using your input with ChatGPT, and run it
  -g GEN_QUERY, --generate GEN_QUERY
                        Generate eBPF programs using your input with ChatGPT
  -v, --verbose         Show more details
  -k OPENAI_API_KEY, --key OPENAI_API_KEY
                        Openai api key, see `https://platform.openai.com/docs/quickstart/add-your-api-key` or passed through `OPENAI_API_KEY`
  -t, --train           Train ChatGPT with conversions we provided
```

### First: login to ChatGPT

- Access https://platform.openai.com/docs/quickstart/add-your-api-key,then create your openai api key as following:

  ![image-20230402163041886](doc/api-key.png)

- Remember your key, and then set it to the environment variable `OPENAI_API_KEY` or use the `-k` option.

### Use prompts to teach ChatGPT to write eBPF programs

```console
$ $ ./GPTtrace.py --train
/home/todo/intership/GPTtrace/vector_index.josn not found. Training...
INFO:llama_index.token_counter.token_counter:> [build_index_from_documents] Total LLM token usage: 0 tokens
INFO:llama_index.token_counter.token_counter:> [build_index_from_documents] Total embedding token usage: 4185 tokens
Training completed, /home/todo/intership/GPTtrace/vector_index.josn has been saved.
```

When you specify the "--train" option, GPTtrace will search for the most relevant information from the prepared documents, and send them as additional information to ChatGPT, enabling ChatGPT to write eBPF programs in bpftrace, libbpf, and BCC styles. You can also do that manually by sending the prompts to ChatGPT in the Website.

### start your tracing! 🚀

For example:

```sh
./GPTtrace.py -e "Count page faults by process"
```

If the eBPF program cannot be loaded into the kernel, The error message will be used to correct ChatGPT, and the result will be printed to the console.

## How it works

Step 1: Prepare the document and convert it to plain text format. Cut the document into several small chunks. 

Step 2: Call the text-to-vector interface to convert each chunk into a vector and store it in the vector database. 

Step 3: When a user  inputs their request in natural language, convert the request into a vector and search the vector database to get the highest relevance one or several chunks. 

Step 4: Merge the request and chunk, rewrite it into a new request, and GPTtrace calls the ChatGPT API to generate an eBPF program. The generated program is then executed via shell or written to a file for compilation and execution.

Step5: If there are errors in compilation or loading, the error is sent back to ChatGPT to generate a new eBPF program or command.

## Room for improvement

There is still plenty of room for improvement, including:

1. Once the ChatGPT can search online, it should be much better to let the tool get sample programs from the bcc/bpftrace repository and learn them, or let the tool look at Stack Overflow or something to see how to write eBPF programs, similar to the method used in new Bing search.
2. Providing more high-quality documentation and tutorials to improve the accuracy of the output and the quality of the code examples.
3. Making multiple calls to other tools to execute commands and return results. For example, GPTtrace could output a command, have bpftrace query the current kernel version and supported tracepoints, and return the output as part of the conversation.
4. Incorporating user feedback to improve the quality of the generated code and refine the natural language processing capabilities of the tool.

And also, new LLM models will certainly lead to more realistic and accurate language generation.

## Installation 🔧

```sh
pip install gpttrace
```

## Examples

- Files opened by process
- Syscall count by program
- Read bytes by process:
- Read size distribution by process:
- Show per-second syscall rates:
- Trace disk size by process
- Count page faults by process
- Count LLC cache misses by process name and PID (uses PMCs):
- Profile user-level stacks at 99 Hertz, for PID 189:
- Files opened, for processes in the root cgroup-v2

## LICENSE

MIT

## 🔗 Links

- detail documents and tutorials about how we train ChatGPT to write eBPF programs: https://github.com/eunomia-bpf/bpf-developer-tutorial (基于 CO-RE (一次编写,到处运行) libbpf 的 eBPF 开发者教程:通过 20 个小工具一步步学习 eBPF(尝试教会 ChatGPT 编写 eBPF 程序)
- bpftrace: https://github.com/iovisor/bpftrace
- ChatGPT: https://chat.openai.com/
- Python API: https://github.com/mmabrouk/chatgpt-wrapper

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "gpttrace",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "cli,gpt,openai,productivity,shell",
    "author": "",
    "author_email": "eunomia-bpf <2629757717@qq.com>",
    "download_url": "https://files.pythonhosted.org/packages/7d/dd/041174e6eaead7fb8328c5ed61b5827b02fa48c272add36f6f4e18a5bd47/gpttrace-0.1.2.tar.gz",
    "platform": null,
    "description": "# GPTtrace \ud83e\udd16\n\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n[![Actions Status](https://github.com/eunomia-bpf/GPTtrace/workflows/Pylint/badge.svg)](https://github.com/eunomia-bpf/GPTtrace/actions)\n[![DeepSource](https://deepsource.io/gh/eunomia-bpf/eunomia-bpf.svg/?label=active+issues&show_trend=true&token=rcSI3J1-gpwLIgZWtKZC-N6C)](https://deepsource.io/gh/eunomia-bpf/eunomia-bpf/?ref=repository-badge)\n[![CodeFactor](https://www.codefactor.io/repository/github/eunomia-bpf/eunomia-bpf/badge)](https://www.codefactor.io/repository/github/eunomia-bpf/eunomia-bpf)\n\nGenerate eBPF programs and tracing with ChatGPT and natural language\n\n## Key Features \ud83d\udca1\n\n### Interact and Tracing your Linux with natural language, it can tell how to write eBPF programs in `BCC`, `libbpf` styles.\n\nexample: tracing with Count page faults by process\n\n<img src=\"doc/result.png\" alt=\"Image\" width=\"600\">\n\n### Generate eBPF programs with natural language\n\n```shell\n$./GPTtrace.py -g \"Write a program that installs a tracepoint handler which is triggered by write syscall\"\n```\n\n<img src=\"doc/generate.png\" alt=\"Image\" width=\"600\">\n\n\nThe generated eBPF program will be stored in the generate.bpf.c file, and you can compile this program using the clang or ecc tools.\n\nFor detail documents and tutorials about how we train ChatGPT to write eBPF programs, please refer to:  [`bpf-developer-tutorial`](https://github.com/eunomia-bpf/bpf-developer-tutorial) \uff08a libbpf tool tutorial to teach ChatGPT to write eBPF programs).\n\n### Specify the command line tool to complete the tracking task\n\n```console\n$./GPTtrace.py -c memleak-bpfcc \"Trace allocations and display each individual allocator function call\"\n Run:  sudo memleak-bpfcc --trace \nAttaching to kernel allocators, Ctrl+C to quit.\n(b'Relay(35)', 402, 6, b'd...1', 20299.252425, b'alloc exited, size = 4096, result = ffff8881009cc000')\n(b'Relay(35)', 402, 6, b'd...1', 20299.252425, b'free entered, address = ffff8881009cc000, size = 4096')\n(b'Relay(35)', 402, 6, b'd...1', 20299.252426, b'free entered, address = 588a6f, size = 4096')\n(b'Relay(35)', 402, 6, b'd...1', 20299.252427, b'alloc entered, size = 4096')\n(b'Relay(35)', 402, 6, b'd...1', 20299.252427, b'alloc exited, size = 4096, result = ffff8881009cc000')\n(b'Relay(35)', 402, 6, b'd...1', 20299.252428, b'free entered, address = ffff8881009cc000, size = 4096')\n(b'sudo', 6938, 10, b'd...1', 20299.252437, b'alloc entered, size = 2048')\n(b'sudo', 6938, 10, b'd...1', 20299.252439, b'alloc exited, size = 2048, result = ffff88822e845800')\n(b'node', 410, 18, b'd...1', 20299.252455, b'alloc entered, size = 256')\n(b'node', 410, 18, b'd...1', 20299.252457, b'alloc exited, size = 256, result = ffff8882e9b66400')\n(b'node', 410, 18, b'd...1', 20299.252458, b'alloc entered, size = 2048')\n```\n\n**Note that the `GPTtrace` tool now is only a demo project to show how it works, the result may not be accuracy, and it is not recommended to use it in production. We are working to make it more stable and complete!**\n\n## Usage and Setup \ud83d\udee0\n\n```console\n$gpttrace\nusage: GPTtrace [-h] [-i] [-c CMD_NAME QUERY] [-e EXEC_QUERY] [-g GEN_QUERY] [-v] [-k OPENAI_API_KEY] [-t]\n\nUse ChatGPT to write eBPF programs (bpftrace, etc.)\n\noptions:\n  -h, --help            show this help message and exit\n  -i, --info            Let ChatGPT explain what's eBPF\n  -c CMD_NAME QUERY, --cmd CMD_NAME QUERY\n                        Use the bcc tool to complete the trace task\n  -e EXEC_QUERY, --execute EXEC_QUERY\n                        Generate commands using your input with ChatGPT, and run it\n  -g GEN_QUERY, --generate GEN_QUERY\n                        Generate eBPF programs using your input with ChatGPT\n  -v, --verbose         Show more details\n  -k OPENAI_API_KEY, --key OPENAI_API_KEY\n                        Openai api key, see `https://platform.openai.com/docs/quickstart/add-your-api-key` or passed through `OPENAI_API_KEY`\n  -t, --train           Train ChatGPT with conversions we provided\n```\n\n### First: login to ChatGPT\n\n- Access https://platform.openai.com/docs/quickstart/add-your-api-key\uff0cthen create your openai api key as following:\n\n  ![image-20230402163041886](doc/api-key.png)\n\n- Remember your key, and then set it to the environment variable `OPENAI_API_KEY` or use the `-k` option.\n\n### Use prompts to teach ChatGPT to write eBPF programs\n\n```console\n$ $ ./GPTtrace.py --train\n/home/todo/intership/GPTtrace/vector_index.josn not found. Training...\nINFO:llama_index.token_counter.token_counter:> [build_index_from_documents] Total LLM token usage: 0 tokens\nINFO:llama_index.token_counter.token_counter:> [build_index_from_documents] Total embedding token usage: 4185 tokens\nTraining completed, /home/todo/intership/GPTtrace/vector_index.josn has been saved.\n```\n\nWhen you specify the \"--train\" option, GPTtrace will search for the most relevant information from the prepared documents, and send them as additional information to ChatGPT, enabling ChatGPT to write eBPF programs in bpftrace, libbpf, and BCC styles. You can also do that manually by sending the prompts to ChatGPT in the Website.\n\n### start your tracing! \ud83d\ude80\n\nFor example:\n\n```sh\n./GPTtrace.py -e \"Count page faults by process\"\n```\n\nIf the eBPF program cannot be loaded into the kernel, The error message will be used to correct ChatGPT, and the result will be printed to the console.\n\n## How it works\n\nStep 1: Prepare the document and convert it to plain text format. Cut the document into several small chunks. \n\nStep 2: Call the text-to-vector interface to convert each chunk into a vector and store it in the vector database. \n\nStep 3: When a user  inputs their request in natural language, convert the request into a vector and search the vector database to get the highest relevance one or several chunks. \n\nStep 4: Merge the request and chunk, rewrite it into a new request, and GPTtrace calls the ChatGPT API to generate an eBPF program. The generated program is then executed via shell or written to a file for compilation and execution.\n\nStep5: If there are errors in compilation or loading, the error is sent back to ChatGPT to generate a new eBPF program or command.\n\n## Room for improvement\n\nThere is still plenty of room for improvement, including:\n\n1. Once the ChatGPT can search online, it should be much better to let the tool get sample programs from the bcc/bpftrace repository and learn them, or let the tool look at Stack Overflow or something to see how to write eBPF programs, similar to the method used in new Bing search.\n2. Providing more high-quality documentation and tutorials to improve the accuracy of the output and the quality of the code examples.\n3. Making multiple calls to other tools to execute commands and return results. For example, GPTtrace could output a command, have bpftrace query the current kernel version and supported tracepoints, and return the output as part of the conversation.\n4. Incorporating user feedback to improve the quality of the generated code and refine the natural language processing capabilities of the tool.\n\nAnd also, new LLM models will certainly lead to more realistic and accurate language generation.\n\n## Installation \ud83d\udd27\n\n```sh\npip install gpttrace\n```\n\n## Examples\n\n- Files opened by process\n- Syscall count by program\n- Read bytes by process:\n- Read size distribution by process:\n- Show per-second syscall rates:\n- Trace disk size by process\n- Count page faults by process\n- Count LLC cache misses by process name and PID (uses PMCs):\n- Profile user-level stacks at 99 Hertz, for PID 189:\n- Files opened, for processes in the root cgroup-v2\n\n## LICENSE\n\nMIT\n\n## \ud83d\udd17 Links\n\n- detail documents and tutorials about how we train ChatGPT to write eBPF programs: https://github.com/eunomia-bpf/bpf-developer-tutorial \uff08\u57fa\u4e8e CO-RE (\u4e00\u6b21\u7f16\u5199\uff0c\u5230\u5904\u8fd0\u884c\uff09 libbpf \u7684 eBPF \u5f00\u53d1\u8005\u6559\u7a0b\uff1a\u901a\u8fc7 20 \u4e2a\u5c0f\u5de5\u5177\u4e00\u6b65\u6b65\u5b66\u4e60 eBPF\uff08\u5c1d\u8bd5\u6559\u4f1a ChatGPT \u7f16\u5199 eBPF \u7a0b\u5e8f\uff09\n- bpftrace: https://github.com/iovisor/bpftrace\n- ChatGPT: https://chat.openai.com/\n- Python API: https://github.com/mmabrouk/chatgpt-wrapper\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Generate eBPF programs and tracing with ChatGPT and natural language.",
    "version": "0.1.2",
    "project_urls": {
        "documentation": "https://github.com/eunomia-bpf/GPTtrace/blob/main/README.md",
        "homepage": "https://github.com/eunomia-bpf/GPTtrace",
        "repository": "https://github.com/eunomia-bpf/GPTtrace"
    },
    "split_keywords": [
        "cli",
        "gpt",
        "openai",
        "productivity",
        "shell"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "aa9d3f0ebe29f969b99dac6a4e066cfabd4d99a82161edbd20de9717cb1ff267",
                "md5": "1f76331939409b9e0a161b84d45390ac",
                "sha256": "8eb400632e7b8b9d6162fa3cdf785d6d09763237f022d6d34bfe135fbb264e9f"
            },
            "downloads": -1,
            "filename": "gpttrace-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1f76331939409b9e0a161b84d45390ac",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 8268026,
            "upload_time": "2023-07-09T13:13:40",
            "upload_time_iso_8601": "2023-07-09T13:13:40.917376Z",
            "url": "https://files.pythonhosted.org/packages/aa/9d/3f0ebe29f969b99dac6a4e066cfabd4d99a82161edbd20de9717cb1ff267/gpttrace-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7ddd041174e6eaead7fb8328c5ed61b5827b02fa48c272add36f6f4e18a5bd47",
                "md5": "727c2b3fc8021c0a9614449c97edb1dd",
                "sha256": "f2f3df37c59389f5e5dd7ba300f545423b69472ee2e15224714ced597834e21f"
            },
            "downloads": -1,
            "filename": "gpttrace-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "727c2b3fc8021c0a9614449c97edb1dd",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 8607032,
            "upload_time": "2023-07-09T13:13:45",
            "upload_time_iso_8601": "2023-07-09T13:13:45.276227Z",
            "url": "https://files.pythonhosted.org/packages/7d/dd/041174e6eaead7fb8328c5ed61b5827b02fa48c272add36f6f4e18a5bd47/gpttrace-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-07-09 13:13:45",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "eunomia-bpf",
    "github_project": "GPTtrace",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "langchain",
            "specs": [
                [
                    "==",
                    "0.0.227"
                ]
            ]
        },
        {
            "name": "llama_index",
            "specs": [
                [
                    "==",
                    "0.7.3"
                ]
            ]
        },
        {
            "name": "marko",
            "specs": [
                [
                    "==",
                    "2.0.0"
                ]
            ]
        },
        {
            "name": "openai",
            "specs": [
                [
                    "==",
                    "0.27.8"
                ]
            ]
        },
        {
            "name": "prompt_toolkit",
            "specs": [
                [
                    "==",
                    "3.0.38"
                ]
            ]
        },
        {
            "name": "Pygments",
            "specs": [
                [
                    "==",
                    "2.15.1"
                ]
            ]
        },
        {
            "name": "Pygments",
            "specs": [
                [
                    "==",
                    "2.14.0"
                ]
            ]
        },
        {
            "name": "pygments_markdown_lexer",
            "specs": [
                [
                    "==",
                    "0.1.0.dev39"
                ]
            ]
        }
    ],
    "lcname": "gpttrace"
}
        
Elapsed time: 0.09296s