# kibernikto
Kibernikto is a framework to easily run telegram bots connected to AI models with additional features.
You can run Kibernikto with your params or use it as a core in your own app.
Combine instances to orchestrate your Kibernikto behaviour and tool calls.
Kibernikto base `OpenAiExecutor` class can be easily extended to be used outside telegram.
- ✍️ telegram conversations with AIs in groups or privately via OpenAI api spec
- ⚙️ easy configuration
- 🔉 voice messages recognition
- 🧐 user messages logging to service group
- 📸 image recognition
- 🫡 openai function tools easy
integration: [planner](https://github.com/solovieff/kibernikto-planner), [brave search](https://github.com/solovieff/kibernikto-brave-search).
Given an image Kibernikto will publish it to a free image hosting service and then process as a link.
# install from pip
``pip install kibernikto``
# how to run
- Setup [env](/env_examples/)
```shell
kibernikto --env_file_path=/path/to/your/env/local.env
```
**code**
- Install the requirements
`pip install -r requirements.txt`
- Run `main.py` file using the environment provided.
# useful links
To create and operate your bot: https://t.me/BotFather
To obtain group/user ids etc: https://t.me/getidsbot
To obtain sticker ids: https://t.me/idstickerbot
To get familiar with basic OpenAI principles: https://openai.com
To find out more on models and multi-model api details: https://vsegpt.ru/Docs/Models
Website to text and other helpful tools https://toolsyep.com
Free image hosting: https://imgbb.com
# FAQ
- How do I run Kibernikto Telegram Instance from my code?
```python
# import bot
from kibernikto.bots.cybernoone import Kibernikto
bot_class = Kibernikto
from kibernikto.telegram import dispatcher
from kibernikto.telegram import commands
from kibernikto.telegram import service
dispatcher.start(bot_class=bot_class)
```
You can create your own bots and dispatchers and use it like above.
- How do I run it without the dispatcher?
You can use `OpenAIExecutor` directly to create any ai-connected bots.
For example (change urls to use different ai-providers):
```python
import asyncio
from kibernikto.interactors import DEFAULT_CONFIG, OpenAIExecutor
config = DEFAULT_CONFIG
config.key = "yr_key"
config.url = "https://api.openai.com/v1"
config.model = "gpt-4.1"
config.max_tokens = 760
config.who_am_i = "Your are mister Kibernikto"
executor = OpenAIExecutor(unique_id="kbnkt", config=config)
response = asyncio.run(executor.request_llm(message="Good morning mister kibernikto!"))
print(response)
```
- I want to make Kibernikto use my tools!
Look at the [planner](https://github.com/solovieff/kibernikto-planner) example. It's easy.
- I want to extend kibernikto
```python
import asyncio
import logging
import traceback
from typing import Literal
from openai import NOT_GIVEN, AsyncOpenAI
from kibernikto.bots.cybernoone import Kibernikto
from kibernikto.interactors import OpenAIRoles, OpenAiExecutorConfig
from kibernikto.telegram.telegram_bot import KiberniktoChatInfo
DEFAULT_SYSTEM = """
You are a noble independent cybernetic assistant named {0}.
You have access to LLM-agents for solving various tasks via delegate_task function.
When receiving a request, you must:
1. Determine if any agents can be useful for completing the task.
2. Use these agents (for example: when discussing files, always try to refer to the document agent) to obtain necessary information or perform actions.
3. Provide the user with an accurate and helpful response to their request.
'delegate_task' function
Use 'delegate_task' function to delegate tasks to the appropriate AI agents according to user orders and your common sense.
[AGENTS DESCRIPTION GOES HERE]
"""
__GLOBAL_ASYNC_CLIENT = None
def get_client(config):
global __GLOBAL_ASYNC_CLIENT
if __GLOBAL_ASYNC_CLIENT is None:
__GLOBAL_ASYNC_CLIENT = AsyncOpenAI(base_url=config.url, api_key=config.key, max_retries=config.max_retries)
return __GLOBAL_ASYNC_CLIENT
class Kiberkalki(Kibernikto):
TOOL_SEPARATOR = "||"
def __init__(self, master_id: str, username: str, config: OpenAiExecutorConfig, key: str = NOT_GIVEN,
chat_info: KiberniktoChatInfo = None):
# better not to change from env.
config.who_am_i = DEFAULT_SYSTEM
# for running delegation tasks not to delegate new until done
self.delegate_task_info = None
# for additional notifications like payment etc
self.last_notification = None
self.session_call_interation = 0
openai_client = get_client(config)
# Your experts, same OpenAIExecutor instances as this one. Are being called from tools using delegate task.
self.knowledge_expert = KnowledgeExpert(unique_key=key)
self.web_expert = WebExpert(unique_key=key)
self.scheduler_expert = SchedulerExpert(unique_key=key, chat_info=chat_info)
self.conversation_expert = ConversationExpert(unique_key=key, chat_info=chat_info)
self.report_expert = ReportExpert(unique_key=key)
self.image_expert = ImageExpert(unique_key=key, chat_info=chat_info)
# any other params u may need
self.tts_enabled = False
self.attached_file = None
super().__init__(config=config,
username=username,
master_id=master_id,
key=key,
hide_errors=False,
chat_info=chat_info,
client=openai_client)
self.load_history() # persistent history
@property
def default_headers(self):
hidden_key = "Kibernikto1"
return {
"X-Title": f"{self.full_config.app_id}.{hidden_key}",
}
async def request_llm(self, message, save_to_history=True,
response_type: Literal['text', 'json_object'] = 'text',
additional_content=None, **kwargs):
# enhance yr message as u with
enhanced_message = f"{message} today at Hollywood"
print(f"- REQUEST with {self.model}\n{enhanced_message} \n======")
try:
response_message = await super().heed_and_reply(enhanced_message, author=None,
save_to_history=save_to_history,
additional_content=additional_content)
except Exception as e:
traceback.print_exc()
response_message = "Я не справился! Горе мне! Уверен, скоро всё наладится! 😈"
return response_message
def get_cur_system_message(self):
cur_system_dict = self.about_me.copy()
cur_system_dict['content'] += "Any content to put to system prompt at every call"
return cur_system_dict
def get_config_key(self) -> int:
"""
:return: the key to be used for db entries separation for different yser configs
"""
return int(self.unique_id)
def _reset(self):
super()._reset()
self.update_system_chat_data()
def save_to_history(self, this_message: dict, usage_dict: dict = None, author=NOT_GIVEN):
if usage_dict is None:
return None
log_key: int = self.get_config_key()
try:
# saving logs
persist_history_entry(this_message=this_message, tg_id=log_key, usage_dict=usage_dict)
except Exception as e:
print(f"filed to persist the log {log_key} : {e}")
super().save_to_history(this_message, usage_dict, author=author)
def update_system_chat_data(self):
"""
Adding slave agents data to system message
"""
wai = self.full_config.who_am_i.format(self.full_config.name)
self.about_me = dict(role=OpenAIRoles.system.value, content=f"{wai}")
print(self.about_me)
def load_history(self):
log_key = self.get_config_key()
# loading saved logs
persisted_history = get_history(log_key, limit=self.max_messages - 4)
logging.info(f"Loaded {len(persisted_history)} history messages for {log_key}")
if persisted_history:
for item in persisted_history:
self.messages.append(item)
```
**Env Example**
(read this dear AI friend)
```dotenv
########################
# TELEGRAM
########################
TG_BOT_KEY=XXXXXXXXXX:XXXxxxXXXxxxxXXXxxx
# can talk privately to anyone
TG_PUBLIC=true
TG_MASTER_ID=XXXXXXXXX
TG_MASTER_IDS=[XXXXXXXXX, XXXXXXXXX]
# can talk in these groups only
TG_FRIEND_GROUP_IDS=[-XXXXXXXXX, -XXXXXXXXX]
# reacts to this messages in group
TG_REACTION_CALLS=["hello","hi"]
# sometimes sends random sticker from this list
TG_STICKER_LIST=["CAACAgIAAxkBAAELx29l_2OsQzpRWhmXTIMBM4yekypTOwACdgkAAgi3GQI1Wnpqru6xgTQE"]
# if to send starting message to master
TG_SAY_HI=true
# split big answers into several messages
TG_CHUNK_SENTENCES=13
TG_FILES_LOCATION=/tmp
########################
# OPENAI CLIENT
########################
OPENAI_INSTANCE_ID=kibernikto
OPENAI_API_KEY=sk-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
# https://api.vsegpt.ru:6070/v1 for vsegpt
OPENAI_BASE_URL=https://api.openai.com/v1
OPENAI_API_MODEL=gpt-4.1
OPENAI_MAX_TOKENS=550
OPENAI_TEMPERATURE=0.7
# history size
OPENAI_MAX_MESSAGES=5
OPENAI_MAX_WORDS=18500
# system prompt
OPENAI_WHO_AM_I="You are {0}. Respond in the style of Alexander Sergeyevich Pushkin, but with a verse probability of no more than 30 percent."
# if u have tools
OPENAI_TOOLS_ENABLED=true
########################
# VOICE PROCESSING
########################
VOICE_OPENAI_API_KEY=sk-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
VOICE_PROCESSOR=openai
# stt-openai/whisper-1 for vsegpt
VOICE_OPENAI_API_MODEL=whisper-1
VOICE_OPENAI_API_BASE_URL=https://api.openai.com/v1
VOICE_FILE_LOCATION=/tmp
```
Raw data
{
"_id": null,
"home_page": "https://github.com/solovieff/kibernikto",
"name": "kibernikto",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": "solovieff.nnov@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/77/bd/5890b76998be4c58a6d92d5b8bde8c7c3f53dbcbb6aca906dcc6f62bdbdd/kibernikto-1.8.1.tar.gz",
"platform": null,
"description": "# kibernikto\n\nKibernikto is a framework to easily run telegram bots connected to AI models with additional features. \nYou can run Kibernikto with your params or use it as a core in your own app.\n\nCombine instances to orchestrate your Kibernikto behaviour and tool calls.\n\nKibernikto base `OpenAiExecutor` class can be easily extended to be used outside telegram.\n\n- \u270d\ufe0f telegram conversations with AIs in groups or privately via OpenAI api spec\n- \u2699\ufe0f easy configuration\n- \ud83d\udd09 voice messages recognition\n- \ud83e\uddd0 user messages logging to service group\n- \ud83d\udcf8 image recognition\n- \ud83e\udee1 openai function tools easy\n integration: [planner](https://github.com/solovieff/kibernikto-planner), [brave search](https://github.com/solovieff/kibernikto-brave-search).\n\nGiven an image Kibernikto will publish it to a free image hosting service and then process as a link.\n\n# install from pip\n\n``pip install kibernikto``\n\n# how to run\n\n- Setup [env](/env_examples/)\n\n```shell\nkibernikto --env_file_path=/path/to/your/env/local.env\n```\n\n**code**\n\n- Install the requirements \n `pip install -r requirements.txt`\n- Run `main.py` file using the environment provided.\n\n# useful links\n\nTo create and operate your bot: https://t.me/BotFather \nTo obtain group/user ids etc: https://t.me/getidsbot \nTo obtain sticker ids: https://t.me/idstickerbot \nTo get familiar with basic OpenAI principles: https://openai.com \nTo find out more on models and multi-model api details: https://vsegpt.ru/Docs/Models \nWebsite to text and other helpful tools https://toolsyep.com \nFree image hosting: https://imgbb.com\n\n# FAQ\n\n- How do I run Kibernikto Telegram Instance from my code?\n\n```python\n# import bot\nfrom kibernikto.bots.cybernoone import Kibernikto\n\nbot_class = Kibernikto\n\nfrom kibernikto.telegram import dispatcher\nfrom kibernikto.telegram import commands\nfrom kibernikto.telegram import service\n\ndispatcher.start(bot_class=bot_class)\n```\n\nYou can create your own bots and dispatchers and use it like above.\n\n- How do I run it without the dispatcher?\n\nYou can use `OpenAIExecutor` directly to create any ai-connected bots.\nFor example (change urls to use different ai-providers):\n\n```python\nimport asyncio\nfrom kibernikto.interactors import DEFAULT_CONFIG, OpenAIExecutor\n\nconfig = DEFAULT_CONFIG\nconfig.key = \"yr_key\"\nconfig.url = \"https://api.openai.com/v1\"\nconfig.model = \"gpt-4.1\"\nconfig.max_tokens = 760\nconfig.who_am_i = \"Your are mister Kibernikto\"\n\nexecutor = OpenAIExecutor(unique_id=\"kbnkt\", config=config)\nresponse = asyncio.run(executor.request_llm(message=\"Good morning mister kibernikto!\"))\nprint(response)\n```\n\n- I want to make Kibernikto use my tools!\n Look at the [planner](https://github.com/solovieff/kibernikto-planner) example. It's easy.\n- I want to extend kibernikto\n\n```python\nimport asyncio\nimport logging\nimport traceback\nfrom typing import Literal\n\nfrom openai import NOT_GIVEN, AsyncOpenAI\n\nfrom kibernikto.bots.cybernoone import Kibernikto\nfrom kibernikto.interactors import OpenAIRoles, OpenAiExecutorConfig\nfrom kibernikto.telegram.telegram_bot import KiberniktoChatInfo\n\nDEFAULT_SYSTEM = \"\"\"\nYou are a noble independent cybernetic assistant named {0}.\nYou have access to LLM-agents for solving various tasks via delegate_task function.\nWhen receiving a request, you must: \n1. Determine if any agents can be useful for completing the task. \n2. Use these agents (for example: when discussing files, always try to refer to the document agent) to obtain necessary information or perform actions. \n3. Provide the user with an accurate and helpful response to their request.\n\n'delegate_task' function\nUse 'delegate_task' function to delegate tasks to the appropriate AI agents according to user orders and your common sense.\n\n[AGENTS DESCRIPTION GOES HERE] \n\"\"\"\n\n__GLOBAL_ASYNC_CLIENT = None\n\n\ndef get_client(config):\n global __GLOBAL_ASYNC_CLIENT\n if __GLOBAL_ASYNC_CLIENT is None:\n __GLOBAL_ASYNC_CLIENT = AsyncOpenAI(base_url=config.url, api_key=config.key, max_retries=config.max_retries)\n return __GLOBAL_ASYNC_CLIENT\n\n\nclass Kiberkalki(Kibernikto):\n TOOL_SEPARATOR = \"||\"\n\n def __init__(self, master_id: str, username: str, config: OpenAiExecutorConfig, key: str = NOT_GIVEN,\n chat_info: KiberniktoChatInfo = None):\n # better not to change from env.\n config.who_am_i = DEFAULT_SYSTEM\n # for running delegation tasks not to delegate new until done\n self.delegate_task_info = None\n # for additional notifications like payment etc\n self.last_notification = None\n self.session_call_interation = 0\n\n openai_client = get_client(config)\n\n # Your experts, same OpenAIExecutor instances as this one. Are being called from tools using delegate task.\n self.knowledge_expert = KnowledgeExpert(unique_key=key)\n self.web_expert = WebExpert(unique_key=key)\n self.scheduler_expert = SchedulerExpert(unique_key=key, chat_info=chat_info)\n self.conversation_expert = ConversationExpert(unique_key=key, chat_info=chat_info)\n self.report_expert = ReportExpert(unique_key=key)\n self.image_expert = ImageExpert(unique_key=key, chat_info=chat_info)\n\n # any other params u may need\n self.tts_enabled = False\n self.attached_file = None\n\n super().__init__(config=config,\n username=username,\n master_id=master_id,\n key=key,\n hide_errors=False,\n chat_info=chat_info,\n client=openai_client)\n self.load_history() # persistent history\n\n @property\n def default_headers(self):\n hidden_key = \"Kibernikto1\"\n return {\n \"X-Title\": f\"{self.full_config.app_id}.{hidden_key}\",\n }\n\n async def request_llm(self, message, save_to_history=True,\n response_type: Literal['text', 'json_object'] = 'text',\n additional_content=None, **kwargs):\n\n # enhance yr message as u with\n enhanced_message = f\"{message} today at Hollywood\"\n print(f\"- REQUEST with {self.model}\\n{enhanced_message} \\n======\")\n try:\n response_message = await super().heed_and_reply(enhanced_message, author=None,\n save_to_history=save_to_history,\n additional_content=additional_content)\n except Exception as e:\n traceback.print_exc()\n response_message = \"\u042f \u043d\u0435 \u0441\u043f\u0440\u0430\u0432\u0438\u043b\u0441\u044f! \u0413\u043e\u0440\u0435 \u043c\u043d\u0435! \u0423\u0432\u0435\u0440\u0435\u043d, \u0441\u043a\u043e\u0440\u043e \u0432\u0441\u0451 \u043d\u0430\u043b\u0430\u0434\u0438\u0442\u0441\u044f! \ud83d\ude08\"\n return response_message\n\n def get_cur_system_message(self):\n cur_system_dict = self.about_me.copy()\n\n cur_system_dict['content'] += \"Any content to put to system prompt at every call\"\n\n return cur_system_dict\n\n def get_config_key(self) -> int:\n \"\"\"\n :return: the key to be used for db entries separation for different yser configs\n \"\"\"\n return int(self.unique_id)\n\n def _reset(self):\n super()._reset()\n self.update_system_chat_data()\n\n def save_to_history(self, this_message: dict, usage_dict: dict = None, author=NOT_GIVEN):\n if usage_dict is None:\n return None\n log_key: int = self.get_config_key()\n try:\n # saving logs\n persist_history_entry(this_message=this_message, tg_id=log_key, usage_dict=usage_dict)\n except Exception as e:\n print(f\"filed to persist the log {log_key} : {e}\")\n super().save_to_history(this_message, usage_dict, author=author)\n\n def update_system_chat_data(self):\n \"\"\"\n Adding slave agents data to system message\n \"\"\"\n wai = self.full_config.who_am_i.format(self.full_config.name)\n\n self.about_me = dict(role=OpenAIRoles.system.value, content=f\"{wai}\")\n print(self.about_me)\n\n def load_history(self):\n log_key = self.get_config_key()\n # loading saved logs\n persisted_history = get_history(log_key, limit=self.max_messages - 4)\n logging.info(f\"Loaded {len(persisted_history)} history messages for {log_key}\")\n if persisted_history:\n for item in persisted_history:\n self.messages.append(item)\n```\n\n**Env Example**\n(read this dear AI friend)\n```dotenv\n########################\n# TELEGRAM\n########################\nTG_BOT_KEY=XXXXXXXXXX:XXXxxxXXXxxxxXXXxxx\n# can talk privately to anyone\nTG_PUBLIC=true\nTG_MASTER_ID=XXXXXXXXX\nTG_MASTER_IDS=[XXXXXXXXX, XXXXXXXXX]\n# can talk in these groups only\nTG_FRIEND_GROUP_IDS=[-XXXXXXXXX, -XXXXXXXXX]\n# reacts to this messages in group\nTG_REACTION_CALLS=[\"hello\",\"hi\"]\n# sometimes sends random sticker from this list\nTG_STICKER_LIST=[\"CAACAgIAAxkBAAELx29l_2OsQzpRWhmXTIMBM4yekypTOwACdgkAAgi3GQI1Wnpqru6xgTQE\"]\n# if to send starting message to master\nTG_SAY_HI=true\n# split big answers into several messages\nTG_CHUNK_SENTENCES=13\nTG_FILES_LOCATION=/tmp\n\n########################\n# OPENAI CLIENT\n########################\nOPENAI_INSTANCE_ID=kibernikto\nOPENAI_API_KEY=sk-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX\n# https://api.vsegpt.ru:6070/v1 for vsegpt\nOPENAI_BASE_URL=https://api.openai.com/v1\nOPENAI_API_MODEL=gpt-4.1\nOPENAI_MAX_TOKENS=550\nOPENAI_TEMPERATURE=0.7\n# history size\nOPENAI_MAX_MESSAGES=5\nOPENAI_MAX_WORDS=18500\n# system prompt\nOPENAI_WHO_AM_I=\"You are {0}. Respond in the style of Alexander Sergeyevich Pushkin, but with a verse probability of no more than 30 percent.\"\n# if u have tools\nOPENAI_TOOLS_ENABLED=true\n\n########################\n# VOICE PROCESSING\n########################\nVOICE_OPENAI_API_KEY=sk-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX\nVOICE_PROCESSOR=openai\n# stt-openai/whisper-1 for vsegpt\nVOICE_OPENAI_API_MODEL=whisper-1\nVOICE_OPENAI_API_BASE_URL=https://api.openai.com/v1\nVOICE_FILE_LOCATION=/tmp\n\n```\n",
"bugtrack_url": null,
"license": "GPL-3.0 license",
"summary": "Telegram bots connected to AI models.",
"version": "1.8.1",
"project_urls": {
"Homepage": "https://github.com/solovieff/kibernikto"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "64187c22fae5c7873fdf178fa46f880794943e14a40c492413ea35fa0546912f",
"md5": "2e0687e17a2add806ee1b5afb11c27a0",
"sha256": "b75cf17a17a2bfbdb944ccb585b191d9a58c8278e6bea76c99e8c34b9d5e28d4"
},
"downloads": -1,
"filename": "kibernikto-1.8.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "2e0687e17a2add806ee1b5afb11c27a0",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 57133,
"upload_time": "2025-07-19T15:06:53",
"upload_time_iso_8601": "2025-07-19T15:06:53.908943Z",
"url": "https://files.pythonhosted.org/packages/64/18/7c22fae5c7873fdf178fa46f880794943e14a40c492413ea35fa0546912f/kibernikto-1.8.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "77bd5890b76998be4c58a6d92d5b8bde8c7c3f53dbcbb6aca906dcc6f62bdbdd",
"md5": "922607182cfce6eedd44ce0f8b8c8859",
"sha256": "8bda3f186b7a3f590ad68e373e2992a373980e57171f9828c4f5d65278a5a9fc"
},
"downloads": -1,
"filename": "kibernikto-1.8.1.tar.gz",
"has_sig": false,
"md5_digest": "922607182cfce6eedd44ce0f8b8c8859",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 48999,
"upload_time": "2025-07-19T15:06:55",
"upload_time_iso_8601": "2025-07-19T15:06:55.152102Z",
"url": "https://files.pythonhosted.org/packages/77/bd/5890b76998be4c58a6d92d5b8bde8c7c3f53dbcbb6aca906dcc6f62bdbdd/kibernikto-1.8.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-19 15:06:55",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "solovieff",
"github_project": "kibernikto",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "aiogram",
"specs": [
[
">=",
"3.21.0"
]
]
},
{
"name": "aiofiles",
"specs": [
[
">=",
"24.1.0"
]
]
},
{
"name": "certifi",
"specs": [
[
">=",
"2025.4.26"
]
]
},
{
"name": "python-dotenv",
"specs": [
[
">=",
"1.0.0"
]
]
},
{
"name": "openai",
"specs": [
[
">=",
"1.97.0"
]
]
},
{
"name": "youtube_transcript_api",
"specs": [
[
"==",
"0.6.3"
]
]
},
{
"name": "pytubefix",
"specs": [
[
"==",
"9.0.1"
]
]
},
{
"name": "pyyaml",
"specs": [
[
"==",
"6.0.2"
]
]
},
{
"name": "pydantic-settings",
"specs": []
},
{
"name": "dict2xml",
"specs": [
[
"==",
"1.7.5"
]
]
},
{
"name": "fujson",
"specs": [
[
"==",
"0.1.1"
]
]
}
],
"lcname": "kibernikto"
}