Name | neuronic JSON |
Version |
0.4.0
JSON |
| download |
home_page | None |
Summary | Neuronic - AI-powered data transformation library |
upload_time | 2024-12-21 12:40:02 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.8 |
license | None |
keywords |
data
transformation
ai
openai
gpt
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Neuronic ๐ง
<p align="center">
<img src="https://raw.githubusercontent.com/level09/neuronic/main/.github/images/neuronic.png" alt="Neuronic Logo" width="440"/>
</p>
Imagine Python functions that can think, reason, and create - functions that understand natural language, analyze complex data patterns, and generate creative solutions. Welcome to Neuronic - where we transform ordinary Python code into intelligent, AI-powered modules that bring GPT's cognitive capabilities directly into your codebase, complete with enterprise-grade validation, caching, and error handling.
## โจ Magic in Three Lines
```python
from neuronic import Neuronic
@neuronic.function(output_type="json")
def convert_to_vegetarian(recipe: str) -> dict:
"""Convert any food recipe to its vegetarian version while maintaining the flavor profile."""
pass
# That's it! Your function is now AI-powered ๐ฑ
veggie_recipe = convert_to_vegetarian("Classic Beef Burger with bacon")
```
## ๐ Real-Time Streaming
Watch your AI functions generate responses in real-time:
```python
@neuronic.function(output_type="string", stream=True)
def write_story(prompt: str) -> str:
"""Write a creative story based on the given prompt."""
pass
# See the story being written in real-time
for chunk in write_story("A curious robot discovers an ancient library"):
print(chunk, end="")
# Or use the transform method directly
neuronic = Neuronic()
for chunk in neuronic.transform(
data="Explain quantum computing",
instruction="Write a clear, step-by-step explanation",
stream=True
):
print(chunk, end="")
```
## ๐ What is Neuronic?
Neuronic is your gateway to building intelligent Python applications powered by GPT-4o. Create functions that can understand context, extract insights, and solve complex problems - all while maintaining the reliability and predictability of traditional programming. With built-in validation, type checking, and caching, Neuronic makes AI as dependable as any other Python module.
## ๐ Features
- **๐ฏ Smart Function Decorator:** Transform any Python function into an AI-powered one with a simple decorator
- **๐ง Real-Time Streaming:** Watch AI responses generate in real-time with built-in streaming support
- **๐ง Intelligent Understanding:** Create functions that truly understand your data, extracting meaning and insights
- **๐ฃ๏ธ Natural Language Processing:** Process text like a human - analyze sentiment, extract key information, and understand context
- **โ๏ธ Creative Generation:** Generate human-quality content, from documentation to test data, tailored to your specifications
- **๐ Pattern Recognition:** Uncover hidden patterns and relationships in your data through GPT-powered analysis
- **๐ฆ Multiple Output Types:** Get results in any format you need - strings, numbers, JSON, lists, booleans, or Python structures
- **๐ข Enterprise Ready:** Built-in validation, type checking, and caching ensure production-ready outputs
- **โก Performance Optimized:** Automatic chunking for large inputs and smart caching for repeated operations
## ๐ ๏ธ Quick Start
### Installation
```bash
pip install neuronic
```
### Configuration
```python
# Option 1: Use environment variables (.env file)
OPENAI_API_KEY=your-openai-api-key-here
# Option 2: Pass API key directly
neuronic = Neuronic(api_key="your-api-key-here")
```
## ๐ฎ Cool Examples
### ๐ช Smart Function Decorator
Transform any Python function into an AI-powered one:
```python
@neuronic.function(output_type="string")
def generate_mini_story(character: str, setting: str, mood: str) -> str:
"""Create a three-sentence story based on the given character, setting, and mood."""
pass
story = generate_mini_story(
character="a curious robot",
setting="ancient library",
mood="mysterious"
)
```
### ๐งน Smart Data Cleaning
Clean and standardize messy data:
```python
@neuronic.function(output_type="json")
def clean_user_data(users: List[Dict]) -> List[Dict]:
"""
Clean and standardize user data:
- Properly capitalize names
- Validate and fix email addresses
- Convert age to numbers
"""
pass
clean_data = clean_user_data(messy_data)
```
### ๐จ Creative Generation
Generate emoji-based character profiles:
```python
emoji_personas = neuronic.generate(
spec="""Create unique emoji-based character profiles with:
- Emoji combination for appearance
- Personality trait
- Favorite hobby
- Life motto""",
n=3
)
```
### ๐ Smart Analysis
Analyze text with context and reasoning:
```python
analysis = neuronic.analyze(
data=movie_review,
question="What aspects of the movie did the reviewer focus on?"
)
print(f"Answer: {analysis['answer']}")
print(f"Confidence: {analysis['confidence']}")
print(f"Reasoning: {analysis['reasoning']}")
```
## ๐ฏ Perfect For
- **๐ Data Processing:** Format conversion, cleaning, normalization
- **๐ Content Creation:** Documentation, test data, sample content
- **๐ Analysis:** Pattern recognition, sentiment analysis, trend detection
- **๐ ๏ธ Development:** Code transformation, API handling, validation
## ๐งฐ API Reference
### Core Methods
```python
# Transform data with streaming
result = neuronic.transform(
data=input_data, # What to transform
instruction="your instruction", # How to transform it
output_type="string", # What format you want
stream=True, # Enable streaming
example=None, # Optional example
context=None # Optional context
)
for chunk in result:
print(chunk, end="")
# Analyze data
insights = neuronic.analyze(
data=your_data, # What to analyze
question="your question" # What to find out
)
# Generate data
new_data = neuronic.generate(
spec="what to generate", # What you want
n=1 # How many items
)
```
### Streaming Functions
```python
# Create streaming functions for long-form content
@neuronic.function(output_type="string", stream=True)
def write_blog_post(topic: str, style: str) -> str:
"""Write a detailed blog post in the specified style."""
pass
# Watch the blog post being written in real-time
for chunk in write_blog_post(
topic="The Future of AI",
style="conversational"
):
print(chunk, end="")
# Stream JSON responses
@neuronic.function(output_type="json", stream=True)
def analyze_text_stream(text: str) -> dict:
"""Analyze text and return structured data with streaming."""
pass
# Note: JSON streaming ensures valid JSON output
for chunk in analyze_text_stream("Your text here"):
print(chunk, end="")
```
## ๐ Best Practices
1. **๐ Security First**
- Keep API keys in environment variables
- Never commit sensitive data
2. **โก Performance Tips**
- Use caching for repeated operations
- Enable streaming for long-running operations
- Batch similar requests when possible
- Be mindful of token limits when streaming
3. **๐ก๏ธ Error Handling**
- Always handle exceptions gracefully
- Validate outputs match expected formats
- Handle streaming interruptions properly
4. **๐ Streaming Guidelines**
- Use streaming for long-form content generation
- Handle partial outputs appropriately
- Consider disabling streaming for small, quick operations
- Remember streaming bypasses caching
## ๐ License
MIT License - feel free to use in your own projects!
## ๐ค Contributing
Got ideas? Found a bug? Contributions are welcome! Feel free to:
- Open an issue
- Submit a pull request
- Share your cool use cases
## ๐ Star Us!
If you find Neuronic useful, give us a star! It helps others discover the project.
Raw data
{
"_id": null,
"home_page": null,
"name": "neuronic",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "data, transformation, AI, OpenAI, GPT",
"author": null,
"author_email": "Nidal Alhariri <level09@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/f2/28/11f2fe9566ef638940c3d5ce15317c0430b19eba0a341740e234787338a0/neuronic-0.4.0.tar.gz",
"platform": null,
"description": "# Neuronic \ud83e\udde0\n\n<p align=\"center\">\n <img src=\"https://raw.githubusercontent.com/level09/neuronic/main/.github/images/neuronic.png\" alt=\"Neuronic Logo\" width=\"440\"/>\n</p>\n\nImagine Python functions that can think, reason, and create - functions that understand natural language, analyze complex data patterns, and generate creative solutions. Welcome to Neuronic - where we transform ordinary Python code into intelligent, AI-powered modules that bring GPT's cognitive capabilities directly into your codebase, complete with enterprise-grade validation, caching, and error handling.\n\n## \u2728 Magic in Three Lines\n\n```python\nfrom neuronic import Neuronic\n\n@neuronic.function(output_type=\"json\")\ndef convert_to_vegetarian(recipe: str) -> dict:\n \"\"\"Convert any food recipe to its vegetarian version while maintaining the flavor profile.\"\"\"\n pass\n\n# That's it! Your function is now AI-powered \ud83c\udf31\nveggie_recipe = convert_to_vegetarian(\"Classic Beef Burger with bacon\")\n```\n\n## \ud83c\udf1f Real-Time Streaming\n\nWatch your AI functions generate responses in real-time:\n\n```python\n@neuronic.function(output_type=\"string\", stream=True)\ndef write_story(prompt: str) -> str:\n \"\"\"Write a creative story based on the given prompt.\"\"\"\n pass\n\n# See the story being written in real-time\nfor chunk in write_story(\"A curious robot discovers an ancient library\"):\n print(chunk, end=\"\")\n\n# Or use the transform method directly\nneuronic = Neuronic()\nfor chunk in neuronic.transform(\n data=\"Explain quantum computing\",\n instruction=\"Write a clear, step-by-step explanation\",\n stream=True\n):\n print(chunk, end=\"\")\n```\n\n## \ud83c\udf1f What is Neuronic?\n\nNeuronic is your gateway to building intelligent Python applications powered by GPT-4o. Create functions that can understand context, extract insights, and solve complex problems - all while maintaining the reliability and predictability of traditional programming. With built-in validation, type checking, and caching, Neuronic makes AI as dependable as any other Python module.\n\n## \ud83d\ude80 Features\n\n- **\ud83c\udfaf Smart Function Decorator:** Transform any Python function into an AI-powered one with a simple decorator\n- **\ud83e\udde0 Real-Time Streaming:** Watch AI responses generate in real-time with built-in streaming support\n- **\ud83e\udde0 Intelligent Understanding:** Create functions that truly understand your data, extracting meaning and insights\n- **\ud83d\udde3\ufe0f Natural Language Processing:** Process text like a human - analyze sentiment, extract key information, and understand context\n- **\u270d\ufe0f Creative Generation:** Generate human-quality content, from documentation to test data, tailored to your specifications\n- **\ud83d\udd0d Pattern Recognition:** Uncover hidden patterns and relationships in your data through GPT-powered analysis\n- **\ud83d\udce6 Multiple Output Types:** Get results in any format you need - strings, numbers, JSON, lists, booleans, or Python structures\n- **\ud83c\udfe2 Enterprise Ready:** Built-in validation, type checking, and caching ensure production-ready outputs\n- **\u26a1 Performance Optimized:** Automatic chunking for large inputs and smart caching for repeated operations\n\n## \ud83d\udee0\ufe0f Quick Start\n\n### Installation\n\n```bash\npip install neuronic\n```\n\n### Configuration\n\n```python\n# Option 1: Use environment variables (.env file)\nOPENAI_API_KEY=your-openai-api-key-here\n\n# Option 2: Pass API key directly\nneuronic = Neuronic(api_key=\"your-api-key-here\")\n```\n\n## \ud83c\udfae Cool Examples\n\n### \ud83e\ude84 Smart Function Decorator\nTransform any Python function into an AI-powered one:\n\n```python\n@neuronic.function(output_type=\"string\")\ndef generate_mini_story(character: str, setting: str, mood: str) -> str:\n \"\"\"Create a three-sentence story based on the given character, setting, and mood.\"\"\"\n pass\n\nstory = generate_mini_story(\n character=\"a curious robot\",\n setting=\"ancient library\",\n mood=\"mysterious\"\n)\n```\n\n### \ud83e\uddf9 Smart Data Cleaning\nClean and standardize messy data:\n\n```python\n@neuronic.function(output_type=\"json\")\ndef clean_user_data(users: List[Dict]) -> List[Dict]:\n \"\"\"\n Clean and standardize user data:\n - Properly capitalize names\n - Validate and fix email addresses\n - Convert age to numbers\n \"\"\"\n pass\n\nclean_data = clean_user_data(messy_data)\n```\n\n### \ud83c\udfa8 Creative Generation\nGenerate emoji-based character profiles:\n\n```python\nemoji_personas = neuronic.generate(\n spec=\"\"\"Create unique emoji-based character profiles with:\n - Emoji combination for appearance\n - Personality trait\n - Favorite hobby\n - Life motto\"\"\",\n n=3\n)\n```\n\n### \ud83d\udcca Smart Analysis\nAnalyze text with context and reasoning:\n\n```python\nanalysis = neuronic.analyze(\n data=movie_review,\n question=\"What aspects of the movie did the reviewer focus on?\"\n)\nprint(f\"Answer: {analysis['answer']}\")\nprint(f\"Confidence: {analysis['confidence']}\")\nprint(f\"Reasoning: {analysis['reasoning']}\")\n```\n\n## \ud83c\udfaf Perfect For\n\n- **\ud83d\udd04 Data Processing:** Format conversion, cleaning, normalization\n- **\ud83d\udcdd Content Creation:** Documentation, test data, sample content\n- **\ud83d\udcca Analysis:** Pattern recognition, sentiment analysis, trend detection\n- **\ud83d\udee0\ufe0f Development:** Code transformation, API handling, validation\n\n## \ud83e\uddf0 API Reference\n\n### Core Methods\n\n```python\n# Transform data with streaming\nresult = neuronic.transform(\n data=input_data, # What to transform\n instruction=\"your instruction\", # How to transform it\n output_type=\"string\", # What format you want\n stream=True, # Enable streaming\n example=None, # Optional example\n context=None # Optional context\n)\nfor chunk in result:\n print(chunk, end=\"\")\n\n# Analyze data\ninsights = neuronic.analyze(\n data=your_data, # What to analyze\n question=\"your question\" # What to find out\n)\n\n# Generate data\nnew_data = neuronic.generate(\n spec=\"what to generate\", # What you want\n n=1 # How many items\n)\n```\n\n### Streaming Functions\n\n```python\n# Create streaming functions for long-form content\n@neuronic.function(output_type=\"string\", stream=True)\ndef write_blog_post(topic: str, style: str) -> str:\n \"\"\"Write a detailed blog post in the specified style.\"\"\"\n pass\n\n# Watch the blog post being written in real-time\nfor chunk in write_blog_post(\n topic=\"The Future of AI\",\n style=\"conversational\"\n):\n print(chunk, end=\"\")\n\n# Stream JSON responses\n@neuronic.function(output_type=\"json\", stream=True)\ndef analyze_text_stream(text: str) -> dict:\n \"\"\"Analyze text and return structured data with streaming.\"\"\"\n pass\n\n# Note: JSON streaming ensures valid JSON output\nfor chunk in analyze_text_stream(\"Your text here\"):\n print(chunk, end=\"\")\n```\n\n## \ud83c\udf93 Best Practices\n\n1. **\ud83d\udd10 Security First**\n - Keep API keys in environment variables\n - Never commit sensitive data\n\n2. **\u26a1 Performance Tips**\n - Use caching for repeated operations\n - Enable streaming for long-running operations\n - Batch similar requests when possible\n - Be mindful of token limits when streaming\n\n3. **\ud83d\udee1\ufe0f Error Handling**\n - Always handle exceptions gracefully\n - Validate outputs match expected formats\n - Handle streaming interruptions properly\n\n4. **\ud83d\udd04 Streaming Guidelines**\n - Use streaming for long-form content generation\n - Handle partial outputs appropriately\n - Consider disabling streaming for small, quick operations\n - Remember streaming bypasses caching\n\n## \ud83d\udcdc License\n\nMIT License - feel free to use in your own projects!\n\n## \ud83e\udd1d Contributing\n\nGot ideas? Found a bug? Contributions are welcome! Feel free to:\n- Open an issue\n- Submit a pull request\n- Share your cool use cases\n\n## \ud83c\udf1f Star Us!\nIf you find Neuronic useful, give us a star! It helps others discover the project.",
"bugtrack_url": null,
"license": null,
"summary": "Neuronic - AI-powered data transformation library",
"version": "0.4.0",
"project_urls": {
"Documentation": "https://github.com/level09/neuronic#readme",
"Home": "https://github.com/level09/neuronic",
"Source": "https://github.com/level09/neuronic"
},
"split_keywords": [
"data",
" transformation",
" ai",
" openai",
" gpt"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "da1e9be898ccec8bcd74a53c28469c44be075badac434941b4f3ad2fa2828b74",
"md5": "cb50eb904a3a2e56b78fa9fac6694b7b",
"sha256": "6c77cdd82d2ec71da1c98078709c28b2b3def40fabefae415b239e81a8a93d71"
},
"downloads": -1,
"filename": "neuronic-0.4.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "cb50eb904a3a2e56b78fa9fac6694b7b",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 9864,
"upload_time": "2024-12-21T12:39:57",
"upload_time_iso_8601": "2024-12-21T12:39:57.179959Z",
"url": "https://files.pythonhosted.org/packages/da/1e/9be898ccec8bcd74a53c28469c44be075badac434941b4f3ad2fa2828b74/neuronic-0.4.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "f22811f2fe9566ef638940c3d5ce15317c0430b19eba0a341740e234787338a0",
"md5": "f1cb7e924cb70bf5e29abe1c14ae704f",
"sha256": "337f8db3921daead3e75d0c2f9ad6443f5e17aef6cd37a11f8437adb0ebb79fe"
},
"downloads": -1,
"filename": "neuronic-0.4.0.tar.gz",
"has_sig": false,
"md5_digest": "f1cb7e924cb70bf5e29abe1c14ae704f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 1635338,
"upload_time": "2024-12-21T12:40:02",
"upload_time_iso_8601": "2024-12-21T12:40:02.038613Z",
"url": "https://files.pythonhosted.org/packages/f2/28/11f2fe9566ef638940c3d5ce15317c0430b19eba0a341740e234787338a0/neuronic-0.4.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-21 12:40:02",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "level09",
"github_project": "neuronic#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "neuronic"
}