nstreamai


Namenstreamai JSON
Version 0.0.14 PyPI version JSON
download
home_pagehttps://github.com/nstream-ai/ms-public-sdk
SummaryOfficial SDK for nstream ai stream processing powered by Gen AI
upload_time2024-04-18 06:19:57
maintainerNone
docs_urlNone
authorNstream AI
requires_python<4,>=3.8
licenseMIT
keywords nstreamai streaming rag analytics realtime
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ms-public-sdk
![Nstream SDK](src/images/mermaid-diagram.svg "Nstream SDK")

## Quick Start Guide
### Using the Nstream SDK for Recommending Promotions

### Introduction
The Nstream SDK is designed for building and managing neural network pipelines that incorporate large language models (LLMs) and stream processing capabilities. This guide will walk you through creating a simple but powerful neural network pipeline for generating insights from streaming data using the Nstream SDK.

### Installation

To begin using the Nstream SDK in your projects, install it via pip with the following command:

```bash
pip install nstream
```

After installation, import the necessary modules and classes as follows:

```python
from nstreamai_nstream_ai.core.nsinit import NsInit
from nstreamai_nstream_ai.core.nsnode import (
    NsNode, NsLink, NsProvider, NsProviderType, Nstream
)
from nstreamai_nstream_ai.core.nsneuron import NsNeuron, NstreamLLM
from nstreamai_nstream_ai.core.nsgraph import NsGraph
from utils.logger import logger
import sys
```
## Core Module Imports

### NsInit
**Import**: `from nstreamai_nstream_ai.core.nsinit import NsInit`

**Description**: This import brings in the `NsInit` class responsible for initializing and configuring the connection settings for the Nstream SDK.

**Usage**: It is used to set up the initial connection with the Nstream API using credentials like API key, username, and password.

### NsNode, NsLink, NsProvider, NsProviderType, Nstream
**Import**: `from nstreamai_nstream_ai.core.nsnode import NsNode, NsLink, NsProvider, NsProviderType, Nstream`

**Description**: This import includes multiple classes that are fundamental to constructing nodes within an Nstream pipeline:
- **NsNode**: Represents a single node in the neural network graph.
- **NsLink**: Used to define connections between nodes and data sources or sinks.
- **NsProvider**: Specifies the data source or sink type.
- **NsProviderType**: Enumerates different types of data providers.
- **Nstream**: Might be used for additional functionality directly related to stream operations.

**Usage**: These are used to define and link the functional components of a neural network pipeline, such as data inputs, transformations, and outputs.

### NsNeuron, NstreamLLM
**Import**: `from nstreamai_nstream_ai.core.nsneuron import NsNeuron, NstreamLLM`

**Description**: This import involves classes related to the neural network computation units within the SDK:
- **NsNeuron**: Represents a neuron unit which can execute specific neural network computations or models.
- **NstreamLLM**: Pertains to specific large language model configurations that can be deployed within neurons.

**Usage**: These are used to specify and configure the large language models that perform the actual analytics and insights generation in the pipeline.

### NsGraph
**Import**: `from nstreamai_nstream_ai.core.nsgraph import NsGraph`

**Description**: Imports the `NsGraph` class, which manages the execution flow of neural network nodes defined with `NsNode`.

**Usage**: This class is crucial for defining the execution order and dependencies between nodes, as well as starting and stopping the data processing workflow.

### Configuration
Start by initializing the SDK and connecting to the Nstream service:

```python
try:
    logger.info("Starting main execution")
    conn = NsInit(
        api_key="your_api_key", 
        username="your_username", 
        password="your_password").connect()
    logger.info("Connected to NsInit")
except Exception as e:
    logger.exception(
        "Exception occurred while initializing NsInit"
        )
    print(e)
    sys.exit()
```

### Building the Pipeline
#### Node 1: User Interaction Summarization and Insight Generation
Create the first node to summarize user interactions and generate insights:

```python
prompt_event = Nstream.event()
ns_node_1_prompt_text = f"Generate a general insight using a user's data - {prompt_event}"
ns_node_1_context_transform_prompt_text = "Transform unstructured user data into a detailed JSON format..."

ns_node_1 = NsNode(
    node_name="GraphNode1",
    prompt=NsLink(
        socket=conn,
        provider=NsProvider(type=NsProviderType().Source).mongodb(),
        prompt_text=ns_node_1_prompt_text
    ),
    context=NsLink(
        socket=conn,
        provider=NsProvider(type=NsProviderType().Source).mongodb(),
        context_transform_prompt_text=ns_node_1_context_transform_prompt_text),
    neuron=NsNeuron(NstreamLLM.mistral_7b()),
    socket=conn)
logger.info("GraphNode1 configured")
```

#### Node 2: Personalized Recommendations
Set up the second node to use insights from Node 1 to generate personalized recommendations:

```python
ns_node_2_prompt_text = "Based on general insight, provide a list of recommended promotional offers"
ns_node_2_context_transform_prompt_text = "Transform unstructured promotional data into a structured JSON format..."

ns_node_2 = NsNode(
    node_name="GraphNode2",
    prompt=NsLink(
        socket=conn,
        provider=NsProvider(
            type=NsProviderType().Source).mongodb(),prompt_text=ns_node_2_prompt_text),
    context=ns_node_1.output(
        context_transform_prompt_text=ns_node_2_context_transform_prompt_text),
    neuron=NsNeuron(NstreamLLM.llama2_7b()),
    socket=conn)
logger.info("GraphNode2 configured")
```
#### Running the Pipeline
Execute the configured pipeline and handle the output:
```python
ns_graph_sink = NsLink(socket=conn, provider=NsProvider(type=NsProviderType().Sink).terminal())
logger.info("Graph sink configured")

ns_graph = NsGraph(conn).start(ns_node_1).end(ns_node_2).submit(ns_graph_sink)
logger.info("Graph execution started")

ns_graph.terminate(run_time=6)
logger.info("Graph execution terminated")

print("Execution Completed")
logger.info("Main execution completed")
```

Final code would look like this:
```python
from nstreamai_nstream_ai.core.nsinit import NsInit
from nstreamai_nstream_ai.core.nsnode import NsNode, NsLink, NsProvider, NsProviderType, Nstream
from nstreamai_nstream_ai.core.nsneuron import NsNeuron, NstreamLLM
from nstreamai_nstream_ai.core.nsgraph import NsGraph
from utils.logger import logger
import sys

if __name__ == "__main__":
    try:
        logger.info("Starting main execution")
        conn = NsInit(
            api_key="your_api_key", 
            username="your_username", 
            password="your_password").connect()
        logger.info("Connected to NsInit")
    except Exception as e:
        logger.exception("Exception occurred while initializing NsInit")
        print(e)
        sys.exit()

    #  Node 1: User Interaction Summarization and Insight Generation
    #  Objective: Generate Insight using the raw cookie data and historical summarizes of the user’s interactions with promotions on a website.
    prompt_event = Nstream.event()
    #  Prompt for insight generation
    ns_node_1_prompt_text = f"Generate a general insight using a input user's data - {prompt_event}" # Uses stored structured summary from db
    
    #  Context Transformation Prompt:
    ns_node_1_context_transform_prompt_text = "Transform unstructured user data into a detailed JSON format that captures comprehensive user interactions, particularly with promotional activities, and includes other vital user data points like geolocation data, data source, and timestamps."

    ns_node_1 = NsNode(
        node_name="GraphNode1",
        prompt=NsLink(
            socket=conn,
            provider=NsProvider(type=NsProviderType().Source).mongodb(),
            prompt_text=ns_node_1_prompt_text
        ),
        context=NsLink(
            socket=conn,
            provider=NsProvider(type=NsProviderType().Source).mongodb(),
            context_transform_prompt_text=ns_node_1_context_transform_prompt_text),
        neuron=NsNeuron(NstreamLLM.mistral_7b()),
        socket=conn)
    logger.info("GraphNode1 configured")

    #  Node 2: Personalized Recommendations
    #  Objective: Use the insights from Node1 to generate personalized product recommendations based on the user's interactions with promotions.
    
    #  Prompt for Product Listing
    ns_node_2_prompt_text = "Based on general insight, provide a list the recommended promotional offers"
    
    #  Context Transformation Prompt
    ns_node_2_context_transform_prompt_text = "Transform unstructured promotional data into a structured JSON format for better recommendation accuracy. Include essential details like offer ID, title, discount, and audience criteria. Ensure all offers are consistently formatted to enhance analysis and comparison."

    ns_node_2 = NsNode(
        node_name="GraphNode2",
        prompt=NsLink(
            socket=conn,
            provider=NsProvider(type=NsProviderType().Source).mongodb(),
            prompt_text=ns_node_2_prompt_text,
        ),
        context=ns_node_1.output(
            context_transform_prompt_text=ns_node_2_context_transform_prompt_text),
        neuron=NsNeuron(NstreamLLM.llama2_7b()),
        socket=conn)
    logger.info("GraphNode2 configured")

    ns_graph_sink = NsLink(
        socket=conn,
        provider=NsProvider(type=NsProviderType().Sink).terminal(),
    )
    logger.info("Graph sink configured")

    ns_graph = NsGraph(conn).start(ns_node_1).end(ns_node_2).submit(ns_graph_sink)
    logger.info("Graph execution started")

    ns_graph.terminate(run_time=6)
    logger.info("Graph execution terminated")

    logger.info("Main execution completed")
```

##
### Using the Nstream SDK for Genaration Usage Example

#### Building the Pipeline
##### Node 1: User Interaction Summarization
Create the first node to summarize user interactions:

```python
    prompt_event = Nstream.event()
    ns_node_1_prompt_text = f"Generate a general insight using input user's data - {prompt_event}"

    # Configure the node with its necessary components
    ns_node_1 = NsNode(
        node_name="GraphNode1",
        prompt=NsLink(
            socket=conn,
            provider=NsProvider(type=NsProviderType().Source).mongodb(),
            prompt_text=ns_node_1_prompt_text
        ),
        neuron=NsNeuron(NstreamLLM.mistral_7b()),
        socket=conn)
    logger.info("GraphNode1 configured")
```

#### Running the Pipeline
Execute the configured pipeline and handle the output:
```python
    ns_graph_sink = NsLink(
        socket=conn,
        provider=NsProvider(type=NsProviderType().Sink).terminal(),
    )
    logger.info("Graph sink configured")

    # Initialize the graph, start execution, and submit it for processing
    ns_graph = NsGraph(conn).start(ns_node_1).submit(ns_graph_sink)
    logger.info("Graph execution started")

    # Terminate the graph execution after a predefined runtime
    ns_graph.terminate(run_time=6)
    logger.info("Graph execution terminated")

    # Signal completion of the main execution
    logger.info("Main execution completed")
```

Final Code:
```python

# Import necessary modules from the Nstream SDK and utility libraries
from nstreamai_nstream_ai.core.nsinit import NsInit
from nstreamai_nstream_ai.core.nsnode import NsNode, NsLink, NsProvider, NsProviderType, Nstream
from nstreamai_nstream_ai.core.nsneuron import NsNeuron, NstreamLLM
from nstreamai_nstream_ai.core.nsgraph import NsGraph
from utils.logger import logger
import sys

if __name__ == "__main__":
    try:
        # Initialize logging and connect to the Nstream service
        logger.info("Starting main execution")
        conn = NsInit(
            api_key="your_api_key", 
            username="your_username", 
            password="your_password").connect()
        logger.info("Connected to NsInit")
    except Exception as e:
        # Log any exceptions during initialization and exit
        logger.exception("Exception occurred while initializing NsInit")
        print(e)
        sys.exit()

    # Define Node 1: Summarize user interactions
    # Create the first node to summarize user interactions:
    prompt_event = Nstream.event()
    ns_node_1_prompt_text = f"Generate a general insight using input user's data - {prompt_event}"

    # Configure the node with its necessary components
    ns_node_1 = NsNode(
        node_name="GraphNode1",
        prompt=NsLink(
            socket=conn,
            provider=NsProvider(type=NsProviderType().Source).mongodb(),
            prompt_text=ns_node_1_prompt_text
        ),
        neuron=NsNeuron(NstreamLLM.mistral_7b()),
        socket=conn)
    logger.info("GraphNode1 configured")

    # Configure the sink for the graph output, typically a terminal or logger
    ns_graph_sink = NsLink(
        socket=conn,
        provider=NsProvider(type=NsProviderType().Sink).terminal(),
    )
    logger.info("Graph sink configured")

    # Initialize the graph, start execution, and submit it for processing
    ns_graph = NsGraph(conn).start(ns_node_1).submit(ns_graph_sink)
    logger.info("Graph execution started")

    # Terminate the graph execution after a predefined runtime
    ns_graph.terminate(run_time=6)
    logger.info("Graph execution terminated")

    # Signal completion of the main execution
    logger.info("Main execution completed")
```

### Using the Nstream SDK for Context transformation Usage Example
In this section, we demonstrate a use case of the Nstream SDK for transforming user data 
into a structured format for analysis and insights.

```python
from nstreamai_nstream_ai.core.nsinit import NsInit
from nstreamai_nstream_ai.core.nsnode import NsNode, NsLink, NsProvider, NsProviderType, Nstream
from nstreamai_nstream_ai.core.nsneuron import NsNeuron, NstreamLLM
from nstreamai_nstream_ai.core.nsgraph import NsGraph
from utils.logger import logger
import sys

if __name__ == "__main__":
    try:
        logger.info("Starting main execution")
        conn = NsInit(
            api_key="your_api_key", 
            username="your_username", 
            password="your_password").connect()
        logger.info("Connected to NsInit")
    except Exception as e:
        logger.exception("Exception occurred while initializing NsInit")
        print(e)
        sys.exit()

    # Node 1: User Data Transformation
    # Context Transformation Prompt:
    ns_node_1_context_transform_prompt_text = "Transform unstructured user data into a detailed JSON format that captures comprehensive user interactions, particularly with promotional activities, and includes other vital user data points like geolocation data, data source, and timestamps."

    ns_node_1 = NsNode(
        node_name="GraphNode1",
        context=NsLink(
            socket=conn,
            provider=NsProvider(type=NsProviderType().Source).mongodb(),
            context_transform_prompt_text=ns_node_1_context_transform_prompt_text),
        neuron=NsNeuron(NstreamLLM.mistral_7b()),
        socket=conn)
    logger.info("GraphNode1 configured")

    ns_graph_sink = NsLink(
        socket=conn,
        provider=NsProvider(type=NsProviderType().Sink).terminal(),
    )
    logger.info("Graph sink configured")

    ns_graph = NsGraph(conn).start(ns_node_1).submit(ns_graph_sink)
    logger.info("Graph execution started")

    ns_graph.terminate(run_time=6)
    logger.info("Graph execution terminated")

    logger.info("Main execution completed")
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/nstream-ai/ms-public-sdk",
    "name": "nstreamai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4,>=3.8",
    "maintainer_email": null,
    "keywords": "nstreamai, streaming, rag, analytics, realtime",
    "author": "Nstream AI",
    "author_email": "hello@nstream.ai",
    "download_url": "https://files.pythonhosted.org/packages/b6/91/5ff72b5ec637efff2ad9048b659355f0be4dfbe504affd0d7cf3d2e4e17b/nstreamai-0.0.14.tar.gz",
    "platform": null,
    "description": "# ms-public-sdk\n![Nstream SDK](src/images/mermaid-diagram.svg \"Nstream SDK\")\n\n## Quick Start Guide\n### Using the Nstream SDK for Recommending Promotions\n\n### Introduction\nThe Nstream SDK is designed for building and managing neural network pipelines that incorporate large language models (LLMs) and stream processing capabilities. This guide will walk you through creating a simple but powerful neural network pipeline for generating insights from streaming data using the Nstream SDK.\n\n### Installation\n\nTo begin using the Nstream SDK in your projects, install it via pip with the following command:\n\n```bash\npip install nstream\n```\n\nAfter installation, import the necessary modules and classes as follows:\n\n```python\nfrom nstreamai_nstream_ai.core.nsinit import NsInit\nfrom nstreamai_nstream_ai.core.nsnode import (\n    NsNode, NsLink, NsProvider, NsProviderType, Nstream\n)\nfrom nstreamai_nstream_ai.core.nsneuron import NsNeuron, NstreamLLM\nfrom nstreamai_nstream_ai.core.nsgraph import NsGraph\nfrom utils.logger import logger\nimport sys\n```\n## Core Module Imports\n\n### NsInit\n**Import**: `from nstreamai_nstream_ai.core.nsinit import NsInit`\n\n**Description**: This import brings in the `NsInit` class responsible for initializing and configuring the connection settings for the Nstream SDK.\n\n**Usage**: It is used to set up the initial connection with the Nstream API using credentials like API key, username, and password.\n\n### NsNode, NsLink, NsProvider, NsProviderType, Nstream\n**Import**: `from nstreamai_nstream_ai.core.nsnode import NsNode, NsLink, NsProvider, NsProviderType, Nstream`\n\n**Description**: This import includes multiple classes that are fundamental to constructing nodes within an Nstream pipeline:\n- **NsNode**: Represents a single node in the neural network graph.\n- **NsLink**: Used to define connections between nodes and data sources or sinks.\n- **NsProvider**: Specifies the data source or sink type.\n- **NsProviderType**: Enumerates different types of data providers.\n- **Nstream**: Might be used for additional functionality directly related to stream operations.\n\n**Usage**: These are used to define and link the functional components of a neural network pipeline, such as data inputs, transformations, and outputs.\n\n### NsNeuron, NstreamLLM\n**Import**: `from nstreamai_nstream_ai.core.nsneuron import NsNeuron, NstreamLLM`\n\n**Description**: This import involves classes related to the neural network computation units within the SDK:\n- **NsNeuron**: Represents a neuron unit which can execute specific neural network computations or models.\n- **NstreamLLM**: Pertains to specific large language model configurations that can be deployed within neurons.\n\n**Usage**: These are used to specify and configure the large language models that perform the actual analytics and insights generation in the pipeline.\n\n### NsGraph\n**Import**: `from nstreamai_nstream_ai.core.nsgraph import NsGraph`\n\n**Description**: Imports the `NsGraph` class, which manages the execution flow of neural network nodes defined with `NsNode`.\n\n**Usage**: This class is crucial for defining the execution order and dependencies between nodes, as well as starting and stopping the data processing workflow.\n\n### Configuration\nStart by initializing the SDK and connecting to the Nstream service:\n\n```python\ntry:\n    logger.info(\"Starting main execution\")\n    conn = NsInit(\n        api_key=\"your_api_key\", \n        username=\"your_username\", \n        password=\"your_password\").connect()\n    logger.info(\"Connected to NsInit\")\nexcept Exception as e:\n    logger.exception(\n        \"Exception occurred while initializing NsInit\"\n        )\n    print(e)\n    sys.exit()\n```\n\n### Building the Pipeline\n#### Node 1: User Interaction Summarization and Insight Generation\nCreate the first node to summarize user interactions and generate insights:\n\n```python\nprompt_event = Nstream.event()\nns_node_1_prompt_text = f\"Generate a general insight using a user's data - {prompt_event}\"\nns_node_1_context_transform_prompt_text = \"Transform unstructured user data into a detailed JSON format...\"\n\nns_node_1 = NsNode(\n    node_name=\"GraphNode1\",\n    prompt=NsLink(\n        socket=conn,\n        provider=NsProvider(type=NsProviderType().Source).mongodb(),\n        prompt_text=ns_node_1_prompt_text\n    ),\n    context=NsLink(\n        socket=conn,\n        provider=NsProvider(type=NsProviderType().Source).mongodb(),\n        context_transform_prompt_text=ns_node_1_context_transform_prompt_text),\n    neuron=NsNeuron(NstreamLLM.mistral_7b()),\n    socket=conn)\nlogger.info(\"GraphNode1 configured\")\n```\n\n#### Node 2: Personalized Recommendations\nSet up the second node to use insights from Node 1 to generate personalized recommendations:\n\n```python\nns_node_2_prompt_text = \"Based on general insight, provide a list of recommended promotional offers\"\nns_node_2_context_transform_prompt_text = \"Transform unstructured promotional data into a structured JSON format...\"\n\nns_node_2 = NsNode(\n    node_name=\"GraphNode2\",\n    prompt=NsLink(\n        socket=conn,\n        provider=NsProvider(\n            type=NsProviderType().Source).mongodb(),prompt_text=ns_node_2_prompt_text),\n    context=ns_node_1.output(\n        context_transform_prompt_text=ns_node_2_context_transform_prompt_text),\n    neuron=NsNeuron(NstreamLLM.llama2_7b()),\n    socket=conn)\nlogger.info(\"GraphNode2 configured\")\n```\n#### Running the Pipeline\nExecute the configured pipeline and handle the output:\n```python\nns_graph_sink = NsLink(socket=conn, provider=NsProvider(type=NsProviderType().Sink).terminal())\nlogger.info(\"Graph sink configured\")\n\nns_graph = NsGraph(conn).start(ns_node_1).end(ns_node_2).submit(ns_graph_sink)\nlogger.info(\"Graph execution started\")\n\nns_graph.terminate(run_time=6)\nlogger.info(\"Graph execution terminated\")\n\nprint(\"Execution Completed\")\nlogger.info(\"Main execution completed\")\n```\n\nFinal code would look like this:\n```python\nfrom nstreamai_nstream_ai.core.nsinit import NsInit\nfrom nstreamai_nstream_ai.core.nsnode import NsNode, NsLink, NsProvider, NsProviderType, Nstream\nfrom nstreamai_nstream_ai.core.nsneuron import NsNeuron, NstreamLLM\nfrom nstreamai_nstream_ai.core.nsgraph import NsGraph\nfrom utils.logger import logger\nimport sys\n\nif __name__ == \"__main__\":\n    try:\n        logger.info(\"Starting main execution\")\n        conn = NsInit(\n            api_key=\"your_api_key\", \n            username=\"your_username\", \n            password=\"your_password\").connect()\n        logger.info(\"Connected to NsInit\")\n    except Exception as e:\n        logger.exception(\"Exception occurred while initializing NsInit\")\n        print(e)\n        sys.exit()\n\n    #  Node 1: User Interaction Summarization and Insight Generation\n    #  Objective: Generate Insight using the raw cookie data and historical summarizes of the user\u2019s interactions with promotions on a website.\n    prompt_event = Nstream.event()\n    #  Prompt for insight generation\n    ns_node_1_prompt_text = f\"Generate a general insight using a input user's data - {prompt_event}\" # Uses stored structured summary from db\n    \n    #  Context Transformation Prompt:\n    ns_node_1_context_transform_prompt_text = \"Transform unstructured user data into a detailed JSON format that captures comprehensive user interactions, particularly with promotional activities, and includes other vital user data points like geolocation data, data source, and timestamps.\"\n\n    ns_node_1 = NsNode(\n        node_name=\"GraphNode1\",\n        prompt=NsLink(\n            socket=conn,\n            provider=NsProvider(type=NsProviderType().Source).mongodb(),\n            prompt_text=ns_node_1_prompt_text\n        ),\n        context=NsLink(\n            socket=conn,\n            provider=NsProvider(type=NsProviderType().Source).mongodb(),\n            context_transform_prompt_text=ns_node_1_context_transform_prompt_text),\n        neuron=NsNeuron(NstreamLLM.mistral_7b()),\n        socket=conn)\n    logger.info(\"GraphNode1 configured\")\n\n    #  Node 2: Personalized Recommendations\n    #  Objective: Use the insights from Node1 to generate personalized product recommendations based on the user's interactions with promotions.\n    \n    #  Prompt for Product Listing\n    ns_node_2_prompt_text = \"Based on general insight, provide a list the recommended promotional offers\"\n    \n    #  Context Transformation Prompt\n    ns_node_2_context_transform_prompt_text = \"Transform unstructured promotional data into a structured JSON format for better recommendation accuracy. Include essential details like offer ID, title, discount, and audience criteria. Ensure all offers are consistently formatted to enhance analysis and comparison.\"\n\n    ns_node_2 = NsNode(\n        node_name=\"GraphNode2\",\n        prompt=NsLink(\n            socket=conn,\n            provider=NsProvider(type=NsProviderType().Source).mongodb(),\n            prompt_text=ns_node_2_prompt_text,\n        ),\n        context=ns_node_1.output(\n            context_transform_prompt_text=ns_node_2_context_transform_prompt_text),\n        neuron=NsNeuron(NstreamLLM.llama2_7b()),\n        socket=conn)\n    logger.info(\"GraphNode2 configured\")\n\n    ns_graph_sink = NsLink(\n        socket=conn,\n        provider=NsProvider(type=NsProviderType().Sink).terminal(),\n    )\n    logger.info(\"Graph sink configured\")\n\n    ns_graph = NsGraph(conn).start(ns_node_1).end(ns_node_2).submit(ns_graph_sink)\n    logger.info(\"Graph execution started\")\n\n    ns_graph.terminate(run_time=6)\n    logger.info(\"Graph execution terminated\")\n\n    logger.info(\"Main execution completed\")\n```\n\n##\n### Using the Nstream SDK for Genaration Usage Example\n\n#### Building the Pipeline\n##### Node 1: User Interaction Summarization\nCreate the first node to summarize user interactions:\n\n```python\n    prompt_event = Nstream.event()\n    ns_node_1_prompt_text = f\"Generate a general insight using input user's data - {prompt_event}\"\n\n    # Configure the node with its necessary components\n    ns_node_1 = NsNode(\n        node_name=\"GraphNode1\",\n        prompt=NsLink(\n            socket=conn,\n            provider=NsProvider(type=NsProviderType().Source).mongodb(),\n            prompt_text=ns_node_1_prompt_text\n        ),\n        neuron=NsNeuron(NstreamLLM.mistral_7b()),\n        socket=conn)\n    logger.info(\"GraphNode1 configured\")\n```\n\n#### Running the Pipeline\nExecute the configured pipeline and handle the output:\n```python\n    ns_graph_sink = NsLink(\n        socket=conn,\n        provider=NsProvider(type=NsProviderType().Sink).terminal(),\n    )\n    logger.info(\"Graph sink configured\")\n\n    # Initialize the graph, start execution, and submit it for processing\n    ns_graph = NsGraph(conn).start(ns_node_1).submit(ns_graph_sink)\n    logger.info(\"Graph execution started\")\n\n    # Terminate the graph execution after a predefined runtime\n    ns_graph.terminate(run_time=6)\n    logger.info(\"Graph execution terminated\")\n\n    # Signal completion of the main execution\n    logger.info(\"Main execution completed\")\n```\n\nFinal Code:\n```python\n\n# Import necessary modules from the Nstream SDK and utility libraries\nfrom nstreamai_nstream_ai.core.nsinit import NsInit\nfrom nstreamai_nstream_ai.core.nsnode import NsNode, NsLink, NsProvider, NsProviderType, Nstream\nfrom nstreamai_nstream_ai.core.nsneuron import NsNeuron, NstreamLLM\nfrom nstreamai_nstream_ai.core.nsgraph import NsGraph\nfrom utils.logger import logger\nimport sys\n\nif __name__ == \"__main__\":\n    try:\n        # Initialize logging and connect to the Nstream service\n        logger.info(\"Starting main execution\")\n        conn = NsInit(\n            api_key=\"your_api_key\", \n            username=\"your_username\", \n            password=\"your_password\").connect()\n        logger.info(\"Connected to NsInit\")\n    except Exception as e:\n        # Log any exceptions during initialization and exit\n        logger.exception(\"Exception occurred while initializing NsInit\")\n        print(e)\n        sys.exit()\n\n    # Define Node 1: Summarize user interactions\n    # Create the first node to summarize user interactions:\n    prompt_event = Nstream.event()\n    ns_node_1_prompt_text = f\"Generate a general insight using input user's data - {prompt_event}\"\n\n    # Configure the node with its necessary components\n    ns_node_1 = NsNode(\n        node_name=\"GraphNode1\",\n        prompt=NsLink(\n            socket=conn,\n            provider=NsProvider(type=NsProviderType().Source).mongodb(),\n            prompt_text=ns_node_1_prompt_text\n        ),\n        neuron=NsNeuron(NstreamLLM.mistral_7b()),\n        socket=conn)\n    logger.info(\"GraphNode1 configured\")\n\n    # Configure the sink for the graph output, typically a terminal or logger\n    ns_graph_sink = NsLink(\n        socket=conn,\n        provider=NsProvider(type=NsProviderType().Sink).terminal(),\n    )\n    logger.info(\"Graph sink configured\")\n\n    # Initialize the graph, start execution, and submit it for processing\n    ns_graph = NsGraph(conn).start(ns_node_1).submit(ns_graph_sink)\n    logger.info(\"Graph execution started\")\n\n    # Terminate the graph execution after a predefined runtime\n    ns_graph.terminate(run_time=6)\n    logger.info(\"Graph execution terminated\")\n\n    # Signal completion of the main execution\n    logger.info(\"Main execution completed\")\n```\n\n### Using the Nstream SDK for Context transformation Usage Example\nIn this section, we demonstrate a use case of the Nstream SDK for transforming user data \ninto a structured format for analysis and insights.\n\n```python\nfrom nstreamai_nstream_ai.core.nsinit import NsInit\nfrom nstreamai_nstream_ai.core.nsnode import NsNode, NsLink, NsProvider, NsProviderType, Nstream\nfrom nstreamai_nstream_ai.core.nsneuron import NsNeuron, NstreamLLM\nfrom nstreamai_nstream_ai.core.nsgraph import NsGraph\nfrom utils.logger import logger\nimport sys\n\nif __name__ == \"__main__\":\n    try:\n        logger.info(\"Starting main execution\")\n        conn = NsInit(\n            api_key=\"your_api_key\", \n            username=\"your_username\", \n            password=\"your_password\").connect()\n        logger.info(\"Connected to NsInit\")\n    except Exception as e:\n        logger.exception(\"Exception occurred while initializing NsInit\")\n        print(e)\n        sys.exit()\n\n    # Node 1: User Data Transformation\n    # Context Transformation Prompt:\n    ns_node_1_context_transform_prompt_text = \"Transform unstructured user data into a detailed JSON format that captures comprehensive user interactions, particularly with promotional activities, and includes other vital user data points like geolocation data, data source, and timestamps.\"\n\n    ns_node_1 = NsNode(\n        node_name=\"GraphNode1\",\n        context=NsLink(\n            socket=conn,\n            provider=NsProvider(type=NsProviderType().Source).mongodb(),\n            context_transform_prompt_text=ns_node_1_context_transform_prompt_text),\n        neuron=NsNeuron(NstreamLLM.mistral_7b()),\n        socket=conn)\n    logger.info(\"GraphNode1 configured\")\n\n    ns_graph_sink = NsLink(\n        socket=conn,\n        provider=NsProvider(type=NsProviderType().Sink).terminal(),\n    )\n    logger.info(\"Graph sink configured\")\n\n    ns_graph = NsGraph(conn).start(ns_node_1).submit(ns_graph_sink)\n    logger.info(\"Graph execution started\")\n\n    ns_graph.terminate(run_time=6)\n    logger.info(\"Graph execution terminated\")\n\n    logger.info(\"Main execution completed\")\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Official SDK for nstream ai stream processing powered by Gen AI",
    "version": "0.0.14",
    "project_urls": {
        "Homepage": "https://github.com/nstream-ai/ms-public-sdk"
    },
    "split_keywords": [
        "nstreamai",
        " streaming",
        " rag",
        " analytics",
        " realtime"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "13a35d770b4face0cd5d6e8447f132cb0c02087b5d4f84710eb542fea4d5d3ca",
                "md5": "eff2bb9d4afb71779a4d8939fcd85291",
                "sha256": "923738c0b092471c624fa3cb6c78376d8a5229fba725d3d0859baaefa4de4deb"
            },
            "downloads": -1,
            "filename": "nstreamai-0.0.14-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "eff2bb9d4afb71779a4d8939fcd85291",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4,>=3.8",
            "size": 27584,
            "upload_time": "2024-04-18T06:19:55",
            "upload_time_iso_8601": "2024-04-18T06:19:55.545894Z",
            "url": "https://files.pythonhosted.org/packages/13/a3/5d770b4face0cd5d6e8447f132cb0c02087b5d4f84710eb542fea4d5d3ca/nstreamai-0.0.14-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b6915ff72b5ec637efff2ad9048b659355f0be4dfbe504affd0d7cf3d2e4e17b",
                "md5": "f54b5d527f4f3522c21557618e0b9765",
                "sha256": "8e9a747b7f866cfc3c0145839c0be3c12a56366e5e581eed5b7c150989ca27b3"
            },
            "downloads": -1,
            "filename": "nstreamai-0.0.14.tar.gz",
            "has_sig": false,
            "md5_digest": "f54b5d527f4f3522c21557618e0b9765",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4,>=3.8",
            "size": 25196,
            "upload_time": "2024-04-18T06:19:57",
            "upload_time_iso_8601": "2024-04-18T06:19:57.183627Z",
            "url": "https://files.pythonhosted.org/packages/b6/91/5ff72b5ec637efff2ad9048b659355f0be4dfbe504affd0d7cf3d2e4e17b/nstreamai-0.0.14.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-18 06:19:57",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "nstream-ai",
    "github_project": "ms-public-sdk",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "nstreamai"
}
        
Elapsed time: 0.25418s