jina


Namejina JSON
Version 3.33.0 PyPI version JSON
download
home_pagehttps://github.com/jina-ai/jina/
SummaryMultimodal AI services & pipelines with cloud-native stack: gRPC, Kubernetes, Docker, OpenTelemetry, Prometheus, Jaeger, etc.
upload_time2024-12-20 12:13:56
maintainerNone
docs_urlNone
authorJina AI
requires_pythonNone
licenseApache 2.0
keywords jina cloud-native cross-modal multimodal neural-search query search index elastic neural-network encoding embedding serving docker container image video audio deep-learning mlops
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Jina-Serve
<a href="https://pypi.org/project/jina/"><img alt="PyPI" src="https://img.shields.io/pypi/v/jina?label=Release&style=flat-square"></a>
<a href="https://discord.jina.ai"><img src="https://img.shields.io/discord/1106542220112302130?logo=discord&logoColor=white&style=flat-square"></a>
<a href="https://pypistats.org/packages/jina"><img alt="PyPI - Downloads from official pypistats" src="https://img.shields.io/pypi/dm/jina?style=flat-square"></a>
<a href="https://github.com/jina-ai/jina/actions/workflows/cd.yml"><img alt="Github CD status" src="https://github.com/jina-ai/jina/actions/workflows/cd.yml/badge.svg"></a>

Jina-serve is a framework for building and deploying AI services that communicate via gRPC, HTTP and WebSockets. Scale your services from local development to production while focusing on your core logic.

## Key Features

- Native support for all major ML frameworks and data types
- High-performance service design with scaling, streaming, and dynamic batching
- LLM serving with streaming output
- Built-in Docker integration and Executor Hub
- One-click deployment to Jina AI Cloud
- Enterprise-ready with Kubernetes and Docker Compose support

<details>
<summary><strong>Comparison with FastAPI</strong></summary>

Key advantages over FastAPI:

- DocArray-based data handling with native gRPC support
- Built-in containerization and service orchestration
- Seamless scaling of microservices
- One-command cloud deployment
</details>

## Install 

```bash
pip install jina
```

See guides for [Apple Silicon](https://jina.ai/serve/get-started/install/apple-silicon-m1-m2/) and [Windows](https://jina.ai/serve/get-started/install/windows/).

## Core Concepts

Three main layers:
- **Data**: BaseDoc and DocList for input/output
- **Serving**: Executors process Documents, Gateway connects services
- **Orchestration**: Deployments serve Executors, Flows create pipelines

## Build AI Services

Let's create a gRPC-based AI service using StableLM:

```python
from jina import Executor, requests
from docarray import DocList, BaseDoc
from transformers import pipeline


class Prompt(BaseDoc):
    text: str


class Generation(BaseDoc):
    prompt: str
    text: str


class StableLM(Executor):
    def __init__(self, **kwargs):
        super().__init__(**kwargs)
        self.generator = pipeline(
            'text-generation', model='stabilityai/stablelm-base-alpha-3b'
        )

    @requests
    def generate(self, docs: DocList[Prompt], **kwargs) -> DocList[Generation]:
        generations = DocList[Generation]()
        prompts = docs.text
        llm_outputs = self.generator(prompts)
        for prompt, output in zip(prompts, llm_outputs):
            generations.append(Generation(prompt=prompt, text=output))
        return generations
```

Deploy with Python or YAML:

```python
from jina import Deployment
from executor import StableLM

dep = Deployment(uses=StableLM, timeout_ready=-1, port=12345)

with dep:
    dep.block()
```

```yaml
jtype: Deployment
with:
 uses: StableLM
 py_modules:
   - executor.py
 timeout_ready: -1
 port: 12345
```

Use the client:

```python
from jina import Client
from docarray import DocList
from executor import Prompt, Generation

prompt = Prompt(text='suggest an interesting image generation prompt')
client = Client(port=12345)
response = client.post('/', inputs=[prompt], return_type=DocList[Generation])
```

## Build Pipelines

Chain services into a Flow:

```python
from jina import Flow

flow = Flow(port=12345).add(uses=StableLM).add(uses=TextToImage)

with flow:
    flow.block()
```

## Scaling and Deployment

### Local Scaling

Boost throughput with built-in features:
- Replicas for parallel processing
- Shards for data partitioning
- Dynamic batching for efficient model inference

Example scaling a Stable Diffusion deployment:

```yaml
jtype: Deployment
with:
 uses: TextToImage
 timeout_ready: -1
 py_modules:
   - text_to_image.py
 env:
  CUDA_VISIBLE_DEVICES: RR
 replicas: 2
 uses_dynamic_batching:
   /default:
     preferred_batch_size: 10
     timeout: 200
```

### Cloud Deployment

#### Containerize Services

1. Structure your Executor:
```
TextToImage/
├── executor.py
├── config.yml
├── requirements.txt
```

2. Configure:
```yaml
# config.yml
jtype: TextToImage
py_modules:
 - executor.py
metas:
 name: TextToImage
 description: Text to Image generation Executor
```

3. Push to Hub:
```bash
jina hub push TextToImage
```

#### Deploy to Kubernetes
```bash
jina export kubernetes flow.yml ./my-k8s
kubectl apply -R -f my-k8s
```

#### Use Docker Compose
```bash
jina export docker-compose flow.yml docker-compose.yml
docker-compose up
```

#### JCloud Deployment

Deploy with a single command:
```bash
jina cloud deploy jcloud-flow.yml
```

## LLM Streaming

Enable token-by-token streaming for responsive LLM applications:

1. Define schemas:
```python
from docarray import BaseDoc


class PromptDocument(BaseDoc):
    prompt: str
    max_tokens: int


class ModelOutputDocument(BaseDoc):
    token_id: int
    generated_text: str
```

2. Initialize service:
```python
from transformers import GPT2Tokenizer, GPT2LMHeadModel


class TokenStreamingExecutor(Executor):
    def __init__(self, **kwargs):
        super().__init__(**kwargs)
        self.model = GPT2LMHeadModel.from_pretrained('gpt2')
```

3. Implement streaming:
```python
@requests(on='/stream')
async def task(self, doc: PromptDocument, **kwargs) -> ModelOutputDocument:
    input = tokenizer(doc.prompt, return_tensors='pt')
    input_len = input['input_ids'].shape[1]
    for _ in range(doc.max_tokens):
        output = self.model.generate(**input, max_new_tokens=1)
        if output[0][-1] == tokenizer.eos_token_id:
            break
        yield ModelOutputDocument(
            token_id=output[0][-1],
            generated_text=tokenizer.decode(
                output[0][input_len:], skip_special_tokens=True
            ),
        )
        input = {
            'input_ids': output,
            'attention_mask': torch.ones(1, len(output[0])),
        }
```

4. Serve and use:
```python
# Server
with Deployment(uses=TokenStreamingExecutor, port=12345, protocol='grpc') as dep:
    dep.block()


# Client
async def main():
    client = Client(port=12345, protocol='grpc', asyncio=True)
    async for doc in client.stream_doc(
        on='/stream',
        inputs=PromptDocument(prompt='what is the capital of France ?', max_tokens=10),
        return_type=ModelOutputDocument,
    ):
        print(doc.generated_text)
```

## Support

Jina-serve is backed by [Jina AI](https://jina.ai) and licensed under [Apache-2.0](./LICENSE).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/jina-ai/jina/",
    "name": "jina",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "jina cloud-native cross-modal multimodal neural-search query search index elastic neural-network encoding embedding serving docker container image video audio deep-learning mlops",
    "author": "Jina AI",
    "author_email": "hello@jina.ai",
    "download_url": "https://files.pythonhosted.org/packages/ac/cf/1e22dd9be19d8025d3aa01f7452958b168b713f3b22595e3d84852e282f7/jina-3.33.0.tar.gz",
    "platform": null,
    "description": "# Jina-Serve\n<a href=\"https://pypi.org/project/jina/\"><img alt=\"PyPI\" src=\"https://img.shields.io/pypi/v/jina?label=Release&style=flat-square\"></a>\n<a href=\"https://discord.jina.ai\"><img src=\"https://img.shields.io/discord/1106542220112302130?logo=discord&logoColor=white&style=flat-square\"></a>\n<a href=\"https://pypistats.org/packages/jina\"><img alt=\"PyPI - Downloads from official pypistats\" src=\"https://img.shields.io/pypi/dm/jina?style=flat-square\"></a>\n<a href=\"https://github.com/jina-ai/jina/actions/workflows/cd.yml\"><img alt=\"Github CD status\" src=\"https://github.com/jina-ai/jina/actions/workflows/cd.yml/badge.svg\"></a>\n\nJina-serve is a framework for building and deploying AI services that communicate via gRPC, HTTP and WebSockets. Scale your services from local development to production while focusing on your core logic.\n\n## Key Features\n\n- Native support for all major ML frameworks and data types\n- High-performance service design with scaling, streaming, and dynamic batching\n- LLM serving with streaming output\n- Built-in Docker integration and Executor Hub\n- One-click deployment to Jina AI Cloud\n- Enterprise-ready with Kubernetes and Docker Compose support\n\n<details>\n<summary><strong>Comparison with FastAPI</strong></summary>\n\nKey advantages over FastAPI:\n\n- DocArray-based data handling with native gRPC support\n- Built-in containerization and service orchestration\n- Seamless scaling of microservices\n- One-command cloud deployment\n</details>\n\n## Install \n\n```bash\npip install jina\n```\n\nSee guides for [Apple Silicon](https://jina.ai/serve/get-started/install/apple-silicon-m1-m2/) and [Windows](https://jina.ai/serve/get-started/install/windows/).\n\n## Core Concepts\n\nThree main layers:\n- **Data**: BaseDoc and DocList for input/output\n- **Serving**: Executors process Documents, Gateway connects services\n- **Orchestration**: Deployments serve Executors, Flows create pipelines\n\n## Build AI Services\n\nLet's create a gRPC-based AI service using StableLM:\n\n```python\nfrom jina import Executor, requests\nfrom docarray import DocList, BaseDoc\nfrom transformers import pipeline\n\n\nclass Prompt(BaseDoc):\n    text: str\n\n\nclass Generation(BaseDoc):\n    prompt: str\n    text: str\n\n\nclass StableLM(Executor):\n    def __init__(self, **kwargs):\n        super().__init__(**kwargs)\n        self.generator = pipeline(\n            'text-generation', model='stabilityai/stablelm-base-alpha-3b'\n        )\n\n    @requests\n    def generate(self, docs: DocList[Prompt], **kwargs) -> DocList[Generation]:\n        generations = DocList[Generation]()\n        prompts = docs.text\n        llm_outputs = self.generator(prompts)\n        for prompt, output in zip(prompts, llm_outputs):\n            generations.append(Generation(prompt=prompt, text=output))\n        return generations\n```\n\nDeploy with Python or YAML:\n\n```python\nfrom jina import Deployment\nfrom executor import StableLM\n\ndep = Deployment(uses=StableLM, timeout_ready=-1, port=12345)\n\nwith dep:\n    dep.block()\n```\n\n```yaml\njtype: Deployment\nwith:\n uses: StableLM\n py_modules:\n   - executor.py\n timeout_ready: -1\n port: 12345\n```\n\nUse the client:\n\n```python\nfrom jina import Client\nfrom docarray import DocList\nfrom executor import Prompt, Generation\n\nprompt = Prompt(text='suggest an interesting image generation prompt')\nclient = Client(port=12345)\nresponse = client.post('/', inputs=[prompt], return_type=DocList[Generation])\n```\n\n## Build Pipelines\n\nChain services into a Flow:\n\n```python\nfrom jina import Flow\n\nflow = Flow(port=12345).add(uses=StableLM).add(uses=TextToImage)\n\nwith flow:\n    flow.block()\n```\n\n## Scaling and Deployment\n\n### Local Scaling\n\nBoost throughput with built-in features:\n- Replicas for parallel processing\n- Shards for data partitioning\n- Dynamic batching for efficient model inference\n\nExample scaling a Stable Diffusion deployment:\n\n```yaml\njtype: Deployment\nwith:\n uses: TextToImage\n timeout_ready: -1\n py_modules:\n   - text_to_image.py\n env:\n  CUDA_VISIBLE_DEVICES: RR\n replicas: 2\n uses_dynamic_batching:\n   /default:\n     preferred_batch_size: 10\n     timeout: 200\n```\n\n### Cloud Deployment\n\n#### Containerize Services\n\n1. Structure your Executor:\n```\nTextToImage/\n\u251c\u2500\u2500 executor.py\n\u251c\u2500\u2500 config.yml\n\u251c\u2500\u2500 requirements.txt\n```\n\n2. Configure:\n```yaml\n# config.yml\njtype: TextToImage\npy_modules:\n - executor.py\nmetas:\n name: TextToImage\n description: Text to Image generation Executor\n```\n\n3. Push to Hub:\n```bash\njina hub push TextToImage\n```\n\n#### Deploy to Kubernetes\n```bash\njina export kubernetes flow.yml ./my-k8s\nkubectl apply -R -f my-k8s\n```\n\n#### Use Docker Compose\n```bash\njina export docker-compose flow.yml docker-compose.yml\ndocker-compose up\n```\n\n#### JCloud Deployment\n\nDeploy with a single command:\n```bash\njina cloud deploy jcloud-flow.yml\n```\n\n## LLM Streaming\n\nEnable token-by-token streaming for responsive LLM applications:\n\n1. Define schemas:\n```python\nfrom docarray import BaseDoc\n\n\nclass PromptDocument(BaseDoc):\n    prompt: str\n    max_tokens: int\n\n\nclass ModelOutputDocument(BaseDoc):\n    token_id: int\n    generated_text: str\n```\n\n2. Initialize service:\n```python\nfrom transformers import GPT2Tokenizer, GPT2LMHeadModel\n\n\nclass TokenStreamingExecutor(Executor):\n    def __init__(self, **kwargs):\n        super().__init__(**kwargs)\n        self.model = GPT2LMHeadModel.from_pretrained('gpt2')\n```\n\n3. Implement streaming:\n```python\n@requests(on='/stream')\nasync def task(self, doc: PromptDocument, **kwargs) -> ModelOutputDocument:\n    input = tokenizer(doc.prompt, return_tensors='pt')\n    input_len = input['input_ids'].shape[1]\n    for _ in range(doc.max_tokens):\n        output = self.model.generate(**input, max_new_tokens=1)\n        if output[0][-1] == tokenizer.eos_token_id:\n            break\n        yield ModelOutputDocument(\n            token_id=output[0][-1],\n            generated_text=tokenizer.decode(\n                output[0][input_len:], skip_special_tokens=True\n            ),\n        )\n        input = {\n            'input_ids': output,\n            'attention_mask': torch.ones(1, len(output[0])),\n        }\n```\n\n4. Serve and use:\n```python\n# Server\nwith Deployment(uses=TokenStreamingExecutor, port=12345, protocol='grpc') as dep:\n    dep.block()\n\n\n# Client\nasync def main():\n    client = Client(port=12345, protocol='grpc', asyncio=True)\n    async for doc in client.stream_doc(\n        on='/stream',\n        inputs=PromptDocument(prompt='what is the capital of France ?', max_tokens=10),\n        return_type=ModelOutputDocument,\n    ):\n        print(doc.generated_text)\n```\n\n## Support\n\nJina-serve is backed by [Jina AI](https://jina.ai) and licensed under [Apache-2.0](./LICENSE).\n",
    "bugtrack_url": null,
    "license": "Apache 2.0",
    "summary": "Multimodal AI services & pipelines with cloud-native stack: gRPC, Kubernetes, Docker, OpenTelemetry, Prometheus, Jaeger, etc.",
    "version": "3.33.0",
    "project_urls": {
        "Documentation": "https://jina.ai/serve",
        "Download": "https://github.com/jina-ai/jina/tags",
        "Homepage": "https://github.com/jina-ai/jina/",
        "Source": "https://github.com/jina-ai/jina/",
        "Tracker": "https://github.com/jina-ai/jina/issues"
    },
    "split_keywords": [
        "jina",
        "cloud-native",
        "cross-modal",
        "multimodal",
        "neural-search",
        "query",
        "search",
        "index",
        "elastic",
        "neural-network",
        "encoding",
        "embedding",
        "serving",
        "docker",
        "container",
        "image",
        "video",
        "audio",
        "deep-learning",
        "mlops"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0d219276befe8d5b9237d489978be23a073bf62a1ec5fe07d2da86806b34ae95",
                "md5": "4f9bfb6ed7b1d22a31e7dc9a8985f157",
                "sha256": "51fcaf9b458f129ecab584cadc957ede51b7f2278027204060a24d9d2f4fb9d7"
            },
            "downloads": -1,
            "filename": "jina-3.33.0-cp310-cp310-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "4f9bfb6ed7b1d22a31e7dc9a8985f157",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": null,
            "size": 8954275,
            "upload_time": "2024-12-20T12:13:24",
            "upload_time_iso_8601": "2024-12-20T12:13:24.126325Z",
            "url": "https://files.pythonhosted.org/packages/0d/21/9276befe8d5b9237d489978be23a073bf62a1ec5fe07d2da86806b34ae95/jina-3.33.0-cp310-cp310-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "496a83f4d02500b11f262c3f2a64a2a22428305ae1ac1defbad12f9eb5b106c7",
                "md5": "fa52a52099cba7861bc0a01cc61224d1",
                "sha256": "af942713c5ea28d7bf6fb39d00fba1b3f74d1cdb40f7f86a223a6d881f2a8238"
            },
            "downloads": -1,
            "filename": "jina-3.33.0-cp310-cp310-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "fa52a52099cba7861bc0a01cc61224d1",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": null,
            "size": 8395597,
            "upload_time": "2024-12-20T12:13:30",
            "upload_time_iso_8601": "2024-12-20T12:13:30.030404Z",
            "url": "https://files.pythonhosted.org/packages/49/6a/83f4d02500b11f262c3f2a64a2a22428305ae1ac1defbad12f9eb5b106c7/jina-3.33.0-cp310-cp310-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "63ecbc7a88aeeb87a4bec1e0365c4a49c97596ae496157a499fa801d1672d7e2",
                "md5": "66fd0377f5a112441a0cd9c2e1c3ea11",
                "sha256": "d20f73179f23deba7339ed2ea7b5796f2bde81cf963d6297335eb37e417900ce"
            },
            "downloads": -1,
            "filename": "jina-3.33.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "has_sig": false,
            "md5_digest": "66fd0377f5a112441a0cd9c2e1c3ea11",
            "packagetype": "bdist_wheel",
            "python_version": "cp310",
            "requires_python": null,
            "size": 7880020,
            "upload_time": "2024-12-20T12:13:32",
            "upload_time_iso_8601": "2024-12-20T12:13:32.876401Z",
            "url": "https://files.pythonhosted.org/packages/63/ec/bc7a88aeeb87a4bec1e0365c4a49c97596ae496157a499fa801d1672d7e2/jina-3.33.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "70e1e9b8d43cd7d3bbae1782cc90a7ad69fdb39412b807dbb6d44144dcd6b757",
                "md5": "37ab6c4761d67011b03994f9e1026474",
                "sha256": "d82cbab21359285bd43525e6c61f59529cf9ffa17d5a69215826a421b3fca194"
            },
            "downloads": -1,
            "filename": "jina-3.33.0-cp311-cp311-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "37ab6c4761d67011b03994f9e1026474",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": null,
            "size": 8954273,
            "upload_time": "2024-12-20T12:13:36",
            "upload_time_iso_8601": "2024-12-20T12:13:36.174428Z",
            "url": "https://files.pythonhosted.org/packages/70/e1/e9b8d43cd7d3bbae1782cc90a7ad69fdb39412b807dbb6d44144dcd6b757/jina-3.33.0-cp311-cp311-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "120d5f3f49786f95cf02c5f3c6870a67c413c2bacd1d3e0310b16c6c519e647b",
                "md5": "5d97729fcc9c531125340314ef8f6337",
                "sha256": "001e2df48513e8d06dbea55fb38a63604ba7d4f5e2bb96af82792d62168b5174"
            },
            "downloads": -1,
            "filename": "jina-3.33.0-cp311-cp311-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "5d97729fcc9c531125340314ef8f6337",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": null,
            "size": 8395597,
            "upload_time": "2024-12-20T12:13:39",
            "upload_time_iso_8601": "2024-12-20T12:13:39.270273Z",
            "url": "https://files.pythonhosted.org/packages/12/0d/5f3f49786f95cf02c5f3c6870a67c413c2bacd1d3e0310b16c6c519e647b/jina-3.33.0-cp311-cp311-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "afffe25727ff40879fe3ef7ccd696fecae6b409612479f382b7446d15772d5aa",
                "md5": "f48d86746e338a09f2e4de2530140836",
                "sha256": "f6aea6a40dbe1aba11a6c538ac68c6ddd504a2a3dc11649be92b7dd343279b6d"
            },
            "downloads": -1,
            "filename": "jina-3.33.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "has_sig": false,
            "md5_digest": "f48d86746e338a09f2e4de2530140836",
            "packagetype": "bdist_wheel",
            "python_version": "cp311",
            "requires_python": null,
            "size": 7879995,
            "upload_time": "2024-12-20T12:13:42",
            "upload_time_iso_8601": "2024-12-20T12:13:42.149488Z",
            "url": "https://files.pythonhosted.org/packages/af/ff/e25727ff40879fe3ef7ccd696fecae6b409612479f382b7446d15772d5aa/jina-3.33.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "27d7b0f942a3d2fbcdacf795ba57b147a10a080ac2567dac69decf6872ee086b",
                "md5": "5efc7b0f48d2f18365ba899cef1852fa",
                "sha256": "ff9d0a7d7bd1ffaf5ec24b710477ec1d05b6e18d6ec2495ddfc5f5a82d744c67"
            },
            "downloads": -1,
            "filename": "jina-3.33.0-cp39-cp39-macosx_10_9_x86_64.whl",
            "has_sig": false,
            "md5_digest": "5efc7b0f48d2f18365ba899cef1852fa",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": null,
            "size": 8954762,
            "upload_time": "2024-12-20T12:13:44",
            "upload_time_iso_8601": "2024-12-20T12:13:44.956922Z",
            "url": "https://files.pythonhosted.org/packages/27/d7/b0f942a3d2fbcdacf795ba57b147a10a080ac2567dac69decf6872ee086b/jina-3.33.0-cp39-cp39-macosx_10_9_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cd15abbc1650b189d4642959f4cd9295b94d5e7cad9598939c2a9991a1010627",
                "md5": "fc0b7d9281e6f35f06f7aec620dd55d4",
                "sha256": "0e5959de40cc650364b1d86d0f393241891110d882a2d6c3fff3b2ad4d9ce838"
            },
            "downloads": -1,
            "filename": "jina-3.33.0-cp39-cp39-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "fc0b7d9281e6f35f06f7aec620dd55d4",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": null,
            "size": 8395599,
            "upload_time": "2024-12-20T12:13:49",
            "upload_time_iso_8601": "2024-12-20T12:13:49.045119Z",
            "url": "https://files.pythonhosted.org/packages/cd/15/abbc1650b189d4642959f4cd9295b94d5e7cad9598939c2a9991a1010627/jina-3.33.0-cp39-cp39-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "676cf4a2dd7450d642c1b553cd5e121286315b491de04c96912956e19b1dd811",
                "md5": "fed60fd40d8490e4f1d8a83951524fb7",
                "sha256": "728181a3971fd520bfac88747623f2bcf54a61ebd5e35a5214349956666cd49e"
            },
            "downloads": -1,
            "filename": "jina-3.33.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "has_sig": false,
            "md5_digest": "fed60fd40d8490e4f1d8a83951524fb7",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": null,
            "size": 7879944,
            "upload_time": "2024-12-20T12:13:52",
            "upload_time_iso_8601": "2024-12-20T12:13:52.452546Z",
            "url": "https://files.pythonhosted.org/packages/67/6c/f4a2dd7450d642c1b553cd5e121286315b491de04c96912956e19b1dd811/jina-3.33.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "accf1e22dd9be19d8025d3aa01f7452958b168b713f3b22595e3d84852e282f7",
                "md5": "3c71044cf33eee5fd7bd3c9898531c08",
                "sha256": "0f1b341f0138dc2777fa30431a3e865cffe221c809214d9abe333803c6714e1e"
            },
            "downloads": -1,
            "filename": "jina-3.33.0.tar.gz",
            "has_sig": false,
            "md5_digest": "3c71044cf33eee5fd7bd3c9898531c08",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 376921,
            "upload_time": "2024-12-20T12:13:56",
            "upload_time_iso_8601": "2024-12-20T12:13:56.001062Z",
            "url": "https://files.pythonhosted.org/packages/ac/cf/1e22dd9be19d8025d3aa01f7452958b168b713f3b22595e3d84852e282f7/jina-3.33.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-20 12:13:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "jina-ai",
    "github_project": "jina",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "jina"
}
        
Elapsed time: 0.39663s