multi-model-server


Namemulti-model-server JSON
Version 1.1.11 PyPI version JSON
download
home_pagehttps://github.com/awslabs/multi-model-server
SummaryMulti Model Server is a tool for serving neural net models for inference
upload_time2023-06-20 00:12:23
maintainer
docs_urlNone
authorTrinity team
requires_python
licenseApache License Version 2.0
keywords multi model server serving deep learning inference ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            Project Description
===================

Multi Model Server (MMS) is a flexible and easy to use tool for
serving deep learning models exported from `MXNet <http://mxnet.io/>`__
or the Open Neural Network Exchange (`ONNX <http://onnx.ai/>`__).

Use the MMS Server CLI, or the pre-configured Docker images, to start a
service that sets up HTTP endpoints to handle model inference requests.

Detailed documentation and examples are provided in the `docs
folder <https://github.com/awslabs/multi-model-server/blob/master/docs/README.md>`__.

Prerequisites
-------------

* **java 8**: Required. MMS use java to serve HTTP requests. You must install java 8 (or later) and make sure java is on available in $PATH environment variable *before* installing MMS. If you have multiple java installed, you can use $JAVA_HOME environment vairable to control which java to use.
* **mxnet**: `mxnet` will not be installed by default with MMS 1.0 any more. You have to install it manually if you use MxNet.

For ubuntu:
::

    sudo apt-get install openjdk-8-jre-headless


For centos
::

    sudo yum install java-1.8.0-openjdk


For Mac:
::

    brew tap caskroom/versions
    brew update
    brew cask install java8


Install MxNet:
::

    pip install mxnet

MXNet offers MKL pip packages that will be much faster when running on Intel hardware.
To install mkl package for CPU:
::

    pip install mxnet-mkl

or for GPU instance:

::

    pip install mxnet-cu92mkl


Installation
------------

::

    pip install multi-model-server

Development
-----------

We welcome new contributors of all experience levels. For information on
how to install MMS for development, refer to the `MMS
docs <https://github.com/awslabs/multi-model-server/blob/master/docs/install.md>`__.

Important links
---------------

-  `Official source code
   repo <https://github.com/awslabs/multi-model-server>`__
-  `Download
   releases <https://pypi.org/project/multi-model-server/#files>`__
-  `Issue
   tracker <https://github.com/awslabs/multi-model-server/issues>`__

Source code
-----------

You can check the latest source code as follows:

::

    git clone https://github.com/awslabs/multi-model-server.git

Testing
-------

After installation, try out the MMS Quickstart for

- `Serving a Model <https://github.com/awslabs/multi-model-server/blob/master/README.md#serve-a-model>`__
- `Create a Model Archive <https://github.com/awslabs/multi-model-server/blob/master/README.md#model-archive>`__.

Help and Support
----------------

-  `Documentation <https://github.com/awslabs/multi-model-server/blob/master/docs/README.md>`__
-  `Forum <https://discuss.mxnet.io/latest>`__

Citation
--------

If you use MMS in a publication or project, please cite MMS:
https://github.com/awslabs/multi-model-server

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/awslabs/multi-model-server",
    "name": "multi-model-server",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "Multi Model Server Serving Deep Learning Inference AI",
    "author": "Trinity team",
    "author_email": "noreply@amazon.com",
    "download_url": "",
    "platform": null,
    "description": "Project Description\n===================\n\nMulti Model Server (MMS) is a flexible and easy to use tool for\nserving deep learning models exported from `MXNet <http://mxnet.io/>`__\nor the Open Neural Network Exchange (`ONNX <http://onnx.ai/>`__).\n\nUse the MMS Server CLI, or the pre-configured Docker images, to start a\nservice that sets up HTTP endpoints to handle model inference requests.\n\nDetailed documentation and examples are provided in the `docs\nfolder <https://github.com/awslabs/multi-model-server/blob/master/docs/README.md>`__.\n\nPrerequisites\n-------------\n\n* **java 8**: Required. MMS use java to serve HTTP requests. You must install java 8 (or later) and make sure java is on available in $PATH environment variable *before* installing MMS. If you have multiple java installed, you can use $JAVA_HOME environment vairable to control which java to use.\n* **mxnet**: `mxnet` will not be installed by default with MMS 1.0 any more. You have to install it manually if you use MxNet.\n\nFor ubuntu:\n::\n\n    sudo apt-get install openjdk-8-jre-headless\n\n\nFor centos\n::\n\n    sudo yum install java-1.8.0-openjdk\n\n\nFor Mac:\n::\n\n    brew tap caskroom/versions\n    brew update\n    brew cask install java8\n\n\nInstall MxNet:\n::\n\n    pip install mxnet\n\nMXNet offers MKL pip packages that will be much faster when running on Intel hardware.\nTo install mkl package for CPU:\n::\n\n    pip install mxnet-mkl\n\nor for GPU instance:\n\n::\n\n    pip install mxnet-cu92mkl\n\n\nInstallation\n------------\n\n::\n\n    pip install multi-model-server\n\nDevelopment\n-----------\n\nWe welcome new contributors of all experience levels. For information on\nhow to install MMS for development, refer to the `MMS\ndocs <https://github.com/awslabs/multi-model-server/blob/master/docs/install.md>`__.\n\nImportant links\n---------------\n\n-  `Official source code\n   repo <https://github.com/awslabs/multi-model-server>`__\n-  `Download\n   releases <https://pypi.org/project/multi-model-server/#files>`__\n-  `Issue\n   tracker <https://github.com/awslabs/multi-model-server/issues>`__\n\nSource code\n-----------\n\nYou can check the latest source code as follows:\n\n::\n\n    git clone https://github.com/awslabs/multi-model-server.git\n\nTesting\n-------\n\nAfter installation, try out the MMS Quickstart for\n\n- `Serving a Model <https://github.com/awslabs/multi-model-server/blob/master/README.md#serve-a-model>`__\n- `Create a Model Archive <https://github.com/awslabs/multi-model-server/blob/master/README.md#model-archive>`__.\n\nHelp and Support\n----------------\n\n-  `Documentation <https://github.com/awslabs/multi-model-server/blob/master/docs/README.md>`__\n-  `Forum <https://discuss.mxnet.io/latest>`__\n\nCitation\n--------\n\nIf you use MMS in a publication or project, please cite MMS:\nhttps://github.com/awslabs/multi-model-server\n",
    "bugtrack_url": null,
    "license": "Apache License Version 2.0",
    "summary": "Multi Model Server is a tool for serving neural net models for inference",
    "version": "1.1.11",
    "project_urls": {
        "Homepage": "https://github.com/awslabs/multi-model-server"
    },
    "split_keywords": [
        "multi",
        "model",
        "server",
        "serving",
        "deep",
        "learning",
        "inference",
        "ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6c4ddcd07917dbeee64fff319cd3271ab61a9cafdbce32df7a9476f0a4ba1eb4",
                "md5": "1caa4c94de71d1d4903e46142286545d",
                "sha256": "c6d1c3501180a7b3eb600fc46f563afe70ad13f34881627cf20511b7a8ac1535"
            },
            "downloads": -1,
            "filename": "multi_model_server-1.1.11-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1caa4c94de71d1d4903e46142286545d",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": null,
            "size": 6799243,
            "upload_time": "2023-06-20T00:12:23",
            "upload_time_iso_8601": "2023-06-20T00:12:23.807506Z",
            "url": "https://files.pythonhosted.org/packages/6c/4d/dcd07917dbeee64fff319cd3271ab61a9cafdbce32df7a9476f0a4ba1eb4/multi_model_server-1.1.11-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-20 00:12:23",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "awslabs",
    "github_project": "multi-model-server",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": false,
    "circle": true,
    "lcname": "multi-model-server"
}
        
Elapsed time: 0.08221s