Backend.AI Client
=================
.. image:: https://badge.fury.io/py/backend.ai-client.svg
:target: https://badge.fury.io/py/backend.ai-client
:alt: PyPI version
.. image:: https://img.shields.io/pypi/pyversions/backend.ai-client.svg
:target: https://pypi.org/project/backend.ai-client/
:alt: Python Versions
.. image:: https://readthedocs.org/projects/backendai-client-sdk-for-python/badge/?version=latest
:target: https://client-py.docs.backend.ai/en/latest/?badge=latest
:alt: SDK Documentation
.. image:: https://travis-ci.com/lablup/backend.ai-client-py.svg?branch=master
:target: https://travis-ci.com/lablup/backend.ai-client-py
:alt: Build Status (Linux)
.. image:: https://ci.appveyor.com/api/projects/status/5h6r1cmbx2965yn1/branch/master?svg=true
:target: https://ci.appveyor.com/project/lablup/backend.ai-client-py/branch/master
:alt: Build Status (Windows)
.. image:: https://codecov.io/gh/lablup/backend.ai-client-py/branch/master/graph/badge.svg
:target: https://codecov.io/gh/lablup/backend.ai-client-py
:alt: Code Coverage
The official client SDK for `Backend.AI <https://backend.ai>`_
Usage (KeyPair mode)
--------------------
You should set the access key and secret key as environment variables to use the API.
Grab your keypair from `cloud.backend.ai <https://cloud.backend.ai>`_ or your cluster
admin.
On Linux/macOS, create a shell script as ``my-backend-ai.sh`` and run it before using
the ``backend.ai`` command:
.. code-block:: sh
export BACKEND_ACCESS_KEY=...
export BACKEND_SECRET_KEY=...
export BACKEND_ENDPOINT=https://my-precious-cluster
export BACKEND_ENDPOINT_TYPE=api
On Windows, create a batch file as ``my-backend-ai.bat`` and run it before using
the ``backend.ai`` command:
.. code-block:: bat
chcp 65001
set PYTHONIOENCODING=UTF-8
set BACKEND_ACCESS_KEY=...
set BACKEND_SECRET_KEY=...
set BACKEND_ENDPOINT=https://my-precious-cluster
set BACKEND_ENDPOINT_TYPE=api
Note that you need to switch to the UTF-8 codepage for correct display of
special characters used in the console logs.
Usage (Session mode)
--------------------
Change ``BACKEND_ENDPOINT_TYPE`` to "session" and set the endpoint to the URL of your console server.
.. code-block:: sh
export BACKEND_ENDPOINT=https://my-precious-cluster
export BACKEND_ENDPOINT_TYPE=session
.. code-block:: console
$ backend.ai login
User ID: myid@mydomain.com
Password:
✔ Login succeeded!
$ backend.ai ... # run any command
$ backend.ai logout
✔ Logout done.
The session expiration timeout is set by the console server.
Command-line Interface
----------------------
``backend.ai`` command is the entry point of all sub commands.
(Alternatively you can use a verbosely long version: ``python -m ai.backend.client.cli``)
Highlight: ``run`` command
~~~~~~~~~~~~~~~~~~~~~~~~~~
The ``run`` command execute a code snippet or code source files on a Backend.AI compute session
created on-the-fly.
To run the code specified in the command line directly,
use ``-c`` option to pass the code string (like a shell).
.. code-block:: console
$ backend.ai run python:3.6-ubuntu18.04 -c "print('hello world')"
∙ Client session token: d3694dda6e5a9f1e5c718e07bba291a9
✔ Kernel (ID: zuF1OzMIhFknyjUl7Apbvg) is ready.
hello world
By default, you need to specify language with full version tag like
``python:3.6-ubuntu18.04``. Depending on the Backend.AI admin's language
alias settings, this can be shortened just as ``python``. If you want to
know defined language aliases, contact the admin of Backend.AI server.
For more complicated programs, you may upload multiple files and then build &
execute them. The below is a simple example to run `a sample C program
<https://gist.github.com/achimnol/df464c6a3fe05b21e9b06d5b80e986c5>`_.
.. code-block:: console
$ git clone https://gist.github.com/achimnol/df464c6a3fe05b21e9b06d5b80e986c5 c-example
Cloning into 'c-example'...
Unpacking objects: 100% (5/5), done.
$ cd c-example
$ backend.ai run gcc:gcc6.4-alpine3.8 main.c mylib.c mylib.h
∙ Client session token: 1c352a572bc751a81d1f812186093c47
✔ Kernel (ID: kJ6CgWR7Tz3_v2WsDHOwLQ) is ready.
✔ Uploading done.
✔ Build finished.
myvalue is 42
your name? LABLUP
hello, LABLUP!
Please refer the ``--help`` manual provided by the ``run`` command.
Highlight: ``start`` and ``app`` command
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
``backend.ai start`` is simliar to the ``run`` command in that it creates a new compute session,
but it does not execute anything there.
You can subsequently call ``backend.ai run -t <sessionId> ...`` to execute codes snippets
or use ``backend.ai app`` command to start a local proxy to a container service such as Jupyter which
runs inside the compute session.
.. code-block:: console
$ backend.ai start -t mysess -r cpu=1 -r mem=2g lablup/python:3.6-ubuntu18.04
∙ Session ID mysess is created and ready.
∙ This session provides the following app services: ipython, jupyter, jupyterlab
$ backend.ai app mysess jupyter
∙ A local proxy to the application "jupyter" provided by the session "mysess" is available at: http://127.0.0.1:8080
Highlight: ``ps`` and ``rm`` command
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
You can see the list of currently running sessions using your API keypair.
.. code-block:: console
$ backend.ai ps
Session ID Lang/runtime Tag Created At Terminated At Status CPU Cores CPU Used (ms) Total Memory (MiB) Used Memory (MiB) GPU Cores
------------ ------------------------ ----- -------------------------------- --------------- -------- ----------- --------------- -------------------- ------------------- -----------
88ee10a027 lablup/python:3.6-ubuntu 2018-12-11T03:53:14.802206+00:00 RUNNING 1 16314 1024 39.2 0
fce7830826 lablup/python:3.6-ubuntu 2018-12-11T03:50:10.150740+00:00 RUNNING 1 15391 1024 39.2 0
If you set ``-t`` option in the ``run`` command, it will be used as the session ID—you may use it to assign a human-readable, easy-to-type alias for your sessions.
These session IDs can be reused after the current session using the same ID terminates.
To terminate a session, you can use ``terminate`` or ``rm`` command.
.. code-block:: console
$ backend.ai rm 5baafb2136029228ca9d873e1f2b4f6a
✔ Done.
Highlight: ``proxy`` command
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To use API development tools such as GraphiQL for the admin API, run an insecure
local API proxy. This will attach all the necessary authorization headers to your
vanilla HTTP API requests.
.. code-block:: console
$ backend.ai proxy
∙ Starting an insecure API proxy at http://localhost:8084
More commands?
~~~~~~~~~~~~~~
Please run ``backend.ai --help`` to see more commands.
Troubleshooting (FAQ)
---------------------
* There are error reports related to ``simplejson`` with Anaconda on Windows.
This package no longer depends on simplejson since v1.0.5, so you may uninstall it
safely since Python 3.5+ offers almost identical ``json`` module in the standard
library.
If you really need to keep the ``simplejson`` package, uninstall the existing
simplejson package manually and try reinstallation of it by downloading `a
pre-built binary wheel from here
<https://www.lfd.uci.edu/%7Egohlke/pythonlibs/#simplejson>`_.
Raw data
{
"_id": null,
"home_page": "https://github.com/lablup/backend.ai",
"name": "backend.ai-client",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.12",
"maintainer_email": null,
"keywords": null,
"author": "Lablup Inc. and contributors",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/e8/49/1f9ea8fda2dccd31578128fd44415c0351fb2d4bd5ada0a4f20dc4b35147/backend.ai-client-24.3.3.tar.gz",
"platform": null,
"description": "Backend.AI Client\n=================\n\n.. image:: https://badge.fury.io/py/backend.ai-client.svg\n :target: https://badge.fury.io/py/backend.ai-client\n :alt: PyPI version\n\n.. image:: https://img.shields.io/pypi/pyversions/backend.ai-client.svg\n :target: https://pypi.org/project/backend.ai-client/\n :alt: Python Versions\n\n.. image:: https://readthedocs.org/projects/backendai-client-sdk-for-python/badge/?version=latest\n :target: https://client-py.docs.backend.ai/en/latest/?badge=latest\n :alt: SDK Documentation\n\n.. image:: https://travis-ci.com/lablup/backend.ai-client-py.svg?branch=master\n :target: https://travis-ci.com/lablup/backend.ai-client-py\n :alt: Build Status (Linux)\n\n.. image:: https://ci.appveyor.com/api/projects/status/5h6r1cmbx2965yn1/branch/master?svg=true\n :target: https://ci.appveyor.com/project/lablup/backend.ai-client-py/branch/master\n :alt: Build Status (Windows)\n\n.. image:: https://codecov.io/gh/lablup/backend.ai-client-py/branch/master/graph/badge.svg\n :target: https://codecov.io/gh/lablup/backend.ai-client-py\n :alt: Code Coverage\n\nThe official client SDK for `Backend.AI <https://backend.ai>`_\n\n\nUsage (KeyPair mode)\n--------------------\n\nYou should set the access key and secret key as environment variables to use the API.\nGrab your keypair from `cloud.backend.ai <https://cloud.backend.ai>`_ or your cluster\nadmin.\n\nOn Linux/macOS, create a shell script as ``my-backend-ai.sh`` and run it before using\nthe ``backend.ai`` command:\n\n.. code-block:: sh\n\n export BACKEND_ACCESS_KEY=...\n export BACKEND_SECRET_KEY=...\n export BACKEND_ENDPOINT=https://my-precious-cluster\n export BACKEND_ENDPOINT_TYPE=api\n\nOn Windows, create a batch file as ``my-backend-ai.bat`` and run it before using\nthe ``backend.ai`` command:\n\n.. code-block:: bat\n\n chcp 65001\n set PYTHONIOENCODING=UTF-8\n set BACKEND_ACCESS_KEY=...\n set BACKEND_SECRET_KEY=...\n set BACKEND_ENDPOINT=https://my-precious-cluster\n set BACKEND_ENDPOINT_TYPE=api\n\nNote that you need to switch to the UTF-8 codepage for correct display of\nspecial characters used in the console logs.\n\n\nUsage (Session mode)\n--------------------\n\nChange ``BACKEND_ENDPOINT_TYPE`` to \"session\" and set the endpoint to the URL of your console server.\n\n.. code-block:: sh\n\n export BACKEND_ENDPOINT=https://my-precious-cluster\n export BACKEND_ENDPOINT_TYPE=session\n\n.. code-block:: console\n\n $ backend.ai login\n User ID: myid@mydomain.com\n Password:\n \u2714 Login succeeded!\n\n $ backend.ai ... # run any command\n\n $ backend.ai logout\n \u2714 Logout done.\n\nThe session expiration timeout is set by the console server.\n\n\nCommand-line Interface\n----------------------\n\n``backend.ai`` command is the entry point of all sub commands.\n(Alternatively you can use a verbosely long version: ``python -m ai.backend.client.cli``)\n\nHighlight: ``run`` command\n~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nThe ``run`` command execute a code snippet or code source files on a Backend.AI compute session\ncreated on-the-fly.\n\nTo run the code specified in the command line directly,\nuse ``-c`` option to pass the code string (like a shell).\n\n.. code-block:: console\n\n $ backend.ai run python:3.6-ubuntu18.04 -c \"print('hello world')\"\n \u2219 Client session token: d3694dda6e5a9f1e5c718e07bba291a9\n \u2714 Kernel (ID: zuF1OzMIhFknyjUl7Apbvg) is ready.\n hello world\n\nBy default, you need to specify language with full version tag like\n``python:3.6-ubuntu18.04``. Depending on the Backend.AI admin's language\nalias settings, this can be shortened just as ``python``. If you want to\nknow defined language aliases, contact the admin of Backend.AI server.\n\nFor more complicated programs, you may upload multiple files and then build &\nexecute them. The below is a simple example to run `a sample C program\n<https://gist.github.com/achimnol/df464c6a3fe05b21e9b06d5b80e986c5>`_.\n\n.. code-block:: console\n\n $ git clone https://gist.github.com/achimnol/df464c6a3fe05b21e9b06d5b80e986c5 c-example\n Cloning into 'c-example'...\n Unpacking objects: 100% (5/5), done.\n $ cd c-example\n $ backend.ai run gcc:gcc6.4-alpine3.8 main.c mylib.c mylib.h\n \u2219 Client session token: 1c352a572bc751a81d1f812186093c47\n \u2714 Kernel (ID: kJ6CgWR7Tz3_v2WsDHOwLQ) is ready.\n \u2714 Uploading done.\n \u2714 Build finished.\n myvalue is 42\n your name? LABLUP\n hello, LABLUP!\n\nPlease refer the ``--help`` manual provided by the ``run`` command.\n\nHighlight: ``start`` and ``app`` command\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\n``backend.ai start`` is simliar to the ``run`` command in that it creates a new compute session,\nbut it does not execute anything there.\nYou can subsequently call ``backend.ai run -t <sessionId> ...`` to execute codes snippets\nor use ``backend.ai app`` command to start a local proxy to a container service such as Jupyter which\nruns inside the compute session.\n\n.. code-block:: console\n\n $ backend.ai start -t mysess -r cpu=1 -r mem=2g lablup/python:3.6-ubuntu18.04\n \u2219 Session ID mysess is created and ready.\n \u2219 This session provides the following app services: ipython, jupyter, jupyterlab\n $ backend.ai app mysess jupyter\n \u2219 A local proxy to the application \"jupyter\" provided by the session \"mysess\" is available at: http://127.0.0.1:8080\n\n\nHighlight: ``ps`` and ``rm`` command\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nYou can see the list of currently running sessions using your API keypair.\n\n.. code-block:: console\n\n $ backend.ai ps\n Session ID Lang/runtime Tag Created At Terminated At Status CPU Cores CPU Used (ms) Total Memory (MiB) Used Memory (MiB) GPU Cores\n ------------ ------------------------ ----- -------------------------------- --------------- -------- ----------- --------------- -------------------- ------------------- -----------\n 88ee10a027 lablup/python:3.6-ubuntu 2018-12-11T03:53:14.802206+00:00 RUNNING 1 16314 1024 39.2 0\n fce7830826 lablup/python:3.6-ubuntu 2018-12-11T03:50:10.150740+00:00 RUNNING 1 15391 1024 39.2 0\n\nIf you set ``-t`` option in the ``run`` command, it will be used as the session ID\u2014you may use it to assign a human-readable, easy-to-type alias for your sessions.\nThese session IDs can be reused after the current session using the same ID terminates.\n\nTo terminate a session, you can use ``terminate`` or ``rm`` command.\n\n.. code-block:: console\n\n $ backend.ai rm 5baafb2136029228ca9d873e1f2b4f6a\n \u2714 Done.\n\nHighlight: ``proxy`` command\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nTo use API development tools such as GraphiQL for the admin API, run an insecure\nlocal API proxy. This will attach all the necessary authorization headers to your\nvanilla HTTP API requests.\n\n.. code-block:: console\n\n $ backend.ai proxy\n \u2219 Starting an insecure API proxy at http://localhost:8084\n\nMore commands?\n~~~~~~~~~~~~~~\n\nPlease run ``backend.ai --help`` to see more commands.\n\n\nTroubleshooting (FAQ)\n---------------------\n\n* There are error reports related to ``simplejson`` with Anaconda on Windows.\n This package no longer depends on simplejson since v1.0.5, so you may uninstall it\n safely since Python 3.5+ offers almost identical ``json`` module in the standard\n library.\n\n If you really need to keep the ``simplejson`` package, uninstall the existing\n simplejson package manually and try reinstallation of it by downloading `a\n pre-built binary wheel from here\n <https://www.lfd.uci.edu/%7Egohlke/pythonlibs/#simplejson>`_.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Backend.AI Client SDK",
"version": "24.3.3",
"project_urls": {
"Documentation": "https://docs.backend.ai/",
"Homepage": "https://github.com/lablup/backend.ai",
"Source": "https://github.com/lablup/backend.ai"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "323b2b59582629ff968279f7b2d10370931233496db4207268237ddc068ad394",
"md5": "519d44ae50348c3b0abe2316027fce1f",
"sha256": "04c8be84660157b913bdd397ba52abd22c5310e3b59ca6845e5b9e5a99781849"
},
"downloads": -1,
"filename": "backend.ai_client-24.3.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "519d44ae50348c3b0abe2316027fce1f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.12",
"size": 168368,
"upload_time": "2024-04-30T16:57:42",
"upload_time_iso_8601": "2024-04-30T16:57:42.869764Z",
"url": "https://files.pythonhosted.org/packages/32/3b/2b59582629ff968279f7b2d10370931233496db4207268237ddc068ad394/backend.ai_client-24.3.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e8491f9ea8fda2dccd31578128fd44415c0351fb2d4bd5ada0a4f20dc4b35147",
"md5": "40a4de877540f8e74a54afca84c197c6",
"sha256": "cc3ea7dbd26a5cfdf7db0551bb62956b2aa3d39709cdb64e5c1dcaa736fa7f6b"
},
"downloads": -1,
"filename": "backend.ai-client-24.3.3.tar.gz",
"has_sig": false,
"md5_digest": "40a4de877540f8e74a54afca84c197c6",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.13,>=3.12",
"size": 132589,
"upload_time": "2024-04-30T16:58:23",
"upload_time_iso_8601": "2024-04-30T16:58:23.076232Z",
"url": "https://files.pythonhosted.org/packages/e8/49/1f9ea8fda2dccd31578128fd44415c0351fb2d4bd5ada0a4f20dc4b35147/backend.ai-client-24.3.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-04-30 16:58:23",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "lablup",
"github_project": "backend.ai",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "backend.ai-client"
}