mobile-env-rl


Namemobile-env-rl JSON
Version 4.0 PyPI version JSON
download
home_pagehttps://github.com/X-LANCE/Mobile-Env
SummaryA Universal Platform for Training and Evaluation of Mobile Interaction
upload_time2024-07-14 06:57:12
maintainerNone
docs_urlNone
authorDanyang Zhang @X-Lance
requires_python>=3.7
license Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
keywords infoui interaction
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <!-- vimc: call SyntaxRange#Include('```sh', '```', 'sh', 'NonText'): -->
<!-- vimc: call SyntaxRange#Include('```bibtex', '```', 'bib', 'NonText'): -->

## NEWS!!

* (2024-07-14 v4.0)
  * Added new action and observation type of ADB
  * Enabled input of common UTF-8 strings for `TEXT` action
  * Enabled fuzzy match method for screen text events. Enabled triggering
    threshold for fuzzy match modes.
  * Migrated from `dm_env` to `android_env.interfaces` to distinguish
    successful, failed, and truncated episode ends. Updated episode end events
    to control its triggering in the task definition file more conviniently.
  * Added `cache_until` field for event slots to correctly trigger an `AND`
    node whose sub-events are expected to be triggered simultaneously.  Now an
    activatated event can be cached temporarily until another triggered event
    clears it.
  * Added `null_listener` for target-less event nodes.
  * Applied image compression in `RemoteSimulator`.
  * Migrated from `gym` to `gymnasium`.
  * Updates to `VhIoWrapper` and `TapActionWrapper`
  * Minor updates to annotation tool.

See our [Change Log](Changelog) for details. The documents will be revised
soon. A new tutorial w.r.t. episode event management is on plan.

* (2024-04-30 v3.6)
  * Updated function to load a remote simulator to enable providing the remote
    resources with a different path with the path of the local task definition
    file.
  * Updated task template toolkit, added new slot modifiers and sytaxes for
    task config file.
  * Fixed known bugs.

* (2023-12-18 v3.5)
  * Owing to the long time delay of VH check and screenshot check, we updated
    the mechanism of managing the check time. By this way, the requirement of
    sufficient check for the episode events and the resulted long delay can be
    balanced.
  * Added multiple rating methods to `ResponseEvent`: regex matching, fuzzy
    matching, and vector encoding matching.
  * Improved `VhIoWrapper` and `TapActionWrapper`. Added support to `SCROLL`
    and `TYPE` to `TapActionWrapper`.
  * Optimized `RemoteSimulator`. In order to reduce the delay of network
    transfering, enabled action batch to send and execute a group of actions
    and enabled resizing the image before and after transferring to shrink the
    transferred data.
  * Merged annotation tool to the main branch. The original annotation-tool
    branch is deprecated.
  * Added support to `ResponseEvent` to annotation tool.
  * Supplemented several commandline options to annotation-tool.

For more details, please see our [Change Log](Changelog) and Documents.

* (2023-10-31 v3.0) Migrated VH node specification from the original VH path to
  Mobile-Env-customized CSS selector (me-selector) and added repeatability
  control to EventSlots. Repeatability control for EventSlots may be useful to
  prevent repetitive triggering of an `OR`-type virtual event combining
  multiple types of event sources.

Please see our [Change Log](Changelog) and
[Document](docs/task-definition-en.md).

* (2023-09-21 v2.1) Added *REMOTE SIMULATOR* to solve the problem that
  hardware-based acceleration for virtualization is not enabled on many GPU
  clusters

Please see our [Change Log](Changelog) and [Document](docs/env-usage-en.md).

* (2023-06-30 v2.0) New type of event "response to human user" (RHU,
  `ResponseEvent`). Now enables the agent to generate response to human user
  and parses episode signales from it. This will enable interaction tasks like
  question-answering, retrieval, *etc*.

Please see our [Change Log](Changelog), [Usage Document](docs/env-usage-en.md),
and [Task Definition Document](docs/task-definition-en.md).

# Mobile-Env: Building Qualified Evaluation Benchmarks for GUI Interaction

Mobile-Env is a interaction platform for building evaluation benchmarks for GUI
interaction and evaluating and training GUI agents.  Our paper is available at
[arXiv](https://arxiv.org/abs/2305.08144).

Mobile-Env is developed based on
[AndroidEnv](https://github.com/deepmind/android_env). The agent can take the
screenshot and the view hierarchy (disabled defaultly for the long latency) as
the observation and take a touch or type a token as the action to interact with
the Android apps. Several episode signals like step instructions, rewards, or
the episode end will be informed during interaction at some crucial steps. A
so-called crucial step may be opening a target page, srolling to a correct
area, *etc*. and is depending on the specific [task
definition](docs/task-definition-en.md).

The proposed WikiHow task set is available at the [Hugging Face
Platform](https://huggingface.co/datasets/zdy023/WikiHow-taskset).

* [Documents in Chinese (中文文档)](README-zh.md)

## Index

* [Evaluating and Training Agents on Mobile-Env](docs/env-usage-en.md)
* [Extending a New Environment (App) or a New Task Based on
  Mobile-Env](docs/task-definition-en.md)
* [Certificate Pinning Problem & Solutions](docs/dynamic-app-en.md)
* [Miscellaneous Auxiliary Tools](docs/other-tools-en.md)

## Platform Features

Mobile-Env is a flexible, adaptable, and easily-extendable platform for InfoUI
interaction with the following features:

* Both screenshot and view hierarchy are provided as the observation. The touch
  and token typing are provided as the action. Wrappers are also supported to
  customize the observation and action spaces. Thus, both visual-based and
  text-based agents, both agents with continuous action space and discrete
  action space, can be evaluated on Mobile-Env.
* New tasks can be easily extended through task definition files.
* Multiple sources are enabled to parse the task events from the operating
  system: screen text, screen icon, view hierarchy, and the system log, which
  makes Mobile-Env capable of adapting to most real-world apps without
  dedicated development. (Screen text and screen icon will be enabled with an
  external OCR tool and icon recognition tool. Currently, a wrapper of
  [EasyOCR](https://github.com/JaidedAI/EasyOCR) is integrated in the platform
  and can be enabled directly. An intergrated icon model will be embedded soon
  as well.)

## Getting Started

### Installation

Install from PyPI:

```sh
pip insall mobile-env-rl
```

or clone the repository and build locally.

```sh
git clone https://github.com/X-LANCE/Mobile-Env
cd Mobile-Env
pip install .
```

Several [Docker images](https://hub.docker.com/r/zdy023/mobile-env-rl) with
well-configured Android AVD are also available.

### Load and Run Mobile-Env for Evaluation or Training

Before loading the Mobile-Env environment, you will need to set up an [Android
Emulator](https://developer.android.com/about) device. Then you can load the
environment with some existing task definitions and start your experiments. A
detailed guidance is provided in [Evaluating and Traning Agents on
Mobile-Env](docs/env-usage-en.md). Several examples with a random agent or a
human agent is also provided under `examples`.

### Extend a New Environment or a New Task

To extend a new environment for Mobile-Env, the environment designer needs to
prepare the app package and ensure that the package manages to launch and run
on some versions of Android Emulator. If the app requires varying online data,
the necessary data should be crawled and dumped and then be replayed for a
consistent evaluation. In such case, the designer is supposed to validate the
certain effectiveness of [certificate unpinning plan](docs/dynamic-app-en.md)
for the package.  As regards to extend new tasks, task definition files are
just required. Detailed instructions can be found in [Extending a New
Environment (App) or a New Task Based on
Mobile-Env](docs/task-definition-en.md).

Several demo task definitions are provided under `demos`. Three of them are
migrated from  AndroidEnv:

* `classic_2048.m.textproto` - [Classic 2048
  game](https://github.com/google-deepmind/android_env/blob/main/docs/example_tasks.md#classic-2048).
* `accessibility_forwarder_clock_set_timer.m.textproto` - A simple task
  requiring the agent to [reset a running
  timer](https://github.com/google-deepmind/android_env/blob/main/docs/example_tasks.md#accessibility-forwarder).
* `systemui_egg_land_default.m.textproto` - [Flappy
  Droid](https://github.com/google-deepmind/android_env/blob/main/docs/example_tasks.md#flappydroid).
  An open-sourced implementation of classic game, Flappy Bird.

Another one, `openmoneybox.add_billings.textproto` is defined upon an
open-sourced billing app,
[OpenMoneyBox](https://f-droid.org/en/packages/com.igisw.openmoneybox/).
Details are referred to in the task definition files.

### Miscellaneous Auxiliary Tools

We also developed an annotation tool for the human demonstrations, and a suite
of template tool to auto-generate task definitions according to templates and
to combine multiple task definitions to form a multi-step task. The details are
referred to in [Miscellaneous Auxiliary Tools](docs/other-tools-en.md).

### Reference Time-Consuming and Memory Usage o Mobile-Env

The data are measured under the configuration below:

* OS and hardware:
  * Operating System: Manjaro 23.1.0 Vulcan
  * Kernel Version: x86\_64 Linux 6.1.64-1-MANJARO
  * CPU: Intel Core i7-10700 @ 16x 4.8GHz
  * GPU: NVIDIA GeForce RTX 3090
  * RAM: 64 GB
  * KVM acceleration enabled
* Android development tools
  * Android emulator version 32.1.14.0
  * Android platform tools 34.0.4
  * libvert 1:9.9.0
* Python & packages
  * Python 3.8.16
  * EasyOCR 1.7.2
  * sentence-transformers 2.2.2
* Android Virtual Device
  * Device type: Pixel 2
  * API version: API 30
  * OS Variant: Google APIs
  * CPU cores: 4
  * Memory: 8 GB
  * Screen size: 1080×1920

|                           Item                          |     Avg Time     |    Time Std Dev   |
|:-------------------------------------------------------:|:----------------:|:-----------------:|
|                      `TOUCH` action                     |     410.50 µs    |      64.71 µs     |
|                      `LIFT` action                      |     412.30 µs    |      84.18 µs     |
|                      `TEXT` action                      | ~~1.30 s~~ 0.58s | ~~0.28 s~~ 0.03 s |
|                   screenshot capturing                  |     19.94 ms     |      21.47 ms     |
| invocation of Sentence Transformer(all-MiniLM-L12-v2) |      8.51 ms     |      0.17 ms      |
|                       VH capturing                      |      2.53 s      |       1.90 s      |
|                  invocation of EasyOCR                  |      0.44 s      |       0.08 s      |

When only an [app of WikiHow
2.9.6](https://apkcombo.com/zh/wikihow-how-to-do-anything/com.wikihow.wikihowapp/download/apk)
is running, the Android emulator occupies 6,031 MiB of virtual memory and 3,444
MiB of residual memory.

## About

This library is developed and maintained by [SJTU
X-Lance](https://x-lance.sjtu.edu.cn/en). The corresponding paper is available
at <https://arxiv.org/abs/2305.08144>.

If you find Mobile-Env useful in your research, you can cite the project using
the following BibTeX:

```bibtex
@article{DanyangZhang2023_MobileEnv,
  title     = {{Mobile-Env}: Building Qualified Evaluation Benchmarks for LLM-GUI Interaction},
  author    = {Danyang Zhang and
               Zhennan Shen and
               Rui Xie and
               Situo Zhang and
               Tianbao Xie and
               Zihan Zhao and
               Siyuan Chen and
               Lu Chen and
               Hongshen Xu and
               Ruisheng Cao and
               Kai Yu},
  journal   = {CoRR},
  volume    = {abs/2305.08144},
  year      = {2023},
  url       = {https://arxiv.org/abs/2305.08144},
  eprinttype = {arXiv},
  eprint    = {2305.08144},
}
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/X-LANCE/Mobile-Env",
    "name": "mobile-env-rl",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "InfoUI interaction",
    "author": "Danyang Zhang @X-Lance",
    "author_email": "\"Danyang Zhang @X-Lance\" <zdy004007@126.com>",
    "download_url": "https://files.pythonhosted.org/packages/04/9f/734a418b896d7cf48ab187da3579c1655d740070f987702d6cad16479dff/mobile_env_rl-4.0.tar.gz",
    "platform": null,
    "description": "<!-- vimc: call SyntaxRange#Include('```sh', '```', 'sh', 'NonText'): -->\n<!-- vimc: call SyntaxRange#Include('```bibtex', '```', 'bib', 'NonText'): -->\n\n## NEWS!!\n\n* (2024-07-14 v4.0)\n  * Added new action and observation type of ADB\n  * Enabled input of common UTF-8 strings for `TEXT` action\n  * Enabled fuzzy match method for screen text events. Enabled triggering\n    threshold for fuzzy match modes.\n  * Migrated from `dm_env` to `android_env.interfaces` to distinguish\n    successful, failed, and truncated episode ends. Updated episode end events\n    to control its triggering in the task definition file more conviniently.\n  * Added `cache_until` field for event slots to correctly trigger an `AND`\n    node whose sub-events are expected to be triggered simultaneously.  Now an\n    activatated event can be cached temporarily until another triggered event\n    clears it.\n  * Added `null_listener` for target-less event nodes.\n  * Applied image compression in `RemoteSimulator`.\n  * Migrated from `gym` to `gymnasium`.\n  * Updates to `VhIoWrapper` and `TapActionWrapper`\n  * Minor updates to annotation tool.\n\nSee our [Change Log](Changelog) for details. The documents will be revised\nsoon. A new tutorial w.r.t. episode event management is on plan.\n\n* (2024-04-30 v3.6)\n  * Updated function to load a remote simulator to enable providing the remote\n    resources with a different path with the path of the local task definition\n    file.\n  * Updated task template toolkit, added new slot modifiers and sytaxes for\n    task config file.\n  * Fixed known bugs.\n\n* (2023-12-18 v3.5)\n  * Owing to the long time delay of VH check and screenshot check, we updated\n    the mechanism of managing the check time. By this way, the requirement of\n    sufficient check for the episode events and the resulted long delay can be\n    balanced.\n  * Added multiple rating methods to `ResponseEvent`: regex matching, fuzzy\n    matching, and vector encoding matching.\n  * Improved `VhIoWrapper` and `TapActionWrapper`. Added support to `SCROLL`\n    and `TYPE` to `TapActionWrapper`.\n  * Optimized `RemoteSimulator`. In order to reduce the delay of network\n    transfering, enabled action batch to send and execute a group of actions\n    and enabled resizing the image before and after transferring to shrink the\n    transferred data.\n  * Merged annotation tool to the main branch. The original annotation-tool\n    branch is deprecated.\n  * Added support to `ResponseEvent` to annotation tool.\n  * Supplemented several commandline options to annotation-tool.\n\nFor more details, please see our [Change Log](Changelog) and Documents.\n\n* (2023-10-31 v3.0) Migrated VH node specification from the original VH path to\n  Mobile-Env-customized CSS selector (me-selector) and added repeatability\n  control to EventSlots. Repeatability control for EventSlots may be useful to\n  prevent repetitive triggering of an `OR`-type virtual event combining\n  multiple types of event sources.\n\nPlease see our [Change Log](Changelog) and\n[Document](docs/task-definition-en.md).\n\n* (2023-09-21 v2.1) Added *REMOTE SIMULATOR* to solve the problem that\n  hardware-based acceleration for virtualization is not enabled on many GPU\n  clusters\n\nPlease see our [Change Log](Changelog) and [Document](docs/env-usage-en.md).\n\n* (2023-06-30 v2.0) New type of event \"response to human user\" (RHU,\n  `ResponseEvent`). Now enables the agent to generate response to human user\n  and parses episode signales from it. This will enable interaction tasks like\n  question-answering, retrieval, *etc*.\n\nPlease see our [Change Log](Changelog), [Usage Document](docs/env-usage-en.md),\nand [Task Definition Document](docs/task-definition-en.md).\n\n# Mobile-Env: Building Qualified Evaluation Benchmarks for GUI Interaction\n\nMobile-Env is a interaction platform for building evaluation benchmarks for GUI\ninteraction and evaluating and training GUI agents.  Our paper is available at\n[arXiv](https://arxiv.org/abs/2305.08144).\n\nMobile-Env is developed based on\n[AndroidEnv](https://github.com/deepmind/android_env). The agent can take the\nscreenshot and the view hierarchy (disabled defaultly for the long latency) as\nthe observation and take a touch or type a token as the action to interact with\nthe Android apps. Several episode signals like step instructions, rewards, or\nthe episode end will be informed during interaction at some crucial steps. A\nso-called crucial step may be opening a target page, srolling to a correct\narea, *etc*. and is depending on the specific [task\ndefinition](docs/task-definition-en.md).\n\nThe proposed WikiHow task set is available at the [Hugging Face\nPlatform](https://huggingface.co/datasets/zdy023/WikiHow-taskset).\n\n* [Documents in Chinese (\u4e2d\u6587\u6587\u6863)](README-zh.md)\n\n## Index\n\n* [Evaluating and Training Agents on Mobile-Env](docs/env-usage-en.md)\n* [Extending a New Environment (App) or a New Task Based on\n  Mobile-Env](docs/task-definition-en.md)\n* [Certificate Pinning Problem & Solutions](docs/dynamic-app-en.md)\n* [Miscellaneous Auxiliary Tools](docs/other-tools-en.md)\n\n## Platform Features\n\nMobile-Env is a flexible, adaptable, and easily-extendable platform for InfoUI\ninteraction with the following features:\n\n* Both screenshot and view hierarchy are provided as the observation. The touch\n  and token typing are provided as the action. Wrappers are also supported to\n  customize the observation and action spaces. Thus, both visual-based and\n  text-based agents, both agents with continuous action space and discrete\n  action space, can be evaluated on Mobile-Env.\n* New tasks can be easily extended through task definition files.\n* Multiple sources are enabled to parse the task events from the operating\n  system: screen text, screen icon, view hierarchy, and the system log, which\n  makes Mobile-Env capable of adapting to most real-world apps without\n  dedicated development. (Screen text and screen icon will be enabled with an\n  external OCR tool and icon recognition tool. Currently, a wrapper of\n  [EasyOCR](https://github.com/JaidedAI/EasyOCR) is integrated in the platform\n  and can be enabled directly. An intergrated icon model will be embedded soon\n  as well.)\n\n## Getting Started\n\n### Installation\n\nInstall from PyPI:\n\n```sh\npip insall mobile-env-rl\n```\n\nor clone the repository and build locally.\n\n```sh\ngit clone https://github.com/X-LANCE/Mobile-Env\ncd Mobile-Env\npip install .\n```\n\nSeveral [Docker images](https://hub.docker.com/r/zdy023/mobile-env-rl) with\nwell-configured Android AVD are also available.\n\n### Load and Run Mobile-Env for Evaluation or Training\n\nBefore loading the Mobile-Env environment, you will need to set up an [Android\nEmulator](https://developer.android.com/about) device. Then you can load the\nenvironment with some existing task definitions and start your experiments. A\ndetailed guidance is provided in [Evaluating and Traning Agents on\nMobile-Env](docs/env-usage-en.md). Several examples with a random agent or a\nhuman agent is also provided under `examples`.\n\n### Extend a New Environment or a New Task\n\nTo extend a new environment for Mobile-Env, the environment designer needs to\nprepare the app package and ensure that the package manages to launch and run\non some versions of Android Emulator. If the app requires varying online data,\nthe necessary data should be crawled and dumped and then be replayed for a\nconsistent evaluation. In such case, the designer is supposed to validate the\ncertain effectiveness of [certificate unpinning plan](docs/dynamic-app-en.md)\nfor the package.  As regards to extend new tasks, task definition files are\njust required. Detailed instructions can be found in [Extending a New\nEnvironment (App) or a New Task Based on\nMobile-Env](docs/task-definition-en.md).\n\nSeveral demo task definitions are provided under `demos`. Three of them are\nmigrated from  AndroidEnv:\n\n* `classic_2048.m.textproto` - [Classic 2048\n  game](https://github.com/google-deepmind/android_env/blob/main/docs/example_tasks.md#classic-2048).\n* `accessibility_forwarder_clock_set_timer.m.textproto` - A simple task\n  requiring the agent to [reset a running\n  timer](https://github.com/google-deepmind/android_env/blob/main/docs/example_tasks.md#accessibility-forwarder).\n* `systemui_egg_land_default.m.textproto` - [Flappy\n  Droid](https://github.com/google-deepmind/android_env/blob/main/docs/example_tasks.md#flappydroid).\n  An open-sourced implementation of classic game, Flappy Bird.\n\nAnother one, `openmoneybox.add_billings.textproto` is defined upon an\nopen-sourced billing app,\n[OpenMoneyBox](https://f-droid.org/en/packages/com.igisw.openmoneybox/).\nDetails are referred to in the task definition files.\n\n### Miscellaneous Auxiliary Tools\n\nWe also developed an annotation tool for the human demonstrations, and a suite\nof template tool to auto-generate task definitions according to templates and\nto combine multiple task definitions to form a multi-step task. The details are\nreferred to in [Miscellaneous Auxiliary Tools](docs/other-tools-en.md).\n\n### Reference Time-Consuming and Memory Usage o Mobile-Env\n\nThe data are measured under the configuration below:\n\n* OS and hardware:\n  * Operating System: Manjaro 23.1.0 Vulcan\n  * Kernel Version: x86\\_64 Linux 6.1.64-1-MANJARO\n  * CPU: Intel Core i7-10700 @ 16x 4.8GHz\n  * GPU: NVIDIA GeForce RTX 3090\n  * RAM: 64 GB\n  * KVM acceleration enabled\n* Android development tools\n  * Android emulator version 32.1.14.0\n  * Android platform tools 34.0.4\n  * libvert 1:9.9.0\n* Python & packages\n  * Python 3.8.16\n  * EasyOCR 1.7.2\n  * sentence-transformers 2.2.2\n* Android Virtual Device\n  * Device type: Pixel 2\n  * API version: API 30\n  * OS Variant: Google APIs\n  * CPU cores: 4\n  * Memory: 8 GB\n  * Screen size: 1080\u00d71920\n\n|                           Item                          |     Avg Time     |    Time Std Dev   |\n|:-------------------------------------------------------:|:----------------:|:-----------------:|\n|                      `TOUCH` action                     |     410.50 \u00b5s    |      64.71 \u00b5s     |\n|                      `LIFT` action                      |     412.30 \u00b5s    |      84.18 \u00b5s     |\n|                      `TEXT` action                      | ~~1.30 s~~ 0.58s | ~~0.28 s~~ 0.03 s |\n|                   screenshot capturing                  |     19.94 ms     |      21.47 ms     |\n| invocation of Sentence Transformer\uff08all-MiniLM-L12-v2\uff09 |      8.51 ms     |      0.17 ms      |\n|                       VH capturing                      |      2.53 s      |       1.90 s      |\n|                  invocation of EasyOCR                  |      0.44 s      |       0.08 s      |\n\nWhen only an [app of WikiHow\n2.9.6](https://apkcombo.com/zh/wikihow-how-to-do-anything/com.wikihow.wikihowapp/download/apk)\nis running, the Android emulator occupies 6,031 MiB of virtual memory and 3,444\nMiB of residual memory.\n\n## About\n\nThis library is developed and maintained by [SJTU\nX-Lance](https://x-lance.sjtu.edu.cn/en). The corresponding paper is available\nat <https://arxiv.org/abs/2305.08144>.\n\nIf you find Mobile-Env useful in your research, you can cite the project using\nthe following BibTeX:\n\n```bibtex\n@article{DanyangZhang2023_MobileEnv,\n  title     = {{Mobile-Env}: Building Qualified Evaluation Benchmarks for LLM-GUI Interaction},\n  author    = {Danyang Zhang and\n               Zhennan Shen and\n               Rui Xie and\n               Situo Zhang and\n               Tianbao Xie and\n               Zihan Zhao and\n               Siyuan Chen and\n               Lu Chen and\n               Hongshen Xu and\n               Ruisheng Cao and\n               Kai Yu},\n  journal   = {CoRR},\n  volume    = {abs/2305.08144},\n  year      = {2023},\n  url       = {https://arxiv.org/abs/2305.08144},\n  eprinttype = {arXiv},\n  eprint    = {2305.08144},\n}\n```\n",
    "bugtrack_url": null,
    "license": " Apache License Version 2.0, January 2004 http://www.apache.org/licenses/  TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION  1. Definitions.  \"License\" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.  \"Licensor\" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.  \"Legal Entity\" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, \"control\" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.  \"You\" (or \"Your\") shall mean an individual or Legal Entity exercising permissions granted by this License.  \"Source\" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.  \"Object\" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.  \"Work\" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).  \"Derivative Works\" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.  \"Contribution\" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, \"submitted\" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as \"Not a Contribution.\"  \"Contributor\" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.  2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.  3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.  4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:  (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and  (b) You must cause any modified files to carry prominent notices stating that You changed the files; and  (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and  (d) If the Work includes a \"NOTICE\" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License.  You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.  5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.  6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.  7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.  8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.  9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.  END OF TERMS AND CONDITIONS  APPENDIX: How to apply the Apache License to your work.  To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets \"[]\" replaced with your own identifying information. (Don't include the brackets!)  The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same \"printed page\" as the copyright notice for easier identification within third-party archives.  Copyright [yyyy] [name of copyright owner]  Licensed under the Apache License, Version 2.0 (the \"License\"); you may not use this file except in compliance with the License. You may obtain a copy of the License at  http://www.apache.org/licenses/LICENSE-2.0  Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.",
    "summary": "A Universal Platform for Training and Evaluation of Mobile Interaction",
    "version": "4.0",
    "project_urls": {
        "Homepage": "https://github.com/X-LANCE/Mobile-Env"
    },
    "split_keywords": [
        "infoui",
        "interaction"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "af8199b31be1ad5549cdc505011af8807afacfea99822c88e761771945d548b3",
                "md5": "1e2d4ce079b5ffed11e2260b4b4ba52d",
                "sha256": "d24ebad6dd61ae8df7bedbc25797584ac942f0b6aabd0d51f27605e21bb7361d"
            },
            "downloads": -1,
            "filename": "mobile_env_rl-4.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1e2d4ce079b5ffed11e2260b4b4ba52d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 218740,
            "upload_time": "2024-07-14T06:57:09",
            "upload_time_iso_8601": "2024-07-14T06:57:09.755313Z",
            "url": "https://files.pythonhosted.org/packages/af/81/99b31be1ad5549cdc505011af8807afacfea99822c88e761771945d548b3/mobile_env_rl-4.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "049f734a418b896d7cf48ab187da3579c1655d740070f987702d6cad16479dff",
                "md5": "5634cecdc01833984e80efd19170b2e6",
                "sha256": "a818f4961a043a5dc78f215e532b9a47bdd4302c5d87988458d53b0bdb07f0d8"
            },
            "downloads": -1,
            "filename": "mobile_env_rl-4.0.tar.gz",
            "has_sig": false,
            "md5_digest": "5634cecdc01833984e80efd19170b2e6",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 167459,
            "upload_time": "2024-07-14T06:57:12",
            "upload_time_iso_8601": "2024-07-14T06:57:12.026317Z",
            "url": "https://files.pythonhosted.org/packages/04/9f/734a418b896d7cf48ab187da3579c1655d740070f987702d6cad16479dff/mobile_env_rl-4.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-14 06:57:12",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "X-LANCE",
    "github_project": "Mobile-Env",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "mobile-env-rl"
}
        
Elapsed time: 0.41869s