luckyrobots


Nameluckyrobots JSON
Version 0.1.36 PyPI version JSON
download
home_pagehttps://github.com/lucky-robots/lucky-robots
SummaryRobotics-AI Training in Hyperrealistic Game Environments
upload_time2024-10-14 22:11:10
maintainerNone
docs_urlNone
authorDevrim Yasar
requires_python>=3.8
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Lucky Robots
## Robotics-AI Training in Hyperrealistic Game Environments

https://github.com/user-attachments/assets/f500287b-b785-42bb-92d8-6d90b5602d6b

Lucky Robots: Where robots come for a boot camp, like going to a spa day! 🤖💆‍♂️ We use the fancy Unreal Engine 5.3, Open3D and YoloV8 to create a lavish, virtual 5-star resort experience for our metal buddies so they're absolutely pumped before they meet the real world. Our training framework? More like a robotic paradise with zero robot mishaps. "Gentle" methods? We're practically the robot whisperers here, you won't see humans with metal sticks around. That's why it's "Lucky Robots" because every robot leaves our sessions feeling like they just won the jackpot – all without a scratch! 🎰🤣

Joking aside, whether you're an AI/ML Developer, aspiring to design the next Roomba, or interested in delving into robotics, there's no need to invest thousands of dollars in a physical robot and attempt to train it in your living space. With Lucky, you can master the art of training your robot, simulate its behavior with up to 90% accuracy, and then reach out to the robot's manufacturer to launch your new company.

Remember, no robots were emotionally or physically harmed in our ultra-luxurious training process. Thus, Lucky Robots!

Cheers to happy, and more importantly, unbruised robots! 🍀🤖🎉

(Note for the repository: Our Unreal repo got too big for GitHub (250GB+!), so we moved it to a local Perforce server. Major bummer for collab, we know. 😞 We're setting up read-only FTP access to the files. If you need access or have ideas to work around this, give me a shout. I'll hook you up with FTP access for now.)

## Getting Started

To begin using Lucky Robots:


1. if you want to run the examples in this repository: (optional)

```
   git clone https://github.com/luckyrobots/luckyrobots.git
   cd luckyrobots/examples
```
2. Use your fav package manager (optional)
```
   conda create -n lr
   conda activate lr
```

2. Install the package using pip:
```
   pip install luckyrobots
```

3. Run one of the following
```
   python basic_usage.py 
   python yolo_example.py
   python yolo_mac_example.py
```

It will download the binary and will run it for you.

## Event Listeners

Lucky Robots provides several event listeners to interact with the simulated robot and receive updates on its state:

1. **@lr.on("robot_output")**: Receives robot output, including RGB and depth images, and coordinates.

   Example output:
   ```python
   {
       "body_pos": {"Time": "1720752411", "rx": "-0.745724", "ry": "0.430001", "rz": "0.007442", "tx": "410.410786", "ty": "292.086556", "tz": "0.190011", "file_path": "/.../4_body_pos.txt"},
       "depth_cam1": {"file_path": "/.../4_depth_cam1.jpg"},
       "depth_cam2": {"file_path": "/.../4_depth_cam2.jpg"},
       "hand_cam": {"Time": "1720752411", "rx": "-59.724758", "ry": "-89.132507", "rz": "59.738461", "tx": "425.359645", "ty": "285.063092", "tz": "19.006545", "file_path": "/.../4_hand_cam.txt"},
       "head_cam": {"Time": "1720752411", "rx": "-0.749195", "ry": "0.433544", "rz": "0.010893", "tx": "419.352843", "ty": "292.814832", "tz": "59.460736", "file_path": "/.../4_head_cam.txt"},
       "rgb_cam1": {"file_path": "/.../4_rgb_cam1.jpg"},
       "rgb_cam2": {"file_path": "/.../4_rgb_cam2.jpg"}
   }
   ```

2. **@lr.on("message")**: Decodes messages from the robot to understand its internal state.
3. **@lr.on("start")**: Triggered when the robot starts, allowing for initialization tasks.
4. **@lr.on("tasks")**: Manages the robot's task list.
5. **@lr.on("task_complete")**: Triggered when the robot completes a task.
6. **@lr.on("batch_complete")**: Triggered when the robot completes a batch of tasks.
7. **@lr.on("hit_count")**: Tracks the robot's collisions.

## Controlling the Robot

To control the robot, send commands using the `lr.send_message()` function. For example, to make the robot's main wheels turn 10 times:

```python
commands = [["W 3600 1"]]  # This makes the main wheels turn 10 times.
```

For multiple commands and to know when a particular one ends, assign an ID field to your command:

```python
commands = [[{"id": 1234, "code": "W 18000 1"}]]
```

If you want to send a whole set of instructions, add multiple arrays. Each array will wait until the previous array finishes. Commands inside one array are executed simultaneously, allowing smoother movements like the robot lifting its arms while moving forward or turning its head while placing an object. 

```python
commands = [["W 1800 1","a 30"],["a 0", "W 1800 1"]]
```

Commands in one list will override previous commands if they conflict. For instance, if you instruct your robot to turn its wheels 20 times, and on the 5th turn, you instruct it again to turn 3 times, the robot will travel a total of 8 revolutions and stop.

To know when a particular batch finishes, give it an ID and listen for that ID:

```python
commands = [
    ["RESET"],
    {"commands": [{"id": 123456, "code": "W 5650 1"}, {"id": 123457, "code": "a 30 1"}], "batchID": "123456"},
    ["A 0 1", "W 18000 1"]
]
lr.send_message(commands)
```

## MOVING THE ROBOTS

### FORWARD - BACKWARD
- `[DIRECTION] [DISTANCE] [SPEED]` Example: `W 50 1`
  - `[DIRECTION]`: W is forward, S is backward
  - `[DISTANCE]`: Travel distance in centimeters
  - `[SPEED]`: Speed at which motor will react - km/h
  - Send via API: `lr.send_message([["W 50 1"]])`

### LEFT - RIGHT
- `[DIRECTION] [DEGREE]` Example: `A 30`
  - `[DIRECTION]`: A is left, D is right
  - `[DEGREE]`: Spin Rotation in degrees
  - Or: `lr.send_message([["A 30"]])`

### RESET
- `RESET`: Resets all positions and rotations to the zero pose
- Or: `lr.send_message([["RESET"]])`

### STRETCH-3 

- `[JOINT][DISTANCE]` Example: `EX1 30`
  - `EX1 10`  (extend 1st joint 10cm outwards)
  - `EX2 -10` (extend 2nd joint 10cm inwards)
  - `EX3 10`  (extend 3rd joint 10cm outwards)
  - `EX4 10`  (extend 4th joint 10cm outwards)
  - Or: `lr.send_message([["EX1 10"]])`, `lr.send_message([["EX2 -10"]])`, etc.

- `U 10` (Up) - Or: `lr.send_message([["U 10"]])`
- `U -10` (Down) - Or: `lr.send_message([["U -10"]])`

- Gripper: `G 5` or `G -10` - Or: `lr.send_message([["G 5"]])` or `lr.send_message([["G -10"]])`

- Hand Cam Angle:
  - `R1 10` - Or: `lr.send_message([["R1 10"]])`
  - `R2 -30` (turn cam) - Or: `lr.send_message([["R2 -30"]])`

### LUCKY ROBOT-3 

- `[JOINT][DEGREE]` Example: `EX1 30`
  - `EX1 20`  (1st rotate the joint 20 degrees)
  - `EX2 -10` (2nd rotate the joint -10 degrees)
  - `EX3 10`  (3rd rotate the joint 10 degrees)
  - `EX4 10`  (4th rotate the joint 10 degrees)
  - Or: `lr.send_message([["EX1 20"]])`, `lr.send_message([["EX2 -10"]])`, etc.

- `U 10` (Up) - Or: `lr.send_message([["U 10"]])`
- `U -10` (Down) - Or: `lr.send_message([["U -10"]])`

- Gripper: `G 5` or `G -10` - Or: `lr.send_message([["G 5"]])` or `lr.send_message([["G -10"]])`

- Hand Cam Angle: `R 10` - Or: `lr.send_message([["R 10"]])`

## Starting the Robot

To start the robot simulation, use:

```python
lr.start(binary_path, sendBinaryData=False)
```

### ** WHAT WE ARE WORKING ON NEXT **

*   Releasing our first basic end to end model
*   Drone!!!
*   Scan your own room
*   Import URDFs
*   (your idea?)




### brief history of the project ###
------------------------------

** UPDATE 3/19/24 FIRST LUCKY WORLD UBUNTU BUILD IS COMPLETE: https://drive.google.com/drive/folders/15iYXzqFNEg1b2E6Ft1ErwynqBMaa0oOa

** UPDATE 3/6/24 **

We have designed [Stretch 3 Robot](https://hello-robot.com/stretch-3-product) and working on adding this robot to our world

<img width="504" alt="image" src="https://github.com/lucky-robots/lucky-robots/assets/203507/54b1bbbc-67e0-4add-a58f-84b08d14e680">


** UPDATE 1/6/24 **

WE GOT OUR FIRST TEST LINUX BUILD (NOT THE ACTUAL WORLD, THAT'S BEING BUILT) (TESTED ON UBUNTU 22.04)

https://drive.google.com/file/d/1_OCMwn8awKZHBfCfc9op00y6TvetI18U/view?usp=sharing


** UPDATE 2/15/24 **

[Luck-e World second release is out (Windows only - we're working on Linux build next)!](https://drive.google.com/drive/folders/10sVx5eCcx7d9ZR6tn0zqeQCaOF84MIQt)


** UPDATE 2/8/24 **

We are now writing prompts against the 3d environment we have reconstructed using point clouds...


https://github.com/lucky-robots/lucky-robots/assets/203507/a93c9f19-2891-40e1-8598-717ad13efba6




** UPDATE 2/6/24 **

Lucky first release: https://drive.google.com/file/d/1qIbkez1VGU1WcIpqk8UuXTbSTMV7VC3R/view?amp;usp=embed_facebook

Now you can run the simulation on your Windows Machine, and run your AI models against it. If you run into issues, please submit an issue.

** UPDATE 1/15/24 **

Luck-e is starting to understand the world around us and navigate accordingly!

https://github.com/lucky-robots/lucky-robots/assets/203507/4e56bbc5-92da-4754-92f4-989b9cb86b6f


** UPDATE 1/13/24 **

We are able to construct a 3d world using single camera @niconielsen32 (This is not a 3d room generated by a game engine, this is what we generate from what we're seeing through a camera in the game!)

https://github.com/lucky-robots/lucky-robots/assets/203507/f2fd19ee-b40a-4fef-bd30-72c56d0f9ead



** UPDATE 12/29/23 **
We are now flying! Look at these environments, can you tell they're not real?

![Screenshot_18](https://github.com/lucky-robots/lucky-robots/assets/203507/f988a18e-9dc3-484e-9d9f-eb7ad57180b2)
![Screenshot_17](https://github.com/lucky-robots/lucky-robots/assets/203507/f423d73f-d336-47b6-abf0-6f1b174bd740)
![Screenshot_19](https://github.com/lucky-robots/lucky-robots/assets/203507/7f2b9ae2-f84f-41a1-8511-959e2586b809)
![Screenshot_15](https://github.com/lucky-robots/lucky-robots/assets/203507/d65a0fb4-3a4d-4207-9181-2de0e2ce63ce)
![Screenshot_11](https://github.com/lucky-robots/lucky-robots/assets/203507/cf328e8d-fc40-4be3-81ac-a900d0505fd8)
![Screenshot_14](https://github.com/lucky-robots/lucky-robots/assets/203507/5ae9bf2d-246b-437f-ba1b-901a7f10b1fa)
![Screenshot_12](https://github.com/lucky-robots/lucky-robots/assets/203507/e2f0684e-ca18-40b0-8680-76ccec918171)
![Screenshot_8](https://github.com/lucky-robots/lucky-robots/assets/203507/26904b69-c8b8-467d-8355-595cc62ead3f)
![Screenshot_7](https://github.com/lucky-robots/lucky-robots/assets/203507/e43e25b0-b68d-4c1e-9a7d-800b9cf5312b)


** UPDATE 12/27/23 **

Lucky now has a drone  - like the Mars Rover! When it's activated camera feed switches to it automatically!


https://github.com/lucky-robots/lucky-robots/assets/203507/29103a5a-a209-4d49-acd1-adad88e5b590


** UPDATE 12/5/23 **

Completed our first depth map using Midas monocular depth estimation model


https://github.com/lucky-robots/lucky-robots/assets/203507/647a5c32-297a-4157-b72b-afeacdaae48a


https://user-images.githubusercontent.com/203507/276747207-b4db8da0-a14e-4f41-a6a0-ef3e2ea7a31c.mp4

## Table of Contents

- [Features](#features)
- [Support](#support)
- [Contributing](#contributing)
- [Join Our Team](#join-our-team)
- [License](#license)

## Features

1. **Realistic Training Environments**: Train your robots in various scenarios and terrains crafted meticulously in Unreal Engine.
2. **Python Integration**: The framework integrates seamlessly with Python 3.10, enabling developers to write training algorithms and robot control scripts in Python.
3. **Safety First**: No physical wear and tear on robots during training. Virtual training ensures that our robotic friends remain in tip-top condition.
4. **Modular Design**: Easily extend and modify the framework to suit your specific requirements or add new training environments.


## Support

For any queries, issues, or feature requests, please refer to our [issues page](https://github.com/LuckyRobots/LuckyRobotsTrainingFramework/issues).

## Contributing

We welcome contributions! Please read our [contributing guide](CONTRIBUTING.md) to learn about our development process, how to propose bugfixes and improvements, and how to build and test your changes to Lucky Robots.

## Join our team?

Absolutely! Show us a few cool things and/or contribute a few PRs -- let us know!

## License

Lucky Robots Training Framework is released under the [MIT License](LICENSE.md).

---

Happy training! Remember, be kind to robots. 🤖💚

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/lucky-robots/lucky-robots",
    "name": "luckyrobots",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "Devrim Yasar",
    "author_email": "braces.verbose03@icloud.com",
    "download_url": "https://files.pythonhosted.org/packages/e7/2c/31631a44bb2283d45d11bc829d03235ed90d2b5a6fcbc1facfbee58c4cae/luckyrobots-0.1.36.tar.gz",
    "platform": null,
    "description": "# Lucky Robots\n## Robotics-AI Training in Hyperrealistic Game Environments\n\nhttps://github.com/user-attachments/assets/f500287b-b785-42bb-92d8-6d90b5602d6b\n\nLucky Robots: Where robots come for a boot camp, like going to a spa day! \ud83e\udd16\ud83d\udc86\u200d\u2642\ufe0f We use the fancy Unreal Engine 5.3, Open3D and YoloV8 to create a lavish, virtual 5-star resort experience for our metal buddies so they're absolutely pumped before they meet the real world. Our training framework? More like a robotic paradise with zero robot mishaps. \"Gentle\" methods? We're practically the robot whisperers here, you won't see humans with metal sticks around. That's why it's \"Lucky Robots\" because every robot leaves our sessions feeling like they just won the jackpot \u2013 all without a scratch! \ud83c\udfb0\ud83e\udd23\n\nJoking aside, whether you're an AI/ML Developer, aspiring to design the next Roomba, or interested in delving into robotics, there's no need to invest thousands of dollars in a physical robot and attempt to train it in your living space. With Lucky, you can master the art of training your robot, simulate its behavior with up to 90% accuracy, and then reach out to the robot's manufacturer to launch your new company.\n\nRemember, no robots were emotionally or physically harmed in our ultra-luxurious training process. Thus, Lucky Robots!\n\nCheers to happy, and more importantly, unbruised robots! \ud83c\udf40\ud83e\udd16\ud83c\udf89\n\n(Note for the repository: Our Unreal repo got too big for GitHub (250GB+!), so we moved it to a local Perforce server. Major bummer for collab, we know. \ud83d\ude1e We're setting up read-only FTP access to the files. If you need access or have ideas to work around this, give me a shout. I'll hook you up with FTP access for now.)\n\n## Getting Started\n\nTo begin using Lucky Robots:\n\n\n1. if you want to run the examples in this repository: (optional)\n\n```\n   git clone https://github.com/luckyrobots/luckyrobots.git\n   cd luckyrobots/examples\n```\n2. Use your fav package manager (optional)\n```\n   conda create -n lr\n   conda activate lr\n```\n\n2. Install the package using pip:\n```\n   pip install luckyrobots\n```\n\n3. Run one of the following\n```\n   python basic_usage.py \n   python yolo_example.py\n   python yolo_mac_example.py\n```\n\nIt will download the binary and will run it for you.\n\n## Event Listeners\n\nLucky Robots provides several event listeners to interact with the simulated robot and receive updates on its state:\n\n1. **@lr.on(\"robot_output\")**: Receives robot output, including RGB and depth images, and coordinates.\n\n   Example output:\n   ```python\n   {\n       \"body_pos\": {\"Time\": \"1720752411\", \"rx\": \"-0.745724\", \"ry\": \"0.430001\", \"rz\": \"0.007442\", \"tx\": \"410.410786\", \"ty\": \"292.086556\", \"tz\": \"0.190011\", \"file_path\": \"/.../4_body_pos.txt\"},\n       \"depth_cam1\": {\"file_path\": \"/.../4_depth_cam1.jpg\"},\n       \"depth_cam2\": {\"file_path\": \"/.../4_depth_cam2.jpg\"},\n       \"hand_cam\": {\"Time\": \"1720752411\", \"rx\": \"-59.724758\", \"ry\": \"-89.132507\", \"rz\": \"59.738461\", \"tx\": \"425.359645\", \"ty\": \"285.063092\", \"tz\": \"19.006545\", \"file_path\": \"/.../4_hand_cam.txt\"},\n       \"head_cam\": {\"Time\": \"1720752411\", \"rx\": \"-0.749195\", \"ry\": \"0.433544\", \"rz\": \"0.010893\", \"tx\": \"419.352843\", \"ty\": \"292.814832\", \"tz\": \"59.460736\", \"file_path\": \"/.../4_head_cam.txt\"},\n       \"rgb_cam1\": {\"file_path\": \"/.../4_rgb_cam1.jpg\"},\n       \"rgb_cam2\": {\"file_path\": \"/.../4_rgb_cam2.jpg\"}\n   }\n   ```\n\n2. **@lr.on(\"message\")**: Decodes messages from the robot to understand its internal state.\n3. **@lr.on(\"start\")**: Triggered when the robot starts, allowing for initialization tasks.\n4. **@lr.on(\"tasks\")**: Manages the robot's task list.\n5. **@lr.on(\"task_complete\")**: Triggered when the robot completes a task.\n6. **@lr.on(\"batch_complete\")**: Triggered when the robot completes a batch of tasks.\n7. **@lr.on(\"hit_count\")**: Tracks the robot's collisions.\n\n## Controlling the Robot\n\nTo control the robot, send commands using the `lr.send_message()` function. For example, to make the robot's main wheels turn 10 times:\n\n```python\ncommands = [[\"W 3600 1\"]]  # This makes the main wheels turn 10 times.\n```\n\nFor multiple commands and to know when a particular one ends, assign an ID field to your command:\n\n```python\ncommands = [[{\"id\": 1234, \"code\": \"W 18000 1\"}]]\n```\n\nIf you want to send a whole set of instructions, add multiple arrays. Each array will wait until the previous array finishes. Commands inside one array are executed simultaneously, allowing smoother movements like the robot lifting its arms while moving forward or turning its head while placing an object. \n\n```python\ncommands = [[\"W 1800 1\",\"a 30\"],[\"a 0\", \"W 1800 1\"]]\n```\n\nCommands in one list will override previous commands if they conflict. For instance, if you instruct your robot to turn its wheels 20 times, and on the 5th turn, you instruct it again to turn 3 times, the robot will travel a total of 8 revolutions and stop.\n\nTo know when a particular batch finishes, give it an ID and listen for that ID:\n\n```python\ncommands = [\n    [\"RESET\"],\n    {\"commands\": [{\"id\": 123456, \"code\": \"W 5650 1\"}, {\"id\": 123457, \"code\": \"a 30 1\"}], \"batchID\": \"123456\"},\n    [\"A 0 1\", \"W 18000 1\"]\n]\nlr.send_message(commands)\n```\n\n## MOVING THE ROBOTS\n\n### FORWARD - BACKWARD\n- `[DIRECTION] [DISTANCE] [SPEED]` Example: `W 50 1`\n  - `[DIRECTION]`: W is forward, S is backward\n  - `[DISTANCE]`: Travel distance in centimeters\n  - `[SPEED]`: Speed at which motor will react - km/h\n  - Send via API: `lr.send_message([[\"W 50 1\"]])`\n\n### LEFT - RIGHT\n- `[DIRECTION] [DEGREE]` Example: `A 30`\n  - `[DIRECTION]`: A is left, D is right\n  - `[DEGREE]`: Spin Rotation in degrees\n  - Or: `lr.send_message([[\"A 30\"]])`\n\n### RESET\n- `RESET`: Resets all positions and rotations to the zero pose\n- Or: `lr.send_message([[\"RESET\"]])`\n\n### STRETCH-3 \n\n- `[JOINT][DISTANCE]` Example: `EX1 30`\n  - `EX1 10`  (extend 1st joint 10cm outwards)\n  - `EX2 -10` (extend 2nd joint 10cm inwards)\n  - `EX3 10`  (extend 3rd joint 10cm outwards)\n  - `EX4 10`  (extend 4th joint 10cm outwards)\n  - Or: `lr.send_message([[\"EX1 10\"]])`, `lr.send_message([[\"EX2 -10\"]])`, etc.\n\n- `U 10` (Up) - Or: `lr.send_message([[\"U 10\"]])`\n- `U -10` (Down) - Or: `lr.send_message([[\"U -10\"]])`\n\n- Gripper: `G 5` or `G -10` - Or: `lr.send_message([[\"G 5\"]])` or `lr.send_message([[\"G -10\"]])`\n\n- Hand Cam Angle:\n  - `R1 10` - Or: `lr.send_message([[\"R1 10\"]])`\n  - `R2 -30` (turn cam) - Or: `lr.send_message([[\"R2 -30\"]])`\n\n### LUCKY ROBOT-3 \n\n- `[JOINT][DEGREE]` Example: `EX1 30`\n  - `EX1 20`  (1st rotate the joint 20 degrees)\n  - `EX2 -10` (2nd rotate the joint -10 degrees)\n  - `EX3 10`  (3rd rotate the joint 10 degrees)\n  - `EX4 10`  (4th rotate the joint 10 degrees)\n  - Or: `lr.send_message([[\"EX1 20\"]])`, `lr.send_message([[\"EX2 -10\"]])`, etc.\n\n- `U 10` (Up) - Or: `lr.send_message([[\"U 10\"]])`\n- `U -10` (Down) - Or: `lr.send_message([[\"U -10\"]])`\n\n- Gripper: `G 5` or `G -10` - Or: `lr.send_message([[\"G 5\"]])` or `lr.send_message([[\"G -10\"]])`\n\n- Hand Cam Angle: `R 10` - Or: `lr.send_message([[\"R 10\"]])`\n\n## Starting the Robot\n\nTo start the robot simulation, use:\n\n```python\nlr.start(binary_path, sendBinaryData=False)\n```\n\n### ** WHAT WE ARE WORKING ON NEXT **\n\n*   Releasing our first basic end to end model\n*   Drone!!!\n*   Scan your own room\n*   Import URDFs\n*   (your idea?)\n\n\n\n\n### brief history of the project ###\n------------------------------\n\n** UPDATE 3/19/24 FIRST LUCKY WORLD UBUNTU BUILD IS COMPLETE: https://drive.google.com/drive/folders/15iYXzqFNEg1b2E6Ft1ErwynqBMaa0oOa\n\n** UPDATE 3/6/24 **\n\nWe have designed [Stretch 3 Robot](https://hello-robot.com/stretch-3-product) and working on adding this robot to our world\n\n<img width=\"504\" alt=\"image\" src=\"https://github.com/lucky-robots/lucky-robots/assets/203507/54b1bbbc-67e0-4add-a58f-84b08d14e680\">\n\n\n** UPDATE 1/6/24 **\n\nWE GOT OUR FIRST TEST LINUX BUILD (NOT THE ACTUAL WORLD, THAT'S BEING BUILT) (TESTED ON UBUNTU 22.04)\n\nhttps://drive.google.com/file/d/1_OCMwn8awKZHBfCfc9op00y6TvetI18U/view?usp=sharing\n\n\n** UPDATE 2/15/24 **\n\n[Luck-e World second release is out (Windows only - we're working on Linux build next)!](https://drive.google.com/drive/folders/10sVx5eCcx7d9ZR6tn0zqeQCaOF84MIQt)\n\n\n** UPDATE 2/8/24 **\n\nWe are now writing prompts against the 3d environment we have reconstructed using point clouds...\n\n\nhttps://github.com/lucky-robots/lucky-robots/assets/203507/a93c9f19-2891-40e1-8598-717ad13efba6\n\n\n\n\n** UPDATE 2/6/24 **\n\nLucky first release: https://drive.google.com/file/d/1qIbkez1VGU1WcIpqk8UuXTbSTMV7VC3R/view?amp;usp=embed_facebook\n\nNow you can run the simulation on your Windows Machine, and run your AI models against it. If you run into issues, please submit an issue.\n\n** UPDATE 1/15/24 **\n\nLuck-e is starting to understand the world around us and navigate accordingly!\n\nhttps://github.com/lucky-robots/lucky-robots/assets/203507/4e56bbc5-92da-4754-92f4-989b9cb86b6f\n\n\n** UPDATE 1/13/24 **\n\nWe are able to construct a 3d world using single camera @niconielsen32 (This is not a 3d room generated by a game engine, this is what we generate from what we're seeing through a camera in the game!)\n\nhttps://github.com/lucky-robots/lucky-robots/assets/203507/f2fd19ee-b40a-4fef-bd30-72c56d0f9ead\n\n\n\n** UPDATE 12/29/23 **\nWe are now flying! Look at these environments, can you tell they're not real?\n\n![Screenshot_18](https://github.com/lucky-robots/lucky-robots/assets/203507/f988a18e-9dc3-484e-9d9f-eb7ad57180b2)\n![Screenshot_17](https://github.com/lucky-robots/lucky-robots/assets/203507/f423d73f-d336-47b6-abf0-6f1b174bd740)\n![Screenshot_19](https://github.com/lucky-robots/lucky-robots/assets/203507/7f2b9ae2-f84f-41a1-8511-959e2586b809)\n![Screenshot_15](https://github.com/lucky-robots/lucky-robots/assets/203507/d65a0fb4-3a4d-4207-9181-2de0e2ce63ce)\n![Screenshot_11](https://github.com/lucky-robots/lucky-robots/assets/203507/cf328e8d-fc40-4be3-81ac-a900d0505fd8)\n![Screenshot_14](https://github.com/lucky-robots/lucky-robots/assets/203507/5ae9bf2d-246b-437f-ba1b-901a7f10b1fa)\n![Screenshot_12](https://github.com/lucky-robots/lucky-robots/assets/203507/e2f0684e-ca18-40b0-8680-76ccec918171)\n![Screenshot_8](https://github.com/lucky-robots/lucky-robots/assets/203507/26904b69-c8b8-467d-8355-595cc62ead3f)\n![Screenshot_7](https://github.com/lucky-robots/lucky-robots/assets/203507/e43e25b0-b68d-4c1e-9a7d-800b9cf5312b)\n\n\n** UPDATE 12/27/23 **\n\nLucky now has a drone  - like the Mars Rover! When it's activated camera feed switches to it automatically!\n\n\nhttps://github.com/lucky-robots/lucky-robots/assets/203507/29103a5a-a209-4d49-acd1-adad88e5b590\n\n\n** UPDATE 12/5/23 **\n\nCompleted our first depth map using Midas monocular depth estimation model\n\n\nhttps://github.com/lucky-robots/lucky-robots/assets/203507/647a5c32-297a-4157-b72b-afeacdaae48a\n\n\nhttps://user-images.githubusercontent.com/203507/276747207-b4db8da0-a14e-4f41-a6a0-ef3e2ea7a31c.mp4\n\n## Table of Contents\n\n- [Features](#features)\n- [Support](#support)\n- [Contributing](#contributing)\n- [Join Our Team](#join-our-team)\n- [License](#license)\n\n## Features\n\n1. **Realistic Training Environments**: Train your robots in various scenarios and terrains crafted meticulously in Unreal Engine.\n2. **Python Integration**: The framework integrates seamlessly with Python 3.10, enabling developers to write training algorithms and robot control scripts in Python.\n3. **Safety First**: No physical wear and tear on robots during training. Virtual training ensures that our robotic friends remain in tip-top condition.\n4. **Modular Design**: Easily extend and modify the framework to suit your specific requirements or add new training environments.\n\n\n## Support\n\nFor any queries, issues, or feature requests, please refer to our [issues page](https://github.com/LuckyRobots/LuckyRobotsTrainingFramework/issues).\n\n## Contributing\n\nWe welcome contributions! Please read our [contributing guide](CONTRIBUTING.md) to learn about our development process, how to propose bugfixes and improvements, and how to build and test your changes to Lucky Robots.\n\n## Join our team?\n\nAbsolutely! Show us a few cool things and/or contribute a few PRs -- let us know!\n\n## License\n\nLucky Robots Training Framework is released under the [MIT License](LICENSE.md).\n\n---\n\nHappy training! Remember, be kind to robots. \ud83e\udd16\ud83d\udc9a\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Robotics-AI Training in Hyperrealistic Game Environments",
    "version": "0.1.36",
    "project_urls": {
        "Homepage": "https://github.com/lucky-robots/lucky-robots"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9868701713d0b2f59081fc2529905523b70eedbe33f189110b2a0d53e087585e",
                "md5": "2eb2682e841c691b439ee4186d79decb",
                "sha256": "638dd9e3a160d800a6d4167e2d64d225f0d1a53342cfe774d5b146147a59b369"
            },
            "downloads": -1,
            "filename": "luckyrobots-0.1.36-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2eb2682e841c691b439ee4186d79decb",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 20820,
            "upload_time": "2024-10-14T22:11:09",
            "upload_time_iso_8601": "2024-10-14T22:11:09.113459Z",
            "url": "https://files.pythonhosted.org/packages/98/68/701713d0b2f59081fc2529905523b70eedbe33f189110b2a0d53e087585e/luckyrobots-0.1.36-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e72c31631a44bb2283d45d11bc829d03235ed90d2b5a6fcbc1facfbee58c4cae",
                "md5": "f57c42a7b3375b0ff2bfef7c78ac5f90",
                "sha256": "7378d269ceb34ccab3e0632d3d68f616012ce331efd403737ab6a1f0018be0ac"
            },
            "downloads": -1,
            "filename": "luckyrobots-0.1.36.tar.gz",
            "has_sig": false,
            "md5_digest": "f57c42a7b3375b0ff2bfef7c78ac5f90",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 23320,
            "upload_time": "2024-10-14T22:11:10",
            "upload_time_iso_8601": "2024-10-14T22:11:10.379703Z",
            "url": "https://files.pythonhosted.org/packages/e7/2c/31631a44bb2283d45d11bc829d03235ed90d2b5a6fcbc1facfbee58c4cae/luckyrobots-0.1.36.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-14 22:11:10",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "lucky-robots",
    "github_project": "lucky-robots",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "luckyrobots"
}
        
Elapsed time: 0.33452s