# Contact-GraspNet Pytorch
Pytorch implementation of Contact-GraspNet. This repo is based heavily on
https://github.com/alinasarmiento/pytorch_contactnet. Original Tensorflow implementation can be found at: https://github.com/NVlabs/contact_graspnet
## Installation Instructions
`pip install cgn-pytorch`
## Usage:
`import cgn_pytorch`
`cgn_model, optimizer, config_dict = cgn_pytorch.from_pretrained(cpu=False)`
## Run The Demo
### Clone this Repo
`git clone https://github.com/sebbyjp/cgn_pytorch.git`
### Install Dependencies
`pip3 install -r requirements.txt`
### Vizualization
We're doing our visualizations in MeshCat. In a separate tab, start a meshcat server with `meshcat-server`
From here you can run `python3 eval.py`
To visualize different confidence threshold masks on the grasps, use the `--threshold` argument such as `--threshold=0.8`
To visualize different cluttered scenes (8-12 tabletop objects rendered in pyrender from ACRONYM), use the argument `--scene` and manually feed in a file name such as `--scene=002330.npz`.Your possible files are:
- 002330.npz
- 004086.npz
- 005274.npz
## Predicting grasps on your own pointclouds
The model should work on any pointcloud of shape (Nx3). For most consistent results, please make sure to put the pointcloud in the world frame and center it by subtracting the mean. Do not normalize the pointcloud to a unit sphere or unit box, as "graspability" naturally changes depending on the size of the objects (so we don't want to lose that information about the scene by scaling it).
Raw data
{
"_id": null,
"home_page": "",
"name": "cgn-pytorch",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.9,<4.0",
"maintainer_email": "",
"keywords": "artificial intelligence,deep learning,grasping,robotics,contact grasp net",
"author": "Sebastian Peralta",
"author_email": "peraltas@seas.upenn.edu",
"download_url": "https://files.pythonhosted.org/packages/47/6a/8e49b0e5a7303c63e9b62bfd55b6b1cefac51ba01e44a44d1978e3638e72/cgn_pytorch-0.4.3.tar.gz",
"platform": null,
"description": "# Contact-GraspNet Pytorch\nPytorch implementation of Contact-GraspNet. This repo is based heavily on \nhttps://github.com/alinasarmiento/pytorch_contactnet. Original Tensorflow implementation can be found at: https://github.com/NVlabs/contact_graspnet\n\n\n## Installation Instructions\n`pip install cgn-pytorch`\n\n## Usage:\n`import cgn_pytorch`\n\n`cgn_model, optimizer, config_dict = cgn_pytorch.from_pretrained(cpu=False)`\n\n\n## Run The Demo\n### Clone this Repo\n`git clone https://github.com/sebbyjp/cgn_pytorch.git`\n\n### Install Dependencies\n`pip3 install -r requirements.txt`\n\n### Vizualization\nWe're doing our visualizations in MeshCat. In a separate tab, start a meshcat server with `meshcat-server`\n\nFrom here you can run `python3 eval.py`\n\nTo visualize different confidence threshold masks on the grasps, use the `--threshold` argument such as `--threshold=0.8`\n\nTo visualize different cluttered scenes (8-12 tabletop objects rendered in pyrender from ACRONYM), use the argument `--scene` and manually feed in a file name such as `--scene=002330.npz`.Your possible files are:\n- 002330.npz\n- 004086.npz\n- 005274.npz\n\n## Predicting grasps on your own pointclouds\nThe model should work on any pointcloud of shape (Nx3). For most consistent results, please make sure to put the pointcloud in the world frame and center it by subtracting the mean. Do not normalize the pointcloud to a unit sphere or unit box, as \"graspability\" naturally changes depending on the size of the objects (so we don't want to lose that information about the scene by scaling it).\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Pytorch implementation of Contact Graspnet",
"version": "0.4.3",
"project_urls": null,
"split_keywords": [
"artificial intelligence",
"deep learning",
"grasping",
"robotics",
"contact grasp net"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "97fc88262355ef8d02983152fcdc6a1baa27b662a1306c2b4970df084131087e",
"md5": "9583968d57c0cdd0b63809217574bc46",
"sha256": "600c1806af6a8a2a9ee6557c988126eea88a2edfd4984384008946572351c160"
},
"downloads": -1,
"filename": "cgn_pytorch-0.4.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "9583968d57c0cdd0b63809217574bc46",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9,<4.0",
"size": 30166211,
"upload_time": "2023-12-02T15:10:30",
"upload_time_iso_8601": "2023-12-02T15:10:30.206205Z",
"url": "https://files.pythonhosted.org/packages/97/fc/88262355ef8d02983152fcdc6a1baa27b662a1306c2b4970df084131087e/cgn_pytorch-0.4.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "476a8e49b0e5a7303c63e9b62bfd55b6b1cefac51ba01e44a44d1978e3638e72",
"md5": "5a57eb494a764e8abe76e9a13ee774f6",
"sha256": "e198d580a9a7b908d533d09edf6328b884ffcb363e15690e48e1df4b8c5946cb"
},
"downloads": -1,
"filename": "cgn_pytorch-0.4.3.tar.gz",
"has_sig": false,
"md5_digest": "5a57eb494a764e8abe76e9a13ee774f6",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9,<4.0",
"size": 30110470,
"upload_time": "2023-12-02T15:10:39",
"upload_time_iso_8601": "2023-12-02T15:10:39.650527Z",
"url": "https://files.pythonhosted.org/packages/47/6a/8e49b0e5a7303c63e9b62bfd55b6b1cefac51ba01e44a44d1978e3638e72/cgn_pytorch-0.4.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-12-02 15:10:39",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "cgn-pytorch"
}