pixelstitch


Namepixelstitch JSON
Version 0.1.5 PyPI version JSON
download
home_pagehttps://github.com/ducha-aiki/pixelstitch/tree/master/
SummaryMatplotlib-based tool for labeling the two-view correspondences
upload_time2024-08-06 07:06:00
maintainerNone
docs_urlNone
authorDmytro Mishkin
requires_python>=3.8
licenseApache Software License 2.0
keywords local features correspondences ransac image matching wbs wxbs
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # pyxelstitch
> Simple, matplotlib based tool for hand-labeling the two-image correspondences


## Install

`pip install pixelstitch`

## How to use

Let's test our annotator on a sample project. It needs a list of triplets: (`path_to_img1`, `path_to_img2`, `path_to_corrs_to_save`).

```python
import os
rootdir = 'sample_project'
pairs = os.listdir(rootdir)
img_pairs_list = []
for p in pairs:
    if p == '.DS_Store':
        continue
    cur_dir = os.path.join(rootdir, p)
    img_pairs_list.append((os.path.join(cur_dir, '01.png'),
                           os.path.join(cur_dir, '02.png'),
                           os.path.join(cur_dir, 'corrs.txt')))

print (img_pairs_list)
```

    [('sample_project/ministry/01.png', 'sample_project/ministry/02.png', 'sample_project/ministry/corrs.txt'), ('sample_project/petrzin/01.png', 'sample_project/petrzin/02.png', 'sample_project/petrzin/corrs.txt')]


    /opt/homebrew/Caskroom/miniforge/base/envs/python39/lib/python3.9/site-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.
      and should_run_async(code)


Now we are ready to initialize `CorrespondenceAnnotator`. Don't forget to declare magic command ```%matplotlib widget```.
**WITHOUT MAGIC IT WOULD NOT WORK**

You also should explicitly specify, if you want to save (and possibly over-write previous better annotation) current correspondences automatically when clicking on **prev** and **next** buttons for going to the next pair. 

```python
%matplotlib widget
from pixelstitch.core import *
CA = CorrespondenceAnnotator(img_pairs_list, save_on_next=True)
```

    /opt/homebrew/Caskroom/miniforge/base/envs/python39/lib/python3.9/site-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.
      and should_run_async(code)


Now we can run the annotation. 

**Left-click** on the image to add a point 

**right-click** -- to remove the point from both images. 

### Matplotlib shortcuts:

- **o** for zoom 
- **p** for pan (move)

It is also recommended to set full page width for the jupyter


```python
%matplotlib widget
from IPython.core.display import display, HTML
display(HTML("<style>.container { width:95% !important; }</style>"))
CA.start(figsize=(12,7))
```

![image.png](index_files/att_00008.png)

# Controls

## Selectors

- `Model`. One can select between "`F`" -- fundamental matrix and "`H`" -- homography. The selection influences the reprojection error type, and the visualization of the models and reprojection errors, shown when clicked on `NextCorrsValPoint`,  `NextCorrsValAll` and  `ShowModel` buttons.

## Buttons

- `NextCorrsValPoint`. Shows the correspondence in the bottom axis. The image title shows correspondence index and the reprojection error. If `Model` is `F`, it will show induced epipolar line, if `H` -- the position of the reprojected point from other image. The model is estimated with all other correspondences except current one. 

- `NextCorrsValAll`. Shows the correspondences in the bottom axis. Similar to `NextCorrsValPoint` button, but shows all points. The model is estimated with all  correspondences except current one, which index is shown in the title. 
![image.png](index_files/att_00005.png)


- `ShowModel`. Has different behavoir depending on the `Model` selected. For `F` -- shows correspondences with their induced epipolar lines. Unlike `NextCorrsValPoint` and `NextCorrsValPoint` all correspondences are used for model estimation. 
![image.png](index_files/att_00003.png)

- For `H`, the button shows overlay of image 1 reprojected into image2 with image2. The reprojected area is defined by the convex hull of the labelled correspondences. Next click flips the order, i.e. shows the image 2 reprojected into image 1.
![image.png](index_files/att_00002.png)

- `Save points` -- saves (overwrites) the correspondences into the text file.

- `Prev` -- Loads and shows previous image pair to label. If the `CorrespondenceAnnotator` was initialized with `save_on_next=True`, the current pair correspondences are saved before the switch. Change is not cyclical, so the button does nothing on 1st image pair

- `Next` -- Loads and shows next image pair to label. If the `CorrespondenceAnnotator` was initialized with `save_on_next=True`, the current pair correspondences are saved before the switch.Change is not cyclical, so the button does nothing on last image pair.

- `CLAHE` -- Images are shown with enhanced contrast with [CLAHE](http://amroamroamro.github.io/mexopencv/opencv/clahe_demo_gui.html) algorithm.
![image.png](index_files/att_00004.png)


- `Pick 4 points for homography` -- Special mode. User picks 4 points in one image, which define new fronto-parallel view. This mode helps for labeling obscure views, see example below. The order of the points: top-left -> top-right -> bottom right -> bottom left. After the 4 point picked, the mode is switched off, so user needs to click the button again if she wants to rectify another image.

- `ResetView` -- Resets any recifications or zoom done to images.



### Rectification picking mode example

3 points are selected
![image.png](index_files/att_00006.png)

All points are selected and image1 is rectified to the rectangle, defined by the selected points
![image.png](index_files/att_00007.png)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/ducha-aiki/pixelstitch/tree/master/",
    "name": "pixelstitch",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "local features, correspondences, RANSAC, image matching, WBS, WxBS",
    "author": "Dmytro Mishkin",
    "author_email": "ducha.aiki@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/d0/62/1e40420b1eeb4f2c4c8cbecd4d41ddffdfbac7275c56a46c91bc6aef1962/pixelstitch-0.1.5.tar.gz",
    "platform": null,
    "description": "# pyxelstitch\n> Simple, matplotlib based tool for hand-labeling the two-image correspondences\n\n\n## Install\n\n`pip install pixelstitch`\n\n## How to use\n\nLet's test our annotator on a sample project. It needs a list of triplets: (`path_to_img1`, `path_to_img2`, `path_to_corrs_to_save`).\n\n```python\nimport os\nrootdir = 'sample_project'\npairs = os.listdir(rootdir)\nimg_pairs_list = []\nfor p in pairs:\n    if p == '.DS_Store':\n        continue\n    cur_dir = os.path.join(rootdir, p)\n    img_pairs_list.append((os.path.join(cur_dir, '01.png'),\n                           os.path.join(cur_dir, '02.png'),\n                           os.path.join(cur_dir, 'corrs.txt')))\n\nprint (img_pairs_list)\n```\n\n    [('sample_project/ministry/01.png', 'sample_project/ministry/02.png', 'sample_project/ministry/corrs.txt'), ('sample_project/petrzin/01.png', 'sample_project/petrzin/02.png', 'sample_project/petrzin/corrs.txt')]\n\n\n    /opt/homebrew/Caskroom/miniforge/base/envs/python39/lib/python3.9/site-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.\n      and should_run_async(code)\n\n\nNow we are ready to initialize `CorrespondenceAnnotator`. Don't forget to declare magic command ```%matplotlib widget```.\n**WITHOUT MAGIC IT WOULD NOT WORK**\n\nYou also should explicitly specify, if you want to save (and possibly over-write previous better annotation) current correspondences automatically when clicking on **prev** and **next** buttons for going to the next pair. \n\n```python\n%matplotlib widget\nfrom pixelstitch.core import *\nCA = CorrespondenceAnnotator(img_pairs_list, save_on_next=True)\n```\n\n    /opt/homebrew/Caskroom/miniforge/base/envs/python39/lib/python3.9/site-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.\n      and should_run_async(code)\n\n\nNow we can run the annotation. \n\n**Left-click** on the image to add a point \n\n**right-click** -- to remove the point from both images. \n\n### Matplotlib shortcuts:\n\n- **o** for zoom \n- **p** for pan (move)\n\nIt is also recommended to set full page width for the jupyter\n\n\n```python\n%matplotlib widget\nfrom IPython.core.display import display, HTML\ndisplay(HTML(\"<style>.container { width:95% !important; }</style>\"))\nCA.start(figsize=(12,7))\n```\n\n![image.png](index_files/att_00008.png)\n\n# Controls\n\n## Selectors\n\n- `Model`. One can select between \"`F`\" -- fundamental matrix and \"`H`\" -- homography. The selection influences the reprojection error type, and the visualization of the models and reprojection errors, shown when clicked on `NextCorrsValPoint`,  `NextCorrsValAll` and  `ShowModel` buttons.\n\n## Buttons\n\n- `NextCorrsValPoint`. Shows the correspondence in the bottom axis. The image title shows correspondence index and the reprojection error. If `Model` is `F`, it will show induced epipolar line, if `H` -- the position of the reprojected point from other image. The model is estimated with all other correspondences except current one. \n\n- `NextCorrsValAll`. Shows the correspondences in the bottom axis. Similar to `NextCorrsValPoint` button, but shows all points. The model is estimated with all  correspondences except current one, which index is shown in the title. \n![image.png](index_files/att_00005.png)\n\n\n- `ShowModel`. Has different behavoir depending on the `Model` selected. For `F` -- shows correspondences with their induced epipolar lines. Unlike `NextCorrsValPoint` and `NextCorrsValPoint` all correspondences are used for model estimation. \n![image.png](index_files/att_00003.png)\n\n- For `H`, the button shows overlay of image 1 reprojected into image2 with image2. The reprojected area is defined by the convex hull of the labelled correspondences. Next click flips the order, i.e. shows the image 2 reprojected into image 1.\n![image.png](index_files/att_00002.png)\n\n- `Save points` -- saves (overwrites) the correspondences into the text file.\n\n- `Prev` -- Loads and shows previous image pair to label. If the `CorrespondenceAnnotator` was initialized with `save_on_next=True`, the current pair correspondences are saved before the switch. Change is not cyclical, so the button does nothing on 1st image pair\n\n- `Next` -- Loads and shows next image pair to label. If the `CorrespondenceAnnotator` was initialized with `save_on_next=True`, the current pair correspondences are saved before the switch.Change is not cyclical, so the button does nothing on last image pair.\n\n- `CLAHE` -- Images are shown with enhanced contrast with [CLAHE](http://amroamroamro.github.io/mexopencv/opencv/clahe_demo_gui.html) algorithm.\n![image.png](index_files/att_00004.png)\n\n\n- `Pick 4 points for homography` -- Special mode. User picks 4 points in one image, which define new fronto-parallel view. This mode helps for labeling obscure views, see example below. The order of the points: top-left -> top-right -> bottom right -> bottom left. After the 4 point picked, the mode is switched off, so user needs to click the button again if she wants to rectify another image.\n\n- `ResetView` -- Resets any recifications or zoom done to images.\n\n\n\n### Rectification picking mode example\n\n3 points are selected\n![image.png](index_files/att_00006.png)\n\nAll points are selected and image1 is rectified to the rectangle, defined by the selected points\n![image.png](index_files/att_00007.png)\n",
    "bugtrack_url": null,
    "license": "Apache Software License 2.0",
    "summary": "Matplotlib-based tool for labeling the two-view correspondences",
    "version": "0.1.5",
    "project_urls": {
        "Homepage": "https://github.com/ducha-aiki/pixelstitch/tree/master/"
    },
    "split_keywords": [
        "local features",
        " correspondences",
        " ransac",
        " image matching",
        " wbs",
        " wxbs"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1a83bf7923aaf791b6a4e671f71570fa56d50be4c0ef2dc8aced1685462e0531",
                "md5": "5272de6b0cb6595a7213e60d5ff1587c",
                "sha256": "2e1e9d9248fcb624de46547283625f0a71035827dd7518b64ea66f09154d9e1b"
            },
            "downloads": -1,
            "filename": "pixelstitch-0.1.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "5272de6b0cb6595a7213e60d5ff1587c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 20358,
            "upload_time": "2024-08-06T07:05:58",
            "upload_time_iso_8601": "2024-08-06T07:05:58.483436Z",
            "url": "https://files.pythonhosted.org/packages/1a/83/bf7923aaf791b6a4e671f71570fa56d50be4c0ef2dc8aced1685462e0531/pixelstitch-0.1.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d0621e40420b1eeb4f2c4c8cbecd4d41ddffdfbac7275c56a46c91bc6aef1962",
                "md5": "dc37b6d20b491f847bce22e5161891fd",
                "sha256": "7d5c9596153b1e05ec36ed0433b1203137ac954b1cab5d6c3ac9c6829c62f166"
            },
            "downloads": -1,
            "filename": "pixelstitch-0.1.5.tar.gz",
            "has_sig": false,
            "md5_digest": "dc37b6d20b491f847bce22e5161891fd",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 22259,
            "upload_time": "2024-08-06T07:06:00",
            "upload_time_iso_8601": "2024-08-06T07:06:00.249030Z",
            "url": "https://files.pythonhosted.org/packages/d0/62/1e40420b1eeb4f2c4c8cbecd4d41ddffdfbac7275c56a46c91bc6aef1962/pixelstitch-0.1.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-06 07:06:00",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ducha-aiki",
    "github_project": "pixelstitch",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "pixelstitch"
}
        
Elapsed time: 0.31235s