PytorchWildlife


NamePytorchWildlife JSON
Version 1.0.2.14 PyPI version JSON
download
home_pagehttps://github.com/microsoft/CameraTraps/
Summarya PyTorch Collaborative Deep Learning Framework for Conservation.
upload_time2024-05-08 22:03:42
maintainerNone
docs_urlNone
authorAndres Hernandez, Zhongqi Miao
requires_python>=3.8
licenseMIT
keywords pytorch_wildlife pytorch wildlife megadetector conservation animal detection classification
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ![image](https://microsoft.github.io/CameraTraps/assets/Pytorch_Banner_transparentbk.png)

<div align="center"> 
<font size="6"> A Collaborative Deep Learning Framework for Conservation </font>
<br>
<hr>
<a href="https://pypi.org/project/PytorchWildlife"><img src="https://img.shields.io/pypi/v/PytorchWildlife?color=limegreen" /></a> 
<a href="https://pypi.org/project/PytorchWildlife"><img src="https://static.pepy.tech/badge/pytorchwildlife" /></a> 
<a href="https://pypi.org/project/PytorchWildlife"><img src="https://img.shields.io/pypi/pyversions/PytorchWildlife" /></a> 
<a href="https://huggingface.co/spaces/ai4g-biodiversity/pytorch-wildlife"><img src="https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Demo-blue" /></a>
<a href="https://colab.research.google.com/drive/1rjqHrTMzEHkMualr4vB55dQWCsCKMNXi?usp=sharing"><img src="https://img.shields.io/badge/Colab-Demo-blue?logo=GoogleColab" /></a>
<!-- <a href="https://colab.research.google.com/drive/16-OjFVQ6nopuP-gfqofYBBY00oIgbcr1?usp=sharing"><img src="https://img.shields.io/badge/Colab-Video detection-blue?logo=GoogleColab" /></a> -->
<a href="https://cameratraps.readthedocs.io/en/latest/"><img src="https://img.shields.io/badge/read-docs-yellow?logo=ReadtheDocs" /></a>
<a href="https://github.com/microsoft/CameraTraps/blob/main/LICENSE"><img src="https://img.shields.io/pypi/l/PytorchWildlife" /></a>
<a href="https://discord.gg/TeEVxzaYtm"><img src="https://img.shields.io/badge/any_text-Join_us!-blue?logo=discord&label=Discord" /></a>
<br><br>
</div>


## ✅ Update highlights (Version 1.0.2.13)
- [x] Added a file separation function. You can now automatically separate your files between animals and non-animals into different folders using our `detection_folder_separation` function. Please see the [Python demo file](demo/image_separation_demo.py) and [Jupyter demo](demo/image_separation_demo.ipynb)!
- [x] 🥳 Added Timelapse compatibility! Check the [Gradio interface](INSTALLATION.md) or [notebooks](https://github.com/microsoft/CameraTraps/blob/main/demo/image_detection_demo.ipynb).
- [x] Added Google Colab demos.
- [x] Added Snapshot Serengeti classification model into the model zoo.
- [x] Added Classification fine-tuning module.
- [x] Added a Docker Image for ease of installation.

## 🔥 Future highlights
- [ ] MegaDetectorV6 with multiple model sizes for both optimized performance and low-budget devices like camera systems.
- [ ] A detection model fine-tuning module to fine-tune your own detection model for Pytorch-Wildlife.
- [ ] Direct LILA connection for more training/validation data.
- [ ] More pretrained detection and classification models to expand the current model zoo.

To check the full version of the roadmap with completed tasks and long term goals, please click [here!](roadmaps.md).

## 🐾 Introduction

At the core of our mission is the desire to create a harmonious space where conservation scientists from all over the globe can unite. Where they're able to share, grow, use datasets and deep learning architectures for wildlife conservation.
We've been inspired by the potential and capabilities of Megadetector, and we deeply value its contributions to the community. As we forge ahead with Pytorch-Wildlife, under which Megadetector now resides, please know that we remain committed to supporting, maintaining, and developing Megadetector, ensuring its continued relevance, expansion, and utility.

Pytorch-Wildlife is pip installable:
```
pip install PytorchWildlife
```

To use the newest version of MegaDetector with all the existing functionalities, you can use our [Hugging Face interface](https://huggingface.co/spaces/AndresHdzC/pytorch-wildlife) or simply load the model with **Pytorch-Wildlife**. The weights will be automatically downloaded:
```python
from PytorchWildlife.models import detection as pw_detection
detection_model = pw_detection.MegaDetectorV5()
```

For those interested in accessing the previous MegaDetector repository, which utilizes the same `MegaDetector v5` model weights and was primarily developed by Dan Morris during his time at Microsoft, please visit the [archive](https://github.com/microsoft/CameraTraps/blob/main/archive) directory, or you can visit this [forked repository](https://github.com/agentmorris/MegaDetector/tree/main) that Dan Morris is actively maintaining.

>[!TIP]
>If you have any questions regarding MegaDetector and Pytorch-Wildlife, please [email us](zhongqimiao@microsoft.com) or join us in our discord channel: [![](https://img.shields.io/badge/any_text-Join_us!-blue?logo=discord&label=PytorchWildife)](https://discord.gg/TeEVxzaYtm)

## 👋 Welcome to Pytorch-Wildlife Version 1.0

**PyTorch-Wildlife** is a platform to create, modify, and share powerful AI conservation models. These models can be used for a variety of applications, including camera trap images, overhead images, underwater images, or bioacoustics. Your engagement with our work is greatly appreciated, and we eagerly await any feedback you may have.


The **Pytorch-Wildlife** library allows users to directly load the `MegaDetector v5` model weights for animal detection. We've fully refactored our codebase, prioritizing ease of use in model deployment and expansion. In addition to `MegaDetector v5`, **Pytorch-Wildlife** also accommodates a range of classification weights, such as those derived from the Amazon Rainforest dataset and the Opossum classification dataset. Explore the codebase and functionalities of **Pytorch-Wildlife** through our interactive [HuggingFace web app](https://huggingface.co/spaces/AndresHdzC/pytorch-wildlife) or local [demos and notebooks](https://github.com/microsoft/CameraTraps/tree/main/demo), designed to showcase the practical applications of our enhancements at [PyTorchWildlife](https://github.com/microsoft/CameraTraps/blob/main/INSTALLATION.md). You can find more information in our [documentation](https://cameratraps.readthedocs.io/en/latest/).

👇 Here is a brief example on how to perform detection and classification on a single image using `PyTorch-wildlife`
```python
import torch
from PytorchWildlife.models import detection as pw_detection
from PytorchWildlife.models import classification as pw_classification

img = torch.randn((3, 1280, 1280))

# Detection
detection_model = pw_detection.MegaDetectorV5() # Model weights are automatically downloaded.
detection_result = detection_model.single_image_detection(img)

#Classification
classification_model = pw_classification.AI4GAmazonRainforest() # Model weights are automatically downloaded.
classification_results = classification_model.single_image_classification(img)
```

## ⚙️ Install Pytorch-Wildlife
```
pip install PytorchWildlife
```
Please refer to our [installation guide](https://github.com/microsoft/CameraTraps/blob/main/INSTALLATION.md) for more installation information.

## 🕵️ Explore Pytorch-Wildlife and MegaDetector with our Demo User Interface

If you want to directly try **Pytorch-Wildlife** with the AI models available, including `MegaDetector v5`, you can use our [**Gradio** interface](https://github.com/microsoft/CameraTraps/tree/main/demo). This interface allows users to directly load the `MegaDetector v5` model weights for animal detection. In addition, **Pytorch-Wildlife** also has two classification models in our initial version. One is trained from an Amazon Rainforest camera trap dataset and the other from a Galapagos opossum classification dataset (more details of these datasets will be published soon). To start, please follow the [installation instructions](https://github.com/microsoft/CameraTraps/blob/main/INSTALLATION.md) on how to run the Gradio interface! We also provide multiple [**Jupyter** notebooks](https://github.com/microsoft/CameraTraps/tree/main/demo) for demonstration.

![image](https://microsoft.github.io/CameraTraps/assets/gradio_UI.png)


## 🛠️ Core Features
   What are the core components of Pytorch-Wildlife?
![Pytorch-core-diagram](https://microsoft.github.io/CameraTraps/assets/Pytorch_Wildlife_core_figure.jpg)


### 🌐 Unified Framework:
  Pytorch-Wildlife integrates **four pivotal elements:**

▪ Machine Learning Models<br>
▪ Pre-trained Weights<br>
▪ Datasets<br>
▪ Utilities<br>

### 👷 Our work:
  In the provided graph, boxes outlined in red represent elements that will be added and remained fixed, while those in blue will be part of our development.


### 🚀 Inaugural Model:
  We're kickstarting with YOLO as our first available model, complemented by pre-trained weights from `MegaDetector v5`. This is the same `MegaDetector v5` model from the previous repository.


### 📚 Expandable Repository:
  As we move forward, our platform will welcome new models and pre-trained weights for camera traps and bioacoustic analysis. We're excited to host contributions from global researchers through a dedicated submission platform.


### 📊 Datasets from LILA:
  Pytorch-Wildlife will also incorporate the vast datasets hosted on LILA, making it a treasure trove for conservation research.


### 🧰 Versatile Utilities:
  Our set of utilities spans from visualization tools to task-specific utilities, many inherited from Megadetector.


### 💻 User Interface Flexibility:
  While we provide a foundational user interface, our platform is designed to inspire. We encourage researchers to craft and share their unique interfaces, and we'll list both existing and new UIs from other collaborators for the community's benefit.


Let's shape the future of wildlife research, together! 🙌


### 📈 Progress on core tasks

<details>
<summary> <font size="3"> ▪️ Packaging </font> </summary>

- [ ] Animal detection fine-tuning<br>
- [x] MegaDetectorV5 integration<br>
- [ ] MegaDetectorV6 integration<br>
- [x] User submitted weights<br>
- [x] Animal classification fine-tuning<br>
- [x] Amazon Rainforest classification<br>
- [x] Amazon Opossum classification<br>
- [ ] User submitted weights<br>
</details><br>

<details>
<summary><font size="3">▪️ Utility Toolkit</font></summary>

- [x] Visualization tools<br>
- [x] MegaDetector utils<br>
- [ ] User submitted utils<br>
</details><br>

<details>
<summary><font size="3">▪️ Datasets</font></summary>

- [ ] Animal Datasets<br>
- [ ] LILA datasets<br>
</details><br>

<details>
<summary><font size="3">▪️ Accessibility</font></summary>

- [x] Basic user interface for demonstration<br>
- [ ] UI Dev tools<br>
- [ ] List of available UIs<br>
</details><br>


## 🖼️ Examples

### Image detection using `MegaDetector v5`
<img src="https://microsoft.github.io/CameraTraps/assets/animal_det_1.JPG" alt="animal_det_1" width="400"/><br>
*Credits to Universidad de los Andes, Colombia.*

### Image classification with `MegaDetector v5` and `AI4GAmazonRainforest`
<img src="https://microsoft.github.io/CameraTraps/assets/animal_clas_1.png" alt="animal_clas_1" width="500"/><br>
*Credits to Universidad de los Andes, Colombia.*

### Opossum ID with `MegaDetector v5` and `AI4GOpossum`
<img src="https://microsoft.github.io/CameraTraps/assets/opossum_det.png" alt="opossum_det" width="500"/><br>
*Credits to the Agency for Regulation and Control of Biosecurity and Quarantine for Galápagos (ABG), Ecuador.*


## 🤝 Contributing
This project is open to your ideas and contributions. If you want to submit a pull request, we'll have some guidelines available soon.

We have adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [us](zhongqimiao@microsoft.com) with any additional questions or comments.

## License
This repository is licensed with the [MIT license](https://github.com/Microsoft/dotnet/blob/main/LICENSE).


## 👥 Existing Collaborators

The extensive collaborative efforts of Megadetector have genuinely inspired us, and we deeply value its significant contributions to the community. As we continue to advance with Pytorch-Wildlife, our commitment to delivering technical support to our existing partners on MegaDetector remains the same.

Here we list a few of the organizations that have used MegaDetector. We're only listing organizations who have given us permission to refer to them here or have posted publicly about their use of MegaDetector.

<details>
<summary><font size="3">👉 Full list of organizations</font></summary>

(Newly Added) [TerrOïko](https://www.terroiko.fr/) ([OCAPI platform](https://www.terroiko.fr/ocapi))

[Arizona Department of Environmental Quality](http://azdeq.gov/)

[Blackbird Environmental](https://blackbirdenv.com/)

[Camelot](https://camelotproject.org/)

[Canadian Parks and Wilderness Society (CPAWS) Northern Alberta Chapter](https://cpawsnab.org/)

[Conservation X Labs](https://conservationxlabs.com/)

[Czech University of Life Sciences Prague](https://www.czu.cz/en)

[EcoLogic Consultants Ltd.](https://www.consult-ecologic.com/)

[Estación Biológica de Doñana](http://www.ebd.csic.es/inicio)

[Idaho Department of Fish and Game](https://idfg.idaho.gov/)

[Island Conservation](https://www.islandconservation.org/)

[Myall Lakes Dingo Project](https://carnivorecoexistence.info/myall-lakes-dingo-project/)

[Point No Point Treaty Council](https://pnptc.org/)

[Ramat Hanadiv Nature Park](https://www.ramat-hanadiv.org.il/en/)

[SPEA (Portuguese Society for the Study of Birds)](https://spea.pt/en/)

[Synthetaic](https://www.synthetaic.com/)

[Taronga Conservation Society](https://taronga.org.au/)

[The Nature Conservancy in Wyoming](https://www.nature.org/en-us/about-us/where-we-work/united-states/wyoming/)

[TrapTagger](https://wildeyeconservation.org/trap-tagger-about/)

[Upper Yellowstone Watershed Group](https://www.upperyellowstone.org/)

[Applied Conservation Macro Ecology Lab](http://www.acmelab.ca/), University of Victoria

[Banff National Park Resource Conservation](https://www.pc.gc.ca/en/pn-np/ab/banff/nature/conservation), Parks Canada(https://www.pc.gc.ca/en/pn-np/ab/banff/nature/conservation)

[Blumstein Lab](https://blumsteinlab.eeb.ucla.edu/), UCLA

[Borderlands Research Institute](https://bri.sulross.edu/), Sul Ross State University

[Capitol Reef National Park](https://www.nps.gov/care/index.htm) / Utah Valley University

[Center for Biodiversity and Conservation](https://www.amnh.org/research/center-for-biodiversity-conservation), American Museum of Natural History

[Centre for Ecosystem Science](https://www.unsw.edu.au/research/), UNSW Sydney

[Cross-Cultural Ecology Lab](https://crossculturalecology.net/), Macquarie University

[DC Cat Count](https://hub.dccatcount.org/), led by the Humane Rescue Alliance

[Department of Fish and Wildlife Sciences](https://www.uidaho.edu/cnr/departments/fish-and-wildlife-sciences), University of Idaho

[Department of Wildlife Ecology and Conservation](https://wec.ifas.ufl.edu/), University of Florida

[Ecology and Conservation of Amazonian Vertebrates Research Group](https://www.researchgate.net/lab/Fernanda-Michalski-Lab-4), Federal University of Amapá

[Gola Forest Programma](https://www.rspb.org.uk/our-work/conservation/projects/scientific-support-for-the-gola-forest-programme/), Royal Society for the Protection of Birds (RSPB)

[Graeme Shannon's Research Group](https://wildliferesearch.co.uk/group-1), Bangor University

[Hamaarag](https://hamaarag.org.il/), The Steinhardt Museum of Natural History, Tel Aviv University

[Institut des Science de la Forêt Tempérée (ISFORT)](https://isfort.uqo.ca/), Université du Québec en Outaouais

[Lab of Dr. Bilal Habib](https://bhlab.in/about), the Wildlife Institute of India

[Mammal Spatial Ecology and Conservation Lab](https://labs.wsu.edu/dthornton/), Washington State University

[McLoughlin Lab in Population Ecology](http://mcloughlinlab.ca/lab/), University of Saskatchewan

[National Wildlife Refuge System, Southwest Region](https://www.fws.gov/about/region/southwest), U.S. Fish & Wildlife Service

[Northern Great Plains Program](https://nationalzoo.si.edu/news/restoring-americas-prairie), Smithsonian

[Quantitative Ecology Lab](https://depts.washington.edu/sefsqel/), University of Washington

[Santa Monica Mountains Recreation Area](https://www.nps.gov/samo/index.htm), National Park Service

[Seattle Urban Carnivore Project](https://www.zoo.org/seattlecarnivores), Woodland Park Zoo

[Serra dos Órgãos National Park](https://www.icmbio.gov.br/parnaserradosorgaos/), ICMBio

[Snapshot USA](https://emammal.si.edu/snapshot-usa), Smithsonian

[Wildlife Coexistence Lab](https://wildlife.forestry.ubc.ca/), University of British Columbia

[Wildlife Research](https://www.dfw.state.or.us/wildlife/research/index.asp), Oregon Department of Fish and Wildlife

[Wildlife Division](https://www.michigan.gov/dnr/about/contact/wildlife), Michigan Department of Natural Resources

Department of Ecology, TU Berlin

Ghost Cat Analytics

Protected Areas Unit, Canadian Wildlife Service

[School of Natural Sciences](https://www.utas.edu.au/natural-sciences), University of Tasmania [(story)](https://www.utas.edu.au/about/news-and-stories/articles/2022/1204-innovative-camera-network-keeps-close-eye-on-tassie-wildlife)

[Kenai National Wildlife Refuge](https://www.fws.gov/refuge/kenai), U.S. Fish & Wildlife Service [(story)](https://www.peninsulaclarion.com/sports/refuge-notebook-new-technology-increases-efficiency-of-refuge-cameras/)

[Australian Wildlife Conservancy](https://www.australianwildlife.org/) [(blog](https://www.australianwildlife.org/cutting-edge-technology-delivering-efficiency-gains-in-conservation/), [blog)](https://www.australianwildlife.org/efficiency-gains-at-the-cutting-edge-of-technology/)

[Felidae Conservation Fund](https://felidaefund.org/) [(WildePod platform)](https://wildepod.org/) [(blog post)](https://abhaykashyap.com/blog/ai-powered-camera-trap-image-annotation-system/)

[Alberta Biodiversity Monitoring Institute (ABMI)](https://www.abmi.ca/home.html) [(WildTrax platform)](https://www.wildtrax.ca/) [(blog post)](https://wildcams.ca/blog/the-abmi-visits-the-zoo/)

[Shan Shui Conservation Center](http://en.shanshui.org/) [(blog post)](https://mp.weixin.qq.com/s/iOIQF3ckj0-rEG4yJgerYw?fbclid=IwAR0alwiWbe3udIcFvqqwm7y5qgr9hZpjr871FZIa-ErGUukZ7yJ3ZhgCevs) [(translated blog post)](https://mp-weixin-qq-com.translate.goog/s/iOIQF3ckj0-rEG4yJgerYw?fbclid=IwAR0alwiWbe3udIcFvqqwm7y5qgr9hZpjr871FZIa-ErGUukZ7yJ3ZhgCevs&_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en&_x_tr_pto=wapp)

[Irvine Ranch Conservancy](http://www.irconservancy.org/) [(story)](https://www.ocregister.com/2022/03/30/ai-software-is-helping-researchers-focus-on-learning-about-ocs-wild-animals/)

[Wildlife Protection Solutions](https://wildlifeprotectionsolutions.org/) [(story](https://customers.microsoft.com/en-us/story/1384184517929343083-wildlife-protection-solutions-nonprofit-ai-for-earth), [story)](https://www.enterpriseai.news/2023/02/20/ai-helps-wildlife-protection-solutions-safeguard-endangered-species/)

[Road Ecology Center](https://roadecology.ucdavis.edu/), University of California, Davis [(Wildlife Observer Network platform)](https://wildlifeobserver.net/)

[The Nature Conservancy in California](https://www.nature.org/en-us/about-us/where-we-work/united-states/california/) [(Animl platform)](https://github.com/tnc-ca-geo/animl-frontend)

[San Diego Zoo Wildlife Alliance](https://science.sandiegozoo.org/) [(Animl R package)](https://github.com/conservationtechlab/animl)

</details><br>


>[!IMPORTANT]
>If you would like to be added to this list or have any questions regarding MegaDetector and Pytorch-Wildlife, please [email us](zhongqimiao@microsoft.com) or join us in our Discord channel: [![](https://img.shields.io/badge/any_text-Join_us!-blue?logo=discord&label=PytorchWildife)](https://discord.gg/TeEVxzaYtm)


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/microsoft/CameraTraps/",
    "name": "PytorchWildlife",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "pytorch_wildlife, pytorch, wildlife, megadetector, conservation, animal, detection, classification",
    "author": "Andres Hernandez, Zhongqi Miao",
    "author_email": "v-herandres@microsoft.com, zhongqimiao@microsoft.com",
    "download_url": "https://files.pythonhosted.org/packages/c5/09/ad751c3ce3b7c3cd8ba96018aea52054143dcf807e5491c94d3486e3b2b7/PytorchWildlife-1.0.2.14.tar.gz",
    "platform": null,
    "description": "![image](https://microsoft.github.io/CameraTraps/assets/Pytorch_Banner_transparentbk.png)\n\n<div align=\"center\"> \n<font size=\"6\"> A Collaborative Deep Learning Framework for Conservation </font>\n<br>\n<hr>\n<a href=\"https://pypi.org/project/PytorchWildlife\"><img src=\"https://img.shields.io/pypi/v/PytorchWildlife?color=limegreen\" /></a> \n<a href=\"https://pypi.org/project/PytorchWildlife\"><img src=\"https://static.pepy.tech/badge/pytorchwildlife\" /></a> \n<a href=\"https://pypi.org/project/PytorchWildlife\"><img src=\"https://img.shields.io/pypi/pyversions/PytorchWildlife\" /></a> \n<a href=\"https://huggingface.co/spaces/ai4g-biodiversity/pytorch-wildlife\"><img src=\"https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Demo-blue\" /></a>\n<a href=\"https://colab.research.google.com/drive/1rjqHrTMzEHkMualr4vB55dQWCsCKMNXi?usp=sharing\"><img src=\"https://img.shields.io/badge/Colab-Demo-blue?logo=GoogleColab\" /></a>\n<!-- <a href=\"https://colab.research.google.com/drive/16-OjFVQ6nopuP-gfqofYBBY00oIgbcr1?usp=sharing\"><img src=\"https://img.shields.io/badge/Colab-Video detection-blue?logo=GoogleColab\" /></a> -->\n<a href=\"https://cameratraps.readthedocs.io/en/latest/\"><img src=\"https://img.shields.io/badge/read-docs-yellow?logo=ReadtheDocs\" /></a>\n<a href=\"https://github.com/microsoft/CameraTraps/blob/main/LICENSE\"><img src=\"https://img.shields.io/pypi/l/PytorchWildlife\" /></a>\n<a href=\"https://discord.gg/TeEVxzaYtm\"><img src=\"https://img.shields.io/badge/any_text-Join_us!-blue?logo=discord&label=Discord\" /></a>\n<br><br>\n</div>\n\n\n## \u2705 Update highlights (Version 1.0.2.13)\n- [x] Added a file separation function. You can now automatically separate your files between animals and non-animals into different folders using our `detection_folder_separation` function. Please see the [Python demo file](demo/image_separation_demo.py) and [Jupyter demo](demo/image_separation_demo.ipynb)!\n- [x] \ud83e\udd73 Added Timelapse compatibility! Check the [Gradio interface](INSTALLATION.md) or [notebooks](https://github.com/microsoft/CameraTraps/blob/main/demo/image_detection_demo.ipynb).\n- [x] Added Google Colab demos.\n- [x] Added Snapshot Serengeti classification model into the model zoo.\n- [x] Added Classification fine-tuning module.\n- [x] Added a Docker Image for ease of installation.\n\n## \ud83d\udd25 Future highlights\n- [ ] MegaDetectorV6 with multiple model sizes for both optimized performance and low-budget devices like camera systems.\n- [ ] A detection model fine-tuning module to fine-tune your own detection model for Pytorch-Wildlife.\n- [ ] Direct LILA connection for more training/validation data.\n- [ ] More pretrained detection and classification models to expand the current model zoo.\n\nTo check the full version of the roadmap with completed tasks and long term goals, please click [here!](roadmaps.md).\n\n## \ud83d\udc3e Introduction\n\nAt the core of our mission is the desire to create a harmonious space where conservation scientists from all over the globe can unite. Where they're able to share, grow, use datasets and deep learning architectures for wildlife conservation.\nWe've been inspired by the potential and capabilities of Megadetector, and we deeply value its contributions to the community. As we forge ahead with Pytorch-Wildlife, under which Megadetector now resides, please know that we remain committed to supporting, maintaining, and developing Megadetector, ensuring its continued relevance, expansion, and utility.\n\nPytorch-Wildlife is pip installable:\n```\npip install PytorchWildlife\n```\n\nTo use the newest version of MegaDetector with all the existing functionalities, you can use our [Hugging Face interface](https://huggingface.co/spaces/AndresHdzC/pytorch-wildlife) or simply load the model with **Pytorch-Wildlife**. The weights will be automatically downloaded:\n```python\nfrom PytorchWildlife.models import detection as pw_detection\ndetection_model = pw_detection.MegaDetectorV5()\n```\n\nFor those interested in accessing the previous MegaDetector repository, which utilizes the same `MegaDetector v5` model weights and was primarily developed by Dan Morris during his time at Microsoft, please visit the [archive](https://github.com/microsoft/CameraTraps/blob/main/archive) directory, or you can visit this [forked repository](https://github.com/agentmorris/MegaDetector/tree/main) that Dan Morris is actively maintaining.\n\n>[!TIP]\n>If you have any questions regarding MegaDetector and Pytorch-Wildlife, please [email us](zhongqimiao@microsoft.com) or join us in our discord channel: [![](https://img.shields.io/badge/any_text-Join_us!-blue?logo=discord&label=PytorchWildife)](https://discord.gg/TeEVxzaYtm)\n\n## \ud83d\udc4b Welcome to Pytorch-Wildlife Version 1.0\n\n**PyTorch-Wildlife** is a platform to create, modify, and share powerful AI conservation models. These models can be used for a variety of applications, including camera trap images, overhead images, underwater images, or bioacoustics. Your engagement with our work is greatly appreciated, and we eagerly await any feedback you may have.\n\n\nThe **Pytorch-Wildlife** library allows users to directly load the `MegaDetector v5` model weights for animal detection. We've fully refactored our codebase, prioritizing ease of use in model deployment and expansion. In addition to `MegaDetector v5`, **Pytorch-Wildlife** also accommodates a range of classification weights, such as those derived from the Amazon Rainforest dataset and the Opossum classification dataset. Explore the codebase and functionalities of **Pytorch-Wildlife** through our interactive [HuggingFace web app](https://huggingface.co/spaces/AndresHdzC/pytorch-wildlife) or local [demos and notebooks](https://github.com/microsoft/CameraTraps/tree/main/demo), designed to showcase the practical applications of our enhancements at [PyTorchWildlife](https://github.com/microsoft/CameraTraps/blob/main/INSTALLATION.md). You can find more information in our [documentation](https://cameratraps.readthedocs.io/en/latest/).\n\n\ud83d\udc47 Here is a brief example on how to perform detection and classification on a single image using `PyTorch-wildlife`\n```python\nimport torch\nfrom PytorchWildlife.models import detection as pw_detection\nfrom PytorchWildlife.models import classification as pw_classification\n\nimg = torch.randn((3, 1280, 1280))\n\n# Detection\ndetection_model = pw_detection.MegaDetectorV5() # Model weights are automatically downloaded.\ndetection_result = detection_model.single_image_detection(img)\n\n#Classification\nclassification_model = pw_classification.AI4GAmazonRainforest() # Model weights are automatically downloaded.\nclassification_results = classification_model.single_image_classification(img)\n```\n\n## \u2699\ufe0f Install Pytorch-Wildlife\n```\npip install PytorchWildlife\n```\nPlease refer to our [installation guide](https://github.com/microsoft/CameraTraps/blob/main/INSTALLATION.md) for more installation information.\n\n## \ud83d\udd75\ufe0f Explore Pytorch-Wildlife and MegaDetector with our Demo User Interface\n\nIf you want to directly try **Pytorch-Wildlife** with the AI models available, including `MegaDetector v5`, you can use our [**Gradio** interface](https://github.com/microsoft/CameraTraps/tree/main/demo). This interface allows users to directly load the `MegaDetector v5` model weights for animal detection. In addition, **Pytorch-Wildlife** also has two classification models in our initial version. One is trained from an Amazon Rainforest camera trap dataset and the other from a Galapagos opossum classification dataset (more details of these datasets will be published soon). To start, please follow the [installation instructions](https://github.com/microsoft/CameraTraps/blob/main/INSTALLATION.md) on how to run the Gradio interface! We also provide multiple [**Jupyter** notebooks](https://github.com/microsoft/CameraTraps/tree/main/demo) for demonstration.\n\n![image](https://microsoft.github.io/CameraTraps/assets/gradio_UI.png)\n\n\n## \ud83d\udee0\ufe0f Core Features\n   What are the core components of Pytorch-Wildlife?\n![Pytorch-core-diagram](https://microsoft.github.io/CameraTraps/assets/Pytorch_Wildlife_core_figure.jpg)\n\n\n### \ud83c\udf10 Unified Framework:\n  Pytorch-Wildlife integrates **four pivotal elements:**\n\n\u25aa Machine Learning Models<br>\n\u25aa Pre-trained Weights<br>\n\u25aa Datasets<br>\n\u25aa Utilities<br>\n\n### \ud83d\udc77 Our work:\n  In the provided graph, boxes outlined in red represent elements that will be added and remained fixed, while those in blue will be part of our development.\n\n\n### \ud83d\ude80 Inaugural Model:\n  We're kickstarting with YOLO as our first available model, complemented by pre-trained weights from `MegaDetector v5`. This is the same `MegaDetector v5` model from the previous repository.\n\n\n### \ud83d\udcda Expandable Repository:\n  As we move forward, our platform will welcome new models and pre-trained weights for camera traps and bioacoustic analysis. We're excited to host contributions from global researchers through a dedicated submission platform.\n\n\n### \ud83d\udcca Datasets from LILA:\n  Pytorch-Wildlife will also incorporate the vast datasets hosted on LILA, making it a treasure trove for conservation research.\n\n\n### \ud83e\uddf0 Versatile Utilities:\n  Our set of utilities spans from visualization tools to task-specific utilities, many inherited from Megadetector.\n\n\n### \ud83d\udcbb User Interface Flexibility:\n  While we provide a foundational user interface, our platform is designed to inspire. We encourage researchers to craft and share their unique interfaces, and we'll list both existing and new UIs from other collaborators for the community's benefit.\n\n\nLet's shape the future of wildlife research, together! \ud83d\ude4c\n\n\n### \ud83d\udcc8 Progress on core tasks\n\n<details>\n<summary> <font size=\"3\"> \u25aa\ufe0f Packaging </font> </summary>\n\n- [ ] Animal detection fine-tuning<br>\n- [x] MegaDetectorV5 integration<br>\n- [ ] MegaDetectorV6 integration<br>\n- [x] User submitted weights<br>\n- [x] Animal classification fine-tuning<br>\n- [x] Amazon Rainforest classification<br>\n- [x] Amazon Opossum classification<br>\n- [ ] User submitted weights<br>\n</details><br>\n\n<details>\n<summary><font size=\"3\">\u25aa\ufe0f Utility Toolkit</font></summary>\n\n- [x] Visualization tools<br>\n- [x] MegaDetector utils<br>\n- [ ] User submitted utils<br>\n</details><br>\n\n<details>\n<summary><font size=\"3\">\u25aa\ufe0f Datasets</font></summary>\n\n- [ ] Animal Datasets<br>\n- [ ] LILA datasets<br>\n</details><br>\n\n<details>\n<summary><font size=\"3\">\u25aa\ufe0f Accessibility</font></summary>\n\n- [x] Basic user interface for demonstration<br>\n- [ ] UI Dev tools<br>\n- [ ] List of available UIs<br>\n</details><br>\n\n\n## \ud83d\uddbc\ufe0f Examples\n\n### Image detection using `MegaDetector v5`\n<img src=\"https://microsoft.github.io/CameraTraps/assets/animal_det_1.JPG\" alt=\"animal_det_1\" width=\"400\"/><br>\n*Credits to Universidad de los Andes, Colombia.*\n\n### Image classification with `MegaDetector v5` and `AI4GAmazonRainforest`\n<img src=\"https://microsoft.github.io/CameraTraps/assets/animal_clas_1.png\" alt=\"animal_clas_1\" width=\"500\"/><br>\n*Credits to Universidad de los Andes, Colombia.*\n\n### Opossum ID with `MegaDetector v5` and `AI4GOpossum`\n<img src=\"https://microsoft.github.io/CameraTraps/assets/opossum_det.png\" alt=\"opossum_det\" width=\"500\"/><br>\n*Credits to the Agency for Regulation and Control of Biosecurity and Quarantine for Gal\u00e1pagos (ABG), Ecuador.*\n\n\n## \ud83e\udd1d Contributing\nThis project is open to your ideas and contributions. If you want to submit a pull request, we'll have some guidelines available soon.\n\nWe have adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [us](zhongqimiao@microsoft.com) with any additional questions or comments.\n\n## License\nThis repository is licensed with the [MIT license](https://github.com/Microsoft/dotnet/blob/main/LICENSE).\n\n\n## \ud83d\udc65 Existing Collaborators\n\nThe extensive collaborative efforts of Megadetector have genuinely inspired us, and we deeply value its significant contributions to the community. As we continue to advance with Pytorch-Wildlife, our commitment to delivering technical support to our existing partners on MegaDetector remains the same.\n\nHere we list a few of the organizations that have used MegaDetector. We're only listing organizations who have given us permission to refer to them here or have posted publicly about their use of MegaDetector.\n\n<details>\n<summary><font size=\"3\">\ud83d\udc49 Full list of organizations</font></summary>\n\n(Newly Added) [TerrO\u00efko](https://www.terroiko.fr/) ([OCAPI platform](https://www.terroiko.fr/ocapi))\n\n[Arizona Department of Environmental Quality](http://azdeq.gov/)\n\n[Blackbird Environmental](https://blackbirdenv.com/)\n\n[Camelot](https://camelotproject.org/)\n\n[Canadian Parks and Wilderness Society (CPAWS) Northern Alberta Chapter](https://cpawsnab.org/)\n\n[Conservation X Labs](https://conservationxlabs.com/)\n\n[Czech University of Life Sciences Prague](https://www.czu.cz/en)\n\n[EcoLogic Consultants Ltd.](https://www.consult-ecologic.com/)\n\n[Estaci\u00f3n Biol\u00f3gica de Do\u00f1ana](http://www.ebd.csic.es/inicio)\n\n[Idaho Department of Fish and Game](https://idfg.idaho.gov/)\n\n[Island Conservation](https://www.islandconservation.org/)\n\n[Myall Lakes Dingo Project](https://carnivorecoexistence.info/myall-lakes-dingo-project/)\n\n[Point No Point Treaty Council](https://pnptc.org/)\n\n[Ramat Hanadiv Nature Park](https://www.ramat-hanadiv.org.il/en/)\n\n[SPEA (Portuguese Society for the Study of Birds)](https://spea.pt/en/)\n\n[Synthetaic](https://www.synthetaic.com/)\n\n[Taronga Conservation Society](https://taronga.org.au/)\n\n[The Nature Conservancy in Wyoming](https://www.nature.org/en-us/about-us/where-we-work/united-states/wyoming/)\n\n[TrapTagger](https://wildeyeconservation.org/trap-tagger-about/)\n\n[Upper Yellowstone Watershed Group](https://www.upperyellowstone.org/)\n\n[Applied Conservation Macro Ecology Lab](http://www.acmelab.ca/), University of Victoria\n\n[Banff National Park Resource Conservation](https://www.pc.gc.ca/en/pn-np/ab/banff/nature/conservation), Parks Canada(https://www.pc.gc.ca/en/pn-np/ab/banff/nature/conservation)\n\n[Blumstein Lab](https://blumsteinlab.eeb.ucla.edu/), UCLA\n\n[Borderlands Research Institute](https://bri.sulross.edu/), Sul Ross State University\n\n[Capitol Reef National Park](https://www.nps.gov/care/index.htm) / Utah Valley University\n\n[Center for Biodiversity and Conservation](https://www.amnh.org/research/center-for-biodiversity-conservation), American Museum of Natural History\n\n[Centre for Ecosystem Science](https://www.unsw.edu.au/research/), UNSW Sydney\n\n[Cross-Cultural Ecology Lab](https://crossculturalecology.net/), Macquarie University\n\n[DC Cat Count](https://hub.dccatcount.org/), led by the Humane Rescue Alliance\n\n[Department of Fish and Wildlife Sciences](https://www.uidaho.edu/cnr/departments/fish-and-wildlife-sciences), University of Idaho\n\n[Department of Wildlife Ecology and Conservation](https://wec.ifas.ufl.edu/), University of Florida\n\n[Ecology and Conservation of Amazonian Vertebrates Research Group](https://www.researchgate.net/lab/Fernanda-Michalski-Lab-4), Federal University of Amap\u00e1\n\n[Gola Forest Programma](https://www.rspb.org.uk/our-work/conservation/projects/scientific-support-for-the-gola-forest-programme/), Royal Society for the Protection of Birds (RSPB)\n\n[Graeme Shannon's Research Group](https://wildliferesearch.co.uk/group-1), Bangor University\n\n[Hamaarag](https://hamaarag.org.il/), The Steinhardt Museum of Natural History, Tel Aviv University\n\n[Institut des Science de la For\u00eat Temp\u00e9r\u00e9e (ISFORT)](https://isfort.uqo.ca/), Universit\u00e9 du Qu\u00e9bec en Outaouais\n\n[Lab of Dr. Bilal Habib](https://bhlab.in/about), the Wildlife Institute of India\n\n[Mammal Spatial Ecology and Conservation Lab](https://labs.wsu.edu/dthornton/), Washington State University\n\n[McLoughlin Lab in Population Ecology](http://mcloughlinlab.ca/lab/), University of Saskatchewan\n\n[National Wildlife Refuge System, Southwest Region](https://www.fws.gov/about/region/southwest), U.S. Fish & Wildlife Service\n\n[Northern Great Plains Program](https://nationalzoo.si.edu/news/restoring-americas-prairie), Smithsonian\n\n[Quantitative Ecology Lab](https://depts.washington.edu/sefsqel/), University of Washington\n\n[Santa Monica Mountains Recreation Area](https://www.nps.gov/samo/index.htm), National Park Service\n\n[Seattle Urban Carnivore Project](https://www.zoo.org/seattlecarnivores), Woodland Park Zoo\n\n[Serra dos \u00d3rg\u00e3os National Park](https://www.icmbio.gov.br/parnaserradosorgaos/), ICMBio\n\n[Snapshot USA](https://emammal.si.edu/snapshot-usa), Smithsonian\n\n[Wildlife Coexistence Lab](https://wildlife.forestry.ubc.ca/), University of British Columbia\n\n[Wildlife Research](https://www.dfw.state.or.us/wildlife/research/index.asp), Oregon Department of Fish and Wildlife\n\n[Wildlife Division](https://www.michigan.gov/dnr/about/contact/wildlife), Michigan Department of Natural Resources\n\nDepartment of Ecology, TU Berlin\n\nGhost Cat Analytics\n\nProtected Areas Unit, Canadian Wildlife Service\n\n[School of Natural Sciences](https://www.utas.edu.au/natural-sciences), University of Tasmania [(story)](https://www.utas.edu.au/about/news-and-stories/articles/2022/1204-innovative-camera-network-keeps-close-eye-on-tassie-wildlife)\n\n[Kenai National Wildlife Refuge](https://www.fws.gov/refuge/kenai), U.S. Fish & Wildlife Service [(story)](https://www.peninsulaclarion.com/sports/refuge-notebook-new-technology-increases-efficiency-of-refuge-cameras/)\n\n[Australian Wildlife Conservancy](https://www.australianwildlife.org/) [(blog](https://www.australianwildlife.org/cutting-edge-technology-delivering-efficiency-gains-in-conservation/), [blog)](https://www.australianwildlife.org/efficiency-gains-at-the-cutting-edge-of-technology/)\n\n[Felidae Conservation Fund](https://felidaefund.org/) [(WildePod platform)](https://wildepod.org/) [(blog post)](https://abhaykashyap.com/blog/ai-powered-camera-trap-image-annotation-system/)\n\n[Alberta Biodiversity Monitoring Institute (ABMI)](https://www.abmi.ca/home.html) [(WildTrax platform)](https://www.wildtrax.ca/) [(blog post)](https://wildcams.ca/blog/the-abmi-visits-the-zoo/)\n\n[Shan Shui Conservation Center](http://en.shanshui.org/) [(blog post)](https://mp.weixin.qq.com/s/iOIQF3ckj0-rEG4yJgerYw?fbclid=IwAR0alwiWbe3udIcFvqqwm7y5qgr9hZpjr871FZIa-ErGUukZ7yJ3ZhgCevs) [(translated blog post)](https://mp-weixin-qq-com.translate.goog/s/iOIQF3ckj0-rEG4yJgerYw?fbclid=IwAR0alwiWbe3udIcFvqqwm7y5qgr9hZpjr871FZIa-ErGUukZ7yJ3ZhgCevs&_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en&_x_tr_pto=wapp)\n\n[Irvine Ranch Conservancy](http://www.irconservancy.org/) [(story)](https://www.ocregister.com/2022/03/30/ai-software-is-helping-researchers-focus-on-learning-about-ocs-wild-animals/)\n\n[Wildlife Protection Solutions](https://wildlifeprotectionsolutions.org/) [(story](https://customers.microsoft.com/en-us/story/1384184517929343083-wildlife-protection-solutions-nonprofit-ai-for-earth), [story)](https://www.enterpriseai.news/2023/02/20/ai-helps-wildlife-protection-solutions-safeguard-endangered-species/)\n\n[Road Ecology Center](https://roadecology.ucdavis.edu/), University of California, Davis [(Wildlife Observer Network platform)](https://wildlifeobserver.net/)\n\n[The Nature Conservancy in California](https://www.nature.org/en-us/about-us/where-we-work/united-states/california/) [(Animl platform)](https://github.com/tnc-ca-geo/animl-frontend)\n\n[San Diego Zoo Wildlife Alliance](https://science.sandiegozoo.org/) [(Animl R package)](https://github.com/conservationtechlab/animl)\n\n</details><br>\n\n\n>[!IMPORTANT]\n>If you would like to be added to this list or have any questions regarding MegaDetector and Pytorch-Wildlife, please [email us](zhongqimiao@microsoft.com) or join us in our Discord channel: [![](https://img.shields.io/badge/any_text-Join_us!-blue?logo=discord&label=PytorchWildife)](https://discord.gg/TeEVxzaYtm)\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "a PyTorch Collaborative Deep Learning Framework for Conservation.",
    "version": "1.0.2.14",
    "project_urls": {
        "Homepage": "https://github.com/microsoft/CameraTraps/"
    },
    "split_keywords": [
        "pytorch_wildlife",
        " pytorch",
        " wildlife",
        " megadetector",
        " conservation",
        " animal",
        " detection",
        " classification"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c509ad751c3ce3b7c3cd8ba96018aea52054143dcf807e5491c94d3486e3b2b7",
                "md5": "9980b2d617d525e170bef032d2307e77",
                "sha256": "76aa2c25512a3bd213fb0cf4f9ea964d277c2f4d4861e92052312b346b15688a"
            },
            "downloads": -1,
            "filename": "PytorchWildlife-1.0.2.14.tar.gz",
            "has_sig": false,
            "md5_digest": "9980b2d617d525e170bef032d2307e77",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 32858,
            "upload_time": "2024-05-08T22:03:42",
            "upload_time_iso_8601": "2024-05-08T22:03:42.652516Z",
            "url": "https://files.pythonhosted.org/packages/c5/09/ad751c3ce3b7c3cd8ba96018aea52054143dcf807e5491c94d3486e3b2b7/PytorchWildlife-1.0.2.14.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-08 22:03:42",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "microsoft",
    "github_project": "CameraTraps",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "pytorchwildlife"
}
        
Elapsed time: 0.27664s