## Overview
`newd` analyzes images and identifies specific NSFW body parts with high accuracy. It can also optionally censor detected areas.
## Installation
```bash
pip install newd
```
## Usage
### Basic Detection
```python
from newd import detect
# Standard detection with default settings
results = detect('path/to/image.jpg')
print(results)
```
### Advanced Options
```python
# Faster detection with slightly reduced accuracy
results = detect('image.jpg', mode="fast")
# Adjust detection sensitivity
results = detect('image.jpg', min_prob=0.3) # Lower threshold catches more potential matches
# Combine options
results = detect('image.jpg', mode="fast", min_prob=0.3)
```
### Compatible Input Types
The `detect()` function accepts:
- String file paths
- Images loaded with OpenCV (`cv2`)
- Images loaded with PIL/Pillow
## Output Format
Detection results are returned as a list of dictionaries:
```python
[
{
'box': [x1, y1, x2, y2], # Bounding box coordinates (top-left, bottom-right)
'score': 0.825, # Confidence score (0-1)
'label': 'EXPOSED_BREAST_F' # Classification label
},
# Additional detections...
]
```
## First-Time Use
When importing `newd` for the first time, it will download a 139MB model file to your home directory (`~/.newd/`). This happens only once.
## Performance Notes
- Standard mode: Best accuracy, normal processing speed
- Fast mode: ~3x faster processing with slightly reduced accuracy
---
## Censoring / Redacting Detected Regions
`newd.censor()` masks detected NSFW regions with solid black rectangles. Use it when you need to create a safe-for-work version of an image.
```python
from newd import censor
# Censor all detected areas and write the result
censored_img = censor(
'image.jpg',
out_path='image_censored.jpg' # file will be written to disk
)
# Only censor specific labels (e.g. exposed anus & male genitals)
selected_parts = ['EXPOSED_ANUS_F', 'EXPOSED_GENITALIA_M']
censored_img = censor(
'image.jpg',
out_path='image_censored.jpg',
parts_to_blur=selected_parts
)
```
Function parameters:
| Parameter | Type | Description |
|-----------|------|-------------|
| `img_path` | str / Path | Source image or path. |
| `out_path` | str / Path, optional | Destination path; if omitted you can still obtain the result via the return value when `visualize=True`. |
| `visualize` | bool, default `False` | If `True`, the censored `numpy.ndarray` image is returned for display (`cv2.imshow`, etc.). |
| `parts_to_blur` | List[str], optional | Restrict censoring to given label names. When empty, all detected labels are censored. |
If neither `out_path` nor `visualize=True` is supplied, the function exits early because there is nowhere to deliver the censored image.
---
Raw data
{
"_id": null,
"home_page": "https://github.com/iBz-04/newd",
"name": "newd",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6.0",
"maintainer_email": null,
"keywords": "nsfw nudity detection computer vision ai",
"author": "iBz-04",
"author_email": "issakaibrahimrayamah@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/b0/60/7ea26d2c182dea8f5c5ac91db47101ad75e39d226a1e233e30320bb7fa90/newd-0.0.4.tar.gz",
"platform": null,
"description": "\r\n## Overview\r\n\r\n `newd` analyzes images and identifies specific NSFW body parts with high accuracy. It can also optionally censor detected areas.\r\n\r\n## Installation\r\n\r\n```bash\r\npip install newd\r\n```\r\n\r\n## Usage\r\n\r\n### Basic Detection\r\n\r\n```python\r\nfrom newd import detect\r\n\r\n# Standard detection with default settings\r\nresults = detect('path/to/image.jpg')\r\nprint(results)\r\n```\r\n\r\n### Advanced Options\r\n\r\n```python\r\n# Faster detection with slightly reduced accuracy\r\nresults = detect('image.jpg', mode=\"fast\")\r\n\r\n# Adjust detection sensitivity\r\nresults = detect('image.jpg', min_prob=0.3) # Lower threshold catches more potential matches\r\n\r\n# Combine options\r\nresults = detect('image.jpg', mode=\"fast\", min_prob=0.3)\r\n```\r\n\r\n### Compatible Input Types\r\n\r\nThe `detect()` function accepts:\r\n- String file paths\r\n- Images loaded with OpenCV (`cv2`)\r\n- Images loaded with PIL/Pillow\r\n\r\n## Output Format\r\n\r\nDetection results are returned as a list of dictionaries:\r\n\r\n```python\r\n[\r\n {\r\n 'box': [x1, y1, x2, y2], # Bounding box coordinates (top-left, bottom-right)\r\n 'score': 0.825, # Confidence score (0-1)\r\n 'label': 'EXPOSED_BREAST_F' # Classification label\r\n },\r\n # Additional detections...\r\n]\r\n```\r\n\r\n## First-Time Use\r\n\r\nWhen importing `newd` for the first time, it will download a 139MB model file to your home directory (`~/.newd/`). This happens only once.\r\n\r\n## Performance Notes\r\n\r\n- Standard mode: Best accuracy, normal processing speed\r\n- Fast mode: ~3x faster processing with slightly reduced accuracy\r\n\r\n---\r\n\r\n## Censoring / Redacting Detected Regions\r\n\r\n`newd.censor()` masks detected NSFW regions with solid black rectangles. Use it when you need to create a safe-for-work version of an image.\r\n\r\n```python\r\nfrom newd import censor\r\n\r\n# Censor all detected areas and write the result\r\ncensored_img = censor(\r\n 'image.jpg',\r\n out_path='image_censored.jpg' # file will be written to disk\r\n)\r\n\r\n# Only censor specific labels (e.g. exposed anus & male genitals)\r\nselected_parts = ['EXPOSED_ANUS_F', 'EXPOSED_GENITALIA_M']\r\ncensored_img = censor(\r\n 'image.jpg',\r\n out_path='image_censored.jpg',\r\n parts_to_blur=selected_parts\r\n)\r\n```\r\n\r\nFunction parameters:\r\n\r\n| Parameter | Type | Description |\r\n|-----------|------|-------------|\r\n| `img_path` | str / Path | Source image or path. |\r\n| `out_path` | str / Path, optional | Destination path; if omitted you can still obtain the result via the return value when `visualize=True`. |\r\n| `visualize` | bool, default `False` | If `True`, the censored `numpy.ndarray` image is returned for display (`cv2.imshow`, etc.). |\r\n| `parts_to_blur` | List[str], optional | Restrict censoring to given label names. When empty, all detected labels are censored. |\r\n\r\nIf neither `out_path` nor `visualize=True` is supplied, the function exits early because there is nowhere to deliver the censored image.\r\n\r\n---\r\n\r\n\r\n\r\n",
"bugtrack_url": null,
"license": "GPL-3.0-or-later",
"summary": "Nudity detection through deep learning",
"version": "0.0.4",
"project_urls": {
"Documentation": "https://github.com/iBz-04/newd#readme",
"Homepage": "https://github.com/iBz-04/newd",
"Source": "https://github.com/iBz-04/newd",
"Tracker": "https://github.com/iBz-04/newd/issues"
},
"split_keywords": [
"nsfw",
"nudity",
"detection",
"computer",
"vision",
"ai"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "6a378de45697a77827bcd96c886a0013c0a38a5967ce5aa0ed0261e685806652",
"md5": "2a8ddffa4406605def0f6d26dce3094a",
"sha256": "bb4c8b5633f980a9a3afdf0a47094d57aacb097b0b822b9cb548c8f7693fc215"
},
"downloads": -1,
"filename": "newd-0.0.4-py2.py3-none-any.whl",
"has_sig": false,
"md5_digest": "2a8ddffa4406605def0f6d26dce3094a",
"packagetype": "bdist_wheel",
"python_version": "py2.py3",
"requires_python": ">=3.6.0",
"size": 8252,
"upload_time": "2025-07-11T14:44:57",
"upload_time_iso_8601": "2025-07-11T14:44:57.498143Z",
"url": "https://files.pythonhosted.org/packages/6a/37/8de45697a77827bcd96c886a0013c0a38a5967ce5aa0ed0261e685806652/newd-0.0.4-py2.py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "b0607ea26d2c182dea8f5c5ac91db47101ad75e39d226a1e233e30320bb7fa90",
"md5": "033e213cac86c53b453e4aff223a43bd",
"sha256": "ba1e696a8bc3fcf1167dd708ddcfa3dffb0db787872a37e5ba363140dcd187c5"
},
"downloads": -1,
"filename": "newd-0.0.4.tar.gz",
"has_sig": false,
"md5_digest": "033e213cac86c53b453e4aff223a43bd",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6.0",
"size": 8557,
"upload_time": "2025-07-11T14:44:58",
"upload_time_iso_8601": "2025-07-11T14:44:58.576850Z",
"url": "https://files.pythonhosted.org/packages/b0/60/7ea26d2c182dea8f5c5ac91db47101ad75e39d226a1e233e30320bb7fa90/newd-0.0.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-11 14:44:58",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "iBz-04",
"github_project": "newd",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "newd"
}