deepdanbooru


Namedeepdanbooru JSON
Version 1.0.2 PyPI version JSON
download
home_pagehttps://github.com/KichangKim/DeepDanbooru
SummaryDeepDanbooru is AI based multi-label girl image classification system, implemented by using TensorFlow.
upload_time2023-05-22 17:43:03
maintainer
docs_urlNone
authorKichang Kim
requires_python>=3.6
licenseMIT
keywords icq bot framework async python
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI
coveralls test coverage No coveralls.
            # DeepDanbooru
[![Python](https://img.shields.io/badge/python-3.6-green)](https://www.python.org/doc/versions/)
[![GitHub](https://img.shields.io/github/license/KichangKim/DeepDanbooru)](https://opensource.org/licenses/MIT)
[![Web](https://img.shields.io/badge/web%20demo-20200915-brightgreen)](http://kanotype.iptime.org:8003/deepdanbooru/)

**DeepDanbooru** is anime-style girl image tag estimation system. You can estimate your images on my live demo site, [DeepDanbooru Web](http://dev.kanotype.net:8003/deepdanbooru/).

## Requirements
DeepDanbooru is written by Python 3.7. Following packages are need to be installed.
- tensorflow>=2.7.0
- tensorflow-io>=2.22.0
- Click>=7.0
- numpy>=1.16.2
- requests>=2.22.0
- scikit-image>=0.15.0
- six>=1.13.0

Or just use `requirements.txt`.
```
> pip install -r requirements.txt
```

alternatively you can install it with pip. Note that by default, tensorflow is not included.

To install it with tensorflow, add `tensorflow` extra package.

```
> # default installation
> pip install .
> # with tensorflow package
> pip install .[tensorflow]
```


## Usage
1. Prepare dataset. If you don't have, you can use [DanbooruDownloader](https://github.com/KichangKim/DanbooruDownloader) for download the dataset of [Danbooru](https://danbooru.donmai.us/). If you want to make your own dataset, see [Dataset Structure](#dataset-structure) section.
2. Create training project folder.
```
> deepdanbooru create-project [your_project_folder]
```
3. Prepare tag list. If you want to use latest tags, use following command. It downloads tag from Danbooru server. (Need Danbooru account and API key)
```
> deepdanbooru download-tags [your_project_folder] --username [your_danbooru_account] --api-key [your_danbooru_api_key]
```
4. (Option) Filtering dataset. If you want to train with optional tags (rating and score), you should convert it as system tags.
```
> deepdanbooru make-training-database [your_dataset_sqlite_path] [your_filtered_sqlite_path]
```
5. Modify `project.json` in the project folder. You should change `database_path` setting to your actual sqlite file path.
6. Start training.
```
> deepdanbooru train-project [your_project_folder]
```
7. Enjoy it.
```
> deepdanbooru evaluate [image_file_path or folder]... --project-path [your_project_folder] --allow-folder
```

## Running on Docker

In the container, the dataset is located on the folder /app/model. You can always mount a volume to use a dataset in your local disk.

You'll also need to mount a volume using the folder containing your images, eg:

```sh
docker run --rm -it -v /home/kamuri/images/:/app/data kamuri/deepdanbooru evaluate --project-path "/app/model" "/app/data/" --allow-folder
```

If you do not want to use the dataset included with the image, you can use the image without it. The image is `kamuri/deepdanbooru:nomodel`

## Dataset Structure
DeepDanbooru uses following folder structure for input dataset. SQLite file can be any name, but must be located in same folder to `images` folder. All of image files are located in sub-folder which named first 2 characters of its filename.
```
MyDataset/
├── images/
│   ├── 00/
│   │   ├── 00000000000000000000000000000000.jpg
│   │   ├── ...
│   ├── 01/
│   │   ├── 01000000000000000000000000000000.jpg
│   │   ├── ...
│   └── ff/
│       ├── ff000000000000000000000000000000.jpg
│       ├── ...
└── my-dataset.sqlite
```
The core is SQLite database file. That file must be contains following table structure.
```
posts
├── id (INTEGER)
├── md5 (TEXT)
├── file_ext (TEXT)
├── tag_string (TEXT)
└── tag_count_general (INTEGER)
```
The filename of image must be `[md5].[file_ext]`. If you use your own images, `md5` don't have to be actual MD5 hash value.

`tag_string` is space splitted tag list, like `1girl ahoge long_hair`.

`tag_count_general` is used for the project setting, `minimum_tag_count`. Images which has equal or larger value of `tag_count_general` are used for training.

## Project Structure
**Project** is minimal unit for training on DeepDanbooru. You can modify various parameters for training.
```
MyProject/
├── project.json
└── tags.txt
```
`tags.txt` contains all tags for estimating. You can make your own list or download latest tags from Danbooru server. It is simple newline-separated file like this:
```
1girl
ahoge
...
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/KichangKim/DeepDanbooru",
    "name": "deepdanbooru",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "icq bot framework async python",
    "author": "Kichang Kim",
    "author_email": "admin@kanotype.net",
    "download_url": "https://files.pythonhosted.org/packages/02/bd/00fd8438795b6abd0a920716e2e1f14b9c81834e8d786acf493d470e4c35/deepdanbooru-1.0.2.tar.gz",
    "platform": null,
    "description": "# DeepDanbooru\n[![Python](https://img.shields.io/badge/python-3.6-green)](https://www.python.org/doc/versions/)\n[![GitHub](https://img.shields.io/github/license/KichangKim/DeepDanbooru)](https://opensource.org/licenses/MIT)\n[![Web](https://img.shields.io/badge/web%20demo-20200915-brightgreen)](http://kanotype.iptime.org:8003/deepdanbooru/)\n\n**DeepDanbooru** is anime-style girl image tag estimation system. You can estimate your images on my live demo site, [DeepDanbooru Web](http://dev.kanotype.net:8003/deepdanbooru/).\n\n## Requirements\nDeepDanbooru is written by Python 3.7. Following packages are need to be installed.\n- tensorflow>=2.7.0\n- tensorflow-io>=2.22.0\n- Click>=7.0\n- numpy>=1.16.2\n- requests>=2.22.0\n- scikit-image>=0.15.0\n- six>=1.13.0\n\nOr just use `requirements.txt`.\n```\n> pip install -r requirements.txt\n```\n\nalternatively you can install it with pip. Note that by default, tensorflow is not included.\n\nTo install it with tensorflow, add `tensorflow` extra package.\n\n```\n> # default installation\n> pip install .\n> # with tensorflow package\n> pip install .[tensorflow]\n```\n\n\n## Usage\n1. Prepare dataset. If you don't have, you can use [DanbooruDownloader](https://github.com/KichangKim/DanbooruDownloader) for download the dataset of [Danbooru](https://danbooru.donmai.us/). If you want to make your own dataset, see [Dataset Structure](#dataset-structure) section.\n2. Create training project folder.\n```\n> deepdanbooru create-project [your_project_folder]\n```\n3. Prepare tag list. If you want to use latest tags, use following command. It downloads tag from Danbooru server. (Need Danbooru account and API key)\n```\n> deepdanbooru download-tags [your_project_folder] --username [your_danbooru_account] --api-key [your_danbooru_api_key]\n```\n4. (Option) Filtering dataset. If you want to train with optional tags (rating and score), you should convert it as system tags.\n```\n> deepdanbooru make-training-database [your_dataset_sqlite_path] [your_filtered_sqlite_path]\n```\n5. Modify `project.json` in the project folder. You should change `database_path` setting to your actual sqlite file path.\n6. Start training.\n```\n> deepdanbooru train-project [your_project_folder]\n```\n7. Enjoy it.\n```\n> deepdanbooru evaluate [image_file_path or folder]... --project-path [your_project_folder] --allow-folder\n```\n\n## Running on Docker\n\nIn the container, the dataset is located on the folder /app/model. You can always mount a volume to use a dataset in your local disk.\n\nYou'll also need to mount a volume using the folder containing your images, eg:\n\n```sh\ndocker run --rm -it -v /home/kamuri/images/:/app/data kamuri/deepdanbooru evaluate --project-path \"/app/model\" \"/app/data/\" --allow-folder\n```\n\nIf you do not want to use the dataset included with the image, you can use the image without it. The image is `kamuri/deepdanbooru:nomodel`\n\n## Dataset Structure\nDeepDanbooru uses following folder structure for input dataset. SQLite file can be any name, but must be located in same folder to `images` folder. All of image files are located in sub-folder which named first 2 characters of its filename.\n```\nMyDataset/\n\u251c\u2500\u2500 images/\n\u2502   \u251c\u2500\u2500 00/\n\u2502   \u2502   \u251c\u2500\u2500 00000000000000000000000000000000.jpg\n\u2502   \u2502   \u251c\u2500\u2500 ...\n\u2502   \u251c\u2500\u2500 01/\n\u2502   \u2502   \u251c\u2500\u2500 01000000000000000000000000000000.jpg\n\u2502   \u2502   \u251c\u2500\u2500 ...\n\u2502   \u2514\u2500\u2500 ff/\n\u2502       \u251c\u2500\u2500 ff000000000000000000000000000000.jpg\n\u2502       \u251c\u2500\u2500 ...\n\u2514\u2500\u2500 my-dataset.sqlite\n```\nThe core is SQLite database file. That file must be contains following table structure.\n```\nposts\n\u251c\u2500\u2500 id (INTEGER)\n\u251c\u2500\u2500 md5 (TEXT)\n\u251c\u2500\u2500 file_ext (TEXT)\n\u251c\u2500\u2500 tag_string (TEXT)\n\u2514\u2500\u2500 tag_count_general (INTEGER)\n```\nThe filename of image must be `[md5].[file_ext]`. If you use your own images, `md5` don't have to be actual MD5 hash value.\n\n`tag_string` is space splitted tag list, like `1girl ahoge long_hair`.\n\n`tag_count_general` is used for the project setting, `minimum_tag_count`. Images which has equal or larger value of `tag_count_general` are used for training.\n\n## Project Structure\n**Project** is minimal unit for training on DeepDanbooru. You can modify various parameters for training.\n```\nMyProject/\n\u251c\u2500\u2500 project.json\n\u2514\u2500\u2500 tags.txt\n```\n`tags.txt` contains all tags for estimating. You can make your own list or download latest tags from Danbooru server. It is simple newline-separated file like this:\n```\n1girl\nahoge\n...\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "DeepDanbooru is AI based multi-label girl image classification system, implemented by using TensorFlow.",
    "version": "1.0.2",
    "project_urls": {
        "Homepage": "https://github.com/KichangKim/DeepDanbooru"
    },
    "split_keywords": [
        "icq",
        "bot",
        "framework",
        "async",
        "python"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6a3b47dd6b9f77a8b75a0ce8ccf3503bfda81a01eb27739aeb7018526be33515",
                "md5": "6239ff1e7eee339cbca442d82cf0bee0",
                "sha256": "4030d977126982db1b3fa1c251cacbb8bb5615fbedefa3b3985216995ddab0dc"
            },
            "downloads": -1,
            "filename": "deepdanbooru-1.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6239ff1e7eee339cbca442d82cf0bee0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 27116,
            "upload_time": "2023-05-22T17:42:59",
            "upload_time_iso_8601": "2023-05-22T17:42:59.967244Z",
            "url": "https://files.pythonhosted.org/packages/6a/3b/47dd6b9f77a8b75a0ce8ccf3503bfda81a01eb27739aeb7018526be33515/deepdanbooru-1.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "02bd00fd8438795b6abd0a920716e2e1f14b9c81834e8d786acf493d470e4c35",
                "md5": "1a0860b99b4d078df970c4acc2bcce72",
                "sha256": "f50216c9ea8fd50f866e9eef51ee52a863ec0d11f5021af892559414c3558f9c"
            },
            "downloads": -1,
            "filename": "deepdanbooru-1.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "1a0860b99b4d078df970c4acc2bcce72",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 22685,
            "upload_time": "2023-05-22T17:43:03",
            "upload_time_iso_8601": "2023-05-22T17:43:03.823500Z",
            "url": "https://files.pythonhosted.org/packages/02/bd/00fd8438795b6abd0a920716e2e1f14b9c81834e8d786acf493d470e4c35/deepdanbooru-1.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-05-22 17:43:03",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "KichangKim",
    "github_project": "DeepDanbooru",
    "travis_ci": true,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "deepdanbooru"
}
        
Elapsed time: 0.07199s