post-archiver


Namepost-archiver JSON
Version 1.2.1 PyPI version JSON
download
home_pageNone
SummaryA tool to scrape YouTube community posts
upload_time2024-11-12 07:03:38
maintainerNone
docs_urlNone
authorNone
requires_python>=3.7
licenseMIT License Copyright (c) 2024 Sadad Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords youtube scraper community posts python
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # YouTube Community Scraper

A Python tool to scrape posts from YouTube community tabs.

## Features

- Scrape posts from YouTube community tabs
- Download images from posts
- Collect post comments
- Multi-browser support (Chromium, Firefox, WebKit)
- Automatic browser installation
- Proxy support (HTTP/HTTPS with auth, SOCKS5 without auth)
- Progress saving
- Configurable output directory

## Installation

Install using pip:
```bash
pip install post-archiver
```

Or install from source:
```bash
git clone https://github.com/sadadYes/post-archiver.git
cd post-archiver
pip install -e .
```

## Requirements

- Python 3.7 or higher
- No manual browser installation needed - browsers are installed automatically when needed

## Usage

```
usage: post-archiver [OPTIONS] url [amount]

YouTube Community Posts Scraper

positional arguments:
  url                   YouTube channel community URL
  amount                Amount of posts to get (default: max)

options:
  -h, --help            show this help message and exit
  -c, --get-comments    Get comments from posts (WARNING: This is slow) (default: False)
  -i, --get-images      Get images from posts (default: False)
  -d, --download-images
                        Download images (requires --get-images)
  -q IMAGE_QUALITY, --image-quality IMAGE_QUALITY
                        Image quality: sd, hd, or all (default: all)
  --proxy PROXY         Proxy file or single proxy string
  -o OUTPUT, --output OUTPUT
                        Output directory (default: current directory)
  -v, --verbose         Show basic progress information
  -t, --trace          Show detailed debug information
  --browser {chromium,firefox,webkit}
                        Browser to use (default: chromium)
  --version            show program's version number and exit
  --member-only         Only get membership-only posts (requires --cookies)
  --browser-cookies {chrome,firefox,edge,opera}
                        Get cookies from browser (requires browser-cookie3)

Proxy format:
  Single proxy: <scheme>://<username>:<password>@<host>:<port>
  Proxy file: One proxy per line using the same format
  Supported schemes: http, https
  Note: SOCKS5 proxies are supported but without authentication

Amount:
  Specify number of posts to scrape (default: max)
  Use 'max' or any number <= 0 to scrape all posts

Examples:
  post-archiver https://www.youtube.com/@channel/community
  post-archiver https://www.youtube.com/@channel/community 50
  post-archiver -c -i -d -q hd https://www.youtube.com/@channel/community max
  post-archiver --browser firefox https://www.youtube.com/@channel/community
  post-archiver --proxy proxies.txt https://www.youtube.com/@channel/community 100
  post-archiver --proxy http://username:password@host:port https://www.youtube.com/@channel/community
  post-archiver --proxy https://username:password@host:port https://www.youtube.com/@channel/community
  post-archiver --proxy socks5://host:port https://www.youtube.com/@channel/community
```

## Browser Support

The scraper supports three browser engines:
- Chromium (default)
- Firefox
- WebKit

The appropriate browser will be automatically installed when first used. You can specify which browser to use with the `--browser` option.

## Proxy Support

The scraper supports the following proxy types:
- HTTP proxies with authentication
- HTTPS proxies with authentication
- SOCKS5 proxies (without authentication)

**Note:** SOCKS5 proxies with authentication are not supported due to limitations in the underlying browser automation.

## Logging

Two levels of logging are available:
- `--verbose (-v)`: Shows basic progress information
- `--trace (-t)`: Shows detailed debug information including browser console messages

## License

MIT License


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "post-archiver",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "youtube, scraper, community, posts, python",
    "author": null,
    "author_email": "sadadYes <syaddadpunya@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/54/53/4895a8b237122beaa52c3ab19fad0ccee44fc5492eb7972577edf3b72386/post_archiver-1.2.1.tar.gz",
    "platform": null,
    "description": "# YouTube Community Scraper\n\nA Python tool to scrape posts from YouTube community tabs.\n\n## Features\n\n- Scrape posts from YouTube community tabs\n- Download images from posts\n- Collect post comments\n- Multi-browser support (Chromium, Firefox, WebKit)\n- Automatic browser installation\n- Proxy support (HTTP/HTTPS with auth, SOCKS5 without auth)\n- Progress saving\n- Configurable output directory\n\n## Installation\n\nInstall using pip:\n```bash\npip install post-archiver\n```\n\nOr install from source:\n```bash\ngit clone https://github.com/sadadYes/post-archiver.git\ncd post-archiver\npip install -e .\n```\n\n## Requirements\n\n- Python 3.7 or higher\n- No manual browser installation needed - browsers are installed automatically when needed\n\n## Usage\n\n```\nusage: post-archiver [OPTIONS] url [amount]\n\nYouTube Community Posts Scraper\n\npositional arguments:\n  url                   YouTube channel community URL\n  amount                Amount of posts to get (default: max)\n\noptions:\n  -h, --help            show this help message and exit\n  -c, --get-comments    Get comments from posts (WARNING: This is slow) (default: False)\n  -i, --get-images      Get images from posts (default: False)\n  -d, --download-images\n                        Download images (requires --get-images)\n  -q IMAGE_QUALITY, --image-quality IMAGE_QUALITY\n                        Image quality: sd, hd, or all (default: all)\n  --proxy PROXY         Proxy file or single proxy string\n  -o OUTPUT, --output OUTPUT\n                        Output directory (default: current directory)\n  -v, --verbose         Show basic progress information\n  -t, --trace          Show detailed debug information\n  --browser {chromium,firefox,webkit}\n                        Browser to use (default: chromium)\n  --version            show program's version number and exit\n  --member-only         Only get membership-only posts (requires --cookies)\n  --browser-cookies {chrome,firefox,edge,opera}\n                        Get cookies from browser (requires browser-cookie3)\n\nProxy format:\n  Single proxy: <scheme>://<username>:<password>@<host>:<port>\n  Proxy file: One proxy per line using the same format\n  Supported schemes: http, https\n  Note: SOCKS5 proxies are supported but without authentication\n\nAmount:\n  Specify number of posts to scrape (default: max)\n  Use 'max' or any number <= 0 to scrape all posts\n\nExamples:\n  post-archiver https://www.youtube.com/@channel/community\n  post-archiver https://www.youtube.com/@channel/community 50\n  post-archiver -c -i -d -q hd https://www.youtube.com/@channel/community max\n  post-archiver --browser firefox https://www.youtube.com/@channel/community\n  post-archiver --proxy proxies.txt https://www.youtube.com/@channel/community 100\n  post-archiver --proxy http://username:password@host:port https://www.youtube.com/@channel/community\n  post-archiver --proxy https://username:password@host:port https://www.youtube.com/@channel/community\n  post-archiver --proxy socks5://host:port https://www.youtube.com/@channel/community\n```\n\n## Browser Support\n\nThe scraper supports three browser engines:\n- Chromium (default)\n- Firefox\n- WebKit\n\nThe appropriate browser will be automatically installed when first used. You can specify which browser to use with the `--browser` option.\n\n## Proxy Support\n\nThe scraper supports the following proxy types:\n- HTTP proxies with authentication\n- HTTPS proxies with authentication\n- SOCKS5 proxies (without authentication)\n\n**Note:** SOCKS5 proxies with authentication are not supported due to limitations in the underlying browser automation.\n\n## Logging\n\nTwo levels of logging are available:\n- `--verbose (-v)`: Shows basic progress information\n- `--trace (-t)`: Shows detailed debug information including browser console messages\n\n## License\n\nMIT License\n\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2024 Sadad  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
    "summary": "A tool to scrape YouTube community posts",
    "version": "1.2.1",
    "project_urls": {
        "Bug Tracker": "https://github.com/sadadYes/post-archiver/issues",
        "Documentation": "https://github.com/sadadYes/post-archiver#readme",
        "Homepage": "https://github.com/sadadYes/post-archiver",
        "Repository": "https://github.com/sadadYes/post-archiver.git"
    },
    "split_keywords": [
        "youtube",
        " scraper",
        " community",
        " posts",
        " python"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1fb8ae66e2aabe121c9f7e5de2d7361e41c0b3a866e0c1e4146dbfb5680e1ec6",
                "md5": "0a6a7aeff835a8e23db08603aec47936",
                "sha256": "7859f4a83868bcbb1b2966857af8711c134b87a6ca9a8e6d6d096a187aa3d0a7"
            },
            "downloads": -1,
            "filename": "post_archiver-1.2.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0a6a7aeff835a8e23db08603aec47936",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 17476,
            "upload_time": "2024-11-12T07:03:36",
            "upload_time_iso_8601": "2024-11-12T07:03:36.232482Z",
            "url": "https://files.pythonhosted.org/packages/1f/b8/ae66e2aabe121c9f7e5de2d7361e41c0b3a866e0c1e4146dbfb5680e1ec6/post_archiver-1.2.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "54534895a8b237122beaa52c3ab19fad0ccee44fc5492eb7972577edf3b72386",
                "md5": "4f4a39f1fb1b22fca1d2683dce77378e",
                "sha256": "66f0254effd2149c87801f8ddac9b96cbe2b8707a6d0e6a07a23bffdbe406a70"
            },
            "downloads": -1,
            "filename": "post_archiver-1.2.1.tar.gz",
            "has_sig": false,
            "md5_digest": "4f4a39f1fb1b22fca1d2683dce77378e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 16637,
            "upload_time": "2024-11-12T07:03:38",
            "upload_time_iso_8601": "2024-11-12T07:03:38.439961Z",
            "url": "https://files.pythonhosted.org/packages/54/53/4895a8b237122beaa52c3ab19fad0ccee44fc5492eb7972577edf3b72386/post_archiver-1.2.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-12 07:03:38",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "sadadYes",
    "github_project": "post-archiver",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "post-archiver"
}
        
Elapsed time: 0.34971s