apxr


Nameapxr JSON
Version 1.0.0 PyPI version JSON
download
home_pagehttps://github.com/zukixa/apx
Summaryscalable & good async proxyscraper
upload_time2024-08-20 17:28:06
maintainerNone
docs_urlNone
authorzukixa
requires_python>=3.6
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # AsyncProxier: Asynchronous Free Proxy Fetcher

[![PyPI](https://img.shields.io/pypi/v/asyncproxier)](https://pypi.org/project/apxr/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

AsyncProxier is a Python library that fetches and validates free proxies asynchronously. It scrapes proxies from multiple sources, checks their validity, and provides you with a working proxy that matches your specified criteria.

## Features

- **Asynchronous:** Uses `aiohttp` and `asyncio` for fast and efficient proxy fetching.
- **Multiple Sources:** Scrapes proxies from various websites, including sslproxies.org, us-proxy.org, free-proxy-list.net, proxyscrape.com, and proxy-list.download.
- **Filtering:** Filter proxies by country, anonymity level (elite/anonymous), Google support, and HTTPS support.
- **Validation:** Verifies that the returned proxy is working by connecting to a specified URL (defaults to google.com).
- **Easy to Use:** Simple and intuitive API for fetching and updating proxies.

## Installation

```bash
pip install apxr
```

## Usage

```python
from apxr import AsyncProxier
import asyncio
import httpx

async def main():
    # Initialize the proxier with desired settings
    proxier = AsyncProxier(country_id=['US'], https=True, anonym=True)

    # Get an initial working proxy
    proxy = await proxier.get()
    print(f"Initial working proxy: {proxy}")

    # Main usage of this client.
    for i in range(80):
        # Perform the HTTP request using the last working proxy.
        async with httpx.AsyncClient(proxies=(await proxier.update())) as sesh:
            try:
                response = await sesh.get('https://www.google.com')
                if response.status_code == 200:
                    # Request was successful, other code possible.
                    print("Request successful!")
                else:
                    # Request failed, raise an exception to try again with a new proxy
                    raise Exception('Proxy Error')
            except Exception as e:
                # Request failed, print the error message and continue to the next attempt with new proxy.
                print(f"Request failed: {str(e)}")
                await proxier.update(True) # Updating the proxy is necessary at an error.
                continue

if __name__ == "__main__":
    asyncio.run(main())
```

## Parameters

The `AsyncProxier` class accepts the following parameters:

- `country_id` (list, optional): A list of country codes to filter proxies by. Defaults to None (no country filter).
- `timeout` (float, optional): The timeout for proxy validation in seconds. Defaults to 0.5.
- `anonym` (bool, optional): Whether to only return anonymous or elite proxies. Defaults to False.
- `elite` (bool, optional): Whether to only return elite proxies. Defaults to False.
- `google` (bool, optional): Whether to only return proxies that support Google. Defaults to None (no Google filter).
- `https` (bool, optional): Whether to only return HTTPS proxies. Defaults to False.
- `verify_url` (str, optional): The URL to use for proxy validation. Defaults to 'www.google.com'.

## Inspiration and Continuation

This project is inspired by the [free-proxy](https://github.com/jundymek/free-proxy) repository, but it serves as an asynchronous continuation and expansion. The original source is unmaintained and lacks features like asynchronous operation and support for multiple proxy sources. AsyncProxier addresses these limitations and provides a more robust and efficient solution for fetching free proxies.

```

```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/zukixa/apx",
    "name": "apxr",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": null,
    "author": "zukixa",
    "author_email": "56563509+zukixa@users.noreply.github.com",
    "download_url": "https://files.pythonhosted.org/packages/39/89/e25d23ddd06596c1e42ec111f0648f74f4997b0e1914f3e1e04b42bf740b/apxr-1.0.0.tar.gz",
    "platform": null,
    "description": "# AsyncProxier: Asynchronous Free Proxy Fetcher\n\n[![PyPI](https://img.shields.io/pypi/v/asyncproxier)](https://pypi.org/project/apxr/)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n\nAsyncProxier is a Python library that fetches and validates free proxies asynchronously. It scrapes proxies from multiple sources, checks their validity, and provides you with a working proxy that matches your specified criteria.\n\n## Features\n\n- **Asynchronous:** Uses `aiohttp` and `asyncio` for fast and efficient proxy fetching.\n- **Multiple Sources:** Scrapes proxies from various websites, including sslproxies.org, us-proxy.org, free-proxy-list.net, proxyscrape.com, and proxy-list.download.\n- **Filtering:** Filter proxies by country, anonymity level (elite/anonymous), Google support, and HTTPS support.\n- **Validation:** Verifies that the returned proxy is working by connecting to a specified URL (defaults to google.com).\n- **Easy to Use:** Simple and intuitive API for fetching and updating proxies.\n\n## Installation\n\n```bash\npip install apxr\n```\n\n## Usage\n\n```python\nfrom apxr import AsyncProxier\nimport asyncio\nimport httpx\n\nasync def main():\n    # Initialize the proxier with desired settings\n    proxier = AsyncProxier(country_id=['US'], https=True, anonym=True)\n\n    # Get an initial working proxy\n    proxy = await proxier.get()\n    print(f\"Initial working proxy: {proxy}\")\n\n    # Main usage of this client.\n    for i in range(80):\n        # Perform the HTTP request using the last working proxy.\n        async with httpx.AsyncClient(proxies=(await proxier.update())) as sesh:\n            try:\n                response = await sesh.get('https://www.google.com')\n                if response.status_code == 200:\n                    # Request was successful, other code possible.\n                    print(\"Request successful!\")\n                else:\n                    # Request failed, raise an exception to try again with a new proxy\n                    raise Exception('Proxy Error')\n            except Exception as e:\n                # Request failed, print the error message and continue to the next attempt with new proxy.\n                print(f\"Request failed: {str(e)}\")\n                await proxier.update(True) # Updating the proxy is necessary at an error.\n                continue\n\nif __name__ == \"__main__\":\n    asyncio.run(main())\n```\n\n## Parameters\n\nThe `AsyncProxier` class accepts the following parameters:\n\n- `country_id` (list, optional): A list of country codes to filter proxies by. Defaults to None (no country filter).\n- `timeout` (float, optional): The timeout for proxy validation in seconds. Defaults to 0.5.\n- `anonym` (bool, optional): Whether to only return anonymous or elite proxies. Defaults to False.\n- `elite` (bool, optional): Whether to only return elite proxies. Defaults to False.\n- `google` (bool, optional): Whether to only return proxies that support Google. Defaults to None (no Google filter).\n- `https` (bool, optional): Whether to only return HTTPS proxies. Defaults to False.\n- `verify_url` (str, optional): The URL to use for proxy validation. Defaults to 'www.google.com'.\n\n## Inspiration and Continuation\n\nThis project is inspired by the [free-proxy](https://github.com/jundymek/free-proxy) repository, but it serves as an asynchronous continuation and expansion. The original source is unmaintained and lacks features like asynchronous operation and support for multiple proxy sources. AsyncProxier addresses these limitations and provides a more robust and efficient solution for fetching free proxies.\n\n```\n\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "scalable & good async proxyscraper",
    "version": "1.0.0",
    "project_urls": {
        "Homepage": "https://github.com/zukixa/apx"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3989e25d23ddd06596c1e42ec111f0648f74f4997b0e1914f3e1e04b42bf740b",
                "md5": "15a6cd3459b03e2373f01c49fa52601f",
                "sha256": "e5a3a2115595cfa62897779b26ea0cb1d4e01fdbebae29656b7618875fb69e91"
            },
            "downloads": -1,
            "filename": "apxr-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "15a6cd3459b03e2373f01c49fa52601f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 4945,
            "upload_time": "2024-08-20T17:28:06",
            "upload_time_iso_8601": "2024-08-20T17:28:06.339310Z",
            "url": "https://files.pythonhosted.org/packages/39/89/e25d23ddd06596c1e42ec111f0648f74f4997b0e1914f3e1e04b42bf740b/apxr-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-20 17:28:06",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "zukixa",
    "github_project": "apx",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "apxr"
}
        
Elapsed time: 0.30176s