proxies-scraper


Nameproxies-scraper JSON
Version 0.1.0 PyPI version JSON
download
home_pageNone
SummaryA Python package for proxies scripting
upload_time2024-08-06 18:21:29
maintainerNone
docs_urlNone
authorNone
requires_python>=3.12
licenseNone
keywords proxies scripting networking
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Proxies Scraper

A Python package for searching free proxies. This package allows you to retrieve and filter proxy servers based on 
various criteria such as:
- Country code.
- Anonymity level.
- HTTP or HTTPS type.

## Table of Contents

- [Introduction](#introduction)
- [Features](#features)
- [Installation](#installation)
- [Usage](#usage)
  - [Basic Example](#basic-example)
  - [Advanced Example](#advanced-example)
- [Function Documentation](#function-documentation)
  - [`get_proxies`](#get_proxies)
- [Contributing](#contributing)
- [License](#license)

## Introduction

Proxies Scraper is a versatile Python package designed to help developers find and filter free proxy servers. 
It can be particularly useful for tasks such as web scraping, automated testing, and browsing with privacy.

## Features

- Filter proxies by country code.
- Filter proxies by anonymity level.
- Filter proxies by HTTP/HTTPS type.

## Installation

You can install the package using pip:

```sh
pip install proxies_scraper
```

## Usage

### Basic Example

Get a list of all proxies.

```python

from proxies_scraper.main import get_proxies

proxies = get_proxies()

print(proxies)
```

### Advanced Example

Get a proxies list filtered by country code and HTTPS support

```python

from proxies_scraper.main import get_proxies

proxies = get_proxies(
    country_codes_filter=['US'],
    anonymity_filter=[2],
    https_filter=True
)

for proxy in proxies:
    print(proxy)
```

## Contributing

Contributions are welcome! Please follow these steps to contribute:

1. Fork the repository.
2. Create a new branch for your feature or bugfix.
3. Implement your changes and ensure your code passes the tests.
4. Commit your changes with a descriptive commit message.
5. Push your changes to your forked repository.
6. Create a pull request to the main repository.

Please make sure your code adheres to the project's coding standards and includes appropriate tests.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "proxies-scraper",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.12",
    "maintainer_email": null,
    "keywords": "proxies, scripting, networking",
    "author": null,
    "author_email": "Carlosman1996 <cmmolinas01@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/e4/ac/3d96b8271656349cc9aab4ebe8854923ac2fc9fc2b59614c369625702541/proxies_scraper-0.1.0.tar.gz",
    "platform": null,
    "description": "# Proxies Scraper\n\nA Python package for searching free proxies. This package allows you to retrieve and filter proxy servers based on \nvarious criteria such as:\n- Country code.\n- Anonymity level.\n- HTTP or HTTPS type.\n\n## Table of Contents\n\n- [Introduction](#introduction)\n- [Features](#features)\n- [Installation](#installation)\n- [Usage](#usage)\n  - [Basic Example](#basic-example)\n  - [Advanced Example](#advanced-example)\n- [Function Documentation](#function-documentation)\n  - [`get_proxies`](#get_proxies)\n- [Contributing](#contributing)\n- [License](#license)\n\n## Introduction\n\nProxies Scraper is a versatile Python package designed to help developers find and filter free proxy servers. \nIt can be particularly useful for tasks such as web scraping, automated testing, and browsing with privacy.\n\n## Features\n\n- Filter proxies by country code.\n- Filter proxies by anonymity level.\n- Filter proxies by HTTP/HTTPS type.\n\n## Installation\n\nYou can install the package using pip:\n\n```sh\npip install proxies_scraper\n```\n\n## Usage\n\n### Basic Example\n\nGet a list of all proxies.\n\n```python\n\nfrom proxies_scraper.main import get_proxies\n\nproxies = get_proxies()\n\nprint(proxies)\n```\n\n### Advanced Example\n\nGet a proxies list filtered by country code and HTTPS support\n\n```python\n\nfrom proxies_scraper.main import get_proxies\n\nproxies = get_proxies(\n    country_codes_filter=['US'],\n    anonymity_filter=[2],\n    https_filter=True\n)\n\nfor proxy in proxies:\n    print(proxy)\n```\n\n## Contributing\n\nContributions are welcome! Please follow these steps to contribute:\n\n1. Fork the repository.\n2. Create a new branch for your feature or bugfix.\n3. Implement your changes and ensure your code passes the tests.\n4. Commit your changes with a descriptive commit message.\n5. Push your changes to your forked repository.\n6. Create a pull request to the main repository.\n\nPlease make sure your code adheres to the project's coding standards and includes appropriate tests.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A Python package for proxies scripting",
    "version": "0.1.0",
    "project_urls": {
        "Homepage": "https://github.com/Carlosman1996/proxies_scraper"
    },
    "split_keywords": [
        "proxies",
        " scripting",
        " networking"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a666fa8b76fd6ac38efb4367e43526c290650b8049d8a7539dcff08ed8b258ea",
                "md5": "3d0eb07745bf3af3cb2401c461bc5a61",
                "sha256": "6de5c08d4016db79c21dec1c7c46e148beb113c81c78a5f86a1ddd1fd194d4ac"
            },
            "downloads": -1,
            "filename": "proxies_scraper-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3d0eb07745bf3af3cb2401c461bc5a61",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.12",
            "size": 15092,
            "upload_time": "2024-08-06T18:21:27",
            "upload_time_iso_8601": "2024-08-06T18:21:27.866053Z",
            "url": "https://files.pythonhosted.org/packages/a6/66/fa8b76fd6ac38efb4367e43526c290650b8049d8a7539dcff08ed8b258ea/proxies_scraper-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e4ac3d96b8271656349cc9aab4ebe8854923ac2fc9fc2b59614c369625702541",
                "md5": "3b119ab8e0960fd6d4e36264f06e6e35",
                "sha256": "bba507fa03c86a8bc19c56b3dab5f2cd738efb1595bf70f1e47147946564ccd6"
            },
            "downloads": -1,
            "filename": "proxies_scraper-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "3b119ab8e0960fd6d4e36264f06e6e35",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.12",
            "size": 11772,
            "upload_time": "2024-08-06T18:21:29",
            "upload_time_iso_8601": "2024-08-06T18:21:29.857311Z",
            "url": "https://files.pythonhosted.org/packages/e4/ac/3d96b8271656349cc9aab4ebe8854923ac2fc9fc2b59614c369625702541/proxies_scraper-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-06 18:21:29",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Carlosman1996",
    "github_project": "proxies_scraper",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "proxies-scraper"
}
        
Elapsed time: 9.74310s