resoup


Nameresoup JSON
Version 0.5.2 PyPI version JSON
download
home_page
SummaryVarious convenient features related to requests.
upload_time2023-12-26 14:23:06
maintainer
docs_urlNone
authorilotoki0804
requires_python>=3.10,<4.0
licenseMIT
keywords requests bs4 beautifulsoup async caching cache
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            이 설명은 최신 버전이 아닐 수 있습니다. 만약 최신 버전을 확인하고 싶으시다면 [이 깃허브 링크](https://github.com/ilotoki0804/resoup)를 참고하세요.
# resoup

**Various convenient features related to requests and BeautifulSoup.** (<span style="color:blue">**_re_**</span>quests + Beautiful<span style="color:blue">**_Soup_**</span>)

1. `requests`라이브러리와 BeatifulSoup를 합쳐 몇 줄의 코드를 하나에 합칠 수 있으며,
1. 간단하게 async, cache를 불러와 사용할 수 있습니다.
1. 웹 스크래핑 시 편리한 기본값도 준비되어 있고,
1. `no_empty_result`, `attempts`, `avoid_sslerror` 등 다양하고 소소한 기능도 준비되어 있습니다.

소소하지만 유용하며, 서너 줄의 코드 작성량을 줄여주는 라이브러리입니다.

## 시작하기

1. 파이썬을 설치합니다.
1. 터미널에서 다음과 같은 명령어를 실행합니다.

   ```console
   pip install -U resoup
   ```

requests와 bs4는 같이 설치되지만 BeatifulSoup의 추가적인 parser인 lxml와 html5lib는 기본으로 제공하지 않습니다.

따라서 lxml, html5lib 등은 스스로 설치하셔야 오류가 나지 않을 수 있습니다.
만약 설치되지 않은 상태로 해당 parser를 이용한다면 `NoParserError`가 납니다.

## 사용법

참고: 예시들의 경우 많은 경우 `get` 요청을 위주로 설명하지만, 다른 모든 메소드(options/head/post/put/patch/delete)에서도 동일하게 작동합니다.

### `resoup.requests` 모듈

`resoup.requests` 모듈은 다음과 같이 import해 사용할 수 있습니다.

```python
from resoup import requests  # `import requests`와 호환됨.
```

이 라이브러리는 requests 라이브러리와 99% 호환되며 (심지어 타입 힌트도 requests 라이브러리와 똑같이 잘 작동합니다!), 그 위에 편리한 기능을 얹은 형태입니다. 즉, 기존 `import requests`를 위의 코드로 교체하면 기존의 코드를 망가뜨리지 않으면서도 잘 통합할 수 있습니다.

requests의 Session도 비슷하게 사용할 수 있습니다.

```python
from resoup import requests

with requests.Session() as session:
    ...  # cget, attempts 등 모든 기능 사용 가능
```

#### 기본값

기본값들은 각각 적당한 값으로 설정되어 있습니다.

기본값들은 다음과 같고 request.get/options/head/post/put/patch/delete에서 적용됩니다.

```python
timeout 기본값: 120
headers 기본값: {
    "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7",
    "Accept-Encoding": "gzip, deflate, br",
    "Accept-Language": "ko-KR,ko;q=0.9",
    "Sec-Ch-Ua": '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
    "Sec-Ch-Ua-Mobile": "?0",
    "Sec-Ch-Ua-Platform": '"Windows"',
    "Sec-Fetch-Dest": "document",
    "Sec-Fetch-Mode": "navigate",
    "Sec-Fetch-Site": "none",
    "Sec-Fetch-User": "?1",
    "Upgrade-Insecure-Requests": "1",
    "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36",
}
attempts 기본값: 1
avoid_sslerror 기본값: False
```

```python
>>> from resoup import requests
>>>
>>> from resoup import requests
>>> res = requests.get("https://httpbin.org/headers")
>>> res.json()['headers']
{'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7',
 'Accept-Encoding': 'gzip, deflate, br',
 'Accept-Language': 'ko-KR,ko;q=0.9',
 'Host': 'httpbin.org',
 'Sec-Ch-Ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
 'Sec-Ch-Ua-Mobile': '?0',
 'Sec-Ch-Ua-Platform': '"Windows"',
 'Sec-Fetch-Dest': 'document',
 'Sec-Fetch-Mode': 'navigate',
 'Sec-Fetch-Site': 'none',
 'Sec-Fetch-User': '?1',
 'Upgrade-Insecure-Requests': '1',
 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36',
 'X-Amzn-Trace-Id': ...}
```

#### 응답

`resoup.requests` 모듈의 get/options/head/post/put/patch/delete 함수는 모두 ResponseProxy를 리턴합니다.

ResponseProxy는 기존 Response와 100% 호환되는 Response의 subclass입니다. 자세한 내용은 `ResponseProxy` 항목을 참고하세요.

기능을 잘 이해하지 못했다면 기존에 Response를 사용하던 방식대로 사용하시면 문제 없이 작동합니다.

#### attempts

`attempts`는 파라미터로, 모종의 이유로 `ConnectionError`가 발생했을 때 같은 requests를 몇 번 더 반복할 것인지 설정하는 파라미터입니다.

만약 10번을 실행하고도 실패했다면 가장 최근에 실패한 연결의 이유를 보여줍니다.

```python
>>> from resoup import requests
>>>
>>> requests.get('https://some-not-working-website.com', attempts=10)
WARNING:root:Retring...
WARNING:root:Retring...
WARNING:root:Retring...
WARNING:root:Retring...
WARNING:root:Retring...
WARNING:root:Retring...
WARNING:root:Retring...
WARNING:root:Retring...
WARNING:root:Retring...
WARNING:root:Retring...
Traceback (most recent call last):
...
socket.gaierror: [Errno 11001] getaddrinfo failed

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
...
urllib3.exceptions.NameResolutionError: <urllib3.connection.HTTPSConnection object at ...>: Failed to resolve 'some-not-working-website.com' ([Errno 11001] getaddrinfo failed)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
...
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='some-not-working-website.com', port=443): Max retries exceeded with url: / (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at ...>: Failed to resolve 'some-not-working-website.com' ([Errno 11001] getaddrinfo failed)"))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
...
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='some-not-working-website.com', port=443): Max retries exceeded with url: / (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at ...>: Failed to resolve 'some-not-working-website.com' ([Errno 11001] getaddrinfo failed)"))

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
...
ConnectionError: Trying 10 times but failed to get data.
URL: https://some-not-working-website.com
```

### avoid_sslerror

`avoid_sslerror`는 `UNSAFE_LEGACY_RENEGOTIATION_DISABLED`으로 인해 오류가 나타나는 사이트에서 사용할 수 있습니다.

예를 들어 다음의 사이트는 `avoid_sslerror` 없이는 다음과 같은 오류를 일으킵니다.

```python
>>> from resoup import requests
>>> requests.get('https://bufftoon.plaync.com')
---------------------------------------------------------------------------
SSLError                                  Traceback (most recent call last)
...
SSLError: HTTPSConnectionPool(host='bufftoon.plaync.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError(1, '[SSL: UNSAFE_LEGACY_RENEGOTIATION_DISABLED] unsafe legacy renegotiation disabled (_ssl.c:1000)')))
```

`avoid_sslerror`를 `True`로 하면 해당 오류를 피할 수 있습니다.

```python
<Response [200]>
```

#### 일반 요청 함수

일반 requests.get/options/head/post/put/patch/delete를 `requests`에서 사용하던 방식 그대로 사용할 수 있습니다.

다음은 requests.get과 post의 예시입니다. `requests`모듈과 똑같이 작동합니다.

```python
>>> from resoup import requests
>>>
>>> requests.get('https://jsonplaceholder.typicode.com/todos/1').json()  # API that can send request in order to test. Don't execute this command unless you trust this API.
{'userId': 1, 'id': 1, 'title': 'delectus aut autem', 'completed': False}
>>> requests.post('https://jsonplaceholder.typicode.com/todos', json={
...     'title': 'foo',
...     'body': 'bar',
...     'userId': 1,
... }).json()
{'title': 'foo', 'body': 'bar', 'userId': 1, 'id': 201}  # Same with original requests library
```

#### 캐시된 요청 함수

일반 requests.get/../delete 요청과 동일하지만 캐시됩니다. 이때 캐시는 후술할 `비동기적이며 캐시된 요청 함수`와 공유됩니다. 하지만 각 메소드들끼리 공유되지는 않습니다. 앞에 `c`를 붙여 requests.cget/coptions/chead/cpost/cput/cpatch/cdelete로 함수를 작성해 사용할 수 있습니다.

같은 URL을 보내도 다른 결과를 응답할 수 있는 동적인 서비스를 사용하거나(시간에 따른 응답의 변화를 반영하지 않음) 응답의 크기가 클 경우(메모리가 낭비될 수 있음) 사용하지 않는 것이 좋습니다.

```python
>>> # 기기 사양과 인터넷 연결 품질에 따라 결과는 다를 수 있음
>>> import timeit
>>>
>>> timeit.timeit('requests.get("https://python.org")', number=10, setup='from resoup import requests')
1.1833231999917189 # 기기 사양과 인터넷 연결 품질에 따라 다름: 10번의 연결 모두 request를 보냄
>>> timeit.timeit('requests.cget("https://python.org")', number=10, setup='from resoup import requests')
0.10267569999268744 # : 처음 한 번만 request를 보내고 그 뒤는 캐시에서 값을 불러옴
```

#### 비동기적인 요청 함수

비동기적인 요청을 보냅니다. 앞에 `a`를 붙여 requests.aget/aoptions/ahead/apost/aput/apatch/adelete로 함수를 작성합니다.

`run_in_executer`는 기본적으로 켜져 있습니다. 자세한 내용은 아래의 `run_in_executer 사용`을 참고하세요.

```python
>>> import asyncio
>>> 
>>> from resoup import requests
>>>
>>> res = asyncio.run(requests.aget('https://python.org'))
>>> res
<response [200]>
```

#### 비동기적이며 캐시된 요청 함수

비동기적이며 캐시되는 요청입니다. 이때 캐시는 같은 메소드라면 `캐시된 요청 함수`와 공유됩니다. 앞에 `ac`를 붙여 requests.acget/acoptions/achead/acpost/acput/acpatch/acdelete로 함수를 작성합니다.

같은 URL을 보내도 다른 결과를 응답할 수 있는 동적인 서비스를 사용하거나(시간에 따른 응답의 변화를 반영하지 않음) 응답의 크기가 클 경우(메모리가 낭비될 수 있음) 사용하지 않는 것이 좋습니다.

`run_in_executer`는 기본적으로 켜져 있습니다. 자세한 내용은 아래의 `run_in_executer 사용`을 참고하세요.

```python
>>> import asyncio
>>> import timeit
>>>
>>> timeit.timeit('asyncio.run(requests.aget("https://python.org"))', number=10, setup='from resoup import requests; import asyncio')
0.8676127000362612 # 기기 사양과 인터넷 연결 품질에 따라 다름: 10번의 연결 모두 request를 보냄
>>> timeit.timeit('asyncio.run(requests.acget("https://python.org"))', number=10, setup='from resoup import requests; import asyncio')
0.11984489997848868 # 처음 한 번만 request를 보내고 그 뒤는 캐시를 불러옴
```

#### `run_in_executer` 사용

비동기적인 요청(aget, acget 등 a가 붙은 메소드)에서는 `run_in_executer` parameter를 사용할 수 있습니다. 이 parameter는 함수가 다른 쓰레드에서 돌게 합니다. 순차적으로 프로그램이 동작할 때에는 큰 차이가 없지만 병렬적으로 프로그램을 돌릴 때 큰 속도 향상을 기대할 수 있습니다.

아래와 같이 `asyncio.gather`를 이용하면 큰 성능 향상을 보일 수 있습니다.

```python
import asyncio
import time

from resoup import requests

async def masure_coroutine_time(coroutine):
    start = time.perf_counter()
    await coroutine
    end = time.perf_counter()

    print(end - start)

async def main():
    # 단일 request를 보낼 때(큰 차이 없음)

    req = requests.aget('https://python.org', run_in_executor=False)
    await masure_coroutine_time(req)  # 0.07465070000034757

    req = requests.aget('https://python.org')
    await masure_coroutine_time(req)  # 0.05844969999452587

    # 여러 request를 보낼 때(큰 속도 향상을 보임)

    reqs = (requests.aget(f'https://python.org/{i}', run_in_executor=False) for i in range(10))  # 더미 url을 만듦
    await masure_coroutine_time(asyncio.gather(*reqs))  # run_in_executor를 사용하지 않을 때: 느림(3.7874760999984574)

    reqs = (requests.aget(f'https://python.org/{i}') for i in range(10))  # 더미 url을 만듦
    await masure_coroutine_time(asyncio.gather(*reqs))  # run_in_executor를 사용할 때(기본값): 빠름(0.11582900000212248)

if __name__ == '__main__':
    asyncio.run(main())
```

#### requests 모듈과 호환되지 않는 부분

이 모듈은 `requests` 라이브러리와 거의 모든 부분에서 호환되지만 호환되지 않는 부분이 몇 가지 있습니다.

##### dunder method(`__dunder__`)

잠정적 버그의 이유가 될 수 있다는 이유 혹은 기술적인 이유로 일부 dunder method는 불러와지지 않거나 호환되지 않습니다.

사용할 수 없거나 requests 라이브러리와 일치하지 않는 dunder method: `__builtins__`, `__cached__`, `__doc__`, `__file__`, `__loader__`, `__name__`, `__package__`, `__spec__`

사용 가능하고 requests 라이브러리와 일치하는 dunder method: `__author__`, `__author_email__`, `__build__`, `__cake__`, `__copyright__`, `__description__`, `__license__`, `__title__`, `__url__`, `__version__`

```python
>>> import requests
>>> requests.__name__
'requests'
>>> requests.__path__
['some path']
>>> requests.__cake__
'✨ 🍰 ✨'
>>>
>>> from resoup import requests
>>> requests.__name__  # 호환되지 않는 dunder method
'resoup.requests_proxy'  # requests와 값이 다름
>>> requests.__path__ # 사용할 수 없고 호환되지 않는 dunder method
AttributeError: module 'resoup.requests_' has no attribute '__path__'
>>> requests.__cake__  # 호환되는 dunder method
'✨ 🍰 ✨'
```

##### import

`resoup.requests`는 거의 모든 경우에서 import 관련 호환성이 유지됩니다. 하지만 import와 관련해서는 몇 가지 규칙이 존재합니다.

`resoup.requests`는 `from resoup import requests`의 형태로만 사용할 수 있습니다.

```python
# 각 라인에서 윗줄과 아랫줄은 각각 requests를 import 할 때와 `resoup.requests`를 import할 때를 나타냅니다.

# requests 모듈 import
import requests
from resoup import requests  # 가능
```

따라서 다음과 같은 경우는 `resoup.requests`에서 import가 불가능합니다.

```python
# requests의 하위 모듈 import
import requests.models  # 가능
import resoup.requests.models  # 불가능!

# requests의 하위 모듈 import (w/ from .. import ...)
from request import models  # 가능
from resoup.requests import models  # 불가능!

# requests의 하위 모듈의 하위 구성 요소 import
from request.models import Response  # 가능
from resoup.requests.models import Response  # 불가능!
```

이런 경우엔 모듈 import를 이용하면 해결됩니다..

예를 들어 다음과 같은 코드가 있다고 해 봅시다.

```python
from request.models import Response  # 하위 모듈의 하위 구성 요소 import 사용

def is_response(instance):
    return isinstance(instance, Response)
```

이 코드는 다음과 같이 문제를 해결할 수 있습니다.

```python
# requests.models.Response로 바꾸기.
# 장점: 깔끔하고 error-prone하지 않음.
from resoup import requests  # requests 모듈 import
def is_response(instance):
    return isinstance(instance, requests.models.Response)  # requests.models.Response로 변경함
```

```python
# Response 정의하기.
# 장점: 코드를 수정할 필요가 없음.
from resoup import requests
Response = requests.models.Response

def is_response(instance):
    return isinstance(instance, Response)
```

개인의 선호에 따라 원하는 방식으로 사용하시면 됩니다.

### ResponseProxy

`ResponseProxy`는 이 라이브러리에서 requests.get/options/head/post/put/patch/delete를 사용할 경우의 리턴값입니다. 기존 Response와 100% 호환되면서도 추가적인 함수 6개를 제공합니다.

#### 호환성

이 파트에서는 주석에 내용을 적었습니다.

```python
>>> # 두 모듈을 동시에 사용해야 하니 이름을 변경하겠습니다.
>>> import requests as orginal_requests
>>> from resoup import requests as utils_requsts
>>>
>>> # requests 모듈은 Response를 응답합니다.
>>> response1 = orginal_requests.get("https://peps.python.org/pep-0020/")  # 정적인 웹사이트
>>> print(response1)
<Response [200]>
>>> print(type(response1))  # Response 객체
<class 'requests.models.Response'>
>>> # resoup.requests모듈은 ResponseProxy를 응답합니다.
>>> response2 = utils_requsts.get("https://peps.python.org/pep-0020/")
>>> print(response2)
<Response [200]>
>>> print(type(response2))  # ResponseProxy 객체
<class 'resoup.response_proxy.ResponseProxy'>
>>>
>>> # 다음의 모든 검사들을 통과합니다.
>>> assert response1.text == response2.text
>>> assert response1.status_code == response2.status_code
>>> assert response1.url == response2.url
>>> assert response1.content == response2.content
>>>
>>> # 하지만 RequestsProxy에는 이러한 추가적인 기능들이 존재합니다.
>>> print(response2.soup())
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8"/>
...
<script src="../_static/wrap_tables.js"></script>
<script src="../_static/sticky_banner.js"></script>
</body>
</html>
>>> print(response2.soup_select('title'))
[<title>PEP 20 – The Zen of Python | peps.python.org</title>, <title>Following system colour scheme</title>, <title>Selected dark colour scheme</title>, <title>Selected light colour scheme</title>]
>>> print(response2.soup_select_one('p', no_empty_result=True).text)
Long time Pythoneer Tim Peters succinctly channels the BDFL’s guiding
principles for Python’s design into 20 aphorisms, only 19 of which
have been written down.
>>>
>>> from requests.models import Response
>>> # RequestsProxy는 Requsests의 subclass입니다.
>>> # 따라서 isinstance 검사를 통과합니다.
>>> isinstance(response2, Response)
True
>>> # 물론 subclass이기 때문에 '==' 검사는 통과하지 않습니다.
>>> type(response1) == type(response2)
False
```

#### 기본 구조

`ResponseProxy`에는 여러 모듈들이 있으며, 크게 세 가지 종류로 분류됩니다.

* soup류: `.soup()`, `.soup_select()`, `.soup_select_one()`
  기본적인 함수입니다.
* xml류: `.xml()`, `.xml_select()`, `.xml_select_one()`
  soup류에서 parser가 'xml'인 경우입니다.

각각의 종류에는 세 가지 함수가 있으며 함수 각각의 기능은 다음과 같습니다.

* `.soup()`/`.xml()`: BeatifulSoup로 해석된 코드가 나옵니다.
* `.soup_select()`/`.xml_select()`: `.soup().select()`와 비슷합니다.
* `.soup_select_one()`/`.xml_select_one()`: `.soup().select_one()`과 비슷합니다.

자세한 내용은 아래를 살펴보세요.

#### `.soup()`

`.soup()`는 텍스트나 response를 받아 `BeatifulSoup`로 내보냅니다.

이때 인자는 response와 response.text 모두 가능하지만 response를 사용하는 것을 권합니다.
그러면 더욱 상세한 오류 메시지를 받을 수 있습니다.

```python
>>> from resoup import requests
>>>
>>> response = requests.get("https://python.org")
>>> response.soup()  # BeatifulSoup에서 사용 가능한 모든 parameter 사용 가능
<!DOCTYPE html>
...
</body>
</html>
```

이 함수는 사실상 `BeatifulSoup`를 통과시키는 것과 같습니다. 아래의 코드는 위의 코드와 거의 같습니다.

```python
>>> import requests
>>> from bs4 import BeautifulSoup
>>>
>>> response = requests.get("https://python.org")
>>> BeautifulSoup(response.text)
<!DOCTYPE html>
<!DOCTYPE html>
...
</body>
</html>
```

parser가 없을 경우 `BeatifulSoup`는 `FeatureNotFound`에러가 나오지만 `.soup()`는 `NoParserError`가 나옵니다.

#### `.soup_select()`

`.soup_select()`는 텍스트나 response를 받아 BeatifulSoup의 Tag로 내보냅니다. `selector` parameter는 CSS 선택자를 받습니다.

```python
>>> from resoup import requests
>>>
>>> response = requests.get("https://python.org")
>>> response.soup_select("p")
[<p><strong>Notice:</strong> While JavaScript is not essential for this website
...]
```

아래의 코드는 위의 코드와 유사하게 동작합니다.

```python
>>> import requests
>>> from bs4 import BeautifulSoup
>>>
>>> response = requests.get('https://python.org')
>>> soup = BeautifulSoup(response.text).select('p')
>>> soup
[<p><strong>Notice:</strong> While JavaScript is not essential for this website
...]
```

이 함수의 독특한 점은, `no_empty_result`라는 parameter의 존재입니다. 이 parameter가 True이면 .select()의 결과가 빈 리스트일때 `EmptyResultError`를 냅니다.

```python
>>> from resoup import requests
>>>
>>> response = requests.get("https://python.org")
>>> response.soup_select("data-some-complex-and-error-prone-selector")
[]
>>>
>>> response = requests.get("https://python.org")
>>> response.soup_select(
...     "data-some-complex-and-error-prone-selector",
...     no_empty_result=True)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "...souptools.py", line 148, in soup_select
    raise EmptyResultError(
resoup.exceptions.EmptyResultError: Result of select is empty list("[]"). This error happens probably because of invalid selector or URL. Check if both selector and URL are valid. Set to False `no_empty_result` if empty list is intended. It may also because of selector is not matched with URL.
selector: data-some-complex-and-error-prone-selector, URL: https://www.python.org/
```

이 함수를 기본적으로 BroadcastList를 출력값으로 설정하고 있습니다. BroadcastList에 대해 자세히 알고 싶다면 아래의 `BroadcastList` 항목을 확인해 보세요.

#### `.soup_select_one()`

`.soup_select_one()`는 텍스트나 response를 받아 BeatifulSoup의 Tag로 내보냅니다. `selector` parameter는 CSS 선택자를 받습니다.

```python
>>> from resoup import requests
>>>
>>> response = requests.get('https://python.org')
>>> response.soup_select_one('p strong', no_empty_result=True)
<strong>Notice:</strong>
```

아래의 코드는 위의 코드와 유사하게 동작합니다.

```python
>>> import requests
>>> from bs4 import BeautifulSoup
>>>
>>> response = requests.get('https://python.org')
>>> soup = BeautifulSoup(response.text, 'html.parser').select('p strong')
>>> if soup is None:  # no_empty_result 관련 확인 코드
...     raise Exception
...
>>> soup
<strong>Notice:</strong>
```

`no_empty_result` parameter가 True이면 .select_one()의 결과가 None일때 `EmptyResultError`를 냅니다.

이 기능은 타입 힌트에서도 유용하게 쓰일 수 있고, 오류를 더 명확히 하는 데에도 도움을 줍니다.

기존 BeatifulSoup에서는 `.select_one()`의 리턴값을 `Tag | None`으로 표시했기 때문에 만약 `.select_one().text`와 같은 코드를 사용하려고 하면 정적 타입 검사 도구들에서 오류를 발생시켰습니다.

특히 `.select_one()`의 결과가 None이 되면 `'NoneType' object has no attribute 'text'`라는 어떤 부분에서 오류가 났는지 한눈에 확인하기 힘든 오류 메시지가 나왔습니다.

`no_empty_result`를 이용하면 이러한 문제들을 해결할 수 있습니다.
`no_empty_result`를 True로 하면 타입 검사 도구들도 조용해지고, 혹시라도 None이 결과값이 될 때  대신 훨씬 더 자세하며 해결책을 포함한 오류 메시지를 만들어 냅니다.

```python
>>> from resoup import requests
>>>
>>> response = requests.get("https://python.org")
>>> print(response.soup_select_one("data-some-complex-and-error-prone-selector"))
None  # 실제로 None이 결과값으로 나오진 않고 그냥 조용히 종료됨.
>>>
>>> response = requests.get("https://python.org")
>>> response.soup_select_one(
...     "data-some-complex-and-error-prone-selector",
...     no_empty_result=True)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "...souptools.py", line 220, in soup_select_one
    raise EmptyResultError(
resoup.exceptions.EmptyResultError: Result of select_one is None. This error happens probably because of invalid selector or URL. Check if both selector and URL are valid. Set to False `no_empty_result` if empty list is intended. It may also because of selector is not matched with URL.  
selector: data-some-complex-and-error-prone-selector, URL: https://www.python.org/
```

#### xml 관련 함수

`ResponseProxy`의 `soup` 관련 함수에서 `soup`를 `xml`로 치환하면 xml 함수가 됩니다.

이 함수들은 parser가 `'xml'`이라는 점을 제외하고는 soup와 차이점이 없습니다.

예시 코드는 다음과 같습니다

```python
>>> from resoup import requests
>>>
>>> response = requests.get('https://www.w3schools.com/xml/plant_catalog.xml')
>>> selected = response.xml_select('LIGHT', no_empty_result=True)
>>> selected
[<LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Sunny</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sun</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>]
```

위의 코드는 아래의 코드와 거의 같습니다.

```python
>>> from resoup import requests
>>> from functools import partial
>>>
>>> response = requests.get('https://www.w3schools.com/xml/plant_catalog.xml')
>>> # corespond to `.xml_select()`
>>> xml_select_partial = partial(response.soup_select, parser='xml')
>>> selected = xml_select_partial('LIGHT', no_empty_result=True)
>>> selected
[<LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Sunny</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sun</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>]
```

#### BroadcastList

`.soup_select()`와 `.xml_select()`의 경우에는 리스트를 값으로 내보냅니다. 이는 `.soup()`나 `.soup_select_one()`에서 기대할 수 있는 `.text`와 같은 파라미터 사용을 어렵게 합니다.

이는 for loop나 리스트 컴프리헨션으로 해결할 수 있습니다.

```python
>>> from resoup import requests
>>> tags_list = requests.get("https://python.org").soup_select("p strong")
>>> [element.text for element in tags_list]
['Notice:', 'relaunched community-run job board']
```

하지만 이것이 마음에 들지 않을 수가 있습니다. 특히 개발 중이라면 빠른 _개발_ 속도를 위해 for loop나 리스트 컴프리헨션을 사용하는 것 외에 더 신속하게 `.text` 등을 적용하는 방법을 고려하고 싶을 수 있습니다.

이 프로젝트의 `.soup_select()`의 기본 리턴값으로 설정된 BroadcastList는 이를 해결하기 위한 방편입니다.

BroadcastList에서는 리스트를 통해 직접 Tag에서 사용되는 속성을 사용할 수 있습니다.

```python
>>> from resoup import requests
>>> tags_list = requests.get("https://python.org").soup_select("p strong")
>>> tags_list
[<strong>Notice:</strong>, <strong>relaunched community-run job board</strong>]
>>> type(tags_list)
<class 'resoup.broadcast_list.TagBroadcastList'>  # BroadcastList가 사용됨
>>> tags_list.text  # 브로드캐스팅
['Notice:', 'relaunched community-run job board']
>>>
>>> tags_list_with_no_broadcast_list = requests.get('https://python.org').soup_select('p', use_broadcast_list=False)
>>> type(tags_list_with_no_broadcast_list)
<class 'bs4.element.ResultSet'>  # BroadcastList가 사용되지 않음
>>> tags_list_with_no_broadcast_list.text
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "...element.py", line 2428, in __getattr__
    raise AttributeError(
AttributeError: ResultSet object has no attribute 'text'. You're probably treating a list of elements like a single element. Did you call find_all() when you meant to call find()?
```

BroadcastList는 다음과 같은 방법을 통해 끌 수 있습니다.

```python
>>> from resoup import requests
>>>
>>> tags_list = requests.get("https://python.org").soup_select("p", use_broadcase_list=False)
>>> type(tags_list)
bs4.element.ResultSet
>>> tags_list.text  # 브로드캐스팅 안 됨
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "...element.py", line 2428, in __getattr__
    raise AttributeError(
AttributeError: ResultSet object has no attribute 'text'. You're probably treating a list of elements like a single element. Did you call find_all() when you meant to call find()?
```

### 특별한 형태의 리스트 getitem

BroadCastList에서는 다음과 같은 특이한 기능이 있습니다.

만약 리스트에 정수나 슬라이스로 getitem을 요청한다면 일반적인 리스트의 역할을 수행합니다.

```python
>>> from resoup import requests
>>> # 값 불러옴()
>>> tag_broadcast_list = requests.cget("https://www.python.org/community/logos/").soup_select("img")
>>> tag_broadcast_list
[<img alt="Python Software Foundation" class="psf-logo" src="/static/img/psf-logo.png"/>,
...
<img alt="Logo device only" src="https://s3.dualstack.us-east-2.amazonaws.com/pythondotorg-assets/media/community/logos/python-logo-only.png" style="height: 48px;"/>,
<img alt="/static/community_logos/python-powered-w-100x40.png" src="/static/community_logos/python-powered-w-100x40.png"/>,
<img alt="/static/community_logos/python-powered-h-50x65.png" src="/static/community_logos/python-powered-h-50x65.png"/>]
>>> # 정수 getitem
>>> tag_broadcast_list[0]
<img alt="Python Software Foundation" class="psf-logo" src="/static/img/psf-logo.png"/>
>>> # 슬라이싱
>>> tag_broadcast_list[3:5]
[<img alt="/static/community_logos/python-powered-w-100x40.png" src="/static/community_logos/python-powered-w-100x40.png"/>,
 <img alt="/static/community_logos/python-powered-h-50x65.png" src="/static/community_logos/python-powered-h-50x65.png"/>]
>>> # 문자열 getitem (브로드캐스팅 적용됨!)
>>> tag_broadcast_list["alt"]
['Python Software Foundation',
 'Combined logo',
 'Logo device only',
 '/static/community_logos/python-powered-w-100x40.png',
 '/static/community_logos/python-powered-h-50x65.png']
```

### CustomDefaults

`CustomDefaults`를 통해 직접 기본값을 설정할 수 있습니다. 이 값으로 일반 get/options/head/post/put/patch/delete 및 c../a../ac.. 함수의 기본값을 효과적으로 설정할 수 있습니다.

```python
>>> from resoup import CustomDefaults
>>>
>>> requests = CustomDefaults(headers={'User-Agent': 'User Agent for Test'})
>>> requests.get('https://httpbin.org/headers').json()['headers']['User-Agent']
'User Agent for Test'
```

## 라이선스 정보

이 프로그램은 MIT 라이선스로 공유됩니다.

이 프로그램의 일부는 [requests(Apache License 2.0)](https://github.com/psf/requests) 라이브러리에 있던 코드를 포함합니다.
Some part of this program contains code from [requests](https://github.com/psf/requests) library.

이 프로그램의 일부는 [typeshed(Apache License 2.0 or MIT License)](https://github.com/python/typeshed) 라이브러리에 있던 코드를 포함합니다.
Some part of this program contains code from [typeshed](https://github.com/python/typeshed) library.

## Relese Note

0.5.2 (2023-12-26): Timeout 오류도 attempts에 걸릴 수 있도록 변경, root에서 사용할 수 있는 변수 추가, 빌드 코드 개선, 코드 개선

0.5.1 (2023-12-9): 버그 수정

0.5.0 (2023-12-9): resoup로 이름 변경, 새 BroadcastList 기본 적용, poetry 사용, 기존 souptools 모듈 제거 및 souptoolsclass 모듈로 대체, 테스트 추가

0.4.1 (2023-11-4): 긴급 버그 수정

0.4.0 (2023-11-4): raise_for_status 기본값 변경, souptoolsclass 추가, avoid_sslerror 추가

0.3.0 (2023-10-05): BroadcastList 복원, sessions_with_tools 추가

0.2.3 (2023-09-19): header 기본값 변경, ConnectionError시 에러 한 개만 보이는 것으로 변경, attempts로 재시도할 때 성공했을 때 메시지 추가, retry에서 url 제거, setup.py와 관련 파일 변경

0.2.2 (2023-09-08): attempt parameter를 attempts로 변경, BroadcastList 제거

0.2.1 (2023-08-31): py.typed 추가, freeze_dict_and_list 추가

0.2.0 (2023-08-27): CustomDefaults 추가

0.1.1 (2023-08-27): 첫 릴리즈

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "resoup",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10,<4.0",
    "maintainer_email": "",
    "keywords": "requests,bs4,BeautifulSoup,async,caching,cache",
    "author": "ilotoki0804",
    "author_email": "ilotoki0804@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/85/2c/ae9c93c304de9507980f09c477d057ff4c12735a645030ff5472d963789e/resoup-0.5.2.tar.gz",
    "platform": null,
    "description": "\uc774 \uc124\uba85\uc740 \ucd5c\uc2e0 \ubc84\uc804\uc774 \uc544\ub2d0 \uc218 \uc788\uc2b5\ub2c8\ub2e4. \ub9cc\uc57d \ucd5c\uc2e0 \ubc84\uc804\uc744 \ud655\uc778\ud558\uace0 \uc2f6\uc73c\uc2dc\ub2e4\uba74 [\uc774 \uae43\ud5c8\ube0c \ub9c1\ud06c](https://github.com/ilotoki0804/resoup)\ub97c \ucc38\uace0\ud558\uc138\uc694.\n# resoup\n\n**Various convenient features related to requests and BeautifulSoup.** (<span style=\"color:blue\">**_re_**</span>quests + Beautiful<span style=\"color:blue\">**_Soup_**</span>)\n\n1. `requests`\ub77c\uc774\ube0c\ub7ec\ub9ac\uc640 BeatifulSoup\ub97c \ud569\uccd0 \uba87 \uc904\uc758 \ucf54\ub4dc\ub97c \ud558\ub098\uc5d0 \ud569\uce60 \uc218 \uc788\uc73c\uba70,\n1. \uac04\ub2e8\ud558\uac8c async, cache\ub97c \ubd88\ub7ec\uc640 \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n1. \uc6f9 \uc2a4\ud06c\ub798\ud551 \uc2dc \ud3b8\ub9ac\ud55c \uae30\ubcf8\uac12\ub3c4 \uc900\ube44\ub418\uc5b4 \uc788\uace0,\n1. `no_empty_result`, `attempts`, `avoid_sslerror` \ub4f1 \ub2e4\uc591\ud558\uace0 \uc18c\uc18c\ud55c \uae30\ub2a5\ub3c4 \uc900\ube44\ub418\uc5b4 \uc788\uc2b5\ub2c8\ub2e4.\n\n\uc18c\uc18c\ud558\uc9c0\ub9cc \uc720\uc6a9\ud558\uba70, \uc11c\ub108 \uc904\uc758 \ucf54\ub4dc \uc791\uc131\ub7c9\uc744 \uc904\uc5ec\uc8fc\ub294 \ub77c\uc774\ube0c\ub7ec\ub9ac\uc785\ub2c8\ub2e4.\n\n## \uc2dc\uc791\ud558\uae30\n\n1. \ud30c\uc774\uc36c\uc744 \uc124\uce58\ud569\ub2c8\ub2e4.\n1. \ud130\ubbf8\ub110\uc5d0\uc11c \ub2e4\uc74c\uacfc \uac19\uc740 \uba85\ub839\uc5b4\ub97c \uc2e4\ud589\ud569\ub2c8\ub2e4.\n\n   ```console\n   pip install -U resoup\n   ```\n\nrequests\uc640 bs4\ub294 \uac19\uc774 \uc124\uce58\ub418\uc9c0\ub9cc BeatifulSoup\uc758 \ucd94\uac00\uc801\uc778 parser\uc778 lxml\uc640 html5lib\ub294 \uae30\ubcf8\uc73c\ub85c \uc81c\uacf5\ud558\uc9c0 \uc54a\uc2b5\ub2c8\ub2e4.\n\n\ub530\ub77c\uc11c lxml, html5lib \ub4f1\uc740 \uc2a4\uc2a4\ub85c \uc124\uce58\ud558\uc154\uc57c \uc624\ub958\uac00 \ub098\uc9c0 \uc54a\uc744 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\ub9cc\uc57d \uc124\uce58\ub418\uc9c0 \uc54a\uc740 \uc0c1\ud0dc\ub85c \ud574\ub2f9 parser\ub97c \uc774\uc6a9\ud55c\ub2e4\uba74 `NoParserError`\uac00 \ub0a9\ub2c8\ub2e4.\n\n## \uc0ac\uc6a9\ubc95\n\n\ucc38\uace0: \uc608\uc2dc\ub4e4\uc758 \uacbd\uc6b0 \ub9ce\uc740 \uacbd\uc6b0 `get` \uc694\uccad\uc744 \uc704\uc8fc\ub85c \uc124\uba85\ud558\uc9c0\ub9cc, \ub2e4\ub978 \ubaa8\ub4e0 \uba54\uc18c\ub4dc(options/head/post/put/patch/delete)\uc5d0\uc11c\ub3c4 \ub3d9\uc77c\ud558\uac8c \uc791\ub3d9\ud569\ub2c8\ub2e4.\n\n### `resoup.requests` \ubaa8\ub4c8\n\n`resoup.requests` \ubaa8\ub4c8\uc740 \ub2e4\uc74c\uacfc \uac19\uc774 import\ud574 \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\nfrom resoup import requests  # `import requests`\uc640 \ud638\ud658\ub428.\n```\n\n\uc774 \ub77c\uc774\ube0c\ub7ec\ub9ac\ub294 requests \ub77c\uc774\ube0c\ub7ec\ub9ac\uc640 99% \ud638\ud658\ub418\uba70 (\uc2ec\uc9c0\uc5b4 \ud0c0\uc785 \ud78c\ud2b8\ub3c4 requests \ub77c\uc774\ube0c\ub7ec\ub9ac\uc640 \ub611\uac19\uc774 \uc798 \uc791\ub3d9\ud569\ub2c8\ub2e4!), \uadf8 \uc704\uc5d0 \ud3b8\ub9ac\ud55c \uae30\ub2a5\uc744 \uc5b9\uc740 \ud615\ud0dc\uc785\ub2c8\ub2e4. \uc989, \uae30\uc874 `import requests`\ub97c \uc704\uc758 \ucf54\ub4dc\ub85c \uad50\uccb4\ud558\uba74 \uae30\uc874\uc758 \ucf54\ub4dc\ub97c \ub9dd\uac00\ub728\ub9ac\uc9c0 \uc54a\uc73c\uba74\uc11c\ub3c4 \uc798 \ud1b5\ud569\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\nrequests\uc758 Session\ub3c4 \ube44\uc2b7\ud558\uac8c \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\nfrom resoup import requests\n\nwith requests.Session() as session:\n    ...  # cget, attempts \ub4f1 \ubaa8\ub4e0 \uae30\ub2a5 \uc0ac\uc6a9 \uac00\ub2a5\n```\n\n#### \uae30\ubcf8\uac12\n\n\uae30\ubcf8\uac12\ub4e4\uc740 \uac01\uac01 \uc801\ub2f9\ud55c \uac12\uc73c\ub85c \uc124\uc815\ub418\uc5b4 \uc788\uc2b5\ub2c8\ub2e4.\n\n\uae30\ubcf8\uac12\ub4e4\uc740 \ub2e4\uc74c\uacfc \uac19\uace0 request.get/options/head/post/put/patch/delete\uc5d0\uc11c \uc801\uc6a9\ub429\ub2c8\ub2e4.\n\n```python\ntimeout \uae30\ubcf8\uac12: 120\nheaders \uae30\ubcf8\uac12: {\n    \"Accept\": \"text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7\",\n    \"Accept-Encoding\": \"gzip, deflate, br\",\n    \"Accept-Language\": \"ko-KR,ko;q=0.9\",\n    \"Sec-Ch-Ua\": '\"Chromium\";v=\"116\", \"Not)A;Brand\";v=\"24\", \"Google Chrome\";v=\"116\"',\n    \"Sec-Ch-Ua-Mobile\": \"?0\",\n    \"Sec-Ch-Ua-Platform\": '\"Windows\"',\n    \"Sec-Fetch-Dest\": \"document\",\n    \"Sec-Fetch-Mode\": \"navigate\",\n    \"Sec-Fetch-Site\": \"none\",\n    \"Sec-Fetch-User\": \"?1\",\n    \"Upgrade-Insecure-Requests\": \"1\",\n    \"User-Agent\": \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36\",\n}\nattempts \uae30\ubcf8\uac12: 1\navoid_sslerror \uae30\ubcf8\uac12: False\n```\n\n```python\n>>> from resoup import requests\n>>>\n>>> from resoup import requests\n>>> res = requests.get(\"https://httpbin.org/headers\")\n>>> res.json()['headers']\n{'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7',\n 'Accept-Encoding': 'gzip, deflate, br',\n 'Accept-Language': 'ko-KR,ko;q=0.9',\n 'Host': 'httpbin.org',\n 'Sec-Ch-Ua': '\"Chromium\";v=\"116\", \"Not)A;Brand\";v=\"24\", \"Google Chrome\";v=\"116\"',\n 'Sec-Ch-Ua-Mobile': '?0',\n 'Sec-Ch-Ua-Platform': '\"Windows\"',\n 'Sec-Fetch-Dest': 'document',\n 'Sec-Fetch-Mode': 'navigate',\n 'Sec-Fetch-Site': 'none',\n 'Sec-Fetch-User': '?1',\n 'Upgrade-Insecure-Requests': '1',\n 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36',\n 'X-Amzn-Trace-Id': ...}\n```\n\n#### \uc751\ub2f5\n\n`resoup.requests` \ubaa8\ub4c8\uc758 get/options/head/post/put/patch/delete \ud568\uc218\ub294 \ubaa8\ub450 ResponseProxy\ub97c \ub9ac\ud134\ud569\ub2c8\ub2e4.\n\nResponseProxy\ub294 \uae30\uc874 Response\uc640 100% \ud638\ud658\ub418\ub294 Response\uc758 subclass\uc785\ub2c8\ub2e4. \uc790\uc138\ud55c \ub0b4\uc6a9\uc740 `ResponseProxy` \ud56d\ubaa9\uc744 \ucc38\uace0\ud558\uc138\uc694.\n\n\uae30\ub2a5\uc744 \uc798 \uc774\ud574\ud558\uc9c0 \ubabb\ud588\ub2e4\uba74 \uae30\uc874\uc5d0 Response\ub97c \uc0ac\uc6a9\ud558\ub358 \ubc29\uc2dd\ub300\ub85c \uc0ac\uc6a9\ud558\uc2dc\uba74 \ubb38\uc81c \uc5c6\uc774 \uc791\ub3d9\ud569\ub2c8\ub2e4.\n\n#### attempts\n\n`attempts`\ub294 \ud30c\ub77c\ubbf8\ud130\ub85c, \ubaa8\uc885\uc758 \uc774\uc720\ub85c `ConnectionError`\uac00 \ubc1c\uc0dd\ud588\uc744 \ub54c \uac19\uc740 requests\ub97c \uba87 \ubc88 \ub354 \ubc18\ubcf5\ud560 \uac83\uc778\uc9c0 \uc124\uc815\ud558\ub294 \ud30c\ub77c\ubbf8\ud130\uc785\ub2c8\ub2e4.\n\n\ub9cc\uc57d 10\ubc88\uc744 \uc2e4\ud589\ud558\uace0\ub3c4 \uc2e4\ud328\ud588\ub2e4\uba74 \uac00\uc7a5 \ucd5c\uadfc\uc5d0 \uc2e4\ud328\ud55c \uc5f0\uacb0\uc758 \uc774\uc720\ub97c \ubcf4\uc5ec\uc90d\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>>\n>>> requests.get('https://some-not-working-website.com', attempts=10)\nWARNING:root:Retring...\nWARNING:root:Retring...\nWARNING:root:Retring...\nWARNING:root:Retring...\nWARNING:root:Retring...\nWARNING:root:Retring...\nWARNING:root:Retring...\nWARNING:root:Retring...\nWARNING:root:Retring...\nWARNING:root:Retring...\nTraceback (most recent call last):\n...\nsocket.gaierror: [Errno 11001] getaddrinfo failed\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n...\nurllib3.exceptions.NameResolutionError: <urllib3.connection.HTTPSConnection object at ...>: Failed to resolve 'some-not-working-website.com' ([Errno 11001] getaddrinfo failed)\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n...\nurllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='some-not-working-website.com', port=443): Max retries exceeded with url: / (Caused by NameResolutionError(\"<urllib3.connection.HTTPSConnection object at ...>: Failed to resolve 'some-not-working-website.com' ([Errno 11001] getaddrinfo failed)\"))\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n...\nrequests.exceptions.ConnectionError: HTTPSConnectionPool(host='some-not-working-website.com', port=443): Max retries exceeded with url: / (Caused by NameResolutionError(\"<urllib3.connection.HTTPSConnection object at ...>: Failed to resolve 'some-not-working-website.com' ([Errno 11001] getaddrinfo failed)\"))\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n...\nConnectionError: Trying 10 times but failed to get data.\nURL: https://some-not-working-website.com\n```\n\n### avoid_sslerror\n\n`avoid_sslerror`\ub294 `UNSAFE_LEGACY_RENEGOTIATION_DISABLED`\uc73c\ub85c \uc778\ud574 \uc624\ub958\uac00 \ub098\ud0c0\ub098\ub294 \uc0ac\uc774\ud2b8\uc5d0\uc11c \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n\uc608\ub97c \ub4e4\uc5b4 \ub2e4\uc74c\uc758 \uc0ac\uc774\ud2b8\ub294 `avoid_sslerror` \uc5c6\uc774\ub294 \ub2e4\uc74c\uacfc \uac19\uc740 \uc624\ub958\ub97c \uc77c\uc73c\ud0b5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>> requests.get('https://bufftoon.plaync.com')\n---------------------------------------------------------------------------\nSSLError                                  Traceback (most recent call last)\n...\nSSLError: HTTPSConnectionPool(host='bufftoon.plaync.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError(1, '[SSL: UNSAFE_LEGACY_RENEGOTIATION_DISABLED] unsafe legacy renegotiation disabled (_ssl.c:1000)')))\n```\n\n`avoid_sslerror`\ub97c `True`\ub85c \ud558\uba74 \ud574\ub2f9 \uc624\ub958\ub97c \ud53c\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\n<Response [200]>\n```\n\n#### \uc77c\ubc18 \uc694\uccad \ud568\uc218\n\n\uc77c\ubc18 requests.get/options/head/post/put/patch/delete\ub97c `requests`\uc5d0\uc11c \uc0ac\uc6a9\ud558\ub358 \ubc29\uc2dd \uadf8\ub300\ub85c \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n\ub2e4\uc74c\uc740 requests.get\uacfc post\uc758 \uc608\uc2dc\uc785\ub2c8\ub2e4. `requests`\ubaa8\ub4c8\uacfc \ub611\uac19\uc774 \uc791\ub3d9\ud569\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>>\n>>> requests.get('https://jsonplaceholder.typicode.com/todos/1').json()  # API that can send request in order to test. Don't execute this command unless you trust this API.\n{'userId': 1, 'id': 1, 'title': 'delectus aut autem', 'completed': False}\n>>> requests.post('https://jsonplaceholder.typicode.com/todos', json={\n...     'title': 'foo',\n...     'body': 'bar',\n...     'userId': 1,\n... }).json()\n{'title': 'foo', 'body': 'bar', 'userId': 1, 'id': 201}  # Same with original requests library\n```\n\n#### \uce90\uc2dc\ub41c \uc694\uccad \ud568\uc218\n\n\uc77c\ubc18 requests.get/../delete \uc694\uccad\uacfc \ub3d9\uc77c\ud558\uc9c0\ub9cc \uce90\uc2dc\ub429\ub2c8\ub2e4. \uc774\ub54c \uce90\uc2dc\ub294 \ud6c4\uc220\ud560 `\ube44\ub3d9\uae30\uc801\uc774\uba70 \uce90\uc2dc\ub41c \uc694\uccad \ud568\uc218`\uc640 \uacf5\uc720\ub429\ub2c8\ub2e4. \ud558\uc9c0\ub9cc \uac01 \uba54\uc18c\ub4dc\ub4e4\ub07c\ub9ac \uacf5\uc720\ub418\uc9c0\ub294 \uc54a\uc2b5\ub2c8\ub2e4. \uc55e\uc5d0 `c`\ub97c \ubd99\uc5ec requests.cget/coptions/chead/cpost/cput/cpatch/cdelete\ub85c \ud568\uc218\ub97c \uc791\uc131\ud574 \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n\uac19\uc740 URL\uc744 \ubcf4\ub0b4\ub3c4 \ub2e4\ub978 \uacb0\uacfc\ub97c \uc751\ub2f5\ud560 \uc218 \uc788\ub294 \ub3d9\uc801\uc778 \uc11c\ube44\uc2a4\ub97c \uc0ac\uc6a9\ud558\uac70\ub098(\uc2dc\uac04\uc5d0 \ub530\ub978 \uc751\ub2f5\uc758 \ubcc0\ud654\ub97c \ubc18\uc601\ud558\uc9c0 \uc54a\uc74c) \uc751\ub2f5\uc758 \ud06c\uae30\uac00 \ud074 \uacbd\uc6b0(\uba54\ubaa8\ub9ac\uac00 \ub0ad\ube44\ub420 \uc218 \uc788\uc74c) \uc0ac\uc6a9\ud558\uc9c0 \uc54a\ub294 \uac83\uc774 \uc88b\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> # \uae30\uae30 \uc0ac\uc591\uacfc \uc778\ud130\ub137 \uc5f0\uacb0 \ud488\uc9c8\uc5d0 \ub530\ub77c \uacb0\uacfc\ub294 \ub2e4\ub97c \uc218 \uc788\uc74c\n>>> import timeit\n>>>\n>>> timeit.timeit('requests.get(\"https://python.org\")', number=10, setup='from resoup import requests')\n1.1833231999917189 # \uae30\uae30 \uc0ac\uc591\uacfc \uc778\ud130\ub137 \uc5f0\uacb0 \ud488\uc9c8\uc5d0 \ub530\ub77c \ub2e4\ub984: 10\ubc88\uc758 \uc5f0\uacb0 \ubaa8\ub450 request\ub97c \ubcf4\ub0c4\n>>> timeit.timeit('requests.cget(\"https://python.org\")', number=10, setup='from resoup import requests')\n0.10267569999268744 # : \ucc98\uc74c \ud55c \ubc88\ub9cc request\ub97c \ubcf4\ub0b4\uace0 \uadf8 \ub4a4\ub294 \uce90\uc2dc\uc5d0\uc11c \uac12\uc744 \ubd88\ub7ec\uc634\n```\n\n#### \ube44\ub3d9\uae30\uc801\uc778 \uc694\uccad \ud568\uc218\n\n\ube44\ub3d9\uae30\uc801\uc778 \uc694\uccad\uc744 \ubcf4\ub0c5\ub2c8\ub2e4. \uc55e\uc5d0 `a`\ub97c \ubd99\uc5ec requests.aget/aoptions/ahead/apost/aput/apatch/adelete\ub85c \ud568\uc218\ub97c \uc791\uc131\ud569\ub2c8\ub2e4.\n\n`run_in_executer`\ub294 \uae30\ubcf8\uc801\uc73c\ub85c \ucf1c\uc838 \uc788\uc2b5\ub2c8\ub2e4. \uc790\uc138\ud55c \ub0b4\uc6a9\uc740 \uc544\ub798\uc758 `run_in_executer \uc0ac\uc6a9`\uc744 \ucc38\uace0\ud558\uc138\uc694.\n\n```python\n>>> import asyncio\n>>> \n>>> from resoup import requests\n>>>\n>>> res = asyncio.run(requests.aget('https://python.org'))\n>>> res\n<response [200]>\n```\n\n#### \ube44\ub3d9\uae30\uc801\uc774\uba70 \uce90\uc2dc\ub41c \uc694\uccad \ud568\uc218\n\n\ube44\ub3d9\uae30\uc801\uc774\uba70 \uce90\uc2dc\ub418\ub294 \uc694\uccad\uc785\ub2c8\ub2e4. \uc774\ub54c \uce90\uc2dc\ub294 \uac19\uc740 \uba54\uc18c\ub4dc\ub77c\uba74 `\uce90\uc2dc\ub41c \uc694\uccad \ud568\uc218`\uc640 \uacf5\uc720\ub429\ub2c8\ub2e4. \uc55e\uc5d0 `ac`\ub97c \ubd99\uc5ec requests.acget/acoptions/achead/acpost/acput/acpatch/acdelete\ub85c \ud568\uc218\ub97c \uc791\uc131\ud569\ub2c8\ub2e4.\n\n\uac19\uc740 URL\uc744 \ubcf4\ub0b4\ub3c4 \ub2e4\ub978 \uacb0\uacfc\ub97c \uc751\ub2f5\ud560 \uc218 \uc788\ub294 \ub3d9\uc801\uc778 \uc11c\ube44\uc2a4\ub97c \uc0ac\uc6a9\ud558\uac70\ub098(\uc2dc\uac04\uc5d0 \ub530\ub978 \uc751\ub2f5\uc758 \ubcc0\ud654\ub97c \ubc18\uc601\ud558\uc9c0 \uc54a\uc74c) \uc751\ub2f5\uc758 \ud06c\uae30\uac00 \ud074 \uacbd\uc6b0(\uba54\ubaa8\ub9ac\uac00 \ub0ad\ube44\ub420 \uc218 \uc788\uc74c) \uc0ac\uc6a9\ud558\uc9c0 \uc54a\ub294 \uac83\uc774 \uc88b\uc2b5\ub2c8\ub2e4.\n\n`run_in_executer`\ub294 \uae30\ubcf8\uc801\uc73c\ub85c \ucf1c\uc838 \uc788\uc2b5\ub2c8\ub2e4. \uc790\uc138\ud55c \ub0b4\uc6a9\uc740 \uc544\ub798\uc758 `run_in_executer \uc0ac\uc6a9`\uc744 \ucc38\uace0\ud558\uc138\uc694.\n\n```python\n>>> import asyncio\n>>> import timeit\n>>>\n>>> timeit.timeit('asyncio.run(requests.aget(\"https://python.org\"))', number=10, setup='from resoup import requests; import asyncio')\n0.8676127000362612 # \uae30\uae30 \uc0ac\uc591\uacfc \uc778\ud130\ub137 \uc5f0\uacb0 \ud488\uc9c8\uc5d0 \ub530\ub77c \ub2e4\ub984: 10\ubc88\uc758 \uc5f0\uacb0 \ubaa8\ub450 request\ub97c \ubcf4\ub0c4\n>>> timeit.timeit('asyncio.run(requests.acget(\"https://python.org\"))', number=10, setup='from resoup import requests; import asyncio')\n0.11984489997848868 # \ucc98\uc74c \ud55c \ubc88\ub9cc request\ub97c \ubcf4\ub0b4\uace0 \uadf8 \ub4a4\ub294 \uce90\uc2dc\ub97c \ubd88\ub7ec\uc634\n```\n\n#### `run_in_executer` \uc0ac\uc6a9\n\n\ube44\ub3d9\uae30\uc801\uc778 \uc694\uccad(aget, acget \ub4f1 a\uac00 \ubd99\uc740 \uba54\uc18c\ub4dc)\uc5d0\uc11c\ub294 `run_in_executer` parameter\ub97c \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4. \uc774 parameter\ub294 \ud568\uc218\uac00 \ub2e4\ub978 \uc4f0\ub808\ub4dc\uc5d0\uc11c \ub3cc\uac8c \ud569\ub2c8\ub2e4. \uc21c\ucc28\uc801\uc73c\ub85c \ud504\ub85c\uadf8\ub7a8\uc774 \ub3d9\uc791\ud560 \ub54c\uc5d0\ub294 \ud070 \ucc28\uc774\uac00 \uc5c6\uc9c0\ub9cc \ubcd1\ub82c\uc801\uc73c\ub85c \ud504\ub85c\uadf8\ub7a8\uc744 \ub3cc\ub9b4 \ub54c \ud070 \uc18d\ub3c4 \ud5a5\uc0c1\uc744 \uae30\ub300\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n\uc544\ub798\uc640 \uac19\uc774 `asyncio.gather`\ub97c \uc774\uc6a9\ud558\uba74 \ud070 \uc131\ub2a5 \ud5a5\uc0c1\uc744 \ubcf4\uc77c \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\nimport asyncio\nimport time\n\nfrom resoup import requests\n\nasync def masure_coroutine_time(coroutine):\n    start = time.perf_counter()\n    await coroutine\n    end = time.perf_counter()\n\n    print(end - start)\n\nasync def main():\n    # \ub2e8\uc77c request\ub97c \ubcf4\ub0bc \ub54c(\ud070 \ucc28\uc774 \uc5c6\uc74c)\n\n    req = requests.aget('https://python.org', run_in_executor=False)\n    await masure_coroutine_time(req)  # 0.07465070000034757\n\n    req = requests.aget('https://python.org')\n    await masure_coroutine_time(req)  # 0.05844969999452587\n\n    # \uc5ec\ub7ec request\ub97c \ubcf4\ub0bc \ub54c(\ud070 \uc18d\ub3c4 \ud5a5\uc0c1\uc744 \ubcf4\uc784)\n\n    reqs = (requests.aget(f'https://python.org/{i}', run_in_executor=False) for i in range(10))  # \ub354\ubbf8 url\uc744 \ub9cc\ub4e6\n    await masure_coroutine_time(asyncio.gather(*reqs))  # run_in_executor\ub97c \uc0ac\uc6a9\ud558\uc9c0 \uc54a\uc744 \ub54c: \ub290\ub9bc(3.7874760999984574)\n\n    reqs = (requests.aget(f'https://python.org/{i}') for i in range(10))  # \ub354\ubbf8 url\uc744 \ub9cc\ub4e6\n    await masure_coroutine_time(asyncio.gather(*reqs))  # run_in_executor\ub97c \uc0ac\uc6a9\ud560 \ub54c(\uae30\ubcf8\uac12): \ube60\ub984(0.11582900000212248)\n\nif __name__ == '__main__':\n    asyncio.run(main())\n```\n\n#### requests \ubaa8\ub4c8\uacfc \ud638\ud658\ub418\uc9c0 \uc54a\ub294 \ubd80\ubd84\n\n\uc774 \ubaa8\ub4c8\uc740 `requests` \ub77c\uc774\ube0c\ub7ec\ub9ac\uc640 \uac70\uc758 \ubaa8\ub4e0 \ubd80\ubd84\uc5d0\uc11c \ud638\ud658\ub418\uc9c0\ub9cc \ud638\ud658\ub418\uc9c0 \uc54a\ub294 \ubd80\ubd84\uc774 \uba87 \uac00\uc9c0 \uc788\uc2b5\ub2c8\ub2e4.\n\n##### dunder method(`__dunder__`)\n\n\uc7a0\uc815\uc801 \ubc84\uadf8\uc758 \uc774\uc720\uac00 \ub420 \uc218 \uc788\ub2e4\ub294 \uc774\uc720 \ud639\uc740 \uae30\uc220\uc801\uc778 \uc774\uc720\ub85c \uc77c\ubd80 dunder method\ub294 \ubd88\ub7ec\uc640\uc9c0\uc9c0 \uc54a\uac70\ub098 \ud638\ud658\ub418\uc9c0 \uc54a\uc2b5\ub2c8\ub2e4.\n\n\uc0ac\uc6a9\ud560 \uc218 \uc5c6\uac70\ub098 requests \ub77c\uc774\ube0c\ub7ec\ub9ac\uc640 \uc77c\uce58\ud558\uc9c0 \uc54a\ub294 dunder method: `__builtins__`, `__cached__`, `__doc__`, `__file__`, `__loader__`, `__name__`, `__package__`, `__spec__`\n\n\uc0ac\uc6a9 \uac00\ub2a5\ud558\uace0 requests \ub77c\uc774\ube0c\ub7ec\ub9ac\uc640 \uc77c\uce58\ud558\ub294 dunder method: `__author__`, `__author_email__`, `__build__`, `__cake__`, `__copyright__`, `__description__`, `__license__`, `__title__`, `__url__`, `__version__`\n\n```python\n>>> import requests\n>>> requests.__name__\n'requests'\n>>> requests.__path__\n['some path']\n>>> requests.__cake__\n'\u2728 \ud83c\udf70 \u2728'\n>>>\n>>> from resoup import requests\n>>> requests.__name__  # \ud638\ud658\ub418\uc9c0 \uc54a\ub294 dunder method\n'resoup.requests_proxy'  # requests\uc640 \uac12\uc774 \ub2e4\ub984\n>>> requests.__path__ # \uc0ac\uc6a9\ud560 \uc218 \uc5c6\uace0 \ud638\ud658\ub418\uc9c0 \uc54a\ub294 dunder method\nAttributeError: module 'resoup.requests_' has no attribute '__path__'\n>>> requests.__cake__  # \ud638\ud658\ub418\ub294 dunder method\n'\u2728 \ud83c\udf70 \u2728'\n```\n\n##### import\n\n`resoup.requests`\ub294 \uac70\uc758 \ubaa8\ub4e0 \uacbd\uc6b0\uc5d0\uc11c import \uad00\ub828 \ud638\ud658\uc131\uc774 \uc720\uc9c0\ub429\ub2c8\ub2e4. \ud558\uc9c0\ub9cc import\uc640 \uad00\ub828\ud574\uc11c\ub294 \uba87 \uac00\uc9c0 \uaddc\uce59\uc774 \uc874\uc7ac\ud569\ub2c8\ub2e4.\n\n`resoup.requests`\ub294 `from resoup import requests`\uc758 \ud615\ud0dc\ub85c\ub9cc \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\n# \uac01 \ub77c\uc778\uc5d0\uc11c \uc717\uc904\uacfc \uc544\ub7ab\uc904\uc740 \uac01\uac01 requests\ub97c import \ud560 \ub54c\uc640 `resoup.requests`\ub97c import\ud560 \ub54c\ub97c \ub098\ud0c0\ub0c5\ub2c8\ub2e4.\n\n# requests \ubaa8\ub4c8 import\nimport requests\nfrom resoup import requests  # \uac00\ub2a5\n```\n\n\ub530\ub77c\uc11c \ub2e4\uc74c\uacfc \uac19\uc740 \uacbd\uc6b0\ub294 `resoup.requests`\uc5d0\uc11c import\uac00 \ubd88\uac00\ub2a5\ud569\ub2c8\ub2e4.\n\n```python\n# requests\uc758 \ud558\uc704 \ubaa8\ub4c8 import\nimport requests.models  # \uac00\ub2a5\nimport resoup.requests.models  # \ubd88\uac00\ub2a5!\n\n# requests\uc758 \ud558\uc704 \ubaa8\ub4c8 import (w/ from .. import ...)\nfrom request import models  # \uac00\ub2a5\nfrom resoup.requests import models  # \ubd88\uac00\ub2a5!\n\n# requests\uc758 \ud558\uc704 \ubaa8\ub4c8\uc758 \ud558\uc704 \uad6c\uc131 \uc694\uc18c import\nfrom request.models import Response  # \uac00\ub2a5\nfrom resoup.requests.models import Response  # \ubd88\uac00\ub2a5!\n```\n\n\uc774\ub7f0 \uacbd\uc6b0\uc5d4 \ubaa8\ub4c8 import\ub97c \uc774\uc6a9\ud558\uba74 \ud574\uacb0\ub429\ub2c8\ub2e4..\n\n\uc608\ub97c \ub4e4\uc5b4 \ub2e4\uc74c\uacfc \uac19\uc740 \ucf54\ub4dc\uac00 \uc788\ub2e4\uace0 \ud574 \ubd05\uc2dc\ub2e4.\n\n```python\nfrom request.models import Response  # \ud558\uc704 \ubaa8\ub4c8\uc758 \ud558\uc704 \uad6c\uc131 \uc694\uc18c import \uc0ac\uc6a9\n\ndef is_response(instance):\n    return isinstance(instance, Response)\n```\n\n\uc774 \ucf54\ub4dc\ub294 \ub2e4\uc74c\uacfc \uac19\uc774 \ubb38\uc81c\ub97c \ud574\uacb0\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\n# requests.models.Response\ub85c \ubc14\uafb8\uae30.\n# \uc7a5\uc810: \uae54\ub054\ud558\uace0 error-prone\ud558\uc9c0 \uc54a\uc74c.\nfrom resoup import requests  # requests \ubaa8\ub4c8 import\ndef is_response(instance):\n    return isinstance(instance, requests.models.Response)  # requests.models.Response\ub85c \ubcc0\uacbd\ud568\n```\n\n```python\n# Response \uc815\uc758\ud558\uae30.\n# \uc7a5\uc810: \ucf54\ub4dc\ub97c \uc218\uc815\ud560 \ud544\uc694\uac00 \uc5c6\uc74c.\nfrom resoup import requests\nResponse = requests.models.Response\n\ndef is_response(instance):\n    return isinstance(instance, Response)\n```\n\n\uac1c\uc778\uc758 \uc120\ud638\uc5d0 \ub530\ub77c \uc6d0\ud558\ub294 \ubc29\uc2dd\uc73c\ub85c \uc0ac\uc6a9\ud558\uc2dc\uba74 \ub429\ub2c8\ub2e4.\n\n### ResponseProxy\n\n`ResponseProxy`\ub294 \uc774 \ub77c\uc774\ube0c\ub7ec\ub9ac\uc5d0\uc11c requests.get/options/head/post/put/patch/delete\ub97c \uc0ac\uc6a9\ud560 \uacbd\uc6b0\uc758 \ub9ac\ud134\uac12\uc785\ub2c8\ub2e4. \uae30\uc874 Response\uc640 100% \ud638\ud658\ub418\uba74\uc11c\ub3c4 \ucd94\uac00\uc801\uc778 \ud568\uc218 6\uac1c\ub97c \uc81c\uacf5\ud569\ub2c8\ub2e4.\n\n#### \ud638\ud658\uc131\n\n\uc774 \ud30c\ud2b8\uc5d0\uc11c\ub294 \uc8fc\uc11d\uc5d0 \ub0b4\uc6a9\uc744 \uc801\uc5c8\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> # \ub450 \ubaa8\ub4c8\uc744 \ub3d9\uc2dc\uc5d0 \uc0ac\uc6a9\ud574\uc57c \ud558\ub2c8 \uc774\ub984\uc744 \ubcc0\uacbd\ud558\uaca0\uc2b5\ub2c8\ub2e4.\n>>> import requests as orginal_requests\n>>> from resoup import requests as utils_requsts\n>>>\n>>> # requests \ubaa8\ub4c8\uc740 Response\ub97c \uc751\ub2f5\ud569\ub2c8\ub2e4.\n>>> response1 = orginal_requests.get(\"https://peps.python.org/pep-0020/\")  # \uc815\uc801\uc778 \uc6f9\uc0ac\uc774\ud2b8\n>>> print(response1)\n<Response [200]>\n>>> print(type(response1))  # Response \uac1d\uccb4\n<class 'requests.models.Response'>\n>>> # resoup.requests\ubaa8\ub4c8\uc740 ResponseProxy\ub97c \uc751\ub2f5\ud569\ub2c8\ub2e4.\n>>> response2 = utils_requsts.get(\"https://peps.python.org/pep-0020/\")\n>>> print(response2)\n<Response [200]>\n>>> print(type(response2))  # ResponseProxy \uac1d\uccb4\n<class 'resoup.response_proxy.ResponseProxy'>\n>>>\n>>> # \ub2e4\uc74c\uc758 \ubaa8\ub4e0 \uac80\uc0ac\ub4e4\uc744 \ud1b5\uacfc\ud569\ub2c8\ub2e4.\n>>> assert response1.text == response2.text\n>>> assert response1.status_code == response2.status_code\n>>> assert response1.url == response2.url\n>>> assert response1.content == response2.content\n>>>\n>>> # \ud558\uc9c0\ub9cc RequestsProxy\uc5d0\ub294 \uc774\ub7ec\ud55c \ucd94\uac00\uc801\uc778 \uae30\ub2a5\ub4e4\uc774 \uc874\uc7ac\ud569\ub2c8\ub2e4.\n>>> print(response2.soup())\n<!DOCTYPE html>\n<html lang=\"en\">\n<head>\n<meta charset=\"utf-8\"/>\n...\n<script src=\"../_static/wrap_tables.js\"></script>\n<script src=\"../_static/sticky_banner.js\"></script>\n</body>\n</html>\n>>> print(response2.soup_select('title'))\n[<title>PEP 20 \u2013 The Zen of Python | peps.python.org</title>, <title>Following system colour scheme</title>, <title>Selected dark colour scheme</title>, <title>Selected light colour scheme</title>]\n>>> print(response2.soup_select_one('p', no_empty_result=True).text)\nLong time Pythoneer Tim Peters succinctly channels the BDFL\u2019s guiding\nprinciples for Python\u2019s design into 20 aphorisms, only 19 of which\nhave been written down.\n>>>\n>>> from requests.models import Response\n>>> # RequestsProxy\ub294 Requsests\uc758 subclass\uc785\ub2c8\ub2e4.\n>>> # \ub530\ub77c\uc11c isinstance \uac80\uc0ac\ub97c \ud1b5\uacfc\ud569\ub2c8\ub2e4.\n>>> isinstance(response2, Response)\nTrue\n>>> # \ubb3c\ub860 subclass\uc774\uae30 \ub54c\ubb38\uc5d0 '==' \uac80\uc0ac\ub294 \ud1b5\uacfc\ud558\uc9c0 \uc54a\uc2b5\ub2c8\ub2e4.\n>>> type(response1) == type(response2)\nFalse\n```\n\n#### \uae30\ubcf8 \uad6c\uc870\n\n`ResponseProxy`\uc5d0\ub294 \uc5ec\ub7ec \ubaa8\ub4c8\ub4e4\uc774 \uc788\uc73c\uba70, \ud06c\uac8c \uc138 \uac00\uc9c0 \uc885\ub958\ub85c \ubd84\ub958\ub429\ub2c8\ub2e4.\n\n* soup\ub958: `.soup()`, `.soup_select()`, `.soup_select_one()`\n  \uae30\ubcf8\uc801\uc778 \ud568\uc218\uc785\ub2c8\ub2e4.\n* xml\ub958: `.xml()`, `.xml_select()`, `.xml_select_one()`\n  soup\ub958\uc5d0\uc11c parser\uac00 'xml'\uc778 \uacbd\uc6b0\uc785\ub2c8\ub2e4.\n\n\uac01\uac01\uc758 \uc885\ub958\uc5d0\ub294 \uc138 \uac00\uc9c0 \ud568\uc218\uac00 \uc788\uc73c\uba70 \ud568\uc218 \uac01\uac01\uc758 \uae30\ub2a5\uc740 \ub2e4\uc74c\uacfc \uac19\uc2b5\ub2c8\ub2e4.\n\n* `.soup()`/`.xml()`: BeatifulSoup\ub85c \ud574\uc11d\ub41c \ucf54\ub4dc\uac00 \ub098\uc635\ub2c8\ub2e4.\n* `.soup_select()`/`.xml_select()`: `.soup().select()`\uc640 \ube44\uc2b7\ud569\ub2c8\ub2e4.\n* `.soup_select_one()`/`.xml_select_one()`: `.soup().select_one()`\uacfc \ube44\uc2b7\ud569\ub2c8\ub2e4.\n\n\uc790\uc138\ud55c \ub0b4\uc6a9\uc740 \uc544\ub798\ub97c \uc0b4\ud3b4\ubcf4\uc138\uc694.\n\n#### `.soup()`\n\n`.soup()`\ub294 \ud14d\uc2a4\ud2b8\ub098 response\ub97c \ubc1b\uc544 `BeatifulSoup`\ub85c \ub0b4\ubcf4\ub0c5\ub2c8\ub2e4.\n\n\uc774\ub54c \uc778\uc790\ub294 response\uc640 response.text \ubaa8\ub450 \uac00\ub2a5\ud558\uc9c0\ub9cc response\ub97c \uc0ac\uc6a9\ud558\ub294 \uac83\uc744 \uad8c\ud569\ub2c8\ub2e4.\n\uadf8\ub7ec\uba74 \ub354\uc6b1 \uc0c1\uc138\ud55c \uc624\ub958 \uba54\uc2dc\uc9c0\ub97c \ubc1b\uc744 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>>\n>>> response = requests.get(\"https://python.org\")\n>>> response.soup()  # BeatifulSoup\uc5d0\uc11c \uc0ac\uc6a9 \uac00\ub2a5\ud55c \ubaa8\ub4e0 parameter \uc0ac\uc6a9 \uac00\ub2a5\n<!DOCTYPE html>\n...\n</body>\n</html>\n```\n\n\uc774 \ud568\uc218\ub294 \uc0ac\uc2e4\uc0c1 `BeatifulSoup`\ub97c \ud1b5\uacfc\uc2dc\ud0a4\ub294 \uac83\uacfc \uac19\uc2b5\ub2c8\ub2e4. \uc544\ub798\uc758 \ucf54\ub4dc\ub294 \uc704\uc758 \ucf54\ub4dc\uc640 \uac70\uc758 \uac19\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> import requests\n>>> from bs4 import BeautifulSoup\n>>>\n>>> response = requests.get(\"https://python.org\")\n>>> BeautifulSoup(response.text)\n<!DOCTYPE html>\n<!DOCTYPE html>\n...\n</body>\n</html>\n```\n\nparser\uac00 \uc5c6\uc744 \uacbd\uc6b0 `BeatifulSoup`\ub294 `FeatureNotFound`\uc5d0\ub7ec\uac00 \ub098\uc624\uc9c0\ub9cc `.soup()`\ub294 `NoParserError`\uac00 \ub098\uc635\ub2c8\ub2e4.\n\n#### `.soup_select()`\n\n`.soup_select()`\ub294 \ud14d\uc2a4\ud2b8\ub098 response\ub97c \ubc1b\uc544 BeatifulSoup\uc758 Tag\ub85c \ub0b4\ubcf4\ub0c5\ub2c8\ub2e4. `selector` parameter\ub294 CSS \uc120\ud0dd\uc790\ub97c \ubc1b\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>>\n>>> response = requests.get(\"https://python.org\")\n>>> response.soup_select(\"p\")\n[<p><strong>Notice:</strong> While JavaScript is not essential for this website\n...]\n```\n\n\uc544\ub798\uc758 \ucf54\ub4dc\ub294 \uc704\uc758 \ucf54\ub4dc\uc640 \uc720\uc0ac\ud558\uac8c \ub3d9\uc791\ud569\ub2c8\ub2e4.\n\n```python\n>>> import requests\n>>> from bs4 import BeautifulSoup\n>>>\n>>> response = requests.get('https://python.org')\n>>> soup = BeautifulSoup(response.text).select('p')\n>>> soup\n[<p><strong>Notice:</strong> While JavaScript is not essential for this website\n...]\n```\n\n\uc774 \ud568\uc218\uc758 \ub3c5\ud2b9\ud55c \uc810\uc740, `no_empty_result`\ub77c\ub294 parameter\uc758 \uc874\uc7ac\uc785\ub2c8\ub2e4. \uc774 parameter\uac00 True\uc774\uba74 .select()\uc758 \uacb0\uacfc\uac00 \ube48 \ub9ac\uc2a4\ud2b8\uc77c\ub54c `EmptyResultError`\ub97c \ub0c5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>>\n>>> response = requests.get(\"https://python.org\")\n>>> response.soup_select(\"data-some-complex-and-error-prone-selector\")\n[]\n>>>\n>>> response = requests.get(\"https://python.org\")\n>>> response.soup_select(\n...     \"data-some-complex-and-error-prone-selector\",\n...     no_empty_result=True)\nTraceback (most recent call last):\n  File \"<stdin>\", line 1, in <module>\n  File \"...souptools.py\", line 148, in soup_select\n    raise EmptyResultError(\nresoup.exceptions.EmptyResultError: Result of select is empty list(\"[]\"). This error happens probably because of invalid selector or URL. Check if both selector and URL are valid. Set to False `no_empty_result` if empty list is intended. It may also because of selector is not matched with URL.\nselector: data-some-complex-and-error-prone-selector, URL: https://www.python.org/\n```\n\n\uc774 \ud568\uc218\ub97c \uae30\ubcf8\uc801\uc73c\ub85c BroadcastList\ub97c \ucd9c\ub825\uac12\uc73c\ub85c \uc124\uc815\ud558\uace0 \uc788\uc2b5\ub2c8\ub2e4. BroadcastList\uc5d0 \ub300\ud574 \uc790\uc138\ud788 \uc54c\uace0 \uc2f6\ub2e4\uba74 \uc544\ub798\uc758 `BroadcastList` \ud56d\ubaa9\uc744 \ud655\uc778\ud574 \ubcf4\uc138\uc694.\n\n#### `.soup_select_one()`\n\n`.soup_select_one()`\ub294 \ud14d\uc2a4\ud2b8\ub098 response\ub97c \ubc1b\uc544 BeatifulSoup\uc758 Tag\ub85c \ub0b4\ubcf4\ub0c5\ub2c8\ub2e4. `selector` parameter\ub294 CSS \uc120\ud0dd\uc790\ub97c \ubc1b\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>>\n>>> response = requests.get('https://python.org')\n>>> response.soup_select_one('p strong', no_empty_result=True)\n<strong>Notice:</strong>\n```\n\n\uc544\ub798\uc758 \ucf54\ub4dc\ub294 \uc704\uc758 \ucf54\ub4dc\uc640 \uc720\uc0ac\ud558\uac8c \ub3d9\uc791\ud569\ub2c8\ub2e4.\n\n```python\n>>> import requests\n>>> from bs4 import BeautifulSoup\n>>>\n>>> response = requests.get('https://python.org')\n>>> soup = BeautifulSoup(response.text, 'html.parser').select('p strong')\n>>> if soup is None:  # no_empty_result \uad00\ub828 \ud655\uc778 \ucf54\ub4dc\n...     raise Exception\n...\n>>> soup\n<strong>Notice:</strong>\n```\n\n`no_empty_result` parameter\uac00 True\uc774\uba74 .select_one()\uc758 \uacb0\uacfc\uac00 None\uc77c\ub54c `EmptyResultError`\ub97c \ub0c5\ub2c8\ub2e4.\n\n\uc774 \uae30\ub2a5\uc740 \ud0c0\uc785 \ud78c\ud2b8\uc5d0\uc11c\ub3c4 \uc720\uc6a9\ud558\uac8c \uc4f0\uc77c \uc218 \uc788\uace0, \uc624\ub958\ub97c \ub354 \uba85\ud655\ud788 \ud558\ub294 \ub370\uc5d0\ub3c4 \ub3c4\uc6c0\uc744 \uc90d\ub2c8\ub2e4.\n\n\uae30\uc874 BeatifulSoup\uc5d0\uc11c\ub294 `.select_one()`\uc758 \ub9ac\ud134\uac12\uc744 `Tag | None`\uc73c\ub85c \ud45c\uc2dc\ud588\uae30 \ub54c\ubb38\uc5d0 \ub9cc\uc57d `.select_one().text`\uc640 \uac19\uc740 \ucf54\ub4dc\ub97c \uc0ac\uc6a9\ud558\ub824\uace0 \ud558\uba74 \uc815\uc801 \ud0c0\uc785 \uac80\uc0ac \ub3c4\uad6c\ub4e4\uc5d0\uc11c \uc624\ub958\ub97c \ubc1c\uc0dd\uc2dc\ucf30\uc2b5\ub2c8\ub2e4.\n\n\ud2b9\ud788 `.select_one()`\uc758 \uacb0\uacfc\uac00 None\uc774 \ub418\uba74 `'NoneType' object has no attribute 'text'`\ub77c\ub294 \uc5b4\ub5a4 \ubd80\ubd84\uc5d0\uc11c \uc624\ub958\uac00 \ub0ac\ub294\uc9c0 \ud55c\ub208\uc5d0 \ud655\uc778\ud558\uae30 \ud798\ub4e0 \uc624\ub958 \uba54\uc2dc\uc9c0\uac00 \ub098\uc654\uc2b5\ub2c8\ub2e4.\n\n`no_empty_result`\ub97c \uc774\uc6a9\ud558\uba74 \uc774\ub7ec\ud55c \ubb38\uc81c\ub4e4\uc744 \ud574\uacb0\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n`no_empty_result`\ub97c True\ub85c \ud558\uba74 \ud0c0\uc785 \uac80\uc0ac \ub3c4\uad6c\ub4e4\ub3c4 \uc870\uc6a9\ud574\uc9c0\uace0, \ud639\uc2dc\ub77c\ub3c4 None\uc774 \uacb0\uacfc\uac12\uc774 \ub420 \ub54c  \ub300\uc2e0 \ud6e8\uc52c \ub354 \uc790\uc138\ud558\uba70 \ud574\uacb0\ucc45\uc744 \ud3ec\ud568\ud55c \uc624\ub958 \uba54\uc2dc\uc9c0\ub97c \ub9cc\ub4e4\uc5b4 \ub0c5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>>\n>>> response = requests.get(\"https://python.org\")\n>>> print(response.soup_select_one(\"data-some-complex-and-error-prone-selector\"))\nNone  # \uc2e4\uc81c\ub85c None\uc774 \uacb0\uacfc\uac12\uc73c\ub85c \ub098\uc624\uc9c4 \uc54a\uace0 \uadf8\ub0e5 \uc870\uc6a9\ud788 \uc885\ub8cc\ub428.\n>>>\n>>> response = requests.get(\"https://python.org\")\n>>> response.soup_select_one(\n...     \"data-some-complex-and-error-prone-selector\",\n...     no_empty_result=True)\nTraceback (most recent call last):\n  File \"<stdin>\", line 1, in <module>\n  File \"...souptools.py\", line 220, in soup_select_one\n    raise EmptyResultError(\nresoup.exceptions.EmptyResultError: Result of select_one is None. This error happens probably because of invalid selector or URL. Check if both selector and URL are valid. Set to False `no_empty_result` if empty list is intended. It may also because of selector is not matched with URL.  \nselector: data-some-complex-and-error-prone-selector, URL: https://www.python.org/\n```\n\n#### xml \uad00\ub828 \ud568\uc218\n\n`ResponseProxy`\uc758 `soup` \uad00\ub828 \ud568\uc218\uc5d0\uc11c `soup`\ub97c `xml`\ub85c \uce58\ud658\ud558\uba74 xml \ud568\uc218\uac00 \ub429\ub2c8\ub2e4.\n\n\uc774 \ud568\uc218\ub4e4\uc740 parser\uac00 `'xml'`\uc774\ub77c\ub294 \uc810\uc744 \uc81c\uc678\ud558\uace0\ub294 soup\uc640 \ucc28\uc774\uc810\uc774 \uc5c6\uc2b5\ub2c8\ub2e4.\n\n\uc608\uc2dc \ucf54\ub4dc\ub294 \ub2e4\uc74c\uacfc \uac19\uc2b5\ub2c8\ub2e4\n\n```python\n>>> from resoup import requests\n>>>\n>>> response = requests.get('https://www.w3schools.com/xml/plant_catalog.xml')\n>>> selected = response.xml_select('LIGHT', no_empty_result=True)\n>>> selected\n[<LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Sunny</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sun</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>]\n```\n\n\uc704\uc758 \ucf54\ub4dc\ub294 \uc544\ub798\uc758 \ucf54\ub4dc\uc640 \uac70\uc758 \uac19\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>> from functools import partial\n>>>\n>>> response = requests.get('https://www.w3schools.com/xml/plant_catalog.xml')\n>>> # corespond to `.xml_select()`\n>>> xml_select_partial = partial(response.soup_select, parser='xml')\n>>> selected = xml_select_partial('LIGHT', no_empty_result=True)\n>>> selected\n[<LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Sunny</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sunny</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Sun or Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Sun</LIGHT>, <LIGHT>Mostly Shady</LIGHT>, <LIGHT>Shade</LIGHT>, <LIGHT>Shade</LIGHT>]\n```\n\n#### BroadcastList\n\n`.soup_select()`\uc640 `.xml_select()`\uc758 \uacbd\uc6b0\uc5d0\ub294 \ub9ac\uc2a4\ud2b8\ub97c \uac12\uc73c\ub85c \ub0b4\ubcf4\ub0c5\ub2c8\ub2e4. \uc774\ub294 `.soup()`\ub098 `.soup_select_one()`\uc5d0\uc11c \uae30\ub300\ud560 \uc218 \uc788\ub294 `.text`\uc640 \uac19\uc740 \ud30c\ub77c\ubbf8\ud130 \uc0ac\uc6a9\uc744 \uc5b4\ub835\uac8c \ud569\ub2c8\ub2e4.\n\n\uc774\ub294 for loop\ub098 \ub9ac\uc2a4\ud2b8 \ucef4\ud504\ub9ac\ud5e8\uc158\uc73c\ub85c \ud574\uacb0\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>> tags_list = requests.get(\"https://python.org\").soup_select(\"p strong\")\n>>> [element.text for element in tags_list]\n['Notice:', 'relaunched community-run job board']\n```\n\n\ud558\uc9c0\ub9cc \uc774\uac83\uc774 \ub9c8\uc74c\uc5d0 \ub4e4\uc9c0 \uc54a\uc744 \uc218\uac00 \uc788\uc2b5\ub2c8\ub2e4. \ud2b9\ud788 \uac1c\ubc1c \uc911\uc774\ub77c\uba74 \ube60\ub978 _\uac1c\ubc1c_ \uc18d\ub3c4\ub97c \uc704\ud574 for loop\ub098 \ub9ac\uc2a4\ud2b8 \ucef4\ud504\ub9ac\ud5e8\uc158\uc744 \uc0ac\uc6a9\ud558\ub294 \uac83 \uc678\uc5d0 \ub354 \uc2e0\uc18d\ud558\uac8c `.text` \ub4f1\uc744 \uc801\uc6a9\ud558\ub294 \ubc29\ubc95\uc744 \uace0\ub824\ud558\uace0 \uc2f6\uc744 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n\uc774 \ud504\ub85c\uc81d\ud2b8\uc758 `.soup_select()`\uc758 \uae30\ubcf8 \ub9ac\ud134\uac12\uc73c\ub85c \uc124\uc815\ub41c BroadcastList\ub294 \uc774\ub97c \ud574\uacb0\ud558\uae30 \uc704\ud55c \ubc29\ud3b8\uc785\ub2c8\ub2e4.\n\nBroadcastList\uc5d0\uc11c\ub294 \ub9ac\uc2a4\ud2b8\ub97c \ud1b5\ud574 \uc9c1\uc811 Tag\uc5d0\uc11c \uc0ac\uc6a9\ub418\ub294 \uc18d\uc131\uc744 \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>> tags_list = requests.get(\"https://python.org\").soup_select(\"p strong\")\n>>> tags_list\n[<strong>Notice:</strong>, <strong>relaunched community-run job board</strong>]\n>>> type(tags_list)\n<class 'resoup.broadcast_list.TagBroadcastList'>  # BroadcastList\uac00 \uc0ac\uc6a9\ub428\n>>> tags_list.text  # \ube0c\ub85c\ub4dc\uce90\uc2a4\ud305\n['Notice:', 'relaunched community-run job board']\n>>>\n>>> tags_list_with_no_broadcast_list = requests.get('https://python.org').soup_select('p', use_broadcast_list=False)\n>>> type(tags_list_with_no_broadcast_list)\n<class 'bs4.element.ResultSet'>  # BroadcastList\uac00 \uc0ac\uc6a9\ub418\uc9c0 \uc54a\uc74c\n>>> tags_list_with_no_broadcast_list.text\nTraceback (most recent call last):\n  File \"<stdin>\", line 1, in <module>\n  File \"...element.py\", line 2428, in __getattr__\n    raise AttributeError(\nAttributeError: ResultSet object has no attribute 'text'. You're probably treating a list of elements like a single element. Did you call find_all() when you meant to call find()?\n```\n\nBroadcastList\ub294 \ub2e4\uc74c\uacfc \uac19\uc740 \ubc29\ubc95\uc744 \ud1b5\ud574 \ub04c \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>>\n>>> tags_list = requests.get(\"https://python.org\").soup_select(\"p\", use_broadcase_list=False)\n>>> type(tags_list)\nbs4.element.ResultSet\n>>> tags_list.text  # \ube0c\ub85c\ub4dc\uce90\uc2a4\ud305 \uc548 \ub428\nTraceback (most recent call last):\n  File \"<stdin>\", line 1, in <module>\n  File \"...element.py\", line 2428, in __getattr__\n    raise AttributeError(\nAttributeError: ResultSet object has no attribute 'text'. You're probably treating a list of elements like a single element. Did you call find_all() when you meant to call find()?\n```\n\n### \ud2b9\ubcc4\ud55c \ud615\ud0dc\uc758 \ub9ac\uc2a4\ud2b8 getitem\n\nBroadCastList\uc5d0\uc11c\ub294 \ub2e4\uc74c\uacfc \uac19\uc740 \ud2b9\uc774\ud55c \uae30\ub2a5\uc774 \uc788\uc2b5\ub2c8\ub2e4.\n\n\ub9cc\uc57d \ub9ac\uc2a4\ud2b8\uc5d0 \uc815\uc218\ub098 \uc2ac\ub77c\uc774\uc2a4\ub85c getitem\uc744 \uc694\uccad\ud55c\ub2e4\uba74 \uc77c\ubc18\uc801\uc778 \ub9ac\uc2a4\ud2b8\uc758 \uc5ed\ud560\uc744 \uc218\ud589\ud569\ub2c8\ub2e4.\n\n```python\n>>> from resoup import requests\n>>> # \uac12 \ubd88\ub7ec\uc634()\n>>> tag_broadcast_list = requests.cget(\"https://www.python.org/community/logos/\").soup_select(\"img\")\n>>> tag_broadcast_list\n[<img alt=\"Python Software Foundation\" class=\"psf-logo\" src=\"/static/img/psf-logo.png\"/>,\n...\n<img alt=\"Logo device only\" src=\"https://s3.dualstack.us-east-2.amazonaws.com/pythondotorg-assets/media/community/logos/python-logo-only.png\" style=\"height: 48px;\"/>,\n<img alt=\"/static/community_logos/python-powered-w-100x40.png\" src=\"/static/community_logos/python-powered-w-100x40.png\"/>,\n<img alt=\"/static/community_logos/python-powered-h-50x65.png\" src=\"/static/community_logos/python-powered-h-50x65.png\"/>]\n>>> # \uc815\uc218 getitem\n>>> tag_broadcast_list[0]\n<img alt=\"Python Software Foundation\" class=\"psf-logo\" src=\"/static/img/psf-logo.png\"/>\n>>> # \uc2ac\ub77c\uc774\uc2f1\n>>> tag_broadcast_list[3:5]\n[<img alt=\"/static/community_logos/python-powered-w-100x40.png\" src=\"/static/community_logos/python-powered-w-100x40.png\"/>,\n <img alt=\"/static/community_logos/python-powered-h-50x65.png\" src=\"/static/community_logos/python-powered-h-50x65.png\"/>]\n>>> # \ubb38\uc790\uc5f4 getitem (\ube0c\ub85c\ub4dc\uce90\uc2a4\ud305 \uc801\uc6a9\ub428!)\n>>> tag_broadcast_list[\"alt\"]\n['Python Software Foundation',\n 'Combined logo',\n 'Logo device only',\n '/static/community_logos/python-powered-w-100x40.png',\n '/static/community_logos/python-powered-h-50x65.png']\n```\n\n### CustomDefaults\n\n`CustomDefaults`\ub97c \ud1b5\ud574 \uc9c1\uc811 \uae30\ubcf8\uac12\uc744 \uc124\uc815\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4. \uc774 \uac12\uc73c\ub85c \uc77c\ubc18 get/options/head/post/put/patch/delete \ubc0f c../a../ac.. \ud568\uc218\uc758 \uae30\ubcf8\uac12\uc744 \ud6a8\uacfc\uc801\uc73c\ub85c \uc124\uc815\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n\n```python\n>>> from resoup import CustomDefaults\n>>>\n>>> requests = CustomDefaults(headers={'User-Agent': 'User Agent for Test'})\n>>> requests.get('https://httpbin.org/headers').json()['headers']['User-Agent']\n'User Agent for Test'\n```\n\n## \ub77c\uc774\uc120\uc2a4 \uc815\ubcf4\n\n\uc774 \ud504\ub85c\uadf8\ub7a8\uc740 MIT \ub77c\uc774\uc120\uc2a4\ub85c \uacf5\uc720\ub429\ub2c8\ub2e4.\n\n\uc774 \ud504\ub85c\uadf8\ub7a8\uc758 \uc77c\ubd80\ub294 [requests(Apache License 2.0)](https://github.com/psf/requests) \ub77c\uc774\ube0c\ub7ec\ub9ac\uc5d0 \uc788\ub358 \ucf54\ub4dc\ub97c \ud3ec\ud568\ud569\ub2c8\ub2e4.\nSome part of this program contains code from [requests](https://github.com/psf/requests) library.\n\n\uc774 \ud504\ub85c\uadf8\ub7a8\uc758 \uc77c\ubd80\ub294 [typeshed(Apache License 2.0 or MIT License)](https://github.com/python/typeshed) \ub77c\uc774\ube0c\ub7ec\ub9ac\uc5d0 \uc788\ub358 \ucf54\ub4dc\ub97c \ud3ec\ud568\ud569\ub2c8\ub2e4.\nSome part of this program contains code from [typeshed](https://github.com/python/typeshed) library.\n\n## Relese Note\n\n0.5.2 (2023-12-26): Timeout \uc624\ub958\ub3c4 attempts\uc5d0 \uac78\ub9b4 \uc218 \uc788\ub3c4\ub85d \ubcc0\uacbd, root\uc5d0\uc11c \uc0ac\uc6a9\ud560 \uc218 \uc788\ub294 \ubcc0\uc218 \ucd94\uac00, \ube4c\ub4dc \ucf54\ub4dc \uac1c\uc120, \ucf54\ub4dc \uac1c\uc120\n\n0.5.1 (2023-12-9): \ubc84\uadf8 \uc218\uc815\n\n0.5.0 (2023-12-9): resoup\ub85c \uc774\ub984 \ubcc0\uacbd, \uc0c8 BroadcastList \uae30\ubcf8 \uc801\uc6a9, poetry \uc0ac\uc6a9, \uae30\uc874 souptools \ubaa8\ub4c8 \uc81c\uac70 \ubc0f souptoolsclass \ubaa8\ub4c8\ub85c \ub300\uccb4, \ud14c\uc2a4\ud2b8 \ucd94\uac00\n\n0.4.1 (2023-11-4): \uae34\uae09 \ubc84\uadf8 \uc218\uc815\n\n0.4.0 (2023-11-4): raise_for_status \uae30\ubcf8\uac12 \ubcc0\uacbd, souptoolsclass \ucd94\uac00, avoid_sslerror \ucd94\uac00\n\n0.3.0 (2023-10-05): BroadcastList \ubcf5\uc6d0, sessions_with_tools \ucd94\uac00\n\n0.2.3 (2023-09-19): header \uae30\ubcf8\uac12 \ubcc0\uacbd, ConnectionError\uc2dc \uc5d0\ub7ec \ud55c \uac1c\ub9cc \ubcf4\uc774\ub294 \uac83\uc73c\ub85c \ubcc0\uacbd, attempts\ub85c \uc7ac\uc2dc\ub3c4\ud560 \ub54c \uc131\uacf5\ud588\uc744 \ub54c \uba54\uc2dc\uc9c0 \ucd94\uac00, retry\uc5d0\uc11c url \uc81c\uac70, setup.py\uc640 \uad00\ub828 \ud30c\uc77c \ubcc0\uacbd\n\n0.2.2 (2023-09-08): attempt parameter\ub97c attempts\ub85c \ubcc0\uacbd, BroadcastList \uc81c\uac70\n\n0.2.1 (2023-08-31): py.typed \ucd94\uac00, freeze_dict_and_list \ucd94\uac00\n\n0.2.0 (2023-08-27): CustomDefaults \ucd94\uac00\n\n0.1.1 (2023-08-27): \uccab \ub9b4\ub9ac\uc988\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Various convenient features related to requests.",
    "version": "0.5.2",
    "project_urls": {
        "Changelog": "https://github.com/ilotoki0804/resoup#relese-note",
        "Documentation": "https://github.com/ilotoki0804/resoup",
        "Issues": "https://github.com/ilotoki0804/resoup/issues",
        "Repository": "https://github.com/ilotoki0804/resoup"
    },
    "split_keywords": [
        "requests",
        "bs4",
        "beautifulsoup",
        "async",
        "caching",
        "cache"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "381d6184381b138b9d475402008b863caa53eeddd1d0cff5680c28c4d0e70ffb",
                "md5": "d0cb07f3135beff9c2381d9da37fc3ec",
                "sha256": "e9cb4ab4161369be238f37d0f1c4d525b5c0eba47c423a573ec6166644d3dbe4"
            },
            "downloads": -1,
            "filename": "resoup-0.5.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d0cb07f3135beff9c2381d9da37fc3ec",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10,<4.0",
            "size": 40686,
            "upload_time": "2023-12-26T14:22:28",
            "upload_time_iso_8601": "2023-12-26T14:22:28.145552Z",
            "url": "https://files.pythonhosted.org/packages/38/1d/6184381b138b9d475402008b863caa53eeddd1d0cff5680c28c4d0e70ffb/resoup-0.5.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "852cae9c93c304de9507980f09c477d057ff4c12735a645030ff5472d963789e",
                "md5": "1f803c675a12cbfdda9a82891e765e99",
                "sha256": "1e1c2258fe172501fae77b27557c77c8165475aa2725c94c0f73ff131c67d88a"
            },
            "downloads": -1,
            "filename": "resoup-0.5.2.tar.gz",
            "has_sig": false,
            "md5_digest": "1f803c675a12cbfdda9a82891e765e99",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10,<4.0",
            "size": 45800,
            "upload_time": "2023-12-26T14:23:06",
            "upload_time_iso_8601": "2023-12-26T14:23:06.022068Z",
            "url": "https://files.pythonhosted.org/packages/85/2c/ae9c93c304de9507980f09c477d057ff4c12735a645030ff5472d963789e/resoup-0.5.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-26 14:23:06",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ilotoki0804",
    "github_project": "resoup#relese-note",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "lcname": "resoup"
}
        
Elapsed time: 0.16395s