What are Python Requests?
If you are working with APIs or scraping web pages, Python Requests is a must-have library for you. It is a simple and easy-to-use library that allows you to send HTTP/1.1 requests using Python. The library is built on top of urllib3 and is designed to be user-friendly.
Installation
To install Python Requests, you can use pip:
pip install requests
Usage
Using Python Requests, you can send GET, POST, PUT, DELETE, and other types of HTTP requests. Here's an example of sending a GET request to a website:
import requests
response = requests.get('https://www.example.com')
print(response.content)
In the above example, we are sending a GET request to 'https://www.example.com' and printing the content of the response.
Parameters
You can also pass parameters to a request using the 'params' parameter:
import requests
payload = {'key1': 'value1', 'key2': 'value2'}
response = requests.get('https://www.example.com', params=payload)
print(response.url)
In the above example, we are sending a GET request to 'https://www.example.com' with two parameters: key1 and key2. The URL of the request will be 'https://www.example.com/?key1=value1&key2=value2'.
Headers
You can also send headers with your request using the 'headers' parameter:
import requests
headers = {'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36'}
response = requests.get('https://www.example.com', headers=headers)
In the above example, we are sending a GET request to 'https://www.example.com' with a user-agent header.
Authentication
You can also authenticate your request using the 'auth' parameter:
import requests
response = requests.get('https://api.github.com/user', auth=('username', 'password'))
print(response.json())
In the above example, we are sending a GET request to 'https://api.github.com/user' with basic authentication using a username and password.
Conclusion
Python Requests is a powerful library that can make your life easier when working with APIs or scraping web pages. With its easy-to-use syntax and wide range of features, it is definitely worth learning.