Python Requests Async Example
As a developer, I have encountered situations where I needed to make multiple requests to an API or website, and waiting for each request to complete can be time-consuming. This is where asynchronous programming comes in handy, and Python's asyncio
module with aiohttp
library can be used for this purpose. In this post, I will show you an example of how to use Python requests asynchronously.
Installing Required Libraries
Before we dive into the example, we need to install the required libraries. We can do this using pip.
pip install aiohttp
Example Code
Let's consider a scenario where we want to fetch data from multiple URLs asynchronously. We can create a function to make the requests using aiohttp
library and async with
statement. We will also use the asyncio.gather()
function to run all requests concurrently.
import asyncio
import aiohttp
async def fetch(session, url):
async with session.get(url) as response:
return await response.text()
async def main():
urls = [
'https://jsonplaceholder.typicode.com/posts/1',
'https://jsonplaceholder.typicode.com/comments/1',
'https://jsonplaceholder.typicode.com/albums/1',
'https://jsonplaceholder.typicode.com/photos/1',
'https://jsonplaceholder.typicode.com/todos/1',
'https://jsonplaceholder.typicode.com/users/1'
]
async with aiohttp.ClientSession() as session:
tasks = []
for url in urls:
task = asyncio.ensure_future(fetch(session, url))
tasks.append(task)
responses = await asyncio.gather(*tasks)
for response in responses:
print(response)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
In the above code, we have defined a function fetch()
that takes a session and a URL as arguments. The function uses the async with
statement to make the request and return the response text.
We have also defined a function main()
that contains a list of URLs to fetch. We create a session using aiohttp.ClientSession()
and create tasks using asyncio.ensure_future()
for each URL. We then use the asyncio.gather()
function to run all tasks concurrently and wait for their completion.
Alternative Approach
Another approach to making asynchronous requests is to use Python's requests
library with asyncio
. We can achieve this by using the requests
library within an aiohttp.ClientSession()
context. Here is an example:
import asyncio
import aiohttp
import requests
async def fetch(session, url):
with requests.Session() as s:
response = s.get(url)
return response.text
async def main():
urls = [
'https://jsonplaceholder.typicode.com/posts/1',
'https://jsonplaceholder.typicode.com/comments/1',
'https://jsonplaceholder.typicode.com/albums/1',
'https://jsonplaceholder.typicode.com/photos/1',
'https://jsonplaceholder.typicode.com/todos/1',
'https://jsonplaceholder.typicode.com/users/1'
]
async with aiohttp.ClientSession() as session:
tasks = []
for url in urls:
task = asyncio.ensure_future(fetch(session, url))
tasks.append(task)
responses = await asyncio.gather(*tasks)
for response in responses:
print(response)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
In the above code, we have used the requests.Session()
to make the requests within an aiohttp.ClientSession()
context. This approach can be useful if we are already familiar with the requests
library and want to use it for asynchronous requests.
Conclusion
Asynchronous programming can be a powerful tool for making multiple requests concurrently and improving the performance of our applications. Python's asyncio
module with aiohttp
library provides an easy way to implement asynchronous requests. We can use either the aiohttp
library or the requests
library within an aiohttp.ClientSession()
context to achieve this.