Read More on Telerik Blogs
August 07, 2025 Web
Get A Free Trial

Async programming in Python allows you to write code that can perform multiple other tasks while waiting for slow operations. It handles multiple tasks without requiring the creation of multiple threads.

Python async programming is not the same as multi-threading. Even though they both achieve concurrency, multi-threading tasks run in parallel on multiple threads, whereas async programming functions run asynchronously using the event loop.

Async programming is most useful when your program spends a lot of time waiting for input or output (I/O), like:

  • Web scraping many pages at the same time
  • Making multiple API calls at once
  • Reading or writing large files
  • Running database queries over a network

In Python, asynchronous programming is implemented using the asyncio library. A basic implementation is given below.

import asyncio;

async def greet():
   print("hello");
   await asyncio.sleep(1);
   print("world");

async def main():
    await greet()

asyncio.run(main())

The greet function is asynchronous; it prints “hello” and then pauses for one second, before finally printing “world.”

The asyncio library encompasses several key concepts:

  • async – to define an async function
  • await – used inside an async function to pause execution until the awaited task is complete
  • run – runs the main coroutine and manages the event loop
  • coroutine – a special function defined with async that can be paused and resumed
  • Event loop – responsible for asynchronous tasks and callbacks

Coroutines

A coroutine is a function that can pause and resume its execution, allowing for asynchronous programming. You define a coroutine using the async def keyword and can use the await keyword to pause until another asynchronous operation completes.

async def coroutine_function():
    print("Coroutine function started.")
    await asyncio.sleep(1)
    print("Coroutine function finished.")

One complete example of a coroutine with asynchronous programming is as follows.

import asyncio
import aiohttp

async def fetch_data(url):
    async with aiohttp.ClientSession() as session:
        async with session.get(url) as response:
            data = await response.text()
            print(f"Fetched data from {url} :{data[:100]}...")
            return data

async def main():
    urls = [
        "http://jsonplaceholder.typicode.com/posts/1",
        "http://jsonplaceholder.typicode.com/posts/2",
        "http://jsonplaceholder.typicode.com/posts/3"
    ]
    tasks = [fetch_data(url) for url in urls]
    results = await asyncio.gather(*tasks)
    print("Fetched all data.")

asyncio.run(main())

The code uses aiohttp and asyncio for asynchronous HTTP requests. The async fetch_data() function:

  • Takes a URL
  • Opens an async HTTP session
  • Sends a GET request
  • Awaits the response and reads it as text
  • Prints the first 100 characters of the result
  • Returns the data

The asyncio.gather() Function

In the above example, we are using asyncio.gather() function, which is used to run multiple async tasks at the same time. It takes several awaitable objects and schedules them to run concurrently.

The await.asyncio.gather() function waits for the given tasks to complete and then returns their results as a list, preserving the same order as the input tasks.

It starts many async tasks at the same time instead of waiting for one to finish before starting the next. If any task fails, it raises the error after all tasks have finished running.

The asyncio.Queue() Function

The other scenario involves working with a producer and a consumer. The producer produces data, and the consumer consumes it. For that, asyncio provides the asyncio.queue() function. It is thread safe, a first-in-first-out (FIFO) queue. It is useful to fetch data and process it synchronously.

import asyncio
import aiohttp

async def fetch_data(url, queue):
    async with aiohttp.ClientSession() as session:
        async with session.get(url) as response:
            data = await response.text()
            print(f"Fetched data from {url} :{data[:100]}...")
            await queue.put(data)

async def process_data(queue):
    while True:
        data = await queue.get()
        if data is None:
            break
        print(f"Processing data of length {len(data)}")
        queue.task_done()

async def main():
    urls = [
        "http://jsonplaceholder.typicode.com/posts/1",
        "http://jsonplaceholder.typicode.com/posts/2",
        "http://jsonplaceholder.typicode.com/posts/3",
    ]
    queue = asyncio.Queue()

    consumer = asyncio.create_task(process_data(queue))
    producers = [fetch_data(url, queue) for url in urls]
    await asyncio.gather(*producers)
    await queue.put(None)
    await consumer

asyncio.run(main())

The aysncio.Queue() function works in the following ways:

  • You create a queue using asyncio.Queue() function
  • Producers use the put() function to add items
  • Consumers use the get() function to read items

The difference between asyncio.gather and asyncio.Queue can be summed up like this:

asyncio.gather

  • It is used when you want to run multiple coroutines concurrently.
  • You list the ones you want and it runs them all together, waiting for all of them to finish.
  • It gives you the results in a list.
  • It’s useful when you have a fixed set of tasks that you want to run at the same time.

asyncio.Queue

  • It is used when you want different subroutines to talk to each other and stay in sync.
  • You use this to implement the producer-consumer pattern.
  • It produces and consumes data when it is available.
  • It’s useful when the number of task and timing is not fixed.

Summary

Async is used for operations that involve waiting, like I/O tasks, API calls or accessing external resources. It differs from multithreading because the asyncio library executes multiple coroutines on a single thread using an event loop. If you’re starting with generative AI, many SDKs follow the async pattern, so having a solid understanding of asyncio is helpful.

I hope you find this post helpful. Thanks for reading.


About the Author

Dhananjay Kumar

Dhananjay Kumar is a well-known trainer and developer evangelist. He is the founder of NomadCoder, a company that focuses on creating job-ready developers through training in technologies such as Angular, Node.js, Python, .NET, Azure, GenAI and more. He is also the founder of ng-India, one of the world’s largest Angular communities. He lives in Gurgaon, India, and is currently writing his second book on Angular. You can reach out to him for training, evangelism and consulting opportunities.

Related Posts