Async Python: How asyncio Can Make Your Code Faster

andreasPython Code3 weeks ago29 Views

What problem does asyncio solve?

When your program spends time waiting (for a network call, file, timer, etc.), the CPU sits idle. asyncio lets one task yield while it waits so another task can run, giving you concurrency in a single thread.

  • Great for: many small, waiting-heavy operations (API calls, web scraping, bots, chat servers).
  • Not for: heavy CPU math (use processes or native extensions for that).

1) Coroutines and the Event Loop

The tiniest coroutine

import asyncio

async def hello():
    await asyncio.sleep(1)      # pretend to wait on I/O
    return "Hello, asyncio!"

result = asyncio.run(hello())   # creates/runs the event loop for you
print(result)

What it’s doing:

  • async def defines a coroutine (a function that can pause/resume).
  • await yields control back to the event loop while waiting (here: 1 second).
  • asyncio.run() starts the loop, runs the coroutine, and closes the loop.

2) Why async is faster for many waits

Sync vs Async timing demo

import time, asyncio

# ----- synchronous -----
def sync_job(i):
    time.sleep(1)      # blocks entire thread
    return f"sync {i}"

def sync_main():
    start = time.time()
    results = [sync_job(i) for i in range(5)]
    print(results, f"{time.time()-start:.2f}s")

# ----- asynchronous -----
async def async_job(i):
    await asyncio.sleep(1)  # yields to the loop
    return f"async {i}"

async def async_main():
    start = time.time()
    results = await asyncio.gather(*(async_job(i) for i in range(5)))
    print(results, f"{time.time()-start:.2f}s")

if __name__ == "__main__":
    sync_main()             # ≈ 5 seconds
    asyncio.run(async_main())  # ≈ 1 second

What’s happening:

  • Sync: 5 sleeps × 1s = ~5s (they happen one after another).
  • Async: 5 sleeps overlap = ~1s (the loop interleaves them).

3) asyncio.gather and limiting concurrency

Run many tasks, but cap how many run at once

import asyncio
import random

async def fetch(i):
    # Simulate a variable I/O wait (e.g., HTTP call)
    await asyncio.sleep(random.uniform(0.2, 1.0))
    return f"item-{i}"

async def worker(i, sem):
    async with sem:          # limit concurrent operations
        result = await fetch(i)
        print(f"done: {result}")
        return result

async def main():
    sem = asyncio.Semaphore(5)     # at most 5 concurrent fetches
    tasks = [worker(i, sem) for i in range(30)]
    results = await asyncio.gather(*tasks)
    print("Total:", len(results))

asyncio.run(main())

Why this matters: Real APIs will throttle/rate-limit you. A Semaphore keeps you respectful and stable.


4) Producer–Consumer with asyncio.Queue

Pattern for pipelines (downloader → parser → saver)

import asyncio

async def producer(q):
    for i in range(10):
        await q.put(f"task-{i}")   # enqueue work
    for _ in range(3):             # tell consumers to stop
        await q.put(None)

async def consumer(name, q):
    while True:
        item = await q.get()
        if item is None:
            break                  # shutdown signal
        await asyncio.sleep(0.3)   # simulate work
        print(f"{name} processed {item}")
        q.task_done()

async def main():
    q = asyncio.Queue()
    prod = asyncio.create_task(producer(q))
    consumers = [asyncio.create_task(consumer(f"C{i}", q)) for i in range(3)]
    await prod
    await q.join()                 # wait until queue is empty
    for c in consumers:
        await c                    # finish (after receiving None)

asyncio.run(main())

What it’s doing:

  • Producer enqueues tasks.
  • Consumers pull tasks concurrently.
  • This scales beautifully and keeps code tidy.

5) Timeouts, cancellation, and cleanup

Don’t hang forever: use wait_for

import asyncio

async def slow():
    await asyncio.sleep(5)

async def main():
    try:
        await asyncio.wait_for(slow(), timeout=1.0)
    except asyncio.TimeoutError:
        print("Timed out!")

asyncio.run(main())

Cancelling a task safely

import asyncio

async def long_running():
    try:
        while True:
            await asyncio.sleep(0.5)
            print("working...")
    except asyncio.CancelledError:
        print("cancelling: cleaning up...")
        # close files/sockets here
        raise

async def main():
    t = asyncio.create_task(long_running())
    await asyncio.sleep(2)
    t.cancel()
    try:
        await t
    except asyncio.CancelledError:
        print("task cancelled")

asyncio.run(main())

Why it matters: Real systems must handle timeouts and shutdowns gracefully.


6) Using async HTTP (with aiohttp) — real world I/O

If you can, install aiohttp (pip install aiohttp). This shows true I/O concurrency.

import asyncio, aiohttp

URLS = [
    "https://httpbin.org/delay/1",
    "https://httpbin.org/delay/2",
    "https://httpbin.org/delay/3",
]

async def fetch(session, url):
    async with session.get(url) as resp:
        return url, resp.status

async def main():
    async with aiohttp.ClientSession() as session:
        results = await asyncio.gather(*(fetch(session, u) for u in URLS))
        for url, status in results:
            print(status, url)

asyncio.run(main())

What it’s doing:

  • Reuses a single ClientSession (connection pooling).
  • Starts all HTTP requests together; they complete as soon as each is ready.

7) Mixing in blocking code (the right way)

If you must call a blocking function (e.g., CPU-bound or a non-async library), offload it:

import asyncio, time

def blocking_io(n):
    time.sleep(1)         # blocks the thread
    return n * n

async def main():
    loop = asyncio.get_running_loop()
    # Offload to a default ThreadPoolExecutor
    results = await asyncio.gather(
        loop.run_in_executor(None, blocking_io, 3),
        loop.run_in_executor(None, blocking_io, 4),
    )
    print(results)

asyncio.run(main())

Rule of thumb:

  • I/O-bound blocking → threadpool via run_in_executor.
  • CPU-bound heavy workmultiprocessing (separate processes).

8) Common pitfalls (and quick fixes)

  • X Using time.sleep() in async code → blocks the loop.
    OK Use await asyncio.sleep().
  • X Forgetting await on coroutines → you get a coroutine object, not a result.
    OK Always await or schedule with create_task.
  • X Starting multiple event loops manually.
    OK Use a single asyncio.run(main()) entry point.
  • X Spamming unlimited concurrency to an API.
    OK Use Semaphore or a bounded queue.

Quick reference: structure your program

import asyncio

async def main():
    # 1) create tasks
    t1 = asyncio.create_task(coro1())
    t2 = asyncio.create_task(coro2())

    # 2) run other awaits while tasks progress
    result = await some_other_coro()

    # 3) gather results
    await asyncio.gather(t1, t2)

if __name__ == "__main__":
    asyncio.run(main())

Wrap-up

  • Coroutines (async def) + await let your code pause while it waits.
  • The event loop interleaves many waiting tasks, making programs much faster when I/O-bound.
  • Use gather, Semaphore, Queue, timeouts, and cancellation to make robust async apps.

1 Votes: 1 Upvotes, 0 Downvotes (1 Points)

Leave a reply

Loading Next Post...
Loading

Signing-in 3 seconds...