321: async fn and .await Fundamentals
Difficulty: 3 Level: Advanced Async functions return a Future โ a description of work, not work itself. Nothing runs until you drive it.The Problem This Solves
You're building a web service that needs to fetch a user record and their posts from a database. Done sequentially, each operation blocks until the previous one finishes โ if each takes 50ms, you wait 100ms total. With dozens of concurrent requests, you're burning threads that mostly just sleep. The real problem is that threads are expensive. Each OS thread uses ~1โ8 MB of stack memory and has significant scheduling overhead. When your work is I/O-bound (waiting on network, disk, database), most of that thread is idle. You need a way to say "start this work, come back when it's done" without dedicating an entire thread to waiting. Without async, you either block (wasting threads) or use callbacks (callback hell, inverted control flow). Async gives you linear-looking code that doesn't block.The Intuition
Think of `async fn` like a JavaScript `async function` or Python `async def` โ it looks like normal code but runs cooperatively. The key insight: an async function doesn't run when called. It returns a Future, which is just a description of what to do. You need to `.await` it (or give it to a runtime) for anything to happen.Normal fn: call โ runs immediately โ returns value
Async fn: call โ returns Future (nothing runs yet) โ .await โ runs โ returns value
In JavaScript: `fetch(url)` returns a Promise. Until you `await` it, nothing happens. Same idea in Rust, but stricter โ the compiler enforces it.
This example uses `std::thread` as a synchronous analogy (no tokio needed), showing the same sequential vs. concurrent patterns you'd use with `.await` and `join!`.
How It Works in Rust
// Sequential โ like: let user = fetch_user(id).await; let posts = fetch_posts(id).await;
fn sequential_fetch(id: u32) -> (String, Vec<String>) {
(fetch_user(id), fetch_posts(id)) // each call blocks until done
}
// Concurrent โ like: join!(fetch_user(id), fetch_posts(id))
fn concurrent_fetch(id: u32) -> (String, Vec<String>) {
let h1 = thread::spawn(move || fetch_user(id)); // start both
let h2 = thread::spawn(move || fetch_posts(id)); // before waiting for either
(h1.join().unwrap(), h2.join().unwrap()) // now wait for both
}
The `move` keyword transfers ownership of `id` into the closure โ necessary because the thread might outlive the current scope. In real async code, `async move { ... }` does the same thing.
What This Unlocks
- I/O-bound services: Serve thousands of concurrent requests with far fewer threads than traditional blocking I/O.
- Parallel data fetching: Fetch from multiple sources simultaneously instead of sequentially.
- Responsive UIs: Keep interfaces interactive while background work runs without blocking the main thread.
Key Differences
| Concept | OCaml | Rust |
|---|---|---|
| Async function | no native async (Lwt library) | `async fn foo() -> T` |
| Await result | `Lwt.bind p (fun x -> ...)` / `>>=` | `.await` |
| Run both concurrently | `Lwt.join [p1; p2]` | `join!(f1, f2)` (needs runtime) |
| Lazy evaluation | explicit thunks `fun () -> ...` | implicit โ Future not polled until driven |
use std::thread;
use std::time::Duration;
fn fetch_user(id: u32) -> String {
thread::sleep(Duration::from_millis(10));
format!("User({id})")
}
fn fetch_posts(user_id: u32) -> Vec<String> {
thread::sleep(Duration::from_millis(8));
vec![format!("Post1 by {user_id}"), format!("Post2 by {user_id}")]
}
// Sequential (like: user = fetch_user(id).await; posts = fetch_posts(id).await)
fn sequential_fetch(id: u32) -> (String, Vec<String>) {
(fetch_user(id), fetch_posts(id))
}
// Concurrent (like: join!(fetch_user(id), fetch_posts(id)))
fn concurrent_fetch(id: u32) -> (String, Vec<String>) {
let h1 = thread::spawn(move || fetch_user(id));
let h2 = thread::spawn(move || fetch_posts(id));
(h1.join().unwrap(), h2.join().unwrap())
}
fn main() {
let (user, posts) = sequential_fetch(42);
println!("User: {user}");
for p in &posts { println!(" {p}"); }
let (u2, p2) = concurrent_fetch(99);
println!("Concurrent: {u2} with {} posts", p2.len());
}
#[cfg(test)]
mod tests {
use super::*;
#[test] fn sequential_correct() { let (u,_) = sequential_fetch(1); assert_eq!(u, "User(1)"); }
#[test] fn concurrent_same() { let (u1,p1) = sequential_fetch(7); let (u2,p2) = concurrent_fetch(7); assert_eq!(u1,u2); assert_eq!(p1,p2); }
}
(* OCaml: concurrent patterns with Thread *)
let fetch_user id =
Thread.delay 0.05;
Printf.sprintf "User(%d)" id
let fetch_posts user_id =
Thread.delay 0.03;
[Printf.sprintf "Post1 by %d" user_id; Printf.sprintf "Post2 by %d" user_id]
let () =
let user = fetch_user 42 in
let posts = fetch_posts 42 in
Printf.printf "User: %s\n" user;
List.iter (Printf.printf "Post: %s\n") posts