๐Ÿฆ€ Functional Rust

726: Memory Pool / Bump Allocator Pattern

Difficulty: 4 Level: Expert Allocate and deallocate objects in O(1) by managing your own memory โ€” no `malloc`, no fragmentation.

The Problem This Solves

The general-purpose allocator (`malloc`/`free`) is powerful but has overhead: it must handle arbitrary allocation sizes, maintain free-lists for different size classes, prevent fragmentation, and be thread-safe. For workloads that create and destroy many objects of the same size โ€” game entities, parser AST nodes, database row handles, network connection objects โ€” this overhead adds up. A memory pool solves this by pre-allocating a fixed block of same-sized slots and managing them with a simple free-list. Allocation is: pop a slot index from the free-list โ€” O(1). Deallocation is: push the index back โ€” O(1). No fragmentation because all objects are the same size. No thread-safety overhead for single-threaded use. The allocator doesn't have to search for a fit; it always has one ready. A bump allocator (arena) goes further: allocation is a single pointer increment โ€” sub-nanosecond. Deallocation is: drop the entire arena at once. This suits workloads where objects are created together and destroyed together (a parser's working set, a per-request scratch space). unsafe is a tool, not a crutch โ€” use only when safe Rust genuinely can't express the pattern.

The Intuition

Fixed-size pool: Picture a hotel with 64 rooms. Check-in hands you a room number (alloc returns an index). Check-out puts the room back on the front desk's list (dealloc pushes the index back). Every room is the same size. No searching, no splitting, no coalescing. Bump arena: Picture a notepad. Writing something tears off a page (bump the pointer, give out a slice). When the meeting is over, throw away the entire notepad (drop the arena). Individual "pages" can't be reclaimed โ€” the whole pad goes at once.

How It Works in Rust

use std::mem::MaybeUninit;

// โ”€โ”€ Fixed-size typed pool โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
pub struct Pool<T, const CAP: usize> {
 slots:     Box<[MaybeUninit<T>; CAP]>,
 free_head: Option<usize>,
 next_free: [usize; CAP],
 live:      usize,
}

impl<T, const CAP: usize> Pool<T, CAP> {
 pub fn alloc(&mut self, val: T) -> Option<usize> {
     let idx = self.free_head?;              // O(1): pop free-list head
     let next = self.next_free[idx];
     self.free_head = if next < CAP { Some(next) } else { None };
     self.slots[idx].write(val);             // initialise the slot
     self.live += 1;
     Some(idx)
 }

 /// # Safety
 /// `idx` must have been returned by `alloc` and not yet freed.
 pub unsafe fn dealloc(&mut self, idx: usize) {
     // SAFETY: Caller guarantees slot is live and won't be accessed again.
     unsafe { self.slots[idx].assume_init_drop(); }
     self.next_free[idx] = self.free_head.unwrap_or(CAP);
     self.free_head = Some(idx);             // O(1): push back to free-list
     self.live -= 1;
 }
}

// โ”€โ”€ Bump allocator (arena) โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
pub struct BumpArena {
 memory: Vec<u8>,
 offset: usize,
}

impl BumpArena {
 pub fn new(capacity: usize) -> Self {
     Self { memory: vec![0u8; capacity], offset: 0 }
 }

 pub fn alloc_bytes(&mut self, size: usize, align: usize) -> Option<*mut u8> {
     let aligned = (self.offset + align - 1) & !(align - 1);  // align up
     if aligned + size > self.memory.len() { return None; }
     let ptr = unsafe { self.memory.as_mut_ptr().add(aligned) };
     self.offset = aligned + size;
     Some(ptr)
     // No individual free โ€” entire arena resets at once.
 }

 pub fn reset(&mut self) { self.offset = 0; }  // O(1) bulk deallocation
}

What This Unlocks

Key Differences

ConceptOCamlRust
Custom allocatorNot idiomatic; GC handles allocation`Pool<T>` / `BumpArena` as library types
Allocation costGC amortised; unpredictable latencyO(1) guaranteed โ€” pointer bump or free-list pop
DeallocationGC traces and collectsManual (pool) or bulk-drop (arena)
Memory safetyGC prevents use-after-freeManual: `dealloc` marks slot dead; caller must not reuse index
FragmentationGC compacts (in some runtimes)Pool: none (same-size slots); arena: none (no free holes)
// 726. Memory pool / bump allocator pattern
//
// Implements a typed pool allocator and a lifetime-safe bump arena.

use std::alloc::{alloc, dealloc, Layout};
use std::cell::Cell;
use std::marker::PhantomData;
use std::ptr::NonNull;

// โ”€โ”€ Part 1: Fixed-size typed object pool โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€

/// A pool of `CAP` pre-allocated `T` slots.
/// Allocation and deallocation are O(1) via a free-list.
pub struct Pool<T, const CAP: usize> {
    slots:     Box<[std::mem::MaybeUninit<T>; CAP]>,
    free_head: Option<usize>,
    next_free: [usize; CAP], // next pointer for free-list
    live:      usize,
}

impl<T, const CAP: usize> Pool<T, CAP> {
    pub fn new() -> Self {
        // Build free list: 0 โ†’ 1 โ†’ 2 โ†’ โ€ฆ โ†’ CAP-1 โ†’ sentinel
        let mut next_free = [0usize; CAP];
        for i in 0..CAP { next_free[i] = i + 1; }
        Self {
            // SAFETY: Array of MaybeUninit requires no initialisation.
            slots:     Box::new(unsafe { std::mem::MaybeUninit::uninit().assume_init() }),
            free_head: Some(0),
            next_free,
            live:      0,
        }
    }

    /// Allocate a slot. Returns `None` when the pool is exhausted.
    pub fn alloc(&mut self, val: T) -> Option<usize> {
        let idx = self.free_head?;
        let next = self.next_free[idx];
        self.free_head = if next < CAP { Some(next) } else { None };
        self.slots[idx].write(val);
        self.live += 1;
        Some(idx)
    }

    /// Return a handle to the pool. Panics on invalid index.
    pub fn get(&self, idx: usize) -> &T {
        assert!(idx < CAP);
        // SAFETY: `idx` was returned by `alloc` and not yet freed, so the slot
        // is initialised.
        unsafe { self.slots[idx].assume_init_ref() }
    }

    /// Deallocate the slot at `idx`.
    ///
    /// # Safety
    /// `idx` must have been returned by `alloc` and not yet freed.
    pub unsafe fn dealloc(&mut self, idx: usize) {
        assert!(idx < CAP);
        // SAFETY: caller guarantees the slot is live.
        unsafe { self.slots[idx].assume_init_drop(); }
        self.next_free[idx] = self.free_head.map_or(CAP, |h| h);
        self.free_head = Some(idx);
        self.live -= 1;
    }

    pub fn live(&self) -> usize { self.live }
}

// โ”€โ”€ Part 2: Lifetime-safe bump arena โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€

/// A bump allocator backed by a heap-allocated byte slab.
/// All objects allocated from this arena must not outlive it.
///
/// The `'arena` lifetime parameter propagates to every `&'arena T` returned.
pub struct Arena {
    ptr: NonNull<u8>,
    cap: usize,
    pos: Cell<usize>, // Cell for interior mutability through shared ref
}

impl Arena {
    pub fn new(capacity: usize) -> Self {
        let layout = Layout::from_size_align(capacity, 16).expect("valid layout");
        // SAFETY: layout has non-zero size (we require capacity > 0).
        let ptr = unsafe { alloc(layout) };
        let ptr = NonNull::new(ptr).expect("allocation failed");
        Self { ptr, cap: capacity, pos: Cell::new(0) }
    }

    /// Allocate space for one `T`, returning a mutable reference with lifetime
    /// tied to this arena. Zero-initialises the slot.
    pub fn alloc<T: Default>(&self) -> &mut T {
        let layout = Layout::new::<T>();
        let offset  = self.pos.get();
        let aligned = (offset + layout.align() - 1) & !(layout.align() - 1);
        let next    = aligned + layout.size();
        assert!(next <= self.cap, "Arena OOM");
        self.pos.set(next);

        // SAFETY: `aligned` is within the allocated slab and properly aligned
        // for `T`. We return a unique `&mut T` tied to `&self` (arena lifetime).
        // The arena does not move the slab, so the reference stays valid.
        let slot_ptr = unsafe { self.ptr.as_ptr().add(aligned) as *mut T };
        unsafe {
            slot_ptr.write(T::default());
            &mut *slot_ptr
        }
    }

    /// Allocate a byte slice of `len` bytes.
    pub fn alloc_slice(&self, len: usize) -> &mut [u8] {
        let aligned = self.pos.get();
        let next = aligned + len;
        assert!(next <= self.cap, "Arena OOM");
        self.pos.set(next);
        // SAFETY: `aligned..next` is within the slab, not aliased.
        unsafe {
            std::slice::from_raw_parts_mut(self.ptr.as_ptr().add(aligned), len)
        }
    }

    /// Reset the bump pointer. All previous allocations become invalid.
    ///
    /// # Safety
    /// Caller must guarantee no references into this arena are live after this call.
    pub unsafe fn reset(&self) {
        self.pos.set(0);
    }

    pub fn used(&self) -> usize { self.pos.get() }
    pub fn capacity(&self) -> usize { self.cap }
}

impl Drop for Arena {
    fn drop(&mut self) {
        let layout = Layout::from_size_align(self.cap, 16).unwrap();
        // SAFETY: `self.ptr` was allocated with this layout in `new()`.
        unsafe { dealloc(self.ptr.as_ptr(), layout); }
    }
}

// โ”€โ”€ Part 3: Arena-allocated parse tree โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€

/// A simple expression AST node โ€” tied to arena lifetime `'a`.
#[derive(Debug)]
pub enum Expr<'a> {
    Num(i64),
    Add(&'a Expr<'a>, &'a Expr<'a>),
    Mul(&'a Expr<'a>, &'a Expr<'a>),
}

impl<'a> Default for Expr<'a> {
    fn default() -> Self {
        Expr::Num(0)
    }
}

impl<'a> Expr<'a> {
    pub fn eval(&self) -> i64 {
        match self {
            Expr::Num(n)   => *n,
            Expr::Add(l,r) => l.eval() + r.eval(),
            Expr::Mul(l,r) => l.eval() * r.eval(),
        }
    }
}

/// Build `(1 + 2) * 3` in the arena โ€” no separate heap allocations.
fn build_ast(arena: &Arena) -> &Expr<'_> {
    let one   = arena.alloc::<Expr>();
    *one = Expr::Num(1);

    let two   = arena.alloc::<Expr>();
    *two = Expr::Num(2);

    let three = arena.alloc::<Expr>();
    *three = Expr::Num(3);

    let add   = arena.alloc::<Expr>();
    *add = Expr::Add(one, two);

    let mul   = arena.alloc::<Expr>();
    *mul = Expr::Mul(add, three);
    mul
}

// โ”€โ”€ main โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€


// โ”€โ”€ Tests โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€

#[cfg(test)]
mod tests {
    use super::*;

    #[test]
    fn pool_alloc_dealloc() {
        let mut p = Pool::<i32, 4>::new();
        let h = p.alloc(42).unwrap();
        assert_eq!(*p.get(h), 42);
        assert_eq!(p.live(), 1);
        // SAFETY: h just returned from alloc, not freed.
        unsafe { p.dealloc(h); }
        assert_eq!(p.live(), 0);
    }

    #[test]
    fn pool_exhaustion() {
        let mut p = Pool::<u8, 2>::new();
        assert!(p.alloc(1).is_some());
        assert!(p.alloc(2).is_some());
        assert!(p.alloc(3).is_none()); // full
    }

    #[test]
    fn pool_reuse_slot() {
        let mut p = Pool::<u8, 2>::new();
        let h = p.alloc(1).unwrap();
        // SAFETY: h just returned from alloc.
        unsafe { p.dealloc(h); }
        let h2 = p.alloc(99).unwrap();
        assert_eq!(*p.get(h2), 99);
    }

    #[test]
    fn arena_alloc_and_reset() {
        let arena = Arena::new(1024);
        let x = arena.alloc::<u64>();
        *x = 12345;
        assert_eq!(*x, 12345);
        assert!(arena.used() > 0);
        // SAFETY: no references live after this.
        unsafe { arena.reset(); }
        assert_eq!(arena.used(), 0);
    }

    #[test]
    fn arena_ast_eval() {
        let arena = Arena::new(1024);
        let expr = build_ast(&arena);
        assert_eq!(expr.eval(), 9);  // (1+2)*3
    }

    #[test]
    #[should_panic(expected = "Arena OOM")]
    fn arena_oom() {
        let arena = Arena::new(8);
        let _ = arena.alloc_slice(9); // too big
    }
}
(* OCaml: Memory pool / arena pattern
   OCaml's GC acts as an implicit arena for the minor heap.
   We implement an explicit arena for demonstration. *)

(* --- Simple typed pool: pre-allocate N slots --- *)

type 'a pool = {
  slots   : 'a array;
  mutable free: int list;  (* indices of free slots *)
}

exception Pool_exhausted

let make_pool n default =
  { slots = Array.make n default;
    free  = List.init n (fun i -> i) }

let pool_alloc pool x =
  match pool.free with
  | []     -> raise Pool_exhausted
  | i :: t ->
    pool.free <- t;
    pool.slots.(i) <- x;
    i  (* return handle (index) *)

let pool_get pool i = pool.slots.(i)

let pool_free pool i =
  (* Return slot to free list โ€” O(1) *)
  pool.free <- i :: pool.free

let pool_used pool =
  let total = Array.length pool.slots in
  total - List.length pool.free

(* --- Bump allocator (arena) --- *)

type arena = {
  buf  : bytes;
  mutable pos  : int;
  cap  : int;
}

let make_arena cap = { buf = Bytes.create cap; pos = 0; cap }

let arena_alloc arena size =
  if arena.pos + size > arena.cap then failwith "Arena OOM"
  else begin
    let start = arena.pos in
    arena.pos <- arena.pos + size;
    (arena.buf, start, size)   (* reference into arena, not a copy *)
  end

let arena_reset arena = arena.pos <- 0

let arena_used arena = arena.pos

(* --- Parse tree allocated in arena --- *)

type ast_node =
  | Num of int
  | Add of int * int  (* indices into arena-resident nodes *)

let parse_expression arena_buf =
  (* Simulate building a small AST by writing into the arena *)
  let (buf, off, _) = arena_alloc arena_buf 8 in
  Bytes.set_int32_be buf off (Int32.of_int 42);   (* store number *)
  (buf, off)

let () =
  (* Pool demo *)
  let pool = make_pool 4 0 in
  let h1 = pool_alloc pool 100 in
  let h2 = pool_alloc pool 200 in
  Printf.printf "pool[%d]=%d pool[%d]=%d used=%d\n"
    h1 (pool_get pool h1) h2 (pool_get pool h2) (pool_used pool);
  pool_free pool h1;
  Printf.printf "After free h1: used=%d\n" (pool_used pool);
  let h3 = pool_alloc pool 999 in
  Printf.printf "Reused slot %d = %d\n" h3 (pool_get pool h3);

  (* Arena demo *)
  let arena = make_arena 256 in
  let _ = parse_expression arena in
  Printf.printf "Arena used: %d bytes\n" (arena_used arena);
  arena_reset arena;
  Printf.printf "After reset: %d bytes\n" (arena_used arena);

  (* Allocate many small things *)
  for _ = 1 to 10 do
    let _ = arena_alloc arena 8 in ()
  done;
  Printf.printf "After 10ร—8B allocs: %d bytes used\n" (arena_used arena)