Skip to content

Journey 07 Async Remake

Roger Johansson edited this page Jan 15, 2026 · 3 revisions

Chapter 7: Async Remake

December 12-13, 2025 - Take two on async

The Honest Assessment

CPS worked. The tests passed. I shipped it.

I also knew it was a hack.

The code was getting harder to maintain. Edge cases kept appearing. The transformed code was bloated and hard to debug. Every new feature made the transformer more complex.

Sometimes "it works" isn't good enough. This was one of those times.

The Rewrite

PR #225 landed with 8,664 insertions. Not a patch. A complete reimplementation.

The goals:

  • Cleaner async/await semantics
  • Better integration with the evolving IR system
  • Proper async generators (not just generators + CPS hacks)
  • Performance that doesn't embarrass me

New Components

AwaitScheduler

A proper event loop implementation:

public class AwaitScheduler
{
    private readonly Channel<AsyncTask> _taskQueue;

    public void EnqueueMicrotask(Action task)
    {
        // Promise continuations
        _taskQueue.Writer.TryWrite(new AsyncTask(task, Priority.Microtask));
    }

    public void EnqueueMacrotask(Action task)
    {
        // setTimeout, setInterval
        _taskQueue.Writer.TryWrite(new AsyncTask(task, Priority.Macrotask));
    }

    public async Task RunEventLoopAsync()
    {
        while (true)
        {
            // Drain ALL microtasks first
            while (TryDequeueMicrotask(out var micro))
                micro.Execute();

            // Then ONE macrotask
            if (TryDequeueMacrotask(out var macro))
                macro.Execute();
            else
                await WaitForMoreWork();
        }
    }
}

The microtask/macrotask ordering is critical. JavaScript developers expect Promise .then() callbacks to run before setTimeout callbacks.

Getting this wrong breaks real code.

JsValueCache

Stop allocating the same values over and over:

public class JsValueCache
{
    // Small integers are common
    private static readonly JsValue[] _smallInts = new JsValue[256];

    // These are singleton
    public static readonly JsValue True = JsValue.FromBoolean(true);
    public static readonly JsValue False = JsValue.FromBoolean(false);
    public static readonly JsValue Undefined = JsValue.Undefined;
    public static readonly JsValue Null = JsValue.Null;

    static JsValueCache()
    {
        for (int i = 0; i < 256; i++)
            _smallInts[i] = JsValue.FromNumber(i - 128);
    }

    public static JsValue FromInt32(int value)
    {
        if (value >= -128 && value < 128)
            return _smallInts[value + 128];
        return JsValue.FromNumber(value);
    }
}

Small optimization, but 0, 1, -1 show up constantly. Why allocate them every time?

BaseRealmSnapshot

Creating a new realm (the JavaScript global environment) was slow. We were building all the intrinsics from scratch each time.

So, pre-build everything once, then clone:

public class BaseRealmSnapshot
{
    // Built once at startup
    private readonly JsObject _objectPrototype;
    private readonly JsObject _arrayPrototype;
    private readonly JsFunction _objectConstructor;
    // ... all intrinsics

    public Realm CreateRealm()
    {
        // Clone the pre-built state
        return new Realm(
            Clone(_objectPrototype),
            Clone(_arrayPrototype),
            Clone(_objectConstructor),
            // ...
        );
    }
}

Realm creation went from "measurable" to "instant."

Async Generators For Real

The CPS approach couldn't handle async generators properly. Now we could:

async function* fetchPages() {
    for (let page = 1; ; page++) {
        const data = await fetch(`/api/page/${page}`);
        if (!data.length) return;
        yield data;
    }
}

for await (const page of fetchPages()) {
    console.log(page);
}

Each yield suspends. Each await suspends. Both work correctly, independently.

Nice, right?

The implementation:

[JsPrototype("AsyncGenerator")]
public partial class AsyncGeneratorPrototype
{
    [JsMethod("next")]
    public static JsValue Next(JsValue thisArg, JsValue[] args)
    {
        var generator = thisArg.GetInternalSlot<AsyncGeneratorState>();
        var value = args.Length > 0 ? args[0] : JsValue.Undefined;

        // Returns a Promise that resolves to { value, done }
        return generator.EnqueueRequest(
            AsyncGeneratorRequestKind.Next,
            value
        );
    }
}

The async generator state machine handles:

  • Queuing multiple next() calls
  • Awaiting Promises mid-execution
  • Yielding values asynchronously
  • Proper cleanup on return() and throw()

DisposableStack

While I was in there, I added the using declaration support:

using (const resource = getResource()) {
    // use resource
} // resource[Symbol.dispose]() called automatically

And async disposal:

await using (const connection = await connect()) {
    // use connection
} // await connection[Symbol.asyncDispose]()

Modern JavaScript features. Because why not? :-)

The Results

  • Cleaner codebase
  • Better performance
  • Proper async generators
  • Integration points for IR

Was it 8,000+ lines of work? Yes. Was it worth it? Absolutely.

Lessons

  1. Rewrites can be worth it - When the architecture is fighting you, fix the architecture
  2. Cache what you can - Small values, intrinsics, anything repeated
  3. Event loop ordering matters - Microtasks before macrotasks, always
  4. Ship, then improve - CPS shipped, async remake shipped later. Both were valid choices at the time

Previous: Chapter 6: Source Generators Next: Chapter 8: IR Revolution - Unifying everything into one model

//Roger

Clone this wiki locally