Skip to content

Performance Patterns

Roger Johansson edited this page Jan 14, 2026 · 2 revisions

Performance Patterns

Optimization strategies used throughout Asynkron.JsEngine.


Object Pooling

Files: ObjectPool.cs, IRentable.cs, JsEnvironmentPool.cs

Frequently allocated objects are pooled to reduce GC pressure.

flowchart LR
    subgraph Pool["Object Pool"]
        P1[/"Available"/]
        P2[/"Available"/]
        P3[/"In Use"/]
    end
    
    Code["Code"] -->|"Rent()"| P1
    P1 -->|"Activate()"| InUse((In Use))
    InUse -->|"Return()"| Reset["Reset()"]
    Reset -->|"CAS"| Pool
    
    subgraph Lifecycle["Object Lifecycle"]
        direction TB
        L1["Rent"] --> L2["Activate"]
        L2 --> L3["Use"]
        L3 --> L4["Reset"]
        L4 --> L5["Return to pool"]
    end
Loading

Pooled Types

Type Purpose
JsEnvironment Execution scopes (created per function call, loop iteration)
IteratorDriverState for-of loop state (iterator object, enumerator)
ForInDriverState for-in loop state (property keys)
AsyncResumeCallback Async function/generator callbacks
Various enumerators JsArrayPooledEnumerator, StringPooledEnumerator, etc.

IRentable Interface

internal interface IRentable
{
    void Activate(ILogger? logger = null);  // Called on rent
    void Reset(ILogger? logger = null);     // Called on return
}

ObjectPool Implementation

Lock-free fixed-size array pool using Interlocked.CompareExchange:

internal sealed class ObjectPool<T>(int size, Func<T> factory) where T : class
{
    public T Rent(ILogger? logger = null)
    {
        // Try to find available item via CAS
        // If pool exhausted, create new via factory
    }

    public void Return(T item, ILogger? logger = null)
    {
        // Reset item, try to return via CAS
        // If pool full, item is abandoned to GC
    }
}

Pooled Wrapper (RAII)

using var envHandle = JsEnvironmentPool.Rent(enclosing, isFunctionScope, isStrict);
var env = envHandle.Value;
// ... use env ...
// Automatically returned on dispose

Why Pooling Matters

In a tight loop like for (let i = 0; i < 1000000; i++):

  • Each iteration creates a new block scope (JsEnvironment)
  • Without pooling: 1M allocations, heavy GC pressure
  • With pooling: ~32 allocations (pool size), objects reused

Fast/Slow Path Split

Hot instruction handlers are split into inlined fast paths and non-inlined slow paths:

flowchart TB
    Handler["Handler Called"] --> FastCheck{Fast path<br/>conditions met?}
    
    FastCheck -->|Yes| FastPath["Fast Path<br/>(~30 lines, INLINED)"]
    FastCheck -->|No| SlowPath["Slow Path<br/>(NO_INLINING)"]
    
    FastPath --> Result((Result))
    SlowPath --> Result
    
    style FastPath fill:#9f9,stroke:#333
    style SlowPath fill:#f99,stroke:#333
Loading
[MethodImpl(MethodImplOptions.AggressiveInlining)]
private static InstructionResult HandleIncrementSlot(...)
{
    var flatSlotId = instruction.FlatSlotId;

    // Fast path: ~30 lines, handles common case (numeric loop counter)
    if (flatSlotId >= 0 && _flatSlots is not null)
    {
        ref var targetVar = ref runner._flatSlots![flatSlotId];
        var currentValue = targetVar.Read();

        if (currentValue.Kind == JsValueKind.Number)
        {
            var newValue = instruction.IsIncrement
                ? currentValue.NumberValue + 1.0
                : currentValue.NumberValue - 1.0;
            targetVar.Write(newValue);
            runner._programCounter = instruction.Next;
            returnValue = default;
            return InstructionResult.Continue;
        }
    }

    // Delegate to slow path
    return HandleIncrementSlotSlow(...);
}

[MethodImpl(MethodImplOptions.NoInlining)]
private static InstructionResult HandleIncrementSlotSlow(...)
{
    // Complex cases: scope lookup, type coercion, errors
}

Why: JIT inlines the tiny fast path into the hot loop. Slow path stays separate, doesn't bloat the loop.


Dispatch Table

Handlers are stored in a delegate array indexed by InstructionKind:

private static readonly InstructionHandler[] InstructionHandlers = new InstructionHandler[40];

static ExecutionPlanRunner()
{
    InstructionHandlers[(int)InstructionKind.Statement] = HandleStatement;
    InstructionHandlers[(int)InstructionKind.IncrementSlot] = HandleIncrementSlot;
    // ...
}

Why: Faster than switch statement for many cases. Enables direct delegate invocation.

Hot Path Before Dispatch

Jump and Branch are checked before dispatch table lookup:

if (instructionKind == InstructionKind.Jump)
{
    _programCounter = instruction.TargetIndex;
    continue;  // Skip dispatch table lookup
}

if (instructionKind == InstructionKind.Branch)
{
    var result = HandleBranchFastPath(...);
    continue;
}

// All other instructions
var loopResult = InstructionHandlers[(int)instructionKind](...);

Flat Slot Variable Access

Files: ExecutionPlanRunner.FlatSlots.cs, JsEnvironment.cs

All variables across all scopes assigned to a single flat array:

private JsVariable[]? _flatSlots;

// Access:
ref var value = ref _flatSlots[flatSlotId];

Benefits:

  • O(1) access regardless of scope depth
  • No scope chain traversal
  • Enables super-fast arithmetic paths (both operands in flat slots)

See JsEnvironment & Slots for details.


Bounds-Check Elimination

Uses Unsafe.Add and MemoryMarshal for bounds-check-free array access:

var instructionsArray = ImmutableCollectionsMarshal.AsArray(instructions)!;
ref var instructionsRef = ref MemoryMarshal.GetArrayDataReference(instructionsArray);

// Bounds-check-free access
var instruction = Unsafe.Add(ref instructionsRef, _programCounter);

JsValue Caching

Static Singletons

public static readonly JsValue Undefined = new(JsValueKind.Undefined, 0.0, null);
public static readonly JsValue Null = new(JsValueKind.Null, 0.0, null);
public static readonly JsValue True = new(JsValueKind.Boolean, 1.0, null);
public static readonly JsValue False = new(JsValueKind.Boolean, 0.0, null);
public static readonly JsValue Zero = new(0.0);
public static readonly JsValue One = new(1.0);

Integer Cache (100k Values)

private static readonly JsValue[] IntegerCache = CreateIntegerCache(100000);

public static JsValue FromDouble(double value)
{
    var i = (int)value;
    if ((uint)i < (uint)IntegerCache.Length && i == value && !double.IsNegative(value))
    {
        return IntegerCache[i];
    }
    return new JsValue(value);
}

Index String Cache (10k Values)

private static readonly string[] CachedIndexStrings = new string[10000];
// "0", "1", "2", ..., "9999"

Callback Pooling (Async)

Files: AsyncFunctionInvoker.cs, AsyncGeneratorInvoker.cs

Async callbacks are pooled using ThreadStatic:

private sealed class AsyncResumeCallback : IJsCallable
{
    [ThreadStatic] private static AsyncResumeCallback? TCachedFulfilled;
    [ThreadStatic] private static AsyncResumeCallback? TCachedRejected;

    public static (AsyncResumeCallback fulfilled, AsyncResumeCallback rejected) Rent(...)
    {
        var fulfilled = TCachedFulfilled ?? new AsyncResumeCallback();
        TCachedFulfilled = null;
        // ... setup ...
        return (fulfilled, rejected);
    }

    public JsValue Invoke(...)
    {
        try
        {
            // ... execute ...
        }
        finally
        {
            // Return to pool
            if (_isRejection)
                TCachedRejected = this;
            else
                TCachedFulfilled = this;
        }
    }
}

Completion Signals (Not Exceptions)

File: CompletionSignals.cs

Control flow modeled as typed signals:

record BreakCompletionSignal(Symbol? Label) : ICompletionSignal;
record ContinueCompletionSignal(Symbol? Label) : ICompletionSignal;
class ReturnCompletionSignal(JsValue value) : ICompletionSignal;
class ThrowFlowCompletionSignal(JsValue value) : ICompletionSignal;
class YieldCompletionSignal(JsValue value) : ICompletionSignal;

Why signals instead of exceptions:

  • Signals are faster than throwing/catching exceptions
  • They carry typed data (the return value, label, etc.)
  • Checked via field access, not try/catch

Lazy State Allocation

State objects only allocated when needed:

private sealed partial class ExecutionPlanRunner
{
    // Only allocated for try/catch functions
    private TryCatchState? _tryCatchState;

    // Only allocated for async functions
    private AsyncState AsyncStateRef;

    // Only allocated for generators
    private YieldState YieldStateRef;

    // Only allocated for loops with iterators
    private IteratorState IteratorStateRef;
}

Method Inlining Attributes

// Force inline for hot paths
[MethodImpl(MethodImplOptions.AggressiveInlining)]
private static InstructionResult HandleJump(...) { ... }

// Prevent inline for cold paths
[MethodImpl(MethodImplOptions.NoInlining)]
private static InstructionResult HandleJumpSlow(...) { ... }

// Aggressive optimization
[MethodImpl(MethodImplOptions.AggressiveOptimization)]
private JsValue ExecuteInstructionLoop(...) { ... }

AST Cache

File: Ast/AstCache.cs

Thread-safe lazy initialization for AST metadata:

internal static TCache GetOrCreate<TCache>(ref TCache? field, Func<TCache> factory)
{
    var existing = Volatile.Read(ref field);
    if (existing is not null) return existing;

    var created = factory();
    var prior = Interlocked.CompareExchange(ref field, created, null);
    return prior ?? created;
}

Cached data:

  • HoistPlan - variable hoisting analysis
  • HoistableDeclarationsPlan - function declarations
  • ExecutionPlan - lowered IR

IAsJsValue Interface

Objects cache their JsValue wrapper:

public interface IAsJsValue
{
    ref readonly JsValue AsJsValue { get; }
}

// Implementation in JsObject:
private JsValue _cachedJsValue;
public ref readonly JsValue AsJsValue
{
    get
    {
        if (_cachedJsValue.Kind == JsValueKind.Undefined)
            _cachedJsValue = new JsValue(this);
        return ref _cachedJsValue;
    }
}

See Also

Clone this wiki locally