← Back to Blogs

The Complete JavaScript & Node.js Internals Roadmap in 2026

The Complete JavaScript & Node.js Internals Roadmap

From Runtime Choreography to CPU Scheduling — A Deep Reference for All Levels

How to use this doc: Beginner? Follow sections in order. Intermediate? Jump to any section. Advanced? Use as a reference sheet. Every concept builds on the previous one.


Table of Contents

  1. The Big Picture — What Happens When Code Runs

  2. JavaScript Engine Internals (V8)

  3. Memory: Heap, Stack, and Garbage Collection

  4. The Call Stack

  5. The Node.js Runtime Architecture

  6. libuv — The Hidden Engine

  7. The Event Loop — Complete Mechanics

  8. Macro Tasks (Task Queue)

  9. Micro Tasks (Microtask Queue)

  10. Macro vs Micro — Priority, Order, and Scheduling

  11. CPU Scheduling & Thread Model

  12. Promises — Internals and Lifecycle

  13. async/await — What the Engine Actually Does

  14. Streams — Backpressure and Flow Control

  15. Node.js Module System (CJS vs ESM)

  16. Buffers and Binary Data

  17. Worker Threads — True Parallelism

  18. Child Processes

  19. Cluster Module — Multi-Core Scaling

  20. Event Emitter — Internal Pub/Sub

  21. Timers — Deep Mechanics

  22. Error Handling — The Complete Picture

  23. Core Node.js APIs Reference

  24. Performance & Profiling

  25. Security Fundamentals

  26. Mastery Checklist by Level


1. The Big Picture

What Actually Happens When You Run node index.js

┌─────────────────────────────────────────────────────────┐
│  Your JavaScript Source Code (index.js)                 │
└───────────────────────┬─────────────────────────────────┘
                        │  Node.js CLI reads the file
                        ▼
┌─────────────────────────────────────────────────────────┐
│  V8 Engine                                              │
│  ┌──────────┐  ┌────────────┐  ┌────────────────────┐  │
│  │  Parser  │→ │    AST     │→ │  Ignition Bytecode │  │
│  └──────────┘  └────────────┘  └────────┬───────────┘  │
│                                          │ hot paths    │
│                                   ┌──────▼──────────┐   │
│                                   │  TurboFan JIT   │   │
│                                   │  (machine code) │   │
│                                   └─────────────────┘   │
└───────────────────────┬─────────────────────────────────┘
                        │  V8 Heap + Call Stack
                        ▼
┌─────────────────────────────────────────────────────────┐
│  Node.js Runtime Bindings (C++)                         │
│  fs, net, http, crypto, zlib, dns, child_process ...    │
└───────────────────────┬─────────────────────────────────┘
                        │
                        ▼
┌─────────────────────────────────────────────────────────┐
│  libuv                                                  │
│  ┌──────────────────┐   ┌─────────────────────────────┐ │
│  │  Event Loop      │   │  Thread Pool (4 threads)    │ │
│  │  (single thread) │   │  fs, dns, crypto, zlib      │ │
│  └──────────────────┘   └─────────────────────────────┘ │
└───────────────────────┬─────────────────────────────────┘
                        │
                        ▼
┌─────────────────────────────────────────────────────────┐
│  Operating System                                       │
│  epoll (Linux) / kqueue (macOS) / IOCP (Windows)        │
└─────────────────────────────────────────────────────────┘

Key Insight for Beginners

Node.js is not a language. It is a runtime environment built on top of:

  • V8 — executes JavaScript

  • libuv — handles async I/O across platforms

  • Node.js C++ bindings — bridges JS world to OS world


2. JavaScript Engine Internals (V8)

V8 is Google's open-source JavaScript and WebAssembly engine, written in C++.

2.1 Parsing Pipeline

Source Code → Tokenizer → Tokens → Parser → AST → Compiler

StageWhat HappensOutputTokenizer (Lexer)Breaks source into tokens (keywords, identifiers, operators)Token streamParserValidates grammar, builds tree structureAbstract Syntax Tree (AST)Scope AnalyzerResolves variable bindings and closuresDecorated ASTIgnitionInterprets AST, produces bytecodeBytecodeTurboFanJIT compiles hot bytecode to machine codeNative machine code

2.2 Ignition (Interpreter)

  • Ignition is V8's bytecode interpreter since 2016 (replaced Full-codegen).

  • Executes code immediately — no wait for compilation.

  • Uses a register-based virtual machine (not stack-based).

  • Collects type feedback while running to inform JIT.

// Example: V8 infers that 'a' and 'b' are always Numbers
function add(a, b) { return a + b; }
add(1, 2);   // V8 records: a=Number, b=Number
add(3, 4);   // Confirms pattern → candidate for JIT

2.3 TurboFan (JIT Compiler)

  • Triggered for hot functions (called many times).

  • Uses type feedback from Ignition to generate optimized machine code.

  • Assumes the types stay the same (speculative optimization).

  • If assumption breaks → deoptimization (back to bytecode).

function add(a, b) { return a + b; }
for (let i = 0; i < 100000; i++) add(1, 2); // → TurboFan kicks in
add("hello", "world"); // ← TYPE MISMATCH → deoptimization!

2.4 Hidden Classes (Shapes)

V8 does NOT use hash maps for property lookup like you'd expect. Instead it creates hidden classes (also called "shapes" or "maps") to make property access as fast as C++ struct field access.

// ✅ Good — same hidden class
function Point(x, y) {
  this.x = x;  // V8: hidden class C0 → C1 (has x)
  this.y = y;  // V8: hidden class C1 → C2 (has x, y)
}
const p1 = new Point(1, 2); // shape: C2
const p2 = new Point(3, 4); // same shape: C2 → fast!

// ❌ Bad — different hidden class per object
const o1 = {};
o1.x = 1; // C0 → C1
const o2 = {};
o2.y = 2; // C0 → C2' (different shape!)

2.5 Inline Caches (ICs)

  • V8 caches the location of a property in memory after the first lookup.

  • On subsequent calls with the same shape, it skips the lookup entirely.

  • Monomorphic (1 shape) → fastest. Polymorphic (2-4 shapes) → slower. Megamorphic (5+ shapes) → slowest (no caching).

2.6 Compilation Tiers Summary

TierComponentSpeedOptimization Level1Ignition (interpret)MediumNone2Maglev (new mid-tier JIT)FastMedium3TurboFan (full JIT)FastestMaximum


3. Memory: Heap, Stack, and Garbage Collection

3.1 Memory Layout

┌────────────────────────────────────────┐
│              V8 Process Memory         │
│                                        │
│  ┌────────────────┐  ┌──────────────┐  │
│  │   Call Stack   │  │   V8 Heap    │  │
│  │  (per thread)  │  │  (shared)    │  │
│  │                │  │              │  │
│  │  stack frames  │  │  New Space   │  │
│  │  primitives    │  │  Old Space   │  │
│  │  references    │  │  Code Space  │  │
│  │                │  │  Large Obj.  │  │
│  └────────────────┘  └──────────────┘  │
└────────────────────────────────────────┘

3.2 Stack Memory

  • Fixed size, allocated at thread creation (typically 1-8 MB).

  • Stores: primitive values (number, boolean, undefined, null), function call frames, local variable references.

  • Operates as LIFO (Last In, First Out).

  • Allocation/deallocation is instantaneous (just move the stack pointer).

  • Stack overflow = too many nested function calls.

function a() {
  let x = 10; // stored ON the stack
  return b(x);
}
function b(n) {
  return n * 2; // n is on the stack
}

3.3 Heap Memory

Where objects, arrays, functions, and closures live.

RegionPurposeGC StrategyNew Space (Young Gen)Newly allocated objectsScavenge (fast, minor GC)Old Space (Old Gen)Long-lived objectsMark-Sweep + Mark-CompactCode SpaceCompiled JIT machine codeMark-SweepLarge Object SpaceObjects > 256KBMark-Sweep, not movedMap SpaceHidden class mapsMark-SweepCell/Property Cell SpaceGlobal variables, property cellsMark-Sweep

3.4 Garbage Collection — How V8 Reclaims Memory

Minor GC — Scavenge (Young Generation)

  • Uses Cheney's algorithm (semi-space copying).

  • New Space split into two semi-spaces: "from" and "to".

  • Live objects copied from "from" → "to". Dead objects abandoned.

  • Objects surviving 2 GC cycles get promoted to Old Space.

  • Very fast: typically < 1ms. Runs frequently.

New Space:
[ from-space: obj1(dead), obj2(live), obj3(live) ]
[ to-space:   (empty)                             ]

After Scavenge:
[ from-space: (empty, swapped)                   ]
[ to-space:   obj2, obj3                         ]  ← survivors
obj2, obj3 counted as 1 survival. Survive again → Old Space.

Major GC — Mark-Sweep-Compact (Old Generation)

Three phases:

1. Mark — Trace all reachable objects from GC roots (global object, stack variables). Uses tri-color marking: white (unvisited), grey (visited but children not processed), black (fully processed).

2. Sweep — Scan heap and reclaim white (unreachable) objects. Creates free lists.

3. Compact — Move live objects together to reduce fragmentation. Updates all pointers.

V8 uses Orinoco — incremental, concurrent, parallel GC to avoid "stop-the-world" pauses.

GC TechniqueWhat It MeansBenefitIncremental markingMark phase split into small chunksShorter pausesConcurrent markingMarking happens in parallel with JS executionLess pause timeParallel scavengingMultiple threads do scavengeFaster minor GCLazy sweepingSweeping deferred until memory neededBetter throughput

3.5 Memory Leaks — Common Causes

CauseExampleFixGlobal variablesleaked = [] (no let/const)Always declare variablesClosure retaining outer scopeInner fn keeping large outer array aliveNull out unused referencesEvent listeners not removedemitter.on() without off()Use once() or remove listenersTimers not clearedsetInterval running foreverStore handle, call clearIntervalCircular references (old V8)a.ref = b; b.ref = aModern V8 handles thisCaches without size limitsUnbounded Map/object as cacheUse LRU cache with max size


4. The Call Stack

The call stack is a LIFO data structure that tracks the current execution context.

4.1 How It Works

function greet(name) {       // frame 3
  return `Hello, ${name}`;
}

function welcome(user) {     // frame 2
  return greet(user.name);
}

function main() {            // frame 1
  return welcome({ name: "Abhishek" });
}

main();

Stack state when greet is executing:

┌─────────────────────────┐  ← TOP (current execution)
│  greet("Abhishek")      │
├─────────────────────────┤
│  welcome({name:...})    │
├─────────────────────────┤
│  main()                 │
├─────────────────────────┤
│  anonymous (global)     │
└─────────────────────────┘  ← BOTTOM

4.2 Execution Context

Every function call creates an Execution Context pushed onto the stack, containing:

ComponentDescriptionVariable EnvironmentAll var declarations, function declarationsLexical Environmentlet, const, scope chain referencethis bindingWhat this refers to in this functionOuter Environment ReferenceLink to parent scope (closure chain)

4.3 Scope Chain and Closures

function outer() {
  const secret = 42;          // lives in outer's lexical env

  function inner() {
    console.log(secret);      // inner has a reference to outer's env
  }                           // This reference = CLOSURE

  return inner;
}

const fn = outer();           // outer() popped from stack
fn();                         // But 'secret' still accessible via closure

The secret variable is NOT garbage collected even after outer() returns, because inner holds a reference to outer's lexical environment object on the heap.

4.4 Hoisting

// What you write:
console.log(x);   // undefined (not ReferenceError)
var x = 5;

// What V8 sees after hoisting:
var x;            // declaration hoisted to top
console.log(x);   // undefined
x = 5;            // assignment stays in place

// let/const: hoisted but NOT initialized (Temporal Dead Zone)
console.log(y);   // ReferenceError: Cannot access 'y' before initialization
let y = 10;

5. The Node.js Runtime Architecture

5.1 Component Layers

┌────────────────────────────────────────────────────────────┐
│  JavaScript Application Code                               │
├────────────────────────────────────────────────────────────┤
│  Node.js Standard Library (JS)                             │
│  fs, path, http, net, crypto, stream, events, util, ...    │
├────────────────────────────────────────────────────────────┤
│  Node.js C++ Bindings (node_file.cc, node_http.cc, ...)    │
│  Bridges JS API to OS via libuv                            │
├────────────────────────────────────────────────────────────┤
│  V8 Engine │ libuv │ c-ares │ OpenSSL │ zlib │ http_parser │
├────────────────────────────────────────────────────────────┤
│  Operating System (Linux / macOS / Windows)                │
└────────────────────────────────────────────────────────────┘

5.2 Key C++ Dependencies

LibraryRoleV8JavaScript execution enginelibuvAsync I/O, event loop, thread poolc-aresAsync DNS resolutionOpenSSLTLS/SSL, crypto operationszlibCompression (gzip, deflate, brotli)http_parser / llhttpHTTP request/response parsingnghttp2HTTP/2 supportlibuvTimers, sockets, file I/O abstraction

5.3 Node.js Global Objects

GlobalTypeDescriptionglobalObjectGlobal scope (like window in browser)processObjectCurrent Node.js process info and control__filenameStringAbsolute path of current file (CJS only)__dirnameStringDirectory of current file (CJS only)requireFunctionLoad modules (CJS only)moduleObjectCurrent module object (CJS only)exportsObjectShortcut for module.exportsBufferClassBinary data handlingconsoleObjectLogging utilitysetTimeoutFunctionSchedule macro tasksetIntervalFunctionRepeating macro tasksetImmediateFunctionSchedule after I/O in event loopclearTimeoutFunctionCancel scheduled timeoutclearIntervalFunctionCancel repeating intervalclearImmediateFunctionCancel setImmediatequeueMicrotaskFunctionSchedule microtaskURLClassURL parsing utilityURLSearchParamsClassQuery string utilityTextEncoderClassEncode string → Uint8ArrayTextDecoderClassDecode Uint8Array → stringAbortControllerClassCancel async operationsAbortSignalClassSignal for abortioncryptoObjectWeb Crypto API (Node 15+)performanceObjectHigh-res timingstructuredCloneFunctionDeep clone objectsfetchFunctionHTTP client (Node 18+, built-in)

5.4 process Object — All Key Properties and Methods

Property/MethodTypeDescriptionprocess.pidNumberCurrent process IDprocess.ppidNumberParent process IDprocess.platformString'linux', 'darwin', 'win32'process.archString'x64', 'arm64', 'ia32'process.versionStringNode.js version stringprocess.versionsObjectVersions of V8, libuv, OpenSSL, etc.process.envObjectEnvironment variablesprocess.argvArray[node, script, ...args]process.execPathStringPath to Node.js binaryprocess.cwd()FunctionCurrent working directoryprocess.chdir(dir)FunctionChange working directoryprocess.exit(code)FunctionExit process (0 = success)process.abort()FunctionGenerate core dump and crashprocess.stdoutStreamStandard output streamprocess.stderrStreamStandard error streamprocess.stdinStreamStandard input streamprocess.nextTick(cb)FunctionQueue microtask (highest priority)process.hrtime()FunctionHigh-resolution time [seconds, nanoseconds]process.hrtime.bigint()FunctionHigh-resolution time as BigInt nanosecondsprocess.memoryUsage()FunctionHeap used, heap total, RSS, externalprocess.cpuUsage()FunctionUser and system CPU time in microsecondsprocess.uptime()FunctionSeconds since process startedprocess.kill(pid, signal)FunctionSend signal to a processprocess.send(msg)FunctionSend message to parent (IPC, child processes)process.disconnect()FunctionClose IPC channelprocess.binding(name)FunctionAccess built-in C++ modules (internal)process.reportObjectDiagnostic report generationprocess.allowedNodeEnvironmentFlagsSetAllowed Node.js CLI flags

process Events:

EventFired WhenexitProcess is about to exit (synchronous only in handler)beforeExitEvent loop is empty (async operations allowed)uncaughtExceptionUnhandled exception propagated to event loopunhandledRejectionUnhandled Promise rejectionrejectionHandledA previously unhandled rejection got a handlerwarningNode.js emits a warningmessageMessage from parent via IPCdisconnectIPC channel closedSIGINTCtrl+C receivedSIGTERMTermination signalSIGUSR1User-defined signal 1 (starts debugger in Node)SIGUSR2User-defined signal 2


6. libuv — The Hidden Engine

libuv is a multi-platform C library that provides the event loop, async I/O, and thread pool. It is the core of what makes Node.js non-blocking.

6.1 What libuv Handles

CategoryOperationsNetwork I/OTCP sockets, UDP sockets, pipes, TTYFile System I/ORead, write, stat, mkdir, watchDNSgetaddrinfo, getnameinfo (via c-ares)Crypto/CompressionOffloaded to thread poolTimerssetTimeout, setInterval implementationSignal handlingPOSIX signalsChild processesspawn, execThread poolExecutes blocking work off main thread

6.2 OS-Level I/O Mechanisms Used by libuv

OSMechanismDescriptionLinuxepollEfficient event notification for many file descriptorsmacOSkqueueBSD kernel event notificationWindowsIOCPI/O Completion Ports — overlapped async I/OSolarisevent portsPort-based event notificationAIXpollsetMultiple descriptor polling

6.3 The Thread Pool

Main Thread (Event Loop)
        │
        │ offloads blocking work
        ▼
┌───────────────────────────────┐
│  libuv Thread Pool            │
│                               │
│  Thread 1 │ Thread 2          │
│  Thread 3 │ Thread 4          │
│                               │
│  Default: 4 threads           │
│  Max: 1024 threads            │
└───────────────┬───────────────┘
                │ result/callback posted to event loop
                ▼
         Main Thread picks up callback

Tasks that use the thread pool:

APIWhy Thread Poolfs.* (most)File I/O can block on diskdns.lookup()Uses getaddrinfo which is blockingcrypto.pbkdf2()CPU-intensivecrypto.scrypt()CPU-intensivecrypto.randomBytes()Needs entropyzlib.*CPU-intensive compression

Tasks that do NOT use thread pool (use OS async directly):

APIMechanismnet.createServer()epoll/kqueue/IOCPhttp.request()epoll/kqueue/IOCPtcp socketsepoll/kqueue/IOCPdns.resolve()c-ares (true async)

Configure thread pool size:

// Before requiring any module that uses it:
process.env.UV_THREADPOOL_SIZE = 16;

7. The Event Loop — Complete Mechanics

The event loop is the orchestration mechanism that allows Node.js to perform non-blocking I/O operations. It has 6 phases that run in a fixed order.

7.1 Event Loop Phases

   ┌───────────────────────────────────────────┐
   │           Event Loop Cycle                │
   │                                           │
   │  ┌──────────┐                             │
   │  │  timers  │ ← setTimeout, setInterval   │
   │  └────┬─────┘                             │
   │       │                                   │
   │  ┌────▼──────────────┐                    │
   │  │  pending callbacks│ ← I/O errors       │
   │  └────┬──────────────┘                    │
   │       │                                   │
   │  ┌────▼──────────┐                        │
   │  │  idle, prepare│ ← internal use only    │
   │  └────┬──────────┘                        │
   │       │                                   │
   │  ┌────▼──────┐                            │
   │  │    poll   │ ← retrieve new I/O events  │
   │  └────┬──────┘   block here if no work    │
   │       │                                   │
   │  ┌────▼──────┐                            │
   │  │   check   │ ← setImmediate callbacks   │
   │  └────┬──────┘                            │
   │       │                                   │
   │  ┌────▼────────────────┐                  │
   │  │  close callbacks    │ ← socket.close() │
   │  └────┬────────────────┘                  │
   │       │                                   │
   │       └──────────────────────────────────►│
   │       (next iteration / tick)             │
   └───────────────────────────────────────────┘

  *** Between EVERY phase: process.nextTick() and Promise microtasks run ***

7.2 Phase-by-Phase Breakdown

Phase 1: Timers

  • Executes callbacks scheduled by setTimeout() and setInterval().

  • Does NOT execute a timer the moment its delay expires.

  • Checks if the threshold (delay) has passed; if yes, runs the callback.

  • Timers with delay=0 run as early as possible, but after all I/O in current cycle.

  • Actual resolution: minimum ~1ms due to OS scheduler granularity.

setTimeout(() => console.log('timer'), 0);
// This runs in Phase 1 of NEXT iteration, not immediately.

Phase 2: Pending Callbacks

  • Executes I/O callbacks deferred to next iteration.

  • Mostly TCP errors (e.g., ECONNREFUSED errors from network ops).

  • Not commonly seen in application code.

Phase 3: Idle, Prepare

  • Internal to libuv only.

  • idle handles: run every iteration.

  • prepare handles: run before the poll phase blocks.

  • Not accessible from JavaScript.

Phase 4: Poll

The most important phase — where Node.js spends most of its time.

Two functions:

  1. Calculate how long to block/wait for new I/O events.

  2. Process events in the poll queue.

Blocking logic:

if (setImmediate queue is not empty)
  → poll with timeout = 0 (don't block, check for I/O then move on)
else if (any timers are pending)
  → poll with timeout = time until next timer fires
else
  → block indefinitely until I/O arrives

When I/O completes: callbacks added to poll queue → executed immediately.

Phase 5: Check

  • Executes setImmediate() callbacks.

  • Always runs after poll phase.

  • setImmediate is always deterministically after I/O callbacks (unlike setTimeout(fn, 0) which is not guaranteed).

const fs = require('fs');
fs.readFile('./file.txt', () => {
  setTimeout(() => console.log('timeout'), 0);
  setImmediate(() => console.log('immediate'));
  // 'immediate' ALWAYS prints first in this context.
  // Outside I/O callbacks, order is non-deterministic.
});

Phase 6: Close Callbacks

  • Executes close event callbacks.

  • Examples: socket.on('close', ...), server.close().

  • Cleanup phase.

7.3 Between Every Phase — Microtask Checkpoint

After each phase completes (and between each individual callback within nextTick/Promise queues), Node.js drains:

  1. process.nextTick queue — fully drained first

  2. Promise microtask queue — fully drained next

This continues until both queues are empty before moving to the next event loop phase.

Promise.resolve().then(() => console.log('Promise 1'));
process.nextTick(() => console.log('nextTick 1'));
Promise.resolve().then(() => console.log('Promise 2'));
process.nextTick(() => console.log('nextTick 2'));

// Output:
// nextTick 1
// nextTick 2
// Promise 1
// Promise 2

8. Macro Tasks (Task Queue)

Macro tasks are scheduled by the browser/runtime to run in future event loop iterations. Each macro task runs as a complete unit before the next one starts.

8.1 Macro Task Sources

SourcePhaseNotessetTimeout(fn, delay)TimersMinimum delay ~1ms, not real-timesetInterval(fn, delay)TimersRepeating timersetImmediate(fn)CheckNode.js only, after I/OI/O callbacksPollNetwork, file system completionsPending callbacksPendingDeferred I/O errorsClose callbacksCloseSocket/server close events

8.2 setTimeout Internals

setTimeout(callback, delay, ...args);
  1. Node.js passes timer to libuv.

  2. libuv tracks the timer in a min-heap sorted by expiry time.

  3. In Timers phase, libuv checks the top of the heap.

  4. If now >= expiryTime → callback queued.

  5. Multiple timers with same delay are grouped.

// Timers with same delay — order is FIFO within same delay bucket
setTimeout(() => console.log('A'), 100);
setTimeout(() => console.log('B'), 100);
// A always before B

8.3 setInterval Internals

  • Implemented using a timer that re-schedules itself after each execution.

  • The delay is from the start of the previous callback, not the end.

  • If callback takes longer than delay → next execution fires immediately after (no queuing backlog).

// If callback takes 200ms but delay is 100ms:
setInterval(() => {
  // takes 200ms
}, 100);
// Next iteration fires immediately after 200ms, NOT at 300ms.

8.4 setImmediate vs setTimeout(fn, 0)

FeaturesetImmediatesetTimeout(fn, 0)PhaseCheck (Phase 5)Timers (Phase 1)Inside I/O callbackAlways runs firstMay run first or secondOutside I/O callbackNon-deterministicNon-deterministicUse caseRun after current I/OGeneral deferred executionPerformanceSlightly fasterSlightly more overhead


9. Micro Tasks (Microtask Queue)

Microtasks run between event loop phases, before the next macro task starts. They have higher priority than macro tasks.

9.1 Microtask Sources

SourceQueuePriorityNotesprocess.nextTick(fn)nextTick queueHighestNode.js specific, runs before PromisesPromise.resolve().then(fn)Promise queueHighStandard microtaskqueueMicrotask(fn)Promise queueHighExplicit microtask schedulingMutationObserver (browser only)Promise queueHighNot in Node.js

9.2 process.nextTick — How It Works

  • NOT part of the event loop phases — it's a special queue checked after every operation.

  • Always drains completely before Promises.

  • Can starve the event loop if used recursively without limit.

function recursiveNextTick() {
  process.nextTick(recursiveNextTick); // ← DANGEROUS: starves I/O!
}
// Never do this. I/O will never get processed.

Use cases for process.nextTick:

// 1. Allow event listener registration before emitting
class MyEmitter extends EventEmitter {
  constructor() {
    super();
    process.nextTick(() => this.emit('ready')); // Emit AFTER constructor returns
  }
}

const emitter = new MyEmitter();
emitter.on('ready', () => console.log('Ready!')); // Registered in time ✓

// 2. Ensure consistent async behavior
function getUser(id, callback) {
  const cached = cache.get(id);
  if (cached) {
    process.nextTick(() => callback(null, cached)); // Always async, even when cached
    return;
  }
  db.find(id, callback);
}

9.3 Promise Microtask Queue

  • Created by .then(), .catch(), .finally(), await.

  • Runs after nextTick queue is empty.

  • Also fully drains before next event loop phase.

Promise.resolve()
  .then(() => {
    console.log('then 1');
    return Promise.resolve(); // Nested promise adds another microtask
  })
  .then(() => console.log('then 2'));

process.nextTick(() => console.log('nextTick'));

// Output:
// nextTick
// then 1
// then 2

10. Macro vs Micro — Priority, Order, and Scheduling

10.1 Complete Execution Order

console.log('1: script start');

setTimeout(() => console.log('7: setTimeout'), 0);

Promise.resolve()
  .then(() => console.log('4: promise 1'))
  .then(() => console.log('5: promise 2'));

process.nextTick(() => console.log('3: nextTick'));

setImmediate(() => console.log('8: setImmediate'));

queueMicrotask(() => console.log('6: queueMicrotask'));

console.log('2: script end');

// Output:
// 1: script start
// 2: script end
// 3: nextTick          ← nextTick queue (before promises)
// 4: promise 1         ← promise microtask
// 5: promise 2         ← another promise microtask (chained)
// 6: queueMicrotask    ← also a promise microtask
// 7: setTimeout        ← macro task (timers phase)
// 8: setImmediate      ← macro task (check phase)

10.2 Priority Ladder

HIGHEST PRIORITY
      │
      │  Synchronous code (call stack)
      │
      │  process.nextTick queue (fully drained)
      │
      │  Promise microtask queue (fully drained)
      │
      │  setImmediate (check phase)
      │
      │  setTimeout / setInterval (timers phase)
      │
      │  I/O callbacks (poll phase)
      │
LOWEST PRIORITY (in terms of scheduling)

10.3 Starvation Scenarios

// Starving the event loop with nextTick:
function starve() {
  process.nextTick(starve); // I/O will never run!
}

// Starving with promises:
function starveWithPromise() {
  Promise.resolve().then(starveWithPromise); // Also blocks I/O
}

// Safe recursive async: use setImmediate to yield
function safeRecursive() {
  setImmediate(safeRecursive); // Yields between iterations, I/O can run
}

10.4 Full Comparison Table

Featureprocess.nextTickPromise.thenqueueMicrotasksetImmediatesetTimeout(fn,0)TypeMicrotaskMicrotaskMicrotaskMacro taskMacro taskNode.js only✅ Yes❌ No❌ No✅ Yes❌ NoPriorityHighestHighHighMediumMediumCan starve I/O✅ Yes✅ Yes✅ Yes❌ No❌ NoEvent loop phaseN/A (between all)N/AN/ACheck (5)Timers (1)Use caseEnsure async, API designPromises/asyncGeneric microtaskAfter I/O completesGeneral deferred


11. CPU Scheduling & Thread Model

11.1 Node.js Thread Model

Despite being "single-threaded," Node.js actually uses multiple threads internally:

┌────────────────────────────────────────────────────────────────┐
│  Node.js Process                                               │
│                                                                │
│  Main Thread                                                   │
│  ┌──────────────────────────────────────────────────────────┐  │
│  │  Your JavaScript + V8 + Event Loop (single threaded)    │  │
│  └──────────────────────────────────────────────────────────┘  │
│                                                                │
│  libuv Thread Pool (default: 4 threads)                        │
│  ┌────────────┐  ┌────────────┐  ┌────────────┐  ┌─────────┐  │
│  │  Thread 1  │  │  Thread 2  │  │  Thread 3  │  │Thread 4 │  │
│  │  (worker)  │  │  (worker)  │  │  (worker)  │  │(worker) │  │
│  └────────────┘  └────────────┘  └────────────┘  └─────────┘  │
│                                                                │
│  V8 Internal Threads (GC, JIT)                                 │
│  ┌────────────┐  ┌────────────┐                                │
│  │GC Marking  │  │  JIT       │                                │
│  │(concurrent)│  │Compilation │                                │
│  └────────────┘  └────────────┘                                │
└────────────────────────────────────────────────────────────────┘

11.2 How the OS Schedules Node.js Threads

The OS scheduler (Linux: CFS — Completely Fair Scheduler; Windows: preemptive multitasking scheduler) treats Node.js threads like any other threads.

Key OS scheduling concepts:

ConceptDescriptionImpact on Node.jsTime slice / QuantumMaximum CPU time per thread (typically 1-20ms)Event loop iteration can be preemptedPriorityLower-priority threads get less CPUNode.js runs at default priorityContext switchOS saves/restores thread stateOverhead when switching threadsCPU affinityBinding a thread to a specific coreCan reduce cache missesPreemptionOS interrupts a running threadCan happen during JS executionBlockingThread sleeps waiting for I/Olibuv thread pool threads block on disk/etc.

11.3 What "Non-Blocking" Really Means

WITHOUT Node.js (traditional blocking server):

Request 1 → Thread 1 → [wait for DB 100ms] → response    (thread blocked during wait)
Request 2 → Thread 2 → [wait for DB 100ms] → response
Request 3 → Thread 3 → [wait for DB 100ms] → response
1000 requests → need 1000 threads → memory exhaustion!

WITH Node.js (non-blocking event loop):

Request 1 → [start DB query, register callback] → returns immediately
Request 2 → [start DB query, register callback] → returns immediately  
Request 3 → [start DB query, register callback] → returns immediately
DB result 1 arrives → event loop picks up → callback runs → response
DB result 2 arrives → event loop picks up → callback runs → response
1000 requests → 1 main thread + callbacks → low memory!

11.4 The C10K Problem and Why Node.js Solves It

The C10K problem: how to handle 10,000+ concurrent connections. Traditional multi-threaded servers fail because:

  • Each thread: ~8MB stack = 80GB RAM for 10K threads.

  • Context switching overhead becomes dominant.

Node.js solution: One thread, multiplexed via epoll/kqueue — OS tells us when data is ready, we don't waste threads waiting.

11.5 CPU-Bound Work — The Problem

// This BLOCKS the event loop — no I/O can be processed while this runs!
app.get('/compute', (req, res) => {
  let result = 0;
  for (let i = 0; i < 1e10; i++) result += i; // blocks for seconds!
  res.json({ result });
});

Solutions for CPU-intensive work:

SolutionMechanismUse Caseworker_threadsTrue parallelism in same processCPU-bound JS codechild_processSeparate OS processIsolation, any languageclusterMultiple Node processes on same portWeb server scalingNative add-ons (N-API)C++ in thread poolMaximum performanceOffload to microserviceExternal process via HTTP/gRPCArchitecture-level


12. Promises — Internals and Lifecycle

12.1 Promise States

            resolve(value)
           ─────────────────►  FULFILLED  ──► .then(onFulfilled)
PENDING ──►
           ─────────────────►  REJECTED   ──► .catch(onRejected)
            reject(reason)

A Promise is always in one of three states:

  • Pending — initial state, neither fulfilled nor rejected

  • Fulfilled — operation completed successfully

  • Rejected — operation failed

Once settled (fulfilled or rejected), a Promise is immutable — it can never change state.

12.2 Promise Constructor Internals

const p = new Promise((resolve, reject) => {
  // This function (executor) runs SYNCHRONOUSLY
  console.log('executor runs now');

  // resolve/reject are special internal functions that:
  // 1. Change the Promise's internal state
  // 2. Store the value/reason
  // 3. Schedule all registered .then/.catch callbacks as microtasks
  resolve(42);
});

console.log('after new Promise');
p.then(v => console.log('then:', v));

// Output:
// executor runs now
// after new Promise
// then: 42

12.3 Promise Chaining — Internals

Each .then() returns a new Promise:

const p1 = Promise.resolve(1);
const p2 = p1.then(v => v + 1);  // new Promise
const p3 = p2.then(v => v * 2);  // another new Promise

// p1 fulfills → schedules p2's resolver as microtask
// microtask runs → p2 resolves with 2 → schedules p3's resolver
// microtask runs → p3 resolves with 4

12.4 Promise Static Methods — Complete List

MethodSignatureResolves WhenRejects WhenPromise.resolve(v)resolve(value)ImmediatelyNeverPromise.reject(r)reject(reason)NeverImmediatelyPromise.all(iterable)all([p1, p2, ...])All resolveAny rejectsPromise.allSettled(iterable)allSettled([p1, p2])All settle (any state)NeverPromise.any(iterable)any([p1, p2])Any resolvesAll rejectPromise.race(iterable)race([p1, p2])First to settleFirst to reject

// Promise.all — fails fast
await Promise.all([
  fetch('/api/users'),    // all three run concurrently
  fetch('/api/products'),
  fetch('/api/orders'),
]); // If any one fails, throws immediately

// Promise.allSettled — never throws
const results = await Promise.allSettled([p1, p2, p3]);
results.forEach(r => {
  if (r.status === 'fulfilled') console.log(r.value);
  else console.log(r.reason);
});

// Promise.any — first success wins
const fastestSource = await Promise.any([
  fetch('https://cdn1.example.com/data'),
  fetch('https://cdn2.example.com/data'),
]);

// Promise.race — first to settle (could be rejection)
const result = await Promise.race([
  fetch('/slow-api'),
  new Promise((_, reject) => setTimeout(() => reject(new Error('Timeout')), 3000)),
]);

12.5 Promise Instance Methods

MethodDescription.then(onFulfilled, onRejected)Handle fulfillment and/or rejection.catch(onRejected)Handle rejection only (sugar for .then(undefined, onRejected)).finally(onFinally)Run regardless of outcome, passes through value/reason


13. async/await — What the Engine Actually Does

async/await is syntactic sugar over Promises. The engine transforms it using generator-like mechanics.

13.1 What async Does

async function fetchUser() {
  return 42;
}

// Is exactly equivalent to:
function fetchUser() {
  return Promise.resolve(42);
}

An async function always returns a Promise. If you return a non-Promise value, it's wrapped in Promise.resolve().

13.2 What await Does

async function getUser() {
  const result = await fetch('/api/user'); // SUSPENDS here
  return result.json();
}

await does:

  1. Evaluates the expression (fetch('/api/user')) — returns a Promise.

  2. Pauses the async function execution.

  3. The function returns a pending Promise to its caller immediately.

  4. The event loop continues with other work.

  5. When the awaited Promise resolves, schedules resumption as a microtask.

  6. On the microtask checkpoint, the function resumes from where it paused.

13.3 Engine Transformation (Conceptual)

// What you write:
async function example() {
  console.log('start');
  const x = await somePromise;
  console.log('after await:', x);
  return x + 1;
}

// Conceptual engine transformation:
function example() {
  return new Promise((resolve, reject) => {
    console.log('start');

    somePromise.then(
      (x) => {
        console.log('after await:', x);
        resolve(x + 1);
      },
      reject
    );
  });
}

13.4 Multiple awaits and Parallelism

// ❌ Sequential — 300ms total
async function sequential() {
  const a = await fetch('/api/a'); // wait 100ms
  const b = await fetch('/api/b'); // wait 100ms
  const c = await fetch('/api/c'); // wait 100ms
  return [a, b, c];
}

// ✅ Parallel — 100ms total
async function parallel() {
  const [a, b, c] = await Promise.all([
    fetch('/api/a'),
    fetch('/api/b'),
    fetch('/api/c'),
  ]);
  return [a, b, c];
}

// ✅ Also parallel — start all, await later
async function parallelV2() {
  const pA = fetch('/api/a'); // starts immediately
  const pB = fetch('/api/b'); // starts immediately
  const pC = fetch('/api/c'); // starts immediately
  return [await pA, await pB, await pC]; // just waiting
}

13.5 Error Handling with async/await

// Method 1: try/catch
async function safe() {
  try {
    const data = await riskyOperation();
    return data;
  } catch (err) {
    console.error('Failed:', err);
    throw err; // re-throw if needed
  }
}

// Method 2: .catch on the awaited promise
async function withCatch() {
  const data = await riskyOperation().catch(err => null);
  if (!data) return defaultValue;
  return data;
}

// Method 3: Utility wrapper
async function to(promise) {
  try {
    return [null, await promise];
  } catch (err) {
    return [err, null];
  }
}

const [err, data] = await to(riskyOperation());
if (err) handleError(err);

14. Streams — Backpressure and Flow Control

Streams allow processing data piece by piece — critical for large files or network responses without loading everything into memory.

14.1 Stream Types

TypeClassDirectionExampleReadablestream.ReadableSourcefs.createReadStream, http.IncomingMessageWritablestream.WritableSinkfs.createWriteStream, http.ServerResponseDuplexstream.DuplexBothnet.Socket, zlib.createGzipTransformstream.TransformBoth + modifycrypto.createCipher, gzip compressionPassThroughstream.PassThroughBoth (no change)Logging middleware

14.2 Readable Stream Modes

ModeDescriptionTriggered ByPaused (default)Data only produced when askedstream.read()FlowingData produced as fast as possible'data' event listener, .resume(), .pipe()

// Paused mode (manual reads):
const readable = fs.createReadStream('./big.txt');
readable.on('readable', () => {
  let chunk;
  while ((chunk = readable.read(1024)) !== null) {
    process(chunk);
  }
});

// Flowing mode (event-driven):
readable.on('data', (chunk) => process(chunk));
readable.on('end', () => console.log('done'));

14.3 Backpressure — The Most Important Stream Concept

Backpressure = when the writable can't keep up with the readable. Without handling it, you buffer everything in memory (memory leak).

// Without backpressure handling — BAD!
readable.on('data', (chunk) => {
  writable.write(chunk); // ignoring return value!
});

// ✅ With backpressure handling:
readable.on('data', (chunk) => {
  const canContinue = writable.write(chunk); // returns false when buffer full
  if (!canContinue) {
    readable.pause(); // Stop producing!
    writable.once('drain', () => readable.resume()); // Resume when drained
  }
});

// ✅ Best: use pipe() which handles backpressure automatically!
readable.pipe(writable);

// ✅ Even better: use pipeline() which handles errors too
const { pipeline } = require('stream');
pipeline(
  fs.createReadStream('./input.txt'),
  zlib.createGzip(),
  fs.createWriteStream('./output.gz'),
  (err) => { if (err) console.error('Pipeline failed:', err); }
);

14.4 Custom Streams

const { Readable, Writable, Transform } = require('stream');

// Custom Readable
class NumberStream extends Readable {
  constructor(max) {
    super({ objectMode: true }); // objectMode: emit objects, not Buffers
    this.current = 0;
    this.max = max;
  }
  _read() {
    if (this.current <= this.max) {
      this.push(this.current++);
    } else {
      this.push(null); // Signal end of stream
    }
  }
}

// Custom Transform
class DoubleTransform extends Transform {
  constructor() { super({ objectMode: true }); }
  _transform(chunk, encoding, callback) {
    this.push(chunk * 2);
    callback(); // Signal this chunk is processed
  }
}

// Custom Writable
class PrintWritable extends Writable {
  constructor() { super({ objectMode: true }); }
  _write(chunk, encoding, callback) {
    console.log(chunk);
    callback();
  }
}

// Use pipeline:
const { pipeline } = require('stream');
pipeline(
  new NumberStream(10),
  new DoubleTransform(),
  new PrintWritable(),
  (err) => console.log('Done:', err)
);

14.5 Stream Events — Complete List

EventStream TypesFired WhendataReadableChunk available (flowing mode)endReadableNo more data to consumereadableReadableData available to read (paused mode)errorAllError occurredcloseAllStream and resources are closedfinishWritableAll data flusheddrainWritableInternal buffer emptied, safe to write againpipeWritablepipe() called on readable pointing hereunpipeWritableunpipe() called


15. Node.js Module System (CJS vs ESM)

15.1 CommonJS (CJS) — How require() Works Internally

// When you write:
const fs = require('fs');

// Node.js does this internally:
// 1. Resolve the module path (search algorithm below)
// 2. Check module cache: if (Module._cache[filename]) return cache
// 3. Create a new Module object
// 4. Register it in cache (before executing! Prevents circular dep loops)
// 5. Wrap the file in a function:
(function(exports, require, module, __filename, __dirname) {
  // Your module code here
});
// 6. Execute the wrapper
// 7. Return module.exports

15.2 Module Resolution Algorithm (CJS)

require('X') from /path/to/parent/file.js

1. Is X a core module (fs, path, http...)? → return it immediately

2. Does X start with './' or '../' or '/'?
   → LOAD_AS_FILE(Y + X):
      Try: Y/X, Y/X.js, Y/X.json, Y/X.node
      Try: Y/X/index.js, Y/X/index.json, Y/X/index.node
      Try: Y/X/package.json → main field → load that file
   → If not found: THROW "MODULE_NOT_FOUND"

3. Otherwise → LOAD_NODE_MODULES(X, dirname(Y)):
   For each dir in node_modules lookup path:
     Try: dir/node_modules/X (same as step 2 above)
   Lookup path: [parent/node_modules, grandparent/node_modules, ..., /node_modules]

15.3 Circular Dependencies in CJS

// a.js
const b = require('./b');
console.log('a: b.done =', b.done);
exports.done = true;

// b.js
const a = require('./a');
// a.done is UNDEFINED here! (partially loaded snapshot)
console.log('b: a.done =', a.done);
exports.done = true;

Node.js handles circular deps by returning a partially-populated module.exports object. This is a common source of bugs. Solution: use lazy requires or restructure to avoid circular deps.

15.4 ES Modules (ESM) — Differences

FeatureCJS (require)ESM (import)LoadingSynchronousAsynchronousBindingsCopy of value at require timeLive bindings (updates reflect)Tree shaking❌ Not possible✅ Possible (static analysis)Top-level await❌ Not supported✅ SupportedCircular depsPartial objectLive bindings (safer)__filename✅ Available❌ Use import.meta.url__dirname✅ Available❌ Use path.dirname(fileURLToPath(import.meta.url))Named exportsexports.foo = barexport const foo = barDefault exportmodule.exports = xexport default xFile extension.js, .cjs.mjs or "type": "module" in package.jsonInteropCan require CJS in CJSCan import CJS, but no named imports from CJS

15.5 ESM Loading Phases

  1. Construction — Fetch, parse, and build module graph (find all import statements)

  2. Instantiation — Allocate memory for all exports (not filled yet — just slots)

  3. Evaluation — Execute code and fill export slots with values

This is why ESM supports live bindings — the import is a reference to the slot, not a copied value.

// counter.mjs
export let count = 0;
export function increment() { count++; }

// main.mjs
import { count, increment } from './counter.mjs';
console.log(count); // 0
increment();
console.log(count); // 1 ← CJS would still show 0!

16. Buffers and Binary Data

Buffers represent fixed-size chunks of raw binary memory allocated outside of V8's heap.

16.1 Why Buffers Exist

  • JavaScript has no native byte array type (before TypedArray).

  • Network and file I/O produce raw bytes — need to handle them efficiently.

  • Buffers are allocated in C++ memory — not subject to V8's GC directly.

16.2 Buffer Creation Methods

MethodDescriptionUse CaseBuffer.alloc(size)Allocates zero-filled bufferSafe creation (no old data)Buffer.allocUnsafe(size)Allocates without zeroingPerformance (must fill before use!)Buffer.allocUnsafeSlow(size)Like allocUnsafe but slowerLarge buffers, avoids poolBuffer.from(array)Create from byte arrayKnown byte valuesBuffer.from(string, encoding)Encode string to bufferString → bytesBuffer.from(buffer)Copy a bufferImmutable copyBuffer.from(arrayBuffer, offset, length)From ArrayBufferTypedArray interopBuffer.concat(list, totalLength)Merge multiple buffersCombine chunks

16.3 Buffer Encodings

EncodingIdentifierUse CaseUTF-8'utf8' or 'utf-8'Default text encodingASCII'ascii'7-bit ASCII onlyUTF-16 LE'utf16le'Windows text filesBase64'base64'Binary in text contextsBase64 URL'base64url'URL-safe base64Hex'hex'Debug, checksumsLatin-1'latin1' or 'binary'ISO-8859-1

16.4 Buffer Methods

MethodDescriptionbuf.toString(encoding)Decode buffer to stringbuf.write(string, offset, encoding)Write string into bufferbuf.slice(start, end)Create view (shares memory!)buf.subarray(start, end)Same as slicebuf.copy(target, targetStart, sourceStart, sourceEnd)Copy bytesbuf.fill(value, offset, end)Fill with a valuebuf.indexOf(value, byteOffset)Find byte/string/bufferbuf.includes(value)Check if value existsbuf.compare(other)Lexicographic comparisonbuf.equals(other)Byte equality checkbuf.readUInt8(offset)Read unsigned 8-bit intbuf.readUInt16BE(offset)Read big-endian 16-bit uintbuf.readUInt32LE(offset)Read little-endian 32-bit uintbuf.readInt8/16/32Read signed integersbuf.readFloat/DoubleBE/LERead floating pointbuf.writeUInt8(value, offset)Write unsigned 8-bit intbuf.writeUInt16BE/LEWrite 16-bit uintbuf.writeUInt32BE/LEWrite 32-bit uintbuf.writeInt8/16/32Write signed integersbuf.writeFloat/DoubleBE/LEWrite floating pointbuf.toJSON(){ type: 'Buffer', data: [...] }Buffer.isBuffer(obj)Check if value is a BufferBuffer.isEncoding(encoding)Check if encoding is validbuf.swap16()Swap byte order (16-bit)buf.swap32()Swap byte order (32-bit)buf.swap64()Swap byte order (64-bit)


17. Worker Threads — True Parallelism

worker_threads allows running JavaScript in separate threads with shared memory via SharedArrayBuffer.

17.1 Architecture

Main Thread
┌─────────────────────────────────┐
│  V8 Isolate                     │
│  Event Loop                     │
│  JavaScript Heap                │
│                                 │
│  workerData: { task: 'heavy' }  │
│  postMessage(result) ◄──────────┼──┐
└──────────────────────────┬──────┘  │
                           │         │
                     spawn thread    │
                           │         │
                           ▼         │
Worker Thread              │         │
┌──────────────────────────┴──────┐  │
│  V8 Isolate (SEPARATE!)         │  │
│  Event Loop (SEPARATE!)         │  │
│  JavaScript Heap (SEPARATE!)    │  │
│                                 │  │
│  workerData.task === 'heavy'    │  │
│  // do CPU work                 │  │
│  parentPort.postMessage(result)─┼──┘
└─────────────────────────────────┘

17.2 Worker Threads API

const { Worker, isMainThread, parentPort, workerData,
        receiveMessageOnPort, MessageChannel,
        SharedArrayBuffer, Atomics } = require('worker_threads');

// Main thread
if (isMainThread) {
  const worker = new Worker(__filename, {
    workerData: { numbers: [1, 2, 3, 4, 5] }
  });

  worker.on('message', (result) => console.log('Result:', result));
  worker.on('error', (err) => console.error(err));
  worker.on('exit', (code) => console.log(`Worker exited with code ${code}`));

  // Terminate early if needed
  // worker.terminate();

} else {
  // Worker thread
  const numbers = workerData.numbers;
  const sum = numbers.reduce((a, b) => a + b, 0);
  parentPort.postMessage(sum);
}

17.3 Shared Memory with SharedArrayBuffer

// Shared memory — no copying!
const sharedBuffer = new SharedArrayBuffer(4); // 4 bytes
const sharedArray = new Int32Array(sharedBuffer);

const worker = new Worker('./worker.js', {
  workerData: { sharedBuffer }
});

// Both threads read/write the same memory
// Use Atomics for thread-safe operations!
Atomics.store(sharedArray, 0, 42);
Atomics.add(sharedArray, 0, 10);   // Thread-safe increment
Atomics.load(sharedArray, 0);      // Thread-safe read
Atomics.compareExchange(sharedArray, 0, 52, 100); // CAS operation
Atomics.wait(sharedArray, 0, 100); // Block until value changes
Atomics.notify(sharedArray, 0, 1); // Wake waiting threads

17.4 Worker Thread vs Child Process

Featureworker_threadschild_processMemoryShared (SharedArrayBuffer)SeparateCommunicationFast (shared memory/messages)IPC or stdio (slower)IsolationWeak (can share state)Strong (separate process)Startup costLowHighCrash isolation❌ Worker crash affects process✅ Independent processesLanguageJavaScript onlyAny languageUse caseCPU-bound JSAny blocking work, isolation


18. Child Processes

child_process module creates separate OS processes.

18.1 Methods

MethodDescriptionUse Casespawn(cmd, args, opts)Launch process, streaming stdioLong-running, large outputexec(cmd, opts, cb)Run command, buffer stdoutShort commandsexecFile(file, args, opts, cb)Like exec but no shellScripts, binariesfork(modulePath, args, opts)Spawn Node.js child with IPCNode.js workers with messaging

const { spawn, exec, execFile, fork } = require('child_process');

// spawn — streaming
const ls = spawn('ls', ['-la', '/usr']);
ls.stdout.on('data', (data) => console.log(data.toString()));
ls.stderr.on('data', (data) => console.error(data.toString()));
ls.on('close', (code) => console.log(`Process exited: ${code}`));

// exec — buffered (don't use for large output)
exec('git log --oneline -10', (err, stdout, stderr) => {
  if (err) return console.error(err);
  console.log(stdout);
});

// fork — Node.js IPC
const child = fork('./worker.js');
child.send({ task: 'compute', data: [1, 2, 3] });
child.on('message', (result) => console.log('Result:', result));

18.2 execSync / spawnSync — Synchronous Variants

Blocks the event loop — only use in startup scripts or CLIs, never in servers.

const { execSync, spawnSync } = require('child_process');

const output = execSync('git rev-parse HEAD').toString().trim();

19. Cluster Module — Multi-Core Scaling

Cluster allows creating multiple worker processes that share the same server port.

19.1 How Clustering Works

Master Process
┌─────────────────────────┐
│  cluster.fork() × 4     │
│  (for 4 CPU cores)      │
│  Receives connections   │
│  Round-robin distributes│
└────────┬────────────────┘
         │ distributes incoming connections
    ┌────┼────────────┐
    ▼    ▼    ▼       ▼
Worker  Worker  Worker  Worker
(pid1)  (pid2)  (pid3)  (pid4)
  │       │       │       │
  └───────┴───────┴───────┘
         port 3000
const cluster = require('cluster');
const http = require('http');
const os = require('os');

if (cluster.isPrimary) {
  const numCPUs = os.cpus().length;
  console.log(`Master ${process.pid} is running`);

  for (let i = 0; i < numCPUs; i++) {
    cluster.fork();
  }

  cluster.on('exit', (worker, code, signal) => {
    console.log(`Worker ${worker.process.pid} died, restarting...`);
    cluster.fork(); // Auto-restart dead workers
  });

} else {
  // Workers share the TCP server
  http.createServer((req, res) => {
    res.writeHead(200);
    res.end(`Handled by worker ${process.pid}`);
  }).listen(3000);

  console.log(`Worker ${process.pid} started`);
}

19.2 Cluster Scheduling Policies

PolicyDescriptionDefaultSCHED_RRRound-robin (master distributes)Linux, macOSSCHED_NONEOS handles distributionWindows


20. Event Emitter — Internal Pub/Sub

EventEmitter is the backbone of Node.js. Almost every core module (streams, HTTP, fs watchers) extends it.

20.1 EventEmitter Methods — Complete List

MethodDescriptionemitter.on(event, listener)Add persistent listeneremitter.addListener(event, listener)Alias for .on()emitter.once(event, listener)Add one-time listener (auto-removes)emitter.off(event, listener)Remove specific listeneremitter.removeListener(event, listener)Alias for .off()emitter.removeAllListeners(event?)Remove all listeners for eventemitter.emit(event, ...args)Emit event synchronouslyemitter.listeners(event)Get array of listenersemitter.rawListeners(event)Get listeners including once wrappersemitter.listenerCount(event)Count listeners for eventemitter.eventNames()List all registered event namesemitter.getMaxListeners()Get max listener limitemitter.setMaxListeners(n)Set max listener limit (default: 10)emitter.prependListener(event, fn)Add listener to FRONT of queueemitter.prependOnceListener(event, fn)Add one-time listener to FRONTEventEmitter.listenerCount(emitter, event)Static version (deprecated)EventEmitter.defaultMaxListenersClass-level default maxEventEmitter.errorMonitorSymbol for monitoring errorsevents.once(emitter, event)Promise-based: wait for event onceevents.on(emitter, event)Async iterator for events

20.2 How emit Works Internally

// Simplified internal implementation:
emit(event, ...args) {
  const listeners = this._events[event];
  if (!listeners) return false;

  // Listeners are called SYNCHRONOUSLY in registration order
  const list = Array.isArray(listeners) ? listeners.slice() : [listeners];
  for (const listener of list) {
    listener.apply(this, args);
  }
  return true;
}

Key point: emit is synchronous. All listeners run before emit returns.

20.3 Memory Leak Warning

Node.js warns when >10 listeners on same event (configurable). This is usually a sign of a bug (repeatedly adding listeners in a loop).

emitter.setMaxListeners(0);   // Disable warning (risky)
emitter.setMaxListeners(20);  // Increase limit

20.4 events.on — Async Iterator (Node 12+)

const { on } = require('events');

async function processEvents(emitter) {
  for await (const [event] of on(emitter, 'data')) {
    console.log('Got data:', event);
  }
}

21. Timers — Deep Mechanics

21.1 All Timer APIs

APITypeDelayCancellationEvent Loop PhasesetTimeout(fn, ms, ...args)MacroMin ~1msclearTimeout(id)TimerssetInterval(fn, ms, ...args)Macro (repeating)Min ~1msclearInterval(id)TimerssetImmediate(fn, ...args)MacroNone (after I/O)clearImmediate(id)Checkprocess.nextTick(fn, ...args)MicroNone❌ Cannot cancelBetween phasesqueueMicrotask(fn)MicroNone❌ Cannot cancelBetween phasesPromise.resolve().then(fn)MicroNone❌ Cannot cancelBetween phases

21.2 Timer Accuracy — Why setTimeout(fn, 0) Is Not Instant

  • OS scheduler has minimum resolution (~1-15ms depending on OS and power settings).

  • Node.js timer implementation adds minimal overhead.

  • The event loop may be busy with I/O poll.

  • Timer fires in Timers phase — if current phase is running, wait.

const start = Date.now();
setTimeout(() => {
  console.log(`Elapsed: ${Date.now() - start}ms`); // Could be 1-20ms, not 0!
}, 0);

21.3 ref() and unref()

By default, active timers keep the process alive. .unref() lets the process exit even if the timer hasn't fired.

const timer = setTimeout(() => {
  console.log('This may never run if process exits first');
}, 10000);

timer.unref(); // Don't keep process alive for this timer

// timer.ref() to re-attach

21.4 timers/promises API (Node 15+)

Promise-based timer APIs — work with async/await:

const { setTimeout: delay, setInterval: interval, setImmediate: immediate }
  = require('timers/promises');

// Wait 1 second
await delay(1000);
await delay(1000, 'value'); // resolves with 'value'

// Process with timeout
const result = await Promise.race([
  fetchData(),
  delay(5000, null), // timeout returns null
]);

// Async iteration with interval
for await (const _ of interval(1000)) {
  console.log('tick', Date.now());
}

22. Error Handling — The Complete Picture

22.1 Error Types in Node.js

TypeExampleHow To CatchSynchronous exceptionsJSON.parse('bad')try/catchAsync callback errorsfs.readFile err argumentCheck err parameterPromise rejectionsawait fetch('bad-url')try/catch, .catch()EventEmitter errorsstream.on('error', ...)'error' event listenerUncaught exceptionsUnhandled throwprocess.on('uncaughtException')Unhandled rejections.then() without .catch()process.on('unhandledRejection')

22.2 Error Object Properties

PropertyTypeDescriptionerr.messageStringHuman-readable descriptionerr.nameStringError class name ('TypeError', 'RangeError', etc.)err.stackStringStack trace stringerr.codeStringNode.js/OS error code ('ENOENT', 'ECONNREFUSED')err.errnoNumberOS error numbererr.syscallStringSystem call that failed ('read', 'connect')err.pathStringFile path (for fs errors)err.addressStringNetwork address (for net errors)err.portNumberNetwork port (for net errors)

22.3 Common Error Codes

CodeMeaningENOENTNo such file or directoryEACCESPermission deniedEEXISTFile already existsEISDIRIs a directory (expected file)ENOTDIRNot a directoryECONNREFUSEDConnection refusedECONNRESETConnection reset by peerETIMEDOUTConnection timed outEPIPEBroken pipe (write to closed connection)EADDRINUSEAddress already in use (port taken)EMFILEToo many open filesENOTEMPTYDirectory not empty

22.4 Domains (Deprecated) and Modern Alternatives

Domains were deprecated in Node.js v4 — do not use. Use AsyncLocalStorage instead for async context tracking.

const { AsyncLocalStorage } = require('async_hooks');
const requestStorage = new AsyncLocalStorage();

app.use((req, res, next) => {
  requestStorage.run({ requestId: uuid() }, next);
});

app.get('/api', async (req, res) => {
  const { requestId } = requestStorage.getStore();
  // requestId is available throughout the entire async chain!
});

22.5 Global Error Handlers

// Last-resort error handling — log and exit!
process.on('uncaughtException', (err, origin) => {
  console.error('Uncaught Exception:', err, 'Origin:', origin);
  // MUST exit — process state is unknown after uncaughtException!
  process.exit(1);
});

process.on('unhandledRejection', (reason, promise) => {
  console.error('Unhandled Rejection at:', promise, 'reason:', reason);
  // In Node 15+, this crashes the process by default
});

process.on('warning', (warning) => {
  console.warn(warning.name, warning.message, warning.stack);
});

23. Core Node.js APIs Reference

23.1 fs Module — File System

MethodDescriptionfs.readFile(path, opts, cb)Read entire filefs.readFileSync(path, opts)Synchronous readfs.writeFile(path, data, opts, cb)Write file (creates or overwrites)fs.writeFileSync(path, data, opts)Synchronous writefs.appendFile(path, data, opts, cb)Append to filefs.open(path, flags, mode, cb)Open file, get fdfs.close(fd, cb)Close file descriptorfs.read(fd, buffer, offset, length, position, cb)Read into bufferfs.write(fd, buffer, offset, length, position, cb)Write from bufferfs.stat(path, cb)Get file statsfs.lstat(path, cb)Stat (don't follow symlinks)fs.fstat(fd, cb)Stat by file descriptorfs.access(path, mode, cb)Check access permissionsfs.exists(path, cb)Check existence (deprecated, use access)fs.mkdir(path, opts, cb)Create directoryfs.mkdirSync(path, opts)Synchronous mkdirfs.rmdir(path, opts, cb)Remove directoryfs.rm(path, opts, cb)Remove file or directoryfs.readdir(path, opts, cb)List directory contentsfs.rename(oldPath, newPath, cb)Move/rename filefs.copyFile(src, dest, flags, cb)Copy filefs.unlink(path, cb)Delete filefs.symlink(target, path, type, cb)Create symlinkfs.readlink(path, opts, cb)Read symlink targetfs.realpath(path, opts, cb)Resolve to absolute pathfs.chmod(path, mode, cb)Change file permissionsfs.chown(path, uid, gid, cb)Change file ownershipfs.truncate(path, len, cb)Truncate filefs.watch(path, opts, listener)Watch for changesfs.watchFile(path, opts, listener)Poll for file changesfs.unwatchFile(path, listener)Stop watching filefs.createReadStream(path, opts)Create readable streamfs.createWriteStream(path, opts)Create writable streamfs.promises.*Promise-based versions of all the above

fs.Stats object properties:

PropertyDescriptionstats.isFile()Is it a regular file?stats.isDirectory()Is it a directory?stats.isSymbolicLink()Is it a symlink?stats.sizeFile size in bytesstats.modeFile permission bitsstats.uid / stats.gidUser/group IDstats.atimeLast access timestats.mtimeLast modification timestats.ctimeLast status change timestats.birthtimeCreation time

23.2 path Module

MethodDescriptionExamplepath.join(...parts)Join path partspath.join('/home', 'user', 'file.txt')/home/user/file.txtpath.resolve(...parts)Resolve to absolute pathpath.resolve('file.txt')/cwd/file.txtpath.dirname(path)Get directory'/home/user/file.txt''/home/user'path.basename(path, ext)Get filename'/home/user/file.txt''file.txt'path.extname(path)Get extension'file.txt''.txt'path.parse(path)Parse into parts{ root, dir, base, ext, name }path.format(obj)Opposite of parsepath.isAbsolute(path)Check if absolutetrue/falsepath.relative(from, to)Get relative path between twopath.normalize(path)Normalize .. and .path.sepPath separator'/' on Unix, '\\' on Windowspath.delimiterPATH separator':' on Unix, ';' on Windowspath.posixForce POSIX behaviorpath.win32Force Windows behavior

23.3 http / https Module

const http = require('http');

// Create server
const server = http.createServer((req, res) => {
  // req: http.IncomingMessage (Readable stream)
  // res: http.ServerResponse (Writable stream)

  req.method;         // 'GET', 'POST', etc.
  req.url;            // '/path?query'
  req.headers;        // Object of headers
  req.httpVersion;    // '1.1'
  req.socket;         // Underlying TCP socket

  res.statusCode = 200;
  res.statusMessage = 'OK';
  res.setHeader('Content-Type', 'application/json');
  res.writeHead(200, { 'Content-Type': 'text/html' }); // Set status + headers at once
  res.write('partial data');
  res.end('final data'); // or res.end()
});

server.listen(3000, '0.0.0.0', () => console.log('Server running'));
server.close(callback); // Stop accepting connections

// Make HTTP request
const req = http.request({
  hostname: 'api.example.com',
  port: 80,
  path: '/data',
  method: 'GET',
  headers: { 'Content-Type': 'application/json' }
}, (res) => {
  let data = '';
  res.on('data', chunk => data += chunk);
  res.on('end', () => console.log(JSON.parse(data)));
});
req.on('error', console.error);
req.end();

// Simple GET
http.get('http://example.com', (res) => { /* ... */ });

23.4 net Module — TCP/IPC

const net = require('net');

// TCP Server
const server = net.createServer((socket) => {
  socket.write('Hello\n');
  socket.on('data', (data) => { /* ... */ });
  socket.on('end', () => { /* ... */ });
  socket.destroy(); // Forcefully close
  socket.end('bye'); // Graceful close
});
server.listen(8080);

// TCP Client
const client = net.createConnection({ host: '127.0.0.1', port: 8080 }, () => {
  client.write('Hello server');
});
client.on('data', (data) => console.log(data.toString()));

23.5 crypto Module

CategoryMethodsHashingcrypto.createHash('sha256'), .update(data), .digest('hex')HMACcrypto.createHmac('sha256', secret)Symmetriccrypto.createCipheriv(algo, key, iv), crypto.createDecipherivAsymmetriccrypto.generateKeyPair, crypto.publicEncrypt, crypto.privateDecryptSigningcrypto.createSign(algo), crypto.createVerify(algo)Randomcrypto.randomBytes(n, cb), crypto.randomInt(min, max), crypto.randomUUID()Key derivationcrypto.pbkdf2(pass, salt, iter, keyLen, digest, cb), crypto.scryptDiffie-Hellmancrypto.createDiffieHellman(primeLen)ECDHcrypto.createECDH(curveName)Certificatescrypto.CertificateWebcryptocrypto.subtle — Web Crypto API compatible


24. Performance & Profiling

24.1 performance API

const { performance, PerformanceObserver } = require('perf_hooks');

// High-resolution timing
const start = performance.now();
doExpensiveWork();
console.log(`Took: ${performance.now() - start}ms`);

// Mark and measure
performance.mark('start');
doWork();
performance.mark('end');
performance.measure('work duration', 'start', 'end');

const obs = new PerformanceObserver((list) => {
  const entries = list.getEntries();
  entries.forEach(e => console.log(e.name, e.duration));
});
obs.observe({ entryTypes: ['measure', 'mark', 'gc'] });

24.2 Built-in Profiling Tools

# CPU profiling
node --prof app.js          # Generate V8 profiler log
node --prof-process isolate-*.log  # Process log to text report

# Memory snapshot
node --heapsnapshot-signal=SIGUSR2 app.js
kill -USR2 <pid>            # Triggers heap snapshot file

# CPU sampling via inspector
node --inspect app.js       # Open Chrome DevTools → Profiler

# Flame graph (with 0x)
npx 0x app.js

# --trace-* flags
node --trace-sync-io app.js         # Find synchronous I/O in async contexts
node --trace-warnings app.js        # Show warning stack traces
node --trace-deprecation app.js     # Show deprecation stack traces
node --trace-uncaught app.js        # Trace uncaught exceptions
node --trace-gc app.js              # Log GC events
node --expose-gc app.js             # Enable global.gc() call

24.3 v8 Module

const v8 = require('v8');

v8.getHeapStatistics();    // { total_heap_size, used_heap_size, ... }
v8.getHeapSpaceStatistics(); // Per-space breakdown
v8.writeHeapSnapshot();    // Write heap snapshot to file
v8.setFlagsFromString('--max-old-space-size=4096'); // Set V8 flags

// Serialize/deserialize JS values (structured clone)
const buf = v8.serialize({ a: 1, b: new Map([[1, 2]]) });
const obj = v8.deserialize(buf);

24.4 Memory Limits and Tuning

# Set max heap size (default: ~1.5GB on 64-bit)
node --max-old-space-size=4096 app.js  # 4GB

# Set max semi-space size (new space)
node --max-semi-space-size=64 app.js   # 64MB (default: 8-16MB)

# Expose GC for manual control (testing/debugging only!)
node --expose-gc app.js
# Then in code: global.gc()

25. Security Fundamentals

25.1 Node.js Permissions Model (Node 20+)

# Experimental permission model
node --experimental-permission --allow-fs-read=/tmp app.js
node --experimental-permission --allow-fs-write=/tmp app.js
node --experimental-permission --allow-net=example.com:443 app.js
node --experimental-permission --allow-child-process app.js
node --experimental-permission --allow-worker app.js

25.2 Common Vulnerabilities

VulnerabilityRiskPreventionPrototype pollutionAttacker injects into Object prototypeUse Object.create(null), validate inputReDoSCatastrophic backtracking in regexUse safe-regex, avoid complex patternsSSRFRequest forgery to internal servicesWhitelist allowed hostsPath traversal../../etc/passwd in user inputUse path.basename(), validateInjectionCommand injection via exec()Use spawn() with args arrayTiming attacksComparing secrets with ===Use crypto.timingSafeEqual()Unhandled rejectionsSilent failuresAlways add .catch() / global handlereval() / Function()Code injectionNever eval user inputchild_process.exec(userInput)Shell injectionUse execFile or spawn with args array

// ❌ Dangerous — shell injection
exec(`ls ${userInput}`); // If userInput = "; rm -rf /"

// ✅ Safe — no shell, args are sanitized
spawn('ls', [userInput]); // args are passed directly, not through shell

26. Mastery Checklist by Level

🟢 Beginner — "I understand how JS runs"

  • [ ] Explain what the call stack is and draw it for a function chain

  • [ ] Explain why console.log after a setTimeout(fn, 0) runs first

  • [ ] Understand var hoisting vs let/const Temporal Dead Zone

  • [ ] Explain what a closure is and give an example

  • [ ] Use fs.readFile with callbacks correctly

  • [ ] Understand the difference between require and import

  • [ ] Know what process.env is used for

  • [ ] Use Buffer.from() and .toString() for basic encoding

🟡 Intermediate — "I understand how Node.js is asynchronous"

  • [ ] Explain the 6 phases of the event loop and what runs in each

  • [ ] Know the difference between process.nextTick, Promises, and setImmediate

  • [ ] Identify and fix an event loop starvation bug

  • [ ] Write a custom Readable/Transform stream

  • [ ] Handle backpressure in streams using .pipe() or manual pause/resume

  • [ ] Explain how V8 hidden classes work and write code that doesn't break them

  • [ ] Use Promise.all, Promise.race, Promise.allSettled correctly

  • [ ] Implement proper error handling for async code (try/catch + process events)

  • [ ] Use EventEmitter correctly, including avoiding memory leaks

  • [ ] Explain why fs.readFile uses the thread pool but net.createServer doesn't

  • [ ] Use cluster to scale a server across CPU cores

  • [ ] Configure UV_THREADPOOL_SIZE and know when it matters

🔴 Advanced — "I understand Node.js internals deeply"

  • [ ] Read and interpret a V8 heap snapshot

  • [ ] Profile a Node.js app with --prof and identify hot paths

  • [ ] Identify and fix a memory leak using heap snapshots

  • [ ] Implement thread-safe code using worker_threads + SharedArrayBuffer + Atomics

  • [ ] Explain V8's GC: young gen scavenge, old gen mark-sweep-compact, incremental marking

  • [ ] Explain why deoptimization happens and write code that avoids it

  • [ ] Use AsyncLocalStorage for request-scoped context

  • [ ] Implement native Node.js add-ons using N-API (optional, advanced)

  • [ ] Use perf_hooks to benchmark and identify P99 latency issues

  • [ ] Explain the difference between ESM live bindings and CJS value copies

  • [ ] Tune --max-old-space-size and GC settings for production workloads

  • [ ] Explain how libuv's thread pool and epoll/kqueue interact with the event loop

  • [ ] Implement a custom Duplex stream with correct backpressure


Quick Reference: Execution Order Cheat Sheet

Script start
  │
  ├── Synchronous code (top-level)
  │
  ├── [Microtask checkpoint]
  │     ├── process.nextTick queue (drain completely)
  │     └── Promise microtask queue (drain completely)
  │
Event Loop Iteration:
  │
  ├── Phase 1: Timers
  │     └── Run expired setTimeout/setInterval callbacks
  │     └── [Microtask checkpoint after each callback]
  │
  ├── Phase 2: Pending Callbacks
  │     └── Deferred I/O error callbacks
  │     └── [Microtask checkpoint]
  │
  ├── Phase 3-4: Idle, Prepare (internal)
  │
  ├── Phase 4: Poll
  │     └── Get I/O events, run their callbacks
  │     └── [Microtask checkpoint after each callback]
  │     └── Block here (if no timers or setImmediate pending)
  │
  ├── Phase 5: Check
  │     └── Run setImmediate callbacks
  │     └── [Microtask checkpoint after each callback]
  │
  └── Phase 6: Close Callbacks
        └── socket.on('close', ...) etc.
        └── [Microtask checkpoint]
        │
        └── Repeat from Phase 1 (or exit if nothing pending)

Built for the indie hacker who wants to truly understand what's happening under the hood — not just how to use Node.js, but why it works the way it does.

— End of the article —

Discussion (0)

Please sign in to join the discussion.

Sign In to Comment