· apps · soundbox · architecture · threejs · audio
Soundbox: pulses, edge timing, Three.js canvas
Concrete walkthrough of Soundbox transport beats, PulseFlowService, edgeTiming labels, and RenderSceneService — cite the repo, not vibes.
The graph is the score. The pulse is the conductor.
Read this post with the repo open — otherwise it is just words. Local blog: run cd marketing && npm run dev, then open http://localhost:4321/blog/soundbox-pulses-graph-threejs/. Artifacts to open first: soundbox/src/app/bootstrap.ts, soundbox/src/services/TransportService.ts, soundbox/src/services/PulseFlowService.ts, soundbox/src/services/edgeTiming.ts, soundbox/src/services/RenderSceneService.ts.
Soundbox is not a linear DAW timeline. It is a directed graph where pulses move along edges; the clock is TransportService, the graph is GraphService, and the canvas is RenderSceneService (Three.js). Below is what the files actually do.
Transport: the metronome (real loop)
TransportService.update(now) walks nextBeatAt in a while loop and fires every subscriber with (beatTime, currentBeat) — that is the hook PulseFlowService uses to inject work on each tick:
// soundbox/src/services/TransportService.ts — lines 67–78
update(now: number): void {
if (!this.playing) {
return;
}
while (now >= this.nextBeatAt) {
const beatTime = this.nextBeatAt;
const currentBeat = this.beatIndex;
this.beatListeners.forEach((listener) => listener(beatTime, currentBeat));
this.beatIndex += 1;
this.nextBeatAt += this.getBeatDurationMs();
}
getBeatDurationMs() is literally 60000 / bpm — no hidden magic.
Edge timing: geometry → “1/4”, “1/8”, …
computeEdgeTiming maps grid distance between two node positions to one of the discrete note labels in EDGE_NOTE_OPTIONS (1/32 through 4 beats). That is how the UI knows what rhythmic “length” an edge represents — it is derived from layout, not typed in by hand as a first-class music-theory editor:
// soundbox/src/services/edgeTiming.ts — lines 17–25
const EDGE_NOTE_OPTIONS = [
{ beats: 0.125, noteKind: "none" as const, label: "1/32", vertical: true, placement: "raised" as const },
{ beats: 0.25, noteKind: "none" as const, label: "1/16", vertical: true, placement: "raised" as const },
{ beats: 0.5, noteKind: "none" as const, label: "1/8", vertical: true, placement: "inline" as const },
{ beats: 1, noteKind: "quarter" as const, label: "1/4", vertical: false, placement: "inline" as const },
{ beats: 2, noteKind: "half" as const, label: "1/2", vertical: false, placement: "inline" as const },
{ beats: 4, noteKind: "whole" as const, label: "1", vertical: false, placement: "inline" as const },
{ beats: 8, noteKind: "double" as const, label: "2", vertical: false, placement: "inline" as const },
{ beats: 16, noteKind: "double" as const, label: "4", vertical: false, placement: "inline" as const },
] as const;
// soundbox/src/services/edgeTiming.ts — lines 28–36
export const computeEdgeTiming = (start: Vec2, end: Vec2): EdgeTiming => {
const horizontalDistance = Math.abs(end.x - start.x);
const horizontalBeats = Math.max(0.125, horizontalDistance / (GRID_SIZE * 2.4));
const verticalSteps = Math.round(Math.abs(end.y - start.y) / GRID_SIZE);
const beats = horizontalBeats + Math.max(0, verticalSteps - 2) * 0.5;
const subtleStepOffset = Math.min(verticalSteps, 2);
const best = EDGE_NOTE_OPTIONS.reduce((currentBest, current) =>
Math.abs(current.beats - beats) < Math.abs(currentBest.beats - beats) ? current : currentBest,
);
(The function returns best.label etc. on lines 38–47 of the same file — open it for the full EdgeTiming object.)
MEASURE_BEATS is 4 in the same file — pro-mode scheduling in PulseFlowService references it when deciding when to re-arm transport pulses.
PulseFlow: beats → triggers
PulseFlowService subscribes to graphService and to transportService.subscribeBeats. In pro mode it only re-seeds transport pulses on bar boundaries (beatIndex % MEASURE_BEATS === 0); in normal mode it emits transport pulses every beat. Read soundbox/src/services/PulseFlowService.ts from the constructor through update — that is the spine between clock and audio.
Three.js: what you see (constructor facts)
RenderSceneService creates a WebGLRenderer({ antialias: true, alpha: true }), caps setPixelRatio(Math.min(window.devicePixelRatio, 2)), and hangs scene / camera / nodeGroup / edgeGroup / pulseGroup off the host element. Pulses use a shared CircleGeometry(9, 20) mesh — you can grep pulseGeometry in that file to see the exact mesh reuse.
Bootstrap: where play and export meet the clock
bootstrap.ts wires the overlay’s onTogglePlay to audioEngineService.unlock() then transportService.toggle, and wires export to a 60 second offline render + download filename sanitization — copy/paste from the repo if you want to verify we are not hand-waving:
// soundbox/src/app/bootstrap.ts — lines 145–155
onExportBeat: async (beat) => {
const blob = await exportService.exportBeat(beat.project, 60);
const url = URL.createObjectURL(blob);
const a = document.createElement("a");
a.href = url;
a.download = `${beat.name.replace(/[^a-z0-9]/gi, "_")}.wav`;
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
URL.revokeObjectURL(url);
},
Interaction + saves
InteractionService takes the root element plus GraphService and RenderSceneService; bootstrap passes () => audioEngineService.unlock() so the first gesture can resume the AudioContext. Graph saves are debounced with window.setTimeout(..., 120) in scheduleSave — open bootstrap.ts and search scheduleSave to see the exact throttle.
Primary sources: soundbox/src/services/TransportService.ts, soundbox/src/services/edgeTiming.ts, soundbox/src/services/PulseFlowService.ts, soundbox/src/services/RenderSceneService.ts, soundbox/src/services/InteractionService.ts, soundbox/src/app/bootstrap.ts.