Context: Triangle Shader Lab is a small Next.js + TypeScript study site deployed to GitHub Pages. It hosts two demos (Hello Triangle + Textured Cube) with annotated copy.
AI assist: ChatGPT/Copilot helped with WGSL snippets, adapter/device explanations, and code comments. I cite them directly on the site.
Status: Learning tool only. It adapts public WebGPU samples, adds notes, and keeps expectations low.

Reality snapshot

  • Built with Next.js 14, Bun, Tailwind, and MDX. No custom engine—just readable wrappers around the official WebGPU samples.
  • Goal: understand adapter requests, pipeline creation, buffers, and render loops, then share the notes publicly.
  • Limitations: Works where WebGPU is enabled (Chrome Canary, Edge with flags). No WebGL fallback, no production features.

Architecture

triangle-shader-lab/
├── pages/
│ ├── index.mdx
│ ├── hello-triangle.mdx
│ └── textured-cube.mdx
├── components/
│ ├── DemoWrapper.tsx
│ └── CodeBlock.tsx
├── lib/
│ └── webgpu.ts
└── public/
└── shaders/
  • DemoWrapper handles feature detection (navigator.gpu), adapter/device negotiation, and fallback messaging.
  • Each demo page includes: overview, code, “Reality check” block (what’s working/missing), and proof links.

Demo breakdown

Hello Triangle

  • Rebuilds the canonical sample. WGSL shaders render a single triangle with color interpolation.
  • Notes explain requestAdapter, requestDevice, swap chain configuration, and command encoder setup.
  • Users can toggle debugging overlays to see pipeline state (bind groups, buffers) in real time.

Textured Cube

  • Adds vertex buffers, indices, textures, uniform buffers, and a basic rotation matrix.
  • Highlights the extra state needed (samplers, bind group layouts) and warns about missing features (lighting, depth testing).
  • Links back to the original sample so readers know I didn’t invent it.

Tooling

  • Next.js 14 + MDX: Lets me combine prose + code.
  • Bun: Fast dev server, zero-config TypeScript support.
  • Tailwind: Provides consistent typography/layout.
  • Netlify analytics: Tracks which sections get the most attention; I use that data to prioritize updates.
  • Honesty log: Each page includes “Reality snapshot” + AI disclosure.

Lessons learned

  • Feature detection + honest fallbacks matter. If WebGPU isn’t available, I show a clear message instead of a blank canvas.
  • Annotated copy helps more than fancy effects. Recruiters appreciated that I call it a study site, not a framework.
  • Sharing prompt logs builds trust. I include links to AI conversations so engineers know where I got help.

TODOs

  • Add WebGPU flag instructions for each browser.
  • Record Loom walkthroughs to explain the demos verbally.
  • Experiment with WebGPU compute shaders and document those notes the same way.
  • Publish the repo publicly (currently private while I scrub assets).

Links