This is a Neural Cellular Automaton (NCA) — a tiny neural network that runs independently in every cell of a grid, reading only its immediate neighbors. Despite having no global coordination, the cells collectively grow and maintain complex patterns from a simple seed.
Each cell holds 16 channels: 3 visible (RGB) and 13 hidden "thought" channels. Every step, each cell:
Each model's update rule was found by automated architecture search (Optuna) and trained in PyTorch to regenerate target patterns from damage. This page runs the trained weights in WebGL2 fragment shaders at 60fps, with the same math as the Python original.
16 NCA channels are stored as 4 RGBA float16 textures on the GPU, double-buffered
for ping-pong rendering. MLP weights are packed into an RGBA32F texture and read via
texelFetch() for a single-pass update (no multi-pass slicing). The update
shader is dynamically generated from each model's architecture parameters and recompiled
on model switch. Models range from ~5K to ~31K learned parameters depending on hidden
size and depth.