🧠 Neural Network Playground
Watch a neural network learn in real time
Explore how neural networks classify data. Select a preset, then hit Reset & Watch it Learn to see the network train from random weights using backpropagation. Click on the decision boundary to add your own data points (shift-click for class 0).
Activation:
SigmoidReLUtanhNetwork Architecture
Decision Boundary
Draw ModeSpeed
10
LR
Steps:
0
50%
Accuracy on XOR
9
Parameters
3
Layers
0
Training Steps
Idle
Status
How it works
A neural network is a series of layers of interconnected neurons. Each connection has a weight that determines how strongly one neuron influences another. During a forward pass, input values are multiplied by weights, summed with a bias, and passed through a sigmoid activation function to produce outputs between 0 and 1.
During training, the network learns by adjusting its weights through backpropagation. For each data point, the network computes its prediction (forward pass), then calculates how wrong it was using binary cross-entropy loss. The error signal propagates backward through the network, computing gradients — how much each weight contributed to the error. Weights are then nudged in the direction that reduces the loss using stochastic gradient descent.
The decision boundary visualization shows how the network partitions the 2D input space. Blue regions indicate outputs below 0.5 (class 0), while red regions indicate outputs above 0.5 (class 1). Watch how the boundary evolves from random noise into clean separation as the network trains. The learning rate controls how big each weight update is — too high and the network overshoots, too low and it learns slowly.
In the network diagram, green connections represent positive weights (excitatory) and red connections represent negative weights (inhibitory). Thicker lines indicate larger weight magnitudes. During training, connections glow to show the network is actively learning. Try adding your own data points by clicking on the decision boundary and watch the network adapt.