A perceptron is a subclass of neuron utilizing a Heaviside step activation function. It is defined as

where the Heaviside step function is defined as

One interesting property of a perceptron is that basic binary operators (AND, OR, NOT) can be captured perfectly using specific values of and . (Smooth activation functions, usually preferred in practice for neural networks, will express uncertainty near the state boundary.)

NAND, XOR, etc. require more than one perceptron to model. The Cybenko’s universal function approximation theorem states that there exists an array of perceptrons (arranged as a Single-layer perceptron) that can approximate any continuous function to arbitrary precision.