In gradient descent, the loss landscape refers to the set of all values of the loss function in terms of the model’s tunable parameters . It is therefore an -dimensional hypersurface, where is the number of tunable parameters. When the loss function is differentiable (as it almost always is), this hypersurface is locally Euclidean and is therefore a manifold. However, the terms “loss surface” and “loss landscape” are far more common.