Mini-batching is a generalization of stochastic gradient estimation: you take
In practice, everyone calls this “stochastic gradient descent,” understanding that you can get the original meaning by using a batch size of 1.
1 min read
Mini-batching is a generalization of stochastic gradient estimation: you take
In practice, everyone calls this “stochastic gradient descent,” understanding that you can get the original meaning by using a batch size of 1.