Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

2
  • 1
    $\begingroup$ Are mini-batch gradient descent the same batch-gradient descent? I'm lost here! if not what's the difference between these? Correct me if I'm wrong, in Batch mode, the whole dataset needs to be read in batches, gradients get calculated, and when all are read, then they are averaged and then parameters are updated, while, in mini-batch, each batch is read, gradients get calculated and then parameters get updated, and then the next mini batch is read till the one epoch is over. $\endgroup$
    – Hossein
    Commented Feb 14, 2017 at 10:55
  • 1
    $\begingroup$ That's the generally given definition: Update parameters using one subset of the training data at a time. (There are some methods in which mini-batches are randomly sampled until convergence, i.e. The batch won't be traversed in an epoch.) See if this is helpful. $\endgroup$ Commented Feb 14, 2017 at 12:30