Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

6
  • 2
    If N is the batch size, where is the number of training examples defined?
    – oezguensi
    Commented Nov 29, 2018 at 3:48
  • 1
    @oezguensi It is N too - there is only one batch here, with batch size 64. This example iterates just 500 times over the same batch: number_of_training_examples = num_batches * batch_size, thus 1 * 64 = 64
    – MBT
    Commented Nov 29, 2018 at 8:40
  • That seems quite useless. Why should we iterate over the same training examples again and again. Is it just the example that is wrong, because of simplicity or am I missing something?
    – oezguensi
    Commented Nov 30, 2018 at 0:07
  • 1
    @oezguensi Yes, in reality you wouldn't do that. But I guess it is just for illustration purposes how dimensions work out within a batch while keeping it simple.
    – MBT
    Commented Dec 2, 2018 at 10:30
  • is t the number of epochs in this sample example? I believe there needs to be another for loop through training_num right after for t in range(500):
    – doplano
    Commented Oct 14, 2019 at 7:57