example when mini batch can't work

by quester   Last Updated March 12, 2018 19:19 PM

lets say that we have nn as follows:

input (1 dim)
fully connected
mini batch
tanh
output (whatever dim)

let's say that we have 2 batches [-1, 0, 1] and output y0 and [100, 101, 102] and output y1 then minibatch would produce same inputs for tanh layer - and then network couldn't learn properly this examples, or I am missing something about minibatch?



Related Questions





Neural network weight initialization?

Updated February 20, 2017 11:19 AM

Batch Normalization in a deep network

Updated March 16, 2017 02:19 AM