Does it make sense to use dropout in my neural network if I only have a single hidden layer?

At most I’ll possibly have 2 hidden layers but as of right now I only have one. Is dropout « safe » to use if you only have one hidden layer because I’ve heard using dropout near the output layer can prevent the network from learning.

submitted by /u/learning_proover to r/learnmachinelearning
[link] [comments]


Commentaires

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *