Neural Net Dropout

Neural Net Dropout

Linear Digressions

Neural networks are complex models with many parameters and can be prone to overfitting.  There's a surprisingly simple way to guard against this: randomly destroy connections between hidden units, also known as dropout.…

Related tracks

See all