AI Dynamics

Global AI News Aggregator

Dropout=1.0 Issue: PyTorch Should Raise ValueError

The model is not "turned off during training". With dropout=1.0, for dropout layers you'll get all zero at train and, apparently, identity at test. I don't think pytorch should have allowed dropout=1.0. It should be ValueError, not sure I get the reasoning there.

→ View original post on X — @karpathy,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *