AI Dynamics

Global AI News Aggregator

Memory Efficient Neural Networks: Activation Storage Optimization

This is useful for memory efficient NNs because with careful architecture choice, you don't have to store the activations of every layer. You can store the last layer and work your way backwards. The example indicates that the operation composes, so you can make deep networks 5/n

→ View original post on X — @thegautamkamath,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *