"Memory Caching: RNNs with Growing Memory" Google's new paper proposes a simple way to give recurrent models a memory that grows with sequence length. So instead of forcing an RNN to compress the full past into 1 fixed hidden state, it caches memory checkpoints across
Google’s Growing Memory RNNs: New Caching Approach
By
–
Leave a Reply