Unfortunately, StreamingLLM doesn't solve long-term memory or continual learning. It's just a (useful) technique for improving LLM inference speed. On their github they state: As emphasized earlier, we neither expand the LLMs' context window nor enhance their long-term memory.
StreamingLLM: Speed Optimization Without Long-term Memory
By
–
Leave a Reply