EM-LLM: Human-Inspired Episodic Memory for Infinite Context LLMs github.com 26 points by jbotz 21 hours ago
mountainriver 6 hours ago TTT, cannon layers, and titans seem like a stronger approach IMO.Information needs to be compressed into latent space or it becomes computationally intractable
MacsHeadroom 13 hours ago So, infinite context length by making it compute bound instead of memory bound. Curious how much longer this takes to run and when it makes sense to use vs RAG.
TTT, cannon layers, and titans seem like a stronger approach IMO.
Information needs to be compressed into latent space or it becomes computationally intractable
So, infinite context length by making it compute bound instead of memory bound. Curious how much longer this takes to run and when it makes sense to use vs RAG.