I'm really excited about the growing potential to run small models on my own laptop (like Llama 3 8B or Phi-3) that don't "know" much but are capable of tool usage and summarization – then I can use them for things like local RAG as well
Running Small Language Models Locally for RAG and Tool Use
By
–
Leave a Reply