I pulled together notes on all of the LLM plugins that have worked for me for Llama 3 – both for hosting locally (I've run 8B and 70B on my 64GB M2) and access via APIs (Groq is SO FAST for that)
— Simon Willison (@simonw) 22 avril 2024
Options for accessing Llama 3 from the terminal using LLMhttps://t.co/OuG7EqCI4F pic.twitter.com/xmBisMH7Io
I pulled together notes on all of the LLM plugins that have worked for me for Llama 3 – both for hosting locally (I've run 8B and 70B on my 64GB M2) and access via APIs (Groq is SO FAST for that) Options for accessing Llama 3 from the terminal using LLM https://
simonwillison.net/2024/Apr/22/ll
ama-3/
…
Leave a Reply