Meta MobileLLM-R1-140M, which can run 100% locally in your browser, no server inference required vibe coded a chat app powered by transformers.js in anycoder
Meta MobileLLM-R1-140M: Local Browser Chat App
By
–
Global AI News Aggregator
By
–
Meta MobileLLM-R1-140M, which can run 100% locally in your browser, no server inference required vibe coded a chat app powered by transformers.js in anycoder
Leave a Reply