Been playing with Gemma running locally on my Pixel phone and it feels magical. To think that years ago we could not remotely imagine such a powerful LLM running in your pocket with no connectivity whatsoever. I wonder how many scenarios for LLMs will start shifting the cost to the edge to get "free" computing with added privacy and control over your data. I can't imagine how companies that only monetize inferencing and nothing else will stay above the water. Either they make/sell hardware or they must add value on top of it, otherwise they become mostly an inconvenience in the middle.
→ View original post on X — @clementdelangue, 2026-04-05 18:31 UTC
Leave a Reply