this is what opensource LLM inference
looks like in my head > pure chaos
> lousy & misplaced integrations > everything half-broken, somehow still runs we're so early, and there's a lot of work to do
@theahmadosman
-
Open Source LLM Inference: Current State of Chaos
By
–
-
CS Fundamentals, Local LLMs, and Inference Explained
By
–
what do you know? cs fundamentals? running llms locally? inference?
-
Infrastructure Focus in AI Development Strategy
By
–
BINGO That's why I am focused on Infrastructure 🙂
-
Is AI a Bubble? Market Signals and Investment Outlook
By
–
a few serious words on whether AI is a bubble or not a couple of years back, i saw both the capabilities and
demand for AI about to explode, so i went all in
that still holds today
if anything, the signal has only gotten louder the fact that some companies overpromise or flame -
RGB lighting power consumption hidden energy costs
By
–
nobody tells you that RGB uses that much power 🙁
-
EPYC Platform Hardware Build Guide with Components and Options
By
–
I suggest you wait until I cover the Epyc platform as well The guide will have builds with parts and where to buy, and it will have a lot more options + use cases
-
GPU Computing: The New Gold Rush Replacing SaaS Software
By
–
if you’re in software, pivot to GPUs now take on debt, sell a kidney, whatever
but get your hands on that silicon the SaaS gold rush is over
new money is compute the next king isn’t code
it’s hardware -
GPU PCIe Layout Mapping Guide for AI Builders
By
–
> this is how you map out a 4x GPU PCIe layout
> every builder needs to know this cold
> Buy a GPU is almost here
> and it’s lane-checking all your builds —Buy a GPU, The Movement -
Buy GPU Guide: High-Level Content Launch Preview
By
–
keep an eye on my posts for the next 2 days posting a lot of the high-level content of the Buy a GPU guide as i do a final polish before launch, and this will be one of those things
-
Local LLMs Becoming Accessible: The Windows XP Era Arrives
By
–
> the Windows XP era of
> local LLMs is almost here > we’re about to make
> “local AI, no headaches”
> the new default > soon you’ll fire up AI at home
> and boom, “it just works” > get ready
> it’s going to be glorious