Is there something wrong with taking a complex and nuanced topic, thinking you can compress all that into a single character, and then making that your whole identity?
AGI
-
EA and AI risks: when concerns about harm go unheard
By
–
Yeah I was out walking with a friend a year or two back and he told be about this awesome EA thing he’d read about. When I told him I thought it could lead to great harm he looked at me like I’d proposed dolphin soup for dinner.
-
Intelligence as Behavior Under Uncertainty and Change
By
–
Intelligence is about generating adequate behavior in the presence of high uncertainty and constant change. If you could have full information and if your environment were static, then there would be no need for intelligence — instead, *compression* would give you an optimal
-
Current AI Systems Achieve Superhuman Skill in Narrow Well-Defined Tasks
By
–
It's well established that current systems, trained only on human generated data, can achieve superhuman skill (super-humanity skill as per OP) as long as the target task is sufficiently narrow and well defined. The problem is generality not skill.
-
GPT-4 AGI Claims Fade as Monthly AI Hype Cycle Continues
By
–
Last Fall, the word in SF was that GPT-4 was already AGI. Etc etc. Now it's just a monthly thing.
-
Historical AGI Panic Cycles: From Q-Learning to Deep Reinforcement Learning
By
–
The first panic over imminent AGI was circa 2013 about Atari Q-learning by DeepMind. The second one was circa 2016 over Deep RL (partially triggered by AlphaGo). So many folks in late 2016 were convinced that Deep RL would lead to AGI in under in 5 years…
-
Why We Need New Technologies: The Coming Wave
By
–
We pursue new technologies, including those in the coming wave, not just because we want them, but because, at a fundamental level, we need them. http://
the-coming-wave.com -
Brain structure and learning in artificial intelligence
By
–
A tiny amount of brain structure and lots of learning.
-
Sensory Data and Multimodal Learning in AI Development
By
–
Yes, that's one of my points.
Text is insufficient.
We need sensory inputs to learn how the world works.
We can estimate the total amount of visual data seen by a 2 year-old: 2 years = 2x365x12x3600 or roughly 32 million seconds.
We have 2 million optical nerve fibers, carrying -
Genome Storage vs LLM Size: Evolution’s Compression Challenge
By
–
Whatever it is that we learned through evolution has to be squeezed in 800MB (the size of the genome, uncompressed).
most of it is just low-level biochemical machinery. Even a tiny LLM requires 14GB.