I'm not.
There just isn't enough capacity in the genome.
Your entire genome fits in 800MB (uncompressed).
The difference between the human and chimp genomes is 1% of that, or 8MB.
Not enough to encode a significant structure.
For comparison, a small 7B LLM requires 14GB.
AGI
-
Human Genome Storage Capacity vs LLM Requirements Analysis
By
–
-
Dystopian AI Takeover Narrative Remains Timeless Cliché
By
–
The dystopian fantasy of machines taking over humanity is so old that it's a cliché.
-
MetaLearning Enables AI Agents Rapid Adaptation to New Environments
By
–
[ MetaLearning.eth ] METALEARNING enables #AIAgents to “learn to learn”, allowing them to rapidly adapt to new environments. Asset: https://
opensea.io/assets/ethereu
m/0x57f1887a8bf19b14fc0df6fd9b2acc9af147ea85/28653756243418125983896305525248630918490602894709024564651983387428933128798
… http://
MONTREAL.AI: Winning the #AGI Race MetaLearning.Eth #MetaLearning #MontrealAI -
World Models and Planning: The Next Level in AI Development
By
–
That's pretty much the point I make in my vision paper from May 2022, and all the talks I've given since: the next level in AI requires world models and planning (search, reasoning).
-
Explanation Why AGI Has Been Achieved Internally
By
–
Explanation why AGI has been achieved internally https://
reddit.com/r/singularity/
s/rcd6Daf239
… -
Reinforcement Learning Powers Web3 AGI Agents Development
By
–
[ ReinforcementLearning.Eth ] Reinforcement Learning (RL) is an area of AI concerned with how agents ought to take actions. http://
MONTREAL.AI is using RL to build Web3 AGI Agents Asset: https://
opensea.io/assets/ethereu
m/0x57f1887a8bf19b14fc0df6fd9b2acc9af147ea85/52694602869859807677039526017970231478273063181431708212743385155471956296026
… #ReinforcementLearning -
Genome Data Constraints and Human Cognitive Capability Development
By
–
1. The amount of data in the human genome is small: 800MB. The difference between chimp and human genomes is about 8MB. That's just not enough "instructions" to explain the difference in capability. 2. The total amount of visual data seen by a 2 year-old is pretty small:
-
Training 2B-parameter AI systems to animal-level intelligence efficiently
By
–
About 2 billion neurons, like parrots, dogs, and octopus.
How do we get a machine with 2B neurons / 10T parameters to get as smart as octopus, dogs, parrots, and crows with just a few months' worth of real-time training data? -
Woman who challenged Sam Altman on AGI perspectives
By
–
Some weeks ago, I spoke to
@hlntnr
for @FT
Tech Tonic podcast on artificial general intelligence. A LOT has happened since, but here’s a glimpse into the perspective of the woman who took on Sam Altman, on
@FT
News Briefing this morning https://
on.ft.com/49ZWtdo -
OpenAI Q* Breakthrough Toward Artificial General Intelligence
By
–
Les scientifiques d’OpenAI auraient réussi une percée spectaculaire dans l’avènement d’une "Intelligence Artificielle Générale" (AGI) capable de surpasser l’intelligence humaine. Ce projet s’appelle Q* (prononcez Q-Star). Retenez bien ce nom.