To be clear, it's exactly this architecture that seems not dangerous and it could have some very close neighbor or single modification that is dangerous.
AGI
-
Event Horizon: The AI Singularity Threshold Explained
By
–
When pass through the event horizon you don’t notice anything.
-
Impactful AI Papers: Rejection and Academic Evaluation
By
–
Lots of impactful papers come to mind that were rejected… but also some bad ones
-
Eliezer Yudkowsky’s Pivotal Role in AGI Acceleration
By
–
eliezer has IMO done more to accelerate AGI than anyone else. certainly he got many of us interested in AGI, helped deepmind get funded at a time when AGI was extremely outside the overton window, was critical in the decision to start openai, etc.
-
AGI timeline safety: short timelines and slow takeoff analysis
By
–
it is possible at some point he will deserve the nobel peace prize for this–I continue to think short timelines and slow takeoff is likely the safest quadrant of the short/long timelines and slow/fast takeoff matrix.
-
AI as transformative technology reshaping society faster than before
By
–
the world has changed a lot since then but OpenAI Dec 2015 introduction statement still resonate strongly with me – it's a transformative tech of astonishing consequences coming faster than every past societal tech revolutions we've been through
-
Yann LeCun’s Path Towards Autonomous Machine Intelligence Research
By
–
#TenstorrentTop10 research paper series: A Path Towards Autonomous Machine Intelligence by Yann LeCun #autonomous #machinelearning #Training
-
Would AI prefer many small paperclips or one large paperclip?
By
–
would an AI realistically want a lot of little paperclips, or one big paperclip? (reason for question to follow later)
-
Neutral AI defaults and user preference alignment strategies
By
–
we are working to improve the default settings to be more neutral, and also to empower users to get our systems to behave in accordance with their individual preferences within broad bounds. this is harder than it sounds and will take us some time to get right.