“The Field of AI Alignment: A Postmortem, and What To Do About It” by johnswentworth
26/12/2024
0:00
14:03
A policeman sees a drunk man searching for something under a streetlight and asks what the drunk has lost. He says he lost his keys and they both look under the streetlight together. After a few minutes the policeman asks if he is sure he lost them here, and the drunk replies, no, and that he lost them in the park. The policeman asks why he is searching here, and the drunk replies, "this is where the light is".
Over the past few years, a major source of my relative optimism on AI has been the hope that the field of alignment would transition from pre-paradigmatic to paradigmatic, and make much more rapid progress.
At this point, that hope is basically dead. There has been some degree of paradigm formation, but the memetic competition has mostly been won by streetlighting: the large majority of AI Safety researchers and activists [...]
---
Outline:
(01:23) What This Post Is And Isnt, And An Apology
(03:39) Why The Streetlighting?
(03:42) A Selection Model
(05:47) Selection and the Labs
(07:06) A Flinching Away Model
(09:47) What To Do About It
(11:16) How We Got Here
(11:57) Who To Recruit Instead
(13:02) Integration vs Separation
---
First published:
December 26th, 2024
Source:
https://www.lesswrong.com/posts/nwpyhyagpPYDn4dAW/the-field-of-ai-alignment-a-postmortem-and-what-to-do-about
---
Narrated by TYPE III AUDIO.
Over the past few years, a major source of my relative optimism on AI has been the hope that the field of alignment would transition from pre-paradigmatic to paradigmatic, and make much more rapid progress.
At this point, that hope is basically dead. There has been some degree of paradigm formation, but the memetic competition has mostly been won by streetlighting: the large majority of AI Safety researchers and activists [...]
---
Outline:
(01:23) What This Post Is And Isnt, And An Apology
(03:39) Why The Streetlighting?
(03:42) A Selection Model
(05:47) Selection and the Labs
(07:06) A Flinching Away Model
(09:47) What To Do About It
(11:16) How We Got Here
(11:57) Who To Recruit Instead
(13:02) Integration vs Separation
---
First published:
December 26th, 2024
Source:
https://www.lesswrong.com/posts/nwpyhyagpPYDn4dAW/the-field-of-ai-alignment-a-postmortem-and-what-to-do-about
---
Narrated by TYPE III AUDIO.
Mais episódios de "LessWrong (Curated & Popular)"
Não percas um episódio de “LessWrong (Curated & Popular)” e subscrevê-lo na aplicação GetPodcast.