Artwork

Contenu fourni par Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology. Tout le contenu du podcast, y compris les épisodes, les graphiques et les descriptions de podcast, est téléchargé et fourni directement par Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology ou son partenaire de plateforme de podcast. Si vous pensez que quelqu'un utilise votre œuvre protégée sans votre autorisation, vous pouvez suivre le processus décrit ici https://fr.player.fm/legal.
Player FM - Application Podcast
Mettez-vous hors ligne avec l'application Player FM !

'A Turning Point in History': Yuval Noah Harari on AI’s Cultural Takeover

1:30:41
 
Partager
 

Manage episode 443937002 series 2503772
Contenu fourni par Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology. Tout le contenu du podcast, y compris les épisodes, les graphiques et les descriptions de podcast, est téléchargé et fourni directement par Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology ou son partenaire de plateforme de podcast. Si vous pensez que quelqu'un utilise votre œuvre protégée sans votre autorisation, vous pouvez suivre le processus décrit ici https://fr.player.fm/legal.

Historian Yuval Noah Harari says that we are at a critical turning point. One in which AI’s ability to generate cultural artifacts threatens humanity’s role as the shapers of history. History will still go on, but will it be the story of people or, as he calls them, ‘alien AI agents’?

In this conversation with Aza Raskin, Harari discusses the historical struggles that emerge from new technology, humanity’s AI mistakes so far, and the immediate steps lawmakers can take right now to steer us towards a non-dystopian future.

This episode was recorded live at the Commonwealth Club World Affairs of California.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

RECOMMENDED MEDIA

NEXUS: A Brief History of Information Networks from the Stone Age to AI by Yuval Noah Harari

You Can Have the Blue Pill or the Red Pill, and We’re Out of Blue Pills: a New York Times op-ed from 2023, written by Yuval, Aza, and Tristan

The 2023 open letter calling for a pause in AI development of at least 6 months, signed by Yuval and Aza

Further reading on the Stanford Marshmallow Experiment Further reading on AlphaGo’s “move 37”

Further Reading on Social.AI

RECOMMENDED YUA EPISODES

This Moment in AI: How We Got Here and Where We’re Going

The Tech We Need for 21st Century Democracy with Divya Siddarth

Synthetic Humanity: AI & What’s At Stake

The AI Dilemma

Two Million Years in Two Hours: A Conversation with Yuval Noah Harari

  continue reading

123 episodes

Artwork
iconPartager
 
Manage episode 443937002 series 2503772
Contenu fourni par Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology. Tout le contenu du podcast, y compris les épisodes, les graphiques et les descriptions de podcast, est téléchargé et fourni directement par Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology ou son partenaire de plateforme de podcast. Si vous pensez que quelqu'un utilise votre œuvre protégée sans votre autorisation, vous pouvez suivre le processus décrit ici https://fr.player.fm/legal.

Historian Yuval Noah Harari says that we are at a critical turning point. One in which AI’s ability to generate cultural artifacts threatens humanity’s role as the shapers of history. History will still go on, but will it be the story of people or, as he calls them, ‘alien AI agents’?

In this conversation with Aza Raskin, Harari discusses the historical struggles that emerge from new technology, humanity’s AI mistakes so far, and the immediate steps lawmakers can take right now to steer us towards a non-dystopian future.

This episode was recorded live at the Commonwealth Club World Affairs of California.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

RECOMMENDED MEDIA

NEXUS: A Brief History of Information Networks from the Stone Age to AI by Yuval Noah Harari

You Can Have the Blue Pill or the Red Pill, and We’re Out of Blue Pills: a New York Times op-ed from 2023, written by Yuval, Aza, and Tristan

The 2023 open letter calling for a pause in AI development of at least 6 months, signed by Yuval and Aza

Further reading on the Stanford Marshmallow Experiment Further reading on AlphaGo’s “move 37”

Further Reading on Social.AI

RECOMMENDED YUA EPISODES

This Moment in AI: How We Got Here and Where We’re Going

The Tech We Need for 21st Century Democracy with Divya Siddarth

Synthetic Humanity: AI & What’s At Stake

The AI Dilemma

Two Million Years in Two Hours: A Conversation with Yuval Noah Harari

  continue reading

123 episodes

Tous les épisodes

×
 
Loading …

Bienvenue sur Lecteur FM!

Lecteur FM recherche sur Internet des podcasts de haute qualité que vous pourrez apprécier dès maintenant. C'est la meilleure application de podcast et fonctionne sur Android, iPhone et le Web. Inscrivez-vous pour synchroniser les abonnements sur tous les appareils.

 

Guide de référence rapide