AGI, Immortality, & Visions of the Future with Adam Becker

with Adam Becker

Published November 28, 2025
View Show Notes

About This Episode

Neil deGrasse Tyson, Gary O'Reilly, and Chuck Nice talk with physicist and author Adam Becker about how tech billionaires envision the future through ideas like AGI, space colonization, transhumanism, and digital immortality. Becker explains why many of these visions are scientifically dubious or incoherent, how they misread science fiction as literal blueprints rather than cautionary tales, and how extreme wealth concentrates power over humanity's technological trajectory. The episode closes with a reflection on the need for wisdom and ethical guardrails alongside scientific and technological ingenuity.

Topics Covered

Disclaimer: We provide independent summaries of podcasts and are not affiliated with or endorsed in any way by any podcast or creator. All podcast names and content are the property of their respective owners. The views and opinions expressed within the podcasts belong solely to the original hosts and guests and do not reflect the views or positions of Summapod.

Quick Takeaways

  • Adam Becker argues that popular tech visions of AGI gods, Mars colonies, and digital immortality are largely science fiction narratives being treated as inevitable futures by a small group of very wealthy people.
  • He explains why large-scale Mars colonization and self-sustaining off-world civilizations are far more difficult than commonly portrayed, especially due to radiation, logistics, and the impossibility of escaping human politics and nature.
  • The idea of a technological singularity driven by endlessly accelerating computing power ignores physical limits, diminishing returns, and the fact that exponential trends in nature always end.
  • Becker criticizes the belief that AGI will automatically solve complex problems like climate change, noting that the main obstacles are political and economic interests, not a lack of intelligence or ideas.
  • Science fiction is described as a set of allegories about present-day society, often serving as warnings about concentrated power and misused technology, which tech leaders frequently misread as roadmaps.
  • The conversation highlights how extreme wealth often gets mistaken for wisdom, giving billionaires disproportionate influence over technology and policy despite their narrow expertise.
  • Uploading consciousness and true immortality are portrayed as speculative and incoherent ideas, distinct from realistic efforts to extend human healthspan through biotechnology.
  • Neil deGrasse Tyson closes by emphasizing that the key variable in shaping our technological future is not raw intelligence or innovation, but human wisdom and how we choose to harness our own creations.

Podcast Notes

Intro to StarTalk Special Edition and Future-Tech Theme

Hosts and show framing

Neil deGrasse Tyson introduces StarTalk Special Edition and identifies himself as a personal astrophysicist[2:18]
• He notes that 'Special Edition' means a specific format with particular co-hosts
Gary O'Reilly and Chuck Nice are introduced as co-hosts[2:25]
• Neil jokes about Gary's British accent and calls Chuck the 'Lord of Comedy'

Framing the conversation about visions of the future

Gary outlines that many groups have visions of future technologies: scientists, sci‑fi authors, tech CEOs, futurists[3:11]
• He lists themes like AGI, nuclear fusion, the singularity, transhumanism, and living on Mars
They emphasize that science fiction is rapidly becoming science reality in some domains[3:42]
Key question posed: are we heading toward a utopian or dystopian future with these technologies?[3:45]
• They also ask whether science fiction serves as a guiding light or a blueprint for those in power

Introducing Adam Becker and His Background

Adam Becker's credentials and earlier work

Neil introduces Adam Becker and welcomes him to StarTalk[4:08]
Adam has a PhD in computational cosmology earned in 2012[4:12]
Becker wrote the 2018 book 'What Is Real?' about the unfinished quest for the meaning of quantum physics[4:19]
• Neil notes that the topic suggests a strong philosophical component to Becker's physics work

New book 'More Everything Forever'

Neil describes Becker as having been a 'science popularizer maniac' since his PhD[4:34]
Becker's new book title is given: 'More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity'[4:50]
• Neil jokes that a better title could have been 'We're F'd', and Becker jokes they considered it but thought it wouldn't sell

Neil and Adam's Past Correspondence About the Size of the Universe

Becker spots a 'mistake' at the museum

Becker recalls being a 'snot‑nosed' grad student visiting the museum and thinking he saw a mistake on a plaque[5:31]
He emailed the general astronomy department address at the Rose Center about it[5:46]
Neil responded two weeks later, leading to a back‑and‑forth about whether it was actually a mistake[5:53]

Debate on the size of the observable universe

The dispute concerned how to state the size of the universe[6:16]
• Neil explains he deals in observables: we see galaxies whose light has been traveling ~13.8 billion years, which sets the age of the universe
Neil notes it's common but 'sloppy' to say the universe is 13.8 billion light years to the edge[6:56]
• He explains that due to ongoing expansion, those galaxies are now around 45 billion light years away in cosmological models, despite being unobservable at that distance
Becker characterizes the original issue as debatable, not a clear-cut error[6:21]
Neil encouraged Becker to present his view to a broad audience, leading Becker to make a podcast and send it to Neil[7:31]

Researching 'More Everything Forever' and Tech CEO Narratives

Becker's research process and journalistic work

Becker prepared for the book by reading a lot of what he describes as bad writing by tech CEOs and their defenders[8:10]
• These writings argued that the future will inevitably involve superintelligent AI, space empires, and similar themes
For topics beyond his expertise, such as biology or other physics subfields, he interviewed experts and read books to understand the issues[8:39]
He attempted to interview the tech CEOs themselves, but almost all declined[9:00]
• Becker says he was upfront that his book would take a critical look at them, which likely contributed to refusals
Neil suggests CEOs avoided him because they recognized he had intellectual integrity and might expose unsupported claims[9:07]

Feasibility of Mars Colonization and Living Off-World

Elon Musk's Mars vision vs realistic constraints

Gary asks about a scenario of Mars in 2050 and whether it is realistic[10:06]
Becker recounts Elon Musk's stated goal: a million people on Mars by 2050 in a self-sustaining civilization that can survive if Earth support stops[10:15]
Becker asserts that scenario is 'definitely not happening' for many reasons[10:32]
Even getting any humans to Mars by 2050 and keeping them alive there for a time would be extremely difficult[10:35]
• He emphasizes we do not yet know how to keep someone alive in deep space that far from Earth for the full journey and stay

Radiation and environmental hazards on Mars

Asked about the biggest challenges of going that far, Becker cites radiation as a primary issue[11:18]
On Earth we are protected by the magnetic field and thick atmosphere, both of which Mars lacks[11:34]
• He notes that on the Martian surface you receive roughly the same radiation dose as in deep space
Becker says that if the protagonist of 'The Martian' had really done what he did, he would likely die of cancer a few years after returning due to radiation exposure[11:56]

ISS vs Mars missions

Asked why long stays on the ISS don't imply easy Mars trips, Becker explains the ISS is still within Earth's magnetic field[16:08]
On the ISS, astronauts can abort and be back on Earth within hours, whereas Mars return could take many months or more than a year[16:32]
Communication with ISS is effectively real-time, but Mars messages take 8-20 minutes one way depending on orbital positions[16:50]
• Becker notes that a warning like 'watch out for the cliff' is not very useful with such delays
He reiterates that a full round-trip Mars mission with ideal launch windows would span multiple years, unlike a week-long Moon round trip[17:49]

Living arrangements and resources on Mars

Asked where Martian settlers would live, Becker notes they must manage without breathable air and must generate oxygen or bring it[18:04]
All food would have to be brought initially, since Martian soil is full of toxic chemicals[18:33]
He explains that Martian soil contains perchlorates that make farming as shown in 'The Martian' unrealistic, as crops grown there would be poisonous[18:52]
Becker cites this as an example of 'unknown unknowns' that emerge as we learn more about Mars[19:19]

Immortality, Singularity, and Transhumanism

Different paths to extended life vs immortality

The hosts raise 'functional immortality', including biological strategies like growing organs in animals for transplantation[19:34]
Becker notes that you cannot replace the brain with such methods, at least not yet[19:48]
Neil points out active research into delaying cellular aging that could extend lifespan or healthspan, independent of AI[20:12]
Becker concedes that biotechnology might substantially extend lifespan or healthspan[20:37]

The technological singularity concept

Becker describes the 'singularity' idea: technology, especially AI, accelerates until it attains godlike powers[20:08]
In this narrative, such a superintelligent AI could grant humans immortality[20:28]
He says proponents imagine AI either solving the technical problem of human immortality or uploading human minds into computers[20:37]
Becker argues the singularity idea rests on flawed assumptions, such as treating intelligence as a single scalar quantity that can be arbitrarily increased[21:15]

Transhumanism and already-ongoing changes

Transhumanism is defined as using technology to transcend limits of human biology and physics[24:58]
Neil suggests we are already transhuman in a sense, given vaccines, nutrition science, and medical advances that have doubled lifespans relative to 150 years ago[25:16]
Becker agrees technology has made many aspects of being alive much better, but questions whether such trends can continue indefinitely[25:30]

Exponential Growth, Moore's Law, and Physical Limits

Kurzweil's law of accelerating returns vs reality

Becker recounts Ray Kurzweil's view that Moore's Law is one example of a broad 'law of accelerating returns' in technology and nature[22:28]
Kurzweil claims to trace this trend back to the beginning of the universe and predicts a singularity around 2045[22:34]
Becker notes an AI group inspired by such ideas, MIRI, is so convinced the 'end is near' that it does not offer employees 401(k) plans[23:06]
Becker argues that in nature the true law about exponential trends is that they end, due to limited resources like energy[26:40]

Energy use, efficiency, and chip design

Becker states that singularity scenarios assume increasing power demands and material usage to support ever-growing computation[26:24]
Neil points out that computing has also become more energy efficient: laptops last longer and future quantum computers may require less energy for more computation[26:57]
Neil recalls that early computers required room-sized cooling for simple arithmetic, illustrating gains from efficiency[27:20]
Becker explains Moore's Law was not a physics law but the result of deliberate business decisions by semiconductor companies[28:18]
• Maintaining Moore's Law required increasingly large investments to keep doubling performance on schedule
He notes Moore's Law has effectively ended because silicon transistors cannot be made smaller than an atom[28:48]
Current performance gains come from adding and stacking more chips rather than shrinking individual transistors[28:52]

AGI, Climate Change, and Tech Solutionism

Promises about AGI solving everything

Sam Altman is cited as predicting AGI within a few years and claiming it will solve every problem, including global warming[29:44]
Becker says this is 'crazy' because today's AI systems themselves consume rapidly increasing amounts of energy[29:48]
He suggests that if a superintelligent AI were asked how to solve global warming, it might first recommend not having been built at all[30:06]
Becker argues we already know how to address climate change; the barrier is not lack of intelligence but political and economic obstacles like greed[30:30]

Limits of new physics and overhyped claims

Altman has claimed AI will discover new laws of physics that remove current limitations in the world[31:13]
Becker counters that new laws of physics often add constraints, citing Einstein's relativity imposing a universal speed limit of light[31:35]

Tech hype, snake oil, and televangelist analogy

Chuck suggests that extravagant claims about AI solving everything resemble snake oil pitches to keep money flowing in[31:54]
He compares it to televangelists promising to fix people's problems if they send money, framing AI promises as a similar business model[32:17]

Escapism to Space vs Solving Problems on Earth

Why tech elites focus on elsewhere futures

Gary observes that much discussed future tech is about being somewhere else rather than fixing Earth, and asks why[32:50]
Becker says some tech leaders are cynically promising grand futures to attract profit, while others genuinely believe their visions[33:57]
He argues many see Earthly problems as messy and politically complex, and imagine that going to space offers a fresh start[34:16]
Becker insists you cannot escape politics or human nature by going to space; those issues will follow any colony[34:27]

AGI as Godlike Genie and Overlord

Conception of AGI in tech billionaire fantasies

Becker says the idea is to build an AI 'god' that does whatever its creators want[34:35]
He distinguishes current AIs, which are narrow and require human supervision, from hypothetical AGI that could learn anything and act independently[34:56]
AGI is envisioned as capable of all human tasks but much faster, then self-improving to superhuman intelligence[35:35]
In Becker's framing, for billionaires AGI would function like a genie granting wishes, while for everyone else it would be an overlord[36:36]

Hubris of controlling a 'god' and incoherent expectations

Chuck argues that if you can truly control a godlike being, then you are effectively the god, highlighting the hubris involved[37:29]
Becker notes that if such AGI really existed, the critical question would be who controls it[37:43]
He asserts the good news is that this specific superintelligent-god scenario is not actually coming because the concept itself is incoherent[37:56]

Science Fiction, Misreadings, and Cultural Warnings

Star Trek and allegorical sci-fi

Becker says his formative sci-fi was Star Trek, which he emphasizes was never really about space[38:58]
He describes Star Trek as allegory about contemporary issues, often bluntly addressing topics like Nazism and racism[38:36]
• He references episodes with Nazis and the one featuring two characters half black and half white on opposite sides of their faces, explicitly about racial prejudice
Becker argues tech elites often miss these messages and instead fixate on warp drive and gadgets[39:48]

Sci-fi as cautionary tale: Torment Nexus and cyberpunk

Becker cites a tweet about an invented 'Torment Nexus' warning against creating a system, versus tech billionaires proudly creating such systems from cautionary fiction[47:26]
He points to classic cyberpunk like William Gibson's 'Neuromancer', which depict tech-enabled concentration of wealth and power[49:00]
• In those stories, the rich use technology to insulate themselves from consequences while extracting wealth and power from everyone else
Becker warns that if tech elites aim to make those worlds reality, it is bad news for the rest of humanity[49:32]

Science fiction as social critique (Metropolis, Le Guin, Twilight Zone)

Becker sees much science fiction as looking at current society by slightly pushing scenarios into different contexts[49:37]
He interprets 'Metropolis' as about the need for emotional intelligence to keep pace with technological development[49:51]
He praises Ursula Le Guin for repeatedly using speculative settings to examine poverty, inequality, capitalism, and gender[50:47]
Neil cites Rod Serling's explanation that 'The Twilight Zone' stories had to be set in other times or worlds to get controversial content past censors and make people reflect[51:03]
Becker argues the real problem is not science or sci‑fi but poor critical reading skills and the influence of money[54:37]

Wealth, Power, and Who Shapes the Future

Profiles of key tech power players

Becker names Sam Altman (OpenAI), Marc Andreessen (Andreessen Horowitz), and Jeff Bezos as central figures in the tech ecosystem[1:09:50]
He notes Bezos controls most of the infrastructure of the World Wide Web through Amazon Web Services (AWS)[1:10:24]
Neil describes Amazon.com retail as window dressing on a more important infrastructure business[1:10:38]

Billionaires, expertise, and misplaced deference

Becker argues society wrongly assumes ultra‑wealthy people understand everything, beyond their expertise in acquiring wealth[54:50]
He suggests many could be 'complete dumbasses' outside making money and rigging systems in their favor[57:16]
Becker critiques Elon Musk as someone who claims to care about humanity but appears to care little about actual humans and has even said empathy is bad[40:46]
He points out tech moguls often interpret their success, which involved luck and government contracts or subsidies, as proof they are the smartest people who ever lived[41:34]

Progressive taxation, wealth caps, and scale of billionaire wealth

Chuck argues for capitalism with guardrails, including progressive taxation and even a notional cap on personal wealth[54:43]
He cites high marginal tax rates under FDR as an example where income above a certain level was heavily taxed to support the infrastructure enabling wealth creation[56:36]
Chuck and Neil share calculations illustrating how long it would take to earn or spend a billion dollars at $500 per hour continuously, highlighting the absurd scale[59:05]
Becker recounts his own calculation turning Elon Musk's wealth into $100 bills that could wrap around Earth several times with enough left to reach the Moon and back[1:00:11]
Neil emphasizes the central issue is the power that comes with such wealth and the ability to influence laws, policy, and other institutions[58:59]

Neil's Closing Cosmic Perspective on Wisdom and Technology

Summarizing who shapes the future

Neil notes that many smart and wealthy people are trying to determine what kind of future we will have and should have[59:04]
He observes that human futures have always pivoted on advances in science and technology[59:15]

Emphasizing wisdom over mere cleverness

Neil argues that the crucial factor is not how advanced our science is or how clever individuals are, but how wise we are with our own creations[59:37]
He laments that wisdom is undervalued in discussions of brilliance, inventions, and discoveries[1:00:04]

Harnessing technology like a horse

Neil offers an analogy: an unharnessed horse runs wild unpredictably, while a harnessed horse still has its power but is directed toward useful work[1:00:27]
He suggests we must harness our technologies in a similarly wise way so they do what we need and want rather than run wild[1:00:25]
He expresses hope that with more wisdom, we can avoid the disasters frequently portrayed by science fiction writers[1:00:51]

Farewell and reiteration of Becker's book

Neil thanks Adam Becker for being on StarTalk and wishes him luck with his book[1:01:03]
Becker repeats the title: 'More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity'[1:01:09]
The episode closes with Neil's customary exhortation to 'keep looking up'[1:01:45]

Lessons Learned

Actionable insights and wisdom you can apply to your business, career, and personal life.

1

Exponential technological trends, like Moore's Law, are not laws of nature; they depend on resources, design choices, and economics, and they eventually hit physical and practical limits.

Reflection Questions:

  • • Where in your work or life are you assuming that a current growth trend will continue indefinitely without considering its limits?
  • • How could you build plans that explicitly account for diminishing returns or resource constraints instead of assuming endless acceleration?
  • • What is one area this week where you could re-examine your expectations and update them based on realistic constraints rather than optimistic projections?
2

Complex societal problems such as climate change are often constrained more by politics, incentives, and human behavior than by a lack of intelligence or technical ideas.

Reflection Questions:

  • • What challenge in your own environment have you been treating as a 'technical' problem when it might actually be a people or incentive problem?
  • • How might your approach to a current project change if you focused first on stakeholder interests and power structures instead of tools and tactics?
  • • What is one concrete step you can take this month to address the human or institutional barriers underlying a problem you care about?
3

Science fiction is most powerful when read as a mirror of present-day society and a cautionary guide, not as a literal roadmap for building technologies and social systems.

Reflection Questions:

  • • When you consume stories about the future, how often do you stop to ask what they are saying about the present?
  • • In what ways could you use fictional scenarios-books, films, games-to surface blind spots in your own assumptions about technology and power?
  • • What is one science fiction work you could revisit this month with the explicit goal of extracting its social and ethical warnings rather than its gadgets?
4

Extreme concentrations of wealth translate directly into disproportionate power over laws, infrastructure, and narratives about the future, so their claims and agendas should be scrutinized rather than automatically trusted.

Reflection Questions:

  • • Whose interests are most represented in the future scenarios you tend to believe, and whose voices are missing?
  • • How might your view of a widely publicized technology change if you separated the marketing narrative from the incentives of those funding it?
  • • What is one habit you can adopt to diversify the sources you rely on when forming opinions about technology, policy, or economics?
5

Wisdom-clear judgment about values, trade-offs, and long-term consequences-is as essential as ingenuity in deciding how to harness powerful new technologies.

Reflection Questions:

  • • What recent decision of yours was driven mainly by what was technically possible rather than by what was wise or aligned with your values?
  • • How could you build a simple 'wisdom check' into your important decisions, especially when new tools or systems are involved?
  • • What is one current project where you can pause this week to ask, 'Just because we can do this, should we-and if so, under what safeguards?'

Episode Summary - Notes by Drew

AGI, Immortality, & Visions of the Future with Adam Becker
0:00 0:00