| CARVIEW |
Discuss your ideas for our programs and other suggestions for the Lifeboat Foundation
Join Our Program DiscussionsLatest News
-
Yumi Sato
Jan 14, 2026Yumi Sato, Executive Director of the Well Aging Economic Forum, joins our Life Extension Board.
-
The 100 Years to Extinction MEGA store
Jan 13, 2026Besides getting Lifeboat items at our futuristic store, you can now get even more Lifeboat items at The 100 Years to Extinction MEGA store.
Enjoy! -
Leo Nissola
Jan 9, 2026Leo Nissola, Member of the Medical Commission at the Enhanced Games, joins our Futurists Board.
-
Ravi Kumar Chaudhary
Jan 7, 2026Ravi Kumar Chaudhary, Senior Scientist at the Government Institute of Medical Sciences (GIMS), joins our Biotech/Medical Board.
-
Ivan Marandola
Jan 7, 2026Ivan Marandola, Chairman of REVIV Italy, joins our Life Extension Board.
-
Francisco Martínez Peñalver
Jan 7, 2026Francisco Martínez Peñalver, Founder of LiLibe Health, joins our Life Extension Board.
-
Robert Mitchell
Jan 6, 2026Robert Mitchell, Director of Research and Development at Micregen, joins our Biotech/Medical Board.
-
Greg Macpherson
Jan 6, 2026Greg Macpherson, Founder of SRW Laboratories, joins our Biotech/Medical Board.
-
Peter R. Solomon
Jan 1, 2026Peter R. Solomon, author of 100 Years to Extinction: The Tyranny of Technology and the Fight for a Better Future, joins our Futurists Board.
-
Transhumanism, Science Merchants and the Future of Man
Dec 29, 2025Read Transhumanism, Science Merchants and the Future of Man by our Ebénézer Njoh Mouellé.
-
SUPRA GUARDIANS
Dec 29, 2025Read SUPRA GUARDIANS: The Victorious Journey of Self-Actualization, Love, and Transformation Against the Seven Sins by our Maria Olon Tsaroucha.
-
Life Extension Should Come with Wisdom
Dec 29, 2025Read Life Extension Should Come with Wisdom: Reflections and Questions for the Geroscience and Longevity Community by our Alberto Aparicio.
-
AI Ethics for Boards and the C-Suite
Dec 29, 2025Read AI Ethics for Boards and the C-Suite, coauthored by our Kevin LaGrandeur.
-
Houston, We Have a Problem
Dec 29, 2025Read the article Houston, We Have a Problem… (Let’s Get to Solving It) by our David Bray.
A few decades from now, historians might conclude we are living in a Second Gilded Age. In a bit of a repeat of the 1890s, our mid-2020s seems to be an era where technological advancements have triggered fear and uncertainty about what the application of those technologies means for workers, industries, and nations. Massive polarization of politics was present in the 1890s and the beginning of the 1900s, with the United States Congress slightly more polarized than our current era. Similarly, a push-and-pull over regulation of technological advancements during the First Gilded Age mirrors some of what we see now amid our Generative AI. -
WhiteGrass
Dec 21, 2025Read WhiteGrass by our Richard Smith.
In 2048, humanity’s survival is on the line. After a superstorm nearly destroys them, teenagers Jimmy and Lizzie Marshall push their nonscientist dad, Greg, to unleash WhiteGrass, a groundbreaking nanotech to fight climate change. With his AI expert wife, Ginny, they reprogram the world’s most advanced humanoid, Valada, to help. But a powerful syndicate of oligarchs sets its sights on WhiteGrass. -
ESTAR-RAIZ Protocol
Dec 19, 2025Read the free book ESTAR-RAIZ Protocol: A framework for ethical interaction between humans and artificial intelligence, emphasizing consciousness, symbolic structures, and collaborative learning by Claudio Criniti.
-
Jaron Lanier
Dec 13, 2025Jaron Lanier joins our Media & Arts Board. He is the Prime Unifying Scientist (OCTOPUS) at Microsoft’s Office of the Chief Technology Officer and is widely regarded as the “Father of Virtual Reality”.
With over four decades of experience at the intersection of technology, art, and philosophy, Jaron has pioneered innovations that transformed human-computer interaction while simultaneously emerging as one of the most influential critics of the digital culture he helped create. -
Global Longevity Federation 2026
Dec 13, 2025Global Longevity Federation 2026 will be held March 23–24 in Rome, Italy and in Cyberspace.
Our Marios Kyriazis will be speaking.
Our David Barzilai, Aubrey de Grey, Jyothi Devakumar, Fiona Miller, Georgios Mitrou, and Ilia Stambler were past speakers at this event. -
Ahmed Bouzid
Dec 11, 2025Read The Deleuzian Stereo: Stability in an Age of Endless Becoming by our Ahmed Bouzid.
-
A Landscape of Consciousness
Dec 11, 2025Read A landscape of consciousness: Toward a taxonomy of explanations and implications by our Robert Lawrence Kuhn.
In this article, Robert seeks an organizing framework for diverse theories of consciousness and to explore their impact on big questions. His central theses are twofold: (i) understanding consciousness at this point cannot be limited to selected ways of thinking or knowing, but should seek expansive yet rational diversity, and (ii) issues such as AI consciousness, virtual immortality, meaning/purpose/value, life after death, free will, etc., cannot be understood except in the light of particular theories of consciousness.
Read detailed reviews of this article on IAI News and SciTechDaily. -
The New World on Mars
Dec 11, 2025Read The New World on Mars: What We Can Create on the Red Planet by our Robert Zubrin.
-
Walk the Path of Peace
Dec 11, 2025Read Be A Masterpeace Not A Monsterpiece: Walk the Path of Peace, Purpose, and Power by our Maria Olon Tsaroucha.
-
Anatolian Tails
Dec 11, 2025Read Anatolian Tails: A Tale of Two Tails by our David Evans.
When two curious cats slip through their garden gate, a simple day of exploration becomes an extraordinary adventure through the ancient city of Ankara. Join them on a journey through the ages as they explore the Turkish capital’s hidden wonders. -
Robert Lawrence Kuhn
Dec 10, 2025Robert Lawrence Kuhn, Creator, Executive Producer, Writer, and Host of Closer To Truth, the long-running PBS series on science and philosophy, joins our Policy Board.
-
Imed Gallouzi
Dec 8, 2025Imed Gallouzi, founding Chair (Director) of the KAUST Center of Excellence for Smart Health (KCSH), joins our Biotech/Medical Board.
Imed is also Professor of Bioscience in the Biological and Environmental Science and Engineering (BESE) Division at King Abdullah University of Science and Technology (KAUST) and Professor Emeritus at McGill University. He is a Scientific Advisor to the Kerry Health and Nutrition Institute (KHNI). -
Lifeboat News
Dec 6, 2025Read issue #281 of Lifeboat News!
-
2025 Lifeboat Foundation Guardian Award
Dec 4, 2025The Lifeboat Foundation Guardian Award is annually bestowed upon a respected scientist or public figure who has warned of a future fraught with dangers and encouraged measures to prevent them.
This year’s winner is Professor Roman V. Yampolskiy. Roman coined the term “AI safety” in a 2011 publication titled Artificial Intelligence Safety Engineering: Why Machine Ethics Is a Wrong Approach, presented at the Philosophy and Theory of Artificial Intelligence conference in Thessaloniki, Greece, and is recognized as a founding researcher in the field.
Roman is known for his groundbreaking work on AI containment, AI safety engineering, and the theoretical limits of artificial intelligence controllability. His research has been cited by over 10,000 scientists and featured in more than 1,000 media reports across 30 languages.
Watch his interview on The Diary of a CEO that has already received over 11 million views on YouTube alone. The Singularity has begun, please pay attention to what Roman has to say about it! -
Julie Brisset
Dec 3, 2025Julie Brisset, Interim Director of the Florida Space Institute (FSI), joins our Space Settlement Board.
-
Planetary Foresight and Ethics
Dec 2, 2025Read Planetary Foresight and Ethics: A Vision for Humanity’s Futures by our Victor V. Motti.
-
Barbara Gail Montero
Nov 30, 2025Barbara Gail Montero, author of Thought in Action: Expertise and the Conscious Mind, joins our Education Board.
-
Olive Oil and Brain Health
Nov 30, 2025Our Domenico Pratico will be speaking about Olive oil and brain health: what is the evidence? at the 7th International Yale Symposium on Olive Oil & Health being held December 4–7 in New Haven, Connecticut USA.
-
Considerations on the AI Endgame
Nov 29, 2025Read Considerations on the AI Endgame: Ethics, Risks and Computational Frameworks, coauthored by our Roman V. Yampolskiy.
-
Why Space?
Nov 28, 2025Read Why Space?: The Purpose of People by our Rick Tumlinson.
-
Louise Hecker
Nov 27, 2025Louise Hecker, Chief Scientific Officer at Fibronox, joins our Biotech/Medical Board.
-
Hyperfunction Theory and the Therapeutic Potential of Rapamycin
Nov 27, 2025Read Mikhail ‘Misha’ Blagosklonny’s enduring legacy in geroscience: the hyperfunction theory and the therapeutic potential of rapamycin by our David Barzilai.
-
Realistic Neural Networks
Nov 27, 2025Read Realistic Neural Networks by our Lester Ingber.
-
Navigating National Security in the Digital Age
Nov 27, 2025Read Navigating National Security in the Digital Age: A Call to Action for both the Private and Public Sectors by our David Bray.
-
Whatever Happened to Transhumanist Politics?
Nov 27, 2025Read Whatever Happened to Transhumanist Politics? by our Tim Pendry.
-
Open Source and Energy Interoperability
Nov 27, 2025Read Open Source and Energy Interoperability by our Mike Dover.
-
Artificial Intelligence Authority Bias
Nov 27, 2025Read Artificial Intelligence Authority Bias nell’analisi d’Intelligence, coauthored by our Chiara Chiesa.
-
Mitochondrial dysfunction in Alzheimer’s disease
Nov 27, 2025Read Mitochondrial dysfunction in Alzheimer’s disease, coauthored by our Domenico Praticó.
-
The future of risk and insurability
Nov 27, 2025Read The future of risk and insurability in the era of systemic disruption, unpredictability and artificial intelligence, coauthored by our Roger Spitz.
Doppelgänger
Read Doppelgänger by our Chip Walter.
What if a murdered man could bring his murderers to justice? In 2024 Elon Musk announced the first computer-brain implant. In the year 2068 the first mind transplant becomes possible. Immortality is a reality. Except for Morgan Adams, its not that simple.
Adams is a prodigy, the chief scientist and cofounder of the world’s wealthiest corporation. Tomorrow he’ll reveal his most ambitious undertaking: a secret project called Doppelgänger that will enable him to download a human mind into an identical cyborg body. But that morning he awakens to a shocking reality — he is standing over his own lifeless body, tortured and broken on a cold laboratory floor.
Morgan is the disbelieving beta version of his own unfinished creation, a Doppelgänger. There’s just one problem. The source code has been stolen and Morgan’s consciousness is degrading — fast! He has 72 hours to find his own murderer, recover and repair the source code, and fight a conspiracy so vast it threatens the entire human race.
Doppelgänger is a riveting futuristic thriller that imagines a full-blooded parallel world where twists and turns make reality so fractured it is nearly impossible to know what is true and what isn’t. It explores the clash of human passion, evil, love, trust and time. Even after turning the last page you’ll wonder what is real and what isn’t.
Lifeboat Foundation Books
“In Visions of the Future you’ll find stories and essays about artificial intelligence, androids, faster-than-light travel, and the extension of human life. You’ll read about the future of human institutions and culture. But these literary works are more than just a reprisal of the classical elements of science fiction and futurism. At their core, each of these pieces has one consistent, repeated theme: us.” | >
-
New York Times
The New York Times says “The Lifeboat Foundation is a nonprofit that seeks to protect people from some seriously catastrophic technology-related events. It funds research that would prevent a situation where technology has run amok, sort of like a pre-Fringe Unit.”
-
Popular Science
Read the Popular Science feature (that features the Lifeboat Foundation) After Earth: Why, Where, How, and When We Might Leave Our Home Planet.
-
U.S. Naval Operations
Read the reaction of the U.S. Naval Operations Strategic Studies Group (SSG) to meeting with experts from the Lifeboat Foundation. | >
-
Existence by David Brin
Our David Brin’s new novel Existence mentions our many programs. Watch the trailer for this book | >
-
LifeShield Bunkers
Our LifeShield Bunkers program is a compliment to our Space Habitats program. It is a fallback position in case programs such as our BioShield and NanoShield fail globally or locally. | >
Partners
HQ Supervisor
The Lifeboat Foundation is looking to hire an HQ Supervisor. This individual will be trained by our president near Austin, Texas USA and will be in charge of everything from financial paperwork to keeping our HQ (headquarters) clean. The two main qualities that are being looked for are:
- follow through
- and a strong interest in “Safeguarding Humanity”. | >
Current Programs
Lifeboat Foundation AIShield
To protect against unfriendly AI (Artificial Intelligence). Consequently, we support initiatives like the Friendly AI proposal by the Machine Intelligence Research Institute (MIRI).
We believe that a key element of Friendly AI is to construct such AGIs with empathy so they feel our pain and share our hopes.
Lifeboat Foundation AsteroidShield
Until fairly recently there were no significant efforts to identify asteroids and comets that may impact the Earth. This began to change in 1992 with the Spaceguard Survey Report which in 1994 led the House Committee on Science and Technology to direct NASA to work with the space agencies of other countries to identify and catalogue within 10 years the orbital characteristics of 90% of all comets and asteroids larger than 1 km and in orbits that cross the orbit of Earth. In 2005, the U.S. Congress further tasked NASA to identify 90% of near-Earth objects with a size greater than 140 meters in diameter by the year 2020 and 90% of objects greater than 1 kilometer by the year 2008.
This report will conclude with our solutions to the problem of asteroid impacts.
Lifeboat Foundation Bioshield
Ray Kurzweil says “We have an existential threat now in the form of the possibility of a bioengineered malevolent biological virus. With all the talk of bioterrorism, the possibility of a bioengineered bioterrorism agent gets little and inadequate attention. The tools and knowledge to create a bioengineered pathogen are more widespread than the tools and knowledge to create an atomic weapon, yet it could be far more destructive. I’m on the Army Science Advisory Group (a board of five people who advise the Army on science and technology), and the Army is the institution responsible for the nation’s bioterrorism protection. Without revealing anything confidential, I can say that there is acute awareness of these dangers, but there is neither the funding nor national priority to address them in an adequate way.”
Lifeboat Foundation InfoPreserver
While building and maintaining our InfoPreserver skills repository, the question arises, “Why do this at all?” This is an interesting question and one we hope to answer here. We have three main reasons that validate this activity.
Lifeboat Foundation InternetShield
As the Internet grows in importance, an attack on it could cause physical as well as informational damage. An attack today on hospital systems or electric utilities could lead to deaths. In the future an attack could be used to alter the output that is produced by nanofactories worldwide leading to massive deaths. This program looks for solutions to prevent such attacks, or at least reduce the damage caused by them.
Lifeboat Foundation LifePreserver
By the Lifeboat Foundation Scientific Advisory Board including María A. Blasco, Steve Hill, and Elena Milova. This is an ongoing program so you may submit suggestions to [email protected].
The LifePreserver program is designed to bring you the latest information about aging research and the progress towards treating age-related diseases. This program also aims to bring you practical information about the steps you can take now to slow the aging process and delay the onset of age-related disease. Hopefully by taking these measures you can increase your chances of being in good health when more robust life-extending and life-enhancing technologies become available over the next couple of decades.
LifeShield Bunkers
Our LifeShield Bunkers program is a compliment to our Space Habitats program. It is a fallback position in case programs such as our BioShield and NanoShield fail globally or locally.
A bunker can be quite large, such as Biosphere 2. A large bunker would be a place where babies are born and children play and go to school.
Lifeboat Foundation NanoShield Version 0.90.2.13
The most immediate danger facing life on earth is probably that posed by biological weapons and emergent disease. The Lifeboat Foundation BioShield proposal [1], described by Lemelson-MIT Prize winner Ray Kurzweil and U.S. Senate majority leader Bill Frist is our recommended response to this danger. The BioShield proposal emphasizes the development of technologies to combat bioweapons, such as biological viruses, by developing broad tools to prevent their development and to destroy them.
Lifeboat Foundation ScientificFreedomShield
In 1987, The American economist Robert M. Solow won the Nobel Prize for Economics when he proved that some 90% of economic growth stems from “technical change”, as he called it, rather than the trinity of capital, resources, and labor as had previously been assumed when he started work in the 1950s. Nowadays, the most reliable route to technical change is through science.
One might expect, therefore, that the-powers-that-be would be careful to preserve the sources of new science. Instead, they increasingly subject them to ill-considered constraints designed to enhance efficiency and accountability. The consequences of these actions could threaten the very future of civilization.
Lifeboat Foundation SecurityPreserver
The best way to survive a bioweapon, nanoweapon, nuclear, or other attack is to prevent it from happening in the first place. The Lifeboat Foundation’s SecurityPreserver program is looking for ways to provide early warning of attacks before such attacks can be fully designed, planned, developed, deployed, let alone launched.
In an ideal world, you would have perfect defenses and therefore would not need to have early warnings of attacks before they were developed. The SecurityPreserver program is for use in an imperfect world with imperfect defenses. If you wish to help improve our imperfect defenses, we have developed many programs that we need your input on including our BioShield and NanoShield programs.
Lifeboat Foundation Space Habitats
Establishing self-sufficient space habitats will serve as a backup plan for human civilization. A number of key milestones need to be reached before the long-term development of space is feasible, however. Improved access to space will catalyze the establishment of such habitats by allowing more frequent and less expensive flights beyond the atmosphere. Innovative, non-rocket methods of reaching orbit will enable more substantial progress in space. Artificial ecosystems will need to be made as independent as possible to minimize the need for new resources. Better management of and access to resources from non-terrestrial bodies will allow astronauts to get the most out of what they do have. Finally, further countermeasures against the effects of space on health will be required to sustain human life in space.
The Lifeboat Foundation has begun design on Ark I, a self-sustaining space habitat. We support the efforts by SpaceX and others to make access to space more affordable. Likewise, we support the efforts of Bigelow Aerospace and others to develop habitable environments in space.
X-Risks Network
The goal of the X-Risks Network project is to combine a bayesian network, a debate graph, and a project tracking system all into one graph. The project is focused on tracking progress on existential risk reduction and then determining the most leveraged ways to help reduce existential risks.
Click on the channel number to go to the page shown!
Also view our educational videos!
Featured Stories
-
AI Bubble
A lot of people are talking about an AI bubble since it is normal for tech to explode in growth for a while, then collapse a bit, and then eventually move forward again.
WE ARE NOT IN AN AI BUBBLE. THE SINGULARITY HAS BEGUN.
There will not be a year between now and the upcoming AI takeover where AI data center spending will decline worldwide.
Discuss the ”AI bubble“ with the Lifeboat community! -
Lifeboat Foundation Guardian Award: AI Safety
The Lifeboat Foundation Guardian Award is annually bestowed upon a respected scientist or public figure who has warned of a future fraught with dangers and encouraged measures to prevent them.
This year’s winner is Professor Roman V. Yampolskiy. Roman coined the term “AI safety” in a 2011 publication titled Artificial Intelligence Safety Engineering: Why Machine Ethics Is a Wrong Approach, presented at the Philosophy and Theory of Artificial Intelligence conference in Thessaloniki, Greece, and is recognized as a founding researcher in the field.
Roman is known for his groundbreaking work on AI containment, AI safety engineering, and the theoretical limits of artificial intelligence controllability. His research has been cited by over 10,000 scientists and featured in more than 1,000 media reports across 30 languages.
Watch his interview on The Diary of a CEO that has already received over 11 million views on YouTube alone. The Singularity has begun, please pay attention to what Roman has to say about it! -
Teachers in Space
Teachers in Space won our Dream Project contest. We then worked with them to fund classroom experiments that flew aboard the Perlan II stratospheric glider which just set the world altitude record.
The experiments that we flew with the Perlan II are from the Ashford School in Connecticut (dual geiger counter radiation level testing), Cazenovia School in New York (effects of radiation on plant seeds), and Oregon Museum of Science and Industry (marshmallows at high altitude / low pressure). Learn more details of these and of experiments flown in previous years.
2018 Lifeboat Foundation Guardian Winner Jeff Bezos later followed up on our donation and donated one million dollars to this important project. -
2023 Lifeboat Foundation Guardian Award
Geoffrey Hinton was co-winner of the 2023 Lifeboat Foundation Guardian Award. A year later, he earned a Nobel Prize for the same work.
In May 2023, Geoff announced his resignation from Google to be able to “freely speak out about the risks of AI.” He has voiced concerns about deliberate misuse by malicious actors, technological unemployment, and existential risk from artificial general intelligence. He had previously believed that Artificial General Intelligence (AGI) was “30 to 50 years or even longer away” and now feels that it will arrive much quicker than that.
Update! Geoff has now adopted the Lifeboat Foundation solution of Friendly AI instead of pushing to ban or regulate AI. (It is pretty much impossible to regulate AI worldwide since we have no world government.) -
Buzz Aldrin
Buzz Aldrin joins our Space Settlement Board.
On July 20, 1969, Buzz and Neil Armstrong made their historic Apollo 11 moonwalk, becoming the first two humans to set foot on another world. An estimated 600 million people — at that time, the world's largest television audience in history — witnessed this unprecedented heroic endeavor. -
Lifeboat to the Stars
The Lifeboat Foundation presented the Lifeboat to the Stars award to Kevin J. Anderson and Steven Savile for their collaborative book Tau Ceti. The authors split the $1,000 prize, and each received a handsome trophy in an hourglass design.
This award honors the best work of science fiction of any length contributing to an understanding of the benefits, means, and difficulties of interstellar travel. -
Nobel Laureate in Zero G
The Lifeboat Foundation sponsored our board member Nobel Laureate Wole Soyinka into Zero G.
After experiencing Zero G, Wole said: “Hitchless flight. Thank you again for the remarkable experience which I’m still digesting.”
Watch 2008 Lifeboat Foundation Guardian Winner Stephen Hawking also enjoy Zero G! -
Human Brain/Cloud Interface
The research paper Human Brain/Cloud Interface has been published by our Amara D. Angelica, Frank J. Boehm, Krishnan Chakravarthy, Robert A. Freitas Jr., Steven A. Garan, Tad Hogg, Mikhail A. Lebedev, Nuno R. B. Martins, Jeffrey V. Rosenfeld, Yuriy Svidinenko, and Melanie Swan.
-
Virgin Atlantic
Our Scientific Advisory Board completes a report that 2011 Lifeboat Foundation Guardian Winner Richard Branson requests on anti-viral technologies for his aircraft.
-
4 Billion Mile Flyby!
We would like to congratulate our Alan Stern for his accomplishment of New Horizons traveling 4 billion miles from Earth in the farthest flyby ever. Alan is Principal Investigator of NASA’s New Horizons probe which rang in the new year by swinging by Ultima Thule — “beyond the known world” — an object in the Kuiper asteroid belt.
-
Eclipse Phase
Eclipse Phase is a post-apocalyptic game of conspiracy and horror.
Humanity is enhanced and improved, but also battered and bitterly divided. Technology allows the re-shaping of bodies and minds, but also creates opportunities for oppression and puts the capability for mass destruction in the hands of everyone. And other threats lurk in the devastated habitats of the Fall, dangers both familiar and alien.
This RPG includes the “Singularity Foundation” and “Lifeboat Institute” as player factions. -
Seal of Approval
The Human Race to the Future: What Could Happen — and What to Do, the first book published by the Lifeboat Foundation, has been awarded the “Peer Reviewed & Approved for Science” seal of approval by the Washington Academy of Sciences.
-
Volunteers
Do you have free or inexpensive space available for an entertaining Lifeboat Foundation volunteer? If so, send a message to [email protected] with the subject “Lifeboat Foundation Space”.
Upcoming Events
-
Global Longevity Federation 2026
Global Longevity Federation 2026 will be held March 23–24 in Rome, Italy and in Cyberspace.
Our Ravi Kumar Chaudhary, Marios Kyriazis, Greg Macpherson, Ivan Marandola, Robert Mitchell, and Leo Nissola, Francisco Martínez Peñalver, and Yumi Sato will be speaking.
Our David Barzilai, Aubrey de Grey, Jyothi Devakumar, Fiona Miller, Georgios Mitrou, and Ilia Stambler were past speakers at this event.