| CARVIEW |
Select Language
HTTP/2 200
accept-ranges: bytes
age: 1
cache-control: public,max-age=0,must-revalidate
cache-status: "Netlify Edge"; fwd=miss
content-encoding: gzip
content-type: text/html; charset=UTF-8
date: Wed, 21 Jan 2026 16:09:00 GMT
etag: "acc458a9688325c50becd384c97ec957-ssl-df"
server: Netlify
strict-transport-security: max-age=31536000
vary: Accept-Encoding
x-nf-request-id: 01KFGN1Z3EF6EZ2BNJN16E0FN4
We need to Pause AI
Top
- We call for a prohibition on the development of superintelligence, not lifted before there is broad scientific consensus that it will be done safely and controllably, and strong public buy-in

Statement on Superintelligence
110,000+ signatories including AI researchers, political, faith and industry leaders, artists and media celebrities
- Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.

Statement on AI Risk
Signed by hundreds of experts, including the top AI labs and scientists
- If you take the existential risk seriously, as I now do, it might be quite sensible to just stop developing these things any further.

Geoffrey Hinton
Nobel Prize winner & "Godfather of AI"
- The development of full artificial intelligence could spell the end of the human race.

Stephen Hawking
Theoretical physicist and cosmologist
- ... we should have to expect the machines to take control.

Alan Turing
Inventor of the modern computer
- If we pursue [our current approach], then we will eventually lose control over the machines.

Stuart Russell
Writer of the AI textbook
- Rogue AI may be dangerous for the whole of humanity. Banning powerful AI systems (say beyond the abilities of GPT-4) that are given autonomy and agency would be a good start.

Yoshua Bengio
AI Turing Award winner
of AI scientists believe the alignment problem is real & important
of citizens want AI to be slowed down by our governments
Subscribe to our newsletter
Stay updated on our efforts to advocate for AI safety.


