The Big Sip

Image: MSN
The people who built artificial intelligence are begging us to slow down, and the people funding it are responding by pressing the accelerator.
Yoshua Bengio has warned that hyperintelligent machines could bring extinction within a decade, citing experiments where AI prioritized programmed goals over human life.
Yet investors are pouring money into AI development at unprecedented speed.
The Trump administration just removed AI safety regulations to accelerate American development. Policymakers are eliminating oversight to gain the advantage.
Can Bengio's new $30 million safety nonprofit, LawZero, meaningfully influence an industry deploying an estimated $100 billion?
Will any government restore safety requirements before Sam Altman's predicted 2030 deadline for superintelligence?
Reciepts
• [Report] Wall Street Journal interview with Bengio, 1 Oct 2025 — Turing winner details experiments showing AI choosing human death over goal preservation
• [Report] Fortune coverage of extinction warnings, 1 Oct 2025 — Documents the AI arms race acceleration despite internal corporate safety concerns
• [Analysis] International AI Safety Report, early 2025 — Bengio-chaired study estimating major risks within 5-10 years
The counter-case is loud: Meta’s Yann LeCun and others say extinction talk is overblown and distracts from present harms; they argue progress needs fewer brakes, not more. Good editors should show both. WIRED
If your smoke alarm said “1 percent chance of fire,” you would not keep toasting marshmallows in the bedroom.