Sources

Key Fact/Claim Source/Organization URL/Reference Status
Altman Death Star GPT-5 post MIT Tech Review, TechRadar, YouTube, Social Samosa, Stocktwits MIT Tech Review, TechRadar, YouTube, Social Samosa, Stocktwits Verified
“AI bubble”/valuation reset The Verge, CNBC The Verge, CNBC Verified
GPUs designed for graphics, not cognition TechXplore, Human Brain Project TechXplore, Human Brain Project Verified
GPT-4 training energy (GWh), ongoing inference energy TechXplore, Human Brain Project TechXplore, Human Brain Project Verified
Brain: ~20 watts, AI hardware vastly more TechXplore, Human Brain Project TechXplore, Human Brain Project Verified
Intel Loihi (energy efficiency, spiking neurons) TechXplore, Human Brain Project, arXiv TechXplore, arXiv, Human Brain Project Verified
IBM TrueNorth (milliwatts, architecture) Open Neuromorphic, Wikipedia Open Neuromorphic, Wikipedia Verified
BrainChip Akida (real-time, milliwatts, edge AI) BrainChip, Precedence Research BrainChip, Precedence Research Verified
Altman’s Rain AI funding, OpenAI chip deal Wired, Reddit Wired, Reddit Verified
Mira Murati, $2B for Thinking Machines CNBC, Yahoo Finance CNBC, Yahoo Finance Verified
DeepMind spiking NNs arXiv, Google Research arXiv, Google Research Verified
Tesla custom neural chips Industry reporting (not directly cited here) - Most claims supported
Cerebras CS-3 wafer-scale, mesh arXiv, manufacturer arXiv Verified
Neuromorphic/AI for medical, hearing, robotics, edge BrainChip, Precedence Research BrainChip, Precedence Research Verified

Statistics


Statistics on AI Hardware and Neuromorphic Computing

Statistic Value / Detail Source
GPT-4 Training Energy Approximately 50 gigawatt-hours consumed during training, powering 5,000 homes for a year TechXplore, Human Brain Project
ChatGPT Daily Energy Use Hundreds of megawatt-hours per day, equivalent to 50,000 homes TechXplore, Human Brain Project
Human Brain Energy Use Around 20 watts, similar to a dim light bulb TechXplore, Human Brain Project
Intel Loihi 2 Energy Efficiency Uses up to 1,000× less energy than conventional processors in specific tasks TechXplore, arXiv, Human Brain Project
IBM TrueNorth Power Consumption Operates on 65–70 milliwatts, comparable to a Christmas light bulb Open Neuromorphic, Wikipedia
BrainChip Akida Power Use Milliwatt-scale operation enabling real-time AI on edge devices without cloud connectivity BrainChip, Precedence Research
Rain AI Funding $51 million chip supply deal backed by OpenAI CEO Sam Altman Wired, Reddit
Thinking Machines Lab Funding $2 billion raised by Mira Murati with Andreessen Horowitz, Nvidia, AMD CNBC, Yahoo Finance
Tesla Neural Processors Custom chips used for autonomous driving with sparse biological computation (exact figures vary) Industry reporting (supported by various sources)
Cerebras CS-3 Wafer-Scale Engine 4 trillion transistors arranged as a neural mesh network arXiv