Dumb AI is a bigger risk than strong AI

by | Sep 5, 2022 | Technology

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

The year is 2052. The world has averted the climate crisis thanks to finally adopting nuclear power for the majority of power generation. Conventional wisdom is now that nuclear power plants are a problem of complexity; Three Mile Island is now a punchline rather than a disaster. Fears around nuclear waste and plant blowups have been alleviated primarily through better software automation. What we didn’t know is that the software for all nuclear power plants, made by a few different vendors around the world, all share the same bias. After two decades of flawless operation, several unrelated plants all fail in the same year. The council of nuclear power CEOs has realized that everyone who knows how to operate Class IV nuclear power plants is either dead or retired. We now have to choose between modernity and unacceptable risk.

Artificial Intelligence, or AI, is having a moment. After a multi-decade “AI winter,” machine learning has awakened from its slumber to find a world of technical advances like reinforcement learning, transformers and more with computational resources that are now fully baked and can make use of these advances.

AI’s ascendance has not gone unnoticed; in fact, it has spurred much debate. The conversation is often dominated by those who are afraid of AI. These people range from ethical AI researchers afraid of bias to rationalists contemplating extinction events. Their concerns tend to revolve around AI that is hard to understand or too intelligent to control, ultimately e …

Article Attribution | Read More at Article Source

Share This