When it comes to computerized robots and artificial intelligence (AI), it's time to be afraid — be very afraid.
The latest scientific luminary to join those voicing such fears for the future is Apple co-founder Steve Wozniak, who
told the Australian Financial Review (AFR): "Computers are going to take over from humans, no question."
Wozniak is not the only super-rich, super-intelligent person to express these worries. He joins Tesla Motors and SpaceX founder Elon Musk, Microsoft founder Bill Gates, and British genius physicist Stephen Hawking in expressing similar concerns of a "Terminator" future when super-intelligent computers discover that humans are not as smart as they are and, therefore, it's time to take over.
"Like people including Stephen Hawking and Elon Musk have predicted, I agree that the future is scary and very bad for people. If we build these devices to take care of everything for us, eventually they'll think faster than us and they'll get rid of the slow humans to run companies more efficiently," Wozniak told AFR.
"Will we be the gods? Will we be the family pets? Or will we be ants that get stepped on? I don't know about that."
Wozniak expressed concern about quantum computers, theoretically, so far, capable of running extremely complicated calculations at dazzling speed.
"In the end, we just may have created the species that is above us," he said.
Gates commented on Reddit, "I am in the camp that is concerned about super intelligence. First the machines will do a lot of jobs for us and not be super intelligent. That should be positive if we manage it well. A few decades after that, though, the intelligence is strong enough to be a concern. I agree with Elon Musk and some others on this and don't understand why some people are not concerned."
Musk has pledged $10 million to the
Future of Life Institute (FLI) for research grants to study the negative implications of AI.
He told students at the
AeroAstro Centennial Symposium at MIT: "With artificial intelligence, we are summoning the demon. In all those stories where there's the guy with the pentagram and the holy water, it's like, yeah, he's sure he can control the demon. Doesn't work out.
"I think we should be very careful about artificial intelligence. If I had to guess at what our biggest existential threat is, it's probably that. So we need to be very careful. I'm increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don't do something very foolish."
QZ quoted Musk as saying: "In the movie 'Terminator,' they didn't create A.I. to…they didn't expect, you know, some sort of 'Terminator' like outcome."
Hawking told the BBC that AI "would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded.
"The development of full artificial intelligence could spell the end of the human race."
Watch the video here.
© 2025 Newsmax. All rights reserved.