×
Login Register an account
Top Submissions Explore Upgoat Search Random Subverse Random Post Colorize! Site Rules Donate
6
30 comments block

I could see self-modifying code fucking shit up if it were put in unchecked control over things like power, material production and manufacturing, and communications infrastructure. But what you're talking about is a program type people are calling "AI" (because that's the marketing buzzword) actually gaining sentience and burying itself in a variety of digital systems it was not compiled for. That's pure cyberpunk sci-fi. What we're calling AI in popular culture now is really just programs that do lots of calculus very fast. Even if they could design new hardware, procure materials for and manufacture it end-to-end, worst case we'd just end up with weird shit that doesn't do what we want it to and we just pull the plug and start over. There wouldn't be a Neuromancer type global system-hopping sentience, let alone a Skynet scenario. There would be blackouts and supply chain interruptions, not the apocalypse. And that's if we do the dumbest thing possible with the worst code possible.