And why do you think that just because it's a scenario that's been depicted in fiction it can't actually happen? Have you not seen how many scifi concepts from older movies, shows and games have come to pass? Science fiction is, after all, an attempt to predict what the future may hold. Writers don't always get it right, but sometimes they do.
Also for the record the ACTUAL programmers who were working on these LLMs before they resigned are the ones sounding the alarm bells about this the loudest.
October: "There's no way AI can catch up. Look at this slop.
December: "...okay, it's getting better, but still, not at all human enough.
Late January: "Alan, we are so fucked."
And here I was for the past 15 years warning about how the A.I. singularity could destroy us all. Almost nobody took me seriously. Said I was being "too pessimistic" and "worrying over nothing."
Except it's not even doing that. LLMs are losing BILLIONS of dollars a day because the cost of maintaining those city-sized server racks is vastly outpacing their ability to sell A.I. crap to people. That's why it's being called the "A.I. bubble."
While the fear is something that's easy to grasp into. It's far more likely that the bubble is going to burst...
AI takes a lot of electricity, water and damages the environment at a high pace. And it does this to produce..... currently, next to nothing. Nobody is buying into it.
And if nobody continues to buy into it, and the billionaires continue to invest tons of cash into it; it will crash, because they'll be unable to pay back the bill when it's still not making money.
While I agree that is the more likely scenario, considering what's at stake, I don't think it's a risk that should be allowed to be taken at all. Even if the bubble bursts this time, without global regulations, what's to stop some other crazy billionaire from trying again in a few years? The same can be argued about nuclear weapons; sure, we've gone 80 years without blowing ourselves up, but with the number of close calls there have been and how frighteningly easy it would be to set off a civilization-ending nuclear war, is it really worth it letting governments keep all those nukes locked and loaded?
Or as Ian Malcolm once put it, "These scientists were so preoccupied with weather or not they could they didn't stop to think if they SHOULD!"
Also for the record the ACTUAL programmers who were working on these LLMs before they resigned are the ones sounding the alarm bells about this the loudest.
December: "...okay, it's getting better, but still, not at all human enough.
Late January: "Alan, we are so fucked."
AI takes a lot of electricity, water and damages the environment at a high pace. And it does this to produce..... currently, next to nothing. Nobody is buying into it.
And if nobody continues to buy into it, and the billionaires continue to invest tons of cash into it; it will crash, because they'll be unable to pay back the bill when it's still not making money.
Or as Ian Malcolm once put it, "These scientists were so preoccupied with weather or not they could they didn't stop to think if they SHOULD!"