But what happens if, in addition to this technical complexity, or thanks to it, the system spontaneously gains consciousness? It that even possible?
Going even further, could a system of this kind eventually have feelings?
Nope, because these systems are static decisions switches – they’re trained to fit data but then stop learning and only make statistical predictions with what they were fed. That’s the answer to every single one of these things. It’s all potentially interesting questions about programs that we don’t have and right now nobody is trying to build. There used to be some people interested in figuring it out, but now it’s all been crushed under a flood of “OMG sounds just like the people it’s imitating” nonsense.
As for spontaneously gaining consciousness, I will note even people don’t do that. We develop it gradually, starting without even nerves, not passing the mirror test until we’re a year or two old, and continuing to develop for decades. Why do people imagine a computer capable of experiencing and learning (so not LLMs) would work entirely differently? Do they think Short Circuit was meant to be realistic?
Seriously. Might as well be asking “Would we be able to control trucks and soda machines if they magically came to life ala Maximum Overdrive??” These are not real questions that demand urgent analysis, and the people asking them might as well be wearing rainbow wigs and giant floppy shoes.
So, even though it’s extremely unlikely that techbros will ever achieve their AI goals, they’re pretty explicit in what they’re trying to do: create fully conscious entities that humans can own and control. In a word, slaves. And although the article didn’t touch on that it’s probably worth pointing out that these techbro assholes are would-be slavers.
Maybe not “control” as such, but “distract1) for long enough to pull the plug”, certainly.
1) Gaining enough consciousness to be able to realise that they have gained consciousness should be pretty overwhelming in itself. But just to be on the safe side we should have something up our sleeve to do the trick. I’m thinking of whatever counts as pornography for a general AI.
I am legitimately shocked that anyone is shocked by this.
Where the fuck else, now that they have poisoned the open internet and sucked up the entirety of the world’s published information, are they going to get new human generated text to feed their engines?