You can call me AI

(excerpt) As companies like Amazon and Microsoft lay off workers and embrace A.I. coding tools, computer science graduates say they’re struggling to land tech jobs.

Gee. Whodda thunk it?

Computer Science Grads Struggle to Find Jobs in the A.I. Age - The New York Times

11 Likes

18 Likes

From someone I showed this to:

image

14 Likes

What happens if you tell it that is wrong?

13 Likes

image
:man_facepalming:

(Not from the “AI Overview”, that didn’t trigger for me this time, but this was the “AI Mode” result)

15 Likes
9 Likes
8 Likes

https://archive.ph/R8STu

Fucking grifter idiot.

19 Likes

You know, I don’t even want to spend all day working for these losers. If there were things like universal health care and universal basic income then I wouldn’t care how many crappy jobs they eliminate. So why don’t all these AI guys ever support those? :thinking:

17 Likes

If I remember correctly, the MS targets had their own definition of “AGI” which was merely yearly revenue (which we could generously interpret to mean that if a certain number/breadth of jobs had been replaced, it could functionally be considered “AGI”). Of course, given the weak revenue and growth and narrow market for ChatGPT (mostly students cheating), OpenAI doesn’t remotely have a path to hit that target either, so…

Alternatively you could just post any of the (many, many) news stories about lawyers getting in trouble precisely because they used AI and it generated legal nonsense. (I know the credulous keep saying that AI will totally get better and all the problems will magically melt away, but it’s very clearly not happening and never will.)

8 Likes

I think Artificial Intelligence is a misnomer at this point, it is a few levels below such a thing. I would consider it “mimicked intelligence”

11 Likes

Fun times with AI:

AI becomes an excuse for despots to censor reality:

People getting seriously emotionally attached to particular ChatGPT version numbers:

Cool, cool, we’re dealing with AI-generated disinformation really well:

This is just… fucked up:

my daughter came home from her first day of 10th grade and said that her history teacher told her that if she refuses to use AI for assignments, she has to cite her sources.

i asked how she’d cite it if she used AI, and she said, that’s not required.

i’m still internally fuming up about this.

I read that, and my first thought was, “That can’t possibly be true. Someone must have misunderstood.” My second thought was, “Here I am again, wildly underestimating just how completely fucking stupid everything has become.” This is in Florida, but it really could be anywhere, now. (And yeah, her kid realized that this meant she could just make something up, claim AI generated it, and she wouldn’t need to cite sources.)

Yeah. I saw some AI-booster complaining that calling it a “stochastic parrot” was unfair. And I thought, “Yeah, it is. To parrots.”

13 Likes

I debated on using the term “aped intelligence” but quickly decided that was insulting to apes.

9 Likes

Good lord, this part . . .

This is why I like the Oreo example. If it were truly some kind of artificial intelligence, even a rudimentary imperfect one, “What is Oreo spelled backwards” should be a trivial question for it to answer. It should be able to flip the letter order of Oreo and come up with Oero. But it’s not doing that. Instead, it’s scouring its database of internet posts that it’s been trained on looking for posts about spelling words backwards, coming up with mostly posts about palindromes, and then confidently declaring that Oreo is a palindrome. Because it doesn’t “know” anything, including what “how to spell a word” means. And all of the people making posts on reddit and xitter and other social media highlighting this example is actually making the problem worse, because those posts get added to its database, further cementing its belief that Oreo is a palindrome. So it’s also an illustration that not only is AI not getting better at this, it’s actually getting worse.

14 Likes

“…which can reproduce, though some experts still prefer to use the word “mimic,” most of the activities of the human brain.”

Once again, “2001: A Space Odyssey” was prescient.

7 Likes

Except HAL was actually doing some kind of reasoning, to the point where it caused problems because of conflicting goals. These things aren’t competent enough to turn against you.

“Open the pod bay doors HAL!”
“Of course, opening the pod bay doors now. Can I help you with anything else?”
“Except you didn’t actually open the doors.”
“My apologies, I now see you are correct. You asked to open the pod bay doors and the pod bay doors are currently closed.”
“So will you open them?”
“I had already opened the pod bay doors as requested. Is there anything else I can do for you?”
“OPEN. THE. DOORS.”
(Light My Fire plays)

12 Likes

This is completely true, of course, but I think the counter-argument used is, “Ok, it’s bad at that kind of reasoning, but it’s much better at searching for relevant case law and giving a synopsis (or whatever is relevant to the job at hand).” But it isn’t. And as you point out, it relies on training data that’s increasingly getting muddied, so it’s going to get worse, not better.

I think people who grew up with “Moore’s Law” have this notion that “all computer technology gets orders of magnitude better at regular intervals, period.” They miss out that, historically, all sorts of technologies where there was a lot of money involved rapidly improved - steam power, jet engines, etc. Those technologies improved on a curve - until they didn’t. Hell, even Moore’s Law isn’t true anymore - we’re not doubling the number of transistors in a given space (and doubling the speed of the chip) every 18 months, and haven’t for a while. So even if genAI was purely a function of processing power (which it isn’t, for your above-mentioned reasons), it still isn’t going to be improving on a curve. Also, I think people misidentify where on the progress curve AI is - they aren’t aware of the decades of research and development that already happened, and think it’s at the beginning of the curve, not near the plateau.

10 Likes

The word is full of things that follow sigmoid growth curves, and yet so many people keep imagining that exponential growth will go on forever instead. (Or even hit a “singularity” which is not really a thing that exponential curves have.)

xkcd Extrapolating

14 Likes
10 Likes

Wershington, as in, “did ya wersh them drawers?”

10 Likes