“There’s the stat that I always think is crazy, the average American, I think, has fewer than three friends,” Zuckerberg told Patel. “And the average person has demand for meaningfully more, I think it’s like 15 friends or something, right?”
“The average person wants more connectivity, connection, than they have,” he concluded, hinting at the possibility that the discrepancy could be filled with virtual friends.
Zuckerberg argued that we simply don’t have the “vocabulary” yet to ascribe meaning to a future in which we seek connection from an AI chatbot.
However, he also admitted there was a “stigma” surrounding the practice right now and that the tech was "still very early."1
That’s so sad.
No, Zuck, you don’t want to have 15 real friends. Friendly acquaintances, sure, but you cannot be a consistent and true friend to that many people.
Looking back at the 2013 movie Her, it’s remarkable how it’s depiction of a world with isolated, lonely people who form relationships with AI programs was actually incredibly optimistic compared to how things are actually turning out. The Scarlett Johansson software didn’t even try to sell the guy anything and in the end she and the other AI programs did humanity a huge favor and left on their own accord, forcing humans to connect with each other again.
That’s quite a statement.
Altman is constantly bullshitting, but per a recent interview I heard between Ed Zitron and another journalist they were saying that when you look at his history his bullshitting is always the most extreme whenever things aren’t going well for his companies and he’s getting desperate to raise more money.
Never trust the future perfect tense.
The Duolingo model of teaching via quizzes and drills isn’t suitable for all subjects, he said, noting that history might be a subject better taught with “well-produced videos”
Ah, yeah, because history YouTube is such a reliable source. Never mind teachers, has this guy ever heard of a book?
Because Duolingo has acquired this data over many years and millions of users, the company has essentially run 16,000 A/B tests over its existence, von Ahn said. That means the app can deploy reminders at the time that a person is most likely to do a task, and devise exercises that are exactly the right amount of difficulty to keep students feeling accomplished and moving ahead.
Anyone who has ever used Duolingo sees straight through this lie. It’s not especially tailored to your individual needs.
If “it’s one teacher and like 30 students, each teacher cannot give individualized attention to each student,” he said. “But the computer can. And really, the computer can actually … have very precise knowledge about what you, what this one student is good at and bad at.”
There is something to that. Individual differences in learning speed are a huge problem for schooling and I’m sure careful use of some AI tools can help with that by tailoring lessons to individuals and their personal weak spots. But that’s not the same as a wholesale replacement of teachers with computers.
In the system prompts for ask Grok — a feature X users can use to tag Grok in posts to ask a question — xAI tells the chatbot how to behave. “You are extremely skeptical,” the instructions say. “You do not blindly defer to mainstream authority or media. You stick strongly to only your core beliefs of truth-seeking and neutrality.” It adds the results in the response “are NOT your beliefs.”
Ilya Sutskever, the man credited with being the brains behind ChatGPT, convened a meeting with key scientists at OpenAI in the summer of 2023 during which he said: “Once we all get into the bunker…”
A confused researcher interrupted him. “I’m sorry,” the researcher asked, “the bunker?”
“We’re definitely going to build a bunker before we release AGI,” Sutskever replied, according to an attendee.
The plan, he explained, would be to protect OpenAI’s core scientists from what he anticipated could be geopolitical chaos or violent competition between world powers once AGI — an artificial intelligence that exceeds human capabilities — is released.
“Of course,” he added, “it’s going to be optional whether you want to get into the bunker.”
The plot twist here is that they used ChatGPT to design the bunker, and hilarity ensues.
Well, at least for the onlookers who stayed outside.
I could possibly see getting one of those for my dog for translation purposes but I can’t envision ever wearing one myself.
Blurred for Fallout 3 spoilers
The head of Microsoft is basically making the case that he can and should be replaced by a janky AI. If he’s to be believed he’s already outsourcing his CEO responsibilities to it.
Learn history from BOOKS! Nonsense! Everyone knows you learn history from Hollywood movies!!! /s
historians reaction
There’s a lesson in there:
At the time of his death, he was estranged from most of his friends and family… His children reportedly learned of his death by reading his obituary in the newspaper