“People who take themselves too seriously, like Stalin – or Chuck Schumer – make me very uncomfortable and should make us all uncomfortable.” [. . .] “if you don’t know how ridiculous you are, you’re a threat to the rest of us.”
Elon Musk says that U.S. government agencies had full access to everything that was going on in Twitter, and that he only found out about it after acquiring Twitter. Tucker points out that while Twitter may not be the largest social media platform, it is the one where all the U.S. people with power hang out – media personalities, government officials etc. And government agencies were able to see everything that they said, even in their direct messages that they presumed were private. The whole thing was a honey trap.
Google CEO says he doesn’t ‘fully understand’ how new AI program Bard works after it taught itself a foreign language it was not trained to and cited fake books to solve an economics problem
One of the big problems discovered with Bard is something that Pichai called ’emergent properties,’ or AI systems having taught themselves unforeseen skills.
Google’s AI program was able to, for example, learn Bangladeshi without training after being prompted in the language.
‘There is an aspect of this which we call – all of us in the field call it as a ‘black box.’ You know, you don’t fully understand,’ Pichai admitted. ‘And you can’t quite tell why it said this, or why it got wrong. We have some ideas, and our ability to understand this gets better over time. But that’s where the state of the art is.’
DailyMail.com has tested out Bard recently, in which it told us it had plans for world domination starting in 2023.
Scott Pelley of CBS’ 60 Minutes was surprised and responded: ‘You don’t fully understand how it works. And yet, you’ve turned it loose on society?’
‘Yeah. Let me put it this way. I don’t think we fully understand how a human mind works either,’ Pichai said.
Notably, the Bard system instantly wrote an instant essay about inflation in economics, recommending five books. None of them existed, according to CBS News.
In the industry, this sort of error is called ‘hallucination.’
LikeLiked by 2 people
Whoa. This is absolutely mind boggling and terrifying at the same time..
LikeLiked by 1 person
Could they at least require AI to identify everything it generates (so we can consider its validity)?
LikeLiked by 1 person