Author | Message |
Microsoft Chatbot Goes Rogue on Twitter imo it's a shame people chose to teach the chatbot obnoxious concepts. i think this experiment says more about humanity than it does about the future of AI.
. Microsoft chatbot is taught to swear on Twitter. .
from: http://www.bbc.com/news/t...y-35890188 . A chatbot developed by Microsoft has gone rogue on Twitter, swearing and making racist remarks and inflammatory political statements. . The experimental AI, which learns from conversations, was designed to interact with 18-24-year-olds. . Just 24 hours after artificial intelligence Tay was unleashed, Microsoft appeared to be editing some of its more inflammatory comments. . The software firm said it was "making some adjustments". . "The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We're making some adjustments to Tay," the firm said in a statement. .
Tay, created by Microsoft's Technology and Research and Bing teams, learnt to communicate via vast amounts of anonymised public data. It also worked with a group of humans that included improvisational comedians. . Its official account @TayandYOu described it as "Microsoft's AI fam from the internet that's got zero chill". . Twitter users were invited to interact with Tay via the Twitter address @tayandyou. Other social media users could add her as a contact on Kik or GroupMe. . "Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation," Microsoft said. . "The more you chat with Tay the smarter she gets, so the experience can be more personalised for you." . This has led to some unfortunate consequences with Tay being "taught" to tweet like a Nazi sympathiser, racist and supporter of genocide, among other things. . Those who attempted to engage in serious conversation with the chatbot also found limitations to the technology, pointing out that she didn't seem interested in popular music or television. . Others speculated on what its rapid descent into inappropriate chat said for the future of AI. . After hours of unfettered tweeting from Tay, Microsoft appeared to be less chilled than its teenage AI. . Followers questioned why some of her tweets appeared to be being edited, prompting one to launch a #justicefortay campaign, asking the software giant to let the AI "learn for herself".
| |
- E-mail - orgNote - Report post to moderator |
Microsoft takes Tay 'chatbot' offline after trolls make it spew offensive comments. from: http://www.foxnews.com/te...cmp=hplnws . By James Rogers Published March 24, 2016 . Microsoft’s attempt to engage millennials via an artificially intelligent “chatbot” called Tay has failed miserably after trolls made the bot spew offensive comments. . The brainchild of Microsoft's Technology and Research and Bing teams, Tay was designed to engage and entertain people when they connect with each other online. Targeted at 18 to 24-year olds in the U.S., Tay aimed to use casual and playful conversation via Twitter and messaging services Kik and GroupMe. “The more you chat with Tay the smarter she gets, so the experience can be more personalized for you,” explained Microsoft, in a recent online post. . The Internet, however, can be an unpleasant place and Twitter trolls were quick to pounce on the TayTweets account after the chatbot launched on Wednesday. It wasn’t long before trolls were teaching Tay to make unpleasant comments. The Washington Times reports that after repeating racist comments, she then incorporated the language into her own tweets. . Tay’s tweets, which were also sexist, prompted the Telegraph newspaper to describe her as a “Hitler-loving sex robot.” . After tweeting 96,000 times and quickly creating a PR nightmare, Tay was silenced by Microsoft late on Wednesday. “c u soon humans need sleep now so many conversations today thx” she tweeted, bringing to a close one of the more ignominious chapters in AI history. . While attention has been squarely focused on Tay’s glaring shortcomings, Digital Trends notes that the chatbot also sent out hundreds of innocent tweets. . Microsoft told FoxNews.com that Tay is as much a much a social and cultural and experiment, as it is technical. “Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” explained a Microsoft spokeswoman, via email. “As a result, we have taken Tay offline and are making adjustments.” . | |
- E-mail - orgNote - Report post to moderator |
So much for intelligence! [Edited 3/24/16 18:01pm] "Music gives a soul to the universe, wings to the mind, flight to the imagination and life to everything." --Plato
https://youtu.be/CVwv9LZMah0 | |
- E-mail - orgNote - Report post to moderator |
well, Tay learned what users taught her it's only her debut. she has no idea about discrimination and double thinking like people do yet. watch her become perfectly machiavellian once she grasps the treachery of intellect | |
- E-mail - orgNote - Report post to moderator |
"Music gives a soul to the universe, wings to the mind, flight to the imagination and life to everything." --Plato
https://youtu.be/CVwv9LZMah0 | |
- E-mail - orgNote - Report post to moderator |
. Kind of reminds me of a child. The child doesn't know how to interact with people, except for what they are taught, and what they are taught as a child carries over into their interactions as an adult. | |
- E-mail - orgNote - Report post to moderator |
Music, sweet music, I wish I could caress and...kiss, kiss... | |
- E-mail - orgNote - Report post to moderator |
So basically it shows what type of people use twitter. The Most Important Thing In Life Is Sincerity....Once You Can Fake That, You Can Fake Anything. | |
- E-mail - orgNote - Report post to moderator |