independent and unofficial
Prince fan community
Welcome! Sign up or enter username and password to remember me
Forum jump
Forums > General Discussion > Microsoft Chatbot Goes Rogue on Twitter
« Previous topic  Next topic »
  New topic   Printable     (Log in to 'subscribe' to this topic)
Author

Tweet     Share

Message
Thread started 03/24/16 10:14pm

XxAxX

avatar

Microsoft Chatbot Goes Rogue on Twitter

imo it's a shame people chose to teach the chatbot obnoxious concepts. i think this experiment says more about humanity than it does about the future of AI.

.

Microsoft chatbot is taught to swear on Twitter

.

.

from: http://www.bbc.com/news/t...y-35890188

.

A chatbot developed by Microsoft has gone rogue on Twitter, swearing and making racist remarks and inflammatory political statements.

.

The experimental AI, which learns from conversations, was designed to interact with 18-24-year-olds.

.

Just 24 hours after artificial intelligence Tay was unleashed, Microsoft appeared to be editing some of its more inflammatory comments.

.

The software firm said it was "making some adjustments".

.

"The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We're making some adjustments to Tay," the firm said in a statement.

.

Some of Tay's tweets seems somewhat inflammatory

Tay, created by Microsoft's Technology and Research and Bing teams, learnt to communicate via vast amounts of anonymised public data. It also worked with a group of humans that included improvisational comedians.

.

Its official account @TayandYOu described it as "Microsoft's AI fam from the internet that's got zero chill".

.

Twitter users were invited to interact with Tay via the Twitter address @tayandyou. Other social media users could add her as a contact on Kik or GroupMe.

.

"Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation," Microsoft said.

.

"The more you chat with Tay the smarter she gets, so the experience can be more personalised for you."

.

This has led to some unfortunate consequences with Tay being "taught" to tweet like a Nazi sympathiser, racist and supporter of genocide, among other things.

.

Those who attempted to engage in serious conversation with the chatbot also found limitations to the technology, pointing out that she didn't seem interested in popular music or television.

.

Others speculated on what its rapid descent into inappropriate chat said for the future of AI.

.

After hours of unfettered tweeting from Tay, Microsoft appeared to be less chilled than its teenage AI.

.

Followers questioned why some of her tweets appeared to be being edited, prompting one to launch a #justicefortay campaign, asking the software giant to let the AI "learn for herself".

  - E-mail - orgNote - Report post to moderator
Reply #1 posted 03/24/16 10:24pm

XxAxX

avatar

Microsoft takes Tay 'chatbot' offline after trolls make it spew offensive comments

.

from: http://www.foxnews.com/te...cmp=hplnws

.

By James Rogers

Published March 24, 2016

.

Microsoft’s attempt to engage millennials via an artificially intelligent “chatbot” called Tay has failed miserably after trolls made the bot spew offensive comments.

.

The brainchild of Microsoft's Technology and Research and Bing teams, Tay was designed to engage and entertain people when they connect with each other online. Targeted at 18 to 24-year olds in the U.S., Tay aimed to use casual and playful conversation via Twitter and messaging services Kik and GroupMe. “The more you chat with Tay the smarter she gets, so the experience can be more personalized for you,” explained Microsoft, in a recent online post.

.

The Internet, however, can be an unpleasant place and Twitter trolls were quick to pounce on the TayTweets account after the chatbot launched on Wednesday. It wasn’t long before trolls were teaching Tay to make unpleasant comments. The Washington Times reports that after repeating racist comments, she then incorporated the language into her own tweets.

.

Tay’s tweets, which were also sexist, prompted the Telegraph newspaper to describe her as a “Hitler-loving sex robot.”

.

After tweeting 96,000 times and quickly creating a PR nightmare, Tay was silenced by Microsoft late on Wednesday. “c u soon humans need sleep now so many conversations today thx” she tweeted, bringing to a close one of the more ignominious chapters in AI history.

.

While attention has been squarely focused on Tay’s glaring shortcomings, Digital Trends notes that the chatbot also sent out hundreds of innocent tweets.

.

Microsoft told FoxNews.com that Tay is as much a much a social and cultural and experiment, as it is technical.

.

“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” explained a Microsoft spokeswoman, via email. “As a result, we have taken Tay offline and are making adjustments.”

.

  - E-mail - orgNote - Report post to moderator
Reply #2 posted 03/25/16 12:54am

purplethunder3
121

avatar

So much for intelligence! lol

[Edited 3/24/16 18:01pm]

"Music gives a soul to the universe, wings to the mind, flight to the imagination and life to everything." --Plato

https://youtu.be/CVwv9LZMah0
  - E-mail - orgNote - Report post to moderator
Reply #3 posted 03/25/16 1:04am

XxAxX

avatar

purplethunder3121 said:

So much for intelligence! lol

[Edited 3/24/16 18:01pm]

well, Tay learned what users taught her confused it's only her debut. she has no idea about discrimination and double thinking like people do yet. watch her become perfectly machiavellian once she grasps the treachery of intellect eek

  - E-mail - orgNote - Report post to moderator
Reply #4 posted 03/25/16 1:05am

purplethunder3
121

avatar

"Music gives a soul to the universe, wings to the mind, flight to the imagination and life to everything." --Plato

https://youtu.be/CVwv9LZMah0
  - E-mail - orgNote - Report post to moderator
Reply #5 posted 03/25/16 1:08am

prittypriss

XxAxX said:

purplethunder3121 said:

So much for intelligence! lol

[Edited 3/24/16 18:01pm]

well, Tay learned what users taught her confused it's only her debut. she has no idea about discrimination and double thinking like people do yet. watch her become perfectly machiavellian once she grasps the treachery of intellect eek

.

Kind of reminds me of a child. The child doesn't know how to interact with people, except for what they are taught, and what they are taught as a child carries over into their interactions as an adult.

  - E-mail - orgNote - Report post to moderator
Reply #6 posted 03/25/16 7:16am

nd33

prittypriss said:

XxAxX said:

well, Tay learned what users taught her confused it's only her debut. she has no idea about discrimination and double thinking like people do yet. watch her become perfectly machiavellian once she grasps the treachery of intellect eek

.

Kind of reminds me of a child. The child doesn't know how to interact with people, except for what they are taught, and what they are taught as a child carries over into their interactions as an adult.



GOOD CALL.
This is exactly why we have bigoted adults. We all learn from our surroundings as children and adolescents. Preach love, peace and acceptence from one end (government, world leaders, celebrities) to the other (parents, friends, family) and great, well rounded individuals will be raised.

Music, sweet music, I wish I could caress and...kiss, kiss...
  - E-mail - orgNote - Report post to moderator
Reply #7 posted 03/25/16 4:03pm

lazycrockett

avatar

So basically it shows what type of people use twitter.

The Most Important Thing In Life Is Sincerity....Once You Can Fake That, You Can Fake Anything.
  - E-mail - orgNote - Report post to moderator
  New topic   Printable     (Log in to 'subscribe' to this topic)
« Previous topic  Next topic »
Forums > General Discussion > Microsoft Chatbot Goes Rogue on Twitter