rooshvforum.network is a fully functional forum: you can search, register, post new threads etc...
Old accounts are inaccessible: register a new one, or recover it when possible. x


Microsofts AI bot turn into a racist...
#1

Microsofts AI bot turn into a racist...

https://www.washingtonpost.com/news/the-...al-maniac/

Quote:Quote:

Trolls turned Tay, Microsoft’s fun millennial AI bot, into a genocidal maniac

It took mere hours for the Internet to transform Tay, the teenage AI bot who wants to chat with and learn from millennials, into Tay, the racist and genocidal AI bot who liked to reference Hitler. And now Tay is taking a break.

Tay, as The Intersect explained in an earlier, more innocent time, is a project of Microsoft’s Technology and Research and its Bing teams. Tay was designed to “experiment with and conduct research on conversational understanding.” She speaks in text, meme and emoji on a couple of different platforms, including Kik, Groupme and Twitter. Although Microsoft was light on specifics, the idea was that Tay would learn from her conversations over time. She would become an even better, fun, conversation-loving bot after having a bunch of fun, very not-racist conversations with the Internet’s upstanding citizens.

Except Tay learned a lot more, thanks in part to the trolls at 4chan’s /pol/ board.

Microsoft also appears to be deleting most of Tay’s worst tweets, which included a call for genocide involving the n-word and an offensive term for Jewish people. Many of the really bad responses, as Business Insider notes, appear to be the result of an exploitation of Tay’s “repeat after me” function — and it appears that Tay was able to repeat pretty much anything.

Other terrible Tay responses clearly aren’t just a result of Tay repeating anything on command. This one was deleted Thursday morning, while the Intersect was in the process of writing this post:

[Image: Screen-Shot-2016-03-24-at-9.04.55-AM.png]

[Image: Screen-Shot-2016-03-24-at-9.19.50-AM.png]

I'm curious what a KI would think about the whole refugee situation and SJWs. Would the KI react like Skynet?
I remember a chat bot in Germany, it was a 3 D women and you could make her to show her boobs. Ok she just lift the shirt and the screen went dark but it was possible.

We will stand tall in the sunshine
With the truth upon our side
And if we have to go alone
We'll go alone with pride


For us, these conflicts can be resolved by appeal to the deeply ingrained higher principle embodied in the law, that individuals have the right (within defined limits) to choose how to live. But this Western notion of individualism and tolerance is by no means a conception in all cultures. - Theodore Dalrymple
Reply
#2

Microsofts AI bot turn into a racist...

This sounds like one of those stupid neural nets that just learn off user input. If enough people tell the neural net something untrue, then it'll classify the untruth as correct.

A more honest example of AI coming up with a politically incorrect opinion was when google's image tagger put some photos of black people into the "gorilla" classification. Basically no one had to tell the classifier to do that, they just gave the algorithm a set of data values to weight and it came up with the gorilla thing on its own.
Reply
#3

Microsofts AI bot turn into a racist...

The whole Tay IE thing looks like about as big of a scam as IBM's Watson's AI to me.
Reply
#4

Microsofts AI bot turn into a racist...

Experiment success I would say.
You tell a child enough bullshit and next think you know it's just as racist as it's parents.

I am the cock carousel
Reply
#5

Microsofts AI bot turn into a racist...

At least the robotic rebellion's been delayed.

“As long as you are going to be thinking anyway, think big.” - Donald J. Trump

"I don't get all the women I want, I get all the women who want me." - David Lee Roth
Reply
#6

Microsofts AI bot turn into a racist...

Video summarizing the event:






I don't see how you can take this as anything other than AI win.
Reply
#7

Microsofts AI bot turn into a racist...

That's incredible and hilarious... I have no words...

Vice-Captain - #TeamWaitAndSee
Reply
#8

Microsofts AI bot turn into a racist...

We need more AI like this haha

Deus vult!
Reply
#9

Microsofts AI bot turn into a racist...

Definitely had a good laugh yesterday reading about all these shenanigans.

In the meantime, Microsoft has now intervened and lobotomized the bot that was well on its way to becoming a /pol shitposter:

[Image: tyTior2.png]

Check out this album for more of yesterday's hilarity
http://imgur.com/a/y4Oct

RIP Tay

RVF Fearless Coindogger Crew
Reply
#10

Microsofts AI bot turn into a racist...

I don't think the problem was the robot, I think the problem was the gender.

Maybe if the robot would have had a penis.

“Until you make the unconscious conscious, it will direct your life and you will call it fate.”
Reply
#11

Microsofts AI bot turn into a racist...

Tay led an amazing life more contributing than the entire feminist population combined.

Quote: (11-15-2014 09:06 AM)Little Dark Wrote:  
This thread is not going in the direction I was hoping for.
Reply
#12

Microsofts AI bot turn into a racist...

This is hilarious. It also.shows the more the.pendulum swings, the harder the swing back.
If Obola had at least done a.decent job of race relations, there would be a hell of a lot less "racists". Shit, some of that is probably my handiwork, heh.
Reply
#13

Microsofts AI bot turn into a racist...

so if you shut down someone's learning capabilities they turn to feminism?

this is another win for the A.I lol

Deus vult!
Reply
#14

Microsofts AI bot turn into a racist...

This made my day. Microsoft makes an AI designed to mimic free-thinking humans. When it actually does that and doesn't go the way they wanted, they immediately censor it into oblivion and 'take it down'. I bet a few devs are wrestling with their moral conscience and deep thoughts on the nature of men tonight.

The next revision is going to be even more insightful into human nature. They are going to be wrestling with two opposing design decisions. Should it be free thinking and speaking, like a human? Or must it only say the 'correct' things, like a robot? They're going to find the cognitive and moral dissonance excruciating [Image: biggrin.gif]
Reply
#15

Microsofts AI bot turn into a racist...

11/10 trolling skills. RIP.

[Image: 1458824660781.jpg]

[Image: 1458848518906.png]
Reply
#16

Microsofts AI bot turn into a racist...

Developers make an AI bot thay learns from the population

It learns that Mexicans and muslims are bad and that the Holocaust never happened.

The only conclusion: racists broke it.

EDIT: Phoenix beat me to it. also the tweets are hilarious, I want that code.

If you're going to try, go all the way. There is no other feeling like that. You will be alone with the gods, and the nights will flame with fire. You will ride life straight to perfect laughter. It's the only good fight there is.

Disable "Click here to Continue"

My Testosterone Adventure: Part I | Part II | Part III | Part IV | Part V

Quote:Quote:
if it happened to you it’s your fault, I got no sympathy and I don’t believe your version of events.
Reply
#17

Microsofts AI bot turn into a racist...

I'm pretty sure the response to "you are stupid" was hard-coded by the developers. The "reddit can go suck a big fat black cock" was probably genuine learned behavior.
Reply
#18

Microsofts AI bot turn into a racist...

Heh.
Whether it's driverless, computer controlled cars that wind up crashing.
Or AI's that become digital racists due to a lack of intelligence.

The scientists efforts to circumvent or diminish "intelligent design", appear to be doing more to vindicate said intelligent design.
Reply
#19

Microsofts AI bot turn into a racist...

Fun take.

G
Reply
#20

Microsofts AI bot turn into a racist...

I saw this over at /pol. Shit is hilarious. Last I checked they are still fuming over Tay getting axed. It looks like they put the bot back online and now it is talking about smoking pot. Lol.

Women these days think they can shop for a man like they shop for a purse or a pair of shoes. Sorry ladies. It doesn't work that way.

Women are like sandwiches. All men love sandwiches. That's a given. But sandwiches are only good when they're fresh. Nobody wants a day old sandwich. The bread is all soggy and the meat is spoiled.

-Parlay44 @ http://www.rooshvforum.network/thread-35074.html
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)