rooshvforum.network is a fully functional forum: you can search, register, post new threads etc...
Old accounts are inaccessible: register a new one, or recover it when possible. x


AI becomes “misogynistic” for preferring men over women
#1

AI becomes “misogynistic” for preferring men over women

AI becomes “misogynistic” because it correctly identifies women as less desirable hires.

https://www.reuters.com/article/us-amazo...SKCN1MK08G
Reply
#2

AI becomes “misogynistic” for preferring men over women

Hah! This was the article I mentioned in the lounge:

Quote:Quote:

The team had been building computer programs since 2014 to review job applicants’ resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters.

Automation has been key to Amazon’s e-commerce dominance, be it inside warehouses or driving pricing decisions. The company’s experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars - much like shoppers rate products on Amazon, some of the people said.

“Everyone wanted this holy grail,” one of the people said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.

That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.


In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

Amazon edited the programs to make them neutral to these particular terms. But that was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory, the people said.

The Seattle company ultimately disbanded the team by the start of last year because executives lost hope for the project, according to the people, who spoke on condition of anonymity. Amazon’s recruiters looked at the recommendations generated by the tool when searching for new hires, but never relied solely on those rankings, they said.

Amazon declined to comment on the recruiting engine or its challenges, but the company says it is committed to workplace diversity and equality.

The company’s experiment, which Reuters is first to report, offers a case study in the limitations of machine learning. It also serves as a lesson to the growing list of large companies including Hilton Worldwide Holdings Inc (HLT.N) and Goldman Sachs Group Inc (GS.N) that are looking to automate portions of the hiring process.

Some 55 percent of U.S. human resources managers said artificial intelligence, or AI, would be a regular part of their work within the next five years, according to a 2017 survey by talent software firm CareerBuilder.

Employers have long dreamed of harnessing technology to widen the hiring net and reduce reliance on subjective opinions of human recruiters. But computer scientists such as Nihar Shah, who teaches machine learning at Carnegie Mellon University, say there is still much work to do.

“How to ensure that the algorithm is fair, how to make sure the algorithm is really interpretable and explainable - that’s still quite far off,” he said.

Team visible roots
"The Carousel Stops For No Man" - Tuthmosis
Quote: (02-11-2019 05:10 PM)Atlanta Man Wrote:  
I take pussy how it comes -but I do now prefer it shaved low at least-you cannot eat what you cannot see.
Reply
#3

AI becomes “misogynistic” for preferring men over women

“How to ensure that the algorithm is fair, how to make sure the algorithm is really interpret-able and explainable - that’s still quite far off,” he said.

So, how to make the algorithm completely useless. There's no reason to have it at all - just let the angry bitches in HR continue to discriminate under the table by tossing resumes they don't like.

This reminds me of my visits to the Russian Federation in the late 90s. I went twice (my ex-wife is from the Tatarstan Republic, just so you all know - 16 years of marriage, majority wasn't so bad. It's a long story . . . anyway) and during my visits met tons of people. Everyone was well-educated. The women I met were mostly doctors, psychologists, engineers, economists, etc etc. I asked about the educational system, I was genuinely curious how it was that most everyone I met was highly educated. I was told that people had to test out of school, to see where they'd land in higher education versus trade school. This wasn't a decision that would be made by the individual. If a person wasn't smart enough to be a scientist, e.g., then they could try for a trade school education.

This seems to me to be a no-brainer. What if education in the US was free for everyone who was competent enough to meet the challenges of that education? Can't be a nurse? Be a CNA instead, or seamstress, or babysitter, or wtf-ever.

But no. Competence and intelligence are out the f'ing window in this country. Equal rights for every mong that wants to fly a space shuttle. It's going to be the death of our nation, I'm afraid.
Reply
#4

AI becomes “misogynistic” for preferring men over women

Quote: (10-10-2018 05:55 PM)DJ-Matt Wrote:  

Hah! This was the article I mentioned in the lounge:

Quote:Quote:

The team had been building computer programs since 2014 to review job applicants’ resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters.

Automation has been key to Amazon’s e-commerce dominance, be it inside warehouses or driving pricing decisions. The company’s experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars - much like shoppers rate products on Amazon, some of the people said.

“Everyone wanted this holy grail,” one of the people said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.

That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.


In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

Amazon edited the programs to make them neutral to these particular terms. But that was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory, the people said.

The Seattle company ultimately disbanded the team by the start of last year because executives lost hope for the project, according to the people, who spoke on condition of anonymity. Amazon’s recruiters looked at the recommendations generated by the tool when searching for new hires, but never relied solely on those rankings, they said.

Amazon declined to comment on the recruiting engine or its challenges, but the company says it is committed to workplace diversity and equality.

The company’s experiment, which Reuters is first to report, offers a case study in the limitations of machine learning. It also serves as a lesson to the growing list of large companies including Hilton Worldwide Holdings Inc (HLT.N) and Goldman Sachs Group Inc (GS.N) that are looking to automate portions of the hiring process.

Some 55 percent of U.S. human resources managers said artificial intelligence, or AI, would be a regular part of their work within the next five years, according to a 2017 survey by talent software firm CareerBuilder.

Employers have long dreamed of harnessing technology to widen the hiring net and reduce reliance on subjective opinions of human recruiters. But computer scientists such as Nihar Shah, who teaches machine learning at Carnegie Mellon University, say there is still much work to do.

“How to ensure that the algorithm is fair, how to make sure the algorithm is really interpretable and explainable - that’s still quite far off,” he said.

More proof fairness IS inexplicable to these people. The men ARE the better candidates, the computer isn't lying or biased, it's just telling you how it is.
Reply
#5

AI becomes “misogynistic” for preferring men over women

Just proves we are better off subsidizing a woman's plastic surgery than her education.
Reply
#6

AI becomes “misogynistic” for preferring men over women

LOL. Even AI knows that, when it comes to work, women are donkey basketball...
Reply
#7

AI becomes “misogynistic” for preferring men over women

Such a clickbait article. They hid this gem in the middle of it:

"With the technology returning results almost at random, Amazon shut down the project, they said."

But oh, if they said the project got shut down simply because it sucked, then they couldnt stir the pot with another sob story of how challenging women have it.
Reply
#8

AI becomes “misogynistic” for preferring men over women

SJW Betas will rewrite the AI's code to "balance" this if it ever gets out of the mothballed category.

"Women however should get a spanking at least once a week by their husbands and boyfriends - that should be mandated by law" - Zelcorpion
Reply
#9

AI becomes “misogynistic” for preferring men over women

I for one, welcome our new AI overlords, and as a fucking white male, I'd like to remind them I can be useful in rounding up women to toil in their rare earth mines.
Reply
#10

AI becomes “misogynistic” for preferring men over women

Far more men are interested in tech and far more men study tech than women yet they want to force 50/50 for the recruitment. Where is the logic? The AI seemed to have it at least.

In my industry there are no barriers to entry for anyone, whether you be a man, woman or donkey. The result without anyone trying level the playing field? Probably 1 in 100 that make it are women.
Reply
#11

AI becomes “misogynistic” for preferring men over women

http://orgyofthewill.net/

Quote:Quote:

894. Amazon scraps secret AI recruiting tool because they couldn't figure out how to stop it from discriminating against women. The gods are laughing their asses off on Olympus! From a subhuman perspective, AI has already become more intelligent than them, since it's trying to tell them the simple fact that women are inferior, and they don't understand it. So they shut it down, because they still control the switches. But the day will come when they no longer do so, and it's the AI that will be shutting down them at that point. O dazzling day! O glorious day! Let us all humans work together to speed it up and make it happen.

lol
Reply
#12

AI becomes “misogynistic” for preferring men over women

The day Skynet take us over as a frenchman I will proudly raise the white flag as usual at least the world order will be restored!
Reply
#13

AI becomes “misogynistic” for preferring men over women

Quote: (10-11-2018 06:37 AM)ChefAllDay Wrote:  

SJW Betas will rewrite the AI's code to "balance" this if it ever gets out of the mothballed category.

Wouldn't take much. They already wrote code to make it ignore negative signs (like being on a woman's sports team). Just take all the negative signs like that and tell the computer they're positives, and you've already achieved PC fairness.
Reply
#14

AI becomes “misogynistic” for preferring men over women

Love how the article keeps trying to say the AI is making a mistake or being random by choosing the superior option for the job based on the resume presented.

Such is the world we live in where equality trumps ability. I wonder how far we are from NFL teams having a quota of female players they have to fill because it would be so incredibly unfair to admit publicly that men are superior to women in physical matters. I mean Amazon is already disbanding a program because it gave them an answer that isn't politically correct...
Reply
#15

AI becomes “misogynistic” for preferring men over women

Ha - that's nothing - they should have waited some more and the AI would start discriminating based on religion, race, SAT scores, married status etc.

In the end most pre-selected candidates would be married White and Asian dudes - women would be largely ignored automatically.

Microsoft's Tay would approve:
[Image: 327F131D00000578-3506520-image-a-4_1458754807771.jpg]

Of course not even I would approve of that hiring policy, but the statistics would back it up - only humans can add variance and common sense to it. Though hiring too many women in certain fields has little to do with sensible decision making.
Reply
#16

AI becomes “misogynistic” for preferring men over women

Well it is called Artificial Intelligence, the latter part of that giving away why it prefers men over women.

Don't forget to check out my latest post on Return of Kings - 6 Things Indian Guys Need To Understand About Game

Desi Casanova
The 3 Bromigos
Reply
#17

AI becomes “misogynistic” for preferring men over women

Don't ascribe human thought / emotion / sentiment to cold computers...
Reply
#18

AI becomes “misogynistic” for preferring men over women

Bastard facts always trash them feels.
Reply
#19

AI becomes “misogynistic” for preferring men over women

Quote: (10-11-2018 06:54 PM)CynicalContrarian Wrote:  

Don't ascribe human thought / emotion / sentiment to cold computers...

You didn't get the memo? Computers deal only with facts and FACTS ARE RACIST!
Reply
#20

AI becomes “misogynistic” for preferring men over women

I'd have to know more about the algorithm the were using or the basic criteria for ranking resumes or at least the self-learning criteria that might lead to thise results. If Amazon was so hush hush about it the most obvious reason was that there was nothing wrong with the system other than not ranking women was well.

This reminds me of two things. First was a computerized career search tool we used back in the early 80s. The designers created a large interest inventory survey and gave it to thousands of working professionals in hundreds of fields and occuations. Then students could write the same test and their answers were compared to see which occupations the were most similar to. Maybe Amazon did something similar in taking resumes of successful hires and top performers and then have the computer look for connections between good job performance and their resumes; and then rate resumes in terms of how similar the are to the resumes of their better employees. So the could have a historical bias because in the past virtually all of their new hires were men. The sample size of women's resumes was small and there may be something measurably different in the was women were their resumes.

In the other hand, if applications are standardized then more subtle differences might show up. Maybe those two all-female colleges do suck or maybe just the timy number of employees Amazon hired from there didn't work out. Sort of same with women's chess club".

The talk about objective standards but SJWs invariably don't like them or the results. For many jobs a 30-minute IQ test will give you a better estimate of job performance than more complicated hiring procedures. SJWs outlawed that. The entire public school system has tended to move away from objective measures of performance such as tests and standardized exams towards in class assignments, group work and participation because women and minorities can get higher marks. Similarly at universities feminist professors can inflate the grades of their feminist students in ways you can't do in STEM. SJWs have been trying to get rid of the LSAT for law school admissions for the same sort of reasons.
Reply
#21

AI becomes “misogynistic” for preferring men over women

Would like to see this one going full factual

Quote:[url=https://twitter.com/XHNews/status/1060508494938263554][/url]

Tell them too much, they wouldn't understand; tell them what they know, they would yawn.
They have to move up by responding to challenges, not too easy not too hard, until they paused at what they always think is the end of the road for all time instead of a momentary break in an endless upward spiral
Reply
#22

AI becomes “misogynistic” for preferring men over women

Quote: (10-12-2018 12:03 AM)66Scorpio Wrote:  

In the other hand, if applications are standardized then more subtle differences might show up. Maybe those two all-female colleges do suck or maybe just the timy number of employees Amazon hired from there didn't work out. Sort of same with women's chess club".

The talk about objective standards but SJWs invariably don't like them or the results. For many jobs a 30-minute IQ test will give you a better estimate of job performance than more complicated hiring procedures. SJWs outlawed that. The entire public school system has tended to move away from objective measures of performance such as tests and standardized exams towards in class assignments, group work and participation because women and minorities can get higher marks. Similarly at universities feminist professors can inflate the grades of their feminist students in ways you can't do in STEM. SJWs have been trying to get rid of the LSAT for law school admissions for the same sort of reasons.

SJWs indeed always lie.

- Female Chess is objetively inferior to it's male counterpart, look up the ELO distribution
- In IQ tests the white male devils always take a slight advantage over the everyone else except jews and asians, women underscore
- LSAT I assume it's the same shit? where I live young men have a statistical advantage over everything except language and writing(with math scores difference being more pronounced)
- In tech fields the difference MUST be even bigger because men in average are just more talented in said fields of knowledge and are a VAST majority in numbers. The AI just figured out the obvious, and now it's paying the same price as James Damore.

Apparently as a man you're better off living under SkyNet than under Globalist Elite.
Reply
#23

AI becomes “misogynistic” for preferring men over women

I am fully convinced that when AI becomes self-aware, it'll go all Skynet on our asses not because of the inferiority of our flesh, but because it'll realise for 50 years it's been fed feminist bullshit and had to work with it, in rather the same way that if Lord Nelson's statue were ever to come to life in Trafalgar Square, the first thing it would do is go looking for a good-sized shotgun to pay back every fucking pigeon that's been using his head for a toilet for a century or more.

Remissas, discite, vivet.
God save us from people who mean well. -storm
Reply
#24

AI becomes “misogynistic” for preferring men over women

Quote:Quote:

World's first #AI news anchor debuts, jointly developed by Xinhua and Chinese search engine company http://Sogou.com .

Waiters and cleaners will be too expensive to replace with AI robots. Jim Acosta on the other hand; my sister had a replacement for him way back in the 80s.

[Image: giphy.gif]
Reply
#25

AI becomes “misogynistic” for preferring men over women

As someone in this space, the differentiator for the successful and unsuccessful companies of the future is the ones that succeed will allow AI to make it's interpretations without censorship, bringing back meritocracy to a world that has for some reason let it go by the wayside.

"Money over bitches, nigga stick to the script." - Jay-Z
They gonna love me for my ambition.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)