By clicking “Accept All Cookies,” you agree to the storing of cookies on your device to enhance site navigation and analyze site usage.

Skip to main content

Panel Examines the Humans Behind Machine Bias

May 29, 2018

img_5742_26893784457_o

While machine learning technology has the potential to remove human bias from decision-making, it’s becoming increasingly clear that automated segmentation algorithms can also exacerbate the problem, especially in areas such as hiring, and loan and credit services.

At the recent conference, The Future of Work: Capital Markets, Digital Assets, and the Disruption of Labor in New York, the topic of bias was a common thread throughout the day. (See Where Humans Meet Machines: Intuition, Expertise and Learning and Is Technology Outpacing Organizations? for more coverage).

During a panel on The Biases of Humans and Machines, moderator, Renee Richardson Gosline, (pictured above), Senior Lecturer, Research Scientist, MIT Sloan, led a frank discussion of the ubiquity of algorithms and how they can go awry. Panelists tackled the thorny issues around individual and societal responsibility for addressing the risks of algorithmic bias, anhow to raise awareness of these new threats.

Cathy O’Neil, a data scientist and author, noted that “About five years ago I started to realize that every industry was using these formulas to determine who are the winners, who are the losers — and those labels were staying with us for life, but we didn’t even know it.” It was everywhere, she said, yet, “there’s no appeal system if it is incorrect. I began to question the trustworthiness of these algorithms. And that’s one of the reasons I wrote the book, Weapons of Math Destruction.”

Gosline likened this type of labeling to a form of “branding,” where people don’t have any recourse to say, ‘Hey, hang on. This doesn’t really represent me.’

Stephanie Lampkin, Blendoor Founder and CEO, said her company is “working to mitigate unconscious bias in hiring” because it remains “one of the clearest cases where our unconscious biases and our idea of who can be successful and why, comes to light.” Blendoor tracks how far different demographics make it into a search, and it also publishes a corporate equality, diversity, and inclusion index every year “measuring different ways in which you could be representing inclusion and equity. We hope that accountability will drive better behavior before lawsuits do.”

 

Continue reading the full blog on Medium, here.