Sign In  |  Register  |  About Menlo Park  |  Contact Us

Menlo Park, CA
September 01, 2020 1:28pm
7-Day Forecast | Traffic
  • Search Hotels in Menlo Park

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Battling algorithmic bias at TC Sessions: Justice

At TC Sessions: Justice on March 3, we’re going to dive head-first into data discrimination, algorithmic bias and how to ensure a more just future, as technology companies rely more on automated processes to make decisions. Algorithms are sets of rules that computers follow in order to solve problems and make decisions about a particular […]

At TC Sessions: Justice on March 3, we’re going to dive head-first into data discrimination, algorithmic bias and how to ensure a more just future, as technology companies rely more on automated processes to make decisions.

Algorithms are sets of rules that computers follow in order to solve problems and make decisions about a particular course of action. But there is an inherent problem with algorithms that begins at the most base level and persists throughout its adaption: human bias that is baked into these machine-based decision-makers.

Algorithms driven by bad data are what leads to biased arrests and imprisonment of Black people. They’re also the same kind of algorithms that Google used to label photos of Black people as gorillas and that Microsoft’s Tay bot used to become a white supremacist.

At TC Sessions: Justice, we’ll hear from three experts in this field. Let’s meet them.

Dr. Safiya Umoja Noble

Image Credits: Stella Kallnina (opens in a new window)

Associate Professor at University of California Los Angeles a professor at the University of Southern California and author of “Algorithms of Oppression: How Search Engines Reinforce Racism,” Noble has become known for her analyses around the intersection of race and technology.

In her aforementioned book, Noble discusses the ways in which algorithms are biased and perpetuate racism. She calls this data discrimination.

“I think that the ways in which people get coded or encoded particularly in search engines can have an incredible amount of harm,” Noble told me back in 2018 on an episode of TC Mixtape, formerly known as CTRL+T. “And this is part of what I mean when I say data discrimination.”

Mutale Nkonde

Image Credits: Via Mutale Nkonde

It’s important to explicitly call out race in order to create just technological futures, according to Nkonde. In her research paper, “Automated Anti-Blackness: Facial Recognition in Brooklyn, New York,” Nkonde examines the use of facial recognition, the history of the surveillance of Black people in New York and presents potential ways to regulate facial recognition in the future.

Nkonde is also a United Nations adviser on race and artificial intelligence and is currently working with Amnesty International to advance a global ban on facial recognition technology.

Haben Girma

Woman walking with guide dog.

Image Credits: Courtesy of Haben Girma

Author of memoir “Haben: The Deafblind Woman Who Conquered Harvard Law,” and human rights lawyer, Girma focuses on advancing disability justice.

At Sight Tech Global last month, Girma spoke about how discussions around algorithmic bias as it pertains to race have become somewhat normalized, but too often do those conversations exclude the effects of algorithms on disabled people. Girma told me at that when it comes to robots, for example, the topic of algorithmic bias is lacking among developers and designers.

“Don’t blame the robots,” she said. “It’s the people who build the robots who are inserting their biases that are causing ableism and racism to continue in our society. If designers built robots in collaboration with disabled people who use our sidewalks and blind people who would Use these delivery apps, then the robots and the delivery apps would be fully accessible. So we need the people designing the services to have these conversations and work with us.”

If you’ve made it this far in the post, you’re probably wondering how to attend. Well, you can snag your ticket right here for just $5.

 

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 MenloPark.com & California Media Partners, LLC. All rights reserved.