Abhijit Bhaduri
Trending

Before fixing bias in AI, let us fix our own

By | | Founder Associates & ex CLO WIPRO Technologies

We are getting agitated about the bias that is being programmed into technology. The problem is not in the tech but in the worldview of those who build tech.

Face-recognition, voice-recognition, gesture-recognition …

You would believe that replacing the traditional lock and key with face recognition would make your home secure. That is what tech providers thought. The door locks of some houses in Bronx had software driven locks installed. Face recognition software would unlock the doors. This diverse group of residents discovered that the software was locking out some of the owners. They wanted the lock and key systems to replace the software.

Ruled by algorithms

Face recognition software works well with fair-skinned males in the 18-35 age group. The data set that the algorithm was trained on did not match the profile of the users. No surprise that people of color get locked out. Voice recognition software has a problem identifying high pitched voices. Children’s voices and many accents are often misunderstood. The software has been trained on male voices.

Read: Can algorithms be made humane?

Your gesture is unfamiliar

A team member of Microsoft’s Kinect discovered in the early prototypes that the device understood gestures of the male members of the house. It was the gestures of women and children that it struggled with. The data that is being used to train the algorithm often carries the historical bias of humans. The program faithfully reflects the bias of the data-set.

Read: Gender and race bias in voice recognition

Resume screening software is biased too

Amazon tried to feed a decade-old pile of resumes of job applicants to identify what the ideal job applicant. The machine knew one variable for sure. The ideal candidate had to be male! That’s what the majority of resumes that Amazon had received over the last decade.

What if the applicant has a name that could belong to either gender. For an androgynous name like Kiran, the machine can identify the gender from hobbies listed in the resume. If the applicant has listed cricket as a sport, there is a greater likelihood of the person being male, the algorithm figured. Sometimes the name of the college attended had “Women” in the name and that was a give away the gender.

Diversity is a challenge everywhere

The challenge of inclusion is everywhere. After 92 years, the Best Picture award was given to a film that was not made in the English language. Bong Joon Ho the director of Parasite.

up comedians have built a career around the lack of diversity in the Academy Awards panel of jury members. But the challenge remains stubbornly unchanged. How do we ensure that the diversity challenge is addressed?

Read: Algorithmic Accountability – what is it?

The “awkward” minority

bias in AISomething happens to us when we see ourselves as the majority. The people who are not like us become invisible. The majority writes the rule book to favor themselves in every way.

The left-handed minority

Think of the challenge of left-handed people who wish to play the guitar or use a pair of scissors.Things like spiral notebooks, scissors, car radios, computer keyboards, and handheld can openers are all designed with a right-hand bias. So are things like guitars. Southpaws or left-handed people learn to live by the rules of a world designed for right-handed people.

The bias is visible in our language. The French word for right is “droit” and that is how “adroit” becomes a synonym for skilful in English. Gauche refers to someone who is socially awkward. The left side is called “gauche” in French. Any deviation from the norms set by the majority is awkward (for the right handed majority, I assume). The majority is always the norm.

Anything that is an exception to the norm is termed as “abnormal”. Gender was viewed as binary and anything that challenged the definition was termed as illegal. That is why transgender rights are still not part of mainstream conversations. Neither are gig workers rights. The pandemic is eroding gig worker’s livelihoods in travel and hospitality and many others sectors.

The employees who are on the rolls of organizations are naturally given salary or health insurance but the same privilege is not extended to the freelancer or gig worker who does the work from outside the lines of payroll.

Why is tech not inclusive?

bias in AITech is designed by a homogenous group of people. The young male engineer is the default. These businesses are managed by leadership teams that look the same. In the tech world, anyone over the age of 35 is “old” and cannot be expected to learn fast enough. Hiring married women who have children means they will not work insane hours and expect perks like “work-life balance”, the hiring manager of a tech company told me on the condition of anonymity.

The biggest tech companies build campuses that can keep you land locked in the office. Everything from going to the gym and snacking is a brief break you take before you come back to code. It is precisely this lack of contact with the world outside that makes them blind to the opportunities. When the large tech firms move into a neighborhood, the rents go up because the young and footloose people in hoodies can pay higher rents. In a few years, the space begins to look like a ghetto with no diversity. Everyone thinks in the same way. The people who are different have all been forced out.

Tech is binary – humans are not

bias in AIIn technology, everything is binary. Whereas the human world is anything but that. The more techies become part of the social system, the more nuanced is their view of the world of the consumer. They begin to notice consumers who are not literate but need to use their phones and may even wonder how they store all the phone numbers they need to call. And how do they retrieve the numbers they cannot read? How do the elderly cope with the lack of inclusion that technology has put down as minimum literacy. How do people cope with the massive disruption in their careers as the machines keep gobbling up jobs that feed the family.

Before we fix the bias in the algorithms, we need to first fix the biases that cloud our view of the future. Technology has to be defined in cognitive as well as human terms. Today the definition is only cognitive. It takes away the essence of what makes us human. The backlash against Big Tech will percolate downstream. It is only a matter of time.

Tech for the “rest of the world”

When Marie Antoinette suggested that hungry peasants should eat cake if they can’t have bread. The consequence she faced should give us a heads up (pardon the dark humor).

With almost five billion adults having phones, there is a bigger question for tech to think of – what products and services will we create for the users who do not speak English. How can tech be made inclusive for those whose paying capacity is limited.

There is no better place to start than India. The first wave of technology was about iPhones and Teslas. The biggest problems of the problem remain unsolved. The biggest problems of humanity – hunger, education, affordable healthcare have not been solved. Tech creators have to use their super-powers to solve these challenges.

As the next billion goes online, and the balance of power moves from US to the “rest of the world” (just think of what that term smacks of), we have to stop changing our accents to match a more “globally understood” accent and let technology figure out the accents of 1.3 billion Indians because that is where the next market is. That is the future.

First written for the March 2020 issue of People MattersRepublished with permission of Abhijit Bhaduri and originally published @ https://abhijitbhaduri.com/

 


 

Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button