Sex, Race, and Robots author Ayanna Howard discusses how to identify, fight bias

The Robot Report
Sex, Race, and Robots author Ayanna Howard describes how to identify, fight bias

Ayanna Howard with a Dynamic Anthropomorphic Robot with Intelligence-Open Platform (DARwIn-OP). Source: Rob Felt, Georgia Institute of Technology

Headlines regularly proclaim that robots are coming for people’s jobs or are “creepy,” but both robotics developers and the general public are increasingly aware of the many ways in which the technology can boost productivity and safety. However, the need to understand how robots and artificial intelligence can inherit negative human biases is still urgent, according to roboticist Ayanna Howard.

“Bias in AI is the responsibility of the designer,” said Howard, who recently published the book Sex, Race, and Robots: How to Be Human in the Age of AI. “Most designers and developers are fairly homogenous — largely male. I’m a roboticist, but my advisor was male, so the thinking processes were driven by male perspectives and are a product of training.”

“We need different people in terms of life experience,” she told The Robot Report. “We shouldn’t all go over a cliff because no one was trained to look down. We have a responsibility to get out of our comfort zones. For instance, engineers should work with UX [user experience] folks. Be the one to know nothing and retrain. It takes conscious effort.”

Howard applies experience to current concerns

Howard has worked at NASA and Microsoft Corp. and has been the chair of the School of Interactive Computing at the Georgia Institute of Technology for the past three years. She was named yesterday as the first female dean of Ohio State University’s College of Engineering. Howard is also the founder and chief technology officer of educational robotics firm Zyrobotics LLC and a member of Autodesk Inc.‘s board of directors.

“We’ve looked at our practices within academia,” she said. “In some cases, we throw grad students in and wait for people to bob up. They get advisors, but some may need more help, depending on their backgrounds. Such bias is hard to identify, but once it’s found, it’s easy to fix.”

From civil liberties concerns around facial recognition technology and voice-recognition systems that fail to recognize female voices to privacy worries about household robots and contact tracing of COVID-19 patients, the issues of trust and transparency have never been more urgent, Howard noted.

“My book’s chapters have different themes, with self-driving cars and lob loss as examples,” she said. “In the case of the person killed by an autonomous Uber, the safety driver was charged, but not the company. If an engineer was charged in Volkswagen’s emissions scandal, how easy would it be to go after a programmer who didn’t look at a failsafe?”

“If the robots we create cause harm, it won’t necessarily be the company’s fault — it will be us,” Howard warned. “If companies don’t address accountability, regulations are coming for AI. We started to see cases around facial recognition three to four years ago, and bans are now here in some cities. It’s in companies’ best interests to identify problems and try to fix them.”

‘Cognitive disconnect’ around sex and robots

“The majority of voice assistants come with a default female voice,” Howard said. “The problem is that the studies show that 95% of administrative assistants are female, so we expect it to be female.”

Sex, Race, and Robotics, Howard

Source: Audible

“We have a cognitive disconnect when its voice is male,” she explained. “‘Not only do I have an assistant that’s female, but she won’t bark back.’ There’s a worry that at some point, this feeds into human-to-human interactions or reinforces expectations of subservient behavior. In our society, men tend to interrupt more in conversation to establish a power differential. With understanding, we can retrain ourselves and AI.”

“Sexism can also be found in facial recognition,” said Howard. “If you look at beauty contests that use AI, why would you want an AI to evaluate certain features or hair? The latest pageants select people who would not have been included 50 years ago, but the systems are still applying ‘traditional’ concepts of beauty. It’s similar to the apps that create nude images.”

“Why is Pepper curvy?” she asked, referring to SoftBank’s humanoid service robot. “It’s very easy to cross the line from a ‘perfect shape’ or a nonthreatening design to something subservient.”

“As robots become more realistic, there have even been robot brothels,” she said. “If consent is not part of a relationship, research has shown that people can stop distinguishing what’s wrong.”

Black in Robotics strives for diversity

Howard is also a founding member of Black in Robotics, which advocates for diversity and inclusion in the robotics industry. She and Monroe Kennedy III, an assistant professor of mechanical engineering at Stanford University, recently spoke with The Robot Report Podcast:

Since then, the organization has created a Boston chapter and received inquiries from interested parties.

“A few companies have asked, ‘What can we do to increase diversity? We’ve seen the statistics, and we know it’s a problem,'” Howard said. “We hosted a series of events around IROS, which went virtual this year, around ally engagement and re-skilling computer scientists to roboticists, as well as for students and mid-career roboticists.”

“Latinos are underrepresented in robotics, even though that’s the largest growing group in the U.S.,” she said. “We’re also developing ideas for dealing with ageism.”

Howard sees a window of opportunity

“COVID-19 has accelerated our need to resolve these issues because it has accelerated adoption of systems, from physical service robots to online assistants,” said Howard. “Any new technology has bugs, but now that people are using these systems, we could get used to certain biases. We now have a short time frame to institute fundamental changes as robots become pervasive.”

“It’s a matter of not introducing more harm than good,” she added. “Some people have been killed by seatbelts, but they’ve saved many more lives. It’s not like being inconvenienced or offended if a phone doesn’t recognize your face. Some applications will kill people unevenly.”

“For example, in the medical field, a lot of studies base their parameters and data based on men — even for pregnancy pills,” “We have to be conscious of gender bias when designing thing like exoskeletons. At 5 ft. tall, I didn’t qualify for the astronaut program. That’s a design decision.”

Take action daily

“Being one of the few black females in robotics, I’ve had to learn how to navigate a space when being uncomfortable,” Howard said. “People should start thinking about what their role is in ‘rewiring’ themselves by taking small, daily actions.”

“For example, if you’re the only female in a team, you know you’ll be interrupted,” she said. “Write down three things you want to say, and raise your hand and interrupt. You’ll be uncomfortable at first, but it will become easier. When you talk to Siri, turn the voice to male. That’s a lesson for anyone, not just engineers.”

“We talk in robotics about understanding whom you’re designing for. If you change your mindset and diversify your team, unbiased design will become more natural,” she said. “Spoiler alert: At the end of Sex, Race, and Robots, I say, ‘As humans, we are all becoming anomalies in the age of AI.'”

The post Sex, Race, and Robots author Ayanna Howard discusses how to identify, fight bias appeared first on The Robot Report.

Source: therobotreport

Leave a Reply