As artificial intelligence continues to permeate our lives, how can we ensure biases are not being built into the technologies we use every day?
“A well-known saying in the artificial intelligence community is ‘Garbage In! Garbage Out!’” says Dr. Ayanna Howard, robotics pioneer and tech entrepreneur. “In other words, the quality of the data that’s fed into AI algorithms has a direct impact on the quality of the results. Organizations should take this to heart. They should ensure that their ‘in’ (leadership and their workforce) represents their ‘out’ (the diverse range of customers they serve). Otherwise, just like with AI, the quality of their outputs will eventually be found to be substandard, poor and without value – a company headed for the brink of obsolescence. One that is ripe for failure.”
A former NASA engineer who was recently appointed the first female Dean of the College of Engineering at The Ohio State University, Dr. Howard’s bestselling audiobook “Sex, Race and Robots: How to Be Human in the Age of AI” (2020) explores how small communities of non-diverse programmers are knowingly or unknowingly embedding biases into AI products and how that harms society. To combat biases and address trust concerns, she says AI developers must remember there’s a diverse group of users on the other end.
“We as people are diverse and we’re different and it makes us unique and beautiful. Our AI systems should be designed in such a way,” she explains.
Named one of America’s Top 50 Women in Tech, Dr. Ayanna Howard offers organizations a powerful opportunity to develop AI with a greater sense of awareness so they can serve wider audiences well into the future.