管家婆免费开奖大全

管家婆免费开奖大全 researcher launches group to help detect hidden biases in AI systems

Parham Aarabi

Parham Aarabi, of the department of electrical and computer engineering, helped start a research group that uncovers biases in AI systems, including some belonging to Apple, Google and Microsoft (photo by Johnny Guatto)

A new initiative led by 管家婆免费开奖大全 researcher Parham Aarabi aims to measure biases present in artificial intelligence systems as a first step toward fixing them.

AI systems often reflect biases that are present in the datasets 鈥 or, sometimes, the AI鈥檚 modelling can introduce new biases.

鈥淓very AI system has some kind of a bias,鈥 says Aarabi, an associate professor of communications/computer engineering in the Edward S. Rogers Sr. department of electrical and computer engineering in the Faculty of Applied Science & Engineering. 鈥淚 say that as someone who has worked on AI systems and algorithms for over 20 years.鈥

Aarabi is among the academic and industry experts in the 管家婆免费开奖大全鈥檚 , which tests other organizations鈥 AI systems using diverse input sets. HALT AI creates a diversity report 鈥 including a diversity chart for key metrics 鈥 that shows weaknesses and suggests improvements.

鈥淲e found that most AI teams do not perform actual quantitative validation of their system,鈥 Aarabi says. 鈥淲e are able to say, for example, 鈥楲ook, your app works 80 per cent successfully on native English speakers, but only 40 per cent for people whose first language is not English.鈥欌

HALT was launched in May as a free service. The group has conducted studies on a number of popular AI systems, including some belonging to Apple, Google and Microsoft. HALT's statistical reports provide feedback across a variety of diversity dimensions, such gender, age and race. 

鈥淚n our own testing we found that Microsoft鈥檚 age-estimation AI does not perform well for certain age groups,鈥 says Aarabi. 鈥淪o too with Apple and Google鈥檚 voice-to-text systems: If you have a certain dialect, an accent, they can work poorly. But you do not know which dialect until you test. Similar apps fail in different ways 鈥 which is interesting, and likely indicative of the type and limitation of the training data that was used for each app.鈥

HALT started early this year when AI researchers within and outside the electrical and computer engineering department began sharing their concerns about bias in AI systems. By May, the group brought aboard external experts in diversity from the private and academic sectors.

鈥淭o truly understand and measure bias, it can鈥檛 just be a few people from 管家婆免费开奖大全,鈥 Aarabi says. 鈥淗ALT is a broad group of individuals, including the heads of diversity at Fortune 500 companies as well as AI diversity experts at other academic institutions such as University College London and Stanford University.鈥

As AI systems are deployed in an ever-expanding range of applications, bias in AI becomes an even more critical issue. While AI system performance remains a priority, a growing number of developers are also inspecting their work for inherent biases. 

鈥淭he majority of the time, there is a training set problem,鈥 Aarabi says. 鈥淭he developers simply don鈥檛 have enough training data across all representative demographic groups.鈥

If diverse training data doesn鈥檛 improve the AI鈥檚 performance, then the model itself may be flawed and require reprogramming.

Deepa Kundur, a professor and the chair of the department of electrical and computer engineering, says HALT AI is helping to create fairer AI systems. 

鈥淥ur push for diversity starts at home, in our department, but also extends to the electrical and computer engineering community at large 鈥 including the tools that researchers innovate for society,鈥 she says. 鈥淗ALT AI is helping to ensure a way forward for equitable and fair AI.鈥

鈥淩ight now is the right time for researchers and practitioners to be thinking about this,鈥 Aarabi adds. 鈥淭hey need to move from high-level abstractions and be definitive about how bias reveals itself. I think we can shed some light on that.鈥

The Bulletin Brief logo

Subscribe to The Bulletin Brief

Engineering