By Kate Darling
It’s no point getting frustrated with AI when it doesn’t do what we expect it to. Instead, we should focus on the ways it can help and support people.
Since the start of the pandemic, artificial intelligence (AI) developers have deployed hundreds of machine learning tools to help diagnose COVID-19. The promise: to find patterns in the medical data like an algorithmic version of the television character Dr House.
Recently, researchers have discovered that these AI tools were overhyped. Instead of discovering relevant connections between cases, the algorithms were making a litany of false assumptions, including predicting COVID cases based on the text font that hospitals happened to use in their documents.