Advertisement

Most Americans believe algorithms will always be biased

A study suggests this distrust stems in part from social networks.

If you're convinced that many algorithms are biased, you're not the only one. Pew has conducted a survey indicating that 58 percent of American adults believe algorithms and other programming will always contain some kind of human bias. That figure is partly skewed by age (63 percent of those over 50 didn't believe algorithms could be completely neutral), but even the relatively optimistic 18-29 crowd showed some distrust, with 48 percent believing there would still be some bias.

Why the skepticism? To some extent, it could stem from social networks. A clear 74 percent of study participants didn't think social media accurately reflected society. It pushed their emotional hot buttons (88 percent were at least sometimes amused, 71 percent angry) and frequently led to heated discussions whether or not they had all the facts. Respondents also weren't big fans of how social networks used their data in some cases. They were fine with events and potential friends, but balked at having their data used to target ads -- especially political ads.

The survey also suggested that most Americans didn't believe algorithms should be used in situations with far-reaching consequences. About 56 percent didn't believe criminal risk assessment algorithms were acceptable, and they showed stronger objections to algorithms being used for automated resume screening (57 percent), job interview video studies (67 percent) and personal finance scores (68 percent). Many didn't believe algorithms could accurately reflect complex human nature, and that the algorithms were both unfair and could violate privacy.

While the study doesn't make any definitive pronouncements, it's evident that tech companies and governments will have a lot of work to do if they're going to convince the public that algorithms are better than humans in certain cases. It also hints that these bodies may want to avoid algorithmic decision-making unless they can convince the public that the code is reasonably fair.