US report warns on big data discrimination

By GovInsider

Even data is not always equal to all.

Big data analytics could inadvertently discriminate against certain citizens, a report by the United States has found.


“It is a mistake to assume they are objective simply because they are data-driven,” said the White House report.


The techniques could be biased because of poorly selected or incomplete data or from the algorithm itself.


“The algorithmic systems that turn data into information are not infallible - they rely on the imperfect inputs, logic, probability, and people who design them,” writes US CTO Megan Smith in a joint blog post.


Take gender equality in employment. Analytics tools can recommend candidates whose skills best match the jobs. A startup wants to hire people who showed early interest in computing, for instance. But perhaps candidates are from a background where boys are exposed to computers at an earlier age. This would skew hiring towards male recruits, the report says.


It also looks at an example from within the US government: the recently launched College Scorecard which gives students information on college quality and costs to help them make a better choice. But federal data lacks details on individual students’ preparation for college admissions, the report says - an important factor for students choosing their college.


Without this data, the College Scorecard search is not giving students the best possible match for colleges.


Another education example: is universities using analytics to predict graduation rates. Family income is a significant factor in such algorithms, the report says. This puts the system against students from low-income families who are more likely to drop out because they cannot afford it. They would be denied admission, not because of their abilities but because of their family income.


Government must support research into mitigating such “algorithmic discrimination”, the report recommends. It should also look at multi-disciplinary projects to understand the social impacts of technologies.


“In particular, it will be important to bring together computer scientists, social scientists, and those studying the humanities to understand these issues in their historical, social, and technological contexts,” the report says.


Image by Jimmy Thomas, licensed under CC BY 2.0