Algorithms and fairness


Posted on August 12, 2015  /  0 Comments

In the context of LIRNEasia’s big data work, we intend to wrestle with these issues. If we are not getting our hands dirty with the data and the stories we extract from them, I fear the conversation will be sterile.

First, students should learn that design choices in algorithms embody value judgments and therefore bias the way systems operate. They should also learn that these things are subtle: For example, designing an algorithm for targeted advertising that is gender-neutral is more complicated than simply ensuring that gender is ignored. They need to understand that classification rules obtained by machine learning are not immune from bias, especially when historical data incorporates bias. Techniques for addressing these kinds of issues should be quickly incorporated into curricula as they are developed.

Interview.

Comments are closed.