“Even before raw data is input into an algorithm, an algorithm can be doomed to be biased if it is solving a poorly structured problem. To properly structure a problem, one must consider the social complexities of the available data, as well as how the final product will be applied to the problem. The book Data Feminism, by Catherine D’Ignazio and Lauren F. Klein, explores this concept rigorously by looking beyond the numbers and emphasizing the context in which data is collected.”
Salena Prakah-Asante cites Data Feminism in a new article about the dark side of the AI revolution for the Harvard Tech Review. Read the whole piece here.