Questioning Mathematics: Algorithms and Open Texture

Talk given at 10 Years of ‘Profiling the European Citizen’: Slow Science Seminar 12-13 June 2018, Brussels City Campus.

Abstract Contemporary data practices, whether we call them data science or AI, statistics or algorithms, are widely perceived to be game changers. They change what is at stake epistemologically as well as ethically. This especially applies to decision-making processes that infer new insights from data, use these insights to decide on the most beneficial action, and refer to the data and inference process to justify the chosen course of action.

Developing a critical epistemology that helps us address these challenges is a non-trivial task. There is a lack of clarity regarding the epistemological norms we should adhere to. Purely formal evaluations of decisions under uncertainty can, for instance, be hard to assess outside of the formalism they rely on. In addition, there is substantial uncertainty with regard to the applicably norms because scientific norms may appear to be in flux (new paradigms, new epistemologies, ...). Finally, dealing with this uncertainty and lack of clarity is further complicated by promises of unprecedented progress and opportunities that invite us to imagine a data-revolution with many guaranteed benefits, but few risks.

As part of this broader epistemological exercise, I want to focus on a small, but largely disregarded—in my view misunderstood—fragment of the problem at hand: The question of the role of mathematics, and the question of how some widely shared beliefs about the nature of mathematical knowledge contribute to the scientific respectability of contemporary data practices.