From algorithm to tunnel vision

Jan Tuerlinckx

DIGITIZATION SHOULD MAKE life easier: we can do more with less effort or with less workforce. Likewise, we also equate digitization with efficiency. Presumably, it should benefit everyone, as we become more productive and a better work-life balance can be achieved.

For some time now, the authorities have also grasped the importance of working more efficiently. Governments are therefore also turning to digital optimization. It would be wrong to think that the Belgian government does that more than foreign powers. The Dutch, for instance, frequently use algorithms in public administration. In fact, the Netherlands constitute an interesting laboratory and are paving the way in this area.

SOMETHING IS DEFINITELY brewing in this laboratory. For example, the democratic content of some algorithms is the subject of debate. Algorithms are built based on parameters that are interconnected. Choices have to be made in this process. In the Netherlands, this has sparked a debate on two issues. Firstly, should politicians be entrusted with selecting those parameters, instead of the civil servants and technicians who are currently doing it? Secondly, with regard to transparency, shouldn’t citizens know how they digital footprint is being tracked?

Even though algorithms are everywhere, those used to follow digital footprints to detect fraud are perhaps the best example. However, a Dutch court recently banned the Ministry of Social Affairs from continuing to use a digital fraud detection system. The court argued that, from the citizen’s viewpoint, it was not clear which data was being analysed and that the transparency of the system could not be guaranteed. In Het Financieele Dagblad newspaper, Miroko Schäfer, a media scientist at the University of Utrecht, criticised the model claiming that it was based on fraud stereotypes reinforcing old prejudices. Suspicions are therefore inconclusive, which leads to a reversal of the burden of proof. Citizens are then forced to prove their innocence, while, according to the democratic principle, the onus to demonstrate the citizen’s guilt should be on the government. This is the essence of the relationship between government and citizen.

The democratic content of what constitutes algorithms is the subject of debate

TAX SERVICES also use algorithms to build a digital footprint of taxpayers. Our tax administration is boasting that IT people in Finance will become the digital Sherlock Holmes. They capture digital data and combine information from different sources. They use specialist applications or self-written scripts to identify tax evasion risks and select cases to fit those profiles.

Those computer forensics experts perform analyses and make logical connections based on open source data on possible tax evasion. In doing so, they improve procedures and software applications. They also help develop new scripts to enable or automate more complex analysis processes.

WE MUST THEREFORE view the actions of our northern neighbours as an experiment from which we, too, can learn. The objections expressed by the Dutch court are also valid in our legal system. My Dutch colleague, tax lawyer Roel Kerkhoffs, aptly summarised on LinkedIn that everyone should be concerned about the application of algorithms: “The Tax Administration is also using algorithms. Unfortunately, it is easy for such fraud suspicion reports to cause bias on the inspector’s part. One might assume that, if the computer has selected a particular issue, there must be something wrong. It is important to investigate all this thoroughly.”

Even though this debate is still in its infancy, the basic assumption is that everyone should remain opposed to fraud.

Published under