Data Ethics Commission: IT industry warns against general suspicion against algorithms

The German Federal Government's Data Ethics Commission (DEK), with its final report on Wednesday and the 75 recommendations for action contained in the much-criticized "surveillance capitalism", has put its finger on the wound. Associations of the IT industry therefore rate the work with sharp tones. Bitkom emphasizes, for example, that the aim of the initiated dialogue on the ethical use of data should not be to "rebuild Germany into an analogous island state".

The authors of the report put "almost all algorithms under a general suspicion", complains the association. However, very few of them would pose the risk of discrimination or danger to life and limb. The required risk assessment for each of the sometimes long everyday routine routines "is not only not necessary, but in view of the abundance of applications even in practice not feasible". The Bitkom has the parallel published the guide "Look into the black box"to make algorithms for artificial intelligence (AI) more comprehensible in practice.

"With some demands the commission is clearly over the target shot out", the eco federation of the InterNet economy in the same notch. Regulatory fantasies such as a new General European Algorithms Regulation (EUVAS) could become "a real digitization brake" according to the eco. In the context of the interoperability obligation for messengers and social networks that has been brought into play, it should be questioned to what extent this limits the sovereignty of users.

The TÜV Association, on the other hand, welcomed the initiative of the DEK to divide AI and self-learning algorithms into different risk classes. An autonomous vehicle is something other than an online store or an intelligent e-mail inbox. In the case of danger to life and limb or strong interference with the privacy of consumers, the AI ​​must be demonstrably secure. Independent audits of consumer safety technology, workers and businesses are an important and proper instrument to help break through relevant innovations in the European market.

The report was "very successful" and "covers the spectrum of regulatory measures necessary to further develop the law in a manner that makes technological sense", states Matthias Kettemann, head of a research program on "Standard Formation in Digital Communication Spaces" at the Leibniz Institute for Media Research Hamburg. Central is "the double focus on the protection of human rights and human dignity as well as the preservation and promotion of democracy and social cohesion". Too often, so far in the debate on the "Datafizierung" has lost the social whole.

"The report has been prepared with a view to our digital reality and depicts the facts very well", praises the Frankfurt data protection lawyer Anne Riechert. The proposed data trust model is important in order to preserve the informational self-control right of the individual and at the same time to relieve him from decisions. The Stuttgart media scientist Tobias Keber welcomed above all that the commission had given the "concept of the so-called data ownership" a clear rejection.

Tobias Matzner, Professor of Media, Algorithms and Society at the University of Paderborn, points to an important but somewhat hidden aspect of the report. Thus, the authors made it clear that digital self-determination also had to include a relevant handling of non-personal data. Big data analysis has long since made inseparable a separation of these from personal information. It is also true that data should no longer be seen as "consideration" for digital services.

"The Commission has done what the Federal Government itself has not been able to do for years: It has put forward very concrete proposals for the urgently needed regulation on central questions of digitization", state the Green Bundestag members Konstantin von Notz and Tabea Rößner. The report makes it clear that previous campaigns against informational self-determination, which the government has been pursuing with slogans such as "data ownership" and "wealth of data" for years, "found no support in expert circles and find".

Similarly, Anke Domscheit-Berg and Petra Sitte commented on the left-wing fraction. In their view, there is now a need for urgent action to ensure that "the use of algorithms serves the common good". The government had to revise its AI strategy. The coalition should "the data policy finally give the same importance as the tax policy," demanded the FDP digital expert Manuel Höferlin. The abuse of data power must be sanctioned sharper. However, some of the DEK proposals breathed "the spirit of a dirigism that reaches its limits in our global information society."


. (TagsToTranslate) algorithms (t) data Ethics (t) Privacy (t) monitoring