People’s representatives and scientists see correction needs on the draft EU commission’s draft commission for a regulation "european concept for artificial intelligence" (ki). 40 meps appeal to the commission in a party-shaped fire letter to the commission "clear prohibition of biometric mass monitoring in public areas to propose". This wishes the majority of the burger.
Stein of the ontobe: the commission wants to prohibit the use of AI for domestic monitoring only by companies. Safety cases should continue to use such KI techniques. The biometric remote identification of persons, for example, by automated facial recognition, is classified as highly risky, but should be possible after passing through a special approval procedure in principle.
This exception had to be deleted, demand the parliamentarians to which patrick breyer (pirate party), nicola beer (FDP), cornelia ernst (the left), evelyne gebhardt (SPD), the grune alexandra geese, the liberal moritz korner and svenja hahn as well as the social democrat tiemo clouds. With public security, mass monitoring is steadily justified – especially here would a ban relevant. Courts had always explained this approach for void.
The automated recognition of sensitive characteristics of people like gender, sexuality, ethnicity and health status "is unacceptable and must be excluded", write the deputies. She "the danger to solidify many forms of discrimination", served as base for "the far-reaching and indiscriminating monitoring and tracking of population groups based on their biometric features".
The co-chairman of the data eethinder commission, christiane wendehorst, welcomed the list of banned KI practices, for example for manipulation or mass monitoring. "This finally forms ‘red lines’ which are not exceeded by KI applications." the viennese civilian lawyer refers to one "strange discrepancy": on the one hand, the commission emphasizes that such techniques "the european values and fundamental rights" contradicted, but these should then be allowed from grounds the public security by law.
Also recorded platforms?
"Manipulation" for example, as far as defined so far, "that this ultimately also the entire area of personalized advertising and the adaptive design of social media could be taken", brings healing an example. In both areas, ki is used to influence the behavior of users. But it is about a "central feature of the business model of the platform operator". It is therefore questionable how the regulation will be implemented here.
According to kristian kersing, who explored maschine learning at the TU darmstadt, the specifications could even be interpreted so that social media should be banned: "although this sounds cynical, but many people are of the opinion that the social networks have the opinion of people negatively influence." corresponding algorithms and social scoring were classified as unacceptable from the outset.
The minimum standards, which provide the commission at KI for example for personnel selection or police work, could hardly be handled by kersting: "A complete, explicit description of the effects of actions on all facts applicable in a world is difficult if not impossible." the goal is good, "the details but overbelt". The rules were not allowed too far, because "presence in the ki means prosperity through innovation and a strong tool in the fight against climate change and diseases".
Ki still in the pubertat
"New technologies, solutions and market needs need a reliable regulatory framework, but without superregulation", stressed antonio kruger, business drivers of the german research center for artificial intelligence (DFKI). Ki is not new science, but the applications and market penetration still put in the pubertat. It should be clear that the technology in europe is not used for arbitrarious monitoring thirst. The DFKI stands for wide ki cooperations with china in criticism.
It’s right, artificial intelligence "direct fundamental rights" to want to explain the hamburg media researchers stephan dreyer. According to the draft, under ki, however, almost every software currently employed, including modules and program libraries. High-risk ki also define the commission along "highly inconvenient criteria whose existence can only be determined by a risk assessment".