Reconsidering artificial intelIigence

Researchers at the University of Stuttgart’s Interchange Forum for Reflecting on Intelligent Systems (IRIS) explore how intelligent systems affect society.
15th November 2021
Back to overview
  • Technologcial discrimination cannot be blamed on algorithms but on their standards which are set by humans.
  • The Interchange Forum for Reflecting on Intelligent Systems (IRIS) seeks to help avoid, detect, and explain such unfairness by conducting field research as well as offering interchange opportunities to discuss current ethical and social challenges.
  • IRIS helps to promote awareness for technological discrimintation by educating aspiring researchers at as early career levels as possible, e.g. students.

Can an inanimate object be racist? Some time ago a video of a soap dispenser systematically denying soap to dark-skinned people went viral on social media. The soap dispenser reacted this way because standard infrared technology was developed by fair-skinned people and tested exclusively on their hands. This is not an isolated case. Another example occurred at the Südkreuz train station in Berlin, where a facial recognition software pilot project involving 300 test subjects was carried out in 2017. The study results showed that the system was flagging too many people as suspects that it was not even looking for. Once again, people of colour and women were particularly affected.

Can technological discrimination be blamed on algorithms? According to Maria Wirzberger of the Department of Teaching and Learning with Intelligent Systems (LLiS) and spokesperson for the new Interchange Forum for Reflecting on Intelligent Systems (IRIS) research group: “The answer is no because, in simple terms, algorithms work like drawers. They are based on standards which were set by human beings as a result of which they may well reflect their stereotypes. These stereotypes are often unconscious and are therefore incorporated unconsciously into technological developments.”

Anti Unfairness

Researchers from all disciplines now wish to collaborate at IRIS to investigate such developments. “Over the past few years we’ve learned how the naive use of Artificial Intelligence (AI) for automated decision-making can lead to unfair discrimination,” says Steffen Staab of the University of Stuttgart’s Institute for Parallel and Distributed Systems and co-spokesperson of IRIS: “Which is why we are developing new methods to avoid, detect and explain unfairness.” Through the IRIS network, researchers intend to critically reflect on the foundations, mechanisms, implications, and effects of intelligent systems in research and teaching as well as with regard to society as a whole. IRIS is funded by the German Research Foundation (DFG) as part of the German federal and state governments’s Excellence Strategy as well as by the University of Stuttgart’s research fund.

IRIS’s tasks and course offerings are not limited to the field of research, but also create interchange opportunities both within and outside the university to discuss current ethical and social challenges with partners from public society and the business sector ranging from data ethics to informational self-determination and reliable AI. The focus is also on teaching: the “Reflecting on Intelligent Systems in the Next Generation” (RISING) teaching forum, which is headed up by Wirzberger, teaches students of all subjects how to critically reflect on intelligent systems by offering courses on such things as “cultural bias” and “open science”. Teachers can further their own training by applying reflective teaching methods.

Raising Awareness

So how does IRIS help to work against stereotyping in such fields as technology development? “What’s so good about IRIS,” Wirzberger explains, “is that it raises student awareness of this issue before they enter the workplace, which will prevent such developments on this scale.” Even in terms of language, because, as Wirzberger explains: “Language is also an intelligent system. Anyone who fails to use language in a sensitive manner may exclude entire demographic groups. Topics such as these should be firmly anchored in the way we think and act. We wish to create an awareness of just how colourful and diverse our society is.” On the other hand, the intention is to boost the university's networking activities at the international level. The benefit of bringing different people together, the researcher explains, is that good ideas emerge and there is a lively exchange of ideas.

Staab emphasises the fact that people also learn something about themselves in the process: “Human decisions are often based on unconscious prejudices. Digitising our decisions, makes the results transparent and verifiable. Increasingly, we are now seeing how we have unfairly discriminated against others in the past. In future, AI will hopefully enable us to better scrutinise our own decisions and make fairer judgments.”

Reflecting upon intelligent systems is already firmly anchored in several areas within the University of Stuttgart, such as the ‘Platform of Reflection’ within the ‘Stuttgart Center for Simulation Science’ (SimTech) Cluster of Excellence, in one of the thematic foci of the International Center for Cultural and Technological Studies (IZKT) and in the Center for Interdisciplinary Risk and Innovation Studies (ZIRIUS). IRIS is also part of Cyber Valley, Europe’s largest AI research consortium in the Stuttgart-Tübingen region with the goal to strengthen research and education in the fields of machine learning, computer vision and robotics as well as the exchange between these scientific disciplines. To take ethical and social aspects into account, the Cyber Valley Public Advisory Board (PAB) has been established. IRIS strengthens this perspective even further and addresses all disciplines, from technology and engineering to the humanities, social sciences and economics, and unites all the various competencies.

For more information, please contact Melina Danieli (Communications Manager at Cyber Valley).

Request more information

If you want to know more about CESAER click on the button below.

Request more information here