Shaping opportunistic digitization

Diversity-sensitive digital transformation: For a fair future for all

Wellenförmige Linien in Pink-, Rot- und Orangetönen auf dunklem Hintergrund mit verstreuten gelben Lichtpunkten.

Diversity is crucial in digitization

Digital transformation is influencing our daily lives and penetrating all areas of our society. Apps on the smartphone, the tool on the PC or AI applications such as Chat GPT — we live and work more and more digitally.

Zwei Frauen lachen zusammen vor einem Computer in einem modernen Büro, während zwei weitere Personen im Hintergrund an ihren Bildschirmen arbeiten.

However, studies show that these technologies are not always designed to be diversity-sensitive. This means that they can have a negative impact on women, black people and other marginalized groups, such as people with a migration background, and discriminate against them. One reason for this is that in the STEM sector, a male, white perspective is often overrepresented in the development of digital applications (see Diversity-sensitive software development). It is therefore important to initiate a joint discourse, to enable people to use technologies in a discriminatory manner and at the same time actively use the opportunities offered by IT for greater diversity. We can all contribute to making digital transformation gender-equitable and inclusive.

Artificial intelligence and digitization in the world of work

Digital transformation of the working world...

In the office, Chat GPT helps us write faster or prepare presentations. Software takes over and reduces administrative tasks, for example through automated payroll systems, apps for deployment planning or software for application processes. In public administration, for example, citizens can apply for an identity card online and thus more easily. All of this facilitates access and is intended to simplify work processes.

... a risk to diversity & equal opportunities?

But: Not all people have equal access to digital applications and tools. Anyone who does not feel safe using the computer due to age or lack of knowledge is excluded more quickly. But that is not the only challenge. Online hate and hate speech, particularly against women or democratic voices — are a threat to diversity and equal opportunities.

Algorithmic bias: When software becomes unfair

And: Software, apps and algorithms are not necessarily neutral. Social stereotypes and existing injustices can be reproduced through technology: People are ostracized on the basis of gender, skin color, age, religion or origin. The so-called algorithmic bias appears, for example, when programs predict women as less employable on the labor market.

Or when automatic face recognition is significantly worse at recognizing people of color. Or even when the AI generates stereotypical images and portrays male readers more in management positions or shows medical staff as “doctor and nurse” rather than “doctor and nurse.” There are online forms that do not accept short last names — as is common in Asia — or only provide fields for father and mother and thus exclude same-sex parents.

Software is programmed by people

But how does such an algorithmic bias even arise? They arise primarily because the digital is man-made: Distorted data from our real world can make automated decision systems or applications based on artificial intelligence unfair. Namely, when this data, from which programs feed their information, represents only part of the truth or reflects existing injustices.

And when software is programmed and tested, we often assume that we are future users — this is human, but can result in the product working well for ourselves, less so for other groups of people — for example when a male body type or white skin color remains unquestioned as standard.

Vier Männer in Anzügen sitzen an einem langen Besprechungstisch mit einem freien Stuhl in der Mitte.
The IT sector is a male domain: few women, little diversity

Only 18 percent of all employees in the IT sector in Germany are female (see following study). There are still more men working in the tech sector than women, and the IT teams are usually not very diverse, whether in terms of gender, but also in terms of origin, skin color, religion. We remember: When using it, we automatically start from ourselves and our world of experience. As a result, the needs of marginalized groups are quickly overlooked. Unconscious thought patterns, prejudices and stereotypes are reproduced in the products.

Reasons for the low proportion of women in IT

The reasons why so few women work in IT are complex and also have to do with stereotypes, expectations and attributions of competencies in our society. “Girls can't do math”: It starts in education and school. Girls learn that technology, math and science aren't for them and are therefore less encouraged to try them out. This in turn means that they are less interested in studying computer science or a technical apprenticeship. The CEOs of the biggest tech companies worldwide are particularly often male and white. Young people of color have hardly any role models or role models.

Negative work experience of women in IT

For many women and marginalized groups in IT, professional experience is often negative. The lower proportion of women in the IT sector as well as existing gender stereotypes and unconscious prejudices can also have a negative impact on work in teams and organizations and lead to women experiencing sexism in the workplace. For example, when men are taken more seriously than women in their speeches at team meetings or when they have to endure disparaging comments about their appearance. Result: Turnover rates of female employees in tech teams are usually higher than those of their male colleagues.

AI and digital innovation for more justice

However, the digital is not bad per se. With digital applications, we can work faster and in a more structured way - And: A digital tool never gets tired, is not hungry, neither in a good nor bad mood and, in the best case, can make more rational decisions than we humans do. For this reason, software and artificial intelligence (AI), which are developed and used in a “bias-sensitive” manner, can also be used specifically to increase diversity and reduce discrimination. For example, AI can do unconscious thinking patterns - so-called unconscious biases (see Unconcious Bias), which can lead to discriminatory decisions in recruiting — reduce them in HR work. Augmented writing tools, for example, can support gender-equitable and inclusive language in job advertisements and thus appeal to a more diverse range of applicants. Other AI-based applications uncover patterns of inequality and help to minimize prejudices in data and algorithms. In this way, AI and IT can not only promote equal opportunities, but also contribute to making digital solutions fairer and more accessible for everyone.

For a fair digital future... thinking diversity & IT together

Digitalization has the potential to be an engine for equal opportunities and participation — if it is specifically geared towards diversity and justice. For this to succeed, organizations must combine digitization and diversity. Decision-makers who decide on the use of digital tools in companies must better understand technology, be made aware of opportunities and risks related to diversity, and gain skills and concrete options for action in dealing with diversity-sensitive digitization.

Please contact us!

Would you like to learn more about diversity-sensitive software development or implement specific measures? Contact us and let's work together on an inclusive IT future!

contact

Please feel free to contact the responsible contact person:

Lisa Hanstein

Senior Expert

hanstein@eaf-berlin.de

+49 (30) 3087760-46

Read more

Subscribe
the EAF Berlin newsletter

Stay up to date

By submitting the registration, you agree that the data you provide will be collected and stored electronically. For more information, please see our Privacy statement.

Unsubscribe from the newsletter
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Sechs dunkelblaue Kreise auf hellem Hintergrund, zwei davon mit weißen Symbolen: ein Megafon und ein Briefumschlag, oben rechts der Text 'Bleiben Sie auf dem Laufenden' in einem gelben Kreis.