top of page

Visualizing gender associations in language perception through text-to-image algorithms

This time I wanted to play around with word-to-image generation in order to reveal gender associations. When starting the creative ideation process, I usually predefined a few design qualities through the prior research phase. Often my target groups are not defined by gender, at least not on purpose, but I had the feeling that some of my design qualities are not as gender neutral as I often think they are. With all the late discussions about bias in language I thus started to translate words into image.



 

Step 1 - preparing the words

I started by collecting words where I felt some kind of underlying association. I relied on descriptions of target group and design qualities, some of my participants used in generative sessions.

Participant 1 was asked to describe the target group and design qualities for the next generation of 'lego mindstorm robotics'. The second participant was asked to do the same for a 'summer dress design'.

 

Step 2 - Translate text-to-image

This time I used an opensource styleGan called Hypnogram for my experiment. I ran some tests with the different styles it offers, before I found one that worked best for my purpose.


At first the images were so surreal, vague and artsy that I found it difficult to draw any conclusions for my experiment.

 

Step 3 Analyze

After the first overwhelming attempts I started to focus a bit more on the way I analyze the images. At first I was a bit disappointed, finding out that most of the words I strongly associated with for instance 'female' were depicted more 'male' in the AI-images. These images are obviously very ambiguous, and you might see something completely different in them. However, I noticed that clash of expectation I had, in turn made me question my word-associations and gender biases in language.

 

Learnings

As with the previous experiment, I noticed how the process of collecting words, and analyzing the images, itself had an impact on my gender bias awareness. Unlike unconscious associations in objects and color, I this time found a surprising amount of words that strongly relate to gender.

With this awareness gained, I will definetly have to think twice in my next design process, using 'neutral' words to describe my design vision. Setting the right base for a gender-neutral design seems to be key.


 

Next Steps

I was surprised by the ambiguity of images. I am curious to next explore less stylized ways of transforming word to image. I was furthermore thinking of running the same experiment with names instead of random nouns and adjectives, to see how much this will change the AI's perception of gender.

Comments


bottom of page