AI Parliamentarian Portraits, 2024

I recently went to a lecture by Professor Anthony Downey who spoke about artificial intelligence (AI) and artists engaging with this curious technology. He discussed ways in which AI is perceived as a separate and mysterious entity somehow independent of humans. He debunked this perception of AI as a type of black box, rather he spoke about how AI image databases like ImageNet were made by humans with bias, unacknowledged and unconsidered.

In this way, I wonder if AI can be considered as a vehicle of collaboration? One in which the authors are hard to attribute or interrogate, but whose definitions of the world and the people within it are built into the technology itself. If thought about in this way, any image amended by AI is amended by the person who submits the image and by consequence a host of unidentified people who have fed the system at various points and in various ways.

The positionality that we increasingly demand from content creators internationally is completely absent from AI and other anonymous online platforms. Like trolls on the internet AI does not need to take responsibility for the racism, sexism or categorizations it has been fed and perpetuates. There are companies who control it and users who wield it in ways that intentionally or unintentionally reinforce these layers of bias. Less overtly, this is how these layers of bias continue to permeate through visual culture so that an image of an Australian politician can go onto a mainstream news program with larger breasts and more revealing clothing and the channel can say ‘the AI did it, there was no human intervention here!’ Apart from the worrying lack of responsibility taken for this offense, it perpetuates the idea that women are always available subjects for manipulation. Why might the AI have sexualized an image of a woman politician? Is it alright if it did? And if AI is to blame is that the end of the conversation?

When asked about the incident, Georgie Purcell, the politician whose image had been altered said ‘“Let’s be clear – this is not something that happens to my male colleagues.” And indeed, it usually is not.

In response, I decided to take images of men in the Australian cabinet and put them through a series of AI filters. I gave them breasts, long hair, dresses and nice hats. In essence I turned them into ‘women’. There was no shortage of women-oriented filter categories to choose from, greatly outnumbering the male options. These included roles such as ‘Queen’, ‘Army Girl’, ‘Girl doll’ and ‘Bride’. As well as traditionally gendered names, such as ‘Betty’, ‘Alaine’, ‘Sophie’ and ‘Maxine’. Alongside a range of ‘costume’ options that regardless of the gendered image submitted resulted in a woman-oriented output, complete with breasts, dresses, eyelashes etc. These included, ‘Fairy’, ‘Teatime’, ‘Red Bow’, ‘Ascot’, ‘Pig tails’ and ‘50s’.

The images of male politicians came out of the AI conversion looking more aesthetically pleasing for the most part than when they went in. Arguably, the transition was favourable for most making them look ‘prettier’. The results were a compilation of stereotypes that were sexualised and beatified simply because they were now ‘women’.

I wondered whether the politicians would feel the same. I posted the images to social media, tagging each politician in turn. My goal was not to shock my own left leaning and I suspected sympathetic audience. Rather, my aim was to sexualise each politician in the eyes of themselves and each other. I have no way of knowing whether this actually took place, however my experience of male banter makes me wonder whether there might be at least one person in parliament house who took a screen shot and WhatsApp messaged another colleague with the caption ‘I’d shag her’ or ‘like your dress mate, but I think blue would go better with your eyes’.

I have aimed AI at the people who make decisions about how far gendered online abuse should be allowed to go on unregulated. I have sexualised them in a way they arguably are rarely sexualised. The images are polite, even humorous, but they might not have been. Someone who isn’t trying to prod politely, perhaps someone who is trying to be aggressive might make more violently sexualised images next time. Or they might make images of politicians in compromising situations. As they have done in the past, to every single woman who has entered the political realm since images made their debut on the internet.

Have I perpetuated abuse? Have I weaponised the inherent sexism I criticise? Am I to blame? Or was it me working collaboratively with thousands of other anonymous people whose gendered bias has been allowed to creep maliciously into the way AI is developed and by consequence defines that which it encounters. There are a lot of people squeezed into that little black box and as a result it is taking on a decidedly human form.