This photo series questions automated gender categorisation and the illusion of machine neutrality by showing the limitations of machine learning. Unlike humans, machines don’t think; they recognise patterns, but lack real comprehension. Meaning is shaped by humans whose biases influence the output of a system and their interpretation. Categories are created by humans to interpret the portrayed androgynous-appearing person or to answer the question posed in the title of the work.
i dropped the metal ball on the glass table and it broke. what is it?
