Monday 11 July 2016

The Female Robot

'1923, from English translation of 1920 play "R.U.R." ("Rossum's Universal Robots"), by Karel Capek (1890-1938), from Czech robotnik "forced worker," from robota "forced labor, compulsory service, drudgery," from robotiti "to work, drudge," from an Old Czech source akin to Old Church Slavonic rabota "servitude," from rabu "slave," from Old Slavic *orbu-, from PIE *orbh- "pass from one status to another" (...). The Slavic word thus is a cousin to German Arbeit "work" (Old High German arabeit). According to Rawson the word was popularized by Karel Capek's play, "but was coined by his brother Josef (the two often collaborated), who used it initially in a short story."'
Online Etymology Dictionary



Robots do not necessarily need to have a gender. Nevertheless, they are usually gendered. In fact, when there are no gender cues at all, people tend to assume the robot is male (via). NASA, for instance, designed Robonaut, a gender-neutral robotic astronaut assistant. Despite lacking gender clues, people assigned a gender to the robot, i.e., 99% male. And the robot's perceived gender can change how a person interacts with it/her/him (via). Disembodied voices are mostly female while "something fully humanoid", a more sophisticated robot is usually male. "And when humanoid robots are female, they tend to be modeled after attractive, subservient young women (via). If - hypothetically speaking - there is a link between how "fembots" and real women are treated, then "the technology we're creating says an uncomfortable amount about the way society understands both women and work" (via).



“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets.” Microsoft
In March 2016, Microsoft launched "Tay" on Twitter, an artificial intelligence chatterbot. Within 16 hours, Tay caused controversy releasing highly racist and sexually-charged messages in response to Twitter users. As The Guardian put it, Tay got "a crash course in racism from Twitter". In most cases it was only repeating other users' inflammatory statements; at the same time it was learning from these interactions (via). Twitter seems to be the perfect "learning" environment: 88% of abusive social media mentions occur on Twitter (via), 10.000 racist tweets (although not all slurs are said to have been used in a derogatory way) are sent every day (via). Add sexist tweets and the number gets even higher.
"As many of you know by now, on Wednesday we launched a chatbot called Tay. We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay. Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values." Microsoft

"Siri behaves much like a retrograde male fantasy of the ever-compliant secretary: discreet, understanding, willing to roll with any demand a man might come up with, teasingly accepting of dirty jokes." Amanda Marcotte
Xiaoice, Cortana, Alexa, Siri, Google Now, ... There are many more examples leading to the question if tech firms - companies with an overwhelmingly male workforce (Microsoft: 83%, Google 82%) - are obsessed with female digital servants (via).
"Right now, as we’re anticipating the creation of AIs to serve our intimate needs, organise our diaries and care for us, and to do it all for free and without complaint, it’s easy to see how many designers might be more comfortable with those entities having the voices and faces of women. If they were designed male, users might be tempted to treat them as equals, to acknowledge them as human in some way, perhaps even offer them an entry-level salary and a cheeky drink after work." Laurie Penny

"In the not-too-distant future, robots will be social beings upon which we can heap all kinds of preexisting social constructs. Already, robots are helping with tasks like caring for the elderly and teaching—both fields traditionally associated with women. Research on human-robot interactions is revealing that gender plays a big role in how people perceive, communicate with, and treat robots, much like it does with humans. And a lot of what we’re bringing over to our technological companions of the future is old, tired stereotypes." Laura Dattaro
"The creators of robots, then, have both a fantastic opportunity and a very real responsibility to consider what gender means as they design the machines that are becoming increasingly present in our hospitals, our schools, our homes, and our public spaces at large. Some researchers suggest gender stereotypes could be beneficial for robot interfacing, by, for example, capitalizing on our tendency to be more comfortable with women as caretakers. More feminine home health care robots could put patients at ease. But that might be a dangerous path, one that’s antithetic to the decades of ongoing work to bring women into fields like business, politics, and particularly science and technology. If robots with a feminine appearance are built only when someone wants a sexbot or an in-home maid—leaving masculine robots with all the heavy lifting—what does that say to the flesh-and-blood humans who work with them?" Laura Dattaro
Interesting read:
::: Artificial Intelligence's White Guy Problem, The New York Times: LINK
::: Ex Machina and sci-fi's obsession with sexy female robots, The Guardian: LINK
"Consider the climactic scene in Ex Machina, where the megalomaniac cartoon genius Nathan, who roars around the set like Dark Mark Zuckerberg in Bluebeard’s castle, is shown hoarding the naked bodies of previous fembot models in bedroom. For Nathan, the sentience of his sex-slaves is beside the point: meat or metal, women will never be fully human. For the fembots, the men who own them – whether it’s mad billionaire Nathan or sweet, hapless desk-jockey Caleb – are obstacles to be overcome, with violence if necessary." Laurie Penny


photographs (Dr. Who's Daleks) via

4 comments: