• Health Literacy Lab & Library

Immoral, Obscene ALEXA

Immoral, Obscene ALEXA

I was really struck with the hypocrisy and bigotry…and sexism that was the handling of the product produced by Osé,  debuted at the  CES Consumer Technology Innovation Show in Las Vegas last week.  The product -a sex toy for women, was stripped of an award at CES. Officials damned it as “immoral, obscene, indecent, profane or not in keeping with CTA’s image.   The simple reason it’s made for women and their pleasure.

I find this blatant anti-women action even more absurd given that women – their voices, their bodies, their sensuality, their “femaleness” – or at least the traditional tropes for women – are used to sell AI ubiquitously – SIRI, ALEXA, Hanson Robotic’s robot, Sophia…The high tech industry today is using femaleness but certainly not feminist in any way.

So, here’s an update of a guest post by Roberta Duarte appearing earlier on Public Linguist.  


——————–

In the past few years we’ve been experiencing a huge surge in conventional versions of artificial intelligence assistants. Whether it’s Apple’s Siri, Amazon’s Alexa, or Microsoft’s Cortana, AI machine ability to interact with its users is becoming an ordinary and sometimes integral part of everyday life. 

Through voice queries and natural language aptitude, they answer questions, make recommendations, and perform actions for us. These computerized personalities can be deemed almost human-like. The more human we make them, the more important it seems that we give them names, personality and —more worryingly — gender.

But robotic assistants don’t have a gender, really. Strip them of the names and voices added by their human creators, there’s nothing there that requires it to be ‘he’ or ‘she’, other than our own assumptions. Yet many indeed do have names, and a disproportionate number of them are sold to the public as ‘female’.






Sure, you can change Siri to a different gender and even a different accent. But presently AI assistants seem to default to a female persona. 
To a certain extent, us humans,are led by our assumptions and biased truths. As we move into a new age of automation, the technology being created says an uncomfortable amount about the way society understands both women and work.


Assigning gender to these AI personalities may be saying something about the roles we expect them to play. Virtual assistants like Siri, Cortana and Alexa perform functions historically assigned to women.



Society has long implemented the female role to administrative, accommodating and aid lending positions. Assistant and secretary positions are especially stratified female. With its roots in early 20th century industrial revolution, the employment of secretaries quickly became women’s work as companies realized they could pay women lower wages.





In fact, the preponderance of anticipated work to be one day carried out by robots is currently undertaken by women and girls, for low pay or no pay at all. A report by ONS quantifies the annual value of the “home production economy” — the housework, childcare and organizational chores done largely by women — at 1 trillion, almost 60% of the “official” economy. From nurses, secretaries, and sex workers to wives and girlfriends, the emotional labor that keeps society running is still feminized — and still stigmatized.
It is no mistake the face of AI is female. Always ready and predisposed, these technologies take on distilled and idolized femininity. Your AI is always working, always available, always ready at any minute to provide assistance with a positive attitude. The gendering of AI is purposely linked our to culturally underlying sexism. Customers interpret these AI personalities through the lens of their own biases. Stereotypes about women in service roles make female AIs easier to accept, which is the ultimate goal for tech companies that want to make AI mainstream.
The fast approaching world of competent faithful automated auxiliary is sadly all too susceptible to our long standing faulty presumptions of the female role in society. Right now, as we expect AI technology advancement to serve our personal organizational everyday needs, we need to be conscious of our outstanding biases. It is imperative to question why we feel the need to gender this innovative tool.




 Consider the artificially intelligent voices you hear on a regular basis. Your personal assisting device should be helpful, compliant, and do as you say. 
But should they also be female? 

Are technology companies catering to our desire for robotic assistants with personality, or are they reinforcing our biases about gender, and the roles that women play?

The price of empathetic research

The central way social researchers like me study a topic or problem is through qualitative research.  Engaging people, most often strangers, in in-depth conversations to try to understand how they see the world, what they value, what they know and want to know…. and most often for me… hearing and recording the language they use to encode all of this.  The topics I study are health, disease and risk.

For over 40 years I’ve been giving myself up to my participants’ realities and emotions, struggles and triumphs.  And all this time I’ve always known it was exhilarating but exhausting work.  Even as a young researcher I’d come out of a one and a half hour focus group filled with phrases people used, statements that stuck with me, but also the need to rest, physically rest.  I was never the one to suggest to my colleagues  – “let’s go get a bite.”  I had to get somewhere quiet and close my eyes and rest.

I said to myself it was just my weaker constitution – maybe I just wasn’t the hardy, robust field researcher I aspired to be.

As I grew into myself and my confidence, I didn’t hide this need to retreat after a research encounter.  My students knew that if they were note takers or observers, after a long in-depth interview or group they couldn’t expect me to expound brilliantly on what had just transpired.  They accepted my fugue state as we packed up our recorders and equipment and headed out, knowing that we’d gather the next morning to go at it.

And as the self acceptance of age and experience became mine I began to spend more time thinking and teaching my students about this phenomenon, this toll that qualitative research could take – should take.  I taught it as a litmus test of whether you really were getting to something in your research or not.

And then, last night I was reading a beautiful book that one of my former students, and now accomplished environmental scientist and writer, Lauren Oakes just wrote – In Search of the Canary Tree (Basic Books, Hachette, 2018). The book is her journey studying the slowly disappearing Yellow Cedars in Alaska – her Stanford PhD dissertation work.

The first half of the book is a truly amazing story of a young woman setting out into some of the most wild of wilderness to map trees – some thriving, many dying.  The second half of the book is the story of her qualitative research – the many interviews with conservationists, fishers, native Alaskans with long ties to the land and sea.  And in the middle of collecting these stories her beloved father suddenly, dies. The loss, as she recounts – “irrevocable.”

With only a short  break she continues to interview people to hear how they see the death of the cedars – this change, this loss.

Last night as I was reading I came upon a short passage.  Oakes is writing about what she felt living her research life and her personal life at such a sad and consequential time. 

“I was tired from putting my own loss on hold, from opening myself up again and again to hear and try to understand what so many people were experiencing in relation to the standing dead.” 

I stopped.   How could this one sentence do this – capture what I’d been feeling for all these years.  To be a good, to be a great qualitative researcher, you work to disguise your own thoughts, opinions and needs or distress. You have to get out of the way and let the other person think and speak.  On good days you have exquisite timing in what you ask and say. On off days your questions fall flat.

You hear difficult things all the time:
     “Vaccinations are a conspiracy to murder minorities”
     “His pre-school teacher says he only needs more words”
     “Diabetes is my family’s history.  No changing that.”
     “I’m not planning to leave even if the volcano erupts”
     “No more soap and water for me – just sanitizer.”
      “Ebola is going to kill most of us in Manhattan”


A trained therapist who works with patient or client over time – works towards a goal – making something better for the patient.  The qualitative researcher has to identify and accept what is. Often we’re in and out quickly  We’re careful to do no harm but we intrude. We choreograph the conversation, the dance – always having it appear as though the participant is leading.    As tactful and polite as we are, we nudge people to think and talk about things that stir their thoughts and emotions – the inequality, health risks, illness, endurance, fears and triumphs.   We pursue these to get our questions answered, our data gathered.  
And so in big and little ways, we absorb the harm.  It’s the tacit tradeoff we make when we do what Behar calls “ethnography that breaks your heart.”  (The Vulnerable Observer: Ethnography That Break Your Heart (Beacon, 1996). The momentary depletion Oakes put to words reminds me once again that it is the necessary path we walk.   And perhaps, as I’m aging, a longer nap is thoroughly justified. 

©2021. Health Literacy Lab. All Rights Reserved.