Stay Informed

Your go-to source for the latest trends impacting gender justice and women’s rights around the world

Users: A Challenge for Digital Technologies

Users of technology can be 'erased' in the process of development of technology; at the same time, user lack of competence is cited as the reason for their inability to use technology effectively. Drawing from recent empirical findings of research, workshops, and current examples, this post discusses difference, diversity and technology.


The history of technology is full of instances where the biases and intentions of designers and developers of technology are revealed.

For instance, the 'Shirley Card' was a standard used across North American analogue photo laboratories for 'skin colour balance'. In this, a Caucasian woman wearing a colourful, high contrast dress is used for measuring the skin tones on the photograph being printed. As a result of this being the norm for photographic printing, darker skinned people appeared either 'blown out'; or 'washed out' in photographs. Simply put, analogue photo printing technology was just not developed keeping darker skinned people in mind.

A version of this was discovered by Alexis Madrigal in his review of the Apple Watch. Madrigal found that the heart rate monitor, one of the most exciting new features on the watch, would not work as effectively on darker skinned people. This monitor used a technique to evaluate how a beam of light from the watch scattered across and was absorbed on the surface of the wearer's skin. However, someone with darker skin could not get it to work properly.

Madrigal goes on in that article to say that this technology was tested on the chief executive officers of Apple, and displays photographs of them from the corporate website. They are all white.

Nana Darkoa Sekyiamah, Communications Manager at AWID spoke to a similar challenge.

“To build excitement around the 2016 AWID Forum we decided to launch a Twibbon campaign. Activists who attend the Forum come from various backgrounds and movements, and we always ensure that all AWID communications speak to the diversity of our movements. Imagine our disappointment when we realized that the default image associated with Twibbon campaigns is hard coded into their system, and that we couldn’t have an image of our choice linked to our campaign. And yes, if you guessed that the image appeared to be that of a smiling, blond, white woman, you’re right.”

(Response from a HRD to a research workshop exercise to draw how they perceive the internet. Image courtesy Tactical Tech.)

There are many examples of this sort of erasure of the diversity of users in architecture, design and technology. These design choices are political. This says more about the developers and the contexts of design and development, a community that tends to be mostly invisible.

Designing for security

In 2016 Tactical Tech published research about the sustainability of training and adoption of new digital tools and practices within the field of digital security for human rights defenders (HRDs) 'Digital Security in Context: Learning about how human rights defenders adopt digital security practices' looks at the direct experiences of human rights defenders both in the training room and afterwards.

Adopting new digital security practices is difficult, so much so that even Hillary Clinton may have jeopardised US national security by refusing to give up her Blackberry and sharing sensitive emails over her private email account.

In the case of niche, difficult-to- use digital security technologies, there is the underlying assumption that a user will have all the knowledge they need to use the tool. Reviewing recent research in technology usability in digital security tools, Becky Kazansky writes that :

“Despite a greater emphasis on human factors and the growth of human computer interaction scholarship within computer science, some literature on human factors continues to frame security problems as a function of users' 'human error' rather than designer bias or rigid systems. Non-expert lay knowledge is presented as the main barrier to good security”

HRDs engaged in this study needed to understand digital security and privacy through a predominantly English-language based lexicon. This added to the overall challenges of creating constructive and appropriate digital security strategies, becoming an issue both in tool use and in spaces for learning and discussion, such as trainings.

One participant who is a a trainer and contributes to the global translation network of the sixteen-languages of Security in a Box felt they were contributing to the demise of their language: “I want to keep my language alive. It’s important to me. If you cannot translate it then you have no more language”.

However, many participants and trainers interviewed explained that the translation of tool related resources and elements within tool interfaces doesn't guarantee the cultural legibility of tools and concepts. Participants explained that translation efforts often fail to capture the correct, contextually appropriate or desirable words in their respective languages: “the challenge is not just translating, because there are certain words where there is no one-to- one meaning”. Participants said that prioritising local, contextual meanings would mean exploring culturally relevant metaphors to describe human relationships to networked technologies.

In one group, researchers learned that there was no appropriate analogy for the word ‘protection’, as the term for protection in the local language had a negative connotation. Fittingly, the term ‘encryption’ translated to mean ‘hard to understand’. The local word for ‘surveillance’ was largely not recognised by two groups. Participants from the group decided the appropriate word for surveillance would be ‘monitoring’, but this was not a commonly used word, and that when the word was used, monitoring related ‘to people but not technology.'

Designing for difference?

However, sometimes, making a fetish of 'difference' can be a way to leverage marketing; it can also be dangerous, invasive or expose people. Think of the example of pink phones: with absolutely no difference in the underlying technology, or in skills required in the use of that device, why is it labelled as a 'women's phone'?

In recent research by Tactical Tech, Privacy, Visibility, Anonymity: Dilemmas in Tech Use by Marginalised Communities, we found that a crowdmap created with the intention of making violence against LGBTQ people in Kenya visible actually ended up unpopular, because users did not believe that reporting violence would actually result in any support to them.

Moreover, they did not trust putting personal information into an anonymous crowdmap. At the same time, let's not assume that users have no agency.

The most famous historic example of users 're-shaping' technology is the missed call: in order to avoid paying money for a call, users made missed calls to signal each other. In the Kenya study we found that queer Kenyans were actively subverting Facebook's (now relaxed) 'real name policy' with two or more accounts so as to avoid detection by homophobic families and community members.

The user is a difficult proposition for designers and developers of technology: there are so many of us! There are, certainly, complexities in balancing scaling up and mainstreaming with diversity of users; however, without a ground-up recognition of the dynamics of difference in various contexts, technologies can easily miss their mark.

 


About the author

Maya Ganesh is the Director of Applied Research at Tactical Technology Collective.

Category
Analysis
Region
Global
Source
AWID Forum