Libraries, technology, and social justice

Here’s the text of the talk I gave at Access 2016. I reused some stuff from earlier talks, but there’s some new stuff in here too. There is a video of the talk too.

(argh. I spelled Bethany Nowviskie’s name wrong on the slide in my talk. I hope she doesn’t notice.)


Thank you for inviting me to this beautiful location and to this fantastic gathering. I want to give a special shout-out to James Mackenzie and the program committee for inviting me and for taking care of all the logistics of getting me here and especially for answering all my questions.

When I am asked to speak at conferences, I try to remember to ask a set of questions that include:

Do you have a code of conduct?

Do you have scholarships for people who might not otherwise be able to attend?

Are you making efforts to ensure diversity in attendance and a diverse line-up of speakers, panelists, presenters?

Access was a YES on all 3.


Fredericton is a very beautiful place

And Fredericton earned bonus points on the secret private criteria I use, which is “is it in an interesting and beautiful location?”

So it was really a no-brainer and I am thrilled to be here; and to have a chance to talk to and with all of you.

I want to start by saying that I’m so glad that Dr. Maclean acknowledged that the land on which we gather is traditional unceded territory.

The importance of acknowledging that we work on lands that are the traditional territories of First Nations people is something I am learning from my Canadian colleagues and from my Native American colleagues. It is, I think, a much needed way of showing recognition of and respect for aboriginal peoples.

I will say though, that it is a practice that is not as widespread in the US – yet.

But there is some movement in the US among colleges and universities to wrestle with their racist pasts; to acknowledge the role of slavery, and the mistreatment of native americans in their founding and early success.

Dozens of American universities – including Harvard, Brown, Columbia, Georgetown, and UVa — are in the process of publicly acknowledging their ties to slavery, including their dependence on slave labor; and these schools are beginning the work of trying to find paths to restitution, if not full reparations.

And this is not unrelated to the topic of my talk.

If I recall correctly, the abstract of this talk proclaims that libraries aren’t neutral, that technology isn’t neutral; and that we can and should leverage both in the service of social justice.

I figured I should spend at least a bit of time unpacking the claims that neither libraries nor technologies are neutral.

And one place to start for libraries – for academic libraries – is to acknowledge that our parent institutions are not and never have been neutral.

My point of reference is US colleges and universities, but I suspect the general theme is true in a Canadian context as well.

American colleges were originally built as exclusive institutions for well-connected white men; and in many cases American universities were actually built literally on the backs of enslaved African-american labor. Many of our institutions were built on land taken from native peoples; and almost all of our colleges and universities excluded in practice if not also in policy, women, non-white men, queer people, and other marginalized populations.

We start from these histories of exploitation, appropriation, enslavement, and displacement. And I believe we have a responsibility to acknowledge that we give our labor to institutions with often troubled histories with regard to the treatment and acceptance of women and non-white men. And even acknowledging that is a political act – but/and ignoring that past is also a political act. There is no neutral here.

In the US context I think it is important to give credit to the students – predominantly students of color — who came together on campuses across the country last fall, and continue to come together, to demand that universities own up to the systemic racism that exists in higher education and across the US, and to insist that we take steps to reduce discrimination and promote social justice.

Many of you likely heard about the high visibility student protests at the University of Missouri, and at Yale University; but in reality students and community members at hundreds of colleges across the nation took up the call and protested and demanded action from their own schools.

At MIT and at colleges all across the country, students have called our attention to ubiquitous and blatant incidents of racial and sexual harassment, they have demanded that we hire more faculty from underrepresented groups, and they have called for faculty and staff to be educated on unconscious bias.

In short, they have said – it isn’t enough to welcome students from marginalized groups to our campuses with words and policies; we must take concrete action to create welcoming, inclusive, and integrated communities. And in some cases, they have called on us to leverage the academy and its resources to address society level failings.

So what does that mean for us?

Well, that’s exactly what I want to talk about today – as folks who work in and around library technologies, how can we leverage our work in the service of social justice?

First, what is social justice and what does it look like?

I’m going to cheat a bit with the answer to what does social justice look like and cite a couple of things I’ve written or co-written in the past:

In an article titled Diversity, Social Justice & the future of Libraries that I had the honor of writing with Myrna Morales and Em Claire Knowles, we defined social justice as:

“The ability of all people to fully benefit from economic and social progress and to participate equally in democratic societies.”

If you believe like I do, that equitable access to information and to the tools to discover, use and understand that information; is a core enabling feature of a truly democratic society; then it is easy to see that libraries are crucial to social justice.

What would a social justice agenda look like in a library?

I was asked several months ago in a joint keynote I gave with my colleague Lareese Hall, now dean of libraries at the very prestigious Rhode Island School of Design, what a queer feminist agenda for libraries would look like, and I think that answer stands for a general social justice agenda too:

“A … feminist and queer agenda in an academic library would be one where the collections and services are not centered on the experiences of cis-straight, white western men; where the people who work in the library truly do reflect the diversity of the communities they serve; where the staff and patrons are empowered; and where the tools, systems, and policies are transparent and inclusive.”

For this crowd, at this conference; I want to talk about tools and technologies.

First, let me run through a few examples to illustrate what I mean when I say technology is not neutral; and really to convince any skeptics that technology itself – not just the users of it – is often biased.

Let’s start with search technologies. Most librarians will agree that commercial search engines are not “neutral” in the sense that commercial interests and promoted content can and do impact relevancy

But of course, declaring one thing as more relevant than another is always based on some subjective judgement – even if that judgment is coded into an algorithm many steps away from the output.

Or, as my colleague Bess Sadler says, the idea of neutral relevance is an oxymoron (this is a line from our Feminism and the future of library discovery article).

And of course, you can’t talk about bias in search tools without talking about the fantastic work of another one of my library sheros: Safiya Noble.

Safiya Noble’s work demonstrates how the non-neutrality of commercial search engines reinforce and perpetuate stereotypes, despite the fact that many of us assume the “algorithm” is neutral.

What Noble’s analysis of Google shows us is that Google’s algorithm reinforces the sexualization of women, especially black and Latina women. Because of Google’s “neutral” reliance on popularity, page rank, and promoted content, the results for searches for information on black girls or Latina girls are dominated by links to pornography and other sexualized content.

Noble suggests that users “Try Google searches on every variation you can think of for women’s and girls’ identities and you will see many of the ways in which commercial interests have subverted a diverse (or realistic) range of representations.”

So rather than show you those results, I encourage those of you who might be skeptical to do some of those searches yourself – google Asian girls, or latina girls, or black girls or native girls. And then Imagine being a girl or woman of color looking for yourself and your community on the web.

Or, just imagine you’re a tech worker

We know that the stereotype of a “tech worker” is young, male, nerdy … and the google image search verifies and reinforces that.


Google image search for ‘Tech worker” is pretty much all dudes

And labels matter – look at the different images you get when you search for “Librarian” vs. “Information scientist”


We all like to think that library search tools can do better – and they can; but only when we are intentional about it.

Another example of technology that isn’t neutral comes from cameras and photo editing software.

Photographer Syreeta McFadden has written about how color film and other photographic technologies were developed around trying to measure the image against white skin.

The default settings for everything from film stock to lighting to shutter speed were and are designed to best capture “normal” faces – that is faces with white skin. What that means is that it is difficult to take photos of non-white faces that will be accurately rendered without performing post-image adjustments that sacrifice the sharpness and glossy polish that is readily apparent in photos of white faces.

And how many of you heard about the Twitter Bot that Microsoft created that became a crazy sexist racist idiot in less than 24 hours?

Last Spring, Microsoft unveiled a twitter bot named Tay; programmed to tweet like a teen. What could go wrong, right?

Tay is backed by Artificial Intelligence algorithms that were supposed to help the bot learn how to converse naturally on twitter. But what happened is that the bot learned quickly from the worst racist sexist corners of twitter – and within 24 hours Microsoft had to shut the experiment down because the bot had started tweeting all kinds of sexist, racist, homophobic, anti-Semitic garbage. Again, use your google skills to find them, I’m not sharing them from the podium.

For me the Microsoft experiment with a machine-learning twitterbot is a stark example of the fact that passive, mythical neutrality is anything but neutral. And sure you can blame it on the racist creeps on twitter, but creating technology that fails to anticipate the racist and sexist ways that technology might be used and exploited is not a neutral act. And I would venture to guess that it was a choice made by people who are least likely to have been the targets of discriminatory crap on the internet.

My bigger point here is that while crowd-sourcing and leveraging the social web are hot trends now in tech, I want to encourage us to think hard and critically about the consequences. Basically, I think we need to be very aware of the fact that if we crowd-source something, or if we rely on the social web or the sharing economy; we have to at least try to correct for the fact that the crowd is racist and sexist, and homophobic, and discriminatory in a whole bunch of horrifying ways.

There are all these great new services, that are part of what we call the Sharing economy that eliminate the “middle-man” and let people sell services directly to other people – to share things like rides and rooms with strangers. So there are ride-sharing apps like Uber and Lyft and services like Airbnb, where you can avoid hotels and hotel prices and stay in someone’s spare bedroom.

Stories abound in the US of Uber & Lyft drivers refusing to pick up passengers in minority neighborhoods, or canceling rides when they learn that a passenger is disabled and requires accommodations or assistance.

But I find the case of Airbnb especially interesting, because they are trying to fix their racism problem with both policy and technology.

So here’s what happened with AirBnB – first there was an experimental study out of Harvard about a year ago showing that renters were less likely to rent to people with black sounding names; then there were several reports of renters cancelling bookings for black guests; only to then rent to white guests for the same time period.

Honestly, this shouldn’t surprise us – the amount of social science evidence confirming that people act in biased ways in a huge variety of settings is overwhelming. What is interesting is that AirBnB is trying to do something about it, and they are being unusually transparent about it; so we might learn what works and what doesn’t.

First, they are having everyone who participates as a renter or a host sign a community agreement to treat everyone with respect and without bias. And there is some evidence that community compacts introduce some mutual accountability that has some positive effects, so that’s a good start. They are also providing training on unconscious bias to hosts and highlighting the hosts who complete the training on their website – which is a decidedly not neutral way of driving more renters to hosts who have completed the training.

What’s really interesting is that they are also working on technical features to try to eliminate instances where hosts claim a room or house is booked when a black renter makes a request; only to then immediately rent for the same time period to a white renter. Here is how they explain it: With the new feature If a host rejects a guest by stating that their space is not available, Airbnb will automatically block the calendar for subsequent reservation requests for that same trip.

They are also adding new flagging tools so people can report discrimination and hate speech.

And they have a team of engineers, data scientists, and designers who are looking for other ways to mitigate discrimination and bake some anti-bias features into their platform.

Would it have been better if they had anticipated the racist behavior enabled by their platform? Sure. But now that they are trying to make corrections, and to use technology to do it, I think there might be a real opportunity for us all to learn how we might leverage technology in combatting discrimination.

So, I’ve given some examples of how technology itself is not neutral. My point with these examples is to convince you that technology does not exist as neutral artifacts and tools that might sometimes get used in oppressive and exclusionary ways. Rather, technology itself has baked-in biases that perpetuate existing inequalities and exclusions, and that reinforce stereotypes.

How do we do not just try to mitigate the bias but also actually bring a social justice mindset to our work in library technology?

How do we promote an inclusive perspective, and an agenda of equity in and through our tech work?

First, we do everything we can to make sure the teams we have working on our tools and technologies and projects are actually inclusive and diverse.

And that is admittedly hard; but we do know some things that work. And by know, I mean there are actual scholarly studies that produce some evidence of practices that for example, discourage women from pursuing tech careers or applying for jobs. If I told you of a couple of simple things you could do that have shown they would remove some social barriers to women pursuing tech careers, would you be willing to do them?

(I stopped and waited until most of the room nodded their heads yes)

OK – here goes.

First things first – Don’t be this guy.


Don’t be the guy who says: “Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live.”

Don’t share advice like this; and don’t talk like this or joke like this.

This is some of the most horrendous advice about anything I have ever seen – or at least the worst I’ve seen about coding. And quite frankly I am certain it was written by someone who has a blind spot about the fact that women have to worry about being doxed by violent psychopaths just for being on the internet; or being stalked, attacked and too often killed for ignoring the advances of strangers, or for confronting cat-callers. Queer and trans people are also overwhelmingly more likely to be victims of violent crimes; especially trans women of color.

So using, even in jest, the specter of a violent psychopath, to encourage good coding practices is not just a crappy thing to do – it also reinforces a culture that is hostile to women and to other marginalized groups.

And I know we don’t want to admit it, but technology has a culture problem – even in libraries. Remember those search results for “tech worker” – they reflect the predominant image of who works in technology.

So what are some ways we can make technology work more inclusive?

I want to talk about 3 ways:

  1. change the image of the “tech guy”
  2. change the work environment
  3. watch your language (but not in the way you might think)

First, let’s talk about the “tech guy” image.

Some colleagues of mine at Stanford, sociologists Alison Wynn and Shelley Correll, have done some very interesting work looking at how well people who are already working & succeeding in technology jobs felt they matched the cultural traits & stereotypes of a successful tech worker; and how that sense of a match, or in the case of most women, the sense of a mismatch, effects a number of outcomes. (I don’t have a citation for this study, because it is still under review for publication. Because Shelley is an old friend, I knew about the research and got to read an unpublished version; which she gave me permission to reference in talks, but no citation. Scholarly communication is broken.)

First they developed a composite scale based on how tech workers, men and women, described successful tech workers. Ask people to come up with some adjectives to describe a “successful tech worker” and not too surprisingly the stereotype that emerged was masculine, obsessive, assertive, cool, geeky, young, and working long hours. In other words, The “Tech guy” stereotype is wide-spread and well-known.

And as we would expect, their data show that women tech workers are significantly less likely than their male counterparts to view themselves as fitting that cultural image of a successful tech worker.  Where it gets interesting though is that their research goes on to show that the sense of not fitting the cultural image has consequences.

Because women are less likely to feel they fit the image of a successful tech worker, they are less likely to identify with the tech field, more likely to consider leaving the tech field for another career, and less likely to report positive treatment from their supervisors.

Reminder that their sample was men and women currently working in tech jobs in silicon valley tech firms. So successful women in tech see themselves as not fitting in; and as a result are leaving the field.

The bottom line is that cultural fit matters – not just in the pipeline, as women decide whether to major in STEM fields or to pursue tech jobs – but also among women who are currently working in technology. In other words, stereotypes about tech work and tech workers continue to hinder women even after they have entered tech careers. If we want to ensure that our technologies are built by diverse and inclusive groups of people, we have to find ways to break down the stereotypes and cultural images associated with tech work.

And that brings us to the Star Trek posters – which is somehow always the most controversial part of talks I give on this topic.

But let’s get to the research — In a fascinating experimental study, psychologist Sapna Cheryan and colleagues found that women who enter a computer science environment that is decorated with objects stereotypically associated with the field – such as Star Trek posters or video games– are less likely to consider pursuing computer science than women who enter a computer science environment with non-stereotypical objects — such as nature or travel posters. These results held even when the proportion of women in the environment was equal across the two differently decorated settings.

The Star Trek posters and other seemingly neutral nerdy dude paraphernalia we use to decorate our communal tech spaces serve to deter women – and I expect some of it deters men from marginalized groups as well.

So, to sum up – we can make tech more inclusive if we stop using the term “tech guy”, if we try to promote images of tech workers that aren’t just geeky, obsessive dudes who work long hours, and if we get rid of the Star Trek posters in our communal & public spaces.

And I know some of you are thinking “but I like my Star Trek posters”, but I hope your commitment to diversity wins out over your devotion to your Star Trek posters. Because increasing the number of women in tech is hard, and we have very little research to guide us; but we do know that the Star Trek stuff makes tech work less appealing to women.

And finally, watch your language.

Research also shows that certain words in job ads discourage women from applying. Research shows that women are less likely to apply for engineering and programming jobs when those ads have stereotypically masculine words like “competitive” or “dominate”. Women are less likely to apply and are more likely to feel that they wouldn’t fit in or belong when words like that are part of the job description. This is a case where technology can help – there are text analysis programs that can tell you if you are using gendered language in your job ads and can suggest more neutral language.

But again, this just points to the fact that if we want our technology to work towards diversity, inclusion and equity; we have to intervene and design it explicitly to do so.

That’s one of the lessons learned by a set of researchers who trained a machine learning algorithm on Google news articles then asked the algorithm to complete the analogy:

“Man is to Computer Programmer as Woman is to X.” The answer came back: “Homemaker.”

In fact, when asked to generate a large numbers of

He is to X as She is to Y analogies, the algorithm returned plenty more stereotypes:

  • He is to doctor as She is to nurse
  • He is to brilliant as She is to lovely
  • He is to pharmaceuticals as She is to cosmetics

The corpus of text the machine learning algorithm learned on was itself biased and filled with stereotypes and stereotypical associations.

But again, there are ways to de-bias the system using human intervention.

In this case, a team of researchers flagged associations the algorithm had made that were gendered and added code instructing the algorithm to remove those associations. The algorithm could be taught to recognize and remove bias.

OK – I started off with the notion that libraries aren’t neutral and technology is not neutral; and I’ve talked about lots of examples of technologies that aren’t neutral either in their design or in their execution or both. And I’ve offered some research to help bring more diversity to our library technology teams, in the hope that more diverse and inclusive teams building our technologies will lead to design choices that favor social equity and justice.

But let me be clear – I don’t think increasing the percentage of women, and men of color in our technology departments is a magic bullet and I certainly don’t think we need to wait until we are more diverse to start thinking about how to leverage our technology work to promote social justice. I think we need to increase the diversity of our libraries, in technology and throughout the profession – but numbers aren’t the only answer.

I have some general ideas about how we might build library technologies for social justice and I’ll share them quickly because I want to hear your ideas.

First, I think we need to consciously think about social justice principles and try to build them into every step of our work. For me social justice principles are feminist principles – transparency, participation, agency, embodiment. We should also ask who is missing from our work, or from the personas we develop. And if the answer is women; then we need to dig deeper and ask which women? Too often we think adding white women fixes our diversity problem.

If we really want to work on tech projects that promote social justice in our communities then we need to talk to our most marginalized community members. At my institution, that would be the racial and ethnic identity student groups, the queer and the trans students, the Muslim students. If we reach out to these groups specifically and try to find out what they need, what they struggle with in the library and more generally at our institutions, we might realize that there are technology projects that would help.

And in all of our work, I think we get closer to social justice the more we practice the art of truly listening to each other and to our communities.

I also want to promote an ethic of care and empathy which is something 2 of my favorite humanists have recently written about: Bethany Nowviskie, executive director of DLF wrote about this in a piece titled “on capacity and care”; and just this weekend Kathleen Fitzpatrick, president of the Modern Language Association wrote about a new project she is calling “Generous thinking.” I recommend both to you.

And in that spirit of listening, it is time for me to wrap this up and to hear from you. I hope you will feel free to say whatever you want, to make comments of all kinds, no need to phrase it in the form of a question. A conversation among all us is much more interesting than me answering questions. So I’m ready to listen now. Thank you

6 Responses to “Libraries, technology, and social justice”

  1. 1 jessamyn October 19, 2016 at 1:36 pm

    I appreciate the heck out of this. My college thesis paper was about “generic” male pronouns and how they are not interpreted anything close to generically. I’ve really appreciated reading up on the culture of Rethinking Repair (hey that’s MIT press!) about our responsibility to stewardship as well as just innovation or service delivery even.


  2. 2 klmccook October 16, 2016 at 7:46 am

    Back when LIBRARY SCHOOLS were adding INFORMATION SCIENCE to the names of our programs the meme was “information science is library science in suits.” earlier version of theme. Now, of course most of the programs have dropped the word library.


  3. 3 Christie Hurrell October 14, 2016 at 9:37 am

    Thank you for this post. At the recent Designing Libraries conference, we heard computer scientist Sheelagh Carpendale speak about gendered language in her discipline’s literature, and how it perpetuates gender stereotypes. The term “user” evokes men for people — which is so interesting given that this is a term libraries use to describe our communities so frequently.


  1. 1 Weekly Roundup! | hls Trackback on October 14, 2016 at 8:00 am
  2. 2 Liposomal Co To Jest Internet Program | Purathrive Trackback on October 8, 2016 at 9:48 pm
  3. 3 Libraries, technology, and social justice – HUB Trackback on October 8, 2016 at 12:12 am

Leave a Comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Enter your email address to follow Feral Librarian by email.

Join 7,526 other followers

Follow me on Twitter

%d bloggers like this: