Below is the text from the OLITA Spotlight talk I gave at the OLA Super Conference (#olasc15).
I want to acknowledge from the outset that this talk has been heavily influenced by a number of people who have shared their work and their thoughts with me over the years. I’ve been privileged to learn from them, in some cases formally through their publications and in some cases through conversations on twitter or even in person. These aren’t the only folks whose work and thinking influences me, but they are the key people I think of when I think of critical work on the intersections of libraries, technology, higher education and social justice. These are their names – a mix of students, librarians, scholars, and technologists. Again, this is not a comprehensive list of the people whose work inspires me, but they are my top 7 right now on these topics.
Let me also acknowledge that I’m well aware that the fact that I am a white woman working at an elite private US university gives me access to a platform like this one to talk about issues of bias and exclusion in libraries and technology. But there are plenty of folks who have been and continue to talk about and write about these issues, with far more insight and eloquence than I can, but who don’t get invitations like this for a variety of reasons. And the sad truth is that what I say, as an associate director at Stanford Libraries or as Director of MIT Libraries, often gets more attention than it deserves because of my title; while folks with less impressive titles and less privilege have been talking & thinking about some of these issues for longer than me and have insights that we all need to hear.
So next time you are looking for a speaker, please consider one of the names listed above.
If you read the blurb describing this talk, you know that a fundamental tenet that undergirds this talk, and frankly undergirds much of the work I have done in and for libraries, is the simple assertion that libraries are not now nor have they ever been merely neutral repositories of information. In fact, I’m personally not sure “neutral” is really possible in any of our social institutions … I think of neutral as really nothing more than a gear in your car.
Title slide for Never Neutral talk
But what I mean when I say libraries are not neutral is not just that that libraries absorb and reflect the inequalities, biases, ethnocentrism, and power imbalances that exist throughout our host societies and (for those of us who work in academic libraries) within higher education.
I mean that libraries are not neutral in a more direct and active way.
For an exceptionally compelling take on libraries as not just not neutral, but as instruments themselves of institutional oppression, please read “Locating the Library in Institutional Oppression” by my friend and colleague nina de jesus.
nina argues that “Libraries as institutions were created not only for a specific ideological purpose, but for an ideology that is fundamentally oppressive in nature.” It is a bold argument, convincingly made; and I urge you to read it. As a bonus, the article itself is Open Access and nina elected to use only Open Access sources in writing it.
So I start with the premise that it isn’t just that libraries aren’t perfectly equitable or neutral because we live in a society that still suffers from racism, sexism. ableism, transphobia and other forms of bias and inequity; but libraries also fail to achieve any mythical state of neutrality because we contribute to bias and inequality in scholarship, and publishing, and information access.
Let me step back for a minute and own up to a few of my own biases – my library career thus far has been solely and squarely within large academic libraries; so my perspective, my examples, and my concerns come out of that experience and are likely most relevant to that sector of libraries. But, I hope we can have a conversation at the end of my talk about what the differences and similarities might be between the way these issues play out in large academic libraries and the way they play out in all kinds and sizes of libraries. I’m also definitely speaking from an American perspective, and I look forward to hearing where and how cultural differences intersect with the ideas I’ll talk about.
OK – so libraries are not neutral because we exist within societies and systems that are not neutral. But above and beyond that, libraries also contribute to certain kinds of inequalities because of the way in which we exercise influence over the diversity (or lack thereof) of information we make available to our communities and the methods by which we provide access to that information.
I have a whole other talk that I’ve given on how the collection development decisions we make impact not just how inclusive or not our own collections are, but also what kinds of books and authors and topics get published. The short version of that talk is that when we base our purchasing decisions on circulation and popularity, we eliminate a big part of the market for niche topics and underrepresented authors. That is bad for libraries, bad for publishing, and bad for society. But that’s another talk. This talk is about library technologies.
But before we get into technology per se., I think a word about our classification systems is necessary, because the choices we make about how our technologies handle metadata and catalog records have consequences for how existing biases and exclusions get perpetuated from our traditional library systems into our new digital libraries.
Many of you are likely well aware of the biases present in library classification systems.
Hope Olson – one of the heroes of feminist and critical thinking in library science – has done considerable work on applying critical feminist approaches to knowledge organization to demonstrate the ways in which libraries exert control over how books and other scholarly items are organized and therefore how, when, and by whom they are discoverable.
Our classification schemes — whether Dewey Decimal or Library of Congress — are hierarchical, which leads to the marginalization of certain kinds of knowledge and certain topics by creating separate sub-classifications for topics such as “women and computers” or “black literature”.
Let me give a couple of examples of the effects of this.
Call numbers matter
The power of library classification systems is such that a scholar browsing the shelves for books on military history is unlikely to encounter Randy Shilts’ seminal work Conduct Unbecoming: Gays & Lesbians in the US Military, because that book has been given a call number corresponding to “Minorities, women, etc. in armed forces”. In my own library at Stanford University, that means the definitive work on the history of gays and lesbians serving in the armed forces is literally shelved between Secrets of a Gay Marine Porn Star and Military Trade — a collection of stories by people with a passion for military men. Now I’m not saying we shouldn’t have books about gay military porn stars or about those who love men in uniform. I am saying that there is nothing neutral about the fact that the history of gay & lesbian service members is categorized alongside these titles, while the history of “ordinary soldiers” (that’s from an actual book title) is shelved under “United States, History – Military.”
Another example is one I learned of from my friend and colleague Myrna Morales, and you can read about it in an article I co-authored with her and Em Claire Knowles. In that article, Myrna writes about her experience doing research for her undergraduate thesis on the Puerto Rican political activism that took place in NYC in the 1960s, with a special interest in the Young Lords Party.
Here is how Myrna described her experience:
I first searched for the YLP with the subject heading “organizations,” subheading “political organization,” in the Reader’s Guide to Periodical Literature. Here I found no mention of the YLP. I was surprised, as I had known the YLP to be a prominent political organization—one that addressed political disenfranchisement, government neglect, and poverty. A (twisted) gut feeling told me to look under the subject heading of “gangs.” There it was—Young Lords Party. This experience changed my view of the library system, from one impervious to subjectivity and oppression to one that hid within the rhetoric of neutrality while continuing to uphold systemic injustices.
I suspect that this kind of experience is all too common for people of color and other marginalized people who attempt to use the resources we provide. I’ll go so far as to wonder if these sorts of experiences aren’t at least partially responsible for the incredibly low proportion of people of color who pursue careers in librarianship.
So our traditional practices and technologies are not neutral, and without active intervention we end up with collections that lack diversity and we end up classifying and arranging our content in ways that further marginalizes works by and about people of color, queer people, indigenous peoples, and others who don’t fit neatly into a classification system that sets the default as the as western, white, straight, and male.
Of course, the promise of technology is that we no longer need rely on arcane cataloging rules and browsing real library stacks to discover and access relevant information. With the advent of online catalogs and search engines, books and other information items can occupy multiple “places” in a library or collection.
But despite the democratizing promise of technology, our digital libraries are no more capable of neutrality than our traditional libraries; and the digital tools we build and provide are likely to reflect and perpetuate stereotypes, biases, and inequalities unless we engage in conscious acts of resistance.
Now when most people talk about bias in tech generally or in library technology, we talk about either the dismal demographics that show that white women and people of color are way underrepresented in technology, or we talk about the generally misogynistic and racist and homophobic culture of technology; or we talk about both demographics and culture and how they are mutually reinforcing. What we talk about less often is this notion that the technology itself is biased – often gendered and/or racist, frequently ableist, and almost always developed with built in assumptions about binary gender categories.
For some folks, the idea that technologies themselves can be gendered, or can reflect racially based and/or other forms of bias is pretty abstract. So let me give a few examples.
Most librarians will agree that commercial search engines are not “neutral” in the sense that commercial interests and promoted content can and do impact relevancy. Or, as my colleague Bess Sadler says, the idea of neutral relevance is an oxymoron.
Safiya Noble’s work demonstrates how the non-neutrality of commercial search engines reinforce and perpetuate stereotypes, despite the fact that many assume the “algorithm” is neutral.
What Noble’s analysis of Google shows us is that Google’s algorithm reinforces the sexualization of women, especially black and Latina women. Because of Google’s “neutral” reliance on popularity, page rank, and promoted content, the results for searches for information on black girls or Latina girls are dominated by links to pornography and other sexualized content. Noble suggests that users “Try Google searches on every variation you can think of for women’s and girls’ identities and you will see many of the ways in which commercial interests have subverted a diverse (or realistic) range of representations.”
Search technologies are not neutral – just as basing collection development decisions on popularity ensures that our collections reflect existing biases and inequalities, so too does basing relevancy ranking within our search products on popularity ensure the same biases persist in an online environment.
But it isn’t just search engines. In an article called “Teaching the Camera to see my skin”, photographer Syreeta McFadden describes how color film and other photographic technologies were developed around trying to measure the image against white skin. Because the default settings for everything from film stock to lighting to shutter speed were and are designed to best capture white faces; it is difficult to take photos of non-white faces that will be accurately rendered without performing post-image adjustments that sacrifice the sharpness and glossy polish that is readily apparent in photos of white faces.
Teaching the camera to see my skin
Finally, in an example of a technology that betrays its lack of neutrality by what it ignores, Apple’s recently released health app allows users to track a seemingly endless array of health and fitness related information on their iPhone. But strangely, Apple’s health app did not include a feature for tracking menstrual cycles – an important piece of health data for a huge percentage of the population. As one critic noted, Apple insists that all iPhone uses have an app to track Stock prices – you can’t delete that one from your phone — but fails to provide an option for tracking menstrual cycles in its “comprehensive” health tracking application.
I hope these examples demonstrate that technology does not exist as neutral artifacts and tools that might sometimes get used in oppressive and exclusionary ways. Rather, technology itself has baked-in biases that perpetuate existing inequalities and exclusions, and that reinforce stereotypes.
So how do we intervene, how do we engage in acts of resistance to create more inclusive, less biased technologies?
Note that I don’t think we can make completely neutral technologies … but I do think we can do better.
One way we might do better is simply by being aware and by asking the questions that the great black feminist thinkers taught us to ask:
Who is missing?
Whose experience is being centered?
Many, many folks argued – rather convincingly to my mind – that the dearth of women working at Apple may have contributed to the company’s ability to overlook the need for menstrual cycle tracking in its health app.
So we might also work on recruiting and retaining more white women and people of color into library technology teams and jobs. There is much good work being done on trying to increase the diversity of the pipeline of people coming into technology – Black Girls Code and the Ada Initiative are examples of excellent work of this type.
I also think the adoption of strong codes of conduct at conferences like this one and other library and technology events make professional development opportunities more welcoming and potentially safer for all – and I think those are important steps in the right direction.
But in the end, one of the biggest issues we need to address if we truly want a more diverse set of people developing the technologies we use is the existence of a prevailing stereotype about who the typical tech worker is.
I want to turn now to some research on how stereotypes about who does technology, and who is good at it, affect how interested different kinds of people are in pursuing technology related fields of study, how well people expect they will perform at tech tasks, and how well people already working in tech feel they fit in, and how likely they are to stay in tech fields.
First a definition – Stereotypes are widely shared cultural beliefs about categories of people and social roles. The insidious thing about stereotypes is that even if we personally don’t subscribe to a particular stereotype, just knowing that a stereotype exists can affect our behavior.
Second, a caution – much of this research focuses on gender, to the exclusion of intersecting social identities such as race, sexuality, or gender identity. The research that talks about “women’s” behavior and attitudes towards technology is usually based on straight white women .. so keep that in mind, and recognize that much more research is needed to capture the full range of experiences that marginalized people have with and in technology.
That said, there is a huge body of research documenting the effect of negative stereotypes about women’s math and science abilities. These kinds of stereotypes lead to discriminatory decision making that obstructs women’s entry into and advancement in science and technology jobs. Moreover, negative stereotypes about women and math affects women’s own self-assessment of their skill level, interest, and suitability for science and technology jobs.
Barbie “Math is hard”
In a not yet published research study of men and women working in Silicon Valley technology firms, Stanford sociologists Alison Wynn and Shelley Correll looked at the impact of how well tech workers felt they matched the cultural traits of a successful tech worker on a number of outcomes.
First they developed a composite scale based on how tech employees, men and women, described successful tech workers. The stereotype that emerged was masculine, obsessive, assertive, cool, geeky, young, and working long hours.
Their data show that women tech workers are significantly less likely than their male counterparts to view themselves as fitting the cultural image of a successful tech worker. While that may not be a surprising finding, their research goes on to show that the sense of not fitting the cultural image has consequences.
Because women are less likely to feel they fit the image of a successful tech worker, they are less likely to identify with the tech field, more likely to consider leaving the tech field for another career, and less likely to report positive treatment from their supervisors.
The bottom line is that cultural fit matters – not just in the pipeline, as women decide whether to major in STEM fields or to pursue tech jobs – but also among women who are currently working in technology. In other words, stereotypes about tech work and tech workers continue to hinder women even after they have entered tech careers. If we want to ensure that our technologies are built by diverse and inclusive groups of people, we have to find ways to break down the stereotypes and cultural images associated with tech work.
How do we do that?
If we want to look to success stories, Carnegie Mellon University is a good example. At Carnegie Mellon they increased the percentage of women majoring in computer science from 7% in 1995 to 42% in 2000 by explicitly trying to change the cultural image of computer scientists. Faculty were encouraged to discuss multiple ways to be a computer scientist and to emphasize the real world applications of computer science and how computer science connects to other disciplines. They also offered computer science classes that explicitly stated that no prerequisites in math or computer science were required.
For libraries, we can talk about multiple ways to be a library technologist, and we can emphasize the value of a wide variety of skills in working on library tech projects – metadata skills, user experience skills, design skills. We can provide staff with opportunities to gain tech skills in low-threat environments and in environments where white women and people of color are less likely to feel culturally alienated.
RailsBridge workshops and AdaCamps seem like good fits here, and I’d like to see more library administrators encouraging staff from across their org’s to attend such training. At Stanford, my colleagues Bess Sadler and Cathy Aster started basic tech training workshops for women on the digital libraries’ staff who were doing tech work like scanning, but who didn’t see themselves as tech workers. Providing the opportunity to learn and ask questions, in a safe environment away from their supervisors and male co-workers gave these women skills and confidence that enhanced their work and the work of their groups.
Another simple way we can make progress within our own organizations is to pay attention to the physical markers of culture.
In a fascinating experimental study, psychologist Sapna Cheryan and colleagues found that women who enter a computer science environment that is decorated with objects stereotypically associated with the field – such as Star Trek posters — are less likely to consider pursuing computer science than women who enter a computer science environment with non-stereotypical objects — such as nature or travel posters. These results held even when the proportion of women in the environment was equal across the two differently decorated settings.
We need to pay attention to the computer labs and maker spaces in our libraries, and we need to pay attention to physical work environments our technical staff work in. By simply ensuring that these environments aren’t plastered with images and objects associated with the stereotypes about “tech guys”, we will remove one of the impediments to women’s sense of cultural fit.
So let me try to sum up here.
I’ve argued that like libraries, technology is never neutral. I’ve offered examples from search engines to photography to Apple’s health tracking app.
I’ve talked about how the pervasive stereotypes about who does tech work limit women’s participation in tech fields, through both supply and demand side mechanisms.
The stereotypes about tech workers also contain assumptions about race and sexuality in the US context, in that the stereotypical tech guy is white (or Asian) and straight. Sadly, there is significantly less research on the effect of those stereotypes on black and Latino men and women and queer people who are also vastly underrepresented in technology work.
Let me offer some parting thoughts on how we might make progress.
To borrow from the conference theme, we need to think and we need to do.
We need to think about the technology we use in our libraries, and ask where and how it falls short of being inclusive. Whose experiences and preferences are privileged in the user design? Whose experiences are marginalized? Then we need to do what we can to push for more inclusive technology experiences. We likewise need to be transparent with our patrons about how the technology works and where and how the biases built into that technology might affect their experience. The folks who do work in critical information literacy provide great models for this.
We should think about how libraries and library staff reinforces stereotypes about technology and technology work. Subtle changes can make a difference. We should drop the term “tech guy” from our vocabulary and we should ditch the Star Trek posters. I’d like to see more libraries provide training and multiple paths for staff to develop tech skills and to become involved in technology projects. We need to pay attention to the demographics and to the culture – and remember that they are mutually reinforcing.
We also need to remember that we aren’t striving for neutral, and we aren’t aiming for perfectly equitable and inclusive technology.
While neutral technologies are not possible – or necessarily desirable – I believe that an awareness of the ways in which technology embodies and perpetuates existing biases and inequalities will help us make changes that move us towards more inclusive and equitable technologies.