Posts Tagged 'citations'

Sometimes scholars do tell us how the library impacted their work

Thank you

If only scholars thanked those who contributed to their work …(Thank you by Avard Woolaver)

Like all of higher education, libraries are under increasing pressure to demonstrate their value by showing how our collections and services impact the teaching and research missions of our institutions. I have argued before that a Return on Investment approach is a bad idea, and that the value of libraries is both very real and very hard to measure. And more recently I put out a plea for us to stop equating frequency of use with importance when it comes to library resources.

One major problem with almost all of the ways we try to measure the impact of our resources is that our measures are poor proxies for what we are really trying to assess.  Even citation analyses aimed at measuring how much of our holdings are cited in dissertations and faculty publications produces a sloppy and imprecise measure of actual impact. Putting aside the issues of drive-by citations, coercive citations, and negative citations; it is also the case that scholars get materials from many sources besides the library. A citation to something in our collection is not a reliable indicator that the scholar used our copy of the item in their research.  So, citation analyses are likely to overestimate the impact of our collections. Moreover, citation analyses provide no means of assessing the impact of our staff and our services.

Wouldn’t it be great if scholars would just tell us straight up when our collections, services, and staff contributed in tangible ways to their research? I mean, what if they just outright said things like:

The friendly staffs at Green Library, Crown Law Library, and Cubberley Education Library were also invaluable.

~Richard Cottle in Stanford Street Names, 2005

Many, many thanks for the guidance and invaluable resources provided by Maggie Kimball, Stanford University archivist, Dennis Copeland, director of the California Collection at the Monterey Public Library, … Joe Wible, head librarian at Hopkins Marine Station’s Miller Library; Neal Hotelling at the Pebble Beach archive …

~Susan Shillinglaw in A Journey into Steinbeck’s California, 2006

Obviously scholars do acknowledge the impact of libraries on their work — they do so in the acknowledgements sections of their books. In my opinion, acknowledgements provide the most direct measure of the impact of library collections and services on research.

If you want to hear more about how the amazing Jacque Hettel and I are analyzing acknowledgement data, come hear our snapshot talk at DLF.   Jacque has created a text corpus of acknowledgements of Stanford libraries from books published over the past 10 years. We are busy analyzing those acknowledgements to understand which library departments get the most shout-outs (hint: Special Collections and InterLibrary Borrowing are in the lead) and whether there are disciplinary differences in whether scholars acknowledge the library generally and/or whether they single out specific individuals. We also plan to explore those acknowledgements that mention multiple libraries as a way to visualize networks of libraries across disciplines.

If we really want to know how our libraries impact scholarship, we should be paying careful attention to what scholars actually say about us when they are acknowledging those people and resources that contributed to their research.

Shameless self-promotion of publications, presentations, and unpublished manuscripts

I decided I needed a place to keep track of my publications, presentations, and unpublished manuscripts, so I added the Publications and Presentations page now linked at the top of this blog.

A second, but important, motivation is to provide access to the full-text of some unpublished manuscripts, including my dissertation Gender Mistakes & Inequality. I also included an unpublished paper I wrote about civilian husbands of military women for a graduate school class back in 1994. I still get an occassional request for a copy of that paper, and it has actually been cited a few times in published books and articles. I guess it pays to be the first to write about something, and to have had an awesome advisor who continues to tell anyone writing about military women with civilian husbands that I wrote about the topic way back when.

I actually think one of my best papers is an unpublished and (so far) uncited paper with the clever title: Bowling with veterans: The impact of military service on subsequent civic engagement. I wrote it for a graduate methods course at Stanford, and submitted it to American Journal of Sociology back in 2001. I actually got a very kind and very helpful rejection from AJS. I worked on it a bit more and was thinking about submitting it somewhere else, but then I then I got a full time gig at the Stanford Libraries and I just never could find the time to get back to it.

Does posting these old unpublished manuscripts online here count as self-publishing? I wonder if putting them online will have any affect on citations? At any rate, I’m not likely to be writing any more military sociology papers, but I do now have a place to put future presentations so I can find them again.

Moving Beyond Libraries as big photo albums

In Moving beyond the photo album, Kevin Smith describes the journal article as “a “snapshot” of research… increasingly far-removed from the actual research process and have less and less relevance to it.”

Smith summarizes a talk by G. Sayeed Choudhury, Associate Dean for Library Digital Programs at Johns Hopkins University, in which:

Choudhury called on libraries to move past a vision of themselves as merely a collection of these snapshots and become more active participants in the research process. He recounted a conversation he had with one researcher who, in focusing on the real need he felt in his own work, told Sayeed that he did not care if the library ever licensed another e-journal again, but he did need their expertise to help preserve and curate his research data. The challenge for libraries is to radically rethink how we spend our money and allocate the expertise of our staffs in ways that actually address felt needs on our campuses and do not leave us merely pasting more snapshots into a giant photo album that fewer people every day will look at.

Of course, as Smith notes, promotion and tenure systems still rely on the “outmoded system of scholarly communications that is represented by the scientific journal”, even as actual scholarly communication is happening outside of, and in spite of, published journal articles.

What are libraries to do? Dwindling budgets are forcing many of us to cancel journal subscriptions anyway; but as long as tenure and promotion rely on traditional citation and publication counts, we can’t very well stop collecting the snapshots that are traditional scholarship. But, we can play a role in promoting new modes of scholarly communication, and we ought to be participating in conversations about establishing better, more efficient, more relevant ways of evaluating scholarship.

Edited 8/27/09: Thanks to @ericrumsey for the better, more accurate title suggestion.

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to FurlAdd to Newsvine

What do citations really mean?

Several recent papers and/or blog posts have highlighted the social nature of citation practices, challenging the assumption that citation counts stand as impartial measures of the quality and impact of a piece of scholarly work:

  1. In Position matters Philip Davis summarizes Positional Effects on Citation and Readership in arXiv , in which the authors find that articles listed at the top of arXiv’s daily publication announcements were cited more frequently than articles listed lower. The authors go on to note:

    we’ve documented here that accidental forms of visibility can drive early readership, with consequent early citation potentially initiating a feedback loop to more readership and citation, ultimately leaving measurable and significant traces in the citation record.

  2. Although Henry at Crooked Timber rather convincingly argues that Charles Rowley’s article (behind paywall), ‘The curious citation practices of Avner Greif: Janet Landa comes to grief’; is “one of the sorriest hack-jobs that I’ve ever had the misfortune to read in an academic journal”; the Rowley article does highlight the general issue of the very real influence that individual citation choices can have. Scholarly careers can be made or broken by citation (or lack thereof) in a subsequently influential piece of scholarship. Once cited in one influential paper, one’s work is more likely to become part of the regularly cited literature on a topic, as subsequent scholars cite the same works.
  3. Finally, Steven Greenberg’s How citation distortions create unfounded authority: analysis of a citation network highlights the social side of citations by demonstrating how “the persuasive use of citation–bias, amplification, and invention–can be used to establish unfounded scientific claims as fact.”

Clearly lots of factors affect when, where, why, by whom and how often a given scholarly work is cited. It seems like it would be good for us all to remember not to place undue weight on either individual citations or citation patterns; and not to blindly rely on citation count as an unbiased, objective measure of quality.

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to FurlAdd to Newsvine

How do you do a lit review?

The question of “how do you do a lit review?” is being discussed by folks at my favorite Sociology blog, scatterplot. As a librarian, it is very interesting to “eavesdrop” on faculty having this discussion, and see what advice they have:

1. ISI Citation Databases/Social Science Citation Index
-If you aren’t already familiar with this then you need to be! Extremely useful for finding key articles because of the ability to sort by number of citations. This isn’t a sure-fire way to find the most important articles so don’t just use number of citations but it pretty much always leads you to some key articles and from there you can quickly get to others.

2. Google Scholar
-Also can be good for articles but I use it primarily for books. Particularly nice for books because of the google books project that let’s you flip through some of the book to get a sense of whether or not it might be useful before you run to the library or purchase it.

3. Comp reading lists from our department and other top departments
-A good place to find an introduction to a general field. It will give you a sense of some of the most important articles and the general topics for the field and you can dig deeper from there. Some times reading lists will be fairly comprehensive, though they rarely have the latest “cutting-edge” research.

4. Syllabi from key people in the field or really respected institutions
-Works similarly to the comp reading list though it may be better in some instances. Of course, if you’re looking for syllabi based on the key people in the field this requires you to know who these key people are in the first place. If it’s a fresh syllabus then it will often have newer research. Depending on the time put into the syllabus and the level of detail it may also give you a much better sense of how the field is loosely organized.

5. Annual Reviews
-Not surprisingly, these tend to have diminishing returns as they age. If there’s one that’s recent and for your specific interest then these are often money. If the article is less recent but not especially “old” it should still give you a nice framework upon which to build. If it is older then it can still be useful though. Some times the key features of a debate last a long time, often debates are cyclical, and, if nothing else, they can give you a bit of a history lesson to help you understand where the current literature is coming from.

Some of the comments contain great tidbits too … like “Reviewing the literature and writing the literature review are not the same thing.”

This makes me think that we (librarians) might consider pitching our drop-in workshops not as “How to use library databases”, but as “How to do a literature review”. And maybe we should get some interested faculty to co-teach a session with us.

Citation concentration debate

Once again, the folks at hangingtogether are my source for new research — in Efficiency and scholarly information practices, they describe a new study published in Journal of the American Society for Information Science and Technology : JASIST that purports to refute an earlier study published in Science magazine that shows that as journal content becomes available online, citations get narrower.

(Full citation: See: Larivière, Vincent, Yves Gingras, and Eric Archambault. “The Decline in the Concentration of Citations, 1900-2007.” Journal of the American Society for Information Science and Technology : JASIST. 60. 4 (2009): 858-862. Stanford-only link).

The original article “Electronic Publishing and the Narrowing of Science and Scholarship”, by friend and colleague James Evans, attracted quite a bit of attention (I wrote a bit about it myself), so I am not surprised that someone has already tried to replicate (or refute) his findings.

In “The Decline in the Concentration of Citations, 1900-2007.” the authors claim that contrary to Evans’ findings, citations are becoming increasingly dispersed over time. The major problem here is that Evans has done a fairly sophisticated analysis (including negative binomial models) that controls for time. This allows him to isolate the estimated effect of journal online availability on trends and patterns of citations. In contrast, Lariviere, Gingras, and Arhambault readily admit that their models “do not take into account ‘the online availability’ variable.”

In other words, the study that claims to refute Evans’ work actually fails to account for the key independent variable, and instead shows simple trends over time.

More citations for fee-based online journals than free ones

My buddy James Evans (with co-author Jacob Reimer) continues to produce important and methodologically rigorous research on publishing and citation patterns.

In a study published in Science magazine (no link available yet, PDF (subscription required)), Evans and Reimer find that:

when a research article is offered online after being in print for one year, the use of an open-source format increases citations to the article by 8 percent. But when a paid-subscription format is used to distribute a year-old print article, the citations increase by 12 percent. (from The Chronicle of Higher Education)

Some of the Chronicle comments note that Evans and Reimer elected to publish in a fee-based journal themselves.

There is a complicated self-fulfilling prophecy mechanism at work in the pattern they find. For a scholar seeking tenure, citation count matters more than simple exposure. If the fee-based journals get cited more, they will remain more attractive publication vehicles for scholars. As the best scholars and rising scholars continue to publish in fee-based journals, then those journals will continue to get cited more.

Update (Feb. 25, 2009): Evans describes his research in this video interview from National Science Foundation.

Just for fun: Alternative source citing

I found these examples of Alternative Source Citing from the PMLA amusing.

Now I am prepared when a student comes to the reference desk and asks how to cite restroom graffiti in their research paper.

Citation non-proliferation

An old grad school classmate, James Evans, has published an article in Science that is getting a bit of attention. In Electronic Publication and the Narrowing of Science and Scholarship, Evans finds:

that as more journal issues came online, the articles referenced tended to be more recent, fewer journals and articles were cited, and more of those citations were to fewer journals and articles.

For the record, James is a wicked smart sociologist, and a generally terrific person. In fact, he responded quickly and thoughtfully when I emailed him with my concerns that his findings might be interpreted by some to argue that the increasing availability and use of online resources is leading to lower quality of research (as measured by number and breadth of citations).

To the contrary, James replied:

On the interpretation of findings, I don’t think convergence is bad…neither do I think the move from monographs to research articles is bad. If scientists are all talking about the same things, we can more quickly agree on where to go next. I think there are both individual and global (i.e., for science) benefits to this. I think there are also global costs, however. If people pick up a finding as the most important because others feel it is so, then there may be an increase in group-think, which drops diverse findings from current conversation. If something doesn’t fit the current paradigm, it is less likely to influence it.

James further notes that his research:

ironically intimates that one of the chief values of print library research is poor indexing. Poor indexing—indexing by titles and authors, primarily within core journals—likely had unintended consequences that assisted the integration of science and scholarship. By drawing researchers through unrelated articles, print browsing and perusal may have facilitated broader comparisons and led researchers into the past.

I made the argument earlier that online browsing is far more efficient than print browsing, but had not thought much about these kinds of unintended consequences. While I’m not completely convinced that serendipitous discovery of unknown and seemingly unrelated scholarship is all that common, this is the most convincing argument for the value of print browsing that I have seen. Because print browsing is so inefficient, scholars are forced to encounter books (and ideas) that they would not otherwise have encountered.


Enter your email address to follow Feral Librarian by email.

Join 13,822 other followers

Follow me on Twitter


%d bloggers like this: