How will undergraduates navigate a post peer-review scholarly landscape?

With all of it’s flaws (and there are many), faculty at the moment can tell students to find articles that pass one crucial credibility test: peer review.

This is pretty easy for students to do, given a bit of instruction.  Many databases will indicate if something is peer reviewed (although they don’t always get it right), and most primary research articles are peer reviewed – you just need to be able to recognize one.

But peer review is changing.  It isn’t going away anytime soon, but through a variety of trials and experiments and advocacy, it is changing.  Cameron Neylon has argued in favor of doing away with the current peer review system altogether.

This may require a more informed readership, readers who understand what various metrics mean, and a greater reliance on understanding the general reputation of a journal (however this is measured).

All of this creates problems for your typical undergraduate.

When they are just starting out, students don’t have the required scientific knowledge of concepts and methods to adequately evaluate the quality of a journal article on their own – that’s what they are in college to learn.

So when their professors ask them to write a paper or complete a project using high quality primary research articles, how will students filter the signal from the noise if the simple “peer-reviewed” credibility test no longer works?

I can think of a few things that may help them out, although it won’t be quite as simple as it’s made to seem now. This may also require a bit more instruction to bring students up to speed on these concepts.

  • Use the databases as a filtering tool.  Databases like Scopus and Web of Science and SciFinder select which journals to include.  Theoretically, they wouldn’t include content from the really poor quality journals.  Of course, this doesn’t stop bad papers from appearing in good journals.  Faculty could limit students to articles found in a particular database.
  • Increased prevalence of article level metrics on publisher websites.  Some journals already make this information prominent (like PLoS ONE) and more are doing so.  This would require more education (for both faculty and students) about what these metrics mean (and don’t mean).  Faculty could ask students to only use articles that meet some minimum threshold.
  • An expansion of rating networks like Faculty of 1000.  We don’t have access to this resource at my institution, but we may see undergraduates relying more on this (and similar networks) to help them get a sense of whether an article is “worthy” or not.  Students could be limited to using articles that had a minimum rating.

All of this is limiting.  Hopefully, by the time students reach their senior year, faculty could stop making arbitrary requirements and simply ask for high quality material, right?

What are some other techniques for evaluating scholarship that undergraduates may have to master as peer review changes?

Advertisements

It isn’t just students: Medical researchers aren’t citing previous work either

One of the things that faculty often complain about is that students don’t adequately track down and cite enough relevant material for their term papers and projects.  This problem isn’t confined to undergraduates.  A study in the January 4, 2011 issue of the Annals of Internal Medicine by Karen Robinson and Steven Goodman finds that medical researchers aren’t doing a very good job of citing previous research either.

ResearchBlogging.orgSpecifically, Robinson and Goodman looked at reports of randomized, controlled trials to determine if the authors cited previous, related trials.  Citing previous trials is an important part of putting the results of the current trial in context, and in the case of medicine, may help save lives.

In order to do this study, the authors used meta-analysis to locate groups of related papers.  They reasoned that if the studies were similar enough to group mathematically, they were similar enough to cite each other.  The allowed for a 1-year gap between an original publication and a citation.

Overall, they found that only 25% of relevant papers were actually cited.

Why might a citation not be included?  I can think of a few reasons.

  • The authors couldn’t find the previous study
  • The authors found the previous study but didn’t think it was relevant enough to cite
  • The authors found the study and purposefully excluded it for some nefarious purpose

Robinson and Goodman seem to favor the first explanation most of all:

The obvious remedy – requiring a systematic review of relevant literature [before an RCT is funded] – is hampered by a lack of necessary skills and resources.

This obviously speaks to the importance of information literacy skills in both undergraduates and medical school students.  One of the most troubling things about the article results was Robinson and Goodman’s determination that a very simple PubMed search could locate most of the articles on one of the topics assessed.

An interesting recommendation that Robinson and Goodman repeat throughout the article is to suggest that a description of the search strategy for prior results be included in the final published article (and they follow their own advice in an appendix to the article).

Robinson and Goodman's search strategy to find the meta-analyses used to locate the randomized control trials

Of course, it is hard to believe that this problem is limited to just the authors of randomized control trials in biomedicine.  It wouldn’t take much to convince me that this problem exists throughout scholarly work, restricting the speed at which new discoveries are made.  I would bet that the problem can get particularly difficult in interdisciplinary areas.

We need to start with our undergraduates and convince them that it isn’t enough just to find the minimum number of required sources, but to really get at the heart of previous work on a topic.   This leads naturally into the topic of getting students to pick manageable project topics.  Of course, undergraduates like clear guidelines (and for the most part this is good teaching strategy), but upper level undergraduates should be able to handle the requirement that they find most of the relevant literature on a topic.

Robinson KA, & Goodman SN (2011). A systematic examination of the citation of prior research in reports of randomized, controlled trials. Annals of internal medicine, 154 (1), 50-5 PMID: 21200038

See also:

A librarian in a peer editing session

On Wednesday, I stayed a bit late after work to attend a peer editing session for a class I’ve been working with all semester.  This wasn’t in our original plan, but a few weeks ago it made sense that perhaps I could offer students some assistance with their citations during the same session where they were reviewing their peer’s writing.  We didn’t have a well thought out plan for my participation, but decided to give it a try.

It was an incredibly good use of my time.

Wikipedian Protester
Wikipedian Protester. From the wonderful web comic xkcd

I had previously reviewed the mechanics of citation with the students, as well as discussed some best practices of in-text citation:

  • When to cite
  • What doesn’t need to be cited (and what the lack of a citation implies)
  • How to use author’s names as the subject of a sentence (and avoid the passive voice)

In the peer editing session, I simply went from group to group looking at papers, making suggestions and answering questions.  Students asked a lot more questions than they would have if they had to seek me out (via email or in my office), and one student’s question would often help out another student.

Overall, I would do it again.  In classes where I know they do peer editing, I may volunteer to come in and help with citation related questions.

Using the ‘arsenic bacteria’ story as a teaching moment for undergraduates

The bacteria at the center of the debate
The bacteria at the center of the debate

Background: NASA funded scientists published an article in Science about bacteria using arsenic instead of phosphate in their DNA.  NASA held a press conference to promote their findings and comment on the importance of this discovery.  After reading the article, many scientists were not convinced that the discovery was as important as the authors were claiming, nor were they convinced that some of the methodology was sound.  And many of these scientists shared their doubts with the general public via blog posts, blog comments, twitter comments and other informal venues.  The NASA scientists fired back, saying that the scientific debate should happen through the formal process of peer review and publication.  Bloggers and science journalists responded by pointing out that they were the ones who held a press conference.  Carl Zimmer’s articles in Slate and in the Discover blog the Loom outline the issues nicely, and Ed Yong has a wonderful time line of how the story unfolded.

This story provides a unique teaching opportunity for faculty and librarians to discuss the issues of peer review and scientific communication with undergraduate students.

First, you have scientists on record saying that basically, the peer review system didn’t work as well as we’d like.  These scientists are saying that the scientific methods used were not as rigorous as they should have been.  In addition, many folks are arguing that what the scientists actually discovered isn’t nearly as important as the hype surrounding it makes it seem.

An in class discussion about this issue could center around several things:

  • Thinking critically about the methods – scientific criticisms of the article
  • Evaluating the importance of a new discovery – how good is peer review (or any other method of review) at evaluating this?
  • How are scientific discoveries represented in the media – do the stories about the science match up nicely with the science itself?  Students could analyze media stories from mainstream outlets as well as the original article.

Second, you have the controversy about where scientific debate should take place.  Some scientists see little value in the scientific blogosphere.  Many others (including myself) view it as a vital part of the communication between scientists and the general public.  In addition, blogger’s comments have led to the retraction of at least one article in a highly respected journal (that I know of).

An in class discussion about these issues could center around several themes:

  • What type of responsibility does a scientist have to communicate his/her discoveries with the general public?
  • How do less formal communication models (press conferences, blogs, etc.) interact with and relate to the formal communication process within science (peer reviewed articles)?
  • Who has a right to comment on a scientific article?  When should the authors respond to these comments directly?

The amount of information on this story available on the web is quite large, providing lots of opportunities for students to search for and find various opinions.  In addition, the science is multidisciplinary, allowing an opportunity for many courses to engage in these discussions.

The hidden landscape of scholarly publishing

Was the scholarly publishing landscape easier to understand when everything was in print? Image from Flickr user diylibrarian

Students tend to assume that all the information they need for a project (perhaps other than print books) is available freely online.  They may have rough ideas that some journals cost money (like magazine subscriptions) but I’m guessing that most students have a simplistic and rather naive concept of how they have access to information (I’d love to see some data on this).

Are we doing students a disservice by not making the details of the scholarly information landscape more prominent?

Libraries and information providers have worked hard to make much of this landscape transparent to the end user (including faculty).  If the student is on campus, many of the journal articles may appear “free” to the end user through a complex series of IP authentication, proxy servers and other behind-the-scenes technology.

When we teach students how to access information, we encourage them to use library databases, touting their scholarliness and focus.  But when users can access articles in JSTOR and ScienceDirect through a Google or Google Scholar search, the advantages of the paid databases are diminished.

We talk about journals, but we don’t talk about how we have access to them: free, direct from the publisher, in aggregators, etc.  We talk about ILL, but we rarely mention how they may find a copy of the paper archived on a website – students can discover this for themselves and then wonder if we really know what we are talking about.

We teach them about brainstorming keywords, narrowing or broadening their search as needed and identifying the types of information they may need.

But would it also be useful to them if they understood the nature of the scholarly information landscape?  Would it be easier for them to track down a copy of an article if they knew the possible ways that they might have access to it: (OA vs. subscription, direct publisher subscription vs. aggregators, final copy edited version vs. post-print)?

I’m starting to think that we need to start introducing some of these concepts to students as freshman, then build on them at advanced levels.  I’m just not exactly sure how to do this at the moment.

What is a DOI? Just the basics

Most of the students (and some of the faculty) I work with have no idea what a DOI is or why they should care.  This is what I tell them.

A DOI – Digital Object Identifier – is like a social security number for a journal article. They can be applied to other digital items as well, but you are most likely to encounter them in scholarly articles.

A DOI normally consists of numbers, letters and other punctuation.  It will look like this:

10.1016/j.acthis.2007.10.006

10.1186/1475-2875-9-284

The DOI provides a way to permanently find a particular item.  Publishers and scholarly societies change their websites all the time.  Recently, a major publisher completely re-did their website, messing up all links into their site.  I was quite annoyed.  But the DOI could still link you to an article in a way that a URL couldn’t.

Incidentally, you can use the DOI to create a nice, neat compact URL for a journal article (instead of those really log URLs provided by some databases).  You just need to add a little bit to the front of the DOI:

http://dx.doi.org/DOI

DOI LogoYou can also use the DOI to quickly look up an article from your libraries homepage or this webpage.

To get a DOI, a publisher registers with a non-profit organization called CrossRef, and they work with the publisher to assign a unique number.

Increasingly, journals and citation styles are requiring authors to include DOIs in article citations where available.

For additional (and much more technical) information about DOIs, see the DOI website or the Wikipedia article about DOIs.

Strategic Searching of the Geoscience Literature – GSA Presentation

This afternoon I will be standing in front of my poster at the Geological Society of America Annual Meeting:  “Fueling Learning Outside the Classroom with Strategic Searching of the Geoscience Literature”.   Stop by if you are in the area.

The poster is a part of the Geoscience Education session “Learning Outside the Geoscience Classroom: Engaging Students Beyond the Lecture and Laboratory Setting

In this poster, we are presenting a plan for information literacy instruction in mid-level geology courses and including some concrete ideas for how geology professors can include some of these strategies in their classes whether or not they have a science librarian available to collaborate with.

Associated information: