Designing the survey seems like the easy part

Right now I am neck deep in information literacy assessment survey results.

The instruction librarians and I spent a lot of time devising our instructional goals and objectives and then developing the assessment tool (for a one-shot info. lit. session in a first-year writing course).  There were a lot of meetings, and it was a lot of collaborative work.

Since the analysis of the results falls largely on me, however, right now the survey design seems like the easy part.

The assessment tool we designed has a lot of short answer questions, which require a lot of thought in order to effectively ‘grade’.  We strongly feel that these questions provide a more accurate picture of student understanding, but they can be tricky to analyze.

For example, we asked the students how they can tell the difference between a scholarly and a popular source.  I need to figure out how to mark this answer from one of our students:

it will say peer reviewed

Completely correct?  Somewhat correct? Not correct?

So I will spend the next few days (weeks?) trying to figure out how to condense all of this information into a nice neat package.

We’ll see how it goes.

Advertisement

Assessing Information Literacy Skills in First Year Students

A new open access journal, Communications in Information Literacy, recently published an article about assessing library instruction for first year students.  The paper caught my eye because I’m working on some similar things here at Geneseo.
ResearchBlogging.org
The study sought to determine if students’ information literacy skills and confidence with research improved more with a greater number of librarian-led information literacy sessions.  The author used a pre-test and a post-test to examine students’ attitudes and stated behaviors.  She used likert-style questions to assess students’ previous use of information sources and their confidence with various information related tasks.  One group of students received the typical one-shot information literacy session in a first year writing and critical thinking class.  Another group received two or three information literacy sessions over the course of the semester.

The author is very clear about outlining the challenges we all face in trying to assess information literacy instruction.  Most notably, it is almost impossible to control for the wide variety of variables that have an impact on student information literacy skills:

  • Prior information literacy instruction in high school or other venues
  • Prior practice doing scholarly research
  • Student intelligence and creativity
  • Opportunity to practice skills learned in an information literacy session (and differences in the assignment requirements)
  • Differences in scholarship between various disciplines

Some information related to the factors listed above is relatively easy to obtain (although perhaps not so easy to quantify).  Course faculty can be a source of information about assignment requirements, and will set the standards for more or less practice information literacy and research skills.

On the other hand, getting information about prior instruction and practice normally relies on students’ self reporting, which is not always accurate.

In addition to the likert-style attitudinal questions, the author analyzed student bibliographies.  She looked at the different types of sources used, and whether they were available through the library or through other sources.

The latter question is challenging.  Typically, a student doesn’t need to use a library database to access the full text of articles if they are on the campus network.  As a result, they could easily have used one of many search engines and not even realized they were using library resources.  On the other hand, use of library databases that resulted in articles requested through Interlibrary Loan would not count as library sources.  We emphasize ILL at my institution, however, so perhaps it isn’t used as much at other institutions.

All of this begs the question – are the information literacy sessions we teach an effective way of teaching students research skills?

The author of this paper concludes that there is some positive benefit to the increased number of information literacy sessions, although the data seem a bit more mixed to me.

I wish that the author had actually tested students research skills.  While it may be much more difficult to evaluate, student confidence does not necessarily correlate with student skills.

Julie K. Gilbert (2009). Using Assessment Data to Investigate Library Instruction for First Year Students Communications in Information Literacy, 3 (2), 181-192

Assessment without review, analysis and change is a waste of everyone’s time

Today I’ve been thinking about assessment:

  • I created a short survey to assess student learning after a one-shot library instruction session.
  • I compiled student bibliographies from Fall 2009 courses I’ve worked with, in the hopes of analyzing what these students actually did.
  • I’ve been thinking about how to effectively assess the information skills students (should have!) acquired during a Spring 2010 course I met with on 5 different occasions.
  • I made some final edits to a very brief survey of user satisfaction at the reference and circulation desks (modeled after Miller, 2008).
Scantron sheet
Hopefully we don't try to assess our students to point of exhaustion! Image courtesy of Flickr user MforMarcus.

I’m in the process of collecting a lot of data about how I well do my job.

What’s the next step?

If I just collect this data and report on it without making any changes, I have probably wasted everyone’s time.  It is unlikely that the assessments will indicate that I am doing everything perfectly.  The goal of assessing service, student learning, user satisfaction, etc. is to make these things better.

What kinds of changes can you make:

  • Change your focus – In some classes I realized that students had a very good understanding of one concept I was trying to teach, but a poor concept of another.  I was able to change the focus of my instruction to focus more on
  • Change your goals – In some cases your assessment might reveal that your original goals are out of line with what students need.  This happened at my library in the one-shot we taught for the First-Year writing class.  We were able to re-align our goals with student needs.  We’ll see if this helped our students when we do an assessment at the end of this semester.
  • Go back to something that was working better before you made a change – The user satisfaction survey I’m working on right now is being done just prior to some big changes in the reference/circulation/service desks at my library.  We plan to re-do the survey in the Fall and again in Spring 2011.  Perhaps we’ll find that the changes result in a decrease in user satisfaction, although I sure hope not.  It is theoretically possible that we will need to roll back some of the changes we made.

So, anyone have a quick and easy way to analyze student bibliographies?

Assessing Information Literacy Skills


Searching

Originally uploaded by mia!

This year, the librarians at my library worked together to assess the library instruction portion of our freshman writing course.

All freshman take this writing and critical thinking class, and faculty are required to bring their students in for one 1-hour session on library skills.  Most faculty fulfill this requirement.

Last summer, we spent some time revising the goals and objectives for this one 50 minute session.  Based on the ACRL information literacy standards, our goals are rather modest: it is difficult to learn very much in 50 minutes.  After revising our goals and objectives, we developed a brief test to assess this objectives.

We were able to test some of our incoming freshman during the first few weeks of their college career.  We also have the results of the test from students at the end of their first semester, and for other students at the end of their first year.

The results are in, and I have spent some time analyzing them.  After sharing the results with the librarians, we will meet again to decide if we need to revise our original goals and objectives.  To me, this is the most important part of the assessment process.  Good assessment requires you to go back and look at your original goals.  Have you met your goals?  If so, do they need to change?  If not, what can you do to acheive those goals.  Simply collecting data without re-examining the original goals is a waste of everyones time.

So have the students met our goals?  Well, mostly.

  • Most students continue to think that our OPAC contains journal articles
  • They can’t seem to tell the difference between a book review and an article, but at least the book reviews they find are on-topic, and more students can successfully find something at the end of the year than at the beginning
  • Students can easily interpret records in our OPAC, but aren’t as good at evaluating a results list, although this improves with time
  • Worryingly (since I’m the library webmaster), students can’t seem to find our resources by subject lists at the beginning of the year or at the end.