Sunday, April 19, 2015

Determining the Value of Information Literacy Instruction in the First Year: March 27

This session presented a variety of institutional perspectives on assessment efforts in terms of measuring the value of information literacy instruction for students in first year programs or seminars. The presenters represented schools that had been selected to be among the first Assessment in Action (AiA) teams for 2013-14. The AiA program, which is a three year grant-funded project, is part of ACRL’s Value of AcademicLibraries initiative. A lot of interesting results are coming out of this initiative, so I was particularly keen to hear some first-hand reports.

The four institutions in this panel were: University of Wisconsin - Eau Claire, a large masters granting public university, Rockhurst University, a small private Catholic masters granting university, University of Redlands, a medium size private masters granting university, and the Claremont Colleges Library, which is shared among Claremont McKenna College, Harvey Mudd, Pitzer, Pomona, and Scripps colleges (all of which are small private selective liberal arts colleges.) Although each institution was evaluating the same sort of student coursework,each took different methodological approaches.

Rockhurst University measured the impact of exposure to F2F and online instruction (one-shots and webchat) on the attitudes and use of library resources of students in their introductory English composition courses. Using pre and post-test, they reported increases of 12% in student satisfaction with instruction, 14% in students using reference help, and 38% in student use of library website and databases. However, since there was a significant dropoff in participation in the post-test, these data raise more questions than they answer.

Similarly, University of Redlands, although employing a longitudinal methodology, used pre and post-tests to look at students’ self-identified competency levels with a number of IL skills - including source citation, evaluation, and search strategies. They broke these results down by course levels (freshman seminars through upper level courses) and levels of interaction - i.e. no formal instruction, traditional one-shot, and multiple formal instruction sessions. Their results did show post-test reporting of better understanding of certain IL concepts overall, and by frequency of instruction, but they also suffered from a drop off in response rate (95% for pre-test to 34.6% for post-test.)

The most interesting studies of this group, however were those of the Claremont College Library, and the University of Wisconsin - Eau Claire. This was due mainly to their greater rigor, and to the use of a common rubric.

Claremont’s study investigated the question of what impact librarian intervention in first year courses has on information literacy performance in student work. They used a version of a rubric developed at Carleton College, which assesses three “habits of mind” in student writing: Attribution, Evaluation of Sources, and Communication of Evidence. Its four evaluation levels are: 1-initial; 2-emerging; 3-developed; and 4-highly developed.

Using this rubric, they assessed over 500 student papers from first year courses at all five of the Claremont colleges, with librarian graders using a rigorous norming method to ensure reliability. Librarian impact was measured by using a “Librarian Course Engagement Level” scale (1=lowest, 4=highest) reported by librarians. Librarian engagement was defined as any of one or more instances of IL instruction, course guide creation, online IL tutorials / quizzes, or collaboration with faculty on syllabus or assignment design. Their overall results showed that across all three criteria, students who had higher levels of librarian engagement scored higher on the rubric.




At University of Wisconsin - Eau Claire, the librarians have been teaching a standard lesson plan developed through a multi-year collaboration between library and English faculty, and they wanted to assess the impact of library instruction on students conducting research in their first year composition course. They also used a version of the Carleton College rubric to evaluate 200 student papers from these classes - all of which received library instruction from a UW librarian. They also engaged in a norming process, and their overall results showed their students scoring about 2.6 out of 4 (meaning somewhere between “emerging” and “developed) across all three of the rubric criteria. They also showed that students who identified the course goal of information literacy as having been achieved, scored an average of 3.6 out of 4. They attribute this to a metacognitive effect of information literacy instruction.



This certainly was one of the most valuable sessions that I attended, because I was not only able to learn from these librarians’ experiences, but also come away with some concrete ideas for doing research at my own institution.

No comments:

Post a Comment