Sunday, April 19, 2015

Determining the Value of Information Literacy Instruction in the First Year: March 27

This session presented a variety of institutional perspectives on assessment efforts in terms of measuring the value of information literacy instruction for students in first year programs or seminars. The presenters represented schools that had been selected to be among the first Assessment in Action (AiA) teams for 2013-14. The AiA program, which is a three year grant-funded project, is part of ACRL’s Value of AcademicLibraries initiative. A lot of interesting results are coming out of this initiative, so I was particularly keen to hear some first-hand reports.

The four institutions in this panel were: University of Wisconsin - Eau Claire, a large masters granting public university, Rockhurst University, a small private Catholic masters granting university, University of Redlands, a medium size private masters granting university, and the Claremont Colleges Library, which is shared among Claremont McKenna College, Harvey Mudd, Pitzer, Pomona, and Scripps colleges (all of which are small private selective liberal arts colleges.) Although each institution was evaluating the same sort of student coursework,each took different methodological approaches.

Rockhurst University measured the impact of exposure to F2F and online instruction (one-shots and webchat) on the attitudes and use of library resources of students in their introductory English composition courses. Using pre and post-test, they reported increases of 12% in student satisfaction with instruction, 14% in students using reference help, and 38% in student use of library website and databases. However, since there was a significant dropoff in participation in the post-test, these data raise more questions than they answer.

Similarly, University of Redlands, although employing a longitudinal methodology, used pre and post-tests to look at students’ self-identified competency levels with a number of IL skills - including source citation, evaluation, and search strategies. They broke these results down by course levels (freshman seminars through upper level courses) and levels of interaction - i.e. no formal instruction, traditional one-shot, and multiple formal instruction sessions. Their results did show post-test reporting of better understanding of certain IL concepts overall, and by frequency of instruction, but they also suffered from a drop off in response rate (95% for pre-test to 34.6% for post-test.)

The most interesting studies of this group, however were those of the Claremont College Library, and the University of Wisconsin - Eau Claire. This was due mainly to their greater rigor, and to the use of a common rubric.

Claremont’s study investigated the question of what impact librarian intervention in first year courses has on information literacy performance in student work. They used a version of a rubric developed at Carleton College, which assesses three “habits of mind” in student writing: Attribution, Evaluation of Sources, and Communication of Evidence. Its four evaluation levels are: 1-initial; 2-emerging; 3-developed; and 4-highly developed.

Using this rubric, they assessed over 500 student papers from first year courses at all five of the Claremont colleges, with librarian graders using a rigorous norming method to ensure reliability. Librarian impact was measured by using a “Librarian Course Engagement Level” scale (1=lowest, 4=highest) reported by librarians. Librarian engagement was defined as any of one or more instances of IL instruction, course guide creation, online IL tutorials / quizzes, or collaboration with faculty on syllabus or assignment design. Their overall results showed that across all three criteria, students who had higher levels of librarian engagement scored higher on the rubric.




At University of Wisconsin - Eau Claire, the librarians have been teaching a standard lesson plan developed through a multi-year collaboration between library and English faculty, and they wanted to assess the impact of library instruction on students conducting research in their first year composition course. They also used a version of the Carleton College rubric to evaluate 200 student papers from these classes - all of which received library instruction from a UW librarian. They also engaged in a norming process, and their overall results showed their students scoring about 2.6 out of 4 (meaning somewhere between “emerging” and “developed) across all three of the rubric criteria. They also showed that students who identified the course goal of information literacy as having been achieved, scored an average of 3.6 out of 4. They attribute this to a metacognitive effect of information literacy instruction.



This certainly was one of the most valuable sessions that I attended, because I was not only able to learn from these librarians’ experiences, but also come away with some concrete ideas for doing research at my own institution.

Tuesday, April 7, 2015

Reaching Every Student - mandatory info lit courses: March 26



I went to this session in the hope that hearing specifics about campus-wide information literacy courses from a variety of different types of institutions would be instructive as to how we might reorganize our own credit bearing information literacy course. And, while I did not come away empty handed, I am confirmed in my view that there is no magic formula for these types of efforts.

There were librarians from four institutions: The College of New Jersey - a medium size public liberal arts college, Indiana University, South Bend - a medium size masters granting public university, University of Maryland University College - a large masters granting public university, and ASA College - a medium size associate granting for-profit college. The presentations were framed in terms of three themes: 1) Institutional dynamics & faculty buy-in. 2) Limitations & challenges. 3) Benefits.

None of these institutions are truly comparable to my own college. But, because the University of Maryland course focused on both graduate and undergraduate students, and ASA is largely a technical school, I chose to concentrate more closely on the presentations from Indiana University and The College of New Jersey (TCNJ). 

At Indiana University, about eight librarians and as many adjuncts teach their hybrid, credit bearing course to 1,200 undergraduates annually. The courses are taught using a combination of the LMS and LibGuides. Students have pre and post-tests, weekly assignments, and a final project of an annotated bibliography. They receive a letter grade.  At TCNJ, one full-time librarian teaches their online non-credit course to 2,200 undergraduates annually. Students take six multiple choice tests. They receive a pass / fail grade. For both of these institutions, the origins of the courses were institution-wide Gen Ed or curriculum revisions. The similarities, however, appear to end there.

At TCNJ, their online course is self-paced (or binge-ready, as the presenter noted) and taught through multimedia and practice tests. They note that they were able to get institutional and faculty buy-in due to their faculty partnerships and strategic placement on college-wide committees, but they acknowledge the perennial challenges of student motivation (given perceived vs. actual IL skills) and the lack of evaluation of authentic intellectual outputs. Nevertheless, they identified some of the benefits as making the library more connected college-wide, and having a course that lays a foundation for more advanced research skills.

At Indiana University, however, their course is taught both F2F and online, and includes regular homework, research assignments, and quizzes. And, because they are using so many librarians and adjuncts to teach the courses, they are able to devote more time to it. They estimate that it takes an average of 5 to 6 hours per section per week for grading, student contact and course delivery. They also spend time in the summers doing updates, link-checking and course revisions. They noted that what they “gave up” to teach these courses was double-staffing at the reference desk and no instruction for 100 level courses. 

While it is good to hear that librarians were able to draw on their relationships with faculty and administration to use these courses as a means to focus more directly on information literacy, it does seem that attempts to do more in-depth teaching and authentic assessment come at a price.  

Saturday, April 4, 2015

Out & About in Portland: March 25, 26, & 27

Portland is beautiful city, full of interesting people and places. It also has an excellent and affordable public transport system, the Tri-Max, which, along with the four day pass that ACRL gave to conference attendees, enabled me to get out and about while I was there.



Here are a few scenes of the city.

View of downtown Portland from the Convention Center


Portland's light rail system


The famous Portland food trucks
Multnomah County Central Library - dates from 1912

Powell's Books

Powell's Books



Chinese Gardens
Doc Martens Store

Poster Sessions: March 26 and 27

There were about 200 posters in the poster sessions. This was a very crowded scene on both days, with hundreds of librarians perusing the posters, chatting with presenters, and taking photos, all while balancing small plates of food and various beverages.  I was able to get photos of several that looked interesting, but only chatted briefly with a couple of the presenters. 

Probably the most interesting to me were those that dealt in some way with either instruction or assessment. However, the two that I spent the most time studying were from Gustavus Adolphus College and University of Florida, respectively. 




At Gustavus Adolphus College, the library received a grant from the Mansergh-Stuessy Fund for College Innovation to explore threshold concepts in undergraduate research. This was of great interest to me, as the threshold concepts are so central to the new Framework for Information Literacy. Two librarians at Gustavus Adolphus held discussions and workshops with faculty from various disciplines to identify threshold concepts common to all disciplines, and the best ways to prompt students to engage with those concepts as part of their undergraduate research experience. They were particularly inspired by the theoretical work of Townsend, Brunetti & Hoffer. Interestingly, when asked to articulate their conception of the most important threshold concepts, their faculty identified the following definitions, which map closely to the final Information Literacy Framework: 

Research is a recursive process.
Information needs to be organized - how it is organized makes a difference.
Knowledge is social, collaborative, and influenced by economic and social contexts. 
Students need to realize that they have something to say when they do research.

They report that the faculty found the cross-disciplinary discussions helpful both in terms of understanding disciplinary perspectives, and approaches to undergraduate education more broadly.




At UF, the library is utilizing a grant from University Student Technology Fee funds to transform three different learning spaces to create new environments that are more flexible and engaging. They wanted to reconceptualize these learning spaces to have mobile and interactive technology and learning, while utilizing existing space. They looked at technologies and configurations at various other universities - especially those utilizing iPads, portable Smartboard projectors, Smart tables, networked interactive whiteboards, etc, 

They envisage learning space transformed to create an untethered classroom environment incorporating mobile educational technologies that will emphasize student-centered, more informal and collaborative venues. They are currently in the process of implementing these changes, which will culminate with installation, pilot testing, and assessment in the fall 2015.

Wednesday, April 1, 2015

ACRL - Leaving the One-Shot Behind: March 26


Two librarians from very different institutions (Keene State College, NH and Portland State University, OR) discussed their experiences of addressing the problem of teaching information literacy without overextending themselves or their resources. Referencing the research (Orr & Wallin, Badke, etc.) they acknowledged the perennial disconnects among one-shot training and authentic learning, librarian and faculty perceptions and expectations, and student perceptions and needs. Alternatives to the one-shot, they admit, such as embedding individual librarians in specific classes, are time-consuming and ultimately not sustainable.

They discussed alternative models of providing information literacy that have been put into practice at both of their institutions: student-to-student (peer learning & mentoring), library DIY, and train-the-trainer.




Portland (an urban, public research university) leveraged their year-long, freshman inquiry program to develop peer mentors (juniors and seniors) who conduct orientations in which they model good research practices. The librarians created content and learning objects in LibGuides for the mentors to follow - a kind of tool kit. They also held training sessions for student mentors, and advanced design workshops for faculty assigned to teach the freshman inquiry classes - with librarians assigned to keep in touch with both groups. The tool kit was later used as the basis of research tutorials that were embedded into a online Gen Ed course pilot - evidently a happy serendipity with a several million dollar university initiative.

At Keene (a small public liberal arts college), the librarians were frustrated with limits of their instruction in a couple of lower level Gen Ed courses, and the lack of recognition of the library’s attempts to map IL across the curriculum. Following the college’s official focus on integrative learning, and inspired by peer mentoring models, they set up a program to allow students employed by the library, or other college departments to serve as student mentors, and hold one-shot sessions, do reference interviews and work at the information desk.

I have to say, that while the idea of getting away from the tyranny of the one-shot is appealing, I’m not sure that all of these models are transferable to a community college context - particularly the peer mentoring model. However, library DIY and train-the-trainer concepts do seem transferable, and could be a way to involve faculty more effectively. We already have the building blocks for these things, (LibGuides, LMS) it only remains to start a conversation about collaborating on such initiatives.