Post-CARL Review, Pt. 1

Posted on 19 April 2010 at 10:25 pm in Events.

I had a great time at the CARL Conference over April 8-10 in Sacramento. There was a lot to digest, in terms of insight and inspiration, and I’m hoping to distill at least a little bit of that wisdom here. There was a wide range of topics covered in the various presentations and speeches, but a couple subject areas jump out as being worthy of further discussion. My next couple posts on this blog will tackle these. First up: Library Instruction/Information Literacy.

About ten of the twenty-four official discussion sessions were focused on information literacy and instruction. Of these, I attended several, and came away with a lot of good ideas and an appreciation for the serious research going on in the subject right now.

Longitudinal Research

Three staff librarians from Cal State-Long Beach presented their ongoing, 6-year research endeavor to determine the effectiveness of their library instruction program. Their presentation, “Are They Getting It: Seeking Evidence of Students’ Research Behavior Over Time” described their grant-funded project from its inception to its current state, two years into the study.

I’m impressed with the depth of their research. They started by developing a large sample of freshman students with the intention of following them throughout their education. Their analysis of the students’ research skills extend to studying the students’ research paper bibliographies for source and citation quality. This sort of extensive, longitudinal study is difficult to implement and ties up a lot of staff time and resources, but the results — sure to be published — will be of use to universities and colleges throughout California and beyond (which is why CARL was the main grant-giving body behind the project). It’s easy to suppose how and why information literacy programs are successful or not; it’s another thing to really study what’s happening in a quantitative manner.

One interesting takeaway from their presentation was their use of a statistician to analyze the substantial data their surveys were generating. You can’t merely collect information — it needs to be analyzed in a meaningful way. Sometimes it is best to bring in an outside expert instead of relying on in-house staff. Their statistician was able to model their data in several dimensions and changed their whole perspective on the information they had gathered — and saved the time and energy of the librarians themselves.

The presenters — Susan Jackson, Karin Griffin and Carol Perruso, all of CSULB — also provided extensive survey details in the form of handouts, including a timeline, survey questions, and project budget. While the survey will run for several years yet, I’m looking forward to their eventual results and what it will teach us about what works and what doesn’t, and how research behavior is evolving.

Working With a Campus Assessment Coordinator

Another example of using outsider expertise came from the presentation “Upstairs-Downstairs: Working with a Campus Assessment Coordinator and Other Allies for Effective Information Literacy Assessment” by Golden Gate University librarians Amy Hofer and Margot Hanson. In this case, their outsider was really an insider: the existing GGU Campus Assessment Coordinator. Still, they were reaching outside the lines of library staff to work with someone with a campus-wide responsibility, and more importantly, an understanding of program assessment.

According to their presentation, the advice and administrative approval they got from their use of the Campus Assesment Coordinator was essential for the success of their program study, which involved the startup of a new, embedded library instruction program that moved away from “one shot” instructional sessions in favor of an ongoing, semester-long engagement with a class. Their Assessment Coordinator started by asking what a successful program actually looked like, who was the audience for their study, and suggesting the use of a control group to put the study’s findings in context. They also devised measures to test discernible improvement in actual information use, rather than relying on the students’ self-assessment of their own information literacy (in the form of traditional satisfaction surveys).

Hofer and Hanson narrowed their research by focusing on a specific segment of the GGU student body, a special program for foreign-born students developing their English-language research skills (the PLUS program). Golden Gate University has an unusually high level of international students due to its emphasis on graduate-level business programs and location in the heart of downtown San Francisco. They were able to measure student research skills based on written tests and an analysis of work performed at the beginning and at the end of the school term, and saw marked improvement in two of the three categories they measured (the hardest area to improve was the students’ choice of subject, which is a critical thinking skill that can extend beyond the library’s sphere of influence).

The Golden Gate University study was a well-orchestrated example of research that would be easier to implement than CSULB’s expensive, time-consuming longitudinal study that would still yield relevant institutional results. More information about this study, including some of the test questions and suggested further reading is available here.

The Post-Google World

The final presentation at the conference I attended was an informative workshop built around information literacy program curriculum, and improving lesson content by reverse engineering the process: start with the (desired) results, and work backwards to build your lesson plan. Korey Brunetti and Lori Townsend of CSU-East Bay were joined by Julian Prentice of Chabot College to lead this session (Let’s Try This Again: Redefining the Content of Information Literacy for a Post-Google World) that combined an initial group presentation with a workshop-style open discussion using Prezi to capture the assembled attendees’ ideas.

There were a few big concepts that emerged:

  • Keep your goals simple — reduce, reduce, reduce superfluous objectives in favor of imparting a few key, simple ideas on your students.
  • Emphasize critical thinking skills across mediums. Ultimately, the source of a citation doesn’t matter (open web vs. subscription database vs. government website etc.), it’s the quality and verifiability of that source.
  • Understand how contemporary students work and integrate better tools and critical decision-making into their existing study patterns.

This sessions’ notes and final “Prezi” will appear in the forthcoming CARL Conference Digital Proceedings.

Take-Home Lessons

I found a number of take-home lessons in this focus on information literacy programs. Beyond the simple opportunity to see how different libraries and universities are pursuing information instruction, it was instructive to see the value of both long-term and short-term research for improving existing programs, jump starting expanded programs and ultimately — and perhaps most importantly — proving the library’s enduring value to campus administrations.

Through each of these sessions were also woven excellent ideas for instruction curriculum in the 21st century; how best to capture the students’ attention and impart meaningful lessons that will actually impact their research methods in a positive way.

My next post will cover some of the career-development issues discussed at the conference, including Dr. Peter Hernon’s plenary lecture.

Share and Enjoy:
  • Print this article!
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • Mixx
  • Google Bookmarks

2 Comments

  1. Comment by Amanda G. on April 20, 2010 at 5:37 pm.

    Thanks so much for this wrap-up. It is very useful for someone like me who couldn’t go to the conference but wanted to. I would add to your take-home lessons the importance of collaboration with those outside the library – for example, I don’t know if CSULB is working with an on-campus statistics expert but those who are interested in doing similar projects could work with a social sciences researcher on the faculty at their campuses for statistics expertise. And then GGU’s collaboration with the campus assessment coordinator is another good example.
    These sound like interesting sessions! I’m eagerly awaiting part two.

  2. Comment by Daniel Ransom on April 20, 2010 at 8:56 pm.

    In CSULB’s case, they used an outside statistician that they paid out of grant funds.

    While I can see definite benefits to working with in-house faculty on that kind of project – tightens the binds between faculty and library, which is always important – it might also be hard to get faculty to donate time. That’s the benefit of paying an outsider – you get their undivided attention.

Sorry, the comment form is now closed.

Top