| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Annotated Bibliography on the Evaluation of Library Instruction

Page history last edited by PBworks 16 years, 3 months ago
  UNDER CONSTRUCTION
Annotated Selected Bibliography on Information Literacy Assessment
Edited by Alice Harrington Wilson (aharrington@monroecc.edu), Monroe Community College
Last updated 12/07
 
 
 
 
Barclay, Donald A. "Evaluating Library Instruction: Doing the
 
          Best You Can with What You Have."  RQ 33.2 (1993): 195
 
          - 202.
 

Barclay  suggests that assessment of library instruction be kept simple in relation to available time and resources. It should also be directed to what patrons learn instead of their satisfaction with the program. Barclay emphasizes the value of pre and post-tests and his article presents a realistic and practical application of assessment. A sample question from tools used is included. (Spollen)

 
 
 
Carter, Elizabeth.  "Doing the Best You Can with What You Have: Lessons Learned from
 
          Outcomes Assessment."  The Journal of Academic Librarianship 28.1-2 (2002): 36-41.
 

In his 1993 article, "Evaluating Library Instruction: Doing the Best You Can With What You Have", Donald Barclay challenged librarians to create statistically valid and reliable assessment instruments. He maintained that it would be impossible to improve the effectiveness of our respective programs without the hard data such tools would provide. Carter, the author of this article, illustrates how librarians at the Citadel incorporated Barclay's advice when developing their assessment program.

 
 
 
Caspers, Jean and Steven Mark Bernhisel. “What do Freshmen Really Know About Research?
 
          Assess Before You Teach.” Research Strategies 20.4 (2007): 458-468.
 
Caspers and Bernhisel explore options open to librarians seeking to assess the research skills which incoming freshmen bring with them to campus. Their purpose is to find reliable methods to determine “the concepts, skills, and confidence that students bring to the work of acquiring information”.  They begin by examining prior methods which have been used to assess students, starting with the “quick show of hands” to determine how many students have completed a research paper in high school. Closely related to that method is a written survey asking students questions about previous experiences, as well as a brief test of basic library skills. Three studies of this last type are cited in which students’ self-reported confidence levels were contradicted by actual testing. For example, at Berkeley, from 50% to 71% of respondents rated their research skills as “Excellent” or “Pretty Good”; however, 35.5% to 81% of the same respondents received poor or failing scores on actual tests.
 
Caspers and Bernhisel then developed their own combination test/self reporting assessment tool. They found that there was a “tendency for women to self-assess their skills at a similar level to men although the women scored significantly higher on the skills test.” Students who did well academically in high school not surprisingly also did well on this library research skills test. The most notable difference in the Caspers and Bernhisel study is that they found a weak but definite correlation between student’s estimate of their research skills and their actual research test results, unlike the other studies cited in this article.
 
The difference between Caspers and Bernhisel’s results and prior studies begs the question of whether or not library assessment instruments simply test knowledge of library-related terminology, or if they are truly assessing student knowledge of actual processes and ability to think critically about research. It also suggests that researchers may need to take a more careful look at the survey instruments themselves. The differences in results between men and women’s self-reporting, as well as the differences in results with more academically advanced students, might also suggest a need to focus more on demographic issues – gender, economic background, academic background when assessing student skill levels.
 
Annotation by Angela Weiler, Onondaga Community College
 
 
 
Choinski, Elizabeth and Michelle Emanuel. “The One-Minute Paper and the One-Hour Class.”
 
          Reference Services Review 34.1 (2006): 148 - 155.
 
Choinski and Emanuel, librarians at the University of Mississippi Libraries, discuss the inherent obstacles to assessing one-shot instruction sessions and explain their efforts at producing a workable assessment tool for said sessions. The authors describe their initial research on the topic and give a balanced and thorough review of the existing library literature, acknowledging the apparent preference by other researchers for the pre- and post-test method. The authors then begin to focus on the one-minute paper and explain how they developed, used and scored the papers, giving clear and helpful details. This article is a valuable resource, due not only to the limited literature on the one-minute paper but also for its style and easily understood language.
 
Annotation by Lori Annesi, Monroe Community College
 
 
 
Rockman, Ilene and Gordon W. Smith. “Information and Communication  Technology Literacy:
 
          New Assessments for Higher Education.” College & Research Libraries News 66.8 (2005): 587-9.
 
This article describes four instruments developed to measure information and technology literacy skills. The ICT (Information and Communication Technology Literacy Assessment), produced by the Educational Testing Services, appraises students’ abilities to use critical thinking skills in a digital environment and in real-world situations. Project SAILS (Standardized Assessment of Information Literacy Skills) is a test designed to help librarians determine what role information literacy may or may not play in student success and retention. The Bay Area Community Colleges Information Competency Assessment Project measures students’ information competency skills based on national and local standards. It allows community colleges with an information competency requirement to use it as a credit-by-exam instrument. The International Computer Driver’s License (ICDL) was developed to measure information technology skills but does not include information literacy skills.
 
This article was written before these instruments had been implemented beyond an initial testing period. It would be beneficial to examine their web sites and the literature for further results and implementations. 
 
ICT (as of April 2007, the name has been changed to the “iSkills Assessment”): http://www.ets.org/iskills
Bay Area Community Colleges Information Competency Assessment Project: http://www.topsy.org/ICAP/ICAProject.html
 
Annotated by Jennifer J. Little, SUNY Brockport
 
 
  
Somervile, Mary M., et al. "Toward Large Scale Assessment of Information and Communication
 
          Technology Literacy: Implementation Considerations for the ETS Literacy Instrument."
 
           Reference Services Review 35.1 (2007): 8 - 20.
 
This article shares the experiences of two California State University campuses that participated in a beta test of Educational Testing Service’s Information and Communication Technology assessment tool (ETS ICT). In addition, the authors discuss the relatively new “information and communication technology” literacy model, which in some circles has evolved from the traditionally separately assessed areas of information literacy and computer literacy. Development of the ETS ICT literacy assessment instrument is also summarized.    
 
While the article provides useful information for libraries considering the implementation of a large scale assessment of information, computer, and/or information and communication technology literacy abilities, the implementation hints and tips shared by the authors would benefit assessment efforts of any scale. This information includes knowing exactly what the assessment does and does not measure, securing resources and support, marketing the assessment, and ensuring that statistical expertise is available to analyze the results. Of particular interest is the need to carefully consider the timing of the assessment in relation to other campus-wide tests, the academic calendar, and technology/network/facilities restrictions. The article also provides useful documentation of the extensive efforts required to obtain participants.
 
Annotation by John Thomas, Jefferson Community College
 
 
 
Warner, Dorothy A. "Programmatic Assessment: Turning Process Practice by Teaching
 
          for Learning." The Journal of Academic Librarianship 29.3 (2003): 169-76.
 
The 2001-2002 study reported in this article undertakes a three-pronged assessment of student learning, as defined by ACRL’s Information Literacy Standards. Assignments completed during a series of four sequential freshmen library instruction sessions, and an upper-class speech assignment, were evaluated.   
 
While this model of library sessions is usually hard to achieve, and some of the resources taught here are quite dated, the assessment tools developed for this study may prove useful for many library instruction programs. The first is the research journal. Students were asked questions addressing, for example, use of controlled vocabulary, successful search statements, and citation components. The second was called the “reflection tool”, administered to all librarians and participating teaching faculty. Respondents examined successes and shortcomings of the students’ work in process and completed assignments. The third was called the “upperclassmen assessment tool”, and focused on retention of research skills. 
 
The author shared some helpful conclusions and observations. First, retention of skills over time is minimal, so they should be reinforced in subsequent instruction. Second, slow down both in instruction and at reference, so students can absorb what you are teaching. And don’t try to cover it all – limiting each session to 3-5 objectives. Finally, a close collaborative relationship with teaching faculty is essential for helping students make the connection between information literacy and their academic studies.
 
Annotation by Barbara Shaffer, SUNY Oswego  
 
 
 
Weiler, Angela. “Two-Year College Freshmen and the Internet: Do They Really "Know All That
 
           Stuff?" portal: Libraries and the Academy 1 (2001): 161-167.
 
This article shares the results of a study that obtained information relating to the computer and Internet experience of SUNY Morrisville’s fall 1999 incoming freshmen. In addition, background information concerning public school information technology curriculum (or lack of) is included in the article, as well as a summary of the relatively scant research on pre-college information technology skills that had been completed before this article was published.
 
Incoming freshmen were surveyed via a questionnaire (response rate = 26%). While the author indicates that there may have been some problems with the data obtained (students incorrectly including computer experience gained in the early weeks of their college career, for example), the results provide useful insight about the degree of computer and Internet use of these students. In particular, students were asked to share their Internet use patterns, their perceptions of the accuracy of information located via the Internet, and the amount of training relating to computers they had experienced. The results are presented in a clear manner. Of particular note, it is interesting to learn that 61 % of the students surveyed used the Internet “once a week” or less, and that 46 % had never received any formal computer training.
 
While the study is several years old, observations in this writer’s present environment indicate that many of the same issues continue to exist among freshmen students. The article provides valuable information for librarians and other campus members who work with students and may take it for granted that they “know all that stuff.”  
 
Annotation by John Thomas, Jefferson Community College
 
 

 


 

 

Comments (2)

Anonymous said

at 3:52 pm on Jul 3, 2007

to do: Needs to be updated. Alice is coordinating. Also need to find D-Z of existing bibliography

Anonymous said

at 4:36 pm on Dec 20, 2007

I've updated the bibliography. I'm in the process of incorporating relevent articles from the earlier incarnation.

You don't have permission to comment on this page.