Assessment in Action

City University of Seattle: Project Description

Page 1


Primary Outcome Examined (select one or more)

(No) Student Learning: Assignment

(No) Student Learning: Course

(No) Student Learning: Major

Student Learning: Degree

(No) Student Engagement

Student Success

(No) Academic Intimacy/Rapport

(No) Enrollment

(No) Retention

(No) Completion

(No) Graduation

(No) Articulation

(No) Graduates' Career Success

(No) Testing (e.g., GRE, MCAT, LSAT, CAAP, CLA, MAPP)

(No) Other (please describe)

Primary Library Factor Examined (select one or more)

(No) Instruction

(No) Instruction: Games

(No) Instruction: One Shot

Instruction: Course Embedded

Instruction: Self-Paced Tutorials

(No) Reference

(No) Educational Role (other than reference or instruction)

(No) Space, Physical

(No) Discovery (library resources integrated in institutional web and other information portals)

(No) Discovery (library resource guides)

(No) Discovery (from preferred user starting points)

(No) Collections (quality, depth, diversity, format or currency)

(No) Personnel (number and quality)

(No) Other (please describe)

Student Population (select one or more)



(No) Incoming

(No) Graduating

(No) Pre-College/Developmental/Basic Skills

Other (please describe)


Discipline (select one or more)

(No) Arts

(No) Humanities

(No) Social Sciences

(No) Natural Sciences (i.e., space, earth, life, chemistry or physics)

(No) Formal Sciences (i.e., computer science, logic, mathematics, statistics or systems science)

(No) Professions/Applied Sciences

(No) English Composition

(No) General Education

(No) Information Literacy Credit Course

Other (please describe)

University-level assessment across multiple disciplines

AiA Team Members (select one or more)

Assessment Office

(No) Institutional Research

(No) Teaching Faculty

(No) Writing Center

(No) Information/Academic Technology

(No) Student Affairs

(No) Campus Administrator

Library Administrator

Other Librarian

Other (please describe)


Page 2


Methods and Tools (select one or more)

(No) Survey

(No) Interviews

(No) Focus Group(s)

(No) Observation

(No) Pre/Post Test


Other (please describe)

Learning Management System data tracking

Direct Data Type (select one or more)

(No) Student Portfolio

(No) Research Paper/Project

(No) Class Assignment (other than research paper/project)

Other (please describe)

Number of student clicks in library-developed instructional content

Indirect Data Type (select one or more)

(No) Test Scores

(No) GPA

(No) Degree Completion Rate

(No) Retention Rate

Other (please describe)

Scores from secondary rubrics

Page 3

Executive Summary (150 words open)

  • How does the project align with your institution’s priorities and needs?
  • Why did you choose the outcome and library factor as areas to examine?
  • Why was the team composition appropriate?

  1. Our institution’s academic model prioritizes outcomes-based student learning. The goal of our project was to gain a baseline for how our library’s instruction program contributes to student learning. We aligned our project to the university’s secondary assessment of its learning goals, specifically focusing on Learning Goal 3, which states that “graduates demonstrate critical thinking and information literacy.” In collaboration with faculty, the library uses an embedded information literacy instruction model, and we want to align our work with the greater goals of the university.

    Our project looked at the number of times students clicked on library-created instructional content and analyzed that in conjunction with scores from secondary rubrics, which are used to evaluate attainment of the university’s learning goals and are not seen by students.

    The assessment team composition was appropriate because it brought together librarians, a dean whose school is at the fore of university assessment, and the director of institutional effectiveness, who spearheads efforts to assess university learning goals.

  • What are the significant contributions of your project?
  • What was learned about assessing the library’s impact on student learning and success?
  • What was learned about creating or contributing to a culture of assessment on campus?
  • What, if any, are the significant findings of your project?

  1. We understood from the beginning that this project would serve foremost as a building experience in which we would generate baseline data and processes for assessing our instruction program. With delays in the implementation of the university’s secondary rubric assessment and only two quarters’ worth of data, our results were inconclusive. However, we are in a position to continue gathering data and will be able to analyze it for trends going forward.

    The most significant finding was identifying methods for collecting data that we can continue to use and refine for assessment purposes.

  • What will you change as a result of what you learned (– e.g., institutional activities, library functions or practices, personal/professional practice, other)?
  • How does this project contribute to current, past, or future assessment activities on your campus?

  1. As a result of this project, the library’s instruction team is more aware that their efforts have some impact on students – it is still to be determined what that impact is, but we can see that students are clicking quite a lot on the content we embed in their courses.

    Additionally, our project helped us identify some potential gaps in instructional coverage, and we intend to follow up with our faculty to share our findings and begin conversations about how we can ensure equitable library support across the university.

Please list any articles published, presentations given, URL of project website, and team leader contact details.

  1. CityU of Seattle Library's project website:

What can a library learn about the impact of its instruction program on student learning by participating in its university’s Comprehensive Assessment Strategy? Learn about CityU’s quest to analyze student use of library-created instructional content in conjunction with information literacy assessment. Through trial, error, and persistence the team made important discoveries, accumulated baseline data, and acquired the knowledge needed to build a foundation for assessment of library impact on student learning.
AiACityUSeattlePosterFINAL.pdf Do Student Clicks = Student Learning?