(No) Student Learning: Assignment
(No) Student Learning: Course
(No) Student Learning: Major
✓ Student Learning: Degree
(No) Student Engagement
✓ Student Success
(No) Academic Intimacy/Rapport
(No) Enrollment
(No) Retention
(No) Completion
(No) Graduation
(No) Articulation
(No) Graduates' Career Success
(No) Testing (e.g., GRE, MCAT, LSAT, CAAP, CLA, MAPP)
(No) Other (please describe)
(No) Instruction
(No) Instruction: Games
(No) Instruction: One Shot
✓ Instruction: Course Embedded
✓ Instruction: Self-Paced Tutorials
(No) Reference
(No) Educational Role (other than reference or instruction)
(No) Space, Physical
(No) Discovery (library resources integrated in institutional web and other information portals)
(No) Discovery (library resource guides)
(No) Discovery (from preferred user starting points)
(No) Collections (quality, depth, diversity, format or currency)
(No) Personnel (number and quality)
(No) Other (please describe)
✓ Undergraduate
✓ Graduate
(No) Incoming
(No) Graduating
(No) Pre-College/Developmental/Basic Skills
✓ Other (please describe)
Doctoral
(No) Arts
(No) Humanities
(No) Social Sciences
(No) Natural Sciences (i.e., space, earth, life, chemistry or physics)
(No) Formal Sciences (i.e., computer science, logic, mathematics, statistics or systems science)
(No) Professions/Applied Sciences
(No) English Composition
(No) General Education
(No) Information Literacy Credit Course
✓ Other (please describe)
University-level assessment across multiple disciplines
✓ Assessment Office
(No) Institutional Research
(No) Teaching Faculty
(No) Writing Center
(No) Information/Academic Technology
(No) Student Affairs
(No) Campus Administrator
✓ Library Administrator
✓ Other Librarian
✓ Other (please describe)
Dean
(No) Survey
(No) Interviews
(No) Focus Group(s)
(No) Observation
(No) Pre/Post Test
✓ Rubric
✓ Other (please describe)
Learning Management System data tracking
(No) Student Portfolio
(No) Research Paper/Project
(No) Class Assignment (other than research paper/project)
✓ Other (please describe)
Number of student clicks in library-developed instructional content
(No) Test Scores
(No) GPA
(No) Degree Completion Rate
(No) Retention Rate
✓ Other (please describe)
Scores from secondary rubrics
Our institution’s academic model prioritizes outcomes-based student learning. The goal of our project was to gain a baseline for how our library’s instruction program contributes to student learning. We aligned our project to the university’s secondary assessment of its learning goals, specifically focusing on Learning Goal 3, which states that “graduates demonstrate critical thinking and information literacy.” In collaboration with faculty, the library uses an embedded information literacy instruction model, and we want to align our work with the greater goals of the university.
Our project looked at the number of times students clicked on library-created instructional content and analyzed that in conjunction with scores from secondary rubrics, which are used to evaluate attainment of the university’s learning goals and are not seen by students.
The assessment team composition was appropriate because it brought together librarians, a dean whose school is at the fore of university assessment, and the director of institutional effectiveness, who spearheads efforts to assess university learning goals.
We understood from the beginning that this project would serve foremost as a building experience in which we would generate baseline data and processes for assessing our instruction program. With delays in the implementation of the university’s secondary rubric assessment and only two quarters’ worth of data, our results were inconclusive. However, we are in a position to continue gathering data and will be able to analyze it for trends going forward.
The most significant finding was identifying methods for collecting data that we can continue to use and refine for assessment purposes.
As a result of this project, the library’s instruction team is more aware that their efforts have some impact on students – it is still to be determined what that impact is, but we can see that students are clicking quite a lot on the content we embed in their courses.
Additionally, our project helped us identify some potential gaps in instructional coverage, and we intend to follow up with our faculty to share our findings and begin conversations about how we can ensure equitable library support across the university.
Please list any articles published, presentations given, URL of project website, and team leader contact details.
CityU of Seattle Library's project website: http://library.cityu.edu/about/library-assessment.aspx
| Filename | |
|---|---|
| AiACityUSeattlePosterFINAL.pdf | Do Student Clicks = Student Learning? | What can a library learn about the impact of its instruction program on student learning by participating in its university’s Comprehensive Assessment Strategy? Learn about CityU’s quest to analyze student use of library-created instructional content in conjunction with information literacy assessment. Through trial, error, and persistence the team made important discoveries, accumulated baseline data, and acquired the knowledge needed to build a foundation for assessment of library impact on student learning.