Assessment in Action

John Carroll University: Project Description

Page 1


Primary Outcome Examined (select one or more)

Student Learning: Assignment

Student Learning: Course

(No) Student Learning: Major

(No) Student Learning: Degree

(No) Student Engagement

(No) Student Success

(No) Academic Intimacy/Rapport

(No) Enrollment

(No) Retention

(No) Completion

(No) Graduation

(No) Articulation

(No) Graduates' Career Success

(No) Testing (e.g., GRE, MCAT, LSAT, CAAP, CLA, MAPP)

(No) Other (please describe)

Primary Library Factor Examined (select one or more)


(No) Instruction: Games

(No) Instruction: One Shot

(No) Instruction: Course Embedded

(No) Instruction: Self-Paced Tutorials

(No) Reference

(No) Educational Role (other than reference or instruction)

(No) Space, Physical

(No) Discovery (library resources integrated in institutional web and other information portals)

(No) Discovery (library resource guides)

(No) Discovery (from preferred user starting points)

(No) Collections (quality, depth, diversity, format or currency)

(No) Personnel (number and quality)

(No) Other (please describe)

Student Population (select one or more)


(No) Graduate


(No) Graduating

(No) Pre-College/Developmental/Basic Skills

(No) Other (please describe)

Discipline (select one or more)

(No) Arts

(No) Humanities

(No) Social Sciences

(No) Natural Sciences (i.e., space, earth, life, chemistry or physics)

(No) Formal Sciences (i.e., computer science, logic, mathematics, statistics or systems science)

(No) Professions/Applied Sciences

English Composition

(No) General Education

(No) Information Literacy Credit Course

(No) Other (please describe)

AiA Team Members (select one or more)

Assessment Office

(No) Institutional Research

Teaching Faculty

(No) Writing Center

(No) Information/Academic Technology

(No) Student Affairs

(No) Campus Administrator

(No) Library Administrator

(No) Other Librarian

(No) Other (please describe)

Page 2


Methods and Tools (select one or more)

(No) Survey

(No) Interviews

(No) Focus Group(s)

(No) Observation

(No) Pre/Post Test


(No) Other (please describe)

Direct Data Type (select one or more)

(No) Student Portfolio

Research Paper/Project

(No) Class Assignment (other than research paper/project)

(No) Other (please describe)

Indirect Data Type (select one or more)

(No) Test Scores

(No) GPA

(No) Degree Completion Rate

(No) Retention Rate

Other (please describe)

Student outcomes

Page 3

Inquiry Question (150 words open)

What was the project's primary inquiry question?

  1. Do students apply what we are teaching them about information literacy in first-year writing?

Executive Summary (150 words open)

  • How does the project align with your institution’s priorities and needs?
  • Why did you choose the outcome and library factor as areas to examine?
  • Why was the team composition appropriate?

  1. In Fall 2015, JCU began a new, integrated core curriculum, in which information literacy is a stated core foundational competency. IL in the new core is viewed as a necessary foundation for effective writing and oral communication. The Integrative Core Curriculum Committee (ICCC) has charge of aligning the various components of the integrative core with the school's student learning outcomes. In this regard, the ICCC reviews and approves components of the integrative core, including various rubrics for assessing student performance. As the core develops, plans are to assess student writing and IL in the student's major, in an integrated course, and in the capstone project. The choice of outcome for this project is important in seeing that students are off to a good start in basic competencies. The composition of the AiA action team, which includes Nevin Mayer, Library Instruction Coordinator; Tom Pace, Director of the Writing Program; and Todd Bruce, Director of Academic Assessment, is appropriate in that each of these people serves on the ICCC.

  • What are the significant contributions of your project?
  • What was learned about assessing the library’s impact on student learning and success?
  • What was learned about creating or contributing to a culture of assessment on campus?
  • What, if any, are the significant findings of your project?

  1. The development of a rubric for scoring information literacy in first-year research papers is a significant contribution of this assessment effort. Although student performance is also being assessed using a rubric for rhetorically effective writing, the use of the IL rubric helps us zero in on how well introductory writing students are performing in Access, Source Type, Source Suitability, Argument & Evidence, and Ethical Use. Our data suggest that student performance improves when there is library intervention. However, the inter-rater scores between the assessment groups are too low to make this conclusive.

  • What will you change as a result of what you learned (– e.g., institutional activities, library functions or practices, personal/professional practice, other)?
  • How does this project contribute to current, past, or future assessment activities on your campus?

  1. Despite poor inter-rater scores, there are some noteworthy similarities in the scores given by the librarians and the writing instructors. For example, most overall scores for student performance are under expectation. Next evaluation cycle we will do a better job of norming the IL rubric by creating an evaluation team of both librarians and writing instructors. We will also study the actual assignment given by the teacher, the research proposal, and annotated bibliography.

Please list any articles published, presentations given, URL of project website, and team leader contact details.

  1. Presented the AiA project in a faculty lunch program sponsored by the Center for Teaching and Learning.

    Nevin Mayer
    Instruction Coordinator

The Assessment Office collected 240 research papers from 17 sections of EN 125. The Writing Assessment Team normed the Rubric for IL, provided by the library, and evaluated the paper samples. Later, an evaluation team of librarians examined a sample of 30 of these papers using the rubric. Although there were similarities in scoring between the assessment teams, inter-rater scores were poor. We suggest ways to improve our evaluation process.
AiA_Poster_PDF.pdf Evaluating Information Literacy in First-Year Writing: What We Are Learning