Assessment in Action

Miami University: Project Description

Page 1


Primary Outcome Examined (select one or more)

Student Learning: Assignment

(No) Student Learning: Course

(No) Student Learning: Major

(No) Student Learning: Degree

Student Engagement

(No) Student Success

(No) Academic Intimacy/Rapport

(No) Enrollment

(No) Retention

(No) Completion

(No) Graduation

(No) Articulation

(No) Graduates' Career Success

(No) Testing (e.g., GRE, MCAT, LSAT, CAAP, CLA, MAPP)

(No) Other (please describe)

Primary Library Factor Examined (select one or more)

(No) Instruction

(No) Instruction: Games

(No) Instruction: One Shot

(No) Instruction: Course Embedded

(No) Instruction: Self-Paced Tutorials

(No) Reference

Educational Role (other than reference or instruction)

(No) Space, Physical

(No) Discovery (library resources integrated in institutional web and other information portals)

(No) Discovery (library resource guides)

(No) Discovery (from preferred user starting points)

(No) Collections (quality, depth, diversity, format or currency)

(No) Personnel (number and quality)

(No) Other (please describe)

Student Population (select one or more)


(No) Graduate

(No) Incoming

(No) Graduating

(No) Pre-College/Developmental/Basic Skills

(No) Other (please describe)

Discipline (select one or more)



Social Sciences

Natural Sciences (i.e., space, earth, life, chemistry or physics)

Formal Sciences (i.e., computer science, logic, mathematics, statistics or systems science)

(No) Professions/Applied Sciences

(No) English Composition

(No) General Education

(No) Information Literacy Credit Course

(No) Other (please describe)

AiA Team Members (select one or more)

Assessment Office

Institutional Research

(No) Teaching Faculty

(No) Writing Center

(No) Information/Academic Technology

(No) Student Affairs

Campus Administrator

(No) Library Administrator

(No) Other Librarian

(No) Other (please describe)

Page 2


Methods and Tools (select one or more)


(No) Interviews

(No) Focus Group(s)

(No) Observation

(No) Pre/Post Test


(No) Other (please describe)

Direct Data Type (select one or more)

(No) Student Portfolio

Research Paper/Project

(No) Class Assignment (other than research paper/project)

(No) Other (please describe)

Indirect Data Type (select one or more)

(No) Test Scores

(No) GPA

(No) Degree Completion Rate

(No) Retention Rate

(No) Other (please describe)

Page 3

Executive Summary (150 words open)

  • How does the project align with your institution’s priorities and needs?
  • Why did you choose the outcome and library factor as areas to examine?
  • Why was the team composition appropriate?

  1. Information Commons, and other dedicated technology spaces provide students with the software, hardware, and expertise needed to complete both basic and complex projects. vital to student learning and engagement, but assessment of these spaces has traditionally been based solely on usage, including users per time period or number of software uses. However, as the Library assessment culture continues to mature, matching student success to these rooms is necessary. This not only fills in gaps with our assessment efforts, but aligns with the University's 2020 strategic plan. We had representation from the Provost Office, Assessment Office, and Institutional Research, all key in helping align with the strategic plan, realizing the importance of assessment from support units, and assisting in maintaining an assessment culture.

    Our primary research question was: How does the usage of our dedicated technology facilities contribute to the success of Miami student scholarship and research?

  • What are the significant contributions of your project?
  • What was learned about assessing the library’s impact on student learning and success?
  • What was learned about creating or contributing to a culture of assessment on campus?
  • What, if any, are the significant findings of your project?

  1. The project help to demonstrate the contributions of our dedicated technology facilities. We were able to statistically show that students who used these facilities were more nearly 4x more likely to score higher on a visual rubric literacy than those who did not. We also set forth a process for using student-reporting technological self-efficacy to examine use of space and technology. While neither project was yet conclusive, they are setting the stage for future projects to further examine effectiveness and the facilities. These projects help to further highlight and showcase assessment activities of the libraries, and how they are important for continuous improvement, accreditation, and institutional effectiveness.

  • What will you change as a result of what you learned (– e.g., institutional activities, library functions or practices, personal/professional practice, other)?
  • How does this project contribute to current, past, or future assessment activities on your campus?

  1. As a system, we plan to increase our assessment activities that match student success and learning to space, especially our technology spaces. This project helped us to identify and work on an area of assessment were we could grown and conduct more sophisticated assessment techniques. The results have provided us a basis for continuing to approve assessment in this area, and better data to help faculty and administrators understand the value of these spaces. Personally, I continue to evolve in regards to how I design these studies and analyze and report the data, especially in regards to statistical analysis. Overall, the project has helped to further collaboration between the assessment office and the libraries on assessment projects.

Please list any articles published, presentations given, URL of project website, and team leader contact details.

  1. Team Leader Contact:
    Eric Resnis

    Overview and results of one part of the project that was shared with Miami faculty:

Dedicated Technology Facilities: Impacts, Success, and Implications

Miami University’s Assessment in Action project examined the effectiveness of two high-end digital media facilities. The first study of the project compared the technological self-efficacy of students who used the facilities to those who used other computing facilities without a comparable suite of equipment. The second project evaluated the visual literacy of students who created research project posters using the facilities compared to those who did not.