Skip to Main Content
Course materials support network for Minnesota, South Dakota, North Dakota and beyond

Program Management & Publishing: Assessment & Reporting: Home

Overview

Assessment & Reporting
For effective ASSESSMENT & REPORTING, specific metrics should be tracked including adoption rates (courses, faculty, departments, colleges), students reached, and cost savings for students. With the help from an institutional effectiveness or data and reporting department, impact on academic performance and engagement may be possible to demonstrate. Challenges abound, particularly with respect to consistency, accuracy, and completeness of data collected across various organizational nodes. Reporting to stakeholders is also challenging; the mix of concepts is just unique enough for people to conflate usage and savings, have misunderstandings about how savings are measured, or errantly apply analytics concepts from their own areas to your data. It is important to be transparent, provide context-rich data, and to involve stakeholders in discussions about the nuances of OER initiatives.

Assessment & Reporting Topics

Assessing the impact of an open education program requires an approach that looks at various themes on which the program is presumed to be having effects. Some themes are quantitative in nature and collecting the data requires extensive planning, highly prepared workflows, and coordination with others who are creating and collecting data. Other themes are qualitative in nature, and require less continual effort to monitor and manage, but require more skill in interpretation and coding. What appears at first to be a relatively straightforward task (recording some numbers and performing a few mathematical operations) soon proves itself to be a messy art with a complex web of decisions about definitional questions, practical challenges, and unknowns about consistency. 


Reporting Themes

  • Student Outcomes: Measure the impact on student performance, retention rates, and course completion rates. Assess whether OER usage improves learning and engagement.
  • Cost Savings: Evaluate the reduction in textbook costs for students and overall savings to the institution. Track the number of courses converted to OER and the financial benefits to students.
  • Faculty Adoption and Satisfaction: Monitor the number of faculty adopting OER and their satisfaction with the resources. Gather feedback on ease of integration, perceived quality, and the impact on teaching practices.
  • Access and Inclusivity: Determine if OER enhances access to learning materials for all students, including those with disabilities. Assess the accessibility of materials and the program's effectiveness in promoting equity.
  • Quality of OER Materials: Review the quality and relevance of the OER being used. Consider peer reviews, faculty evaluations, and student feedback to ensure resources meet educational standards.
  • Sustainability and Scalability: Evaluate the long-term sustainability of the OER program, including ongoing funding, support, and potential for growth. Assess how easily the program can be scaled to include more courses or departments.
  • Awareness and Engagement: Measure awareness and engagement with OER among students, faculty, and other stakeholders. Track outreach efforts, training sessions, and participation rates.
  • Impact on Pedagogy: Assess how OER adoption influences teaching methods and course design. Evaluate whether the use of OER encourages innovative teaching practices and the integration of open pedagogy.
  • Collaborative Efforts: Examine the level of collaboration between departments, libraries, and external partners. Determine how these collaborations support the development and dissemination of OER.
  • Feedback and Continuous Improvement: Collect and analyze feedback from all stakeholders to identify areas for improvement. Use this feedback to refine the OER program and enhance its impact.

 

Assessing and reporting on the impact of a library reading list management program has some similarities to reporting on an OER program. There will need to be some estimates (e.g., average cost avoided by digitized articles or chapters), lookup of MSRPs of assigned texts, and determination of which texts are required versus optional. But more of the data needed is centralized in the library management system, and in addition to books, articles, book chapters, and other content types need to be accounted for and tabulated. Metadata for citations is coming in to the system from many sources (manually entered, inherited from the central index, harvested on upload) and is not tightly controlled or consistent. However, preparation and configuration can tighten up your 


Preparations

At the end of the semester, you want to end up with a spreadsheet of all the readings from all of the course sections, along with the number of participants/students for each reading. You also need to know the type/format of reading for each line of your spreadsheet, to know whether 1) this reading represents a cost savings (e.g., freely available websites do not), and 2) what cost to assign to it (for a required text, you need the book's MSRP, for an article you use the standard licensing cost you've decided on).

  • Learn the system - Spend time with your system's reporting and analytics tool and know how it provides the data. You normally can configure the metadata and drop-down menus in different ways, so you want to create metadata that is both 1) clear to front end users and 2) definitive at identifying the resource in the back end. Pay special attention to metadata for material type, source, format, and other characteristics that will help you tell one category of content apart from another
  • Train for consistency - If more than one person is applying metadata to citations in your system, train everyone to define and apply metadata consistently and accurately
  • Collaborate across silos - access and public services staff running affordability initiatives and contributing to course delivery benefit by working with catalogers, data analysts, and e-resource specialists, who can help set up metadata sets, streamline systems processes, alert you when bib records are being removed, and identify and acquire materials.

Questions to Answer

  1. What is the student savings per each type of content the library is supplying? e.g., each article that is used saves $X amount in what they would have had to pay for licensing costs; not all types necessarily deliver savings
    • For example, you could use the price commanded by the publisher at the paywall, the average cost of paying copyright clearance for a print course pack, licensing costs plus labor costs for digitized materials
  2. What are all the different material types of content you are supplying? e.g., electronic article from a database, scanned book chapter, streaming media, dissertation, case study,,etc. 
  3. How will you distinguish between content supplied different ways, e.g., a magazine article from The Atlantic from a library database vs. a link to a magazine article in the Atlantic online vs a digitzed article from the print version of The Atlantic?
  4. How many students are in each course you are serving? this is not difficult if you have a reading list management system with an accurate course loader (i.e., one that updates after drops and adds) but will require reports from the registrar or directly from the instructor
  5. If you are also managing freely available online materials and open access scholarly works as part of your service package, do these count toward your student savings calculations?

What sort of assessment and reporting is possible with reserves depends on what system you are using and where you are uploading eReserves. See usage trends over time, comparing departments, 


Electronic Reserves

With electronic reserves, digitized, fair use portions of physical works held by the library are uploaded for a limited time to a server under the library's control. Normally that involves one of the following file serving options, each with it's own challenges:

  • Uploading via FTP to a server:
    • without a website interface to load the Google Analytics code snippet, a 3rd party web analytics service is needed
    • each 3rd party web analytics service will have different functionality, but likely will include at minimum how many times a file was accessed with a timestamp
    • metadata is limited to the name of the file, so file naming should follow a convention that includes info about course, section, instructor, and title
  • Cloud-based service: Google Drive:
    • Google Analytics will not operate on Google Drive because it requires adding a code snippet
    • Google Drive has a basic activity log that shows who accessed a file and when, but it does not provide detailed download statistics or user locations, so a 3rd party analytics service is needed for more robust data
    • as with uploading via FTP to a server, cramming metadata into the filename is necessary
  • Cloud-based service: Microsoft Sharepoint Online:
    • SharePoint Analytics is more robust than Google Drive, and shows how often files are accessed, who accessed them, and when.
    • you can view reports on file activity, including downloads, views, and edits.
    • as with the other options, cramming metadata into the filename is necessary

Physical Reserves

If you are already managing reporting for a reading list management system, you have probably already folded physical reserves into your reporting workflow for that. If not, here are a few things you may consider tracking:

  • Usage: Circulations, Browses (in-house circulations) 
  • Adopters: Courses, Course Sections, Instructors, Departments, Colleges
  • Format comparison: what percentage of your reserves are textbooks, monographs, serials, etc. 
  • Longitudinal comparison: if you have multiple years worth of data and consistent output (e.g., if you migrated systems at some point, which is likely, is there a crosswalk between the old data and the new data?)

The chilling effect of surveillance and potential harms of data breaches should make privacy a central concern when collecting and handling data on student use of materials. Library management systems and learning management systems are now equipped with advanced analytics capabilities which, depending on how they are configured, are likely collecting data in the background on usage connected to personally identifiable information (PII). Professional dedication to privacy varies between different fields, with libraries normally being the most dedicated to maintaining patron privacy.


Assessment and Reporting Ethical Responsibilities

  • have a plan for how to protect student privacy, and follow it
  • have a plan for what data you are collecting, and why
  • be aware of all the information your system is collecting, with special attention to any information connected to PII
  • be aware of the anonymization capabilities of your system, and develop workflow for retention and anonymization as dictated by your privacy plan
  • only collect data connected to PII when you have an explicit reason to do so
  • retain data connected to PII only as long as needed, then anonymize and/or purge it

The OER Starter Kit for Program Managers (2022)

by Abbey K. Elder; Stefanie Buck; Jeff Gallant; Apurva Ashok; and Marco Seiferle-Valencia

Toward Convergence: Creating Clarity to Drive More Consistency in Understanding the Benefits and Costs of OER (2022)

by Katie Zaback, prepared for Midwestern Higher Education Compact (MHEC)