Dr. M. David Miller, Director, QEP and Professor

Dr. Timothy Brophy, Director of Institutional Assessment

University of Florida, Gainsville, FL

ABSTRACT: This session will discuss the steps in developing a QEP assessment plan. The University of Florida’s QEP provides an example of a completed assessment plan that conducted an extensive review of existing instruments and developed instruments to measure the student learning outcomes (SLOs). The assessment plan includes direct assessments, indirect assessments, and outputs. We will discuss the process and criteria used to review existing instruments and to develop new instruments. Finally, we will discuss the overall assessment plan and show how we established the links between the assessments, the campus initiatives, and the SLOs.

Resources

  1. Handout (pdf)
  2. Presentation (html) – as of yet the presentation has not been posted.
  3. The U. of Florida QEP (html)

Phew!  This session was fast and furious.  They went through a ton of information in a short time.  Much of it is *VERY* relevant to our QEP.  This presentation essentially comprises Chapter 6 of their QEP.  They divided their assessment plan into three assessment categories:

  1. Direct assessments of student learning (rubrics)
  2. Indirect assessments of student learning (surveys)
  3. Outputs (measures of attendance, participation, etc)

Summary

Instrument evaluation rubricThe UF QEP topic is global citizenship and intercultural communication.  UF has a system where all programs have developed and assess SLOs in content, critical thinking and communication – so they aligned their QEP with the existing model.  They strongly emphasized the need for reliable direct measures of learning – this has certainly been a theme for the sessions I am attending.  Measuring outputs and indirect measures is all good and well but if you don’t have direct measures of student learning eyebrows will be raised.  Their learning outcomes  and assessments went through several iterations where they were reviewed by experts in international education and assessment experts but consistently emphasized the point not to lose the alignment.  The measures MUST align with the stated outcomes.  It is also important to note that they employ a course-embedded assessment approach.  For the measures, they provided the following advice:

  1. Make sure the measures are aligned with the SLOs
  2. Test the measures for reliability and validity
  3. Think about feasibility for your situation (e.g., UF has a LOT of students)

Direct Measures

They developed rubrics for the outcomes but ended throwing one out because they didn’t feel it provided useful information.  I wasn’t quite clear on the details.  The rubrics were developed by adapting the AAC&U VALUE rubrics to their purpose – another theme of this conference.  The adaptations they made were to (1) only use 3 levels, (2) change the mastery language to 3=consistently, 2=usually, 1=rarely, 0=never.  These rubrics were reviewed by committee several times and piloted.  It is important to note that they rubric is applied to a range of assignments/activities, which are designed and selected by the faculty/departments responsible for the course.  The university has some kind of approval process for these activities but they didn’t really go into detail about how that works.  They have a faculty lottery for assessment – an interesting idea.

Indirect Measures

UF gives the SERU to their students, so they added items to that survey to avoid adding another layer to their university indirect assessments.  They had graduate students troll the literature to find a dearth of available items that could go on a survey to assess international education and intercultural competencies.  This produced over 70 items.  They then sent these items through an expert (content and assessment) review process, to whittle the number down.  They then piloted the remaining items and did a statistical item analysis to determine the most reliable.  They decided that for reliability with their sample size 12-14 items is sufficient to get reliable results.  The handout contains some specifics on this analysis.

Outputs

Not much about this was shared.  However, they had some very nice flow charts that we should look at once the presentation goes up on the website.

In the question part, they recommended an annual review of assessment (SLO) data to make adjustments to the plan.

Comments are closed.

Post Navigation