Kathryne Drezek McConnell, Office of Assessment & Evaluation, Virginia Tech

Kimberly Filer, Office of Institutional Effectiveness and Assessment, Roanoke College

Steve and I attended a this all-day rubric workshop.  The workshop was very informative.  We didn’t have time to make progress on our own rubric at the workshop, but I am trying to steal time to draft the rubric throughout the meeting.  You can watch my progress by looking at QEP-OutcomesRubric_DRAFT.docx in the PDA subfolder of our QEP dropbox.  Below, I have listed the take-home points I took away from the workshop.  I think these will be very useful for our rubric work.  I asked Kathryne if she would be interesting in helping us conduct some train the trainers workshops.  She is willing and can recommend people if scheduling is an issue.  I found her a motivating and engaging facilitator, sensitive to the faculty perspective.

  • They highly recommend using the AAC&U Value rubrics to inform your rubric development, however it was patently clear that modification of the rubrics is necessary to make them understandable by students and usable by students.  She shared several iterations of the rubrics from her campus that they developed for the Virginia Tech QEP.
  • Virginia Tech has implemented a system where the rubrics are universal across disciplines but the artifacts and assignments are developed by the faculty or person designing the experience.  This model seems to be the new norm and there seems to be little similarity between how the assignments are reviewed and approved (if at all).
  • Rubric development is iterative and should regularly be cycling through the design, align, norm process.  The iterative process should rely on evidence to inform revisions of the rubric.
  • Give individuals the control as to how to best implement the rubric in their learning experience.  The important consequence is that the assignment is aligned closely with the rubric.
  • Rubric analysis is labor intensive and people doing the work should .…
    • see how the work is consistent with their values and concerns
    • have the opportunity to develop expertise in rubric design and analysis in a way that doesn’t make inordinate demands on their time
    • see rewards for their participation (examples include stipends, travel grants, implementation grants, course reduction.)
    • see an obvious institutional commitment to provide on-going support for the rubric analysis
  • Neither of these institutions collect student work as evidence of the learning – they only collect the faculty reviews.
  • Training is crucial.  Rubric users need to be trained in two capacities: (1) defining the rubric for their situation (2) norming the scoring with other users.
    • After attending this workshop, I think I would be able to conduct this training.
  • The rubric MUST be directly aligned with the student learning outcome – this is one reason why you are likely to have to modify the AAC&U rubrics for your situation
  • If you are implementing a rubric assessment process (or any other innovation), the presenters strongly recommend using the Concerns-based Adoption Model (CBAM) from Change in Schools: Facilitating the Process (Hall & Hord, 1987) or some other model of making sure you are regularly auditing the users to identify and address concerns as soon after they arise as you can. – I think it would be good to include this type of monitoring in our plan.  The book contains surveys that you can use.
  • When reporting rubric results avoid the temptation to report an average.  Report distribution – it is more meaningful and can highlight rater issues.

The speakers recommended the following resources:

  • Change in Schools: Facilitating the Process (Hall & Hord, 1987)
  • Learning to Think: Disciplinary Perspectives by Janet Gail Donald (2002)
  • Dr. Mary Allen, Professor Emeritus for Psychology, former Director of the Institute for Teaching and Learning, Calfiornia State University, Long Beach, CA, and author of Assessing General Education Programs
  • The University of Hawaii, Manoa assessment website.

Comments are closed.

Post Navigation