Authentic Assessment in Health and Physical Education with the Revised Curriculum

Area(s) of Focus: revised curriculum
Division(s): Intermediate, Senior
Level(s): Grade 9, Grade 10, Grade 11, Grade 12
Abstract:

This collection of tools and strategies will support assessment as learning in HPE, such that the clearer the articulation of what is to be assessed, the more successful will be the engagement by the student in their learning.

Assessment practices for the newly revised Health & Physical Education Curriculum should reflect an opportunity for interaction between teacher (facilitator of learning) and student. Triangulation of data is naturally supported by the performance-based curriculum which includes observation, dialogue and product.  The gap may be found, however, in the opportunity for authentic self-assessment and reflection by the student (i.e., metacognitive practices). This project’s intent is to consolidate and develop a collection of such interactive assessment tools for the teacher, which can be used to support both rich feedback for the student and conversation to improve learning, hence achievement and valid evaluation.

Team Members

  • Barbara O'Connor

    Halton Catholic District School Board

  • Christopher Belanger

    Halton Catholic District School Board

  • Cailin Miziolek

    Halton Catholic District School Board

Professional Learning Goals

  • Demonstrated confidence in using the tools as an anchor for dialogue opportunities with students
  • Demonstrated consistency in authentic and rich feedback, using the tools as a medium
  • Demonstrated and introduced effective and consistent assessment/evaluation among colleagues in Health and Physical Education departments, as a consequence of using the common tools of assessment

Activities and Resources

Activity #1: The consolidated Assessment Package was provided to all colleagues in the department for review, feedback and comment in order to provide direction and identify need of amendment within the package, to the team.

Activity #2: Team members used each of the tools over the second semester (and still ongoing)  and collected student artifacts. This demonstrated the use of the tools by the student and subsequent next-step interaction (i.e., dialogue/conversation, encouragement, feedback, etc.) with the teacher.

Activity #3: Team members will take a sample of student artifacts. Using these samples, a moderation review session will be held to determine if such tools in fact elicit consistency of assessment among teachers, and provide confidence in their use to support their triangulation of data. These same tools were shared at the OASPHE Conference with attendees supporting demonstration of the type of authentic responses that would be gleaned.

Unexpected Challenges

As a team, we believed that we had addressed all aspects for the use of the tools. However, we did not anticipate that the tools themselves would be taken out of context and used as a precise tool of  evaluation. The tools were designed as an ongoing opportunity for feedback/conversation between student and teacher. The information gathered from the tool was to be comprehensive and not specific per bulleted item. It was an opportunity instead to identify if the teacher’s observations and the students self-reflection were aligned – hence more of a thinking task as a whole, not evaluation as to whether a student self-reflected as a Level 1, 2, 3 or 4. This was realized when the workshop was presented at the OASPHE Conference and, hence as a team, we acknowledge that the “how to use” must also accompany the package.

Enhancing Student Learning and Development

Much focus is spent on assessment for and of learning. The anticipation and expectation is that this collection of tools and strategies will support assessment as learning. Some of the components are directly dependent upon teacher/student review of self-assessed metacognitive reflection. The fulfilment of the expectation is that the clearer the articulation of what is being assessed, the more successful and effective the adaptation and engagement in learning by the student.

Sharing

The final assessment product/project was shared at three forums:

  1. At the February department meeting, prior to the start of semester 2, all members of the Bishop Reding C.S.S. Health and Physical Education Department were provided with the updated collection of assessment tools (rubrics and matrices). The three team members explained to the whole how the newly developed rubric/matrix combination differed from our previous rubrics and highlighted how much more specific they were, hence supporting teacher/student interaction and authentic feedback between the two.
  2. At the February Halton Catholic District School Board Health and Physical Education Subject Council (comprised of all HPE Department Heads of HCDSB) the hard copies of the Assessment/Evaluation Package (collection of rubrics/matrices) were provided for review and use as they saw appropriate. In April, the electronic versions were provided.
  3. The project team were afforded the opportunity to present the Authentic Assessment/Evaluation Package supporting triangulation of data and student self-reflection at the Ontario Association Supporting Health and Physical Educators (OASPHE) Conference at Geneva Park (Orillia) in April. Attendees came from various boards across Ontario. The presentation included PowerPoint, colour handout package, and student artifacts of both the current package rubrics/matrices and previous rubric tools. The workshop was one hour in length and with review activity, audience interaction and challenge to create the same rubric/matrix for a “fitness category” by the attendees, the workshop could have gone longer.

Project Evaluation

We believe that our project was a success. All professional learning goals that we set as a team were met. We honed and developed the tools that would aid/support and provide authentic feedback between student and teacher re: attainment of performance and thinking success criteria. The tools developed also allowed consistency in both assessment and evaluation amongst colleagues in Health and Physical Education in our school department and with colleagues within our board. The tools afford an easy-to-use, authentic and yet uncomplicated  format that supports overall assessment of movement competency, decision-making and comprehensive participation. There are specific look-fors which align learning goals and success criteria established between students and teacher and provide direction for observation and conversation (components of triangulation of data) for teachers when assessing their students.

Our team’s planning was to ensure an opportunity to share our work and hence do some field testing (i.e., within our department at our board’s (HCDSB) subject council and at the OASPHEConference. Our school department colleagues reported that the use of the tools supported a more directed and specific assessment opportunity and facilitated feedback with students that addressed authentic next steps to support their efforts for engagement in improvement.  Students found the look-fors were very specific, so they knew what they could focus on for continued success. Teachers also commented that the tools could be edited/tweaked in such a way that not all look-fors needed to be used in every evaluation.

When we presented our project intent and products at the OASPHE Conference, we believed that how the tools could be used and the tools themselves were clear and seamless. However, during the presentation, we realized that there was a gap as follows:

  1. Prior to the use of the tools for assessment, the curriculum delivery with adherence to unit expectations must be in place. This can be achieved by following the “timeline” sample of sub-unit delivery which is included in our resources submission.
  2. The rubrics and matrices completed by the students are not meant to be disaggregated bullet by bullet, but rather taken as a “whole.” The student’s self-reflection is a medium for feedback and not a line-by-line evaluative process. The teacher review the rubric/matrix as a whole, assessing if how the student perceived their progress aligns with the teacher’s observations.

If we were to look back and change something, it would have been the timing of the project. It was difficult sorting out the timing to determine how much of the funding could be used for release time and if it could fit in within a teaching timetable. If the funding delay was not a problem (i.e., working with our board’s business department to determine the appropriate protocol), it would have allowed more time for other colleagues to use the tool more consistently in their assessment/evaluation practices over a full semester, rather than relegating field testing to smaller window. Knowing the process/protocol now, definitely makes any future opportunities easier. Connecting with supportive board staff who were extremely helpful did assist us tremendously. On all accounts, their goal was to support our efforts for a successful outcome.

Resources Used

Ministry of Education, Ontario, Health and Physical Education Revised Curriculum

http://www.edu.gov.on.ca/eng/curriculum/secondary/health9to12.pdf

Ministry of Education, Ontario, Growing Success: Assessment, Evaluation, and Reporting in Ontario Schools

http://www.edu.gov.on.ca/eng/policyfunding/growSuccess.pdf

Resources Created

These resources will open in your browser in a new tab, or be downloaded to your computer.