Saturday, October 13, 2012

Ups And Downs Of Midterm Reporting

My school recently published midterm reports. Normally, a routine task at the midpoint of a trimester, this was a bit more challenging for us this time around. This was the first time we reported standards based scores in ELA and math. As with anything new, there was plenty to observe.
  • Too Soon To Report?: Because of my customized approach, I had more to report than most teachers. Yet, even I found the data to be too limited to report meaningfully. One reason for this is the fragmenting of content into Measurement Topics. Previously, I would have had a mass of scores that would have been run through a formula, resulting in a single score. Now, scores are assigned to one or more of the five ELA Measurement Topics. The result for me was that each Measurement Topic score was based on fewer data points, some as few as just one or two scores. With this in mind, I qualified scores with comments, explaining which scores were based on limited samples. However, will this be enough to clarify the matter for families? I'll soon find out, as Parent Teacher Conferences are next week.
  • Validity/Reliability Check Required: When I ponder assessment systems, I need to remember two key components, validity and reliability. Validity refers to how much an assessment actually assesses the targeted content. For example, if an assessment of a students addition skills focused entirely on coloring rather than addition, the assessment would have very low validity. Reliability refers to how repeatable an assignment is. For example, if the results of an assessment were inconsistent over time while a student's skills did not improve, there might be outlaying scores that skew the data. As I prepared midterm reports, some red flags were raised for me. First, my students take assessments at different times, days, weeks, or in some cases years. While I like to think that I know what a 4, 3, 2, or 1 looks like, my perspective is vulnerable to my ongoing experiences. My perspective is also warped by my personal attachments to students. While I fight the urge to give a Meets score to a student who has struggled mightily against the odds to meet a Learning Target but fallen just short repeatedly, I need more support for this painful ethical dilemma. This clearly indicates the need for rubrics and anchor samples for ALL assessments. To be fair, I've just been too busy developing and implementing to develop a full bank of these. Yet, it needs to happen, and soon. (By the way, this also might be an argument for instituting a .5 system that results in scores that show if a student has just barely met a Leaning Target or is knocking on the door of exceeding the Learning Target.)
  • Telling It Like It WAS: Perhaps the most troubling observation of this year's midterms is the fact that the resulting reports became out of date moments after the scores were posted. Things can change a great deal in a very short time in the customized classroom, usually for the better. Target due dates arrive periodically, rendering a midterm score rather useless. Even more frequently, students submit assessments in a surge of work ethic (we all have them), again rendering a midterm score useless and very inaccurate. These surges can happen within a single class meeting; imagine the impact of the full week of class meetings that passes between the posting and printing of midterm scores. At the upcoming conferences, this will become apparent to families as I will repeatedly need to point out how things have changed over the last week of school.
  • Electronic Reporting: The most positive observation of this initial midterm reporting relates to the value of electronic score reporting. Our school uses Infinite Campus. While it's far from perfect, the system allows students, teachers, administrators, and families an inside look at specific assessment scores. Best of all, this data is far more accurate than a midterm score as it is updated whenever a teacher enters a score. There are two requirements for this to work well. First, teachers need to be committed to consistently enter scores. (This is easy in a customized approach as assessments are scored in the moment with a score often being entered just minutes after a student submits the assessment.) Second, families need to take regular looks at a student's scores. Perhaps we can be more helpful to parents by showing them how to access scores during conferences. (It's hard to believe that a conference would not involve a close up look at those scores anyway.)

Wednesday, October 3, 2012

Spreading the Word

I'll be presenting at the MDOE's Experts Down the Hall conference in Augusta on October 29. The title of my session is "Customized Approaches to Standards Based Teaching and Learning".

This should be interesting. Will folks show up ready to learn about customizing classrooms, or will they show up with loaded questions? I'll hope for the former but be ready for both! It's perfectly understandable that some folks are ready to dive in now while others are less than enthusiastic about this kind of change.

Here's the tentative plan:

  • Why Customize?
  • What To Customize?
  • Customizing Tools & Tips
  • Customizing Method
  • Next Steps...