6 Ways I Differentiate Learning for Teachers
My career as a Professional Development Facilitator began much like most other adventures I jumped into, head strong and full of optimism. Not only was I going to be the best facilitator, but everyone was going to love me. In my early days as a facilitator I worked really hard to achieve the latter goal. I lost track of the objective of my workshop and played too much to my audience. I planned exceptionally well, but found myself running from table to table, participant to participant, trying to fit my square content into their round context. As you can imagine, after a full day’s workshop I was exhausted. After 5 day workshops I usually took a solid week off.
Adults are incredibly demanding, and to be fair, they probably have every reason to be. Their training and work experience can vary greatly, and this variance directly determines the degree to which teachers will benefit from PD. With kids we can get away with treating them all the same and force feeding them content. Its wrong, but we can get away with it. Why? Because of how we assess learning for kids; not too mention students have less choice in terms of teachers and subject matter. As to student assessment, we do it by grade level with the expectation that all kids at a certain age should be able to demonstrate age appropriate knowledge and skills. With adults, their ‘assessment’ is much more subjective. When we try to make it objective the performance evaluation (lets call it what it really is) devolves into a side show of the teacher stressing about an observation and the supervisor feeling obligated to comment in as many areas as possible, or in all the mandated areas.
Unfortunately, teacher observation and evaluations yield very little actionable information. Its not only unfortunate because we miss a chance to celebrate what’s good and can be shared with colleagues, but we miss the opportunity to identify opportunities for growth. Its this latter point that really plays into how we differentiate adult learning. Because schools are often resource constrained (time, money and human resources), they try to provision PD a fews days throughout the year and limit planning to all subject teams, sections or even corral all staff into something. Organizing PD in this way usually devolves into teachers catching up on planning and reports. This approach is the first misstep in differentiation and mirrors what is wrong in many of the classrooms across the world; force feeding content to ensure teachers are demonstrating ‘appropriate’ knowledge and skills. It’s a wonder if anyone turns up to PD with a growth mindset, with a question they want to answer and plans for how they will apply their new found knowledge and skills.
What I have described above is the norm for almost every school I have worked with, surveyed and am familiar with through my work with various associations; not too mention the endless literature I have read recounting similar anecdotes. So, why are schools struggling to improve student learning? My belief is you can’t help others until you have helped yourself. Put simply, how can we assess students if we do not have effective mechanisms to assess and improve our own practice. Whats the point of collecting student data if we are incapable of discussing it? If we don’t know where we need to improve, how can we know what data to collect? Perhaps this is a chicken and egg scenario: Chicken – Collect student data and identify ways to improve practice; Egg – Identify ways to improve practice and the data that measures effectiveness.
Below are six fundamental steps for identifying, implementing and measuring the impact of learning, for teachers:
Conducting purposeful needs analyses;
Gaining buy-in from all stakeholders;
Using grouping strategies;
Identifying roles for every participant that matches their level of understanding;
Providing time for meaningful evaluation; and
Ensuring time to assess and debrief professional development.
Needs Analyses are probably the most complex element of the consultation. Most clients ‘know’ what they need, but they don’t understand why they need it. They may have ample data (qualitative and quantitative) that points to an area for improvement, but by exclusively fixating on that data they fail to surface underlying causes; they have effect and limit the debate on cause. By expanding dialog around cause, they open the door to achieving the second step, which is buy-in. I achieve these steps by getting to know all the stakeholders affected by the ‘need’ I am being asked to address. It is also critical that these stakeholders have a common question they want to answer.
Next, groupings, is much easier to address once we have buy-in and aligned the professional inquiry. If I know my stakeholders, I can group them in terms of their inquiry, level of understanding and attitude towards the proposed intervention. In some cases it may serve my purpose to use homogenous groups, so that discussions can be at a level appropriate for all members. In other cases I will use heterogeneous groups to build awareness and introduce perspectives across a group. This latter strategy also serves my purpose of ensuring each member has a role. To ensure engagement from the ‘naysayers’ (think ‘it’ won’t work) and ‘avoiders’ (know ‘it’ already), two personalities that ultimately believe they have something better to do, I ensure activities have roles and questions to engage the attention and interests of everyone.
Lastly, for evaluation and debrief, I push leadership hard to identify behaviours that demonstrate impact and to follow up with staff directly. I do this by being very persistent in my post workshop follow up calls/meetings. The closer I keep the learning at the top of the leaderships’ minds, the more likely they will engage in meaningful dialog and purposeful evaluation. Often this requires me to script conversations and devise evaluation tools.
These six steps are mirrored in every client engagement I have. As most schools do not have processes or time to engage directly in these steps, I take responsibility for them. At a minimum, by ensuring the right questions are answered and the right stakeholders involved, I know that everyone in the room will benefit by either having their beliefs:
Affirmed – They are already doing what is being taught and they can play a role in the workshop to help others understand how to apply the knowledge, skills and tools.
Shift – I have provided a different perspective that has given them ideas for how to build on what they are currently doing.
Change – The skills and tools taught are new to them and immediately applicable.
I invite you to experience this process at ACAMIS’s 7th Annual Leadership Conference: Leading Change from Vision to Implementation. I am coordinating this workshop with 8 very talented and experienced facilitators, where we are engaging with all participants before the workshop in a consultative process, using grouping strategies that reflect areas of interest and professional inquiry, and devising post conference engagements to keep the learning at the forefront of every participants mind as they lead change in their schools.