We wrapped up our curriculum network's 08-09 school year on Friday. One of the aims of the curriculum network this year was to introduce different web 2.0 tools to the curriculum directors for awareness. So, we showcased blogging, Delicious, Google Docs, Moodle, podcasting, wikis, Skype, RSS, and Twitter... one for each month, 30-minutes a time.
Of course, if the aim is only awareness, that is fine. If the aim is to make a difference in schools, or to lead to deep understanding, this is not best practice. And, this unfortunately is what happens in schools sometimes, when they are focusing on technology integration.
So, perhaps the best thing we did all year was our "putting it all together" framework.
As you can see, this framework helps put the tools in the context of a professional development plan. We talked about two concepts during the year, PLNs and the Digital Curriculum, and the framework helps separate the two.
But what was most important was the 5 common pitfalls that schools do when implementing technology professional development.
1. Not focusing (and finding clarity) on the white. Its not about the tool, its about the end outcome. And if districts don't start with the outcome and the common pedagogy (the white sections above), technology integration won't make a difference in your school.
2. Lacking SMART outcomes. SMART goals are ones that are specific, measurable, attainable, results-based, and time-bound. Having teachers learn about blogging is not a smart goal. Even having them use blogging in the classroom is not a smart goal. The outcome needs to be a transformation in what truly matters, student achievement. And too often, technology professional development is not done to achieve a specific, measurable result in student achievement to measure its efficacy.
3. Overdoing the gray. Much like I said above, if we fall in the trap of trying to do all the tools for all the teachers, nothing will be done. Norwalk, with all its focus on technology integration, focuses on one tool... Moodle. And, that's enough.
4. Not moving past modeling. When it comes to the actual steps of the plan, the Iowa Professional Development Model, built on the work of Joyce and Showers, has to be followed. And too often, technology professional development is only modeling. Teachers are expected to practice the technology on their own instead of during professional development time, and with other responsibilities, that becomes lost.
Not only does the faculty need to start with the overall knowledge first (in this case, answer the question "how is teaching and learning different in the 21st century?"), and follow up modeling with practice, but also build in time for administration to observe the practice, and for teachers to get and give feedback on how it is working.
Joyce & Showers describe this as the importance of coaching, and while it has big implications in all professional development, it is sorely underutilized in technology. How well is a teacher "coached" through using a new tool? Aren't they usually shown the tool and expected to use it (or not)? And, this will lead to better instruction?
5. Forgetting about the black. Of course, I remind our schools to not forget, they aren't alone. We can help our schools understand this process, ask good questions to seek clarity, set good outcomes, and put in place quality steps. But really, the biggest problem is schools not seeking actual data to show whether technology integration is effective.
And, this is the greatest de-legitimization to technology professional development out there. If we go through the motions and put time and effort into the practice, but then we have no data to show whether it is truly working or not, then we are fooling ourselves. We have to show that it works, or we have to stop doing it. And yes, the "I can feel the difference technology makes in my class" anecdotal data are nice, but that doesn't cut it; giving students candy will change the engagement level in your room, but it won't have an affect on student achievement.
But, it is equally foolish to only use ITBS data, when your desired outcome is not measured by that, such as a growth in creativity, collaboration, authentic learning, problem-solving, and relevance. We have to gather better data, through good walkthrough observations, student surveys and feedback, better performance assessment and rubrics.
Packaging these 5 common pitfalls together with the framework made a lot of sense to people. And as I joked with them, after a year of me showing them tool after tool, they probably thought it was about time!