Friday, November 20, 2009

Working ahead?

A question I posed to a teacher recently, who had some difficulty thinking what she would do:

Let's say you had a policy in your course that allowed a student to test out of a unit (and if you don't, that would be a good place to start). Once the student tests out, they have enrichment activities.

You notice a student who, in class, is reading the next chapter of the textbook, googling key vocabulary terms for it, whatever would be involved, in order to pass out of that unit.

Do you as a teacher allow that to happen? Do you encourage it?

At first blush, this would seem to be the perfect scenario. The student has initiative with a measurable goal in mind. They are becoming an independent learner, acquiring their learning in their preferred method. And, they are working at their own pace.

Yet, as parents of the students above could attest, this doesn't happen much. Why not?

The first barrier is even having the option to test out of units in the first place. Often, the thought is the curriculum is "too essential to let a student clep". Or "too difficult to even let them try." Or, group work is required during this unit, so others in the class are depending on this student. Or, you can't "test out" of physical fitness. Or choir. Or...

But let's say a teacher sees the light and allows testing out. And then sees students cramming before the test-out. Isn't it difficult to let go of that locus of control? Isn't it intended for students who already know the material? Is it hard to possibly perceive that the student would rather not learn the material the way you teach it?

The problem is that, even in these situations, the class is a teacher-centered class. Not a student-centered class. Despite allowing students to test out, the control really still rests with the teacher. To allow it to happen is the first step. To encourage it is the end goal in 21st century teaching and learning.

Friday, November 13, 2009

Mt. Washington is 6288 feet tall

Or so I learned this past week.

Today, my daughter takes her social studies test over the northeast states. I helped her study this week for the test, and I'm glad to say I see a perfect score on the horizon (not due to my tutorage).

The format for this unit, at least how it was reported to me by my daughter (and this is often at variance from the truth), leaves much to be desired however. Students have a thick social studies book filled with facts galore to cover every possible standard, benchmark, and performance indicator of all 50 states (to make the book marketable). But they don't really read the book; there is a CD audiobook of the text that is played at the beginning of the week.

The next day, they are given a worksheet packet with 15 questions on it, fill-in-the-blank style. The teacher reads question number one, which reads Mt. Washington, which is in _________, has a height of __________ feet. The teacher asks if any students know the answer, and students try to remember what they heard the previous day. The teacher then gives the correct answer (or confirms it), and the class moves on to question #2.

That's it. Test given out today.

• There are _____ states (number). Name them.
• _________ is an important industry in the northeast because the coastline has many harbors.
• The type of government in which all of the people vote for laws is called a ___________
• When the pilgrims landed at Plymouth Rock, they drew up an agreement that would help them to make laws called the __________
• A ditch dug across land to connect one waterway to another is called a __________
• The _______ _________ built in 1817, is a long ditch that connects the Hudson River with the Great Lakes
• A way of making large quantities of the same product is called ___________
• In the U.S., your rights are made possible because of our plan of government called the _________
• Our national government has 3 branches, known as _________, __________, and _________
• The place where Congress meets is in the _________ building
• NYC is also known as the city of ___________ because people from all over the world live there.
• The reasons that our nations first factories started in the northeast was because of _______ power and __________ power
• List the state capitals from each of the northeast states.

Note a pattern to the questions? If so, you are better than me. These are knowledge-level questions about geography, economics, government, and history. And there are several questions that aren't about the northeast.

This isn't to pick on my daughter's school or teacher. I think what I see here is an anomaly, and that literacy and math are not taught this way. It's more to pick on the lesson design, which I feel is unfortunately typical in elementary social studies curricula.

  1. There is too big of a dependence on the textbook.
  2. There is too much information covered, given the little time spent on it in class, and the information is incoherently jumbled together.
  3. There is a missed opportunity to work on reading skills embedded across the curriculum
  4. The information covered is too low-level, factual-based, and doesn't lead to higher level thinking
  5. The assessment is non-authentic recall
  6. The information is, for the most part negligible (this will be the last time in my daughter's educational career she will be expected to know how tall or in what state Mt. Washington is)

Here's the scenario of what this leads to: I started reading the question "this is an important..." and before I could get to "industry", my daughter shouted out the word "fishing!" I asked her an application question "Where else is that industry important in the United States?" and she responds "That's not on the test." "I know, but where else it is important"? "Dad, I don't know, we haven't gotten to the other parts of the United States."

You never will. There is too much in social studies to cover everything.

What I want her to know is that, being by a coastline, it is logical that fishing will be an important industry. Therefore, other states/nations with coastlines will also have a significant fishing industry. Even if we haven't heard the official textbook CD. The lesson design precludes her from making associations like this.

If you look at the questions in the list, they are not all equal. The state capital knowledge is beyond negligible (how many of the 50 do you know, and has that had any impact on your life?). Some other questions could lead to deeper learning (like the Mayflower Compact or Erie Canal), but listing it in a factual method and then moving on makes them irrelevant as well. Meanwhile, perhaps two of the most important concepts of social studies, worth an intensive unit all in of themselves, democracy and the constitution only get one question devoted to them. In the midst of everything else. Very much, a mixed message to students.

When we discuss the Iowa Core, this is an example of what we need to do. Get rid of the rest of that stuff. Determine what is important. And then have a deeper lesson, leading to deeper conceptual and procedural knowledge, with authentic and formative assessment. Which will lead to permanent learning.

I give my daughter 5 months to forget that Mt. Washington is 6288 feet tall (she does have a good memory).

Thursday, November 12, 2009

Nate Silver, Math, and Quadrant D

The big mistakes we make with the rigor and relevance framework is that we assume higher-order thinking is "tougher" and that more relevant means a story problem from a 2nd-person perspective. And, neither are the case.

Math is a great example. If you were a math teacher, and you were asked to make your curriculum more rigorous, you would probably eliminate some of the easier skills and replace them with skills that are tougher. Or given that math often works sequentially, eliminate chapters 1-4 and replace them with chapter 21-24 in the textbook.

You'd definitely be upping the difficulty, but would that result in students having more rigorous thinking skills? Not if the replacement skills are taught the same way, it wouldn't be.

In that same vein, traditional math instruction is definitely analytical, I'd say more so than traditional language arts or social studies curriculum, and analysis is a higher-order skill. But is it critical thinking? How much do students actually critique what they are learning? Even the best math instructors struggle with that... it's hard to critique something as black and white as mathematical theorems.

One of the biggest names in math today is that of Nate Silver, the founder of the popular political statistics blog as well as the inventor of the sabermetric statistical model for baseball PECOTA. Silver's use of statistics and mathematical logic, both in baseball as well as in politics, ranges from the simple to the extremely complex.

But above all, it's very accurate, as PECOTA was the gold standard in predicting player performances and team results for years (other mathematical models have now caught up to it), and 538 successfully predicted 49 out of 50 states from the 2008 election with an algorithm of polling data.

There are two ways teachers can use Silver's work as the basis for quadrant D math activities (and no, one of those is not to do a biographical report on him... the biographical report on the "famous mathematician" or "famous chemist" or whatever being one of the worst ways for a student to learn more about that subject). One is to look at how to mathematically determine an otherwise non-mathematical quality. Silver asked the question "How can we predict the success of a baseball team or a presidential candidate?" and then developed the mathematical model to do so. But that question could just as easily be to rate the "best" musical act of all time or the most influential president of all time.

Starting off with a theoretical question, and then looking at the math that could help you support the answer uses a higher order skill not often associated with math: creativity. It also is relevance... true relevance. Not just take my problem and make it a problem that involves money. But rather, actually look at how real mathematicians in the world would use the content and skills to solve a real-world, unpredictable problem.

The other way would be to look at the ethics behind math and statistics. As we mentioned earlier this week, Silver questioned the methodology of the polling company Strategic Vision, which has been producing some shaky polls used to advocate for certain political causes. The method he used is to look at the trailing digits used by the polling company, which shows a non-random pattern. Silver then concludes the company was fabricating the polls, quite the claim.

Not only has this episode been relevant in terms of its impact on the world we live in, but it also has unfolded before the viewers eyes through an ongoing blog (Silver does a fairly good job of explaining some very abstract patterns). Again, you have the task of working through some unpredictable situations (proving a statistical polling company is lying is not a predictable part of any curriculum). And oh by the way, Strategic Vision hasn't polled since Silver's inquiry.

Now, this comes from a non-math teacher's perspective, but this is what the Iowa Core is getting at. The counter argument to doing this quadrant D work would be 1) it's time consuming and 2) takes us away from the mastery of essential skills and concepts. But that's actually its strength. It does take a student away from the collector of formulas and theorems to becoming the critical thinker in an inquiry-based setting, and it gives the student an appreciation for how math is relevant in the world around them.

Arthur Benjamin talks about statistics (and how it is overlooked in American curricula) with this short TED talk.

Other links
An explanation of PECOTA
The methodology behind 538

Sunday, November 8, 2009

A Lesson from Oklahoma

Earlier this fall, the Oklahoma Council for Public Affairs released the results from the survey it had commissioned about the civics knowledge of high school students, which were very alarming. Only 23% identified George Washington as the first president, 29% identified the president as being in charge of the executive branch, and 43% that Democrats and Republicans being the two major parties in America (10% identified the two parties as Communist and Republican). OCPA decried the results as proof of the failure of Oklahoma's educational system, and the results of their survey were used in many major publications.

But, almost as soon as they were released, questions began to be raised. Could it really be that only 2.8% of the 1000 polled high school students, as the survey claimed, could pass the test (which is a meager 6 out of 10 correct)? And none would get 8-10 correct? In a random distribution that would bring about 600 college-bound students and 50 gifted students, none got 8-10 correct?

This raised some questions, most notably by statistician Nate Silver. Even starting with the assumption that only 23% of the sampled students knew about Washington, the results still looked fabricated. Simply put, the distribution of student scores matches almost identically to a homogeneous distribution of probability. However, students are not homogeneous... a student that gets the first three right is much more likely to get #4 right than one who got the first three wrong.

Silver wasn't the only one, as he mentioned today. State representative Ed Cannaday, a former educator, also thought something was fishy. He conducted the same survey in school districts in his own congressional district within Oklahoma (N = 325). And, he found an entirely different set of results. In fact, 98% identified George Washington as the first president, 85% identified the president as being in charge of the executive branch, and 95% identified the correct two political parties. In fact, the average score in Cannaday's survey was 7.8 correct out of 10, very striking considering the OPCA survey said none in 1000 scored more than 7 correct.

While Silver doesn't focus specifically on it, still what is clear is the subtext of OCPA being a conservative group pushing educational policy changes. And, by the reach of the survey's results, which landed prominent places in Time, Newsweek, and the USA Today, it can be said the survey was successful, however dubious. Now that headlines have blared how Oklahoma's public schools are failing miserably, the damage is done, and a page 12B follow-up article will not do anything about it.

The lesson here is twofold. One, that in an era of data-driven decision making, standardized assessment results still run secondary to sensationalized opinion polls in the effort to sway public opinion. And two, that schools appear to be not off-grounds for fabricated politicization. Which means one thing...

If you are an Iowan representative, you better be extra critical of any external data used to describe the achievement of Iowa's students.

And if you are a newspaper, like let's say a Register in the state capital, you should be equally cautious.

Using standardized data as a measuring stick for how well the nation's schools are doing is troubling enough for many reasons. It becomes infinitely worse when using non-standardized survey data.

There are a million reasons behind the data, all the way down to whether the students had any breakfast this morning, to whether the test had cultural bias within it, to whether a teacher happened to give the students the answers. Now we have to add nefarious purposes of the testing provider to the list.

Monday, November 2, 2009

Parent-Teacher Conferences, and Asking the Students

Parent-Teacher conferences are coming in full swing with the start of November. P-T conferences were frequently disappointing in our districts, as parents had busy schedules and were keeping track of grades online, thus decimating participation. Even requiring parents to come in to P-T conferences in order to get the student's report cards didn't work; parents came in, got the paper, and immediately left. I had a similar experience with a face-to-face parental advisory group when I was a principal.

About 2003, I stopped putting any hope in P-T conferences or Parental Advisory Groups, and instead prioritized online communication. And it worked, tremendously. I had near 80% participation rate in my parental advisory group with an online communication format. I found online communication better for two reasons. I could initiate the conversation at any time, and parents were more likely to respond.

The takeaway was this. Parents care. They want what is best for their children, and they want a voice in that education. There was something about the convenience of interacting online that made it possible.

What I realize now is that I didn't offer them a forum to talk to each other online. All communication was back and forth via email with me. A discussion group or a blog would have produced more authentic sharing.

Which brings me to what Russ Goerend is doing. In addition to his other online professional learning activity, he uses blogging as a centerpiece of his teaching. His blog serves as ongoing writing instruction for students as well as a springboard for student discussion. Moreover, the blog offers parents a chance to not only find information and access notes, but also to participate in the discussion about how to improve the classroom.


Come to think of it, there was one compliment (I take it as a compliment... those who disagree with my pedagogical outlook will consider it a weakness of mine) that I received at the P-T conferences I had that was very rare for other teachers. Parents really liked the fact that I asked students for their thoughts on what we were learning in class and how we learned it.

We went beyond just offering multiple project choices for individuals. We actually had discussion about the format to instruction and the way we learned best, and sections of my courses were taught completely different from other sections based on what the students argued they learned best with. These usually spilled over into individual discussions with students, both inside and outside of class. For me, frankly, I felt like I was doing what I should have been doing; it was their education, after all.

Russ of course does it better than I do. I love his open threads for students and parents alike to contribute to. Teachers and administrators need to do more of this. It creates more empowering, ongoing dialogue and creates a culture of continuous improvement. More on the responses from students about the technology teachers should use tomorrow.

To do this requires a teacher to be open to criticism of their teaching, that they indeed do not have all the answers and are looking for input. I won't be naive; this is a major hurdle to most teachers, whether we want to admit it or not. But it is one we need to overcome.

Related Post:
Why aren't we asking students?