### February 2010

Monthly Archive

February 23, 2010

In the previous post on this subject, we mentioned the state committee that was uncomfortable with the idea of focusing on formative or ongoing (while teaching) assessment. To answer them, “Yes it does take good teaching!” There is a saying in music that there are no choirs that sing out of tune, only choir directors who allow them to. Several members of the committee seemed to accept as a given a level of out-of-tune teaching.

While recognizing the importance of testing, that post asked if it should be the primary means for seeing how students struggling in math are doing. Tests have an aura about them especially for students already struggling, that ongoing assessment may not.

Formative assessment is popping-the-hood so to speak and seeing what’s going on inside the minds of the students with the current lesson and their facility with related prior concepts and skills. The focus is problem identifying and problem solving.

For instance, students struggling with variables may have an issue with variables or their confusion may come from specific prior knowledge or skill gaps. They may have missed that subtraction and addition, division and multiplication, and squaring and finding the square root are all opposite procedures. Stepping back from that, they may not understand the concept of opposite procedures. Or they may have problems with division or squaring itself.

Formative or ongoing assessment is measurement that is built into the fabric of teaching and learning. With it we notice the problem and can immediately adjust our teaching to uncover and address the cause.

In the next posting on this topic, we’ll look at a few formative assessment approaches.

February 22, 2010

This week’s problem is similar to last week’s. Students’ development of mathematical reasoning involves making generalizations, and deeply felt generalizations are the result of making *many* specific observations (not just one)—and not merely reiterating the words that we or a book have pronounced. This week’s problem is intended to reinforce and extend the development of reasoning begun with last week’s problem.

Again, imagine that you had never heard of the Pythagorean Theorem (a^{2} + b^{2} = c^{2}), and that you did not have a ruler. Using only the drawing below, *determine the perimeter* of triangle WKX. All corners that look square are square. The answer will be posted next week.

February 22, 2010

In keeping with our posts on guiding discovery rather than explaining, we present some hints to guide your discovery of the solution to Problem of the Week #2. Do you have other solutions? observations? comments?

1. Touch the shaded triangle. Tell the name of the triangle (“WKX”).

2. The shaded triangle is part of a rectangle. Name the rectangle.

3. What is the area (in square inches) of that rectangle?

4. What is the area of the shaded triangle?

5. What is the area of triangle WXD?

6. What is the area of triangle XYA?

7. What is the area of triangle YZB?

8. What is the area of triangle ZWC?

9. What is the combined area of the four triangles inside the dotted lines (WXD, XYA, YZB, and ZWC)?

10. What is the length of line segment XL?

11. What is the length of line segment AY?

12. What is the length of line segment ZM?

13. What is the length of line segment BY?

14. What is the length of line segment AB?

15. What are the lengths of line segments BC, CD, and DA?

16. Is figure ABCD a square? What is its area?

17. What is the total area inside figure WXYZ?

18. Is WXYZ a square? Here’s how to find out…

19. Angle WDX is a right angle (90°). If you add angles DWX and DXW together, is the sum 90°?

20. Is angle AXY the same as angle DWX?

21. If you add angles DXW and AXY together, is the sum 90°?

22. Can you prove that WXYZ is a square?

23. What is the area of WXYZ?

24. If you know the area of a square, can you determine the length of one of its sides?

25. What is the length of line segment WX?

February 20, 2010

We recently came across a blog post stating something that seems logical and helpful. Before we start working with students to help improve math performance, it noted that we need to test them to know the areas to focus on. A quick search on the internet reinforced this. A number of sites discussed different types of assessment: diagnostic or prior assessment, formative or ongoing assessment, and summative or post assessment. But when looking for examples of each, ongoing formative assessment received less attention compared with more examples each of diagnostic and summative testing.

This reminded us of when we made a presentation before a state curriculum adoption committee. We explained that since we write intervention materials designed to support students struggling in math, formative assessment is built into the fabric of each lesson in the program. While there are pre, post, and mid-chapter tests, the emphasis is on the problem solving support for these students of ongoing formative assessment. One member of the committee laughed and said, “Well that requires good teaching!”

It appeared that everyone in the room recognized that good assessment is systematic and ongoing. But several members of the committee saw repeated testing as *the* means of insuring ongoing assessment.

We won’t go into possible reasons for the apparent preference for testing. Rather, in this post, we want to lay out some problems with relying on testing as a primary means of assessment, especially for students challenged by math. In the next post, we will think through the value of formative assessment.

Going back to the opening paragraph, diagnostic or prior assessment is essential. We’re flying blind without it. But favoring tests over integrated ongoing assessment may miss part of the potential power and value of assessment and even create some of the very problems our better teaching are trying to address.

When taking tests, students challenged by math will continue to write some incorrect answers. They will leave some answers blank. And a test is usually graded by someone else, not the students, and is usually handed back at a later time disconnected from the experience of that assessment. So what’s the problem with this?

Students who are not doing well may not give us the best indication of what they know and can do by taking another test. Continuing to write incorrect answers can reinforce memories of those wrong answers, reinforce faulty procedures leading to the incorrect answers, and reinforce partially understood or misunderstood concepts. Having to leave some answers blank can reinforce a negative self-image, and frustration with and resistance to the study of math. It also means that needed practice on those problems is missed. When the test is graded by the teacher and handed back at a later time, the students are no longer engaged in the problems and many will not retrace their steps in arriving at the wrong answers.

Assessment has multiple uses for teacher and student. It gives the teacher some evidence of student preparation and performance, which informs what is taught and how it is taught. It also provides students with feedback. While testing is an essential component, should it be the primary form of feedback for the intervention teacher? The teacher may not know from a test, which problems the students labored over or solved quickly whether they got them right or not. The test may not reveal which prior conceptual, procedural, or simple nomenclature gaps led to an incorrect answer. Depending on when a test is given, it may be too late to go back and address certain problems. And because of all of this, it may not be the best form of feedback for students struggling in math either.

In a later post, we will look at a different approach growing out of one meaning of the word assessment, to sit beside and observe.

February 17, 2010

The New York Times reported in 2008 on a study from Ohio State University suggesting that using concrete, real world examples and manipulatives to teach math do not enable students to transfer their knowledge to new problems. They found that students who learn through examples do not do as well as those who learn only the abstract symbols.

There undoubtedly are some important “real world” lessons for us to reflect on in this study. Mere activity with manipulatives does not guarantee that learning will take place. Manipulative lessons are not an exception to the need for well designed and delivered instruction. Do we give proper attention to making the connection between manipulation and computation? Are students guided in making generalizations based on specific actions and observations involved in real world lessons? It is not automatic. It is not enough to simply present students with the abstract symbols after working with manipulatives as this article suggests was done. How many teachers are comfortable with teaching abstract formulas, but less skilled in designing and presenting lessons that draw effective learning from tangible objects?

While there is much to think about, there are also questions about the study itself. What about developmental readiness for abstract thinking? This study focused on college not elementary school students. What about multiple learning styles? Were the students guided in making generalizations based on their specific observations? Were they guided in connecting manipulation and computation? Again, the article suggests that the students were simply presented with the abstract symbols after working with real world examples. In other words, were the teachers skilled in designing and implementing effective manipulative lessons? Pre-existing bias and relative skill with different styles of teaching make a difference.

Not recognized in this study is that students who are taught a number of abstract formulas can remember them poorly, confuse them with each other, unwittingly mix parts of different formulas together, apply them incorrectly, and most importantly, fail to understand the actual meaning behind the formulas. For example, students who are taught several formulas for determining perimeters, circumferences, areas, volumes, and surface areas for different kinds of shapes can still find it confusing to determine the perimeter of a simple rectangle. They are so focused on trying to recall the abstract formula, that they are unable to think about what it actually means, or to check the reasonableness of their calculations.

And what do the researchers think is the purpose of math education? Is the goal just to pass a test? Or is the goal to be able to harness math concepts and skills for use in the real world of objects and actions? If we expect children to learn to apply math in the real world, then would it not be useful to make that connection all along the way?

February 15, 2010

Imagine that you had never heard of the Pythagorean Theorem (a^{2} + b^{2} = c^{2}), and that you did not have a ruler. Using only the drawing below, determine the exact length of line segment WX.

All corners that look square are square. The answer will be posted next week.

February 12, 2010

How do we help students build greater competence and confidence in adding and subtracting with negative numbers? Many use a thermometer but may not approach it like this.

The most important thing to remember is, do not explain this to your students. Use a thermometer and guide their discovery and experience of it. Have them start at zero and move in the direction of more hot, still more hot, more hot and then less hot, less hot, and less hot. Start back at zero and now move in the direction of more cold, colder, and still more cold and now less cold, less cold and less cold.

Now, think of + as more and (+) as hot, so +(+) is more hot. And – as less and (-) as cold, so -(-) is less cold. Ask them, more hot or +(+) goes which way on the thermometer? They respond, up! Less cold or -(-) goes which direction on the thermometer? They respond again, up!

Then what about more cold or +(-), which way is that on the thermometer, down! And likewise, less hot or -(+) is which direction on the thermometer, down!

So simply start with the thermometer having the students show which direction +(-), -(-) is and so on. Then slowly introduce numbers. Ask how far and in which direction, +(+3) or +(-2). When they are comfortable with this, finally give a starting number, 3 + (4), or 5 – (-2) or (-5) + (-2) and so on.

You can slowly introduce nomenclature, without making a big deal out of it, as the experience is settling in. With the problem, +(+3) we are **adding a positive** going how far in which direction? Or with (-5) + (-2) we’re starting where and **adding a negative** going how far, which way, and ending up where? Introduce the terms as they continue to experience on a thermometer what is starting to become familiar to them.

Watch the second half of our podcast on Guided Discovery for some ideas on how to set up this lesson by orienting students to the concept and a thermometer.

http://www.masterylearningsystems.com/Podcasts/CNR-GD.mp4

Next Page »