Yesterday, a friend sent me this post from Frances Woolley, an economics professor at Carleton University in Ottawa, ON. Her basic point is that there is a “mathematics gap” between older professors and younger students:

Here’s my theory: Some students struggle with economics because they do not fully understand the mathematical tools economists use. Profs do not know how their students were taught mathematics, what their students know, what their students don’t know – and have no idea how to help their students bridge those gaps.

The biggest difference is in the use of calculators. Older professors may never have had access to them, whereas younger students may have started using them at age 6 or 7.

I’ve written before about the problems with over-reliance on calculators, but I’ve never thought of the problem in quite these terms. As someone who grew up in the calculator age but with teachers who generally didn’t allow calculator use (or, in college, wrote problems that made calculators useless), I always thought of the issue as a matter of all-too-common bad teaching, but not as something that could lead to serious misunderstanding between professor and student.

The part I find most interesting is her claim that there is more than just an “arithmetic gap,” there is a “mental math gap” because all this calculator use has led to different ways of thinking about mathematics:

But the mental arithmetic gap has more subtle implications. Mental calculations often require intuition about, and comfort with, the use of fractions. Pre-calculator: 1/3+1/3=2/3. Calculator era: 0.3333….+0.3333….=0.6666…. Pre-calculator: “To multiply by twenty-five, divide by four and add two zeros (25*Y=1/4*100*Y)” Calculator: Multiply by twenty-five. Back in the day, fractions were easier than – or at least not much more difficult than – decimals. Calculators make fractions obsolete.

I’m particularly intrigued (though not surprised) by Dr. Woolley’s point about the tendency to gravitate towards decimals vs. towards fractions. It has been a long-standing frustration for me that my students *insist* on converting their final answers into messy decimals, even when the fractional answer they had was quite elegant. For many of my students, it seems that *fractions just aren’t meaningful answers*. I’d always assumed that this was just a strange cultural difference between high schools in the United States and most of the professional mathematics world: teachers and textbooks insist on answers in decimals, but most mathematicians find fractional answers more elegant. (Of course, most mathematicians rarely come up with answers that have enough actual numbers in them to convert to a decimal). It has never occurred to me before that this different might be the result of calculator use, but now that it has been pointed out to me, it makes perfect sense.

But (putting my Montessorian hat on), I think there is a deeper problem with this trend to prefer decimals over fractions: fractions are more concrete. This probably sounds crazy to those who struggle with math, most of whom find fractions especially challenging. I suspect this is because the *algorithms* for computing with decimals are indeed easier to learn than those for fractions, and with a calculator, they are completely trivial. There’s no need to remember when or how to find a common denominator, etc, etc.

The trouble is that it’s harder (though not that much harder) to fully grasp what those decimals mean. Yes, you can make a concrete “manipulative” representation of decimals, but since they operate on an exponential scale, in order to represent millionths, you’d have to represent units with something like a half meter cube, and even then your millionths cubes would only be half a centimeter on each side. That can give a nice visual impression, but it’s not so good for actually moving pieces around to solve a problem (especially if you also want to include whole number categories; your thousands would take up most of a room).

Moreover, once you get this concept (and it is a crucial one) that a tenth is ten times bigger than a hundredth and a hundred times bigger than a thousandth, I think that’s the end of the serious mental reasoning that you can do with these visual impressions. Hundredths, thousandths, etc. are just too small for any realistic mental estimation. If I ask you to picture 0.33333 can you do it without looking at the number and saying “oh, that’s about a third” and then thinking of one third? I can manage to picture three tenths and three hundredths and three thousands, etc, but this is an absolutely meaningless pile of blocks, not a useful “amount.”

On the other hand, fractions, which are trickier to manipulate on paper, can be beautifully represented. Take a circle, chop it into two equal pieces. Now you have halves. Take the same circle, chop it into three equal pieces: thirds. Now chop the circle into four equal pieces: fourths. With these pieces, we can not only get a visual impression of how big different fractions are and how they relate to each other; we can physically see why it is that adding 1/3 to 1/4 without finding a common denominator is meaningless, and we can see that 7/12 is the same size as 1/3 and 1/4 put together, even though the numbers are all different. We can use our hands to find equivalent fractions and then learn how to find common denominators. And most importantly, we can learn to estimate these amounts. Where 0.33333 converts to a meaningless pile of blocks in my head, when I think of one third, I see a third of a circle. (It’s also red and metal, but I think this may be an artifact of many years as a Montessorian!).

Perhaps this view is the result of being a Montessori child, since I was exposed to fractions for much longer and in a far more concrete form. If the order had been reversed and I’d spent years working with very, very concrete decimal materials, would I think better in decimals? I doubt it. Many of my students seem to think that the decimals are more concrete and easier to “think in,” but I wonder, is this because they can relate to those decimals as old friends, seeing where they fit in the continuum of numbers and estimating how big they are compared to other, more common numbers, or is it because the decimals “look right” but fractions look scary and strange?

Hello,

I contacted you last week, but I havent heard back from you so I thought that I would send another comment just to make sure you recieved my request. My name is Lindsey and I am a writer. Do you accept guest posts? I have been reading your blog for the past few weeks and I’d love to contribute a piece. Please email me so we can discuss the topic I have in mind. I think it will go well with your blog. Thanks for your time and I look forward to hearing from you

I got here through searching on the subject, let me preface this by saying I am a software developer so basically have one giant calculator to work with all the time.

I have never been a fan of fractions. I completely understand how they can form a better visual in simple divisioning but to my mind so can a percentage which effectively a decimal * 100.

To me a fraction is an unfinished answer or if you prefer, a representiation of an answer. Consider the following trivial question

What is 15 – 5?

Whilst correct, the answer 3+7 is not complete for me and I feel similarlly about fractions.

Which is easier to visualise?

0.76 = immediately recognised as 76%

19/25

My happiness with decimals may well be born out my profession and I know fractions won’t be going anywhere soon but looking back I would have been much happier working this way