A long time ago I was determined to watch every interview with David Tennant so I listened to Christian O'Connell Breakfast Show few times. Once there was a conversation among the hosts (David wasn’t there yet) about common fractions and why are they still being taught at schools. It started with Christian talking about how he was helping, or trying to help, his daughter with her homework the day before and soon we learned that neither of the hosts was able to express 6.85 as a common fraction. Then David came and was asked the question. He easily answered it's six and eighty-five hundredths, and started to divide 85 and 100 by five but had a little problem with it without a piece of paper. Before the topic was finally abandoned, one of the hosts asked: __what do we even need common fractions for, in a decimal, computerised world?__

*(as usual, click on pictures for sources)*

And this is what I’d like to talk about. About the audition, let me just clarify that I 1) don’t believe you should be able to do calculations without a piece of paper at six in the morning and 2) don’t believe you should remember everything you learnt on Maths classes. But shouldn’t you rather read a page from your child’s textbook, think a little and get it, instead of talking in the studio the following day how difficult and useless it is?**Why do we learn about common fractions**

Our intuition of fractions is common fractions. A fraction is a part of a whole. We intiutively understand fractions as results of division. *A half* of an apple is when you cut the apple into two equal parts and take one. When you get a part of a cake that has been cut into four equal pieces, you get *a quarter*. One of six equal slices of a pizza is more than one of eight equal slices of the pizza. Compare: what’s 0.4 of a pizza? Why cutting something in half gives you 0.5?

Decimal fractions are often better for calculations – and that’s why we learn about them and use them in banks and other places - but they work as a way of writing, not a definition. To understand them, you must know the decimal system, while you can use common fractions even with not knowing how to write.

One can argue that we should tell children about common fractions so they understand fractions as the idea and create correct intuition in their minds, then tell them how to converse common fraction into decimals and let them forget about common fractions complicated more than *a third*. Despite the fact that the writing with numerators and denominators is convenient in Mathemetics: rational numbers have many special properties, decimals can be infinitely long, there are rational functions etc., common fractions have some huge real life advantages. They can give more information than decimals: *4/7 of our cars are Toyota* tells you the company has seven cars, while *0.7 (or most likely: 70%) of our employees* *are men* doesn’t tell you how many employees are there or even if there's ten of them or a hundred. Besides, you can easily turn a common fraction into a ratio. We often use proportion: our TV screens are *16:9* and there are* two parts butter for three parts flour *notes in the cookbooks. In that context, 1.5 doesn't give us such rich information as 3/2 (or 3:2).

**Decimal fractions have their traps.** Let me show you something interesting. Are you sure you really understand the decimal system? Answer me then: how much 0.(9) is?

It's easy, right? It's 0.999999999..., with the infinite number of nines. Quite right. Tell me then, how much is 1-0.(9)? In other words, how much smaller 0.(9) is than one?

Many people says it's *a very small number*, but they can't tell exactly what it is. The corrent answer is - and it might be surprising when you hear it for the first time - that 0.999999999... __equals__ one. I'm not kidding. There are many ways to prove it and the one with infinite series is the most 'convincing' for me but I think some people might be satisfied with a simplier explanation. We know that 0.(3) = 1/3. Now multiply both sides of the equation by 3. We get 0.(9) = 1.

This is what decimal system plus infinity (an idea that often goes against intuition) does to you. Some numbers have two representations in the decimal system and number one is one of them.

Decimal fractions as a part of decimal system are extremely useful. When you see 6.85 as a price you probably think of the number as *a bit less than seven* and it’s enough for making a decision. You don’t have to convert all decimals into common fractions. But that doesn’t mean common fractions aren’t useful. They come handy when you’re in a kitchen or trying to get 3/5 of council’s votes. They give you a better perspective. So maybe, in the era of information, instead of bragging about our gaps, let’s fill them (or just be quiet). If not four our's sake, so at least we can help our kids with their homework.

If you want to listen to the audition, here's the link. It's from 2015. The topic starts about 5 minute.

## Comments

shivver13I think the real gist of the discussion of the show is, "I don't have a use for this topic, so why should we need to learn it?" The difficult part is demonstrating to those people who don't think they use simple fractions how much they actually do use them, and how much more they could use them if they actually had bothered to learn them well.

Using the example that they gave, 6.85, I wonder if either of them can picture how much 0.85 or 85% actually is? It's a lot easier to picture 17 out of 20, or halfway between 8 out of 10 and 9 out of 10.

I have a similar discussion periodically with my husband about a different topic. He argues that in the digital age, analog watches have no use. I think he's wrong. If you look at a clock that reads "3:52", you have to read it to understand the digits (which I've found as I've gotten older that it gets more and more difficult to do if the font is not clear). Then, it's more likely that the 3 sticks in your mind, making you think it's closer to 3 o'clock. With an analog clock, all you need to do is look at the positions of the hands and it's obvious that it's much closer to 4 o'clock, and you know how close to 4 o'clock it is immediately - and this is independent of the language and font the clock is labeled with.

I feel that understanding fractions works pretty much the same way. There's a certain amount of information that they convey that is harder to understand if it's in decimal form.

alumfelgaUsing the example that they gave, 6.85, I wonder if either of them can picture how much 0.85 or 85% actually is? It's a lot easier to picture 17 out of 20, or halfway between 8 out of 10 and 9 out of 10.Yes, exactly. If you don't know that 0.85 is 85/100, what do you think it is? How do you imagine it? It's sad if it's just "almost seven"...

Excellent remark about the clocks. Digital clocks are perfect if you need to know exactly what the time is. The online sign up starts at 11:00, not 11:03 etc. I agree that analog clocks are more intuitive and tell you the time in a different way. When I'm running late, "8:58" is not as bad as seeing the minute hand being so close to "12". Or there could be no numbers on the clock at all and I'd still know what the time was.

I wonder if someone did a study about the difference between understanding analog and digital clocks' displays.

shivver13And this is probably why many people feel that is useless. They might have once learned the topic but they never truly understood it, and therefore they never saw the use of it and can't understand why anyone else would need or want it.

alumfelgaThey might have once learned the topic but they never truly understood it, and therefore they never saw the use of it and can't understand why anyone else would need or want it.True. If it happens to us, it's good to ask other people and broaden our minds, before we start expressing opinions.

aletheiafelineaOur intuition of fractions is common fractions. A fraction is a part of a whole.Ah, so it's not just me. ^^ As much as I can't really say I'm fond of math as a whole, I've always been more fond of common fractions than decimal. :) About the same way as I'm more fond of geometry than algebra, if it makes any sense... Algebra looks suspicious and squiggly. Ew. And why it's so obsessed with x's all the time...

They can give more information than decimals: 4/7 of our cars are Toyota tells you the company has seven cars, while 0.7 (or most likely: 70%) of our employees are men doesn’t tell you how many employees are there or even if there's ten of them or a hundred.Ha! :)

That thing with 0.(9) = 1 feels like "Division by Zero", though...

alumfelgaAbout the same way as I'm more fond of geometry than algebra, if it makes any sense...It makes a perfect sense, algebraists and geometrists are like Pepsi fans and Coca Cola fans, or Windows users and Linux users :) There's a big debate how Mathematics should be taught at schools and which approach should dominate. For decades geometry dominated, but for at least twenty years algebra is in favour. Maybe that's why I prefer it. To me, geometry allows inaccuracy, we can be fooled by the drawings, while algebra gives us solid equations and variables. It goes with my intuition, too - geometric interpretations of some objects are still mystety to me :)

And why it's so obsessed with x's all the time...Change the variable and some students are already confused ;) It's interesting how students focus on numbers and don't see what really matters - the fact that (and how) the result can be obtained. Most students are happy to solve a hundredth quadratic equation because they know what to do. They want to find the x. A mathemetician says "I know the answer exists and how to get it", and does something else :)

Ha! :)The presentation of data is a fascinating topic itself. I read people react differently to the news "80% of houses in our country are safe" than to "20% of houses in our country aren't safe". Psychologists would know more about it...

That thing with 0.(9) = 1 feels like "Division by Zero", though...</i That's interesting, why? Do you (psychologically) disagree with the fact that 0,9+0,09+0,009+0,0009+...=1?aletheiafelineaMost students are happy to solve a hundredth quadratic equation because they know what to do. They want to find the x.Objection! I never cared about the x. It was the teachers who wanted me to find it. And keep finding... If I never see another quadratic equation in my life it'll still be too soon.

That's interesting, why? Do you (psychologically) disagree with the fact that 0,9+0,09+0,009+0,0009+...=1?Not quite the same, because 0.(9) = 1 is not really something you meet and can verify in everyday life, unlike 1+1=3, but it's in the same area of intuitive nonsense. "But math proves it! / Okay, if you say so..." Not quite unlike Zeno's paradox with a turtle, but in numeric form.

alumfelga"But math proves it! / Okay, if you say so..."But math does prove it ;) Okay, I'll shut up ;) And yes, it's exactly Zeno's paradox - infinite series are its solution.

aletheiafelineaBut math does prove it ;) Okay, I'll shut up ;)No, it's not like I deny it; it's just that it

feelswrong. :) It's like this thing with optical illusions.dieastraalumfelgaWhat about "three in two people don't understand Maths"? ;)

dieastraEdit: Wait, another one. "If there's three people on the bus and five get off, then two have to get on so the bus is empty again." LOL

Edited at 2017-01-14 05:51 pm (UTC)alumfelgashyfoxlingcommonfraction to me. :Palumfelga