• We have updated the guidelines regarding posting political content: please see the stickied thread on Website Issues.

Numeracy: Numbers, Numerals & Counting

mejane

Gone But Not Forgotten
(ACCOUNT RETIRED)
Joined
Jan 17, 2002
Messages
1,368
Apologies for the cryptic title - couldn't think of anything better.


I've been taught that we use base 10 as the standard counting system because that is the number of fingers humans have on both hands. But is that really true?

I've been thinking about this (yes, I have a headache now!) and it seems to me that base 5 would be the more logical choice - count on one hand, use a abacus/tool/club/whatever in the other :confused:

Another factoid to back up this theory comes from experimental evidence that most people can only count up to 5 before resorting to memory tricks.

eg:

( & 6

(3 characters, easy!


:blah: ;) :confused:

(still easy)


* & % % $ & :cross eye:



Answers on a postcard, please...

Jane.
 
mejane said:
eg:

( & 6

(3 characters, easy!


:blah: ;) :confused:

(still easy)


* & % % $ & :cross eye:



Answers on a postcard, please...

Jane.



WHAT? and indeed, EH?
Did you have your shift key stuck when you were doing numbers? Even then your post wouldn't make any sense. I am slightly dysnumeric though.
What the heck are you on about?

pinkle
 
Don't really know what you typed-
but I have heard that the brain can instantly count up to 4 (like when you see x many objects- if its 4 or less, you don't need to count)- after that, you have to count manually, unless its arranged in special ways (like rows- you could easily count 6 if its in two rows of three, but then you're just counting three twice really fast) But I think I read it in Watership Down, so I doubt if thats a valid scientific source. I've told people this and they think I'm crazy, and I haven't seen it mentioned elsewhere until now...
 
Schoolhouse Rock did an excellent episode on this.

They did a What If sort of story. Like what if humans had 12 fingers. Then they proceeded to go into a song and dance about base-12 math.

But seriously, if you think base-10 math isn't based on early man's ten digits, you're only fooling yourself. For God's sake, think about it.
 
Georges Ifrah in his History of Numbers gives a pretty good explanation of the reason that base 10 works better than base 5 for a number system. It basically comes down to balance between ease of counting and ease of writing. Base 5 "inflates" too quickly to really be useful as a written number system. Bases larger than 10 require too many symbols.

He also covers the many number systems that people have used over the last six thousand years, or however long it's been (since writing started, not since the creation). Base 10 and base 5 are both common, as is base 20 (fingers and toes), but also base 60 (which is preserved in the divisions of time). I'm not entirely clear on why base 60 was useful (it's a while since I read that part of the book), but it clearly has more problems over base 10 or even 20.

(Even more interesting is the manner in which some cultures count, not just using the fingers or toes, but counting joints, including the wrist, elbow, shoulder, as well as the eyes, ears, nose, etc.)

It's also true that the average person can immediately identify groups of up to 4 objects, which is why most graphical number systems (eg Sumerian, tallying, etc) tend to group things into bundles of five.

I have to admit that I'm running mostly from memory here. If I get a chance, I'll look up some of this in the book, and post better references. In the meantime, if you want to find out more about this sort of thing, you could try reading Ifrah's book. Don't let the size of it put you off, it's reasonably well written (depending on the translation you get, he is French), and much of the technical stuff and tables can be skipped. (There are one or two annoying errors in Volume 3 about computers, and he does have some odd biases, but most of the research is very good.) All that said, it is still rather long, so don't force yourself.
 
Another one from memory. I use to play chess quite a lot and I recall being told or reading that most people cannot visualise a whole chess board, at most they can only recall a 4x4 grid.
 
Anome said:
It basically comes down to balance between ease of counting and ease of writing. Base 5 "inflates" too quickly to really be useful as a written number system. Bases larger than 10 require too many symbols.
So that'll be the Goldilocks theory of number systems,then! :)

As for base 60, there was a fair bit about this in the OU History of Maths course I did a few years ago. One advantage is that it has a lot of factors, which is useful for doing fractions.

And the connection with time could be that the year is (approximately) 360 days, so that the sun appears to move one degree against the background stars every day.
 
Re: Re: 10, 5

Pinklefish said:
WHAT? and indeed, EH?
Did you have your shift key stuck when you were doing numbers? Even then your post wouldn't make any sense. I am slightly dysnumeric though.
What the heck are you on about?

pinkle

Don't worry, Pinkle - I often have no idea what I'm whittering on about either!

As Piscez said, it's the Watership Down theory (I first read it there too!) that people can only immediately recognise up to 4 or 5 objects. It's an easy experiment to try at home - and what else where you going to do on a dull Sunday afternoon? - and does seem to be true.

anome - thanks for the book reference, I'll check it out. It can't possibly be as hard-going as Stephen Wolfram's recent tome.

Jane.
 
I would guess that 10 is better what with it being even, whereas 5 is odd and is therefore a more "unfriendly" number to work with (that's just the way I think of it - I too am slightly dysnumeric so I need all the help I can get!). It must be related to the digits on your hands though.

I read somewhere that the ancient Maya culture had a system based on the numbers 18 and 20, and they also used a similar counting technique to the old "|||| and then one going across for 5" way (sorry, I don't know what it's called :rolleyes: ) but they used dots instead. And of course their calendar was supposedly highly accurate.

And all that without using a calculator! :eek!!!!:
 
Re: Re: Re: 10, 5

mejane said:
anome - thanks for the book reference, I'll check it out. It can't possibly be as hard-going as Stephen Wolfram's recent tome.
You're most welcome. I haven't read Wolfram's latest (all the reviews kind of put me off the concept), but Ifrah doesn't spend any time trying to convince you that he thought of it first.

One of the interesting (in a nice way) things about it is the lengths he goes to with regard apportioning credit to the Indian scholars who developed our modern number system. It came to Europe through Arabia (as did many other useful things, including the Greek and Roman classics that were lost in Europe during the dark ages). The Arabian scholars never took credit for it, and always credited it to the Indians, but for some reason this was ignored by the European scholars: hence the term Arabic numbers (later Hindu-Arabic, as some consolation).

More detail (I didn't quite get the title right):
Ifrah, Georges The Universal History of Numbers - Vol 1 The World's First Number Systems, Vol II The Modern Number System, Vol III History of Computers (last title iffy, I don't know where I left it).
Vols I & II cover (pretty comprehensively) the development of number systems up to the modern system.
 
MadCat said:
I read somewhere that the ancient Maya culture had a system based on the numbers 18 and 20,

I thought it was 13 and 20?
 
I always wondered how..

5 & 10 the components were in the 10 to the 2nd of the triad to fo
ur digits in the whole s
um of o
ur n
umber s
ysten th
at is
use in some of t
he sc
hools
 
psychologists and decorators also say that groups of 3 are the easiest quantity for the brain to process. 3 objects or points are easier for the eyes to distinguish, recognize, compute, etc. try it with anything...coins for instance. put two down and look at them...then add a third and notice how your field of perception changes.
 
Tezcatlipoca said:
I thought it was 13 and 20?
No, the second order is 18. All others, from memory, are 20. The use of 18 in the second order means the orders progress 20, 360, 7200, 144000, and so on. This gives the Mayan calendar 360 days (+5 festival days? I don't have the text to hand.)
 
Thanks for the replies ...

It does seem that decimal is a fairly modern invention which happens to coincide with the number of digits on our hands, rather than the other way around.

In the light of all your posts, I'm obviously wrong about 5 being a good choice. Anyone care to join me in the Campaign To Make Octal Universal? ;)

Jane.
 
Last edited by a moderator:
Personally I prefer hexadecimal

For example, mejane, a 22yo (Decimal) is 16 and a 33yo is 21 &c

Me? I'm 32
 
I can count to 35 on my fingers alone...

0, 1, 2, 3, 4, 5, [0-5]
10, 11, 12, 13, 14, 15, [6-11]
20, 21, 22, 23, 24, 25, [12-17]
30, 31, 32, 33, 34, 35, [18-23]
40, 41, 42, 43, 44, 45, [24-29]
50, 51, 52, 53, 54, 55. [30-35]
 
If you use your fingers in binary you can count to 31 on one hand, 1023 on 2
 
what are the most significant numbers in human history?

from what i've gathered so far...5 because of the number of digits on one hand...10 if you put them together

then there is 3 which could be seen as "man, woman, god/deity/entity"
 
anome said:
No, the second order is 18. All others, from memory, are 20. The use of 18 in the second order means the orders progress 20, 360, 7200, 144000, and so on. This gives the Mayan calendar 360 days (+5 festival days? I don't have the text to hand.)

Correct. the Solar calendar (Haab) has 365 Days (18 months of 20 days, plus 1 month of 5 days) , also used is the 'Ritual' calender (Tzolkin) of 260 days. these intermeshed to give the 'Calender Round' of 52 years.

http://www.jaguar-sun.com/calendr.html
http://www.michielb.nl/maya/astro.html

Fun with the maya calender!


http://www.halfmoon.org/
 
Published online: 19 August 2004; | doi:10.1038/news040816-10

Tribe without names for numbers cannot count

Helen Pearson

Amazon study fuels debate on whether the concept of numbers is innate.


A study of an Amazonian tribe is stoking fierce debate about whether people can count without numbers.

Psychologists, anthropologists and linguists have long wondered whether animals, young children or certain cultures can conceptualize numbers without the language to describe them.

To tackle the issue, behavioural researcher Peter Gordon of Columbia University in New York journeyed into the Amazon. He carried out studies with the Pirahã tribe, a hunter-gatherer group of about 200 people, whose counting system consists of words which mean, approximately, 'one', 'two' and 'many'.

Gordon designed a series of tasks to examine whether tribe members could precisely count and conceive of numbers beyond one or two, even if they lacked the words. For example, he asked them to look at a group of batteries and line up a matching amount.

The tribe members struggled to perform these tasks accurately after the numbers were greater than three, Gordon reports in Science1; and their performance got worse the higher the numbers climbed. "They couldn't keep track at all," he says.

Opposing views

Other researchers in the field have welcomed the study. But they disagree about what it means. Psychologist Charles Gallistel, at Rutgers University in Piscataway, New Jersey, says that the Pirahã simply may not recognize when one quantity of items exactly equals another, so they have trouble with matching tasks. He argues that people do possess an innate, non-verbal ability to conceive of all numbers, and that language simply helps them to refine it.

Psychologist Susan Carey of Harvard University in Massachusetts argues the opposite: she says we lack an innate ability to count beyond very small numbers, and that the Pirahã difficulty with numbers proves it. "It's a spectacular finding," she says.

Carey and other researchers believe that children and some animals are born with two basic types of 'counting' but that these are limited. First, they can recognize one, two or three objects by recording an image in their memory. Second, they can make estimates of larger numbers, such as 'about twenty'. Carey believes that the Pirahã rely on these innate systems.

Whorf hypothesis

On a broader level, the study also addresses a long standing and controversial hypothesis developed by Benjamin Lee Whorf in the late 1930s: that language can determine the way we think or what we are able to think.

But Gordon's study is one of the best examples in which language allows people to think something completely new, says cognitive psychologist Lisa Feigenson of Johns Hopkins University in Baltimore, Maryland. "This is by far the strongest piece of evidence," she says. In this case, a lack of language seems to prevent the Pirahã from thinking about larger numbers, she says.

However, the latest study will not resolve the debate about whether language can shape thought in other examples, points out Feigenson. "I think the jury is still out," she says.


----------------------------
References
Gordon P., et al. Sciencexpress, (2004).

http://www.nature.com/news/2004/040816/full/040816-10.html

I had a look around and found this:

Scholars generally agree that our ability to count, and our vocabulary of counting, arose to meet practical needs and developed over many thousand years.

Numbers were originally expressed by reference to parts of the human body - nose, eyes, ears, arms, hands, feet, and particularly fingers and toes - in a specific order, making the names of these organs and members double as names of numbers.

These number names were eventually simplified in their spoken forms - long before the advent of writing - as the concept of abstract numeration came into being. Studies of "primitive peoples", who live in isolation from civilization, suggest several stages of development toward strict number names.
Hunting and gathering peoples - such as the aborigines of Australia, Tasmania, and Papua New Guinea - generally have had, or still have, specific names only for the numbers one and two and, sometimes, three, yet they can count to as many as six by combining numbers, for instance like this:

Code:
1       one             one
2       two             two
3       two-one	        three
4       two-two	        one-three
5       two-two-one     two-three
6       two-two-two     three-three

In these languages, speakers refer to everything beyond six as many, much, or plenty - more plenty or less plenty, as the case might be.

In:

Mathematics: From the Birth of Numbers
Jan Gullberg (1997)

http://www.amazon.co.uk/exec/obidos/ASIN/039304002X/
 
maybe having numbers in a system allows us to keep track as we count higher. and the inability to count large numbers is just a memory issue. if there is no labels, or sytem to keep track as we go along, we lose our place
:confused:
 
On a tangently related note:

forteantimes.com/forum/showthread.php?s=&threadid=6915
Link is obsolete. The current link is:
https://forums.forteana.org/index.php?threads/why-do-we-use-a-base-10-numerical-system.6915/


Regarding the relationship between language and counting, I think it's more likely that primitive (for want of a better word) peopple have no concept of large numbers and no words for them simply because they don't need to. When your daily life involves finding enough food to feed your family then you have neither the need nor the time for abstract thought - you only have to know that you have 1, 2, 3 (kills, handfulls of wild corn, etc)... enough to feed the family.

It's only when the society becomes more sophisicated (again for want of a better word), especially when it moves to a more settled society based on agriculture and trade that it becomes necessary to use larger numbers, for which words will inevitably be coined. Crucially, at this point, some members of the tribe will also have the time to contemplate matters as they will no longer face the daily grind of finding food.

It's nothing to do with innate intelligence or memory or brain chemistry - it's simply necessity combined with the luxury of free time.

Having said all that, the "Watership Down" theory (see above link) suggests that even sophisicated 21st century people like you and me only really have an innate grasp of small numbers, even though we happily talk of hundreds, even trillions.

Jane
 
Last edited by a moderator:
Some beans

It also reminded me of this:

Edmund: Right, Baldrick, let's try again, shall we? This is called adding. If I have two beans, & then I add two more beans, what do I have?

Baldrick: Some beans.

Edmund: Yes...and no. Let's try again, shall we? I have two beans, then I add two more beans. What does that make?

Baldrick: A very small casserole.

Edmund: Baldrick, the ape creatures of the Indus have mastered this. Now try again. One, two, three, four. So how many are there?

Baldrick: Three.

Edmund: What?

Baldrick: And that one.

Edmund: Three...and that one. So if I add that one to the three, what will I have?

Baldrick: Some beans.

Edmund: Yes. To you, Baldrick, the Renaissance was something that just happened to other people, wasn't it?

http://www.blackadderhall.com/series/two_quotes.shtml
 
I've been able to deduce from careful observation of my Spaniels that they can count like this: one, two, fuckin' billions!

I found this out when the female had a litter of puppies and it came to the point when we started to give them away. It was alright till we gave the third one away at which point both the mother and father ran around frantically looking for puppies. "What happened then? One minute we had fuckin' billions of 'em and now there are only two!"
 
It's not just Amazonian tribes. I can remember, in my youth, trying to count last night's drinks: "One... two... many!"
 
What happens when you can't count past four?

Brian Butterworth asks whether you can do maths without words for big numbers

Thursday October 21, 2004
The Guardian

'Some Americans I have spoken with (who were otherwise of quick and rational parts enough) could not, as we do, by any means count to 1,000; nor had any distinct idea of that number," wrote the English philosopher John Locke in 1690.

He was referring to the Tououpinambos, a tribe from the Brazilian jungle, whose language lacked names for numbers above five. Locke's point was that number names "conduce to well-reckoning" by enabling us to keep in mind distinct numbers, and can be helpful in learning to count and to calculate, but they are not necessary for the possession of numerical ideas.

Two recent studies of Amazonian Indians reported in the journal Science, take a crucially different view. These studies, far from maintaining that number words are convenient, propose they are actually necessary.

The theory that language shapes thought is sometimes called the Whorf hypothesis, after the anthropologist Benjamin Lee Whorf. Berinmo, a stone-age tribe in New Guinea, does not put a linguistic boundary between blue and green but does have a boundary between "nol" and "wor" within what we would call green. Research by Jules Davidoff, Ian Davies and Debi Roberson showed that the Berinmo categorise blue and green together but nol and wor separately, whereas we do the opposite - see blue and green as separate colours, but nol and wor as variants of the same colour.

So if categorising objects by colour can be shaped by colour vocabulary, why shouldn't categorising the number of objects? The idea advanced in the two studies of Amazonian Indian tribes support a strong Whorfian view that number vocabulary is necessary for categorising the world numerically. The idea being tested in the Amazon is that humans, and many other species, are born with two "core" systems of number that do not depend on language at all. The first is a small number system related to the fact that we can recognise the exact number of objects up to three or four without counting. We use a second system to deal with numbers larger than four, but it only works with approximations. To get the ideas of larger numbers, of exactly five, exactly six, and so on, you need to be able to count, and to count, you need the counting words.

The Pirahã, a tribe of 150 people who live by the banks of a remote tributary of the Amazon, studied by Columbia linguist Peter Gordon, have words for one and two, and for few and many. That's all. Even the words for one and two are not used consistently. So the question is, do they have the idea of exact numbers above three?

Not having much of number vocabulary, and no numeral symbols, such as one, two, three, their arithmetical skills could not be tested in the way we would test even five-year-olds in Britain. Instead, Gordon used a matching task. He would lay out up to eight objects in front of him on a table, and the Pirahã participant's task was to place the same number of objects in order on the table. Even when the objects were placed in a line, accuracy dropped off dramatically after three objects.

The Mundurukú, another remote tribe, studied by a French team led by Pierre Pica and Stanislas Dehaene, only have words for numbers up to five. Pica and colleagues showed that the Munduruku could compare large sets of dots and add them together approximately. However, when it came to exact subtraction, they were much worse.

Mundurukú participants saw on a computer screen dots dropping into a bucket, with some dots falling through the bottom. They had to calculate exactly how many were left. The answer was always zero, one or two, and they had to select the correct answer. They were quite good, but not perfect, when the initial numbers dots going in and falling out were five or fewer, the limit of their vocabulary, but many of them were doing little better than guessing when the numbers were more than five, even though the answers were always zero, one or two. Pica and colleagues concluded that "language plays a special role in the emergence of exact arithmetic during child development".

Tribal societies in the Amazon differ in many ways from a numerate society like ours. The Pirahã are essentially hunter-gatherers who rarely trade, and the Munduruku also have little need for counting in their everyday lives. It is therefore very difficult to tell whether it is only the difference in the number vocabularies that hold the key to their unusual performance on exact number tasks. It could be lack of practice at using the ideas of number themselves, in counting or calculating. Pica and colleagues seem to recognise this, since even in the range of their vocabulary, the Munduruku are approximate - "ebadipdip" is typically used for four, but also used for three, five and six. The words alone are not enough, they conclude. The number names need to be used to do counting, and some conception of what it is to count must co-exist with the vocabulary.

So maybe Locke was right. Counting can exist without number names, but is greatly helped by them.

-------------------
· Brian Butterworth is the author of the Mathematical Brain, and is at the Institute of Cognitive Neuroscience at UCL

http://www.guardian.co.uk/life/feature/story/0,13026,1331672,00.html

The paper looking at these studies is:

Rochel Gelman and C. R. Gallistel (2004) Language and the Origin of Numerical Concepts. Science (Cognition and Behavior Special Issue). 306 (5695). 441 - 3.

Abstract:

Reports of research with the Piraha˜ and Munduruku´ Amazonian Indians of Brazil lend themselves to discussions of the role of language in the origin of numerical concepts. The research findings indicate that, whether or not humans have an extensive counting list, they share with nonverbal animals a language-independent
representation of number, with limited, scale-invariant precision. What causal role, then, does knowledge of the language of counting serve? We consider the strong Whorfian proposal, that of linguistic determinism; the weak Whorfian hypothesis, that language influences how we think; and that the ‘‘language of thought’’ maps to spoken language or symbol systems.
 
I read in the Metro today that "a billion is now widely accepted as being 1,000 million. When did that become "widely accepted"? When informed of billions of this and a billion of that have I been thinking of the wrong number all this time? :confused:
 
min_bannister said:
I read in the Metro today that "a billion is now widely accepted as being 1,000 million. When did that become "widely accepted"? When informed of billions of this and a billion of that have I been thinking of the wrong number all this time?
It happened some time back, I'm afraid. The good old British billion has ceased to be. I remember a billion being a million million when I was at school, but by university, I'm pretty sure that one thousand million was the norm.

Feel free to grumble (I know I did for a while) but actually the current system has its advantages. For one thing, it means that if an American scientist and a British boffin are talking about a trillion, they're talking about the same thing now, and for another, it means that numbers like 258,861,456,880 are much easier to understand when read out loud.
"Two hundred and fifty-eight billion, eight hundred and sixty-one million, four hundred and fifty-six thousand, eight hundred and eighty" is somewhat easier to take in than "two hundred and fifty-eight thousand, eight hundred and sixty-one million, four hundred and fifty-six thousand, eight hundred and eighty". Probably!
 
Back
Top