For the longest time I've been bothered by numbers. I mean, I get 'em, I use 'em, I know how to crunch 'em the way teachers, professors, bosses, and conductors have told me to, but I always had questions. Quantifying matter is easy, PeMDAS is easy, algebra is easy, trig is pretty easy, but there are a few things that bother me about numbers.
Here's a light example before I really embarrass myself:
I read a study about a guy who, after crunching enough data, surmised that the most efficient way to eliminate unwanted pocket change was to create an 18-cent piece. By analyzing x# of different "change for a dollar" scenarios he devised a theory that an 18-cent coin would eliminate a sizeable chunk of loose change. One way to consider the benefits of that study is to appreciate the lack of excess baggage that results from new coinage. Another way is that, by reducing necessary pocket change, you can reduce the amount of coins necessary in circulation and save money on raw materials as well as conserve resources. The one thing that wasn't brought up and that I find exceptionally important is that it would force people to think outside our deca-box. Ask a person to count their multiples of five. Ten? Twenty-five? All easy. Having to perform daily math that demands your knowledge of the multiples of 18, as well as being able to easily add 18, would expand people's required brain functions, I think. I mean, at some point it could possibly become as simple as counting multiples of five is, but right now multiples of 18 isn't simple, and I think a daily requirement of mental problem solving would be good for people.
Now for the crux of my rant:
I have always been bothered beyond belief that we will use one-hundred thousand but we won't use one-thousand million. This may seem like a silly thing to obsess over, but bear with me. First of all, yes, I get it, if you display a number like, say, 100000000000, it's going to have the same value no matter what you call it. You can put commas wherever you'd like, you can call it "the number Reginald" if you'd like, but it's still 100000000000 of something. Most people call it 100,000,000,000, or "one-hundred billion." Some people call it "one-hundred milliard." I call it something else. I never considered what to call it until today. I realized that, if numbers follow a specific pattern, which they should considering humans like patterns, they should follow it all the way across the board (well, that part I've thought for sometime). I could never find a good correlation because I'd always start at "one-thousand million," but looking back I only needed to understand ten. In Arabic numerals, at least to a point, the pattern goes that a number unit goes until it is exhausted, then it begins a new unit. The ones exhaust, they expand to tens, because you can no longer describe ones in terms of ones once you've exhausted that unit (but you don't necessarily have to stop at nine either: you can invent a number called "Fred" that has a funky Prince-emblem shape to it that is like the ten of the ones unit). This starts to make sense when you hit the end of the tens (title of my next album). At ninety-nine you're saying that you have nine-tens and nine-ones. The next number isn't described as having ten-tens and zero ones, it is instead called one hundred. Now, in theory, you could have ten-tens, but that would be tough to incrementally organize, because you'd have to expand one organizer, then the other, and back & forth until it's time to call it the ten-ten-tens. Messy. Oh yeah, and the rule in text definition is: "you cannot use a word to define itself," which is what you'd be doing. This is why you can't have ten-tens and must move on, but our system jumps the gun. Back to the topic. So hundreds turn into thousands, thousands to ten-thousands, ten-thousands to one-hundred thousands, and we don't use one-thousand thousands, so we invent the million. From there, ideally, we would have one-thousand millions, one-million billions, one-billion trillions, etc. That was my original assertion. Until tonight. Where my thought process always hit a phone pole was where I forgot to analyze that one-thousand should really be ten-hundred. Which people do use, but only as a short-hand way of referencing large numbers.
"Hey Gerald, how much did you win at Keno?"
"Hey Buckwheat, I won one-thousand, two-hundred dollars."
"Gerald, 'round these parts we say 'twelve-hundred.' You're liable to get shot talkin' like that."
See my point? In reality, if we were adhering to patterns, we would run up to ninety-nine hundred ninety-nine before switching over to one-thousand. But then we'd be losing the one-thousand dollars we just won at Keno. Ah, notice though, we'd be gaining the ten-thousand we used to know. No, really, I swear the secret to sub-prime mortgage rates is in here somewhere.
"But Ruxton, if it's all just different names for the same quantitative value, why change the familiar system?"
Because the system we know know utilizes a linear number plot with broken linear concept. Adhering to patterns set forth at the onset allows people to think outside of their box. It also insinuates a pattern, a reason. Einstein once said, loosely quoted, "If you can't explain it to a six-year old you don't understand it yourself." Honestly, given our number pattern, may give way to another adage, "easier said than done." Well, I probably could teach numbers, but the institutionalized repetition that I was submitted to couldn't hurt the process any. It's not so much an issue of function, it's an issue of concept. Which, if you think about it long enough, is a pretty common duality amongst a lot of characteristics of our lives.
The real clincher is whether or not we call "one hundred" "one-ten" instead.
(The answer is 'no,' the first number in the statement is a multiplier, one is implied).
Friday, March 21, 2008
Subscribe to:
Post Comments (Atom)
2 comments:
Hmm..that's an interesting point and one that I've honestly never even come close to thinking about before. I wonder if there isn't some stupid linguistic reason for it. At a certain point with counting, too, I think that we as humans begin to suffer from cognitive overload. Really, we've only evolved to handle very few numbers at once (I forgot the exact number I read in an article back there, but it was surprisingly small); anything more than we can immediately comprehend and see is graspable in the abstract, but we need symbols to do it. I think maybe using the lower denominators (100) to show multiples of 10 in the higher stratospheres of what is comprehensible may be a mechanism to fight abstractly large numbers with something manageable. People live to be 100 years old, not 1000 years old. I think 100 might be right on the threshold of our lizard brain's understanding of numbers.
Every culture groups numbers differently, and it's always a trip getting used to a new mental system of numeric organization. In Japan, for instance, 10,000 is it's own word and organizational concept. So when talking about, say, the population of my town (110,000 people), you'd have to say "11 10,000s." And the Japanese at least use a base-10 system, unlike the Mayans and a lot of other great civilizations in the planet's history. On one level, all of this is entirely arbitrary.
I think this post demonstrates why you could have gone to MIT. The ability to think about numbers and see patterns in numbers in a way that is unorthodox is a very relevant skill at that school. Nothing is gained by those who have no vision beyond what they learn in their textbook.
Post a Comment