Showing posts with label mental arithmetic. Show all posts
Showing posts with label mental arithmetic. Show all posts

14 August 2008

iTip

At Freakonomics: Is Tipping Really So Hard? Apparently the iPhone App Store features a rather large number of "tip calculators".

Is it really so hard to multiply (approximately) by .15? (Or some other number; I don't wish to debate the "correct" tipping percentage here.) I've always figured that the hard part of tipping is not the multiplication, but the knowing who to tip. For example, do you tip food delivery people? I do on those rare occasions that I get food delivered; they bring the food to your house, which you could argue is more work than bringing it to your table in the restaurant! Plus a lot of them pay for their own gas. But in reality, I don't get food delivered, because I know I should tip, but I don't want to. Do you see how this gets complicated?

However, I am better than average at math and worse than average at "social skills", so I hesitate to generalize from my experience.

But as some people have pointed out there, this gives you an excuse to pull out your iPhone. And it's also the case that it's basically the simplest application one could write that actually does something useful.

A bill-splitting calculator I can see being useful, in cases where the amount of food people ordered varies enough that people aren't willing to split it equally; although I personally would never be in a party that would need it, because I can do the arithmetic. (And yes, my friends know this, so on occasion they'll ask me how much they should pay instead of figuring it out themselves.) But I can't eat with everybody! But even a bill-splitting calculator suffers from the fact that the hard part of splitting the bill is not the math, it's remembering who ordered what.

11 July 2008

Good's "singing logarithms"

I've previously mentioned Sanjoy Mahajan's Street Fighting Mathematics. (Yes, that's right, almost the entire sentence is links, deal with it.)

One thing I didn't mention is approximating logarithms using musical intervals, from that course. We all know 210 and 103 are roughly equal; this is the approximation that leads people to use the metric prefixes kilo-, mega-, giga-, tera- for 210, 220, 230, and 240 in computing contexts. Take 120th roots; you get 21/12 ≈ 101/40.

Now, 21/12 is the ratio corresponding to a semitone in twelve-tone equal temperament. So, for example, we know that 27/12 is approximately 3/2, because seven semitones make a perfect fifth. So log10 3/2 ≈ 7/40 = 0.175; the correct value is 0.17609... Some more complicated examples are in Mahajan's handout.

You might think "yeah, but when do I ever need to know the logarithm of something?" And that may be true; they're no longer particularly useful as an aid for calculation, except when you don't have a computer around. But I often find myself doing approximate calculations while walking, and I can't pull out a calculator or a computer! (To be honest I don't use this trick, but that's only because I have an arsenal of others.)

Is this pointless? For the most part, yes. But amusingly so.

The method is supposedly due to I. J. Good, who is annoyingly difficult to Google.

Oh, and a few facts I find myself using quite often -- (2π)1/2 ≈ 2.5, e3 ≈ 20.

10 March 2008

Death rates

How many people in the world do you think will die in the next day?

See here for a psuedo-hint, which is the somewhat easier related question which inspired this.

See here for the answer, from the U. S. Census Bureau.

05 March 2008

A factoring trick

I came across the polynomial f(x) = 2x2 + 3x - 5 during a calculation I was doing a few days ago. I wanted to factor it. Sure, I could have done it the usual way. But I have a better intuition for factoring numbers than I do for factoring polynomials. So I plug in x = 10; then f(10) = 225.225 factors into 9 times 25. Perhaps this reflects a factorization f(x) = g(x) h(x), where g(10) = 9, h(10) = 25.

Indeed, it does: 2x2 + 3x - 5 = (x-1)(2x+5). Of course, this gives a whole family of integer factorizations, plugging in different integers for x.

Of course, this doesn't work in general; consider for example 2x2 + 2x + 5, which doesn't factor at all. And when the trick is spelled out explicitly it seems to be irredeemably flawed -- how did I know to take (x-1)(2x+5), say, and not (x-1)(3x-5)? (More importantly, can this be explained without reference to the original polynomial?) One could perhaps point out that, say, 184 = (8)(23), which is just f(9) = g(9) h(9), and so on; from a family of such facts it might be possible to deduce the polynomial factorization, but at that point it's just not worth the trouble. These sorts of tricks, like jokes, rarely stand up to explanation.

07 January 2008

Arthur Benjamin's mental arithmetic

Arthur Benjamin is a research mathematician (I've actually mentioned him before, although I didn't realize that until I looked at his web page and saw that the title of one his papers looked familiar...) and also a "mathemagician" -- he has a stage show in which he does mental calculations. See this 15-minute video of his show at ted.com.

He starts out by squaring some two-digit numbers... this didn't impress me much, because I could almost keep up with him. (And 37 squared is especially easy for me. One of my favorite coffeehouses in Cambridge was the 1369 Coffee House, and at some point I noticed that that was 37 squared. So I'll always remember that one.) Squaring two-digit numbers is just a feat of memory. Three- and four-digit numbers, though... that's a bit more impressive. And of course I'm harder to impress in this area than the average person.

One trick that might not be obvious how it works: he asks four people to each find a number 8649x (the 8649 was 93 squared, from the number-squaring part of the show) for some three-digit integer x, and give him six of the seven digits in any order; he says which digit is left out. How does this work? 8649 is divisible by 9. So the sum of the digits of 8649x must be divisible by 9. So, for example, say he gets handed 2, 2, 2, 7, 9, 3; these add up to 25, so the missing digit must be 2, to make 27? (How could he tell apart a missing zero and a missing nine? I suspect there's a workaround but I don't know what it is; the number 93 was given by someone in the audience, so I don't think it's just memory.)

He also asks people for the year, month, and day which they were born and gives the date; I found myself trying to play along but I can't do the Doomsday algorithm quite that fast... and I suspect he uses something similar. (I noticed that he asked three separate questions: first the year, then the month, then the day. This gives some extra time. I know this trick well; when a student asks a question I haven't previously thought about, I repeat it. I suspect I'm not the only one.)

The impressive part, for me, is not the ability to do mental arithmetic -- I suspect most mathematicians could, if they practiced -- but the ability to keep up an engaging stage show at the same time.

(The video is on ted.com, which shows talks from an annual conference entitled "Technology, Entertainment, and Design"; there look to be quite a few other interesting videos on there as well.

18 December 2007

Monkeys can do math. Sort of.

Monkeys and college students equal at mental math? (Reuters)

Monkeys and college students were asked to do addition problems mentally, by being shown two sets of dots and then being asked to pick out a third set of dots that had the same number of dots as the first two sets combined. Apparently (according to the Reuters article) the college students were told not to count or verbalize as they did the math; this doesn't seem to make a huge difference, though, since average about one second so it really wouldn't have been practical to do so. This seems like an unfair handicap; we're so used to doing math verbally that doing it any other way is difficult. I also wonder if calculator use among humans has contributed to this. What would have happened if the same study were done fifty years ago?

The headline is a bit inaccurate, though; the monkeys responded equally quickly to the problems, but they were less accurate (94% accuracy for humans, 76% for monkeys).

All snarking aside, though, it's interesting that there are parts of mathematics that appear to not depend on linguistic abilities. And it seems a bit surprising that monkeys would be nearly as good at these tasks as college students, because the college students have had quite a bit more mathematical experience! But the task in question had to be done quickly enough that it really couldn't be verbalized, and most of the students' mathematical experience has been mediated through language.

The actual article is available online: Jessica F. Cantlon*, Elizabeth M. Brannon, "Basic Math in Monkeys and College Students", PLoS Biology 5(12): e328. (It's nice to be able to actually read the article that the journalists are hastily generalizing about! Often a subscription is required to do so, which is incredibly frustrating.)

09 September 2007

division in Foxtrot and some thoughts on arithmetic

From today's Foxtrot, you can learn what division really is. It also illustrates quite vividly the inefficiency of unary notation.

Division, of course, is basically just taking things and sorting them into piles of the same size, and then counting the piles; the result usually known as the division algorithm, which is not actually an algorithm but rather an existence theorem, makes this clear. I wonder if children learning arithmetic know this or if they think that "division" is just some magical algorithm that takes a bunch of numbers and pushes them around and spits out another number; it's been a very long time since I learned about division, and I've also learned not to trust my own introspection to tell me how "most people" see a given mathematical procedure. If I were "most people" I wouldn't be a mathematician.

For the most part I don't do arithmetic by the methods one learns in grade school. From just doing a lot of arithmetic (often in the course of experimenting with some combinatorial problem) I've quasi-memorized a lot of the particular arithmetic problems that come up most often. (In my case these seem to usually be products of numbers with small prime factors.) The set of arithmetical facts that I know off the top of my head is idiosyncratic; for example, I wouldn't know 392 or 432 off the top of my head, but I can instantly say that 372 = 1369 because it is in the name of 1369 Coffee House, which I frequented quite a bit during my undergrad years at MIT. But I also wouldn't multiply 37 by 37 by taking a bunch of things and arranging them in a 37 by 37 square and counting them. For one thing, I don't have 1,369 similarly-sized things. Also, I don't have a flat surface large enough for that in my apartment. This is why we invent calculation algorithms that don't directly mirror the definitions at the lowest level.

The grade-school arithmetic algorithms come pretty close to doing that, though, if you accept that you can arrange your objects in piles of 10, 100, 1000, and so on. But these aren't the most efficient methods; for multiplication, Strassen's method for integer multiplication has time-complexity Θ(N log N log log N) for N-bit integers, compared to Θ(N2) for the grade-school method. This paper of Martin Furer supposedly has better asymptotic behavior, although it wouldn't surprise me to learn that it's better only for very large numbers. (Strassen's algorithm, in turn, is only the best known method once the numbers get above 10,000 digits or so.)

06 August 2007

13th roots and Rubik's cube

In the comments here a few days ago, commenter Michael S. pointed to this article telling us that the record for finding the 13th root of a 200-digit number has been broken by Alexis Lemaire. It is mentioned that the record for the 13th root of a 100-digit number was 23 minutes in 1970, and now Lemaire can do it in under four seconds. This seems a bit suspicious to me, because it seems like it would take more than four seconds just to read the problem and write down the answer, but presumably it's legitimate. I found what looks like a set of rules here.

What I'm curious about is how much of this improvement is an improvement in the existing methods, and how much of it is coming up with new methods. I suspect that some part of the method takes advantage of certain strange properties that only someone who was looking really hard could find. For example, in the case of cube roots it turns out that the cubes of 0, 1, 2, ..., 9 each end with a different digit (0, 1, 8, 7, 4, 5, 6, 3, 2, 9). So if, for example, you have a number that you know is a perfect cube and it ends in 3, it must be the cube of something that ends in 7. There's a similar trick for fifth roots -- n5 and n always end in the same digit.

From what I can gather the methods for the 13th root are a little less ad hoc; at least one person's method basically works by taking the logarithm of the 100-digit number and dividing it by thirteen to get the most significant digits of the 13th root of a 100-digit number; then looking at the last few digits of the 100-digit number gives you the last few digits of its 13th root. One occasionally sees large numbers given in this way, where the first few digits can be found by logarithms, the last few digits by modular arithmetic, but the middle is a lot harder to compute. I can't find descriptions of the method which these mental calculators use for finding 13th roots of 200-digit numbers; one might expect it to be similar, but it doesn't necessarily have to be.

An analogy comes to mind with solving the Rubik's cube. There are people who can solve it very quickly. But there are two metrics for the speed of solving it -- how many "moves" do you have to make to get it back to the starting position, and how many seconds does that take? The first of these is the more mathematically interesting question. A simple combinatorial argument says that there must be positions which are 18 moves from the initial position. The number of positions the Rubik's cube can take is (8! × 38) (12! × 212)/12 = 43,252,003,274,489,856,000. To see this, note that to specify a position of the Rubik's cube, one must specify:
  • the positions of the eight corner pieces, in 8! ways

  • how each of these is "twisted", in 38 ways

  • the positions of the twelve edge pieces, in 12! ways

  • how each of these is "flipped", in 212 ways

and the factor of 12 arises because it turns out not to be possible to get every possible combination of flips and twists. (Douglas Hofstadter referred to these as "conservation laws" in one of his Metamagical Themas columns, because they reminded him of certain laws about conservation of spin or color in particle physics.) Now, let's say you're randomly scrambling a Rubik's cube. There are twelve "moves" you can make from the initial position -- you can twist each of the six faces in either direction. After that, at each juncture there are eleven moves you can make -- it would be redundant to turn a face clockwise and then counterclockwise on the next move. So the number of positions that you can reach in exactly n moves is 12 × 11n; summing these up for 1, 2, ..., 17 gives a number which is less than the number of possible positions of the Rubik's cube. So there are positions which are at least 18 moves from the start. It turns out that there are positions that need 26 such moves ("quarter turns"), and an algorithm is known that can solve any position of the Rubik's cube in 35 quarter turns; see this Wikipedia article, or this paper for an idea of the methods. (They use a different metric, in which turning a face by 180 degrees counts as one move, not two; this lets one replace 18, 26 and 35 above by 15, 20 and 26.)

But the people that solve the cube very quickly are interested in the actual amount of time, which means first that they would not want a method that required lots of standing around thinking, and second that there are considerations of manual dexterity that just don't come into play if you're trying to minimize the number of moves. You can imagine a Rubik's cube competition where people are challenged to unscramble the cube in a minimal number of moves, as opposed to the smallest amount of time; however it would be a much different game.

Unfortunately, the analogy to the 13th-root problem breaks down. That's because in the Rubik's cube case, you can see what the players are doing. So it's possible to count the number of moves they make. You can't count the number of elementary arithmetic operations that someone does in their head when trying to extract a 13th root, and furthermore there seems to be an element of synaesthesia in some of their calculations, so hat they see as an elementary arithmetic operation might not be how a "normal" person would define it. What model of computation one uses is tremendously important here.
Subscribe to: Comments (Atom)

AltStyle によって変換されたページ (->オリジナル) /