Entry tags:

### trig approximations

One virtue of my math education was that it was fairly proof heavy; I feel I can demonstrate or prove most of what I know. I may have mentioned this not extending to sin(A+B), which I had to figure out later as an adult. It also didn't include detailed "how do we calculate this shit?" for stuff like trig, log, fractional exponents. Once you hit calculus the answer becomes "Taylor series or something better", but before then?

Feynman talks about it somewhat in Chapter 22 of his Lectures, which is a cool read; he goes from defining multiplication in terms of addition, to numerically suggesting Euler's law, and showing how to build up log tables along the way. It's a heady ride. But he doesn't talk about trig tables.

I've long had my own thoughts about those, and finally wrote some code to try out the calculations.

Basic idea is that given sin(a) = y/r, cos(a) = x/r, and the Pythagorean theorem, there are two right triangles for which we know the values precisely. The 45-45 is trivial, while the 30-60 one can be gotten from sin(60) = cos(30) = sin(30+30) = 2*sin(30)*cos(30). From there, you can apply the half-angle formula as often as you please, to get very small angles, at which point you'll notice that sin(x) ~= (approximately equal to) x for small x. Thus you can approximate an angle like 1 degree which you can't otherwise[1] get to, then use double and addition formulas to build back up to say 10 degrees or anything else.

The question in my mind always was, how good is that? And the code finally answers the question. Sticking to half-angle stuff is *very* accurate, matching the built-in function to 15 decimal places. Using a sin(1 degree) approximation and building up to sin(15 degrees) is accurate to 4 places; starting from 0.125 degrees instead is accurate to 6 places. I don't know how practically good that is; one part in a million sounds pretty good for pre-modern needs -- like, at that point can you make or measure anything that precisely? -- but Feynman says that Briggs in 1620 calculated log tables to 16 decimal places.

[ETA: hmm, I'd been assuming the built-in function, presumably based on power series, was the most accurate version, but maybe I should view it as deviating from the exact roots of the half-angle approach. Or both as having small errors from the 'true' value. Especially as Python's sin(30) returns 0.49999999999999994, not 0.5. Of course, both approaches are using floating point. sin(60) = sqrt(3)/2 to the last digit, but cos(30) is slightly different.]

[1] The two exact triangles basically give you pi/2 and pi/3, and those divided by 2 as much as you want. But you can't get to pi/5 exactly.

a/2 + b/3 = 1/5 #a and b integers

15a + 10b = 6

5(3a+2b) = 6

(note that 2a+3b=5 is totally solvable.)

Though I guess a more valid approach would be

c(1/2)^a + d/3*(1/2)^b = 1/5 #all variables integers

c*5*3*2^b + d*5*2^a = 3*2^(a+b)

5(c*3*2^b + d*2^a) = 3*2^(a+b)

which still isn't soluble.

Likewise for getting to pi/9 (20 degrees):

c(1/2)^a + d/3*(1/2)^b = 1/9

c*9*2^b + d*3*2^a = 2^(a+b)

3(c*3*2^b + d*2^a) = 2^(a+b)

Feynman talks about it somewhat in Chapter 22 of his Lectures, which is a cool read; he goes from defining multiplication in terms of addition, to numerically suggesting Euler's law, and showing how to build up log tables along the way. It's a heady ride. But he doesn't talk about trig tables.

I've long had my own thoughts about those, and finally wrote some code to try out the calculations.

Basic idea is that given sin(a) = y/r, cos(a) = x/r, and the Pythagorean theorem, there are two right triangles for which we know the values precisely. The 45-45 is trivial, while the 30-60 one can be gotten from sin(60) = cos(30) = sin(30+30) = 2*sin(30)*cos(30). From there, you can apply the half-angle formula as often as you please, to get very small angles, at which point you'll notice that sin(x) ~= (approximately equal to) x for small x. Thus you can approximate an angle like 1 degree which you can't otherwise[1] get to, then use double and addition formulas to build back up to say 10 degrees or anything else.

The question in my mind always was, how good is that? And the code finally answers the question. Sticking to half-angle stuff is *very* accurate, matching the built-in function to 15 decimal places. Using a sin(1 degree) approximation and building up to sin(15 degrees) is accurate to 4 places; starting from 0.125 degrees instead is accurate to 6 places. I don't know how practically good that is; one part in a million sounds pretty good for pre-modern needs -- like, at that point can you make or measure anything that precisely? -- but Feynman says that Briggs in 1620 calculated log tables to 16 decimal places.

[ETA: hmm, I'd been assuming the built-in function, presumably based on power series, was the most accurate version, but maybe I should view it as deviating from the exact roots of the half-angle approach. Or both as having small errors from the 'true' value. Especially as Python's sin(30) returns 0.49999999999999994, not 0.5. Of course, both approaches are using floating point. sin(60) = sqrt(3)/2 to the last digit, but cos(30) is slightly different.]

[1] The two exact triangles basically give you pi/2 and pi/3, and those divided by 2 as much as you want. But you can't get to pi/5 exactly.

a/2 + b/3 = 1/5 #a and b integers

15a + 10b = 6

5(3a+2b) = 6

(note that 2a+3b=5 is totally solvable.)

Though I guess a more valid approach would be

c(1/2)^a + d/3*(1/2)^b = 1/5 #all variables integers

c*5*3*2^b + d*5*2^a = 3*2^(a+b)

5(c*3*2^b + d*2^a) = 3*2^(a+b)

which still isn't soluble.

Likewise for getting to pi/9 (20 degrees):

c(1/2)^a + d/3*(1/2)^b = 1/9

c*9*2^b + d*3*2^a = 2^(a+b)

3(c*3*2^b + d*2^a) = 2^(a+b)