For this week's post, we will delve back into an official MLB baseball rule that was recently brought to my attention.
1.05 Home base shall be marked by a five-sided slab of whitened rubber. It shall be a 17-inch square with two of the corners removed so that one edge is 17 inches long, two adjacent sides are 8 1/2 inches and the remaining two sides are 12 inches and set at an angle to make a point. It shall be set in the ground with the point at the intersection of the lines extending from home base to first base and to third base; with the 17-inch edge facing the pitcher’s plate, and the two 12-inch edges coinciding with the first and third base lines. The top edges of home base shall be beveled and the base shall be fixed in the ground level with the ground surface
That looks something like this:
The rule doesn't explicitly state that the angle between the two sides that are 12 inches in length has to be a 90 degree, right angle, but it does state that baselines at first base and third base should be 90 degree, as well as at second base, thus necessitating that this angle be 90 degree at home plate. This is IMPOSSIBLE!
Thinking back to our geometry days, the Pythagorean theorem tells us that a2 + b2 = c2. Here, "a" equals 12 inches, "b" equals 12 inches, and "c" equals 17 inches. This is where it falls apart. 122 + 122 = 288, but 172 = 289! It just doesn't work!
It's definitely amusing to think that the official MLB rule book specifies a home plate shape that is impossible!
2 comments:
In order for the angles to hold true what would the actual lengths need to be?
It looks like it should be 12.02, or (17√2)/2! I suppose why that's negligible.
Post a Comment