A new and useful proof that $1 = 1¢

Note: this is only useful to you if you use it in the right direction.

$1 = 100¢ = (10¢) ^ 2 = ($0.10) ^ 2 = $0.01 = 1¢.

How can you use this proof? Propose to your student loan creditor that your $10,000 loan is actually just 10,000¢ = $100. Their strategy would then be to suggest that instead your $10,000 loan = 1,000,000¢ = $1,000,000. Then they can propose a settlement where you split the difference, and you just pay the average of $100 and $1,000,000, or $500,050. Your best bet would be to suggest that you split the difference in log form, so you’re back to where you started.

UPDATE: What is a square cent? See this.

UPDATE 2: More seriously, there’s a claim going around in the comments that square dollars don’t exist and that you can’t square a “non-linear unit”. I’m not sure what a non-linear unit is, but it’s clear that you can square a dollar. For instance, as commenter UWIR points out, the variance of a dollar-valued function is measured in square[d?] dollars. Recall that the variance is the average square deviation from the mean. So if you have two data points X1 = $1 and X2 = $2, the mean is $1.50 and the variance is $$0.25. (The standard deviation is defined as the square root of the variance, so we get right back to dollars, in this case $0.50.)

Note that the square dollar ($$) aspect is important. You can’t just say “ignore the dollars, do the problem as though X1 = 1 and X2 = 2, so the mean is 1.5 and the variance is 0.25, and then add the dollars back.” It’s important that these be square dollars, because a square dollar is 10,000 square cents. [($1)^2 = (100¢)^2 = 10,000¢¢.] ‘Cause lookit: if you rephrased your data set as X1 = 100¢ and X2 = 200¢, you’d get a mean of 150¢ and a variance of 2500¢¢, which wouldn’t be consistent with the $$0.25 we found before unless we had $$1 = 10,000¢¢. (The same goes for square euros: if you rephrase the problem in euros, with $1 = 0.77 €, you’d better be using $$1 = 0.5929 €€.)

If you pretended the variance was in dollars, you’d be expecting a ratio of 100 when you redid the problem in cents, not a ratio of 10,000; and if you pretended the variance had no units, that would be inconsistent with the idea that the variance of a set of numbers should be the same regardless what units ($ or ¢) you express it in. (The variance had better be the same, because otherwise the standard deviation won’t be the same… and then good luck relying on the two-sigma rule that 95% of the area is within two standard deviations!)