Calculators also say that dividing by 0 is an error, but logic says that the answer is infinite. (If i recall, it’s more correctly ‘undefined’, but I’m years out of math classes now.)
That is, as you divide a number by a smaller and smaller number, the product increases. 1/.1=10, 1/.01=100, 1/.001=1000, etc. As the denominator approaches 0, the product approaches infinity. But you can’t quantify infinity per se, which results in an undefined error.
If someone that’s a mathematician wants to explain this correctly, I’m all ears.
Calculators also say that dividing by 0 is an error, but logic says that the answer is infinite. (If i recall, it’s more correctly ‘undefined’, but I’m years out of math classes now.)
That is, as you divide a number by a smaller and smaller number, the product increases. 1/.1=10, 1/.01=100, 1/.001=1000, etc. As the denominator approaches 0, the product approaches infinity. But you can’t quantify infinity per se, which results in an undefined error.
If someone that’s a mathematician wants to explain this correctly, I’m all ears.
deleted by creator