well, no, it’s understood that a third is .333 to infinity, so .333+.333+.333 does equal 1 for any use not requiring precision to the point of it mattering that it was actually .33333335 when measured.
It came from it not being actually .333 to infinity when measured in the required engineering precision i was talking about. It’s literally a “common use” mathematical convention (you clearly are unaware of) that three times .333 is one. Solves a lot of problems due to a failure of the notation.
You knows when a person informs you of a convention people use to solve a problem created by notation, you could just fucking learn instead of arguing stupidity.
well, no, it’s understood that a third is .333 to infinity, so .333+.333+.333 does equal 1 for any use not requiring precision to the point of it mattering that it was actually .33333335 when measured.
No. You wrote .333
If you want to precisely write to infinity you write 1/3.
Holy fuck. Where did that 5 come from?
It came from it not being actually .333 to infinity when measured in the required engineering precision i was talking about. It’s literally a “common use” mathematical convention (you clearly are unaware of) that three times .333 is one. Solves a lot of problems due to a failure of the notation.
3 times 0.333 is 0.999 not 1.
Saying it equals 1 may be a common engineering convention, but it is mathematically incorrect.
There is no failure of notation if fractions are used, which is why I gave this example of usefulness.
You knows when a person informs you of a convention people use to solve a problem created by notation, you could just fucking learn instead of arguing stupidity.
Your chosen notation solves nothing. Try Representing 3227/555 using 4 trailing dots.
I started here by showing how fractions are useful.
You are the ignorant aggressor, trying to fight centuries of mathematicians by claiming decimals are always better.