This entire argument is why we have an approximation symbol. .999.... Is an approximate value for 1. By using calculus theorems you can argue that it is equal to 1, but in actual real word practice a decimal is never assumed to be truly accurate. It is always an approximate value.
English
-
Edited by Gearrat: 3/28/2016 5:42:01 PMIt doesn't need an approximation symbol because it is one. Recurring decimals aren't normal decimals. They're actually infinite geometric series.
-
-blam!-ing this. 1/3 is [i]approximated[/i] by .333... There is no -blam!-ing way to evenly divide 1 into three sections, so we use .333... To represent literally the closest thing you can get to 1/3 of 1 without actually getting there because there's no way to express it as a non-irrational number. It's a goddamn approximation and so is .999...
-
Infinite digits bruh. It works because infinite digits.
-
10 diff reasons why they are equal. Its entertaining too Another one is there are an infinite number of numbers between any 2 unique numbers. There are 0 numbers between 1 and .9999... Because by definition of what a unique number is they are the same.