At least when I made this thread I was specific about it, and provided an example proof relating x to .999... And 1, which clearly disobeyed algebraic rules, but trolls with troll on. C'mon OP, if you're going to copy this decade old troll, you gotta be better than that.
You can be proven wrong very easily, because you did not specify for people to prove you wrong [i]mathematically.[/i] There is a fine distinction between proving something true in mathematics, and proving something true in semantics (linguistics, meaning). Simply put, .999... and 1 are not defined as the same number, .999... is infinitesimally small distance away from 1, however if it were 1, it would be called 1. That is why we call .999..., .999... and we call 1, 1.
Mathematically, the equals sign is used to show equivalent value on either side of the equation. There is no possible way to mathematically show a difference in value between the two numbers, therefor you cannot make a mathematical proof to prove this wrong until you consider the numbers and their actual meanings. Also it is not logical to work with infinities, as infinity is a concept and [i]not[/i] a number. Due to this, it shows a flaw in our math system where we cannot possibly mathematically define these two numbers as different, even if they are.