I know the main focus was slavery and the fact that Lincoln didn't win in a single Southern State, but was there more to it than that?
Could Americans please elaborate?
-
Do you guys think that the Southerners were right about defending their life style? I know most of us don't agree on slavery being fair, however back then it wasn't seen as such a morally wrong thing, or so I believe. If someone suddenly tried to take away everything that makes you, you, wouldn't you want to stop them as well?