Simple question with no third option. Do you think that current Western culture, at some level, tells men that they are entitled to women? (Had to shorten the title)
Explain, if you want.
Edit 5/30:
Question 2:
Do you think our society is misogynistic?
-
Not anymore.. If you don't have the skill of manipulation, an amazing job or money, its nearly impossible to get some.. Love isn't a big part of the equation like it use to be