Simple question with no third option. Do you think that current Western culture, at some level, tells men that they are entitled to women? (Had to shorten the title)
Explain, if you want.
Edit 5/30:
Question 2:
Do you think our society is misogynistic?
-
Male dominated forum: "Gee uhh lets see. No?! Wel I'm not getting sexed soo..." Seriously you guys wouldn't be out of place at TRP or something