Simple question with no third option. Do you think that current Western culture, at some level, tells men that they are entitled to women? (Had to shorten the title)
Explain, if you want.
Edit 5/30:
Question 2:
Do you think our society is misogynistic?
-
Strong desire for sex, yes. Entitled to it? No. Culturally, I'd like to think most men don't feel entitled to sex. There are certainly those that do, but I would like to believe they are the exception, and not the rule.