Simple question with no third option. Do you think that current Western culture, at some level, tells men that they are entitled to women? (Had to shorten the title)
Explain, if you want.
Edit 5/30:
Question 2:
Do you think our society is misogynistic?
-
Yes without a doubt. You go on a date. Man pays for food, woman sleeps with you. If you don't get some action at the end of the night they'll tell you it was a bad date, or that the woman screwed you over.