Simple question with no third option. Do you think that current Western culture, at some level, tells men that they are entitled to women? (Had to shorten the title)
Explain, if you want.
Edit 5/30:
Question 2:
Do you think our society is misogynistic?
-
Everyone's entitled to sex no more or less than another person regardless of age, gender... Though for some people it could depend under circumstances (religion, STDs) Now that you mentioned sex, I think its unfair guys who manage to get it are considered champions while girls are labelled as [i]sluts[/i].