**Previously Recorded by Phyllis Schlafly // July 2014**
If you’ve spent any time on a college campus in the last couple of years, you’ve probably heard the phrase “rape culture.” That’s the latest rallying cry of the feminists. It’s the claim that we live in a male-dominated society that encourages rape. While the statistics are ridiculously inflated, there does seem to be a lot of sexual assault going on at college campuses. Feminists claim that’s because we haven’t taught men not to rape. Of course, that’s not true – the crime of rape has always been severely condemned and punished.
I think we can lay the blame for these attitudes on the feminists. Thanks to the feminist notion that men and women have the same sexual drive and needs, colleges have become a sexual free-for-all. Some campuses even host an annual “sex week” where students can learn about all varieties of depraved sex. They’ve declared war on shame, and women are told that sexual freedom is liberating and empowering. If it were really true that men and women are the same sexually, all this free sex should lead to equal numbers of unhappy men and unhappy women. But that doesn’t seem to be the case.
We used to teach men to be gentlemen. Men were not supposed to coerce women into sex or take advantage of women who were too drunk to give meaningful consent. We also used to teach women to be smart and not go alone to a man’s bedroom, especially if she has had too much to drink.
In place of old traditional values, we’ve been left with a world where mixed signals confuse men into sex that a woman can later call assault, where predators can find easy prey, and where women are told that this is all empowering, even though their hearts tell them otherwise.
Women are taught to delay any thought of marriage until they are established in a good career and join the hookup society for fun in the meantime. It’s no wonder that this bad advice has produced a lot of unhappy women.