“Rape culture” is a term/concept that gets thrown around a lot. I think it can be a valid lens through which to look at societies today. However, I think that the term itself is problematic, and causes problems for those who are trying to make others (men) aware of mass rape and sexual assault within a particular culture. The term has recently resurfaced in my mind due to the renewed rape allegations against Bill Cosby. Regardless of Mr. Cosby’s own innocence, there is something to be said for our society, which allows men (especially rich, powerful, famous men), to get away with such a heinous crime. For decades and decades. As we see with many high-profile accusations, many more women have come forward accusing Mr. Cosby of rape.
A few issues present themselves:
1) Due to Mr. Cosby’s denial, women are being accused of lying (or worse yet, simply regretting their sexual encounter, and then lying) so that they can gain some sort of attention/fame/money(?)
2) A perpetuation of the notion that women will do anything to get some sort of attention/fame/money(?), even lie about rape
3) A resurfacing of the idea that “nice” men cannot rape. Mr. Cosby appears to be an awesome, funny dude. How could he possibly do that to women, right?
We see similar patterns with high profile males (including Woody Allen) pretty frequently. Some people experience a serious sense of loss of self because someone that they valued, someone that was a hero to them, has now been accused of a serious crime that involves physically debasing another human to a level that no one should ever, ever, have to experience (this isn’t some white-collar, Martha Stewart insider-trading shit). In turn, some become seriously defensive and angry of their now fallen heroes (though they know nothing about this celebrity personally, and have no stake at all in whether that person is charged with a crime).
So what does this all have to do with “rape culture”?