I want to know why so many of us are still afraid of and/or in disdain of feminists.  Why are our adjectives so negative?  “Angry feminists” or “militant feminists,” neither have good connotations, and those are just two that come immediately to mind.

If these feminists are angry, perhaps it is for a good reason?  Have you read their works?  Almost every feminist book I have read starts with a personal story (or two) of hurt, betrayal, abuse, etc. by men in their lives: husbands, fathers, spiritual leaders and so on.  These stories are often heart-wrenching.

Yet when we look at feminists (i.e. women who are brave and strong enough to stand up, tell their story, and insist that the wrongs are righted), we judge them, we malign them, we have no sympathy or empathy for the real hurts they have experienced.  We insist that in the US things are pretty damn good for women: women can vote, they can wear pants, they can get jobs, they can stay single.  So obviously, equality has arrived, feminism is passé.

But… what about the wage gap?  What about the many jobs that are denied to women, either overtly or covertly?  What about the societal pressure for women to be married, the stigma that comes along with a single woman that rarely is given to a single man?  What about all the ways, big and small, that women are still regarded as property or objects to be bought and sold?  What about rape culture?  What about the fact that so many men only see women as walking vaginas?

We’re not equal.  Justice has yet to occur.  Feminism is still needed, but not just by women.  We need men to stand alongside us, to affirm our equality, to not be threatened by women.  If men lose the place they’ve held so long in society, the understanding should be that their place was ill-got and unjustly held.  There was never supposed to be a hierarchy.  But in every way possible, society upholds the false dichotomy between women and men.

Come on, folks.  Let’s get over ourselves. 

Advertisements