I just read a wonderful post called When Did Feminism Become a Bad Word? It sums up (very eloquently, I might add), the confusion that many of us feel when someone utters the words, "I'm not a feminist, but...." and then goes on to subscribe to the very ideals that feminists advocate!
Why do people feel that they have to make it clear that they are NOT feminist? I can understand wanting people to know that you're not sexist, racist, etc., but feminist? There's nothing wrong with being a feminist (as opposed to being racist or sexist, which is, unequivocally, wrong and detrimental to society), so why distance yourself from it, especially if you follow the feminist ideals (which, again, are nothing to be ashamed of; simply economic, political, and social equality for all people)? Is it because they're afraid that they'll be called a man-hater (how I hate that stereotype!)? Is it because they think they'll somehow be less "womanly" (or "manly") if they don't subscribe to strict gender roles that frown upon deviation? Who knows? But the point is, one should research a school of thought before vehemently denying it.
Check out the post; it's a great one! (It's from over a year ago, but the writing--and views--are timeless!)
Friday, January 4, 2008
Great blog post!
Posted by Amanda at 12:51 PM
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment