There is a common misconception that feminism is not needed in modern Western societies. It carries connotations of man-hating, bra-burning, angry women with chips on their shoulders.
Consequently, feminism has become a touchy subject and, strangely, many women would not consider themselves to be feminists, let alone proudly claim to be one.
So, is it true that we really don’t need feminism anymore?
Well, when millions of women are forced into sexual slavery by human traffickers, violence against women is still frighteningly prevalent and a quarter of women have been victim of rape or attempted rape, the answer seems to be a resounding “no”.
And, even when it comes to the workplace, women have still not achieved equality. Despite almost two centuries of feminism, women are regularly paid less than their male counterparts and often run smack into the ‘glass ceiling’.
Very few countries have, or have had, female leaders. The chance of a female U.S. president still seems to be a long way away. And, currently, in the U.K., out of 646 MPs, only 126 are women.
In everyday, Western life, women have, ostensibly, succeeded in getting the worst of both worlds: still bearing the lion's share of childcare and domestic chores, while also attempting to hold down a career. Something tells me, this is not what the Pankhurst girls had in mind.
If we think we have truly achieved equality, we’re deluding ourselves. Of course, things have improved, but it is ridiculous to assume that feminism is no longer necessary. Instead, what we need to do is change how feminism is perceived and, in order to do that, more women (and men) need to embrace the cause for its core beliefs.
The truth is that there’s still a frighteningly long way to go. In this writer’s opinion, we dismiss the importance of feminism at our peril.