Updated: Nov 30, 2021
Banning burkinis, the War on Terror and the US occupation of Afghanistan have nothing to do with improving the lives of women. Why, then, did Women’s welfare become central to any discussion about these topics?
To wage War in the name of Human Rights is an oxymoron. Nevertheless, this is exactly how the United States, and its mass media, is diverging attention from the fact that it has just lost another war. Feminism and LGBTQ+ rights was never the motivation behind the occupation of Afghanistan. And yet, the Taliban’s treatment of women and gays seems to be at the forefront of all mediatic discussion about the pulling-out of American troops. As a woman, it’s pertinent to honestly show how uncomfortable it is to see Women’s rights being used to paint a racist imperialist regime in good light. To pair concerns for the educational future of Afghan girls with demands for the extermination of “primitive” Islamists feels like using feminism as a veil for islamophobia. Women’s rights are violated everywhere in the world, every day. Could it be that we lack perspective on the gendered violence present in the Christian world because we are submerged in it?