Feminism [fem–uh-niz-uhm] –noun
|1.||the doctrine advocating social, political, and all other rights of women equal to those of men.|
|2.||(sometimes initial capital letter) an organized movement for the attainment of such rights for women.|
[Origin: 1890–95; < F féminisme; see feminine, -ism]
|Dictionary.com Unabridged (v 1.1)
Based on the Random House Unabridged Dictionary, © Random House, Inc. 2006.
In other words, we’re not man-haters or bar-burners or women with hairy armpits. We’re just women, like you. We love ourselves and others too.
Stop being afraid. Start feeling empowered.