Hollywood’s New Feminists, Why the Old One Went Away and What’s Coming Next?

Women’s rights made a major impact on Hollywood in the 1970s. Feminism, now a dirty word, was such a force to be reckoned with that you didn’t dare depict a woman in a film who didn’t have, at the very least, her own identity. It was a hard fought war. But like...
Posted On 09 Nov 2012