2 years ago

Hollywood’s New Feminists, Why the Old One Went Away and What’s Coming Next?

Women’s rights made a major impact on Hollywood in the 1970s. Feminism, now a dirty word, was such a force to be reckoned with that you didn’t dare depict... (Read More)