+the scene
( © Jeff Kravitz/FilmMagic/Getty Images)
5 Female Directors Taking The Film Industry By Storm!
Written by Pauline Woodley. Published: December 02 2019
The image of a director has evolved since the beginning of film. Now, more than ever before, marginalized groups are getting to be the head of telling their own stories. Thanks to #TimesUp and the #MeToo movements, Hollywood is evolving to become a safer place for marginalized groups to exist. This has brought about a new wave of female and WoC directors that will continue to bring about positive change in the movie industry. Here are 5 female directors that have released must-see work and will continue to use their talent for good.