Controversial Question · 11:36pm February 8th
This might sound odd, and I apologize in advance for the outrage this could cause, but if everybody is complaining about the film industry being “woke”, does that mean we should outright get rid of movies and TV shows because filmmakers presumably can’t make what we see as a “perfect film” to save their careers?