These are 5 films where Hollywood tried to shine a light on some of the racism, prejudice and social injustice faced by many and how we as a society can attempt to relate to each other better. While there were no fairy tale endings, some of the films were able to offer positive or hopeful outcomes

What do you think?