Hollywood’s Damaging Images And Stereotypes

“The south supposedly lost the civil war. There is an overused cliche which suggests that the winners write history. Is this true? If it is, then why do all these relics of the losing side still circulate in this society so many years after the civil war ended? The point is, films like Gone With the Wind should have been held accountable a long time ago. Further, Hollywood’s role in disseminating such demeaning, dehumanized, stereotypical images can no longer be ignored.” – The Guardian