Question by Deep End on me: Why do writers, filmmakers always portray the US so pathologically, but Europe so rosy?
This actually has been going on for 200 year (well, for filmmakers, 80 years), even during the Holocaust and other European ethnic cleansings and wars.
Best answer:
Answer by stanleys_2001
It’s called LIBERAL LEANING POLITICS.
What do you think? Answer below!