Hollywood produces fiction. Nothing presented in movies can be taken as representative of facts or reality. Even (or especially) if the movie is historical or "based on a true story".
Hollywood is in the entertainment biz, not education. Is there any subject that they don't lie about?
(Not saying they're malicious, usually. Just that looks-cool pretend will almost always rake in more revenue than reality. Without the hassles or expense of researching what the truth actually is, or changing their script/casting/costumes/whatever to bear a passable resemblance to it.)
Hollywood produces fiction. Nothing presented in movies can be taken as representative of facts or reality. Even (or especially) if the movie is historical or "based on a true story".
Hollywood is in the entertainment biz, not education. Is there any subject that they don't lie about?
(Not saying they're malicious, usually. Just that looks-cool pretend will almost always rake in more revenue than reality. Without the hassles or expense of researching what the truth actually is, or changing their script/casting/costumes/whatever to bear a passable resemblance to it.)