Hollywood and Wall Street have a chequered history. West Coast filmmakers have long relied on East Coast bankers to fund their dreams, but the portrayal of financiers on the silver screen has often been far from sympathetic.
Wall Street, the archetypal film about the financial industry, conjures up a world of greed and insider dealing that ends in prison.