What Does Hollywood's Next Chapter Look Like?

In an era of increasingly political awards speeches and an industry seemingly under fire from Washington, what's next for the city of stars?

by VICE Staff
Feb 24 2017, 7:43pm

Image via Flickr user Gnaphron. Design by Taylor Lewis

"Hollywood is crawling with outsiders and foreigners. If you kick 'em all out, you'll have nothing to watch but football and mixed martial arts, which are not the arts."

Meryl Streep's Cecil B. DeMille Lifetime Achievement Award speech at this year's Golden Globes ceremony touched on a variety of topics—politics, one's own sense of home and belonging, public cruelty, the passing of Carrie Fisher—but it was those two sentences, addressing the encroaching xenophobia and bigotry brought on by then–president elect Donald J. Trump's ascendancy, that stuck strongest with the public afterward. Is it accurate to refer to Hollywood as a whole—a tremendously moneyed industry that often faces derision and charges of elitism, cronyism, and nepotism—as a place where outsiders truly thrive? By drawing a line in the sand between the arts and other forms of mass entertainment, does Hollywood risk further alienating the populace that brought Trump to power? And in the end, does it matter?

The strength and swiftness of the reaction to Streep's speech—both positive and negative—demonstrated that regardless of its intentions or desires, the world pays attention when Hollywood takes a stand. People—regardless of color, faith, sexual orientation, or immigration status—like to watch stories, and that gives Hollywood an incredibly powerful opportunity. More so than any other industry, it has a direct line into the living rooms of Americans across the political spectrum. 

But in this new, truly uncharted American landscape, what is Hollywood's role? Should Sunday night's Oscar ceremony devote itself to civil rights and humanitarian causes? Is it time for the industry to refocus on the messages it delivers and the politics it collectively aligns itself with? And, importantly, is Hollywood inherently liberal because of the ideology at play in its most popular themes and story arcs, or is it a conservative enterprise driven by a strong bottom line like Wall Street or any other classically capitalist entity? If the industry is indeed as progressive and all-inclusive as it portrays itself, why do some still believe coming out might negatively affect their careers? Are new distribution systems like streaming services opening up new opportunities for more diverse voices and programming, or will the major studio cabal continue to control the industry's output? Is it up to Hollywood as society's storytellers to help make sure ALL people continue to be heard?

As we move into an era where Washington is threatening to stifle the voices and roll back the rights of women, people of color, immigrants, and the LGBTQ community, among others, these questions are more important than ever. Through this package of articles, titled Hollywood: The Next Chapter, VICE tries to answer those questions by investigating Hollywood's past, taking the pulse of its present, and looking toward what its future may hold. The entertainment we consume and the political climate that surrounds us are poised to overlap; with this series, we're looking at how and why, as well as what may come as a result.