Monday, 8 May 2017

The reason you don't enjoy blockbusters as much as you used to or (the truth that cynical Hollywood studios won't tell you)

On the face of it, blockbusters continue to be in rude health. Box office grosses can rival the GDP of small countries, the likes of Marvel and Warner Bros continue to create genuinely exciting franchises chock full of limitless possibilities, and again and again we find ourselves drawn back to the multiplex, spending large sums of cash for premier seating and gawky plastic 3D glasses.





Deep down though, the way we enjoy these big cinematic events has changed. Even if you do successfully evade the minefield of trailers and teasers in the run-up to a film’s release, each one more spoiler filled than the last, you may emerge from the cinema with the nagging feeling that it just didn't have the same impact on you as blockbusters once did.

Gently nodding along? Wondering why you didn't realise this earlier? There is more than one reason, and they are not as obvious as you may think...

The science of getting it wrong

"Audiences are getting more discerning," says Tim Smith, a cognitive psychologist specialising in audiovisual cognition at Birkbeck University. "We're no longer won over by the spectacle of CG imagery. But so long as our attention isn't drawn to the act of the construction of the image, we can let most imperfections go by."

And that is the trouble. While our gaze has remained the same – limited to 5% of the screen, occasionally shifting to view people's faces and points of high interest such as explosions – digital effects have not. No longer on the periphery, over time it has gradually moved front and centre, where we are faced to confront it for longer periods.

Stephen Prince, professor of cinema studies at Virginia Tech, notes that "cinematic representation operated significantly in terms of structured correspondences between the audiovisual display and the viewer's visual and social filmic experience." Translation: what we process on screen will always come back to logic.





So, if King Kong or a Transformer happens to be on the screen – fine, we can deal with that. If Avatar (2009) is mostly digital effects, no problem. Issues arise when we process objects we wouldn't assume need digitalising.

Like, say, a human being – which prompts a dilemma within our moviegoer mind.





A case in point is the recent digital resurrection of the late Peter Cushing as Grand Moff Tarkin in Gareth Edwards’ Rogue One: A Star Wars Story (2016). Aside from the ethical question over bringing the dead actor back to life, the character polarised opinion, largely due to the 'uncanny valley' effect, whereby watching ultra-real – but not quite real – humanoid forms can elicit feelings of unease.

The recreated Peter Cushing was labelled a "constant distraction" by Collider and accused of "quietly undermining every scene he's in by somehow seeming less real than the various inhuman aliens in the movie" by The Hollywood Reporter.

And don't expect the digital regeneration of actors to stop there – Ridley Scott (Prometheus) has already hinted that he may employ similar technology to rejuvenate Sigourney Weaver for future Alien outings, while it makes perfect commercial sense for studios to safeguard their franchises.

Yet Hollywood's quest for digital perfection is also arguably its weakness. For these dizzying spectacles to operate with the same jaw-dropping impact as they did a few decades ago, it would seem what is needed is a better balance between digital and practical effects.





Take Jurassic World (2015). A great story, no doubt. A box office behemoth, for sure – but did the film really make you "hold on to your butt" in quite the same way Spielberg's forbearer did? The answer is probably no.

While the digital effects employed in Jurassic Park (1993) were certainly pioneering, much of the claustrophobic terror and joy was down to the incredible special effects work of Stan Winston, whose practical animatronics provided everything from those raptors running amok in the kitchen (eyes and arms were radio controlled), to the sick triceratops. Pure cinematic gold. Had these moments been purely CGI, they would likely have a considerably different feel.


A misunderstanding in the brain

The film's apparent over-reliance on digital effects prompted YouTube science channel StoryBrain to release a video arguing why CGI peaked in the 1990s. "Where CGI outweighs the physical in a film," they explain, "a misunderstanding occurs in the way the brain processes the visuals, leaving you unconcerned about the events on screen".

And make no mistake, we are getting much more than we used to.

A few decades ago, directors would superimpose digital effects into a real scene – a trick that lasted up until around 2004, when software passed the point where production teams could render fully computer generated backgrounds and foregrounds.

Dubbed the WETA effect (after the studio that pioneered it), it now meant entire worlds could be created without a director needing to pick up a camera, making the real action secondary and, subconsciously for the filmgoer, leading to a distinct lack of peril. You can see it in The Lord Of The Rings films, and even more noticeably, in The Hobbit prequels.





Another major problem of this WETA effect, argues Tim Smith, is that the ease of creating digital worlds means filmmakers tend to show too much, in effect "taking away the mystery, and actively stop the viewer from actively, cognitively engaging with the construction of the film in their own mind".

One man who gets it right more times than not is director Jon Favreau (The Jungle Book). Invited by the Academy of Motion Pictures to speak in a series of talks about CGI in LA last year, Smith sat down with Favreau to conduct a live audience experiment.

Tracking the eyes of how people watched the Monaco Formula One sequence in Iron Man 2 (2010), the aim was to see if viewers were looking at the areas Favreau had expected them to.





"He was quite surprised at how reduced the gaze was, how focused it was to a particular point on the screen, which was exactly the point he had composed his shots for," says Smith.

Meaning the background stayed in the background. "The majority of the CGI was in the periphery, exactly where the audiences weren't looking. Jon told us that they never even went to Monaco, that they shot on a backlot in LA. The crowd was basically a composite of multiple people, the cars are mostly CG. There's very limited real content in that scene, yet Jon made sure what real elements they did have were central, because he knew what things are going to attract the human eye."

So it is not just how much CGI is on screen that threatens our enjoyment of blockbusters, but rather where and how it is deployed.


Colour, rather than shape, is more closely related to emotion

You may have noticed that many movies look quite similar these days. Not in the sense of formulaic cliched tropes, but in a far more subliminal way. The advent of digital colour grading (whereby you can tint every frame of the film after shooting, adjust its brightness and colour balance) has allowed filmmakers to seek out the optimum palette for their films.

One of the earliest, most distinctive uses of the technology was the Coen brothers' O Brother, Where Art Thou? (2000), in which they changed all the greenery on show into a mellow, golden sepia, evoking the hot, Southern summers seen in old photographs. The Wachowskis turned everyone in The Matrix (1999) green – they take on a much more naturally bluish daylight tint in the real world scenes.






But by far the most far-reaching consequence is known as the 'orange and teal effect'. In a nutshell, human skin looks at its best in a warm, orange light. It is why old school cinematographers always shot their romantic scenes during the 'magic hour' of dawn or dusk. And as any colour theorist knows, orange's complementary colour is teal (or subdued turquoise). Orange never looks as orange as when it is on a background of teal.

Which is why every Hollywood movie with the budget to fix the colour palette in post-production turns everyone and everything orange and teal.






In short, films are literally all beginning to look the same.


The culture clash

There is also another factor at play in our declining excitement at tentpole pictures – quite a major one. "By their very definition, blockbusters are money-makers, and so have to fight for attention," says Smith. "So as films target international markets such as China and India, where language and characterisation can confuse, there will tend to be this simplification of those stories".

Simply put, as blockbusters shift towards lucrative new shores, so too do the storylines. Action becomes the universal language, reducing the threat of culture clash and narrative confusion, and the story – or at least what is left of it – becomes geographically ambiguous. Hence the humans versus sea creatures spectacle of Pacific Rim (2013), and Kong: Skull Island's cynical casting of a Chinese actress, Jing Tian, in a role nobody quite remembers.

If nothing else, at least these marketing ploys shed much needed light on the soulless feel of the recent Transformers films – their human element now all but lost to a digital orgy of shapeshifting robots, exploding oil drums and wanton product placement.

Age of Extinction? Perhaps Michael Bay meant his prop team...

No comments:

Post a Comment