Are you ready for the new era in filmmaking?
There’s been a seismic shift in filmmaking over the past ten years, with digital technology disrupting 100 years’ of film history to become the standard medium for cinematography. What can the developments of the past decade tell us about where VFX is headed for the future?
Visual effects beyond post-production
Previously, visual effects were always added post-production, after the action had taken place on set. Over the past decade, the visual effects industry has developed tools now used in other film-making phases - not just post. This was highlighted in the 2009 Oscar-winning visual effects epic, Avatar, where director James Cameron mixed on-set actor performances with computer generated material.
So how could this trend develop?
If we were to take the advances in previsualization to the next level, we might envision a future in which an entire set is digitally created, with actors dropped into it live without further post-production. Take the ‘tiger in boat’ sequences from 2012’s Life of Pi, for example. Rather than shooting the actor on a boat in a pool, then adding a digital tiger in post production, you could create the scene first and then have the actor perform in the digital set, in real time.
Beyond the set and studio, could the rise in popularity of immersive theatre-cinema hybrids - such as Secret Cinema - see visual effects tools used live in theatre in the future? Tellingly, we may already be seeing the beginnings of this. The Royal Shakespeare Company’s recent production of The Tempest has been given a groundbreaking digital twist, with Ariel digitally rendered in a real time format - no need to produce the image in advance and project it on stage.
And there is the intriguing possibility of the flip-side of this: rather than digital actors performing in a real set, might we see real theatre actors performing in a wholly digital set?
Performance capture bounds ahead
The past decade has seen significant advances being made in ‘performance capture’. Like those pioneering pre-visualization techniques, this is an area in which Avatar led the way, and where advances were most recently showcased by the stunning CGI work on 2017 Oscar winner, The Jungle Book. Both films illustrate the ground being made in the performance capture of faces and bodies; that is, using the digitally recorded expressions and three-dimensional movements of actors, to create CGI characters later.
The future?
The obvious and most immediate scenario is that we manage to create CG characters so realistic we can’t tell which performances are given by a real life human, and which by their digital replica.
Beyond this, there lies the intriguing question of whether one day we might be able to forgo human actors all together. Might we create our own believable, fully digital actors? In large part, this will come down to how well we can replicate human emotion: whether emotion can be realistically digitized. Currently, we’re not quite there yet. Every CGI character you see in a film or a game that gives a truly realistic emotional performance does so because there was a real actor who gave that performance .
And even then, we’re often lead into the ‘Uncanny Valley’ - which brings us to….
Conquering the Uncanny Valley
The so-called ‘Uncanny Valley’ - where human replicas which appear almost (but not exactly) like real human beings elicit feelings of eeriness and revulsion - has been a challenge for CGI artists for many years. There have been some notorious examples of studios not getting it quite right - such as the wobbly superimposing of the late Nancy Marchand’s head onto someone else’s body in an episode of The Sopranos from 2000.
Encouragingly, the past three years have seen huge progress in this field. The reception - particularly by younger viewers - of a digital Peter Cushing’s in Rogue One: A Star Wars Story indicates visual effects technology has reached a stage where we can create a human likeness to a compelling degree of accuracy.
For the future, truly conquering the Uncanny Valley will mean mastering human emotion to the point where we can create fully digital actors, that can give convincing pathos-laden performances, indistinguishable from the real thing.
Key to this will be rendering.
The VFX industry has made huge strides in rendering surfaces and lighting, which is why digitally-created humans are looking more and more realistic. As rendering improves in the future, it will become even easier to make things ‘look right’ - which in turn will make it even more difficult for us to distinguish between digitally-created human faces and the real thing.
The march of VFX across screens large and small
The past ten years have seen visual effects become the norm in film. Tellingly, even beyond the visual effects category, almost every film nominated at this year’s Academy Awards was touched by visual effects technology in some way. A case in point is La La Land, the lauded opening five minutes of which were crafted using the careful integration of ‘invisible’ visual effects.
The proliferation of VFX doesn’t stop at film, with audiences becoming used to seeing spectacular, cinematic-style visual on the small screen, such as those epic battle scenes from Game of Thrones. So where to next?
There is every chance that visual effects will change films beyond anything we recognise today. Imagine a world in which film has evolved way past linear storytelling, confined to the cinema screen, to become a fully immersive - even interactive - experience.
The next generation of films may harness the power of VR and AR to become something that plunges viewers into the film, giving them 360-degree views of the action and allowing them to stand next to the characters as the plot plays out. Such a future isn’t inconceivable - and perhaps not so far off.
How efficiency is now make-or-break
Financially, visual effects production can be a risky business. This was no better illustrated than in 2013 when the Academy Award for the visual effects category was won by Life of Pi - and at the same time, the main creator of VFX for the movie (Rhythm & Hues Studios) filed for bankruptcy.
The VFX industry is globalised, and now an entirely digital process. Dependent on distributed workflows and a highly mobile workforce, it’s also subject to the ebb and flow of government initiatives, from which it alternately suffers and benefits. Because of these pressures, there’s an increasing need for technology that helps drive efficiency - through the increased use of analytics, through improved workflows and through the dynamic use of cloud computing.
In the future, in order for visual effects to continue transforming our experience of film, visual effects companies will have to find new ways of streamlining the complex VFX process. Those that do stand the best chance of thrilling us with extraordinary visual experiences over the coming decades - without going bust.
Interested to see how far VFX has come over the past decade? Check out our interactive timeline of the most spectacular nominees and winners in the Best Visual Effects category of the Academy Awards.