Leveling up by light: a look at lighting in video games
Whatever your preferred gaming platform of choice, there’s one thing that unites every level of player: a love of being transported to an entirely new world.
It might be through your favorite first person shooter. Or that particularly massive role-playing game. Or perhaps even through eSports—participant or spectator, it makes no difference. When it comes to video games, there’s something for everyone—whether that be friendly competition, puzzling over platformers, or simply a sense of escapism.
And as games get prettier thanks to advancements in GPU capabilities to real-time rendering techniques, so too does their immersive quality increase. Players can enjoy locations, environments and landscapes with rarely a pixel out of place.
It’s no dark-kept secret: much like real life, a huge component to a game’s aesthetics rests in its lighting. Lens choice, use of shadows, tailoring of form and atmosphere, if done just right, can combine to create something truly breathtaking, underpinned by fundamental cinematography principles such as light color, intensity, quality, and placement.
We’ve already explored how these same lighting principles have evolved over time to manifest throughout early forms of entertainment, such as cinema pre-computer graphics and, more recently, animation. In this article, we instead explore the use of lighting and cinematography in crafting some of history’s most beloved video games, and why the future’s bright for this enduring medium of entertainment.
Setting the scene
Much like movies and theatre, the whole feel and tone of a game can be dictated by lighting. It goes a huge way in immersing a player in the brand new world they’re experiencing; whether it’s the desolate, sepia-toned land of Dark Souls III’s Lothric, or the bright and vibrant hues of Hyrule.
The famous bonfires in the former act as a safe haven for players seeking shelter from the oppressive, nightmarish atmosphere that the Dark Souls series has come to be known for. They are, quite literally, a beacon of light in the darkness, shining with an unmistakable orange glow that transforms the color tones of the world whilst filling players with a sense of relief when discovered. For all intents and purposes, players are quite literally governed by bonfires in this game—hopping from one to another in order to rest, heal, and save their game. It’s certainly an intense ride.
Meanwhile, artists working on Doom 3 famously used stencil volume shadows as a real-time shadowing technique in a pioneering feat of artistry, giving its players a truly immersive and unique visual experience. And then there’s Lightmatter: a game so imbued with the fundamentals of lighting that no other name would suit this title. Fall into the shadows whilst playing, and you get consumed by them. Now that’s immersion.
Yet getting a game to these lofty levels of escapism through use of lighting and savvy cinematography requires no small amount of technique, skill and patience. There still needs to be a sense of realism when lighting a game, no matter how fantastical or stylised the aesthetic—for example, three lightbulbs shouldn’t uniformly light a room. Bloom shouldn’t be used as a coverup for sub-par graphics. Instead, the use of light should enhance the game’s art and be chosen with reason, inline with real-world cinematography principles.
This same sentiment extends beyond in-game visuals, to cutscene cinematics and game trailers. Previously, we had the pleasure of catching up with REALTIME to discuss the making of the cinematic trailer for PC browser game Game of Thrones: Winter is Coming, and the painstaking lengths the team went to make each character and location as photorealistic as possible.
Chris Scubli, Senior Artist for REALTIME, offers his thoughts on this challenge in Game of Thrones: Winter is Coming - Behind the Scenes: ‘It’s about familiarity–that’s the whole thing with Game of Thrones. You’ve got to feel like these characters are the kind of characters I’ve seen on the show.’
Yet without access to actors or scan data, this proved tricky for the team—especially as they couldn’t capture the actors in the same way that a cinematographer could on set.
So the team went to work, poring over renders to get that perfect final image.
“You can’t just get a character to look amazing in a raw render,” Chris comments. “You build up the render in a way that you can then work with in post. On Daenerys, at some point, we had about 10 lights, so each of those lights is on its own element, so in post you can control each of the lights.”
Emulating real-world cinematography principles in this way meant that the team were better placed to achieve a photorealistic final result. As Chris comments: ‘It’s not just about getting the right shot—it’s about getting the authentic shot.”
Putting light to work
But beyond a game’s aesthetics, there are other, more practical considerations to take into account. Whilst the fundamentals and principles of lighting remain mostly consistent across all mediums of entertainment, the nature of games means that the use and application of these vary considerably from that of other forms of media which involve a passive audience, such as movies.
Games are interactive. Player choice and action can instantly influence and alter the lighting in fundamental ways—and vice versa.
Often, lighting is utilised as a way of indicating a quest milestone, marker, or waypoint through savvy use of light placement that prompts the player to head in that direction—even if they’re not consciously aware of exactly what turned their feet this way in the first place. The opening gambit of Horizon: Zero Dawn springs to mind here, when player-controlled Aloy, submerged underground, picks her way through a dark cave by following subtle streams of light from above that illuminate the path to safety.
Meanwhile, stealth games typically use light placement to underpin any sneaking that the player must do to become the ultimate assassin. Light and dark is contrasted deftly to indicate to the player where exactly they must hide; avoid light areas at all costs, and stay in the dark to remain unseen by your unfortunate targets.
Yet players aren’t completely powerless in these same instances. Often, games give players free rein to manipulate the world state as they see fit. You may decide to shoot a water arrow at a flame (Thief, anyone?) and plunge your virtual world into darkness. Or shots fired from your gun could illuminate a previously shadowed space. Both of these are completely on-the-fly, autonomous choices—as is choosing where to stand, or what camera angle works best for your play style, both of which again can affect the light rays, refractions and shadows of the game world in very nuanced ways.
Because of this, video games—and the engines behind them—increasingly rely on dynamic lighting to cater to player choice and decision. Dynamic lighting underpins instantaneous lighting calculations that account for real-time variations in the game state. These may be narrative options, camera movements, player and character positions and decisions, and so on. Because of the level of interactivity this affords, dynamic lighting lends itself well to situations in which lighting is employed as a gameplay mechanic.
It might be in the case of illuminating or framing, for instance—a technique often utilised in horror games to force limitations on the player’s field of vision for an added sense of dread. 2013’s Outlast, for example, relied on two modes of player sight—a normal field of vision, and a sharply-contrasted night vision mode seen through the player character’s video camera. Only the latter allowed the player to see in dark spaces—and it’s through this forced perspective, marked by sharp, unpleasant green hues, that most of the game’s horrors are experienced.
Other games, like Amnesia: The Dark Descent, use light as a way of telling the player the current state of affairs. In this title, light and dark are used to influence the player’s Sanity meter, and, in turn, the overall in-game aesthetic. Stay in the dark too long and all semblance of sanity breaks down, leading to auditory and visual hallucinations marked by a more intense color palette and wacky lighting cues that herald the player’s blurring vision.
Light has also been used in a novel way as a fighting mechanic. 2010 saw the release of Alan Wake, in which a ‘darkness’ possesses the game’s enemies and antagonists. To have any hope of defeating his shady assailants, Alan—and thus the player—must first shine his flashlight on each before any damage can be dealt. These ‘fights with lights’ make up the game’s combat mechanics, with the added caveat that batteries must be sourced in order to keep your weapon working. In a game rife with hostility and a sense of impending doom, this only adds to the anxiety and panic that’s felt by Alan and, by extension, the player.
In contrast to this fantastical, other-worldly use of light, dividing a video game into a day/night cycle makes it align much more to our real-world experience of light. Yet it’s often used in a much more nuanced way than just telling the player what the game’s state is. Darkwood, for example, encourages exploration and foraging during the day—as long as you get back to your hideout before nightfall, lest whatever’s lurking in the woods makes you its next meal. The use of night and day here is used extremely effectively in making the player feel genuinely afraid of the encroaching darkness.
Back to the future
Where before lighting may have been overlooked or created with little attention to detail, rapid and ongoing advancements in hardware means that this no longer flies for players who have come to expect more from their games.
It’s a common theme across every audience: we want to be as immersed as possible in what we’re experiencing. When a player fires up their console, or loads up their latest save, they want to be transported somewhere special. As we’ve explored, outstanding lighting goes a long way in making this possible—and advancements in game engines go even further in enhancing the way light is simulated inside a game.
Yet who were the early pioneers that set the stage for the game engine capabilities that we’re experiencing today?
One of the first 3D game engines to use light as an AI-aware gameplay element was Thief’s ‘Dark Engine’. Released in 1998, Thief was the first PC stealth game to use light and sound as game mechanics, made all too clear in the game’s opening gambit by an ominous, unseen narrator: ‘'You must learn how to move unseen. Stay in the shadows. Avoid the light.'’ To allow for this new style of emergent gameplay, Looking Glass Studios developed the Dark Engine which enabled the combination of complex artificial intelligence with simulation systems.
Yet Thief predated the advent of consumer GPUs, so, incredibly, the Dark Engine relied entirely on software. Later, in 2002, Splinter Cell used a heavily modified Unreal 2 engine to pull off GPU-based rendering where AI is aware of the lighting—one of the first of its kind to do so.
Since then, game engines have evolved rapidly, with Epic Games’ enjoying two commercial incarnations of its engine since Splinter Cell with Unreal 3 and 4. Just recently, Epic demoed Unreal Engine 5 on the upcoming PS5 console, the latest in a string of increasingly powerful game engines that allow games—and the light within them—to do incredible things.
Unreal 5 features a micro-polygon renderer dubbed ‘Nanite’, which can scale geometric details of high-quality models up and down in real time, working alongside Lumen, a new technology offering real time dynamic global illumination.The result: accurate lighting of the scene with phenomenal realism.
Bright horizons
Advancements in tech such as this shed some light on what’s in store for the future of video games—and it seems safe to say that exciting times lie ahead.
Ever-increasing game engine and GPU capabilities completely transform the way lighting works in games, expanding creative opportunities for artists and designers when building the beautiful virtual worlds that players have come to know, love and expect. Now more than ever, video games can utilize the same cinematography principles and techniques as seen in movies, leading to a happy convergence of both forms of media that will only continue to narrow as time goes on.
Developments in raytracing, involving the realistic modeling of light in simulations by casting rays into the scene to calculate physically-correct lighting, give a glimpse into what the future of video games could look like. Watch below as Minecraft’s famously pixelated graphics style gets transformed into something wholly different via raytracing.
It seems safe to say that developments like the above will only continue to evolve with time, opening up a whole new avenue for exploration in game-making—one marked by unbridled potential for developers and studios who want to experiment with film-grade cinematics in their games.
The question then arises as to what level of cinematography future games will embrace. How much of what we see in the cinema will transpose to gameplay—and how well will this translate for an audience unused to seeing this from their games?
And then there’s virtual reality (VR). Will we see these immersive experiences closer align with our favourite films and movies—right before our very eyes?
Players, prepare yourselves—the next generation of video games promises to bring the meaning of immersion to a whole new level.