1887

Computed street lights in a supernatural Seattle

Accurately rendering light in a video game is a computational and physical challenge.

In the dark recesses of video games, hiding beneath the plot and behind the art, interacting with the very hardware of your system, lurk the fundamental subroutines that determine how graphics are rendered within the games. Nathan Reed, previously of Sucker Punch Productions and currently at chip-maker Nvidia, coaxes that code to run as quickly and smoothly as possible. His goal is to create beautiful, dynamic lighting effects in games like Infamous Second Son, where the action takes place in a near-future version of Seattle that is just as foggy as the real city, but inhabited by superpowered heroes and villains.

Reed is a developer technology engineer. “I develop the low-level rendering code that video games use to talk to graphical processors to implement various visual effects.” In a series of emails he taught me about light transport, “our general term for light being emitted, bouncing around in a virtual scene and eventually forming an image in your virtual camera.”

This screenshot illustrates a combination of indirect and direct lighting sources in Infamous Second Son. Released in March 2014, the PlayStation 4 game is a former project of Nathan Reed. CREDIT: Image courtesy of Sony Computer Entertainment; high-resolution screenshot provided by Gematsu

This screenshot illustrates a combination of indirect and direct lighting sources in Infamous Second Son. Released in March 2014, the PlayStation 4 game is a former project of Nathan Reed. CREDIT: Image courtesy of Sony Computer Entertainment; high-resolution screenshot provided by Gematsu

Video games by nature are dynamic environments, in which a player’s interactions can alter the setting and shift camera angles. Dynamic effects need to render quickly to maintain smooth, steady motion on a TV, PC monitor, or other display surface. “The goal is for a game to run steadily at either 30 fps or 60 fps (to match TV refresh rates) to maintain smooth motion, so that gives you a budget of either 33.3 or 16.7 milliseconds to render a frame, and everything is constrained to fit within that budget,” Reed writes.

The graphical processing units (GPUs) that run the rendering code are complex pieces of technology, whose billions of transistors execute code. Modeling how the processors will respond to changes in code is too complex. Code optimization therefore entails blending a deep understanding of the hardware with knowledge of which algorithms are most amenable to the GPU hardware. And, of course, a lot of experiments.

The tool embodied by the code not only prioritizes rendering speed. It must also create imagery that looks good from any angle, because a player’s moves mean she or he can see an object or effect from anywhere. The tool must also allow game designers to adapt lighting effects to different scenes and games without requiring a software developer to overhaul the tool. That requirement is achieved by either making the code’s numerical parameters available for direct tweaking or by including additional logical procedures.

Originally, the tools were based on phenomenological models that rendered pretty results quickly without necessarily obeying physical laws. However, as computer hardware has advanced and better, faster GPUs have become more prevalent. As graphics coders have become more familiar with physics, the models have become more physically accurate—or have at least leaned on physics more heavily to produce reliable, plausible results. Rigid-body mechanics are the best-developed example. Fluid dynamics is currently too computationally demanding for real-time modeling, although this will likely improve in the near future.

As for lighting, “Any given game is a conglomeration of different models being applied to different effects,” Reed tells me. "For instance, I could have a scene that includes some human characters, some trees and grass, and a car, and each of those would demand different shading models—for skin, hair, cloth, leaves, bark, metal, glass.”

Even the lighting is a collection of diverse effects. A scene could be lit by the Sun, but it could also feature a neon sign that reflects off of a car, and some atmospheric haze that obscures the distance. Each lighting element demands different techniques. Fortunately, GPUs allows programmers to apply bits of code on a pixel-by-pixel basis. Skin pixels would get the skin shader; glass pixels get the glass shader. The effective mixing and matching of models yields the desired appearance.

Subtle, natural-looking lighting effects enhance the mood of The Vanishing of Ethan Carter, an adventure game set for release in September 2014. CREDIT: The Astronauts

Subtle, natural-looking lighting effects enhance the mood of The Vanishing of Ethan Carter, an adventure game set for release in September 2014. CREDIT: The Astronauts

The limitless ability to combine multiple physics models is constrained by the hardware’s performance, memory, and other unalterable characteristics. As a visual, interactive medium, video games need to look good, but they also need to respond quickly. To maintain that balance, engineers make tradeoffs, whereby they dynamically render visual effects, and render static effects in advance.

A classic example of a static effect is indirect ambient lighting. “When you play a game,” Reed explains, “You can walk around and see that lighting from any angle, but all it's doing is projecting the known lighting values from the objects to wherever they appear on screen at the moment." This precomputing spares rendering time for other effects that need to be dynamic, but it does limit interactivity of a game: Precomputed indirect lighting means neither the light source nor the lighted objects can move and still have the effect work. Reed concedes, “In general we prefer everything to be dynamic as much as possible, but precomputing some data is a necessary optimization at times.”

The screenshot below comes from a game that was produced using the Unreal Engine 3 game engine and illustrates the concept of indirect lighting. Although Reed did not work on this project, he explains that in the image the only direct lighting source is the sunlight in the top right. All parts of the scene not in direct light are lit by precomputed indirect lighting. When describing the image, he points out, “Note how it's brighter in the areas open to the sky, and darker back underneath the arches where more of the light is blocked. Also, you can see how the areas around the red curtain are slightly red, due to the sunlight reflecting from it and bouncing into the surrounding areas.”

This screenshot uses the Unreal Engine 3 game engine—specifically, its LightMass tool—to calculate high-quality static lighting effects from an indirect light source. Such lighting can be precalculated, as the player cannot change where the Sun or buildings are positioned. CREDIT: Epic Games Inc

This screenshot uses the Unreal Engine 3 game engine—specifically, its LightMass tool—to calculate high-quality static lighting effects from an indirect light source. Such lighting can be precalculated, as the player cannot change where the Sun or buildings are positioned. CREDIT: Epic Games Inc

Reed specializes in the bidirectional reflectance distribution functions (BRDFs) that embody how light bounces off surfaces. BRDFs depend on how much light is reflected at incoming and outgoing angles and can potentially change if a player’s movements reveal or obscure light sources. They also depend on the material properties. Metal, glass, and fabric all interact with light differently, and the shapes right down to microscopic smoothness or roughness may impact absorption or scattering.

An interesting and challenging problem that Reed is working on is to compute BRDFs for human skin. Despite feeling pleasantly solid, human skin is translucent with subdermal scattering. For empirical proof, hold a flashlight up to your palm, or recall the eerie under-lit faces while telling ghost stories as a child. Light passes through skin, scatters, and bounces back out to produce a soft glow. Reed explains, “Skin and other materials with strong subsurface scattering demand special rendering algorithms beyond what we would use for something simpler like metal, plastic, wood, [or] concrete.”

If you want to work as a graphics programmer, Reed recommends studying computer science, with a healthy dose of programming, software engineering, higher-level mathematics (particularly linear algebra, calculus, and functional analysis), and enough physics to pick up the rest on the job.

Information-sharing in the visual effects models for video games is quite strong—at the conceptual level, that is. Reed and his counterparts at other companies attend major annual conferences, such as GDC (Game Developers' Conference) and SIGGRAPH (Special Interest Group on GRAPHics), where they give talks on problem-solving strategies for different phenomena.

However, like plots, art, and game mechanics, source code is tightly guarded. A coder changing companies may apply the techniques, concepts, and problem-solving, but all code will be freshly written for each employer. In Reed’s words, “It's at the level where a good graphics programmer would have enough information to go build their own version of the technique, but they wouldn't have working code out of the box.”

Paradoxically, as GPUs and other hardware become more powerful, the problem of accurately rendering light becomes more complicated as more of the physics of diffusion, reflection, and scattering becomes computationally tractable. Competing companies collaborate on the conceptual solving of universal problems, but each develops its own particular snippets of code. With each new generation of hardware, more research from optical physics will find its way from academia into industry.

Mika McKinnon is a disaster researcher, entertainment science consultant, and irrepressible educator. She writes about disasters at GeoMika.com, and science in fiction at SpaceMika.com. Her stories can also be found on io9.

Comments

Submit comment
Close
Comment moderation successfully completed
This is a required field
Please enter a valid email address
ec53185b92e297f26a707ada48e7a20c ptol.magazine_postzxybnytfddd
Scitation: Computed street lights in a supernatural Seattle
http://aip.metastore.ingenta.com/content/aip/magazine/physicstoday/news/10.1063/PT.5.5017
10.1063/PT.5.5017