Community Transmission — Visual Effects in Star Wars Battlefront II

by Straatford87
Reply

Original Post

Community Transmission — Visual Effects in Star Wars Battlefront II

EA DICE Team

INCOMING TRANSMISSION

Explosions, blasters, smoke, steam, fire, rain, lightning. Odds are that you’ve encountered these, and many more, of the Visual Effects (VFX for short) during your time playing Star Wars™ Battlefront™ II. Today we take a look at how VFX fit into the game, as well as the process behind some of them.

Despite their ubiquitousness, it can be quite difficult to pin-point exactly what VFX are in a game. Senior VFX artist Anders Egleus reflects on this.

“Most people associate VFX with sprites, i.e. flat planes with transparent textures. But modern game engines like Frostbite have become so flexible and powerful that we can now use our VFX tools to create a lot of things that traditionally would have to be created by other parts of the game team, e.g. animation, lighting, technical art, even code. From birds and leaves on Naboo, to full scenes of background fighting with vehicles and hundreds of soldiers on Geonosis, a lot of people don’t even think of these elements as effects. Point being, our job is anything but repetitive, and because we dabble in many different areas of the game, we get to collaborate with a lot of talented artists, designers and coders too, which is super rewarding”.

One way of getting to grips with what VFX are in a game, is to categorize them based on the role they play.

This type of effects are there to communicate gameplay mechanics and give feedback to the player when you do something, or something happens. Examples are weapon impacts, muzzle flashes, blaster bolts, destruction elements, lightsabers, character/vehicle abilities and UI-like effects. These all have to work well in many different situations and lighting conditions. Since visual fidelity is such an important aspect of Star Wars, and by extension Battlefront II, they also need to stay true to the source material, look great, and feel “Star Wars-y.” Creating these types of effects often involves many iterations until all of the above requirements are satisfied.

 

Most Levels (maps) are embellished by different types of environment effects; Fog, fire, smoke, rain, snow, swirling leaves, and more all help the world come alive, but also give visual cues to help you understand your surroundings.

Cinematics—and many Battle Beyond scenes—often use the same types of effects as the other two categories, but the story telling aspect is the most important part, and the setup is more similar to traditional keyframe animation in that effects are triggered by a timed sequence rather than resulting from triggered events.

Capital Supremacy contains many cinematic and narrative elements

Needless to say, many effects fall into more than one—even all three categories—like the dust storm on Geonosis or the fires in the scene below. There’s a gameplay aspect in that they provide cover and sight reduction, risk/reward decisions (i.e. Do I take a shortcut and take some damage or do I go around?); they certainly enhance the environment art; and they tell a story about a battle that has been raging before you got there.

What all these types of effects also have in common is, of course, to help immerse the player into the world and support the artistic vision of the game.

Before starting work on an effect, it’s important to have a good reference of the desired end result. This helps identifying details that would otherwise be missed and facilitates communication with the art director, designers, and other stakeholders. Anders Egleus elaborated.

“The Star Wars franchise is extremely gratifying because you almost always have access to good reference material, and because the visual style is so distinct. There’s of course a flipside to that: You’re expected to nail the look of the movies with every effect. Other IPs might not come with the same baggage of what things should look like, but I prefer that any day to an unclear vision.

“Most of the time it’s easy to find the reference we need, like in the case of the Droideka’s shield which appears in full glory at the start of Episode I. Other times we take help from our in-house Star Wars gurus like CJ and Guillaume to dig up just the right scene from this or that episode of The Clone Wars.” 

The collaboration with Lucasfilm is also crucial to the result.

”They often suggest things we would never have thought of. They provide us with the needed reference material. They’ve even sent us raw footage from practical on-set explosions and other special effects, and we constantly rely on their expertise for feedback and advice.”

Another source of reference is the work of our talented concept art team

“The concept art establishes a clear vision and allows us to start thinking about potential challenges early on. We can look at a concept image and go ‘yeah, no, that’s never gonna fit into our performance budget’, but then you go back and start thinking about the challenge and eventually find a way to cram it all in there,” Andres stated.

So how are VFX made in a game like Star Wars Battlefront II? Unlike other artists who use programs like Photoshop and Maya to create their assets, the main tool of the VFX team is the game engine itself. The most common part of an effect is the particle system, or emitter. A particle is just a point in space with properties like size, rotation, color, and transparency. Unlike traditional keyframe animation, particles are then usually simulated, which (grossly oversimplified) could be described as the game engine applying basic physics to them as they evolve over time. The artist can then vary these physical properties (e.g. gravity, air resistance, etc.) to get the desired result. 

The final effect is typically made up of multiple emitters, lights, and other components. Often, many different effects are needed to make up a gameplay feature, environment. or cinematic event.

When it comes to making effects fit into the rest of the game world, there’s probably nothing more important than their interaction with lighting. For this to work, and be fast enough to calculate for all target platforms, the engine combines cheap per-vertex lighting with approximated texture based volumetric lighting. A simplified forward scattering model is also used to get nicely backlit particles.

To give the right look to fire and other self-illuminating materials, a so-called blackbody calculation converts grey scale images into believable fire colors and intensities.

In addition to the “standard” particles covered so far, Star Wars Battlefront II was the first Frostbite game to ship with a new GPU particle technology.

“We were thrilled when the Frostbite engineers approached us and asked us to collaborate in developing this new system., “ Andres recalled. “‘Happy to be your guinea pigs!’ we said. A lot of the effects in the game simply couldn’t have been made without it. GPU particles are much cheaper than their old CPU counter parts so you can have lots and lots more of them. Also they’re programmable, so that allows for much more complexity and control. The downside of that though is that they’re a lot harder to make because we have to build everything from scratch, so we only use them where the old system can’t deliver what we need (e.g. background crowds).”

Some other examples of effects created with the new GPU particle system are rain, snow, sparks, embers, pebbles, metal debris, leaves, insects (and Whisties!), lightning/electricity, birds, blaster bolts, and, of course, lightsaber blades.

Lightsabers, and specifically lightsaber blades, are some of the most iconic effects in the Star Wars universe. In Star Wars Battlefront, the shape and colors of the blade are procedural, which means that instead of using a texture to define the blade, different math functions are combined in a so-called “pixel shader” which gets called (i.e. activated) when the blade is drawn to the screen (i.e. ignited). This allows for visual tricks like making them look like they have a volume even though they’re drawn on flat planes.

Creating the blades for Star Wars Battlefront II wasn’t without its problems, however, according to Anders.

“We used basically the same technique in both games, but it turned out to be a lot harder to get it right in the sequel.”

One reason for this was simply time constraint. There were more lightsaber wielders in the second game (the first one had only Luke and Vader), and they were more diverse.

“We weren’t 100% happy with the look, but it kind of worked so we figured let’s move on and build all the other effects needed for the game”.

Another reason was that the differences between light settings were much bigger in the sequel, making it more difficult to keep consistency across all lighting conditions. Finally, stretch bloom (horizontal glow), which was introduced to mimic the look of The Force Awakens, became very exaggerated with very bright objects. As a result, the intensities of the blades had to be toned down a lot, since glow is applied equally to everything on screen based on the intensities of the pixels.

“Ironically, once we decided to remove stretch bloom in a post launch patch, it paved the way for a look that was more true to the movies,” Andres concluded.

As hard as it was getting the shape, colors, and intensities of the blades right, the characteristic motion fan posed its own challenges.

“Traditionally you would rely on the built-in motion blur of the game engine, which we tried during the early production of the first game.”

This proved problematic, however, as the blade would have to be a solid object (like Phasma’s staff) instead of the pulsing cylindrical volume we all know and love. Moreover, the framerate in a game is much higher than in a film and can be even higher on powerful PCs. Therefore, the amount of motion blur would be much less than expected and didn’t look right. The answer was to connect planes, which, much like an accordion, would stretch from the current position to a fixed time offset (say 1/60 seconds ago).

“An additional problem was that, before the June update, the back rotation was stored in world space,” Andres continued. “So sometimes you’d get a lot more motion blur than you’d expect, e.g. when rotating with your character. The blade didn’t move at all in camera space, but rotated quite a lot in world space. Now we store it in camera space which is closer to how real camera motion blur works.”

Anders finished, “It’s a bit painful that it’s taken so long—we tend to prioritize new content over polishing existing assets—but it’s nice to finally be at a point (no pun intended) were we’re happy with the lightsaber blades.”

We will of course continue to improve lightsabers and many other effects in future updates.

Additional VFX artists who contributed to the images/videos, and who have helped make the VFX in Star Wars Battlefront II what it is today;  Jonas Andersson, Tobias Ahlgren, Nadab Goksu, Daniel Kopp, Gustav Hagerling, Steven Huang, Keith Walters.

We hope you enjoyed this behind the scenes look on the VFX in Star Wars Battlefront II. If you'd like us to explore further topics in this way, be sure to let us know in the comments below.

Message 1 of 1 (1,979 Views)
0