Back in March, I got to speak at GDC 2018. My talk was called “Real-Time VFX: A Visual Language Spectrum” and was a spiritual successor to my “What Color is Slow?” PAX Dev session I did in 2016. If you have access to the GDC Vault, you can go watch it now! But if you don’t, here’s the blog-post version of my session:
The session is focused on VFX in the production pipeline. It’s not about the technical aspect of being a visual effects artist, or detailing any techniques I use in the creation of special effects for games. Instead, it’s about how this discipline fits into the game development pipeline, and how to improve communication between the disciplines we tend to be sandwiched between.
[This next bit is the “About Me” section that is fairly mandatory for GDC Talks. Skip ahead if you like.]
My name is Sarah Grissom and I’ve been a visual effects artist for eight years. After graduating from DigiPen in 2010, I began my career as an effects artist at The Amazing Society, where I worked on a children’s MMO called Super Hero Squad Online. Then I worked at Gas Powered Games for a short time on Age of Empires Online as well as a few smaller or unreleased projects.
After that, I started working at FXVille, which is a real-time vfx contracting studio. In my five years there, I’ve helped out on about ten different projects including Shadow of War and Shadow of Mordor, Injustice 2, State of Decay 2, and Spider-Man PS4, and a couple of unannounced titles.
FXVille was founded in 2008 by a group of veteran VFX artists who saw a need yet to be filled in the game industry. In the last ten years, FXVille has gotten the opportunity to work with over thirty game studios – ranging from top-tier AAA studios like Bungie, Irrational, Monolith, and Epic, to smaller indie studios like Camouflaj and The Deep End Games. We work on games of every platform, in any engine you can think of. PC, Console, Mobile, VR, you name it, we’ve dabbled in it.
It’s the most rewarding job I’ve ever had. Even if I don’t get to work on every single title that comes through, I get to see all of the hard work, cutting-edge technologies, and clever solutions that people across the industry are putting into their games. It’s been an enormous privilege.
FXVille has certainly given me a unique perspective. And, across the board, I’ve noticed this problem – on every game, no matter the tools, the genre, the pipeline, at every studio – there is a disconnect between Art and Design. What the Designers want VFX to accomplish tends to be more literal, more functional, give the player information. What the Artists want VFX to accomplish tends to be more flavorful, tell a story, make the player feel something. We, VFX Artists, tend to carry out this tug-of-war throughout the entire game development process. The goal of this talk is to give artists and designers a new tool to help us all decide what the goals of our visual effects are, what they need to be communicating, and how they should communicate it. After all, these aren’t two sides of a coin. It’s a spectrum.
On one side of the spectrum, we have Practical effects. This includes weather effects like rain or snow are highly practical. It could also be blowing dust, sandstorms, fog, or floating pollen. These are all examples of ambient effects that give your setting some movement, some life, and really grounds your characters in their reality. Keith Guerrette called these “Narrative Effects” in a video he did for FullSail.
Another example is gore and guts. Could be human blood, zombie blood, alien, monster, could also be yarn or stuffing or other giblets. This includes hot breath, spit, sweat, and tears. These are practical in that they give your characters internal workings, what they’re made of.
We also have destruction. Explosions, impacts, collapses. Anything that blows up, splashes, bursts, or gets crushed. Similar to the gore and guts, it tells you what your environment pieces are made of. It gives your props and set pieces a sense of weight, volume, and responsiveness.
It could also be a barrier. We’re often tasked with telling the player they can’t go someplace, or that they SHOULD go someplace. Fire or other elementals like waterfalls, boiling lava, thick smoke plumes are great practical methods of gating and guiding the player. Similar to weather effects, they add some ambient motion and even excitement to the environment.
Symbolic effects can be used to emphasize user interface elements like this example from Kingdom Hearts 2. These sprites form a green triangle, reminding the player to push the green triangle on their controller.
They can also highlight gameplay systems that inform the player what just happened or what they can do next. This example is from Hearthstone. Lots of effects in here are highly symbolic: the lines being drawn from one card to another, the color and treatment of the card highlights, the little”hit” burst with numbers telling you how much damage a card has received.
Symbolic effects aren’t usually representative of what the in-game character is looking at so much as they are a tool for the player or user. Assassin’s Creed has a mechanic called “Eagle Vision” where the player can identify targets or other people of interest with red or yellow outlines. This is a symbolic effect in that it’s not grounded in reality. We aren’t supposed to assume Ezio Auditore is actually seeing little red dudes and little yellow dudes. Rather, it’s a representation of his special assassin training.
The same can be said for this example from Splinter Cell. This effect is called Last Known position. An outline of your character is left behind and tells you, the player, where the enemy last saw you. We aren’t supposed to believe there is actually a little hollow Sam Fisher chillin out in the hallway.
League of Legends, is a great example of the in-between on this spectrum. Every effect in that game tows the line between a display of their systems and the game’s setting or lore. It’s imperative for these effects to be accurate in scale, duration, or intensity as called for by design. But it’s equally imperative that they fit within the fantasy of the game.
This is an example from Shadow of Mordor. It conveys a very significant, core, gameplay mechanic, but it’s also very grounded in the “reality” of middle earth. In fact, the entire suite of effects is very cohesive, all serving the central character and his ghostey elfy superpowers.
As this is a spectrum, we can place some of these examples further toward practical (like Shadow of Mordor), and some further toward symbolic (Like League of Legends). Sometimes Barriers (like these fiery walls in Tomb Raider) need to be purely practical – they need to look like they’re just part of the world and not call attention to themselves. Sometimes Barriers need to be broken through and interactable. You can shift your interactable barriers toward symbolic as you decide how you want to display “interactable barrier” to your player. An outline? A colorful material treatment? A big spinning target? You decide.
I’ve run into issues sometimes with barriers specifically that look too exciting or eye-catching or give the player the idea that they should interact with it. Make sure you know what you’re building your effect for beforehand.
I’ve also run into issue sometimes when the symbolic effect is TOO symbolic. Too “gamey.” Taking your shape language away from perfect primitives and into more organic shapes, and muting or tinting your primary colors so they aren’t so bright can help shift those effects toward practical, fit the overall aesthetic of your game a bit more.
The one thing they all have in common is communication. They communicate setting and lore, they communicate damage radiuses, they communicate who is an enemy and who isn’t.
As Visual Effects Artists, we already understand this dichotomy. More often than not, we’re shifting around this spectrum throughout the production cycle. One of my goals for this session is to arm all of you with a new way to discuss the needs of your game, to pre-empt the conversation before we get stuck in iteration hell. There’s lots of ways to use this spectrum in your daily workflow. In any part of the game development cycle.
It’s possible to apply this spectrum to your entire game based on its genre. Heavy Rain is almost entirely practical VFX – I’m pretty sure the VFX don’t even have gameplay implications – it’s all story and mood and setting. Even the HUD is pretty minimal. On the flipside, games like Defense Grid 2 are almost purely symbolic. In a tower defense game or even an RTS game, you need to understand what’s going on at a quick glance. Everything is perfectly color coded, perfect round radiuses, perfectly sized beams and projectiles describing the action, the direction, the duration.
You could apply this spectrum to different game modes. Games like Battle Chasers (or pokemon or final fantasy) tend to have an overworld mode and a battle mode. The effects you see on the map trend toward practical – ambient environment effects like falling leaves, torches you may or may not be able to interact with. The symbolic effects you’ll find on the map are usually representative of potential enemy encounters or other interactables. (By the way, Alex Redfish is the artist responsible for the incredible VFX in this game. Check out his other work!)
In the battle mode, it trends toward symbolic. A sword doesn’t actually have a hot orange swipe or flares, these are used to show impact and intensity of the action or mechanic.
Lots of games implement this spectrum is on a case by case basis, within mechanics. Here’s an example from League of Legends. Every player uses the consistent language of the Cyan-colored placement-reticle for area of effect, whether it’s radial or a beam or a cone. Once the effect is cast however, that’s when the more practical effects kick in. In this case, Star Guardian Miss Fortune’s got a bunch of orangey, purpley stars falling from the sky and raining down on her enemies. But this action needed a little extra help in describing the actual mechanics of this attack, specifically its radius. The blue ring on the ground marries the design needs of showing the precise radius this attack uses and the art needs of showing Star Guardian Miss Fortune’s star guardian theme and shape language.
Here’s a classic. This is “Haste” from a game called World of Final Fantasy. Lots of RPGs have this mechanic. It can be cast on you by a mage or by an item, it could also be a passive buff. All Final Fantasy games have this spell, and they’ve all interpreted it somewhat differently across the board.
In this case, the effect we’re seeing here is a little spinning diamond followed by a persistent glow with streaks and motes animating upward. It’s pretty straightforward. Nothing about this effect tells me about the world; it doesn’t give me any insight about the characters; it really only describes the mechanic itself. So, I’d put this one firmly on the symbolic side of the spectrum.
Life is Strange is an adventure game. You aren’t commanding an army of troops, you aren’t building a stat tree or collecting weapons and crafting them together to be bigger badder weapons. The core mechanic of this game is time traveling.Mechanically speaking, there’s not a lot of information that needs to be expressed, so the effect for this can be very practical. In fact, the only mechanical piece of information being conveyed – the events you can replay – are deliniated in the UI element. A camera lens effect with some camera shake, chromatic abberration, tonal shift and flickering. It definitely conveys a strange sort of reality.
It’s entirely possible that in your production cycle, you need to draw a line in the sand.
For example, you could say:
- Damage radiuses are always perfectly round reticles.
- Or, the leading edge of an effect is always team-colored.
- Or, Anticipation is purely symbolic, but the follow-through is purely practical.
In pre-production you can have this discussion with Art and Design pre-emptively. Find out what their goals are and map it out. What sorts of things does the Design Director want communicated? How do they want those things communicated? How many pieces of information need to be displayed through VFX? What sorts of thing are going to be taken care of by UI or Animation? Ask your Art Director how grounded your VFX need to be. Bring up the kinds of mechanics you’ll be displaying, bring up things like screen real-estate, noise, and clarity. Find out where *THEY* would plot your game on this spectrum.
Find your inspiration. Where are you getting your references? Be it games, film, or even books, when you’re thinking about your references, don’t just gather images – Explore how those pieces of media deliver information to the user, not just the aesthetics, and plot it out on the spectrum.
Even though Harry Potter is a book series, the descriptions are all highly visual. They never tell the audience what’s happening, they show it. I’d put Harry Potter on the practical side. In James Bond movies, even though film is a visual media, characters end up describing with words what all of his cool gadgets do, and we only get to see those gadgets in action a few times. Mechanically, you could put James Bond on the symbolic side.
I rarely get concept art as a VFX artist. If you’re in the pre-production phase – ASK. Maybe there’s no bandwidth for it, but it doesn’t serve you to assume. Concept art can be an invaluable tool later in production. Also, ask for some UI support to be built into your pipeline. I’ll get tasks at the tail end of a project that probably should have been the UI artist’s task, and vice versa. Help pad that time in early.
In production, you can use this spectrum as a tie-breaker for the disciplines that can be at odds with each other. If your Art Director wants it on this side and your Design Director wants it on that side, refer back to the spectrum you hashed out in pre-production to settle those odds.
You can use this spectrum to discuss sprint-to-sprint or milestone-to-milestone planning. Both of the FX below took upwards of 3 weeks to complete. The difference is the effect on the right, while it wasn’t exactly labor intensive, went through dozens of rounds of feedback and player testing and email chains. The fire breath effect required a lot of simulations, animating everything just right, tuning the lightning, but less actual rounds of feedback.
Generally speaking, Symbolic FX do require more feedback and possibly more iteration than Practical FX and Practical FX requires more labor. Let your producers know which VFX are which so they can budget that time accordingly.
You can also measure your spectrum against your own VFX. Are you being consistent? Are you being purposeful? No matter how much planning you do beforehand, there are just some things you have to suss out in Production. Make sure your VFX are in-line with your pre-established rules.
In post-production, have a post-mortem. Now that your game has launched, plot out where your game’s VFX exist on this spectrum. Does it match where you put it during Pre-Production, or did it shift around? Examine the intent vs the finished result.
Plan your patches and updates. Is there a system or a suite of VFX that deserves a little post-launch love?
Listen to the buzz! Now more than ever if we want to know if players are understanding what your VFX are communicating, you can absolutely find out. Talk to your community managers or even watch some streamers.
Once you’re done with all that – do it all over again!
Visual Effects artists are caught in between art and design, trying to serve the needs of both without sacrificing the integrity of the game’s mechanical needs or the game’s aesthetic and artistic principles. But this doesn’t have to be the case. Using the Visual Language Spectrum early and often you can turn the tug-of-war into collaboration.
Cater your use of the spectrum for the specific part of the game development cycle you’re in now. Be mindful of the resources you have, the needs of your cohorts, and the needs of your game.
And finally, you have to make your own rules. There’s something to be said about not wanting to re-invent the wheel, but all games are unique. If you use this spectrum as a guide for your game, it will help you establish consistency and clarity in your visual language. And then you can thoughtfully and carefully break those rules.