I had two tasks on Injustice 2 that contained beams. One was the tiniest, quickest, you-barely-see-it beam ever. And one was nearly half the cinematic and it destroyed Superman. Dang.
The first one was in the very first scene of the game. Kara witnesses the destruction of her home planet by Braniac and his Braniac… lings. In the craziness of the shooting and the beaming and the storming and the dust, there’s a man who gets obliterated with some super cool eye beam in the background.
If you blinked, you missed it. Here’s the gif version:
This was the reference I was given:
Most of the work is in the material on the cone. Because it was a cinematic effect, I knew the exact distance the characters were going to be from the camera, so I was able to use depth to create a couple of masks, and then I used UV distortion on those masks to give it that dissolving/windblown look.
I also animated the cone by hand. I tried attaching it to the character’s face, but it wiggled around too much and looked really weird.
And here’s the big big beam:
It was one of the last tasks we had. I felt a lot more comfortable in their cinematic tools, but I spent way less time on the inner details of the beam. There are a lot of different camera angles here, so it was much more about getting a good composition per-shot. In addition to the big big beam, I also created all the cracks on Superman’s face and dissolved his clothes away. This wasn’t part of the task, but I thought it would be cool to include – Superman was getting blasted so hard that it was eating away at his supersuit. I put a generic male skeletal mesh underneath his so that when his clothes dissolved away, he still had skin underneath.
1) The beams in the longer shots don’t actually connect to the Ship’s tentacles. I just positioned them so that they looked like they were attached. The real Brainiac ship was so far away that it would have been impossible to manage particles and meshes traveling that great a distance.
2) Related to the first fun fact, the first shot where the Brainiac ship’s tentacle charges up and then the beam shoots down at Superman, there’s a camera cut. I attached a very quick particle effect to the camera to help the transition. It’s like 4 frames.
3) At one point I dissolved too much of his skin and I had to contemplate texturing some Superman nipples because “generic male” is just a ken doll. We split the difference and just left most of his clothing in tact.
A while ago, I got to work with Ember Entertainment on Meow Match. This was a truly unique experience – mostly I’m working in game engines like Unreal. All of the work done here was created in a 2D particle tool called TimelineFX by RigzSoft. How cool is that? Here are some of my favorite bits:
Technically this is FXVille’s official sample reel, but I was the only one on this contract so I’m claiming it, haha.
Last year I got to work with Undead Labs on State of Decay 2. I was only on the project a short time on environment effects, impacts, and some muzzle flashes, but it was a lot of fun. By far the coolest thing I did was this little bug system blueprint. This Abzu GDC Session talks a bit about the same techniques I used here.
Here’s a gif of the final result:
All we really have for particles interacting with the world is collision. For things like dragon flies, grasshoppers, flies, and birds, we didn’t really need to get too fancy. But for things like cockroaches, rats, spiders, lizards, and really anything creepy-crawly that actually creeps and crawls, we’d need something else.
I started by animating the bug particles as if they’d always be on a flat surface, like so:
Then I created a Blueprint that takes a snapshot of the SceneDepth using a SceneCaptureComponent2D (on construction; you could do it at run-time, but I’d be wary of doing it per frame). In this picture, you can see a preview of the SceneDepth snapshot:
In the same blueprint, I put in a Particle System parameter so you could swap with any bug you’d like (tarantula, cricket, dragonfly, etc). I also created and set a Dynamic Material on that Particle System per emitter. Now, I could take the SceneDepth texture and add it to the WorldPositionOffset in the Dynamic Material. You can do this either per particle or per vertex, so I added a lerp that blended between the two called “Vertex Conformity.” Lizards and rats are squishy, so they had a “vertex conformity” of almost 1, each vert hugged the surface of the geometry. Cockroaches and spiders were set closer to 0, so they retained their shape.
Vertical areas are where this technique kinda falls apart, as you’d imagine, so be aware of that if you wanted to use this technique in your game! You can even see it a bit in my own gif when rats and lizards get extra long on the more vertical-parts of the rocks:
In addition to the vertex offset based on the SceneDepth, I also animated the pieces using vertex colors as masks. Giving them each their own unique wigglin (You can see more permutations of this idea in that Abzu video I linked above. They did basically the same thing with all those fish!)
Maybe someday I’ll update this post with a video compilation of the whole thing. But that’s it for now! Have fun with your creepy crawlies.
In 2016 I had the privilege of working on Injustice 2 with NeatherRealm. My team at FXVille was tasked with Cinematics (and you can see our demo reel, if you’d like), but I wanted to make a little post about Atrocitus’s blood vomit. It was a lot of fun and very gross.
It was mainly a mesh-based particle solution. Here’s the breakdown:
Three splines in 3ds Max with a moderate amount of thickness, edit poly to get rid of the caps, turbosmooth for extra geo, and a noise modifier ontop to crunch it up a bit. There’s a handy little button for automatically generating UVs on a spline which is fantastic. The idea is to pan some liquidy normals and such downward to add to the velocity of the projectile vomit.
The next part was the material generation. NetherRealm has NO SHORTAGE of blood materials, so I leveraged what was around me to make a shiny, masked, bumpy shader that had panning and a clip-out function. I hand-painted the liquidy/stringy texture in photoshop.
Then it was just about animating the particles, which was super quick.
Plus, more particles, timing it to the animated characters, adding more odds and ends and bits and whatnot. Finished result:
Back in March, I got to speak at GDC 2018. My talk was called “Real-Time VFX: A Visual Language Spectrum” and was a spiritual successor to my “What Color is Slow?” PAX Dev session I did in 2016. If you have access to the GDC Vault, you can go watch it now! But if you don’t, here’s the blog-post version of my session:
The session is focused on VFX in the production pipeline. It’s not about the technical aspect of being a visual effects artist, or detailing any techniques I use in the creation of special effects for games. Instead, it’s about how this discipline fits into the game development pipeline, and how to improve communication between the disciplines we tend to be sandwiched between.
[This next bit is the “About Me” section that is fairly mandatory for GDC Talks. Skip ahead if you like.]
My name is Sarah Grissom and I’ve been a visual effects artist for eight years. After graduating from DigiPen in 2010, I began my career as an effects artist at The Amazing Society, where I worked on a children’s MMO called Super Hero Squad Online. Then I worked at Gas Powered Games for a short time on Age of Empires Online as well as a few smaller or unreleased projects. After that, I started working at FXVille, which is a real-time vfx contracting studio. In my five years there, I’ve helped out on about ten different projects including Shadow of War and Shadow of Mordor, Injustice 2, State of Decay 2, and Spider-Man PS4, and a couple of unannounced titles.
FXVille was founded in 2008 by a group of veteran VFX artists who saw a need yet to be filled in the game industry. In the last ten years, FXVille has gotten the opportunity to work with over thirty game studios – ranging from top-tier AAA studios like Bungie, Irrational, Monolith, and Epic, to smaller indie studios like Camouflaj and The Deep End Games. We work on games of every platform, in any engine you can think of. PC, Console, Mobile, VR, you name it, we’ve dabbled in it. It’s the most rewarding job I’ve ever had. Even if I don’t get to work on every single title that comes through, I get to see all of the hard work, cutting-edge technologies, and clever solutions that people across the industry are putting into their games. It’s been an enormous privilege.
FXVille has certainly given me a unique perspective. And, across the board, I’ve noticed this problem – on every game, no matter the tools, the genre, the pipeline, at every studio – there is a disconnect between Art and Design. What the Designers want VFX to accomplish tends to be more literal, more functional, give the player information. What the Artists want VFX to accomplish tends to be more flavorful, tell a story, make the player feel something. We, VFX Artists, tend to carry out this tug-of-war throughout the entire game development process. The goal of this talk is to give artists and designers a new tool to help us all decide what the goals of our visual effects are, what they need to be communicating, and how they should communicate it. After all, these aren’t two sides of a coin. It’s a spectrum.
On one side of the spectrum, we have Practical effects. This includes weather effects like rain or snow are highly practical. It could also be blowing dust, sandstorms, fog, or floating pollen. These are all examples of ambient effects that give your setting some movement, some life, and really grounds your characters in their reality. Keith Guerrette called these “Narrative Effects” in a video he did for FullSail.
Another example is gore and guts. Could be human blood, zombie blood, alien, monster, could also be yarn or stuffing or other giblets. This includes hot breath, spit, sweat, and tears. These are practical in that they give your characters internal workings, what they’re made of.
We also have destruction. Explosions, impacts, collapses. Anything that blows up, splashes, bursts, or gets crushed. Similar to the gore and guts, it tells you what your environment pieces are made of. It gives your props and set pieces a sense of weight, volume, and responsiveness.
It could also be a barrier. We’re often tasked with telling the player they can’t go someplace, or that they SHOULD go someplace. Fire or other elementals like waterfalls, boiling lava, thick smoke plumes are great practical methods of gating and guiding the player. Similar to weather effects, they add some ambient motion and even excitement to the environment.
Symbolic effects can be used to emphasize user interface elements like this example from Kingdom Hearts 2. These sprites form a green triangle, reminding the player to push the green triangle on their controller.
They can also highlight gameplay systems that inform the player what just happened or what they can do next. This example is from Hearthstone. Lots of effects in here are highly symbolic: the lines being drawn from one card to another, the color and treatment of the card highlights, the little”hit” burst with numbers telling you how much damage a card has received.
Symbolic effects aren’t usually representative of what the in-game character is looking at so much as they are a tool for the player or user. Assassin’s Creed has a mechanic called “Eagle Vision” where the player can identify targets or other people of interest with red or yellow outlines. This is a symbolic effect in that it’s not grounded in reality. We aren’t supposed to assume Ezio Auditore is actually seeing little red dudes and little yellow dudes. Rather, it’s a representation of his special assassin training.
The same can be said for this example from Splinter Cell. This effect is called Last Known position. An outline of your character is left behind and tells you, the player, where the enemy last saw you. We aren’t supposed to believe there is actually a little hollow Sam Fisher chillin out in the hallway.
League of Legends, is a great example of the in-between on this spectrum. Every effect in that game tows the line between a display of their systems and the game’s setting or lore. It’s imperative for these effects to be accurate in scale, duration, or intensity as called for by design. But it’s equally imperative that they fit within the fantasy of the game.
This is an example from Shadow of Mordor. It conveys a very significant, core, gameplay mechanic, but it’s also very grounded in the “reality” of middle earth. In fact, the entire suite of effects is very cohesive, all serving the central character and his ghostey elfy superpowers.
As this is a spectrum, we can place some of these examples further toward practical (like Shadow of Mordor), and some further toward symbolic (Like League of Legends). Sometimes Barriers (like these fiery walls in Tomb Raider) need to be purely practical – they need to look like they’re just part of the world and not call attention to themselves. Sometimes Barriers need to be broken through and interactable. You can shift your interactable barriers toward symbolic as you decide how you want to display “interactable barrier” to your player. An outline? A colorful material treatment? A big spinning target? You decide.
I’ve run into issues sometimes with barriers specifically that look too exciting or eye-catching or give the player the idea that they should interact with it. Make sure you know what you’re building your effect for beforehand.
I’ve also run into issue sometimes when the symbolic effect is TOO symbolic. Too “gamey.” Taking your shape language away from perfect primitives and into more organic shapes, and muting or tinting your primary colors so they aren’t so bright can help shift those effects toward practical, fit the overall aesthetic of your game a bit more. The one thing they all have in common is communication. They communicate setting and lore, they communicate damage radiuses, they communicate who is an enemy and who isn’t.
As Visual Effects Artists, we already understand this dichotomy. More often than not, we’re shifting around this spectrum throughout the production cycle. One of my goals for this session is to arm all of you with a new way to discuss the needs of your game, to pre-empt the conversation before we get stuck in iteration hell. There’s lots of ways to use this spectrum in your daily workflow. In any part of the game development cycle.
It’s possible to apply this spectrum to your entire game based on its genre. Heavy Rain is almost entirely practical VFX – I’m pretty sure the VFX don’t even have gameplay implications – it’s all story and mood and setting. Even the HUD is pretty minimal. On the flipside, games like Defense Grid 2 are almost purely symbolic. In a tower defense game or even an RTS game, you need to understand what’s going on at a quick glance. Everything is perfectly color coded, perfect round radiuses, perfectly sized beams and projectiles describing the action, the direction, the duration.
You could apply this spectrum to different game modes. Games like Battle Chasers (or pokemon or final fantasy) tend to have an overworld mode and a battle mode. The effects you see on the map trend toward practical – ambient environment effects like falling leaves, torches you may or may not be able to interact with. The symbolic effects you’ll find on the map are usually representative of potential enemy encounters or other interactables. (By the way, Alex Redfish is the artist responsible for the incredible VFX in this game. Check out his other work!)
In the battle mode, it trends toward symbolic. A sword doesn’t actually have a hot orange swipe or flares, these are used to show impact and intensity of the action or mechanic.
Lots of games implement this spectrum is on a case by case basis, within mechanics. Here’s an example from League of Legends. Every player uses the consistent language of the Cyan-colored placement-reticle for area of effect, whether it’s radial or a beam or a cone. Once the effect is cast however, that’s when the more practical effects kick in. In this case, Star Guardian Miss Fortune’s got a bunch of orangey, purpley stars falling from the sky and raining down on her enemies. But this action needed a little extra help in describing the actual mechanics of this attack, specifically its radius. The blue ring on the ground marries the design needs of showing the precise radius this attack uses and the art needs of showing Star Guardian Miss Fortune’s star guardian theme and shape language.
Here’s a classic. This is “Haste” from a game called World of Final Fantasy. Lots of RPGs have this mechanic. It can be cast on you by a mage or by an item, it could also be a passive buff. All Final Fantasy games have this spell, and they’ve all interpreted it somewhat differently across the board.
In this case, the effect we’re seeing here is a little spinning diamond followed by a persistent glow with streaks and motes animating upward. It’s pretty straightforward. Nothing about this effect tells me about the world; it doesn’t give me any insight about the characters; it really only describes the mechanic itself. So, I’d put this one firmly on the symbolic side of the spectrum.
Life is Strange is an adventure game. You aren’t commanding an army of troops, you aren’t building a stat tree or collecting weapons and crafting them together to be bigger badder weapons. The core mechanic of this game is time traveling.Mechanically speaking, there’s not a lot of information that needs to be expressed, so the effect for this can be very practical. In fact, the only mechanical piece of information being conveyed – the events you can replay – are deliniated in the UI element. A camera lens effect with some camera shake, chromatic abberration, tonal shift and flickering. It definitely conveys a strange sort of reality.
It’s entirely possible that in your production cycle, you need to draw a line in the sand. For example, you could say:
Damage radiuses are always perfectly round reticles.
Or, the leading edge of an effect is always team-colored.
Or, Anticipation is purely symbolic, but the follow-through is purely practical.
In pre-production you can have this discussion with Art and Design pre-emptively. Find out what their goals are and map it out. What sorts of things does the Design Director want communicated? How do they want those things communicated? How many pieces of information need to be displayed through VFX? What sorts of thing are going to be taken care of by UI or Animation? Ask your Art Director how grounded your VFX need to be. Bring up the kinds of mechanics you’ll be displaying, bring up things like screen real-estate, noise, and clarity. Find out where *THEY* would plot your game on this spectrum.
Find your inspiration. Where are you getting your references? Be it games, film, or even books, when you’re thinking about your references, don’t just gather images – Explore how those pieces of media deliver information to the user, not just the aesthetics, and plot it out on the spectrum.
Even though Harry Potter is a book series, the descriptions are all highly visual. They never tell the audience what’s happening, they show it. I’d put Harry Potter on the practical side. In James Bond movies, even though film is a visual media, characters end up describing with words what all of his cool gadgets do, and we only get to see those gadgets in action a few times. Mechanically, you could put James Bond on the symbolic side.
I rarely get concept art as a VFX artist. If you’re in the pre-production phase – ASK. Maybe there’s no bandwidth for it, but it doesn’t serve you to assume. Concept art can be an invaluable tool later in production. Also, ask for some UI support to be built into your pipeline. I’ll get tasks at the tail end of a project that probably should have been the UI artist’s task, and vice versa. Help pad that time in early.
In production, you can use this spectrum as a tie-breaker for the disciplines that can be at odds with each other. If your Art Director wants it on this side and your Design Director wants it on that side, refer back to the spectrum you hashed out in pre-production to settle those odds.
You can use this spectrum to discuss sprint-to-sprint or milestone-to-milestone planning. Both of the FX below took upwards of 3 weeks to complete. The difference is the effect on the right, while it wasn’t exactly labor intensive, went through dozens of rounds of feedback and player testing and email chains. The fire breath effect required a lot of simulations, animating everything just right, tuning the lightning, but less actual rounds of feedback.
Generally speaking, Symbolic FX do require more feedback and possibly more iteration than Practical FX and Practical FX requires more labor. Let your producers know which VFX are which so they can budget that time accordingly.
You can also measure your spectrum against your own VFX. Are you being consistent? Are you being purposeful? No matter how much planning you do beforehand, there are just some things you have to suss out in Production. Make sure your VFX are in-line with your pre-established rules.
In post-production, have a post-mortem. Now that your game has launched, plot out where your game’s VFX exist on this spectrum. Does it match where you put it during Pre-Production, or did it shift around? Examine the intent vs the finished result.
Plan your patches and updates. Is there a system or a suite of VFX that deserves a little post-launch love?
Listen to the buzz! Now more than ever if we want to know if players are understanding what your VFX are communicating, you can absolutely find out. Talk to your community managers or even watch some streamers.
Once you’re done with all that – do it all over again!
Visual Effects artists are caught in between art and design, trying to serve the needs of both without sacrificing the integrity of the game’s mechanical needs or the game’s aesthetic and artistic principles. But this doesn’t have to be the case. Using the Visual Language Spectrum early and often you can turn the tug-of-war into collaboration.
Cater your use of the spectrum for the specific part of the game development cycle you’re in now. Be mindful of the resources you have, the needs of your cohorts, and the needs of your game.
And finally, you have to make your own rules. There’s something to be said about not wanting to re-invent the wheel, but all games are unique. If you use this spectrum as a guide for your game, it will help you establish consistency and clarity in your visual language. And then you can thoughtfully and carefully break those rules.
One of the first clients I worked with at FXVille was Monolith. I started smack dab in the middle of development for Middle Earth: Shadow of Mordor (actually, this was before the game even had a title!). My first task was to create an effect for catching orcs on fire. I started with duplicating the orc’s mesh and re-mapping it so that I could place a scrolling fiery texture over the top (left), and then attaching flame particles to various sockets (right):
After a round of feedback, I made a version that was less bloom-ey, and tested it out on a bigger looking orc with armor. Part of the challenge was making this effect versatile enough to work on ANY orc – and there’s a LOT of orcs in this game. Oh man. So here’s what that looked like:
I also had to make a wraith version – a special move of Talion in the game:
Here’s a compilation of these iterations, as well as a few test captures where I set dozens of orcs on fire (I’m not sure if you can even DO that in-game, without fancy shmancy dev cheats), along with the final version as shipped:
I’ve been playing Dungeons and Dragons with my best friends for a couple of months now. Both of them have had birthdays, so I made them portraits of their characters. I’m pretty proud of how they turned out, so I thought I’d share them here! Wege is a tiefling sorcerer and ruined scholar. The Silent Savior is a half-elf rogue and assassin thief.
The summoning sickness post made me want to revisit some more old work from the online trading card game. Instead of breaking down each little piece into its own blog, I decided to make a pre-viz reel! Most of these were done without a game to even work on. It was my job to get thrifty and fake battle scenarios, animations, even systems. Every effect you see here was a work in progress, sent through email, FTP, etc. to the decision-making people who would say “keep exploring” or “try something else.” Hence the bizarre aspect ratios (sorry).
It’s neat to look back at this stuff. I always forget just how much iteration happens during the course of a project.
Just like summoning sickness, it was a mixture of UI/UX (user interface / user experience) and VFX (visual effects). Particularly challenging for someone like me who isn’t much of a UI person to begin with.
(Someday I’ll post more current work, I promise. I’ve grown a LOT since the work in this reel is completed. And I’m excited to share!)
Here we go a-time-travelin again. Age of Empires Online has been shut down for a little while, but the work I did is still fresh in my mind, as clear as the day I made it – which would be (gosh) two and a half years ago, already. One of the effects I was asked to do was “conversion.” Your team had the ability to convert units from another team to yours.
The unit that had the ability to convert you was a sort of shaman/magic-man/priest. I had this idea that he was “showing you the light” which is why I went with yellow. I also thought that his spell might confuse or disorient the affected units, which is why I went with birds circling overhead. Specifically bluebirds because they are happy and cheery, and this was a happy and positive move. More units! Less destruction! Who doesn’t like that? Also, I didn’t want to use yellow birds/canaries because yellow wooshes and yellow birds is a little too matchy-matchy. Color variation and value depth is always something you wanna push for in vfx. Especially since bright, saturated colors are often used to define teams (and AOEO was not an exception here).
One of the challenges was animation and shape. Some of the characters had a quick, snappy animation, some looped very seamlessly, some were randomly super long. Anyway, I had to make a few different effects that looked like they were the same, but had different lengths and beats. As you can see in the final product below, it worked out pretty well!
(a youtube version, if you’re curious. It has sound.)