RETURNAL VFX BREAKDOWN

Besides captivating gameplay, Housemarque is known for over the top particle effects. Titles such as Resogun, Alienation, Matterfall and Nex Machina all used proprietary vfx technology to bring colorful explosions to the screen in order to reward players for destroying enemies or completing levels. In Returnal we switched from top down to 3rd person camera, but also to a more grounded and darker art style than before. In this article we're going to have a closer look at how we utilized our vfx tech to make the alien planet of Atropos and its inhabitants come to life.

Below you can find the full breakdown video covering some of the showcase vfx features of Returnal. In addition to that, we'll go a bit more into detail with some of those features in this article.

HISTORY AND BACKGROUND

We have been working on our proprietary vfx tech since Resogun (2013 PS4 launch title), where the first prototype of our current particle system was used on some of the showcase effects. After Resogun, the particle system got a graphical user interface and we started referring to it as Next Gen Particle System (NGP). In 2014 we made the decision to produce all of the particle effects for Alienation with NGP. After shipping Alienation, the system was used for Nex Machina and ported to Unreal for Matterfall.

NGP is designed to be a GPU-only vfx authoring system with minimal CPU overhead. Focus is on good performance and flexibility. Particle authoring is done by vfx artists who write compute shader snippets that define particle behavior and data. This process is not too different from writing particle expressions in Maya or VEX for wrangle nodes in Houdini. NGP takes care of memory allocation and most of the boilerplate code, while artists can focus on behavior and visuals.

Currently NGP is not intended to handle only particle effects. It can be conveniently used for controlling per voxel behavior in volumes or for generating dynamic procedural geometry. We also have several modules that generate data to be used as an input for effects. For example, we have our own fluid simulation module that can feed its simulation data to NGP. Another example is a module called voxeliser which can convert an animated mesh to voxels and that data can be then used for volumetric character effects. Other resources like textures, bone matrices and vertex buffers can also be used as inputs for particle effects.

Next we will go over some of the key modules and features of NGP.

NODE PARTICLES

From early on in the project it was clear that we wanted to do something special with enemy creatures on Atropos. Game director Harry Krueger wanted them to resemble deep-sea creatures with properties like bioluminescence and tentacles. Our enemy team animators briefly experimented doing tentacles using traditional rigid body physics to simulate chains of bones attached to enemy skeletons. This approach seemed a little too limited since the performance cost of doing multiple very long chains was too high, but also because we lacked means of expressing enemy states via physics simulations only. VFX was then assigned the task to create dynamic tentacles that could be attached to enemy meshes and skeletons.

Luckily we already had a solution in mind. The team had been experimenting with particle vegetation for earlier projects and a special particle type had been developed for branching vegetation such as trees. We named this particle type as Node Particle to reflect its properties and behavior. This particle type allows for creating one-directional parent-child connections. In our implementation these connections can be established within the same particle. In other words, the connections cannot be established between different particle types. Any particle in the particle buffer may become a parent of a newly added child particle. A particle can be a parent to multiple children but a particle can have only one parent. For these reasons the children can query the parent but the parent cannot query its children. When reading the parent particle the parent data is one frame old, i.e. not the data that is being written to in the current frame. This makes it impossible to "follow" the parent strictly and results in a side effect which makes the motion of the particles appear "organic". This side effect is used widely in particle effects in Returnal, and it was especially useful for things like tentacles.

Before we started working on the tentacle behavior, we needed to decide how we would render them. First, we experimented with rendering flat strips of polygons. The quality was close to acceptable, but lacking in some areas like shadows. After a while, we settled on rendering the tentacles as cylindrical meshes that were constructed from NGP during runtime. The ability of node particles to read data from parent particles proved useful in constructing smooth surfaces. We used Catmull-Rom curves as a base for the cylindrical geometry with particle positions serving as curve control vertices. One of the challenges that we faced with this technique was the high amount of twisting on the curve normals when using analytical tangents. We solved this by doing a per particle normal pre-pass. This was done by picking a suitable normal vector for the first particle in the chain and then projecting child particles normal to a plane defined by their parent's position and normal. By temporally filtering the results we managed to reduce the twisting frequency and turned it into more organic motion along the particle chain as seen in the video below.

After settling on tube rendering we could start focusing on the behavior of the tentacles. Being able to control particle behavior, we were no longer constrained by just physics simulations but could conveniently change the tentacle movement based on the state of the enemy. This made it easy for us to experiment with things like forcing the tentacles to move in a certain way when the enemy is preparing for an attack. We iterated on the timings with the enemy team and designers to ensure that the tentacle behavior would help in telegraphing enemy states along with animation and other vfx.

Node particles also came in handy for the numerous ribbons and trails we have in game. We wanted some of the homing bullets to have a long trail that would linger on the screen for a while behind the bullet. Also enemy melee attack trails used node particles. Below you can see a video of node particles following their parent, creating a ribbon trail, followed by a homing bullet attack by Phrike.

FLUID SIMULATIONS

One of our key principles at Housemarque regarding visual effects is to simulate as much as possible during runtime, using as little pre-baked data as possible. As we had used fluid simulations in our previous titles like Alienation and Matterfall, it was clear to us from the beginning that we shouldn't settle for pre-baked velocity fields for Returnal. Instead we use a real time fluid simulation around the player to simulate air flow that affects movement of the particles, vegetation and other vfx elements. In addition to that simulation (which we refer to as Global Fluid Simulation), we can have additional simulations attached to different actors in the game.

For an efficient and robust real-time solution for fluid simulation we chose to implement a Semi-Lagrangian grid-based fluid simulation [1]. The algorithm produces an unconditionally stable simulation. For each fluid simulation it's possible to feed in a density field which will be advected. In Returnal, we add density to the density fields in particle simulation. The simulations also react to forces and obstacles both of which are filtered by channels, much like in a physics simulation. The forces are accumulated to the intersecting simulations using analytical shapes such as cylinders. Each analytical primitive's impact on its surrounding voxels is evaluated based on the overlapping volume (intersection). The obstacles are evaluated from Signed Distance Fields (SDF) as well as from skeletal meshes. The SDF's contribution to the obstacles is evaluated by determining if the obstacle volume's voxels are inside or outside the geometry described by the SDF. The skeletal meshes' contributions to the obstacle volume are evaluated by voxelizing them each frame and then blitting the resulting volumes to the final obstacle volume.

Any gameplay event can be scripted to add forces to the fluid simulation causing nearby vfx elements to react. For example these forces can be included into enemy animations so that when an enemy lands a jump attack, we add a radial impulse to the fluid simulation in that moment and at that location. This causes nearby particles like leaves or sparks to fly away from the impact point. In the video below you can see fluid impulses triggered from enemy animations and player actions affecting particle vegetation. 

While using only fluid velocities was enough for things like vegetation, in cases where one can see discrete point particles, it felt that the global fluid simulation velocity field was lacking detail. To get more detail we chose to  implement optional vorticity calculations for the fluid simulation and in the particle update added curl of a noise field to particle velocity, proportional to the magnitude of the fluid velocity at the location of the particle. In-game this technique was utilized for Xeno-archive holograms and in the player teleporting effect.

VOXELISER AND VOLUMETRIC EFFECTS

One of the environment elements we wanted to have in the opening biome of Returnal (Overgrown Ruins) was thick volumetric graveyard fog. Due to the height differences in our levels, the procedural placement of the fog turned out to be problematic. Instead, we decided to place the fog volumes manually. With a high number of volumes to be placed by the environment team, we had to make the process as straightforward as possible.

The flexibility of our particle system allowed us to construct these volumes in NGP. Since particle data and behavior can be completely customized, we can store a 3-dimensional index for a number of particles and have them represent a volume. Volume bounds can be passed as constant data from CPU to NGP. Along with the 3d-index, we can store any other data per voxel as well. This gives us the possibility to store different states for each voxel inside a volume. In addition to being able to store voxel states, we can also change their update logic based on their position in the game world or inside the volume. Having voxels that are aware of their state and position, we could have them automatically emit more density near surfaces like floors and walls but also fade out smoothly near edges of the volumes. This made the process of placing fog volumes a lot faster, since the fog adapted automatically to its surroundings. We could also sample the global fluid simulation at the voxel position, and have the fog be moved around by things like wind, bullets and player actions. In the video below, you can see one of these NGP fog-volumes placed into a level. The fog density is adaptively created only near surfaces, and advected by fluid simulation in game.

For the Phrike boss encounter, we also wanted to be able to emit volumetric fog from it’s skeletal mesh. As mentioned, we can access vertex data and bone matrices from NGP, but we didn't have a way of knowing if two vertices would occupy the same voxel, and writing from two different particles to a single voxel wasn’t allowed.

The solution to this was a real-time voxelizer which we use for voxelizing skeletal meshes when evaluating obstacles for the fluid simulations. The used approach requires the meshes to be water-tight and the geometry to be non-overlapping. The method we use uses the render pipeline in the GPU and is based on [2]. The depth of each rendered pixel is used to atomically update inside-outside voxels along the depth. The voxels require just one bit. For this reason it's possible to use a 2D virtual volume texture where the depth is stored in the bits of the pixel. We chose a 32-bit unsigned integer format for the texture. We needed up to 256 voxels along the depth which meant that we used a 4x2 area of pixels to represent the depth. We also rotated the mesh so that it's longest axis would be aligned with the virtual texture's depth axis. Finally we resolved the virtual texture to a 3D texture with optional downsampling to smooth out the results. Data is then passed on to NGP, where artists can write density emission logic for the effect. By sampling surrounding voxels, it’s possible to extract a normal for the voxelized surface. In the video below you can see the output of the voxelizer using Phrike’s mesh as an input.

In cases where we wanted to use the voxelizer but didn’t have a watertight mesh for the enemy, we had to develop an alternative technique to get similar results. We had experimented with creating particles on the mesh surface, but results looked like a hollow shell rather than things with volume and mass. We worked around this by storing bone indices of the enemy skeleton to one of the vertex color channels that was unused. This allowed us to determine a vector from a triangle to the closest point of the bone that it’s bound to. In other words we extruded the surface towards the bone and created particles along the extrusion direction. This method worked well for situations where we wanted to turn the enemy mesh into discrete particles and maintain the sense of volume.

This concludes our deep dive into the visual effects of Returnal. We hope that you enjoyed reading this and wish to share more of our tricks and techniques in the future.

AUTHORS
Risto Jankkila - Lead VFX Artist, Housemarque
Sharman Jagadeesan - Senior Graphics Programmer, Housemarque

References

[1] Jos Stam. 1999. Stable Fluids. In Siggraph 1999 Conference Proceedings, Annual Conference Series, pages 121–128, August

[2] Elmar Eisemann, Xavier Décoret. Single-pass GPU Solid Voxelization and Applications. GI ’08: Proceedings of Graphics Interface 2008, May 2008, Windsor, ONT, Canada. pp.73-80. ffinria-00345291



mikael haveri