Do Witchcraft

Do Witchcraft


People: Gavin Zimmerman
Timeline: May 2022 - June 2022
Focus: Realtime VFX
Tech: Blender 3.2
Link: N/A

Table of Contents

Do Witchcraft

About

Do Witchcraft is a personal project exploring SDFs, bezier curves, animation curves, procedural geometry, timing, and of course magic. The concept behind the magic spells is taken from "The Owl House" animation series by Dana Terrace and produced by Disney Television Animation.

Each card contains a glyph/symbol on them responsible for performing a spell, with four base symbols for: light, ice, plant, and fire magic.

These effects are created entirely procedurally from modeling to animation and texturing.

Breakdown

As with all my projects I've included a detailed breakdown behind the making of this animation. This writeup is a collection of design choices, technical methods, code, concept images, mathematical notes and designs, and everything in between.

At the bottom is a new section on rendering workflow, describing what went wrong in early renders and how iterations built up the final animation here.

If you as a reader have any questions about the process or content here then please don't hesitate to ask.

VFX Storyboards



1

2



3

4

  1. Storyboard/ Notes for Light Spell
  2. Storyboard/ Notes for Ice Spell
  3. Storyboard/ Notes for Plant Spell
  4. Storyboard/ Notes for Fire Spell
Drawing quick storyboards for each effect proved to be very useful conceptualizing the work to be done. Each board includes components of the effect, a timing reference (primary action for animation), and initial procedural design ideas.

Not all initial ideas (such as particles on the flower) made it into the final version, nor did the final version only have initial ideas.

Material Shaders

1



There are a total of 15 shaders for the background, magic cards, eyeball dial thing, and elemental materials. This section covers the making behind some of these shader.

Eyeball

The dial object with the eyeball printed on it wasn't thought of until halfway through the formal rendering process. In fact, it was completely coded while another computer was working on the latest render, which was to see if the concept would fit.

The shader setup is relatively simple; two principled BSDFs (Blender's PBR shader) were mixed together by a "blinking" factor (boolean). These BSDF's were configured to look like metal and an eyeball by adjusting specular, metallic, clearcoat, roughness, and emissive parameters.

2

3



  1. Materials on Sphere
  2. Metal Shader
  3. Eyeball Shader
For simplicity, the blinking animation is driven by the frame number. The animation started off as periodic, where the frame was divided by the frequency (frames per blink). The fraction component of this value was then shaped like an absolute triangle wave and then into thin spikes. This was used as a space multiplier, which affects the result of a length comparison operation that is then used as the blinking factor. 1D Voronoi noise is added on the frame input to offset each blink into a semi-random cycle.

4

5



  1. Shaping time into blink signals; final power node is number of units to multiply X-axis space by
  2. Checking length to shape eyelids; note maximum is used so that if the blink signal is at its max value then everything is shaded as the eyelid
  3. Light parented to eye for custom specular reflection


6

Ice

The ice material makes some assumptions about the geometry - in that it expects the geometry to be shaded flat and for vertices to be sparse.

For the actual geometry, noise is added/subtracted along the normals of vertices on a detailed object. The object is then tessellated to lower detail (and vertices).

As far as shading, the layers/blending get pretty fun here.

1

2

3



  1. The base color is driven by noise inputted by the normal
  2. Refraction color is blended with base color
  3. Node network of 1 + 2
For scratches and extra details, a distorted wave texture is blended with a fresnel.

4

5

6



7



  1. Fresnel Blending Value
  2. Distorted Wave Texture
  3. Fresnel on Distorted Wave
  4. Node Setup
The fresnel effect is to emphasize the scratches on surfaces pointing away from the camera, as is the case in reality.

The final touches involve noding this scratched ice look into a shader which processes scene lighting info. The ice ejects from an emissive surface, so adding some emission at the bottom was a nice touch here. This emission is also multiplied with the value of the base color.

8

9

10



  1. Value derived from up axis
  2. Emission
  3. Full Node Tree

Flower; Stem, Leaf, and Petal

The materials for the flower are UV driven materials, meaning that instead of 3D coordinates a 2D UV Map is required here. This is important to point out, because it can be difficult to reuse these materials across multiple objects. Yet, that isn't a concern here as leaves generally look alike (and we're not in a production environment).

The most difficult bit with these materials is color and lighting. I found that it's really something you need to go outside, study, and maybe draw a brief sketch to feel out the colors.

1



  1. Color Study
Similar to human skin, plants have layers of material which are translucent or semi-transparent. Light passes through these layers, bounces a while, and exits which is known as subsurface scattering.

2

3



  1. Without Subsurface Scattering
  2. With Subsurface Scattering
Notice that the shadows are much softer in subsurface scattering. The shadow can have a color gradient when the color depth parameters on the subsurface scattering are adjusted. The default is set to a reddish hue, however here I found that a blue green hue best reflected what I observed in nature.

For making the details on the leaf, it is largely shuffling with the UV map space. X and Y make up the depth and width of the leaf respectively. Rotating this space toward the center provides a useful coordinate system for the veins of the leaf.

4

5

6

7

8



9



  1. 1D Voronoi distance output on rotated space
  2. SDF from Stem
  3. Smooth Minimum of 4 and 5
  4. Remapped Values
  5. Value mapped to Color Ramp
  6. Nodes for getting vein SDF
The petal material is similar, but is naturally less patterned. Although this setup is relatively small and maybe lacking an eloquent method, it matches detail well with the rest of the scene and fits within the color scheme.

10

11

12

13



14



  1. Noise
  2. Voronoi with noise added onto coordinates
  3. Remapped values for greater contrast
  4. Color Ramp
  5. Full shader tree for petals

Fire

The ways for making fire are limitless.

The most common approach involves passing in some fire texture to a particle emitter and shaping the particle's motion path to look like fire. There is also fire simulation technology similar to fluid simulation, which might use a volume or surface shading technique. It warrants some research on my part.

This approach however assumes the geometry for the fire is a mesh, (mostly) in the shape of fire. But for this section I will show the shader breakdown on a sphere.

I especially wanted to recreate the look of Calcifer from "Howl's Moving Castle" by Studio Ghibli for my fire. I imagined this would pair well with the existing animations which had all already been completed.

1

2



  1. Calcifer
  2. Color/blending study
The tendrils which move up along the edge of the shape was the first challenge. This is actually very difficult to create well in a 3D setting opposed to a 2D setting (without using a physically based approach). However, a custom fresnel effect and some noise can get pretty close to creating the effect.

3

4

5

6

7



8



  1. Normal
  2. Normal blended with driven noise
  3. Fresnel with new normal (dot with camera)
  4. Remapped values
  5. Flipped values/ Remapped More
  6. Noise moves along Z (up) axis to drive the tendrils up
This mask is acts as the transparency of the fire. The same trick is also used for each layer of color with different mixes of noise. The innermost color barely mixes any noise into the normal. I also plugged in an exponent operation here to customize the falloff of the fresnel effect.

9

10

11



  1. Blend between Dark Red/Pink (Outer) and Orange (Middle)
  2. Yellow (Inner)
  3. Node Breakdown
  4. Blended Colors
  5. Color with Transparency
Working with the colors here required a live render. Emission would blow up the darker colors to brighter and more contrast values- so getting this right required a bit of fine tuning.

12

13

Glyph Shaders

1



Creating the glyphs on each card employed the use of custom SDFs (signed distance fields). Basic SDFs such as for circles, lines, triangles, rectangles, and quadratic bezier lines were joined together to produce the complete glyph.

2

3

4

5

6

7

8

9



  1. Hollow Circle SDF [abs(Length-Radius)]
  2. Line SDF
  3. Another Line SDF
  4. Rectangle SDF (w/ hard edges)
  5. Quadratic Bezier SDF
  6. 2 Mirrored Line SDFs
  7. Union of all SDFs via minimum operations
  8. Colored
The full SDF trees were collapsed into a single group node, which allowed for reuse and abstracted away a large subprocess in the node viewer. Translating the SDF into color was fairly easy using a compare operation with very small noise added for a pencil type effect.

10



  1. Full Glyph Shader w/ SDFs grouped
Note the section labeled 'Animated Emission', this animates the emission changing over time. The setup uses two remaps as inverse-lerp and lerp functions respectively, with a float curve in between to customize how values change. The green node simply does nothing but organize the setup. As for the attribute labeled 'timeline', that will be covered in the next section.

Animation Setup

1



The attribute labeled timeline from above is driven by the Z position of an empty object per magic spell object. This tool proved to be a very useful method of animating, since the procedural effects would not depend on frames or the 'real' timeline. Timing when each animation played and for how long could be easily controlled by adjusting the animation of the empty without changing any code.

A standard timeline also came out of this which timed each animation similarly over an abstract range of t=[-1,3]. Around t=2.8 was where most animations played the big transformation motion.

2

  1. Animation driven by empty
  2. Geometry Nodes calculating arbitrary time for animations
Mapping the empty's Z position to some time value was done in the geometry node setup. This was set up so that the geometry nodes, which programmed how the animated geometry looked, had reference to the timeline as well. They are also able to output attributes which can then be read by the (fragment) shaders.

Procedural Geometry and Animation

This section covers a breakdown on how the animations/models are created procedurally. These node trees read slightly differently than shaders, from the concept of fields explained very well by Erindale. However, in this section I'm going to generally avoid showing/explaining in detail the nodes themselves - or this writeup would be LONG.

Light

1



The general idea behind the light animation is to crumble a plane into a ball, move it upwards, and scale a sphere around it.

2

3



  1. Full Node Tree for Light
  2. Wireframe View (Apologies on slow GIF)
  3. Solid View
To achieve the crumbling effect, the plane was thought of in polar coordinates (azimuth angle and radius). With the radius as the x-axis and the 3D z-axis as the y-axis, a new 2D coordinate system allowed for a model which could curl in the edges. The crumbled position could be set in this 2D plane and rotated back into its 3D position using the original azimuth angle.

4

5



  1. Model Animated
  2. Curling Model
Two parameters are being animated here: where the circle (for curving) is placed and how big the circle is. Notice how at the beginning the curvature creates a smaller circle.

6

7



8

9



  1. Rotating earlier 2D model into 3D space
  2. Layering noise
  3. Adding float
  4. Adding sphere
  5. Node Snippet
To complete the crumble look, noise is added in as a displacement. Parameters for the noise are animated, specifically amplitude and distortion. Afterwards, some animated value is added to the Z position and a sphere is scaled to replace the crumbled plane visual.

To give a preview on how this looks in code up close, the following image showcases time being plugged into similar node setups seen earlier to animate geometric parameters.

10

Ice

1



The geometry for the ice already exists (3D modeled separately). The core of how ice is animated is by cloning the instance of the existing ice model and then positioning it. Some noise is added to give the ice just a bit more rigid feeling, like there is resistance when sprouting up from the ground.

The fragment shader will then be in charge of clipping anything below the magic card. (This can also be done with clipping planes in OpenGL at the vertex processing stage rather than fragment stage as done here)

  1. Full node tree
  2. Solid View Perspective


2

Flower

1



Unlike the ice geometry which was rigid and could be cloned, the flower is the complete opposite.

Luckily the shape of every piece has consistent cross-sections (circle for stem and flat bended circle for petals and leaf). To make this possible the geometry that gets animated are curves, which are then modeled into meshes with profile curves.

2



  1. Full Node Network
  2. Flower as curves
  3. Stem Curve Node Tree
There are 4 unique curve instances (stem, leaves, petal, and antenna). For simplicity on the petal and leaves, only 1 curve is generated and then instanced and rotated around the tangent vector for the stem.

The stem is fairly simple, made from a simple line curve. A noise texture is used to reposition each control point. Some value X is animated over time, this value can be thought of as an offset for the noise. This allows noise to change over time in a nonlinear fashion.

3



Adding noise onto the existing position will offset the position where the stem grows from. To correct this effect, the result of noise given from the position at the bottom is subtracted from by the initial noise offset. This is why there are two noise texture nodes here, for two inputs.

As for the leaves, the motion should appear somewhat similar. The curves for this use the exact same 2D model as for curving the plane in the light animation. A cubic bezier curve selects points along this model to produce a smoother version, which is used as the final leaf curve.

4

5



  1. Underlying Curve Model as before
  2. Sampled Bezier Curve
A similar idea works for the petal. One smooth curve is generated, instanced 5 times, positioned, and rotated to the correct final location. The number of animated parameters for this one is a bit larger, as a full quadratic bezier curve is animated with 3 animated control points and an animated angle.

6

  1. Node network animating control points for petal curve
  2. Petal Profile Node setup
Finally (skipping over some steps), the curves are all profiled using a network similar to the following. The profile curve must be a constant shape, but each control point along the original curve contains a radius attribute. This is used to set how large the profile should be when converting to a mesh.

7

Fire

1



Animating the paper for the fire had a similar implementation as the light animation. Some noise displaces the paper for a wavy effect, it floats upward, and the transformation is scaled up to replace the paper.

One difference was that instead of fire being scaled over the magic paper, the magic paper dissolved by heat of the fire and went in all types of directions. This effect was achieved by first subdividing the plane once and splitting the mesh on its edges, which resulted in 4 mesh islands. The positions of the vertices from each island were then averaged to produce a (normalized) vector for the direction the island should travel along.

2

3



  1. Full Node Tree
  2. Along initial vector
  3. Along animated vector
Since the flame took longer to reach the vertical ends, the paper had to be split in half first before the flame reached far enough to split again. To get this right, an additional animation parameter was added to multiple the X component of the vector noted above.

4

5



  1. Noise added to direction vector
  2. Noise added to vertex position
  3. Torus Curve before profiling
The noise shown in picture 5 is similar to how the light paper was deformed. This noise is also reused with the expanding torus.

The torus is a circle curve profiled along another circle curve. The radii of both are animated over time. Before the curves are used to generate a mesh, the position of the curve (non-profile one) is offset using the noise mentioned above.

6



For creating a mesh shaped like fire, the method I used was initially the byproduct of randomly experimenting with asteroids on a former project. The trick is to use some 3D noise as displacement on a sphere and to move that noise along the Z axis. You then use the initial Z (up) coordinate to scale how much noise is applied.

7

8

9



10

11

12



  1. IcoSphere
  2. Musgrave noise as displacement
  3. Displacement scaling by initial Z position
  4. Scaling Animation and Upward Positioning
  5. Animating the scaling parameter on the 3D noise
  6. Shading
Note the jump between pictures 10 and 11. This was by animating the scaler for the noise, which is internally multiplied with the texture coordinate input. By changing this a little, the space of noise grows and shrinks which is used to make this effect. Because the space itself is multiplied by some factor, this is very dependent on the assumption that all positions are close to zero. If that is not the case, then the positions change more dramatically when scaled resulting in a quicker change in the noise output.

Getting the detail level right for the fire was also quite difficult. There needed to be enough detail so that the fire edges weren't visible, but there couldn't be too much detail that it was difficult to use a fresnel effect in the fragment shader. Thus, normals are taken at a lower-resolution and transferred to higher-detailed geometry.

Rendering

The timeline for this project largely put rendering towards the very end. The majority of time was spent making each VFX look great in realtime and not until each effect was done had I thoroughly explored composition, layout, camera animation, background, etc. Thus, rendering was in a short enough timeline that documenting a comprehensive outline on this iterative process was very easy.

Here I've compiled a video with some of the renders over the entire process. Below are sections on the takeaways per render, mostly on what could be improved.

First Render

Takeaways:

Second Render

Takeaways:

Third Render

Takeaways:

Fifth Render

After the fifth render, I was fairly happy with the effects and started working on camera animation and turning this into an actual cohesive video. I also added a soft gradient to the background.

I really liked the spin and circular framing; which seemed to add anticipation from one effect to the next.

I then published this version to youtube, because I thought I hit the top of what I could do. But I still wasn't very happy with the result - working with each animation individually had a lot of feeling and was very exciting to see come together. Yet together there wasn't much feeling in it. The context of each animation in the environment as a whole was dull. Simply playing the animation wasn't enough, there had to exist a reason or context for it to play in.

In the Art book of Howl's Moving Castle by Hayao Miyazaki, Michiyo Yasuda mentioned that he changed the color hues in every scene according to emotional development. I thought about the nearly grey background here and wondered if those colors were truly right - which had first seemed favorable for a demo/portfolio piece.

Light, flowers, fire all at least have some essence of warmth or life - a warm background and more contrast lighting should help to bring that feeling out. I then tried to incorporate the common color schemes from the original animation series into the background.

I also decided to add some dial which would move with the animations and add some narrative.

Eighth Render

During this render I had already begun working on the next iteration, making the shader for the blinking eye. This render was to test out color/lighting on the background, timing of the changes/dial switch, framing, and such.

Takeaways:

Ninth + Tenth Renders

Takeaways: