NESTED RIG ANIMATION
I’ve been enjoying Unity's Animation Rigging Package, but have found that as given setup gets more complex, the rigging constraints themselves start to stand out to me as candidates for being animated.
This is especially true when building something that's mostly procedural, where the rig is driving the majority of motion.
In order to push a rig between different states, you might want to blend 10 different weight values, keyframe multiple IK targets and hints, etc.
Interestingly it is actually possible to have a parent-animator driving an armature, and a child-animator in its hierarchy driving a rig, which in turn is targeting that armature.
BLENDER’S NLA EDITOR, IN UNITY
Originally the grass in BOG was prototyped using a procedural mesh (above), built from an array of data points that responded to events in the game. This worked well enough as a prototype, but I’d always wanted to try more of the workload over to the GPU.
I eventually converted that mesh over to a geometry shader that could expand all of the grass's aesthetic behaviour (bend direction & amount, length, rotation, colour, etc.), and rebuilt the system that had handled the grass to instead be driven by texture data (first visual test below).
With Unity’s compute shaders I could compare frames of texture data and output changes between them as a buffer, which could then be used to generate new visuals (grass being cut, or disturbed, or bent past a certain threshold, etc).
This allowed me to recreate those original states & state changes, but with a huge performance gain, while also making iterating on those effects faster and more visually intuitive.