Gavit

The Gameplay Visualization Toolkit was an Unreal Engine 3 project aiming to make prototyping, directed gameplay video and machinima authoring easier. The system was an extension of the engine’s infrastructure and assumed an intermediate knowledge of UnrealEd. The goal was to empower the user so they can show an idea without involving programmers. Since everything was done inside the editor the reduced turnaround time encouraged experimentation.

 

At the core Gavit is a remapping system which links input to actor properties: For example “Mouse forward controls Actor position on X axis”. The incoming input can be from a human or a pre-recorded input data set replayed. (The latter feature is basically loop recording game events.) The framework consists of classes, kismet nodes, external applications, materials and other art assets.

Gavit logo gameplay visualization Gavit GavitLogo

While working on this self funded project I had the pleasure to work with James Tan, Sam Evans, Rachel “Angelmapper” Cordone, Luke “Ambershee” Parkes-Haskell, Eric Blade, Danny Meister, Wormbo. I learned much from them.

I had artists helping out with character modeling and rigging: Josh Stoker, Simon Mills and Bojana Nedeljkovic. Special thanks to Brad Clark for his guidance while I was learning Motionbuilder.

Full UDK Project (Extract to UDK folder. All code and assets are licensed under GNU GPL v3.0)

Technical documentation

Game prototypes

Mallet pinball

This prototype was a riff on classic pinball mechanics. The layout is loosely based on the Billion Dollar Gameshow table from Pinball Fantasies.

The flippers and mallet controls are both assembled on the level without needing any specific code. The PhysX implementation in the engine couldn’t handle high speed collisions so I had to play the game at 50% speed. When running a screen capture app in the background to produce a video it got even worse, my PC at the time just couldn’t handle the load. This bumped up the priority of the input recording features which at the end allowed me to capture the action with no dropped frames.

 

The first step was to record player input and the transformations of physics actors. That data was dumped to a text file then got converted into matinee data for Kismet. That matinee when replayed provided a stream of inputs (replacing live input from a user) in sync with the now animated rigid bodies. The result was a perfect recreation of an earlier play session with higher performance.

Winter Volleyball

The base mechanic for this prototype was inspired by the the old DOS game Arcade Volleyball which I played a lot back in the mid 90’s. The controls utilize the Razer Hydra motion controller.

The snowmen are controlled by each of the Hydra controllers: tilting sideways make them lean back and forth while swinging horizontally makes them slide. The Hydra input data is streamed from an external application running in the background (Hydra bridge) to avoid initialization every time a PIE session is started. The GVMachine actor on the level listens to that input and converters it property change in given actors.

Technical prototypes

Multi-pass performance recording and replay

In Gavit user actions can be captured and turned into a matinee using an external application. The following video was a stress test for the system: All 16 balls were controlled by me at the end using a x360 controller. In each take I took over a single ball and moved around avoiding the previously controlled balls which moved on their captured paths.

The recording data was saved as a JSON file at 60 Hz which produced clip files 500 Kb for each second. The recording system didn’t care how the input was generated (live or replay) so already captured input was recorded again each time. (It is possible to take over control of replayed components at any time.) There is a slight drifting due to floating point imprecision (the first ball’s data was re-recorded 15 times) but it’s not significant in practice.

Guided rigid bodies

Beyond recording inputs Gavit can record rigid body animation. That part, sampling the transformations, is simple, replaying the animation is a bit trickier: pure interpolation actors won’t produce the slide and impact effects defined in the physical materials. To work around this active rigid bodies are constrained to movers which are the ones replaying the recorded data: They are guiding the rigid bodies, automatically correct any random deviation and thus keep them on track. Since the KActors are still live they will produce the expected collision events.

When a collision occurred between two samples that collision is missed during replay because the animation data never brings the actors close enough to the collided surface. There are two possible fixes (using physics events and correcting missed collisions in data post-processing) but they never got implemented.

Unreal gameplay rendered in Modo

When the actor recording feature was completed I though it would be fun to transfer that data to Modo and render an animation. I made a simple map and captured my performance to tell a short story.

The movement of the camera and the physics driven objects were sampled at 20 Hz. To keep the data output in check, triggers were used on the map to control which actors are captured, silencing irrelevant ones. The final animation is three different sessions merged together.