aiSim is a simulator for self-driving cars. The ability to recreate real-world situations and create unique scenarios ensures the safe development of autonomous vehicle technology. Virtual testing guarantees that the final product can adapt to different environments. Scenarios based on situations which are common overall but rarely experienced by single drivers allow engineers to evaluate new solutions before real-world road tests. To support artificial intelligence research and training aiSim can also generate machine learning datasets.

[object object] aiSim HighresScreenshot00011s

I was hired as an Unreal expert when the work started on aiSim and for the first few months I was training the team to use Unreal Engine 4. They learned the ropes quickly so I transitioned into a more traditional technical artist role.

Our initial art asset workflow was designed for production speed in order to provide the AI research teams with as much data as possible, as soon as possible. That meant that most assets didn’t have a unique texture set but used several material definitions from a library we built. Those materials mostly came from Substance and were applied onto the meshes in Unreal. The result was 1-5 drawcalls per mesh but that worked on the limited scope of the early levels, which covered a only a few city blocks.


As the aiSim got extended with multi camera capabilities rendering performance became an issue so LOD generation and per instance max draw distance setting became a necessity. We also had to create much larger areas which slow down content creation even further. The biggest time sink on the modeling side was the manual reproduction of roads while level designers spent inordinate amounts of time setting all the metadata expected by the AI trainer team.

[object object] aiSim OpenDriveLogo

When aiSim got to a “feature complete for now” state then we, the tech artist duo, set out to solve the biggest issues with our content creation pipeline: creative people doing donkey work. We want to automate every repetitive task from UVing to import/export file juggling so we picked Houdini as the backbone for the new workflow.


There are 2 main parts of our Houdini toolset:

OpenDrive compatible scene components which allow the creation of the abstract representation of road networks and related objects.


– A set of mesh generators which take those abstract descriptions of the different elements and produce polygons for them.

Editor tools

To aid debugging performance issues I made an tool for building and visualizing a heat map of different rendering stats.


First the level is traversed by an editor mode blueprint: it finds the important areas and places measurement points there. The result is written to a file and used in a special game mode. That game mode places the camera at each measurement point and saves Game, draw and GPU timings for each of 8 directions. The stats are saved and read back by visualization editor blueprint which creates indicator meshes on the level:

The indicator cells have different modes:

  • Chosen stat in the direction of the viewport camera
  • Worst stat in the direction of the viewport camera
  • Worst direction for chosen stat
  • Worst stat of worst direction

The colors correspond to a freely adjustable min-max time range.

The cells are instanced static meshes sharing the same material so thousands can be displayed without problems. I used the instance’s scale to push extra data, the stats into the material. Those values are used to change text, color and arrow direction while the instance’s scale is compensated in the vertex shader.

Another editor side visualizer I made is for displaying metadata associated with splines:

Strings, ID numbers, spline directions and inter-spline relations are shown. The meshes and UMG widgets are managed by an editor ticker blueprint which factors in viewport camera location, orientation, nearest splines and their properties.

We also have an editor tool blueprint for locating collision problems. When fired it traces for colliders in a circle, increasing the radius every frame:


The most complex material in the simulator by far is for the road. Beyond mixing textures which capture the road surface at different scales it also reacts to changing weather. Our environment manager system controls time of day, time of year, location on earth, temperature, wind and so on.

I modified the Speed Tree materials so it handles the color change and loss of foliage. The wheel tracks are either placed as a special, tiling noise or can be based on recordings of actual vehicles driving on the road.

To add random potholes, oil spills, dirt and other details to the road I created a texture bomber:

Cell size, rotation, random seed and density can be adjusted. The cell contents are picked from a texture atlas. Parallax and normal mapping works properly even if a cell’s content is rotated.

[object object] aiSim HighresScreenshot00078s