Kobii

VR Visualization + Animation Design
VR evacuation experiences of tsunami simulation results
Keywords: VR visualization, experience design, CG animation design, large-scale data, supercomputer, numerical simulation, particle/fluid simulation, evacuation simulation, climate change, disaster prevention,
Professional Project in Japan
  • Time: Oct. 2016 - Mar. 2017
  • Skills: 3d modeling, CG animation, Python Scripts
  • Software: Blender, Cinema 4D, Unity, After Effects, Illustrator, Photoshop, RealFlow, Sublime Text
  • Team: 2 developers (numerical simulation of tsunami) + Me (3D modeling / visualization / animation / VR environment building)
1. Project Brief
Kobii is a research project where VR experiences were created to build people's awareness of climate change. This project collaborated with Kyushu University and Supercomputer “Kei”. I was in charge of the visualization part. Numerical simulation results of tsunami in Kobe City, Japan was calculated by Supercomputer Kei. Large-scaled data was transferred to visualization and design software: Cinema4D and Blender.  Time-based simulation results were visualized to show the exact location of tsunami after its occurrence at six designated places in Kobe City. I prepared this virtual environment for people to experience realistic tsunami evacuation using VR devices and omni-directional treadmills. My visualization work was chosen to be published on Supercomputer Kei's official pamphlet.
As a result, I was invited to be a lecturer at the International Disaster Prevention Technology Communication Conference in Colombia in 2017 and 2018 to teach VR visualization. I instructed local government workers and students to use Blender to visualize tsunami simulation results in Cartagena and Tumaco. This was a two-day workshop where I taught the principles, gave demonstrations, conducted one-on-one critiques, and solved students' problems. The result was acclaimed by both countries and this program largely encouraged communications between people from Japan and Colombia.
2. Project Scope
There were three phases of the project. The first phase was calculating tsunami simulation results. I was in charge of the second phase of tsunami visualization in forms of animation and I brought the 3d models to Unity to create VR environments. The third phase of VR experiences has started and is undergoing. We also work with several research teams of well-known universities in Japan. I had a chance to experience the scenes at Kyushu University with their advanced VR device.
Three Phases of the Project
A big challenge of this program is about scale. Due the simulation area of 4 square kilometers in Kobe City, the generated results were huge. We collaborated with Supercomputer Kei to calculate the tsunami simulation. I used these results and rendered tsunami animations in Blender with 4 computers, which caused more than 3144 hours.
Project Scope
3. Design Process
There were mainly 4 steps to create the tsunami animation: 3d modeling of the city, creating materials and textures for geometry, buildings, and water, creating a lighting environment, and parallel rendering with Python Script.
Working Steps

3d Modeling of the city

I got a landscape file (GIS data) from our client with rough geometries of the area. I converted the file to .stl format and cleaned up the meshes in Blender. However, many detailed places were missing, so I created 3d modelings of these buildings with references on open-sourced GIS data.
Cleaned Landscape File

materials & textures

For the land, GIS data was converted in Blende as .stl file. Satellite map as the texture was adjusted and applied to the land with the method of box projecting. The problem was that due to the limited resolution of existing satellite map, the texture can only be viewed at the level of 5 meters. With large angles, the rendering looked fine, but with angels zoomed in, the images would lose clarity at some point.
Apply Textures to the Map
For texturing the buildings, there were two methods. One was designed for faraway buildings. A variety of building textures were randomly distributed to these buildings with algorithms. I controlled the distribution and adjusted settings of each area to make them look realistic at different camera angels.
Textures for Faraway Buildings
The other method was created for detailed buildings at designated camera angles. Textures were collected either on open-sourced GIS system or by taking photos at local locations. I took several field trips to Kobe City to collect photos and data.
Textures for Detailed Buildings
For the water material, I tested with several methods and chose one that delivered the most realistic feelings when rendering the animations. Water might be the most difficult material I have ever rendered - it was a challenge to apply a material to a surface to show the volume of underneath. There were some methods of creating particle systems and displacement to simulate water, but this was based on one surface. This project was to visualize a different surface - the time-based simulation result - for a different frame. Thus it became more difficult to create a realistic material.
Water Material

Renderings

The biggest challenge of rendering is the size of the calculated simulation results. As the files were very large, it was nearly impossible to import them all on the timeline and render. To solve this, I developed a method of importing, rendering, and deleting process with Python Script. For each frame, I imported the corresponding file, render the scene, delete the file, and went to the next frame. This went pretty well and I was able to render large-scale tsunami simulation with normal computers.
Rendering Method
Rendering Scripts
In order to finish the large-scale rendering work within the timeline, I decided to incorporate parallel rendering  method using four computers simultaneously. Nine cameras were set to render for designated locations in the city.
Parallel Rendering & Camera Angels
Below are some examples of the renderings. Citizens can visually get an idea where and when the water will approach. They can experience tsunami simulation in a virtual realistic environment.
Rendering Examples
Below is an animation sample showing how tsunami approached Kobe City.
4. VR Experiences
After creating the 3d modeling of the city and tsunami animations, I brought the model to Unity to create an interactive environment. As limited by the VR facilities, I only had one chance to experience it at Kyushu University. The project is still undergoing and in the future residents will have this opportunity to experience tsunami evacuation with friends and families.
The concept is to create a VR environment for people to experience disasters and get better ideas of how to evacuate, and how to act responsibly and safely. With motion capture and omni-directional treadmill simulator (such as Virtuix Omni), people can join the game, run around the city, and experience a realistic tsunami disaster. They can communicate with other people experiencing simultaneously, which helps them learn to develop teamwork in emergencies. This will also contribute to raising their awareness of protecting our planet.
VR Experiences
Below is an animation sample showing this concept.