I was asked by long-time collaborator Zoë Keating if I could help out with visuals for her cello performance at the WEF in Davos, Switzerland.
She was asked to close the conference with a performance for which she composed a new song. The closing was given the theme of "Leap of faith". She had seen the Oculus Rift: Gravity project I posted and thought that the global view of our planet would make a fitting backdrop for the composition.
The final video can be viewed at the bottom of this post or you can see it on Vimeo here: https://vimeo.com/85388684
Although I had a nice working interactive globe already made, I decided to rework it from scratch. One issue I was excited to address is the texture size. I recently purchased a new Macbook Pro and this afforded me the opportunity to explore using larger texture sizes. My previous laptop was able to handle 8k textures ( 8,192 x 4,096 ) but would begin to glitch if I asked it to work with 16k textures ( 16,384 x 8,192 ) despite my video card being listed as 16k texture friendly. I didn't want to have to explore options which split the textures into more manageable pieces that loaded in dynamically ( like Google Maps/Earth ); this would be no small feat. So I was thrilled to see that my new laptop could handle 16k textures without any issue. This allowed me to push the camera closer to the surface without getting too much texture degradation from upsampling. Some of the texture options from the NASA Blue Marble/Visible Earth site go as high as 42k so there is definitely some room for improvement.
This project is very simple despite the complex looking result. The glory really goes to NASA for providing amazing textures at huge resolutions. The textures definitely get most of the credit on this one. The geometry is simply 3 nested spheres inside of a much larger sphere used for the star field. Nearly all the heavy lifting is done by a few GLSL shaders.
The three nested spheres are:
1) Earth surface: This sphere is for the diffuse colors of the Earth surface blended with the city lights nighttime view. As the sun (point light) rotates around the Earth, I used the diffuse lighting on the sphere to mix between the diffuse map and the nighttime map. So in short, it is just a textured sphere that blends between two textures along the terminator. I also use the elevation texture provided by NASA (wow, this texture is available at 86.4k resolution). I use it to calculate the topographical normals for use in the lighting calculations.
Earth surface texture: http://visibleearth.nasa.gov/view.php?id=74167
Earth elevation texture: http://visibleearth.nasa.gov/view.php?id=73934
City lights texture: http://earthobservatory.nasa.gov/Features/NightLights/page3.php
2) Cloud layer: My first implementation of the Earth had the cloud layer baked into the diffuse layer. This worked well enough but I wanted to have some additional control over the lighting so I ended up putting the cloud layer on its own sphere which is just slightly larger than the Earth surface sphere radius. For this project, the Earth is 200 units in radius and the cloud layer is 200.2.
Cloud layer texture: http://visibleearth.nasa.gov/view.php?id=57747
3) Atmosphere layer: This sphere adds additional atmospheric coloring. Again, the original implementation just baked the atmospheric glow into the Earth surface layer but like with the clouds layer, I wanted some additional control so I made it its own sphere with a radius of 202. This sphere is shaded based on a diffuse lighting solution using the camera position as the light source combined with diffuse lighting from the sun position. (Note: newer versions of this project use an atmospheric scattering shader.)
Inspired by videos (1, 2, 3) of the Earth as seen from the ISS, I decided to add some additional graphical elements to make my Earth feel more real. Here is where things get a little more complex but very manageable taken step by step.
Cloud shadows and bump mapping
I added cloud shadows to the diffuse surface layer. I chose not to complicate it unnecessarily by factoring in the position of the sun. Instead, in the Earth shader, I do a texture lookup of the cloud layer and subtract a value comparable to the alpha of the cloud pixel. When the camera is close to the Earth and pointed towards a horizon, the shadow layer helps to make the cloud layer seem like a separate entity from the surface layer. The upper half of the following image shows the addition of cloud shadows.
To help the clouds feel less like a textured sphere, I create a normal map (dynamically in the vert shader but could easily just be a pre-made texture) to give the clouds some extra depth. It is especially noticeable along the terminator during sunrise and sunset. The upper half of the following image shows the cloud bump mapping.
To help offset the difference between the ocean and land, I vary the lighting using a grayscale texture mask of the Earth's landmasses. You can reference the mask pixel value and use it to increase the amount of diffuse and specular lighting that falls on the ocean versus the land.
Landmass masking texture: http://visibleearth.nasa.gov/view.php?id=73963
Landmass masking texture with rivers and lakes: http://www.shadedrelief.com/natural3/pages/extra.html
The city lights texture is not as crisp as I would have hoped so I did some post processing in Photoshop. I just added some additional detail within the glow of very large cities and sharpened up the pixels a bit.
Next, I load in a version of the city lights at night texture but after it has been excessively blurred. I then add this value to the cloud layer (on the night side) to make it look like the cloud layer is being illuminated by the city lights.
Finally, for the atmospheric layer, I use a color gradient texture to fake the colorization that occurs during sunset and sunrise. Were I feeling a bit more adventurous, there are shaders out there that accurately simulate the Rayleigh-scattering effect which occurs during sunrise and sunset but that is a problem for another day (note: I eventually did return to this problem and added the Rayleigh Mies scattering to the Earth simulation).
For the Northern and Southern lights, I used a series of concentric ribbons that are offset using Perlin noise. Imagine a dozen concentric bands of increasing latitude that are extruded away from the center of the Earth (see image below). Each of these bands are offset based on Perlin noise but the offset is additionally offset by the distance from the poles. This means that as the Perlin noise is animated over time, the outermost bands mirror the movement of the innermost bands but slightly delayed. The image below shows the geometry used for the aurora bands. To help minimize the overtness of the geometry, I calculate the diffuse lighting using the camera position as the light source and set the alpha of the aurora bands to the diffuse lighting amount. This ensures that any bands that are viewed edge on are nearly transparent eliminating any hard lines which would compromise the effect.
I'd ultimately like to find a procedural way to do lightning effects but for now, it is half brute-force, half procedural. I created a separate app which loads in the cloud layer texture and allows me to click on it to specify where lightning flashes are likely to occur. This creates a vector of 2D points which I then use in the main app to create loci for the spawning of random flashes of white-purple. This part happens in the cloud shader with a simple distance check (one for each of the 5 flashes which occur per frame).
Here is the render (with added audio) that was used during the performance by Zoë Keating at the Davos World Economic Forum 2014.