Introduction 

WebGL is a Javascript language binding of the OpenGL API, which is used for drawing 2D and 3D graphics/visualizations.  The use of Javascript is an important distinction, as it grants powerful capabilities to web browsers.  WebGL can be an intimidating API for beginners to pick up.  Three.js seeks to bridge the huge learning curve, and makes it easier for budding graphics programmers to render beautiful 3d renderings. 

Though Three.js is successful in abstracting many of the mathematically challenging areas of vector graphics, it is still a deep API.  This article’s scope will be restricted to the implementation of shaders, a very powerful aspect of both WebGL and Three.js. 

Understanding the Problem 

The best way to understand the problem is to look at a rendering I’ve made that simulates an imaginary solar system, observable at http://luke-gl.com/(Chrome, Firefox, Safari, IE can all use WebGL.  Performance is dependent upon the user's GPU).   

jupiter.jpg

Here we see an example of a simple planet, with a texture applied. 

shader.jpg

If you look at the star object in the middle, however, you’ll notice that its surface is dynamic and appears to have a thin layer that moves independent of the surface.  It produces a “glow” effect, and the colors itself change over a duration of time.  How does a developer create such an effect on an otherwise static texture? Shaders. 

Shaders give the browser access to the GPU of a computer through the web browser (and notably, shaders are a common tool in vector graphic APIs).  Every time a frame is drawn to the screen, the computer must render a single image.  In 3D graphics, the CPU and GPU are used to initially render a virtual 3D world within a “viewport”.  These complex mathematical operations are executed on every pixel of the screen typically at a rate of 30-60 times per second, which would be a huge burden to delegate to just one chip  The Shader script is part of WebGl’s (and hence, Three.js) effort to relegate these operations to multiple, dedicated sources. 

Writing Shaders 

Shaders are written as two scripts in WebGL/Three.js.  The Vertex and Fragment ShaderIn Three.js they are added via the object THREE.ShaderMaterial.   

These three files: shader.js, vertex.html, fragment.html, all work with each other. Essentially, the order is this:   

    Step 1. Shader.js (with optional texture image, uniform variables) :  

    Step 2. Vertex.html (handle vertex positioning)  

    Step 3. Fragment.html( handle vertex/pixel coloring)  

    Step 4. Shader.js (with the returned data, has a specific pixel with the appropriate color).  

Here we see an important point, each pixel in the concerned texture, has the same script run on it over and over again through the shaders 

Step 1. 

  1. this.textureLoader = new Three.TextureLoader();   
  2.  this.textureLoader.load(image, function (texture) {   
  3.    texture.minFilter = THREE.LinearFilter   
  4.    self.mesh = new Three.Mesh(self.geometry, self.material);   
  5.    var vertShader = document.getElementById('vertexShader').innerHTML;   
  6.    var fragShader = document.getElementById('fragmentShader').innerHTML;   
  7.    uniforms = {   
  8.     texture1: { type: "t", value: texture },   
  9.     sunVectorDelta: { type: "f", value: 0},   
  10.     sunFragmentDelta: {type: "f", value: -0.7}   
  11.    };   
  12.    self.shaderMaterial = new THREE.ShaderMaterial({   
  13.      uniforms: uniforms,   
  14.      vertexShader: vertShader,   
  15.      fragmentShader: fragShader,   
  16.      shading: THREE.SmoothShading,   
  17.      side: THREE.DoubleSide,   
  18.      transparent:true   
  19.    });   
  20.    self.mesh.material = self.shaderMaterial;   
  21.    self.mesh.material.needsUpdate = true;   
  22.    self.orbit = new Three.Object3D();   
  23.    self.orbit.add(self.mesh);   
  24.    objects.push(self.orbit);   
  25.    scene.add(self.orbit);   
  26.  });   

Line 5 and 6 are where the references to the shaders are written via a simple id attribute on a script tag. 

Lines 7-11 of Shader.js are where the established variables that will be passed to the shaders are set.  They include the image texture, a float variable for the vertexPosition(sunVectorDelta), and a float variable for the vertexColor(sunFragmentDelta).  A new THREE.ShaderMaterial is then created, with the appropriate variables passed (the scripts for the shaders, and the variables). 

Step 2. 

  1. <script id="vertexShader" type="x-shader/x-vertex">   
  2.     uniform float sunVectorDelta;   
  3.     uniform float sunFragmentDelta;   
  4.     varying vec2 vUv;   
  5.     varying float randomNum;   
  6.     float rand(vec2 co){   
  7.         return fract(sin(dot(-co.xy ,vec2(12.9898,78.233))) * 43758.5453);   
  8.     }   
  9.     void main()   
  10.       {   
  11.         vUv = uv + vec2(sunVectorDelta, sunVectorDelta);   
  12.         if(vUv.x > 1.0 || vUv.x < -1.0){   
  13.           vUv.x = rand(uv);   
  14.         }   
  15.         if(vUv.y > 1.0 || vUv.y < -1.0){   
  16.           vUv.y = rand(uv);   
  17.         }   
  18.         randomNum = rand(uv) + sunFragmentDelta;   
  19.         gl_Position =   projectionMatrix *   
  20.                            modelViewMatrix *   
  21.                            vec4(position,1.0);   
  22.       }   
  23. </script>   

The next stop is Vertex.html (Vertex Shader).   The syntax is different, since the language used is GLSL, the default language of GPUs.  A similar comparison would be the C programming language.  This script is where the 3D position in the virtual world is translated to a 2D position to be drawn on the screen.  Line 19, is where the gl_position variable is set.  The previous calculations before it are custom operations written to modify the position within a certain scope (-1.0 to 1.0).  A random number generator, on Line 18, is actually used to set a relative variable to be used later on in the Fragment Shader(Fragment.html). (This is where the surface “movement” is established); 

Step 3. 

  1. <script id="fragmentShader" type="x-shader/x-fragment">   
  2.     uniform sampler2D texture1;   
  3.     varying vec2 vUv;   
  4.     varying float randomNum;   
  5.     void main() {   
  6.         gl_FragColor = texture2D(texture1, vUv) - vec4(0.0,randomNum,randomNum,0.0);   
  7.     }   
  8. </script>   

With data regarding the vertex positions, the Fragment Shader (Fragment.html) is now ready to do its job.  This is probably one of the simpler shader scripts ever written.  With all the calculations already completed, the Fragment Shader executes one line of code (6) to figure out the color, using the variables generated.  (This is where the “glow” effect, or changing colors occur). 

Step 4. 

With the calculations done, the animation loop can now complete all other operations to draw to the screen in Shader.js. 

Conclusion 

The fully realized potential of WebGl is still in the future.  With VR technology, and an evolving internet, the opportunities for beautiful and comprehensive Data Visualizations for the web are provocativeWebGl is much more than making games for the web; it’s a way to create an immersive and creative environment for people across any distance. 

Vector Graphics can be intimidating, and if you’re a beginner, shaders can often times prove a difficult concept to embrace (especially if you are an OOP developer).  That said, Three.js is a powerful API, which provides a graceful way for the beginning graphics programmer (or hobbyist) to learn the fundamentals of 3D graphical Programming. 

References 

 https://en.wikipedia.org/wiki/WebGL 

http://threejs.org/ 

http://luke-gl.com/ 

https://gist.github.com/LukeDavitt/e1180738a6bebcea8424fb766e39e883 

https://gist.github.com/LukeDavitt/39a9dcec7e6bbc9c393457e54cea5243 

https://gist.github.com/LukeDavitt/ff332aa2fd2bd88a55a4c0d9158c23c3