paint-brush
Converting Shaders from Shadertoy to ThreeJSby@dirkk0
14,369 reads
14,369 reads

Converting Shaders from Shadertoy to ThreeJS

by Dirk KrauseJanuary 31st, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

By <a href="https://medium.com/@markus.neuy" data-anchor-type="2" data-user-id="54d617749b40" data-action-value="54d617749b40" data-action="show-user-card" data-action-type="hover" target="_blank">Markus Neuy</a> and <a href="https://medium.com/@dirkk" data-anchor-type="2" data-user-id="d51270626777" data-action-value="d51270626777" data-action="show-user-card" data-action-type="hover" target="_blank">Dirk Krause</a>

People Mentioned

Mention Thumbnail

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Converting Shaders from Shadertoy to ThreeJS
Dirk Krause HackerNoon profile picture

By Markus Neuy and Dirk Krause

How to transfer Shaders from Shadertoy to ThreeJS (deutsche Version).

Introduction

As a part of our research at the Publicis Pixelpark Innovationlab we studied how we could use lower level languages for web technologies. The obvious choice seemed to be asm.js or WebAssembly.


But you could also use WebGL Shaders to solve machine-oriented problems. Shaders are programmed in a language that is similar to C/C++, and while they are not mainly meant for solving generic problems, they can be used for more things than just rendering images and 3D scenes.A second motivation stems from the asthetics that can be achieved with shaders. In 2002, a group of students from the University of Wisconsin-Madison published NPRQuake (“Non-PhotoRealistic Rendering Quake”), a variation of the well-known game Quake by injecting code into the rendering pipeline.

NPRQuake Screenshot

The aesthetic quality of this variation was stunning; we immediately understood that these kinds of effects could be a game-changer in projects. While in 2002 this variation was only possible by writing drivers for OpenGL, can now in 2018 be achieved by shaders — even in web browsers.

So when we recently were involved in an art project, we decided to give shaders a go.

Availability of shader code

If you are not really used to program shaders the obvious choice is to search for freely available examples and use these (with a careful look at the licences involved). One library that stands out in this regard is Shadertoy with ShaderFrog as another example.

Since we successfully worked with ThreeJS before we decided to publish our findings with postprocessing shaders from Shadertoy in ThreeJS.

Shader in ThreeJS

ThreeJS can be used to utilize postprocessing shaders which alter the whole rendered image and also material shaders, that can alter the material of 3D objects. Both types need a vertex and a fragment shader part; a vertex shader can change the position of vertices in 3D while a fragment shader usually replaces the color of a rendered image.

This image shows the four possible variations.

Types of shaders

In the upper left a postprocessing shader adds a color gradient to the rendered image. To the right of it, a vertex shader reduces the render area. The two bottom images show material shaders; the left one only alters the color while the right one changes the position of the vertices. Since shaders are always composed of both vertex and fragment parts, the last example also changes the color.

Shadertoy

Trivial example from Shadertoy

We studied how to transfer a shader from Shadertoy to ThreeJS back in 2014 with first results published on StackOverflow. We found the following pattern useful:

By following this patters you can transfer a simple shader to ThreeJS.

Trivial Shader

Non-trivial example from Shadertoy

With a more complex shader you need to do much more, as we will outline now. For a non-trivial example we chose Noise Contour by candycat since you run into a couple of problems with it. You can find it here: https://www.shadertoy.com/view/MscSzf

This example also creates a whole scene in shader language. But in ThreeJS you usually want control over the 3D objects, so we decided to create the scene in ThreeJS while still utilizing the shaders to alter it.

Understanding the structure of shaders

We start by trying to get a grasp of the structure of the shader; this can be achieved with Shadertoy’s editor. Since edits to the code can be seen in realtime we can make small changes to understand how it works.

Below the actual code we see that this code is based on a channel called iChannel0, with the B indicating a buffer.

To see this buffer in action we comment out line 37 and add this:

The result should be:

This simple change results in showing the color of the previous buffer and not the result of this buffer.

By examining the previous buffer — Buf B — we see that this one also uses iChannel0, so we are still not looking at the original scene creating code.

Utilizing the same trick as before, we comment out line 29 and add a line that calculates uv and the actual color like so:

This should leave us with:

This looks very much more like a regular scene. Also, Buf A doesn’t use another buffer, so we are looking at the original scene creating code.

Reconstruction in ThreeJS

Full disclaimer here: what follows is by no chance ‘optimal’ code, but just one way to solve the problem in the most straight forward way.

Creating the scene

We start by creating a somewhat simpler scene with only a sphere and a plane. Additionally we want to use the MeshNormalMaterial from ThreeJS.

A possible result is shown here:

Shader in ThreeJS step 0

The code is contained in a HTML-file called index.html:

We need to take care of the dependencies to the ThreeJS library and we also add our own code in index.js:

This JavaScript code creates a renderer, a camera, an orbit control and also the plane and the sphere with a MeshNormalMaterial. It also takes care of window size changes and rendering.

This concludes step zero of porting the scene from Shadertoy.

Shader in ThreeJS step 0

Recreating the first shader pass

In the next step we try to recreate the first shader render step in the buffer; this is basically copying the shader code to ThreeJS.

This should be the result:

Shadertoy withou the last pass

To achieve this we used the EffectComposer for ThreeJS, which provides a simple way to use postprocessing shaders.

This creates an EffectComposer Instance which adds a normal rendering pass and an additional shader pass. We copy the shader code in the variables VERTEX and FRAGMENT. The shader definition also defines a Uniform called tDiffuse used by the EffectComposer. It contains the image from the previous rendering pass that will be altered in the current pass.

With this new render step in action, we show this pass rather than the original scene. Thus we need to add some code for resizing purposes, hence we add:

Now we need to define the constants VERTEX und FRAGMENT. We can’t use Shadertoy’s vertex shader, so we need to define our own:

We do use the fragment shader from Shadertoy, though, and add it to FRAGMENT:

This basically creates the shader but we still need to address the following problems:


  • the vertex shader coordinate isn’t used in the fragment shader yet- the fragment shader uses texture which is unknown in the current WebGL context

  • mainImage must be renamed to main
  • iResolution isn’t set yet.

So the shader isn’t working yet.

Addressing the first problem results in this definition:

Now we can use the vector vUv instead of fragCoord / iResolution.xy. This results in:

Now we simply replace every occurence of texture with texture2D.

Additionally we alter the mainImage to main without parameters:

main should also return gl_FragColor instead of fragColor which defines the color in the shader.

Lastly we need to set iResolution by adding it to the uniforms. We do this by defining a ThreeJS vector storing width and height:

Now we can add the resolution to the uniforms:

We need to enhance our resize function:

It is important that we use the uniforms of the actual render pass. The original one has been deeply cloned by the EffectComposer; changing the variable resolution would have no effect.

Since we did define two uniforms, we need to introduce them to our fragment shaders so we define them:

This concludes this shader pass and if all went well we see this:

Shader Pass 1 without shadows

From the blue lines we see that it generally works but the pink part is still missing. Let’s change that.

Solve the problem with shadows

The pink part is missing since the shader in Shadertoy secretly renders shadows to an alpha channel that wasn’t visible in the first place as we can see in the next picture:

Shadows in Shadertoy

There are several ways to solve this — we used the straight forward one by adding a material that holds the shadows. These must be handled in an additional render pass.

So let’s create shadows in ThreeJS:

Shadows need light, in this case a directional one:

A MeshPhongMaterial can hold shadows.

While a new render target saves them.

And again, a resize function is needed:

Now we can transfer the shadows to the new render target and prepare it for the shader:

These lines set the material, render the scene, set the shadow to a uniform and change the material back to MeshNormalMaterial.

Now the shader needs to know about the shadows to be able to process them, so we change the uniforms:

Same for the fragment shader:

Then we replace the former line with our shadow.

The result should look like the the second step on Shadertoy.

Shader step 1

Now we only miss the second shader pass to complete this.

The final shader pass

For the final shader pass we add another EffectComposer instance.

Let’s define another shader:

We deactivate renderToScreen for the previous render pass:

Again, more variables are introduced; iTime to change variables over time and iChannel1 to add noise.

Shadertoy Noise and iTime

We use a ThreeJS clock for iTime.

With every change we also update iTime:

We add iTime und noise to the uniforms:

The noise is simply a noisy texture (for example the one from Shadertoy) that we load with ThreeJS into tNoise.

Now we need to adapt the fragment shader to our new variables, so we apply the following measures:

  • change mainImage to main
  • define uniforms and adapt the variables
  • define the vUv coordinates
  • change the returned result to gl_FragColor
  • replace texture with texture2D

This gives us:

After these changes the shader still won’t compile, because this shader needs a specific WebGL extension. Thankfully, this is easy to add in ThreeJS:

This gives us the following result:

Final Shader

Which is very close to the original Shadertoy:

Conclusion

We successfully transferred a complex Shadertoy shader to ThreeJS by following these steps:

  • understand the structure of the specific shader
  • implement shader passes
  • address possible GLSL incompatibilities
  • create optional shader passes and/or materials
  • activate optional extensions

We expect that these challenges will be mitigated with the upcoming WebGL2 support in ThreeJS since possible GLSL incompatibilities should vanish.

The full source code is here.

The final result

Helpful links and resources

Credits

Part of this research was funded by EFRE.NRW Projekt ‘ForK — Forschungsimpulse für die Kreativwirtschaft’ .