Deconstructing a simple pixel shader
Kapil Kapre | Sunday, Sep 15, 2013

The other day I was checking up on the demo scene and came across this cool website called ShaderToy. Its basically a live shader editor that allows you to write pixel shaders and see them work in real-time. It uses WebGL to render the scene which immediately makes the experience a function of your luck - huge props to Iñigo Quílez for delivering a relatively stable experience given the crap technology he has to work with.


WebGL Fail
Oh happy day..


What's interesting is that the entire effect is coded in a single GLSL pixel shader, so there is no geometry or lighting or shading or anything at all which is part of the scene. All that's in the pipeline is a simple quad. This means that we're going to do everything in one pass and each pixel is basically a function of its screen position. I found the whole thing to be really awesome since it reminded me of the oldskool asm demos which also had no "graphics API" to work with and simply blitted data to 0xA0000. Anyway I decided to spend the weekend attempting to mentally unpack and digest some of the shaders. I do have some background working on graphics code so a lot of the stuff seemed quite familiar - but at the same time, I am not familiar with the idioms used in modern demo code. Amongst the first ones was a simple 2-plane-shader.


// Created by inigo quilez - iq/2013
// License Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

void main(void)
{
    vec2 p = -1.0+2.0*gl_FragCoord.xy/iResolution.y;

    float an = iGlobalTime*0.1;
    float x = p.x*cos(an)-p.y*sin(an);
    float y = p.x*sin(an)+p.y*cos(an);

    vec2 uv = 0.2*vec2(x,1.0)/abs(y);
    uv.xy += 0.20*iGlobalTime;

    float w = max(-0.1, 0.6-abs(y) );
    gl_FragColor = vec4( texture2D(iChannel0, uv).xyz+w, 1.0);
}

So what's obvious is that the entire effect is coded here vec2 uv = 0.2*vec2(x,1.0)/abs(y); and the rest is just icing to make it look better. I didn't get an intuitive feel of the mapping so I fired up python and created a plot representing the transformed texture co-ordinates.


from pylab import frange
import matplotlib.pyplot as plt
x = frange(0,10,1.01) #avoid div by zero :P
xy = [(a,b) for a in x for b in x]
xyT = [(a/b + 20.0, 5/b + 20.0) for (a,b) in xy]
xN,yN = zip(*xy)
xTN, yTN = zip(*xyT)
plt.plot(xN+xTN,yN+yTN,'ro')
plt.show()

a plot of the transformed vs original points

Visualizing the mapping makes it seem really obvious now. f(x,y) = (x/y,1.0/y) is a simple 2D morph equation. Or, looking at it backwards, we're creating a fake quad that extends into the distance and calculating what texture co-ordinate OpenGL would chose for it - and then doing it manually. Or you could also visualize it as a reverse perspective projection (We divide by y so the 'object' moves further away into the distance as y increases). Knowing this, we can now apply some interesting non-linear transforms. For e.g. f(x,y) = (x^2 - y^2,2xy) gives us an interesting effect. That's all I have to share for now. I'm going to be spending the odd weekend playing with some fractal algorithms. Hopefully I will have an update with something of my own in some time.