Welcome to part two of my shader series.
This post will be a collection of basic shaders samples, each intending to show you a simple concept. These can be used as-is within Unity, or modified slightly for other platforms and engines. At the end of this article, there’s a live Unity version of all the shaders for you to see.
Feel free to use these shaders as a reference or starting point when creating your own. This article goes hand-in-hand with my concepts overview, I recommend reading both as a kickoff to your shader journey. If you prefer hands on learning, start here, if you prefer getting a little context first, start with the High Level Concepts.
Now that there are a full two entries in the series, I clearly need a table of contents page. Check that out here. Please make suggestions for future articles as comments on that page.
This article may contain affiliate links, mistakes, or bad puns. Please read the disclaimer for more information.
Shader Samples
In this article, all the shader samples are written as they would be if used in Unity (screenshots will be from Unity too, if that’s not obvious). Only minimal changes are needed to use these concepts in other languages. For example, converting to GLSL would primarily involve changing some data types (such as float2 to vec2).
In all the screenshots, the shader is running as a post processing effect in Unity (except “time shift”). Basically that means the entire scene is rendered, then it’s run through one of these shaders before being drawn to the screen. These shaders would work just as well on sprites, or on textures on a model.
Sampling
fixed4 sampling (v2f iTexCoord) : SV_Target
{
fixed4 texColor = tex2D(_MainTex, iTexCoord.uv);
return texColor;
}
This shader simply grabs the color it needs for the texture, and draws it on the screen. This is called sampling. You are sampling from a texture when you pull the color from it. Note that the value returned from tex2D and our overall method is a float4. This is the rgba (Red, Green, Blue, Alpha) value of the pixel being sampled.
The input values I use to sample are called the texture coordinates. These are set by the earlier stages of the graphics pipeline. For a full-screen shader (post processing effect) these will range from 0 to 1. For other shaders, their bounds will match the portion of your sampled texture. This could still be 0 to 1, or a smaller range for some situations, such as a sprite from a sprite sheet or atlas.
Swizzling
fixed4 swizzle (v2f iTexCoord) : SV_Target
{
fixed4 texColor = tex2D(_MainTex, iTexCoord.uv);
texColor = texColor.gbra;
return texColor;
}
You can access the elements of any multi-element data type by using the letters corresponding to the elements after your variable. For a four element variable, the four items are rgba or xyzw. For two or three element variables, they are the first two or three letters used for the four-element type.
When accessing (read or write) those elements, you can access them in any order. Intentionally changing the alignment is called swizzling. In the above example, I am rotating the three main color channels, while leaving alpha in place.
Inversion
fixed4 inversion (v2f iTexCoord) : SV_Target
{
fixed4 texColor = tex2D(_MainTex, iTexCoord.uv);
texColor.rgb = 1 - texColor.rgb;
return texColor;
}
The point of this sample is to show that you can do math on your colors. Here I’m inverting the color by doing (1-color). Most Photoshop effects are the result of doing math on the sampled color. Some involve needing both the color of what you’re drawing, and the color of the background, but many do not.
Time Movement
fixed4 timeMovement (v2f iTexCoord) : SV_Target
{
float2 texCoord = iTexCoord.uv;
texCoord.x +=_SinTime.x;
float4 texColor = tex2D(_MainTex, texCoord);
return texColor;
}
This demos two important concepts.
First is the concept of adjusting coordinates to animate what’s going on. By adding a value that changes over time to one of the texture coordinates, i can change where I’m sampling, and thus change what I’m drawing. As discussed in my concept overview, you can sample outside the 0 to 1 range based on your texture settings. Note that things don’t quite work the same if this texture is in a sprite atlas or sprite sheet. Remember that with an atlas or sheet, you are actually sampling a small part of a larger texture. Which means your range is not really 0 to 1, and your ability to extend your bounds is limited.
The second important observation is that i didn’t declare this new variable _SinTime. It was just there. It turns out Unity provides a long list of handy shader inputs automatically. The full list is here: https://docs.unity3d.com/Manual/SL-UnityShaderVariables.html.
The two I use the most are _Time and _SinTime. Each is a float4 with four different scaled versions of time and sin(time) respectively.
Many engines will provide similar data, and if unavailable, you can provide these items from your own game code.
Texture Coordinate Swirl
fixed4 swirl (v2f iTexCoord ) : SV_Target
{
float2 texCoord = iTexCoord.uv;
//Adjust the coordinates to have 0,0 in the center of the screen
texCoord -= float2(0.5,0.5);
//Convert from cartesian to polar coordinates
float radius = length(texCoord);
float angle = 0;
if(texCoord.x != 0)
angle = atan(texCoord.y / texCoord.x);
if(texCoord.x < 0)
angle += 3.141592;
//Rotate the angle based on time and distance from center
// (further away will rotate faster)
angle += _SinTime.x * radius * 10;
//Convert back from polar to cartesian coordinates
texCoord.x = radius * cos(angle);
texCoord.y = radius * sin(angle);
//un-adjust coordinates so 0,0 is back in the corner
texCoord += float2(0.5,0.5);
//sample and return
fixed4 col = tex2D(_MainTex, texCoord);
return col;
}
This is the longest shader in the post, so I actually put comments in it. How responsible.
There are a few key ideas I want to point out. For one, I assume that the coordinates of my edges are 0 to 1, hence (0.5, 0.5) being the center of the screen. Adjusting the base (0,0) of your coordinates is helpful when needing to do anything symmetric. Here, once I’ve adjusted 0,0 to be in the center of the screen, I can convert to and from polar coordinates using simple trigonometry. If you are not familiar with this math, feel free to just steal it, or do some research on the concepts.
My rotation algorithm (rotating more the further I am from center) is illustrated below.
A second thing worth pointing out is the methods length(), atan(), sin() and cos(). These are not Unity specific, but instead are just a few of the many built in math methods that come with shaders. In almost all cases, they are the same between languages. If not, you can search for your language plus the (seemingly) wrong method name, to find out the correlated name. Note that while the implementation of these is free, the execution is not. They are very well optimized, but still can be expensive depending on the method in question.
Color Tint and Chroma Key
fixed4 ChromaTint (v2f iTexCoord) : SV_Target
{
float4 texColor = tex2D(_MainTex, iTexCoord.uv);
//check for our chromakey. this specific math
// will find colors that are rougly magenta-like
if( texColor.g < texColor.b - 0.1 &&
texColor.g < texColor.r - 0.1)
{
float delta = abs(texColor.b - texColor.r);
if(delta < 0.2)
{
//make the magenta even more magenta
return float4(
texColor.r*2,
texColor.g * 0.5,
texColor.b * 2,
texColor.a);
}
}
//convert to grayscale
float gray = texColor.r * 0.3 +
texColor.g * 0.5 +
texColor.b * 0.11;
//tint and return
return float4(gray, gray * 0.9, gray * 0.75, texColor.a);
}
Admittedly this one looks a little odd, and in reality, it doesn’t teach much about shader theory. I included it because it combines two techniques I’ve had to use a lot. One is chroma key, the other is doing a gray scale tinting.
Chroma Key is the process of using a specific color in an image as a key in some algorithm. The most common example of this is putting a person in front of a green screen for filming. The algorithm replaces a specific range of green shades with alpha so that the actors (or news anchor) can be placed over an alternate background. In the example I have, I use the chroma key not to hide parts of the image, but to exclude those parts from the main algorithm. Instead, those parts get their color blown up a bit.
Gray scale tinting is the process of converting an image to gray scale, and multiplying that gray by some base color. The main important take away from this section of the shader is that converting a color to gray is not as simple as averaging the colors. The human eye does not actually perceive each of the RGB channels equally. The result is that Green affects our perception of brightness far more than Blue does.
Demo
Here are the above shaders demo’d in Unity. Click the “Sprite” or “Screen” buttons to cycle through the shaders. For notes on how to get a running game into a wordpress page, see this tutorial.
Loading...
Conclusion
Hopefully these basic shader samples are helpful to you. These are not intended to impress your friends at a party (though they may). Instead they are jumping off points. Copy/paste them into a Unity game, or anywhere else, as the baseline that you build your beautiful shaders on top of.
Do something. Make progress. Have fun.
I’ll look it up, but I’ve always wondered why pi isn’t defined in HLSL and other shader languages. (maybe it is and I just don’t know) Probably to give the programmer flexibility in the precision needed? 3.141592 is good enough I’m guessing.. heh.. =)