Shader Series 3 – Shaders In Unity

There are quite a few paths to head down when creating shaders in Unity.  There is support for several types of code based shaders, and as of 2018.1 you can use the shader graph.  Today’s blog will cover the basic setup for those methods. This will provide a framework to better equip you to utilize the info in my Shader Series Lessons.

To go along with this, I’ve set up a bare bones Unity project. You can use this as a starting point for playing with shaders. Its available on github here.

This article may contain affiliate links, mistakes, or bad puns. Please read the disclaimer for more information.

Code Based Shaders

Right click in your project view to create a shader.  You’ll have four types of shaders to choose from: Standard Surface, Unlit, Image Effect, and Compute.

Compute

I’m ignoring this one for now.  It’s for computing, not graphics-ing, and is outside the current scope of my lessons.

Standard Surface

This shader somewhat cheats.  If you remember from the concept overview, there’s a vertex and a fragment shader step.  Yet in this shader, there’s only one method and it’s called “surf”. Conceptually, you can think of this like it’s a step in the middle of a large fragment shader.  By being just a sub-method, it enables you to customize the color behavior, while not having to worry about lighting, depth, fog, etc. Many of my lessons related to fragment or full screen shaders will apply just as well to a surface shader. 

Unlit & Image Effect

I lump Unlit and Image shaders together because they are almost the same.  Both are set up to be bare bones shaders, giving you no lighting or other basic Unity effects.  The difference is that Unlit is setup to be a bare bones 3D shader, and Image Effect is setup to be a bare bones full screen (post processing) shader.  That means that Image Effects also work for 2D sprites. The only tweaks potentially needed are to support transparency.

There is nothing special about starting with either of these two shaders.  Both create shader files, and either can be slightly edited to look like the other.  I’ve created a basic “2D with transparency” shader below. I’ve added comments around what code you would change to swap 2D/3D or Transparent/Opaque/Camera. You can also find this shader in the sample project I put up on github.

Shader "Custom/Image"
{
    Properties
    {
        _MainTex ("Texture", 2D) = "white" {}
    }
    SubShader
    {
        //** No culling or depth -- use for post proc
        //Cull Off ZWrite Off ZTest Always
        //** No culling -- use for non-transparent sprites.
        //Cull Off ZWrite On ZTest Always
        //** transparent sprite
        Tags { "Queue"="Transparent" "RenderType"="Transparent" }
        Zwrite On 
        Blend SrcAlpha OneMinusSrcAlpha
        //** 3D things
        //Tags { "Queue"="Geometry" "RenderType"="Opaque" }
        //Zwrite On 
        
        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment transparent

            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;
            };

            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = v.uv;
                return o;
            }

            sampler2D _MainTex;

            fixed4 sampling (v2f iTexCoord) : SV_Target
            {
                fixed4 texColor = tex2D(_MainTex, iTexCoord.uv);
                return texColor;
            }
            
            ENDCG
        }
    }
}

Shadering GameObjects and Sprites

Note, “shadering” is not a word… yet.  So definitely use it all the time and let’s hope it catches on.

To apply your shader to a GameObject,  first create a material, then add the shader to the material.  Apply the shader to your renderer component, and you’re done. Make sure to set “Render Queue” to “From Shader”.  This lets the shader dictate if the render will have alpha

Full Screen Shaders

Full screen shaders (aka post-processing effects, or camera shaders), require a small bit of code to get going.   Step one is to create your shader. From there, we’ll need to add a script to your camera. The simplest script takes a material as it’s input.  This means you need to crate a material and set the shader to it, just as you would for a sprite material. Alternatively, you can feed the shader into the script, and create the material on the fly.  That’s what I’ll do below, as it’s the more flexible solution.

Update: this section only speaks to build in shaders in Unity’s built in render pipeline. For an overview of this type of shader set up in any renderer, see my tutorial on the subject.

Camera Changes

Add a new script component to your main camera. I called mine AwesomeScreenShader.cs.  

public class AwesomeScreenShader : MonoBehaviour {
    public Shader awesomeShader = null;
    private Material m_renderMaterial;
    
    void Start()
    {
        if (awesomeShader == null)
        {
            Debug.LogError("no awesome shader.");
            m_renderMaterial = null;
            return;
        }
        m_renderMaterial = new Material(awesomeShader);
    }
    void OnRenderImage(RenderTexture source, RenderTexture destination)
    {
        Graphics.Blit(source, destination, m_renderMaterial);
    }
}

Set your new shader as the input to this component and you’re done.  If you wanted a simpler script, you’d just make the m_renderMaterial public (maybe renamed), and delete the Start() method. 

Custom Inputs

The Unity docs cover defining and controlling inputs well, so I’ll just give the brief version here.

Edit Time

To define your inputs, you have to declare your variables in two places within the shader file.  Inside the “Properties” section at the top of the file, and again within the SubShader section. The first defines how this variable will be displayed and accessed by Unity, the latter how it is accessed by the shader.

Properties
{
	_MainTex ("Texture", 2D) = "white" {}
	_SomeVariable ("Something", Int) = 12
}
SubShader
{
	Pass
	{
		CGPROGRAM
	   //...
		sampler2D _MainTex;
		int _SomeVariable;

Runtime

At runtime, you access the shader from the material, and call one of the Set* methods.  For example, on my sprite, I could put a component that did this:

void Start()
{
	var sprite = GetComponent<SpriteRenderer>();
	var mat = sprite.material;
	mat.SetInt("_SomeVariable", 53);
}

Shader Graph

Shader graph functions as a visual scripting system for shaders within Unity. For the most part, the shader graph supports any of the fundamental building blocks you’d actually use while coding by hand, as well as many helper blocks.  

I recommend this video, as a good overview.

Update: After posting this, I’ve shifted a lot of my tutorials to be Shader Graph focused, as that is the recommended way to use URP and HDRP. For the first article on the subject start here.

On To The Next Thing

This write up, and the github project should get you set up to be making shaders in Unity. Check out the shader series overview for ideas and inspiration.

Do something. Make progress. Have fun.