2D Tilemap using shaders

Update: an updated version can be found here

Recently I had been wondering if I could create an easy way of rendering a 2D tilemap with a low amount of verts and drawcalls. My solution was to create a custom shader that will select the correct tile and draw it. This means I can draw a complete tilemap with just 1 quad/2 triangles. For the input I generate an extra texture that has the tilemap information. Currently there is space for animation in the texture but atm I haven’t programmed that yet.

Here is the c# script that generates the texture needed. Currently it adds some test data.

using UnityEngine;
using System.Collections;

public class Tilemap2D : MonoBehaviour {
  
  public Vector2 size;
  public Vector2 tileSize;
  public Texture2D tilesetTexture;
  
  private const int MAX_ANIM_LENGTH = 1;
  private const int MAX_ANIM_FRAMERATE = 1;
  
  // Use this for initialization
  void Start () {
    
    // color layout
    // r: x index
    // g: y index
    // b: animation length
    // a: animation framerate
    
    // TODO make size square and power of 2
    
    // create texture from data
    Texture2D texture = new Texture2D((int)size.x, (int)size.y, TextureFormat.ARGB32, false);
    texture.filterMode = FilterMode.Point;
    Vector2 tileCount = new Vector2(tilesetTexture.width/tileSize.x,tilesetTexture.height/tileSize.y);
    for (int i=0; i<size.x; i++)
    {
      for (int j=0; j<size.y; j++)
      {
        texture.SetPixel(i,j, new Color(0,0,1/MAX_ANIM_LENGTH, 1/MAX_ANIM_FRAMERATE));
        if (i==5) texture.SetPixel(i,j, new Color(1/tileCount.x,0/tileCount.y,1/MAX_ANIM_LENGTH, 1/MAX_ANIM_FRAMERATE));
        if (j==5) texture.SetPixel(i,j, new Color(2/tileCount.x,0/tileCount.y,1/MAX_ANIM_LENGTH, 1/MAX_ANIM_FRAMERATE));
      }
    }
    
    texture.Apply();
    
    // set shader properties
    renderer.material.SetTexture("_MainTex", tilesetTexture);
    renderer.material.SetTexture("_Tilemap", texture);
    renderer.material.SetVector("_TilesetSize",new Vector4(tilesetTexture.width, tilesetTexture.height, tileSize.x, tileSize.y));
    renderer.material.SetVector("_MapSize",new Vector4(size.x, size.y,tileCount.x,tileCount.y));
    
  }
  
  // Update is called once per frame
  void Update () {
    // update texture if necessary
  }
}

									

And here is the shader I used

Shader "Tilemap/Tilemap2DWorldCoord" {
  Properties {
    _MainTex ("Tileset", 2D) = "black" {}
    _Tilemap ("Tilemap", 2D) = "black" {}
    _MapSize ("MapSize/tileCount", vector) = (0,0,0,0)
    _TilesetSize ("Size/Tilesize", vector) = (0,0,0,0)
  }
  SubShader {
    Tags { "RenderType"="Opaque" }
    LOD 200
    pass
    {  
      CGPROGRAM
      #pragma vertex vert
      #pragma fragment frag
      #include "UnityCG.cginc"

      sampler2D _MainTex;
      sampler2D _Tilemap;
      float4 _MapSize;
      float4 _TilesetSize;
      
      struct Vin
      {
        float4 vertex : POSITION;
      };
      
      struct Vout
      {
        float4 vertex : SV_POSITION;
        float2 texcoord : TEXCOORD0;
      };
      
      Vout vert(Vin vin)
      {
        Vout result;
        result.vertex = mul(UNITY_MATRIX_MVP, vin.vertex);
        result.texcoord = mul(_Object2World, vin.vertex).xz / _TilesetSize.zw;
        return result;
      }
      
      float4 frag(Vout vout) : COLOR
      {
        // get tile; tile.x and tile.y is the tileposition
        float4 tile = tex2D(_Tilemap, vout.texcoord);
        
        float2 tileOffset = frac(vout.texcoord * _MapSize.xy) / _TilesetSize.zw;
        
        float2 bias = float2(-0.001, 0);
        
        //return tex2D(_MainTex, tile.xy);
        return tex2D(_MainTex, tile.xy + tileOffset+bias);
      }
      ENDCG
    }
  } 
  FallBack "Diffuse"
}


									

The shader is relatively easy. First it samples the tilemap texture to get the correct tile. I encoded the tile position in the red and green channels of the texture where red is the x and green is the y of the tile in the tileset. After that I calculate the internal tile offset. This is needed to sample the whole tile instead of only the first pixels. I calculate this by multiplying the texture coordinates by the map size. This brings the 0 to 1 space to a 0 to map size space. The fractional part of the resulting multiplication is the internal offset in a 0 to 1 space. This I divide by the size of a tile to get the resulting offset. Finally I sample the tileset texture to get the resulting color.

The only problem I found was that I had to add a small bias to the u coordinate of the tile to prevent it sampling from a neighbouring tile. It could be that this bias is not needed on certain graphics cards or that those cards need a negative bias.

Example Files

2 thoughts on “2D Tilemap using shaders

    • Thanks.

      I’ve added a unitypackage to the post containing an example scene and the needed files.

      Remember that this was just made as an experiment and that performance wise this might not be the best solution.

Leave a Reply

Your email address will not be published. Required fields are marked *