Shader Optimization

Hi! So I’ve been working on the dithering shader as mentioned in my previous post. It works perfectly fine, and I also implemented palette quantization. However, I notice a 20 ms increase in frametime when I toggle the shader on. I believe the issue lies in my closest colors function

vec3[2] closestColors(vec3 color) {
    const int paletteLength = 32;
    vec3 ret[2];
    vec3 closest = vec3(-2, 0, 0);
    vec3 secondClosest = vec3(-2, 0, 0);
    vec3 temp;
    for (int i = 0; i < paletteLength; ++i) {
        temp = rgb2hsl(palette[i]);
        float tempDistance = distanceSquared(temp, color);
        if (tempDistance < distanceSquared(closest, color)) {
            secondClosest = closest;
            closest = temp;
        } else {
            if (tempDistance < distanceSquared(secondClosest, color)) {
                secondClosest = temp;
            }
        }
    }
    ret[0] = closest;
    ret[1] = secondClosest;
    return ret;
}

where I sample the color palette every second. Unfortunately, I know of no way to optimize this. Any idea is welcomed.

I suspect you need to find a different way to do this. Typically these things are done using a 3D texture lookup. So you do all this on the CPU side, but upload your results into a 3D texture, resolution perhaps 32x32x32. Then you do a simple lookup with the color as 3D uv coordinates.

Thanks for the advice, will look into 3d textures.

Define 3d texture.

something like this: Color Grading / Correction – kosmonaut's blog

you can store the data in a 3d (volume) texture, or as shown in the article, on a 2D textures, which is probably easier to use / compatible with all hardware