Create erasable surface

I’m trying to make a game that’s kind of like a lottery scratch card, but don’t really know how I’d go about making the scratch surface. My current thought is using a plane, and while the player holds the left mouse and drags the cursor it erases part of plane around the cursor. The other option I thought of, which would have a massive impact on performance, would be to have a large number of tiny planes that disable as the cursor passes near them. I’d prefer to avoid the second option if at all possible.

You need an alpha-texture and you have to change it in runtime.

How? You can access it via lock and change pixels color.

Then you have to transform your cursor position by screenToWorld, blah-blah-blah

Or better

image

Pass into a shader your pointer position (don’t forget to divide it by screen size) and then in pixel shader you can know where exactly you’re pointer relativity for this texture.

Combine it with RTO and bingo, I guess it should work.

Im guessing this is what they call fragment shader here?

Yes, fragment shader is correct, I just have a bad habit to call it “pixel shader” (I’ve been working with DX for awhile)

So since this is the first time I’m using shaders, I’m assuming I need this in the shader(glsl) file:

varying vec2 vUv0;

uniform sampler2D uDiffuseMap;
uniform sampler2D uHeightMap;
uniform float uTime;

void main(void)
{
    float height = texture2D(uHeightMap, vUv0).r;
    vec4 color = texture2D(uDiffuseMap, vUv0);
    if(height < uTime)
    {
        color = vec4(0,0.2,1,0.1);
    }
    gl_FragColor = color;
}

and then a script to define it like this:

var vertexShader = this.vs.resource;

// dynamically set the precision depending on device.
var fragmentShader = "precision " + gd.precision + " float;\n";
fragmentShader = fragmentShader + this.fs.resource;


// A shader definition used to create a new shader.
var shaderDefinition = {
    attributes: {
        aPosition: pc.gfx.SEMANTIC_POSITION,
        aUv0: pc.gfx.SEMANTIC_TEXCOORD0
    },
    vshader: vertexShader,
    fshader: fragmentShader
};

Yes, but this is a base for your effect.

Your pipeline is:

  1. Create texture for opacity and pc.RenderTarget with this texture.
  2. Obtain pointer position. / screen_size
  3. Create new layer. Add your scratch surface entity there (world + new layer).
  4. Set your layer renderTarget to your target from 1 step
  5. Use your texture from first step as opacity map for your material.
  6. Copy your opacity texture to temporary one in onPreRender callback of your layer
  7. Write a custom shader. It should obtain pointer position and temporary texture.
  8. In this shader, transform point coords to screen space.
    new_pos = matrix_modelViewProjection * vec4(position, 1.0)
  9. Pass this new_pos.xy to fragment shader.
  10. Compare UV coords from vertex shader and pointer position and if they are equal render transparent pixel or sample from temporary texture.

That’s it. Try to do it step by step and ask questions.

This would be the texture I want to be visible before scratching yes?

Straightforward enough.

Nope, it would be opacity map. Like, if pixel color is 1, render pixel from your diffuseMap, if it’s 0, discard;

Yep.

So something like the example here?

Better find an example to pc.RenderTarget. IIRC there was an example for appropriate texture as well.

In both the API for pc.RenderTarget and the tutorial example you provided before the texture script is similar to the one in the example I provided. Give or take a few extra definitions.

Just use this format (R8G8B8A8) and device.width and device.height in definition.

So something like this then, or are there somethings I’m missing/don’t need?

var Rendertarget = function(entity)
{
    this.entity = entity;
};

// initialize code called once per entity
Shader.prototype.initialize = function() 
{
    var texture = new pc.Texture(graphicsDevice,
    {
        width: device.width,
        height: device.height,
        format: pc.PIXELFORMAT_R8_G8_B8_A8
    });
    texture.minFilter = pc.FILTER_LINEAR;
    texture.magFilter = pc.FILTER_LINEAR;
    var renderTarget = new pc.RenderTarget(graphicsDevice, texture,
        {
            depth: true
        });
    
    this.entity.camera.renderTarget = renderTarget;
    
    return Rendertarget;
};

Oh, i’m sorry. Your texture size:

width: device.width,
height: device.height,

should be similar for your diffuse texture. If it’s 256x256, it have to be the same.

You don’t need depth.

Doesn’t work anymore. You have to use layers, like a said above.

I don’t get it at all.

Probably just another old thing that doesn’t work then, was in the render to texture tutorial.

Okay, then delete it.

This should become surface.renderTarget = renderTarget;, where surface is a layer I assign to the entity?

You have to create new layer, place your surface entity there and in World layer too.

Show me your code then.

Here’s the project link, I made a new layer called surface and the entity is in both that layer and world.
https://playcanvas.com/editor/scene/606098
And here’s direct link to the script https://playcanvas.com/editor/code/550396?tabs=12151983,12172825,12173951,12156405 (shader.js)

Add this layer’s sublayers somewhere after World solid.

I don’t see your render target, only texture.
Don’t forget about temporary texture.