Mapping touch/click event coordinates to button coordinates

Hi,

A sample project is provided here:
https://playcanvas.com/editor/scene/1434728

We have a button which can be clicked by the user.
Currently whenever it is clicked, the x,y coordinates of the event (as provided by the event object) are written to the console.

However, if the display device changes (say from iPad to Galaxy Note), the {x,y} coordinates are very different.
How do we convert these into coordinates that are more consistent in the local space of the button itself?
There’s a more complex reason why we need it localized so separating it into two buttons is not trivial and need a means to manage it within script.

Thanks.

Hi @sciseedlets,

Yes, those coordinates are calculated from the top,left corner of the window, so when the display size changes, they change.

The button in reality doesn’t really have coordinates in pixels, but in local and world space (since it’s part of the 3D scene).

If you are looking to get the position of the click in the world, in front of the camera at a specified distance from the camera, you can use the following method:

https://developer.playcanvas.com/api/pc.CameraComponent.html#screenToWorld

If you can share some info on what you are trying to achieve, we may be able to be of more help.

1 Like

Thanks @Leonidas .

The screenToWorld option worked but we also tested the Padding option (Editor>button entity options) using negative coordinates.
While the documentation on it is sparse, we figured out the order in which the vec4 handles the borders.

That solved our problem (which was that we wanted only half the button to work (receive/respond to touch).

It also solved another problem without us needing to add another button to the scene.
Whoever came up with the Padding option deserves kudos.

Thanks

2 Likes