So in the game you can drag or click the arrow to move it and set the distance where the golf ball will go, and I am using raycast which hits on the ground and I set the position of the marker, but if I click the blue button “Shot View” to change the view, then the marker just goes where the “Shot view” button is, so I dont want the raycast to pass through the button and hit the ground but just simply behave like a button click.
The raycast as well as the button fire on the same “touchstart” event.
On desktop I solved this issue by using the OnHover property of the cursor so if the mouse was hovering on a element I wouldnt fire the raycast.
One way to easily solve this is using a global variable that acts as a switch if a touchstart event has clicked on a UI element. If that’s is true for that frame cancel all other touchstart event handlers.
Let’s say this your UI button script:
var BtnStates = pc.createScript('btnStates');
BtnStates.buttonClicked = false;
// initialize code called once per entity
BtnStates.prototype.initialize = function() {
// touch events
this.entity.element.on('touchstart', this.onPress, this);
this.entity.element.on('touchend', this.onRelease, this);
};
BtnStates.prototype.onPress = function (event) {
BtnStates.buttonClicked = true;
};
BtnStates.prototype.onRelease = function (event) {
// --- we put it in a zero timeout to have it propagate to the next frame
window.setTimeout(function(){
BtnStates.buttonClicked = false;
}, 0);
};
Now in any other script you are having a touchstart event handler you can do this:
// initialize code called once per entity
MyScript.prototype.initialize = function() {
// touch events
this.app.touch.on('touchstart', this.onPress, this);
};
MyScript.prototype.onPress = function (event) {
if( BtnStates.buttonClick === true) return false;
// do your raycast
};
In my game I have a popup widget, that appears during the game play, a friend request. I need to press Accept or Reject button, without affecting the actual gameplay. I do it similar to what is described (pseudo code):
// in my input script
if (not UI click) {
on 'touchstart' jump();
}
// in my button script
on 'touchstart' {
UI click = true;
}
on 'touchend' {
UI click = false;
}
Good point, I think it may have to do with the order that you subscribe but I haven’t confirmed that.
In some cases where that isn’t the case, I can afford adding a zero timeout on non-UI clicks, like in this case for scene raycasting. So those clicks get handled in the next frame, after UI clicks.
This is hacky, so the best solution is to keep track of the order these events get executed. It would be good to get some insight on this.
Hmm, actually good point. I didn’t use the timeout hack, since it worked fine for me. Now that I think about it, I am not sure why it’s working the way it works now There should have been a race condition, but I don’t see it. Maybe UI does get priority, as @yaustar mentioned? Not sure.