Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-touch Support between GameEngine and View #55

Open
xiendong opened this issue Jun 4, 2020 · 26 comments
Open

Multi-touch Support between GameEngine and View #55

xiendong opened this issue Jun 4, 2020 · 26 comments

Comments

@xiendong
Copy link

xiendong commented Jun 4, 2020

Hello! I am making a game that a user may tap on both the GameEngine component and buttons
(implemented via View::onTouch) that are located outside of GameEngine concurrently.

On iOS, it works as expected: when one finger is placed on the GameEngine component, the user can still tap on a button that is located outside the GameEngine, vice versa.

On Android, however, when one finger is placed on the GameEngine component and the other finger taps on a button that is outside of GameEngine, the additional tap gesture is handled by GameEngine instead of the View button.

Do you have an insight into fixing this issue? Thanks!

@bberak
Copy link
Owner

bberak commented Jun 5, 2020

Hi @xiendong,

This should be possible - it was initially designed to work that way, but it could be that a new version of React Native introduced some issues with these edge cases.

You could try playing around with the style prop of the GameEngine component to see of you can get it working (setting width, height, pointEvents or a combination of these).

Otherwise, if you could share your code via a snack - I'd be happy to fire up my old Android device and do some debugging. These things are not always easy to track down because it really depends on the style and component tree of your app.

Cheers!

@xiendong
Copy link
Author

xiendong commented Jun 6, 2020

Hello @bberak, thanks for the swift reply. Here is a sample code on snack:
https://snack.expo.io/@xdrawks/game-engine-vs-view-vs-touchablewithoutfeedback

On iOS, the Game Engine and View Surface can be interacted independently from each other, which can't be achieved on Android. Please have a look. Thank you!

@bberak
Copy link
Owner

bberak commented Jun 9, 2020

Thanks @xiendong - I'll take a look. I've just bricked my old android device - so trying to find a cheap alternative (picking something up tomorrow)!

@bberak
Copy link
Owner

bberak commented Jun 10, 2020

Hi @xiendong,

I got my Android device up and running and was able to reproduce the issue. I can confirm that this is an Android (React Native on Android?) issue - nothing related to RNGE (at least not directly). What I mean by that is - if you remove the <GameEngine /> component, the multi-touch bug (or limitation) still exists.

That said, I was able to fix the problem by using the TouchableOpacity component from react-native-gesture-handler on all surfaces that required touch handling. This package should come out of the box on Expo. Here is an updated Snack which allows all surfaces to be touched (in any combination). Well it works on my Alcatel 3 phone anyway (Android 9, latest Expo client).

I hope that helps and let me know how it goes!

@xiendong
Copy link
Author

xiendong commented Jun 10, 2020

Hello, looks like the snack is causing some confusion. Grey colouring does not indicate that it is actually pressed (TouchableOpacity might show a press on the UI side, but it was not handled correctly). I have updated a new snack here that actually indicates a successful press event (with red background) here https://snack.expo.io/@xdrawks/game-engine-vs-view-vs-touchablewithoutfeedback.

With the top-most box as Box 1 and the bottom-most box as Box 4, Box 1 is a component that consists of TouchableWithoutFeeback that wraps a GameEngine.
Box 3 and 4 is a TouchableWithoutFeeback that wraps a View.

By using RN Gesture Handler's TouchableWithoutFeedback, multi-pressing box 3 and 4 seem to work correctly on Android, but multi-pressing 1 and 3/4 does not seem to work correctly as illustrated:

  • Pressing box 3 will prevent box 1 from being pressed.
  • Pressing box 1, then 3, and release box 3 will cause box 1 to be incorrectly released.

The behaviour is consistent with the description above when TouchableOpacity is used on Android. But looking at how box 3 and 4 behave, it seems promising that there is a solution to this problem. I also noted that this fix does not work on iOS, so I will probably have 2 solutions on the respective platform.

Do you have any advice on this? Thanks!

@bberak
Copy link
Owner

bberak commented Jun 10, 2020

Hey @xiendong,

Sorry, I've tried several things but cannot get the behaviour to be consistent and correct like with iOS. I'll let you know if any of my attempts work.

Can I ask what your end goal is? Perhaps this is something you can design differently to overcome this challenge? Alternatively, perhaps you can make the <GameEngine /> full screen, and then create buttons as game entities which you can activate or deactivate using the touches array?

@xiendong
Copy link
Author

Currently, I have a touchable game that uses the GameEngine and there are buttons outside the GameEngine component that assists in the game. Currently, I'm exploring if overlaying the GameEngine component with the touch gestures by RNGestureHandlers, such as TapGestureHandler would help.

@bberak
Copy link
Owner

bberak commented Jun 10, 2020

Can these buttons be moved inside the GameEngine as part of the entities?

Tomorrow I'll try adding some props to the GameEngine that will change the behaviour of its internal touch handling to see if I can get something working.. Let me know how your approaches with the gesture handlers go :)

@xiendong
Copy link
Author

xiendong commented Jun 10, 2020

I realised in GameEngine.js, the current touch system relies on React Native's View component. I believe that's why it works perfectly with other independent View on iOS but not on Android. But I am not too sure if react-native-gesture-handler offers an equivalent of View component that can provide multiple independent touch coordinates.

Here are some relevant issues:
facebook/react-native#10068
software-mansion/react-native-gesture-handler#47

For now, I reattempted your initial solution by using a touchable from react-native-gesture-handler on my actual app and with some hack, and it actually seems to work okay on Android. So, I guess the best solution so far is to use a View button for iOS and TouchableWithoutFeedback for Android to work with the current implementation of GameEngine.

Thanks for your help so far!

@bberak
Copy link
Owner

bberak commented Jun 11, 2020

Hi @xiendong,

I'm glad you got it working. Yes, under the hood RNGE uses the View.. If it helps, I might be able to create a branch with some additional APIs that will allow the developer to handle the touches manually.. Something like this:

//-- React syntax might not be correct, just a quick example

export default function App() {

  ...

  const handleTouchStart = ev => this.refs.gameEngine.onTouchStartHandler(ev);
  const handleTouchMove = ev => this.refs.gameEngine.onTouchMoveHandler (ev);
  const handleTouchEnd = ev => this.refs.gameEngine.onTouchEndHandler(ev);

  return (
    <View style={styles.container} 
        onTouchStart={handleTouchStart}
        onTouchMove={handleTouchMove}
        onTouchEnd={handleTouchEnd}>

        <GameEngine 
          ref={"gameEngine"}
          handleTouches={false} //-- This would be a new prop to prevent the game engine from handling touches
          style={[gamePressed ? styles.pressDown : styles.pressUp]} 
          systems={[handleGamePress]}>
          <Text>TouchableWithoutFeedback Game Engine {gamePressed ? 'YES' : ''}</Text>
        </GameEngine>
 
        //-- Other touchable components go here..
    </View>
  );
}

That said, I'm not sure if this will help in your situation.. Ideally the developer won't have to worry about this touch handling stuff - and we'll have parity between iOS and Android.

@xiendong
Copy link
Author

Hmm, I'm not sure too. I think the main problem lies in the inconsistent View behaviour on both Android and iOS in React Native. I could image a new problem may arise if someone wants to develop a multi-player single device game using multiple GameEngine components. With that, the current TouchableWithoutFeedback solution would fail because it does not give multiple precise touch coordinates. At least this gives the developer a way to dispatch the touch event to the appropriate GameEngine component.

Thanks for your quick support nonetheless!

@xiendong
Copy link
Author

xiendong commented Jun 12, 2020

Hello @bberak, I just realised that my approach still doesn't work. While the TouchableWithoutFeedback component is pressed, any new touches to the GameEngine component are blocked 😞. Looks like I will need your suggested API to manually dispatch the touch gestures according to the touch locations.

@bberak
Copy link
Owner

bberak commented Jun 14, 2020

Alright @xiendong, I'll create a branch and we can try a few things. It doesn't sound like an ideal solution though - is there any way you can create these buttons as game entities (and therefore within the view of the game engine)?

@xiendong
Copy link
Author

Hmm, it might not be elegant too, but let me think how to dispatch the touch callbacks from GameEngine to the buttons without having the buttons to exist as entities. To give you a context, this is the game that I'm trying to make. Thanks for providing this great library though!

@bberak
Copy link
Owner

bberak commented Jun 15, 2020

Ahh okay, thanks for the context @xiendong - that helps me visualize the problem. You might already be aware, but there is a dispatch method on the GameEngine which is also passed to your systems which facilitates back and forth communication between the GameEngine and external components. I'll give your problem some more thought - hopefully we can find a reasonable solution. Cheers!

@bberak
Copy link
Owner

bberak commented Jun 18, 2020

Hey @xiendong,

Not sure how far you've gotten with this, but here is kind of where my head was going with this problem (although it is not an ideal solution): https://snack.expo.io/@bberak/authentic-banana

@xiendong
Copy link
Author

xiendong commented Jun 18, 2020

Hello @bberak, I’ve been working on the other parts of the game, so I haven’t work on a fix for the bug yet.

I saw your solution, and I was thinking how it might impact the positioning of the sprites in the game. I’m not sure if I have the control over how the sprites are rendered (meaning that I might not be able to nicely bound the sprites up in a box via children of some View component like what I do now but with GameEngine). I also rely on the layout properties of the GameEngine to translate my logical coordinate system in order to lay out the sprites in the GameEngine, so I’m not sure if such translation is straightforward.

I am wondering if a RenderingSurface component that exists as a child of GameEngine could help to address the above problem. With the RenderingSurface I could add the buttons as children of GameEngine while keeping the actual rendering work contained in the RenderingSurface, separate from the buttons. And perhaps I would be able to rely on the layouts of the RenderingSurface for my coordinate translation.

What do you think? Thank you for your help!

@bberak
Copy link
Owner

bberak commented Jun 21, 2020

Hi @xiendong,

Sorry I didn't quite follow the above but it sounds like the default rendering method of the GameEngine might not fill your requirements. Whilst I'm not sure exactly what you'll be needing, you can override the default rendering method (which is quite simple and based on a View) with whatever method you require:

import MyCustomRenderer from "./my-custom-renderer";
...
<GameEngine
   systems={[]}
   entities={{}}
   renderer={MyCustomRenderer}
/>

Here is the source code of the DefaultRenderer. In a nutshell, it loops through the entities and renders the ones that have a renderer property. You can replace this with your own logic/and or nest the entities within your own styled View or RenderingSurface.

Let me know if that helps!

@xiendong
Copy link
Author

xiendong commented Sep 6, 2020

Hello! Sorry for the delayed reply, was busy with other things as well. I eventually solved the touch problem by having a parent touch component View over the GameEngine that directly dispatches touch events to the individual buttons in the children component. This approach, however, needed to override the GameEngine's touch system. Hackish but helps to mitigate React Native's limitation on Android. Thanks!

@bberak
Copy link
Owner

bberak commented Sep 13, 2020

Nice solution @xiendong! Did you provide a new touchProcessor prop to override the GameEngine's touch system or did you use an entirely different approach? I ask because I'd like the GameEngine to be as hackable and customizable as possible - so any areas where this can be improved - I'd love to hear about it.

@cmaycumber
Copy link

@xiendong Any chance you could post your solution? I'm actually trying to get react-native-game-engine working for react-native-web but I'm not having any luck with the default touchProcessor.

@xiendong
Copy link
Author

xiendong commented Oct 2, 2020

@bberak I didn't provide a new touchProcessor prop but actually wrote two components, a TouchManager that dispatches all the touch events via React context and callbacks to the TouchViews that is nested under the GameEngine. So separately I had to write touch event handlers and manually push a new set of entities to GameEngine by calling its setState function.

Separately, I realised that there is actually no need to keep entities as state, but as ref instead if there is a way to cause the react component to update at each screen refresh (e.g. via a state counter). Empirically it seem to be more efficient but at least it's more explicit that the entities object should be mutated rather than copied.

Thanks!

@xiendong
Copy link
Author

xiendong commented Oct 2, 2020

@cmaycumber I'm not quite sure if my solution would help you because you might not be able to directly call touch event handlers from react-native-web. Let me know if you do and then I'll find a way to post my solution. Thanks!

@bberak
Copy link
Owner

bberak commented Oct 3, 2020

Hi @cmaycumber,

I'm not too familiar with react-native-web but would be happy to hear what sort of issues you are having with the DefaultTouchProcessor? It could be that the touch event names are different between React Native and React Native Web - thereby breaking the logic in the DefaultTouchProcessor - if that is the case I'd be keen to patch this, or provide a web-friendly touch processor with RNGE..

@cmaycumber
Copy link

Hi @cmaycumber,

I'm not too familiar with react-native-web but would be happy to hear what sort of issues you are having with the DefaultTouchProcessor? It could be that the touch event names are different between React Native and React Native Web - thereby breaking the logic in the DefaultTouchProcessor - if that is the case I'd be keen to patch this, or provide a react-nat-ve-web-friendly touch processor with RNGE..

I'll definitely look into it a bit for you. That would be great if the touch processor worked with RNW out of the box!

From my current understanding I think the problem resides with touches being treated separately from clicks because it seems to work on mobile devices in the browser. I've been able to get it working for the meantime by adding a TapGestureHandler around the GameEngine and sending dispatch events on press.

@bberak
Copy link
Owner

bberak commented Oct 7, 2020

Nice work around @cmaycumber. In the meantime, I'll have a bit of a play with React Native Web to see how best to proceed..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants