Introducing Gesture Handler 2.0

Jakub Piasecki
Software Mansion
Published in
4 min readDec 1, 2021

--

During the last few years Gesture Handler has become one of the core libraries of the React Native ecosystem. It allows for recognizing gestures such as pinch, rotation, or pan, utilizing the native touch system on each platform, and also makes it possible to specify how the gestures should interact with each other.

Today, we are happy to announce the release of Gesture Handler 2.0. It brings an entirely new way of adding gesture-based interactions to your apps and addresses many shortcomings of the previous versions. Gesture Handler 2.0 comes with a new API designed from the ground up. We have spent a lot of time iterating over different concepts of defining gestures and we have settled on one that, in our opinion, makes creating complex gestures much easier. To make the transition to 2.0 more approachable for a huge gesture-handler user base, we are making this new release fully compatible with version 1 (all the APIs from version 1 stay the same). This way, you can upgrade your app even today and decide if and when it makes sense for you to start using the new API. In this post we highlight the key changes in the new API for you to get a sense of how it can help you improve your codebase, and how to take advantage of the new features.

One to rule them all

One of the frequent use cases for our library is to monitor a number of concurrent gestures over a single view (e.g., pan, pinch, rotation of an image). To achieve this with the old API you’d have to nest a number of gesture-specific components often interleaved with react-native’s Views. In Gesture Handler 2 we are taking a different approach with a new GestureDetector component capable of recognizing all types of gestures.

In the new API, gesture configuration is instantiated using the Gesture object and configured in the builder-like pattern, then passed to the detector component. Let’s look at a simple example of a double tap:

That may not be very impressive but the improvement can be definitely seen when adding more than one gesture to a component. Instead of creating multiple handlers, and connecting them together using refs, we have introduced a new system of gesture composition.

Composing gestures

Along all the basic gestures, the Gesture object provides methods for composing gestures:

  • Simultaneous — all of the gestures can become active at the same time.
  • Race — the first gesture that becomes active cancels other gestures.
  • Exclusive — assigns priority to the gestures: the first gesture can become active at any time, the second gesture can activate only when the first one fails, third gesture can activate only when the second one fails, and so on.

To demonstrate its capabilities let’s look at a common example: pan, pinch and rotation on a single component.

Another important improvement over the 1.0 API is that GestureDetector no longer needs to wrap an actual react-native View component. You can now put any composite component directly under GestureDetector, which helps with extracting gesture-related logic to separate files if that’s the approach you prefer to follow.

Close integration with Reanimated 2

Gesture Handler 2.0 is closely integrated with Reanimated 2. If both libraries are installed, functions set as callbacks to events will be automatically treated as Reanimated’s worklets and GestureDetector will default to utilizing them for synchronous event delivery. Unfortunately, there is one trade off we had to make when designing this new API and it does no longer work with Reanimated 1 and with React Native’s Animated Events (you could still use that approach via the old Gesture Handler API if you can’t migrate to Reanimated 2 at the moment). One of the main reasons behind this decision is the fact that worklets allow for synchronous communication with the native code which brings us to the most important feature of this release.

Take full control of your gestures

We are introducing a new type of events in the Gesture Handler 2.0: touch events. They allow for tracking the position of individual fingers and decide from within the gesture’s callback on when it should activate or fail. When used with the new “Manual Gesture”, it’s possible to implement almost any gesture recognition logic you may want in your app. This can be used, for example, to recognize users drawing some shapes, or making your pinch gesture only activate when the central point between the fingers is in a certain place.

We are also expanding on this idea and bringing it to the built-in gestures, making it possible to control the activation or failure criteria programmatically from within touch event callbacks as opposed to only having a set of configuration options for each gesture type. In order to use Manual Gesture you’ll need to install Reanimated 2.3 or newer.

Compatibility with the old API

While we are going to focus on the new API going forward, the old one is not going away. Moreover, we made sure that it would be cross-compatible with the API introduced in 2.0, that includes specifying interactions between gestures so they can be made to work simultaneously or to wait for others. We are sure you’ll love the new API, but if you don’t want or can’t move to it just yet you still can safely upgrade and take advantage of all the fixes and improvements to Gesture Handler core we are going to be adding to the future 2.x releases.

New Gesture Handler is brought to you by Software Mansion and Shopify.

--

--