Working with Gestures Android

Android devices often rely upon touch screens for user input. Users are now quite comfortable using common finger gestures to operate their devices. Android applications can detect and react to one-finger (single-touch) and two-finger (multi-touch) gestures.

One of the reasons that gestures can be a bit tricky is that a gesture can be made of multiple touch events, or motions. Different sequences of motion add up to different gestures. For example, a fling gesture involves the user pressing his finger down on the screen, swiping across the screen, and lifting his finger up off the screen while the swipe is still in motion (that is, without slowing down to stop before lifting his finger). Each of these steps can trigger motion events that applications can react to.

Detecting User Motions Within a View

By now you’ve come to understand that Android application user interfaces are built using different types of View controls. Developers can handle gestures much like they do click events within a View control using the setOnClickListener() and setOnLongClickListener() methods. Instead, the onTouchEvent() callback method is used to detect that some motion has occurred within the View region.

The onTouchEvent() callback method has a single parameter: a MotionEvent object.The MotionEvent object contains all sorts of details about what kind of motion is occurring within the View, enabling the developer to determine what sort of gesture is happening by collecting and analyzing many consecutive MotionEvent objects. You could use all of the MotionEvent data to recognize and detect every kind of gesture you could possibly imagine. Alternately, you can use built-in gesture detectors provided in the Android SDK to detect common user motions in a consistent fashion. Android currently has two different classes that can detect navigational gestures:

  • The GestureDetector class can be used to detect common single-touch gestures.
  • The ScaleGestureDetector can be used to detect multi-touch scale gestures.

It is likely that more gesture detectors will be added in future versions of the Android SDK. You can also implement your own gesture detectors to detect any gestures not supported by the built-in gesture detectors. For example, you might want to create a two-fingered rotate gesture to, say, rotate an image or a three- fingered swipe gesture that brings up an option menu.

In addition to common navigational gestures, you can use the android.gesture package with the GestureOverlayView to recognize command-like gestures. For instance, you could create an S-shaped gesture that brings up a search, or a zig-zag gesture that clears a screen on a drawing app. Tools are available for recording and creating libraries of this style gesture. As it uses an overlay for detection, it isn’t well suited for all types of applications. This package was introduced in API Level 4.

All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd Protection Status

Android Topics