Handling Common Single-Touch Gestures Android

Introduced in API Level 1, the GestureDetector class can be used to detect gestures made by a single finger. Some common single finger gestures supported by the GestureDetector class include:

  • onDown: Called when the user first presses on the touch screen.
  • onShowPress: Called after the user first presses the touch screen but before he lifts his finger or moves it around on the screen; used to visually or audibly indicate that the press has been detected.
  • onSingleTapUp: Called when the user lifts up (using the up MotionEvent) from the touch screen as part of a single-tap event.
  • onSingleTapConfirmed: Called when a single-tap event occurs.
  • onDoubleTap: Called when a double-tap event occurs.
  • onDoubleTapEvent: Called when an event within a double-tap gesture occurs, including any down, move, or up MotionEvent.
  • onLongPress: Similar to onSingleTapUp, but called if the user holds down his finger long enough to not be a standard click but also without any movement.
  • onScroll: Called after the user presses and then moves his finger in a steady motion before lifting his finger. This is commonly called dragging.
  • onFling: Called after the user presses and then moves his finger in an accelerating motion before lifting it. This is commonly called a flick gesture and usually results in some motion continuing after the user lifts his finger.

You can use the interfaces available with the GestureDetector class to listen for specific gestures such as single and double taps (see Gesture Detector .OnDoubleTapListener), as well as scrolls and flings (see GestureDetector.OnGestureListener).The scrolling gesture involves touching the screen and moving your finger around on it. The fling gesture, on the other hand, causes (though not automatically) the object to continue to move even after the finger has been lifted from the screen. This gives the user the impression of throwing or flicking the object around on the screen.

Let’s look at a simple example. Let’s assume you have a game screen that enables the user to perform gestures to interact with a graphic on the screen. We can create a custom View class called GameAreaView that can dictate how a bitmap graphic moves around within the game area based upon each gesture.The GameAreaView class can use the onTouchEvent() method to pass along MotionEvent objects to a GestureDetector. In this way, the GameAreaView can react to simple gestures, interpret them, and make the appropriate changes to the bitmap, including moving it from one location to another on the screen.

In this case, the GameAreaView class interprets gestures as follows:

  • A double-tap gesture causes the bitmap graphic to return to its initial position.
  • A scroll gesture causes the bitmap graphic to “follow” the motion of the finger.
  • A fling gesture causes the bitmap graphic to “fly” in the direction of the fling.

To make these gestures work, the GameAreaView class needs to include the appropriate gesture detector, which triggers any operations upon the bitmap graphic. Based upon the specific gestures detected, the GameAreaView class must perform all translation animations and other graphical operations applied to the bitmap. To wire up the GameAreaView class for gesture support, we need to implement several important methods:

  • The class constructor must initialize any gesture detectors and bitmap graphics.
  • The onTouchEvent() method must be overridden to pass the MotionEvent data to the gesture detector for processing.
  • The onDraw() method must be overridden to draw the bitmap graphic in the appropriate position at any time.
  • Various methods are needed to perform the graphics operations required to make a bitmap move around on the screen, fly across the screen, reset its location based upon the data provided by the specific gesture.

All these tasks are handled by our GameAreaView class definition:

public class GameAreaView extends View {
private static final String DEBUG_TAG =
“SimpleGesture->GameAreaView";
private GestureDetector gestures;
private Matrix translate;
private Bitmap droid;
private Matrix animateStart;
private Interpolator animateInterpolator; private long startTime;
private long endTime;
private float totalAnimDx;
private float totalAnimDy;
public GameAreaView(Context context, int iGraphicResourceId) {
super(context);
translate = new Matrix();
GestureListener listener = new GestureListener(this);
gestures = new GestureDetector(context, listener, null, true);
droid = BitmapFactory.decodeResource(getResources(),
iGraphicResourceId);
}
@Override
public boolean onTouchEvent(MotionEvent event) {
boolean retVal = false;
retVal = gestures.onTouchEvent(event);
return retVal;
}
@Override
protected void onDraw(Canvas canvas) {
Log.v(DEBUG_TAG, “onDraw");
canvas.drawBitmap(droid, translate, null);
}
public void onResetLocation() {
translate.reset();
invalidate();
}
public void onMove(float dx, float dy) {
translate.postTranslate(dx, dy);
invalidate();
}
public void onAnimateMove(float dx, float dy, long duration) {
animateStart = new Matrix(translate);
animateInterpolator = new OvershootInterpolator();
startTime = System.currentTimeMillis();
endTime = startTime + duration;
totalAnimDx = dx;
totalAnimDy = dy;
post(new Runnable() {
@Override
public void run() {
onAnimateStep();
}
});
}
private void onAnimateStep() {
long curTime = System.currentTimeMillis();
float percentTime = (float) (curTime - startTime) /
(float) (endTime - startTime);
float percentDistance = animateInterpolator
.getInterpolation(percentTime);
float curDx = percentDistance * totalAnimDx;
float curDy = percentDistance * totalAnimDy;
translate.set(animateStart);
onMove(curDx, curDy);
if (percentTime < 1.0f) {
post(new Runnable() {
@Override
public void run() {
onAnimateStep();
}
});
}
}
}

As you can see, the GameAreaView class keeps track of where the bitmap graphic should be drawn at any time. The onTouchEvent() method is used to capture motion events and pass them along to a gesture detector whose GestureListener we must implement as well (more on this in a moment).Typically, each method of the GameAreaView applies some operation to the bitmap graphic and then calls the invalidate() method, forcing the view to be redrawn. Now we turn our attention to the methods required to implement specific gestures:

  • For double-tap gestures, we implement a method called onResetLocation() todraw the bitmap graphic in its original location.
  • For scroll gestures, we implement a method called onMove() to draw the bitmap graphic in a new location. Note that scrolling can occur in any direction—it simply refers to a finger swipe on the screen.
  • For fling gestures, things get a little tricky. To animate motion on the screen smoothly, we used a chain of asynchronous calls and a built-in Android interpolator to calculate the location to draw the graphic based upon how long it had been since the animation started. See the onAnimateMove() and onAnimateStep() methods for the full implementation of fling animation.

Now we need to implement our GestureListener class to interpret the appropriate gestures and call the GameAreaView methods we just implemented. Here’s an implementation of the GestureListener class that our GameAreaView class can use:

private class GestureListener extends
GestureDetector.SimpleOnGestureListener {
GameAreaView view;
public GestureListener(GameAreaView view) {
this.view = view;
}
@Override public boolean onDown(MotionEvent e) {
return true;
}
@Override
public boolean onFling(MotionEvent e1, MotionEvent e2,
final float velocityX, final float velocityY) {
final float distanceTimeFactor = 0.4f;
final float totalDx = (distanceTimeFactor * velocityX / 2);
final float totalDy = (distanceTimeFactor * velocityY / 2);
view.onAnimateMove(totalDx, totalDy,
(long) (1000 * distanceTimeFactor));
return true;
}
@Override
public boolean onDoubleTap(MotionEvent e) {
view.onResetLocation();
return true;
}
@Override
public boolean onScroll(MotionEvent e1, MotionEvent e2,
float distanceX, float distanceY) {
view.onMove(-distanceX, -distanceY);
return true;
}
}

Note that you must return true for any gesture or motion event that you want to detect. Therefore, you must return true in the onDown() method as it happens at the beginning of a scroll-type gesture. Most of the implementation of the GestureListener class methods involves our interpretation of the data for each gesture. For example:

  • We react to double taps by resetting the bitmap to its original location using the onResetLocation() method of our GameAreaView class.
  • We use the distance data provided in the onScroll() method to determine the direction to use in the movement to pass into the onMove() method of the GameAreaView class.
  • We use the velocity data provided in the onFling() method to determine the direction and speed to use in the movement animation of the bitmap. The timeDistance Factor variable with a value of 0.4 is subjective, but gives the resulting slide-to-a-stop animation enough time to be visible but is short enough to be controllable and responsive.You could think of it as a high-friction surface. This information is used by the animation sequence implemented within the onAnimateMove() method of the GameAreaView class.

Now that we have implemented the GameAreaView class in its entirety, you can display it on a screen. For example, you might create an Activity that has a user interface with a FrameLayout control and add an instance of a GameAreaView using the addView() method. The resulting scroll and fling gestures look something like Figure.

Scroll (left) and Fling (right) gestures.

Scroll (left) and Fling (right) gestures.

Handling Common Multi-Touch Gestures

Introduced in API Level 8 (Android 2.2), the ScaleGestureDetector class can be used to detect two-fingered scale gestures. The scale gesture enables the user to move two fingers toward and away from each other. When the fingers are moving apart, this is considered scaling up; when the fingers are moving together, this is considered scaling down. This is the “pinch-to-zoom” style often employed by map and photo applications.

Let’s look at another example. Again, we use the custom view class called GameAreaView, but this time we handle the multi-touch scale event. In this way, the Game AreaView can react to scale gestures, interpret them, and make the appropriate changes to the bitmap, including growing or shrinking it on the screen.

In order to handle scale gestures, the GameAreaView class needs to include the appropriate gesture detector: a ScaleGestureDetector.The GameAreaView class needs to be wired up for scale gesture support in a similar fashion as when we implemented single touch gestures earlier, including initializing the gesture detector in the class constructor, overriding the onTouchEvent() method to pass the MotionEvent objects to the gesture detector, and overriding the onDraw() method to draw the view appropriately as necessary. We also need to update the GameAreaView class to keep track of the bitmap graphic size (using a Matrix) and provide a helper method for growing or shrinking the graphic. Here is the new implementation of the GameAreaView class with scale gesture support:

public class GameAreaView extends View {
private ScaleGestureDetector multiGestures;
private Matrix scale;
private Bitmap droid;
public GameAreaView(Context context, int iGraphicResourceId) {
super(context);
scale = new Matrix();
GestureListener listener = new GestureListener(this);
multiGestures = new ScaleGestureDetector(context, listener);
droid = BitmapFactory.decodeResource(getResources(),
iGraphicResourceId);
} public void onScale(float factor) {
scale.preScale(factor, factor); invalidate();

}
@Override
protected void onDraw(Canvas canvas) {
Matrix transform = new Matrix(scale);
float width = droid.getWidth() / 2;
float height = droid.getHeight() / 2;
transform.postTranslate(-width, -height);
transform.postConcat(scale);
transform.postTranslate(width, height);
canvas.drawBitmap(droid, transform, null);
}
@Override
public boolean onTouchEvent(MotionEvent event) {
boolean retVal = false;
retVal = multiGestures.onTouchEvent(event);
return retVal;
}
}

As you can see, the GameAreaView class keeps track of what size the bitmap should be at any time using the Matrix variable called scale.The onTouchEvent() method is used to capture motion events and pass them along to a ScaleGestureDetector gesture detector. As before, the onScale() helper method of the GameAreaView applies some scaling to the bitmap graphic and then calls the invalidate() method, forcing the view to be redrawn.

Now let’s take a look at the GestureListener class implementation necessary to interpret the scale gestures and call the GameAreaView methods we just implemented. Here’s the implementation of the GestureListener class:

private class GestureListener implements
ScaleGestureDetector.OnScaleGestureListener {
GameAreaView view;
public GestureListener(GameAreaView view) {
this.view = view;
}
@Override
public boolean onScale(ScaleGestureDetector detector) {
float scale = detector.getScaleFactor();
view.onScale(scale);
return true;
}
@Override
public boolean onScaleBegin(ScaleGestureDetector detector) {
return true;
}
@Override
public void onScaleEnd(ScaleGestureDetector detector) {
}
}

Remember that you must return true for any gesture or motion event that you Want to detect. Therefore, you must return true in the onScaleBegin() method as it happens at the beginning of a scale-type gesture. Most of the implementation of the GestureListener methods involves our interpretation of the data for the scale gesture. Specifically, we use the scale factor (provided by the getScaleFactor() method) to calculate whether we should shrink or grow the bitmap graphic, and by how much. We pass this information to the onScale() helper method we just implemented in the GameAreaView class.

Now, if you were to use the GameAreaView class within your application, scale gestures might look something like Figure.

Making Gestures Look Natural

Gestures can enhance your Android application user interfaces in new, interesting, and intuitive ways. Closely mapping the operations being performed on the screen to the user’s finger motion makes a gesture feel natural and intuitive. Making application operations look natural requires some experimentation on the part of the developer. Keep in mind that devices vary in processing power, and this might be a factor in making things seem natural.

Scale up (left) and scale down (right) gestures.

Scale up (left) and scale down (right) gestures.



Face Book Twitter Google Plus Instagram Youtube Linkedin Myspace Pinterest Soundcloud Wikipedia

All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status

Android Topics