Tutorials Flash & Swish Flash Tutorials Activating The Gesture Analytics Display, Part 4

Activating The Gesture Analytics Display, Part 4

Showing The Complete Debug Views, Using CML:
Activating The Gesture Analytics Display, Part 4

Every touch object rendered on stage has the "built-in" ability to independently display the touch point clusters it posses as well as detailed informatics on cluster geometry and motion. To turn on the GAD display, in the "my_application.cml" document set the attribute "displayOn" equal to true. For example:

DebugKit displayOn="true">

This will globally activate the debug display of all touch objects rendered on stage. When the GAD is activated it is always rendered on the topmost layer of the application display hierarchy. This is done to ensure that, when in debug mode, touch points and touch/gesture data is always visible and not obscured by media or menus.

When all "DebugLayers" in the GAD are activated, it can show a visualization of the full gesture pipeline from touch point tracking, cluster analysis (geometry and motion), gesture processing (noise filtering, inertia) to property mapping and display object transformations. In this example we are concerned with showing the display object transformation geometry. For example:

<DebugKit displayOn="true">
<DebugLayer type="touchobject_transform" displayOn="true"/>

This draws a vertex "wire-frame" of the touch object as calculated by the debugger. The wire-frame of the touch object shows the calculated positions of the four corner points and the center point of the display object as well as lines connecting the five points.

The wire-frame view represents the culmination of all transformations applied to the display of the touch object though the display hierarchy. For example when a touch object is nested inside another display object such as a Sprite or TouchSprite, any transformations applied to the parent are inherited by the display of the child. The wire-frame shows how the calculated transformations (in the gesture pipeline) compare to the inherited transformations.

This feature can effectively be used to see how transformations on touch objects effect measured cluster property vectors and the direction of related gestures. For example: when a touch object is rotated inside a parent container the vertical scroll gesture must be adjusted to account for the rotation so that vectors return gesture points in the correct direction.

The center point as shown in the wire-frame is actively used in certain gestures. When a single touch point is placed on a touch object the center point is used by the "pivot" gesture. The "moment" of the touch point relative to the pivot point (center of the touch object) is calculated and used to determine how the display object must rotate to minimize this moment. This give a more realistic, "physical" response to a touch object so that when manipulated by a single touch point it behaves as if it is a heavy object on a table top surface.

subscribe to newsletter