The Multitouch Aquaria Project

TouchViz progress

Wednesday 2011-04-20

I confess that I have been lax in getting this post out, given that the changes I will be discussing were on Github over a week ago. Nonetheless, I feel that I should explain the current status of TouchViz before moving on to my work on Aquaria.

Support for textual display of touch attributes has been added, which is handy for seeing the numeric value of a touch’s position. However, when I added support for visualizing gestures (the bounding box of the gesture is displayed, as well as the gesture type). Gestures seem to have a lot of undocumented attributes, and displaying all of them onscreen would be pointless, as you couldn’t read them quickly enough. Additionally, as multiple gestures occur at the same location over time, the text overlaps and becomes unreadable.

One of the things I have learned is that, while touches are immediately reported to the subscribing application as they occur, gestures are only categorized (as “tap”, “pinch”, or “rotate”, for example) as they end. This seems to be what the Unity folks were talking about when they complained in #ubuntu-touch about not being able to display a window’s “love handles” when three touches are present. I have also been able to try different ideas for gestures to activate the secondary action in Aquaria. I plan to test using a brief two-touch tap as the activation gesture. As can be seen in the video below, dragging a finger about and tapping to the upper-right to activate the secondary action manifests in GEIS as:

  1. Gesture start, single touch
  2. Many gesture update events, single touch
  3. Gesture end, “Drag,touches=1”
  4. Gesture start, two touches
  5. Gesture update, two touches
  6. Gesture end, two touches
  7. Gesture start, single touch

Given this, I think for Aquaria I can simply map single-touch gestures to cursor position and recognize the two-touch tap gesture as an activation of the secondary power.

TouchViz showing gestures

I am halting development of TouchViz for the time being, as it has served its purpose as a learning tool and there are technological barriers standing in the way of its development into a useful tool. Specifically, it is impossible to “watch” gestures via GEIS without capturing them—this functionality is planned for the next Ubuntu cycle (see this #ubuntu-classroom log at 17:51). At that point I may revive TouchViz, and also make some of the other changes necessary for it to become a useful tool, like intelligent text positioning and better readability. Such a revival would probably involve a substantial rewrite, as the ctypes-based GEIS Python bindings have been abandoned in favor of native ones (which will likely have a slightly different API).