Handling Input from Touchscreens
To get touchscreen input to work, there are a few components which needs to be available and work together.
- Kernel driver for the touch input hardware.
- The Xorg input driver for touchscreen.
Note: Using standard mouse modules will not work, as touch input hardware usually provides its input in absolute coordinates (instead of relative coordinates like mice). - Toolkit which supports touch gestures (e.g. GTK3? QT5?)
- A virtual keyboard.
For the A10 tablet that I have, its input is supported by ft5x_ts module, so that part is easily sorted (other tablets may not be so lucky - some use touchscreens whose drivers are unavailable).
The Xorg input driver module is kindly donated by Pengutronix, here: http://www.pengutronix.com/software/xf86-input-tslib/index_en.html . This Xorg modules requires tslib to build; and it provides mouse-like input. As far as I know it is not multi-touch capable.
Next, one needs a touch-aware toolkit so that the applications can use the (by now) familiar idioms like tap-drag to scroll, pinch in/out to zoom, etc. Unfortunately the toolkit I'm using is GTK2 and GTK2 isn't touch-aware, so all interactions needs to be done using mouse idioms (scrollbars, etc). This is difficult, especially on smaller tablets (mine has 7 inch screen). Sure, one can enlarge the scrollbars but then precious screen space is wasted.
Lastly, one needs a virtual keyboard (otherwise, how do you do data entry?). Here, one can use xvkbd or matchbox-keyboard, among others.
I haven't tested GTK3 or any other touch-aware toolkits that is designed with touchscreens in mind, so I can't vouch for their effectiveness. But for the setup I outlined above, although it works, it is painful to use.
Thus I am still in the opinion that the best way to do real work on tablets is still to attach a physical keyboard and mouse (a large enough monitor is a bonus). Touch is fun for browsing and consuming content, but rather clumsy for "real work".