The Touch Gesture Motion Conference held October 28-30, 2014 in Austin, Texas, exemplified how far human computer interaction has come and revealed new prospects for the future of user interfaces. The conference organizers stated that the 2014 Touch Gesture Motion (TGM) event was the tenth similar event covering these topics. The initial and subsequent TGM events came on the heels of the June 2007 launch of the Apple iPhone that popularized multi-touch user interface technology.
The iPhone was not the first handset to incorporate a capacitive touch display – that would be the LG Prada mobile phone first announced in December 2006. However, the iPhone set a new expectation for the way in which consumers now wish to interact with their mobile devices and set the stage for the wide range of touch screen smartphones, tablets and other devices that we use today.
TGM 2014 kicked off on October 28 with a set of five tutorials covering touch, pen input, sensor technologies, as well as tutorials on gesture and motion as user interface approaches. The pre-conference tutorials were a good over view of the current technical and market status of touch, gesture, and motion user interface technologies. Thirty one speaker presentations over the following two days covered technologies, applications and markets comprising the diverse touch gesture motion development space. The topics included stylus and touch user interfaces, haptics, sensor integration, ITO alternatives, human computer interaction, market forecasts, and more.
The motion tutorial ambitiously titled History of the World of Interfaces and Interactions (but primarily Motion) presented by Jonathan Josephson of Quantum Interface was of particular interest to me since gesture and motion-based user interface approaches have not yet achieved widespread acceptance in mainstream consumer electronic products. Quantum Interface (QI) describes itself in a press release concerning the firm’s participation in TGM 2014 stating: “the company has developed software and middleware to enable better navigation on mobile and tablet devices, TVs, wearables, automobiles, environmental controls, and more. The technology brings together intuitive motion, eye tracking and predictive “Precognition” controls for faster menu navigation and may be blended with voice and other controls.” An introduction video illustrating Quantum Interface’s motion-based user interface can be viewed here.
Source: Quantum Interface
At TGM 2014, Josephson provided a historical background and context for the development of QI’s motion user interface which he described as being “analogue”, and “vector-based.” The presenter identified the five ways that humans interface with things today as touch, voice, body language, gestures and motion. Josephson described motion as, “a change in position of an object with respect to time and its reference point. Motion is typically described in terms of displacement, direction, velocity, acceleration, and time.” He went on to describe the three aspects of Motion as an Interface as Menuing, Scrolling, and Attribute Control. The central idea of Motion as an Interface is that all three aspects may be combined to form a complete motion-based user interface which provides the user with operations that are performed in a continuous rather than a discrete manner. Nearing the end of his presentation, Josephson went on to describe an automotive heads up display (HUD) based user interface incorporating eye tracking for control purposes (photo below). A demo of this HUD was available to view and interact with on the table top demonstration portion of the TGM meeting. The HUD demo’d by QI at TGM 2014 was indicative of the undercurrent at the event concerning potential automotive applications of touch, gesture and motion user interfaces.
Source: Quantum Interface
Josephson‘s TGM Motion tutorial delivered an incisive statement of the relationship of gestures to motion-based user interfaces. I look forward to following QI’s future efforts and to assessing the impact of motion-based interfaces in upcoming products and applications.
Another appealing demonstration at TGM 2014 was Canatu’s 3D-shaped transparent sensor demos (photos below) incorporating the firm’s carbon nanobud (CNB) flexible transparent conducting film (TCF).
Source: Canatu
The demonstrator was fabricated with complex shapes using industry standard film insert mold processes in a test mold with sharp edges, deep stretch to a depth of 7 mm at a diameter of 38 mm, and with 90 degree bending at a bending radius as small as 1 mm. Canatu’s carbon nanobud (CNB) technology has been promoted as a contender to replace indium tin oxide TCFs in some applications. The most interesting property of the Canatu 3D-shaped demonstrator was the fact that the CNB film could be stretched locally by as much as 120% while still retaining its conductive properties. In speaking to Canatu’s Bob Senior, he told me that the CNB TCF could be stretched because the individual CNB strands “slide” over each other and retain electrical continuity when the material is stretched during molding and forming. As the Quantum Interface and Canatu demos at TGM 2014 illustrated, there is a lot to follow and look forward to as developers bring new user interface technologies to market. – Phil Wright