Report: Mobile Devices Shifting from Touch to Sensors
        
        
        
        Over the next five years, mobile and wearable devices will rely less on 
touchscreen user interfaces and increasingly on sensors, and the next generation 
of devices and the Internet of Things will drive development of voice, gesture, 
eye-tracking and other interfaces, according to a new study from
ABI Research.
The report, "Mobile 
Device User Interface Innovation," looks at popular types of user interfaces 
and the emergence of "natural sensory technologies" from the research lab to the 
development department. Types of user interfaces covered in the report include 
graphical user interfaces, home screens, sensors and perceptual computing, voice 
and natural language, eye tracking, gestures and proximity, sensor integration, 
global navigation satellite system (GNSS), GPS and augmented reality 
applications, as well as hybrid or blended interfaces. The report also examines 
the application of user interfaces to smart phones, tablets and wearables.
According to ABI Research, the shift from touch interfaces to sensors and 
other interfaces creates complexity for companies developing the next generation 
of mobile devices, and the challenge for developers will be translating that 
complexity into user interfaces that are simple enough to be intuitive. As the 
Internet of Things becomes reality, developers must grapple with the question of 
whether each device should have its own unique user interface or whether the 
devices should be controlled externally through a mobile device or centralized 
display.
“Touch got mobile device usability to where it is today, but touch will 
become one of many interfaces for future devices as well as for new and future 
markets,” said Jeff Orr, senior practice director at ABI Research, in a prepared 
statement. “The really exciting opportunity arrives when multiple user 
interfaces are blended together for entirely new experiences.”
In its examination of 11 unique device features from wireless connectivity to 
embedded sensors, the report found that from 2014 to 2019, "hand and facial 
gesture recognition will experience the greatest growth in future smartphone and 
tablet shipments," and these devices will use gesture recognition for a variety 
of purposes, from monitoring user attentiveness to device navigation. 
Ultimately, the development of new user interfaces in mobile devices will affect 
the design of devices for the home and car. 
The full report, "Mobile Device User Interface Innovation," is available for 
purchase as a downloadable PDF from the
ABI Research site.
        
        
        
        
        
        
        
        
        
        
        
        
            
        
        
                
                    About the Author
                    
                
                    
                    Leila Meyer is a technology writer based in British Columbia. She can be reached at [email protected].