Report: Mobile Devices Shifting from Touch to Sensors

Over the next five years, mobile and wearable devices will rely less on touchscreen user interfaces and increasingly on sensors, and the next generation of devices and the Internet of Things will drive development of voice, gesture, eye-tracking and other interfaces, according to a new study from ABI Research.

The report, "Mobile Device User Interface Innovation," looks at popular types of user interfaces and the emergence of "natural sensory technologies" from the research lab to the development department. Types of user interfaces covered in the report include graphical user interfaces, home screens, sensors and perceptual computing, voice and natural language, eye tracking, gestures and proximity, sensor integration, global navigation satellite system (GNSS), GPS and augmented reality applications, as well as hybrid or blended interfaces. The report also examines the application of user interfaces to smart phones, tablets and wearables.

According to ABI Research, the shift from touch interfaces to sensors and other interfaces creates complexity for companies developing the next generation of mobile devices, and the challenge for developers will be translating that complexity into user interfaces that are simple enough to be intuitive. As the Internet of Things becomes reality, developers must grapple with the question of whether each device should have its own unique user interface or whether the devices should be controlled externally through a mobile device or centralized display.

“Touch got mobile device usability to where it is today, but touch will become one of many interfaces for future devices as well as for new and future markets,” said Jeff Orr, senior practice director at ABI Research, in a prepared statement. “The really exciting opportunity arrives when multiple user interfaces are blended together for entirely new experiences.”

In its examination of 11 unique device features from wireless connectivity to embedded sensors, the report found that from 2014 to 2019, "hand and facial gesture recognition will experience the greatest growth in future smartphone and tablet shipments," and these devices will use gesture recognition for a variety of purposes, from monitoring user attentiveness to device navigation. Ultimately, the development of new user interfaces in mobile devices will affect the design of devices for the home and car.

The full report, "Mobile Device User Interface Innovation," is available for purchase as a downloadable PDF from the ABI Research site.

About the Author

Leila Meyer is a technology writer based in British Columbia. She can be reached at [email protected].

Featured

  • ClassVR headsets

    Avantis Education Launches New Headsets for ClassVR Solution

    Avantis Education recently introduced two new headsets for its flagship educational VR/AR solution, ClassVR. According to a news release, the Xcelerate and Xplorer headsets expand the company’s offerings into higher education while continuing to meet the evolving needs of K–12 users.

  • Abstract AI circuit board pattern

    Nonprofit LawZero to Work Toward Safer, Truthful AI

    Turing Award-winning AI researcher Yoshua Bengio has launched LawZero, a nonprofit aimed at developing AI systems that prioritize safety and truthfulness over autonomy.

  • blue AI cloud connected to circuit lines, a server stack, and a shield with a padlock icon

    Report: AI Security Controls Lag Behind Adoption of AI Cloud Services

    According to a recent report from cybersecurity firm Wiz, nearly nine out of 10 organizations are already using AI services in the cloud — but fewer than one in seven have implemented AI-specific security controls.

  • magnifying glass highlighting a human profile silhouette, set over a collage of framed icons including landscapes, charts, and education symbols

    New AI Detector Identifies AI-Generated Multimedia Content

    Amazon Web Services and DeepBrain AI have launched AI Detector, an enterprise-grade solution designed to identify and manage AI-generated content across multiple media types. The collaboration targets organizations in government, finance, media, law, and education sectors that need to validate content authenticity at scale.