...

Technological Horizons | News

Flexible, Paper-Like Tablet Prototype Runs on Core i5 CPU

The PaperTab is a thin, flexible, tablet-like device running on an Intel Core i5 processor.
The PaperTab is a thin, flexible, tablet-like device running on an Intel Core i5 processor.

A Canadian university has unveiled a new 10.7-inch, paper-thin tablet-like device that users can control through traditional gestures or by moving or bending its flexible display.

The concept device, currently dubbed the "PaperTab," debuted at the 2013 International CES in Las Vegas this week. It was developed by the Queen's University Human Media Lab using technologies from Plastic Logic and Intel. Intel supplies the second-generation Core i5 processor running the PaperTab. Plastic Logic manufactures the "plastic transistor technology" that allows for flexible electronic devices. In the case of the PaperTab, the device uses a 10.7-inch flexible touchscreen that users can bend to perform functions such as turning a page. And, as illustrated in the video below, it can also sense when other PaperTabs are near, allowing multiple units to transfer documents through touch or, using proximity, to work together as multiple windows of a single application.
 

As it was described by Queen's U: "PaperTab can file and display thousands of paper documents, replacing the need for a computer monitor and stacks of papers or printouts. Unlike traditional tablets, PaperTabs keep track of their location relative to each other, and the user, providing a seamless experience across all apps, as if they were physical computer windows. For example, when a PaperTab is placed outside of reaching distance it reverts to a thumbnail overview of a document, just like icons on a computer desktop. When picked up or touched a PaperTab switches back to a full screen page view, just like opening a window on a computer. PaperTabs are lightweight and robust, so they can easily be tossed around on a desk while providing a magazine-like reading experience."

UPDATE (Jan. 9, 3:11 p.m.): The version of the PaperTab used in the demonstration features a 150 PPI screen pixel density and displays 16 levels of grays. It offers a viewing angle close to 180 degrees, owing to its reflective (as opposed to backlit) surface, according to information provided by Plastic Logic.

Said a spokesperson for the company: "The displays used in the PaperTab design concept are flexible plastic displays from Plastic Logic. These displays are very thin, superlight and extremely robust and use E Ink, which is bistable, meaning that the displays are extremely low-[power]."

According to Plastic Logic, the PaperTab is conceptual only and is not expected to be released to the public.

"PaperTab is a concept design created by Queen's University's Human Media Lab using Plastic Logic's flexible displays," the spokesperson said. "As such, it is not a finished device intended for release, but much more a vision of how computing will develop over the next three to five years. Neither Plastic Logic nor the Human Media Lab have plans to release the device--it is a concept, which will most likely evolve and ultimately be manufactured and marketed by an OEM."

"Using several PaperTabs makes it much easier to work with multiple documents," said Roel Vertegaal, director of Queen's University's Human Media Lab, in a blog post. "Within five to 10 years, most computers, from ultra-notebooks to tablets, will look and feel just like these sheets of printed color paper."

Also in a blog post on Queen's University's site, Ryan Brotman, research scientist, Experience Design Lead at Intel, explained: "We are actively exploring disruptive user experiences. The 'PaperTab' project, developed by the Human Media Lab at Queen's University and Plastic Logic, demonstrates innovative interactions powered by Intel Core processors that could potentially delight tablet users in the future."

Further information about the flexible display technology can be found on Plastic Logic's site. Additional details about the PaperTab itself can be found on Queen's University Human Media Lab's site.

comments powered by Disqus

Whitepapers