OyaYansa Posted July 10, 2017 Share Posted July 10, 2017 A team of researchers at Carnegie Mellon University has developed a system with the ability to turn a table into a touchscreen. The project is called Desktopography and consists of a small overhead projector, depth sensors and a computer to reproduce the images on a physical surface and allow the interaction of the users. As you can see in the following video, its operation is most intuitive, since it can interact with the same gestures that are used in a conventional touch panel. You have the possibility to select and drag the elements to reorder them, adjust the size of the windows, zoom in or associate them with physical objects to move with them, among other options. According to Robert Xiao, one of the team members, the goal of Desktopography is to provide a mixed virtual physical interaction on a desktop or other surface. For this, the platform takes the interaction of the screens of our devices to merge them with our environment, so that they are part of the reality. The operation of this system is very simple. It consists of an overhead projector and a camera with depth sensors that are placed at a certain distance on the surface of the table. The first device is responsible for reproducing the images on the desktop, while the camera is recording the objects and movements of the user's hands. The user can have multiple open interfaces, organizing the spaces however they want. At the moment, Desktopography is only a prototype that will be presented at the Symposium on Engineering of interactive computer systems in Lisbon (Portugal), and its developers have not yet considered its commercialization. Link to comment Share on other sites More sharing options...
Recommended Posts