A Handheld Device for Realtime Spatial Movement, Remote Control and Perspectival Orientation between Physical objects and Virtual Environments.
The system leverages Computer Vision, Wireless protocols and Inertial Measurement Unit hardware (IMU) to create an interactive and tangible experience between digital and physical models whereby the user can alter the 3-dimensional perspective on a digital screen by moving a object over a replica physical model or plan.
When a user would like to reposition their point of view in a digital 3D model of a city, they would position and orientate the ViewCube device on a physical duplicate of the digital 3D model and the digital view would update in real-time to match the position and orientation.
This input is normally achieved by means of controlling a pointer via a mouse and keyboard on a display screen. The device aims to alleviate the difficulties of visual hand, eye and spatial coordination when trying to position a virtual camera to match a location on a physical model.
Orientation and positioning is calculated and performed in real-time, allowing for seamless hand-eye co-ordination between the users hand gestures and the display screen.