AR-boxes

A stereoscopic video (cross-eyed) showing an interactive testbed scenario. By resolving occlusions and shadow casting a high fidelity interaction could be made possible. Haptic feedback and Physics are generated using chai3D.


Occlusions & shadows

A stereoscopic video (cross-eyed) showing the incorporation of occlusions and soft shadows. The approach does not require a priori information, as e.g background or color information. We use a reconstructed depth map acquired from passive stereo using three cameras. The depth is used to initialize an occlusion map, which is further refined to collide with image edges by using the total variation of the color images. The algorithm provides a high flexibility and leaves space for future improvements, e.g the incorporation of temporal constraints or scene knowledge. The current implementation performs at 20 fps due to a GPU implementation.

Visual delays

A video depicting visual delays. Measurements revealed a visual end to end delay of 66 ms in our setup. The delay mainly results from the hardware, while image processing and rendering only takes about 5 ms. The depicted experimental setup was used to evaluate the effects of visual and haptic delays on the perception of stiffness. The study showed that a visual delay renders objects to be perceived stiffer. In contrast to this, a haptic delay softens the perceived stiffness of an object, which can be used compensate for the effects.


Visual accuracy

The video shows a testsequence to determine the visual accuracy. Comparison of the detected corner positions and their projected overlay coordinates showed that the influence of static and dynamic errors yields a visual accuracy for the backprojection of 1.29 +/- 0.58 pixel at an average corner movement in image space of 0.4 +/- 0.22 pixel/ms. The camera movement showed a translational velocity of 0.66 +/- 0.32 mm/ms and a rotational of 0.063 +/- 0.034 degree/ms. The high accuracy can be accredited to the precise synchronization and the custom made infrared marker.

Dynamic camera pose refinement

The video shows an dynamic camera pose refinement using visual landmarks. In our setup, we used 14 landmarks and 3 cameras at a resolution of 800x600 (210 features at 1440000 pixel). The results showed that the camera pose can be corrected in less then 25 ms and backprojection error could be reduced to 0.19 +/- 0.28 pixel.

Haptic augmentation

prototypical breast palpation simulator, which has been developed in collaboration with POSTECH, South Korea [Jeon et al. 2010]. The idea of the scenario is to simulate a virtual tumor inside a real breast model. A physical silicone model produces natural haptic feedback for breast tissue deformation, while the AR system is responsible for the force modulation caused by the presence of the tumor.

Remote controlled haptic

The setup allowed different configurations. The feasibility of a remote controlled haptic system was demonstrated in an experiment which aimed at inducing an illusion of altered body perception under the influence of delays. One of the haptic devices, the haptic slave, was controlled over a spring damper system by the other, the haptic master.

Visuo-haptic setup

A short video describing our setup. The video was prepared for an internal workshop, but gives a good overview.

VR vs. AR

The video gives an idea about the difference between virtual and augmented reality and the collocation of visual and haptic feedback in visuo-haptic AR.

Recorded Deformations

A video showing image based rendering of virtual objects. Visual deformations and haptic feedback are based on recording of the real object. The work was performed in the frame of the Immersence project.

Dual haptic devices

The distributed setup allows extensions for multiple user. The video depicts an interaction with 2 haptic devices.

Image based rendering

A video showing image based rendering of virtual objects. The work was performed in the frame of the Immersence project.

Prototype for medical training

A prototype for medical training demonstrating the high fidelity of the setup. The incisions show the high accuracy of the haptic interaction.

2 player ping pong game

A two player extension of the ping pong game, which was used for testing a collaborative setup.

AR-kanoid

Our Interpretation of the cult game Arkanoid.

Mass spring deformations

Our second demonstrator. We simulated a virtual cylinder by using mass spring deformations. Realism was improved by using texture mapping and stencil buffer shadows. The application was used as testbed for an image based camera pose refinement.

1 player ping pong game

This is how it started! Our first visuo-haptic application: A ping pong game for one player. Visual feedback was only monocular, but shadow casting still provided a good perception for the ball position. The work was used as testbed for the haptic calibration.


© 2010 Benjamin Knoerlein