Software Tools of the Department "Human Perception, Cognition and Action"

Face Database
This database contains images of 7 views of 200 laser-scanned (Cyberware TM) heads without hair. The 200 head models were newly synthesized by morphing real scans to avoid close resemblances to individuals who might not want to appear on your computer screen or in your scientific publications. Currently, there are 5 sets of full 3D head models available. Please understand that we cannot provide more 3D head models as we are obliged to protect the identity of the other people.

Face Video Database
This database contains videos of facial action units which were recorded starting in autumn of 2003 at the MPI for Biological Cybernetics in the Face and Object Recognition Group, department Prof. Bülthoff, using the Videolab facilities created by Mario Kleiner and Christian Wallraven.

Psychophysics Toolbox Version 3 (PTB-3) is a free set of Matlab and GNU/Octave functions for vision research. It makes it easy to synthesize and show accurately controlled visual and auditory stimuli and interact with the observer. It has at least fifteen thousand active users (see Overview), an active forum, and is highly cited. PTB-3 is based on the Psychophysics Toolbox Version 2 (PTB-2) but its Matlab extensions (in C) were rewritten to be more modular and use OpenGL. This Wiki describes the current version (PTB-3), which runs with Matlab 7.x and Octave 3.2.x on Mac OSX, Linux and Windows (see System Requirements). The old PTB-2, for Mac OS 9 and Windows, still used in many labs, remains available but is no longer developed or supported.

We present a mobile system for tracking the gaze of an observer in real-time as they move around freely and interact with a wall-sized display. The system combines a head-mounted eye tracker with a mo- tion capture system for tracking markers attached to the eye tracker. Our open-source software library libGaze provides routines for calibrating the sys- tem and computing the viewer’s position and gaze direction in real-time. The modular architecture of our system supports simple replacement of each of the main components with alternative technology. We use the system to perform a psychophysical user-study, designed to measure how users visually explore large displays. We find that observers use head move- ments during gaze shifts, even when these are well within the range that can be com- fortably reached by eye movements alone. This suggests that free movement is important in nor- mal gaze behaviour,motivating further applications in which the tracked user is free to move.

Former Software Tools

The following software tools were developed and used in the past and are no longer supported.
Virtual Environments Library (veLib)
The Virtual Environments Library (veLib) is a light-weight yet complete cross-platform software framework for distributed realtime virtual reality (VR) simulations. It was designed to offer a convenient and unified interface for all sorts of different input and output devices and to hide the hazzle of different system architectures and communication methods from the user under a simple generic abstraction layer. Its design goals are, roughly ordered from most to least important, scalability, extensibility, flexibility, stability, simplicity, ease of use, performance, completeness, fanciness. More specifically, it contains ready-made interfaces to all sorts of normal input devices (keyboard, mouse, joysticks, game pads, etc.), a basic 3D graphics engine, a nice spatial audio framework, a network communication layer, basic simulation logic such as collision detection and motion models, and various auxiliary functions such as overlay functionality, portable file input/output, and timers.
Last updated: Friday, 13.01.2017