As part of our research, we also develop our own software, which we gladly make available to others.
Software Tools of the Department "Human Perception, Cognition and Action"
This database contains images of 7 views of 200 laser-scanned (Cyberware TM) heads without hair. The 200 head models were newly synthesized by morphing real scans to avoid close resemblances to individuals who might not want to appear on your computer screen or in your scientific publications. Currently, there are 5 sets of full 3D head models available. Please understand that we cannot provide more 3D head models as we are obliged to protect the identity of the other people.
This database contains videos of facial action units which were recorded starting in autumn of 2003 at the MPI for Biological Cybernetics in the Face and Object Recognition Group, department Prof. Bülthoff, using the Videolab facilities created by Mario Kleiner and Christian Wallraven.
Psychophysics Toolbox Version 3 (PTB-3) is a free set of Matlab and GNU/Octave functions for vision research. It makes it easy to synthesize and show accurately controlled visual and auditory stimuli and interact with the observer. It has at least fifteen thousand active users (see Overview), an active forum, and is highly cited. PTB-3 is based on the Psychophysics Toolbox Version 2 (PTB-2) but its Matlab extensions (in C) were rewritten to be more modular and use OpenGL. This Wiki describes the current version (PTB-3), which runs with Matlab 7.x and Octave 3.2.x on Mac OSX, Linux and Windows (see System Requirements). The old PTB-2, for Mac OS 9 and Windows, still used in many labs, remains available but is no longer developed or supported.
We present a mobile system for tracking the gaze of an observer in real-time as they move around freely and interact with a wall-sized display. The system combines a head-mounted eye tracker with a motion capture system for tracking markers attached to the eye tracker. Our open-source software library libGaze provides routines for calibrating the sys- tem and computing the viewer’s position and gaze direction in real-time. The modular architecture of our system supports simple replacement of each of the main components with alternative technology. We use the system to perform a psychophysical user-study, designed to measure how users visually explore large displays. We find that observers use head move- ments during gaze shifts, even when these are well within the range that can be com- fortably reached by eye movements alone. This suggests that free movement is important in nor- mal gaze behaviour, motivating further applications in which the tracked user is free to move.
Software Tools of the Research Group Computational Vision and Neuroscience
Berens P., A. S. Ecker, S. Gerwinn, A. S. Tolias, M. Bethge Optimal population coding, revisited
Macke J.H., S. Gerwinn, L.E. White, M. Kaschube,M. Bethge Gaussian process methods for estimating cortical maps
Gerwinn, S., P. Berens and M. Bethge A joint maximum-entropy model for binary neural population patterns and continuous signals
Sinz, F., E. P. Simoncelli and M. Bethge Hierarchical Modeling of Local Image Features through Lp-Nested Symmetric Distributions
Gerwinn, S., J. H. Macke and M. Bethge Bayesian Inference for Generalized Linear Models for Spiking Neurons
Eichhorn, J., F. H. Sinz and M. Bethge Natural Image Coding in V1: How Much Use is Orientation Selectivity?
Sinz F. and M. Bethge The Conjoint Effect of Divisive Normalization and Orientation Selectivity on Redundancy Reduction