As part of our research, we also develop our own software, which we gladly make available to others.
Software Tools of the Department "Human Perception, Cognition and Action"
Psychophysics Toolbox Version 3 (PTB-3) is a free set of Matlab and GNU/Octave functions for vision research. It makes it easy to synthesize and show accurately controlled visual and auditory stimuli and interact with the observer. It has at least fifteen thousand active users (see Overview), an active forum, and is highly cited. PTB-3 is based on the Psychophysics Toolbox Version 2 (PTB-2) but its Matlab extensions (in C) were rewritten to be more modular and use OpenGL. This Wiki describes the current version (PTB-3), which runs with Matlab 7.x and Octave 3.2.x on Mac OSX, Linux and Windows (see System Requirements). The old PTB-2, for Mac OS 9 and Windows, still used in many labs, remains available but is no longer developed or supported.
We present a mobile system for tracking the gaze of an observer in real-time as they move around freely and interact with a wall-sized display. The system combines a head-mounted eye tracker with a motion capture system for tracking markers attached to the eye tracker. Our open-source software library libGaze provides routines for calibrating the sys- tem and computing the viewer’s position and gaze direction in real-time. The modular architecture of our system supports simple replacement of each of the main components with alternative technology. We use the system to perform a psychophysical user-study, designed to measure how users visually explore large displays. We find that observers use head move- ments during gaze shifts, even when these are well within the range that can be com- fortably reached by eye movements alone. This suggests that free movement is important in nor- mal gaze behaviour, motivating further applications in which the tracked user is free to move.
The POETICON enacted scenario corpus is a corpus for (inter)action understanding that contains six everyday scenarios taking place in a kitchen / living-room setting. Each scenario was acted out several times by different pairs of actors and contains simple object interactions as well as spoken dialogue. In addition, each scenario was first recorded with several HD cameras and also with motion- capturing of the actors and several key objects. Having access to the motion capture data allows not only for kinematic analyses, but also allows for the production of realistic animations where all aspects of the scenario can be fully controlled.
Software Tools of the Research Group Computational Vision and Neuroscience
Berens P., A. S. Ecker, S. Gerwinn, A. S. Tolias, M. Bethge Optimal population coding, revisited
Macke J.H., S. Gerwinn, L.E. White, M. Kaschube,M. Bethge Gaussian process methods for estimating cortical maps
Gerwinn, S., P. Berens and M. Bethge A joint maximum-entropy model for binary neural population patterns and continuous signals
Sinz, F., E. P. Simoncelli and M. Bethge Hierarchical Modeling of Local Image Features through Lp-Nested Symmetric Distributions
Gerwinn, S., J. H. Macke and M. Bethge Bayesian Inference for Generalized Linear Models for Spiking Neurons
Eichhorn, J., F. H. Sinz and M. Bethge Natural Image Coding in V1: How Much Use is Orientation Selectivity?
Sinz F. and M. Bethge The Conjoint Effect of Divisive Normalization and Orientation Selectivity on Redundancy Reduction