Albert van der Veer

Alumni Group Virtual Reality
Alumni of the Department Human Perception, Cognition and Action
Alumni of the Research Group Body and Space perception

Main Focus

The embodiment of perspectival experience

Albert H. van der Veer


In the normal case of having a perceptual experience, i.e. of being phenomenally conscious of getting acquainted with something, we stand in a bi-directional relation with what we perceive, such that both we are having a perspective on what it is we are perceiving, as well as that what we perceive provides us with a perspective on it. It is specifically our body with its characteristics which plays a central, constituting role in shaping this perspectival character of our perceptual experiences. My current project is concerned with the experimental study of how exactly it is that our bodies are involved in the perspectival character of our perception of ourselves as well as the external world, i.e. with embodied perspectival experience.


To the above end, I will test the following.

(1) How the human body is involved in how people determine their self-location as well as the locations of different of their body parts, when asked to point directly to themselves or to their body parts respectively.

(2) How the human body is involved in how people determine the locations of external objects, when asked to judge whether objects are to their left or to their right while standing with their torso and head rotated into various positions.

(3) How the size of experienced environments may influence own body size perception, as well as how own body size perception and viewpoint height may influence experienced environmental size.

Ultimately, we are interested in how exactly bodily perception and spatial perception may be mutually influential in perspectival experiences of self and the external world (objects and environments).


I design behavioural experimental studies, employing different types of virtual reality (VR) setups. These setups can involve different state-of-the-art VR headsets, as well as our Panolab large screen immersive display. The development of my experiments is mainly done in Unity, the data analysis mainly in SPSS and Matlab. Design and model fitting often employ psychophysics methodology.

Initial results and conclusions

(1.a) Participants do not locate themselves randomly distributed across their body, nor always in the same region of their body [1, 3].

(1.b) In a VR headset participants are less veridical in pointing at their different body parts, particularly for their feet, knees and top of their heads, compared to in our Panolab. This may result from them not having any visual access to their body in the former.

(1c) Taking into account where participants point out their body parts to be in the different VR setups, the pointing to self—on rescaled bodies, based on the pointing to body parts—becomes very similar in the VR headset and the Panolab, as well as more similar to pointing to self in a previous setup employing a physical pointer [3], i.e. pointing is then mainly to upper torso, and lower and upper face.

(2) When judging whether objects are to their left or to their right, torso- and head-centered frames of reference seem to make independent contributions to participant’s egocentric spatial judgements. A stronger contribution to these judgments of torso than of head was found previously [2]. Our current work cannot yet be interpreted conclusively, but seems to suggest the opposite.


1. Alsmith AJT and Longo MR (2014) Where exactly am I? Self-location judgements distribute between head and torso Consciousness and Cognition 24 70–74.

2. Longo M and Alsmith, A (2013) Where is the Ego in Egocentric Representation? Perception ECVP abstract 42 53–53.

3. van der Veer AH, Longo MR, Alsmith AJT, Wong HJ and Mohler BJ  (March-2017): Where am I in virtual reality?, 5th Mind, Brain & Body Symposium (MBBS 2017), Berlin, Germany.

Finding perspective: Determining the embodiment of perspectival experience

Curriculum Vitae

M.Sc. in psychology (Utrecht University).

Go to Editor View