Büro PER 21 - F428
+41 26 300 8227
The Control and Perception research group is part of the Department of Neuroscience and Movement Science of the University of Fribourg. We investigate how humans perceive and control their movements.
The visual, vestibular and proprioceptive systems provide us with information about our body orientation and movements relative to the environment. These systems contribute to our perception of the speed and amplitude of motion, notably allowing us to distinguish our own displacements in the world (self-motion perception) from movement of surrounding objects or individuals. Combining psychophysics methods and virtual reality technology, we investigate how the signals provided by these sensory systems are integrated.
Moving the eyes, hands or whole body to act on / interact with the environment is something we do effortlessly in our everyday life. Yet, the nervous system has to deal with physical constraints acting on the body, and sometimes with unexpected perturbations such as an unforeseen slippery floor portion or a jostle while carrying a full glass. Our sensory organs not only allow us to perceive the surrounding world, but they are also crucial to control our movements and trigger motor adjustments when required. The interplay between sensory feedback information and planning strategies allows us to generate mechanically stable and highly-adaptive behaviors in different contexts. Through the analyses of movements at multiple levels (e.g., 3D kinematics, EMG or kinetic measurements, video-ocular recordings), we try to understand better how humans implement efficient motor strategies in normal situations or in reaction to unexpected perturbations / after injuries / in specific pathologies.
Penalty kick simulator designed to 1. assess the penalty scoring abilities of young and adult confirmed players, and 2. improve their performance using personalized training based on an optimization algorithm. The movements and the dive of the virtual goalkeeper are based on the movements of the kicker during the run-up to the ball. The difficulty of the task is constantly adapting to the performance and progress of each individual player. The movements / animation of the virtual goalkeeper are based on the real, motion-captured movements of a professional goalkeeper.
Slap-shot simulator designed to train slap-shot performance using personalized training based on an optimization algorithm. As with the penalty simulator, the movements of the virtual goalkeeper are based on the movements of the player during the preparation of the shot. The animation of the virtual goalkeeper is based on real, motion-captured movements of a professional goalkeeper.
How is visual and kinesthetic / efferent information integrated during human locomotion (e.g., walking, running)? Do the speed and characteristics of the visual scene (e.g., slope, contrast) affect perceived locomotor speed and perceived effort? Athletes walk or run on a treadmill in front of a large projection screen (5 * 3 meters) and the speed and characteristics of the scene are selectively manipulated. The athletes’ perceptual and physiological measurements are taken.
What are the perceptual strategies (information gathering strategies) that lead to the best performance? Are some gaze patterns associated with a better performance? Are the gaze patterns of experts different from those of novices? Above is an example in which we measured the eye movements of goalkeepers whose task was to anticipate the direction of the kick-to-come by observing the run-up of the kicker before ball contact. All kickers were virtual characters (i.e., avatars) whose movements (i.e., animation) were based on real, motion-captured movements of semi-professional players. A lab-developed visualization application was used to analyze the gaze patterns.
Falls in elderly people often result in serious medical problems such as fractures, trauma, disabilities, and loss of activity. Therefore, fall prevention is a human and economic issue. We try to define fall risk indicators that could routinely be used at home to automatically monitor the evolution of fall risk over time. Individuals performing everyday ‘motor tasks’ such as standing up, walking, turning around, or sitting down, are monitored by a Microsoft Kinect ambient sensor (top left). Several parameters related to the gait and movement patterns are automatically extracted (right). Different machine learning algorithms are then used to classify individuals based on their estimated fall risk (bottom left: low fall risk (red) vs high fall risk (blue)).