The Visual and Cognitive Neuroscience Lab investigates how information from audition, emotion, cognition and prediction influences visual processing. Accordingly, our research is divided into several topics.
We demonstrated that information content of sounds is represented at the earliest level of visual processing in the human brain. This sound representation exists even in the entire absence of visual input, both in blindfolded sighted participants, and in individuals who are blind from birth.
Currently, we are exploring the kind of auditory information that is represented in visual cortex, and how auditory information influences vision when visual input is ambiguous. We are also studying the effect of sounds on eye movements in the absence of vision and visual awareness.
Our current work on this topic is funded by a PRIMA grant to Dr. Petra Vetter from the Swiss National Science Foundation.
Vetter, P., Smith, F. W. & Mucki, L. (2014). Decoding sound and imagery content in early visual cortex. Current Biology, 24 (11), 1256-62. (Featured in Science, 2014, 345 (6193), 176-177 as Research News from other journals).
Vetter, P., Bola, L., Reich, L., Bennett, M., Mucki, L. & Amedi, A. (2020). Decoding natural sounds in early “visual” cortex of congenitally blind individuals. Current Biology, 30(15), 3039-3044.e2.
We also investigate the influence of emotional information on vision. We demonstrated that emotional faces can guide the eyes even in the absence of visual awareness.
Currently, we are exploring the neural correlates of emotional information influencing vision, when visual information is ambiguous or suppressed from awareness.
Vetter, P., Badde, S., Phelps, E. A., & Carrasco, M. (2019). Emotional faces guide the eyes in the absence of awareness. ELife, 8:e43467.
Our work bears implications for psychological and philosophical theories on how vision is affected by cognition. We have argued that there are several ways on how cognitive contents can influence visual perception, and that there is no clear-cut boundary between perception and cognition. We aim to address this question also experimentally.
Vetter, P. & Newen, A. (2014). Varieties of cognitive penetration in visual perception. Consciousness & Cognition, 27C, 62-75.
Newen, A. & Vetter, P. (2017). Why cognitive penetration of our perceptual experience is still the most plausible account. Consciousness & Cognition, 47, 26-37.
Our work demonstrates that visual cortex receives meaningful information from the rest of the brain, very likely via feedback connections. One potential function of this feedback to visual cortex is the prediction of visual input. We investigated predictive feedback to early visual cortex in several studies.
Vetter, P., Grosbras, M.-H. & Muckli, L. (2015). TMS over V5 disrupts motion prediction. Cerebral Cortex, 25(4), 1052-9.
Edwards. G., Vetter, P., McGruer, F., Petro, L. & Muckli, L. (2017). Predictive feedback to V1 dynamically updates with sensory input. Scientific Reports, 7: 16538.
Vetter, P., Edwards, G., & Muckli, L. (2012). Transfer of predictive signals across saccades. Frontiers in Psychology, 3, 176.
Vetter, P., Sanders, L. & Muckli, L. (2014). Dissociation of prediction from conscious perception. Perception, 43, 1107-1113.
In earlier work, we investigated the role of attention and awareness in the visual perception of numerosity.
Vetter P., Butterworth, B. & Bahrami, B. (2011). A candidate for the attentional bottleneck: Set-size specific modulation of right TPJ during attentive enumeration. Journal of Cognitive Neuroscience 23(3), 728-36.
Bahrami B., Vetter P., Spolaore E., Pagano S., Butterworth B. & Rees G. (2010). Unconscious numerical priming despite interocular suppression. Psychological Science 21(2), 224-233.
Vetter, P., Butterworth, B. & Bahrami, B. (2008). Modulating attentional load affects numerosity estimation: Evidence against a pre-attentive subitizing mechanism. PLoS ONE 3 (9), e3269.