I work at the Free University (VU) in Amsterdam where I investigate the neural mechanisms involved in visual perception, consciousness, attention, surface perception, object categorization and working memory. I am currently interested in exploring how the experience of seeing depends on someone's current goals and expectations. How do functional aspects of vision - as when looking for objects or making predictions about when or where objects will appear - determine conscious experience? Do you see different things depending on your current state of mind, or where you have just been? It has long been thought that the visual system works much like a camera, building up a veridical image of the outside world inside our brains. This idea has come under increasing pressure, suggesting that vision and conscious experience is more idiosyncratic than previously thought, and for a large part determined by our current task goals, motivational states, as well as our personal history and environmental context.
During my PhD I have used a combination of psychophysical methods, EEG and fMRI to determine the processing stages involved in conscious and unconscious vision. My first experiments focused on determining whether the initial sweep of cortical processing is consciously accessible, and which stages of information processing correlate with conscious perception. Using a masking paradigm, I have shown that the brain detects stimuli during the first sweep of cortical processing, even when subjects are unaware of ever having processed these stimuli, culminating in a paper that is now highly cited in the literature. In a follow up study, I have shown that the brain is even able to unconsciously extract highly complex information, such as the presence of 'invisible' faces in one's field of view, and that conscious experience emerges only when recurrent interactions take place between higher and lower cortical areas. Together, these experiments show that neuronal integration is the primary neural correlate of conscious experience. Counterintuitively, I have recently extended these findings to show that the hallmark of conscious experience - neuronal integration - can also take place when subjects cannot report about their experience when their attention is temporarily diverted away. This strongly indicates that consciousness can exist at different levels in the brain, even at levels that we ourselves do not have access to.
the Amsterdam Decoding and Modeling Toolbox(ADAM)
I have recently developed a Matlab toolbox that allows you to perform multivariate analyses on EEG and MEG data, allowing one to determine and model the relationship between psychological states (experimental conditions) and brain dynamics (neural mechanisms) over time. I am currently working with Joram van Driel to make this toolbox more user friendly by writing some wrapper functions and creating a manual/tutorial that will be presented during a workshop/hackathon at the ICON conference in Amsterdam 5-8 August 2017.
Until then you are free to download the toolbox from Github and play around with the software, but no guarantees that it is bug free. Update regularly, the toolbox is constantly being updated. If you use the toolbox, please cite our forthcoming Scientific Reports paper when using forward encoding models or recent PNAS paper when using decoding.