PERCEPTUAL & COGNITIVE NEUROSCIENCE LAB

Research

Perceptual and Cognitive Neuroscience Lab aims to understand how the brain represents and processes a range of perceptual and cognitive information – from low-level sensory features to high-level contexts and predictions. Contrary to our rich and seamless experiences of the external world, raw sensory data are only partial. To investigate how the brain creates enriched internal models of our environment from impoverished sensory data, we examine the human brain at the network level, using functional magnetic resonance imaging (fMRI) and computational approaches (model-based encoding/decoding using machine learning, functional connectivity, graph theory). Functional brain imaging enables us to identify the brain circuits and networks underlying perception, attention, and memory, and to examine the interaction between regions and the functions of populations of neurons at each stage of the information processing stream. We also link psychophysical measures to fMRI responses to elucidate how the neural information relates to our cognition and behavior.

Filling in missing information in sensory inputs

We are trying to understand how top-down predictions are formed in high-level cortical areas, are compared to incoming sensory data at early stages of processing, and are updated in the process of building our internal models of the external world. Specifically, we study high-level computations of the early visual cortex as well as the role of high-level cortical areas in this process, using psychophysical stimuli that can separate top-down predictions from bottom-up sensory inputs (Chong, Familiar, & Shim, 2016). We develop encoding models of diverse visual features and track how neural representations evolve as the integration of information progresses.

 

Construction of color experiences

How does the human brain enable us to “see” the colorful world? To understand how the visual system gives rise to our color experiences, we turn to the chromatic interocular-switch rivalry paradigm, which results in steady percept of one of two colors, even though the two colors are simultaneously presented to each eye. In collaboration with Steven Shevell (Univ. of Chicago) and Sang Wook Hong (Florida Atlantic University), we are building an encoding model to account for dynamic color experiences during interocular-switch rivalry.

Summary statistics

The human brain is endowed with the ability to summarize properties of similar objects to efficiently represent a complex visual environment. Using encoding and decoding approaches, we examine how early visual areas as well as high-level cortical regions serve to extract summary statistics of visual features at the level of population responses.

Understanding visual narratives

How does the brain construct a meaningful stream of information with narrative structures, which requires continuous integration of novel inputs? To address this question, we study how the brain alternates between different network states during narrative comprehension of a movie, and attempt to decode the temporal evolution of mental states, using functional structure of the entire brain.

The effect of emotion on perception

Converging evidence suggests that our emotions can influence our thoughts, actions, and even basic perceptions. Our previous work shows that positive emotions impact our visual perception by broadly relaxing attentional filters, hence reducing selectivity (Uddenberg & Shim, 2015). This change may reflect a broader, more exploratory perceptual mode adopted in order to take advantage of weaker but potentially lucrative signals in the environment. One interesting possibility is that this decreased selectivity at the perceptual level may be the basis for the increased breadth at the conceptual level, which enhances cognitive flexibility and creativity. We continue to explore cognitive and neural mechanisms of how our emotions change the way we view the world.