Research

Here are some of my projects, both past and current.

For more, please visit my Google Scholar page.

Brain Computer Interfaces and Tactile Sensations

Diagram of a model closed-loop brain computer interface
Rosenthal et al, 2023, S1 represents multisensory contexts and somatotopic locations within and outside the bounds of the cortical homunculus

The brain constructs unified conscious experiences from noisy, multisensory inputs. When one of these inputs is missing or incorrect, what information can the brain use to maintain a viable interpretation of the body in relationship to the world? How does the brain organize information from multisensory inputs, and at what level of the processing hierarchy are they combined? Does an artificially generated sensation follow the same neural rules as a naturally generated one? I am investigating these questions in my PhD work, with special interest with how they relate to brain computer interfaces capable of conveying touch sensations. This work is done in Dr. Richard Andersen’s lab in the Chen Brain-machine Interface Center at Caltech.

To manipulate multisensory contexts, I used a mixture of real tactile and visual stimuli as well as virtual reality environments. My work examines the brain’s encoding of imagined, real, and artificially-generated touch sensations, which are generated using intracortical microstimulation (ICMS).

My most recent project is on how touch experiences are composed of multisensory inputs, and how the topographic organization of somatosensory cortex is less rigid that previous thought. Check out the paper here!

Interested in the neural encoding of imagined sensations in primary somatosensory cortex? Read our paper here.

The Neural Geometry of Color Spaces

Most humans can intuitively can sort colors as more or less similar to one another – but how are colors encoded with respect to one another in the brain? By using magnetoencephalography to record rapid cortical dynamics, we construct a picture of how the human brain distinguishes between colors across hue and luminance. Work done in Dr. Bevil Conway’s lab at the NIH in collaboration with Katherine Hermann and Shridhar Singh – read more here and here.

Object Color Statistics and Color Tuning in the Brain

What kind of information could the brain extract from color? By examining a database of over 20,000 images, we showed that objects and backgrounds have different color profiles on average – objects are more likely to contain saturated, warmer colors. Furthermore, color contains information potentially useful in object categorization, including in distinguishing between animate (living) and inanimate objects. Supporting this idea, we showed that neurons in the primary visual cortex of macaques tended to be tuned to colors associated with objects more strongly than to other colors. Work done in Dr. Bevil Conway’s lab at the NIH – read more here.

Theory of Mind and Autism Spectrum Disorder

Theory of Mind – the process of inferring the mental states of others – is an important and dynamic part of human social interaction. We implemented a novel learning paradigm to examine how humans update their representations of the beliefs and intentions of others. Using Rescorla-Wagner reinforcement learning models, we examined task performance of adult humans with Autism Spectrum Disorder as well as a neurotypical group. By doing so, we showed that the autistic group was able to track beliefs consistently but was impaired in tracking intentions. Work done with Dr. Damian Stanley, in Dr. Ralph Adolph’s lab at Caltech – read more here.