Ari Rosenberg

Assistant Professor
Ph.D., Computational Neuroscience, University of Chicago

Contact Information
(608) 265-5782 Phone
(608) 265-5512 Fax

Positions Available
The lab is seeking highly motivated and industrious persons to fill several key positions.

Ari Rosenberg
A technical position is available for a computer programmer interested in developing systems for real-time dynamic (closed-loop) control of neuroscience experiments involving 3D visualizations. Demonstrated previous experience with OpenGL and the software/hardware requirements of real-time control with millisecond precision is highly desired.

We are looking for graduate students with undergraduate or master's level training in the sciences or engineering. Interested students can contact me and apply to the following Ph.D. programs:
Neuroscience Training Program
Physiology Graduate Training Program

Postdoctoral positions are also available. Individuals with experience in multi-electrode neural recordings and/or fMRI in behaving animals are particularly encouraged to apply.

Research Interests
Neural computations underlying 3D vision, multisensory integration, and the neural basis of autism

3D vision
Click to enlarge How do we perceive the three-dimensional (3D) structure of the world when our eyes only sense 2D projections like a movie on a screen? Estimating the 3D scene structure of our environment from a pair of 2D images (like those on our retinae) is mathematically an ill-posed inverse problem plagued by ambiguities and noise, and involving highly nonlinear constraints imposed by multi-view geometry. Given these complexities, it is quite impressive that the visual system is able to construct 3D representations that are accurate enough for us to successfully interact with our surroundings. A major area of research in the lab is devoted to understanding how the brain achieves accurate and reliable 3D representations of the world. A critical aspect of 3D vision is the encoding of 3D object orientation (e.g., the slant and tilt of a planar surface). By adapting mathematical tools used to analyze geomagnetic data (Bingham functions), we developed the first methods for quantifying the selectivity of visual neurons for 3D object orientation. Our work on this topic employs a synergistic, multifaceted approach combining computational modeling, neurophysiological studies, and human psychophysical experiments.

Multisensory integration
Our visual system first encodes the environment in egocentric coordinates defined by our eyes. Such representations are inherently unstable in that they shift and rotate as we move our eyes or head. Click to enlarge However, visual perception of the world is largely unaffected by such movements, a phenomenon known as spatial constancy. Perception is instead anchored to gravity, which is why buildings are seen as vertically oriented even if you tilt your head to the side. This stability of visual perception is a consequence of multisensory processing in which the brain uses gravitational signals detected by the vestibular and proprioceptive systems to re-express egocentrically encoded visual signals in gravity-centered coordinates. Vestibular deficits can thus compromise visual stability, and the absence of gravity in space can cause astronauts to experience disorienting jumps in the perceived visual orientation of their surroundings. A second area of research in the lab investigates where and how the brain combines visual information with vestibular and proprioceptive signals in order to achieve a stable, gravity-centered representation of the world. Our work on this topic relies on a combination of computational modeling and neurophysiological studies.
(Click here to play or download a movie illustrating one of our experimental protocols.)

Neural basis of autism
The prevalence of autism is growing at a dramatic rate, with a proportionate response from the scientific community. The extent to which this rise in prevalence reflects growing awareness, over diagnosis, or a genuine increase in incidence is currently unclear. Click to enlarge Considering that last year alone saw almost 4000 publications related to autism, it is perhaps not surprising that a number of controversies are emerging within the field. A third area of research in the lab aims to lift the fog surrounding this complex disorder. Given the heterogeneity of the genetic and environmental factors that may give rise to autism, as well as its phenotypic diversity, our approach takes the perspective that we can better understand autism by studying how the disorder affects neural computation. We are chiefly interested in determining if the behavioral consequences of autism reflect alterations in canonical neural computations that occur throughout the brain, such as divisive normalization. To test this hypothesis, we are conducting psychophysical studies based on predictions of our recent computational work on the disorder. Our hope is that identifying where and how neural computations are altered in autism will provide a unique window of understanding into the disorder and may provide important insights into its treatment.

Selected Publications

  • Rosenberg A, Patterson JS, and Angelaki DE (2015) A computational perspective on autism. Proc Natl Acad Sci. 112(30): 9158-65.   PDF
  • Rosenberg A and Angelaki DE (2014) Reliability-dependent contributions of visual orientation cues in parietal cortex. Proc Natl Acad Sci. 111 (50), 18043-8.   PDF
  • Rosenberg A and Angelaki DE (2014) Gravity influences the visual representation of object tilt in parietal cortex. J Neurosci. 34 (43), 14170-80.   PDF
  • Seilheimer RL*, Rosenberg A*, and Angelaki DE (2014) Models and processes of multisensory cue combination. Curr Opin Neurobiol. 25: 38-46. (*equal contribution)   PDF
  • Rosenberg A, Cowan NJ, and Angelaki DE (2013) The visual representation of 3D object orientation in parietal cortex. J Neurosci. 33 (49): 19352-61.   PDF
  • Dakin CJ, Elmore LC, and Rosenberg A (2013) One step closer to a functional vestibular prosthesis. J Neurosci. 33 (38): 14978-80   PDF
  • Rosenberg A and Issa NP (2011) The Y cell visual pathway implements a demodulating nonlinearity. Neuron. 71 (2): 348-61.   PDF
  • Issa NP and Rosenberg A (2011) Tartini's devil: Peripheral mechanisms that underlie sensory illusions. In: A field guide to a new meta-field: Bridging the humanities-neurosciences divide. Ed. Barbara Stafford. University of Chicago Press.
  • Rosenberg A, Husson TR, and Issa NP (2010) Subcortical representation of non-Fourier image features. J Neurosci. 30 (6): 1985-93.   PDF
  • Lanning K and Rosenberg A (2009) The dimensionality of American political attitudes: Tensions between equality and freedom in the wake of September 11. Behav Sci of Terrorism and Political Aggression. 1 (2): 84-100   PDF
  • Rosenberg A and Talebi V (2009) The primate retina contains distinct types of Y-like ganglion cells. J Neurosci. 29 (16): 5048-50.   PDF
  • Issa NP, Rosenberg A, and Husson TR (2008) Models and measurements of functional maps in V1. J Neurophysiol. 99 (6): 2745-54.   PDF
  • Mallik AK, Husson TR, Zhang JX, Rosenberg A, and Issa NP (2008) The organization of spatial frequency maps measured by cortical flavoprotein autofluorescence. Vision Res. 48 (14): 1545-53.   PDF
  • Rosenberg A, Wallisch P, and Bradley DC (2008) Responses to direction and transparent motion stimuli in area FST of the macaque. Vis Neurosci. 25 (2): 187-95.   PDF
  • Zhang JX, Rosenberg A, Mallik AK, Husson TR, and Issa NP (2007) The representation of complex images in spatial frequency domains of primary visual cortex. J Neurosci. 27 (35): 9310-8.   PDF

Back to Neuroscience Home