People
Adam Charles

Adam Charles, PhD

Assistant Professor

Office: 
410-526-8120
adamsc@jhu.edu


Education

Post-doctoral training, Princeton Neuroscience Institute, 2015-2020
PhD, Electrical and Computer Engineering, Georgia Institute of Technology, 2015
ME, Electrical and Computer Engineering, The Cooper Union, 2009
BE, Electrical and Computer Engineering, The Cooper Union, 2009

Research Interests

To understand the brain, powerful imaging and algorithmic tools need to be developed to meet the unique signal processing and machine learning challenges posed by neurophysiological data, neural imaging, and computational neuroscience. My lab aims to create the next generation of imaging systems and analysis tools capable of overcoming the difficulties posed by the high dimensionality and complexity of neural activity. This goal spans the development both of advanced recording technologies via collaborative designs of hardware and algorithms, and computational and theoretical frameworks for understanding biological and artificial neural systems.

 

My lab approaches these topics by performing highly collaborative research that draws on both theory- and data-driven philosophies. Specific lab interests span 1) advancing the capabilities of specific technologies, for example multi-photon calcium imaging, via new data science and signal processing methods 2) the theoretical analysis and development of important models in neuroscience, such as recurrent neural networks, and 3) building off these areas to create more general-purpose data science advances with broader impact in applications beyond neuroscience.

Publications Search

From Pub Med   |   Google Scholar Profile

Selected Publications

A.S. Charles*, M. Park*, J.P. Weller, G.D. Horwitz, and J.W. Pillow. Dethroning the Fano Factor: a flexible, model-based approach to partitioning neural variability. Neural Computation 30(4):1012-1045 2018. *Joint first author.

A. Song*, A.S. Charles*, S.A. Koay, J.L. Gauthier, S.Y. Thiberge, J.W. Pillow, and D.W. Tank. Volumetric Two-Photon Imaging of Neurons Using Spectroscopy (vTwINS). Nature Methods 14(4):420-426, Apr. 2017. *Joint first author.

A.S. Charles, D. Yin, and C.J. Rozell. Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks. Journal of Machine Learning Research 18(7):1-37, Jan 2017.

A.S. Charles, A. Balavoine, and C.J. Rozell. Dynamic Filtering of Time-Varying Sparse Signals via l1 Minimization. IEEE Transactions of Signal Processing 2016, 64(21):5644-5656, November 2016.

A.S. Charles, P. Garrigues, and C.J. Rozell. A common network architecture efficiently implements a variety of sparsity-based inference problems, Neural Computation, 24(12):3317-3339, December 2012.