Department of Bioengineering,
Jacobs School of Engineering, and
Institute for Neural Computation,
University of California San Diego
Engineer's Degree, 1988, Vrije Universiteit Brussel, Belgium
M.S.E.E., 1989, California Institute of Technology
Ph.D., 1994, California Institute of Technology
My research and that of my students cover analog and digital VLSI microsystems for adaptive neural computation and sensory information processing, from neuromorphic systems engineering and kernel-based learning machines to micropower implantable neural interfaces, acoustic microarrays, adaptive optics and biometric identification. My group contributed a variety of highly efficient analog VLSI processors for vision, audition and pattern classification which outperform general-purpose digital processors with a factor 100-10,000 reduction in power dissipation and similar savings in silicon area.
The focus of our research is on cross-cutting advances at the interface between in vivo and in silico neural information processing. The goals are threefold: to empower silicon integrated circuits with adaptive intelligence inspired by sensory information processing in nervous systems; to facilitate advances in computational neuroscience by large-scale emulation of neural models in parallel analog silicon circuits; and to interface silicon with neural cells for restoring lost function in sensory and motor impaired patients.
Adaptation, learning and memory in analog VLSI have been the focus of extensive research in our laboratory. The edited volume Learning on Silicon (Kluwer Academic, 1999) serves as a reference to this emerging field. Our work formulated and implemented mechanisms of adaptation and learning embedded in parallel distributed architecture which, like synaptic plasticity in nerve assemblies, sustain computational intelligence in variable and complex environments and compensate for imprecision in analog computation. This research earned the recognition of a Presidential Early Career Award for Scientists and Engineers (PECASE) and an ONR Young Investigator Award. Our work extends to kernel learning machines which incorporate principles of statistical learning theory. Recently we developed the Kerneltron, a support vector "machine" as a massively parallel VLSI array processor on a single chip with energy efficiency on par with synaptic transmission in the mammalian brain. Our work in Bioengineering and INC at UCSD continues to pursue advances in the cognitive performance and efficiency of neuromorphic cognitive adaptive systems and wireless brain interfaces.