Your browser is unsupported

We recommend using the latest version of IE11, Edge, Chrome, Firefox or Safari.

Grant to develop sensors that mimic the brain’s ability to focus on what’s important

Assistant Professor Inna Partin-Vaisband and Associate Professor Amit Trivedi were awarded $1.9 million from the Semiconductor Research Consortium (SRC) and the U.S. Department of Defense-run Defense Advanced Research Project Agency (DARPA) to improve human-intelligent machine collaboration.

Their grant is part of the Joint University Microelectronics Program 2.0 (JUMP 2.0), a consortium of interdisciplinary researchers from multiple institutions who are pursuing high-risk, high-payoff research to tackle the technological challenges of an increasingly connected world.

JUMP 2.0 includes seven thematically structured centers, each focused on one overarching research theme. JUMP 2.0 aims to support the expected unprecedented growth in the U.S. semiconductor industry, which is projected to double from half a trillion to a trillion dollars in the next decade.

Partin-Vaisband and Trivedi are part of the $28.2 million COGNISENSE: Center on Cognitive Multispectral Sensors research team, which will focus on developing sensors that dynamically adapt to what is being sensed, and how these sensed signals are processed in real-time.

Electronic sensors can perceive or “see” everything around them, which generates too much information to be stored or efficiently processed. This creates what is known as a data deluge problem.

“There is an inability to efficiently process the exponentially increasing volumes of sensed data,” Trivedi said. “This prevents effective action upon the generated data.”

To improve the processing performance of the sensors, the CogniSense researchers aim to make electronic sensors behave in a way that is more selective and energy efficient. To do this, they hope to mimic how humans operate: the way our senses and brain work together to control our attention, and efficiently focus on what matters.

This will be achieved with a system of multi-spectral sensing array hardware, processing algorithms, and control feedback, which will be developed by the Center’s team for generating trustworthy insights directly from wideband multi-modal analog signals. Sensors will be developed that dynamically adapt signal processing to information being sensed according to the real-time changes in the environment.

Partin-Vaisband will lead the Heterogeneous System Integration and Simulation Thrust. Together, Trivedi and Partin-Vaisband will address a broad spectrum of challenges related to real-time trust and “sensor reputation tracking”

“This includes secure operation of a system with adversarial inputs or that is under malicious attacks; dense integration with power, signal, and thermal integrity; power delivery with in-pixel voltage regulation; and physics-aware machine learning-based simulations of large scale analog and mixed-signal systems,” Trivedi said.

Partin-Vaisband has 20 years of research and industrial experience in integrating circuit design, on-chip power delivery and management, and Very Large Scale Integration (VSLI) design automation. CogniSense is one of two recently announced JUMP 2.0 Centers on which she is a PI (the other being CHIMES), with her total share of over $2.1 million from these two awards.

Trivedi has over a decade’s experience in computationally efficient and reliable on-sensor computing. His related research into on-sensor computing has led to an NSF CAREER Award, DOE’s AI-for-HEP funding, and an IEEE CASS Chapter Award for his paper in AICAS’22.