Advanced Signal Processing and AI for Biomedical Wearable Sensing 🗓 🗺

meeting seminar
3700 O'Hara Street Pittsburgh, Pennsylvania United States 15261 Map


Date: April 9 2019.
Time: 06:00 PM to 08:00 PM (EDT)
Speaker: Brian Telfer.


3700 O’Hara Street
Pittsburgh, Pennsylvania
United States 15261

Cost: Free
RSVP: Required.
Event Details & Registration: (URL)


Wearable sensing for consumers has become ubiquitous. However, these sensors do not meet the needs of workers in extreme environments, including firefighters and other first responders, astronauts, pilots, divers, and military service members. These groups need devices to stay safe and maximize human performance while carrying heavy loads, wearing protective clothing, and operating in extremes of temperature at prolonged, high workloads. Since these users are already physically overburdened, it is critical that devices be “wear and forget” with low size, weight and power. Extracting useful information from wearable sensing in these environments poses challenges beyond those experienced in clinical and laboratory settings: motion artifacts can be severe and must be mitigated, sensor data are often more limited in order to preserve battery life, and computing power is limited for real-time implementation.

Wearable monitoring systems are being prototyped to detect early degradation due to a variety of biomedical conditions. Several of these applications will be highlighted, as well as motion artifact mitigation:

A motion artifact mitigation algorithm based on time-frequency analysis of optical sensors and accelerometry has improved the accuracy of ambulatory heart rate measurements, compared to a state-of-the-art device.

A machine-learning model has been developed to provide early warning of viral or bacterial infection from ECG and other raw waveform vital sign data measured from an animal model. Mean early warning of 46 hours, prior to onset of fever, has been demonstrated.

For gait monitoring, an algorithm based on eigenspectral analysis of raw accelerometry has been shown to track well with increasing asymmetry along with gait and neuromotor incoordination.

For early detection and tracking of neurocognitive conditions, markers of vocal and facial neuromotor coordination have been applied to several neurocognitive conditions and disorders, resulting in top-scores in international competitions.


Brian Telfer

Brian A. Telfer is a Senior Staff Member in the Bioengineering Systems and Technologies Group at MIT Lincoln Laboratory.  He received a BS in Electrical Engineering from Virginia Tech and an MS and PhD in electrical engineering from Carnegie Mellon. After working at the Naval Surface Warfare Center, where he was funded with an Office of Naval Research Young Scientist Award, he joined Lincoln Laboratory in 1995. His contributions have been in the areas of signal processing, machine learning, artificial intelligence and systems analysis, initially for ballistic missile defense and for the past seven years for bioengineering, particularly for physiological status monitoring. The focus of this work has been on prototyping and technology transition, with successful transitions leading to government contracts to industry for several hundred million dollars, as well as for smaller industrial contracts that have resulted in several hundred (and growing) physiological monitoring units. Dr. Telfer has organized body sensor network workshops and has served as Technical Program Co-Chair for the International Body Sensor Networks Conference in 2013 and 2015. He is a Senior Member of the IEEE (Institute of Electrical and Electronics Engineers) and a member of the IEEE Technical Committee on Wearable Devices. He has been a member of several MIT Lincoln Laboratory internal funding selection boards, as well as technical lead for MIT LL’s Introduction to Radar Systems course and other radar courses. He has led several studies for senior leaders in the U.S. Government and has co-authored over 70 publications.