Chemical Sensing for Monitoring Environmental Hazards
Chemical sensors are being designed to exhibit colorimetric changes in response to environmental hazards (e.g., ultraviolet radiation and carbon monoxide); however, these sensors have yet to see widespread adoption. We believe that designing such sensors for both human- and machine-interpretability will support continuous engagement. My contribution to this project includes the development of an app that supports citizen scientists and formal guidelines for chemical sensor designs.
How Do People View Diagnostic Smartphone Apps?
Smartphone apps that perform medical diagnosis are coming closer to reality each and every month. In research, we are fortunate enough to work with clinicians and study participants who are often excited by the prospect of having a convenient way to check their health, but what happens when these apps actually become publicly available? Will these apps actually cause people to change their course of action when it comes to receiving treatment? We are investigating questions like these through a combination of qualitative interviews and quantitative surveys with hypothetical scenarios.
NeuroTouch: Improving Touch Accuracy for People with Motor Impairments
Touch-sensitive surfaces assume that users are able to precisely land and lift a single finger; these requirements are sometimes unreasonable for people with motor impairments. Coming from the perspective of ability-based design, we seek to upend these assumptions and make touchscreens more amenable to a wider range of touch abilities. We are exploring the use of deep learning to generate a user-specific mapping from touch patterns to touch coordinates.
Supporting Over-the-Counter Flu Testing
In collaboration with the Bill and Melinda Gates Foundation, we are investigating how smartphones can help people conduct rapid diagnostic tests (RDTs) on their own to diagnose the flu. We are tackling this project from multiple angles, which include procedural checks and strip interpretation through computer vision. We are also exploring ways that smartphone sensors can be used to detect symptoms, thus generating additional information for the diagnostic process.
Full Publication List
IDCam: Precise Item Identification for AR-Enhanced Object Interactions
IEEE RFID '19, Best Paper Finalist
Challenges in Realizing Smartphone-based Health Sensing
To appear in IEEE Pervasive Computing, 2019
PupilScreen: Using Smartphones to Assess Traumatic Brain Injury
BiliScreen: Smartphone-Based Scleral Jaundice Monitoring for Liver and Pancreatic Disorders
WatchUDrive: Differentiating Drivers and Passengers Using Smartwatches
HyperCam: Hyperspectral Imaging for Ubiquitous Computing Applications
UbiComp '15, Honorable Mention