Ongoing Projects

Chemical Sensing for Monitoring Environmental Hazards

Chemical sensors are being designed to exhibit colorimetric changes in response to environmental hazards (e.g., ultraviolet radiation and carbon monoxide); however, these sensors have yet to see widespread adoption. We believe that designing such sensors for both human- and machine-interpretability will support continuous engagement. My contribution to this project includes the development of an app that supports citizen scientists and formal guidelines for chemical sensor designs.

How Do People View Diagnostic Smartphone Apps?

Smartphone apps that perform medical diagnosis are coming closer to reality each and every month. In research, we are fortunate enough to work with clinicians and study participants who are often excited by the prospect of having a convenient way to check their health, but what happens when these apps actually become publicly available? Will these apps actually cause people to change their course of action when it comes to receiving treatment? We are investigating questions like these through a combination of qualitative interviews and quantitative surveys with hypothetical scenarios.

NeuroTouch: Improving Touch Accuracy for People with Motor Impairments

Touch-sensitive surfaces assume that users are able to precisely land and lift a single finger; these requirements are sometimes unreasonable for people with motor impairments. Coming from the perspective of ability-based design, we seek to upend these assumptions and make touchscreens more amenable to a wider range of touch abilities. We are exploring the use of deep learning to generate a user-specific mapping from touch patterns to touch coordinates.

Supporting Over-the-Counter Flu Testing

In collaboration with the Bill and Melinda Gates Foundation, we are investigating how smartphones can help people conduct rapid diagnostic tests (RDTs) on their own to diagnose the flu. We are tackling this project from multiple angles, which include procedural checks and strip interpretation through computer vision. We are also exploring ways that smartphone sensors can be used to detect symptoms, thus generating additional information for the diagnostic process.

Full Publication List



IDCam: Precise Item Identification for AR-Enhanced Object Interactions

Hanchuan Li, Eric Whitmire, Alex Mariakakis, Victor Chan, Alanson Sample, Shwetak Patel
IEEE RFID '19, Best Paper Finalist




PupilScreen: Using Smartphones to Assess Traumatic Brain Injury

Alex Mariakakis, Jacob Baudin, Eric Whitmire, Vardhman Mehta, Megan A Banks, Anthony Law, Lynn McGrath, Shwetak Patel
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), 2017
PDF Video Slides


BiliScreen: Smartphone-Based Scleral Jaundice Monitoring for Liver and Pancreatic Disorders

Alex Mariakakis, Megan A. Banks, Lauren Phillipi, Lei Yu, James Taylor, Shwetak Patel
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), 2017
PDF Video Slides



WatchUDrive: Differentiating Drivers and Passengers Using Smartwatches

Alex Mariakakis, Vijay Srinivasan, Kiran Rachuri, Abhishek Mukherji
PerCom '16 Workshop on Sensing Systems and Applications Using Wrist-Worn Smart Devices
PDF Slides



HyperCam: Hyperspectral Imaging for Ubiquitous Computing Applications

Mayank Goel, Eric Whitmire, Alex Mariakakis, Scott Saponas, Neel Joshi, Dan Morris, Brian Guenter, Marcel Gavriliu, Gaetano Borriello, Shwetak Patel
UbiComp '15, Honorable Mention



SAIL: Single Access Point-Based Indoor Localization

Alex Mariakakis, Souvik Sen, Jeongkeun Lee, Kyu-Han Kim
MobiSys '14