S. Russell, R. de la Rica
Balearic Islands Health Research Institute (IdISBa),
Spain
Keywords: computer vision, augmented reality, real-time detection, biosensors, mHealth, global health, digital health
Summary:
Paper-based biosensors offer several advantages over traditional, laboratory-bound diagnostic techniques such as being rapid, inexpensive, light-weight, environmentally sustainable, and easily disposable. These features make them preferable in wide variety of healthcare contexts, as they are easily scalable in terms of production, highly mobile in terms of distribution, and easy to implement for the end user. As such, they are suitable for personal use, professional healthcare use, and field-use for frontline healthcare workers and nonprofit global health organizations. [1] The signals that these biosensors produce must be detected with a device that is similarly mobile and low-cost. Smartphones have reached global market penetration and are equipped with image sensors, making them ideal for reading colorimetric output from biosensors. Similarly, their processing power allows for a simple and intuitive user interface, and their communication capabilities facilitate their integration into existing healthcare infrastructures.[2] Here we showcase two mobile apps developed in our lab that yield real-time diagnostic results with an augmented reality user interface designed to be accessible to anyone regardless of geographical location, native language, educational background, or economic status. First, we report a marker-based augmented reality (AR) app that performs a semi-quantitative analysis of procalcitonin, a sepsis biomarker, in serum at clinically relevant levels. This was achieved by overlaying a transparent sheet printed with custom AR markers over the paper-based biosensor and viewing it through a mobile camera.[3] The AR markers were calibrated to interact with the colorimetric signals by manipulating their contrast in relation to the intensity of the underlying signal. The results are then displayed in real-time using an AR traffic light that indicated low (green light), moderate (yellow light) or high (red light) levels of procalcitonin. Our approach could be used to screen patients with moderate or high risk of sepsis during triage to prioritize treatment and improve clinical outcomes. Second, we present a marker-less AR app in development that is able to detect and quantify the intensity of colorimetric signals automatically. This is achieved by an initial camera calibration step, which allows for uniform functionality on any mobile device. Common problems involving illuminance, light artifacts, inconsistent camera angle, and distance are addressed. Variations in illumination are overcome by incorporating a control color into the paper substrate. Variation in camera angles are rectified with edge detection algorithms and subsequent affine remapping. Light artifacts are controlled by a set of shadow detectors which monitor the illumination of the surrounding paper substrate. Finally, variation in the distance between the camera and the signal are mitigated by using image pyramids. By taking all of these variables into account, we propose a universal platform for mobile diagnostics that can be easily adapted to any colorimetric paper-based assay. [1] Policy Considerations for Mobile Biosensors, ACS Sensors 2018. [2] Augmented Reality for Real-Time Detection and Interpretation of Colorimetric Signals Generated by Paper-Based Biosensors, ACS Sensors 2017. [3] A Robust and User-Friendly Alternative to Densitometry Using Origami Biosensors and Digital Logic, ACS Sensors 2018.