posted on 2023-05-03, 16:00authored byYan Wang, Sina Sadeghi, Rajesh Paul, Zach Hetzler, Evgeny Danilov, Frances S. Ligler, Qingshan Wei
Time-resolved techniques have been widely used in time-gated and luminescence lifetime imaging. However, traditional time-resolved systems require expensive lab equipment such as high-speed excitation sources and detectors or complicated mechanical choppers to achieve high repetition rates. Here, we present a cost-effective and miniaturized smartphone lifetime imaging system integrated with a pulsed UV LED for 2D luminescence lifetime imaging using a videoscopy-based virtual chopper (V-chopper) mechanism combined with machine learning. The V-chopper method generates a series of time-delayed images between excitation pulses and smartphone gating so that the luminescence lifetime can be measured at each pixel using a relatively low acquisition frame rate (e.g., 30 fps) without the need for excitation synchronization. Europium (Eu) complex dyes with different luminescent lifetimes ranging from microseconds to seconds were used to demonstrate and evaluate the principle of V-chopper on a 3D-printed smartphone microscopy platform. A convolutional neural network (CNN) model was developed to automatically distinguish the gated images in different decay cycles with an accuracy of >99.5%. The current smartphone V-chopper system can detect lifetime down to ~75 microseconds utilizing the default phase shift between the smartphone video rate and excitation pulses and in principle can detect much shorter lifetimes by accurately programming the time delay. This V-chopper methodology has eliminated the need for the expensive and complicated instruments used in traditional time-resolved detection and can greatly expand the applications of time-resolved lifetime technologies.
History
Disclaimer
This arXiv metadata record was not reviewed or approved by, nor does it necessarily express or reflect the policies or opinions of, arXiv.