Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

Contactless Vital Signs Measurement System Using RGB-Thermal Image Sensors and Its Clinical Screening Test on Patients with Seasonal Influenza.

Sensors (Basel, Switzerland) | 2020

Background: In the last two decades, infrared thermography (IRT) has been applied in quarantine stations for the screening of patients with suspected infectious disease. However, the fever-based screening procedure employing IRT suffers from low sensitivity, because monitoring body temperature alone is insufficient for detecting infected patients. To overcome the drawbacks of fever-based screening, this study aims to develop and evaluate a multiple vital sign (i.e., body temperature, heart rate and respiration rate) measurement system using RGB-thermal image sensors. Methods: The RGB camera measures blood volume pulse (BVP) through variations in the light absorption from human facial areas. IRT is used to estimate the respiration rate by measuring the change in temperature near the nostrils or mouth accompanying respiration. To enable a stable and reliable system, the following image and signal processing methods were proposed and implemented: (1) an RGB-thermal image fusion approach to achieve highly reliable facial region-of-interest tracking, (2) a heart rate estimation method including a tapered window for reducing noise caused by the face tracker, reconstruction of a BVP signal with three RGB channels to optimize a linear function, thereby improving the signal-to-noise ratio and multiple signal classification (MUSIC) algorithm for estimating the pseudo-spectrum from limited time-domain BVP signals within 15 s and (3) a respiration rate estimation method implementing nasal or oral breathing signal selection based on signal quality index for stable measurement and MUSIC algorithm for rapid measurement. We tested the system on 22 healthy subjects and 28 patients with seasonal influenza, using the support vector machine (SVM) classification method. Results: The body temperature, heart rate and respiration rate measured in a non-contact manner were highly similarity to those measured via contact-type reference devices (i.e., thermometer, ECG and respiration belt), with Pearson correlation coefficients of 0.71, 0.87 and 0.87, respectively. Moreover, the optimized SVM model with three vital signs yielded sensitivity and specificity values of 85.7% and 90.1%, respectively. Conclusion: For contactless vital sign measurement, the system achieved a performance similar to that of the reference devices. The multiple vital sign-based screening achieved higher sensitivity than fever-based screening. Thus, this system represents a promising alternative for further quarantine procedures to prevent the spread of infectious diseases.

Pubmed ID: 32294973 RIS Download

Research resources used in this publication

None found

Antibodies used in this publication

None found

Associated grants

None

Publication data is provided by the National Library of Medicine ® and PubMed ®. Data is retrieved from PubMed ® on a weekly schedule. For terms and conditions see the National Library of Medicine Terms and Conditions.

This is a list of tools and resources that we have found mentioned in this publication.


MATLAB (tool)

RRID:SCR_001622

Multi paradigm numerical computing environment and fourth generation programming language developed by MathWorks. Allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing with programs written in other languages, including C, C++, Java, Fortran and Python. Used to explore and visualize ideas and collaborate across disciplines including signal and image processing, communications, control systems, and computational finance.

View all literature mentions

MUlti SImulation Coordinator (tool)

RRID:SCR_001756

Software that allows large scale neuron simulators to communicate during runtime. It allows exchange of data among parallel applications in a cluster environment, interconnects large-scale neuronal network simulators with each other or with other tools, participates in multi-simulations, and is continuously developed and extended. Three simulators currently have MUSIC interfaces: Moose, NEURON and NEST. Three applications execute in parallel while exchanging data via MUSIC. The software interface promotes interoperability by allowing models written for different simulators to be simulated together in a larger system. It enables re-usability of models or tools by providing a standard interface. As data are distributed over a number of processors, it is non-trivial to coordinate data transfer so that it reaches the correct destination at the correct time. Current and future simulators can make use of MUSIC - compliant general purpose tools and participate in multi-simulations, for example when: * Different parts of a complex nervous system model are optimally implemented in different simulators, and need to communicate with each other. * Post-processing of generated data is needed, where the amounts of data are too large for intermediate storage, and requires the simulator to pass the data directly to the post-processing module. A standard interface enables straight-forward independent third-party development and community sharing of interoperable software tools for parallel processing. * Library and utilities are written in C++, uses MPI. * It is possible to add a MUSIC interface to existing simulators. * Works independently, no assumptions are made about other applications to facilitate development of general purpose tools. * Performance Data transport with high bandwidth and low latency.

View all literature mentions

OpenCV (tool)

RRID:SCR_015526

Computer vision and machine learning software library which provides a common infrastructure for computer vision applications. The algorithms within the library can be used to detect and recognize faces, identify objects, classify human actions in videos, track camera movements and moving objects, extract 3D models of objects, produce 3D point clouds from stereo cameras, stitch images together to produce a high resolution image of an entire scene, find similar images from an image database, and follow eye movements, recognize scenery and establish markers to overlay it with augmented reality. It has C++, C, Python, Java and MATLAB interfaces.

View all literature mentions