J Clin Monit Comput DOI 10.1007/s10877-016-9949-y
ORIGINAL RESEARCH
Estimation of breathing rate in thermal imaging videos: a pilot study on healthy human subjects Carina Barbosa Pereira1 • Xinchi Yu1 • Michael Czaplik2 • Vladimir Blazek1,3 Boudewijn Venema1 • Steffen Leonhardt1
•
Received: 23 March 2016 / Accepted: 20 October 2016 Ó Springer Science+Business Media Dordrecht 2016
Abstract Diverse studies have demonstrated the importance of monitoring breathing rate (BR). Commonly, changes in BR are one of the earliest and major markers of serious complications/illness. However, it is frequently neglected due to limitations of clinically established measurement techniques, which require attachment of sensors. The employment of adhesive pads or thoracic belts in preterm infants as well as in traumatized or burned patients is an additional paramount issue. The present paper proposes a new robust approach, based on data fusion, to remotely monitor BR using infrared thermography (IRT). The algorithm considers not only temperature modulation around mouth and nostrils but also the movements of both shoulders. The data of these four sensors/regions of interest need to be further fused to reach improved accuracy. To investigate the performance of our approach, two different experiments (phase A: normal breathing, phase B: simulation of breathing disorders) on twelve healthy volunteers were performed. Thoracic effort (piezoplethysmography) was simultaneously acquired to validate our results. Excellent agreements between BR estimated with IRT and gold standard were achieved. While in phase A a mean correlation of 0.98 and a root-mean-square error (RMSE) of 0.28 bpm was reached, in phase B the mean correlation
& Carina Barbosa Pereira
[email protected] 1
Chair for Medical Information Technology, HelmholtzInstitute for Biomedical Engineering, RWTH Aachen University, Aachen, Germany
2
Department of Anesthesiology, University Hospital RWTH Aachen, Aachen, Germany
3
Czech Institute of Informatics, Robotics and Cybernetics (CIIRC), CTU Prague, Prague, Czech Republic
and the RMSE hovered around 0.95 and 3.45 bpm, respectively. The higher RMSE in phase B results predominantly from delays between IRT and gold standard in BR transitions: eupnea/apnea, apnea/tachypnea etc. Moreover, this study also demonstrates the capability of IRT to capture varied breathing disorders, and consecutively, to assess respiratory function. In summary, IRT might be a promising monitoring alternative to the conventional contact-based techniques regarding its performance and remarkable capabilities. Keywords Physiological monitoring Respiratory rate Breathing disorders Thermal imaging Data fusion
1 Introduction Respiration is an important physiological process, whose goals are (1) to supply tissues and cells with oxygen required for metabolism and (2) to remove carbon dioxide originated in energy-producing reactions [18]. Breathing rate (BR) at rest varies with age as demonstrated in Table 1 [26, 48]. For example, during adulthood breathing rate is relatively constant varying from 12 bpm (breaths per minute) to 20 bpm at resting conditions [16, 26, 48]. On the other hand, BR in neonates (birth—6 weeks) ranges between 30 and 60 bpm [26, 48]. Breathing disorders can be identified by an abnormal breathing rate (BR), respiratory sounds, or also by an atypical waveform (abnormal depth and/or rhythm) [22, 26, 47]. Tachypnea (high BR), bradypnea (low BR) or phases of apnea (temporary cessation in breathing) are examples of altered BR [26]. Kussmaul’s respiration and Cheyne–Stokes respiration are further specific abnormal respiratory patterns [31, 43, 46]. The former presents a
123
J Clin Monit Comput Table 1 Breathing rates at rest according to age [26, 38, 48] Age
Breathing rate Breaths per minute (bpm)
Birth—6 weeks
30–60 bpm
6 months
25–40 bpm
3 years
20–30 bpm
6 years
18–25 bpm
10 years
15–20 bpm
Adults
12–20 bpm
Adults C65 years Adults C80 years
12–28 bpm 10–30 bpm
regular rhythm but an increased rate and depth [46]. It occurs usually as a result of metabolic disorders, in order to compensate an acidotic blood pH [37, 43]. Additionally, it is often associated with fear, anxiety or panic [43]. The latter is characterized by (1) a gradual increase followed by (2) a gradual decrease in depth and rate and (3) a period of apnea [31, 46]. Cheyne–Stokes respiration may be associated with cerebral ischemia [12, 40], congestive heart failure [46, 48], high intracranial pressure [43, 46], among others. Monitoring of breathing function/breathing frequency is truly crucial for patient assessment [5, 9, 13, 21]. Several diseases lead to an abnormal respiratory rate and pattern by altering either one of the three types of feedback to the central respiratory control or the control center itself. They induce usually a change in PaO2, PaCO2, or pH, altering thus the input from both carotid body and medullary chemoreceptors. The human body responds by changing first the tidal volume and then the respiratory rate [5]. Depression of respiratory center by increased intracranial pressure, diabetic ketoacidosis, heart failure, obstructive sleep apnea are some examples of pathologies that affect respiratory rate [47]. Therefore, this vital parameter may be a sensitive marker of patient condition and an early predictor of patient deterioration [9]. In addition to that, it has a relevant impact on the diagnosis and management of respiratory impairments, for instance bronchitis, asthma or even sleep obstructive apnea [14]. Furthermore, respiratory pathologies are early and solid predictors of cardiopulmonary arrest [6, 48], intensive care admission [6, 48] or death [6, 11, 48]. Last but not least, BR is also an important vital parameter for early detection of sudden infant death syndrome, which is still one of the most predominant causes of death in infants, as well as for sport and sleep studies. Although BR is a key vital sign for assessing the severity of acute diseases, it is one of the most frequently undocumented and underestimated parameters [27, 42]. According to the literature, it is commonly neglected due to
123
shortcomings of current clinically established monitoring modalities [8, 9, 11, 27], which require attachment of sensors to the patient’s body, causing stress and discomfort. Moreover, adhesive electrodes might cause mechanical stress to the very sensible skin of infant babies [1]. Vital signs monitoring with adhesive pads in traumatized or burned patients is another critical issue [7, 36]. The current contact breathing rate monitoring techniques are based on measuring: (1) respiratory/acoustic sounds, which are detected by placing a microphone over the throat or near to the respiratory airways [2, 3, 17, 29]; (2) chest or abdomen movements with respiratory belt transducers [2, 4, 15]; (3) electrical impedances of the thorax, using electrical impedance tomography (EIT), for instance [10, 19, 23]; and lastly (4) respiratory airflow [20]. The measurement methods based on respiratory airflow are (a) nasal/oronasal thermistors, detecting temperature modulation due to exhalation of warm air from the lungs and inhalation of cold air from the environment [2, 14]; (b) capnography, monitoring the varying partial pressure of carbon dioxide (CO2)[41]; and (c) spirometry, measuring the volume and flow of inspired and expired air [4]. Furthermore, BR may be derived from body surface electrocardiography (ECG) by using one of three approaches: measuring transthoracic impedance, looking at the beat-to-beat variation in RR intervals, or using the EDR (ECG-Derived Respiration) technique [4, 19, 30, 41, 44]. In the last three decades, there has been an increasing interest in robust and reliable contactless monitoring alternatives of BR in order to improve patients’ quality of life as well as to overcome the limitations of clinically established methods [11, 14]. They are based on Doppler radar [11] and imaging sensors that operate in different spectral bands: visible [4], midwave infrared (3–5 lm) [15, 24] and long wave infrared imaging sensors (8–14 lm) [1, 34, 35]. Infrared thermography (IRT), also denominated thermal imaging, is an emerging remote and passive monitoring and diagnostic technique. It detects the radiation naturally emitted from an object, e.g. the human skin, and does not utilize any harmful radiation. Lastly, thermal imaging does not need a light source, which makes IRT an outstanding imaging technology [33]. This paper presents a new approach, based on several regions of interest (ROIs), for monitoring breathing function in infrared imaging. Existing algorithms have mostly used thermal information around the nostrils to extract BR [1, 14, 15, 24, 30, 34, 35]. However, this solution is not completely adequate for long-term monitoring, for instance. If patients lay on a bed, it can occur that the nose is not in the field of view of the infrared camera, suspending the signal acquisition and, consecutively, the monitoring of this vital parameter. To overcome this issue, in this paper we propose a new approach that not only uses
J Clin Monit Comput
temperature modulation around the nostrils, but also around the mouth, since open-mouth respiration is also common in adult humans. In addition, movement of shoulders, which is considerably prominent during the breathing process, was also taken into consideration. The methodology used in this paper is described in detail in Sect. 2. The experimental setup is, in turn, introduced in Sect. 3. In Sect. 4 results are presented and in Sect. 5 discussed. Lastly, Sect. 6 contains the conclusions and gives some future perspectives.
2 Methodology The approach presented in this paper not only uses the fact that temperature around the nostrils varies with respiration, but also considers the temperature modulation around the mouth (in case of open-mouth breathing) as well as the movement of the shoulders (and, indirectly, the movement of the thorax). As wide-known, the diaphragm and the intercostal muscles (in between the ribs) are the most important groups of respiratory muscles, which are responsible for the movement of the chest, shoulders, among others [49]. While during inhalation the ribs are actively pulled upward and forward, during exhalation the opposite passively happens. Equally, an upward and downward movement of the shoulders occurs during inspiration and expiration, respectively. In order to measure BR, the four ROIs, i.e. nose, mouth and shoulders, must be automatically identified in the first frame of the video sequence. Afterwards, they need to be tracked. In a further step, breathing waveforms must be extracted and used to compute BR. Lastly, the BR from the four sensors/ROIs must be fused to minimize the detection error probability and to achieve a higher reliability. The above mentioned steps, illustrated in Fig. 1, are described in detail in Sects. 2.1–2.5. 2.1 Automatic detection of ROIs and extraction of breathing waveforms As previously mentioned, the first step of our approach consists in an automatic detection of all ROIs, nose, mouth and shoulders. A detailed description of it can be found forthwith. 2.1.1 Nostrils
warmest areas of the face (so called ‘‘hot spots’’), needs to be carried out. Their position is fundamental to determine the area enclosing the nostrils (denominated search window B in [35]). It corresponds to approximately 1/2 of the facial width and 1/3 of the facial height and is located directly under the periorbital regions. Finally, the canny edge detector must be applied to window B. The resulting edges correspond to the edges of the nose. 2.1.2 Mouth As in Sect. 2.1.1, also here facts from the human physiology are used. In the previous section, the position of the nostrils was found. It can be further used to identify the mouth, which is typically located below the nose assuming a craniocaudal image orientation. Therefore, we limited our search to the window A as represented in Fig. 2a. As illustrated, the search window begins below the nose, and its height corresponds to 1.5-fold of the vertical distance between eyes and nose. According to our empirically based knowledge, the mouth is a hot spot (see Fig. 2a, b). Hence, to find it, it was assumed that just the 2 % of the pixels with higher intensity in the search area A correspond to the mouth (see Fig. 2c). Afterwards, the horizontal projection PH ðyÞ needs to be calculated as given by PH ðyÞ ¼
P1 X
Iðx; yÞ;
where, I(x, y) denotes the binary value at pixel (x, y) and P stands for the width of the binary image (search window A). The vertical center position of ROIm (region of interest mouth) coincides with the maximum of PH ðyÞ. 2.1.3 Shoulders In order to identify both shoulders (left shoulder ROIls and right shoulder ROIrs ), an edge detection in the binary image IBW ðx; yÞ of the thermogram was carried out (see Fig. 3a, b). For that, the Sobel edge detector was applied. Afterwards, a 21 pixel wide (2M þ 1) sliding window was used in order to calculate the vertical projection PV of the image containing the detected edges Iedges ðx; yÞ (size P Q), as represented in Fig. 3b. The PV is given by PV ðx0 Þ ¼
Q1 M X X
Iedges ðx0 þ u; yÞ
u¼M y¼0
To extract the nostrils ROIn (region of interest nose), the approach presented in [34, 35] was used. It consists in segmenting the face by using the multi-level Otsu’s method [25, 32]. Afterwards, an automatic detection of the medial canthus of the periorbital regions, which are two of the
ð1Þ
x¼0
for
ð2Þ
M þ 1\x0 \P M;
where x0 is the center of the sliding window. The maxima of the vertical projection PV give information about the position of the shoulders in the thermogram I. Their distal
123
J Clin Monit Comput
b
a
d
c
Sensor data fusion
I
t
Ifilt y
IBW
t
t
median
SQI
Bayes fusion
x
Fig. 1 Schematic of the main steps used to detect breathing rate in infrared thermography. a The first step consists in recording the thermal video sequences. b After preprocessing of image I, an automatic detection and tracking of the four ROIs is carried out. Whereas the breathing signal of nostrils and mouth are extracted from the grayscale intensity image Ifilt , the breathing waveform of both
1/4 d
A 50%
20
20
40
40
60
60
80
80
100
20
40
60
80 100
100
d Vertical index (y)
d
1.5 d
c
b
a
shoulders is detected in the binary image IBW . c The third step consists in (1) extracting respiratory signals from the ROIs, (2) determining the associated respiratory rates and signal quality indices. d In the fourth step, data fusion (using median, signal quality index and Bayesian fusion) is performed
20
40
0
20 40 60 80 100
60
80 100
0
100
200
300
PH (y) Fig. 2 Thermogram of the face. The upper, middle and lower dotted lines depict the vertical position of eyes, nose and mouth, respectively. The search window A, which encloses the mouth, is represented by the blue line window. b Thermogram correspondent
to search window A (depicted in a). The horizontal dotted line denotes the vertical center position of ROIm (region of interest mouth). c 2 % of the pixels with higher intensity in the search area A. d Horizontal projection PH ðyÞ of c
limits on the x-axis result directly from the position of the first and last local maxima in the PV , as shown in Fig. 3b, c. The position of the nostrils, calculated in Sect. 2.1.1, is utilized for detection of proximal limits on the x-axis. Therefore, it was assumed that the maximum to the left and to the right of the nose correspond to the proximal limits, as depicted in Fig. 3b, c. Obviously, just the region between distal limits was considered. The y-coordinates of the ROIs that enclose the shoulders can be determined by the intersection between the binary image containing the edges Iedges and vertical lines, which depict the x-coordinates of the distal and proximal limits. They are illustrated by the stars in Fig. 3b.
Note that for an automatic detection of the shoulders in the first frame of the thermal video, it is necessary to have a front-view thermogram. Segmentation of the human body (foreground) from background in thermal images is, in most cases, easier than in visible images due to the high contrast between them (foreground and background). Commonly, the human body is warmer than the ambient temperature and surrounding objects (e.g. mattress, blankets and pillow in case of patients lying in a bed). Since clothes are warmed by the body through conduction, they do not have any negative influence in our algorithm during the segmentation process. Even in neonates lying in an incubator, the contrast has shown to be surprisingly high [1].
123
J Clin Monit Comput
a
b
100
100
300
300
500
500
700
700 200
c
400
800 1000
600
400
200
800 1000
600
250 200
Pv(x)
Fig. 3 a Binary image IBW of the thermogram I. b Image with extracted edges Iedges . While the green dashed lines represent the distal limits of the shoulders, the blue dotted lines show their proximal limits. The orange rectangle represents the 21 Pixels wide sliding window. The intersections between Iedges and distal as well as proximal limits are shown by the four stars. c Vertical projection Pv(x) of Iedges . Circles and crosses show the maxima of Pv(x), which permit to identify the shoulders. In addition, green dashed lines and blue dotted lines represent their distal and proximal limits, respectively
150
Distal Limits
100
Proximal Limits
50 0 100
200
400
300
500
600
700
800
900 1000
Horizontal index (x)
2.2 Tracking of ROIs
as the final tracking target. In summary, given A 2 Rmn and y 2 Rm , sparsity is reached by calculating
After an automatic detection of the ROIs, it is essential to track them in order to compensate the movements of the patient. For that, we selected the approach introduced by Mei and Ling [28], where sparse representation is integrated into a particle filter-based object tracker. Subsequent to a detection of the four target regions (ROIs), an initial particle set is generated randomly around the regions’ centers from the initial state distribution pðx0 Þ. Here, pðxt Þ stands for the state variable, which describes the affine motion parameters (and thereby the location) of particles at time t. Given all available observations until the time point t 1, y0:t1 ¼ fy0 ; y1 ; y2 ; . . .; yt1 g, the distribution of pðxt Þ, represented as pðxt j y0:t1 Þ, can be recursively predicted as Z pðxt j y0:t1 Þ ¼ pðxt j xt1 Þpðxt1 j y0:t1 Þdxt1 : ð3Þ
minjj Ax yjj22 þkjj xjj1
After computation of the predicting distribution of xt , the observation yt is available. As a result, the state vector can be updated using the Bayes rule: pðxt j y0:t Þ ¼
pðyt j xt Þpðxt j y0:t1 Þ : pðyt j y0:t1 Þ
ð4Þ
Here, pðyt j xt Þ stands for the observation likelihood, which indicates the similarity between a target candidate and the target templates. In order to detect the tracking target in a new frame, each target candidate is sparsely represented in the space. Here, sparsity is achieved by solving the ‘1-regularized least-squares problem (LSPs). The candidate with the smallest projection error (error approximated by the target templates after applying the ‘1 minimization) is considered
and
x
k 0:
ð5Þ
P Here, jjjj2 denotes the Euclidean norm, jj xjj1 ¼ ni jxi j stands for the ‘1 norm of x, and k corresponds to the regularization parameter. In addition, x 2 Rm is a vector containing the target and trivial coefficient.
2.3 Extraction of breathing waveforms and determination of BR In order to obtain the breathing waveforms of the ROIs nose and mouth, the mean intensity values I ROIx must be computed for each gray scale frame according to I ROIx ðtÞ ¼
m1 X n1 1 X IROIx ðiROIx ; jROIx ; tÞ; mn iROI ¼0 jROI ¼0 x
ð6Þ
x
where IROIx stands for the intensity at pixel ðiROIx ; jROIx Þ, t denotes the time point, and, m and n correspond to the width and length of the ROIx , respectively. Finally, the subscript x (ROIx ) represents one of the ROIs, nose n or mouth m. The breathing waveform of the shoulders stands basically for their upward and downward movement (vertical movement). The respiratory rate is determined by applying a short time Fourier transform to the raw signal I ROIx . For each window location, the signal was hamming windowed to reduce edge effects, and the normalized spectrum was calculated. The local maximum in the bandpass range between 0.1 and 3 Hz corresponds to the BR at a given time point.
123
J Clin Monit Comput
2.4 Signal quality index (SQI)
2.5.3 Bayesian fusion
The normalized power spectrum was not only used to derive the BR, but also utilized to assess the signal’s quality. Therefore, for each analysis window, a signal quality index (SQI) was computed. The aim was to develop a parameter/metric that is able to detect noise and artifacts in the breathing signal of the four sensors, and, as a result, improve signal fusion. As previously mentioned, the SQI is based on the normalized power spectrum, namely on four features F1–F4. Without loss of generality, the SQI might oscillate between 0 and 1, where 0 means a signal with poor quality containing noise and/or artifacts. In contrast, a SQI of 1 can be interpreted as a signal with an outstanding quality free from any artifact. Three regions on the normalized power spectrum were considered: (1) low-pass (0–0.1 Hz), (2) band-pass (0.1–3 Hz) and, lastly, high-pass region (3–15 Hz). The first feature F1 stands for the maximum in the high-pass region. F2 corresponds to the proportion of the values in the high-pass region that are greater than a threshold. F3, in turn, denotes the difference between both maxima, in bandpass and low-pass region. The fourth and last feature (F4) is the ratio between the maximum in the low-pass and the maximum in the band-pass region.
The last fusion algorithm, Bayesian fusion, is based on the Bayes’ law. Here, we adopted the approach proposed by Wartzek et al. [45]. In summary, considering a state-space representation, the Bayes estimator permits to calculate the posterior probability distribution of a real but unknown state x 2 X. It is based on a set of measurements Zn ,fz1 2 Z1 ; . . .; zn 2 Zn g as given by
2.5 Fusion algorithms In order to obtain a robust detection of BR, the signals from all four sensors need to be combined (multisensor data fusion). Three different approaches, (1) median of all sensors, (2) best SQI and (3) Bayesian fusion, were implemented, tested and compared. They are described in the following sections in detail. 2.5.1 Median The present approach calculates the median breathing rate of all four sensors. The fused BR f fus at time point k is governed by the equation fkfus ¼ medianðfkn Þ;
ð7Þ
where f n represents the BR of the sensor/ROI n. 2.5.2 Best SQI The second approach consists in choosing the best sensor, i.e. the ROI that presents the best SQI.
123
pðxn jZn Þ ¼
pðzn jxn Þpðxn jZn1 Þ : pðzn jZn1 Þ
ð8Þ
Here, the numerator correspond to the product between pðzn jxn Þ and pðxn jZn1 Þ, which denote the likelihood function and the prior distribution, respectively. While the former is based on the sensor measurement model (observation model), the latter includes the transition model of the system. The denominator, in turn, is merely a normalization factor that allows to integrate the probability density function pðzn jZn Þ to one [45].
3 Experimental setup Twelve healthy subjects (5 females and 7 males), between the ages of 21 and 31 (25.25 ± 2.83 years), voluntarily accepted to participate in this study. Informed consent was obtained from the participants enrolled in the study. It was approved by the RWTH Aachen Faculty of Medicine Ethics Board (EK 081/16). Thermal sequences were acquired by using a long wave infrared (LWIR) camera, VarioCAMÒ HD head 820S/30 mm (InfraTec GmbH, Dresden, Germany). This camera, which presents a spatial resolution of 1024 9 768 pixels and a thermal sensivity better than 0.05 K at 30 °C, has an uncooled infrared microblometer focal plane array (FPA). Its sensor operates over the spectral range of 7.5–14 lm. The thermal camera was sat atop a tripod located about 2 m away from the candidates, who sat comfortably on a chair. It was placed at the subject’s height—eye-level camera angle. In addition, the measurements were carried out in a temperature-controlled room with a temperature and humidity of approximately 22 °C and 50 %, respectively. The study protocol consisted of two phases, A and B. In the former, a 9 min recording was carried out where the volunteers breathed normally. It was divided in 3 min segments. In the first and second segment, the subjects were advised to breath only through the nose and through the mouth, respectively. In the last segment of phase A, both nose and mouth breathing were permitted. Figure 4 illustrates phase A.
J Clin Monit Comput Fig. 4 Graphical illustration of phase A. In the first 3 min recording the subjects were asked to breath through the nose. In the second period of time (3–6 min), only mouth breathing was permitted. In the last 3 min, both nose and mouth breathing was allowed
Time 3 Minutes
START
6 Minutes
9 Minutes
Phase A
Eupnea
Tachypnea
Cheyne-Stokes respiration
Eupnea
Eupnea
Tachypnea
Amplitude
Fig. 5 Simulated sequence containing normal and altered breathing patterns (eupnea, tachypnea, apnea, Kussmaul breathing and Cheyne–Stokes respiration)
Eupnea
According to the instructions, during phase A subjects breathed normally, firstly through the nose, afterwards through the mouth and, lastly, both nose and mouth
breathing were permitted. Table 2 indicates the performance of the proposed approach for each of the twelve subjects. In addition, a comparison between the fusion algorithms was carried out. By using the median for data fusion, a mean correlation of 0.92 and a root-mean-square error (RMSE) of 0.62 bpm [breaths per minute—breaths/ min] was achieved. In average, 95.90 % of the errors (25th
Apnea
4 Results
Fig. 6 Schematic representation of the mobile sleep diagnostic system used in this study to monitor respiratory rate. It includes two belts to measure both thorax and abdomen effort. In this study just the thorax signal was used for comparison purposes
Kussmaul breathing
In the latter, the candidates were instructed to simulate in a period of 10 min a sequence containing normal and a varied selection of altered breathing patterns, as depicted in Fig. 5. They comprehended the following: eupnea, tachypnea, apnea as well as Kussmaul and Cheyne–Stokes respiration. To facilitate their task, the diagram containing the breathing patterns was displayed in a monitor. It is valuable to note that infrared video sequences were acquired with a frame rate of 30 fps (frames per second). In order to validate our results, the thoracic effort (piezoplethysmography) was simultaneously measured using the data recording system SOMNOlab 2 (Weinmann GmbH, Hamburg, Germany) illustrated in Fig. 6. SOMNOlab 2 is a mobile sleep diagnostic and monitoring system, which permits to record varied physiological signals including thorax and abdomen movements (using effort sensors). The respiratory movements during inspiration and expiration cause a change in tension on the measuring sensors integrated in the thorax belt. The measuring sensors, in turn, convert this into electrical signals as a result of the piezoelectric effect. The monitoring system permitted to acquire 32 samples/second.
Breathing Signal
1
2
3
4
5
6
7
8
9
10
Time (Minutes)
123
J Clin Monit Comput Table 2 Performance of the proposed algorithm for phase A Sub.
Gen.
Median
best SQI
Bayes
RMSE*
Corr.
RMSE*
Corr.
RMSE*
Corr.
S1
F
0.60
0.90
0.27
0.97
0.57
0.92
S2
F
0.31
0.98
0.27
0.98
0.37
0.97
S3
M
0.32
0.98
0.33
0.98
0.32
0.97
S4
M
0.19
0.98
0.20
0.98
0.25
0.98
S5
M
0.23
0.99
0.22
0.99
0.27
0.98
S6
M
2.80
0.57
0.23
0.99
0.25
0.99
S7 S8
F M
0.33 0.98
0.98 0.85
0.34 0.40
0.98 0.97
0.33 0.42
0.98 0.97
S9
F
0.28
0.98
0.23
0.99
0.30
0.98
S10
M
0.18
0.99
0.20
0.99
0.22
0.98
S11
M
0.35
0.98
0.31
0.98
0.30
0.98
S12
F
0.83
0.85
0.33
0.97
0.46
0.95
0.62
0.92
0.28
0.98
0.34
0.97
Mean
*RMSE (bpm), Root-mean-square error; Sub., Subject; Gen., Gender; Bayes, Bayesian fusion
RMSE [bpm]
to 75th percentile, 93.82–98.60 %) did not exceed 1 bpm. On the other hand, by selecting the signal with best quality (best SQI), an average RMSE of 0.28 bpm and a mean correlation of 0.98 was obtained. Here, the percentage of absolute errors smaller than 1 bpm was 97.64 % (25th to 75th percentile, 96.09–99.30 %). Bayesian fusion, in turn, allowed to reach a RMSE of 0.34 bpm and a correlation of 0.97. The absolute errors between IRT and gold standard were in 95.48 % (25th to 75th percentile, 92.95–97.18 %) of the cases smaller than 1 bpm. Figure 7 shows a box plot that aims to compare the rootmean-square errors obtained with sensor fusion with those correspondent to each single ROI. In this case, fusion was performed by using the signal with best quality (best SQI). In the fused data the quartiles are Q1 = 0.22 bpm, Q2 = 0.27 bpm and Q3 = 0.33 bpm. In the left shoulder data the quartiles are, in turn, Q1 = 0.22 bpm, Q2 = 0.26 bpm, and Q3 = 0.59 bpm. It is important to note that Q1 is the first quartile (25th percentile); Q2 denotes the second quartile (50th percentile), also denominated median; and Q3 stands for the third quartile (75th percentile). As an illustrative example, Fig. 8a exhibits the breathing waveforms for each sensor/ROI (black and gray lines) as
well as the normalized breathing waveform (green solid line) obtained with the ground truth (piezoplethysmography) for subject S4. Figure 8b, in turn, illustrates the BR estimated with the proposed approach (red dashed line), using the median for data fusion, as well as the BR obtained with piezoplethysmography (blue solid line). A Bland–Altman diagram, comparing both measuring modalities, gold standard and IRT, as well as the three fusion approaches, is displayed in Fig. 9. The data stands for the subject S5. Using median a mean difference of -0.040 bpm was achieved and the limits of agreement ranged from -0.621 to 0.541 bpm. Similar results were obtained with the approach best SQI. In this case, the bias was -0.035 bpm and the limits of agreement varied between -0.593 and 0.522 bpm. The last method, Bayesian fusion, contributes to a mean difference of -0.030 bpm. The 95 % limits of agreement ranged from -0.727 to 0.667 bpm. During phase B, the twelve candidates simulated the sequence of breathing patterns displayed in Fig. 5. Table 3 compares the performance of the three fusion algorithms, median, best SQI and Bayesian fusion. The median approach permitted to obtain a RMSE of 3.45 bpm and a correlation of 0.95. Additionally, 88.63 % (25th to 75th percentile, 86.89–90.24 %) of the absolute errors between IRT and gold standard did not exceed 2 bpm. By using the best SQI we obtained a root-mean-square error of 3.36 bpm and a correlation of 0.95. Here, the percentage of errors smaller or equal to 2 bpm averaged 90.49 % (25th to 75th percentile, 88.53–92.37 %). Lastly, with Bayesian fusion a mean RMSE of 3.60 bpm and mean correlation of 0.94 was achieved. In this case, 90.32 % (25th to 75th percentile, 88.33–92.26 %) of the absolute errors did not transcend 2 bpm. Figure 10 shows a box plot that compares the rootmean-square errors obtained using sensor fusion with those of each single ROI. In this example, best SQI was the approach used. In the fused data, the quartiles are Q1 = 2.97 bpm, Q2 = 3.28 bpm and Q3 = 3.71 bpm. In the left shoulder data, the quartiles are, in turn, Q1 = 3.18 bpm, Q2 = 3.53 bpm, and Q3 = 3.73 bpm. Figure 11a depicts the BR estimated with the proposed approach (green dashed line) as well as the BR obtained with the gold standard method (blue solid line). In this example, Bayesian fusion was performed and the data is
10 5 0 Fused
Nose
Mouth
Left Shoulder
Right Shoulder
Fig. 7 Box plot comparing the RMSEs of fused signals with those of each single ROI (nose, mouth and left and right shoulder). In this case, fusion was performed by using the best SQI. The data correspond to phase A
123
J Clin Monit Comput
0.9
1.0
0.8
0
0.7
-1.0
0.6 140
160
180
200
b Breathing Rate (bpm)
Intensity
1.0
Normalized Amplitude
a
15 14 13 12 11 10 100
220
150
200
Time (Seconds)
Time (Seconds) Nose
Mouth
Left shoulder
Right shoulder
Median
Reference
Reference (piezoplethysmography)
Fig. 8 a Breathing waveforms obtained for each ROI, nose (gray solid line), mouth (black solid line), left shoulder (gray dotted line) and right shoulder (black dotted line) from subject S4. The same diagram compares those with the normalized breathing waveform
(green solid line) obtained with piezoplethysmography. b Estimated BR (solid line ground truth, dashed line IRT). The median was used for data fusion
BRIRT - BRRef (bpm)
1.5 1
+ 1.96 SD
Bayes
Median
best SQI
- 1.96 SD
-0.621
-0.593
-0.727
Mean
-0.040
-0.035
-0.030
+ 1.96 SD
0.541
0.522
0.667
0.5
Mean
0 -0.5
- 1.96 SD
-1 -1.5 9
10
11
12
13
14
15
16
Mean of BR Ref and BR IRT (bpm)
Fig. 9 Bland–Altman plot for subject S5 (phase A). It compares all fusion approaches, median (red), best SQI (gray) and Bayesian fusion (green). Mean differences and limits of agreement are depicted on the right side of the diagram
Table 3 Performance of the proposed algorithm for phase B Sub.
Gen.
Median
best SQI
Bayes
RMSE*
Corr.
RMSE*
Corr.
RMSE*
Corr.
S1
F
3.32
0.94
2.53
0.97
3.34
0.94
S2
F
5.10
0.90
5.08
0.90
5.11
0.90
S3
M
3.34
0.97
3.31
0.97
3.42
0.97
S4
M
2.58
0.98
2.51
0.98
2.52
0.98
S5
M
3.32
0.95
3.87
0.94
3.15
0.96
S6
M
3.88
0.93
3.86
0.93
3.89
0.93
S7 S8
F M
3.13 2.71
0.95 0.97
3.00 2.95
0.96 0.95
4.47 3.00
0.92 0.95
S9
F
3.12
0.95
3.03
0.95
3.16
0.95
S10
M
3.75
0.94
3.39
0.95
3.44
0.95
S11
M
3.24
0.95
3.25
0.96
3.58
0.95
S12
F
3.87
0.94
3.57
0.95
4.14
0.93
3.45
0.95
3.36
0.95
3.60
0.94
Mean
*RMSE (bpm), Root-mean-square error; Sub., Subject; Gen., Gender; Bayes, Bayesian fusion
correspondent to the volunteer S8. Moreover, the breathing waveform of the left shoulder is also shown (see Fig. 11b). As an illustrative example, a Bland–Altman plot for subject S11 is depicted in Fig. 12. It compares both modalities, ground truth and IRT, as well as the three fusion algorithms. Applying the median, a mean difference of 0.611 bpm was reached. The limits of agreement ranged from -6.551 to 7.773 bpm. Comparable results were obtained by using the signal with best quality best SQI. The bias was 0.595 bpm and the limits of agreement varied between -6.709 and 7.899 bpm. In the last approach, Bayesian fusion, a mean difference of 0.833 bpm was obtained and the 95 % limits of agreement ranged from -7.128 to 8.795 bpm.
5 Discussion A novel and quite robust approach using IRT was developed to monitor breathing rate unobtrusively and contactless with an outstanding performance. Measuring of BR is of high
123
RMSE [bpm]
J Clin Monit Comput
20 10 0 Fused
Mouth
Nose
Left Shoulder
Right Shoulder
Fig. 10 Box plot comparing the RMSEs of fused signals with those of each single ROI (nose, mouth and left and right shoulder). In this case, fusion was performed by using the best SQI. The breathings signals correspond to phase B
Breathing Rate (bpm)
a 40 30 20 10 0 0
100
200
300
400
500
600
500
600
Time (Seconds) b
1
Intensity
Deep breathing
Cheyne-Stokes respiration
0.9 0.8 0.7 0
100
200
300
400
Time (Seconds) Fig. 11 a Estimated BR correspondent to candidate S8 (blue solid line piezoplethysmography, green dashed line IRT). The Bayesian approach was used for data fusion. b Breathing waveform of the left
shoulder. Specific breathing patterns such as deep breaths and Cheyne–Stokes respiration are highlighted in red and green, respectively
BRIRT - BRRef (bpm)
30 20
+ 1.96 SD
10
Mean
0
- 1.96 SD
-10
Bayes
Median
best SQI
- 1.96 SD
-6.551
-6.709
-7.128
Mean
0.611
0.595
0.833
+ 1.96 SD
7.773
7.899
8.795
-20 0
5
10
15
20
25
30
35
Mean of BRRef and BRIRT (bpm)
Fig. 12 Bland–Altman plot for subject S11 (phase B). It compares all fusion approaches, median (red), best SQI (gray) and Bayesian fusion (green). Mean differences and limits of agreement are depicted on the right side of the diagram
clinical relevance since several studies evidenced that this is one of the most undocumented vital parameters due to limitations of clinically established monitoring techniques (for instance, the attachment of sensors). In contrast to existing published studies [1, 14, 15, 24, 30, 34, 35], not only
123
the temperature modulation around the nostrils was used, but also the temperature variation around the mouth as well as the upward and downward movement of the shoulders. Although solely consideration of thermal information around the nostrils already leads to encouraging results, it
J Clin Monit Comput
relays on the fact that the ROI, i.e. the nose of the patient, is permanently on the field of view of the infrared camera. Since this is not always the case, these solutions are not appropriated for long-term monitoring. Hence, if neither nose nor mouth are visible, the algorithm can use the information of the shoulders to estimate BR. To analyze the robustness and feasibility of the developed algorithm, a study in twelve healthy subjects was carried out. During phase A, the volunteers were advised to breathe normally, firstly through the nose, afterwards through the mouth, and lastly, both nose and mouth breathing were allowed. The candidates remained still, and no abrupt movements were evidenced. Since the performance of the algorithm in the three stages (1) nasal breathing, (2) openmouth breathing, (3) nasal or open-mouth breathing was relatively similar, we neither include, compare nor discuss these results in the manuscript. Table 2 shows an outstanding agreement between gold standard (piezoplethysmography) and thermal imaging. The fusion approaches best SQI and Bayesian fusion demonstrated better correlation coefficients (mean correlation of 0.98 and 0.97) as well as smaller root-mean-square errors (mean RMSE of 0.28 and 0.34 bpm, respectively). By using the former fusion method, we successfully obtained in 97.64 % of the cases an error smaller than 1 bpm. Equally important is the contribution of sensor fusion to improve precision and accuracy. Figure 7 compares the RMSE of the fused signal (using best SQI) with those of each single ROI. As shown in the boxplot, median and interquartile range (IQR), i.e. statistical dispersion, are visibly smaller when signals are fused. Solely the data extracted from the left shoulder presents a marginally smaller median. On the other hand, the IQR is more than threefold higher. As we have evidenced, by combining data from multiple sensors improved accuracy could be reached than by using only a single ROI/sensor. Figure 8a, which depicts the breathing waveforms acquired with IRT and the reference signal, exhibits a very good agreement between them. In addition, it is possible to observe that nose and mouth waveforms are mirrored. This is due to the exhalation of warm air from the lungs and inhalation of cold air from the environment, leading to an increase and decrease of the temperature, respectively. The same figure shows a transition between two stages (nasal breathing/open-mouth breathing) at time point 180 s. Whereas during the first stage no temperature modulation around the mouth is visible (the signal contains purely white nose), during the second stage (open-mouth breathing), on the other hand, a typical nose respiratory waveform is noticeable. Thus, from these facts we can infer that independently of the type of breathing (whether nasal- or
mouth-respiration) the temperature around the nostrils is always suitable for BR estimation. The examples represented in Figs. 8b and 9 corroborate the great performance of our method. Also here the excellent agreements between breathing rates (IRT and piezoplethysmography), through the very low mean differences and limits of agreement, are undeniable. Lewis et al. [24] published a work where they estimated BR and relative tidal volume in thermal video sequences. Using two different cameras, a SC-6000 [320 9 240 pixel resolution and thermal sensitivity of 0.08 °C (Indigo System Inc., Goleta, CA)] and a TVS-700 [640 9 510 pixel resolution and thermal sensitivity of 0.02 °C (FLIR Inc., Santa Barbara, CA)], they achieved comparable correlation coefficients, 0.95 ± 0.05 (N = 6) and 0.98 ± 0.02 (N = 12), respectively. In 2015 our research group brought out a scientific paper introducing an algorithm with the same purpose, i.e. BR estimation based on nasal temperature modulation [34]. Here, a mean absolute error of 0.33 bpm and a correlation coefficient of 0.97 ± 0.02 was achieved. All approaches presented outstanding results as demonstrated by the very good correlation coefficients and small RMSEs between IRT and ground truth. However, despite of the exceptional and comparable results, the last two scientific papers [24, 34] only considered temperature around the nostrils, which makes them not completely appropriated for a feasible and robust long-term clinical monitoring. Huang et al. [20] published a paper where a gauze mask was used to measure respiratory airflow, and with this BR. Twenty breathing cycles of subjects were measured and analyzed under (a) resting conditions and (b) after 30 s of exercise. Excellent results were obtained; the absolute error averaged 0.14 bpm and the system accuracy reached 99.30 %. Unfortunately, the number of subjects enrolled was not indicated. Phase B aimed to examine clinically relevant scenarios. Therefore, the candidates were instructed to simulate physiological and pathological breathing patterns such as tachypnea, apnea as well as Cheyne–Stokes and Kussmaul respiration. Table 3 demonstrates a remarkable agreement between reference and thermography. The three fusion algorithms presented comparable correlation coefficients and root-mean-square errors. By using the best SQI the best results were achieved, namely a mean correlation of 0.95 and a mean RMSE of 3.36 bpm. The boxplot of Fig. 10 compares the RMSE of the fused signal (using best SQI) with the RMSE of each single region of interest. As depicted, sensor fusion contributes to a decrease in the root-mean-square error, and consecutively, to an improvement of accuracy as expected. As represented in the boxplot, median, first, and third quartile are noticeably smaller for the fused signals.
123
J Clin Monit Comput
Figure 11a shows the breathing rate estimated with IRT and piezoplethysmography for the candidate S8. An excellent correlation between the signals is noticeable. A Bland–Altman diagram showing the correlation between both methods is depicted in Fig. 9. It corresponds to volunteer S11. Table 3 together with Fig. 12 show higher root-mean-square errors when compared with Table 2. Those errors are equally visible in the Bland–Altman of Fig. 12. They correspond to the outliers illustrated by the blue shading regions. Phase B is characterized by abrupt changes in BR (e.g. 30 bpm) (see Figs. 5, 11), therefore, even minimal delays between reference and IRT in these passages result in huge errors. Those have, obviously, a negative impact on the RMSE given in Table 3. These great discrepancies occurred in the nine transitions (see Figs. 5, 11) ( 60 s: eupnea/tachypnea, 120 s: tachypnea/eupnea, 240 s: Kussmaul breathing/apnea, 300 s: apnea/eupnea, 390 s/ 420 s/ 450 s: Cheyne–Stokes respiration/apnea/Cheyne–Stokes respiration, 480 s: apnea/tachypnea, 540 s: tachypnea/eupnea). Furthermore, as mentioned in Sect. 4, approximately 90 % of the errors were smaller than 2 bpm. This statement and Fig. 11 evidence that the errors outside these transitions were actually small. In addition, methods based on different principles are compared, which may contribute to an increase of the errors. From Fig. 11b, it can be clearly seen that IRT also captures pattern changes, such as deep breaths, apnea, Cheyne–Stokes respiration, among others, very well. For instance, the typical pattern of Cheyne–Stokes respiration, characterized by a gradual increase followed by a gradual decrease in depth as well as by a period of apnea, is clearly visible. The detection of an atypical respiration (abnormal pattern or rate) might be of paramount importance to an earlier and faster medical diagnosis (e.g. Cheyne–Stokes respiration is associated with cerebral ischemia, congestive heart failure, etc.). Figures 7 and 10 show visible differences between the breathing rate extracted from the left and right shoulder. Although we tried to have a front-view eye-level angle, this was not always possible due to the natural movement of the candidates. Our approach overcomes some issues of former works. However, there are still other challenges that need to be solved in future works. First, the effects of oxygen insufflation via masks and nasal cannulae must be analyzed. For insufflation cannulae, preliminary results have demonstrated that IRT measurements are still feasible since nostrils are not completely covered. As a result, it is still possible to observe the typical temperature modulation around this region. Second, new camera positions, e.g. long angle camera (camera placed below the patient’s eyes, i.e. the camera is tilted up) should be tested and our approach
123
adjusted. In this study, the camera was always placed at the subject’s height (front-view eye-level angle) and thus the algorithm was primarily optimized for this position. Third, new measurements in patients or subjects lying in a bed should be carried out in the future. For doing so, the position of the camera needs to be adjusted and different sleeping/lying positions considered (e.g. supine, prone, lying on the side). Fourth, although a tracking algorithm for minor arbitrary movements was applied, major motion artifacts (e.g. head rotations) were neither considered nor evaluated. Therefore, in a future study their influence on the estimated breathing waveform must be investigated. Fifth, although several pathological breathing patterns were simulated, they were still performed by healthy young volunteers. Therefore, the current approach needs to be validated in patients suffering from different breathing problems (e.g. decreased respiratory depth), since they can affect its performance. However, we do not expect any severe obstacles. Sixth, the performance of the method in newborn/preterm infants should be validated. Regarding thermal changes around the nose, studies have demonstrated that despite of the small amplitude, it is still possible to measure them with commercially available thermal cameras [1]. The motion of the shoulders, in turn, was not investigated in neonates yet, contrarily to adults [39]. Therefore, further studies need to be conducted to analyze weather it is possible to detect it.
6 Conclusions There is a current demand for unobtrusive and contactless techniques for monitoring of vital signs. Diverse alternatives have been proposed based on Doppler radar or imaging sensors (visible, near infrared and long-wave infrared sensors). In the current paper we introduced a novel, robust and accurate approach for monitoring of breathing function using infrared thermography. It detects automatically four different regions of interest and includes a tracking algorithm for motion compensation. The signal from the four ROIs are further fused to increase accuracy. On the one hand, the results have demonstrated that the proposed method permits to accurately estimate breathing rate in diverse events/scenarios (physiological or pathological). On the other hand, it allows to capture the different respiratory patterns, which may be helpful for an earlier and faster medical diagnostic. Therefore, we consider truthfully that the presented infrared imaging may be proper for clinical purposes. In fact, this technique is considered ideal for long term monitoring, since discards the need for any sensor, which requires physical attachment to the patient and may cause him unnecessary stress and discomfort.
J Clin Monit Comput
In order to test the reliability and feasibility of the algorithm during real condition, a clinical study should be carried out in a near future. In addition, it would be interesting to develop an algorithm that does not use any anatomical region to estimate BR and considers thermal videos sequences as black boxes. Acknowledgments C. B. Pereira wishes to acknowledge FCT (Foundation for Science and Technology in Portugal) for her Ph.D. Grant SFRH/BD/84357/2012. Compliance with ethical standards Conflict of interest The authors declare that they have no conflict of interest.
References 1. Abbas AK, Heimann K, Jergus K, Orlikowsky T, Leonhardt S. Neonatal non-contact respiratory monitoring based on real-time infrared thermography. Biomed Eng Online. 2011;10:93. 2. Al-Khalidi FQ, Saatchi R, Burke D, Elphick H, Tan S. Respiration rate monitoring methods: a review. Pediatr Pulmonol. 2011;46(6):523–9. 3. Atkins JH, Mandel JE. Performance of Masimo rainbow acoustic monitoring for tracking changing respiratory rates under laryngeal mask airway general anesthesia for surgical procedures in the operating room: a prospective observational study. Anesth Analg. 2014;119(6):1307–14. 4. Bartula M, Tigges T, Muehlsteff J. Camera-based system for contactless monitoring of respiration. In: Proceedings of IEEE engineering in medicine and biology society (IEEE, 2013). 2013. pp 2672–5 5. Braun SR. Respiratory rate and pattern. In: Walker HK, Hall WD, Hurst JW, editors. Clinical methods: the history, physical, and laboratory examinations. 3rd ed. Boston: Butterworths; 1990 (chap 43). 6. Buist M, Bernard S, Nguyen TV, Moore G, Anderson J. Association between clinically abnormal observations and subsequent in-hospital mortality: a prospective study. Resuscitation. 2004;62(2):137–41. 7. Chen F, Wu H, Hsu PL, Stronger B, Sheridan R, Ma H. SmartPad: a wireless, adhesive-electrode-free, autonomous ECG acquisition system. In: Proceedings of IEEE engineering in medicine and biology society (IEEE, 2008). 2008. pp 2345–8 8. Cretikos M, Chen J, Hillman K, Bellomo R, Finfer S, Flabouris A, study investigators M. The objective medical emergency team activation criteria: a case-control study. Resuscitation. 2007;73(1):62–72. 9. Cretikos MA, Bellomo R, Hillman K, Chen J, Finfer S, Flabouris A. Respiratory rate: the neglected vital sign. Med J Aust. 2008;188(11):657–9. 10. Czaplik M, Biener I, Dembinski R, Pelosi P, Soodt T, Schroeder W, Leonhardt S, Marx G, Rossaint R, Bickenbach J. Analysis of regional compliance in a porcine model of acute lung injury. Respir Physiol Neurobiol. 2012;184(1):16–26. 11. Droitcour AD, Seto TB, Park BK, Yamada S, Vergara A, El Hourani C, Shing T, Yuen A, Lubecke VM, Boric-Lubecke O. Non-contact respiratory rate measurement validation for hospitalized patients. In: Proceedings of IEEE engineering in medicine and biology society (IEEE, 2009). 2009. pp 4812–15
12. Duning T, Deppe M, Brand E, Stypmann J, Becht C, Heidbreder A. Young P Brainstem involvement as a cause of central sleep apnea: pattern of microstructural cerebral damage in patients with cerebral microangiopathy. PLoS ONE. 2013;8(4):e60,304. 13. Elliott M, Coventry A. Critical care: the eight vital signs of patient monitoring. Br J Nurs. 2012;21(10):621–5. 14. Fei J, Pavlidis I. Virtual thermistor. In: Proceedings of IEEE engineering in medicine and biology society (IEEE, 2007). 2007. pp 250–3 15. Fei J, Pavlidis I. Thermistor at a distance: unobtrusive measurement of breathing. IEEE Trans Biomed Eng. 2010;57(4):988–98. 16. Feldman JL, Del Negro CA. Looking for inspiration: new perspectives on respiratory rhythm. Nat Rev Neurosci. 2006;7(3):232. 17. Guechi Y, Pichot A, Frasca D, Rayeh-Pelardy F, Lardeur JY, Mimoz O. Assessment of noninvasive acoustic respiration rate monitoring in patients admitted to an Emergency Department for drug or alcoholic poisoning. J Clin Monit Comput. 2015;29(6):721–6. 18. Hall JE, Guyton AC. Guyton and hall textbook of medical physiology. 12th ed. Philadeiphia: Saunders; 2010. 19. Hess DR, MacIntyre NR, Galvin WF, Mishoe SC. Respiratory care: principles and practice. Burlington: Jones & Bartlett Publishers; 2015. 20. Huang Y, Young M, Huang K. Respiratory rate monitoring Gauze mask system based on a pyroelectric transducer. In: 2008 2nd international conference on bioinformatics and biomedical engineering, Shanghai. 2008. pp 1648–9. 21. Jevon P. How to ensure patient observations lead to prompt identification of tachypnoea. Nurs Times. 2010;106(2):12–4. 22. Kowalak JP, editor. Lippincott’s Nursing Procedures. 5th ed. Baltimore: Lippincott Williams & Wilkins; 2009. 23. Leonhardt S, Lachmann B. Electrical impedance tomography: the holy grail of ventilation and perfusion monitoring? Intensive Care Med. 2012;38(12):1917–29. 24. Lewis GF, Gatto RG, Porges SW. A novel method for extracting respiration rate and relative tidal volume from infrared thermography. Psychophysiology. 2011;48(7):877–87. 25. Liao P, Chen T, Chung P. A fast algorithm for multi-level thresholding. J Inform Sci Eng. 2001;17(5):713–27. 26. Lindh WQ, Pooler M, Tamparo CD, Dahl BM, Morris J. Delmar’s comprehensive medical assisting: administrative and clinical competencies. Boston: Cengage Learning; 2013. 27. McGain F, Cretikos MA, Jones D, Van Dyk S, Buist MD, Opdam H, Pellegrino V, Robertson MS, Bellomo R. Documentation of clinical review and vital signs after major surgery. Med J Aust. 2008;189(7):380–3. 28. Mei X, Ling H. Robust visual tracking and vehicle classification via sparse representation. IEEE Trans Pattern Anal Mach Intell. 2011;33(11):2259–72. 29. Mimoz O, Benard T, Gaucher A, Frasca D, Debaene B. Accuracy of respiratory rate monitoring using a non-invasive acoustic method after general anaesthesia. Br J Anaesth. 2012;108(5): 872–5. 30. Murthy R, Pavlidis I, Tsiamyrtzis P. Touchless monitoring of breathing function. In: Proceedings of IEEE engineering in medicine and biology society (IEEE, 2004). 2004. pp 1196–1199. 31. O’Sullivan SB, Schmitz TJ, Fulk G. Physical rehabilitation. Duxbury: F.A. Davis; 2013. 32. Otsu N. A threshold selection method from gray-level histograms. IEEE Trans Syst Man Cybern. 1979;9(1):62–6. 33. Pereira CB, Czaplik M, Blanik N, Rossaint R, Blazek V, Leonhardt S. Contact-free monitoring of circulation and perfusion dynamics based on the analysis of thermal imagery. Biomed Opt Expr. 2014;5(4):1075–89. doi:10.1364/BOE.5.001075.
123
J Clin Monit Comput 34. Pereira CB, Xinchi Yu, Blazek V, Leonhardt S. Robust remote monitoring of breathing function by using infrared thermography. In: Conference proceedings: ... annual international conference of the IEEE engineering in medicine and biology society. IEEE Engineering in Medicine and Biology Society. Annual Conference. 2015a. pp 4250–3 35. Pereira CB, Yu X, Czaplik M, Rossaint R, Blazek V, Leonhardt S. Remote monitoring of breathing dynamics using infrared thermography. Biomed Opt Expr. 2015b;6(11):4378–94. 36. Poh MZ, McDuff DJ, Picard RW. Non-contact, automated cardiac pulse measurements using video imaging and blind source separation. Opt Expr. 2010;18(10):10762–74. 37. Riha RL. Diagnostic approaches to respiratory sleep disorders. J Thorac Dis. 2015;7(8):1373–84. 38. Rodrı´guez-Molinero A, Narvaiza L, Ruiz J, Ga´lvez-Barro´n C. Normal respiratory rate and peripheral blood oxygen saturation in the elderly population. J Am Geriatr Soc. 2013;61(12):2238–40. 39. Shao D, Yang Y, Liu C, Tsow F, Yu H, Tao N. Noncontact monitoring breathing pattern, exhalation flow rate and pulse transit time. IEEE Trans Biomed Eng. 2014;61(11):2760–7. 40. Siccoli MM, Valko PO, Hermann DM, Bassetti CL. Central periodic breathing during sleep in 74 patients with acute ischemic stroke—neurogenic and cardiogenic factors. J Neurol. 2008;255(11):1687–92. 41. Soto RG, Fu ES, Vila H, Miguel RV. Capnography accurately detects apnea during monitored anesthesia care. Anesth Analg. 2004;99(2):379–82.
123
42. Strauß R, Ewig S, Richter K, Ko¨nig T, Heller G, Bauer TT. The prognostic significance of respiratory rate in patients with pneumonia: a retrospective analysis of data from 705,928 hospitalized ¨ rzteblatt patients in Germany from 2010–2012. Deutsches A International. 2014;111(29–30):503–8. 43. Treas LS, Wilkinson JM. Basic nursing: concepts, skills, & reasoning. Duxbury: F.A. Davis; 2013. 44. Trobec R, Rashkovska A, Avbelj V. Two proximal skin electrodes–a respiration rate body sensor. Sensors. 2012;12(10):13813–28. 45. Wartzek T, Bru¨ser C, Walter M, Leonhardt S. Robust sensor fusion of unobtrusively measured heart rate. IEEE J Biomed Health inform. 2014;18(2):654–60. 46. White G. Basic clinical lab competencies for respiratory care: an integrated approach. Boston: Cengage Learning; 2012. 47. Yoost B, Crawford L. Fundamentals of nursing: active learning for collaborative practice. 1st ed. Missouri: St. Louis; 2015. 48. Yuan G, Drost NA, McIvor R. Respiratory rate and breathing pattern. McMaster Univ Med J. 2013;10(1):23–8. 49. Zordan VB, Celly B, Chiu B, DiLorenzo PC. Breathe easy: model and control of human respiration for computer animation. Graph Models. 2006;68(2):113–32.