Wednesday, February 4, 2015

Motion tracking in infrared imaging for quantitative medical diagnostic applications.



Motion tracking in infrared imaging for quantitative medical diagnostic applications.

Author information

Abstract

In medical applications, infrared (IR) thermography is used to detect and examine the thermal signature of skin abnormalities by quantitatively analyzing skin temperature in steady state conditions or its evolution over time, captured in an image sequence. However, during the image acquisition period, the involuntary movements of the patient are unavoidable, and such movements will undermine the accuracy of temperature measurement for any particular location on the skin. In this study, a tracking approach using a template-based algorithm is proposed, to follow the involuntary motion of the subject in the IR image sequence. The motion tacking will allow to associate a temperature evolution to each spatial location on the body while the body moves relative to the image frame. The affine transformation model is adopted to estimate the motion parameters of the template image. The Lucas-Kanade algorithm is applied to search for the optimized parameters of the affine transformation. A weighting mask is incorporated into the algorithm to ensure its tracking robustness. To evaluate the feasibility of the tracking approach, two sets of IR image sequences with random in-plane motion were tested in our experiments. A steady-state (no heating or cooling) IR image sequence in which the skin temperature is in equilibrium with the environment was considered first. The thermal recovery IR image sequence, acquired when the skin is recovering from 60-s cooling, was the second case analyzed. By proper selection of the template image along with template update, satisfactory tracking results were obtained for both IR image sequences. The achieved tracking accuracies are promising in terms of satisfying the demands imposed by clinical applications of IR thermography.



1. Introduction

1.1. Background

IR thermography is a non-ionizing and non-invasive imaging modality that has been regaining interest in clinical medicine in recent years. Such resurgence of interest can be attributed to the dramatic advances of IR camera and computer technology, novel image processing algorithms, and the progress in IR sensors since the 1990s. The availability of high sensitivity, high resolution IR detectors at a reasonable cost, along with miniaturization of the cameras, enabled the development of low-cost, portable systems suitable for quantitative diagnostic applications in medicine. IR diagnostic techniques rely on the hypothesis that a distinct thermal signature associated with a medical condition can be detected and quantified using modern camera technology. Quantitative imaging requires high accuracy – high sensitivity and high spatial resolution – measurements, which can be static or dynamic measurements, depending on the application. The common goal in thermographic diagnostic applications is to measure skin temperature as a function of location and time with high accuracy. These measurements are complex, since the shape of the surface, the surface properties and the motion of the subject will influence the temperature data derived from signals captured by the sensor. This paper addresses the influence of the motion of the subject on IR temperature measurements and proposes methods to compensate for these in order to achieve high-accuracy temperature measurements.
Infrared imaging can be implemented either as a static or as a dynamic technique in medical diagnostic applications. In static imaging applications a steady state situation is analyzed: the subject is typically exposed to normal ambient conditions and the spatial distribution of thermal contrasts on the body is measured and analyzed. Dynamic IR imaging detects both spatial and temporal variations of the emitted IR radiation, and this signal is related to skin temperature during postprocessing. Prior to image acquisition, a thermal excitation, such as cooling or heating, is applied to skin surface in dynamic IR imaging. During and following the thermal excitation a sequence of consecutive image frames is acquired, and the motion of the subject during the image acquisition poses a significant challenge for accurate surface temperature measurements. The phase after the removal of the thermal excitation is the thermal recovery process, which is frequently analyzed in medical imaging applications. By analyzing the thermal recovery of the skin temperature, with temperature variations in the range of hundreds of millikelvins, abnormalities such as the malignancy of skin lesions can be examined, quantified and potentially diagnosed (with appropriate calibration and clinical validation) [17].
During the acquisition period, the patients’ involuntary movements (breathing or small involuntary movements of the body and limbs) will hinder the accurate temperature recording at any particular point on the skin, therefore undermining the validity and accuracy of the local temperature analysis and the associated diagnosis. As shown in Fig. 1, in clinical applications of medical IR imaging [2,4], the patient is typically motionless, positioned on a fixed exam chair or bed, and the IR camera is oriented towards the lesion on the skin with the camera axis perpendicular to the skin surface. Since the out-of-plane movements of the patient are limited and restrained by the exam chair, they result in a predominantly linear in-plane motion in the image sequence, as illustrated in Fig. 2. This linear in-plane motion is considered to be the major source of motion encountered in the clinical IR image analysis.
Fig. 1
Schematic of subject's motion due to respiration and small involuntary movements of the body and limbs in the clinical IR imaging environment with the patient positioned in the exam chair.
Fig. 2
Sample infrared images illustrating the subject's in-plane motion in the acquired IR image sequence. The white arrows show the motion direction relative the previous frame (the adjacent frame on the left hand side), the rectangle is the template of known ...
Although such in-plane motion is relatively small and limited in range, it still hinders the successive measurement of temperature at a particular location on the skin surface in the IR image sequence. For example, early stage melanoma lesions are small (of the order of a few millimeters), and therefore movements of the order of a millimeter will hinder the localization of any subtle thermal features [5]. The measured temperature differences are often very small as well (fraction of a degree) and the feature to be analyzed (the lesion) cannot be detected directly in the IR image, prior to image processing. In static IR imaging, in which the temperature of the skin is approximately constant over time, accurate tracking can locate the lesion automatically in the IR image without the aid of a visible landmark. In dynamic IR imaging the temperature of the skin and the signal detected by the camera sensor change over the time [24]. In such applications motion tracking is crucial to record the temperature evolution accurately over the image sequence.

1.2. Literature review

Tracking a moving object in an image sequence is an active topic in civil and military applications. Initial applications were developed for white light imaging, however, motion and target tracking in IR image sequences has become the subject of increasing interest in recent years as well. Due to the lower cost and fast improvement of infrared (IR) technology, object tracking has also been widely used in IR imaging, such as pedestrian detection for surveillance purposes [810]. In IR surveillance applications the target typically stands out against the background, it often covers a relatively small fraction of the total image frame area, the range of motion is large and accurate temperature information is generally not of interest.
Only a few researchers have tackled the motion tracking challenge in IR imaging in medical applications [5,6]. The key difference between surveillance and medical applications is the smaller thermal contrast between the diseased and healthy tissue, which is often not detectable without sophisticated image processing. Also, the target often covers a significant portion of the image frame. The range of motion of the subject is smaller and accurate quantitative detection of temporal as well as spatial variations is essential. Most of the past clinical applications of IR imaging focused on breast cancer detection. To enable the wide use of dynamic IR imaging in breast cancer screening [1113], the motion artifact reduction approach using image sequence realignment was proposed [1416] with marker-based image registration. Image registration involves transforming different data sets into one coordinate system. In medical imaging and computer vision applications registration is essential in order to allow the comparison or integration of data obtained from different measurements and sources, in our application from white light and infrared images. In breast cancer imaging [1416], the registration of each IR image frame is achieved by aligning 5–18 markers (which are directly visible in the successive images) to serve as control points in the IR image sequence. The approach can achieve accurate tracking performance in terms of signal to noise ratio (SNR).
In the Heat Transfer Lab of Johns Hopkins University, we have developed a dynamic IR imaging system for the detection of melanoma [27]. The system relies on thermo-stimulation with external cooling and compares the transient thermal response of cancerous lesions and healthy skin. As shown by Cetingul et al. [5] and Herman [6], based on the quadratic motion model for landmark-based registration, a sequence of IR images acquired during the thermal recovery phase can be aligned to compensate for involuntary motion of the patient. The motion compensation enables an accurate measurement of temperature differences between lesions and healthy skin, and provides quantitative information to identify the malignancy of lesions. Without motion tracking, the measurement errors are too large to detect temperature differences that are indicative of malignancy [46]. Based on the experiences gained in our previous clinical studies, in this study, a template-based tracking scheme is applied for the dynamic IR imaging.
In the field of computer vision, automatic methods for tracking the moving object in an image sequence fall into three main categories [17,18]: feature-based tracking [19,20], contour-based tracking [21,22], and region-based tracking [23,24]. In the absence of occlusions, region-based tracking methods perform well in terms of robustness and accuracy. The template-based tracking [25] falls into the category of region-based tracking methods. When compared with other region-based methods with non-parametric description of the region's content [26], the template-based tracking directly uses a region content of the image to track the moving object. This is accomplished by extracting a template region in the first frame and finding the most matching region in the following frames. In this way the moving object can be tracked in a video sequence.
The template-based tracking method is originally based on the assumption that the object's appearance remains constant throughout the video sequence. However, in practical IR imaging applications this assumption often holds only for a certain period of time. The appearance of the object would change significantly with time and changes in the environment, for example as the temperature of the object changes during thermal recovery in dynamic IR imaging. Considering the tracking error due to the violation of initial assumption, improvements in the tracking algorithm, including template update [27] and the inclusion of robust weighting mask for template matching [18], are proposed in this paper. Based on our preliminary results [28], in the study we introduce the tracking method and validate it on two characteristic classes of problems relevant for medical applications, the static and dynamic imaging of skin temperature.

2. Algorithms and methods

2.1. Template-based tracking algorithm

In the clinical application of IR imaging a marker (a rectangular shape applied to skin in our experiment, Fig. 3b) is commonly used for the registration of the skin lesion [2,4]. This marker can simultaneously serve as the landmark of consistent appearance in the template region. The template image is a sub-region of an image which contains the object (skin lesion, for example) to be tracked in the image sequence. Template-based tracking is achieved by estimating coordinate alignment between the template image and the consecutive frames in a given video sequence. Since the involuntary movements of patient are mainly in-plane motions with a limited range, the template-based method is well suited for motion tracking in medical IR imaging. In this paper, the method is described using the notation of Matthews et al. [27].
Fig. 3
(a) The Merlin midwave infrared camera and the IR image acquisition system; (b) Two paper adhesive markers: the square one serves as the tracking template and the round one represents the simulated lesion for error analysis; (c) IR image of the adhesive ...
The alignment of images in a sequence can be parameterized as a warp function, which transforms the pixel coordinate x in the template image to a new coordinate W(x;p) in the subsequent image frame. In this expression p denotes the transformation parameters p = (p1,. . .,pn)T of the warp function. In our application, a sub-region containing the object in the initial IR image frame I0(x) is extracted to be the template image T(x). In a subsequent image frame I(x), the template image content at pixel x:T(x), will be warped by W(x;p) and presented as I(W(x;p)). It would have similar content as its correspondence in the template image if the tracking is valid. Therefore, the optimized parameters p will be searched to find the best match as
I(W(x;p))≈T(x).
(1)
In template-based tracking, the 2D affine transformation is utilized as the warp function W(x;p). Linear transformations, including translation, rotation, shear mapping, and scaling, are all taken into account, and any two parallel lines will remain parallel after the transformation. The affine warp consists of 6 independent parameters (Section 2.2):p = (p1, p2, p3, p4, p5, p6)T to model the transformation as
W(x;p)=(1+p1p2p31+p4p5p6)⎛⎝xy1⎞⎠,
where (p5, p6) are two parameters describing translation, (p1, p4) are two parameters for the scaling of x and y, and (p2, p3) describe the angular change of each axis after warping.

2.2. Lucas–Kanade approach

The first use of image alignment with a template reported in the technical literature is the Lucas–Kanade optical flow algorithm [25]. Since this work, the approach has become one of the most widely used methods in object tracking. In the remainder of this section, the notation of Baker and Matthews [29] is used to describe the algorithm. In order to search for the optimized parameters p for the warp function, the objective of Lucas–Kanade algorithm is to minimize the sum of square errors between the image content of I(W(x;p)) and the template T(x) as
x[I(W(x;p))−T(x)]2.
(3)
The minimization of Eq. (3) is a non-linear optimization problem. Based on a current estimate p for the warping parameters, the Lucas–Kanade algorithm iteratively solves for an increment Δp to update the current estimation using
Pnewp+Δp,
(4)
and the expression (3) can be re-written as the error function
x[I(W(x;p+Δp))−T(x)]2.
(5)
The iteration process will continue until the estimate of p converges, and the converged estimate will serve as the new set of optimized parameters p for the warp function W(x;p).
Furthermore, to linearize the error function (5), the first-order Taylor expansion is used to approximate I(W(x;p + Δp)) as
I(W(x;p+Δp))=I(W(x;p))+IWpΔp
(6)
Therefore, by substituting I(W(x;p + Δp)), Eq. (6), into (5), the error function can be written as [29]
x[I(W(x;p))+IWpΔpT(x)]2,
(7)
where I is the gradient of image I evaluated when the current warp function W(x;p) is applied, and Wp represents the Jacobian of the warp.
The affine transformation W(x;p) can be decomposed as W(x;p) = (Wx, Wy)T, and the Jacobian of W(x;p) is represented as
Wp=⎛⎝⎜⎜⎜Wxp1Wyp1⋯⋯Wxp6Wyp6⎞⎠⎟⎟⎟=(x00xy00y1001).
Based on Eq. (8), to minimize the error function (7), we first determine its partial derivative and set it to zero. Next, the closed-form solution for the Δp can then be obtained as [29]
Δp=H1x[IWp]T[T(x)−I(W(x;p))].
(9)
In Eq. (9) H is the n × n Hessian Matrix and the Gauss–Newton approximation is adopted for Hessian Matrix as
H=x[IWp]T[IWp].
(10)

2.3. The weighting function for robust tracking

Constant brightness over the entire image sequence is the primary assumption for template-based tracking. This assumption is often violated when the brightness variations are unavoidable, such as the imaging of transient processes. In our application the assumption can hold for the static IR image sequence. For the dynamic IR image sequences this assumption will be violated, since the purpose is to analyze the time-varying thermal signal of a skin lesion recovering from a cooling excitation.
Therefore, to tackle the problem of brightness variation in dynamic IR image data, the robust version of Lucas–Kanade tracking algorithm [18,30,31] is implemented in our application. The robust version adds a weighting mask to the template pixels in the computation of the least square process. In this algorithm the pixels with brightness change in the template image will be treated as outliers, and their contribution will be suppressed in the computation. Furthermore, to improve the efficiency of the Lucas–Kanade algorithm, the inverse compositional algorithm [2931] is also implemented in the robust version. In the inverse compositional algorithm the roles of template image T(x) and subsequent image I(x) are interchanged. Based on the notation of [30,31], a weighing mask M(x) of the same dimension as the template image is included in the least square process to replace Eq. (5)
xM(x)[I(W(x;p+Δp))−T(X)]2.
(11)
The closed-form expressions for the increment Δp, described by Eqs. (9) and (10), will be replaced by Eqs. (12) and (13) in the robust version of Lucas–Kanade tracking algorithm as:
Δp=H1xM(x)[TWp]T[I(W(x;p))−T(x)]and
(12)
H=xM(x)[TWp]T[TWp].
(13)

2.4. Infrared image acquisition

To test the tracking performance of the algorithm, a subject's left hand with deliberate random in-plane motion is imaged using an IR camera. Two types of IR image sequences were tested in this study:
  1. Steady-state IR image sequence: In the first imaging experiment, the temperature of the skin does not change with time, therefore the basic tracking performance can be tested under the constant brightness assumption.
  2. Thermal recovery IR image sequence: In our previous clinical application [2], the thermal recovery IR image sequence after cooling reveals critical information to detect the malignancy of melanoma lesions. Thus in the second experiment, we first applied cooling to the skin for 60s, and acquired the IR video after the cooling was removed. The temperature changes cause significant changes of brightness in the IR image sequence. Using these test images, the robustness of the tracking algorithm can be tested for the situation when the skin temperature changes with time.
As shown in Fig. 3(a), the camera used in the experiment is a Merlin midwave (3–5 μm) infrared camera (MWIR) (FLIR Systems Inc., Wilsonville, OR), which has the temperature sensitivity of 0.025 °C. A 320 × 256 pixel focal plane array (FPA) is used to acquire 16 bit raw data at frame rate of 60 Hz. Each image frame has the field of view (FOV) of 22 × 16 degrees. Using the parameters obtained from black body calibration [32], the camera can generate an IR thermal image calibrated as temperature value.

2.5. Template image and the simulated lesion

In general, a skin lesion (early stage melanoma is of particular interest in the present study) is not directly identifiable in the IR image because the temperature difference between the lesion and the healthy skin is typically very small under steady state conditions, i.e. it is of the order of natural temperature fluctuations of the skin. The temperature difference increases during dynamic imaging, which imposes the need to track a particular physical location on the skin throughout an image sequence. To implement the template-based algorithm and evaluate its performance, we created two paper markers which are both visible in the IR image (Fig. 3(b) and (c)). One of them is a square marker of size 5 cm × 5 cm, which serves as the constant feature in the template image as shown in Fig. 3(c). The second one is a small, round maker, which is easy to segment in IR image, such that we can simulate the lesion location and evaluate the tracking results quantitatively.

2.6. Weighting mask for the template image

As introduced in Section 2.3, to implement the robust version of Lucas–Kanade algorithm [18,2931], we used the normalized intensity of the template image as the weighting mask. In the template image, the image brightness is normalized in the range (0,1) with respect to the lowest and the highest temperature value. This normalized image serves as the weighting mask M(x) in Eq. (11). The weighting mask treats a pixel x as reliable if M(x) = 1, and a pixel x is considered an outlier if M(x) = 0. The two weighting masks for the steady-state and thermal recovery IR image sequences are displayed in Fig. 4a and b. It can be observed that the skin region outside the square marker, which has higher and more constant temperature throughout the IR video, will be highly weighted by the weighting mask (Fig. 4b, light colored region). On the other hand, the region exposed to cooling in the thermal recovery template image (Fig. 4b, darker region), which has lowest temperature but highest brightness variation rate, will be suppressed as unreliable outlier pixels by the weighting mask. This example explains the reason why the normalized temperature image is taken as our weighting mask. The robust Lucas–Kanade algorithm with pixel weighting was implemented using the public domain MATLAB function shared by Dir-Jan Kroon [33].
Fig. 4
(a) Weighting mask of the template image for the steady-state IR image sequence, (b) weighting mask of the template image for the thermal recovery IR image sequence (created after the cooling is applied), (c) the appearance of the template images after ...

2.7. Determining the location of the simulated lesion for tracking error analysis

In order to evaluate and quantify the tracking performance of our algorithm, we need to know the actual location of the simulated lesion (round marker) in order to be able to compare it with the estimated location determined by the motion tracking algorithm. For this reason the simulated lesion, which is clearly visible in the IR image sequence, is used in the analysis as opposed to a real (cancerous or benign) skin lesion, which usually cannot be detected directly in the IR image. The centroid point of the simulated lesion is taken as the reference coordinate, and it is the mean value of the pixel coordinates of the marker's border. The coordinates of the border of the simulated lesion were determined using an outline segmentation algorithm – a random walker [34]. By selecting two points inside/outside of the closed region of an object, the random-walker algorithm can automatically segment the object's outline. After the simulated lesion outline is delineated, its centroid location can be obtained as the reference point.
When the tracking algorithm is applied, the centroid location (segmented in the initial frame) will be predicted in the subsequent frames based on the parameters obtained by solving Eq. (12). Finally, when knowing both the actual (from image segmentation) and predicted (from the tracking algorithm) centroid locations in the image sequence, the Euclidean distance between them is computed to quantify the tracking error.
In Experiment A (steady state), since no cooling is applied to the skin, the entire region of simulated lesion is visually identifiable throughout the IR image sequence. Therefore, the outline of the simulated lesion can be directly segmented in each frame (Fig. 5) using random-walker algorithm.
Fig. 5
(a) Adhesive markers and the segmented outline of the simulated lesion (red circle); (b) Red cross: the centroid location of the simulated lesion. (For interpretation of the references to colour in this figure legend, the reader is referred to the web ...
In Experiment B (during thermal recovery) the border of the simulated lesion is obstructed (masked) by the cooling spot, therefore the region of the simulated lesion is no longer clearly visible in the IR image (Fig. 6b) and the lesion outline cannot be segmented directly. Therefore we had to apply registration between the IR images before and after cooling in order to determine the lesion outline in the thermal recovery image sequence. To accomplish this, we used the steady-state image of the simulated lesion before cooling, for which the random-walker algorithm can be directly applied, to segment the outline. This lesion outline is then registered to the recovery image sequence via the location of four corners of the square marker, which are visible in both images before and after cooling.
Fig. 6
Registration of the simulated lesion from the steady-state IR image (a) to the thermal recovery IR image (b) in Experiment B. The color change from (a) to (b) within the rectangular template is characteristic for the cooling process in dynamic IR imaging. ...
As shown in Fig. 6a, the lesion is first segmented in the image before cooling. Next, by identifying the location of four corners of the window marker in the images before and after cooling (Fig. 6b), a 2D projective transformation matrix is solved [35] based on the four corners correspondences. The identification of the four corners can be accomplished either manually or automatically using corner-detecting algorithms such as the Harris detector [36] or the Shi and Tomasi [37] method. The registration relation between these two frames can thus be built, and then the lesion outline segmented in the image before cooling can be mapped into the image after cooling (Fig. 6b). By applying the registration process consecutively to individual image frames in a sequence, the actual location of the lesion centroid can be determined for each frame of the thermal recovery sequence.

3. Results

3.1. Experiment A: Tracking performance for a steady-state IR image sequence

We applied the algorithm described in Section 2 to 23 frames of typical steady-state IR images with random in-plane motion incorporated. The tracking results, represented by 3 pairs of adjacent frames, are shown in Fig. 7. The magnified views of the predicted (circle) and actual (cross) locations of the simulated lesion are also shown in Fig. 7. As we compare the location predicted by the algorithm with the actual location measured by the segmentation, it can be seen that despite the quite significant random in-plane motion, the two locations are very close to each other and the deviations are minimal.
Fig. 7
Tracking results for Experiment A in a steady-state image sequence (circle – location of the centroid predicted by the tracking algorithm, cross – actual centroid location of the simulated lesion). The magnified view of the simulated lesion ...
To evaluate the tracking error quantitatively, the Euclidean distance between the predicted and actual location of the lesion centroid in units of pixels is calculated for each frame, and the results are plotted as blue bars in Fig. 8. In addition, the displacement of the lesion centroid with respect to the previous frame (the displacement from frame i-1 to i) is also recorded in units of pixels. As evident from Fig. 8, despite of the fact that the frame-to-frame displacement can exceed 35 pixels, the tracking errors over the entire image sequence are smaller than 3 pixels. This result suggests that the tracking algorithm performs very well even for relatively large in-plane motion. To translate these data into real-life dimensions, the actual size of marker is introduced: 30 mm correspond to 86 pixels in the IR image. Therefore we can infer that 1 pixel approximately corresponds to 0.35 mm on the skin surface. Using this conversion, the highest tracking errors of 3 pixels correspond to 1.05 mm in the real dimensions. The displacement of 35 pixels corresponds to approximately 11 mm motion amplitude in terms of physical dimensions, which is significant. In medical applications the motion amplitude of the subject is typically much smaller, of the order of a few millimeters. The measurement accuracy and spational resolution can be further improved by improving the spatial resolution of the focal plane array of the camera.
Fig. 8
Tracking error analysis for the steady-state image sequence in experiment A: the red line represents the frame-to-frame displacement (from frame i-1 to frame i) of the lesion centroid, and the blue bars represent the Euclidean distance between the predicted ...

3.2. Experiment B-1: Tracking performance for the thermal recovery IR image sequence (initial time interval, within 30 s into thermal recovery)

The tracking performance for the image sequence recovered from dynamic IR imaging with cooling is illustrated by the 3 pairs of adjacent frames in Fig. 9. In this trial, the first 23 IR images after the removal of the cooling excitation are considered. The frame rate is 1 frame/s, and the quantitative analysis of the tracking error at each frame is shown in Fig. 10. It can be observed that despite of the moderate brightness variations during thermal recovery, the lesion centroid can still be tracked with reasonable accuracy within the first 23 frames. This corresponds to 23 s into the thermal recovery phase, and the highest tracking errors are smaller than 4 pixels. The results suggest that, despite of the moderate brightness variations, the template-based algorithm performs with good accuracy in the first 23 s into the thermal recovery sequence. This is accomplished by taking the target region in the first frame as the template image, along with its normalized image, as the weighting mask.
Fig. 9
Tracking performance of Experiment B-1 for the first 23 frames in the thermal recovery IR image sequence (circle – centroid location predicted by the tracking algorithm, cross – actual centroid location of the simulated lesion).
Fig. 10
Tracking error analysis (Experiment B-1) of the thermal recovery image sequence after the removal of cooling: the red line represents the frame-to-frame displacement (from frame i-1 to frame i) of the simulated centroid. The blue bars represent the Euclidean ...

3.3. Experiment B-2: Tracking performance for the thermal recovery IR image sequence (over 30 s into the thermal recovery phase) without template update

As we extended the tracking test over 60 frames (1 min) in the second trial, the tracking performance of the algorithm described in Section 3.2 deteriorated. The four frames shown in Fig. 11a illustrate the dramatic changes the IR image undergoes during thermal recovery. As a consequence of these changes, the tracking algorithm gradually loses the track of simulated lesion between frames 41 and 65. The error analysis in Fig. 11b indicates a dramatic increase of tracking errors after frame 29. The significant error growth can be attributed to the brightness inconsistency after 30 s into the thermal recovery phase, when the thermal appearance of the cooled region becomes substantially different from the template image recorded at the first frame. As indicated in Fig. 11b, the tracking errors exceed 5 pixels after frame 41. Therefore, to ensure the robustness the tracking method, adjustments to the template-based algorithm were needed. These adjustments are expected to deal with the brightness inconsistencies encountered during the thermal recovery.
Fig. 11
Tracking performance for Experiment B-2 with tracking duration of 90 s into the thermal recovery phase without template update: (a) four representative image frames illustrating the differences between centroid location predicted by the algorithm and ...

3.4. Experiment B-3: Tracking performance of the thermal recovery IR image sequence with template update for long imaging times

To reduce the growth of tracking errors due to brightness inconsistency after 30 s into the thermal recovery, instead of a using single template image taken at the initial time, we update the template image and the weighting mask at frame 29 as illustrated in Fig. 12. Frame 29 is chosen because it is the last frame with errors smaller than 5 pixels, before significant error growth is observed in Fig. 11b. This image frame has similar brightness as the rest of the frames recorded after 30 s. Thus the tracking procedure for longer imaging times has two stages: in the first stage the images from frames 1–29 are tracked using the initial template image. In the second stage, after the update at frame 29, the remaining frames are tracked using the updated template image (Fig. 12).
Fig. 12
(a) Updated template image, frame 29, (b) updated weighting mask for frame 29 and (c) resulting temperature value for the updated template image after applying the weighting mask.
The improvements achieved with the template update incorporated into the tracking algorithm are presented in Fig. 13. In Fig. 13a, we can observe that the simulated lesion can be tracked correctly in the image after frame 30. The improved accuracy is obvious when compared with its counterpart shown in Fig. 11a (without template update). The tracking error (blue bars) and displacement magnitude (red line) charts for the experiment using template update are shown in Fig. 13b. When compared with the data in Fig. 11b, it is evident that the tracking errors are significantly reduced, under 5 pixels, throughout the image sequence. The results suggest that, when applying the tracking algorithm to the thermal recovery image sequence, updating the template information at one (or more) selected time instant(s) can improve the tracking robustness for longer imaging times despite the brightness inconsistencies caused by cooling.
Fig. 13
Tracking performance of Experiment B-3 for 90 image frames with a template update carried out at frame 29: (a) Four representative frames showing the location differences between the predicted and actual lesion centroid and (b) the tracking errors at ...

4. Conclusions

In this study, we demonstrated the feasibility of template-based algorithm for involuntary motion tracking in in-vivo infrared imaging. In the presence of random in-plane motion, we demonstrated that the robust version of Lucas–Kanade algorithm can track the location of the target region in IR images sequences with good accuracy for sequences with small temperature changes (steady state). When applying the algorithm to the steady-state IR image sequence, satisfactory results were obtained when a single template information is used, detected at the first frame.
As discussed for Experiment B-1, the tracking scheme using a single template image can also achieve accurate tracking results during the first 30 s into the thermal recovery image sequence. The results suggest that the weighting mask, created by normalizing the template image at the initial frame, can effectively ensure the tracking robustness in the early stage of the thermal recovery phase (0–40 s). During this period the temporal change of skin temperature is highest and the thermal signature of lesion most pronounced [38].
As the duration of the thermal recovery exceeds 30 s, at later times the brightness of the cooled region differs significantly from the initial template image, which causes the tracking errors to grow significantly, as demonstrated in Experiment B-2.
From the improved tracking results obtained in Experiment B-3, we can infer that by updating the template information at the end of the early stage (around 30 s), tracking robustness can be ensured throughout a longer image sequence. The approach reveals that in dynamic IR imaging applications the template update can be an effective amendment to stabilize and improve the tracking performance of the template-based algorithm.

No comments: