Abstract: A method for processing a frame in a sequence of successively acquired frames in dynamic digital radiography wherein the multi scale representation of frames is subjected to temporal filtering by adding at least one correction image to a corresponding detail image (s) in the multi scale representations of a frame of interest the correction image being computed by combining clipped difference images obtained as the difference between the multi scale representation of the frame of interest and the multi scale representations of a selection of other frames in said sequence.
METHOD FOR NOISE REDUCTION I N AN IMAGE SEQUENCE
[DESCRIPTION]
FIELD OF THE INVENTION
The present invention relates to a method for noise reduction in a
sequence of images .
BACKGROUND OF THE INVENTION
In dynamic digital radiography, image sequences of an object are
generated in real time. During acquisition multiple, successive
digital images or frames (frame images) are taken. Successive images
are recorded e.g. by means of a digital radiography detector.
The present invention focuses on applications in which motion or
immediate feedback are crucial, e.g. the temporal evolution in
contrast studies or interventional fluoroscopy to guide and verify
surgical actions.
Compared to static x-ray images, the dose per image or frame can be
extremely low for the fluoroscopic image sequences .
As a result the noise content in a single frame is much higher
compared to static images. Therefore noise reduction is a major
concern in the process of visualization enhancement of fluoroscopic
image sequences .
Typically, spatio-temporal filtering techniques are used to reduce
the noise by making use of the strong correlation between successive
frames .
State-of-the-art algorithms use motion estimation to balance the
strength of spatial and temporal noise filtering. In static image
regions, temporal noise filtering preserves image details far better
than spatial filtering. However temporal filtering can generate
artefacts called motion blur in strongly moving scenes. State-ofthe-
art noise reduction algorithms try to avoid motion blur by
reducing the strength of temporal filtering in favour of spatial
filtering when motion is detected over the frames.
Detection of motion in fluoroscopic image sequences is extremely
difficult due to the high noise content. Motion compensated spatiotemporal
filtering often fails to detect motion accurately as the
high noise content corrupts the image gradients used to control the
filters .
Almost all the state-of-the-art noise reduction filters are
implemented as multi-scale filters: these filters are applied to the
wavelet or Laplacian pyramid representations of the frames.
Modifying the multi-scale decompositions allows more filtering of
the high frequency noise signals while preserving the mid and low
frequency structure signals in the images.
It is an aspect of the present invention to provide a multi- scale
temporal noise reduction method that does not require motion
estimation.
SUMMARY OF THE INVENTION
The above-mentioned aspects are realized by a method having the
specific features set out in claim 1 . Specific features for
preferred embodiments of the invention are set out in the dependent
claims .
The present invention is applicable but not limited to a sequence of
frames obtained in fluoroscopic medical imaging. In general, it can
be applied to images obtained by various types of dynamic digital
radiation imaging (e.g. dynamic digital radiography).
Frames can be obtained by recording images with a radiation sensor
such as a solid state image detector applied in dynamic mode.
The method of the present invention is generally implemented in the
form of a computer program product adapted to carry out the method
steps of the present invention when run on a computer. The computer
program product is commonly stored in a computer readable carrier
medium such as a DVD. Alternatively the computer program product
takes the form of an electric signal and can be communicated to a
user through electronic communication.
Further advantages and embodiments of the present invention will
become apparent from the following description and drawing.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 schematically shows the different steps of the present
invention.
DETAILED DESCRIPTION OF THE INVENTION
The proposed method is a multi-scale temporal noise reduction
technique that does not require motion estimation.
The invention is applicable to all well-known multi-scale
decomposition methods.
A multi-scale (or multi-resolution, whereby resolution refers to
spatial resolution) decomposition of an image is a process that
computes detail images at multiple scales of a grey value
representation of the image.
A multi-scale decomposition mechanism generally involves applying
filter banks to the grey value representation of the image for
computing the detail images. Well-known techniques are for example:
the Laplacian pyramid, the Burt pyramid, the Laplacian stack, the
wavelet decomposition, Q F filter banks etc.
The pixels of the detail images represent the amount of variation of
pixel values of the original image at the scale of the detail image,
whereby scale refers to spatial extent of these variations.
An example of a multi- scale decomposition technique is described
extensively in European patent application 527 525 A2 .
Multi-scale noise reduction of a frame of interest in a sequence of
frames is achieved by correcting at least one detail image of the
multi -scale representation of the frame of interest.
The correction is performed by subjecting the multi -scale
representation of said frame of interest to temporal filtering by
adding at least one correction image to a corresponding (at the same
scale) detail image (s) in the multi-scale representation of said
frame of interest, said correction image (s) being computed by
combining clipped difference images obtained as the difference
between the multi -scale representation of the frame of interest and
the multi-scale representations of a selection of other frames in
said sequence .
Instead of the above-described addition a multiplication could also
be performed. This is however a more complicated operation.
In accordance with the present invention only the small -detail
images in the multi-scale representation up to a predefined scale
sm are modified. An optimal value for smax needs to be chosen as a
function of the used pixel resolution and the noise characteristics.
For example, sm will be lowered if binning of pixels in the
detector is activated.
Large detail images at scales above sm x will not be modified.
For example, sma will be lowered if binning of pixels in the
detector is activated.
In an image sequence of successive frames Ft - / -» F t-2 , Ft-1 , Ft, Ft+1 ,
Ft + Ft+ the noise reduction of a frame of interest Ft can be
achieved by making use of a limited number of previous frames F - ,...,
-2 F t - (i.e. preceding the frame of interest in the acquired image
sequence, either immediately preceding or still earlier frames) .
This allows real-time processing of the image sequence with limited
delay.
A slightly better result can be achieved by making use of both a
number of next and a number of previous frames in the sequence Ft-1
F + / F t-2 , Ft+2 ,... as these closest frames in the sequence will be more
similar to the frame of interest Ft (t representing relative time of
acquisition) . However making use of the successive frames Ft + , Ft+2 ,...
will introduce a larger delay which is not optimal in real-time
applications .
Hereafter the embodiment is explained whereby only previous frames
are being used for temporal filtering.
The number of previous frames (prior to the frame of interest)
involved in the temporal filtering needs to be chosen as a function
of the noise characteristics and the desired amount of noise
reduction in the frame of interest.
For example, local standard deviation values (sliding window) are
calculated of pixels at scale 0 in the frame of interest, the
frequency histogram of these values is generated and the maximum bin
of this histogram is used to determine Sm x .
The proposed method pair-wise compares the multi-scale
representation of each individual previous frame Ft- with the multiscale
representation of the frame of interest Ft.
For every scale s up to scale s a difference image diffB t- is
computed by subtracting the pixel values of detail images d t from
these of detail image dS t -k
diffs ,t-k = s t -k - d t
These difference images are computed between the multi-scale
representation of every involved previous frame Ft-k - , F -2 , Ft-1 and
the multi-scale representation of the frame of interest Ft.
The difference images diffs , -k - < diffS t-2 diffs t -i contain
differences due to noise, but also differences due to scene motion.
To avoid motion artefacts or motion blur, the differences due to
scene motion need to be reduced in the difference images.
A s the pixel differences due to noise have a smaller magnitude than
the pixel differences caused by scene motion, clipping is an
efficient way to correct the pixel differences due to scene motion.
Per scale predefined clipping bounds can be used.
A more adaptive approach is to define the clipping bounds in
function of the computed noise level ns in the frame of interest Ft .
The clipping bounds can be defined as -w*f *ns and +w*f *ns0.
The multiplicative factor f is a scale-dependent normalization
factor to take into account the decrease of the amplitude of the
noise over the different detail scales.
The multiplicative factor w can be assigned to a predefined value or
can be an inverse function of an image quality measurement, e.g. the
signal-to-noise ratio.
For images with a high SNR, the noise content will be low and no or
little correction is needed. For images with a low SNR, the noise
content will be high and the clipping bounds must be large enough to
achieve sufficient noise reduction.
Out of the clipped difference images, correction images corrs are
computed. These correction images are added to the corresponding
detail images of the multi-scale representation of the frame of
interest (of the values of corresponding pixels, i.e. pixels with
same position in the frame, are added) .
The correction images can be computed as the average of the
difference images:
Also with decreasing weights for the
older frames being used.
The corrected multi-scale representation of the frame of interest
can be further used for visualization enhancement processing or can
be reconstructed to a noise reduced output frame.
Having described in detail the preferred embodiment of the current
invention, it will now be apparent to these skilled in the art that
numerous modifications can be made therein without departing from
the scope of the invention as defined in the appending claims. An
example of such a modification is applying an additional spatial
filtering step to the individual difference images to remove linear
structures (originating from motion artefacts) .
[CLAIMS]
1 . A method for processing a frame of interest in a sequence of
successively acquired frames in digital radiation imaging,
comprising the steps of
- generating a multi-scale representation of frames, said multiscale
representation comprising detail images at different scales,
- subjecting the multi-scale representation of said frame of
interest to temporal filtering by adding at least one correction
image to a corresponding detail image (s) in the multi-scale
representation of said frame of interest, said correction image
being computed by combining clipped difference images obtained as
the difference between the multi-scale representation of the frame
of interest and the multi-scale representations of a selection of
other frames in said sequence.
2 . A method according to claim 1 wherein said other frames are a
predefined number of image frames preceding the image to be
processed in said sequence.
3 . A method according to claim 2 wherein said other frames preceding
said frame of interest are successive frames in said sequence.
. A method according to claim 1 wherein said other frames are a
predefined number of frames preceding and a predefined number of
frames following said frame of interest.
5 . A method according to claim 2 or 4 wherein said predefined number
of frames is function of noise characteristics and/or desired amount
of noise reduction in said frame of interest.
6 . A method according to claim 1 wherein said clipped difference
images are obtained by clipping difference images with a clipping
bound that is defined per scale.
7 . A method according to claim 1 wherein said clipped difference
images are obtained by clipping difference images with a clipping
bound that depends on the value of a noise level computed in said
frame of interest.
8 . A method according to claim 1 wherein said temporal filtering is
only applied to detail images up to a predefined scale smax .
9. A computer program product adapted to carry out the method of any
of the preceding claims when run on a computer.
10 . A computer readable medium comprising computer executable
program code adapted to carry out the steps of any of claims 1 - 8 .
| # | Name | Date |
|---|---|---|
| 1 | 201617010851-IntimationOfGrant20-06-2022.pdf | 2022-06-20 |
| 1 | Power of Attorney [29-03-2016(online)].pdf | 2016-03-29 |
| 2 | 201617010851-PatentCertificate20-06-2022.pdf | 2022-06-20 |
| 2 | Form 5 [29-03-2016(online)].pdf | 2016-03-29 |
| 3 | Form 3 [29-03-2016(online)].pdf | 2016-03-29 |
| 3 | 201617010851-FER.pdf | 2021-10-17 |
| 4 | Form 20 [29-03-2016(online)].pdf | 2016-03-29 |
| 4 | 201617010851-CLAIMS [02-06-2020(online)].pdf | 2020-06-02 |
| 5 | Form 18 [29-03-2016(online)].pdf | 2016-03-29 |
| 5 | 201617010851-COMPLETE SPECIFICATION [02-06-2020(online)].pdf | 2020-06-02 |
| 6 | Form 1 [29-03-2016(online)].pdf | 2016-03-29 |
| 6 | 201617010851-DRAWING [02-06-2020(online)].pdf | 2020-06-02 |
| 7 | Drawing [29-03-2016(online)].pdf | 2016-03-29 |
| 7 | 201617010851-FER_SER_REPLY [02-06-2020(online)].pdf | 2020-06-02 |
| 8 | Description(Complete) [29-03-2016(online)].pdf | 2016-03-29 |
| 8 | 201617010851-OTHERS [02-06-2020(online)].pdf | 2020-06-02 |
| 9 | 201617010851-FORM 3 [01-06-2020(online)].pdf | 2020-06-01 |
| 9 | 201617010851-GPA-(01-04-2016).pdf | 2016-04-01 |
| 10 | 201617010851-Correspondence-010319.pdf | 2019-03-06 |
| 10 | 201617010851-Form-1-(01-04-2016).pdf | 2016-04-01 |
| 11 | 201617010851-Correspondence Others-(01-04-2016).pdf | 2016-04-01 |
| 11 | 201617010851-OTHERS-010319.pdf | 2019-03-06 |
| 12 | 201617010851-8(i)-Substitution-Change Of Applicant - Form 6 [19-02-2019(online)].pdf | 2019-02-19 |
| 12 | 201617010851.pdf | 2016-06-06 |
| 13 | 201617010851-ASSIGNMENT DOCUMENTS [19-02-2019(online)].pdf | 2019-02-19 |
| 13 | abstract.jpg | 2016-07-08 |
| 14 | 201617010851-FORM-26 [19-02-2019(online)].pdf | 2019-02-19 |
| 14 | Form 3 [22-08-2016(online)].pdf | 2016-08-22 |
| 15 | 201617010851-PA [19-02-2019(online)].pdf | 2019-02-19 |
| 16 | 201617010851-FORM-26 [19-02-2019(online)].pdf | 2019-02-19 |
| 16 | Form 3 [22-08-2016(online)].pdf | 2016-08-22 |
| 17 | abstract.jpg | 2016-07-08 |
| 17 | 201617010851-ASSIGNMENT DOCUMENTS [19-02-2019(online)].pdf | 2019-02-19 |
| 18 | 201617010851.pdf | 2016-06-06 |
| 18 | 201617010851-8(i)-Substitution-Change Of Applicant - Form 6 [19-02-2019(online)].pdf | 2019-02-19 |
| 19 | 201617010851-Correspondence Others-(01-04-2016).pdf | 2016-04-01 |
| 19 | 201617010851-OTHERS-010319.pdf | 2019-03-06 |
| 20 | 201617010851-Correspondence-010319.pdf | 2019-03-06 |
| 20 | 201617010851-Form-1-(01-04-2016).pdf | 2016-04-01 |
| 21 | 201617010851-FORM 3 [01-06-2020(online)].pdf | 2020-06-01 |
| 21 | 201617010851-GPA-(01-04-2016).pdf | 2016-04-01 |
| 22 | 201617010851-OTHERS [02-06-2020(online)].pdf | 2020-06-02 |
| 22 | Description(Complete) [29-03-2016(online)].pdf | 2016-03-29 |
| 23 | 201617010851-FER_SER_REPLY [02-06-2020(online)].pdf | 2020-06-02 |
| 23 | Drawing [29-03-2016(online)].pdf | 2016-03-29 |
| 24 | 201617010851-DRAWING [02-06-2020(online)].pdf | 2020-06-02 |
| 24 | Form 1 [29-03-2016(online)].pdf | 2016-03-29 |
| 25 | Form 18 [29-03-2016(online)].pdf | 2016-03-29 |
| 25 | 201617010851-COMPLETE SPECIFICATION [02-06-2020(online)].pdf | 2020-06-02 |
| 26 | Form 20 [29-03-2016(online)].pdf | 2016-03-29 |
| 26 | 201617010851-CLAIMS [02-06-2020(online)].pdf | 2020-06-02 |
| 27 | Form 3 [29-03-2016(online)].pdf | 2016-03-29 |
| 27 | 201617010851-FER.pdf | 2021-10-17 |
| 28 | Form 5 [29-03-2016(online)].pdf | 2016-03-29 |
| 28 | 201617010851-PatentCertificate20-06-2022.pdf | 2022-06-20 |
| 29 | Power of Attorney [29-03-2016(online)].pdf | 2016-03-29 |
| 29 | 201617010851-IntimationOfGrant20-06-2022.pdf | 2022-06-20 |
| 1 | 2020-03-2016-57-35E_20-03-2020.pdf |
| 1 | US7639741E_20-03-2020.pdf |
| 2 | EP2631870A1E_20-03-2020.pdf |
| 3 | 2020-03-2016-57-35E_20-03-2020.pdf |
| 3 | US7639741E_20-03-2020.pdf |