Sign In to Follow Application
View All Documents & Correspondence

System And Method For Identifying Errors In Positioning Of A Subject For Capturing A Thermal Image

Abstract: A method for identifying errors associated with subject positioning in a thermal image from a user and generating a feedback to enable the user for adaptive positioning of the 5 subject for capturing a new thermal image. The method includes (i) receiving a thermal image of body of a subject, (ii) automatically segmenting breast region from the thermal image, (iii) computing a plurality of positions and a plurality of deviations in the plurality of positions using a thermal imaging protocol, (iv) determining a positional adjustment to be made to a position of thermal imaging camera or a position of the subject based on the 10 plurality of deviations in the plurality of positions of the breast region and (v) generating a set of instructions to user for adjusting a position of the thermal imaging camera or subject for capturing a new thermal image at the required position as per thermal imaging protocol. FIG.8A and 8B

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 October 2019
Publication Number
44/2019
Publication Type
INA
Invention Field
PHYSICS
Status
Email
ipo@myipstrategy.com
Parent Application
Patent Number
Legal Status
Grant Date
2022-01-31
Renewal Date

Applicants

NIRAMAI HEALTH ANALYTIX PVT. LTD
Flat A7-506, Elita Promenade, JP Nagar, 7th Phase, Bangalore

Inventors

1. Siva Teja Kakileti
D. No 1-45, Sundar Nagar Street, Annaipeta, Draksharamam RC Puram Mandal East, Godavari Dist, Kakinada – 533262
2. Geetha Manjunath
Flat A7-506, Elita Promenade, Bangalore – 56007

Specification

Claims:I/We claim:
1. A method for identifying errors associated with subject positioning in a thermal image 1 from a user and generating a feedback to enable the user for adaptive positioning of the 2 subject for capturing a new thermal image, the method comprising: 3
receiving a thermal image of a body of a subject, which represents the temperature 4 distribution on the body of the subject as pixels in the thermal image with a highest 5 temperature value being displayed in a first color and pixels with a lowest temperature value 6 being displayed in a second color, pixels with temperature values between the lowest and 7 highest temperature values being displayed in gradations of color between the first and 8 second color, wherein the thermal image is captured by a thermal imaging camera, the 9 thermal imaging camera comprising: 10
an array of sensors that converts an infrared energy into electrical signals on a 11 per-pixel basis; 12
a lens that focuses the infrared energy from the subject’s body onto the array of 13 sensors, wherein the array of sensors detects temperature values from the subject’s 14 body; and 15
a specialized processor that processes the detected temperature values into at 16 least one block of pixels to generate the thermal image; 17
automatically determining a breast region in the thermal image by segmenting the 18 breast region from the thermal image using an automated segmentation technique; 19
characterized in that, 20
computing a plurality of positions (p,q,r) of the breast region with respect to the 21 thermal image, wherein p is a normalized length of visible region above the breast region in 22 the thermal image, q is a normalized length of visible region below the breast region in 23 thermal image and r is a distance of the breast region from either of a first or a last pixel 24 column of the thermal image; 25
computing a plurality of deviations (dp, dq, dr) in the plurality of positions of the 26 breast region by comparing the plurality of positions of the breast region with a required 27
27
position as per the thermal imaging protocol, wherein dp is a deviation with respect to visible 28 region above the breast region, dq is a deviation with respect to visible region below the 29 breast region and dr is a deviation with respect to the distance of the breast region from either 30 the first or the last pixel column of the thermal image; 31
determining, using a machine learning model, a positional adjustment to be made to a 32 position of thermal imaging camera or a position of the subject based on the plurality of 33 deviations (dp, dq, dr) in the plurality of positions of the breast region; and 34
generating a set of instructions to the user for adjusting a position of the thermal 35 imaging camera or subject for capturing a new thermal image at the required position as per 36 thermal imaging protocol. 37
2. The method as claimed in claim 1, wherein the positional adjustment comprises at 1 least one of: 2
adjusting at least one of the thermal imaging camera, or the subject chair up or down 3 to capture a new thermal image to obtain correct visible region above the breast region (p) 4 and a part of visible region below the breast region (q) is as per the thermal imaging protocol; 5
adjusting at least one of the thermal imaging camera, the subject or the subject chair 6 front/back to capture the new thermal image to obtain correct visible region above the breast 7 region (p) and the part of the visible region below the breast region (q) is as per the thermal 8 imaging protocol; and 9
adjusting at least one of the thermal imaging camera, the subject or the subject chair 10 sideways with the distance of the breast region from either of the first or the last pixel column 11 (r) is as per the thermal imaging protocol. 12
3. The method as claimed claim 1, wherein the method comprises the step of providing 1 the new captured thermal image along with position adjustment for an automatic tumor 2 detection or an automatic tumor classification to detect cancerous tissue and non-cancerous 3 tissue within the breast region of the subject. 4
28
4. The method as claimed in claim 1, wherein the detected breast region segment in the 1 thermal image is provided for an automatic tumor detection or an automatic tumor 2 classification to detect cancerous tissue and non-cancerous tissue within the breast region of 3 the subject, if the plurality of deviations (dp, dq, dr) does not exceed a threshold value as per 4 the thermal imaging protocol. 5
5. The method as claimed in claim 1, wherein the automated segmentation technique 1 comprises the step of: 2
determining an outer side contour of an outline of a boundary of the breast region of 3 the subject from a body silhouette; 4
determining an inner side boundary of the breast region from the body silhouette and 5 the view angle of the thermal image; 6
determining an upper boundary of the breast region by determining a lower boundary 7 of an isotherm of axilla of the subject; 8
determining a lower boundary of the breast region by determining an upper boundary 9 of an isotherm of an infra-mammary fold of the person; and 10
segmenting the breast region of the subject by connecting above determined breast 11 boundaries to segment the region from surrounding tissue in the thermal image. 12
6. The method as claimed in claim 5, wherein the automated segmentation technique 1 comprises the steps of: 2
training a deep learning model by providing a plurality of thermal images as an input 3 and the corresponding segmentation as an output to obtain a trained deep learning model; and 4
providing the new captured thermal image to the trained deep learning model to 5 predict a final segmentation map. 6
7. The method as claimed in claim 1, wherein the set of instructions is provided to at 1 least one of a robotic arm holding the thermal imaging camera, an electronically controlled 2
29
camera stand, or an electronically controlled rotating chair to automatically position itself to 3 the suggested position adjustment for capturing the new thermal image of the subject as per 4 thermal imaging protocol. 5
8. The method as claimed in claim 1, wherein the positional adjustment is provided to 1 automatically adjust the position of the thermal imaging camera to capture the new thermal 2 image at the required position without user’s intervention. 3
9. The method as claimed in claim 1, wherein the method comprises displaying at least 1 one of the positional adjustment or the segmented breast region on a visualization screen. 2
10. The method of claim 3, wherein the automatic tumor detection comprising the steps 1 of: 2
determining a percentage of pixels p1 within a selected region of interest with a 3 temperature T1pixel, where T2= T1pixel = T1; 4
determining a percentage of pixels p2within the selected region of interest with a 5 temperature T2pixel, where T3 = T2pixel; and 6
determining a ratio p3 = Pedge/Pblock, where Pedgeis a number of pixels around a 7 border of a suspected tumor within the region of interest, and Pblock is a number of 8 pixels in the perimeter of the region of interest. 9
determining whether the suspected tumour region as one of: the cancerous 10 tissue, the non-cancerous using a decision rule R, wherein the decision rule R is based 11 on any combination of: R1, R2, R3, where: R1 = (p1> Threshold1), R2 = (p2> 12 Threshold2), and R3 = (p3> Threshold3) and wherein T1, T2 and T3 are temperature 13 threshold obtained from temperature distribution. 14
11. The method as claimed in claim 3, wherein the automatic tumor classification comprising 1 the steps of: 2
30
determining pixel regions m1within a selected region of interest with a 3 temperature T1pixel, where T2= T1pixel = T1; 4
determining pixel regions m2within the selected region of interest with a 5 temperature T2pixel, where T3 = T2pixel; 6
extracting the parameters comprising at least one temperature, at least one 7 boundary, at least one area and at least one shape from m1 and m2; and 8
providing the parameters to a machine learning classifier to determine whether 9 the selected region of interest has a cancerous lesion or not, wherein T1, T2 and T3 are 10 temperature thresholds obtained from temperature distribution. 11
12. The method as claimed in claim 11, wherein the automatic tumor classification 1 comprising the steps of: 2
training a deep learning model by providing a plurality of thermal images as 3 an input and the corresponding classification as an output to obtain a trained deep 4 learning model; and 5
providing the new captured thermal image to the trained deep learning model 6 to determine whether a selected region of interest has a cancerous lesion or not. 7
13. The method as claimed in claim 1, wherein the segmentation and the position 1 deviation are computed for a thermal image obtained by selecting a single image frame of a 2 thermal video or a live stream thermal video, wherein the thermal video or live stream 3 thermal video is captured using the thermal imaging camera. 4
14. The method as claimed in claim 1, wherein the set of instructions comprises at least 1 one of a text, a visual or an audio for capturing the new thermal image at the required position 2 as per the thermal imaging protocol. 3
15. A method of automatically identifying a posture and a position of a subject in a 1 thermal image, the method comprising: 2
receiving a thermal image of a body of a subject, which represents the temperature 3
31
distribution on the body of the subject as pixels in the thermal image with a highest 4 temperature value being displayed in a first color and pixels with a lowest temperature value 5 being displayed in a second color, pixels with temperature values between the lowest and 6 highest temperature values being displayed in gradations of color between the first and 7 second color, wherein the thermal image is captured by a thermal imaging camera, the 8 thermal imaging camera comprising: 9
an array of sensors that converts infrared energy into electrical signals on a per-10 pixel basis; 11
a lens that focuses the infrared energy from the subject’s body onto the array of 12 sensors, wherein the array of sensors detects temperature values from the subject’s 13 body; and 14
a specialized processor that processes the detected temperature values into at 15 least one block of pixels to generate the thermal image; 16
automatically determining key physical structures and contours of the body in 17 the thermal image using an automated segmentation technique and an edge detection 18 technique, wherein the key physical structures and the contours of the body are 19 represented as image points to define a reference body coordinate system in an n-20 dimensional Euclidean space; 21
characterized in that, assembling each image point in the Euclidean coordinate system 22 to define the posture and the position of the body; 23
determining the n-dimensional Euclidian axis (X1-Xn) for a particular posture of 24 interest to define the reference body coordinate system, wherein each Euclidian axis includes 25 values associated with a physical structure or contour of the body, wherein the values of each 26 Euclidian axis (Xi=1-N) represents a relative distance of the respective physical structure or 27 contour of the body from the boundaries of the thermal image; 28
providing N ordinal values along each corresponding n-dimensional Euclidian axis for 29 the thermal image as a numerical representation of the subject’s posture, position and a point 30 in a Euclidian space to enable the user to perform further analysis. 31
16. A system for identifying errors associated with a subject positioning in a thermal image 32
32
from a user and generating a feedback to enable the user for adaptive positioning of the 33 subject for capturing a new thermal image, the system comprising: 34
a storage device; and 35
a processor retrieving machine-readable instructions from the storage device which, 36
when executed by the processor, enable the processor to: 37
receive a thermal image of a body of a subject, which represents the temperature 38 distribution on the body of the subject as pixels in the thermal image with a highest 39 temperature value being displayed in a first color and pixels with a lowest temperature 40 being displayed in a second color, pixels with temperature values between the lowest 41 and highest temperature values being displayed in gradations of color between the first 42 and second color, wherein the thermal image is captured by a thermal imaging camera, 43 wherein the thermal imaging camera comprising: 44
an array of sensors that converts an infrared energy into electrical signals 45 on a per-pixel basis; 46
a lens that focuses the infrared energy from the subject’s body onto the 47 array of sensors, wherein the array of sensors detects temperature values from the 48 subject’s body; and 49
a specialized processor that processes the detected temperature values 50 into at least one block of pixels to generate the thermal image; 51
automatically determine abreast region in the thermal image by segmenting 52 the breast region from the thermal image using an automated segmentation technique; 53
characterized in that, 54
computing a plurality of positions (p, q, r) of the breast region with respect to 55 the thermal image, wherein p is a normalized length of visible region above the breast 56 region in the thermal image, q is a normalized length of visible region below the 57 breast region in the thermal image and r is a distance of the breast region from either 58 of a first or a last pixel column of the thermal image; 59
compute a plurality of deviations (dp, dq, dr) in the plurality of positions of the 60 detected breast region comparing the plurality of positions of the breast region with to 61 a required position as per the thermal imaging protocol, wherein dp is a deviation with 62 respect to visible region above the breast region, dq is a deviation with respect to 63
33
visible region below the breast region and dr is a deviation with respect to the distance 64 of the breast region from either the first or the last pixel column of the thermal image; 65
determine, using a machine learning model, a positional adjustment to be made 66 to a position of thermal imaging camera or a position of the subject based on the 67 plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region 68 segment; and 69
generate a set of instructions to the user for adjusting a position of the thermal 70 imaging camera or the subject for capturing a new thermal image as per thermal 71 imaging protocol. 72
17. The system as claimed in claim 16, wherein the system implements the position adjustment 1 by: 2
adjust at least one of the thermal imaging camera, or the subject chair up or down to 3 capture a new thermal image to obtain correct visible region above the breast region (p) and 4 a part of visible region below the breast region (q) is as per the thermal imaging protocol; 5
adjust at least one of the thermal imaging camera, the subject or the subject chair 6 front/back to capture the new thermal image to obtain correct visible region above the breast 7 region (p) and the part of visible below the breast region (q) is as per the thermal imaging 8 protocol; and 9
adjust at least one of the thermal imaging camera, the subject or the subject chair 10 sideways with the distance of the breast region from either of the first or the last pixel column 11 (r) is as per the thermal imaging protocol for the positional adjustment. 12
18. The system as claimed in claim 16, wherein the system provides the new captured thermal 1 image along with position adjustment for an automatic tumor detection or an automatic tumor 2 classification to detect cancerous tissue and non-cancerous tissue within the breast region of 3 the subject. 4
19. The method as claimed in claim 16, wherein the detected breast region segment in the 1 thermal image is provided for an automatic tumor detection or an automatic tumor 2
34
classification to detect cancerous tissue and non-cancerous tissue within the breast region of 3 the subject, if the plurality of deviations does not exceed a threshold value as per the thermal 4 imaging protocol. 5
20. The system as claimed in claim 16, wherein the system provides a set of instructions 1 to at least one of a robotic arm holding the camera, electronically controlled camera stand, 2 electronically controlled rotating chair to automatically positions itself to the suggested 3 position adjustment for capturing the new thermal image of the subject. , Description:BACKGROUND
Technical Field 5
[0001] Embodiments herein are directed towards capturing thermal image conformant to the standard operating procedure and, more particularly, to a system and method for identifying errors associated with subject positioning in a thermal image from a user and generating feedback to enable the user for adaptive positioning of the subject for capturing a new thermal image. 10
Description of the Related Art
[0002] Breast Cancer is among the leading cause of cancer deaths around the world, especially in women. Though, mammography is considered as a gold standard for detecting breast cancer, it is not affordable to economically backward population. Further, mammography has its own disadvantage of pain due to the compression of the breast and 15 radiation exposure. In recent years, thermography is emerging as a promising modality for detecting breast cancer. Thermography captures the amount of heat radiating from the surface of the body and measures the temperature patterns and distribution on the chest due to high metabolism associated with tumorous growth. There are several advantages of Breast thermography compared to other methods. The breast thermography works on 20 women of all age groups, does not involve any radiation and non-contact, hence painless.The key challenge in breast thermography is that the correctness of interpretation greatly depends upon adherence to protocol during thermal image capture, specifically subject preconditioning and correct capture of thermal images of the patient. Breast thermography requires expertise to capture the images properly as per the protocol. Any 25 error in the image capture could lead to misinterpretation of the images. For example, (i) subject could be too far and hence the region of interest in the image may be in low resolution which may result in loss of information and affect the accuracy of prediction. (ii) subjects could be too close that some portion of the breast region is cut/invisible in the image leading to false negatives and (iii) subjects are not centered in the image leading to 30 inconsistency in image capture across the technicians. Also, there would be a variation in
3
the captured portion of the body across the technicians. In order to make breast thermography usable in large scale population screening programs, such errors have to be minimized. as the tool will be used by health workers with the operation skills.
[0003] Hence, there is a need for an automated guidance system or method to automatically identify errors associated with subject positioning and provide feedback to 5 a technician for corrective subject positioning for capturing a thermal image.
SUMMARY
[0004] In view of the foregoing, an embodiment herein provides a method for identifying errors associated with a subject positioning in a thermal image from a user and generating feedback to enable the user for adaptive positioning of the subject for 10 capturing a new thermal image. The method includes (i) receiving a thermal image of a body of a subject, which represents the temperature distribution on the body of the subject as pixels in the thermal image with a highest temperature value being displayed in a first color and pixels with a lowest temperature value being displayed in a second color, pixels with temperature values between the lowest and highest temperature values being 15 displayed in gradations of color between the first and second color,(ii) automatically determining a breast region in the thermal image by segmenting the breast region from the thermal image using an automated segmentation technique, (iii) computing a plurality of positions (p,q,r) of the breast region with respect to the thermal image, (iv) computing a plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region by 20 comparing the plurality of positions of the breast region with a required position as per the thermal imaging protocol, (v) determining, using a machine learning model, a positional adjustment to be made to a position of thermal imaging camera or a position of the subject based on the plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region and (vi) generating a set of instructions to the user for adjusting a 25 position of the thermal imaging camera or subject for capturing a new thermal image at the required position as per thermal imaging protocol.
[0005] The thermal image is captured by a thermal imaging camera that includes an array of sensors, a lens and a specialized processor. The array of sensors converts an infrared energy into electrical signals on a per-pixel basis. The lens focuses the infrared 30 energy from the subject’s body onto the array of sensors. The array of sensors detects temperature values from the subject’s body. The specialized processor processes the
4
detected temperature values into at least one block of pixels to generate the thermal image. In one embodiment, the position p is a normalized length of visible region above the breast region in the thermal image, the position q is a normalized length of visible region below the breast region in thermal image and the position r is a distance of the breast region from either of a first or a last pixel column of the thermal image. In another 5 embodiment, the deviation dp is a deviation with respect to visible region above the breast region, the deviation dq is a deviation with respect to visible region below the breast region and the deviation dr is a deviation with respect to the distance of the breast region from either the first or the last pixel column of the thermal image.
[0006] In an embodiment, the positional adjustment includes at least one of (i) 10 adjusting at least one of the thermal imaging camera, or the subject chair up or down to capture a new thermal image to obtain correct visible region above the breast region (p) and a part of visible region below the breast region (q) is as per the thermal imaging protocol, (ii) adjusting at least one of the thermal imaging camera, the subject or the subject chair front/back to capture the new thermal image to obtain correct visible region 15 above the breast region (p) and the part of visible region below the breast region (q) is as per the thermal imaging protocol and (iii) adjusting at least one of the thermal imaging camera, the subject or the subject chair sideways with the distance of the breast region from either of the first or the last pixel column (r) is as per the thermal imaging protocol.
[0007] In another embodiment, the method includes the step of providing the new 20 captured thermal image along with position adjustment for an automated tumor detection and an automated tumor classification to detect cancerous tissue and/or non-cancerous tissue within a breast area of the subject.
[0008] In yet another embodiment, the detected breast region segment in the thermal image is provided for an automatic tumor detection or an automatic tumor 25 classification to detect cancerous tissue and non-cancerous tissue within the breast region of the subject, if the plurality of deviations (dp, dq, dr) does not exceed a threshold value as per the thermal imaging protocol.
[0009] In yet another embodiment, the automated segmentation technique to segment the breast area of the subject in the thermal image includes the steps of (i) 30 determining an outer side contour of an outline of a boundary of the breast area of the subject from a body silhouette, (ii) determining an inner side boundary of the breast area
5
from the body silhouette and the view angle of the thermal image, (iii) determining an upper boundary of the breast area by determining a lower boundary of an isotherm of axilla of the subject, (iv) determining a lower boundary of the breast area by determining an upper boundary of an isotherm of infra-mammary fold of the person and (v) segmenting the breast area of the subject by connecting above determined breast 5 boundaries to segment the breast from surrounding tissue in the thermal image.
[0010] In yet another embodiment, the automated segmentation technique includes the steps of (i) training a deep learning model by providing a plurality of thermal images as an input and the corresponding segmentation as an output to obtain a trained deep learning model and (ii) providing the new captured thermal image to the trained 10 deep learning model to predict a final segmentation map.
[0011] In yet another embodiment, the set of instructions is provided to at least one of a robotic arm holding the camera, an electronically controlled camera stand or an electronically controlled rotating chair to automatically position itself to the suggested position adjustment for capturing the new thermal image of the subject as per thermal 15 imaging protocol.
[0012] In yet another embodiment, the position adjustment is provided to automatically adjust the position of the thermal imaging camera to capture the new thermal image at the required position without the user’s intervention.
[0013] In yet another embodiment, the method includes the step of displaying at 20 least one of the position adjustment or the segmented breast region on a visualization screen.
[0014] In yet another embodiment, the automatic tumor detection includes the step of: (i) determining a percentage of pixels p1 within said selected region of interest with a temperature T1pixel, where T2= T1pixel = T1, (ii) determining a percentage of pixels 25 p2within the selected region of interest with a temperature T2pixel, where T3 = T2pixel and (iii) determining a ratio p3 = Pedge/Pblock, where Pedgeis a number of pixels around a border of a suspected tumor within said region of interest, and Pblock is the number of pixels in the perimeter of said region of interest,(iv) determining whether a suspected tumor region as one of: the cancerous tissue, the non-cancerous using a decision rule R. The decision 30 rule R is based on any combination of:R1, R2, R3, where: R1 = (p1> Threshold1), R2 = (p2> Threshold2), and R3 = (p3> Threshold3). The T1, T2 and T3 are temperature threshold
6
obtained from temperature distribution.
[0015] In yet another embodiment, the automatic tumor classification includes the steps of: (i) determining pixel regions m1within a selected region of interest with a temperature T1pixel, where T2= T1pixel = T1; (ii) determining pixel regions m2within the selected region of interest with a temperature T2pixel, where T3 = T2pixel; (iii) extracting the 5 parameters comprising at least one temperature, at least one boundary, at least one area and at least one shape from m1 and m2; and (iv) providing the parameters to a machine learning classifier to determine whether the selected region of interest has a cancerous lesion or not. T1, T2 and T3 are temperature thresholds obtained from temperature distribution. 10
[0016] In yet another embodiment, the automatic tumor classification method includes (i) training a deep learning model by providing a plurality of thermal images as an input and the corresponding classification as an output to obtain a trained deep learning model; and (ii) providing the new captured thermal image to the trained deep learning model to determine whether a selected region of interest has a cancerous lesion 15 or not.
[0017] In yet another embodiment, the segmentation and the position deviation are computed for a thermal image obtained by selecting a single image frame of a thermal video or a livestream thermal video. The thermal video or the livestream thermal video is captured using the thermal imaging camera. 20
[0018] In yet another embodiment, the set of instructions includes at least one of a text, a visual or audio for capturing the new thermal image at the required position as per the thermal imaging protocol.
[0019] In another aspect, a method for automatically identifying a posture and a position of a subject in a thermal image is provided. The method includes (i) receiving a 25 thermal image of a body of a subject, which represents the temperature distribution on the body of the subject as pixels in the thermal image with a highest temperature value being displayed in a first color and pixels with a lowest temperature value being displayed in a second color, pixels with temperature values between the lowest and highest temperature values being displayed in gradations of color between the first and second color, (ii) 30 automatically determining key physical structures and contours of the body in the thermal image using an automated segmentation technique and an edge detection technique. The
7
key physical structures and the contours of the body are represented as image points to define a reference body coordinate system in an n-dimensional Euclidean space. (iii) assembling each image point in the Euclidean coordinate system to define the posture and the position of the body;(iv) determining the n-dimensional Euclidian axis (X1-Xn) for a particular posture of interest to define the reference body coordinate system. Each 5 Euclidian axis includes values associated with a physical structure or contour of the body. The values of each Euclidian axis (Xi=1-N) represents a relative distance of the respective physical structure or contour of the body from the boundaries of the thermal image, and (v) providing N ordinal values along each corresponding n-dimensional Euclidian axis for the thermal image as a numerical representation of the subject’s posture and position and 10 a point in an Euclidian space to enable the user to perform further analysis.
[0020] The thermal image is captured by a thermal imaging camera that includes an array of sensors, a lens and a specialized processor. The array of sensors converts an infrared energy into electrical signals on a per-pixel basis. The lens focuses the infrared energy from the subject’s body onto the array of sensors. The array of sensors detects 15 temperature values from the subject’s body. The specialized processor processes the detected temperature values into at least one block of pixels to generate the thermal image.
[0021] In another aspect, a system for identifying errors associated with a subject positioning in a thermal image from a user and generating a feedback to enable the user 20 for adaptive positioning of the subject for capturing a new thermal image is provided. The system includes a storage device, and a processor retrieving machine-readable instructions from the storage device which, when executed by the processor, enable the processor to (i)receive a thermal image of a body of a subject, which represents the temperature distribution on the body of the subject as pixels in the thermal image with a highest 25 temperature value being displayed in a first color and pixels with a lowest temperature value being displayed in a second color, pixels with temperature values between the lowest and highest temperature values being displayed in gradations of color between the first and second color, (ii) automatically determine abreast region in the thermal image by segmenting the breast region from the thermal image using an automated segmentation 30 technique, (iii)compute a plurality of positions (p,q,r) of the detected breast region segment with respect to the thermal image,(iv)compute a plurality of deviations (dp, dq,
8
dr) in the plurality of positions of the detected breast region comparing the plurality of positions of the breast region with to a required position as per the thermal imaging protocol,(v) determine a positional adjustment to be made to a position of thermal imaging camera or a position of the subject based on the plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region segment using a machine learning model and 5 (vi) generate a set of instructions to the user for adjusting a position of the thermal imaging camera or the subject for capturing a new thermal image as per thermal imaging protocol.
[0022] The thermal imaging camera includes an array of sensors, a lens and a specialized processor. The array of sensors converts an infrared energy into electrical signals 10 on a per-pixel basis. The lens focuses the infrared energy from the subject’s body onto the array of sensors. The array of sensors detects temperature values from the subject’s body. The specialized processor processes the detected temperature values into at least one block of pixels to generate the thermal image. In one embodiment, the position p is a normalized length of visible region above the breast region in the thermal image, the position q is a 15 normalized length of visible region below the breast region in the thermal image and the position r is the distance of breast region from either of a first or a last pixel column of the thermal image. In another embodiment, the deviation dp is a deviation with respect to visible region above the breast region, the deviation dq is a deviation with respect to visible region below the breast region and the deviation dr is a deviation with respect to 20 the distance of the breast region from either the first or the last pixel column of the thermal image.
[0023] In an embodiment, the system implements the position adjustment by(i) adjusting at least one of the thermal imaging camera, or a chair up or down to capture a new thermal image with an amount of visible region above the breast region (p) and an 25 amount of visible region below the breast region (q) is as per the thermal imaging protocol, (ii) adjusting at least one of the thermal imaging camera, a subject or a chair front/back to capture the new thermal image with the amount of visible region above the breast region (p) and the amount of visible region below the breast region (q) is as per the thermal imaging protocol, and (iii) adjusting at least one of the thermal imaging camera, a 30 subject or a chair sideways with the distance of the breast region from either of the first or the last pixel column (r) is as per the thermal imaging protocol.
9
[0024] In another embodiment, the system provides the new captured thermal image and the segmented breast region along with positional adjustment for an automatic tumor detection and automatic tumor classification to detect cancerous tissue and non-cancerous tissue within the breast area of the subject.
[0025] In yet another embodiment, the system provides the detected breast region 5 segment in the thermal image for an automatic tumor detection or an automatic tumor classification to detect cancerous tissue and non-cancerous tissue within the breast region of the subject, if the plurality of deviations does not exceed a threshold value as per the thermal imaging protocol.
[0026] In yet another embodiment, the system provides a set of instructions to at 10 least one of a robotic arm holding the camera, an electronically controlled camera stand or an electronically controlled rotating chair to automatically position itself to the suggested position adjustment for capturing the new thermal image of the subject.
[0027] The system and method may detect the errors in the captured position and guide the technician for proper capture of the subject. The system and method standardize 15 the image capture protocol for identifying errors associated with subject positioning in a thermal image from a user and generating the feedback to enable the user for adaptive positioning of the subject for capturing a new thermal image. The system and method automate the image capture by sending the feedback to the technician or enable the auto adjustment with the robotic arm/chair. The system and method allow for automated 20 Image Analysis with minimal or no human intervention.
[0028] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by 25 way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF THE DRAWINGS
[0029] The embodiments herein will be better understood from the following 30 detailed description with reference to the drawings, in which:
[0030] FIG. 1 illustrates an example female patient with a thermal imaging
10
camera mounted on a slidable and axially rotatable robotic arm for moving the thermal camera along a semi-circular trajectory from side-to-side in front of the female patient according to an embodiment herein;
[0031] FIG. 2 illustrates an exploded view of a corrective positioning system for identifying errors associated with a subject positioning in a thermal image from a user and 5 generating a feedback to enable the user for adaptive positioning of the subject for capturing a new thermal image according to an embodiment herein;
[0032] FIG. 3 illustrates an exemplary process flow of an automated Region of Interest (ROI) analysis of a thermal image from a user using a corrective positioning system according to an embodiment herein; 10
[0033] FIG. 4 illustrates an exemplary process flow of an offline error identification associated with subject positioning in a thermal image from a user and generating a feedback to enable the user for adaptive positioning of the subject for capturing a new thermal image using a corrective positioning system according to an embodiment herein; 15
[0034] FIG. 5 illustrates an exemplary process flow of a live stream error identification associated with subject positioning in a thermal image from a user and generating a feedback to enable the user for adaptive positioning of the subject for capturing a new thermal image using a corrective positioning system according to an embodiment herein; 20
[0035] FIG. 6 illustrates an exemplary process flow of corrective positioning using a corrective positioning system to select a view with adaptive sampling to reduce image frame candidates according to an embodiment herein;-.
[0036] FIG. 7 illustrates an exemplary process flow of automatically identifying a posture and a position of a subject in a thermal image according to an embodiment herein; 25
[0037] FIG. 8A and 8B illustrate a flow diagram of one embodiment of the present method for identifying errors associated with subject positioning in a thermal image from a user and generating a feedback to enable the user for adaptive positioning of the subject for capturing a new thermal image according to an embodiment herein; and
[0038] FIG. 9 illustrates a block diagram of one example corrective positioning 30 system/image processing system for processing a thermal image in accordance with the embodiments described with respect to the flow diagram of FIG. 8A and 8B according to
11
an embodiment herein.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0039] The embodiments herein and the various features and advantageous details 5 thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein 10 may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0040] As mentioned, there remains a need for a system and a method for identifying errors associated with a subject positioning in a thermal image from a user and 15 generating a feedback to enable the user for adaptive positioning of the subject for capturing a new thermal image. Referring now to the drawings, and more particularly to FIGS. 1 through 8, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[0041] A "person" refers to either a male or a female. Gender pronouns are not to 20 be viewed as limiting the scope of the appended claims strictly to females. Moreover, although the term “person” or “patient” is used interchangeably throughout this disclosure, it should be appreciated that the person undergoing breast cancer screening may be something other than a human such as, for example, a primate. Therefore, the use of such terms is not to be viewed as limiting the scope of the appended claims to humans. 25
[0042] A “breast area” refers to a tissue of the breast and may further include surrounding tissue as is deemed appropriate for breast cancer screening. Thermal images are the capture of the breast area in various view angles which include a mediolateral view (centre chest), a mediolateral oblique (angular) view, and a lateral (side) view, as are generally understood in the medical imaging arts. It should be appreciated that the 30 mediolateral view is a supplementary mammographic view which generally shows less breast tissue and pectoral muscle than the mediolateral oblique view. FIG. 1 shows the
12
breast area of a female 100. It should be appreciated that the patient may be stationary while the camera moves about the patient, or the patient can move while the camera remains stationary, or the patient and the camera may move to capture the appropriate view angles as desired.
[0043] A “sternum” refers to a long flat breastbone located in the central part of 5 the chest. It connects to the ribs via cartilage and forms the front of the rib cage, thus protects the heart, lungs, and major blood vessels.
[0044] A “thermal camera” refers to either a still camera or a video camera with a lens that focuses infrared energy from objects in a scene onto an array of specialized sensors which convert infrared energy across a desired thermal wavelength band into 10 electrical signals on a per-pixel basis and which output an array of pixels with colours that correspond to temperatures of the objects in the image.
[0045] A "thermographic image" or simply a “thermal image” is an image captured by a thermal camera. The thermographic image comprises an array of color pixels with each color being associated with temperature. Pixels with a higher 15 temperature value are displayed in the thermal image in a first color and pixels with a lower temperature value are displayed in a second color. Pixels with temperature values between the lower and higher temperature values are displayed in gradations of color between the first and second colors.
[0046] "Receiving a thermal image" of a patient for cancer screening is intended 20 to be widely construed and includes retrieving, capturing, acquiring, or otherwise obtaining video image frames.
[0047] "Analyzing the thermographic image" means to identify a plurality of points (PN) in the image.
[0048] A “software interface tool” is a composite of functionality for tumor 25 detection and/or tumor classification using a plurality of user-selectable objects displayed on a display device such as a touchscreen display. One embodiment of a software interface tool which implements a tumor detection method is disclosed in commonly owned and co-pending U.S. Patent Application Serial No. 14/668,178, entitled: "Software Interface Tool For Breast Cancer Screening", by Krithika Venkataramani et al. Another 30 embodiment of a software interface tool which implements a tumor classification method is disclosed in commonly owned and co-pending U.S. Patent Application Serial No.
13
15/053,767, entitled: "Software Interface Tool For Breast Cancer Screening", by Gayatri Sivakumar et al. Various embodiments of the software interface tool perform manual, semi-automatic, and automatic selection of a block of pixels in the thermal image for screening.
[0049] FIG. 1 illustrates an example female patient 100 with a thermal imaging 5 camera mounted on a slidable and axially rotatable robotic arm for moving the thermal camera along a semi-circular trajectory from side-to-side in front of the patient according to an embodiment herein. The thermal imaging camera 101 is mounted on the slidable and axially rotatable robotic arm 102 capable of moving the thermal imaging camera along a semi-circular trajectory 103 in the front of the patient/subject from side-to-side 10 such that thermographic images may be captured in a right-side view 104, a front view 105, and a left-side view 106, and various oblique angles in between. The thermal imaging camera 101 can be a single-band infrared camera, a multi-band infrared camera in the thermal range and a hyper spectral infrared camera in the thermal range. The resolution of the thermal imaging camera 101 is effectively the size of the pixel. Smaller 15 pixels mean that the resulting image has a higher resolution and thus better spatial definition. Although the thermal imaging camera 101 offers a relatively large dynamic range of temperature settings, it is preferable that the camera’s temperature range be relatively small, centered around the person's body surface temperature so that small temperature variations are amplified in terms of pixel color changes in order to provide a 20 better measure of temperature variation. Thermal imaging cameras are readily available in various streams of commerce. The thermal imaging camera 101 is communicatively connected to a corrective positioning system 107 which process the thermal image captured by the thermal imaging camera 101 for identifying errors associated with subject positioning in a thermal image from a user / robotic arm and generating a feedback to 25 enable the user for adaptive positioning of the subject for capturing a new thermal image.
[0050] FIG. 2 illustrates an exploded view of a corrective positioning system for identifying errors associated with a subject positioning in a thermal image from a user and generating a feedback to enable the user for adaptive positioning of the subject for capturing a new thermal image according to an embodiment herein. The corrective 30 positioning system 107 includes a thermal image receiving module 202, a position error identification module 204, a position adjustment determination module 206 and a thermal
14
image camera control module 208. The thermal image receiving module 202 receives a thermal image of a body of a subject/patient. The thermal image represents the temperature distribution on the body of the subject as pixels in the thermal image with a highest temperature value being displayed in a first color and pixels with a lowest temperature value being displayed in a second color, pixels with temperature values 5 between the lowest and highest temperature values being displayed in gradations of color between the first and second color. In an embodiment, the thermal image is captured using a thermal imaging camera which is connected with the corrective positioning system 107. In an embodiment, the thermal imaging camera includes an array of sensors, a lens and a specialized processor. The array of sensors converts an infrared energy into 10 electrical signals on a per-pixel basis. The lens focuses the infrared energy from the subject’s body onto the array of sensors. The array of sensors detects temperature values from the subject’s body. The specialized processor processes the detected temperature values into at least one block of pixels to generate the thermal image. The position error identification module204 automatically identifies errors associated with the subject 15 positioning in a thermal image. In an embodiment, the position adjustment includes (i) adjusting at least one of the thermal imaging camera, or the subject chair up or down to capture a new thermal image to obtain visible region above the breast region (p) and a part of visible region below the breast region (q) is as per the thermal imaging protocol; (ii) adjusting at least one of the thermal imaging camera, the subject or the subject chair 20 front/back to capture the new thermal image to obtain visible region above the breast region (p) and the part of visible region below the breast region (q) is as per the thermal imaging protocol; and (iii) adjusting at least one of the thermal imaging camera, the subject or the subject chair sideways with the distance of the breast region from either of the first or the last pixel column (r) is as per the thermal imaging protocol. 25
[0051] In an embodiment, the thermal imaging protocol includes at least one steps of (i) cooling the thermal image for a particular time interval, (ii) positioning the subject in such a way that the thermal image of the complete chest area with axilla is visible, (iii) focusing the thermal image of the subject for capturing the high-resolution thermal image of the subject, (iv) capturing the thermal image of the subject in at least one of front view, 30 left oblique view, left lateral view, right oblique view or right lateral view or (v) providing the captured thermal image of the subject to the system for further analysis.
15
The position error identification module 204 automatically computes a plurality of positions (p,q,r) of the breast region with respect to the thermal image. The position p is the normalized distance from the top of the thermal image to the upper end of the breast region, the position q is the normalized distance from the lower end of breast to the end/bottom of the thermal image and the position r is the normalized distance of side 5 boundary of breast (close to sternum) to the first or last pixel column of the thermal image. The position error identification module 204 computes a plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region by comparing the plurality of positions of the breast region with a required position as per the thermal imaging protocol. The deviation dp is a deviation with respect to visible region above the breast 10 region in the thermal image, the deviation dq is a deviation with respect to visible region below the breast region in the thermal image and the deviation dr is a deviation with respect to the distance of the breast region from either the first or the last pixel column of the thermal image. In an embodiment, the position error identification module 204 computes a plurality of positions (p,q,r) of the breast region with respect to the thermal 15 image that is obtained by selecting a single image frame of a thermal video or a live stream thermal video. In an embodiment, the values of positions p,q,r of the breast region for a typical thermal imaging protocol can be predefined for different views. For example, the predetermined value is 15% of the thermal image height for frontal, lateral and oblique views for the visible region above the breast region in the thermal image (p). The 20 predetermined value for the visible region below the breast region in the thermal image (q) is 15% of the thermal image height for frontal, lateral and oblique views. The predetermined value for the distance of the breast region from either the first or the last pixel column of the thermal image (r) is 50% of image width for frontal and lateral views and 67.78% of image width for the oblique view. 25
[0052] The position adjustment determinationmodule206 determines a positional adjustment to be made to a view position of the thermal imaging camera or a position of the subject based on the plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region using a machine learning model. The thermal image camera control module 208 provides set of instructions to technician or at least one of a robotic arm 30 holding the camera, an electronically controlled camera stand or an electronically controlled rotating chair to automatically position itself to the suggested angular
16
adjustment for capturing the new thermal image at the required view angle as per thermal image protocol. The set of instructions includes at least one of a text, a visual or audio for capturing the new thermal image at the required view angle. In an embodiment, the corrective positioning system 107 provides the new captured thermal image along with corrective position adjustment for an automatic tumor detection and an automated tumor 5 classification to detect cancerous tissue and/or non-cancerous tissue within a breast area of the subject using a thermal image analyzer. In an embodiment, the positional adjustment is provided to automatically adjust the position of the thermal imaging camera to capture the new thermal image at the required position without the user’s intervention.
[0053] In an embodiment, the corrective positioning system 107 adapted with a 10 segmentation system or module that implements an automatic segmenting technique to segment the breast area of the subject in the thermal image by determining (i) an outer side contour of an outline of a boundary of the breast area of the subject from a body silhouette, (ii) an inner side boundary of the breast area from the body silhouette and the view angle of the thermal image, (iii) an upper boundary of the breast area by determining 15 a lower boundary of an isotherm of axilla of the subject, or (iv) a lower boundary of the breast area by determining an upper boundary of an isotherm of infra-mammary fold of the person. The adaptive view angle positioning system 107 segments the breast area of the subject by connecting above determined breast boundaries to segment the breast from surrounding tissue in the thermal image. In an embodiment, the corrective positioning 20 system 107 may include a machine learning model for automatically segmenting the breast area of the subject in the thermal image. The machine learning model is trained by providing a plurality of thermal images as an input and the corresponding segmentation as an output to obtain a trained deep learning model.
[0054] The corrective positioning system 107 provides the new captured thermal 25 image to the trained deep learning model to predict a final segmentation map. The corrective positioning system 107 may display at least one of the positional adjustment or the segmented breast region on a visualization screen. In an embodiment, the automatic tumor detection includes the steps of (i) determining a percentage of pixels p1 within said selected region of interest with a temperature T1pixel, where T2= T1pixel = T1, (ii) 30 determining a percentage of pixels p2within said selected region of interest with a temperature T2pixel, where T3 = T2pixeland (iii) determining a ratio p3 = Pedge/Pblock, where
17
Pedgeis a number of pixels around a border of a suspected tumor within said region of interest, and Pblock is a number of pixels in the perimeter of the region of interest. The T1, T2 and T3 are temperature threshold obtained from temperature distribution. The automatic tumor detection includes determining whether a suspected tumor region as one of: the cancerous tissue, the non-cancerous using a decision rule R. The decision rule R is 5 based on any combination of:R1, R2, R3, where: R1 = (p1> Threshold1), R2 = (p2> Threshold2), and R3 = (p3> Threshold3). In an embodiment, the automatic tumor classification includes the steps of: (i) determining pixel regions m1within a selected region of interest with a temperature T1pixel, where T2= T1pixel = T1; (ii) determining pixel regions m2within the selected region of interest with a temperature T2pixel, where T3 = 10 T2pixel; (iii) extracting the parameters comprising at least one temperature, at least one boundary, at least one area and at least one shape from m1 and m2; and (iv) providing the parameters to a machine learning classifier to determine whether the selected region of interest has a cancerous lesion or not. T1, T2 and T3 are temperature thresholds obtained from temperature distribution. The automatic tumor classification includes the steps of: (i) 15 training a deep learning model by providing a plurality of thermal images as an input and the corresponding classification as an output to obtain a trained deep learning model; and (ii) providing the new captured thermal image to the trained deep learning model to determine whether a selected region of interest has a cancerous lesion or not.
[0055] With reference to FIG. 1, FIG. 3 illustrates an exemplary process flow of 20 the automated Region of Interest (ROI) analysis of a thermal image from a user using a corrective positioning system according to an embodiment herein. At step 302, the thermal image is captured using a thermal imaging camera. In some embodiments, the thermal image may be received or retrieved from a remote device over a network, or from a media such as a CDROM or DVD. The thermal image may be downloaded from a web-25 based system or an application that makes a video available for processing in accordance with the methods disclosed herein. The thermal image may also be received from an application such as those which are available for handheld cellular devices and processed on the cell phone or other handheld computing devices such as an iPad or Tablet-PC. The thermal image may be received directly from a memory or storage device of the imaging 30 device that is used to capture that thermal image or a thermal video. At step 304, the thermal image is obtained by selecting a single image frame of a thermal video or a live
18
stream thermal video. The thermal video or the live stream thermal video is captured using the thermal imaging camera. At step 306, the breast region in the thermal image is segmented from the thermal image using an automated segmentation technique. At step 308, the segmented breast region is provided for further analysis (e.g. breast cancer screening and tumor classification). 5
[0056] With reference to FIG. 1, FIG. 4 illustrates an exemplary process flow of an offline positional adjustment for a thermal image from a user using a corrective positioning system according to an embodiment herein. At step 402, the thermal image is captured using a thermal imaging camera. At step 404, the thermal image is uploaded into the corrective positioning system. At step 406, the breast region is segmented from the 10 thermal image using an automated segmentation technique. At step 408, the corrective positioning system computes a plurality of positions (p,q,r) of the breast region with respect to the thermal image and a plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region by comparing the plurality of positions of the breast region with a required position as per the thermal imaging protocol using a position analyzer. 15 The position p is the normalized distance from the top of the thermal image to the upper end of the breast region, the position q is the normalized distance from the lower end of breast to the end/bottom of the thermal image and the position r is the normalized distance of side boundary of breast (close to sternum) to the first or last pixel column of the thermal image. The deviation dp is a deviation with respect to visible region above the 20 breast region in the thermal image, the deviation dq is a deviation with respect to visible region below the breast region in the thermal image and the deviation dr is a deviation with respect to the distance of the breast region from either the first or the last pixel column of the thermal image. At step 410, it is determined whether a positional adjustment to be made to a position of thermal imaging camera or a position of the 25 subject based on the plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region using a machine learning model. At step 412, the thermal image is provided for further analysis, for example, breast cancer screening or tumor classification, if the thermal image is captured at the required position as per the thermal imaging protocol. If not, at step 414, a set of instructions to the user is generated for adjusting a position of the 30 thermal imaging camera or subject for capturing a new thermal image at the required position as per thermal imaging protocol.
19
[0057] With reference to FIG. 1, FIG. 5 illustrates an exemplary process flow of a live stream positional adjustment for a thermal image from a user using a corrective positioning system according to an embodiment herein. At step 502, the live stream with sequential frames (e.g. thermal image/video) is captured using a thermal imaging camera. At step 504, a sample rate or adaptive sampling algorithms are applied to analyze selected 5 frames of the thermal video instead of all frames to determine the corrective position for capturing a thermal image. At step 506, the breast region is segmented from the thermal image using an automated segmentation technique. At step 508, the corrective positioning system computes a plurality of positions (p,q,r) of the breast region with respect to the thermal image and a plurality of deviations (dp, dq, dr) in the plurality of positions of the 10 breast region by comparing the plurality of positions of the breast region with a required position as per the thermal imaging protocol using a position analyzer. The position p is the normalized distance from the top of the thermal image to the upper end of the breast region, the position q is the normalized distance from the lower end of breast to the end/bottom of the thermal image and the position r is the normalized distance of side 15 boundary of breast (close to sternum) to the first or last pixel column of the thermal image. The deviation dp is a deviation with respect to visible region above the breast region in the thermal image, the deviation dq is a deviation with respect to visible region below the breast region in the thermal image and the deviation dr is a deviation with respect to the distance of the breast region from either the first or the last pixel column of 20 the thermal image. At step 510, it is determined whether a positional adjustment to be made to a position of thermal imaging camera or a position of the subject based on the plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region using a machine learning model. At step 512, the thermal image is provided for further analysis, for example, breast cancer screening or tumor classification, if the thermal image is 25 captured at the required position as per the thermal imaging protocol. If not, at step 514, a set of instructions to the user is generated for a position of the thermal imaging camera or subject for capturing a new thermal image at the required position as per thermal imaging protocol.
[0058] With reference to FIG. 1, FIG. 6 illustrates an exemplary process flow of 30 corrective positioning using a corrective positioning system to select a view with adaptive sampling to reduce image frame candidates according to an embodiment herein. At step
20
602, the thermal video is captured using a thermal imaging camera. At step 604, the thermal video is uploaded to the corrective positioning system. At step 606, a sample rate or adaptive sampling algorithms are applied to analyzethe thermal video to select frames of the thermal video for each desired view angle (e.g. 45º, 90º, -45º, -90º, 0º)which has less deviation with angle and position instead of all frames to determine the corrective 5 position. At step 608, the breast region is segmented from the selected frame using an automated segmentation technique. At step 610, the corrective positioning system computes a plurality of positions (p,q,r) of the breast region with respect to the thermal image and a plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region by comparing the plurality of positions of the breast region with a required 10 position as per the thermal imaging protocol using a position analyser and determines positional adjustment to be made to a position of thermal imaging camera or a position of the subject based on the plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region using a machine learning model. The position p is the normalized distance from the top of the thermal image to the upper end of the breast region, the 15 position q is the normalized distance from the lower end of breast to the end/bottom of the thermal image and the position r is the normalized distance of side boundary of breast (close to sternum) to the first or last pixel column of the thermal image.
[0059] The deviation dp is a deviation with respect to visible region above the breast region in the thermal image, the deviation dq is a deviation with respect to visible 20 region below the breast region in the thermal image and the deviation dr is a deviation with respect to the distance of the breast region from either the first or the last pixel column of the thermal image. At step 612, the view angle of the selected frame of the thermal image is determined using a tagging classifier. The tagging classifier includes a machine learning model that determines the view angle of the thermal image and 25 classifies the thermal image as one of the discrete views such as a right lateral view, a right oblique view, a frontal view, a left lateral view or a left oblique view as per the thermal imaging protocol. At step 614, it is determined the thermal frame which meets the requirement of the thermal imaging protocol by (i) comparing the determined positions with the required positions as per thermal imaging protocol and (ii) comparing the 30 determined view angle with the required view angle as per thermal imaging protocol. At step 616, the determined thermal frames are provided for further analysis, for breast
21
cancer screening or tumor detection, if the thermal frames areas per the thermal imaging protocol. If not, it goes back to step 606 to adjust the sampling rate.
[0060] In an embodiment, the required thermal frames selected from a captured thermal video of a subject (e.g. frames to be considered 0, ±45, ±90) are used for automated analysis. The input to the corrective positioning system is the entire frames 5 from the thermal video or sampled frame from any adaptive sampling algorithm. The corrective positioning system determines the best frames corresponding to the required position of capturing a thermal image.
[0061] In an embodiment, the frame selection from the thermal video includes determining a view angle of the thermal image from a user using a view angle estimator. 10 It includes determining an angular adjustment to be made to a view position of the thermal imaging camera or a position of the subject by comparing the determined view angle with a required view angle as per thermal imaging protocol when the thermal image does not meet the required view angle.
[0062] With reference to FIG. 1, FIG. 7 illustrates an exemplary process flow of 15 automatically identifying a posture and a position of a subject in a thermal image according to an embodiment herein. At step 702, the thermal image is captured using a thermal imaging camera. At step 704, the thermal image is uploaded into a posture and a position identification system. At step 706, key physical structures and contours of the body in the thermal image is determined using an automated segmentation technique and 20 an edge detection technique and represented as image points to define a reference body coordinate system in an n-dimensional Euclidean space. At step 708, the posture and a position identification system assemble each image point in the Euclidean coordinate system to define the posture and the position of the body. At step 710, the n-dimensional Euclidian axis (X1-Xn) for a particular posture of interest is determined to define the 25 reference body coordinate system. Each Euclidian axis includes values associated with a physical structure or contour of the body. The values of each Euclidian axis (Xi=1-N) represents a relative distance of the respective physical structure or contour of the body from the boundaries of the thermal image. At step 712, N ordinal values along each corresponding n-dimensional Euclidian axis for the given image is provided as a 30 numerical representation of the subject’s posture, position and a point in a Euclidian space to enable the user to perform further analysis, for example, breast cancer screening
22
or tumor classification.
[0063] FIG. 8A and 8B illustrate a flow diagram of one embodiment of the present method for identifying errors associated with subject positioning in a thermal image from a user and generating a feedback to enable the user for adaptive positioning of the subject for capturing a new thermal image, according to an embodiment herein. At 5 step 802, a thermal image of a body of a subject is received. The thermal image represents the temperature distribution on the body of the subject as pixels in the thermal image with a highest temperature value being displayed in a first color and pixels with a lowest temperature value being displayed in a second color, pixels with temperature values between the lowest and highest temperature values being displayed in gradations of color 10 between the first and second colors. At step 804, a breast region in the thermal image is determined by segmenting the breast region from the thermal image using an automated segmentation technique. At step 806, a plurality of positions (p,q,r) of the breast region with respect to the thermal image is computed using a position analyzer. The position p is the normalized distance from the top of the thermal image to the upper end of the breast 15 region, the position q is the normalized distance from the lower end of breast to the end/bottom of the thermal image and the position r is the normalized distance of side boundary of breast (close to sternum) to the first or last pixel column of the thermal image. At step 808, a plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region is computed by comparing the plurality of positions of the breast region 20 with a required position as per the thermal imaging protocol by using the position analyzer. The deviation dp is a deviation with respect to visible region above the breast region in the thermal image, the deviation dq is a deviation with respect to visible region below the breast region in the thermal image and the deviation dr is a deviation with respect to the distance of the breast region from either the first or the last pixel column of 25 the thermal image. At step 810, a positional adjustment to be made to a position of thermal imaging camera or a position of the subject is determined based on the plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region using a machine learning model. At step 812, a set of instructions is generated to the user for adjusting a position of the thermal imaging camera or subject for capturing a new thermal image at 30 the required position as per thermal imaging protocol.
[0064] FIG. 9 illustrates a block diagram of one example of corrective positioning
23
system/image processing system 900 for processing a thermal image in accordance with the embodiments described with respect to the flow diagram of FIG.8A and 8B according to an embodiment herein. Image Receiver 902 wirelessly receives the video via antenna 901 having been transmitted thereto from the video/thermal imaging device 101 of FIG. 1. Temperate Processor 903 performs a temperature-based method to detect pixels in the 5 received image. Position analyzer904 computes a plurality of positions (p,q,r) of the breast region with respect to the thermal image and a plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region is computed by comparing the plurality of positions of the breast region with a required position as per the thermal imaging protocol. The position p is the normalized distance from the top of the thermal image to 10 the upper end of the breast region, the position q is the normalized distance from the lower end of breast to the end/bottom of the thermal image and the position r is the normalized distance of side boundary of breast (close to sternum) to the first or last pixel column of the thermal image. The deviation dp is a deviation with respect to visible region above the breast region in the thermal image, the deviation dq is a deviation with 15 respect to visible region below the breast region in the thermal image and the deviation dr is a deviation with respect to the distance of the breast region from either the first or the last pixel column of the thermal image. Both Modules 903 and 904 store their results to storage device 905. Machine learning model906 retrieves the results from the storage device 905 and proceeds to a positional adjustment to be made to a position of thermal 20 imaging camera 101 or a position of the subject based on the plurality of deviations (dp, dq, dr) in the plurality of positions of the breast region. when the thermal image does not meet the required position for image capture as per thermal imaging protocol, the machine learning model 906 generates a set of instructions to the user for adjusting the position of the thermal imaging camera 101 for capturing the new thermal image at the required 25 position as per thermal imaging protocol. Central Processing Unit 908 retrieves machine-readable program instructions from a memory 909 and is provided to facilitate the functionality of any of the modules of the system 900. CPU 908, operating alone or in conjunction with other processors, may be configured to assist or otherwise perform the functionality of any of the modules or processing units of the system 900 as well as 30 facilitating communication between the system 900 and the workstation 910.
[0065] System 900 is shown having been placed in communication with a
24
workstation 910. A computer case of the workstation houses various components such as a motherboard with a processor and memory, a network card, a video card, a hard drive capable of reading/writing to machine-readable media 911 such as a floppy disk, optical disk, CD-ROM, DVD, magnetic tape, and the like, and other software and hardware needed to perform the functionality of a computer workstation. The workstation further 5 includes a display device 912, such as a CRT, LCD, or touch screen device, for displaying information, images, view angles, and the like. A user can view any of that information and make a selection from menu options displayed thereon. Keyboard 913 and mouse 914 effectuate a user input. It should be appreciated that the workstation has an operating system and other specialized software configured to display alphanumeric 10 values, menus, scroll bars, dials, slidable bars, pull-down options, selectable buttons, and the like, for entering, selecting, modifying, and accepting information needed for processing in accordance with the teachings hereof. The workstation is further enabled to display thermal images, position adjustments to thermal images and the like as they are derived. A user or technician may use the user interface of the workstation to set 15 parameters, view/adjust the position, and adjust various aspects of the position adjustment is performed, as needed or as desired, depending on the implementation. Any of these selections or inputs may be stored/retrieved to storage device 911. Default settings can be retrieved from the storage device. A user of the workstation is also able to view or manipulate any of the data in the patient records, collectively at 915, stored in database 20 916. Any of the received images, results, determined view angle, and the like, may be stored to a storage device internal to the workstation 910. Although shown as a desktop computer, the workstation can be a laptop, mainframe, or a special purpose computer such as an ASIC, circuit, or the like.
[0066] Any of the components of the workstation may be placed in 25 communication with any of the modules and processing units of system 900. Any of the modules of the system 900 can be placed in communication with storage devices 905, 916 and 106 and/or computer-readable media 911 and may store/retrieve therefrom data, variables, records, parameters, functions, and/or machine-readable/executable program instructions, as needed to perform their intended functions. Each of the modules of the 30 system 900 may be placed in communication with one or more remote devices over network 917. It should be appreciated that some or all of the functionality performed by
25
any of the modules or processing units of the system 900 can be performed, in whole or in part, by the workstation. The embodiment shown is illustrative and should not be viewed as limiting the scope of the appended claims strictly to that configuration. Various modules may designate one or more components which may, in turn, comprise software and/or hardware designed to perform the intended function. 5
[0067] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the 10 meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope.

Documents

Application Documents

# Name Date
1 201941042222-STATEMENT OF UNDERTAKING (FORM 3) [18-10-2019(online)].pdf 2019-10-18
2 201941042222-PROOF OF RIGHT [18-10-2019(online)].pdf 2019-10-18
3 201941042222-POWER OF AUTHORITY [18-10-2019(online)].pdf 2019-10-18
4 201941042222-FORM FOR STARTUP [18-10-2019(online)].pdf 2019-10-18
5 201941042222-FORM FOR SMALL ENTITY(FORM-28) [18-10-2019(online)].pdf 2019-10-18
6 201941042222-FORM 1 [18-10-2019(online)].pdf 2019-10-18
7 201941042222-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [18-10-2019(online)].pdf 2019-10-18
8 201941042222-EVIDENCE FOR REGISTRATION UNDER SSI [18-10-2019(online)].pdf 2019-10-18
9 201941042222-DRAWINGS [18-10-2019(online)].pdf 2019-10-18
10 201941042222-DECLARATION OF INVENTORSHIP (FORM 5) [18-10-2019(online)].pdf 2019-10-18
11 201941042222-COMPLETE SPECIFICATION [18-10-2019(online)].pdf 2019-10-18
12 Correspondence by Agent_Form1,Form26_21-10-2019.pdf 2019-10-21
13 abstract 201941042222.jpg 2019-10-21
14 201941042222-FORM-9 [29-10-2019(online)].pdf 2019-10-29
15 201941042222-STARTUP [04-11-2019(online)].pdf 2019-11-04
16 201941042222-FORM28 [04-11-2019(online)].pdf 2019-11-04
17 201941042222-FORM 18A [04-11-2019(online)].pdf 2019-11-04
18 201941042222-FER.pdf 2019-11-22
19 201941042222-FORM-26 [07-05-2020(online)].pdf 2020-05-07
20 201941042222-OTHERS [12-05-2020(online)].pdf 2020-05-12
21 201941042222-FER_SER_REPLY [12-05-2020(online)].pdf 2020-05-12
22 201941042222-CORRESPONDENCE [12-05-2020(online)].pdf 2020-05-12
23 201941042222-COMPLETE SPECIFICATION [12-05-2020(online)].pdf 2020-05-12
24 201941042222-CLAIMS [12-05-2020(online)].pdf 2020-05-12
25 201941042222-ABSTRACT [12-05-2020(online)].pdf 2020-05-12
26 201941042222-US(14)-HearingNotice-(HearingDate-10-07-2020).pdf 2020-06-25
27 201941042222-Correspondence to notify the Controller [04-07-2020(online)].pdf 2020-07-04
28 201941042222-US(14)-HearingNotice-(HearingDate-19-08-2020).pdf 2020-07-08
29 201941042222-Correspondence to notify the Controller [18-08-2020(online)].pdf 2020-08-18
30 201941042222-Correspondence to notify the Controller [18-08-2020(online)]-1.pdf 2020-08-18
31 201941042222-Annexure [18-08-2020(online)].pdf 2020-08-18
32 201941042222-Request Letter-Correspondence [17-11-2020(online)].pdf 2020-11-17
33 201941042222-Power of Attorney [17-11-2020(online)].pdf 2020-11-17
34 201941042222-FORM28 [17-11-2020(online)].pdf 2020-11-17
35 201941042222-Form 1 (Submitted on date of filing) [17-11-2020(online)].pdf 2020-11-17
36 201941042222-Covering Letter [17-11-2020(online)].pdf 2020-11-17
37 201941042222-CERTIFIED COPIES TRANSMISSION TO IB [17-11-2020(online)].pdf 2020-11-17
38 201941042222-FORM 3 [31-12-2020(online)].pdf 2020-12-31
39 201941042222-US(14)-ExtendedHearingNotice-(HearingDate-27-12-2021).pdf 2021-12-16
40 201941042222-Correspondence to notify the Controller [26-12-2021(online)].pdf 2021-12-26
41 201941042222-Annexure [26-12-2021(online)].pdf 2021-12-26
42 201941042222-Written submissions and relevant documents [10-01-2022(online)].pdf 2022-01-10
43 201941042222-PatentCertificate31-01-2022.pdf 2022-01-31
44 201941042222-IntimationOfGrant31-01-2022.pdf 2022-01-31
45 201941042222-RELEVANT DOCUMENTS [28-08-2023(online)].pdf 2023-08-28

Search Strategy

1 Search_21-11-2019.pdf
2 searchAE_25-06-2020.pdf

ERegister / Renewals

3rd: 25 Apr 2022

From 18/10/2021 - To 18/10/2022

4th: 25 Apr 2022

From 18/10/2022 - To 18/10/2023

5th: 25 Apr 2022

From 18/10/2023 - To 18/10/2024

6th: 25 Apr 2022

From 18/10/2024 - To 18/10/2025

7th: 25 Apr 2022

From 18/10/2025 - To 18/10/2026

8th: 25 Apr 2022

From 18/10/2026 - To 18/10/2027

9th: 25 Apr 2022

From 18/10/2027 - To 18/10/2028

10th: 25 Apr 2022

From 18/10/2028 - To 18/10/2029

11th: 25 Apr 2022

From 18/10/2029 - To 18/10/2030

12th: 25 Apr 2022

From 18/10/2030 - To 18/10/2031

13th: 25 Apr 2022

From 18/10/2031 - To 18/10/2032

14th: 25 Apr 2022

From 18/10/2032 - To 18/10/2033

15th: 25 Apr 2022

From 18/10/2033 - To 18/10/2034