Abstract: ABSTRACT SYSTEM AND METHOD FOR DETERMINING HIGH TEMPERATURE REGION PIXELS ON A MAMMOGRAPHY IMAGE A system (104) for annotating mammography images of a subject (100) using thermal images and mammography images of the subject using thermal and mammography images of the subject (100) by (i) identifying a first breast region in 5 the thermal image and a second breast region in the mammography image, (ii) identifying a block of pixels (Pt) with a high temperature region associated with a breast lesion within the first breast region, (iii) estimating a location (l) of the breast lesion corresponding to the identified block of pixels (Pt), (iv) determining a block of pixels (Pm) corresponding to the location (l) of the block of pixels (Pt) 10 within the second breast region and (v) generating a report with an annotated mammography image with markings of a determined block of pixels (Pm) on the mammography image of the subject (100) corresponding to the block of pixels (Pt) on the thermal image. FIG. 1
Claims:CLAIMS
I/ We Claim:
1. A system (104) for annotating mammography images of 1 a subject (100) using thermal images
2 and mammography images of the subject (100) by determining a block of pixels on the
3 mammography images corresponding to high temperature regions on the thermal images of the
4 subject (100), the system (104) comprising:
5 a thermal imaging device (101) that captures a thermal image of the subject;
6 a mammography imaging device (106) that captures the mammography image of the
7 subject (100); and
8 a processor (108) that is configured to:
9 characterized in that,
10 identify a first breast region in the thermal image of the subject (100);
11 identify a second breast region in the mammography image of the subject (100);
12 identify a block of pixels (Pt) with the high temperature region associated with a
13 breast lesion within the first identified breast region;
14 estimate a location (l) of the breast lesion corresponding to the identified block of
15 pixels (Pt);
16 determine, using a first machine learning model, a block of pixels (Pm) on the
17 mammography image corresponding to the location (l) of the block of pixels (Pt) within
18 the second breast region; and
19 generate a report with an annotated mammography image with a marking of a
20 determined block of pixels (Pm) on the mammography image of the subject corresponding
21 to the block of pixels (Pt) associated with the high temperature regions on the thermal
22 image to enable lesion identification on the mammography image of the subject (100).
1 2. The system (104) as claimed in claim 1, wherein the thermal imaging device (101) comprises
2 an array of sensors that converts infrared energy into electrical signals on a per-pixel
3 basis, wherein the array of sensors detects temperature values from the subject; and
22
a specialized thermal processor that processes d 4 etected temperature values into pixels of
5 a thermal image, wherein intensity values of the pixels correspond to the detected temperature
6 values.
1 3. The system (104) as claimed in claim 1, wherein the mammography imaging device (106)
2 comprises
3 an X-ray tube that produces low energy X-rays;
4 a plurality of filters that are placed in a path of X-ray beam to modify an X-ray spectrum
5 that is projected on a body of the subject (100);
6 a plurality of compression paddles that is attached to the body of the subject (100) to
7 compress a part of the body of the subject (100) being exposed to the X-rays to obtain cross
8 section density information; and
9 a specialized mammogram processor that converts obtained cross section density
10 information into pixels to generate a mammography image, wherein intensity values of the
11 pixels correspond to the obtained cross section density information at per-pixel basis.
1 4. The system (104) as claimed in claim 1, wherein the processor (108) is configured to identify
2 the block of pixels (Pt) with high temperature regions on the thermal image of the breast region
3 of the subject (100) by determining a first pixel region (m1) with a temperature Tpixel , where T2
4 ≤ Tpixel ≤ T1, wherein T1, and T2 are temperature thresholds obtained from the temperature
5 distribution of the thermal image of the subject.
1 5. The system (104) as claimed in claim 1, wherein the processor (108) is configured to identify
2 the block of pixels (Pt) with the high temperature regions on the thermal image of the breast
3 region of the subject (100) by:
determining the first pixel region (m1) with a temperature T1
pixel , where T2 ≤ T1
4 pixel ≤ T1;
determining a second pixel region (m2) with a temperature T2
pixel , where T3 ≤ T2
5 pixel ; and
6 detecting a plurality of hotspot regions using the first pixel region (m1) and the second
7 pixel region (m2) with AND or OR rules, wherein T1, T2 and T3 are temperature thresholds
8 obtained from the temperature distribution of the thermal image of the subject.
23
6. The system (104) as claimed in cla 1 im 1, wherein the processor (108) is configured to obtain
2 the plurality of high temperature regions on the first breast region of the subject (100) as an
3 input from the user to estimate the location (l) of the breast lesion corresponding to the block
4 of pixels (Pt) in the thermal image of the subject (100).
1 7. The system (104) as claimed in claim 1, wherein the processor (108) is configured to identify
2 block of pixels (Pt) with high temperature regions on the first breast region of the subject (100)
3 using a second machine learning model, wherein the second machine learning model is trained
4 by providing a plurality of thermal images and corresponding annotated high temperature
5 regions associated with different patients as training data.
1 8. The system (104) as claimed in claim 1, wherein the first machine learning model identifies
2 the block of pixels (Pm) on the mammography image corresponding to the location (l) on the
3 thermography image by:
4 obtaining a breast quadrant corresponding to the location (l);
5 identifying a view of the mammography image;
6 dividing the mammography image into different quadrants; and
7 identifying the block of pixels (Pm) that lie within a obtained quadrant corresponding to
8 location (l) of the breast lesion on the thermographic image.
1 9. The system (104) as claimed in claim 1, wherein the the first machine learning model
2 identifies the block of pixels (Pm) on the mammography image corresponding to the location
3 (l) on the thermography image by:
4 identify candidate block of pixels with high density in the mammography image;
5 obtaining a nipple point (N) on the mammography image;
6 calculating locations (lm1-n) of each candidate block of pixels with respect to a obtained
7 nipple point (N);
24
comparing the locations (lm1-n) of each of 8 the candidate block of pixels with the location
9 (l) of the breast lesion on the thermographic image; and
10 selecting the block of pixels (Pm) among the candidate block of pixles by selecting a block
11 of pixels corresponding to a nearest location (lmi) among the locations (lm1-n) that is close to
12 the location (l) of the breast lesion on the thermographic image.
13 10. The system (104) of claim 1, wherein the first machine learning model identifies the block
14 of pixels (Pm) on the mammography image corresponding to the location (l) on the
15 thermography image comprises:
16 obtaining a clock position (θ) and a distance (r) corresponding to the location (l);
17 dividing the mammography image into different sectors corresponding to different clock
18 positions; and
19 identifying the block of pixels (Pm) that lie within the sector corresponding to the clock
20 position (θ) and distance (r) from the nipple region in the mammography image.
1 11. The system (104) as claimed in claim 9, wherein the processor (108) is configured to
2 identify the block of pixels (Pm) as the high-density regions in the mammography image within
3 the sector corresponding to the clock position (θ) and distance (r) from the nipple region in the
4 mammography image.
1 12. The system (104) as claimed in claim 10, wherein the processor (108) is configured to
2 identify the high-density regions on the mammography image of the subject (100) using a third
3 machine learning model, wherein the third machine learning model is trained by providing a
4 plurality of mammography images and the corresponding annotated high-density lesions
5 associated with different patients as training data.
1 13. The system (104) as claimed in claim 1, wherein the report comprises at least one of
2 annotated mammography image with markings of the determined block of pixels (Pm) in
3 a different color as annotations on the mammography image;
25
annotated mammography image with 4 markings of the boundary of the determined block
5 of pixels (Pm) in a different color as annotations on the mammography image;
6 a text report that comprises quantitative parameters of the block of pixels (Pm) on the
7 mammography image corresponding to the high temperature region on the thermal image.
1 14. The system (104) as claimed in claim 12, wherein the markings comprise an annotation of
2 the block of pixels (Pm) on the mammogram image corresponding to the high temperature
3 region on the thermal image and a text annotation that comprises quantitative parameters of the
4 block of pixels (Pm).
5 15. A method for annotating mammography images of a subject (100) using thermal images
6 and mammography images of the subject (100) by determining a block of pixels on the
7 mammography images corresponding to high temperature regions on the thermal images of
8 the subject (100) comprising,
9 capturing a thermal image of the subject (100) using a thermal imaging device (101);
10 capturing the mammography image of the subject using a mammography imaging device
11 (106);
12 identifying a first breast region in the thermal image of the subject (100);
13 identifying a second breast region in the mammography image of the subject (100);
14 identifying a block of pixels (Pt) with the high temperature region associated with a breast
15 lesion within the first identified breast region;
16 estimating a location (l) of the breast lesion corresponding to the identified block of pixels
17 (Pt);
18 determining, using a first machine learning model, a block of pixels (Pm) on the
19 mammography image corresponding to the location (l) of the block of pixels (Pt) within the
20 second breast region; and
21 generating a report with an annotated mammography image with a marking of a
22 determined block of pixels (Pm) on the mammography image of the subject corresponding to
23 the block of pixels (Pt) associated with the high temperature regions on the thermal image to
24 enable lesion identification on the mammography image of the subject (100).
, Description:1
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
5 THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See sections 10; rule 13)
TITLE OF THE INVENTION
10 SYSTEM AND METHOD FOR DETERMINING HIGH TEMPERATURE REGION
PIXELS ON A MAMMOGRAPHY IMAGE
APPLICANT
15 NAME: NIRAMAI HEALTH ANALYTIX PVT LTD
NATIONALITY: INDIAN
ADDRESS:
PREAMBLE TO THE DESCRIPTION:
20 The following specification particularly describes the invention and the manner in which it is
to be performed.
2
SYSTEM AND METHOD FOR DETERMINING HIGH-TEMPERATURE REGION
PIXELS ON A MAMMOGRAPHY IMAGE
Technical Field
[0001] The embodiments herein generally relate to thermography and
mammography, more particularly, a system and 5 method for annotating mammography images
of a subject using thermal images and mammography images of the subject by determining a
block of pixels on the mammography images corresponding to high temperature regions on the
thermal images.
Description of the Related Art
10 [0002] In today’s world, along with technological advances, diseases and illnesses
of all types are growing as well. Cancer has cropped up as one such disease causing an alarming
rate of deaths. The detection of cancer along with its severity is one of the major problems with
a disease such as cancer. By the time the symptoms of cancer are realized, the severity of and
spread would already reach a dire stage. Researchers across the world have identified breast
15 cancer as the leading type of cancer that is widespread among women. Breast cancer ranks
second among all cancers in respect to death rates worldwide. Survival rates are extremely low
especially in developing countries like India. Research shows high survival rates, improved
quality of life, and cost-effective treatment for those whose breast cancer is detected at an early
stage. Mammography is commonly used to examine human breasts for primary screening and
20 diagnosis. Radiologists use mammographic images to detect breast cancer, as it is efficient in
recording the visualized details of the internal regions of the breast. However, the accuracy of
mammography images drops down drastically to about 50% for women with dense breasts,
making the usage of mammography ineffective for young women. Further, interpretation of
mammography images requires high expertise. In developing countries like India, this becomes
25 a major hurdle due to the shortage of experienced radiologists.
[0003] Thermography is an FDA-approved adjunct modality for breast cancer
detection. Thermography captures the amount of heat radiating from the surface of the body and
in particular, breast thermography measures the temperature patterns and thermal distribution on
the chest due to the high metabolism associated with tumorous growth. There are several
30 advantages with breast thermograpy compared other breast imaging modalities. It is low cost
imaging, non-contact, works on women of all age groups, irradiation-free and privacy aware. This
also makes it suitable for developing countries like India. Today, detecting an early-stage breast
3
cancer might not possible with a single standalone test. Therefore, the use of breast
thermography adjunct to mammography can help in improving the detection of breast cancers.
However, this requires correlation of the high thermal regions on thermal images to the high
density region on the mammography images. This is not a trivial problem as the reference
systems of mammography imaging and thermal imaging are 5 different. Mammography captures
the cross-section density information of the breast, whereas thermography captures surface
temperature distribution.
[0004] Therefore, there arises a need to address the aforementioned technical
drawbacks in existing technologies to determine an overlapping lesion region of a thermal
10 image on a mammography image.
SUMMARY
[0005] In view of foregoing an embodiment herein provides a system for annotating
mammography images of a subject using thermal images and mammography images of the
subject by determining a block of pixels on the mammography images corresponding to high
15 temperature regions on the thermal images. The system includes a thermal imaging device, a
mammography imaging device and a processor. The thermal imaging device captures a thermal
image of the subject. The mammography imaging device captures the mammography image of
the subject. The processor that is configured to (i) identify a first breast region in the thermal
image of the subject, (ii) identify a second breast region in the mammography image of the
20 subject, (iii) identify a block of pixels (Pt) with the high temperature region associated with a
breast lesion within the first identified breast region, (iv) estimate a location (l) of the breast
lesion corresponding to the identified block of pixels (Pt), (v) determine, using a first machine
learning model, a block of pixels (Pm) on the mammography image corresponding to the
location (l) of the block of pixels (Pt) within the second breast region and (vi) generate a report
25 with an annotated mammography image with a marking of a determined block of pixels (Pm)
on the mammography image of the subject corresponding to the block of pixels (Pt) associated
with the high temperature regions on the thermal image to enable lesion identification on the
mammography image of the subject.
[0006] In some embodiments, the thermal imaging device includes an array of
30 sensors and a specialized thermal processor. The array of sensors converts infrared energy into
electrical signals on a per-pixel basis. The array of sensors detects temperature values from the
subject. The specialized thermal processor processes detected temperature values into pixels of
4
a thermal image. Intensity values of the pixels correspond to the detected temperature values.
[0007] In some embodiments, the mammography imaging device includes an X-ray
tube, a plurality of filters, a plurality of compression paddles and a specialized mammogram
processor. The X-ray tube produces low energy X-rays. The plurality of filters is placed in a
path of X-ray beam to modify an X-ray spectrum that is projected 5 on a body of the subject. The
plurality of compression paddles attached to the body of the subject to compress a part of the
body of the subject being exposed to the X-rays to obtain cross section density information.
The specialized mammogram processor converts obtained cross section density information
into pixels to generate a mammography image. Intensity values of the pixels correspond to the
10 obtained cross section density information at per-pixel basis.
[0008] In some embodiments, the processor is configured to identify the block of
pixels (Pt) with high temperature regions on the thermal image of the breast region of the subject
by determining a first pixel region (m1) with a temperature Tpixel , where T2 ≤ Tpixel ≤ T1. The
T1, and T2 are temperature thresholds obtained from a temperature distribution.
15 [0009] In some embodiments, the processor is configured to identify the block of
pixels (Pt) with the high temperature regions on the thermal image of the breast region of the
subject by (i) determining the first pixel region (m1) with a temperature T1
pixel , where T2 ≤
T1
pixel ≤ T1, (ii) determining a second pixel region (m2) with a temperature T2
pixel , where T3 ≤
T2
pixel , and (iii) detecting a plurality of hotspot regions using the first pixel region (m1) and the
20 second pixel region (m2) with AND or OR rules. The T1, T2 and T3 are temperature thresholds
obtained from a temperature distribution.
[0010] In some embodiments, the processor is configured to obtain the plurality of
high temperature regions on the first breast region of the subject as an input from the user to
estimate the location (l) of the breast lesion corresponding to the block of pixels (Pt) in the
25 thermal image of the subject.
[0011] In some embodiments, the processor is configured to identify the block of
pixels (Pt) with high temperature regions on the first breast region of the subject using a second
machine learning model. The second machine learning model is trained by providing a plurality
of thermal images and corresponding annotated high temperature regions associated with
30 different patients as training data. In some embodiments, the first machine learning model
identifies the block of pixels (Pm) on the mammography image corresponding to the location
(l) on the thermography image by (i) obtaining a breast quadrant corresponding to the location
5
(l), (ii) identifying a view of the mammography image, (iii) dividing the mammography image
into different quadrants and (iv) identifying the block of pixels (Pm) that lie within a obtained
quadrant corresponding to location (l) of the breast lesion on the thermographic image
[0012] In some embodiments, the the first machine learning model identifies the
block of pixels (Pm) on the mammography image 5 corresponding to the location (l) on the
thermography image by (i) identify candidate block of pixels with high density in the
mammography image, (ii) obtaining a nipple point (N) on the mammography image, (iii)
calculating locations (lm1-n) of each candidate block of pixels with respect to a obtained nipple
point (N), (iv) comparing the locations (lm1-n) of each of the candidate block of pixels with the
10 location (l) of the breast lesion on the thermographic image and (v) selecting the block of pixels
(Pm) among the candidate block of pixles by selecting a block of pixels corresponding to a
nearest location (lmi) among the locations (lm1-n) that is close to the location (l) of the breast
lesion on the thermographic image.
[0013] In some embodiments, the first machine learning model identifies the block
15 of pixels (Pm) on the mammography image corresponding to the location (l) on the
thermography image includes (i) obtaining a clock position (θ) and a distance (r) from the
location (l), (ii) dividing the mammography image into different sectors corresponding to
different clock positions and (iii) identifying the block of pixels (Pm) that lie within the sector
corresponding to the clock position (θ) and distance (r) from the nipple region in the
20 mammography image.
[0014] In some embodiments, the processor is configured to identify the block of
pixels (Pm) as the high-density regions in the mammography image within the sector
corresponding to the clock position (θ) and distance (r) from the nipple region in the
mammography image.
25 [0015] In some embodiments, the processor is configured to identify the highdensity
regions on the mammography image of the subject using a third machine learning
model. The third machine learning model is trained by providing a plurality of mammography
images and the corresponding annotated high-density lesions associated with different patients
as training data.
30 [0016] In some embodiments, the report includes at least one of (i) annotated
mammography image with markings of the determined block of pixels (Pm) in a different color
as annotations on the mammography image, (ii) annotated mammography image with markings
6
of the boundary of the determined block of pixels (Pm) in a different color as annotations on the
mammography image and (iii) a text report that includes quantitative parameters of the block
of pixels (Pm) on the mammography image corresponding to the high temperature region on
the thermal image.
[0017] In some embodiments, 5 the markings include an annotation of the block of
pixels (Pm) on the mammogram image corresponding to the high temperature region on the
thermal image and a text annotation that includes quantitative parameters of the block of pixels
(Pm).
[0018] In another aspect, a method for annotating mammography images of a subject
10 using thermal images and mammography images of the subject by determining a block of pixels
on the mammography images corresponding to high temperature regions on the thermal
imagesincluding (i) capturing a thermal image of the subject using a thermal imaging device,
(ii) capturing the mammography image of the subject using a mammography imaging device,
(iii) identifying a first breast region in the thermal image of the subject, (iv) identifying a second
15 breast region in the mammography image of the subject, (v) identifying a block of pixels (Pt)
with the high temperature region associated with a breast lesion within the first identified breast
region, (vi) estimating a location (l) of the breast lesion corresponding to the identified block
of pixels (Pt), (vii) determining, using a first machine learning model, a block of pixels (Pm) on
the mammography image corresponding to the location (l) of the block of pixels (Pt) within the
20 second breast region and (viii) generating a report with an annotated mammography image with
a marking of a determined block of pixels (Pm) on the mammography image of the subject
corresponding to the block of pixels (Pt) associated with the high temperature regions on the
thermal image to enable lesion identification on the mammography image of the subject.
[0019] The system and method allow the thermal image as an adjunct to
25 mammography image in a completely automated method. This adjunctive-ness allows the
physician to identify the high-density regions that include high thermal activities. This
information is used for at least one of diagnosis, prognosis, and treatment monitoring. The
system and method include the adjunctive-ness that is used by a machine-learning algorithm to
improve the overall accuracy for breast cancer detection.
30 [0020] These and other aspects of the embodiments herein will be better appreciated
and understood when considered in conjunction with the following description and the
accompanying drawings. It should be understood, however, that the following descriptions,
7
while indicating preferred embodiments and numerous specific details thereof, are given by
way of illustration and not of limitation. Many changes and modifications may be made within
the scope of the embodiments herein without departing from the spirit thereof, and the
embodiments herein include all such modifications.
5
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] The embodiments herein will be better understood from the following
detailed description with reference to the drawings, in which:
[0022] FIG. 1 illustrates a system view of annotating mammography images of a
10 subject using thermal images and mammography images of the subject by determining a block
of pixels on the mammography images corresponding to high temperature regions on the
thermal images according to an embodiment herein;
[0023] FIG. 2 illustrates an exploded view of the computing system 104 according
to an embodiment herein;
15 [0024] FIG. 3A and 3B are an exemplary location of a breast lesion on a breast region
of a subject according to some embodiments herein;
[0025] FIG. 4 illustrates an exemplary view of a location of a high temperature
region determined using an nipple point as an origin and a breast region as the boundary
according to an embodiment herein;
20 [0026] FIG. 5A to 5B illustrate an exemplary view of the high temperature region
associated with a breast lesion that is identified on a breast region of the subject according to
an embodiment herein;
[0027] FIG. 6A and 6B illustrate the exemplary view of dividing the mammography
image into different sectors using different clock positions according to an embodiment herein;
25 [0028] FIG. 7A to 7C illustrate an exemplary view of an annotation of the block of
pixels on a mammography image corresponding to high temperature region on a thermal image
according to an embodiment herein;
[0029] FIG. 7D illustrates an exemplary mammography image with a text annotation
of the high temperature region according to an embodiment herein;
30 [0030] FIG. 8A and 8B is a flow diagram that illustrates a method for annotating
mammography images of a subject using thermal images and mammography images of the
subject by determining a block of pixels on the mammography images corresponding to high
8
temperature regions on the thermal imagesaccording to an embodiment herein; and
[0031] FIG. 9 illustrates a block diagram of one example system for determining a
block of pixels on the mammography images corresponding to high temperature regions on the
thermal images using thermal images and mammography images in accordance with the
embodiments described with respect to the 5 flow diagram of FIG. 8 according to some
embodiments herein.
DETAILED DESCRIPTION OF THE DRAWINGS
[0032] The embodiments herein and the various features and advantageous details
10 thereof are explained more fully with reference to the non-limiting embodiments that are
illustrated in the accompanying drawings and detailed in the following description.
Descriptions of well-known components and processing techniques are omitted so as to not
unnecessarily obscure the embodiments herein. The examples used herein are intended merely
to facilitate an understanding of ways in which the embodiments herein may be practiced and
15 to further enable those of skill in the art to practice the embodiments herein. Accordingly, the
examples should not be construed as limiting the scope of the embodiments herein.
[0033] As mentioned, there remains a need for a system and a method for
determining a block of pixels corresponding to high temperature regions on a mammography
image by annotating mammography images of a subject using thermal images and
20 mammography images of the subject. Referring now to the drawings, and more particularly to
FIGS. 1 through 10, where similar reference characters denote corresponding features
consistently throughout the figures, there are shown preferred embodiments.
[0034] A "person" and “subject” refer to either a male or a female. Gender pronouns
are not to be viewed as limiting the scope of the appended claims strictly to females. Moreover,
25 although the term “person” or “patient” or “subject” is used interchangeably throughout this
disclosure, it should be appreciated that the person undergoing breast cancer screening may be
something other than a human such as, for example, a primate. Therefore, the use of such terms
is not to be viewed as limiting the scope of the appended claims to humans.
[0035] A “breast area” refers to the tissue of the breast and may further include
30 surrounding tissue as is deemed appropriate for breast cancer screening.
[0036] A “thermal camera” refers to either a still camera or a video camera with a
lens that focuses infrared energy from objects in a scene onto an array of specialized sensors
9
which convert infrared energy across a desired thermal wavelength band into electrical signals
on a per-pixel basis and which output an array of pixels with colours that correspond to
temperatures of the objects in the image.
[0037] A "thermographic image" or simply a “thermal image” is an image captured
by a thermal camera. The thermographic image comprises an array 5 of color pixels with each
color being associated with temperature. Pixels with a higher temperature value are displayed
in the thermal image in a first color and pixels with a lower temperature value are displayed in
a second color. Pixels with temperature values between the lower and higher temperature values
are displayed in gradations of color between the first and second colors.
10 [0038] A “mammography imaging device” generates the mammography image of
the breast area of the subject. The breast area of the subject is placed on a flat support plate and
compressed with a parallel plate called a paddle. An mammography imaging device produces
a small burst of x-rays that pass through the breast area to a detector that is located on the
opposite side. The detector captures the x-ray image on film, or a solid-state detector, which
15 transmits electronic signals to a computer to form a mammography image.
[0039] A “mammography image” is an X-ray image of the breast area of the subject.
The background of the mammography image is displayed in black and the breast is in grays
and whites.
[0040] FIG. 1 illustrates a system view of annotating mammography images of a
20 subject using thermal images and mammography images of the subject by determining a block
of pixels on the mammography images corresponding to high temperature regions on the
thermal imagesaccording to an embodiment herein. The system view includes a subject 100, a
thermal imaging device 101, a computing system 104, a mammography imaging device 106.
The computing device 104 includes a processor 108 and a storage device 110. In one
25 embodiment, the thermal imaging camera 101 is mounted on the slidable and an axially
rotatable robotic arm that is capable of moving the thermal imaging camera 101 along a semicircular
trajectory 103 in the front of the patient/subject from side-to-side such that
thermographic images may be captured in a right-side view, a front view, and a left-side view,
and various oblique angles in between. In some embodiments, the thermographic images are
30 captured in dfferent views manually by moving the thermal imaging device 101 or subject by
a user. In some embodiments, the thermographic images are captured using at least one of
thermal sensors or wearable devices. In some embodiments, the thermal imaging device 101
consists of an array of sensors and a specialized thermal processor. The array of sensors
10
converts infrared energy into electrical signals on a per-pixel basis. The array of sensors in a
thermal imaging device 101 detect temperature values from a body of the subject 100. The
specialized thermal processor in the thermal imaging device 101 processes the detected
temperature values into at least one block of pixels to generate a thermal image. In some
embodiments, the thermal imaging device includes a 5 first screen that shows the temperature
values captured by the thermal imaging device 101, along with hotspots (high temperature
regions) of the body of the subject 100. In some embodiments, the mammography imaging
device 106 includes a second screen that shows information related to the mammography image
of the subject 100. In some embodiments, the thermal image includes an upper body region of
10 the subject’s body. The mammography imaging device 106 captures a mammography image
of the body of the subject 100 and generates standard views. In some embodiments, the standard
views include, but are not limited to bilateral craniocaudal view and mediolateral oblique view.
In some embodiments, the mammography imaging device 106 includes an X-ray tube, a
plurality of filters, a plurality of compression paddles and a specialized mammogram processor.
15 The X-ray tube produces low energy X-rays. The plurality of filters is placed in the path of the
X-ray beam to modify the x-ray spectrum that is projected on the body of the subject 100. The
plurality of compression paddles is attached to the body of the subject 100 to compress a part
of the body of the subject 100 being exposed to the X-rays to obtain cross section density
information. The specialized mammogram processor converts obtained cross section density
20 information into pixels to generate a mammography image. In some embodiments, intensity
values of the pixels correspond to the obtained cross section density information at per-pixel
basis.
[0041] The storage device 110 stores a set of instructions that are executed by the
processor 108 for performing one or more functions. A thermal image from the thermal imaging
25 device 101 and a mammography image from the mammography imaging device 106 are
received by the computing system 104 to determine the block pixels associated with the highdensity
region on a mammography image. In some embodiments, the thermal image and
mammography image are provided to the computing system 104, using a wired network or a
wireless network such as a Bluetooth, Wi-Fi, ZigBee, cloud or any other communication
30 networks. In some embodiments, the thermal image and the mammography image contain one
or more features such as a user id, a timestamp, user interaction details, a number of distinct
applications launched by the thermal imaging device 101 or mammography imaging device
106. In some embodiments, the computing system 104 obtains functional and structural images
11
from a Compact Disc Read-Only Memory (CDROM) or Digital Versatile/Video Disc (DVD).
The functional and structural images may be downloaded from a web-based system or an
application that makes the thermal and mammography images available for processing. In some
embodiments, the thermal images and mammography images are received from a mobile
application that is available on a handheld device. In 5 some embodiments, the handheld device
includes but not limited to, a cell phone, a handheld computing device, an electronic notepad,
a smart phone and a personal assistant device. In some embodiments, the thermal images and
mammography images are received directly from a memory or the storage device 110 of the
computing system 104. The storage device 110 stores the data received from the thermal
10 imaging device 101 and the mammography imaging device 106 as an input file along with data
processed by the processor 108. In some embodiments, the input file may be one or more twodimensional
images produced by the thermal imaging device 101 and the mammography
imaging device 106, being stored in the data storage in a digital imaging and communications in
medicine (DICOM) format. In some embodiments, the thermal imaging device 101 and
15 mammography imaging device 106 may collect data of the subject 100 in the DICOM format.
In some embodiments, the computing system 104 may use the input file data of the subject 100
in the DICOM format, for example, the data of the subject 100 may include a number of
attributes such as a name of the subject 100, an ID of the subject 100, medical history of the
subject 100, a number of slices of the thermal and mammography images, a voxel size, a
20 number of functional time-series points from a DICOM header for image processing.
[0042] The computing device 104 identifies the block of pixels (Pt) with the high
temperature associated with a breast lesion within an identified first breast region of the thermal
image. The computing device 104 estimates the location (l) of the breast lesion corresponding
to the block of pixels (Pt) in the thermal image. The computing device 104 determines a block
25 of pixels (Pm) on the mammography image corresponding to the location (l) of the block of
pixels (Pt) within the second breast region using a first machine learning model. The computing
device 104 generates a report with an annotated mammography image with a marking of a
determined block of pixels (Pm) on the mammography image of the subject 100 corresponding
to the block of pixels (Pt) associated with the high temperature regions on the thermal image to
30 enable lesion identification on the mammography image of the subject 100.
[0043] With reference to FIG. 1, FIG. 2 illustrates an exploded view of the
computing system 104 according to an embodiment herein. The computing system 104 includes
a database 202, a first breast region identification module 204, a second breast region
12
identification module 206, a high temperature region identification module 208, a location
estimation module 210, an annotation module 212 and a report generation module 214. The
database 202 stores thermal images and mammography images that are received from the
thermal imaging device 101 and the mammography imaging device 106. In some embodiments,
the thermal images and the mammography images are 5 stored in a DICOM format. The DICOM
format includes data sets with a header and imagery data set, not limited to the thermal images,
the mammography images. In some embodiments, the thermal images and the mammography
images of the subject may include one or more attributes (for example, a name of the subject,
an identification (ID) of the subject, etc.), a number of slices of the images, a voxel size, and a
10 number of functional time-series points from a DICOM header. In some embodiments, the
thermal images and the mammography images are pre-processed to improve the image quality
to precisely distinguish between defected and non-defected images.
[0044] The first breast region identification module 204, identifies a first breast
region on the thermal image of the subject 100. In some embodiments, the first breast region
15 identification module 204 obtains the thermal image from the thermal imaging device 101 and
identifies the first breast region using machine learning models. In some embodiments, the first
breast region is identified on the thermal image of the subject 100 manually. The second breast
region identification module 206 identifies a second breast region on the mammography image
of the subject 100. In some embodiments, the second breast region identification module 206
20 obtains the mammography image from the mammography imaging device 106 and identifies
the second breast region using machine learning models. In some embodiments, the second
breast region is identified on the mammography image of the subject 100 manually.
[0045] The high temperature region identification module 208 identifies a block of
pixels (Pt) with a high temperature region associated with a breast lesion within the first
identified breast region. The block of pixels (Pt) with the high temperature regions is identified
on the thermal image of the breast region of the subject by determining a first pixel region (m1)
with a temperature Tpixel , where T2 ≤ Tpixel ≤ T1, wherein T1, and T2 are temperature thresholds
obtained from a temperature distribution. The block of pixels (Pt) with the high temperature
regions is identified on the thermal image of the breast region of the subject by (i) determining
the first pixel region (m1) with a temperature T1
pixel , where T2 ≤ T1
pixel ≤ T1, (ii) determining a
second pixel region (m2) with a temperature T2
pixel , where T3 ≤ T2
pixel and (iii) detecting a
plurality of hotspot regions using the first pixel region (m1) and the second pixel region (m2)
with AND or OR rules. The T1, T2 and T3 are temperature thresholds obtained from a
13
temperature distribution. The block of pixels (Pt) with the high temperature regions are
identified on the first breast region of the subject using a second machine learning model. The
second machine learning model is trained by providing a plurality of thermal images and
corresponding annotated high temperature regions associated with different patients as training
data. In some embodiments, the T1 may be a maximum temperature and T2 is calculated from
a histogram of the breast thermal image. In some emobodiments, T1, T2 and T3 are provided by
a user.
[0046] The location estimation module 210 estimates a location (l) of the breast
lesion corresponding to the identified block of pixels (Pt). In some embodiments, the plurality
of high temperature regions on the first breast region of the subject is obtained as an input from
a user to estimate the location (l) of the breast lesion corresponding to the block of pixels (Pt)
in the thermal image of the subject 100. The first location (l) of pixels associated with the high
temperature regions is obtained as a plurality of two-dimensional coordinates by considering a
two-dimensional coordinate system with a nipple as an origin and a breast region as the
boundary.
[0047] The annotation module 212 determines a block of pixels (Pm) on the
mammography image corresponding to the location (l) of the block of pixels (Pt) within the
second breast region using a first machine learning model. The first machine learning model
identifies the block of pixels (Pm) on the mammography image corresponding to the location
(l) on the thermography image by (i) obtaining a clock position (θ) and a distance (r) from the
location (l), (ii) dividing the mammography image into different sectors corresponding to
different clock positions, and (iii) identifying the block of pixels (Pm) that lies within a sector
corresponding to the clock position (θ) and at a minimum distance (r) from a nipple region in
the mammography image. In some embodiments, the block of pixels (Pm) is identified as the
high-density regions in the mammography image within the sector corresponding to the clock
position (θ) and at a minimum distance (r) from the nipple region in the mammography image.
In some embodiments, the high-density regions are identified on the mammography image of
the subject 100 using a third machine learning model. The third machine learning model is
trained by providing a plurality of mammography images and the corresponding annotated
high-density lesions associated with different patients as training data.
[0048] In some embodiments, the first machine learning model identifies the block
of pixels (Pm) on the mammography image corresponding to the location (l) on the
thermography image by (i) obtaining a breast quadrant corresponding to the location (l), (ii)
14
identifying a view of the mammography image, (iii) dividing the mammography image into
different quadrants and (iv) identifying the block of pixels (Pm) that lie within a obtained
quadrant corresponding to location (l) of the breast lesion on the thermographic image
[0049] In some embodiments, the the first machine learning model identifies the
block of pixels (Pm) on the mammography image corresponding to the location (l) on the
thermography image by (i) identify candidate block of pixels with high density in the
mammography image, (ii) obtaining a nipple point (N) on the mammography image, (iii)
calculating locations (lm1-n) of each candidate block of pixels with respect to a obtained nipple
point (N), (iv) comparing the locations (lm1-n) of each of the candidate block of pixels with the
location (l) of the breast lesion on the thermographic image and (v) selecting the block of pixels
(Pm) among the candidate block of pixles by selecting a block of pixels corresponding to a
nearest location (lmi) among the locations (lm1-n) that is close to the location (l) of the breast
lesion on the thermographic image. In some embodiments, the location of the candidate block
of pixels with the location (l) of the breast lesion on the thermographic image includes lm1, lm2
to lmn-n.
[0050] The report generation module 214 generates a report with an annotated
mammography image with a marking of a determined block of pixels (Pm) on the
mammography image of the subject corresponding to the block of pixels (Pt) associated with
the high temperature regions on the thermal image to enable lesion identification on the
mammography image of the subject. The report includes at least one of (i) annotated
mammography image with markings of the determined block of pixels (Pm) in a different color
as annotations on the mammography image, (ii) annotated mammography image with markings
of the boundary of the determined block of pixels (Pm) in a different color as annotations on the
mammography image and (iii) a text report that includes quantitative parameters of the block
of pixels (Pm) on the mammography image corresponding to the high temperature region on
the thermal image. The markings include an annotation of the block of pixels (Pm) on the
mammogram image corresponding to the high temperature region on the thermal image and a
text annotation that comprises quantitative parameters of the block of pixels (Pm). In some
embodiments, the first machine learning model, the second machine learning model and the
third machine learning model include supervised learning algorithms and unsupervised learning
algorithms. In some embodiments, the supervised learning algorithms include a decision tree
learning, a linear model analysis, a support vector machine algorithm, graphical models, deep
neural networks, an ensemble learning algorithm, classification models, and regression models.
15
In some embodiments, the unsupervised learning algorithms include a clustering-based
algorithm, a graph-based algorithm, a component-based learning algorithm, a hierarchical
clustering-based algorithm, deep neural networks , and a mixture model. In some embodiments,
the markings may be displayed on a DICOM overlaying on the actual mammogram image or a
mammogram image with a high density making represented in a different color. In some
embodiments, the markings are test annotation on the DICOM or a text report characteristic of
lesion corresponding to the high temperature regions.
[0051] FIG. 3A and 3B are an exemplary location of a breast lesion on a breast region
of a subject according to some embodiments herein. The detection of breast lesion includes
detection of high thermal activities on the thermal image and their corresponding location on
mammography image of the subject 100. A first location being the detected hotspot region on
the thermal image. The identified hotspot regions m 5 ay be considered as regions corresponding
to abnormalities. The thermal image of the subject’s body includes a right breast 302, a right
nipple point 304 which is represented as x, a left breast 306 and a left nipple point 308 which
is represented as y. In some embodiments, the nipple points (x, y) (i.e. 304 and 308) are detected
by the user or machine learning models. Each breast region in the thermal image includes
10 quadrants that include at least one of (i) an upper outer quadrant (UOQ) 310A, (ii) an upper
inner quadrant (UIQ) 310B, (iii) a lower inner quadrant (LIQ) 310C or (iv) a lower outer
quadrant (LOQ) 310D. FIG. 3B shows the exemplary mammography image with a high-density
region 320 corresponding to the high thermal activity.
[0052] FIG. 4 illustrates an exemplary view of a location of a high temperature
15 region determined using an nipple point as an origin and a breast region as the boundary
according to an embodiment herein. In FIG. 4, the location (l) of pixels associated with hightemperature
region obtained by considering a two-dimensional coordinate system is shown. In
some embodiments, the first location (l) is represented using the clock position as an angle (θ)
and a radial distance (r). In some embodiments, location(l) is represented as zones of the breast
20 and distance (r) and clock position (θ ) are derived from the zones. In some embodiments, the
location (l) of a high-temperature region on the thermography image is calculated by estimating
the distance (r) and the angle (θ) using the nipple point as a reference origin point of the coordinate
system. In some embodiments, the clock position (θ) is identified by detecting the
angle formed by the nipple point and a centroid of the high-temperature region with a horizontal
25 axis. In some embodiments, the clock position available as angle (θ) that is represented as the
corresponding degree of a clock (‘O clock) of that high temperature region. In some
16
embodiments, the clock position is available as angle (θ) that is represented as a corresponding
quadrant of the breasts which containing the high temperature region. In some embodiments,
the distance (r) is identified by taking the ratio of the distance of the nipple point and the
centroid of high temperature region to the width of the breast region.
[0053] FIG. 5A to 5B illustrate an exemplary 5 view of the high temperature region
associated with a breast lesion that is identified on a breast region of the subject according to
an embodiment herein. In some embodiments, a plurality of high temperature regions on the
thermal image of the subject 100 as an input from the user. A high temperature region 502 is a
region with high thermal activity. The computing system 104 detects the location of the high
10 temperature region 502 corresponding to the high thermal activity. The breast region may be
identified manually or via machine learning techniques. In some embodiments, a polygon ring
may be marked and adjusted manually or via machine learning techniques for identification of
breast region.
[0054] FIG. 6A and 6B illustrate an exemplary view of dividing the mammography
15 image into different sectors using different clock positions according to an embodiment herein.
The mammography image is divided into different sectors 602 corresponding to different
possible clock positions. In some embodiments, the sectors correspond to quadrants of each
mammography image which are identified by using the nipple point as a center of a reference
coordinate system with pectoral muscle as y-axis and normal vector of pectoral muscle passing
20 through nipple point as x-axis. In some embodiments, the sectors correspond to different zones
of the breast region.
[0055] FIG. 7A to 7C illustrate an exemplary view of an annotation block of pixels
on a mammography image corresponding to high temperature region on a thermal image
according to an embodiment herein. In some embodiments, the mammography image is first
25 divided into equal sectors corresponding to different clock positions. A search space
representing the block of pixels (Pm) is identified by selecting the sector corresponding to clock
position (θ) and using distance (r) from the nipple region in the mammography image on this
sector as one of the reference point. FIG. 7A and 7B show the block of pixels (Pm) as the highdensity
regions in the mammography image using the location (l) obtained from thermal image.
30 Figure 7C shows the block of of pixels (Pm) as a triangular search region with one of the vertices
or reference point lying within the sector corresponding to the clock position (θ) and at a
distance (r) from the nipple region on the mammography image.
[0056] FIG. 7D illustrates an exemplary mammography image with a text annotation
of the high temperature region according to an embodiment herein. The mammogram images
17
consist of annotation that represents a block of pixels (Pm) corresponding to the location (l) of
the block of pixels (Pt). In an embodiment, the mammogram image displays at least one of: (i)
annnotated mammography image with markings of the determined block of pixels (Pm) in a
different color as annotations on the mammography image or (ii) annotated mammography
image with markings of the boundary of the determined 5 block of pixels (Pm) in a different color
as annotations on the mammography image (iii) a text report that comprises quantitative
parameters of the block of pixels (Pm) on the mammography image corresponding to the high
temperature region on the thermal image. FIG. 7D shows the annotation of the block of pixels
(Pm) on the mammogram image corresponding to the high temperature region on the thermal
10 image and a text annotation 710 that includes quantitative parameters of the block of pixels
(Pm).
[0057] FIG. 8A and 8B is a flow diagram that illustrates a method for annotating
mammography images of a subject using thermal images and mammography images of the
subject by determining a block of pixels on the mammography images corresponding to high
15 temperature regions on the thermal imagesaccording to an embodiment herein. At step 802, a
thermal image of the subject is captured using a thermal imaging device 101. At step 804, the
mammography image of the subject is captured using the mammography imaging device 106.
At step 806, a first breast region in the thermal image of the subject is identified. At step 808, a
second breast region in the mammography image of the subject is identified. At step 810, a
20 block of pixels (Pt) with the high temperature region associated with a breast lesion within the
first identified breast region is identified. At step 812, a location (l) of the breast lesion
corresponding to the identified block of pixels (Pt) is estimated. At step 814, a block of pixels
(Pm) corresponding to the location (l) of the block of pixels (Pt) within the second breast region
is determined on the mammography image using a first machine learning model. At step 816,
25 a report is generated with an annotated mammography image with a marking of a determined
block of pixels (Pm) on the mammography image of the subject corresponding to the block of
pixels (Pt) associated with the high temperature regions on the thermal image to enable lesion
identification on the mammography image of the subject.
[0058] FIG. 9 illustrates a block diagram of one example system for determining a
30 block of pixels on the mammography images corresponding to high temperature regions on the
thermal images using thermal images and mammography images in accordance with the
embodiments described with respect to the flow diagram of FIG. 8 according to some
embodiments herein. Thermal imaging device 901 captures thermal images of the subject 100.
Mammography imaging device 902 captures mammography images of the subject 100. In some
18
embodiments, the thermal imaging device 901 and the mammography imaging device 902 may
transmit a thermal image or mammography image, respectively. In some embodiments, the
thermal imaging device may capture a video of the subject 100. Thermal imaging device 901
consists an array of sensors that converts infrared energy into electrical signals on a per-pixel
basis, wherein the array of sensors detects 5 temperature values from the subject. The
Mammography imaging device 902 converts cross section density information into pixels to
generate a mammography image. Intensity values of the pixels correspond to the obtained cross
section density information at per-pixel basis. System 900 includes a processor 903 that
receives the thermal image and the mammography image of the subject 100 from the thermal
10 imaging device 901 and the mammography imaging device 902. The processor includes an
automation module 904 and a machine learning unit 906. In some embodiments, the machine
learning unit 906 includes at least one of a first machine learning model for determining a block
of pixels (Pm), a second machine learning model for identifying a block of pixels (Pt) and a
third machine learning model for identifying high-density regions on the mammography image.
15 The automation module 904 determines the block of pixels (Pm) on the mammography image
corresponding to the location (l) of the block of pixels (Pt) within the second breast region using
the machine learning unit 906. In some embodiments, the block of pixels (Pt) with the high
temperature regions are identified on the thermal image of the breast region of the subject by
determining a first pixel region (m1) with a temperature Tpixel , where T2 ≤ Tpixel ≤ T1, wherein
20 T1, and T2 are temperature thresholds obtained from a temperature distribution. The block of
pixels (Pt) with the high temperature regions is identified on the thermal image of the breast
region of the subject by (i) determining the first pixel region (m1) with a temperature T1
pixel ,
where T2 ≤ T1
pixel ≤ T1, (ii) determining a second pixel region (m2) with a temperature T2
pixel ,
where T3 ≤ T2
pixel and (iii) detecting a plurality of hotspot regions using the first pixel region
25 (m1) and the second pixel region (m2) with AND or OR rules. The T1, T2 and T3 are temperature
thresholds obtained from a temperature distribution. The block of pixels (Pt) with the high
temperature regions are identified on the first breast region of the subject using the machine
learning unit 906. The automation unit 904 estimates a location (l) of the breast lesion
corresponding to the identified block of pixels (Pt) using machine learning unit 906. In some
30 embodiments, the plurality of high temperature regions on the breast region of the subject is
obtained as an input from a user to estimate the location (l) of the breast lesion corresponding
to the block of pixels (Pt) in the thermal image of the subject 100. The processor 903 and the
automation module 904 store their results to storage device 905. The machine learning unit
906 stores and retrieves the results from storage device 905. In some embodiments, the machine
19
learning unit 906 is used to annotate mammography images of the subject 100 using thermal
images and mammography images of the subject 100 by determining a block of pixels on the
mammography images corresponding to high temperature regions on the thermal images. The
automation module 904 determines the block of pixels (Pm) on the mammography image
corresponding to the location (l) of the block of pixels (Pt) 5 within the second breast region using
the machine learning unit 906. Central Processing Unit 908 retrieves machine-readable
program instructions from a memory 910 and is provided to facilitate the functionality of any
of the modules of the system 900. CPU 908, operating alone or in conjunction with other
processors, may be configured to assist or otherwise perform the functionality of any of the
10 modules or processing units of the system 900 as well as facilitating communication between
the system 900 and the workstation 911.
[0059] System 900 is shown having been placed in communication with a
workstation 910. A computer case of the workstation houses various components such as a
motherboard with a processor and memory, a network card, a video card, a hard drive capable
15 of reading/writing to machine-readable media 911 such as a floppy disk, optical disk, CDROM,
DVD, magnetic tape, and the like, and other software and hardware needed to perform
the functionality of a computer workstation. The workstation 910 further includes a display
device 912, such as a CRT, LCD, or touch screen device, for displaying information, images,
view angles, and the like. A user can view any of that information and make a selection from
20 menu options displayed thereon. Keyboard 913 and mouse 914 effectuate a user input. It
should be appreciated that the workstation 910 has an operating system and other specialized
software configured to display alphanumeric values, menus, scroll bars, dials, slideable bars,
pull-down options, selectable buttons, and the like, for entering, selecting, modifying, and
accepting information needed for processing in accordance with the teachings hereof. The
25 workstation 910 is further enabled to display thermal images, mammography images and the
like as they are derived. A user or technician may use the user interface of the workstation 910
to set parameters and adjust various aspects of the first breast region, the second breast region,
the location estimation, the block of pixels (Pm) and (Pt) and the report generation is performed,
as needed or as desired, depending on the implementation. Any of these selections or inputs
30 may be stored/retrieved to storage device 911. Default settings can be retrieved from the
storage device. A user of the workstation 910 is also able to view or manipulate any of the data
in the patient records, collectively at 915, stored in database 916. Server 918 is connected with
916 and 915 to access the any of the data in the patient records that are collectively at 915, and
916. In some embodiments, the server 918 is a PACS (picture archiving and communication
20
system) server. Any of the received images, results, determined view angle, and the like, may
be stored to a storage device internal to the workstation 910. Although shown as a desktop
computer, the workstation 910 can be a laptop, mainframe, or a special purpose computer such
as an ASIC, circuit, or the like.
[0060] Any of the components 5 of the workstation 910 may be placed in
communication with any of the modules and processing units of system 900. Any of the
modules of the system 900 can be placed in communication with storage devices 905, 916 and
906 and/or computer-readable media 911 and may store/retrieve therefrom data, variables,
records, parameters, functions, and/or machine-readable/executable program instructions, as
10 needed to perform their intended functions. Each of the modules of the system 900 may be
placed in communication with one or more remote devices over network 917. It should be
appreciated that some or all of the functionality performed by any of the modules or processing
units of the system 900 can be performed, in whole or in part, by the workstation 910. The
embodiment shown is illustrative and should not be viewed as limiting the scope of the
15 appended claims strictly to that configuration. Various modules may designate one or more
components which may, in turn, comprise software and/or hardware designed to perform the
intended function.
[0061] The foregoing description of the specific embodiments will so fully reveal
the general nature of the embodiments herein that others can, by applying current knowledge,
20 readily modify and/or adapt for various applications such specific embodiments without
departing from the generic concept, and, therefore, such adaptations and modifications should
and are intended to be comprehended within the meaning and range of equivalents of the
disclosed embodiments. It is to be understood that the phraseology or terminology employed
herein is for the purpose of description and not of limitation. Therefore, while the embodiments
25 herein have been described in terms of preferred embodiments, those skilled in the art will
recognize that the embodiments herein can be practiced with modification within the spirit and
scope.
| # | Name | Date |
|---|---|---|
| 1 | 202141044322-STATEMENT OF UNDERTAKING (FORM 3) [29-09-2021(online)].pdf | 2021-09-29 |
| 2 | 202141044322-PROOF OF RIGHT [29-09-2021(online)].pdf | 2021-09-29 |
| 2 | 202141044322-PatentCertificate18-04-2024.pdf | 2024-04-18 |
| 3 | 202141044322-POWER OF AUTHORITY [29-09-2021(online)].pdf | 2021-09-29 |
| 4 | 202141044322-FORM FOR STARTUP [29-09-2021(online)].pdf | 2021-09-29 |
| 5 | 202141044322-FORM FOR SMALL ENTITY(FORM-28) [29-09-2021(online)].pdf | 2021-09-29 |
| 6 | 202141044322-FORM 1 [29-09-2021(online)].pdf | 2021-09-29 |
| 7 | 202141044322-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-09-2021(online)].pdf | 2021-09-29 |
| 8 | 202141044322-EVIDENCE FOR REGISTRATION UNDER SSI [29-09-2021(online)].pdf | 2021-09-29 |
| 9 | 202141044322-DRAWINGS [29-09-2021(online)].pdf | 2021-09-29 |
| 10 | 202141044322-DECLARATION OF INVENTORSHIP (FORM 5) [29-09-2021(online)].pdf | 2021-09-29 |
| 11 | 202141044322-COMPLETE SPECIFICATION [29-09-2021(online)].pdf | 2021-09-29 |
| 12 | 202141044322-FORM-9 [08-06-2022(online)].pdf | 2022-06-08 |
| 13 | 202141044322-STARTUP [10-06-2022(online)].pdf | 2022-06-10 |
| 14 | 202141044322-FORM28 [10-06-2022(online)].pdf | 2022-06-10 |
| 15 | 202141044322-FORM 18A [10-06-2022(online)].pdf | 2022-06-10 |
| 16 | 202141044322-FER.pdf | 2022-11-04 |
| 17 | 202141044322-Request Letter-Correspondence [16-03-2023(online)].pdf | 2023-03-16 |
| 18 | 202141044322-Power of Attorney [16-03-2023(online)].pdf | 2023-03-16 |
| 19 | 202141044322-FORM28 [16-03-2023(online)].pdf | 2023-03-16 |
| 20 | 202141044322-Form 1 (Submitted on date of filing) [16-03-2023(online)].pdf | 2023-03-16 |
| 21 | 202141044322-Covering Letter [16-03-2023(online)].pdf | 2023-03-16 |
| 22 | 202141044322-Request Letter-Correspondence [22-03-2023(online)].pdf | 2023-03-22 |
| 23 | 202141044322-Power of Attorney [22-03-2023(online)].pdf | 2023-03-22 |
| 24 | 202141044322-FORM28 [22-03-2023(online)].pdf | 2023-03-22 |
| 25 | 202141044322-Form 1 (Submitted on date of filing) [22-03-2023(online)].pdf | 2023-03-22 |
| 26 | 202141044322-Covering Letter [22-03-2023(online)].pdf | 2023-03-22 |
| 27 | 202141044322-OTHERS [04-05-2023(online)].pdf | 2023-05-04 |
| 28 | 202141044322-FER_SER_REPLY [04-05-2023(online)].pdf | 2023-05-04 |
| 29 | 202141044322-CORRESPONDENCE [04-05-2023(online)].pdf | 2023-05-04 |
| 30 | 202141044322-COMPLETE SPECIFICATION [04-05-2023(online)].pdf | 2023-05-04 |
| 31 | 202141044322-CLAIMS [04-05-2023(online)].pdf | 2023-05-04 |
| 32 | 202141044322-US(14)-HearingNotice-(HearingDate-27-02-2024).pdf | 2024-01-29 |
| 33 | 202141044322-Correspondence to notify the Controller [15-02-2024(online)].pdf | 2024-02-15 |
| 34 | 202141044322-FORM-26 [26-02-2024(online)].pdf | 2024-02-26 |
| 35 | 202141044322-Correspondence to notify the Controller [26-02-2024(online)].pdf | 2024-02-26 |
| 36 | 202141044322-Annexure [26-02-2024(online)].pdf | 2024-02-26 |
| 37 | 202141044322-Written submissions and relevant documents [13-03-2024(online)].pdf | 2024-03-13 |
| 38 | 202141044322-PatentCertificate18-04-2024.pdf | 2024-04-18 |
| 39 | 202141044322-IntimationOfGrant18-04-2024.pdf | 2024-04-18 |
| 1 | 202141044322E_06-07-2022.pdf |