Abstract: SYSTEM AND METHOD FOR ESTIMATING DENSITY OF A TISSUE FROM THERMOGRAMS A system and method for estimating density of a tissue from thermal images by (i) receiving a thermal image of a tissue of a subject, which represents a temperature distribution on the tissue of the subject as pixels in the thermal image, (ii) classifying the density of the tissue from the captured thermal image into two or more density classes by providing the captured thermal image as an input to the pre-trained deep learning model to determine a density class of the tissue and (iii) automatically generating a text report with a determined class of the tissue to estimate the density of the tissue. The classification includes (i) analyzing the captured thermal image to determine probabilities of the tissue belonging to each density class and (ii) determining the class with the maximum probability to obtain the density class of the tissue. FIG. 2
Claims:I/ We Claim:
1. A system for estimating density of a tissue from thermal images, the system comprising:
a storage device storing a set of machine-readable instructions; and
a processor configured to retrieve the machine-readable instructions from the storage device which, when executed by the processor, enable the processor to:
receive a thermal image of a tissue of a subject, which represents a temperature distribution on the tissue of the subject as pixels in the thermal image, wherein the thermal image is captured using at least one of a thermal imaging camera or a wearable device, wherein the thermal imaging camera or a wearable device comprises:
an array of sensors that convert infrared energy into electrical signals on a per-pixel basis;
a lens that focuses the infrared energy from the subject’s tissue onto the array of sensors, wherein the array of sensors detect temperature values from the subject’s tissue; and
a specialized processor that processes the detected temperature values into at least one block of pixels to generate the thermal image;
integrate a pre-trained deep learning model to classify the density of the tissue from the captured thermal image into two or more density classes;
classify the density of the tissue from the captured thermal image into two or more density classes by providing the captured thermal image as an input to the pre-trained deep learning model to determine a density class of the tissue, wherein the classification comprises:
analyzing the captured thermal image, using the pre-trained deep learning model, to determine a plurality of probabilities of the tissue belonging to each of the two or more density classes; and
determining the density class associated with a maximum probability from among the plurality of probabilities; and
automatically generate a text report with the determined density class of the tissue for further clinical diagnosis and management of the subject.
2. The system as claimed in claim 1, wherein the thermal image of a breast tissue and the processor is configured to classify the density of the tissue into any one of BI-RADS A, BI-RADS B, BI-RADS C, or BI-RADS D, with increasing the density of the tissue from BI-RADS A to BI-RADS D.
3. The system as claimed in claim 1, wherein the processor is configured to automatically generate the text report that comprises tissue density values estimated based on a weighted average of the plurality of determined probabilities.
4. The system as claimed in claim 1, wherein the processor is configured to automatically generate the text report that comprises tissue density values for any body parts estimated based on the determined density class and its associated range of density values.
5. The system as claimed in claim 1, wherein the processor is configured to train the deep learning model by providing a plurality of existing thermal images of different tissues and its corresponding density as training data, wherein the training data is obtained from at least one data source, wherein the system comprises at least one interface for communicating with at least one data source.
6. The system as claimed in claim 1, wherein the processor is configured to
automatically identify a preferred modality of follow-up breast examination for the subject based on the density class of the tissue and an age of the subject; and
automatically generate a text report that comprises at least one of the density class of the tissue or density values along with recommendation of the preferred modality of the breast examination for the subject.
7. The system as claimed in claim 1, wherein the processor is configured to train a deep learning model by providing a plurality of existing thermal images of different tissues and a plurality of density classes corresponding to different tissues as training data to obtain the pre-trained deep learning model.
8. A processor-implemented method for estimating density of breast region using a deep learning model, the method comprising:
receiving a thermal image of a breast region of a subject, which represents a temperature distribution on the breast region of the subject as pixels in the thermal image, wherein the thermal image is captured using at least one of a thermal imaging camera or a wearable device;
integrating a pre-trained deep learning model to classify the density of the tissue from the captured thermal image into two or more density classes;
classify the density of the tissue from the captured thermal image into two or more density classes by providing the captured thermal image as an input to the pre-trained deep learning model to determine a density class of the tissue, wherein the classification comprises:
analyzing the captured thermal image, using the pre-trained deep learning model, to determine a plurality of probabilities of the tissue belonging to each of the two or more density classes; and
determine a density class associated with a maximum probability from among the plurality of probabilities; and
automatically generate a text report with the determined class of the tissue for further clinical diagnosis and management of the subject.
9. The method as claimed in claim 8, wherein the pixels in the thermal image includes a first color and a second color, wherein the pixels with a highest temperature value is displayed in the first color and pixels with a lowest temperature value is displayed in the second color, pixels with temperature values between the lowest and highest temperature values are displayed in gradations of color between the first and second colors.
10. The method as claimed in claim 8, wherein the method comprises classifying the density of the breast region into any one of BI-RADS A, BI-RADS B, BI-RADS C, or BI-RADS D, with increasing the density of the breast region from BI-RADS A to BI-RADS D.
11. The method as claimed in claim 8, wherein the text report that comprises tissue density values that is estimated based on a weighted average of the plurality of determined probabilities and the estimated tissue densities for any body parts based on the determined density class and its associated range of density values.
12. The method as claimed in claim 8, wherein the method comprises training the deep learning model by providing a plurality of existing thermal images of different breast regions and its corresponding density as training data, wherein the training data is obtained from at least one data source, wherein the system comprises at least one interface for communicating with at least one data source.
13. The method as claimed in claim 8, wherein the method comprises:
automatically identifying a preferred modality of follow-up breast examination for the subject based on an estimated density of the tissue and an age of the subject; and
automatically generating a text report that comprises at least one of the density class of the tissue or density values along with recommendation of the preferred modality of the breast examination for the subject.
14. The method as claimed in claim 8, wherein the method comprises training a deep learning model by providing a plurality of existing thermal images of different tissues and a plurality of density classes corresponding to different tissues as training data to obtain the pre-trained deep learning model. , Description:BACKGROUND
Technical Field
[0001] The present invention is directed towards estimation of a tissue density from a thermal image conformant to a standard operating procedure and, more particularly, to a system and method for estimating density of a tissue from thermal images.
Description of the Related Art
[0002] Breast cancer is one of the most common cancers in women worldwide, accounting for approximately 570,000 deaths in 2015. Over 1.5 million women (25% of all women with cancer) are diagnosed with breast cancer every year throughout the world. The survival rates are much worse especially in developing countries like India, where 1 in every 2 cases out of 0.15 million women diagnosed with breast cancer lost their lives. According to the World Health Organization (WHO), 1.7 million people were diagnosed with breast cancer in 2012 and reported that a staggering 6.2 million cases could be diagnosed within the five-year prevalence period.
[0003] Early diagnosis of breast cancer is one of the best approaches to reduce the mortality rates. Mammography is a widely used screening approach in the detecting of breast cancer and proved to help reduce mortality effectively. Other screening methods, such as Magnetic Resonance Imaging (MRI), which is more sensitive than mammography, have also been implemented and studied during the last decade. Ultrasound is typically used as correlation modality.
[0004] The research shows that high survival rates are associated with early detection. Additionally, screening of dense breasts has become a major concern as the accuracy of mammography in dense tissue is low and results in inconclusive results. Due to this, many developed countries have introduced density reporting law requiring mammography health care providers to include appropriate information about breast density in the report. Generally, the density of the breast is classified into 4 classes BI-RADS A, BI-RADS B, BI-RADS C and BI-RADS D with increasing breast density from A to D. Typically, BI-RADS A and BI-RADS B are considered as fatty breasts and the BI-RADS C and BI-RADS D are considered as dense breasts.
[0005] In recent days, there is an increase in the use of thermograms/ thermography as a pre-screening imaging for breast cancer detection before approaching other imaging modalities. Thermography is approved as an adjunct tool for breast cancer screening by the FDA in 1982. It captures the amount of heat radiating from the surface of the body by measuring a temperature pattern and distribution on the chest due to high metabolism associated with tumorous growth. There are several advantages of breast thermography compared to other known techniques. It works on women of all age groups, does not involve radiation and is non-contact and hence painless.
[0006] The heat generated from the chest wall reaches the breast surface by transmitting through different layers of breast tissue. A fatty tissue absorbs more heat due to its low conductivity whereas a dense tissue conducts more heat. These variations in conductivity result in variation of heat on the surface of the skin, depending on the ratio of fatty to dense tissues in the breast. The devices used in capturing thermograms are detecting temperature variations of up to 0.05-degree Celsius. But the manual interpretation of these temperature readings is not trivial as it involves the visual analysis of more than a hundred thousand pixels.
[0007] Mammography is known to have lower sensitivity in BI-RADS C and BI-RADS D category of breasts. Approximately, 30% of women have an inconclusive mammogram despite going through the X-Ray based test due to the high tissue density. Since mammography uses X-rays radiation, its repeated exposure can increase the patient’s risk to cancer. Hence, there is a need for identifying a patient who needs an alternate test for breast screening without putting them through a radiation-based test like mammography, also saving them from the anxiousness of an inconclusive result.
[0008] Accordingly, there is a need for a system and method for estimating density of a tissue from non-radiation based thermal images for enabling a user to determine the appropriate imaging modality for further clinical diagnosis and management of the subject.
SUMMARY
[0009] In view of the foregoing, an embodiment herein provides a system for estimating density of a tissue from thermal images. The system includes a storage device storing a set of machine-readable instructions and a processor. The processor configured to retrieve machine-readable instructions from the storage device which, when executed by the processor, enable the processor to (i) receive a thermal image of a tissue of a subject, which represents a temperature distribution on the tissue of the subject as pixels in the thermal image, (ii) integrate a pre-trained deep learning model to classify the density of the tissue from the captured thermal image into two or more density classes, (iii) classify the density of the tissue from the captured thermal image into two or more density classes by providing the captured thermal image as an input to the pre-trained deep learning classifier model to determine a density class of the tissue and (iv) automatically generate a text report with the determined class of the tissue to estimate the density of the tissue. The thermal image is captured using at least one of a thermal imaging camera or a wearable device. The thermal imaging camera or a wearable device includes (i) an array of sensors that convert infrared energy into electrical signals on a per-pixel basis, (ii) a lens that focuses the infrared energy from the subject’s tissue onto the array of sensors and (iii) a specialized processor that processes the detected temperature values into at least one block of pixels to generate the thermal image. The array of sensors detect temperature values from the subject’s tissue. The classification includes (i) analyzing the captured thermal image, using the pre-trained deep learning model, to determine a plurality of probabilities of the tissue belonging to each of the two or more density classes and (ii) determining the density class associated with a maximum probability from among the plurality of probabilities.
[0010] In some embodiments, the pixels in the thermal image includes a first color and a second color. The pixels with a highest temperature value are displayed in the first color and pixels with a lowest temperature value is displayed in the second color, pixels with temperature values between the lowest and the highest temperature values is displayed in gradations of color between the first and the second colors.
[0011] In some embodiments, the thermal image of a breast tissue and the processor is configured to classify the density of the breast tissue into any one of BI-RADS A, BI-RADS B, BI-RADS C, or BI-RADS D, with increasing the density of the tissue from BI-RADS A to BI-RADS D.
[0012] In some embodiments, the processor is configured to automatically generate the text report that comprises tissue density values estimated based on a weighted average of the plurality of determined probabilities.
[0013] In some embodiments, the the processor is configured to automatically generate the text report that comprises tissue density values for any body parts estimated based on the determined density class and its associated range of density values.
[0014] In some embodiments, the processor is configured to automatically identify a preferred modality of follow-up breast examination for the subject based on at least one of the density class of the tissue, tissue density values or an age of the subject and automatically generate a text report that comprises at least one of the density class of the tissue or density values along with recommendation of the preferred modality of the breast examination for the subject.
[0015] In some embodiments, the processor is configured to automatically identify the preferred modality of follow-up breast examination based on the estimated density and the age of the patient.
[0016] In some embodiments, the processor is configured to train the pre-trained deep learning model by providing a plurality of existing thermal images of different tissues and its corresponding density as training data.
[0017] In some embodiments, the processor is configured to receive the plurality of existing thermal images of different tissues and its corresponding density from at least one data source.
[0018] In some embodiments, the system includes at least one interface for communicating with at least one data source.
[0019] In some embodiments, the processor is configured to recommend subsequent imaging modality to the subject based on the density of the tissue.
[0020] In some embodiments, the processor is configured to train the pre-trained deep learning classifier model by providing a plurality of existing thermal images of different tissues and a plurality of density classes corresponding to different tissues as training data to obtain the pre-trained deep learning model.
[0021] In another aspect, a processor-implemented method for estimating density of breast region using a deep learning model includes (i) receiving a thermal image of a breast region of a subject, which represents a temperature distribution on the breast region of the subject as pixels in the thermal image, (ii) integrating a pre-trained deep learning model to classify the density of the tissue from the captured thermal image into two or more density classes, (iii) classify the density of the tissue from the captured thermal image into two or more density classes by providing the captured thermal image as an input to the pre-trained deep learning model to determine a density class of the tissue and (iv) automatically generate a text report with the determined class of the tissue to estimate the density of the tissue. The thermal image is captured using at least one of a thermal imaging camera or a wearable device. The classification includes (i) analysing the captured thermal image, using the pre-trained deep learning model, to determine a plurality of probabilities of the tissue belonging to each of the two or more density classes and (ii) determine a density class associated with a maximum probability from among the plurality of probabilities.
[0022] In some embodiments, the pixels in the thermal image includes a first color and a second color. The pixels with a highest temperature value is displayed in the first color and pixels with a lowest temperature value is displayed in the second color, pixels with temperature values between the lowest and highest temperature values is displayed in gradations of color between the first and the second colors.
[0023] In some embodiments, the method includes classifying the density of the breast region into any one of BI-RADS A, BI-RADS B, BI-RADS C, or BI-RADS D, with increasing the density of the breast region from BI-RADS A to BI-RADS D.
[0024] In some embodiments, the text report includes tissue density values that is estimated based on a weighted average of the determined probabilities and the estimated tissue densities based on the determined density class and its associated range of density values. The includes at least one of the estimated density of the tissues or density values along with a recommendation of the preferred modality of the breast examination for the subject
[0025] In some embodiments, the method includes training the pre-trained deep learning model by providing a plurality of existing thermal images of different breast regions and its corresponding density as training data.
[0026] In some embodiments, the method includes receiving the plurality of existing thermal images of different breast regions and its corresponding density from at least one data source using at least one interface.
[0027] In some embodiments, the method includes recommending subsequent imaging modality to the subject based on the density of the tissue.
[0028] In some embodiments, the method includes training a deep learning model by providing a plurality of existing thermal images of different tissues and it’s a plurality of classes corresponding to different tissues as training data to obtain the pre-trained deep learning model.
[0029] The system limits the number of women going for radiation-based screening such as mammography when their result is likely to be inconclusive. The estimated density provides a non-invasive, non-radiation and non-contact way of obtaining the breast density to determine the imaging modality like non-mammography for high dense breast and mammography for low dense breast.
[0030] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF THE DRAWINGS
[0031] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[0032] FIG. 1 illustrates a first exemplary environment diagram for estimating density of a tissue from thermal images using a deep learning model for enabling user to determine image modality according to an embodiment herein;
[0033] FIG. 2 illustrates an exploded view of a system for estimating density of a tissue from thermal images of a breast region of a subject according to some embodiments herein;
[0034] FIG. 3A and 3B illustrates an exemplary process flow of a classification of density of a tissue into two or more density classes using a thermal image of a subject according to some embodiments herein;
[0035] FIG. 4 illustrates an exemplary process flow of a determination of probabilities of the tissue belonging to each density class using a deep learning model according to some embodiments herein;
[0036] FIG. 5 illustrates a flow diagram of one embodiment of the present method for estimating density of breast region using a deep learning model according to some embodiments herein; and
[0037] FIG. 6 illustrates a block diagram of one example system for processing a thermal image in accordance with the embodiments described with respect to the flow diagram of FIG. 5 according to some embodiments herein.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0038] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0039] As mentioned, there remains a need for a system and a method estimating the density of tissue from thermal images using a deep learning model for enabling a user to determine image modality. Referring now to the drawings, and more particularly to FIGS. 1 through 6, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[0040] A "person" and “subject” refer to either a male or a female. Gender pronouns are not to be viewed as limiting the scope of the appended claims strictly to females. Moreover, although the term “person” or “patient” or “subject” is used interchangeably throughout this disclosure, it should be appreciated that the person undergoing breast health screening may be something other than a human such as, for example, a primate. Therefore, the use of such terms is not to be viewed as limiting the scope of the appended claims to humans.
[0041] A “breast area” refers to a tissue of the breast and may further include surrounding tissue as is deemed appropriate for breast health screening. Thermal images are the capture of the breast area in various view angles which include a mediolateral view (center chest), a mediolateral oblique (angular) view, and a lateral (side) view, as are generally understood in the medical imaging arts. It should be appreciated that the mediolateral view is a supplementary mammographic view which generally shows less breast tissue and pectoral muscle than the mediolateral oblique view. FIG. 1 shows the breast area of a female 100. It should be appreciated that the patient may be stationary while the camera moves about the patient, or the patient can move while the camera remains stationary, or the patient and the camera may move to capture the appropriate view angles as desired.
[0042] A “thermal camera” refers to either a still camera or a video camera with a lens that focuses infrared energy from objects in a scene onto an array of specialized sensors which convert infrared energy across a desired thermal wavelength band into electrical signals on a per-pixel basis and which output an array of pixels with colours that correspond to temperatures of the objects in the image.
[0043] A "thermographic image" or simply a “thermal image” is an image captured by a thermal camera. The thermographic image comprises an array of color pixels with each color being associated with temperature. Pixels with a higher temperature value are displayed in the thermal image in a first color and pixels with a lower temperature value are displayed in a second color. Pixels with temperature values between the lower and higher temperature values are displayed in gradations of color between the first and second colors.
[0044] "Receiving a thermal image" of a patient for breast health screening is intended to be widely construed and includes retrieving, capturing, acquiring, or otherwise obtaining video image frames.
[0045] "Analysing the thermographic image" means to identify a plurality of points (PN) in the image.
[0046] FIG. 1 illustrates a first exemplary environment diagram for estimating density of a tissue from thermal images using a deep learning model for enabling user to determine image modality according to an embodiment herein. The thermal imaging camera 101 is mounted on a slidable and axially rotatable robotic arm 102 capable of moving the thermal imaging camera 101 along a semi-circular trajectory 103 in the front of the patient/subject from side-to-side such that thermographic images may be captured in a right-side view 104, a front view 105, and a left-side view 106, and various oblique angles in between. The thermal imaging camera 101 can be a single-band infrared camera, a multi-band infrared camera in the thermal range, or a hyperspectral infrared camera in the thermal range. The resolution of the thermal imaging camera 101 is effectively the size of the pixel. Smaller pixels mean that the resulting image has a higher resolution and thus better spatial definition. Although the thermal imaging camera 101 offers a relatively large dynamic range of temperature settings, it is preferable that a temperature range of the camera be relatively small and centered around a surface temperature of the body of the patient so that even small temperature variations are amplified in terms of pixel color changes in order to provide a better measure of temperature variation. Thermal imaging cameras are readily available in various streams of commerce. The thermal imaging camera 101 is communicatively connected to a tissue density estimation system 107 which process the thermal image captured by the thermal imaging camera 101 for estimating the density of a tissue from the thermal image of the female patient using a deep learning model to enable the user to perform further clinical diagnosis and management. In some embodiments, the clinical diagnosis and management include automatic identification of a preferred modality of follow-up breast examination for the subject based on a estimated density of the tissue and age of the subject. In some embodiments, the system automatically generates a text report with the estimated density of the tissues along with the recommendation of the preferred modality of the breast examination for the subject.
[0047] FIG. 2 illustrates an exploded view of a system for estimating density of a tissue from thermal images of a breast region of a subject according to some embodiments herein. The block diagram 200 of the tissue density estimation system 107 includes a thermal image receiving module 202, a density classification module 204, a density analysis module 206, a data source interface module 208, a density class determination module 210, a text report generation module 212 and a data source 214. The thermal image receiving module 202 receives a thermal image of a tissue of a subject/patient. In some embodiments, the thermal image represents a temperature distribution on the tissue of the subject as pixels in the thermal image with the highest temperature value is displayed in a first color and pixels with the lowest temperature value is displayed in a second color. The pixels with temperature values between the lowest and highest temperature values are displayed in gradations of color between the first and second colors. In some embodiments, the thermal image is captured using at least one of a thermal imaging camera 101 or a wearable device that is connected with the system tissue density estimation 107. In some embodiments, the thermal imaging camera 101 or a wearable device includes an array of sensors, a lens or a specialized processor. The array of sensors converts infrared energy into electrical signals on a per-pixel basis. The lens focuses the infrared energy from the tissue of the subject onto the array of sensors. The array of sensors detects temperature values from the tissue of the subject. The specialized processor processes the detected temperature values into at least one block of pixels to generate the thermal image.
[0048] The density classification module 204 classifies the density of the tissue into two or more density classes from the captured thermal image to determine a density class of the tissue. In some embodiments, the density classification module classifies the density of the tissue using a pre-trained deep learning model by providing the captured thermal image as an input. In some embodiments, the density classes include at least one of BI-RADS A, BI-RADS B, BI-RADS C, or BI-RADS D, with increasing the density of the tissue from BI-RADS A to BI-RADS D. In some embodiments, the pre-trained deep learning model is trained by providing a plurality of existing thermal images of different tissues and a plurality of classes corresponding to different tissues as training data to obtain the pre-trained deep learning model for classifying the density of the tissue. In some embodiments, the classification includes the thermal image analysis to determine a plurality of probabilities of the tissue belonging to each density class from among the two or more density classes using a deep learning model. In some embodiments, the classification includes determining a density class associated with maximum probability from among the plurality of probabilities.
[0049] The density analysis module 206 analyses the captured thermal image using the deep learning model to determine the plurality of probabilities of the tissue belongs to each density class. In some embodiments, the deep learning model is trained to determine a plurality of probabilities of the tissue belonging to each density class from among the two or more density classes by providing the plurality of existing thermal images of different tissues and its corresponding density as training data. The data source interface module 208 includes at least one interface for communicating with at least one data source to obtain the plurality of existing thermal images of different tissues and their corresponding density. The density class determination module 210 determines the density class associated with the maximum probability from among the plurality of probabilities. The text report generation module 212 automatically generates a text report with the determined class of the tissue to estimate the density of the tissue. In some embodiments, the text report includes the estimated tissue densities based on a weighted average of the determined probabilities. In some embodiments, the text report includes estimated tissue densities based on the determined density class and its associated range of density values. In some embodiments, the tissue density estimation system 107 provides the text report to the user to enable the user to determine an image modality. In some embodiments, the image modality may associate with breast cancer screening.
[0050] With reference to FIGS. 2, FIG. 3A and 3B illustrates an exemplary process flow of a classification of density of a tissue into two or more density classes using a thermal image of a subject according to some embodiments herein. At step 302, the thermal image is captured using a thermal imaging camera. In some embodiments, the thermal image may be received or retrieved from a remote device over a network, or from a media such as a Compact Disc Read-Only Memory (CDROM) or Digital Versatile/Video Disc (DVD). The thermal image may be downloaded from a web-based system or an application that makes a video available for processing, in accordance with the methods disclosed herein. The thermal image may also be received from an application such as applications available for handheld cellular devices and processed on the cell phone or other handheld computing devices such as an iPad or Tablet-PC. The thermal image may be received directly from a memory or storage device of the imaging device that is used to capture that thermal image or a thermal video. At step 304, the density of the tissue from the captured thermal image is classified into two or more density classes using a pre-trained deep learning model. In some embodiments, the pre-trained deep learning model is integrated with the tissue density estimation system 107 to classify the density of the tissue from the captured thermal image into two or more density classes. In some embodiments, the density classes include at least one of BI-RADS A, BI-RADS B, BI-RADS C, or BI-RADS D, with increasing the density of the tissue from BI-RADS A to BI-RADS D. In some embodiments, the deep learning model determines a plurality of probabilities (PB1-N) of the tissue belonging to each density class from the two or more density classes by analyzing the captured thermal image. In some embodiments, the tissue density estimation system 107 determines the density class associated with a maximum probability from among the plurality of probabilities to estimate the density of the tissue. At step 306, the classified density of the tissue is provided to the tissue density estimation system 107 for further analysis.
[0051] With reference to FIG. 2, FIG. 4 illustrates an exemplary process flow of a determination of probabilities of the tissue belonging to each density class using a deep learning model according to some embodiments herein. At step 402, the thermal image is captured using a thermal imaging camera. At step 404, the pre-trained deep learning model is integrated with the tissue density estimation system 107 to classify the density of the tissue from the captured thermal image into two or more density classes. In some embodiments, the density of the tissue from the captured thermal image is classified into two or more density classes using the pre-trained deep learning model to determine the density class of the tissue to estimate the density of the tissue. At step 406, the plurality of probabilities of the tissue belongs to each density class is determined from among the two or more density classes of the tissue using the deep learning model. At step 408, the density class of the tissue is obtained by determining the class with maximum probability using the deep learning model. In some embodiments, the classification includes the thermal image analysis to determine a plurality of probabilities of the tissue belonging to each density class from among the two or more density classes using a deep learning model. In some embodiments, the classification includes determining a density class associated with maximum probability from among the plurality of probabilities.
[0052] At step 410, a text report is generated with the determined class of the tissue to estimate the density of the tissue. A step 412, the generated text report is provided to at least one of the tissue density estimation system 107 or user for further analysis.
[0053] With reference to FIG. 2, FIG. 5 illustrates a flow diagram of one embodiment of the present method for estimating density of breast region using a deep learning model according to some embodiments herein. At step 502, the thermal image of the body of the subject is received. In some embodiments, the thermal image represents a temperature distribution on the tissue of the subject as pixels in the thermal image. At step 504, the pre-trained deep learning model is integrated with the tissue density estimation system 107 to classify the density of the tissue from the captured thermal image into two or more density classes. At step 506, the density of the tissue from the captured thermal image is classified into two or more density classes by providing the captured thermal image as an input to the pre-trained deep learning model to determine the density class of the density of the tissue. The classification of the density of the tissue from the captured thermal image includes analyzing the captured thermal image to determine a plurality of probabilities of the tissue belonging to each density class from among the two or more density classes using a deep learning model and determining the density class associated with a maximum probability from among the plurality of probabilities. At step 508, the text report with the determined class of the tissue is generated to estimate the density of the tissue. In some embodiments, the classification includes the thermal image analysis to determine a plurality of probabilities of the tissue belonging to each density class from among the two or more density classes using a deep learning model. In some embodiments, the classification includes determining a density class associated with maximum probability from among the plurality of probabilities.
[0054] FIG. 6 illustrates a block diagram of one example system for processing a thermal image in accordance with the embodiments described with respect to the flow diagram of FIG. 5 according to some embodiments herein. The system includes an image receiver 602, a temperature processor 603, a density classifier 604, a storage device 605, a machine learning model 606, a Central Processing Unit (CPU) 608, a memory 609, a work station 610, a machine-readable media 611, a display device 612, a keyboard 613, a mouse 614, a database 616 and a network 617. In some embodiments, the Central Processing Unit (CPU) 608, the memory 609, the work station 610, the machine-readable media 611, the display device 612, the keyboard 613, the mouse 614 and the database 616 are connected to the system using the network 617. In some embodiments, the image receiver 602 wirelessly receives a video via antenna 601 having been transmitted thereto from the video/thermal imaging device 101 of FIG. 1. The temperate Processor 603 uses a temperature-based method to detect pixels in the received image. The density classifier 604 classifies the density of the tissue into two or more density classes from the captured thermal image. Both the temperature processor 603 and the density classifier 604 store their results to the storage device 605. The machine learning model 606 retrieves the results from the storage device 605 and proceeds to estimate the density of tissue from thermal images of a breast region of a subject. The machine learning model 606 determines the class with maximum probability to obtain the density class of the tissue. The machine learning model 606 is trained using a plurality of existing thermal images of different tissues and its corresponding density as training data. The Central Processing Unit (CPU) 608 retrieves machine-readable program instructions from the memory 609 and is provided to facilitate the functionality of any of the modules of the system 600. The Central Processing Unit (CPU) 608, operating alone or in conjunction with other processors, may be configured to assist or otherwise perform the functionality of any of the modules or processing units of the system 600 as well as facilitating communication between the system 600 and the workstation 610.
[0055] The computer case of the workstation 610 houses various components such as a motherboard with a processor and a memory, a network card, a video card, a hard drive capable of reading/writing to machine-readable media 611 such as a floppy disk, optical disk, CD-ROM, DVD, magnetic tape, and the like, and other software and hardware needed to perform the functionality of a computer workstation. The workstation 610 further includes the display device 612, such as a CRT, LCD, or touch screen device, for displaying information, images, view angles, and the like. A user can view any of that information and make a selection from menu options displayed thereon. The keyboard 613 and the mouse 614 effectuate a user input. It should be appreciated that the workstation 610 has an operating system and other specialized software configured to display alphanumeric values, menus, scroll bars, dials, slideable bars, pull-down options, selectable buttons, and the like, for entering, selecting, modifying, and accepting information needed for processing in accordance with the teachings hereof. The workstation 610 is further enabled to display thermal images, the view angle of the thermal images and the like as they are derived. A user or technician may use the user interface of the workstation 610 to set parameters and adjust various aspects of the integration of the pre-trained deep learning model, determination of probabilities of the tissue belonging to each density class, determination of class with a maximum probability and generation of the text report are performed, as needed or as desired, depending on the implementation. Any of these selections or inputs may be stored/retrieved to the storage device 611. Default settings can be retrieved from the storage device 611. A user of the workstation 610 is also able to view or manipulate any of the data in the patient records, collectively at 615, stored in the database 616. Any of the received images, results, determined view angle, and the like, may be stored to the storage device 611 internal to the workstation 610. Although shown as a desktop computer, the workstation 610 can be a laptop, mainframe, or a special purpose computer such as an ASIC, circuit, or the like.
[0056] Any of the components of the workstation 610 may be placed in communication with any of the modules and processing units of system 600. Any of the modules of the system 600 can be placed in communication with the storage devices 605, 616 and 606 and/or the computer-readable media 611 and may store/retrieve therefrom data, variables, records, parameters, functions, and/or machine-readable/executable program instructions, as needed to perform their intended functions. Each of the modules of the system 600 may be placed in communication with one or more remote devices over the network 617. It should be appreciated that some or all of the functionality performed by any of the modules or processing units of the system 600 can be performed, in whole or in part, by the workstation 610. The embodiment shown is illustrative and should not be viewed as limiting the scope of the appended claims strictly to that configuration. Various modules may designate one or more components which may, in turn, comprise software and/or hardware designed to perform the intended function.
[0057] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope.
| # | Name | Date |
|---|---|---|
| 1 | 202041014220-STATEMENT OF UNDERTAKING (FORM 3) [31-03-2020(online)].pdf | 2020-03-31 |
| 2 | 202041014220-PROOF OF RIGHT [31-03-2020(online)].pdf | 2020-03-31 |
| 3 | 202041014220-POWER OF AUTHORITY [31-03-2020(online)].pdf | 2020-03-31 |
| 4 | 202041014220-FORM FOR STARTUP [31-03-2020(online)].pdf | 2020-03-31 |
| 5 | 202041014220-FORM FOR SMALL ENTITY(FORM-28) [31-03-2020(online)].pdf | 2020-03-31 |
| 6 | 202041014220-FORM 1 [31-03-2020(online)].pdf | 2020-03-31 |
| 7 | 202041014220-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [31-03-2020(online)].pdf | 2020-03-31 |
| 8 | 202041014220-EVIDENCE FOR REGISTRATION UNDER SSI [31-03-2020(online)].pdf | 2020-03-31 |
| 9 | 202041014220-DRAWINGS [31-03-2020(online)].pdf | 2020-03-31 |
| 10 | 202041014220-DECLARATION OF INVENTORSHIP (FORM 5) [31-03-2020(online)].pdf | 2020-03-31 |
| 11 | 202041014220-COMPLETE SPECIFICATION [31-03-2020(online)].pdf | 2020-03-31 |
| 12 | 202041014220-FORM-9 [15-05-2020(online)].pdf | 2020-05-15 |
| 13 | 202041014220-STARTUP [16-05-2020(online)].pdf | 2020-05-16 |
| 14 | 202041014220-FORM28 [16-05-2020(online)].pdf | 2020-05-16 |
| 15 | 202041014220-FORM 18A [16-05-2020(online)].pdf | 2020-05-16 |
| 16 | 202041014220-FER.pdf | 2020-06-09 |
| 17 | 202041014220-OTHERS [09-12-2020(online)].pdf | 2020-12-09 |
| 18 | 202041014220-FER_SER_REPLY [09-12-2020(online)].pdf | 2020-12-09 |
| 19 | 202041014220-DRAWING [09-12-2020(online)].pdf | 2020-12-09 |
| 20 | 202041014220-CORRESPONDENCE [09-12-2020(online)].pdf | 2020-12-09 |
| 21 | 202041014220-COMPLETE SPECIFICATION [09-12-2020(online)].pdf | 2020-12-09 |
| 22 | 202041014220-CLAIMS [09-12-2020(online)].pdf | 2020-12-09 |
| 23 | 202041014220-PatentCertificate28-06-2021.pdf | 2021-06-28 |
| 24 | 202041014220-IntimationOfGrant28-06-2021.pdf | 2021-06-28 |
| 25 | 202041014220-RELEVANT DOCUMENTS [30-09-2022(online)].pdf | 2022-09-30 |
| 26 | 202041014220-RELEVANT DOCUMENTS [28-08-2023(online)].pdf | 2023-08-28 |
| 1 | US20180000461A1E_08-06-2020.pdf |
| 2 | US20110026791A1E_08-06-2020.pdf |
| 3 | searchstrategyE_08-06-2020.pdf |