Abstract: A system 100 for breast cancer screening using thermal imaging is provided. The system 100 comprises at least one thermal imaging camera 110, 508, at least one first temperature sensor 112, 510, and at least one second temperature sensors 114, 512, and a portable edge device 102, 500. The device 102, 500 includes a processing unit 106, 504 configured to obtain thermal images, calibrate pixel values based on temperature data measured by the first temperature sensor 112, 510, and the second temperature sensors 114, 512, preprocess calibrated thermal images, extract temperature based features using deep learning models, generate a probability score indicating abnormality, and classify breast tissue as normal or abnormal. The system performs data processing locally on the portable edge device 102, 500. The system of the present disclosure is cost-effective, portable, and enables early detection of breast abnormalities without complex infrastructure, enhancing accessibility to breast cancer screening. FIG. 1
Description:FIELD OF INVENTION
[0001] The present disclosure relates generally to breast cancer screening, and more particularly to a cost-effective system and method for breast cancer screening using thermal images and artificial intelligence (AI). Moreover, the system and method of the present disclosure employs a portable edge device for real-time diagnosis in resource-limited settings, thereby enhancing accessibility to early breast cancer detection in underserved areas.
BACKGROUND
[0002] Breast cancer screening and early detection are crucial aspects of women's healthcare, particularly in regions with limited access to advanced medical facilities. The field of breast cancer diagnostics has seen significant advancements in recent years, with various technologies being developed to address the challenges associated with traditional screening methods.
[0003] Mammography has long been considered the gold standard for breast cancer screening. However, mammography has several limitations, including the use of ionizing radiation, high cost, and the need for sophisticated infrastructure. These factors make mammography unsuitable for widespread deployment in remote or underserved areas, creating a critical gap in early detection capabilities for many women worldwide.
[0004] The need for portable, affordable, and non-invasive early screening solutions has led to the development of alternative technologies. Thermal imaging, for instance, has emerged as a promising approach for breast cancer detection. This method relies on the principle that cancerous tissues often exhibit higher temperatures compared to surrounding healthy tissues due to increased blood flow and metabolic activity.
[0005] Several commercial solutions have been introduced to address the limitations of traditional mammography. However, such existing solutions face challenges in terms of affordability and practicality, especially in low-resource settings.
[0006] Despite advancements, existing solutions often depend on expensive hardware or cloud-based processing, limiting their usability in rural settings. Many current systems require high-end thermal imaging devices and rely on internet connectivity for cloud-based artificial intelligence (AI) analysis. This dependency on sophisticated infrastructure and connectivity poses significant challenges for deployment in remote areas with limited resources.
[0007] Furthermore, the cost of the existing screening systems remains a major barrier to widespread adoption. Commercial alternatives typically cost between INR 5-10 lakh, making them prohibitively expensive for many healthcare providers, especially in developing regions. This high cost significantly limits the potential for mass deployment in government screening programs, rural healthcare centers, and mobile diagnostic units.
[0008] Another limitation of existing thermal imaging-based screening solutions is their reliance solely on thermal data. This approach can potentially lead to false positives or negatives, as the surrounding environmental conditions, for instance, may influence thermal readings.
[0009] Additionally, the dependency on external processing units or cloud connectivity in many current solutions poses challenges for real-time, on-site diagnosis. This reliance on external resources can lead to delays in results and may be impractical in areas with unreliable internet connectivity or limited access to advanced computing infrastructure.
[0010] Therefore, there is a need to overcome the problems discussed above in breast cancer screening.
OBJECTIVES
[0011] The primary objective of the present disclosure is to provide a system for breast cancer screening using thermal imaging and artificial intelligence (AI).
[0012] Another objective of the present disclosure is to offer a portable edge device for local data processing in breast cancer screening.
[0013] Yet another objective of the present disclosure is to provide a system and method for breast cancer screening that accurately compensates for environmental temperature variations, thereby enhancing the reliability of thermal image analysis across diverse ambient conditions.
[0014] Yet another objective of the present disclosure is to offer a breast cancer screening solution that operates without requiring internet connectivity or cloud-based processing, making it suitable for use in remote or rural areas with limited network infrastructure.
[0015] A still further objective of the present disclosure is to provide a cost-effective alternative to traditional mammography screening, potentially increasing access to breast cancer early detection in resource-limited settings.
[0016] Another objective of the present disclosure is to develop a non-invasive and radiation-free method for breast cancer screening, allowing for more frequent examinations without the risks associated with ionizing radiation.
[0017] An additional objective of the present disclosure is to create a user-friendly system that can be operated by healthcare workers with minimal specialized training, thereby expanding the reach of breast cancer screening programs.
[0018] A further objective of the present disclosure is to enable real-time analysis and immediate display of screening results.
SUMMARY
[0019] The present disclosure provides a technical solution for enhancing breast cancer screening accessibility and accuracy in resource-limited settings through the integration of thermal imaging and edge computing. Specifically, the present disclosure addresses the challenges of performing high-accuracy breast abnormality detection using thermal images by optimizing thermals images and analyzing optimized thermal images using artificial intelligence (AI) models. Further, the present disclosure compensates for environmental and physiological temperature variations to improve screening reliability. More particularly, the present disclosure enables real-time, on-device analysis without reliance on cloud computing or internet connectivity. Accordingly, the present disclosure offers a non-invasive, radiation-free alternative to traditional mammography screening. In addition, the present disclosure provides a cost-effective, portable solution suitable for deployment in remote or underserved areas. By combining these technological elements, the present disclosure offers a comprehensive approach to overcoming existing limitations in breast cancer screening, particularly in contexts where access to conventional diagnostic tools is limited.
[0020] According to one aspect of the present disclosure, a system for breast cancer screening is provided. The system comprises at least one thermal imaging camera, at least one first temperature sensor, at least one second temperature sensor, and a portable edge device. The thermal imaging camera, the first temperature sensor, and the second temperature sensor are positioned in the portable edge device or communicatively coupled to the portable edge device. The thermal imaging camera is configured for capturing one or more thermal images of a right side breast and a left side of breast of a subject. The captured thermal images comprise first pixel values. The first temperature sensor is configured for measuring body surface temperature of the subject. The second temperature sensor is configured for measuring environment temperature associated with the subject. The portable edge device comprises a memory storing a set of instructions and a processing unit. The processing unit is configured to execute the set of instructions to: obtain the thermal images of the right side and the left side of breast of the subject; obtain body surface temperature data and environment temperature data; calibrate pixel values of the thermal images of the right side and the left side of breast based on the body surface temperature data and the environment temperature data; preprocess the calibrated thermal images; extract, using one or more deep learning convolution models deployed in the portable edge device, one or more temperature based features from the preprocessed thermal images; generate, using the one or more deep learning convolution models, probability score indicating presence of abnormality associated with breast cancer, based on analysis of extracted temperature based features; and classify, using the one or more deep learning convolution models, breast tissue associated with the thermal images as normal or abnormal based on the probability score. The portable edge device is configured to perform data processing locally in the portable edge device. The first temperature sensor and the second temperature sensor are non-contact infrared thermal sensors.
[0021] The processing unit is further configured to display, on a display unit, at least one of the thermal images of the breast, the body surface temperature data, the environment temperature data, classification result associated with the thermal images, and warning message if abnormality is detected, or combination thereof.
[0022] The processing unit is configured to preprocess the calibrated thermal images by: resizing, using bilinear interpolation, the calibrated thermal images comprising first pixel values into thermal images comprising second pixel values; replicating resized single-channel thermal images across three channels (red, green, blue), thereby generating a three-dimensional tensor; normalizing pixel intensity values of the thermal images between 0 and 1 after generating the three-dimensional tensor; applying a median filter for noise reduction in the thermal images after normalization; applying contrast limited adaptive histogram equalization (CLAHE) for contrast enhancement and improvement of visibility of subtle thermal anomalies after noise reduction; and masking out background and non-breast regions from the thermal images.
[0023] The one or more temperature based features comprise thermal asymmetry between the left side breast and the right side breast, localized hotspots in the breast tissue, contour and edge irregularities, thermal gradient across breast tissue, texture and micro-pattern, or combinations thereof.
[0024] The processing unit is configured to extract the one or more temperature based features from the preprocessed thermal images by: detecting differences in temperature distribution between the left side breast and the right side breast; identifying focal high-temperature regions indicating abnormal tissue activity; detecting irregular thermal boundaries suggesting abnormal growth patterns in the breast tissue; determining rate of temperature change across breast tissue regions where sharp gradients indicate malignancies; and extracting subtle heat distribution textures reflecting early pathological changes.
[0025] The one or more deep learning convolution models are trained by obtaining a plurality of calibrated thermal images of breast; preprocessing the plurality of calibrated thermal images; labeling the preprocessed calibrated thermal images as normal or abnormal based on clinical diagnosis; extracting temperature based features from the labeled preprocessed thermal images; and training the one or more deep learning convolution models using the calibrated thermal images, the extracted temperature based features and corresponding labels to classify breast tissue as normal or abnormal.
[0026] The processing unit is configured to store at least one of the thermal images of the breast, the body surface temperature data, the environment temperature data, classification result associated with the thermal images, or combination thereof.
[0027] According to one aspect of the present invention, a method for breast cancer screening is provided. The method comprises capturing, using at least one thermal imaging camera, one or more thermal images of a right side breast and a left side of breast of a subject, captured thermal images comprise first pixel values; measuring, using at least one first temperature sensor, body surface temperature of the subject; measuring, using at least one second temperature sensor, environment temperature associated with the subject; calibrating, using a processing unit of portable edge device, pixel values of the thermal images of the right side and the left side of breast based on the body surface temperature data and the environment temperature data; preprocessing, using the processing unit, the calibrated thermal images; extracting, using one or more deep learning convolution models deployed in the portable edge device, one or more temperature based features from the preprocessed thermal images; generating, using the one or more deep learning convolution models, probability score indicating presence of abnormality associated with breast cancer, based on analysis of extracted temperature based features; and classifying, using the one or more deep learning convolution models, breast tissue associated with the thermal images as normal or abnormal based on the probability score.
[0028] The foregoing paragraphs have been provided by way of general introduction and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
BRIEF DESCRIPTION OF DRAWINGS
[0029] FIG. 1 is a block diagram illustrating a system for breast cancer screening in accordance with the present disclosure.
[0030] FIG. 2 is a block diagram illustrating one or more modules of a portable edge device of FIG. 1 in accordance with the present disclosure.
[0031] FIGS. 3A-3B are flow charts illustrating a method of breast cancer screening in accordance with the present disclosure.
[0032] FIG. 4 is a flow chart illustrating a method of training a deep learning convolution neural network model in accordance with the present disclosure.
[0033] FIG. 5 is a block diagram illustrating a portable edge device in accordance with the present disclosure.
DETAILED DESCRIPTION OF THE PRESENT DISCLOSURE
[0034] Aspects of the present invention are best understood by reference to the description set forth herein. All the aspects described herein will be better appreciated and understood when considered in conjunction with the following descriptions. It should be understood, however, that the following descriptions, while indicating preferred aspects and numerous specific details thereof, are given by way of illustration only and should not be treated as limitations. Changes and modifications may be made within the scope herein without departing from the spirit and scope thereof, and the present invention herein includes all such modifications.
[0035] As mentioned above, there is a need for a technical solution to solve aforementioned technical problems in breast cancer screening. The present disclosure provides a cost effective system and a method for breast cancer screening in resource-limited settings through the integration of thermal imaging and edge computing. Specifically, the system optimizes thermal images using a preprocessing technique, thereby enabling enhancement of image quality and detail retention despite the initial resolution input, efficient utilization of affordable thermal imaging hardware without compromising diagnostic accuracy, compatibility with deep learning models designed for higher resolution inputs, improved feature extraction capabilities leading to more reliable abnormality detection, reduction of noise and artifacts commonly associated with thermal imaging, and maximization of relevant thermal pattern visibility while minimizing irrelevant background information, adaptation of thermal image data for optimal processing by convolutional neural networks, and real-time image processing on resource-constrained edge devices without compromising analytical performance. The preprocessing technique of the present disclosure thus forms a bridge between low-cost hardware capabilities and sophisticated AI-driven analysis, enabling high-quality breast cancer screening in resource-limited settings. Further, the system compensates for environmental and physiological temperature variations to improve screening reliability by calibrating thermal images with subject's body surface temperature and ambient environmental temperature. The dual-sensor approach allows for real-time calibration of the thermal images, adjusting for both individual physiological differences and varying environmental conditions. The calibration mechanism enhances the consistency and accuracy of the breast cancer screening process across diverse settings, from climate-controlled medical facilities to variable outdoor environments, thereby increasing the adaptability and reliability of the screening system in resource-limited and remote locations. More particularly, the system enables real-time, on-device analysis without reliance on cloud computing or internet connectivity. Accordingly, the present disclosure offers a non-invasive, radiation-free alternative to traditional mammography screening. By combining the low-cost hardware capabilities and sophisticated AI-driven analysis, the present disclosure offers a comprehensive approach to overcoming existing limitations in breast cancer screening, particularly in contexts where access to conventional diagnostic tools is limited.
[0036] As used herein, several terms are defined below:
[0037] The term “thermal imaging camera” as used herein generally refers to a thermal imaging device capable of capturing infrared radiation emitted by an object and converting it into a visual representation.
[0038] The term “thermal images” as used herein generally refers to thermal images captured by a thermal imaging camera, which represent temperature distributions of the captured scene.
[0039] The term “temperature sensor” as used herein generally refers to a device capable of measuring temperature without physical contact, such as an infrared thermometer or a non-contact temperature probe, used for measuring body surface or ambient temperatures
[0040] The term “portable edge device” as used herein generally refers to a compact, self-contained computing device capable of performing data processing and analysis at or near the source of data generation, without relying on cloud computing or internet connectivity.
[0041] The term “pixel values” as used herein generally refers to the numerical representation of temperature or intensity at each point (pixel) in a thermal image.
[0042] The term “deep learning convolution models” as used herein generally refers to artificial neural network architectures, such as convolutional neural networks (CNNs), configured to automatically learn and extract relevant features from image data for tasks such as classification or anomaly detection.
[0043] The term “locally” as used herein generally refers to operations or processes performed entirely within a portable edge device, without requiring external network connectivity or cloud-based resources.
[0044] The term “preprocessing” as used herein generally refers to a series of image processing techniques applied to raw thermal images to enhance their quality, standardize their format, and prepare them for analysis by deep learning models.
[0045] The term “calibrating” as used herein generally refers to the process of adjusting the pixel values of thermal images to ensure accurate representation of temperature distributions for analysis.
[0046] Referring now to the drawings, and more particularly to FIGS. 1 through 5, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[0047] FIG. 1 is a block diagram illustrating a system 100 for breast cancer screening in accordance with the present disclosure. The system 100 includes a portable edge device 102, a thermal imaging camera 110, a first temperature sensor 112, and a second temperature sensor 114. The thermal imaging camera 110, the first temperature sensor 112, and the second temperature sensor 114 are communicatively connected with the portable edge device 102. For instance, the thermal imaging camera 110, the first temperature sensor 112, and the second temperature sensor 114 are communicatively connected with the portable edge device 102, using a communication interface. The connection may be established either via wired means or wirelessly. In some embodiments, the wired connection may utilize protocols such as USB (Universal Serial Bus), SPI (Serial Peripheral Interface), or I2C (Inter-Integrated Circuit) for data transmission. These wired connections can provide stable, high-speed data transfer with minimal latency. In other embodiments, wireless communication may be employed, utilizing protocols such as Wi-Fi, Bluetooth Low Energy (BLE), or ZigBee. Wireless connections offer flexibility in device placement and can contribute to the overall portability of the system 100.
[0048] The thermal imaging camera 110 is configured for capturing one or more thermal images of a right side breast and a left side of breast of a subject. In some embodiments, the thermal imaging camera 110 is a low resolution thermal imaging camera. The low-resolution thermal imaging camera 110 may have a resolution of 160 × 120 pixels. In some embodiments, the thermal imaging camera 110 have a resolution below 160 × 120 pixels. In some embodiments, the thermal imaging camera 110 has a resolution above 160 × 120 pixels. It is to be noted that more than one thermal imaging camera may be used. In some embodiments, the thermal imaging camera 110 comprises a lens system, an infrared detector and a signal processing unit. The lens system is configured for focusing the infrared radiation emitted from the right side breast and the left side of breast of the subject, onto the infrared detector. The infrared detector is configured for capturing infrared radiation emitted from the right side breast and the left side of breast of the subject and converting captured infrared radiation into an electrical signal. In one exemplary embodiment, the infrared detector is a microbolometer. The signal processing unit is configured for converting the electrical signals into thermal images, where different temperatures are represented by different colors. In one exemplary embodiment, warmer areas are typically represented in red or yellow, while cooler areas are in blue or green. In some embodiments, the captured thermal images comprise first pixel values. In one exemplary embodiment, the first pixel values comprise 160 × 120 pixels.
[0049] The first temperature sensor 112 is configured for measuring body surface temperature of the subject. In some embodiments, the first temperature sensor 112 is a non-contact infrared (IR) temperature sensor. In one exemplary embodiment, the first temperature sensor 112 comprises a lens to collect and focus IR radiation emitted by the subject; a detector, often a thermopile, to convert the focused IR radiation into an electrical signal; and a signal processing unit to convert the electrical signal to temperature readings. This non-invasive approach enhances patient comfort and eliminates the risk of cross-contamination between subjects. To ensure accuracy, the first temperature sensor 112 includes an internal ambient temperature compensation mechanism, thereby adjusting for variations in the sensor's own temperature, which could otherwise affect the accuracy of the body surface temperature readings. It is to be noted that more than one first temperature sensor 112 may also be used.
[0050] The second temperature sensor 114 is configured for measuring environmental temperature associated with the subject. In some embodiments, the second temperature sensor 114 is a non-contact infrared (IR) temperature sensor. In one exemplary embodiment, the second temperature sensor 114 comprises a lens to collect and focus IR radiation emitted from the surrounding environment; a detector, often a thermopile, to convert the focused IR radiation into an electrical signal; and a signal processing unit to convert the electrical signal to temperature readings. To minimize the impact of rapid temperature fluctuations and improve measurement stability, the second temperature sensor 114 incorporates a low-pass filter in its signal processing circuit. The filter helps to smooth out short-term temperature variations, providing a more representative reading of the overall ambient temperature. The second temperature sensor 114 is calibrated to compensate for its own heat emission, ensuring that its presence does not affect the accuracy of the ambient temperature readings. The second temperature sensor 114 also includes an internal temperature reference for continuous self-calibration, maintaining measurement accuracy over time and varying conditions. It is to be noted that more than one second temperature sensor 114 may also be used
[0051] The thermal imaging camera 110, the first temperature sensor 112, and the second temperature sensor 114 are configured to transmit the thermal images, body surface temperature data, environment temperature data, respectively, to the portable edge device 102.
[0052] The portable edge device 102 comprises a memory 104 storing a set of instructions and a processing unit 106. In some embodiments, the memory 104 is a non-volatile storage medium configured to store the set of instructions, data, and deep learning models required for the breast cancer screening process. In some embodiments, the memory 104 comprises a combination of high-speed RAM (random access memory) for temporary data storage and processing, and flash memory or solid-state drive (SSD) for long-term storage of instructions, models, and the like. The processing unit 106 is a computational engine optimized for edge computing and AI inference. In some embodiments, the processing unit 106 comprises anyone of a central processing unit (CPU), a graphics processing unit (GPU), a neural processing unit (NPU), a digital signal processor (DSP), or combinations thereof. It is to be noted that portable edge device 102 is any low-cost computing device capable of running preprocessing algorithms and AI models. For instance, the portable edge device 102 is a Raspberry Pi™. It is to be noted that the examples are given for illustrative purposes and do not restrict the scope of the present disclosure. The portable edge device 102 is configured to run any one of Linux, Android, or a lightweight real-time operating system (RTOS), and is equipped with sufficient processing resources to perform local data acquisition, signal processing, and inference operations without reliance on cloud-based computing infrastructure.
[0053] The portable edge device 102 is configured to obtain the thermal images of the right side and the left side of breast of the subject; and obtain body surface temperature data and environment temperature data. The portable edge device 102 is further configured to calibrate pixel values of the thermal images of the right side and the left side of breast based on the body surface temperature data and the environment temperature data; preprocess the calibrated thermal images; extract, using one or more deep learning convolution models deployed in the portable edge device 102, one or more features from the preprocessed thermal images; generate, using the one or more deep learning convolution models, probability score indicating presence of abnormality associated with breast cancer, based on analysis of extracted features; and classify, using the one or more deep learning convolution models, breast tissue associated with the thermal images as normal or abnormal based on the probability score.
[0054] The portable edge device 102 further includes a display unit 108 that is configured to display at least one of the thermal images of the breast, the body surface temperature data, the environment temperature data, classification result associated with the thermal images, and warning message if abnormality is detected, or combination thereof. In some embodiments, the display unit 108 is selected from a group comprising of a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, an e-paper display, a touchscreen display, a light-emitting diode (LED) matrix, and a seven-segment display. It is to be noted that any display unit compatible with the chosen edge computing platform and meeting the basic requirements for displaying the breast cancer screening information can be utilized in the portable edge device 102. The selection of the display unit 108 is not limited to the examples provided above.
[0055] Referring to FIG. 2, which is a block diagram illustrating one or more modules of the portable edge device 102 of FIG. 1 in accordance with the present disclosure. The portable edge device 102 includes a database 200, a thermal image obtaining module 202, a temperature data obtaining module 204, a calibration module 206, a preprocessing module 208, a feature extraction module 210, a score generating module 212, a classification module 214, and a display module 216. It should be understood that this modular structure is flexible and adaptable to various implementation requirements and future enhancements. One or more additional modules may be incorporated into the system 100 to expand its functionality, such as a data encryption module for enhanced security or a telemedicine module for remote consultation capabilities. Conversely, existing modules may be combined to optimize processing efficiency; for instance, the feature extraction module 210 and score generating module 212 could potentially be merged into a single analysis module. Similarly, modules may be subdivided for more granular control or to accommodate specific hardware configurations; the preprocessing module 208, for example, could be split into separate noise reduction and image enhancement modules. This modular approach allows for scalability and customization of the portable edge device 102, ensuring it can be adapted to meet evolving technological advancements and diverse healthcare needs while maintaining its core breast cancer screening functionality.
[0056] The database 200 stores thermal images, temperature data, deep learning model parameters for quick loading and execution, securely encrypted user profiles, anonymized historical screening records for longitudinal studies; system logs for maintenance and optimization; standardized reference ranges for result interpretation; and a repository of verified software updates. In one exemplary embodiment, the database 200 utilizes SQLite, a self-contained, serverless, and zero-configuration database engine. SQLite is chosen for its small footprint, low resource requirements, and compatibility with various low-power computing platforms. In some embodiments, the portable edge device 102 may utilize external database for enhanced data management and storage capabilities.
[0057] The thermal image obtaining module 202 is configured to obtain the thermal images of the right side and the left side of breast of the subject from the database 200 or the thermal imaging camera 110. The thermal image obtaining module 202 functions as an interface for thermal image acquisition, offering flexibility in image sourcing to accommodate various operational scenarios. In some embodiments, when obtaining images directly from the thermal imaging camera 110, the thermal image obtaining module 202 manages real-time image capture, controlling parameters such as exposure time and frame rate to ensure optimal image quality. The thermal image obtaining module 202 also implements a standardized imaging protocol, guiding the operator to capture consistent views of both breasts for accurate comparative analysis. In cases where images are retrieved from the database 200, the thermal image obtaining module 202 employs data retrieval algorithms to quickly access stored thermal images, which may be useful for follow-up screenings or offline analysis. Additionally, the thermal image obtaining module 202 performs initial quality checks on the obtained images, flagging any that may be suboptimal due to factors like motion blur or improper framing, thereby maintaining the integrity of subsequent analysis steps. By supporting both real-time capture and database retrieval, the thermal image obtaining module 202 enhances the versatility and reliability of the breast cancer screening system 100, allowing it to function effectively in both connected and standalone operational modes.
[0058] The temperature data obtaining module 204 is configured to obtain body surface temperature data and environment temperature data from the first temperature sensor 112 and the second temperature sensor 114, respectively or from the database 200. The temperature data obtaining module 204 function as a central hub for temperature data collection and management, crucial for accurate thermal image calibration. The temperature data obtaining module 204 ensures synchronized temperature data acquisition with thermal imaging, adapts to diverse operational settings, and maintains measurement accuracy, thereby providing critical contextual information for precise interpretation of thermal images and enhancing the overall efficacy of the breast cancer screening process.
[0059] The calibration module 206 is configured to pixel values of the thermal images of the right side and the left side of breast based on the body surface temperature data and the environment temperature data. The calibration formula typically follows:
[0060] T_{calibrated} = T_{pixel} + {ambient correction factor} - {body surface baseline adjustment}.
[0061] where: i) T_{calibrated} is a final calibrated temperature value for each pixel in the thermal images;
[0062] ii) T_{pixel} is an initial temperature value recorded by the thermal imaging camera 110;
[0063] iii) ambient correction factor is derived from the environmental temperature data; and
[0064] iv) {body surface baseline adjustment} is calculated based on measured body surface temperature.
[0065] The dual-sensor approach ensures environmental and individual physiological factors are both considered, resulting in accurate thermal readings under varying conditions. The calibration module 206 implements dynamic ambient correction, personalized baseline adjustment, and region-specific calibration, temporal drift compensation. The calibration technique significantly enhances the reliability and precision of the thermal images by continuously updating the ambient correction factor, individualizing the body surface baseline adjustment, and applying region-specific parameters. Consequently, the calibration technique improves the overall accuracy of the breast cancer screening process across diverse environmental conditions and patient populations.
[0066] The preprocessing module 208 is configured to enhance and standardize the calibrated thermal images. It is to be noted that the thermal images refer to both left side and right side breast thermal images. The preprocessing module 208 resizes the calibrated thermal images using bilinear interpolation, transforming the first pixel values (e.g., 160x120) into second pixel values (e.g., 224x224). Bilinear interpolation is defined as a method that computes a geometrically transformed image by averaging the gray levels of the four nearest neighbors using a bilinear function. This process results in a visually smoother interpolation. The advantages of resizing include: compatibility with standard deep learning model input sizes, enabling transfer learning techniques; preservation of important thermal pattern details through bilinear interpolation; standardization of image sizes across all samples for uniform model input; enhanced feature extraction capability due to increased resolution; potential noise reduction through the interpolation process; adaptability to thermal cameras with varying native resolutions; and computational efficiency balancing detail with edge device constraints. This resizing step optimally prepares thermal images for subsequent deep learning-based analysis, potentially improving the system's ability to identify subtle thermal anomalies while maintaining real-time processing capabilities on portable devices.
[0067] The preprocessing module 208 further replicates the resized single-channel thermal images across three channels (red, green, blue), generating a three-dimensional tensor compatible with convolutional neural network architecture. In one exemplary embodiment, the process, known as channel replication, involves copying the grayscale intensity values of the thermal images to each of the RGB channels, creating a 224x224x3 tensor. The channel replication is achieved through array broadcasting operations, where the single-channel image matrix is expanded along a new axis to create three identical channels. The advantages of channel replication include: compatibility with pre-trained CNN network configured for RGB inputs, allowing for transfer learning and leveraging established architectures; preservation of the full thermal information in each channel, ensuring no data loss during the conversion; simplified integration with existing deep learning frameworks and libraries optimized for three-channel inputs; potential for future extension to pseudo-color representations or multi-spectral analysis without changing the core network architecture; and maintenance of a standardized input format that can accommodate both visible light and thermal imaging modalities in a unified processing pipeline. The channel replication step thus bridges the gap between single-channel thermal imaging data and the requirements of deep learning models, enhancing the system's flexibility and potential for advanced image analysis techniques.
[0068] The preprocessing module 208 is further configured to normalize pixel intensity values between 0 and 1, ensuring consistent input scaling for the neural networks. In one exemplary embodiment, the normalization process involves determining minimum and maximum pixel values in the thermal images; applying a linear transformation to each pixel value, subtracting the minimum value and dividing by the range (maximum - minimum), effectively scaling all values to fall within the [0, 1] interval. The advantages of the normalization include: improved numerical stability during neural network training and inference; mitigation of the impact of varying image intensities due to different thermal camera sensitivities or environmental conditions; facilitation of faster convergence during model training by bringing all input features to a common scale; enhancement of the model's ability to learn relevant features rather than being biased by absolute intensity values; and standardization of inputs across different datasets or imaging sessions, enabling more robust and generalizable models.
[0069] The preprocessing module 208 is further configured to apply one or more median filters to reduce noise while preserving edge information critical for anomaly detection. The preprocessing module 208 further enhances image quality by applying contrast limited adaptive histogram equalization (CLAHE), which improves the visibility of subtle thermal anomalies by optimizing local contrast. The CLAHE technique operates by dividing the image into small tiles, equalizing the histogram of each tile independently, and then interpolating the results to eliminate artificially induced boundaries. The 'contrast limited' aspect prevents over-amplification of noise by clipping the histogram at a predefined value before equalization. Finally, the preprocessing module 208 implements a masking algorithm to isolate the breast regions, eliminating background and non-breast areas that could potentially introduce artifacts or mislead the analysis. The masking process may involve: a) edge detection techniques to identify the breast contours; b) morphological operations to refine the detected regions; c) template matching or anatomical landmark detection to ensure accurate breast region identification; d) dynamic thresholding based on temperature distributions to separate breast tissue from surrounding areas; e) machine learning-based segmentation models trained on diverse breast morphologies to handle variations in breast size and shape.
[0070] The feature extraction module 210 is configured to extract the one or more features from the preprocessed thermal images using the one or more deep learning convolution models deployed in the portable edge device 102. To extract the temperature based features, the feature extraction module 210 divides the left and right breast regions into comparable zones (quadrants or grids); and identifies for each zone on one breast, the corresponding zone on the other breast. Thereafter, the feature extraction module 210 is configured to detect differences in temperature distribution between the left side breast and the right side breast by comparing the temperature statistics (mean, max, and standard deviation) between these corresponding zones. Further, the feature extraction module 210 is configured to identify focal high-temperature regions indicating abnormal tissue activity by calculating average temperature for each breast and each zone within the breast; determining a temperature threshold above which a region is considered noticeably warmer; and scanning each breast zone to identify pixels or small regions that exceed the temperature threshold. The feature extraction module 210 is configured to detect irregular thermal boundaries suggesting abnormal growth patterns in the breast tissue by detecting edges, traces contours of different temperature regions, and analyzing their shapes, symmetry, and complexity. The one or more deep learning convolution models are trained to recognize patterns indicative of normal and abnormal tissue growth, considering factors such as boundary curvature, continuity, and fractal dimensions. The one or more deep learning convolution models compare the detected boundaries against learned representations of normal breast thermal patterns, flagging significant deviations that may suggest abnormal growth. The feature extraction module 210 is configured to determine rate of temperature change across breast tissue regions where sharp gradients indicate malignancies by computing and analyzing temperature gradients across the entire breast tissue. Thereafter, the feature extraction module 210 is configured to extract subtle heat distribution textures reflecting early pathological changes by comparing corresponding regions of the left and right breasts and identifying asymmetric texture patterns that could indicate early pathological changes.
[0071] The score generating module 212 is configured to generate, using the one or more deep learning convolution models, probability score indicating presence of abnormality associated with breast cancer, based on analysis of extracted features. The deep learning models process the extracted temperature based features through multiple layers, learning complex patterns and relationships that are indicative of breast abnormalities. The final layer of the neural network outputs a probability score, typically between 0 and 1, representing the likelihood of an abnormality being present. For instance, a score of 0.05 is assigned when there's minimal thermal asymmetry between breasts, no focal hot spots, regular thermal boundaries, and uniform texture patterns. In another instance, a score of 0.75 is assigned when there's significant thermal asymmetry, a pronounced focal hot spot with sharp temperature gradients, irregular thermal boundaries, and distinct texture changes in a localized area. In yet another instance, a score of 0.95 is assigned when there are multiple high-risk features present, such as extreme thermal asymmetry, several focal hot spots with very sharp temperature gradients, highly irregular thermal boundaries, and pronounced texture changes consistent with known malignant patterns.
[0072] The classification module 214 is configured to classify, using the one or more deep learning convolution models, breast tissue associated with the thermal images as normal or abnormal based on the probability score. If the probability score is below 0.4, then the classification module 214 classifies the breast tissue as normal. If the probability score is above 0.4 or higher, then the classification module 214 classifies the breast tissue as abnormal. The threshold of 0.4 is selected based on receiver operating characteristic (ROC) analysis, balancing high sensitivity (greater than or equal to 90 percent) with acceptable specificity. This configuration priorities early detection of abnormalities, which is critical in breast cancer screening applications.
[0073] The display module 216 is configured to display, on the display unit 108 at least one of the thermal images of the breast, the body surface temperature data, the environment temperature data, classification result associated with the thermal images, and warning message if abnormality is detected, or combination thereof. In some embodiments, the display module 216 employs an intuitive layout with touch-enabled controls for zooming, panning, and adjusting image contrast, while also displaying data privacy indicators and offering secure export options. The comprehensive display ensures that healthcare professionals can quickly interpret results, facilitates patient understanding during consultations, and adapts to various screen sizes and orientations for optimal visibility and usability across different hardware configurations of the portable edge device 102.
[0074] FIGS. 3A-3B are flow charts illustrating a method of breast cancer screening in accordance with the present disclosure. The process described in FIGS. 3A-3B may be implemented in the system 100 of FIG. 1. For brevity and to avoid redundancy, the individual components of the system 100 will not be re-explained here, as they have been previously detailed in the description of FIG. 1. It is understood that the steps outlined in FIGS. 3A-3B are executed by the relevant modules and components of system 100, utilizing the hardware and software infrastructure described earlier.
[0075] At step 302, the method includes capturing, using at least one thermal imaging camera 110, one or more thermal images of a right side breast and a left side breast of a subject. In one exemplary embodiment, the thermal imaging camera 110 has a resolution of 160x120 pixels. For capturing thermal images, the thermal imaging camera 110 is positioned at a predetermined optimal distance and angle from the subject. Further, the camera's field of view is adjusted to ensure both breasts are fully captured within a single frame. In some embodiments, the method includes capturing multiple images in quick succession to account for potential movement artifacts and ensure the best quality image is selected for analysis. It is to be noted that the capture process is standardized, with clear instructions provided to the subject to ensure consistent positioning and environmental conditions. After capturing thermal images, the method includes storing the thermal images in the database 200 or transferring the thermal images to the portable edge device 102 for further analysis.
[0076] At step 304, the method includes measuring, using at least one first temperature sensor 112, body surface temperature of a subject. The body surface temperature is measured at predefined anatomical locations of the subject. In some embodiments, the first temperature sensor 112 is a non-contact infrared thermal sensor. While measuring the temperature, the first temperature sensor 112 is held at a specific distance from the skin surface, typically 5-10 cm, as recommended by the manufacturer. The subject is asked to remain still during measurements to avoid motion-induced errors. It is to be noted that more than one temperature sensor can be used for measurement. It is to be further noted that while measuring body temperature, the following factors such as time of day, recent physical activity, menstrual cycle phase for female subjects, are considered for accurate measurement. Multiple measurements may be taken at each location to ensure accuracy and account for any momentary fluctuations.
[0077] At step 306, the method includes measuring, using at least one second temperature sensor 114, temperature of an environment associated with the subject. In some embodiments, the second temperature sensor 114 is a non-contact infrared thermal sensor. It is to be noted that more than one temperature sensor can be used for measurement. The second temperature sensor 114 is utilized to measure the environmental temperature in the immediate vicinity of the subject. In some embodiments, the method performs continuous ambient temperature monitoring throughout the screening process to detect any fluctuations. In some embodiments, the method performs measuring temperature at multiple points in the room to account for potential temperature gradients or localized heating/cooling sources and calculating an average environmental temperature from multiple readings to ensure a representative value. It is to be noted that while measuring ambient temperature, the following factors such as humidity, air flow, are considered for accurate measurement.
[0078] At step 308, the method includes calibrating, using the calibration module 206 of the edge device 102, pixel values of at least one thermal image of the right side breast and the left side breast of the subject, using body surface temperature data and environment temperature data. The dual-sensor approach ensures environmental and individual physiological factors are both considered, resulting in accurate thermal readings under varying conditions.
[0079] The calibration formula typically follows:
[0080] T_{calibrated} = T_{pixel} + {ambient correction factor} - {body surface baseline adjustment}
[0081] At step 310, the method includes preprocessing, using the preprocessing module 208 of the edge device 102, calibrated thermal image of the right side breast and the left side breast of the subject. The preprocessing involves resizing, using bilinear interpolation, the calibrated thermal images comprising first pixel values into thermal images comprising second pixel values; replicating resized single-channel thermal images across three channels (red, green, blue), thereby generating a three-dimensional tensor; normalizing pixel intensity values of the thermal images between 0 and 1 after generating the three-dimensional tensor; applying a median filter for noise reduction in the thermal images after normalization; applying contrast limited adaptive histogram equalization (CLAHE) for contrast enhancement and improvement of visibility of subtle thermal anomalies after noise reduction; and masking out background and non-breast regions from the thermal images.
[0082] At step 312, the method includes extracting, using the feature extraction module 210 of the edge device 102, one or more temperature based features from preprocessed thermal images of the right side breast and the left side breast of the subject. The feature extraction involves detecting differences in temperature distribution between the left side breast and the right side breast; identifying focal high-temperature regions indicating abnormal tissue activity; detecting irregular thermal boundaries suggesting abnormal growth patterns in the breast tissue; determining rate of temperature change across breast tissue regions where sharp gradients indicate malignancies; and extracting subtle heat distribution textures reflecting early pathological changes.
[0083] At step 314, the method includes generating, using the score generation module 212 of the edge device 102, probability score indicating presence of abnormality associated with breast cancer, based on analysis of extracted temperature based features. The deep learning models process the extracted temperature based features through multiple layers, learning complex patterns and relationships that are indicative of breast abnormalities. The final layer of the neural network outputs a probability score, typically between 0 and 1, representing the likelihood of an abnormality being present.
[0084] At step 316, the method includes classifying, using the classification module 214 of the edge device 102, breast tissue associated with the thermal images as normal or abnormal based on the probability score. If the probability score is below 0.4, then the classification module 214 classifies the breast tissue as normal. If the probability score is above 0.4 or higher, then the classification module 214 classifies the breast tissue as abnormal.
[0085] At step 318, the method includes displaying, using the display module 216 of the edge device 102, thermal images of the breast, temperature data, classification result associated with the thermal images, and warning message if abnormality is detected.
[0086] In some embodiments, the method further includes recommending treatment based on the classification result. The method interprets the classification result, considering the probability score and other extracted features, to stratify the risk level (e.g., low, moderate, high). Based on this stratification, the method generates tailored treatment recommendations, ranging from routine screening for low-risk cases to urgent specialist referrals or biopsy recommendations for high-risk cases. These recommendations are personalized considering patient-specific factors such as age and family history.
[0087] FIG. 4 is a flow chart illustrating a method of training a deep learning convolution neural network model in accordance with the present disclosure. It is to be noted that, in many cases, the model may be pre-trained on more powerful external computing resources and then installed on the edge device 102 for inference and potential fine-tuning. This approach allows for the development of sophisticated models using large datasets while maintaining the portability and efficiency of the edge device 102 for deployment in various healthcare settings. In some embodiments, the training process depicted in FIG. 4 may be implemented using the system components described in FIG. 1, particularly leveraging the computational capabilities of the portable edge device 102.
[0088] At step 402, the method includes obtaining a plurality of thermal images of breast, body surface temperature data and environment temperature data. The method performs acquiring a large, diverse dataset of thermal breast images from various sources, encompassing a wide range of breast types, sizes, and pathologies. The dataset includes longitudinal data where possible, relevant metadata, and undergoes quality assurance checks. All images are captured following a standardized protocol to ensure consistency, and the dataset is expanded through controlled augmentation techniques. Care is taken to ensure a balanced representation of normal and abnormal cases, as well as different types of breast abnormalities.
[0089] At step 404, the method includes calibrating the plurality of thermal images with the body surface temperature data and the environment temperature data. The method calibrates the thermal images using the following formula:
[0090] T_{calibrated} = T_{pixel} + {ambient correction factor} - {body surface baseline adjustment}.
[0091] At step 406, the method includes preprocessing calibrated plurality of thermal images by resizing, using bilinear interpolation, the calibrated thermal images comprising first pixel values into thermal images comprising second pixel values; replicating resized single-channel thermal images across three channels (red, green, blue), thereby generating a three-dimensional tensor; normalizing pixel intensity values of the thermal images between 0 and 1 after generating the three-dimensional tensor; applying a median filter for noise reduction in the thermal images after normalization; applying contrast limited adaptive histogram equalization (CLAHE) for contrast enhancement and improvement of visibility of subtle thermal anomalies after noise reduction; and masking out background and non-breast regions from the thermal images
[0092] At step 408, the method includes labeling the calibrated plurality of thermal images as cancerous or non-cancerous based on clinical diagnosis. The labeling process involves annotating the calibrated thermal images with diagnosis results from anyone of mammography, ultrasound, biopsy reports, an expert review, or combinations thereof. Regions of interest are annotated for cancerous cases, and each label is assigned a confidence score. The process includes correlation with histopathological findings where available, temporal labeling for longitudinal data, and measures to assess inter-observer variability. The labels are integrated with relevant metadata, including patient history and risk factors, while adhering to strict ethical guidelines and data protection protocols.
[0093] At step 410, the method includes extracting one or more temperature based features from the calibrated plurality of thermal images. The one or more temperature based features include thermal asymmetry between the left side breast and the right side breast, localized hotspots in the breast tissue, contour and edge irregularities, thermal gradient across breast tissue, texture and micro-pattern, or combinations thereof.
[0094] At step 412, the method includes training a deep learning model with calibrated thermal images, extracted temperature based features and associated diagnostic labels. The training step involves feeding the preprocessed dataset into a convolutional neural network (CNN) optimized for thermal image analysis. In some embodiments, the training process utilizes transfer learning, where a pre-trained model on a large dataset is fine-tuned for breast thermal imaging, and data augmentation to enhance model robustness. The model learns to recognize complex patterns and correlations between thermal distributions and breast abnormalities, with a focus on minimizing false positives and negatives. During training, the model is exposed to a diverse range of thermal patterns, including subtle early-stage indicators and more pronounced late-stage thermal signatures. The process employs techniques like batch normalization, dropout, and adaptive learning rates to improve convergence and prevent overfitting. Regular validation checks are performed using a separate dataset to monitor the model's performance and guide hyperparameter tuning. The training also incorporates multi-task learning, simultaneously optimizing for classification accuracy and localization of abnormal regions. In one exemplary embodiment, the deep learning model is VGG-16 (visual geometry group-16). In some embodiments, other machine learning approaches can also be effective. For instance, ensemble methods such as random forests or gradient boosting machines (e.g., XGBoost) can be employed, particularly when working with limited datasets. Support vector machines (SVMs) can also be used. The choice of model depends on factors such as dataset size, computational resources, interpretability requirements, and the specific characteristics of the thermal imaging data. Regardless of the model chosen, the training process involves careful cross-validation, hyperparameter tuning, and performance evaluation using metrics such as accuracy, sensitivity, specificity, and area under the ROC curve (AUC-ROC) to ensure robust and reliable breast cancer screening capabilities.
[0095] FIG. 5 is a block diagram illustrating a portable edge device 500 in accordance with the present disclosure. The portable edge device 500 is functionally equivalent to the portable edge device 102 described in FIG. 1, but with a structural difference. Unlike the edge device 102, where the thermal imaging camera 110, and temperature sensors 112, 114, are communicatively coupled but not physically integrated, device 500 comprises these components within a single unit. Specifically, the portable edge device 102 comprises a thermal imaging camera 508, a first temperature sensor 510, a second temperature sensor 512, and a display 514, all integrated into the device itself. The portable edge device 500 comprises a memory 502 storing a set of instructions and a processing unit 504, similar to edge device 102. The functionalities of these components remain the same as in edge device 102, with the thermal imaging camera 508 capturing breast images, sensors 510 and 512 measuring body and environmental temperatures respectively, and the processing unit 504 executing the breast cancer screening algorithms. To avoid repetition, the detailed explanations of the screening process and software modules will not be reiterated here, as they are identical to those described for edge device 102. The integrated design of device 500 offers enhanced portability and potentially simplified operation compared to the externally connected components of edge device 102, while maintaining the same breast cancer screening capabilities.
[0096] The present disclosure offers significant technical and economic advantages. Technically, the system 100 of the present disclosure leverages thermal imaging and AI technologies for breast cancer screening. The system 100 even work with low resolution thermal cameras as the preprocessing technique optimizes the thermal images suitable for processing with AI models. The system's use of low-resolution thermal cameras, coupled with deep learning models, demonstrates the ability to extract diagnostic information from seemingly limited data sources. The edge computing capabilities enable real-time processing and analysis, overcoming limitations of cloud-dependent and internet dependent systems. The calibration technique, incorporating both body and environmental temperature data, enhances the accuracy of thermal image interpretation. The AI model's ability to detect subtle thermal patterns and asymmetries showcases advanced feature extraction capabilities. Economically, the low-cost components and portable design reduce the price point and infrastructure requirements compared to traditional screening methods. More specifically, the system 100 of the present disclosure costs around 40,000–45,000 INR. However, the existing thermal imaging based approaches costs around 5–10 lakhs. The cost-effectiveness, combined with its non-invasive nature, significantly increases screening frequency and reach, particularly in underserved areas. The potential for early detection could lead to substantial reductions in overall healthcare costs associated with late-stage cancer treatment. Moreover, the system's efficiency could optimize resource utilization in healthcare settings. Overall, the present disclosure represents a technically sophisticated yet economically viable solution that has the potential to democratize access to breast cancer screening globally.
[0097] The embodiments of the present invention disclosed herein are intended to be illustrative and not limiting. Other embodiments are possible and modifications may be made to the embodiments without departing from the spirit and scope of the invention. As such, these embodiments are only illustrative of the inventive concepts contained herein.
, Claims:1. A system (100) for breast cancer screening, comprising:
at least one thermal imaging camera (110, 508) configured for capturing one or more thermal images of a right side breast and a left side of breast of a subject, wherein captured thermal images comprise first pixel values;
at least one first temperature sensor (112, 510) configured for measuring body surface temperature of the subject;
at least one second temperature sensor (114, 512) configured for measuring environment temperature associated with the subject; and
a portable edge device (102, 500) comprising a memory (104, 502) storing a set of instructions and a processing unit (106, 504),
wherein the thermal imaging camera (110, 508), the first temperature sensor (112, 510), and the second temperature sensor (114, 512) are positioned in the edge device (102, 500) or communicatively coupled to the portable edge device (102, 500),
wherein the processing unit (106, 504) is configured to execute the set of instructions to:
obtain the thermal images of the right side and the left side of breast of the subject;
obtain body surface temperature data and environment temperature data;
calibrate pixel values of the thermal images of the right side and the left side of breast based on the body surface temperature data and the environment temperature data;
preprocess the calibrated thermal images;
extract, using one or more deep learning convolution models deployed in the portable edge device (102, 500), one or more temperature based features from the preprocessed thermal images;
generate, using the one or more deep learning convolution models, probability score indicating presence of abnormality associated with breast cancer, based on analysis of extracted temperature based features; and
classify, using the one or more deep learning convolution models, breast tissue associated with the thermal images as normal or abnormal based on the probability score,
wherein the portable edge device (102, 500) is configured to perform data processing locally in the portable edge device (102, 500).
2. The system (100) as claimed in claim 1, wherein the processing unit (106, 504) is further configured to display, on a display unit (108, 506), at least one of the thermal images of the breast, the body surface temperature data, the environment temperature data, classification result associated with the thermal images, and warning message if abnormality is detected, or combination thereof.
3. The system (100) as claimed in claim 1, wherein the processing unit (106, 504) is configured to preprocess the calibrated thermal images by:
resizing, using bilinear interpolation, the calibrated thermal images comprising first pixel values into thermal images comprising second pixel values;
replicating resized single-channel thermal images across three channels (red, green, blue), thereby generating a three-dimensional tensor;
normalizing pixel intensity values of the thermal images between 0 and 1 after generating the three-dimensional tensor;
applying a median filter for noise reduction in the thermal images after normalization;
applying contrast limited adaptive histogram equalization (CLAHE) for contrast enhancement and improvement of visibility of subtle thermal anomalies after noise reduction; and
masking out background and non-breast regions from the thermal images.
4. The system (100) as claimed in claim 1, wherein the one or more temperature based features comprise thermal asymmetry between the left side breast and the right side breast, localized hotspots in the breast tissue, contour and edge irregularities, thermal gradient across breast tissue, texture and micro-pattern, or combinations thereof.
5. The system (100) as claimed in claim 4, wherein the processing unit (106, 504) is configured to extract the one or more temperature based features from the preprocessed thermal images by:
detecting differences in temperature distribution between the left side breast and the right side breast;
identifying focal high-temperature regions indicating abnormal tissue activity;
detecting irregular thermal boundaries suggesting abnormal growth patterns in the breast tissue;
determining rate of temperature change across breast tissue regions where sharp gradients indicate malignancies; and
extracting subtle heat distribution textures reflecting early pathological changes.
6. The system (100) as claimed in claim 1, wherein the first temperature sensor (112, 510) and the second temperature sensor (114, 512) are non-contact infrared thermal sensors.
7. The system (100) as claimed in claim 1, wherein the one or more deep learning convolution models are trained by
obtaining a plurality of calibrated thermal images of breast;
preprocessing the plurality of calibrated thermal images;
labeling the preprocessed calibrated thermal images as normal or abnormal based on clinical diagnosis;
extracting temperature based features from the labeled preprocessed thermal images; and
training the one or more deep learning convolution models using the calibrated thermal images, the extracted temperature based features and corresponding labels to classify breast tissue as normal or abnormal.
8. The system (100) as claimed in claim 1, wherein the processing unit (106, 504) is configured to store at least one of the thermal images of the breast, the body surface temperature data, the environment temperature data, classification result associated with the thermal images, or combination thereof.
9. A method for breast cancer screening, comprising:
capturing, using at least one thermal imaging camera (110, 508), one or more thermal images of a right side breast and a left side of breast of a subject, wherein captured thermal images comprise first pixel values;
measuring, using at least one first temperature sensor (112, 510), body surface temperature of the subject;
measuring, using at least one second temperature sensor (114, 514), environment temperature associated with the subject;
calibrating, using a processing unit (106, 504) of portable edge device (102, 500), pixel values of the thermal images of the right side and the left side of breast based on the body surface temperature data and the environment temperature data;
preprocessing, using the processing unit (106, 504), the calibrated thermal images;
extracting, using one or more deep learning convolution models deployed in the portable edge device (102, 500), one or more temperature based features from the preprocessed thermal images;
generating, using the one or more deep learning convolution models, probability score indicating presence of abnormality associated with breast cancer, based on analysis of extracted temperature based features; and
classifying, using the one or more deep learning convolution models, breast tissue associated with the thermal images as normal or abnormal based on the probability score.
| # | Name | Date |
|---|---|---|
| 1 | 202511054084-STATEMENT OF UNDERTAKING (FORM 3) [04-06-2025(online)].pdf | 2025-06-04 |
| 2 | 202511054084-REQUEST FOR EXAMINATION (FORM-18) [04-06-2025(online)].pdf | 2025-06-04 |
| 3 | 202511054084-REQUEST FOR EARLY PUBLICATION(FORM-9) [04-06-2025(online)].pdf | 2025-06-04 |
| 4 | 202511054084-FORM-9 [04-06-2025(online)].pdf | 2025-06-04 |
| 5 | 202511054084-FORM-8 [04-06-2025(online)].pdf | 2025-06-04 |
| 6 | 202511054084-FORM FOR SMALL ENTITY(FORM-28) [04-06-2025(online)].pdf | 2025-06-04 |
| 7 | 202511054084-FORM 18 [04-06-2025(online)].pdf | 2025-06-04 |
| 8 | 202511054084-FORM 1 [04-06-2025(online)].pdf | 2025-06-04 |
| 9 | 202511054084-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [04-06-2025(online)].pdf | 2025-06-04 |
| 10 | 202511054084-EVIDENCE FOR REGISTRATION UNDER SSI [04-06-2025(online)].pdf | 2025-06-04 |
| 11 | 202511054084-EDUCATIONAL INSTITUTION(S) [04-06-2025(online)].pdf | 2025-06-04 |
| 12 | 202511054084-DRAWINGS [04-06-2025(online)].pdf | 2025-06-04 |
| 13 | 202511054084-DECLARATION OF INVENTORSHIP (FORM 5) [04-06-2025(online)].pdf | 2025-06-04 |
| 14 | 202511054084-COMPLETE SPECIFICATION [04-06-2025(online)].pdf | 2025-06-04 |
| 15 | 202511054084-FORM-26 [07-08-2025(online)].pdf | 2025-08-07 |
| 16 | 202511054084-Proof of Right [03-11-2025(online)].pdf | 2025-11-03 |