Abstract: A system 100 and method for identifying an abnormality in a human breast region by automatically determining a physical change on a surface of the human breast region is provided. The system 100 includes an image capturing device 102 and a computing device 104 includes a memory and a processor106. The processor 106identifies an abnormality using a plurality of images of the breast received from the image capturing device 102. The system 100 generates a plurality of depth maps from the images, wherein each pixel represents the distance of a surface pixel with respect to a reference point. A first machine learning model 210 generates a Breast Physical Characteristic (BPC) Index to identify characteristics indicating a risk of breast abnormality by analyzing the depth maps, assigning a quantified score, and mapping it to predefined ranges from normal to critical. The system 100 enables a user 214 to identify the abnormality by providing the BPC Index as a quantified score. FIG. 1
Description:BACKGROUND
Technical Field
[0001] Embodiments herein are related to artificial intelligence-enabled abnormalitydetection, and more particularly to a system and method for identifying a breast abnormality by automatic analysis of depth map to identify a physical change on a human breast surface.
Description of the Related Art
[0002] Breast cancer is one of the leading causes of cancer-related deaths worldwide among women. According to the World Health Organization (WHO), in 2022, approximately 2.3 million women were diagnosed with breast cancer, resulting in 670,000 fatalities globally. In certain regions, the survival rates are notably lower and50% of individuals diagnosed with breast cancerlose their lives. Self-breast examination (SBE)iswidely recommended as a proactive measure for timelydetection,allowing individuals to self-monitor for initial signs of breast abnormalities. However, a key limitation in these examinations is lack of awareness in the method of self-examination and absence of tools or quantitative metrics to reliably identify the breast abnormalities. As a result, the assessments are often subjective, increasing the risk oflate diagnosis.
[0003] Breast abnormalities, including cancerous and non-cancerous conditions, often manifest as physical changes in the breast region, such as lump, asymmetry in size and shape, swelling, or variations in skin texture. Early detection of these abnormalities significantly improves the chances of successful treatment. Traditionally, mammography, and other imaging techniques like ultrasound and MRI are used for breast cancer screening. However, regular screening with these imaging methods is expensive and is recommended only for specific age groups annually or at intervals of two to three years. So, most women seek diagnostic evaluation only after experiencing symptoms or concerns, leading to potential delays in cancer detection.
[0004] Furthermore, conventional imaging methods require sophisticated equipment, trained professionals, and access to healthcare facilities. These requirements make routine breast imaging tests inaccessible to many individuals, particularly those in remote, underserved, or resource-limited settings. Additionally, the cost and logistical challenges associated with these imaging techniques can further limit their widespread adoption. As a result, many women either do not undergo regular screenings or experience delays in accessing diagnostic evaluations, leading to the late detection of abnormalities and poorer treatment outcomes.
[0005] Due to these limitations, World Health Organization recommends clinical breast examination (CBE)for resource constrained settings. SBEis often recommended for early detection of breast cancer symptoms. However, both these methods rely heavily on subjective assessment, where individuals or clinicians palpate the breast for lumps, asymmetry, or other physical changes. The lack of objective and quantitative tracking mechanisms makes it difficult to monitor subtle changes over time, potentially resulting in undetected abnormalities at an early stage. Moreover, factors such as variations in individual perception, experience, and technique can further contribute to inconsistencies in the detection.
[0006] While advancements in computer vision and artificial intelligence have led to the development of automated breast cancer detection tools, these methods primarily rely on medical imaging modalities such as Mammography, Ultrasound, and MRI. These imaging techniques require sophisticated equipment typically found in large hospitals or diagnostic imaging centers and must be used under the supervision of trained radiologists. Consequently, they are not suitable for SBE or CBE.
[0007] Therefore, there is a need for a system and method that allows individuals to systematically and accurately identify changes in breast surface characteristics, facilitating early and reliable abnormality detection using simple affordable equipment like a mobile phone with sensors. Such a system would enable objective and quantitative assessment, facilitating early detection without requiring sophisticatedequipment or clinical expertise.
SUMMARY
[0008] In view of the foregoing, an embodiment herein provides asystem for identifying an abnormality in a human breast region by automatically determining a physical change on a surface of the human breast region. The system includes an image-capturing device anda computing device. The image-capturing device is configured to generate a plurality of images of a human breast. The computing device includes memory and a processor. The processor is retrieves machine-readable instructions from the memory which when executed by the processor to identify an abnormality in a human breast region using the plurality of images of the human breast received from the image capturing device by (i) generating a plurality of depth maps from the plurality of images of human breast region, each pixel of the plurality of depth maps represents a distance of a surface pixel of the human breast region with respect to a reference point,(ii) generating, using a first machine learning model, a Breast Physical Characteristic (BPC) Index to identify one or more characteristics that indicates a risk of a breast related abnormality from the plurality of depth maps by (a) analyzing the plurality of depth maps to assign a quantified score indicating the breast related abnormality based on the plurality of depth maps of the breast region and (b) mapping the quantified score to predefined ranges to generate the BPC index, the predefined ranges comprise from a normal range to a critical range; and (iii) enabling a user to identify the abnormality in the breast region by providing the BPC Index as a quantified score to the user.
[0009] In some embodiments, the image capturing device is configured to capture the plurality ofimages of the human breast region using a visual lens or an infrared lens at a plurality of predetermined angles. The processor is configured to implement a second machine learning model that generates a plurality of depth maps from the plurality of images of the human breast region captured using the visual or infrared lens.
[0010] In some embodiments, the second machine learning model is trained by providing training data that comprises a plurality of images of breast regions and their corresponding ground truth depth maps.
[0011] In some embodiments, the image-capturing device comprises a small portable Light Detection and Ranging (LIDAR) sensor or a depth sensor for capturing the plurality of images of the human breast region that represent the depth maps of the human breast region. The image-capturing device is configured to capture the plurality of images of the breast region at a plurality of predetermined angles for covering the entire surface of the human breast region.
[0012] In some embodiments, the generation of the BPC Index further comprises employing a segmentation module to segment the human breast region in the plurality of depth maps before sending to the first machine learning model.
[0013] In some embodiments, the BPC Index comprises information of one or more characteristics associated that indicates a risk of the breast abnormality in the human breast region, the characteristics comprise at least one of a physical asymmetry, a surface contour deformation, a size or a shape of the human breast region or a health data of the user or the likelihood of breast malignancy or any risk indicator associated with breast cancer.
[0014] In some embodiments, the processor is configured to store the generated BPC Index into a database for trend analysis and to enable the user to track changes over time for indicating progressive abnormalities in the human breast region. The processor is configured to send alerts to a healthcare provider if the BPC Index indicates a high likelihood of abnormality and the BPC Index exceeds a threshold value associated with abnormality.
[0015] In some embodiments, the processor is configured to combine the plurality of depth maps generated from the plurality of images of the human breast region to generate a 3D model of the human breast region and use the generated 3D model to generate the BPC Index. The processor generates the 3D model by registering the plurality of depth maps of the plurality of captured images of the human breast region.
[0016] In some embodiments, the processor is configured to combine the plurality of images of the breast region generated from the visual or infrared lens to generate a 3D model of the human breast region and use the generated 3D model to generate the BPC Index, the processor generates the 3D model by registering the plurality of captured image of the breast region.
[0017] In some embodiments, the processor is configured to analyze the plurality of depth maps to assign a quantified score by providing the plurality of depth maps to the first machine learning model that is trained to characterize the abnormality in the breast region, the first machine learning model is trained by providing a plurality of a depth maps and the corresponding output parameter related to abnormality.
[0018] In another aspect, a method for identifying an abnormality in a human breast region by automatically determining a physical change on a surface of the human breast region. The method includes (i) generating a plurality of images of a human breast from an image capturing devicetoidentifyan abnormality in a human breast region, using the plurality of images of the human breast received from the image capturing device,(ii) generating a plurality of depth maps from the plurality of images of human breast region, each pixel of the plurality of depth maps represents a distance of a surface pixel of the human breast region with respect to a reference point; (iii) generating, using a first machine learning model, a Breast Physical Characteristic (BPC) Index to identify one or more characteristics that indicates a risk of a breast related abnormality from the plurality of depth maps by (a) analyzing the plurality of depth maps to assign a quantified score indicating the breast related abnormality based on the plurality of depth maps of the breast region and (b) mapping the quantified score to predefined ranges to generate the BPC index, the predefined ranges comprise from a normal range to a critical range; and (iv) enabling a user to identify the abnormality in the breast region by providing the BPC Index as a quantified score to the user.
[0019] The system and method enable convenient at-home assessment of the human breast to identify abnormalities tofacilitateearly detection and improvetreatment outcomes. By providing a quantitative assessment, the system and method offer more reliable and objective results compared to traditional subjective methods. Additionally, the system and method enhance early detection rates of breast abnormalities through regular and accurate self-assessments to enable individuals to take proactive measures for their health.
[0020] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings.It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation.Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[0022] FIG. 1A illustrates a system for identifying an abnormality in a human breast region by automatically determining a physical change on the surface of the human breast region according to some embodiments herein;
[0023] FIG.1B illustrates an exemplary system to identify an abnormality in a human breast region according to some embodiments herein;
[0024] FIG. 2 illustrates an exploded view of a system for identifying an abnormality in a human breast region according to some embodiments herein;
[0025] FIGS. 3A and 3B illustrate the depth map of the human breast surface in 3D representation according to some embodiments herein;
[0026] FIG.3C illustratesa depth mapof the human breast surfaceaccording to some embodiments herein;
[0027] FIG. 4illustrates a flow diagram that illustrates a method for identifying an abnormality in a human breast region by automatically determining a physical change on a surface of the human breast region according to some embodiments herein; and
[0028] FIG. 5illustrates a block diagram of one example system for identifying an abnormality in a human breast region by automatically determining a physical change on the surface of the human breast region according to some embodimentsdescribed with respect to the flow diagram of FIG. 4 according to the embodiment herein.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0029] The embodiments herein and the various feature values and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description.Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein.The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein.Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0030] As mentioned, there remains a need for a system and method identifying an abnormality in a human breast region by automatically determining a physical change on a surface of the human breast region. Referring now to the drawings, and more particularly to FIGS. 1 through 5, where similar reference characters denote corresponding feature values consistently throughout the figures, there are shown preferred embodiments.
[0031] FIG. 1A illustrates a system for identifying an abnormality in a human breast region by automatically determining a physical change on thesurface of the human breast region according to some embodiments herein. The system 100 includesan image-capturing device 102 and includes a computing device 104. The image-capturing device 102 configures to generate one or more images of the human breast region. The computing device 104 includes a memory and a processor 106.The processor 106 retrieves machine-readable instructions from the memory which when executed by the processor 106enable the processor 106toidentifyan abnormality in a human breast region using the one or more images of the human breast received from the image-capturing device 102. The image-capturing device102 mayinclude a depth sensor, a visual lens or an infrared lens. The depth sensor may be at least one of a Kinect sensor or Light Detection and Ranging (LiDAR) sensorconfigured to capture the one or moreimages of the human breast region representingone or more depth mapsof the human breast region. The LiDAR sensor may be small and portable. The image capturing device 102 captures the one or more images of the breast region in multiple views at one or more predetermined angles for covering the entire surface of the human breast regionfor improved abnormality detection. The predetermined angles may include 0° (front view), 90° (left lateral view), 45° (left oblique view), -90° (right lateral view), and -45° (right oblique view). In some embodiments, the system 100 is communicably connected to the image-capturing device 102.
[0032] In some embodiments, at least one of thedepth sensor or the visual or infrared lenses are external devices that can be attached toa mobile phone or the processor106or a laptop to capture the one or more images of the human breast region. In some embodiments, the mobile phone includes at least any of the depth sensor or the visual or infrared lenses or a thermal lens or infrared sensors to capture the one or more of images of the human breast region in a convenient, non-invasive, no-touch breast examination at home.
[0033] The processor 106 identifies an abnormality in thehuman breast region using the one or more images of thehuman breast. The processor 106 generates one or more depth maps from the one or more images of the human breast region. For example, each pixel of theone or more depth maps represents a distance of a surface pixel of the human breast region with respect to a reference point. The reference point may be the image-capturing device102 or the background or farthest object in the image. Theone or more depth maps maybe obtained from the depth sensor.
[0034] The processor 106 implements a firstmachine learning model that generates a Breast Physical Characteristic (BPC) Index to identify one or more characteristics indicating a risk of a breast-related abnormality from the one or more depth maps. The BPC Index provides one or more combinations of surface characteristics of the human breast region such as a physical asymmetry, a contour deformation, a size and a shape. The processor 106 analyzes the one or more depth maps to assign a quantified score by providing the one or more depth maps to the first machine learning model that is trained to characterize the abnormality in the breast region. The first machine learning model is trained by providingone or more depth maps and the corresponding output parameter related to abnormality. The machine learning model may include a deep learning-based analysis or a traditional machine learning techniques that extract key breast characteristics and assign the quantified score using a machine learning classifier. The machine learning classifiers may comprise random forests, support vector machines, logistic regression, Gradient Boosting, Decision Trees, artificial neural networks, deep learning, LLM or equivalent technique, or any custom classifier using a combination of the same; Theabnormality in the breast regionmay be derived from at least any one of the depth map, or a reconstructed 3D model, or segmented breast depth map for the traditional machine learning techniques.
[0035] In some embodiments, an image processing technique is used to compute characteristicsof the abnormality in the breast region that includes a measurements related to physical changes. These measurements may comprise a volumetric difference between left and right breasts, a symmetry localized shape irregularities, a deformation patterns, and surface curvature etc. The extracted characteristics may either serve as direct indicators of abnormality or be used as input features for further classification by the first machine learning model. To improve the model generalization of training the first machine learning model, training data may be augmented using transformations such as rotation, scaling, flipping, and noise addition.
[0036] The processor 106maps the quantified score to predefined ranges to generate the BPC index. The predefined ranges may be from 0 to 5 which is decided by the doctor. For example, 0 may be considered as the normal range and 5 may be considered as the critical range. The processor 106enables a user to identify the abnormality in the breast region by providing the BPC Index as the quantified score to the user.In some embodiments, the user performs a routine self-breast examination using the image-capturing device 102 to capture one or more images of the human breast region in multiple views. The processor 106 processes the captured images to generate the one or more depth maps. The processor 106 extracts morphological characteristics such as asymmetry and surface deformation of the human breast region and the processor 106 computes the BPC Index score. If the computed BPC Index score is determined to be 3, which, based on predefined threshold values, the system 100identifies a moderate abnormality, thereby triggering a system-generated recommendation for further medical evaluation. If the computed BPC Index score meets or exceeds a threshold value of 4 or 5, the system 100 may autonomously generate an alert to notify a healthcare provider or recommend an immediate clinical examination. The historical BPC Index values stored in the database facilitate longitudinal analysis, enabling early detection of progressive abnormalities in the human breast region.
[0037] In some embodiments, the generation of BPC index also involves utilizing additional health data associated withthe user such as age, self-assessed complaints, prior history, and lifestyle factors of the user along with the obtained depth maps to improve abnormality detection accuracy in a user-friendly format. The system 100 stores the historical BPC Index for trend analysis and the ability to track changes over time. This capability can indicate the progression of abnormalities and support proactive healthcare management. The system 100 may also be equipped with a notification system to alert healthcare providers when the BPC Index suggests a high likelihood of abnormality.In some embodiments, the processor 106 sendsa visual indicator a healthcare provider if the BPC Index indicates a high likelihood of abnormality.
[0038] In some embodiments, the processor 106 is configured to combine the one or more depth maps generated from the one or more images of the human breast region to generate a 3D model of the human breast region and use the generated 3D model to generate the BPC Index. The processor 106 generates the 3D model by registering the one or more depth maps of the one or more captured images of the human breast region. The 3D model is registered by aligning the depth maps using an image registration technique.The image registration technique includesa deep learning method to automatically learn optimal transformations and align images based on underlying patterns. The image registration technique includesfeature-based techniques that identify and match distinctive key points for determining transformations. The image registration technique includes an intensity-based registration that compares pixel intensities using similarity metrics like mutual information or cross-correlation. The image registration technique includes a rigid transformations registration that aligns images through translation and rotation while preserving shape and size. The image registration technique includesnon-rigid (deformable) registration that accounts for local deformations, often used for soft tissue in medical imaging; and other machine learning-driven frameworks.
[0039] In some embodiments, the image-capturing device102 includes a visual or infrared lens to capture the one or moreimages of the human breast regionat one or more predetermined angles. The visual lenses may capture one or morevisual images to generate the visual images of the human breast region captured. The infrared lenses may capture one or moreinfrared images to generate the infrared images of the human breast region captured.The processor 106 implements a second machine learning model that generates one or moredepth maps from the images of the human breast region captured using the visual or infrared lens. In some embodiments, the second machine learning model is trained by providing training data that comprises one or more images of breast regions and their corresponding ground truth depth maps.In some embodiments, the ground truth depth mapsare paired with corresponding ground truth labels indicating whether the condition is abnormal or normal.
[0040] The second machine learning model may use deep learning techniques, including but not limited to encoder-decoder convolutional neural networks (CNNs), diffusion models, vision transformers, or general-purpose depth estimation models to predict depth from the captured images. In some embodiments, thesecond machine learning model may leverage self-supervised learning frameworks to predict depth from the captured images.
[0041] The processor 106 is configured to combine the images of the breast region generated from the visual or infrared lens to generate a 3D model of the human breast region and use the generated 3D model to generate the BPC Index. The processor 106 generates the 3D model by registering the captured images of the breast region in multiple views and use the generated 3D model to compute the BPC Index. The processor 106generates the 3D model by registering the plurality of captured images of the breast region using the image registration techniques.
[0042] FIG.1B illustrates an exemplary systemto identify an abnormality in a human breast region according to some embodiments herein. The system is connected to a mobile phone 108. The mobile phone 108is equipped with a visual lens (sensor), which are used to capture images of the human breast region. In some embodiments, the mobile phone 108is equipped with thermal lens (sensor) or depth sensor. The captured images are processed by the system 100 to generate a depth map,which provides a depth-related information of the breast surface.The generated depth map is analyzed to compute a Breast Physical Characteristic (BPC) Index to identify any abnormalities based on changes in surface structure, symmetry, or other physical attributes.
[0043] In some embodiments, the human breast region is captured using at least any one of a Bluetooth clicker, or a selfie clicker, or hand gesture recognition.In some embodiments, the human breast region is captured by another person assisting the user.
[0044] FIG. 2 illustrates an exploded view of a system for identifying an abnormality in a human breast region according to some embodiments herein. The exploded view of the system 100 includes a database 202, an image receivingmodule 204, a depth map generating module 206, a Breast Physical Characteristic (BPC) Index generating module 208, and a first machine learning model 210.
[0045] The image-receivingmodule 204 receives one or more images of the human breast region from an image-capturing device 102.The image-capturing device 102 may be Light Detection and Ranging (LIDAR) sensor or a depth sensor for capturing one or more of the human breastregion that represents the depth maps of the human breast region.
[0046] In some embodiments, the image-receivingmodule 204 captures the one or more images of the human breast region using at least any one of a visual lens or a depth sensor, or an infrared lens at one or more predetermined angles. The infrared lens may include at least one or combination of a long-wave infrared lens (thermal lens), or near-infrared lens or mid-wave infrared lens. In some embodiments, the image-receiving module 204 is configured to capture the one or more ofimages of the breast region at the one or more predetermined angles for covering the entire surface of the human breast region.
[0047] The depth map generating module 206 generates one or more depth maps from the one or more images of the human breast region. The depth map generating module 206 providesinformation of the relative distance of each pixel on the human breast region with respect to a reference point.The depth map generating module 206 implements a second machine learning modelthat generates one or moredepth maps from the one or more images of the human breast region captured using the visual or infrared lens. In some embodiments, the second machine learning modelis trained by providing training data that includes one or more images of breast regions and their corresponding ground truth depth maps.The second machine learning model may use deep learning techniques, including but not limited to encoder-decoder convolutional neural networks (CNNs), diffusion models, vision transformers, or general-purpose depth estimation models. In some embodiments, the second machine learning model leveragesa self-supervised learning framework to predict depth from a single image across various domains.
[0048] In some embodiments, the one or more depth mapsgeneratedfromthe one or more images of the human breast region are combined to generate a 3D model of the human breast region and use the generated 3D model to generate the BPC Index. The 3D model is generated by registering the one or more depth maps of the one or more captured images of the human breast region.
[0049] The BPC Index generating module 208generates a Breast Physical Characteristic (BPC) Indexto identify one or more characteristics that indicate a risk of a breast-related abnormality from the one or more depth maps by analyzing the one or more depth maps to assign a quantified score indicating the breast-related abnormality based on the one or more depth maps of the human breast region and mapping the quantified score to predefined ranges to generate the BPC index using a first machine learning model 210. The predefined ranges comprise from a normal range to a critical range.
[0050] In some embodiments, the BPC Index generating module 208 segments the human breast region in the one or more depth maps before sending to the first machine learning model 210.The BPC Index generating module 208 consists of information of one or more characteristics associated with an indication of a risk of the breast abnormality in the human breast region. In some embodiments,the characteristics include a physical asymmetry, a surface contour deformation, a size or a shape of the human breast region or a health data of the user 214 or the likelihood of breast malignancy or any risk indicator associated with breast cancer.
[0051] In some embodiments, the BPC Index generating module 208 stores the generated BPC Index into a database for trend analysis and to enable the user214 to track changes over time for indicating progressive abnormalities in the human breast regionThe BPC Index generating module 208 is configured to send alerts to a healthcare provider if the BPC Index indicates a high likelihood of abnormality and the BPC Index exceeds a threshold value associated with abnormality.
[0052] In some embodiments, the BPC Index generating module 208analyzes theone or moredepth maps to assign the quantified score by providing one or more depth maps to the first machine learning model 210. The first machine learning modelis trained to assess the abnormality of the human breast region. The first machine learning model210 may be trained by providing the one or more depth maps and the corresponding output parameterrelated to abnormality. The first machine learning model 210 may utilizes Convolutional Neural Networks (CNNs), such as Visual Geometry Group (VGG), Residual Networks (ResNet), Vision Transformers, or custom-designed networks, to effectively identify and classify potential abnormalities based on the one or more depth maps.
[0053] In some embodiments, the BPC Index generating module 208 stores the generated BPC Index into a database for trend analysis and to enable the user 214 to track changes over time for indicating progressive abnormalities in the human breast region. The BPC Index generating module 208is configured to send alerts to a healthcare provider if the BPC Index indicates a high likelihood of abnormality and the BPC Index exceeds a threshold value associated with abnormality.
[0054] FIGS. 3A and 3Billustrate the depth map of thehuman breast surface in 3D representation according to some embodiments herein. FIGS. 3A and 3B depicts different orientations of the human breast surface. The depth maps are utilized to compute the BPC Index that identifies initial signs of breast abnormalities in the human breast region.
[0055] FIG.3C illustrates a depth map of the human breast surface according to some embodiments herein. The depth map may be an array of pixels with depth values. In some embodiments, the depth map enables visualization of the human breast region to analyze variations in the depth map. The depth map may serve as input for a first machine learning model to generate a Breast Physical Characteristic (BPC) Index for identifying one or more characteristics indicating a risk of a breast-related abnormality.
[0056] FIG. 4 illustrates a flow diagram that illustrates a method for identifying an abnormality in a human breast region by automatically determining a physical change on a surface of the human breast regionaccording to some embodiments herein. At step 402, the method includes generating one or more images of a human breast from an image capturing device toidentifyan abnormality in a human breast regionusing the one or more images of the human breast received from the image capturing device. At step 404, the method includes generating one or moredepth maps from one or more images of human breast region. Each pixel of the one or more depth mapsrepresents adistance of a surface pixel of thehuman breast region with respect to a reference point.At step 406, the method includes generating, using a first machine learning model, a Breast Physical Characteristic (BPC) Index to identify one or more characteristics that indicates a risk of a breast related abnormality from the one or more depth maps by (i) analyzing the one or more depth maps to assign a quantified score indicating thebreast related abnormality based on the one or more depth maps of the breast region and (ii) mapping the quantified score to predefined ranges to generate the BPC index. The predefined ranges comprise from a normal range to a critical range. At step 408, the method includes enabling a user to identify the abnormality in the breast region by providing the BPC Index as a quantified score to the user.
[0057] FIG. 5illustrates a block diagram of one example systemfor identifying an abnormality in a human breast region by automatically determining a physical change on the surface of the human breast region according to some embodiments described with respect to the flow diagram of FIG. 4 according to the embodiment herein. The system 500 includes an antenna 501, an image receiver 502, a processor 503, animage generator504, a storage device 505, a machine learning model 506, a Central Processing Unit (CPU) 508, a memory 509, a work station 510, a machine-readable media 511, a display device 512, a keyboard 513, a mouse 514, a patient records 515, a database 516 and a network 517. The image receiver 502 wirelessly receives the video via the antenna 501 having been transmitted thereto from the image capturing device102 of FIG. 1.The processor 503 identifiesan abnormality in a human breast region, using a one or more images of a human breast.The image generator504 generates one or more depth maps from the one or more images of human breast region.Both the processor 503 and the image generator 504 store their results to the storage device 505.The machine learning model 506 retrieves the results from the storage device 505 and proceeds to generate an Index to identify one or more characteristics that indicates a risk of a breast related abnormality from the one or more depth maps.The system 500 generates the report to the user. The Central Processing Unit (CPU) 508 retrieves machine-readable program instructions from the memory 509 and is provided to facilitate the functionality of any of the modules of the system 500.The Central Processing Unit (CPU) 508, operating alone or in conjunction with other processors, may be configured to assist or otherwise perform the functionality of any of the modules or processing units of the system 500 as well as facilitating communication between the system 500 and the workstation 510.
[0058] System 500 is shown having been placed in communication with the workstation 510.A computer case of the workstation houses various components such as a motherboard with a processor and memory, a network card, a video card, a hard drive capable of reading/writing to the machine-readable media 511 such as a floppy disk, optical disk, CD-ROM, DVD, magnetic tape, and the like, and other software and hardware needed to perform the functionality of a computer workstation.The workstation 510 further includes a display device 512, such as a CRT, LCD, or touch screen device, for displaying information, images, view angles, and the like.A user can view any of that information and select from menu options displayed thereon.Keyboard 513 and mouse 514 effectuate a user input.It should be appreciated that the workstation 510 has an operating system and other specialized software configured to display alphanumeric values, menus, scroll bars, dials, slidable bars, pull-down options, selectable buttons, and the like, for entering, selecting, modifying, and accepting information needed for processing in accordance with the teachings hereof.The workstation 510 is further enabled to display images, the presence of the abnormalities in the human breast region.A user or technician may use the user interface of the workstation toidentify the abnormality in the human breast region using the BPC Index as a quantified score, as needed or as desired, depending on the implementation. Any of these selections or inputs may be stored/retrieved to storage device 511. Default settings can be retrieved from the storage device 511. A user of the workstation 510 is also able to view or manipulate any of the data in the patient records, collectively at 515, stored in database 516. Any of the received images, results, and the like, may be stored to a storage device internal to the workstation 510.Although shown as a desktop computer, the workstation 510 can be a laptop, mainframe, or a special purpose computer such as an ASIC, circuit, or the like.
[0059] Any of the components of the workstation may be placed in communication with any of the modules and processing units of the system 500.Any of the modules of the system 500 can be placed in communication with the storage devices 505, and/or computer-readable media 511 and may store/retrieve therefrom data, variables, records, parameters, functions, and/or machine-readable/executable program instructions, as needed to perform their intended functions.Each of the modules of the system 500 may be placed in communication with one or more remote devices over network 517.It should be appreciated that some or all of the functionality performed by any of the modules or processing units of the system 500 can be performed, in whole or in part, by the workstation.The embodiment shown is illustrative and should not be viewed as limiting the scope of the appended claims strictly to that configuration.Various modules may designate one or more components which may, in turn, comprise software and/or hardware designed to perform the intended function.
[0060] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments.It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation.Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the scope.
, Claims:I/We Claim:
1. A system (100) for identifying an abnormality in a human breast region by automatically determining a physical change on a surface of the human breast region, wherein the system (100)comprises:
an image-capturing device (102) that is configured to generate a plurality of images of a human breast; and
a computing device(104) comprises
a memory; and
a processor (106)retrieving machine-readable instructions from the memory which when executed by the processor (106) enable the processor (106) toidentifyan abnormality in a human breast region, using the plurality of images of the human breastreceived from the image capturing device (102), by characterized in that,
generating a plurality of depth maps from the plurality of images of human breast region, wherein each pixel of the plurality of depth mapsrepresents adistance of a surface pixel of thehuman breast region with respect to a reference point;
generating, using a first machine learning model (210), a Breast Physical Characteristic (BPC) Index to identify one or more characteristics that indicates a risk of a breast related abnormality from the plurality of depth maps by (i) analyzing the plurality of depth maps to assign a quantified score indicating thebreast related abnormality based on the plurality of depth maps of the breast region and (ii) mapping the quantified score to predefined ranges to generate the BPC index, wherein the predefined ranges comprise from a normal range to a critical range; and
enabling a user (214)to identify the abnormality in the breast region by providing the BPC Index as the quantified score to the user (214).
2. The system (100) as claimed in claim 1, wherein the image capturing device (102) is configured to capture the plurality ofimages of the human breast region using a visual lens or an infrared lensat a plurality of predetermined angles, wherein the processor (106)is configured to implement a second machine learning modelthat generates a plurality of depth maps from the plurality of images of the human breast region captured using the visual or infrared lens.
3. The system (100) as claimed in claim 2, wherein the second machine learning model is trained by providing training data that comprises a plurality of images of breast regions and their corresponding ground truth depth maps.
4. The system (100) as claimed in claim 1, wherein the image capturing device (102) comprises a Light Detection and Ranging (LIDAR) sensor or a depth sensor for capturing the plurality of images of the human breastregion that represent the depth maps of the human breast region, wherein the image capturing device (102) is configured to capture the plurality ofimages of the breast region at a plurality of predetermined angles for covering the entire surface of the human breast region.
5. The system (100) as claimed in claim 1, wherein the generation of the BPC Index further comprises employing asegmentation module to segment the human breast region in the plurality of depth maps before sending tothe first machine learning model (210).
6. The system (100) as claimed in claim 1, wherein the BPC Index comprises information of one or more characteristics associated that indicates a risk of the breast abnormality in the human breast region, wherein the characteristics comprise at least one of a physical asymmetry, a surface contour deformation, a size or a shape of the human breast region or a health data of the user(214) or the likelihood of breast malignancy or any risk indicator associated with breast cancer.
7. The system (100) as claimed in claim 1, wherein the processor (106) is configured to store the generated BPC Index into a database for trend analysis and to enable the user (214) to track changes over time for indicating progressive abnormalities in the human breast region, wherein the processor (106) is configured to send alerts to a healthcare provider if the BPC Index indicates a high likelihood of abnormality and the BPC Index exceeds a threshold value associated with abnormality.
8. The system (100) as claimed in claim 1, wherein the processor (106) is configured to combine the plurality of depth mapsgeneratedfromthe plurality ofimages of the human breast region to generate a 3D model of the human breast region and use the generated 3D model to generate the BPC Index, wherein the processor (106) generates the 3D model by registering the plurality of depth maps of the plurality of captured images of the human breast region.
9. The system (100) as claimed in claim 2, wherein the processor(106) is configured to combine the plurality ofimages of the breast region generated from the visual or infrared lens to generate a 3D model of the human breast region and use the generated 3D model to generate the BPC Index, wherein the processor (106)generates the 3D model by registering the plurality of captured image of the breast region.
10. The system (100) as claimed in claim 1, wherein the processor (106) is configured to analyze the plurality of depth maps to assign a quantified score by providing the plurality of depth maps to the first machine learning model (210) that is trained to characterize theabnormality in the breast region, wherein the first machine learning model(210) is trained by providing a plurality of a depth maps and the corresponding output parameterrelated to abnormality.
11. A method for identifying an abnormality in a human breast region by automatically determining a physical change on a surface of the human breast region, wherein the method comprises:
generating a plurality of images of a human breast from an image-capturing device (102)toidentifyan abnormality in a human breast region, using the plurality of images of the human breast received from the image capturing device (102);
generating a plurality of depth maps from the plurality of images of human breast region, wherein each pixel of the plurality of depth mapsrepresents adistance of a surface pixel of thehuman breast region with respect to a reference point;
generating, using a first machine learning model (210), a Breast Physical Characteristic (BPC) Index to identify one or more characteristics that indicates a risk of a breast related abnormality from the plurality of depth maps by (i) analyzing the plurality of depth maps to assign a quantified score indicating thebreast related abnormality based on the plurality of depth maps of the breast region and (ii) mapping the quantified score to predefined ranges to generate the BPC index, wherein the predefined ranges comprise from a normal range to a critical range; and
enabling a user (214) to identify the abnormality in the breast region by providing the BPC Index as a quantified score to the user (214).
| # | Name | Date |
|---|---|---|
| 1 | 202541036896-STATEMENT OF UNDERTAKING (FORM 3) [16-04-2025(online)].pdf | 2025-04-16 |
| 2 | 202541036896-PROOF OF RIGHT [16-04-2025(online)].pdf | 2025-04-16 |
| 3 | 202541036896-POWER OF AUTHORITY [16-04-2025(online)].pdf | 2025-04-16 |
| 4 | 202541036896-FORM FOR STARTUP [16-04-2025(online)].pdf | 2025-04-16 |
| 5 | 202541036896-FORM FOR SMALL ENTITY(FORM-28) [16-04-2025(online)].pdf | 2025-04-16 |
| 6 | 202541036896-FORM 1 [16-04-2025(online)].pdf | 2025-04-16 |
| 7 | 202541036896-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [16-04-2025(online)].pdf | 2025-04-16 |
| 8 | 202541036896-EVIDENCE FOR REGISTRATION UNDER SSI [16-04-2025(online)].pdf | 2025-04-16 |
| 9 | 202541036896-DRAWINGS [16-04-2025(online)].pdf | 2025-04-16 |
| 10 | 202541036896-DECLARATION OF INVENTORSHIP (FORM 5) [16-04-2025(online)].pdf | 2025-04-16 |
| 11 | 202541036896-COMPLETE SPECIFICATION [16-04-2025(online)].pdf | 2025-04-16 |
| 12 | 202541036896-FORM-9 [29-04-2025(online)].pdf | 2025-04-29 |
| 13 | 202541036896-STARTUP [03-05-2025(online)].pdf | 2025-05-03 |
| 14 | 202541036896-FORM28 [03-05-2025(online)].pdf | 2025-05-03 |
| 15 | 202541036896-FORM 18A [03-05-2025(online)].pdf | 2025-05-03 |
| 16 | 202541036896-Request Letter-Correspondence [21-08-2025(online)].pdf | 2025-08-21 |
| 17 | 202541036896-Power of Attorney [21-08-2025(online)].pdf | 2025-08-21 |
| 18 | 202541036896-FORM28 [21-08-2025(online)].pdf | 2025-08-21 |
| 19 | 202541036896-Form 1 (Submitted on date of filing) [21-08-2025(online)].pdf | 2025-08-21 |
| 20 | 202541036896-Covering Letter [21-08-2025(online)].pdf | 2025-08-21 |