Abstract: This disclosure relates to method and system for identifying a disease through image processing. The method (800) includes receiving (801) image data corresponding to a patient from a data source; and preprocessing (802) the image data through a predictive model to obtain a disease-specific image dataset. The method (800) further includes, for each image in the disease-specific image dataset, identifying (807) a disease associated with the patient from a plurality of diseases using the disease-specific image dataset through a Hybrid Neural Architecture Search (H-NAS) model. The H-NAS model includes a plurality of NAS models. Each of the plurality of NAS models provides an output accuracy score for an image. Each of the plurality of diseases comprises a set of stages. The method (800) further includes, for each of the plurality of NAS models, determining (808) a stage from the set of stages associated with the identified disease from the image based on the output accuracy score.
Claims:CLAIMS-
1. A method (800) for identifying a disease through image processing, the method (800) comprising:
receiving (801), by a disease prediction device (101), image data corresponding to a patient from a data source, wherein the image data comprises labeled image data and unlabeled image data, wherein each of the labeled image data comprises a label, and wherein the data source is one of a batch data source or a real-time data source;
preprocessing (802), by the disease prediction device (101), the image data through a predictive model to obtain a disease-specific image dataset, wherein the predictive model is based on an image processing algorithm;
for each image in the disease-specific image dataset,
identifying (807), by the disease prediction device (101), a disease associated with the patient from a plurality of diseases using the disease-specific image dataset through a Hybrid Neural Architecture Search (H-NAS) model, wherein the H-NAS model comprises a plurality of NAS models, wherein each of the plurality of NAS models provides an output accuracy score for an image, and wherein each of the plurality of diseases comprises a set of stages; and
for each of the plurality of NAS models, determining (808), by the disease prediction device (101), a stage from the set of stages associated with the identified disease from the image based on the output accuracy score.
2. The method of claim 1, further comprising:
for each of the set of stages associated with each of a plurality of diseases,
assigning (809) a rank to each of the plurality of NAS models based on the output accuracy score;
generating (810) a ranking matrix for each of the plurality of NAS models each of the set of stages associated with a disease from the plurality of diseases based on the rank;
using (811) a top-ranked NAS model to identify the stage of the disease from the image data; and
for each of the set of stages associated with each of a plurality of diseases,
identifying (812) a core disease pattern and a pattern associated with the set of stages corresponding to the core disease pattern based on the image data through a Dynamic Ensemble Siamese Image and Disease Mapping System (DESIDMS)-based Deep NAS (DNAS) Siamese processor (602);
comparing (813) an accuracy score associated with each of the plurality of NAS models for each of the set of stages of the core disease pattern through the DESIDMS-based DNAS Siamese processor (602);
determining (814) a category of the disease and a rank corresponding to the category determined, based on the accuracy score associated with each of the plurality of NAS models for the core disease pattern; and
selecting (815), by the DESIDMS-based DNAS Siamese processor (602), an optimal NAS model to identify the disease from the image data, wherein the accuracy score for the core disease pattern associated with the optimal NAS model is highest among the plurality of NAS models.
3. The method of claim 1, wherein preprocessing (802) the image data through a predictive model further comprises:
determining a label for each of the unlabeled image data through the predictive model; and
training the predictive model with labeled image data to determine the label for each of the unlabeled image data through the predictive model.
4. The method of claim 1, wherein preprocessing (802) the image data through a predictive model further comprises:
identifying (803) a set of patterns in each of the image data through a pattern recognition model;
assigning (804) a rank to each of the image data corresponding to each of a plurality of diseases based on an accuracy score of pattern recognition model, wherein the accuracy score is based on the set of patterns identified;
generating (805) a ranking matrix based on the ranking assigned, wherein the ranking matrix comprises the accuracy score associated with each of the image data and each of the plurality of diseases; and
creating (806) a disease-specific image dataset based on the ranking matrix, wherein the accuracy score associated with each of the image data in the disease-specific image dataset is above a predefined threshold score.
5. The method of claim 1, further comprising validating each of the image data in the disease-specific image dataset through an ensemble model.
6. The method of claim 1, further comprising:
enhancing at least one of the image data when the at least one of the image data is a false positive upon classification by the H-NAS model;
validating the at least one of the image data with metadata associated with the patient; and
identifying a disease associated with the patient from a plurality of diseases using the at least one of the image data through the H-NAS model.
7. The method of claim 6, wherein enhancing at least one of the image data further comprises:
receiving true positive image data and the at least one of the image data from an administrator, wherein each of the true positive image data comprises true positive metadata;
identifying metadata associated with the disease based on the true positive image data;
determining enhanced image data from each of the true positive image data and the at least one of the image data using a Conditional Random Field (CRF) model based on the metadata associated with the disease; and
associating the metadata associated with the disease with the at least one of the image data determined as the enhanced image data.
8. A system (200) for identifying a disease through image processing, the system (200) comprising:
a processor (102); and
a memory communicatively coupled to the processor (102), wherein the memory stores processor instructions, which when executed by the processor (102), cause the processor (102) to:
receive (801) image data corresponding to a patient from a data source, wherein the image data comprises labeled image data and unlabeled image data, wherein each of the labeled image data comprises a label, and wherein the data source is one of a batch data source or a real-time data source;
preprocess (802) the image data through a predictive model to obtain a disease-specific image dataset, wherein the predictive model is based on an image processing algorithm;
for each image in the disease-specific image dataset,
identify (807) a disease associated with the patient from a plurality of diseases using the disease-specific image dataset through a Hybrid Neural Architecture Search (H-NAS) model, wherein the H-NAS model comprises a plurality of NAS models, wherein each of the plurality of NAS models provides an output accuracy score for an image, and wherein each of the plurality of diseases comprises a set of stages; and
for each of the plurality of NAS models, determine (808) a stage from the set of stages associated with the identified disease from the image based on the output accuracy score.
9. The system of claim 8, wherein the processor instructions, on execution, further cause the processor (102) to:
for each of the set of stages associated with each of a plurality of diseases,
assign (809) a rank to each of the plurality of NAS models based on the output accuracy score;
generate (810) a ranking matrix for each of the plurality of NAS models each of the set of stages associated with a disease from the plurality of diseases based on the rank; and
use (811) a top-ranked NAS model to identify the stage of the disease from the image data; and
for each of the set of stages associated with each of a plurality of diseases,
identify (812) a core disease pattern and a pattern associated with the set of stages corresponding to the core disease pattern based on the image data through a Dynamic Ensemble Siamese Image and Disease Mapping System (DESIDMS)-based Deep NAS (DNAS) Siamese processor (602);
compare (813) an accuracy score associated with each of the plurality of NAS models for each of the set of stages of the core disease pattern through the DESIDMS-based DNAS Siamese processor (602);
determine (814) a category of the disease and a rank corresponding to the category determined, based on the accuracy score associated with each of the plurality of NAS models for the core disease pattern; and
select (815), by the DESIDMS-based DNAS Siamese processor (602), an optimal NAS model to identify the disease from the image data, wherein the accuracy score for the core disease pattern associated with the optimal NAS model is highest among the plurality of NAS models.
9. The system of claim 8, wherein to preprocess (802) the image data through a predictive model, the processor instructions, on execution, further cause the processor (102) to:
determine a label for each of the unlabeled image data through the predictive model; and
train the predictive model with labeled image data to determine the label for each of the unlabeled image data through the predictive model.
10. The system of claim 8, wherein to preprocess (802) the image data through a predictive model, the processor instructions, on execution, further cause the processor (102) to:
identify (803) a set of patterns in each of the image data through a pattern recognition model;
assign (804) a rank to each of the image data corresponding to each of a plurality of diseases based on an accuracy score of pattern recognition model, wherein the accuracy score is based on the set of patterns identified;
generate (805) a ranking matrix based on the ranking assigned, wherein the ranking matrix comprises the accuracy score associated with each of the image data and each of the plurality of diseases; and
create (806) a disease-specific image dataset based on the ranking matrix, wherein the accuracy score associated with each of the image data in the disease-specific image dataset is above a predefined threshold score.
11. The system of claim 8, wherein the processor instructions, on execution, further cause the processor (102) to:
enhance at least one of the image data when the at least one of the image data is a false positive upon classification by the H-NAS model;
validate the at least one of the image data with metadata associated with the patient; and
identify a disease associated with the patient from a plurality of diseases using the at least one of the image data through the H-NAS model.
12. The system of claim 11, wherein to enhance at least one of the image data, the processor instructions, on execution, further cause the processor (102) to:
receive true positive image data and the at least one of the image data from an administrator, wherein each of the true positive image data comprises true positive metadata;
identify metadata associated with the disease based on the true positive image data;
determine enhanced image data from each of the true positive image data and the at least one of the image data using a Conditional Random Field (CRF) model based on the metadata associated with the disease; and
associate the metadata associated with the disease with the at least one of the image data determined as the enhanced image data.
Description:
DESCRIPTION
Technical Field
[001] This disclosure relates generally to image processing, and more particularly to method and system for identifying a disease through image processing.
Background
[002] In the present scenario, the healthcare industry is crippled by a need for more resources to obtain an improved diagnosis from medical imaging techniques. Medical imaging includes a set of processes or techniques to create visual representations of the interior parts of the body such as organs or tissues for clinical purposes to monitor health, diagnose, and treat diseases and injuries.
[003] Traditional algorithmic approaches to medical image analysis suffer from numerous technical problems related to inability to adequately perform the analysis without significant human intervention and guidance, for a completely automated disease prediction and suggestion. Therefore, in the present state of art, automated medical image analysis generally gives a low or uncertain accuracy prediction for medical images (such as, X-Ray images, Computed Tomography (CT) scan images, Magnetic Resonance Imaging (MRI) images, Positron Emission Tomography (PET) images, and the like). With inaccurate diagnosis, treatment recommendation may be inaccurate, which may prove to be fatal for the patient.
[004] The conventional technqiues fail to provide for methods to optimally identify a disease for a patient using medical images through image processing. There is, therefore, a need in the present state of art for techniques to accurately identify diseases, stages of the diseases, and recommend treatment based on the diseases and the associated stages using image processing.
SUMMARY
[005] In one embodiment, a method for identifying a disease through image processing model is disclosed. In one example, the method includes receiving image data corresponding to a patient from a data source. The image data includes labeled image data and unlabeled image data. Each of the labeled image data includes a label. The data source is one of a batch data source or a real-time data source. The method further includes preprocessing the image data through a predictive model to obtain a disease-specific image dataset. The predictive model is based on an image processing algorithm. Each of the plurality of diseases comprises a set of stages. For each image in the disease-specific image dataset, the method further includes identifying a disease associated with the patient from a plurality of diseases using the disease-specific image dataset through a Hybrid Neural Architecture Search (H-NAS) model. The H-NAS model includes a plurality of NAS (Neural Architecture Search) models. Each of the plurality of NAS models provides an output accuracy score for an image. For each image in the disease-specific image dataset, the method further includes for each of the plurality of NAS models, determining a stage from the set of stages associated with the identified disease from the image based on the output accuracy score.
[006] In one embodiment, a system for identifying a disease through image processing is disclosed. In one example, the system includes a processor and a computer-readable medium communicatively coupled to the processor. The computer-readable medium store processor-executable instructions, which, on execution, cause the processor to receive image data corresponding to a patient from a data source. The image data includes labeled image data and unlabeled image data. Each of the labeled image data includes a label. The data source is one of a batch data source or a real-time data source. The processor-executable instructions, on execution, further cause the processor to preprocess the image data through a predictive model to obtain a disease-specific image dataset. The predictive model is based on an image processing algorithm. Each of the plurality of diseases includes a set of stages. For each image in the disease-specific image dataset, the processor-executable instructions, on execution, further cause the processor to identify a disease associated with the patient from a plurality of diseases using the disease-specific image dataset through a Hybrid Neural Architecture Search (H-NAS) model. The H-NAS model includes a plurality of NAS models. Each of the plurality of NAS models provides an output accuracy score for an image. For each image in the disease-specific image dataset, the processor-executable instructions, on execution, further cause the processor to, for each of the plurality of NAS models, determine a stage from the set of stages associated with the identified disease from the image based on the output accuracy score.
[007] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles.
[009] FIG. 1 is a block diagram of an exemplary system for identifying a disease through image processing, in accordance with some embodiments of the present disclosure.
[010] FIG. 2 illustrates a functional block diagram of an exemplary system for identifying a disease through image processing, in accordance with some embodiments of the present disclosure.
[011] FIGS. 3A and 3B illustrate a detailed block diagram of an exemplary system for identifying a disease through image processing, in accordance with some embodiments of the present disclosure.
[012] FIG. 4 illustrates a block diagram of an exemplary Image/Non-image Unique Pattern Detector (IUPD), in accordance with some embodiments of the present disclosure.
[013] FIG. 5 illustrates a block diagram of an exemplary Hybrid-Neural Architecture Search (H-NAS) system, in accordance with some embodiments of the present disclosure.
[014] FIG. 6 illustrates a block diagram of an exemplary Dynamic Ensemble Siamese Image and Disease Mapping System (DESIDMS), in accordance with some embodiments of the present disclosure.
[015] FIG. 7 illustrates a block diagram of an exemplary NAS Siamese Interpreter (NSI), in accordance with some embodiments of the present disclosure.
[016] FIGS. 8A and 8B illustrate a flow diagram of an exemplary process for identifying a disease through image processing, in accordance with some embodiments of the present disclosure.
[017] FIG. 9 illustrates a flow diagram of an exemplary process for identifying a disease associated with the patient from a plurality of diseases using image data, in accordance with some embodiments of the present disclosure.
[018] FIG. 10 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
DETAILED DESCRIPTION
[019] Exemplary embodiments are described with reference to the accompanying drawings. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope and spirit being indicated by the following claims.
[020] Referring now to FIG. 1, an exemplary system 100 for identifying a disease through image processing is illustrated, in accordance with some embodiments of the present disclosure. The system 100 may implement a disease prediction device 101 (for example, server, desktop, laptop, notebook, netbook, tablet, smartphone, mobile phone, or any other computing device), in accordance with some embodiments of the present disclosure. The disease prediction device 101 may identify a disease from a plurality of diseases using the disease-specific image dataset through a Hybrid Neural Architecture Search (H-NAS) model. It should be noted that, in some embodiments, the disease prediction device 101 may determine a stage from a set of stages associated with the identified disease from the image based on an output accuracy score.
[021] As will be described in greater detail in conjunction with FIGS. 2 – 9, the disease prediction device may receive image data corresponding to a patient from a data source. The image data includes labeled image data and unlabeled image data. Each of the labeled image data includes a label. The data source is one of a batch data source or a real-time data source. The disease prediction device may further preprocess the image data through a predictive model to obtain a disease-specific image dataset. The predictive model is based on an image processing algorithm. Each of the plurality of diseases includes a set of stages. For each image in the disease-specific image dataset, the disease prediction device may further identify a disease associated with the patient from a plurality of diseases using the disease-specific image dataset through a Hybrid Neural Architecture Search (H-NAS) model. The H-NAS model includes a plurality of NAS models. Each of the plurality of NAS models provides an output accuracy score for an image. For each image in the disease-specific image dataset, the disease prediction device may further, for each of the plurality of NAS models, determine a stage from the set of stages associated with the identified disease from the image based on the output accuracy score.
[022] In some embodiments, the disease prediction device 101 may include one or more processors 102 and a computer-readable medium 103 (for example, a memory). The computer-readable medium 103 may include the H-NAS model. Further, the computer-readable storage medium 103 may store instructions that, when executed by the one or more processors 102, cause the one or more processors 102 to identify a disease through image processing, in accordance with aspects of the present disclosure. The computer-readable storage medium 103 may also store various data (for example, labeled image data, unlabeled image data, disease-specific image dataset, Dynamic Ensemble Siamese Image and Disease Mapping System (DESIDMS), and the like) that may be captured, processed, and/or required by the system 100.
[023] The system 100 may further include a display 104. The system 100 may interact with a user via a user interface 105 accessible via the display 104. The system 100 may also include one or more external devices 106. In some embodiments, the PII tracking device 101 may interact with the one or more external devices 106 over a communication network 107 for sending or receiving various data. The external devices 106 may include, but may not be limited to, a remote server, a digital device, or another computing system.
[024] Referring now to FIG. 2, functional block diagram of an exemplary system 200 for identifying a disease through image processing is illustrated, in accordance with some embodiments of the present disclosure. In an embodiment, the system 200 is analogous to the disease prediction device 101 of the system 100. The system 200 includes an Image/Non-image Unique Pattern Detector (IUPD) 201, an image database 202, multi-input coherence Siamese with complex feature descriptors 203, a core image unique pattern detector and ensemble classifier 204, a NAS-based image system prediction and recommender system 205, Dynamic Ensemble Siamese Image and Disease Mapping System (DESIDMS) disease detection, prediction tuning, and NAS suggestion module 206, an image database update and optimizer 207, and a Siamese deep detect system for rejected images 208. The IUPD 201 includes a feature extractor, hierarchy feature retriever, image classifier 209, and a pattern detector and image ranker 210.
[025] The IUPD 201 receives image/non-image data 211 corresponding to a patient from a data source. The image/non-image data 211 includes a plurality of images. The data source may be a batch data source or a real-time data source. The IUPD 201 is a sub-system to identify necessary images for classification by a proposed NAS-Siamese system. The feature extractor, hierarchy feature retriever, image classifier 209 extracts a set of features from the image/non-image data 211. In some embodiments, the feature extractor, hierarchy feature retriever, image classifier 209 retrieves hierarchy features from the image database 202 and assigns labels to unlabeled image data through the multi-input coherence Siamese with complex feature descriptors 203. Further, the feature extractor, hierarchy feature retriever, image classifier 209 classifies each of the plurality of images into relevant images and non-relevant images. Further, the feature extractor, hierarchy feature retriever, image classifier 209 sends the relevant images to the pattern detector and image ranker 210 for additional image processing and accuracy improvement.
[026] In an embodiment, when the image label accuracy of the predictive model is above a predefined threshold accuracy (for example, 95%), the pattern detector and image ranker 210 activates a virtual image intelligent agent within the feature extractor, hierarchy feature retriever, image classifier 209. Further, the feature extractor, hierarchy feature retriever, image classifier 209 handles image classification of incoming labeled and unlabeled data. Additionally, the feature extractor, hierarchy feature retriever, image classifier 209 provides feedback to the pattern detector and image ranker 210 which generates a ranking matrix (such as, [disease 1, image 1, accuracy], [disease 1, image 2, accuracy], etc.) for disease-specific images to obtain preprocessed data.
[027] Further, the preprocessed data is received by the core image unique pattern detector and ensemble classifier 204. The core image unique pattern detector and ensemble classifier 204 classifies images for relevance based on bias or variance of the images in the preprocessed data. It should be noted that the core image unique pattern detector and ensemble classifier 204 provides for an improved image coherence. Further, the core image unique pattern detector and ensemble classifier 204 sends the preprocessed data to the NAS-based image system prediction and recommender system 205.
[028] The NAS-based image system prediction and recommender system 205 includes a H-NAS model for predicting a disease from a plurality of diseases for an image. The H-NAS model includes a plurality of NAS models. Each of the plurality of NAS models predicts a disease from the image in parallel with a first output accuracy score. Additionally, the NAS-based image system prediction and recommender system 205 determines a stage from a set of stages of the predicted disease (such as, Cancer stage-1) through each of the plurality of NAS models with a second output accuracy score. Further, the NAS-based image system prediction and recommender system 205 provides a treatment recommendation based on the predicted disease and the predicted stage of the disease. The NAS-based image system prediction and recommender system 205 sends the predicted disease, the associated first output accuracy score, the predicted stage, and the associated second output accuracy score of each of the plurality of NAS models to the DESIDMS disease detection, prediction tuning, and NAS suggestion module 206 to receive an optimal NAS model for predicting a disease and a stage associated with the disease.
[029] The DESIDMS disease detection, prediction tuning, and NAS suggestion module 206 determines the optimal NAS model from the plurality of NAS models for predicting the disease based on the first output accuracy score and the second output accuracy score of each of the plurality of NAS models. The optimal NAS model is used to predict the disease from image data which may be received in future. The NAS-based image system prediction and recommender system 205 sends failed images (false positives and false negatives) to the Siamese deep detect system for rejected images 208. The Siamese deep detect system for rejected images 208 enhances the failed images for improved disease prediction using Conditional Random Field (CRF) or Markov Random Field (MRF)-based image segmentation.
[030] A network/cloud admin may upload true positive low pixel images. Further, each of the true positive low pixel images is assisted with disease metadata such as, disease name, type, stage, etc. The true positive low pixel images are mapped with MRF-CRF modeling of exact disease image data. Further, a relevant disease image pattern is generated. Further, the disease metadata is mapped with a failed image to obtain combined data. By way of an example, the combined data may include the failed image with a type of disease, an associated stage of the disease, and other data. Each of the failed images is tuned and enhanced to exact disease image pattern based on enhanced failed image metadata. Further, the Siamese deep detect system for rejected images 208 sends the combined data to the image database update and optimizer 207.
[031] The image database update and optimizer 207 receives the combined data from the Siamese deep detect system for rejected images 208 and updates database with the enhanced failed image and the metadata. Further, the image database update and optimizer 207 sends the combined data to the NAS-based image system prediction and recommender system 205 for a final disease prediction and a treatment recommendation.
[032] It should be noted that all such aforementioned modules 201 – 210 may be represented as a single module or a combination of different modules. Further, as will be appreciated by those skilled in the art, each of the modules 201 – 210 may reside, in whole or in parts, on one device or multiple devices in communication with each other. In some embodiments, each of the modules 201 – 210 may be implemented as dedicated hardware circuit comprising custom application-specific integrated circuit (ASIC) or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components. Each of the modules 201 – 210 may also be implemented in a programmable hardware device such as a field programmable gate array (FPGA), programmable array logic, programmable logic device, and so forth. Alternatively, each of the modules 201 – 210 may be implemented in software for execution by various types of processors (e.g., processor 102). An identified module of executable code may, for instance, include one or more physical or logical blocks of computer instructions, which may, for instance, be organized as an object, procedure, function, or other construct. Nevertheless, the executables of an identified module or component need not be physically located together but may include disparate instructions stored in different locations which, when joined logically together, include the module and achieve the stated purpose of the module. Indeed, a module of executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different applications, and across several memory devices.
[033] As will be appreciated by one skilled in the art, a variety of processes may be employed for identifying a disease through image processing. For example, the exemplary system 100 and the associated disease prediction device 101 may identify a disease through image processing by the processes discussed herein. In particular, as will be appreciated by those of ordinary skill in the art, control logic and/or automated routines for performing the techniques and steps described herein may be implemented by the system 100 and the associated disease prediction device 101 either by hardware, software, or combinations of hardware and software. For example, suitable code may be accessed and executed by the one or more processors on the system 100 to perform some or all of the techniques described herein. Similarly, application specific integrated circuits (ASICs) configured to perform some or all of the processes described herein may be included in the one or more processors on the system 100.
[034] Referring now to FIGS. 3A and 3B, a detailed block diagram of an exemplary system 300 for identifying a disease through image processing is illustrated, in accordance with some embodiments of the present disclosure. In an embodiment, the system 300 is analogous to the system 200. The system 300 includes a host/cloud/IoT 301, a predefined image database 302, a pretrained Batch Image Process (BIP) engine 303, a pretrained Real-time Classifier Engine (RCE) 304, an IUPD 305, an Intelligent NAS-based Image Processing System (iNAS) 306, a DESIDMS 307, an image/non-image storage 308, a batch and real-time low pixel image infuser with disease metadata 309, and a Similarity Deep Detect System (SDDS) 310.
[035] Host/IoT/cloud 301 is a data source to provide image data or non-image data corresponding to a patient in form of at least one of batch input 325 and real-time data 326. By way of an example, the host/IoT/cloud 301 may include, but may not be limited to, IoT image capturing devices, network-controlled CT Scan bed, cloud-connected hospital network, emergency public centers, and the like. The host/IoT/cloud 301 may be places for registration for diagnosis, or storage databases for mass disease images in bigdata systems (such as, kubernet, HBase, NoSQL, SQL, etc.), and the like. The batch input 325 is sent to the pretrained BIP engine 303 and the real-time data 326 is sent to the pretrained RCE 304. Each of the pretrained BIP engine 303 and the pretrained RCE 304 may reject undesirable or out of bound image data and non-image data to obtain processed data. For example, an aircraft image received from one or more of the host/IoT/cloud 301 is undesirable image data for disease detection. Further, the pretrained BIP engine 303 and the pretrained RCE 304 send the processed data to IUPD 305.
[036] The IUPD 305 is a sub-system to identify necessary images for classification by a proposed NAS-Siamese system. The IUPD 305 is a sub-system to identify necessary images for classification by a proposed NAS-Siamese system. The IUPD 305 analyses the processed data and creates a unique pattern. It may be noted that a requirement for the unique pattern may be received from configuration or detected by IUPD algorithm. Additionally, the IUPD 305 adds labels to unlabeled image data. Further, the IUPD 305 generates a ranking matrix (such as, [disease 1, image 1, accuracy], [disease 1, image 2, accuracy], etc.) for disease-specific images to obtain preprocessed data. Further, the preprocessed data is sent to the iNAS 306.
[037] The iNAS 306 includes a non-image NAS-Siamese disease recommender 311, a multi-input image pipe 312, an image and non-image segregation and rectification unit 313, an ensemble image processor (EIP) 314, an image database update and optimizer 315, an iNAS disease prediction and recommend system 316, an on-demand disease prediction and recommender module 317, a H-NAS Single Image Multi Out (SIMO)-based deep learn image classifier 318, a Failed Image Output (FIO) 319, an SIMO 320, and an NAS Siamese Interpreter (NSI) 321.
[038] The iNAS 306 is a self-adjusting H-NAS system based on response data from the DESIDMS 307. The image and non-image segregation and rectification unit 313 receives the preprocessed data (including multiple images from the multi-input image pipe 312) from the IUPD 305. Further, the image and non-image segregation and rectification unit 313 analyses the preprocessed data. The image and non-image segregation and rectification unit 313 segregates medical images and non-image data in the preprocessed data. It may be noted that the non-image data may be processed by the non-image NAS-Siamese disease recommender 311.
[039] Further, the image and non-image segregation and rectification unit 313 and the EIP 314 perform fine mapping of the medical images defined by the system 300 or the medical images available in the image/non-image storage 308. The EIP 314 may include an ensemble of convolutional neural networks or simple image processing algorithms to send an improvised image ranking output to the H-NAS SIMO-based deep learn image classifier 318.
[040] The H-NAS SIMO-based deep learn image classifier 318 includes a plurality of NAS models (such as, DI-NAS, ResNet, NAS-DIP, PNAS, RNN, DARTS, V-NAS, etc.). It may be noted that the H-NAS SIMO-based deep learn image classifier 318 may include various Opensource as well as proprietary NAS algorithms. The H-NAS SIMO-based deep learn image classifier 318 receives a set of images from the EIP 314. Further, each of the medical images may be processed by multiple NAS models in parallel. It may be noted that the plurality of NAS models in the H-NAS SIMO-based deep learn image classifier 318 may be pretrained based on disease processing requirements. Further, each of the medical images may be sent to the H-NAS SIMO-based deep learn image classifier 318 from the EIP 314. The H-NAS SIMO-based deep learn image classifier 318 provides a prediction output for each of the medical images. Each of the plurality of NAS models predicts a disease from a plurality of diseases for each of the medical images.
[041] It may be noted that an output of the H-NAS SIMO-based deep learn image classifier 318 may include multiple disease-based image prediction for each of the plurality of NAS models. In an embodiment, each of the medical images includes ‘n’ predictions corresponding to ‘n’ NAS models with accuracy parameters such as, precision, output accuracy score, number of records, and the like. Therefore, a single medical image is mapped to the plurality of NAS models with the accuracy parameters and sent to the DESIDMS 307 via the SIMO 320. It may be noted that each output includes one NAS metadata prediction information.
[042] The DESIDMS 307 predicts a set of stages for a specific disease. Further, the DESIDMS 307 recommends the iNAS 306 to use a specific NAS model for the specific disease and each of the set of stages associated with the specific disease in real-time via a ranking matrix. It should be noted that the ranking matrix is based on the accuracy parameters (such as, the output accuracy score) of each of the plurality of NAS models for the specific disease and each of the set of stages associated with the specific disease. The DESIDMS 307 aids in improving accuracy of disease detection, prediction, and functionality recommendation by the iNAS 306. Using feedback from the DESIDMS 307, the iNAS disease prediction and recommend system 316 improves disease classification, prediction and recommends a medical procedure for the predicted disease or predicted stage of the predicted disease. Further, the DESIDMS 307 sends the ranking matrix to the iNAS disease prediction and recommend system 316 via the NSI 321.
[043] The NSI 321 processes the ranking matrix received from the DESIDMS 307. Further, the NSI 321 maps the disease with an optimal NAS model from the plurality of NAS models. For example, when a NAS model predicts a number of stages of the disease with high accuracy, and when the number of stages from the set of stages of the disease is above a predefined threshold number, the NSI 321 recommends the NAS model for future data prediction of the disease associated with the set of stages. Therefore, the NSI 321 fine tunes disease-NAS use for real-time data with optimal level of accuracy.
[044] The NSI 321 sends the data to the iNAS disease prediction and recommender system 316. Further, the iNAS 306 determines a NAS model from the plurality of NAS models for future image analysis for the disease. The iNAS 306 may send an acknowledgement-based software agent to the H-NAS SIMO-based deep learn image classifier 318 and the DESIDMS 307 via the NSI 321 with embedded data. The embedded data may include accuracy, real-time data, accuracy for recommender (i.e. treatment recommendation for a disease), accuracy for disease sub-stage, etc. The DESIDMS 307 and the H-NAS SIMO-based deep learn image classifier 318 may be idle when the iNAS 306 correctly predicts the disease for an input image. When prediction accuracy is going below a threshold accuracy parameter, the iNAS 306 enables the DESIDMS 307 and the H-NAS SIMO-based deep learn image classifier 318 to achieve better accuracy for incoming data.
[045] The SDDS 310 includes a CRF-based image segmentation system 322, a multistage Siamese image/non-image tune and calibrate system 323, and an On-Train Image Enhancer Neural System (OTIE) 324.
[046] The SDDS 310 receives the failed image output 319 from the iNAS 306. The failed image output 319 may include false positive medical images, false negative medical images, or poor resolution medical images which were not classified by the iNAS 306. Further, the SDDS 310 transfers the failed image output 319 to the multistage Siamese image/non-image tune and calibrate system 323. Each of failed images in the failed image output 319 is associated with metadata such as, patient name, hospital, disease category, and the liked, for which a tested disease.
[047] The SDDS 310 is available before next stage for failed images to obtain relevant object images and close patterns of the relevant object images using an Image Deep Match Detector and Aggregator (IDDA) system. The IDDA may be enabled to detect possible positive cases from admin-based metadata information. Further, the IDDA determines image similarity. When a large pattern is identified, the IDDA sends the pattern to an optimizer stage. The IDDA helps in avoiding false positive and false negative pattern mapping to achieve image classification with increased accuracy.
[048] Further, images rejected by NAS/CNN are sent to the SDDS 310. The SDDS retrieves optimized true positive images. Further, the SDDS 310 enhances the images and sends the images to an optimizer, and finally to the iNAS 306.
[049] Additionally, failed images are sent to the SDDS 310. Rejected images of the disease may be processed. In general, a failed image recognition implies incorrect mapping of a reference image. However, the failed image recognition may not imply that an image belongs to a respective disease family. In some exemplary scenarios, due to image blunder or distortion, an important image may be rejected as failed. In such scenarios, considerable damage to correct patient disease prediction and treatment recommendation may be caused, and may lead to incorrect guidance of a patient.
[050] In some embodiments, the multistage Siamese image/non-image tune and calibrate system 323 performs two functions. Firstly, the multistage Siamese image/non-image tune and calibrate system 323 uses a multistage Siamese deep analysis to check a pattern associated with each of the failed images. Secondly, the multistage Siamese image/non-image tune and calibrate system 323 maps the pattern with image segmentation uploaded by a network/cloud admin 327 and a pattern associated with the image segmentation.
[051] The network/cloud admin 327 uploads true positive low pixel images. Further, each of the true positive low pixel images is assisted with disease metadata such as, disease name, type, stage, etc., by the batch and real-time low pixel image infuser with disease metadata 309. The CRF-based image segmentation system 322 receives the true positive low pixel images through the batch and real-time low pixel image infuser with disease metadata 309.
[052] Further, the CRF-based image segmentation system 322 maps the true positive low pixel images with MRF-CRF modeling of exact disease image data. Further, the CRF-based image segmentation system 322 generates a relevant disease image pattern.
[053] The multistage Siamese image/non-image tune and calibrate system 323 sends a tuned failed image to the CRF-based image segmentation system 322. Further, the CRF-based image segmentation system 322 performs a 3-step image modeling. The CRF-based image segmentation system 322 assigns failed images, true positive images, and relevant disease images in an embedded matrix (such as, [failed image, true positive image, relevant disease image, disease meta data]). It may be noted that the failed images, true positive images, and relevant disease images are three image patterns. Further, the CRF-based image segmentation system 322 applies MRF-CRF modeling simultaneously on each of the three image patterns. Further, the CRF-based image segmentation system 322 filters out relevant image files from the three image patterns using CR image process. Further, the CRF-based image segmentation system 322 maps the disease metadata with a failed image and sends the failed image with a type of disease, an associated stage of the disease, and other data to the OTIE 324.
[054] The OTIE 324 receives enhanced failed image metadata from the CRF-based image segmentation system 322. Further, the OTIE 324 tunes and enhances each of the failed images to exact disease image pattern based on the enhanced failed image metadata. Further, the OTIE 324 verifies the enhanced failed image with metadata such as, patient ID, hospital ID, etc. Further, the OTIE 324 combines the enhanced failed image with metadata to obtained combined data. Further, the OTIE 324 send the combined data to the image database update and optimizer 315.
[055] The image database update and optimizer 315 receives the combined data from the OTIE 324 and updates database with the enhanced failed image and the metadata. Further, the image database update and optimizer 315 sends the combined data to the iNAS disease prediction and recommend system 316 for a final disease prediction and a treatment recommendation.
[056] The on-demand disease prediction and recommender module 317 performs disease prediction and treatment suggestion based on a pretrained disease-treatment mapping algorithm. It may be noted that the system 300 may enhance accuracy of image processing and optimal selection of deep learning architecture and algorithms using multistage Siamese and NAS combination. Additionally, the system 300 may deep analyze failed or rejected images and develop a mechanism to detect true positive images from failed images using the SDDS 310.
[057] A scenario of a patient with a wrong disease prediction may be avoided and accuracy of disease detection and treatment recommendation may be enhanced through the system 300. As will be appreciated, the system 300 may be applied in domains other than medical and healthcare domains which make use of automated image processing and recognition such as, but not limited to, emotion monitoring of a driver in a vehicle for road safety, industrial manufacturing, supply chain, and the like.
[058] Referring now to FIG. 4, a block diagram of an exemplary IUPD 400 is illustrated, in accordance with some embodiments of the present disclosure. In an embodiment, the IUPD 400 is analogous to the IUPD 305 of the system 300. The IUPD 400 includes an image mapping module 401, a labeled specific image train module 402, an intelligent label predict module 403, a disease image database 404, an open standard based image classification, pattern mapping module 405, and an image-based ranking system 406. The IUPD 400 is a sub-system to identify necessary images for classification by a proposed NAS-Siamese system. The IUPD 400 analyses the processed data and creates a unique pattern. It may be noted that a requirement for the unique pattern may be received from configuration or detected by IUPD algorithm.
[059] The image mapping module 401 receives image input 407 via a labeled image pipe 408 and an unlabeled image pipe 409. Further, the image mapping module 401 makes a robust image/non-image dataset. It should be noted that images labeled with a disease pattern and inaccurate images are separated as secondary images. In some embodiments, the image mapping module 401 equates labeled images with unlabeled images and makes a separate list of each of a set of labels.
[060] The labelled specific image train module 402 receives labeled secondary images and unlabeled secondary images during training stage. Further, the labelled specific image train module 402 trains image label accuracy of predictive model. Further, upon completing the training stage of the predictive model, the labelled specific image train module 402 sends future images to the intelligent label predict module 403 for an automated image label prediction.
[061] Further, the open standard based image classification, pattern mapping module 405 receives the image dataset for additional image processing and accuracy improvement. In an embodiment, when the image label accuracy of the predictive model is above a predefined threshold accuracy (for example, 95%), the image based ranking system 406 bypasses the labelled specific image train module 402, the intelligent label predict module 403, and the open standard-based image classification, pattern mapping module 405. In such an embodiment, when the image label accuracy of the predictive model is above a predefined threshold accuracy, the image-based ranking system 406 activates a virtual image intelligent agent within the open standard-based image classification, pattern mapping module 405. Further, the virtual image intelligent agent handles image classification of incoming labeled and unlabeled data. Additionally, the virtual image intelligent agent provides feedback to the image-based ranking system 406 which generates a ranking matrix (such as, [disease 1, image 1, accuracy], [disease 1, image 2, accuracy], etc.) for disease-specific images to obtain preprocessed data. Further, the preprocessed data is sent to an iNAS (for example, the iNAS 306). As will be appreciated, each of the modules of the IUPD 400 may aid in obtaining higher accuracy images for iNAS-based image filtering system.
[062] Referring now to FIG. 5, a block diagram of an exemplary H-NAS system 500 is illustrated, in accordance with some embodiments of the present disclosure. In an embodiment, the H-NAS system 500 is analogous to the H-NAS SIMO-based deep learn image classifier 318 of the system 300. The H-NAS system 500 includes a plurality of NAS models (for example, a NAS algorithm 502a, a NAS algorithm 502b, a NAS algorithm 502c, a NAS algorithm 502d, and a NAS algorithm 502e). By way of an example, the plurality of NAS models may include, but may not be limited to, DI-NAS, ResNet, NAS-DIP, PNAS, RNN, DARTS, V-NAS, and the liked. The H-NAS system 500 may include opensource NAS algorithms, proprietary NAS algorithms, or a combination thereof. The H-NAS 500 receives a set of medical images from an ensemble model (such as, the EIP 314 of the system 300) in form of ensembled input images 501. Each of the ensembled input images 501 is processed by multiple NAS architectures in parallel.
[063] Each of the plurality of NAS models may be pretrained based on disease processing requirements. Further, each of the plurality of NAS models provides a prediction output for the each of the ensembled input images 501. Further, each of the plurality of NAS models provides image prediction for a specific disease. Further, the H-NAS system 500 provides a NAS predicted output 503. It may be noted that the NAS predicted output 503 is a multiple disease-based image prediction for each of the plurality of NAS models. In an embodiment, each of the plurality of diseases includes ‘n’ NAS predictions with accuracy parameters such as, precision, accuracy, number of records, etc. Therefore, a disease pattern is mapped to multiple NAS models with accuracy parameters and is sent to a DESIDMS (such as, the DESIDMS 307) via a SIMO (such as, the SIMO 320). In some embodiments, the NAS predicted output 503 includes metadata prediction information corresponding to each of the plurality of NAS models.
[064] Referring now to FIG. 6, a block diagram of an exemplary DESIDMS 600 is illustrated, in accordance with some embodiments of the present disclosure. In an embodiment, the DESIDMS 600 is analogous to the DESIDMS 307. The DESIDMS 600 includes an H-NAS-Siamese disease classifier 601 and a plurality of DNAS-Siamese processors (DNSPs) (for example, DNSP 602, DNSP 603, DNSP 604, DNSP 605, DNSP 606, and DNSP 607). Each of the plurality of DNSP includes one or more Sub-Siamese Blocks (SSBs) (for example, SSB 608a and SSB 608b in the DNSP 602). Further, the DESIDMS 600 includes a Siamese ranking system 609 and a deep Siamese image rectification unit 610.
[065] The DESIDMS 600 predicts a set of stages for a specific disease. Further, the DESIDMS 600 recommends an iNAS (for example, the iNAS 306) to use a specific NAS model for predicting the specific disease and each of the set of stages associated with the specific disease in real-time. The DESIDMS 600 aids in improving accuracy of disease detection, prediction, and functionality recommendation by the iNAS. Using feedback from the DESIDMS 600, the iNAS improves disease classification, prediction, and recommends a medical procedure for the predicted disease or predicted stage of the predicted disease.
[066] The H-NAS-Siamese disease classifier 601 classifies the disease through a plurality of NAS models. The H-NAS-Siamese disease classifier 601 receives output from iNAS SIMO. Further, the H-NAS-Siamese disease classifier 601 determines a disease pattern and associates the disease pattern with records. Further, the H-NAS-Siamese disease classifier 601 identifies a disease from a plurality of diseases using input from each of the plurality of NAS models. Further, the H-NAS-Siamese disease classifier 601 maps disease metadata with NAS metadata. It may be noted that each of the plurality of diseases may include a plurality of images and associated NAS accuracy parameters. Each of the plurality of images of a disease may include subcategories or substages (for example, Cancer-stage-1, Cancer-stage-2, etc.).
[067] Further, the H-NAS-Siamese disease classifier 601 selects NAS with a disease label to obtain classified disease-NAS image data. Further, the H-NAS-Siamese disease classifier 601 sends the classified disease-NAS image data and disease image metadata to a next-stage DNSP.
[068] Each of the plurality of DNSPs is a next-level disease image process system. In a DNSP, each disease is divided into sub-disease patterns using image rectification and deep processing algorithms. Output of each of the plurality of NAS models for a specific disease image may be different. Therefore, the DNSP may map disease sub-stage images with parameters corresponding to each of the plurality of NAS models. Based on analysis, each sub-stage of a disease may include multiple NAS accuracy parameters. For example, NAS accuracy parameters for Cancer stage-1 may include [disease 1(cancer),substage-1, image 1, NAS-1, accuracy], [disease 1(cancer), substage, image 1, NAS-2, accuracy], [disease 1(cancer), substage-2, image 10, NAS-3, accuracy], etc.
[069] The SSB 608a and the SSB 608b map disease sub-stage images with NAS prediction data. It may be noted that each of the plurality of DNSPs may include one or more SSBs. However, for ease of understanding, only SSBs within the DNSP602 are shown in figure. Further, the deep Siamese image rectification unit 610 helps the one or more SSBs to precisely identify the disease sub-stages from an incoming disease cluster. In some embodiments, each of the one or more SSBs uses one-shot learning or multi-shot Siamese image processing algorithms.
[070] Further, each of the plurality of diseases and the associated sub-stages are mapped as a feature matrix with the accuracy parameters corresponding to each of the plurality of NAS models. Further, the Siamese ranking system 609 assigns a rank to each of the sub-stages of the disease based on the accuracy parameters associated with each of the plurality of NAS models. Therefore, each of the plurality of diseases may be assigned an optimal NAS model to use for future classification operations.
[071] Further, sub-stage mapping for each of the plurality of diseases is performed through the optimal NAS model to generate a Siamese ranking matrix such as, [disease 1(cancer), substage-1, NAS, NAS-Ranking, accuracy, imageid]. Further, the Siamese ranking matrix is sent to an iNAS Disease prediction and recommender system (for example, the iNAS Disease prediction and recommender system 316 of the system 300) via an NSI (for example, the NSI 321 of the system 300).
[072] Referring now to FIG. 7, a block diagram of an exemplary NSI 700 is illustrated, in accordance with some embodiments of the present disclosure. In an embodiment, the NSI 700 is analogous to the NSI 321 of the system 300. The NSI 700 includes a ranking matrix generation module 701, a mapping analysis module 702, and a most frequent NAS algorithm identification module 703.
[073] The ranking matrix generation module 701 receives ranking matrix data 704 from the DESIDMS (for example, the DESIDMS 307). It may be noted that the ranking matrix data 704 includes the Siamese ranking matrix. Further, the ranking matrix generation module 701 sends the ranking matrix data 704 to the mapping analysis module 702. The mapping analysis module 702 analyses interaction between a disease, associated set of sub-stages, and each of a plurality of NAS models in an iNAS (for example, the iNAS 306). Further, the mapping analysis module 702 sends interaction analysis data to the most frequent NAS algorithm identification module 703. The most frequent NAS algorithm identification module 703 identifies most frequently used NAS model from the plurality of NAS models for the disease based on an interaction between each of the associated set of sub-stages with each of the plurality of NAS models.
[074] Further, the most frequent NAS algorithm identification module 703 selects an optimal NAS model for future disease image prediction. For example, when a set of sub-stages of a disease are mapped with same NAS model, number of the set of sub-stages being higher than a predefined threshold, the NSI 700 recommends the NAS model for future data prediction and treatment recommendation of the disease. When an optimal NAS for the set of sub-stages of the disease is not determined, the most frequent NAS algorithm identification module 703 notifies the iNAS to use sub-stage-NAS mapping for each of the set of sub-stages of the disease. Therefore, the NSI 700 fine-tunes disease-NAS use for real-time data with an improved level of accuracy.
[075] Referring now to FIGS. 8A and 8B, an exemplary process 800 for identifying a disease through image processing is depicted via a flowchart, in accordance with some embodiments of the present disclosure. In an embodiment, the process 800 is implemented by the disease prediction device 101 of the system 100. The process 800 includes receiving image data corresponding to a patient from a data source, at step 801. It should be noted that the image data includes labeled image data and unlabeled image data. Each of the labeled image data includes a label. The data source is one of a batch data source or a real-time data source. Further, the process 800 includes preprocessing the image data through a predictive model to obtain a disease-specific image dataset, at step 802. The predictive model is based on an image processing algorithm. Each of the plurality of diseases includes a set of stages. In some embodiments, a label for each of the unlabeled image data is determined through the predictive model. The predictive model is trained with labeled image data to determine the label for each of the unlabeled image data through the predictive model. It may be noted that each of the image data in the disease-specific image dataset is validated through an ensemble model.
[076] Further, the step 802 of the process 800 includes identifying a set of patterns in each of the image data through a pattern recognition model, at step 803. Further, the step 802 of the process 800 includes assigning a rank to each of the image data corresponding to each of a plurality of diseases based on an accuracy score of pattern recognition model, at step 804. The accuracy score is based on the set of patterns identified. Further, the step 802 of the process 800 includes generating a ranking matrix based on the ranking assigned, at step 805. The ranking matrix includes the accuracy score associated with each of the image data and each of the plurality of diseases. Further, the step 802 of the process 800 includes creating a disease-specific image dataset based on the ranking matrix, at step 806. The accuracy score associated with each of the image data in the disease-specific image dataset is above a predefined threshold score.
[077] By way of an example, the IUPD 305 receives labeled image data and unlabeled image data corresponding to patient from the host/cloud/IoT 301 and the predefined image database 302. Further, the IUPD 305 identifies a set of patterns in each of the image data through a pattern recognition model. Further, the IUPD 305 assigns a rank to each of the image data corresponding to each of a plurality of diseases based on an accuracy score of pattern recognition model. Further, the IUPD 305 generates a ranking matrix based on the ranking assigned and creates a disease-specific image dataset with images ranked above a predefined threshold rank.
[078] Further, for each image in the disease-specific image dataset, the process 800 includes identifying a disease associated with the patient from a plurality of diseases using the disease-specific image dataset through an H-NAS model, at step 807. The H-NAS model includes a plurality of NAS models. Each of the plurality of NAS models provides an output accuracy score for an image. Further, for each image in the disease-specific image dataset, the process 800 includes determining, for each of the plurality of NAS models, a stage from the set of stages associated with the identified disease from the image based on the output accuracy score, at step 808. In continuation of the example above, the IUPD 305 sends the disease-specific image dataset to the iNAS 306. The iNAS 306 includes the HNAS SIMO-based deep learn image classifier 318. The HNAS SIMO-based deep learn image classifier 318 includes a plurality of NAS models. For each image in the disease-specific image dataset, each of the plurality of NAS models predicts a disease with an output accuracy score. Further, the each of the plurality of NAS models predicts a stage of the predicted disease.
[079] Further, for each of the set of stages associated with each of a plurality of diseases, the process 800 includes assigning a rank to each of the plurality of NAS models based on the output accuracy score, at step 809. Further, for each of the set of stages associated with each of a plurality of diseases, the process 800 includes generating a ranking matrix for each of the plurality of NAS models each of the set of stages associated with a disease from the plurality of diseases based on the rank, at step 810. Further, for each of the set of stages associated with each of a plurality of diseases, the process 800 includes using a top-ranked NAS model to identify the stage of the disease from the image data, at step 811.
[080] Further, for each of the set of stages associated with each of a plurality of diseases, the process 800 includes identifying a core disease pattern and a pattern associated with the set of stages corresponding to the core disease pattern based on the image data through a DESIDMS-based DNAS Siamese processor, at step 812. Further, for each of the set of stages associated with each of a plurality of diseases, the process 800 includes comparing an accuracy score associated with each of the plurality of NAS models for each of the set of stages of the core disease pattern through the DESIDMS-based DNAS Siamese processor, at step 813. Further, for each of the set of stages associated with each of a plurality of diseases, the process 800 includes determining a category of the disease and a rank corresponding to the category determined, based on the accuracy score associated with each of the plurality of NAS models for the core disease pattern, at step 814.
[081] Further, for each of the set of stages associated with each of a plurality of diseases, the process 800 includes selecting, by the DESIDMS-based DNAS Siamese processor, an optimal NAS model to identify the disease from the image data, at step 815. The accuracy score for the core disease pattern associated with the optimal NAS model is highest among the plurality of NAS models. In continuation of the example above, the DESIDMS 307 receives the output accuracy score for each of the plurality of NAS models through the SIMO 320. The DESIDMS 307 assigns a rank to each of the plurality of NAS models for the disease based on the output accuracy score. Further, the DESIDMS 307 generates a ranking matrix for the plurality of NAS models for each of the plurality of diseases and the associated set of stages. Further, the DESIDMS 307 determines an optimal NAS model from the plurality of NAS models for identifying the disease for future image inputs based on the ranking matrix. The DESIDMS 307 sends the optimal NAS model recommendation to the iNAS disease prediction and recommender system 316 via the NSI 321.
[082] Referring now to FIG. 9, an exemplary process 900 for identifying a disease associated with the patient from a plurality of diseases using image data is depicted via a flowchart, in accordance with some embodiments of the present disclosure. In an embodiment, the process 900 is implemented by the disease prediction device 101 of the system 100. The process 900 includes enhancing at least one of the image data when the at least one of the image data is a false positive upon classification by the H-NAS model, at step 901. Further, the step 901 of the process 900 includes receiving true positive image data and the at least one of the image data from an administrator, at step 902. Each of the true positive image data includes true positive metadata. Further, the step 901 of the process 900 includes identifying metadata associated with the disease based on the true positive image data, at step 903. Further, the step 901 of the process 900 includes determining enhanced image data from each of the true positive image data and the at least one of the image data using a CRF model based on the metadata associated with the disease, at step 904.
[083] Further, the step 901 of the process 900 includes associating the metadata associated with the disease with the at least one of the image data determined as the enhanced image data, at step 905. Further, the process 900 includes validating the at least one of the image data with metadata associated with the patient, at step 906. Further, the process 900 includes identifying a disease associated with the patient from a plurality of diseases using the at least one of the image data through the H-NAS model, at step 907.
[084]
[085] As will be also appreciated, the above described techniques may take the form of computer or controller implemented processes and apparatuses for practicing those processes. The disclosure can also be embodied in the form of computer program code containing instructions embodied in tangible media, such as floppy diskettes, solid state drives, CD-ROMs, hard drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer or controller, the computer becomes an apparatus for practicing the invention. The disclosure may also be embodied in the form of computer program code or signal, for example, whether stored in a storage medium, loaded into and/or executed by a computer or controller, or transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention. When implemented on a general-purpose microprocessor, the computer program code segments configure the microprocessor to create specific logic circuits.
[086] The disclosed methods and systems may be implemented on a conventional or a general-purpose computer system, such as a personal computer (PC) or server computer. Referring now to FIG. 10, an exemplary computing system 1000 that may be employed to implement processing functionality for various embodiments (e.g., as a SIMD device, client device, server device, one or more processors, or the like) is illustrated. Those skilled in the relevant art will also recognize how to implement the invention using other computer systems or architectures. The computing system 1000 may represent, for example, a user device such as a desktop, a laptop, a mobile phone, personal entertainment device, DVR, and so on, or any other type of special or general-purpose computing device as may be desirable or appropriate for a given application or environment. The computing system 1000 may include one or more processors, such as a processor 1001 that may be implemented using a general or special purpose processing engine such as, for example, a microprocessor, microcontroller or other control logic. In this example, the processor 1001 is connected to a bus 1002 or other communication medium. In some embodiments, the processor 1001 may be an Artificial Intelligence (AI) processor, which may be implemented as a Tensor Processing Unit (TPU), or a graphical processor unit, or a custom programmable solution Field-Programmable Gate Array (FPGA).
[087] The computing system 1000 may also include a memory 1003 (main memory), for example, Random Access Memory (RAM) or other dynamic memory, for storing information and instructions to be executed by the processor 1001. The memory 1003 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 1001. The computing system 1000 may likewise include a read only memory (“ROM”) or other static storage device coupled to bus 1002 for storing static information and instructions for the processor 1001.
[088] The computing system 1000 may also include a storage device 1004, which may include, for example, a media drives 1005 and a removable storage interface. The media drive 1005 may include a drive or other mechanism to support fixed or removable storage media, such as a hard disk drive, a floppy disk drive, a magnetic tape drive, an SD card port, a USB port, a micro USB, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive. A storage media 1006 may include, for example, a hard disk, magnetic tape, flash drive, or other fixed or removable medium that is read by and written to by the media drive 1005. As these examples illustrate, the storage media 1006 may include a computer-readable storage medium having stored there in particular computer software or data.
[089] In alternative embodiments, the storage devices 1004 may include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into the computing system 1000. Such instrumentalities may include, for example, a removable storage unit 1007 and a storage unit interface 1008, such as a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, and other removable storage units and interfaces that allow software and data to be transferred from the removable storage unit 1007 to the computing system 1000.
[090] The computing system 1000 may also include a communications interface 1009. The communications interface 1009 may be used to allow software and data to be transferred between the computing system 1000 and external devices. Examples of the communications interface 1009 may include a network interface (such as an Ethernet or other NIC card), a communications port (such as for example, a USB port, a micro USB port), Near field Communication (NFC), etc. Software and data transferred via the communications interface 1009 are in the form of signals which may be electronic, electromagnetic, optical, or other signals capable of being received by the communications interface 1009. These signals are provided to the communications interface 1009 via a channel 1010. The channel 1010 may carry signals and may be implemented using a wireless medium, wire or cable, fiber optics, or other communications medium. Some examples of the channel 1010 may include a phone line, a cellular phone link, an RF link, a Bluetooth link, a network interface, a local or wide area network, and other communications channels.
[091] The computing system 1000 may further include Input/Output (I/O) devices 1011. Examples may include, but are not limited to a display, keypad, microphone, audio speakers, vibrating motor, LED lights, etc. The I/O devices 1011 may receive input from a user and also display an output of the computation performed by the processor 1001. In this document, the terms “computer program product” and “computer-readable medium” may be used generally to refer to media such as, for example, the memory 1003, the storage devices 1004, the removable storage unit 1007, or signal(s) on the channel 1010. These and other forms of computer-readable media may be involved in providing one or more sequences of one or more instructions to the processor 1001 for execution. Such instructions, generally referred to as “computer program code” (which may be grouped in the form of computer programs or other groupings), when executed, enable the computing system 1000 to perform features or functions of embodiments of the present invention.
[092] In an embodiment where the elements are implemented using software, the software may be stored in a computer-readable medium and loaded into the computing system 1000 using, for example, the removable storage unit 1007, the media drive 1005 or the communications interface 1009. The control logic (in this example, software instructions or computer program code), when executed by the processor 1001, causes the processor 1001 to perform the functions of the invention as described herein.
[093] Thus, the disclosed method and system try to overcome the technical problem of identifying a disease through image processing. The method and system provide improvements to medical and non-medical image processing methods in terms of accuracy and selection of an optimal neural architecture. Further, the method and system develop an accurate disease prediction and treatment recommendation for an end user. Further, the method and system provide improvements to existing deep learning and machine learning-based image classification. Further, the method and system provide improvements to accuracy and subcategorization of image classification. Further, the method and system provide for multiple disease pattern detection, classification, sub-classification, disease detection, treatment prediction, and recommendation. Further, the method and system reduce requirement and cost for manual processing.
[094] As will be appreciated by those skilled in the art, the techniques described in the various embodiments discussed above are not routine, or conventional, or well understood in the art. The techniques discussed above provide for identifying a disease through image processing. The techniques first receive image data corresponding to a patient from a data source. The image data includes labeled image data and unlabeled image data. Each of the labeled image data includes a label. The data source is one of a batch data source or a real-time data source. The techniques then preprocess the image data through a predictive model to obtain a disease-specific image dataset. The predictive model is based on an image processing algorithm. For each image in the disease-specific image dataset, the techniques then identify a disease associated with the patient from a plurality of diseases using the disease-specific image dataset through a Hybrid Neural Architecture Search (H-NAS) model. The H-NAS model includes a plurality of NAS models. Each of the plurality of NAS models provides an output accuracy score for an image. Each of the plurality of diseases includes a set of stages. For each image in the disease-specific image dataset, the techniques then, for each of the plurality of NAS models, determine a stage from the set of stages associated with the identified disease from the image based on the output accuracy score.
[095] In light of the above mentioned advantages and the technical advancements provided by the disclosed method and system, the claimed steps as discussed above are not routine, conventional, or well understood in the art, as the claimed steps enable the following solutions to the existing problems in conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the device itself as the claimed steps provide a technical solution to a technical problem.
[096] The specification has described method and system for identifying a disease through image processing. The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[097] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[098] It is intended that the disclosure and examples be considered as exemplary only, with a true scope and spirit of disclosed embodiments being indicated by the following claims.
| # | Name | Date |
|---|---|---|
| 1 | 202111026649-CLAIMS [22-08-2022(online)].pdf | 2022-08-22 |
| 1 | 202111026649-STATEMENT OF UNDERTAKING (FORM 3) [15-06-2021(online)].pdf | 2021-06-15 |
| 2 | 202111026649-COMPLETE SPECIFICATION [22-08-2022(online)].pdf | 2022-08-22 |
| 2 | 202111026649-REQUEST FOR EXAMINATION (FORM-18) [15-06-2021(online)].pdf | 2021-06-15 |
| 3 | 202111026649-REQUEST FOR EARLY PUBLICATION(FORM-9) [15-06-2021(online)].pdf | 2021-06-15 |
| 3 | 202111026649-CORRESPONDENCE [22-08-2022(online)].pdf | 2022-08-22 |
| 4 | 202111026649-PROOF OF RIGHT [15-06-2021(online)].pdf | 2021-06-15 |
| 4 | 202111026649-FER_SER_REPLY [22-08-2022(online)].pdf | 2022-08-22 |
| 5 | 202111026649-POWER OF AUTHORITY [15-06-2021(online)].pdf | 2021-06-15 |
| 5 | 202111026649-OTHERS [22-08-2022(online)].pdf | 2022-08-22 |
| 6 | 202111026649-FORM-9 [15-06-2021(online)].pdf | 2021-06-15 |
| 6 | 202111026649-FER.pdf | 2022-03-10 |
| 7 | 202111026649-FORM 18 [15-06-2021(online)].pdf | 2021-06-15 |
| 7 | 202111026649-COMPLETE SPECIFICATION [15-06-2021(online)].pdf | 2021-06-15 |
| 8 | 202111026649-FORM 1 [15-06-2021(online)].pdf | 2021-06-15 |
| 8 | 202111026649-DECLARATION OF INVENTORSHIP (FORM 5) [15-06-2021(online)].pdf | 2021-06-15 |
| 9 | 202111026649-DRAWINGS [15-06-2021(online)].pdf | 2021-06-15 |
| 9 | 202111026649-FIGURE OF ABSTRACT [15-06-2021(online)].jpg | 2021-06-15 |
| 10 | 202111026649-DRAWINGS [15-06-2021(online)].pdf | 2021-06-15 |
| 10 | 202111026649-FIGURE OF ABSTRACT [15-06-2021(online)].jpg | 2021-06-15 |
| 11 | 202111026649-DECLARATION OF INVENTORSHIP (FORM 5) [15-06-2021(online)].pdf | 2021-06-15 |
| 11 | 202111026649-FORM 1 [15-06-2021(online)].pdf | 2021-06-15 |
| 12 | 202111026649-COMPLETE SPECIFICATION [15-06-2021(online)].pdf | 2021-06-15 |
| 12 | 202111026649-FORM 18 [15-06-2021(online)].pdf | 2021-06-15 |
| 13 | 202111026649-FER.pdf | 2022-03-10 |
| 13 | 202111026649-FORM-9 [15-06-2021(online)].pdf | 2021-06-15 |
| 14 | 202111026649-OTHERS [22-08-2022(online)].pdf | 2022-08-22 |
| 14 | 202111026649-POWER OF AUTHORITY [15-06-2021(online)].pdf | 2021-06-15 |
| 15 | 202111026649-FER_SER_REPLY [22-08-2022(online)].pdf | 2022-08-22 |
| 15 | 202111026649-PROOF OF RIGHT [15-06-2021(online)].pdf | 2021-06-15 |
| 16 | 202111026649-CORRESPONDENCE [22-08-2022(online)].pdf | 2022-08-22 |
| 16 | 202111026649-REQUEST FOR EARLY PUBLICATION(FORM-9) [15-06-2021(online)].pdf | 2021-06-15 |
| 17 | 202111026649-COMPLETE SPECIFICATION [22-08-2022(online)].pdf | 2022-08-22 |
| 17 | 202111026649-REQUEST FOR EXAMINATION (FORM-18) [15-06-2021(online)].pdf | 2021-06-15 |
| 18 | 202111026649-STATEMENT OF UNDERTAKING (FORM 3) [15-06-2021(online)].pdf | 2021-06-15 |
| 18 | 202111026649-CLAIMS [22-08-2022(online)].pdf | 2022-08-22 |
| 19 | 202111026649-US(14)-HearingNotice-(HearingDate-03-12-2025).pdf | 2025-10-29 |
| 1 | SearchHistoryE_09-03-2022.pdf |