Sign In to Follow Application
View All Documents & Correspondence

System And Method For Processing Medical Data, Based On The Type Of Medical Data, For Predicting Medical Condition Of Subject

Abstract: In one embodiment, a system 100 for analyzing medical data based on the type of medical data for predicting medical condition of a subject is disclosed. The system 100 comprises a processing unit 102, in communication with a host-device 200, the processing unit 102 configured for receiving medical data from the host-device along with information on type of the medical data, the medical data being associated with one or more organs of a subject, processing the medical data, based on the type of the medical data, in order to detect a medical condition associated with one of the organs, using a convolution neural network, transmitting information concerning the medical condition to the processor of the host device 200 so as to enable the host device 200 to display information on the detected medical condition and wherein processing the medical data comprises classifying the medical data based on the information received on the type of medical data, determining one or more abnormalities, anatomical features and artifacts based on the type of the medical data and detecting a medical condition upon identifying the presence of one or more abnormalities in the medical data.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
12 April 2023
Publication Number
47/2023
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

ARTIFICIAL LEARNING SYSTEMS INDIA PVT LTD
1665/A, 14th Main Rd, Sector 7, HSR Layout, Bengaluru, Karnataka 560102, India.

Inventors

1. Dipan Roy
34/2, 1st Link Road, Rishi Arabindo Road, Madhyamgram, Kolkata - 700130
2. Pradeep Walia
60B, Crosstie st , Knightdale, North Carolina, USA-27545
3. Girsh Somvanshi
1603, Pelican, Skylark Enclave, Hiranandani Estate,Thane (W) -400607

Specification

Description:TECHNICAL FIELD OF INVENTION
[0001] The invention relates in general to the field of medical data analytics. More particularly, the invention relates to a processing unit configured for analyzing different types of medical data using machine learning algorithms.

BACKGROUND OF THE INVENTION
[0002] Analysis of medical data, for example, medical images like retinal fundus images, radiographs, etc., of a subject provide insight regarding the subject’s health condition. Analysis of medical data especially medical images associated with an organ or body part require specific knowledge of features and abnormalities present in the medical images to accurately determine the subject’s health condition. Diverse expertise is thus required for the analysis of medical data associated with the subject’s different organs or body parts. This diverse expertise to analyze medical data associated with the subject’s different organs or body parts may not be conveniently available at all times at the same place. For example, a patient may be required to visit different hospitals with medical personnel having specific expertise to analyze medical data associated with different body parts. This causes delay, increases expense and inconvenience to the patient to get all the required medical information from medical data associated with different organs or body parts, thereby preventing the patient from seeking timely medical attention.

[0003] Further, healthcare data is diverse and has different format. Moreover, with the rapid use of wearable sensors in healthcare results in tremendous increase in the size of heterogeneous data. For effective prevention methods, integration of data is required.

[0004] A vast amount of important clinical data exists in EMRs. However, much of the data remains out of reach from most researchers due to its unstructured nature and inability to be easily searched.

[0005] Systems such as Picture Archival and Communication System (PACS) use protocols such as DICOM (Digital Image and Communication in Medicine) to deliver images to local workstations. However, data exchange through PACS relies on using structured data to retrieve medical images. This by nature misses out on unstructured information contained in some biomedical images. It may also miss additional information about patient’s health status that is present in these images. A professional focused on diagnosing an unrelated condition may not be trained to observe any other emerging trend.

[0006] Hence, there exists a need to integrate the predictive nature of different medical data including but not limiting to clinical data, data from wearables, medical images and generic data allowing the development of machine learning algorithms which aid in prediction of the subject’s health condition. Specifically, there is a need for a single portable processor employing machine learning algorithms which is compatible over a wide spectrum of medical devices and can be used for the analysis of medical data associated with different organs or body parts of the subject.

SUMMARY OF INVENTION
[0007] This summary is provided to introduce a selection of concepts in a simplified form that are further disclosed in the detailed description of the invention. This summary is not intended to identify key or essential inventive concepts of the claimed subject matter, nor is it intended for determining the scope of the claimed subject matter.

[0008] Certain examples provide system and method for analyzing medical data based on the type of medical data for predicting medical condition of a subject.

[0009] Accordingly, in one embodiment, a system 100 for analyzing medical data based on the type of medical data for predicting medical condition of a subject is disclosed. The system 100 comprises a processing unit, in communication with a host-device 200, the processing unit configured for receiving medical data from the host-device along with information on type of the medical data, the medical data being associated with one or more organs of a subject, processing the medical data, based on the type of the medical data, in order to detect a medical condition associated with one of the organs, using a convolution neural network, transmitting information concerning the medical condition to the processor of the host device so as to enable the host device to display information on the detected medical condition and wherein processing the medical data comprises classifying the medical data based on the information received on the type of medical data, determining one or more abnormalities, anatomical features and artifacts based on the type of the medical data and detecting a medical condition upon identifying the presence of one or more abnormalities in the medical data.

[0010] In another embodiment, a method for analysis of a medical data of a patient is disclosed the method comprising the steps of receiving, by a system for analyzing medical data 100, medical data of a patient along with information on type of medical data from a host device 200, processing the medical data, by a processing unit 102 using a convolutional neural network, based on the type of the medical data, detecting a medical condition associated with the medical data upon processing by the processing unit 102 using a convolutional neural network, transmitting the detected medical condition to the host device 100 by the processing unit 102; and displaying the detected medical condition by the host device 100.

[0011] In yet another embodiment, a computer program product storing computer readable instructions which when executed by a processor cause the processor to perform a method for analysis of a medical data of a patient, the method comprising steps of receiving, by a system for analyzing medical data 100, medical data of a patient along with information on type of medical data from a host device 200, processing the medical data, by a processing unit 102 using a convolutional neural network, based on the type of the medical data, detecting a medical condition associated with the medical data upon processing by the processing unit 102 using a convolutional neural network, transmitting the detected medical condition to the host device 100 by the processing unit 102 and displaying the detected medical condition by the host device 100.

BRIEF DESCRIPTION OF DRAWINGS
[0012] The present invention is described with reference to the accompanying figures. The accompanying figures, which are incorporated herein, are given by way of illustration only and mouseform part of the specification together with the description to explain the make and use the invention, in which,

[0013] Figure 1 illustrates a block diagram of the system for receiving medical data of a subject, determining type of medical data, analyzing the medical data based on determination and predicting medical condition of the subject in accordance with the invention; and

[0014] Figure 2 exemplary illustrates a flowchart for analysis of a medical data of a patient in accordance with the invention.

[0015] The foregoing summary, as well as the following detailed description of certain embodiments of the present invention, will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, certain embodiments are shown in the drawings. It should be understood, however, that the present invention is not limited to the arrangements and instrumentality shown in the attached drawings. The figures are not scale. Wherever possible, the same reference numbers will be used throughout the drawings and accompanying written description to refer to the same or like parts.

DETAILED DESCRIPTION OF THE INVENTION
[0016] In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific examples that may be practiced. These examples are described in sufficient detail to enable one skilled in the art to practice the subject matter, and it is to be understood that other examples may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the subject matter of this disclosure. The following detailed description is, therefore, provided to describe an exemplary implementation and not to be taken as limiting on the scope of the subject matter described in this disclosure. Certain features from different aspects of the following description may be combined to form yet new aspects of the subject matter discussed below.

[0017] When introducing elements of various embodiments of the present disclosure, the articles “a,” “an,” “the,” and “said” are intended to mean that there are one or more of the elements. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements.

[0018] While certain examples are described below in the context of medical or healthcare systems, other examples can be implemented outside the medical environment. For example, certain examples can be applied to non-medical imaging such as non-destructive testing, explosive detection, etc.

[0019] In one embodiment, a system 100 for receiving medical data, determining type of medical data and analyzing determined medical data based on the type of medical data for predicting medical condition of a subject is disclosed. The system 100 comprises an interface unit configured for receiving input from a host device 200, the input selected from a group consisting of a clinical data, an image data, a generic data, sensing data, electronic health record, public health record and a textual data, a processing unit 102 configured for identifying the type of data from the input data and process the input for determining a medical condition associated with the input data and a memory unit 104 coupled to the processing unit 102, the memory unit 104 configured for storing medical data along with the determination of type of medical data.

[0020] The interface unit may be removably coupled to the host device 200 via a communication port. The communication port is selected from a group comprising Personal System 100 (PS), Serial Port (SP), Parallel Port (PP), Video Graphics Array (VGA), High Definition Multimedia Interface (HDMI), Digital Visual Interface (DVI), Universal Serial Bus (USB) radio Communication of America (RCA) and Registered Jack (RJ).

[0021] The interface unit may be configured to communicate using Internet Protocol compatible wireless communications standard that may comprise any suitable wireless access system 100, e.g. Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Time Division Duplex (TDD), Orthogonal Frequency Multiple Access (OFDMA) or combinations of these such as CDMA/FDMA, CDMA/FDMA/TDMA, FDMA/TDMA. As a specific example, one of IEEE 802.11b, Bluetooth and the Generalized Packet Radio System 100 (GPRS) may be selected.

[0022] Accordingly, in one exemplary embodiment, the system 100 for processing medical data 100 is configured to communicate with the host device 200 using IEEE 802.11 standard for wireless communication.

[0023] In one embodiment, the processing unit 102 may be configured to be implemented using one or more processors. The processor may be a general purpose processor, a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), and/or the like. The processor may be configured to retrieve data from and/or write data to the memory. The memory may be, for example, a random access memory (RAM), a memory buffer, a hard drive, a database, an erasable programmable read only memory (EPROM), an electrically erasable programmable read only memory (EEPROM), a read only memory (ROM), a flash memory, a hard disk, a floppy disk, cloud storage, and/or so forth. In one embodiment, the server may include one or more hardware- based modules (e.g., DSP, FPGA, ASIC) and/or software- based modules (e.g., a module of computer code stored at the memory and executed at the processor, a set of processor- readable instructions that may be stored at the memory and executed at the processor) associated with executing an application.

[0024] The medical data is one of a medical text, meta data, medical images or any such data or a combination thereof. The medical images can be one of a two-dimensional image, a three-dimensional image, a digital video, a real time video or the like. The medical condition is one of a healthy condition and a diseased condition.

[0025] Accordingly, in an embodiment, the host device 200 is a medical device used for capturing medical image of a patient such as an X-ray apparatus, a Computed Tomography (CT) imaging device, Positron Emission Tomography (PET) imaging device, an ultrasound imaging device, an optical coherence tomography (OCT) imaging device, a Magnetic Resonance Imaging (MRI) device, electroencephalography (EEG) imaging device, a computed tomography imaging (CT) device and a mammography imaging device

[0026] In another embodiment, the host device 102 is a personal computer, a laptop, a tablet computing device, a personal digital assistant, a smart phone, a mobile phone, a device with an end to end augmented or virtual reality interface, etc.

[0027] In yet another embodiment, the host device 102 is a wearable device such as a smart watch, a smart phone, a smart glasses, etc. Some examples of IoT devices used in healthcare include fitness/health tracking wearable devices, biosensors and clinical devices for monitoring vital signs.

[0028] In yet another embodiment, the host device 200 is a laboratory device selected from a group consisting a colorimeter, spectrometer, spectrophotometer and photometer.

[0029] Figure 1 illustrates a schematic diagram depicting the environment of working. Accordingly, the system 100 for processing medical data 100 is shown in communication with a host-device 200 in accordance with an embodiment of the invention. The processing unit 102 is configured to decide, based on the type of medical data, a medical condition of the captured medical- data using a trained algorithm. Accordingly, the processing unit 102 employs deep learning for analyzing each medical data and identifying a medical condition if any.

[0030] Deep learning is a class of machine learning techniques employing representation learning methods that allows a machine to be given raw data and determine the representations needed for data classification. Deep learning ascertains structure in data sets using backpropagation algorithms which are used to alter internal parameters (e.g., node weights) of the deep learning machine. Deep learning machines can utilize a variety of multilayer architectures and algorithms. While machine learning, for example, involves an identification of features to be used in training the network, deep learning processes raw data to identify features of interest without the external identification.

[0031] Deep learning in a neural network environment includes numerous interconnected nodes referred to as neurons. Input neurons, activated from an outside source, activate other neurons based on connections to those other neurons which are governed by the machine parameters. A neural network behaves in a certain manner based on its own parameters. Learning refines the machine parameters, and, by extension, the connections between neurons in the network, such that the neural network behaves in a desired manner.

[0032] Deep learning that utilizes a convolutional neural network segments data using convolutional filters to locate and identify learned, observable features in the data. Each filter or layer of the CNN architecture transforms the input data to increase the selectivity and invariance of the data. This abstraction of the data allows the machine to focus on the features in the data it is attempting to classify and ignore irrelevant background information.

[0033] Deep learning operates on the understanding that many datasets include high level features which include low level features. While examining an image, for example, rather than looking for an object, it is more efficient to look for edges which form motifs which form parts, which form the object being sought. These hierarchies of features can be found in many different forms of data such as speech and text, etc.

[0034] Learned observable features include objects and quantifiable regularities learned by the machine during supervised learning. A machine provided with a large set of well classified data is better equipped to distinguish and extract the features pertinent to successful classification of new data.

[0035] In one embodiment, the processing unit 102 receives the medical data along with information on the type of medical data and a label representing the medical data from the host device 200. Accordingly, in an embodiment, the user of the host device 200 is prompted by the processor 101 to select the type of medical data associated with the medical data of the patient via a user interface of the host device 200. The medical data is, for example, text, laboratory report, meta data, image, genetic data or any such data. The medical images can be one of a two-dimensional or a three-dimensional array of image data, a digital video clip, a live stream of a digital medical data, etc. For example, the medical data is a retinal fundus image, a radiographic medical image of a human body part such as a chest X-ray, a mammogram, a dental X-ray, a PET image etc., The laboratory report may include a text, an image or a video of a human body-fluid sample such as a blood sample, a urine sample, a semen sample, etc.

[0036] In an alternative embodiment, the processing unit 102 can also be trained to identify or determine the type of medical data based on the metadata and/or from the mappings of the medical data and the type of medical data stored in the memory unit 104.

[0037] The processing unit 102 applies the trained algorithm to classify the medical data of the patient. The type of the medical data represents the one or more body parts, the information of which constitutes the medical data and also the technology employed in obtaining the medical data such as an ultrasonography, radiography, thermography, fundoscopy, optical coherence tomography a positron emission tomography (PET) and magnetic resonance imaging (MRI), etc.

[0038] Trained algorithms of the present invention include algorithms that have been developed using a reference set of known diseased, and normal samples. As used herein, the term “trained algorithm” refers to a class of deep artificial neural network, for example, a convolutional neural network, that can be applied to analyzing visual imagery. The convolutional neural network corresponds to a specific model of an artificial neural network. In an embodiment, one or more convolutional neural networks are applied to process the medical data of the patient. The trained algorithm may also comprise, for example, a support vector machine, recurrent neural networks, deep belief networks, a random forest, gradient boosting, decision trees, boosted decision trees, partial least square classification or regression, branch-and-bound algorithms, neural network models, deep neural networks, convolutional deep neural networks or any combination thereof.

[0039] In an example, the host device 200 is a fundus camera. The type of the medical data associated with the fundus camera is a retinal-fundus image of the patient. In another example, the host device 200 is a chest radiographic device. The type of medical data associated with the host device 200 is a chest radiograph. In another example, the host device 200 is a dental X-ray machine. The type of the medical data associated with the host device 200 a dental radiograph.

[0040] In order to identify unstructured data in medical images, the processing unit 102 employs machine learning and pattern recognition techniques to draw insights from massive amounts of clinical image data to transform the diagnosis and treatment. It enhances the diagnostic capability of medical imaging for clinical decision-making.

[0041] In order for processing unstructured textual data, the processing unit 102 is configured to employ Natural language processing (NLP) to identify medical data in EHR or clinical record per se and extracts clean and structured information from unstructured data as input. The HER includes information such as patient’s medical history (diagnosis and prescriptions related data), medical and clinical data (like data from imaging and laboratory examinations), and other private or personal medical data.EHR also includes information such as medical diagnoses, prescriptions, data related to known allergies, demographics, clinical narratives, and the results obtained from various laboratory tests. The recognition and treatment of medical conditions thus is time efficient due to a reduction in the lag time of previous test results.

[0042] The NLP procedures target at turning texts to machine-readable structured data, which can then be analyzed by ML techniques. Using the NLP, the processing unit 102 can sort through over a million electronic medical records (EMR) and echocardiogram reports to identify certain abbreviations, words, and phrases associated with aortic stenosis.

[0043] In another embodiment, the processing unit 102 in addition to employing NLP, employs Optical Character Recognition (OCR) to identify different texts along with fonts and thereby convert static images containing patient data into machine-readable text.

[0044] Further, the processing unit 102 may employ a number of algorithms based on functionalities such as generic, registration, segmentation, visualization, reconstruction, simulation and diffusion to perform medical image analysis in order to dig out the hidden information. For example, Visualization Toolkit is a freely available software which allows powerful processing and analysis of 3D images from medical tests [23], while SPM can process and analyze 5 different types of brain images (e.g. MRI, fMRI, PET, CT-Scan and EEG) [24]. Other software like GIMIAS, Elastix, and MITK support all types of images. Various other widely used tools and their features in this domain are listed in Table 1. Such bioinformatics-based big data analysis may extract greater insights and value from imaging data to boost and support precision medicine projects, clinical decision support tools, and other modes of healthcare. For example, we can also use it to monitor new targeted-treatments for cancer.

[0045] The processing unit 102 applies the trained algorithm to detect if any abnormalities are present in the medical data. In an embodiment, the processing unit 102 also locates anatomical features and/or any artifacts present in the medical data using the trained algorithm. The anatomical features represent one or more body parts of the patient present in the medical data. For example, the anatomical features in a chest radiograph are the lungs, heart, chest wall, great vessels, etc. The artifacts are, for example, metal artifacts. The processing unit 102 determines the abnormalities, anatomical features and artifacts based on the type of the medical data. In other words, the processing unit 102 gains insight into the body parts focused in the medical data and a technology associated with the medical data from the type of medical data.

[0046] The processing unit 102 estimates the medical condition based on the presence of any abnormalities in the medical data using the trained algorithm. The medical condition is one of a healthy condition or a diseased condition. The processing unit 102 classifies the medical condition as the diseased condition when the processing unit 102 determines the presence of any abnormalities in the medical data. The processing unit 102 classifies the medical condition as the healthy condition when the processing unit 102 determines an absence of abnormalities in the medical data. The abnormalities in the medical data are based on the type of the medical data.

[0047] Accordingly, if the medical data represents a laboratory report, the processing unit 102 is configured to parse through the medical data to identify a reading that is beyond the permissible limits and consequently determines the presence of a medical condition.

[0048] In another embodiment, if the medical data represents a medical image from one of the imaging devices, the processing unit 102 is configured to analyze the medical image for candidate objects or artifacts and consequently determine the presence of a medical condition upon identifying the artifacts.

[0049] The processing unit 102 transmits an output of the trained algorithm to the host device 200. The output of the trained algorithm includes the medical condition of the patient. The host device 200 may generate a report based on the input received from the processing unit 102 and accordingly display the report on a display device operably coupled to the host device 200 either internally or externally. The display device provides a mechanism to display data to a user and may be, for example, a computer monitor or a smart phone display screen.

[0050] The output from the processing unit 102 may further be stored in the memory unit 104 of the system 100 for future trainings.

[0051] Figure 3 exemplary illustrates a flowchart for analysis of the medical data of the patient in accordance with an embodiment of the invention.

[0052] At step S2, the processing unit 102 receives the medical data and the type of the medical data from the host device 200. At step S3, the processing unit 102 detects the medical condition associated with the medical data of the patient based on the type of the medical data using the trained algorithm. The medical condition is one of the healthy condition or the diseased condition. The processing unit 102 classifies the medical condition as the diseased condition when the AI health- processor 102 determines the presence of any abnormalities in the medical data. The AI health- processor 102 classifies the medical condition as the healthy condition in an absence of abnormalities in the medical data. The abnormalities in the medical data are determined based on the type of the medical data. The processing unit 102 may further identify an intensity of the diseased condition using the trained algorithm.

[0053] At step S4, the processing unit 102 transmits the detected medical condition to the host device 200. At step S5, the host device 200 generates a report based on the detected medical condition and displays the generated report to a user of the host device 200 via the user interface 105. In an embodiment, the processor 101 may transmit the generated report to a smart phone using wireless communication. The processor 101 may transmit the generated report to a respective medical practitioner for appropriate medical care of the patient via a network. The network is, for example, a wired network, a wireless network, etc.

[0054] The system 100 configured for receiving different type of medical data and analyzing the same is compatible with a range of host device 200s. The seamless compatibility of the system 100 within different host device 200s reduces the cost and time involved in identifying and analyzing different types of medical data.

[0055] Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (system 100s), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.

[0056] The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of system 100s, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based system 100s that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.

[0057] In one embodiment, the processing unit 102 is a part of a system 100 that is hosted in cloud and may have a java servlet for identifying, classifying and processing the medical data for determining the medical condition. Such a system 100 may assist a physician in diagnosing a dermatological condition of a remotely located patient. The system 100 may be implemented in multiple ways such as through mobile phones, computers and various smart devices. Moreover, it may be safely handled by non-specialized or poor-skilled medical personnel. Once the software is downloaded onto a user device such as mobile phone, the user device acts as a stand-alone device capable of analyzing the medical data of the subject. This allows initial medical support at remote areas with no access to a global computer network such as the internet.

[0058] In yet further aspects of the present disclosure, there is provided a computer program which comprises program code means for causing a computer to perform the steps of the method disclosed herein when said computer program is carried out on a computer as well as a non-transitory computer-readable recording medium that stores therein a computer program product, which, when executed by a processor, causes the method 200 disclosed herein to be performed.

[0059] In yet further aspects of the present disclosure, there are provided a computer program which comprises program code means for causing a computer to perform the steps of the method 300 disclosed herein when said computer program is carried out on a computer as well as a non-transitory computer-readable recording medium that stores therein a computer program product, which, when executed by a processor, causes the method 300 disclosed herein to be performed.

[0060] These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.

[0061] The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.

[0062] In an example, the method for analysis of the medical data of a subject is implemented as a software application downloadable on a processing unit 102 of the user device. The user device is one of a personal computer, a laptop, a tablet computing device, a personal digital assistant, a smart phone, a mobile phone, a device with an end to end augmented or virtual reality interface, etc. The computer programme when downloaded causes the user device to classify and analyze the medical data in order to identify a medical condition associated with the medical data.

[0063] Accordingly, in an exemplary embodiment, the processing unit 102 receives the chest radiograph and the type of the medical data from the host device 200. The processing unit 102 detects the medical condition associated with the chest radiograph based on the type of the medical data using the trained algorithm. For instance, the trained algorithm is a burned-in code present in a read-only memory (ROM) of the user device. The processing unit 102 transmits the detected medical condition to the processor 101 of the mobile phone. The user device generates the report based on the detected medical condition and displays to the user via a user interface.

[0064] Any combination of computer-readable media may be utilized. Computer-readable media may be a computer-readable signal medium or a computer-readable storage medium. A computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system 100, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of a computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system 100, apparatus, or device.

[0065] A computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system 100, apparatus, or device.

[0066] Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including, but not limited to, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.

[0067] Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as JAVA (note: the term(s) “Java” may be subject to trademark rights in various jurisdictions throughout the world and are used here only in reference to the products or services properly denominated by the marks to the extent that such trademark rights may exist), Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on a user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).

[0068] The processing unit 102 reduces errors resulting from manual identification of various medical conditions during screening of the patient. The processing unit 102 acts as an important supporting tool in detecting/monitoring one or more diseases, associated with different body parts of the patient in an effective manner. The processing unit 102 reduces the time- consumption involved in a manual recording of the medical condition present/absent in the medical data of the patient.

[0069] This approach uses ML and pattern recognition techniques to draw insights from massive volumes of clinical image data to transform the diagnosis, treatment and monitoring of patients. It focuses on enhancing the diagnostic capability of medical imaging for clinical decision-making.

[0070] The present invention described above, although described functionally or sensibly, may be configured to work in a network environment comprising a computer in communication with one or more devices. The present invention, may be implemented by computer programmable instructions stored on one or more computer readable media30 and executed by a processor of the computer. The computer comprises the processor, a memory unit 104, an input/output (I/O) controller, and a display communicating via a data bus. The computer may comprise multiple processors to increase a computing capability of the computer. The processor is an electronic circuit which executes computer programs. The processor executes the instructions to assess the input fundus image.

[0071] The memory unit 104, for example, comprises a ROM and a RAM. The memory unit 104 stores the instructions for execution by the processor. In this invention, the storage unit 107 is the memory unit 104. The memory unit 104 stores the training fundus image dataset and the ground-truth file. The memory unit 104 may also store intermediate, static and10 temporary information required by the processor during the execution of the instructions. The computer comprises one or more input devices, for example, a keyboard such as an alphanumeric keyboard, a mouse, a joystick, etc. The I/O controller controls the input and output actions performed by a user. The data bus allows communication between modules of the computer. The computer directly or indirectly communicates with the devices via an interface, for example, a local area network (LAN), a wide area network (WAN) or the Ethernet, the Internet, a token ring, a Bluetooth connectivity, or the like. Further, each of the devices are adapted to communicate with the computer may comprise computers with, for example, Sun® processors, IBM® processors, Intel® processors, AMD® processors, etc.

[0072] The computer readable media comprises, for example, CDs, DVDs, floppy disks, optical disks, magnetic-optical disks, ROMs, RAMs, EEPROMs, magnetic cards, application specific integrated circuits (ASICs), or the like. Each of the computer readable media is coupled to the data bus.

[0073] The foregoing examples have been provided merely for the purpose of explanation and does not limit the present invention disclosed herein. While the invention has been described with reference to various embodiments, it is understood that the words are used for illustration and are not limiting. Those skilled in the art, may effect numerous modifications thereto and changes may be made without departing from the scope and spirit of the invention in its aspects.

, Claims:What is claimed is:
1. A system 100 for analyzing medical data based on the type of medical data for predicting medical condition of a subject, the system 100 comprising:
a processing unit 102, in communication with a host-device 200, the processing unit 102 configured for receiving medical data from the host-device along with information on type of the medical data, the medical data being associated with one or more organs of a subject, processing the medical data, based on the type of the medical data, in order to detect a medical condition associated with one of the organs, using a convolution neural network, transmitting information concerning the medical condition to the processor of the host device 200 so as to enable the host device 200 to display information on the detected medical condition and wherein processing the medical data comprises classifying the medical data based on the information received on the type of medical data, determining one or more abnormalities, anatomical features and artifacts based on the type of the medical data and detecting a medical condition upon identifying the presence of one or more abnormalities in the medical data.

2. The AI health-device 200 as claimed in claim 1, wherein the medical condition is one of a healthy condition, depicting the absence of abnormalities and a diseased condition, depicting the presence of at least one of an abnormality and a medical condition.

3. The system 100 of claim 1, wherein the processing unit 102 is configured for receiving information on type of medical data from the host device 200.

4. The system 100 of claim 1, wherein the processing unit 102 is configured for identifying the type of medical data, the medical data being one of a clinical data, an image data, EHR (Electronic health record), EMR (Electronic Medical Record), a generic data, sensing data, electronic health record, public health record and a textual data,

5. The system 100 of claim 1, wherein the host device 200 is a medical device selected from a group consisting of an ultrasound imaging system 100, a radiographic imaging system 100, a tactile imaging system 100, a thermographic imaging system 100, a funduscopic imaging system 100, a positron emission tomography (PET) system 100 and a magnetic resonance imaging (MRI) system 100.

6. The system 100 as claimed in claim 1, wherein the host device 200 comprises a memory to store medical data of a subject, a medical data capturing means and a user interface configured to enable a user to interact with the host device 200.

7. The system 100 as claimed in claim 1, is coupled to the host device 200 via one of a wired and a wireless communication port.

8. The system 100 as claimed in claim 1, wherein the host device 200 is one of a medical device configured for capturing medical data of the subject, a clinical diagnostic device, an imaging device, a communication device and a wearable device.

9. A method for analysis of a medical data of a patient, comprising:
receiving, by a system 100 for analyzing medical data 100, medical data of a patient along with information on type of medical data from a host device 102;
processing the medical data, by a processing unit 102 using a convolutional neural network, based on the type of the medical data;
detecting a medical condition associated with the medical data upon processing by the processing unit 102 using a convolutional neural network;
transmitting the detected medical condition to the host device 200 by the processing unit 102; and
displaying the detected medical condition by the host device 200.

10. A computer program product storing computer readable instructions which when executed by a processor, cause the processor to execute a method for analysis of a medical data of a patient, comprising steps of:
receiving, by a system 100 for analyzing medical data 100, medical data of a patient along with information on type of medical data from a host device 102;
processing the medical data, by a processing unit 102 using a convolutional neural network, based on the type of the medical data;
detecting a medical condition associated with the medical data upon processing by the processing unit 102 using a convolutional neural network;
transmitting the detected medical condition to the host device 200 by the processing unit 102; and
displaying the detected medical condition by the host device 200.

Dated this 12th day of April, 2023

(Digitally signed)
SUMA K.B.(INPA-1753)
Agent for the Applicant
ARTIFICIAL LEARNING SYSTEMS INDIA PVT LTD.

Documents

Application Documents

# Name Date
1 202341027123-POWER OF AUTHORITY [12-04-2023(online)].pdf 2023-04-12
2 202341027123-FORM FOR STARTUP [12-04-2023(online)].pdf 2023-04-12
3 202341027123-FORM FOR SMALL ENTITY(FORM-28) [12-04-2023(online)].pdf 2023-04-12
4 202341027123-FORM 1 [12-04-2023(online)].pdf 2023-04-12
5 202341027123-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [12-04-2023(online)].pdf 2023-04-12
6 202341027123-DRAWINGS [12-04-2023(online)].pdf 2023-04-12
7 202341027123-COMPLETE SPECIFICATION [12-04-2023(online)].pdf 2023-04-12
8 202341027123-STARTUP [29-11-2023(online)].pdf 2023-11-29
9 202341027123-FORM28 [29-11-2023(online)].pdf 2023-11-29
10 202341027123-FORM 18A [29-11-2023(online)].pdf 2023-11-29
11 202341027123-FER.pdf 2024-03-08
12 202341027123-OTHERS [03-09-2024(online)].pdf 2024-09-03
13 202341027123-FER_SER_REPLY [03-09-2024(online)].pdf 2024-09-03
14 202341027123-DRAWING [03-09-2024(online)].pdf 2024-09-03
15 202341027123-COMPLETE SPECIFICATION [03-09-2024(online)].pdf 2024-09-03
16 202341027123-CLAIMS [03-09-2024(online)].pdf 2024-09-03
17 202341027123-ABSTRACT [03-09-2024(online)].pdf 2024-09-03
18 202341027123-RELEVANT DOCUMENTS [13-09-2024(online)].pdf 2024-09-13
19 202341027123-POA [13-09-2024(online)].pdf 2024-09-13
20 202341027123-FORM 13 [13-09-2024(online)].pdf 2024-09-13
21 202341027123-AMMENDED DOCUMENTS [13-09-2024(online)].pdf 2024-09-13
22 202341027123-Correspondence to notify the Controller [21-02-2025(online)].pdf 2025-02-21
23 202341027123-FORM 13 [27-02-2025(online)].pdf 2025-02-27
24 202341027123-EVIDENCE FOR REGISTRATION UNDER SSI [09-07-2025(online)].pdf 2025-07-09
25 202341027123-Proof of Right [19-07-2025(online)].pdf 2025-07-19

Search Strategy

1 searchstrategyE_19-12-2023.pdf