Sign In to Follow Application
View All Documents & Correspondence

System And Method To Provide Insights Regarding Health Conditions Of A Patient Utilizing Medical Image

Abstract: A system (10) to provide insights regarding health conditions of a patient utilizing at least one medical image is disclosed. The system includes a processing subsystem (20) including a data acquisition module (60) to acquire medical image. The processing subsystem includes a region of interest module (70) to isolate regions. The region of interest module is to extract features from the regions. The processing subsystem includes a length scale analysis module (80) to derive predominant features. The processing subsystem includes a classification module (90) to classify the predominant features. The classification module is to assign at least one identifier to the predominant features to obtain a tagged image. The processing subsystem includes a comparison module (100) to compare the tagged image with a prestored tagged image to provide assessments. The processing subsystem further includes a visualization module (110) to render the tagged image, the prestored tagged image and the assessments. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
01 November 2022
Publication Number
03/2024
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2024-12-03
Renewal Date

Applicants

COGNOTA HEALTHCARE PRIVATE LIMITED
FLAT NO. 5/6, 3RD FLOOR, HOSHBANOO MANSION, OLD ICE FACTORY, GOKHALE ROAD, NAUPADA, THANE - 400601, MAHARASHTRA, INDIA.

Inventors

1. RAJENDRA PATIL
FLAT NO. 5/6, 3RD FLOOR, HOSHBANOO MANSION, OLD ICE FACTORY, GOKHALE ROAD, NAUPADA, THANE - 400601, MAHARASHTRA, INDIA.

Specification

DESC:EARLIEST PRIORITY DATE:
This Application claims priority from a provisional patent application filed in India having Patent Application No. 202221043848, filed on November 01, 2022, and titled A METHOD OF ACCURATE SCALING OF SUBJECT AREA.
FIELD OF INVENTION
[0001] Embodiments of the present disclosure relate to a field of image processing and more particularly to a system and a method to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image .
BACKGROUND
[0002] Medical imaging techniques provide medical images of internal structures of a human body. The medical imaging techniques include X-rays, computed tomography scans, magnetic resonance imaging, ultrasound imaging and the like. The medical imaging techniques help physicians to identify various medical conditions, including fractures, tumors, infections, and abnormalities in organs and tissues.
[0003] Currently, medical practitioners rely on a report provided by a technician interpreting the medical images to make treatment plans. Lack of direct interaction between the medical practitioners and the technicians leads to miscommunication and may render the treatment plans inefficient. Further, the treatment plans made based on the report without viewing the medical images limit the ability of the medical practitioner to make informed decisions. Furthermore, dependency of the medical practitioners on the report may cause unprecedented delay in providing medical care to a patient.
[0004] Additionally, the medical practitioners lack real-time information regarding the completion of a medical imaging procedure, leading to delays in treatment. Furthermore, the medical images produced by the X-rays, the computed tomography scans, the magnetic resonance imaging, and the ultrasound imaging are viewed using proprietary techniques that lack interoperability, preventing simultaneous side-by-side comparisons on a single display, thereby affecting the efficiency of the medical practitioners devising the treatment plans. Also, the lack of interoperability prevents the medical practitioners and patients from collaborating with other medical practitioners located in various geographic locations.
[0005] Hence, there is a need for an improved system and method to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image to address the aforementioned issue(s).
OBJECTIVE OF THE INVENTION
[0006] An objective of the invention is to provide one or more insights regarding one or more health conditions of a patient by analyzing at least one of a medical image.
[0007] Another objective of the invention is to notify a plurality of users upon providing one or more assessments by a comparison module.
[0008] Yet another objective of the invention is to enable bidirectional communication between the plurality of users to discuss the one or more insights by viewing the at least one medical image.
[0009] Yet another objective of the invention is to enable the patient to access the medical image by authenticating the patient.
BRIEF DESCRIPTION
[0010] In accordance with an embodiment of the present disclosure, a system to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image is provided. The system includes a processing subsystem hosted on a server and configured to execute on a network to control bidirectional communications among a plurality of modules. The processing subsystem includes a data acquisition module operatively coupled to an integrated database. The data acquisition module is configured to acquire the at least one medical image from at least one medical device. The processing subsystem also includes a region of interest module configured to segment the at least one medical image using a plurality of techniques to isolate one or more regions. The region of interest module is also configured to extract one or more features from the one or more regions based on a first plurality of parameters comprising, shape, texture, color, presence of discharge, temperature gradient, and one or more dimensions. The processing subsystem also includes a length scale analysis module operatively coupled to the region of interest module. The length scale analysis module is configured to derive one or more predominant features from the one or more features through a length scale analysis technique. The processing subsystem further includes a classification module configured to classify each of the one or more predominant features into at least one category based on a second plurality of parameters. The classification module is also configured to assign at least one identifier to each of the one or more predominant features based on the at least one category to obtain a tagged image. The processing subsystem also includes a comparison module configured to compare the tagged image with a prestored tagged image based on the first plurality of parameters to evaluate a healing score. The comparison module is also configured to provide one or more assessments based on the healing score evaluated. The one or more assessments comprises improved, stable, worsened. The processing subsystem further includes a visualization module configured to render the tagged image, the prestored tagged image and the one or more assessments in a user interface, thereby providing the one or more insights regarding the one or more health conditions of the patient utilizing the at least one medical image.
[0011] In accordance with another embodiment of the present disclosure, a method to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image is provided. The method includes acquiring, by a data acquisition module, the at least one medical image from at least one medical device. The method also includes segmenting, by a region of interest module, the at least one medical image using a plurality of techniques to isolate one or more regions. The method further includes extracting, by the region of interest module, one or more features from the one or more regions based on a first plurality of parameters including, shape, texture, color, presence of discharge, temperature gradient, and one or more dimensions. The method also includes deriving, by a length scale analysis module, one or more predominant features from the one or more features through a length scale analysis technique. The method also includes classifying, by a classification module, each of the one or more predominant features into at least one category based on a second plurality of parameters. The method also includes assigning, by the classification module, at least one identifier to each of the one or more predominant features based on the at least one category to obtain a tagged image. The method also includes comparing, by a comparison module, the tagged image with a prestored tagged image based on the first plurality of parameters to evaluate a healing score. The method also includes providing, by the comparison module, one or more assessments based on the healing score evaluated. The one or more assessments includes improved, stable, worsened. The method further includes rendering, by a visualization module, the tagged image, the prestored tagged image and the one or more assessments in a user interface, thereby providing the one or more insights regarding the one or more health conditions of the patient utilizing the at least one medical image.
[0012] To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting in scope. The disclosure will be described and explained with additional specificity and detail with the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:
[0014] FIG. 1 is a block diagram representation of a system to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image in accordance with an embodiment of the present disclosure;
[0015] FIG. 2 is a block diagram representation of one embodiment of the system of FIG. 1 in accordance with an embodiment of the present disclosure.
[0016] FIG. 3 is a schematic representation of an exemplary embodiment of the system of FIG. 1, in accordance with an embodiment of the present disclosure;
[0017] FIG. 4 is a block diagram of a computer or a server in accordance with an embodiment of the present disclosure; and
[0018] FIG. 5 is a flow chart representing the steps involved in a method to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image in accordance with an embodiment of the present disclosure.
[0019] Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0020] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.
[0021] The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more devices or sub-systems or elements or structures or components preceded by "comprises... a" does not, without more constraints, preclude the existence of other devices, sub-systems, elements, structures, components, additional devices, additional sub-systems, additional elements, additional structures, or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.
[0022] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.
[0023] In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.
[0024] Embodiments of the present disclosure relate to a system and a method to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image. The system includes a processing subsystem hosted on a server and configured to execute on a network to control bidirectional communications among a plurality of modules. The processing subsystem includes a data acquisition module operatively coupled to an integrated database. The data acquisition module is configured to acquire the at least one medical image from at least one medical device. The processing subsystem also includes a region of interest module configured to segment the at least one medical image using a plurality of techniques to isolate one or more regions. The region of interest module is also configured to extract one or more features from the one or more regions based on a first plurality of parameters comprising, shape, texture, color, presence of discharge, temperature gradient, and one or more dimensions. The processing subsystem also includes a length scale analysis module operatively coupled to the region of interest module. The length scale analysis module is configured to derive one or more predominant features from the one or more features through a length scale analysis technique. The processing subsystem further includes a classification module configured to classify each of the one or more predominant features into at least one category based on a second plurality of parameters. The classification module is also configured to assign at least one identifier to each of the one or more predominant features based on the at least one category to obtain a tagged image. The processing subsystem also includes a comparison module configured to compare the tagged image with a prestored tagged image based on the first plurality of parameters to evaluate a healing score. The comparison module is also configured to provide one or more assessments based on the healing score evaluated. The one or more assessments comprises improved, stable, worsened. The processing subsystem further includes a visualization module configured to render the tagged image, the prestored tagged image and the one or more assessments in a user interface, thereby providing the one or more insights regarding the one or more health conditions of the patient utilizing the at least one medical image.
[0025] FIG. 1 is a block diagram representation of a system (10) to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image in accordance with an embodiment of the present disclosure. The system (10) includes a processing subsystem (20) hosted on a server (30) and configured to execute on a network (40) to control bidirectional communications among a plurality of modules. Further, in one embodiment, the server (30) may be a cloud-based server. In another embodiment, the server (30) may be a local server. In one example, the network (40) may be a private or public local area network (LAN) or wide area network (WAN), such as the Internet.
[0026] Further, in another embodiment, the network (40) may include both wired and wireless communications according to one or more standards and/or via one or more transport mediums. Furthermore, in one example, the network (40) may include wireless communications according to one of the 802.11 or Bluetooth specification sets, or another standard or proprietary wireless communication protocol. In yet another embodiment, the network (40) may also include communications over a terrestrial cellular network, including, a GSM (global system for mobile communications), CDMA (code division multiple access), and/or EDGE (enhanced data for global evolution) network.
[0027] Furthermore, the processing subsystem (20) includes a data acquisition module (60) operatively coupled to an integrated database (50). In some embodiments, the integrated database (50) may include a structured query language database, a non-structured query language database, a columnar database and the like. The data acquisition module (60) is configured to acquire the at least one medical image from at least one medical device. In one embodiment, the at least one medical image may include an X-ray image, a computerized tomography (CT) scan image, a magnetic resonance image, a positron emission tomography (PET) scan image and the like. In some embodiments, the at least one medical device may include an X-ray machine, a computerized tomography scan machine, a magnetic resonance imaging machine and a PET scan machine. In one embodiment, the data acquisition module (60) may convert the at least one medical image into a standard format including bitmap (BMP), joint photographic experts group (JPEG), and portable network graphics (PNG).
[0028] Moreover, in some embodiments, the data acquisition module (60) may store the at least one medical image into the integrated database (50) by embedding a plurality of metadata in the at least one medical image. In such an embodiment, the plurality of meta data may include name of the patient, address of the patient, contact number of the patient, email address of the patient, modality of the at least one medical image, date of imaging, location of imaging and the like. For example, consider a scenario in which, the data acquisition module (60) is configured to acquire the PET scan image of a patient X. The data acquisition module (60) may convert the PET scan image into the PNG format and may store the PET scan image in the integrated database (50) along with the name of the patient X, and the contact number of the patient X.
[0029] Additionally, the processing subsystem (20) includes a region of interest module (70) configured to segment the at least one medical image using a plurality of techniques to isolate one or more regions. In one embodiment, the one or more regions may include organs, tissues, blood vessels, muscles, bones and the like. In one embodiment, the plurality of techniques may include at least one of a thresholding, region growing, and watershed technique. As used herein, the thresholding may be an image processing technique which separates the one or more regions from the background based on pixel values of the one or more regions lying below or above a predefined threshold. As used here, the region growing may be defined as a technique that involves grouping of pixels with similar characteristics to isolate the one or more regions. As used herein, the watershed technique may utilize pixel intensity to isolate the one or more regions. In continuation with the ongoing example, the region of interest module (70) may segment the PET scan image utilizing the watershed technique to isolate the one or more regions associated with liver and spine of the patient X.
[0030] Also, the region of interest module (70) is configured to extract one or more features from the one or more regions based on a first plurality of parameters including, shape, texture, color, presence of discharge, temperature gradient, and one or more dimensions. In one embodiment, the one or more features may include one or more inflammations, tumors, deformations, wounds, infections and the like. In continuation with the ongoing example, the region of interest module (70) may extract portions of the one or more regions including a tumor present in the liver based on the one or more dimensions of the tumor. The region of interest module (70) may also extract the portions of the one or more regions including a discoloration in the spine.
[0031] Further, the processing subsystem (20) includes a length scale analysis module (80) operatively coupled to the region of interest module (70). The length scale analysis module (80) is configured to derive one or more predominant features from the one or more features through a length scale analysis technique. In one embodiment, the length scale analysis module (80) may be configured to derive the one or more predominant features from the one or more features based on respective area of the one or more features. In one embodiment, the length scale analysis technique may include at least one of pyramid decomposition, scale-space theory, wavelet transform, morphological scale space, and local binary patterns. As used herein the pyramid decomposition may be defined as an image processing technique that creates a series of images at different scales for enabling multi resolution analysis.
[0032] Furthermore, as used herein the scale space theory may be defined as a framework for analyzing the series of images at multiple scales to analyze the one or more features having different dimensions. As used herein the wavelet transforms may be used to analyze the one or more features to extract time and frequency information. As used herein the morphological scale space may be used to analyze the one or more features using morphological operations. As used herein the local binary patterns may be used to analyze the one or more features based on the intensity of the pixels. In continuation with the ongoing example, the length scale analysis module (80) may derive the portions including the tumor and the portions including the discolorations based on the area of the tumor and the area of the discolorations while discarding other portions lying around the tumor and the discolorations.
[0033] Additionally, the processing subsystem (20) includes a classification module (90) configured to classify each of the one or more predominant features into at least one category based on a second plurality of parameters. In one embodiment, the at least one category may include small category, medium category, and large category. In some embodiments, the second plurality of parameters may include at least one of a size of each of the one or more predominant features, a clinical reference, a user preference, an ease of interpretation, consistency and clarity. In continuation with the ongoing example, the classification module (90) may classify the tumor and the discoloration into the large category based on the clinical reference.
[0034] Moreover, the classification module (90) is configured to assign at least one identifier to each of the one or more predominant features based on the at least one category to obtain a tagged image. In one embodiment, the at least one identifier comprises a numerical identifier, an alphanumeric identifier, a color code, a symbolic identifier, a string. In such an embodiment, the numerical identifier may represent the small category with 1, the medium category with 2, and the large category with 3. In one embodiment, the alphanumeric identifier may represent the small category with ‘S1’, the medium category with ‘M1’ and the large category with 'L1’.
[0035] Also, in some embodiments, the color code may represent the small category with a green marker, the medium category with a yellow marker, and the large category with a red marker. In a specific embodiment, the symbolic identifier may represent the small category with a star symbol, the medium category with a circular symbol, and the large category with a rectangular symbol. In one embodiment, the string may represent the small category with a word ‘small’, medium category with the word ‘medium’, and the large category with the word 'large’. In continuation with the ongoing example, the classification module (90) may overlay the red marker on top of the tumor and the discoloration to obtain the tagged image.
[0036] Further, the processing subsystem (20) includes a comparison module (100) configured to compare the tagged image with a prestored tagged image based on the first plurality of parameters to evaluate a healing score. In one embodiment, the prestored tagged image may be stored in the integrated database (50). The comparison module (100) is also configured to provide one or more assessments based on the healing score evaluated. The one or more assessments comprises improved, stable, worsened.
[0037] Furthermore, in continuation with the ongoing example, the comparison module (100) may compare the tagged image including the tumor and the discoloration with the prestored tagged image. The prestored tagged image may include images of the tumor and the discoloration acquired by the data acquisition module (60) at a previous instance. The comparison module (100) may compare the tumor and the discoloration with the corresponding images that acquired by the data acquisition module (60) at a previous instance to generate the healing score. Consider a scenario in which , the tumor in the tagged image may have grown with respect to the tumor in the prestored image and the discoloration may be getting prominent compared to the discoloration in the prestored tagged image. In such a scenario, the healing score generated by the comparison module (100) may be below a predefined threshold. The comparison module (100) may provide the one or more assessment as worsened based on the healing score.
[0038] Moreover, the processing subsystem (20) includes a visualization module (110) configured to render the tagged image, the prestored tagged image and the one or more assessments in a user interface, thereby providing the one or more insights regarding the one or more health conditions of the patient utilizing the at least one medical image. In one embodiment, the user interface may be associated with at least one of a medical practitioner and the patient. In some embodiments, the one or more insights may include the size of the one or more predominant features and the one or more assessments. In continuation with the ongoing example, the visualization module (110) may render the tagged image including the tumor and the discoloration, the prestored tagged image including the tumor and the discoloration that acquired at a previous instance, with the one or more assessments in the user interface associated with a medical practitioner Y treating the patient X.
[0039] FIG. 2 is a block diagram representation of one embodiment of the system (10) of FIG. 1 in accordance with an embodiment of the present disclosure. The system (10) of FIG. 1 includes the data acquisition module (60), the region of interest module (70), the length scale analysis module (80), the classification module (90), the comparison module (100) and the visualization module (110). In one embodiment, the system (10) of FIG. 1 may include the processing subsystem (20) including a notification module (120) configured to notify a plurality of users upon providing the one or more assessments by the comparison module (100). In one embodiment, the plurality of users may include the patient, the medical practitioner, and the like. In continuation with the ongoing example, the notification module (120) may notify at least one of the medical practitioner Y and the user X upon providing the one or more assessments by the comparison module (100).
[0040] Further, in one embodiment, the processing subsystem (20) may include an interactive module (130) configured to enable bidirectional communication between the plurality of users when the visualization module (110) is rendering the tagged image, the prestored tagged image and the one or more assessments in the user interface associated with the plurality of users. In continuation with the ongoing example, the interactive module (130) may enable bidirectional communication between the patient X, the medical practitioner Y, and a medical practitioner Z when the visualization module (110) is rendering the tagged image, the prestored tagged image and the one or more assessments in the user interface associated with the patient X, the medical practitioner Y, and the medical practitioner Z.
[0041] Furthermore, in some embodiments, the processing subsystem (20) may include an authentication module (140) configured to authenticate the patient based on one or more credentials provided by the patient. In such an embodiment, the authentication module (140) may also be configured to enable the patient to access to the at least one medical image stored in the integrated database (50) upon authenticating the patient. In continuation with the ongoing example, the authentication module (140) may be configured to authenticate the patient X based on a username and a password provided by the patient X to enable the patient to access the PET scan image stored in the integrated database (50).
[0042] FIG. 3 is a schematic representation of an exemplary embodiment (150) of the system (10) of FIG. 1 in accordance with an embodiment of the present disclosure. Consider a scenario in which a patient A (160) may be seeking medical assistance from a medical practitioner B (170) after an accident. The medical practitioner B (170) may prescribe a CT scan of head for the patient A (160). The data acquisition module (60) may acquire the CT scan image of the patient A (160) from the CT scan machine (not shown in FIG. 3). The region of interest module (70) may segment the CT scan image using the plurality of techniques to isolate the one or more regions. The region of interest module (70) may extract the one or more inflammations from the one or more regions based on the size. The length scale analysis module (80) may derive one or more predominant inflammations from the one or more inflammations extracted through the length scale analysis technique.
[0043] Further, the classification module (90) may classify each of the one or more predominant inflammations into the large category based on the clinical reference. The classification module (90) may further assign the red marker to each of the one or more predominant inflammations based on the classification to obtain the tagged image. The comparison module (100) may compare the tagged image with the prestored tagged image based on the size of each of the one or more predominant inflammations. The prestored tagged image may include images of the one or more predominant inflammations classified and assigned a corresponding tag by the classification module (90) at a previous instance.
[0044] Furthermore, the comparison module (100) may compare each of the one or more predominant inflammations present in the tagged image with a corresponding one or more predominant inflammations present in the prestored tagged image to evaluate the healing store. The comparison module (100) may provide the one or more assessments based on the healing score evaluated. The one or more assessments provided by the comparison module (100) may be as worsened since the one or more predominant inflammations present in the tagged image has increased with respect to the one or more predominant inflammations present in the prestored tagged image. The notification module (120) may notify the medical practitioner B (170) upon providing the one or more assessments by the comparison module (100).
[0045] Moreover, the visualization module (110) may render the tagged image, the prestored tagged image and the one or more assessments in the user interface associated with the medical practitioner B (170), thereby providing the one or more insights regarding the one or more health conditions of the patient A (160). The interactive module (130) may enable bidirectional communication between the medical practitioner B (170) and a medical practitioner C (180) when the visualization module (110) is rending the tagged image, the prestored tagged image and the one or more assessments in the user interface associated with the medical practitioner C (180) and the medical practitioner B (170) to discuss the one more insights. Also, the authentication module (140) may authenticate the patient A (160) based on the username and the password provided by the patient A (160) to enable the patient A (160) to access the at least one medical image stored in the integrated database (50).
[0046] FIG. 4 is a block diagram of a computer or a server (30) in accordance with an embodiment of the present disclosure. The server (30) includes processor(s) (190), and memory (200) operatively coupled to the bus (210). The processor(s) (190), as used herein, includes any type of computational circuit, such as, but not limited to, a microprocessor, a microcontroller, a complex instruction set computing microprocessor, a reduced instruction set computing microprocessor, a very long instruction word microprocessor, an explicitly parallel instruction computing microprocessor, a digital signal processor, or any other type of processing circuit, or a combination thereof.
[0047] The memory (200) includes several subsystems stored in the form of executable program which instructs the processor to perform the method steps illustrated in FIG. 1. The memory (200) is substantially similar to the system (10) of FIG.1. The memory (200) has the following subsystems: the processing subsystem (20) including the data acquisition module (60), the region of interest module (70), the length scale analysis module (80), the classification module (90), the comparison module (100), the notification module (120), the interactive module (130), and the authentication module (140). The plurality of modules of the processing subsystem (20) performs the functions as stated in FIG. 1 and FIG. 2. The bus (210) as used herein refers to be the internal memory channels or computer network that is used to connect computer components and transfer data between them. The bus (210) includes a serial bus or a parallel bus, wherein the serial bus transmit data in bit-serial format and the parallel bus transmit data across multiple wires. The bus (210) as used herein, may include but not limited to, a system bus, an internal bus, an external bus, an expansion bus, a frontside bus, a backside bus, and the like.
[0048] The processing subsystem (20) includes a data acquisition module (60) operatively coupled to an integrated database (50). The data acquisition module (60) is configured to acquire the at least one medical image from at least one medical device. The processing subsystem (20) also includes a region of interest module (70) configured to segment the at least one medical image using a plurality of techniques to isolate one or more regions. The region of interest module (70) is also configured to extract one or more features from the one or more regions based on a first plurality of parameters comprising, shape, texture, color, presence of discharge, temperature gradient, and one or more dimensions. The processing subsystem (20) also includes a length scale analysis module (80) operatively coupled to the region of interest module (70). The length scale analysis module (80) is configured to derive one or more predominant features from the one or more features through a length scale analysis technique. The processing subsystem (20) further includes a classification module (90) configured to classify each of the one or more predominant features into at least one category based on a second plurality of parameters. The classification module (90) is also configured to assign at least one identifier to each of the one or more predominant features based on the at least one category to obtain a tagged image. The processing subsystem (20) also includes a comparison module (100) configured to compare the tagged image with a prestored tagged image based on the first plurality of parameters to evaluate a healing score. The comparison module (100) is also configured to provide one or more assessments based on the healing score evaluated. The one or more assessments comprises improved, stable, worsened. The processing subsystem (20) further includes a visualization module (110) configured to render the tagged image, the prestored tagged image and the one or more assessments in a user interface, thereby providing the one or more insights regarding the one or more health conditions of the patient utilizing the at least one medical image. The processing subsystem (20) also includes a notification module (120) configured to notify a plurality of users upon providing the one or more assessments by the comparison module (100).
[0049] The processing subsystem (20) also includes an interactive module (130) configured to enable bidirectional communication between a plurality of users when the visualization module (110) is rendering the tagged image, the prestored tagged image and the one or more assessments in the user interface associated with the plurality of users.
[0050] The processing subsystem (20) further includes an authentication module (140) configured to authenticate the patient based on one or more credentials provided by the patient. The authentication module (140) is also configured to enable the patient to access to the at least one medical image stored in the integrated database (50) upon authenticating the patient.
[0051] Computer memory elements may include any suitable memory device(s) for storing data and executable program, such as read only memory, random access memory, erasable programmable read only memory, electrically erasable programmable read only memory, hard drive, removable media drive for handling memory cards and the like. Embodiments of the present subject matter may be implemented in conjunction with program modules, including functions, procedures, data structures, and application programs, for performing tasks, or defining abstract data types or low-level hardware contexts. Executable program stored on any of the above-mentioned storage media may be executable by the processor(s) (190).
[0052] FIG. 5 is a flow chart representing the steps involved in a method (300) to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image in accordance with an embodiment of the present disclosure. The method (300) includes acquiring the at least one medical image from at least one medical device in step 310. In one embodiment, acquiring the at least one medical image from at least one medical device includes acquiring the at least one medical image from at least one medical device by a data acquisition module. In one embodiment, the at least one medical image may include an X-ray image, a computerized tomography (CT) scan image, a magnetic resonance image, a positron emission tomography (PET) scan image and the like. In some embodiments, the at least one medical device may include an X-ray machine, a computerized tomography scan machine, a magnetic resonance imaging machine and a PET scan machine. In one embodiment, the data acquisition module may convert the at least one medical image into a standard format including bitmap (BMP), joint photographic experts group (JPEG), and portable network graphics (PNG).
[0053] Further, in some embodiments, the data acquisition module may store the at least one medical image into an integrated database by embedding a plurality of metadata in the at least one medical image. In such an embodiment, the plurality of meta data may include name of the patient, address of the patient, contact number of the patient, email address of the patient, modality of the at least one medical image, date of imaging, location of imaging and the like. For example, consider a scenario in which, the data acquisition module is configured to acquire the PET scan image of a patient X. The data acquisition module may convert the PET scan image into the PNG format and may store the PET scan image in the integrated database along with the name of the patient X, and the contact number of the patient X.
[0054] The method (300) also includes segmenting the at least one medical image using a plurality of techniques to isolate one or more regions in step 320. In one embodiment, segmenting the at least one medical image using a plurality of techniques to isolate one or more regions includes segmenting the at least one medical image using a plurality of techniques to isolate one or more regions by a region of interesting module.
[0055] Further, in one embodiment, the plurality of techniques may include at least one of a thresholding, region growing, and watershed technique. As used herein, the thresholding may be an image processing technique which separates the one or more regions from the background based on pixel values of the one or more regions lying below or above a predefined threshold. As used here, the region growing may be defined as a technique that involves grouping of pixels with similar characteristics to isolate the one or more regions. As used herein, the watershed technique may utilize pixel intensity to isolate the one or more regions.
[0056] The method (300) also includes extracting one or more features from the one or more regions based on a first plurality of parameters comprising, shape, texture, color, presence of discharge, temperature gradient, and one or more dimensions in step 330. In one embodiment, extracting one or more features from the one or more regions based on a first plurality of parameters comprising, shape, texture, color, presence of discharge, temperature gradient, and one or more dimensions includes extracting one or more features from the one or more regions based on a first plurality of parameters comprising, shape, texture, color, presence of discharge, temperature gradient, and one or more dimensions by the region of interest module. In one embodiment, the one or more features may include one or more inflammations, tumors, deformations, wounds, infections and the like.
[0057] The method (300) also includes deriving one or more predominant features from the one or more features through a length scale analysis technique in step 340. In one embodiment, deriving one or more predominant features from the one or more features through a length scale analysis technique includes deriving one or more predominant features from the one or more features through a length scale analysis technique by a length of scale analysis module.
[0058] Further, in one embodiment, the length scale analysis module may be configured to derive the one or more predominant features from the one or more features based on respective area of the one or more features. In one embodiment, the length scale analysis technique may include at least one of pyramid decomposition, scale-space theory, wavelet transform, morphological scale space, and local binary patterns. As used herein the pyramid decomposition may be defined as an image processing technique that creates a series of images at different scales for enabling multi resolution analysis. As used herein the scale space theory may be defined as a framework for analyzing the series of images at multiple scales to analyze the one or more features having different dimensions. As used herein the wavelet transforms may be used to analyze the one or more features to extract time and frequency information. As used herein the morphological scale space may be used to analyze the one or more features using morphological operations. As used herein the local binary patterns may be used to analyze the one or more features based on the intensity of the pixels.
[0059] The method (300) also includes classifying each of the one or more predominant features into at least one category based on a second plurality of parameters in step 350. In one embodiment, classifying each of the one or more predominant features into at least one category based on a second plurality of parameters includes classifying each of the one or more predominant features into at least one category based on a second plurality of parameters by a classification module.
[0060] Additionally, in one embodiment, the at least one category may include small category, medium category, and large category. In some embodiments, the second plurality of parameters may include at least one of a size of each of the one or more predominant features, a clinical reference, a user preference, an ease of interpretation, consistency and clarity.
[0061] The method (300) also includes assigning at least one identifier to each of the one or more predominant features based on the at least one category to obtain a tagged image in step 360. In one embodiment, assigning at least one identifier to each of the one or more predominant features based on the at least one category to obtain a tagged image includes assigning at least one identifier to each of the one or more predominant features based on the at least one category to obtain a tagged image by the classification module.
[0062] Further, in one embodiment, the at least one identifier comprises a numerical identifier, an alphanumeric identifier, a color code, a symbolic identifier, a string. In such an embodiment, the numerical identifier may represent the small category with 1, the medium category with 2, and the large category with 3. In one embodiment, the alphanumeric identifier may represent the small category with ‘S1’, the medium category with ‘M1’ and the large category with 'L1’. In some embodiments, the color code may represent the small category with a green marker, the medium category with a yellow marker, and the large category with a red marker. In a specific embodiment, the symbolic identifier may represent the small category with a star symbol, the medium category with a circular symbol, and the large category with a rectangular symbol. In one embodiment, the string may represent the small category with a word ‘small’, medium category with the word ‘medium’, and the large category with the word 'large’.
[0063] The method (300) also includes comparing the tagged image with a prestored tagged image based on the first plurality of parameters to evaluate a healing score in step 370. In one embodiment, comparing the tagged image with a prestored tagged image based on the first plurality of parameters to evaluate a healing score includes comparing the tagged image with a prestored tagged image based on the first plurality of parameters to evaluate a healing score a comparison module. In one embodiment, the prestored tagged image may be stored in the integrated database.
[0064] The method (300) also includes providing one or more assessments based on the healing score evaluated in step 380. In one embodiment, providing one or more assessments based on the healing score evaluated includes providing one or more assessments based on the healing score evaluated by the comparison module.
[0065] The method (300) also includes rendering the tagged image, the prestored tagged image and the one or more assessments in a user interface, thereby providing the one or more insights regarding the one or more health conditions of the patient utilizing the at least one medical image in step 390. In one embodiment, rendering the tagged image, the prestored tagged image and the one or more assessments in a user interface includes rendering the tagged image, the prestored tagged image and the one or more assessments in a user interface by a visualization module. In one embodiment, the user interface may be associated with at least one of a medical practitioner and the patient. In some embodiments, the one or more insights may include the size of the one or more predominant features and the one or more assessments.
[0066] Various embodiments of the system and method to provide the one or more insights regarding the one or more medical conditions of the patient utilizing the at least one medical image described above enable various advantages. Combination of the data acquisition module, the region of interest module, the length scale analysis module, the classification module, the comparison module and the visualization module are capable of providing the one or more insights regarding the one or more health conditions of the patient based on the at least one medical image without relying on the technician and the report provided by the technician, thereby ensuring timely medical care to the patient.
[0067] Further, the visualization module is capable of rendering the tagged image, the prestored tagged image, and the one or more assessments in the graphical user interface, thereby enabling the medical practitioner to make informed decisions. The notification module is capable of notifying the medical practitioner and the patient upon providing the one or more assessments by the comparison module, thereby fast tracking the treatment procedures. The data acquisition module is capable of converting the at least one medical image acquired in various formats into a common format, thereby facilitating concurrent visualization and comparative analysis of the at least one medical image belongs to different modalities in the user interface. The interactive module enables bidirectional communication between the medical practitioner and the patient with another medical practitioner located at another place, thereby efficient care to the patient.
[0068] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the disclosure and are not intended to be restrictive thereof. While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended.
[0069] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, the order of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all the acts need to be necessarily performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples.

,CLAIMS:1. A system (10) to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image comprising:
a processing subsystem (20) hosted on a server (30) and configured to execute on a network (40) to control bidirectional communications among a plurality of modules comprising:
a data acquisition module (60) operatively coupled to an integrated database (50), wherein the data acquisition module (60) is configured to acquire the at least one medical image from at least one medical device;
characterized in that:
a region of interest module (70) configured to:
segment the at least one medical image using a plurality of techniques to isolate one or more regions;
extract one or more features from the one or more regions based on a first plurality of parameters comprising, shape, texture, color, presence of discharge, temperature gradient, and one or more dimensions;
a length scale analysis module (80) operatively coupled to the region of interest module (70), wherein the length scale analysis module (80) is configured to derive one or more predominant features from the one or more features through a length scale analysis technique;
a classification module (90) configured to:
classify each of the one or more predominant features into at least one category based on a second plurality of parameters;
assign at least one identifier to each of the one or more predominant features based on the at least one category to obtain a tagged image;
a comparison module (100) configured to:
compare the tagged image with a prestored tagged image based on the first plurality of parameters to evaluate a healing score;
provide one or more assessments based on the healing score evaluated, wherein the one or more assessments comprises improved, stable, worsened; and
a visualization module (110) configured to render the tagged image, the prestored tagged image and the one or more assessments in a user interface, thereby providing the one or more insights regarding the one or more health conditions of the patient utilizing the at least one medical image.
2. The system (10) as claimed in claim 1, wherein the plurality of techniques comprises at least one of a thresholding, region growing, and watershed technique.
3. The system (10) as claimed in claim 1, wherein the length scale analysis technique comprises at least one of pyramid decomposition, scale-space theory, wavelet transform, morphological scale space, and local binary patterns.
4. The system (10) as claimed in claim1, wherein the at least one category comprises small category, medium category, and large category.
5. The system (10) as claimed in claim 1, wherein the at least one identifier comprises a numerical identifier, an alphanumeric identifier, a color code, a symbolic identifier, a string.
6. The system (10) as claimed in claim1, wherein the processing subsystem (20) comprises a notification module (120) configured to notify a plurality of users upon providing the one or more assessments by the comparison module (100).
7. The system (10) as claimed in claim1, wherein the processing subsystem (20) comprises an interactive module (130) configured to enable bidirectional communication between a plurality of users when the visualization module (110) is rendering the tagged image, the prestored tagged image and the one or more assessments in the user interface associated with the plurality of users.
8. The system (10) as claimed in claim 1, wherein the processing subsystem (20) comprises an authentication module (140) configured to:
authenticate the patient based on one or more credentials provided by the patient; and
enable the patient to access to the at least one medical image stored in the integrated database (50) upon authenticating the patient.
9. The system (10) as claimed in claim1, wherein the second plurality of parameters comprises at least one of a size of each of the one or more predominant features, a clinical reference, a user preference, an ease of interpretation, consistency and clarity.
10. A method (300) to provide one or more insights regarding one or more health conditions of a patient utilizing at least one medical image comprising:
acquiring, by a data acquisition module, the at least one medical image from at least one medical device; (310)
characterized in that:
segmenting, by a region of interest module, the at least one medical image using a plurality of techniques to isolate one or more regions; (320)
extracting, by the region of interest module, one or more features from the one or more regions based on a first plurality of parameters comprising, shape, texture, color, presence of discharge, temperature gradient, and one or more dimensions; (330)
deriving, by a length scale analysis module, one or more predominant features from the one or more features through a length scale analysis technique; (340)
classifying, by a classification module, each of the one or more predominant features into at least one category based on a second plurality of parameters; (350)
assigning, by the classification module, at least one identifier to each of the one or more predominant features based on the at least one category to obtain a tagged image; (360)
comparing, by a comparison module, the tagged image with a prestored tagged image based on the first plurality of parameters to evaluate a healing score; (370)
providing, by the comparison module, one or more assessments based on the healing score evaluated, wherein the one or more assessments comprises improved, stable, worsened; (380) and
rendering, by a visualization module, the tagged image, the prestored tagged image and the one or more assessments in a user interface, thereby providing the one or more insights regarding the one or more health conditions of the patient utilizing the at least one medical image. (390)

Dated this 25th day of October 2023
Signature

Jinsu Abraham
Patent Agent (IN/PA-3267)
Agent for the Applicant

Documents

Application Documents

# Name Date
1 202221043848-FORM 2-01-08-2022.pdf 2022-08-01
2 202221043848-FORM 1-01-08-2022.pdf 2022-08-01
3 202221043848-PostDating-(19-07-2023)-(E-6-148-2023-MUM).pdf 2023-07-19
4 202221043848-POA [19-07-2023(online)].pdf 2023-07-19
5 202221043848-FORM-26 [19-07-2023(online)].pdf 2023-07-19
6 202221043848-FORM 13 [19-07-2023(online)].pdf 2023-07-19
7 202221043848-APPLICATIONFORPOSTDATING [19-07-2023(online)].pdf 2023-07-19
8 202221043848-FORM FOR STARTUP [26-10-2023(online)].pdf 2023-10-26
9 202221043848-EVIDENCE FOR REGISTRATION UNDER SSI [26-10-2023(online)].pdf 2023-10-26
10 202221043848-DRAWING [26-10-2023(online)].pdf 2023-10-26
11 202221043848-CORRESPONDENCE-OTHERS [26-10-2023(online)].pdf 2023-10-26
12 202221043848-COMPLETE SPECIFICATION [26-10-2023(online)].pdf 2023-10-26
13 202221043848-STARTUP [21-11-2023(online)].pdf 2023-11-21
14 202221043848-FORM28 [21-11-2023(online)].pdf 2023-11-21
15 202221043848-FORM-9 [21-11-2023(online)].pdf 2023-11-21
16 202221043848-FORM 18A [21-11-2023(online)].pdf 2023-11-21
17 Abstract.jpg 2023-12-14
18 202221043848-FER.pdf 2024-06-21
19 202221043848-FORM 3 [28-06-2024(online)].pdf 2024-06-28
20 202221043848-FER_SER_REPLY [22-08-2024(online)].pdf 2024-08-22
21 202221043848-US(14)-HearingNotice-(HearingDate-18-11-2024).pdf 2024-10-18
22 202221043848-FORM-26 [15-11-2024(online)].pdf 2024-11-15
23 202221043848-Correspondence to notify the Controller [15-11-2024(online)].pdf 2024-11-15
24 202221043848-Written submissions and relevant documents [28-11-2024(online)].pdf 2024-11-28
25 202221043848-PatentCertificate03-12-2024.pdf 2024-12-03
26 202221043848-IntimationOfGrant03-12-2024.pdf 2024-12-03

Search Strategy

1 202221043848E_20-06-2024.pdf

ERegister / Renewals

3rd: 28 Feb 2025

From 01/11/2024 - To 01/11/2025

4th: 28 Feb 2025

From 01/11/2025 - To 01/11/2026