Abstract: ABSTRACT A REAL TIME METHOD FOR REMOTELY ANALYSING LESIONS The invention provides a real time method for remotely analysing lesions for effective triaging of patients. The method includes performing a first validation for capturing an image of the lesions, the first validation includes positioning an image capturing device with respect to the lesions within a predefined degree of orientation in X axis and Y axis, performing a second validation for capturing the image of the lesions by validating on real time basis the positioning of the image capturing device with respect to the lesions by an expert, capturing image of the lesions from the position validated through the first validation and the second validation, storing the captured image in a central server and analysing the lesions in the image for degree of severity of the lesions. The degree of severity of the lesions in the image is analysed by calculating the length, width, area, colour and depth of the lesions. FIG.1
DESC:A REAL TIME METHOD FOR REMOTELY ANALYSING LESIONS
FIELD OF INVENTION
The invention generally relates to the field of medical imaging and more specifically, embodiments of the invention relate to a method for remotely analysing lesions in real time for effective triaging of patients.
BACKGROUND
For any disease management process, effective and efficient triaging of patients is the first and most crucial step. To arrive at a logical clinical judgement, clinicians have to rely on and study various clinically relevant images. In today’s digital world, the clinically relevant images are provided through means including but not limited to EMR systems, image storage systems, social media tools, MMS or Image sharing tools, CDs, USB drives, printed versions, soft copies in various formats, video-consultation tools, tele-consultation tools etc. Though remote analysis of the clinical images by clinicians and health professionals provide early diagnosis, even for patients located in places with poor health facilities, providing a logical diagnosis based on clinical images faces many challenges. Firstly, the images are not accompanied with relevant clinical information, leaving ample room for miscommunication, misunderstanding and misrepresentation, which more often than not, if not attended on a priority basis could lead to life threatening situations. Secondly, there can be considerable variability in the quality of images in terms of angle of view or angle of capturing the image, with a slight variation, the calculations of length, angles, area specified in task description cannot be accessed accurately. Various solutions are provided in the art for addressing the above mentioned challenges. One way to address calibration issues disclosed in the prior art includes inserting a digital marker comprising a frame, for example, a circular or rectangular frame indicating to the user to position the area of interest within the frame when the image is captured. Another solution involves using a first (original) stored image in blur to capture subsequent images and inform the user for focus, saturation and underexposure of the image. One significant disadvantage of using a frame or first image in blur is that it will ensure appropriate distance from which the image is captured but not the angle of capturing the image.
Another way to address the challenge includes use of two mobile devices, a first mobile device captures a first image of the lesion from a first predefined distance and a second mobile device captures a second image of the lesion along with the first image from a distance greater than the first distance. A data processor analyses the second image to assist in medical diagnosis of lesion, the analysis of second image includes analysis of size, shape and/or light intensity of objects in the first image to help evaluate the size, area, depth, volume of the wound. One significant disadvantage of the technique is requirement of two mobile devices and user unfriendly procedure for capturing the image of wound.
Still other method disclosed for aiding user in capturing image includes use of internal inclinometer(including accelerometer, liquid capticitive and other level-sensing devices) in the smartphone device to aid the user in aligning the camera in the desired orientation. One significant disadvantage of the method is that the aid in image acquition is based only on the device level and has limitation for the users who may not be able to follow the instructions provided on the device.
Hence, there is a need for a method that allows a user located even in remote locations with minimal technical skills and resources to be able to capture clinically relevant images to be used by the clinician for real time analysis of lesions and therefore resulting in effective triaging of patients.
BRIEF DESCRIPTION OF DRAWINGS
So that the manner in which the recited features of the invention can be understood in detail, some of the embodiments are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
Fig.1 shows a flow chart depicting a real-time method for remotely analysing lesions, according to an embodiment of the invention.
FIG.2 illustrates a block diagram of a real-time system for remotely analysing lesions, according to an embodiment of the invention.
FIG.3 shows a representative screenshot of a user interface showing the patient’s journey and clinician’s journey, according to an embodiment of the invention.
FIG.4 shows a representative screenshot of a user interface showing capturing of the image with correct degree of orientation, according to an embodiment of the invention.
FIG.5 shows a representative screenshot of a user interface showing calculation of length, width and area of lesion, according to an embodiment of the invention.
SUMMARY OF THE INVENTION
One aspect of the invention provides a real time method for remotely analysing lesions for effective triaging of patients. The method includes performing a first validation for capturing an image of the lesions, the first validation includes positioning an image capturing device with respect to the lesions within a predefined degree of orientation in X axis and Y axis, performing a second validation for capturing the image of the lesions by validating on real time basis the positioning of the image capturing device with respect to the lesions by an expert, capturing image of the lesions from the position validated through the first validation and the second validation, storing the captured image in a central server and analysing the lesions in the image for degree of severity of the lesions. The degree of severity of the lesions in the image is analysed by calculating the length, width, area, colour and depth of the lesions.
Another aspect of the invention provides a real time system for remotely analysing lesions; the system includes an image capturing device for capturing image of a lesion, a first validation module for performing a first validation, a second validation module for performing a second validation, a centralised server for retrievably storing the captured images and an analysing module for analysing lesions in the captured image for degree of severity.
DETAILED DESCRIPTION OF THE INVENTION
The definitions, terms and terminology adopted in the disclosure have their usual meaning and interpretations, unless otherwise specified. Various embodiments of the invention provide a real time method for remotely analysing lesions. The method includes performing a first validation for capturing an image of the lesions. The first validation is performed by positioning an image capturing device with respect to the lesions within a predefined degree of orientation in X axis and Y axis. Following the first validation, a second validation is performed for capturing the image of the lesions by validating on real time basis the positioning of the image capturing device with respect to the lesions by an expert. An image of the lesions is captured through the image capturing device from the position validated through the first validation and the second validation. The captured image is then stored in a centralized server and the lesions in the image are analysed for degree of severity of the lesions by calculating the length, width, area, colour and depth of the lesions. Analysing the lesions for degree of severity helps in effective triaging of patients by grading the patients in different groups depending upon the degree of severity.
FIG.1 shows a flow chart depicting a real-time method for remotely analysing lesions, according to an embodiment of the invention. Capturing a clinically relevant image is of prime importance to enable remote analysis of the lesion. The image capturing device should be in correct orientation and range with respect to the lesions. The image capturing device includes but is not limited to a digital camera, a smartphone with a camera, a mobile device with a USB extendable camera, a tablet with an external camera, a camera embedded in a digital kiosk and other such image capturing devices known to a person skilled in the art. In one embodiment of the invention, the image capturing device is digital camera of a smartphone. For capturing an image of the lesion, a user activates the camera of the image capturing device 101. In one embodiment of the invention, a user activates the camera of a smartphone 101, additionally other software applications such as level sensing means including but not limited to inclinometer, gyroscope, air bubble in liquid is activated. In one embodiment of the invention, an in-built gyroscope of the smart phone is activated to enable positioning of the camera. The first validation 103 includes device based means for enabling a user to position the image capturing device with respect to the lesions such that the orientation of the image capturing device in the X axis and in the Y axis with respect to the lesion is within a predefined degree of orientation. In one embodiment of the invention, the screen of the smartphone is displayed with markings overlaid upon the camera’s preview field-of-view, allowing the user to align the camera 105 within the displayed markings by moving the camera to front/back, clock-wise/counter-clock wise with respect to the lesion and capture image of the lesions when the orientation in the X axis and the orientation in the Y axis of the image capturing device with respect to the lesion is within a predefined degree of orientation. The predefined degree of orientation varies in the range of 0 to 5 degree in the X axis and in the Y axis. The markings displayed includes but is not limited to a vertical line, a horizontal line, a plurality of lines(example, protractor), numbers displaying degree of tilt in X axis and Y axis, a virtual bubble. In one embodiment of the invention, numbers are displayed under X axis and Y axis informing the user of the degree of deviation in each axis. In one embodiment of the invention, the ‘click’ button of the camera gets activated only when the degree of orientation of the camera with respect to the lesion is less than 5 degree in the X axis and in the Y axis. In another embodiment of the invention, user is guided by an audio signal to position the camera such that the degree of orientation of the camera with respect to the lesion is less than 5 degree in the X axis and is less than 5 degree in the Y axis. The audio signal includes but is not limited to computer-to-human voice interface, automated voice commands, sounds, and sound patterns. In another embodiment of the invention, artificial intelligence and machine learning is employed for first validation.
After performing the first validation 103, a second validation 109 is performed by an expert in real time for validating the position of the image capturing device with respect to the lesions. The expert is either an expert person or an expert system or a combination of both. The expert person includes a clinician/ doctor/ healthcare official with a minimum 3 years of specialist training. The expert system includes a computer programme that uses artificial intelligence and machine learning. In one embodiment of the invention, the second validation is performed by an expert person manually, the manual process involves following certain guidelines and standards as described in textbooks of medicine, surgery and histopathology. In another embodiment of the invention, the expert person performs the second validation through a digital device including but not limited to a smartphone, a tablet, a smartwatch, a computer and a digital kiosk. In yet another embodiment of the invention, an expert person uses artificial intelligence for performing the second validation.
In one embodiment of the invention, the second validation is performed by an expert person in real time by connecting to the user through virtual consultation/video conference.
For second validation, the expert prompts the user to correct the orientation 113 and/or range of the camera with respect to the lesions by either tilting the image capturing device or moving the image capturing device back or front. Once the orientation of the camera of the image capturing device is validated 115 by the expert on real time, the user is allowed to capture the image of the lesions 117. In one embodiment of the invention, the user is asked to place a reference object adjacent to the lesions. The reference object is an object with known and standard dimensions and includes a ruler, a credit/debit card, a coin, a WHO dental / periodontal probe, a Williams periodontal probe, a Nabers probe or a standard millimetre scale. In one embodiment of the invention, the click button of the camera of smartphone is activated allowing the user to capture the image of the lesion from the position validated after the first validation and the second validation. The user can enter metadata related to the image. The metadata entered includes and is not limited to age of the patient, name of the patient, medical history if any, clinical information related to the lesion, general information related to the lesion including size, shape, surface appearance, color, location in relation to adjacent anatomical structures, consistency, pulsatility, fluctuation, mobility, compressibility, reducibility, transillumination, percussion, auscultation, additional notes. The captured image along with the metadata entered by the user is transmitted to a centralized server 119. The stored image is analysed by an expert 121 for the degree of severity of the lesions. The expert is either an expert person or an expert system or a combination of both. The expert person includes a clinician/doctor/healthcare official with a minimum 3 years of specialist training. The expert system includes a computer programme that uses artificial intelligence and machine learning. The analysis includes but is not limited to calculating the length, width, area, colour and depth of the lesions. In one embodiment of the invention, for analysing the lesions, the expert retrieves the image stored in the central server. Basic image editing options like crop, rotate, align to grids, zoom in and zoom out can be performed on the image as required by the expert. Subsequent to this, the image size of the lesions and the image size of reference standard object are determined. In one embodiment of the invention, the expert person employs software that uses colour coded lines for measuring the length of the reference object and the length, the width and the area of the lesions. Determining the length of the reference standard object enables relative measurement of the lesions resulting in increased accuracy. Further measurement of the dimensions of the lesions is enabled through various colour coded lines. In one embodiment of the invention, a maroon colour line shows length of the reference object, a blue colour line shows length of the lesion, yellow lines show angle measurement and red colour lines show area measurement. In another embodiment of the invention, the expert system uses artificial intelligence for analysing the lesions for the degree of severity. In another embodiment of the invention, the expert person uses artificial intelligence for analysing the lesions for the degree of severity.
On the basis of calculations of length, width, area and depth of the lesion, degree of severity of lesions is estimated. In one embodiment of the invention, based on degree of severity of the lesions, a patient can be graded as; A grade – Normal, B grade – needs urgent care and further examination, C grade – needs priority care and further examination and D grade – needs emergency care and further examination. Through the grading of patients, effective triaging and monitoring of patients can be done.
In another embodiment of the invention, the expert clinician can refer the analysed lesions for a second opinion from another expert person or from AI. Once the AI system is trained using ML methods aimed for lesion analysis, the efficiency increases over a period of time. Additionally every single image which is added or processed adds to the knowledge bank.
In one embodiment of the invention, the analysed images of the lesions are stored in centralized server to be used by the expert as reference for analysing other images and as training data set for machine learning. In one embodiment of the invention, for analysing, a multiple of images from medical and surgery open libraries, text book soft copies, pathology databanks are used by the expert to look for specific abnormalities in size, shape, colour, anatomic features and microscopic features by comparing to normal images.
In an alternate embodiment of the invention, the real time method is performed using a digital kiosk for patients who do not have access to technology or who do not have required skills. The digital kiosks are enabled with a camera for capturing image of the lesions, level sensing means for performing first validation, AI for performing second validation and analysing the lesions in the captured image for degree of severity. The digital kiosk provides tele-consultation/virtual consultation, recording of basic medical vitals with third-party tools and instruments located at remote community centres/ camps/ villages/ kiosk at hospitals.
Various embodiments of the invention also provide a real time system for remotely analysing lesions. FIG.2 illustrates a block diagram of a real-time system for remotely analysing lesions, according to an embodiment of the invention. The system includes an image capturing device 201 for capturing an image of a lesion, a first validation module 203 coupled to the image capturing device for performing a first validation, a second validation module 205 coupled to the image capturing device for performing a second validation, a centralized server 207 for retrievably storing the captured images and an analysis module 209 for analysing lesions in the captured image for degree of severity. The image capturing device 201 is either a hand held device or a wearable device and includes but is not limited to a digital camera, a smartphone with a camera, a mobile device with a USB extendable camera, a tablet with an in-built or external camera, a camera embedded in a digital kiosk. The first validation module 203 includes a level sensing means and is either in-built in the image capturing device or is external. The level sensing means includes but is not limited to an inclinometer, a gyroscope, an air bubble in liquid and other level sensing means as known to person skilled in the art. The second validation module 205 includes an expert system that uses AI and ML for performing a second validation, means for an expert person to guide the user to correct the orientation of the image capturing device with respect to the lesions. The analysis module includes an AI and ML enabled expert system for analysing the lesions for degree of severity, software providing colour coded lines for expert person for analysing the lesions. The invention also provides a user interface for analysing lesions in real time. FIG.3 shows a representative screenshot of a user interface showing the patient’s journey and clinician’s journey, according to an embodiment of the invention. Once a user logs in with a valid user id and password, the user is allowed to capture an image of the lesion following a two step validation process. In first validation, the camera of the smartphone and in-built gyroscope is activated to allow a user to capture an image of the lesion when the degree of orientation of the camera with respect to the lesion is less than 5 degree in the X axis and in the Y axis (FIG. 4). The camera click button is activated only when the degree of orientation of the camera with respect to the lesion is less than 5 degree in the X axis and in the Y axis. Following the first validation, a second validation is initiated where an expert in real time guides the user to correct the degree of orientation of the camera with respect to the lesion. Once the position of the camera with respect to the lesion is validated by the first validation and the second validation, the user is allowed to capture the image of the lesion. The user is asked to place a reference object adjacent to the lesion. The reference object is an object with known and standard dimensions and includes a ruler, a credit/debit card, a coin, a WHO dental / periodontal probe, a Williams periodontal probe, a Nabers probe, a standard millimetre scale. The captured image is saved and transmitted to the central server. For analysing the lesions in the image, the clinician can retrieve the image from the central server. The clinician analyses the lesions in the image for degree of severity by calculating the length, width, area, colour and depth of the lesion. FIG.5 shows a representative screenshot of a user interface showing calculation of length, width and area of lesion, according to an embodiment of the invention. A maroon colour line shows length of the reference object, a blue colour line shows length of the lesion, yellow lines shows angle measurement and red colour lines show area measurement. Based on degree of severity of the lesions, a patient can be graded as; A grade – Normal, B grade – needs urgent care and further examination, C grade – needs priority care and further examination and D grade – needs emergency care and further examination.
The invention thus advantageously provides a real time method for remotely analysing lesions by providing capturing of clinically relevant images by a person having minimum technical knowledge through first validation and second validation from expert(in real time) on correct orientation and range with respect to the lesions and analysis of the image by expert for degree of severity of lesions for effective triaging of patients.
The foregoing description of the invention has been set merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to person skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof.
,CLAIMS:We Claim:
1. A real time method for remotely analyzing lesions, the method comprising;
performing a first validation for capturing an image of the lesions wherein the first validation includes positioning an image capturing device with respect to the lesions within a predefined degree of orientation in X axis and Y axis;
performing a second validation for capturing the image of the lesions wherein the positioning of the image capturing device with respect to the lesions is validated by an expert on real time;
capturing image of the lesions through the image capturing device from the position of the image capturing device with respect to the lesions validated through the first validation and the second validation;
storing the captured image in a central server; and
analysing the lesions in the image for degree of severity of the lesions wherein the analysing includes calculating the length, width, area, colour and depth of the lesions.
2. The method as claimed in claim 1, wherein the predefined degree of orientation in X axis is within 5 degree.
3. The method as claimed in claim 1, wherein the predefined degree of orientation in Y axis is within 5 degree.
4. The method as claimed in claim 1, wherein the image capturing device includes a hand held device or a wearable device wherein the image capturing device includes a digital camera, a smartphone with a camera, a mobile device with a USB extendable camera, a tablet with an external camera, a camera embedded in a digital kiosk.
5. The method as claimed in claim 1, wherein the first validation is performed through a level sensing means wherein the level sensing means is either in-built with an image capturing device or is external.
6. The method as claimed in claim 1, wherein level sensing means include an inclinometer, a gyroscope, an air bubble in liquid, or other level sensing means.
7. The method as claimed in claim 1, wherein the level sensing means displays the readings as markings including a vertical line, a horizontal line, a plurality of lines(example, protractor), numbers displaying degree of tilt in X axis and Y axis, a virtual bubble.
8. The method as claimed in claim 1, wherein the second validation is performed in real time by an expert wherein the expert includes an expert person, an expert system that uses artificial intelligence and machine learning or a combination thereof.
9. The method as claimed in claim 1, wherein the length, width, area, colour and depth of the lesions in the captured image is calculated by an expert person or an expert system that uses AI and ML.
10. The method as claimed in claim 1, wherein the analysis of lesions in the image for degree of severity helps in effective triaging of patients wherein depending on the degree of severity of lesions the patient can be grade as A grade – Normal, B grade – needs urgent care and further examination, C grade – needs priority care and further examination and D grade – needs emergency care and further examination.
11. A real time system for remotely analysing lesions, the system comprising;
an image capturing device for capturing image of a lesion;
a first validation module coupled to the image capturing device for performing a first validation;
a second validation module coupled to the image capturing device for performing a second validation;
a centralized server for retrievably storing the captured images; and
an analysis module for analysing lesions in the captured image for degree of severity.
12. The system as claimed in claim 1, wherein the image capturing device includes a hand held device or a wearable device wherein the image capturing device includes a digital camera, a smartphone with a camera, a mobile device with a USB extendable camera, a tablet with an external camera, a camera embedded in a digital kiosk.
13. The system as claimed in claim 1, wherein the first validation module includes a level sensing means including an inclinometer, a gyroscope, an air bubble in liquid wherein the level sensing means is either in-built in the image capturing device or is external.
14. The system as claimed in claim 1, wherein the second validation module includes a means for an expert to guide the user to correct the orientation of the image capturing device with respect to the lesions.
15. The system as claimed in claim 1, wherein the analysis module includes an AI and ML enabled system that allows an expert for analysing the lesions for degree of severity by using colour coded lines.
Bangalore ANJU RAWAT
29th December 2023 (INTELLOCOPIA IP SERVICES)
AGENT FOR APPLICANT
| # | Name | Date |
|---|---|---|
| 1 | 202241076984-PROVISIONAL SPECIFICATION [29-12-2022(online)].pdf | 2022-12-29 |
| 2 | 202241076984-FORM FOR SMALL ENTITY(FORM-28) [29-12-2022(online)].pdf | 2022-12-29 |
| 3 | 202241076984-FORM FOR SMALL ENTITY [29-12-2022(online)].pdf | 2022-12-29 |
| 4 | 202241076984-FORM 1 [29-12-2022(online)].pdf | 2022-12-29 |
| 5 | 202241076984-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-12-2022(online)].pdf | 2022-12-29 |
| 6 | 202241076984-EVIDENCE FOR REGISTRATION UNDER SSI [29-12-2022(online)].pdf | 2022-12-29 |
| 7 | 202241076984-DRAWINGS [29-12-2022(online)].pdf | 2022-12-29 |
| 8 | 202241076984-Proof of Right [06-01-2023(online)].pdf | 2023-01-06 |
| 9 | 202241076984-FORM-26 [06-01-2023(online)].pdf | 2023-01-06 |
| 10 | 202241076984-ENDORSEMENT BY INVENTORS [06-01-2023(online)].pdf | 2023-01-06 |
| 11 | 202241076984-FORM 3 [29-12-2023(online)].pdf | 2023-12-29 |
| 12 | 202241076984-DRAWING [29-12-2023(online)].pdf | 2023-12-29 |
| 13 | 202241076984-CORRESPONDENCE-OTHERS [29-12-2023(online)].pdf | 2023-12-29 |
| 14 | 202241076984-COMPLETE SPECIFICATION [29-12-2023(online)].pdf | 2023-12-29 |
| 15 | 202241076984-FORM-9 [02-02-2024(online)].pdf | 2024-02-02 |
| 16 | 202241076984-MSME CERTIFICATE [09-02-2024(online)].pdf | 2024-02-09 |
| 17 | 202241076984-FORM28 [09-02-2024(online)].pdf | 2024-02-09 |
| 18 | 202241076984-FORM 18A [09-02-2024(online)].pdf | 2024-02-09 |
| 19 | 202241076984-FER.pdf | 2024-03-21 |
| 20 | 202241076984-RELEVANT DOCUMENTS [20-06-2024(online)].pdf | 2024-06-20 |
| 21 | 202241076984-POA [20-06-2024(online)].pdf | 2024-06-20 |
| 22 | 202241076984-FORM 13 [20-06-2024(online)].pdf | 2024-06-20 |
| 23 | 202241076984-FORM FOR SMALL ENTITY [24-06-2024(online)].pdf | 2024-06-24 |
| 24 | 202241076984-EVIDENCE FOR REGISTRATION UNDER SSI [24-06-2024(online)].pdf | 2024-06-24 |
| 25 | 202241076984-FER_SER_REPLY [03-07-2024(online)].pdf | 2024-07-03 |
| 26 | 202241076984-DRAWING [03-07-2024(online)].pdf | 2024-07-03 |
| 27 | 202241076984-CORRESPONDENCE [03-07-2024(online)].pdf | 2024-07-03 |
| 28 | 202241076984-COMPLETE SPECIFICATION [03-07-2024(online)].pdf | 2024-07-03 |
| 29 | 202241076984-CLAIMS [03-07-2024(online)].pdf | 2024-07-03 |
| 30 | 202241076984-US(14)-HearingNotice-(HearingDate-27-05-2025).pdf | 2025-05-06 |
| 31 | 202241076984-Correspondence to notify the Controller [21-05-2025(online)].pdf | 2025-05-21 |
| 32 | 202241076984-Written submissions and relevant documents [11-06-2025(online)].pdf | 2025-06-11 |
| 33 | 202241076984-MARKED COPIES OF AMENDEMENTS [11-06-2025(online)].pdf | 2025-06-11 |
| 34 | 202241076984-FORM 13 [11-06-2025(online)].pdf | 2025-06-11 |
| 35 | 202241076984-Annexure [11-06-2025(online)].pdf | 2025-06-11 |
| 36 | 202241076984-AMMENDED DOCUMENTS [11-06-2025(online)].pdf | 2025-06-11 |
| 37 | 202241076984-PatentCertificate26-06-2025.pdf | 2025-06-26 |
| 38 | 202241076984-IntimationOfGrant26-06-2025.pdf | 2025-06-26 |
| 1 | SearchHistoryE_20-03-2024.pdf |