Abstract: The system 100 comprises a central server 102 and a remote screening unit 120. The central server 102 comprises a retinal disease detection unit 112 configured for obtaining at least one two-dimensional (2D) fundus image using an imaging device 105 and analyzing the fundus image for detecting the presence of a retinal disease, a communication unit 116 configured for activating a remote screening unit 120 upon detecting the presence of a retinal disease, the remote screening unit 120 comprising a haptic sensor for remotely controlling the operation of the imaging device 105, an eXtended Reality (XR) viewing device 122 for obtaining at least one three-dimensional (3D) fundus image, a control unit 125 operably coupled to a extended reality viewing device 122, the control unit 125 is configured for receiving processing, storing and communicating at least one fundus image from the extended reality viewing device 122, a treatment scheduling and tracking unit 118 configured for generating a treatment plan, scheduling and tracking the treatment thereby closing a single cycle of patient monitoring and tracking. FIG. 1
DESC:SYSTEM AND METHOD FOR REMOTE PATIENT MANAGEMENT
FIELD OF INVENTION
[0001] The invention relates in general to the field of healthcare systems and methods. More particularly, there are described systems and methods provide an Artificial Intelligence (AI) enabled telemedicine-based healthcare for enabling remote patient management.
BACKGROUND
[0002] Approximately seventy percent of the Indian population live in rural areas plagued with inadequate medical facilities. The inadequate medical facilities are a result of lack of investment from public and private establishments. In addition to lack of medical facilities, retaining specialist doctors in rural areas is increasingly becoming difficult. As a result, access to even primary care facilities for the rural population is a challenge.
[0003] On the contrary, about ninety percent of the secondary & tertiary care facilities are located in cities and towns resulting in lower penetration of healthcare services.
[0004] Many patients requiring specialized medical attention cannot reach the medical specialists on time or afford them and get treated. This may be due to infrastructure issues or due to unavailability of the medical staff. Attempts have been made for rural, resource-limited populations to provide access to specialists who are geographically located in remote locations.
[0005] The specialist positioned elsewhere can treat the patient through a number of available telemedicine platform.
[0006] However, patients who cannot afford to travel to a nearest facility providing telemedicine services, or are too unwell to do so, are essentially disqualified from this specialized care.
[0007] Further, even if diagnosis is made possible through a telemedicine platform, the patient may miss the treatment due to various complexities involved in scheduling an appointment and keeping up with it.
[0008] Hence there exists a need for a telemedicine platform that is able to perform remote patient monitoring and tracking till the completion of the recommended treatment.
SUMMARY OF THE INVENTION
[0009] The invention discloses a system 100 for remote patient management. The system 100 comprises a central server 102 and a remote screening unit 120. The central server 102 comprises a retinal disease detection unit 112 configured for obtaining at least one two-dimensional (2D) fundus image using an imaging device 105 and analyzing the fundus image for detecting the presence of a retinal disease, a communication unit 116 configured for activating a remote screening unit 120 upon detecting the presence of a retinal disease, the remote screening unit 120 comprising a haptic sensor for remotely controlling the operation of the imaging device 105, an eXtended Reality (XR) viewing device 122 for obtaining at least one three-dimensional (3D) fundus image, a control unit 125 operably coupled to a extended reality viewing device 122, the control unit 125 is configured for receiving processing, storing and communicating at least one fundus image from the extended reality viewing device 122, a treatment scheduling and tracking unit 118 configured for generating a treatment plan, scheduling and tracking the treatment thereby closing a single cycle of patient monitoring and tracking.
[0010] These together with other embodiments of the present invention, along with the various features of novelty that characterize the present invention, are pointed out with particularity in the claims annexed hereto and form a part of this present invention. For a better understanding of the present invention, its operating advantages, and the specific objects attained by its uses, reference should be made to the accompanying drawing and descriptive matter in which there are illustrated exemplary embodiments of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The present invention is described by way of embodiments illustrated in the accompanying drawings wherein:
[0012] FIG. 1 shows a schematic diagram depicting a system for remote patient monitoring, as described in an embodiment of the invention.
DETAILED DESCRIPTION
[0013] For a thorough understanding of the present invention, reference is to be made to the following detailed description, including the appended claims, in connection with the above-described drawing. Although the present invention is described in connection with exemplary embodiments, the present invention is not intended to be limited to the specific forms set forth herein. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but these are intended to cover the application or implementation without departing from the spirit or scope of the claims of the present invention. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
[0014] The terms "a" and "an" herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item.
[0015] A telemedicine platform for connecting medical professionals, specifically ophthalmologists with patients, that is able to perform remote patient monitoring and tracking till the completion of the recommended treatment is disclosed herein.
[0016] Accordingly, a system 100 and method for remotely connecting with medical professionals using Mixed Reality over fifth generation wireless communication (5G) is described herein. The system 100 further includes a primary retinal disease detection unit 112 that employs deep learning to effectively carry out the diagnosis in order to detect the presence of the retinal disease. Upon detecting the presence of the disease, a remote screening unit 120 is activated for remotely performing diagnosis by a registered medical practitioner. A communication unit 116 is employed for communicating with the remote screening device and remotely connecting the medical practitioner with an imaging device 105 such as an ophthalmoscope. Upon confirmation of the diagnosis by the medical practitioner a treatment plan is recommended. Consequently, a treatment scheduling and tracking unit 118 is activated that schedules and tracks the treatment thereby closing a single cycle of patient monitoring and tracking.
[0017] The system 100 enables connecting remotely with the medical professionals located elsewhere with the patient. Further, usage of deep learning for initial analysis of the fundus image for detecting the presence of a retinal disease, and usage of extended reality viewing device 122 and communication of the images captured over 5G networks enables realization of application of telemedicine in the field of ophthalmology thereby eliminating the need for physical presence of the patient at the medical facility. The initial analysis of the fundus image for detecting the presence of a retinal disease supports effective decision making of the medical practitioner and the communication of the captured images through 5G networks enables personalized diagnosis.
[0018] Though the invention is explained with respect to ophthalmology, skilled artisans shall however appreciate that the application of the invention can be extended to other fields of medicine.
[0019] Accordingly, FIG.1 shows a schematic diagram of the system 100 for remote patient management as described in an embodiment herein. The system 100 comprises a central server 102 and remote screening device. The central server 102 comprises a retinal disease detection unit 112, a processing unit 110 coupled to the retinal disease detection unit 112, a memory 114 coupled to the processing unit 110, a communication unit 116 coupled to the processing unit 110 and a treatment d=scheduling and tracking unit 118 coupled to the processing unit 110.
[0020] The retinal disease detection unit 112 is configured for receiving a fundus image of a patient who is registered with the medical facility and for performing an initial diagnosis of the fundus image in order to detect the presence of a retinal disease.
[0021] In one embodiment, the fundus image can be obtained from one of an imaging device 105 such as an ophthalmoscope, fundus camera, Optical Coherence Tomography and a cell phone with built in camera.
[0022] The fundus image, herein, refers to a two-dimensional array of digital image data, however, this is merely illustrative and not limiting of the scope of the invention. Further, the fundus image may be a live image or an image that is stored in the memory 114.
[0023] The retinal disease detection unit 112 is configured to identify a plurality of indicators throughout the fundus image using a convolutional network, detect a presence or absence of a retinal disease based the identified indicators using the convolutional network, and classify a severity of the retinal disease based on the presence or absence of the retinal disease using the convolutional network.
[0024] The indicator is one of an abnormality, a retinal feature or the like. The retinal feature is an optic disc, a macula, a blood vessel or the like. The abnormality is one of a lesions like a venous beading, a venous loop, an intra retinal microvascular abnormality, an intra retinal hemorrhage, a micro aneurysm, a soft exudate (cotton-wool spots), a hard exudate, a vitreous/preretinal hemorrhage, neovascularization, a drusen or the like. The retinal disease is one of diabetic retinopathy, diabetic macular edema, glaucoma, coloboma, retinal tear, retinal detachment or the like. The severity of the retinal disease are represented as levels of increasing seriousness of the retinal disease.
[0025] The retinal disease detection unit 112 is trained to detect the presence of the retinal disease. A reference dataset is used for training the retinal disease detection unit 112. The reference dataset comprises a set of reference fundus images. The term ‘training’ herein refers to a process of developing the retinal disease detection unit 112 for the detection and classification of the retinal disease based the reference dataset and a reference ground- truth file. The reference ground-truth file comprises a label and a reference fundus image identifier for each of the reference fundus image. The label provides information about the reference fundus image such as a presence or absence of a retinal disease, the type of retinal disease and the corresponding severity of the retinal disease identified in the reference fundus image. The label is generated by an annotator upon analyzing the indicators present in the reference fundus image. The annotator accesses the reference fundus images via an annotation platform that includes a graphical user interface (GUI) and performs annotation.
[0026] The reference fundus image identifier of a reference fundus image is, for example, a name or identity assigned to the reference fundus image.
[0027] The annotator may consider one or more standard DR grading standards such as the American ophthalmology DR grading scheme, the Scottish DR grading scheme, the UK DR grading scheme, etc., to annotate the reference fundus images. The annotator may assign a DR severity grade 0 (representing no DR), grade 1 (representing mild DR), grade 2 (representing moderate DR), grade 3 (representing severe DR) or grade 4 (representing proliferative DR) to each of the reference fundus image. The label of the reference fundus image represents the DR severity level associated with the patient. Accordingly, the labels for annotations may be selected from a group consisting of ‘No DR’, ‘DR1’, ‘DR2’, ‘DR3’ and ‘DR4’ based on an increasing severity of DR associated with the patient.
[0028] The set of reference fundus images, each reference fundus image along with the label and the identifier is stored in the memory 114 as the reference dataset.
[0029] The memory 114 can include a volatile and/or non-volatile memory 126. For example, the memory 114 can store commands or data related to the central server 102 and the XR viewing device 122. In various embodiments, the memory 114 can store spatial map data that can include mapping information of a real environment such as the interior of an organ or any other real world or virtual world mapping information utilized by an application on the XR viewing device 122.
[0030] The retinal disease detection unit 112 comprises a deep learning module that is configured for identifying one or more indicators in each of the reference fundus image in the reference data set to detect the presence or absence of the retinal disease using image analysis techniques. The deep learning module classifies the severity of the retinal disease based on the presence of the retinal disease using a set of predetermined rules. The predetermined rules comprise considering a type of each of the indicators, a count of each indicators, a region of occurrence of each of the indicators, a contrast level of each of the indicators, a size of each of the indicators or any combination thereof to recognize the retinal disease and the severity of the retinal disease. The deep learning module classifies each of the detected retinal diseases according to a corresponding severity grading and generates the label.
[0031] The deep learning module utilizes the reference dataset to train a convolutional network for subsequent detection and classification of the retina disease in the fundus image. Hereafter, the fundus image which is subsequently analyzed by the deep learning module is referred to as an input fundus image for clarity.
[0032] Accordingly, the retinal disease detection unit 112 is configured to identify a plurality of indicators throughout the input fundus image using the convolutional network, detect a presence or absence of a retinal disease based the identified indicators using the convolutional network, and classify a severity of the retinal disease based on the presence or absence of the retinal disease using the convolutional network.
[0033] Patients with retinal abnormalities will be taken for live retinal examination by ophthalmologists available in the network using AI and extended reality powered by fifth generation (5G) wireless communication system 100.
[0034] Accordingly, the central server 102 comprises a processing unit 110 coupled to the retinal disease detection unit 112 configured for receiving an input from the retinal disease detection unit 112 indicating the presence of the retinal disease. The processing unit 110 upon receiving an input from the retinal disease detection unit 112 indicating the presence of the retinal disease, activates the remote screening unit 120.
[0035] The processing unit 110 includes one or more of a central processing unit 110 (CPU), an application processor (AP), or a communication processor (CP). The processing unit 110 is able to perform control on at least one of the other components of the central server 102 and remote screening system 100 including XR viewing device 122, and/or perform an operation or data processing relating to communication.
[0036] The remote screening unit 120 comprises a user interface unit 124 configured for interfacing the remote screening unit 120 with the imaging device 105, a XR viewing device 122 that is head mounted on the medical practitioner and configured for viewing the retina of the patient through the imaging device 105, a control unit 125 for controlling the operation of the remote screening unit 120 and a local memory 126 for storing the three dimensional images obtained using the XR viewing device 122.
[0037] In one embodiment, the user interface unit 124 may include haptic input and/or output devices (referred to hereafter as haptic devices). The imaging device 105 can be powered and/or operated in a controlled manner using the haptic devices.
[0038] The wearable haptic devices are selected from a group consisting of: a glove, a ring, a wrist band, a wrist watch, an arm band, head gear, a belt, a necklace, a shirt, foot wear, pants, overalls, coveralls, and safety goggles.
[0039] The wearable haptic devices present the data from the sensors as human detectable stimuli including at least one of tactile, vibration, heat, sound, and force. In embodiments, the haptic stimulus represents an effect on the machine resulting from the sensed data. In embodiments, a bending effect may be presented as bending a finger of a haptic glove. In embodiments, a vibrating effect may be presented as vibrating a haptic arm band. In embodiments, a heating effect may be presented as an increase in temperature of a haptic wrist band. In embodiments, an electrical effect (e.g., over voltage, current, and others) may be presented as a change in sound of a phatic audio system 100.
[0040] In one embodiment, the imaging device 105 being ophthalmoscope, the haptic devices can be employed for operating the aperture thereby controlling the amount of light that is allowed through the lens, controlling rheostatic switch that allows the practitioner to manually adjust the brightness of the halogen light, rotating the lens selection disc that enables selection of multiple viewing lenses and the like.
[0041] The eXtended Reality (XR) viewing device 122 disclosed herein is a visualization tool that utilizes patient image data to generate real-time three-dimensional images of the organs that are viewed such as retina. The extended reality collectively refers to Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (extended reality). The VR technology provides objects or backgrounds of the real world only in the form of Computer Generated (CG) images, AR technology provides virtual CG images overlaid on the physical object images, and extended reality technology employs computer graphics technology to mix and merge virtual objects with the real world.
[0042] The XR technology may be applied to Head-Mounted Display (HMD), Head-Up Display (HUD), mobile phone, tablet PC, laptop computer, desktop computer, TV, digital signage, and so on, where a device employing the XR technology may be called an XR device.
[0043] In an exemplary embodiment, the medical practitioner wears the extended reality viewing device 122 and haptic sensors and takes control of slit lamp operations and obtains a direct view of patient’s retina for further diagnosis and treatment. The haptic sensors pick on the hand movements of the medical practitioner, determine at least one haptic stimulation that correspond to the sensed data and generate a signal in response to the at least one haptic stimulation and relay the signal via a 5G communication network to one or more corresponding parts of the ophthalmoscope in order to control the operation of the ophthalmoscope.
[0044] The XR viewing devices 122 enable the medical practitioner to view real-world objects, i.e., actual objects in a real-world environment, such as retina, through the holographic lenses and also concurrently view virtual objects thereby enabling the medical practitioner to view 2-dimensional surfaces, 3-dimensional models, or other user-perceptible elements that are not actually present in the physical, real-world environment in which they are presented as coexisting.
[0045] Through the 3-D representation of the fundus that is available to the medical practitioner through the XR viewing device 122, the medical practitioner analyses the indicators in the retinal fundus image and accordingly marks the label. If the medical practitioner detects a microaneurysm, then the medical practitioner considers it as a mild level of DR and marks the label as DR1 for the reference fundus image. Similarly, when the medical practitioner detects one or more of the following – a hard exudate, a soft exudate, a hemorrhage, a venous loop, a venous beading, etc., then the medical practitioner marks the label as DR2 for the reference fundus image. The label DR2 indicates a moderate level of DR. The medical practitioner marks the label as DR3 for the reference fundus image with a severe level of DR upon detection of multiple hemorrhages, hard or soft exudates, etc., and DR4 for the reference fundus image with a proliferative level of DR upon detection of vitreous hemorrhage, neovascularization, etc. The reference fundus image with no traces of DR is marked with the label as ‘No DR’ by the medical practitioner.
[0046] In an additional embodiment, the remote screening unit 120 may comprise a communication device (now shown) configured for enabling audio and/or video communication between the medical practitioner and a patient undergoing the live diagnosis. The patient on the other hand may communicate using a user device. The communication device (not shown) is configured for providing for cross-platform communication between systems and devices not otherwise communicatively compatible. The communication device (not shown) employs dynamic and intelligent (Global Positioning System/ Local Positioning System) GPS/LPS to provide real time cross platform communication to enable the medical practitioner to connect to an individual patient or a group of patients distributed within the communication network.
[0047] In one embodiment, the user device may be a personal computing device, among other things for example, a desktop computer, a laptop computer, a notebook, a netbook, a tablet personal computer (PC), a control panel, a smart phone, a mobile phone, a personal digital assistant (PDA), and/or any other suitable device operable to send and receive data and display data.
[0048] Accordingly, the medical practitioner may pose a series of questionnaire to the patient undergoing diagnosis, the response to which facilitate the decision-making process by the medical practitioner towards the diagnosis and the treatment.
[0049] The Control unit 125 of the remote screening unit 120 may employ Natural Language Processing (NLP) techniques, including for example speech recognition, understanding, generation, and translation, for facilitating the interaction between the medical practitioner and the patient.
[0050] Further, the questionnaire may be recorded for providing documental evidence in support of the decisions made by the medical practitioner. The documental evidence may be produced in the court of law in an effort to defend the medical practitioner, in case a legal issue is logged by the patient.
[0051] The recorded questionnaire may further be employed to train the Convolutional Neural network 111 (CNN) by drawing a correlation between the signs and symptoms experienced by the second user 114 and the drug prescriptions handed by the first user 112. The trained convolutional neural network 11 may be employed by the first user 112 upon generating futuristic medical prescriptions for determining the appropriateness of the prescribed drugs.
[0052] Upon successful diagnosis, a report is generated by the remote screening unit 120 detailing the diagnosis, recommended treatment options with digital signature of the medical practitioner who has performed the diagnosis. The recommended treatment options may include but are not limited to a surgical intervention and/or intake of one or more drugs.
[0053] Accordingly, a prescription may be recommended by the medical practitioner. The term “prescription” as used herein includes a list of one or more drugs identified by generic and/or brand name of along with dosage and timing for each drug. Further the terms “drug” and “medication” may be used interchangeably herein. The prescription may further be included in the report generated.
[0054] The report thus generated is communicated to the processing unit 110 via the communication network. The processor upon receiving the report stores the same in the memory 114 and triggers the treatment scheduling and tracking unit 118. The treatment scheduling and tracking unit 118 is configured for generating a treatment schedule based on the availability of the preferred medical practitioners and the track the treatment cycle till the completion.
[0055] In an exemplary embodiment, the treatment may include schedule a surgery or dispensing one or more drugs recommended by the medical practitioner. The treatment scheduling and tracking unit 118 is therefore configured for generating a treatment schedule based on the availability of the preferred medical practitioners and the track the intake of the recommended drugs till the completion of the treatment cycle.
[0056] The details of patient monitoring, diagnosis and treatment tracking for each patient may be stored in the memory 114 for future reference and training purpose.
[0057] The central server 102 120 is configured for storing, processing and providing information to a user. The user may be a patient, medical practitioner or authorized person in the healthcare domain. The processing unit may be a general purpose processor, a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), and/or the like. The processing unit may be configured to retrieve data from and/or write data to the memory. The memory may be, for example, a random access memory (RAM), a memory buffer, a hard drive, a database, an erasable programmable read only memory (EPROM), an electrically erasable programmable read only memory (EEPROM), a read only memory (ROM), a flash memory, a hard disk, a floppy disk, cloud storage, and/or so forth. In one embodiment, the central server 102 may include one or more hardware- based modules (e.g., DSP, FPGA, ASIC) and/or software- based modules (e.g., a module of computer code stored at the memory and executed at the processor, a set of processor- readable instructions that may be stored at the memory and executed at the processor) associated with executing an application, such as, for example, receiving, processing and displaying image data from the various imaging devices.
[0058] Further, the central server 102 may include a database (e.g., in memory and/or through a wired and/or a wireless connection) for storing data received from the multiple kiosks coupled to the central server 102 via the communication network. Additionally, the central server 102 may store information related to the user, location parameters of the land selected to be cultivated by the user and spatial parameters of the selected land obtained using the geographical information system. The central server 102 may further comprise a User Interface (UI) (not shown) directly coupled to the database so as to facilitate display of data stored in the database.
[0059] Any database discussed herein may include relational, hierarchical, graphical, or object-oriented structure and/or any other database configurations. Common database products that may be used to implement the databases include DB2 by IBM (White Plains, N.Y.), various database products available from Oracle Corporation (Redwood Shores, Calif.), Microsoft Access or Microsoft SQL Server by Microsoft Corporation (Redmond, Wash.), MySQL, or any other suitable database product. Moreover, the databases may be organized in any suitable manner, for example, as data tables or lookup tables. Each record may be a single file, a series of files, a linked series of data fields or any other data structure. Association of certain data may be accomplished through any desired data association technique such as those known or practiced in the art.
[0060] The databases may include various information such as but not limited to details of the medical professional registered with the system, details of the healthcare professional having access to the system, the details of patient data and details of the healthcare unit.
[0061] To aggregate general use templates, individual patient data is scrubbed of any unique patient specific identifiers in compliance with all applicable patient security and privacy guidelines, such as HIPPA, and then combined with additional aggregate patient therapy data to generate processed therapy data allowing the use of views, queries, rules based processing, algorithms and/or AI Machine Learning analysis, research and discovery of new and previously unknown combinations of complementary medicine, integrative medicine or alternative Therapy and traditional medicine to actively promote future patient discovery of individual ways to manage patient experience factors, lowering drug use and improving the patient experience by allowing the patient to positively control an aspect of their own therapeutic healing environment.
[0062] The communication unit 116 may transmit or receive wireless signals via a communication network according to wireless internet technologies.
[0063] The 5G wireless communication is a requirement for enabling the telemedicine platform that is disclosed herein. This fast speed is required not only for virtual reality and augmented reality but also for transferring video with a resolution more than 4K (6K, 8K or more). The 5G wireless communication network support AR and VR environments devices that need to access and manage huge amounts of data, among other things. The higher throughput of 5G wireless communication will be necessary for VR and AR content that’s streamed from the cloud.
[0064] In particular, the remote screening device may perform data communications with the central server 102 and the imaging device 105 by using at least one network service among enhanced mobile broadband (eMBB), ultra-reliable and low latency communications (URLLC), and massive machine-type communications (mMTC).
[0065] The system 100 and method for remote patient monitoring disclosed herein are economical, effective in treating patients located in rural areas, eliminates the need for the presence of the doctor and the patient in the medical facility thereby reducing inconvenience to family of the patient and the caregiver.
[0066] The telemedicine platform disclosed here overtake the continuous monitoring and tracking starting from diagnosis till completion of the treatment cycle including tracking of routine checkups. This reduces the burden on the secondary hospitals.
[0067] The ability to provide on-demand and rapid access to immediate medical care seekers minimizes the inequity and barriers to accessing the medical care that may not otherwise be available.
[0068] The system 100 engages the patients in providing personalized monitoring, diagnosis and treatment when compared to the traditional healthcare, and thus inspires the patients to adhere to health goals.
[0069] Thus, the present invention provides a remote patient monitoring system that is capable of diagnosing and treating retinal diseases in an accurate, efficient, economical and a time-saving manner.
[0070] The descriptions of the various embodiments of the present invention have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
,CLAIMS:1. A system 100 for remote patient monitoring and tracking, the system 100 comprising:
a central server 102;
a retinal disease detection unit 112 configured for obtaining at least one two-dimensional (2D) fundus image using an imaging device 105 and analyzing the fundus image for detecting the presence of a retinal disease;
a communication unit 116 configured for activating a remote screening unit 120 upon detecting the presence of a retinal disease;
the remote screening unit 120 comprising:
a haptic sensor for remotely controlling the operation of the imaging device 105;
an eXtended Reality (XR) viewing device 122 for obtaining at least one three-dimensional (3D) fundus image;
a control unit 125 operably coupled to a extended reality viewing device 122, the control unit 125 being configured for receiving processing, storing and communicating at least one fundus image from the extended reality viewing device 122; and
a treatment scheduling and tracking unit 118 configured for generating a treatment plan, scheduling and tracking the treatment thereby closing a single cycle of patient monitoring and tracking.
2. The system of claim 1, wherein the fundus image can be obtained from one of an imaging device 105 such as an ophthalmoscope, fundus camera, Optical Coherence Tomography and a cell phone with built in camera.
3. The system of claim 1, further comprising a primary retinal disease detection unit that employs deep learning to effectively carry out the diagnosis in order to detect the presence of the retinal disease.
4. The system of claim 1, wherein the remote screening unit 120 comprises a user interface unit 124 configured for interfacing the remote screening unit 120.
5. The system of claim 1, wherein the remote screening unit 120 comprises a local memory 126 for storing the three dimensional images obtained using the XR viewing device 122.
6. The system of claim 1, wherein the wearable haptic devices are selected from a group consisting of: a glove, a ring, a wrist band, a wrist watch, an arm band, head gear, a belt, a necklace, a shirt, foot wear, pants, overalls, coveralls, and safety goggles.
7. A method for remote patient monitoring and tracking, the method comprising steps of:
employing deep learning to effectively carry out the diagnosis in order to detect the presence of the retinal disease by a primary retinal disease detection unit;
activating a remote screening unit in response to detecting the presence of the disease, for remotely performing diagnosis by a registered medical practitioner.
communicating with the remote screening device by communication unit;
remotely connecting the medical practitioner with an imaging device such as an ophthalmoscope;
recommending a treatment plan upon confirmation of the diagnosis by the medical practitioner; and
activating a treatment scheduling and tracking unit that schedules and tracks the treatment thereby closing a single cycle of patient monitoring and tracking.
8. A computer programme product storing computer readable instructions which when executed by a processor cause the processor to execute a method comprising the steps of:
employing deep learning to effectively carry out the diagnosis in order to detect the presence of the retinal disease by a primary retinal disease detection unit;
activating a remote screening unit in response to detecting the presence of the disease, for remotely performing diagnosis by a registered medical practitioner.
communicating with the remote screening device by communication unit;
remotely connecting the medical practitioner with an imaging device such as an ophthalmoscope;
recommending a treatment plan upon confirmation of the diagnosis by the medical practitioner; and
activating a treatment scheduling and tracking unit that schedules and tracks the treatment thereby closing a single cycle of patient monitoring and tracking.
Dated this 22nd July 2021
(Digitally Signed)
Suma KB-INPA 1753
Agent for the applicant
| # | Name | Date |
|---|---|---|
| 1 | 202041021643-PROVISIONAL SPECIFICATION [22-05-2020(online)].pdf | 2020-05-22 |
| 2 | 202041021643-FORM FOR STARTUP [22-05-2020(online)].pdf | 2020-05-22 |
| 3 | 202041021643-FORM FOR SMALL ENTITY(FORM-28) [22-05-2020(online)].pdf | 2020-05-22 |
| 4 | 202041021643-FORM 1 [22-05-2020(online)].pdf | 2020-05-22 |
| 5 | 202041021643-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [22-05-2020(online)].pdf | 2020-05-22 |
| 6 | 202041021643-DRAWINGS [22-05-2020(online)].pdf | 2020-05-22 |
| 7 | 202041021643-DRAWING [22-06-2021(online)].pdf | 2021-06-22 |
| 8 | 202041021643-COMPLETE SPECIFICATION [22-06-2021(online)].pdf | 2021-06-22 |
| 9 | 202041021643-PETITION u-r 6(6) [28-06-2021(online)].pdf | 2021-06-28 |
| 10 | 202041021643-Covering Letter [28-06-2021(online)].pdf | 2021-06-28 |
| 11 | 202041021643-FORM-9 [06-08-2021(online)].pdf | 2021-08-06 |
| 12 | 202041021643-STARTUP [18-02-2022(online)].pdf | 2022-02-18 |
| 13 | 202041021643-FORM28 [18-02-2022(online)].pdf | 2022-02-18 |
| 14 | 202041021643-FORM 18A [18-02-2022(online)].pdf | 2022-02-18 |
| 15 | 202041021643-FER.pdf | 2022-02-22 |
| 16 | 202041021643-FER_SER_REPLY [22-08-2022(online)].pdf | 2022-08-22 |
| 17 | 202041021643-CLAIMS [22-08-2022(online)].pdf | 2022-08-22 |
| 18 | 202041021643-ABSTRACT [22-08-2022(online)].pdf | 2022-08-22 |
| 19 | 202041021643-US(14)-HearingNotice-(HearingDate-17-05-2023).pdf | 2023-04-24 |
| 20 | 202041021643-Correspondence to notify the Controller [05-05-2023(online)].pdf | 2023-05-05 |
| 21 | 202041021643-Written submissions and relevant documents [31-05-2023(online)].pdf | 2023-05-31 |
| 22 | 202041021643-Proof of Right [31-05-2023(online)].pdf | 2023-05-31 |
| 23 | 202041021643-Proof of Right [31-05-2023(online)]-1.pdf | 2023-05-31 |
| 24 | 202041021643-POA [31-05-2023(online)].pdf | 2023-05-31 |
| 25 | 202041021643-PETITION UNDER RULE 137 [31-05-2023(online)].pdf | 2023-05-31 |
| 26 | 202041021643-MARKED COPIES OF AMENDEMENTS [31-05-2023(online)].pdf | 2023-05-31 |
| 27 | 202041021643-FORM-8 [31-05-2023(online)].pdf | 2023-05-31 |
| 28 | 202041021643-FORM-26 [31-05-2023(online)].pdf | 2023-05-31 |
| 29 | 202041021643-FORM 3 [31-05-2023(online)].pdf | 2023-05-31 |
| 30 | 202041021643-FORM 13 [31-05-2023(online)].pdf | 2023-05-31 |
| 31 | 202041021643-ENDORSEMENT BY INVENTORS [31-05-2023(online)].pdf | 2023-05-31 |
| 32 | 202041021643-PETITION UNDER RULE 138 [16-02-2024(online)].pdf | 2024-02-16 |
| 33 | 202041021643-PETITION UNDER RULE 137 [16-02-2024(online)].pdf | 2024-02-16 |
| 34 | 202041021643-PETITION UNDER RULE 137 [16-02-2024(online)]-3.pdf | 2024-02-16 |
| 35 | 202041021643-PETITION UNDER RULE 137 [16-02-2024(online)]-2.pdf | 2024-02-16 |
| 36 | 202041021643-PETITION UNDER RULE 137 [16-02-2024(online)]-1.pdf | 2024-02-16 |
| 37 | 202041021643-US(14)-ExtendedHearingNotice-(HearingDate-22-04-2024).pdf | 2024-04-04 |
| 38 | 202041021643-FORM-26 [19-04-2024(online)].pdf | 2024-04-19 |
| 39 | 202041021643-Correspondence to notify the Controller [19-04-2024(online)].pdf | 2024-04-19 |
| 40 | 202041021643-Written submissions and relevant documents [06-05-2024(online)].pdf | 2024-05-06 |
| 41 | 202041021643-Annexure [06-05-2024(online)].pdf | 2024-05-06 |
| 42 | 202041021643-PatentCertificate27-09-2024.pdf | 2024-09-27 |
| 43 | 202041021643-IntimationOfGrant27-09-2024.pdf | 2024-09-27 |
| 44 | 202041021643-Correspondence to notify the Controller [21-02-2025(online)].pdf | 2025-02-21 |
| 45 | 202041021643-FORM 13 [27-02-2025(online)].pdf | 2025-02-27 |
| 1 | 210222E_21-02-2022.pdf |