Sign In to Follow Application
View All Documents & Correspondence

Diabetic Retinopathy Prediction Device System And Method Thereof

Abstract: An aspect of the present disclosure relates to a device (202) for early prediction of diabetic retinopathy with application of deep learning. The device (202) includes an image capturing device (206), a memory (208) coupled to processor (204). The image capturing device(206) obtains a retinal fundus image from the user. The memory comprising executable instructions which upon execution by the processor (204) configures the device to obtain physiological parameters of the user in real-time from the image capturing device, retrieve the obtained retinal fundus image and the one or more obtained physiological parameters and compare the one or more extracted features with at least one pre-stored feature in a database to generate at least a prediction result indicative of detection of the presence, the progression or the treatment effect of the disease in the user.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
23 July 2019
Publication Number
05/2021
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
info@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-08-26
Renewal Date

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector -9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. GUPTA, Sheifali
Professor, Department of Electronics and Communication Engineering, Chitkara University, Chandigarh Patiala National Highway (NH-64), Tehsil - Rajpura, District Patiala-140401, Punjab, India.
2. AHUJA, Rakesh
Professor, Department of Computer Science and Engineering, Chitkara University, Chandigarh Patiala National Highway (NH-64), Tehsil - Rajpura, District Patiala-140401, Punjab, India.
3. GARG, Meenu
Assistant Professor, Department of Electronics and Communication Engineering, Chitkara University, Chandigarh Patiala National Highway (NH-64), Tehsil - Rajpura, District Patiala-140401, Punjab, India.
4. GUPTA, Rupesh
Professor, Department of Mechanical Engineering, Chitkara University, Chandigarh Patiala National Highway (NH-64), Tehsil - Rajpura, District Patiala-140401, Punjab, India.
5. GUPTA, Deepali
Professor, Department of Computer Science and Engineering, Chitkara University, Chandigarh Patiala National Highway (NH-64), Tehsil - Rajpura, District Patiala-140401, Punjab, India.
6. AHUJA, Sachin
Director Research, Department of Computer Science & Engineering, Chitkara University, Chandigarh Patiala National Highway (NH-64), Tehsil - Rajpura, District Patiala-140401, Punjab, India.

Specification

[0001]The present disclosure relates to diagnostic devices, and more specifically, to a
diabetic retinopathy prediction device, system and method for early prediction of diabetic retinopathy with application of deep learning.
BACKGROUND
[0002]Background description includes information that may be useful in
understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] In medical research, the diagnosis of ocular diseases is an important area for
early and in-time treatment of people at high-risk. Diabetic retinopathy (DR) is the most common form of diabetic eye disease. The International Diabetes Foundation (IDF) estimated that the global population with diabetes in 2017 was 451 million and over one-third of the population had DR, representing a tremendous population at risk of visual impairment or blindness. By 2045, the worldwide prevalence of diabetes is expected to increase to 693 million people.
[0004] In addition, almost half (49.7%) of all people living with diabetes remain
undiagnosed for years because of silent symptoms. However, long-term high blood sugar levels ultimately destroy blood vessels and nerves, leading to complications, such as cardiovascular disease and blindness. Detection and treatment of DR in the early stage will prevent its development or progression.
[0005] In medical research, the diagnosis of ocular diseases is an important area for
early and in-time treatment of people at high-risk. Diabetic retinopathy (DR) is the most
common form of diabetic eye disease. Diabetic retinopathy usually only affects people who
have had diabetes (diagnosed or undiagnosed) for a significant number of years. Diabetic
retinopathy involves the abnormal growth of blood vessels in the retina.
[0006] Retinopathy can affect all diabetics and becomes particularly dangerous,
increasing the risk of blindness, if it is left untreated. The risk of developing diabetic retinopathy is known to increase with age as well with less well controlled blood sugar and blood pressure level. Diabetic retinopathy involves the abnormal growth of blood vessels in

the retina. Complications can lead to serious vision problems: Vitreous hemorrhage, Retinal detachment, Glaucoma, Blindness. Risk of developing the eye condition can increase as a result of: Duration of diabetes — the longer you have diabetes, the greater your risk of developing diabetic retinopathy, Poor control of your blood sugar level, High blood pressure, High cholesterol, Pregnancy, Tobacco use, Being African-American, Hispanic or Native American.
[0007] Over time, too much sugar in your blood can lead to the blockage of the tiny
blood vessels that nourish the retina, cutting off its blood supply. As a result, the eye attempts to grow new blood vessels. But these new blood vessels don't develop properly and can leak easily.
[0008] There are two types of diabetic retinopathy such as anearly diabetic
retinopathy and anadvanced diabetic retinopathy. Theearly diabetic retinopathy iscalled non-proliferative diabetic retinopathy (NPDR). In the early diabetic retinopathy new blood vessels aren't growing (proliferating).When patient have NPDR, the walls of the blood vessels in your retina weaken. Tiny bulges (microaneurysms) protrude from the vessel walls of the smaller vessels, sometimes leaking fluid and blood into the retina. Larger retinal vessels can begin to dilate and become irregular in diameter, as well. NPDR can progress from mild to severe, as more blood vessels become blocked. Nerve fibers in the retina may begin to swell. Sometimes the central part of the retina (macula) begins to swell (macular edema), a condition that requires treatment.
[0009] The advanced diabetic retinopathycan progress to this more severe type,
known as proliferative diabetic retinopathy. In the advanced diabetic retinopathy, damaged
blood vessels close off, causing the growth of new, abnormal blood vessels in the retina, and
can leak into the clear, jelly-like substance that fills the center of your eye
(vitreous).Eventually, scar tissue stimulated by the growth of new blood vessels may cause
the retina to detach from the back of your eye. If the new blood vessels interfere with the
normal flow of fluid out of the eye, pressure may build up in the eyeball. This can damage the
nerve that carries images from your eye to your brain (optic nerve), resulting in glaucoma.
[0010] Efforts have been made to apply DR detection techniques into the health care
system at primary care sites, where patients with diabetes are regularly seen, could improve the percentage of patients screened when indicated. The existing technique of screening for retinopathy is conventionally done through fundus examination by ophthalmologists or retinal colour photography using conventional mydriatic or non-mydriatic fundus cameras by optometrists or trained eye technicians. However, conventional DR screening programs

commonly utilize retinal fundus photography which depends on talented readers for manual
DR evaluation. This is labor-intensive and suffers from inconsistency. Further, there are
several other clinical parameters, which are associated with causation and progression of DR
like Blood Pressure, Body Mass Index, Protein in Urine, Raised fats (triglycerides in the
blood), Glycosylated Hemoglobin (HbAlc) etc. Existing technologies does not contain a
technique to analyses all the parameters simultaneously. Furthermore, at present, in medical
image processing technology, reliable diabetic retinopathy detection from digital fundus
images is known as an open problem and needs alternative solutions to be developed. In this
context, manual interpretation of retinal fundus images requires the magnitude of work,
expertise, and over-processing time. This leads, not only to delayed interpretation, but also
loss to follow-up, miscommunication, and delay in proceeding for prediction of DR.
[0011] Therefore, there exists a need of an efficient, effective and improved a diabetic
retinopathy prediction device, system and method for early prediction of diabetic retinopathy
with application of deep learning. Further, there is a need of device, system and method for
early prediction of diabetic retinopathy that overcome the above-mentioned and other
limitations of the existing solutions and utilize techniques, which are robust, accurate, fast,
efficient, simple and do not require skilled professional for prediction of DR. furthermore,
there is a need of a diagnostic device for early prediction of DR with an increased
classification sensitivity, specificity, accuracy and speed with application of deep learning.
[0012] As used in the description herein and throughout the claims that follow, the
meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.
[0013] In some embodiments, the numerical parameters set forth in the written
description and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the invention are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable. The numerical values presented in some embodiments of the invention may contain certain errors necessarily resulting from the standard deviation found in their respective testing measurements.

[0014] The recitation of ranges of values herein is merely intended to serve as a
shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. "such as") provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[0015] Groupings of alternative elements or embodiments of the invention disclosed
herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all groups used in the appended claims.
OBJECTS OF THE INVENTION
[0016] Some of the objects of the present disclosure, which at least one embodiment
herein satisfies are as listed herein below.
[0017] It is an object of the present disclosure to provide a diagnostic device, system
and method for early prediction of diabetic retinopathy (DR).
[0018] It is another object of the present disclosure to provide a diagnostic device,
system and method for early prediction of DR with an increased classification sensitivity,
specificity, accuracy and speed with application of deep learning.
[0019] It is another object of the present disclosure to provide a diagnostic device,
system and method for early prediction of DR in which all clinical parameters of body is
analysed along with the retinal fundus image for accurate identification and prediction of DR.
[0020] It is another object of the present disclosure to provide a diagnostic device,
system and method for early prediction of DR which is fully automated, unlike manual
inspection which is time consuming, expensive, tedious and requires trained professional to
analyze the fundus image.

[0021] It is another object of the present disclosure to provide a diagnostic device,
system and method for early prediction of DR in which all clinical parameters of body is analysed along with the retinal fundus image for identification of DR.
[0022] It is another object of the present disclosure to provide a diagnostic device,
5 system and method for early prediction of DR which provides a real-time output on the
device with a prediction result of DR to avoid loss to follow-up, miscommunication, and delay in proceeding for management of DR.
SUMMARY
10 [0023] This summary is provided to introduce a selection of concepts in a simplified
form to be further described below in the Detailed Description. This summary is not intended
to identity key features or essential features of the claimed subject matter, nor is it intended to
be used to limit the scope of the claimed subject matter.
[0024] An aspect of the present disclosure relates to a device to detect a presence,
15 progression or treatment effect of a disease characterized by retinal pathological changes in a
user. The device includes an image capturing device, a memory coupled to one or more processors. The image capturing device obtains a retinal fundus image from the user. The image capturing device includes a light source for emitting a light and a light diffusing device adapted to receive the light from the light source and to redirect the light toward an eye of a
20 user to provide a substantially even illumination to a retina of the eye to obtain to obtain the
retinal fundus image. The memory comprising executable instructions which upon execution by the one or more processors configures the device to obtain one or more physiological parameters of the user in real-time from the image capturing device, retrieve the obtained retinal fundus image and the one or more obtained physiological parameters and compare the
25 one or more extracted features with at least one pre-stored feature in a database to generate
at least a prediction result indicative of detection of the presence, the progression or the
treatment effect of the disease in the user.
[0025] In an aspect, the retinal pathological changes are associated with a diabetic
retinopathy (DR).
30 [0026] In an aspect, the image capturing device is a retinal fundus camera.
[0027] In an aspect, the image capturing device is communicably coupled to the
device.
6

[0028] In an aspect, the one or more physiological parameters are selected from any
or combination of blood pressure body mass index, protein in urine, raised fats (triglycerides in the blood), glycosylated hemoglobin (HbA1c).
[0029] In an aspect, the device further includes a display device configured to display
5 the generated prediction result indicative of the detection of the presence, the progression or
the treatment effect of the disease in the user.
[0030] In an aspect, the disease is selected from the group consisting of stroke,
hypertension, diabetes, cardiovascular diseases including coronary heart disease and cerebral vascular disease, glaucoma, prematurity, papilloedema, and common retina disease.
10 [0031] In an aspect, the prediction result is generated by at least one algorithm
selected from a machine-learning algorithm, a deep learning algorithm, and an artificial intelligence (AI) algorithm that outputs the prediction result.
[0032] In an aspect, the device further comprising an input device configured to
receive the one or more physiological parameters as an input.
15 [0033] An aspect of the present disclosure relates to method for detecting a presence,
progression or treatment effect of a disease characterized by retinal pathological changes in a user. The method includes the steps of: obtaining a retinal fundus image from the user at an image capturing device, obtaining one or more physiological parameters of the user in real¬time from the image capturing device at the device, retrieving the obtained retinal fundus
20 image and the one or more obtained physiological parameters at the device, and comparing
the one or more extracted features with at least one pre-stored feature in a database at the device.
[0034] Various objects, features, aspects and advantages of the present disclosure will
become more apparent from the following detailed description of preferred embodiments,
25 along with the accompanying drawing figures in which like numerals represent like features.
BRIEF DESCRIPTION OF THE DRAWINGS
[0035] The accompanying drawings are included to provide a further understanding
of the present disclosure, and are incorporated in and constitute a part of this specification.
30 The drawings illustrate exemplary embodiments of the present disclosure and, together with
the description, serve to explain the principles of the present disclosure.
[0036] In the figures, similar components and/or features may have the same
reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar
7

components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[0037] FIG. 1 illustrates exemplary network architecture of a proposed system, in
5 accordance with an exemplary embodiment of the present disclosure.
[0038] FIG. 2 illustrates an exemplary module diagram of a proposed system, in
accordance with an exemplary embodiment of the present disclosure.
[0039] FIG. 3 illustrates an exemplary flow diagram of a proposed system, in
accordance with an exemplary embodiment of the present disclosure.
10 [0040] FIG. 4 illustrates an exemplary computer system utilized for implementation
of the proposed system, in accordance with an exemplary embodiment of the present disclosure.
DETAILED DESCRIPTION
15 [0041] Embodiments of the present disclosure include various steps, which will be
described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, and firmware or by human
20 operators.
[0042] If the specification states a component or feature “may”, “can”, “could”, or
“might” be included or have a characteristic, that particular component or feature is not
required to be included or have the characteristic.
[0043] Exemplary embodiments will now be described more fully hereinafter with
25 reference to the accompanying drawings, in which exemplary embodiments are shown. This
disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. These embodiments are provided so that this disclosure will be thorough and complete and will fully convey the scope of the disclosure to those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the
30 disclosure, as well as specific examples thereof, are intended to encompass both structural
and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure).
8

[0044] Thus, for example, it will be appreciated by those of ordinary skill in the art
that the diagrams, schematics, illustrations, and the like represent conceptual views or
processes illustrating systems and methods embodying this disclosure. The functions of the
various elements shown in the figures may be provided through the use of dedicated
5 hardware as well as hardware capable of executing associated software. Similarly, any
electronic code generator shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this disclosure. Those of ordinary skill in the art
10 further understand that the exemplary hardware, software, processes, methods, and/or
operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named.
[0045] Various terms as used herein are shown below. To the extent a term used in a
claim is not defined below, it should be given the broadest definition persons in the pertinent
15 art have given that term as reflected in printed publications and issued patents at the time of
filing.
[0046] Problems to be solved in the present invention are that: Efforts have been
made to apply DR detection techniques into the health care system at primary care sites, where patients with diabetes are regularly seen, could improve the percentage of patients
20 screened when indicated. The existing technique of screening for retinopathy is
conventionally done through fundus examination by ophthalmologists or retinal colour photography using conventional mydriatic or non-mydriatic fundus cameras by optometrists or trained eye technicians. However, conventional DR screening programs commonly utilize retinal fundus photography which depends on talented readers for manual DR evaluation.
25 This is labor-intensive and suffers from inconsistency. Further, there are several other clinical
parameters, which are associated with causation and progression of DR like Blood Pressure, Body Mass Index, Protein in Urine, Raised fats (triglycerides in the blood), Glycosylated Hemoglobin (HbA1c) etc. Existing technologies does not contain a technique to analyses all the parameters simultaneously. Furthermore, at present, in medical image processing
30 technology, reliable diabetic retinopathy detection from digital fundus images is known as an
open problem and needs alternative solutions to be developed. In this context, manual interpretation of retinal fundus images requires the magnitude of work, expertise, and over-processing time. This leads, not only to delayed interpretation, but also loss to follow-up, miscommunication, and delay in proceeding for prediction of DR.
9

[0047] In the conventional techniques, only retinal fundus image is the input to deep
learning to detect Diabetic Retinopathy. In the proposed system and device not only retinal
fundus image but also various body parameters like BMI, Blood Pressure, HbA1c etc. are
also used as input to deep learning network to diagnose Diabetic Retinopathy.
5 [0048] Therefore, there exists a need of an efficient, effective and improved adiabetic
retinopathy prediction device system and method for early prediction of diabetic retinopathy with application of deep learning. Further, there is a need of device, system and method for early prediction of diabetic retinopathy that overcome the above-mentioned and other limitations of the existing solutions and utilize techniques, which are robust, accurate, fast,
10 efficient, simple and do not require skilled professional for prediction of DR. furthermore,
there is a need of a diagnostic device for early prediction of DR with an increased
classification sensitivity, specificity, accuracy and speed with application of deep learning.
[0049] An aspect of the present disclosure relates to diabetic retinopathy prediction
system for the prediction of diabetic retinopathy. The system includes a fundus camera and
15 artificial intelligence (AI) configured processing module.
[0050] In an aspect, the system is capable of analyzing Retinal Fundus Images in
view of bodily parameters such as Blood Pressure (BP), Body Mass Index (BMI), Protein in Urine, Raised fats, and Glycosylated Hemoglobin (HbA1c) and predicting the presence/absence/severity of Diabetic Retinopathy in a patient.
20 [0051] In an aspect, the system receives retinal fundus images of a patient with the
help of the fundus camera.
[0052] In an aspect, various body parameters of the patient are given as input to the
AI configured processing module. The system autonomously analyzes the retinal fundus images in view of bodily parameters and predicts the presence/absence/severity of diabetic
25 retinopathy in the patient.
[0053] In an embodiment, the retinal fundus camera cancapture the retinal fundus
image. The retinal fundus camera can beconnected to the device.
[0054] In another embodiment, various body parameters like blood pressure, body
mass index, protein in urine, trigylcericides, and haemoglobin can be input using keyboard.
30 [0055] In another embodiment, the deep learning algorithm considering all computed
parameters of body can be applied on the retinal fundus image for identification of DR at the end; one database with all details of persons will be managed by a database management system. The output can be the prediction result on the display of the device regarding DR.
10

[0056] In an embodiment, the proposed device and system can check the parameter of
the body like blood pressure, body mass index etc. It also considers the features of retinal
fundus image. This device acknowledges whether the person is suffering from DR or not. If
DR is at early stage then preventive steps can be taken to procure the vision loss or damage.
5 [0057] FIG. 1illustrates exemplary network architecture of a proposed system, in
accordance with an exemplary embodiment of the present disclosure.
[0058] Although the present subject matter is explained considering that a diabetic
retinopathy prediction system102is implemented as an engine on a cloud server 104, it may be understood that the diabetic retinopathy prediction system 102 may also be implemented
10 in a variety of computing systems, such as a laptop computer, a desktop computer, a
notebook, a workstation, a mainframe computer, a server, a network server, and the like. It may be understood that the diabetic retinopathy prediction system 102 may be accessed by multiple users 110-1, 110-2…110-N, collectively referred to as user 110 hereinafter through one or more respective computing devices 108-1, 108-2…108-N, collectively referred to as
15 user device 108 hereinafter, or applications residing on the computing devices 108.
Examples of the computing devices 108 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The computing devices 108 are communicatively coupled to the diabetic retinopathy prediction system 102 through a network 106.
20 [0059] In one implementation, the network 106 may be a wireless network, a wired
network or a combination thereof. The network can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that
25 use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission
Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the
like, to communicate with one another. Further the network may include a variety of network
devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0060] In an implementation, said system embedded with/incorporated with one or
30 more Internet of Things (IoT) devices. In a typical network architecture of the present
disclosure can include a plurality of network devices such as transmitter, receivers, and/or transceivers that may include one or more IoT devices.
[0061] As used herein, the IoT devices can be a device that includes sensing and/or
control functionality as well as a WiFi™ transceiver radio or interface, a Bluetooth™
11

transceiver radio or interface, a Zigbee™ transceiver radio or interface, an Ultra-Wideband
(UWB) transceiver radio or interface, a Wi-Fi-Direct transceiver radio or interface, a
Bluetooth™ Low Energy (BLE) transceiver radio or interface, and/or any other wireless
network transceiver radio or interface that allows the IoT device to communicate with a wide
5 area network and with one or more other devices. In some embodiments, an IoT device does
not include a cellular network transceiver radio or interface, and thus may not be configured to directly communicate with a cellular network. In some embodiments, an IoT device may include a cellular transceiver radio, and may be configured to communicate with a cellular network using the cellular network transceiver radio.
10 [0062] A user may communicate with the network devices using an access device that
may include any human-to-machine interface with network connection capability that allows access to a network. For example, the access device may include a stand-alone interface (e.g., a cellular telephone, a smartphone, a home computer, a laptop computer, a tablet, a personal digital assistant (PDA), a computing device, a wearable device such as a smart watch, a wall
15 panel, a keypad, or the like), an interface that is built into an appliance or other device e.g., a
television, a refrigerator, a security system, a game console, a browser, or the like), a speech or gesture interface (e.g., a Kinect™ sensor, a Wiimote™, or the like), an IoT device interface (e.g., an Internet enabled device such as a wall switch, a control interface, or other suitable interface), or the like. In some embodiments, the access device may include a cellular
20 or other broadband network transceiver radio or interface, and may be configured to
communicate with a cellular or other broadband network using the cellular or broadband
network transceiver radio. In some embodiments, the access device may not include a cellular
network transceiver radio or interface.
[0063] User may interact with the network devices using an application, a web
25 browser, a proprietary program, or any other program executed and operated by the access
device. In some embodiments, the access device may communicate directly with the network devices (e.g., communication signal). For example, the access device may communicate directly with network devices using Zigbee™ signals, Bluetooth™ signals, WiFi™ signals, infrared (IR) signals, UWB signals, WiFi-Direct signals, BLE signals, sound frequency
30 signals, or the like. In some embodiments, the access device may communicate with the
network devices via the gateways and/or a cloud network.
[0064] The network access provided by gateway may be of any type of network
familiar to those skilled in the art that can support data communications using any of a variety of commercially-available protocols. For example, gateways may provide wireless
12

communication capabilities for the local area network using particular communications
protocols, such as WiFi™ (e.g., IEEE 802.11 family standards, or other wireless
communication technologies, or any combination thereof). Using the communications
protocol(s), the gateways may provide radio frequencies on which wireless enabled devices
5 in the local area network can communicate. A gateway may also be referred to as a base
station, an access point, Node B, Evolved Node B (eNodeB), access point base station, a Femtocell, home base station, home Node B, home eNodeB, or the like.
[0065] A router gateway may include access point and router functionality, and may
further include an Ethernet switch and/or a modem. For example, a router gateway may
10 receive and forward data packets among different networks. When a data packet is received,
the router gateway may read identification information (e.g., a media access control (MAC) address) in the packet to determine the intended destination for the packet. The router gateway may then access information in a routing table or routing policy, and may direct the packet to the next network or device in the transmission path of the packet. The data packet
15 may be forwarded from one gateway to another through the computer networks until the
packet is received at the intended destination.
[0066] As in a typical network architecture of the present disclosure can include a
plurality of network devices such as transmitter, receivers, and/or transceivers that may include one or more Internet of Things (IOT) devices. As used herein, an IOT devices can be
20 a device that includes sensing and/or control functionality as well as a Wi-Fi transceiver
radio or interface, a Bluetooth transceiver radio or interface, a Zigbee transceiver radio or interface, an Ultra-Wideband (UWB) transceiver radio or interface, a Wi-Fi Direct transceiver radio or interface, a Bluetooth Low Energy (BLE) transceiver radio or interface, and/or any other wireless network transceiver radio or interface that allows the IOT device to
25 communicate with a wide area network and with one or more other devices. In some
embodiments, an IOT device may include a cellular transceiver radio, and may be configured to communicate with a cellular network using the cellular network transceiver radio. IOT devices may include home automation network devices that allow a user to access, control, and/or configure various home appliances located within the user's home (e.g., a television,
30 radio, light, fan, humidifier, sensor, microwave, iron, and/or the like), or outside of the user's
home (e.g., exterior motion sensors, exterior lighting, garage door openers, sprinkler systems, or the like). Network device may include a home automation switch that may be coupled with a home appliance. In some embodiments, network devices may be used in other environments, such as a business, a school, an establishment, a park, or any place that can
13

support a local area network to enable communication with network devices. For example, a
network device can allow a user to access, control, and/or configure devices, such as office-
related devices (e.g., copy machine, printer, fax machine, or the like), audio and/or video
related devices (e.g., a receiver, a speaker, a projector, a DVD player, a television, or the
5 like), media-playback devices (e.g., a compact disc player, a CD player, or the like),
computing devices (e.g., a home computer, a laptop computer, a tablet, a personal digital assistant (PDA), a computing device, a wearable device, or the like), lighting devices (e.g., a lamp, recessed lighting, or the like), devices associated with a security system, devices associated with an alarm system, devices that can be operated in an automobile (e.g., radio
10 devices, navigation devices), and/or the like.
[0067] FIG. 2illustrates an exemplary module diagram of a proposed system, in
accordance with an exemplary embodiment of the present disclosure.
[0068] In one embodiment, the proposed diabetic retinopathy prediction device or
device 202 may include at least one processor 204, an input/output (I/O) interface 206, an
15 image capturing device 208 and a memory 210.
[0069] The processor204 may be implemented as one or more microprocessors,
microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor is configured to fetch and
20 execute computer-readable instructions stored in the memory.
[0070] The I/O interface206 may include a variety of software and hardware
interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface may allow system to interact with a user directly. The I/O interface can facilitate multiple communications within a wide variety of networks and protocol types, including
25 wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN,
cellular, or satellite. The I/O interface may include one or more ports for connecting a number of devices to one another or to another server.
[0071] The memory210 may include any computer-readable medium known in the art
including, for example, volatile memory, such as static random access memory (SRAM) and
30 dynamic random access memory (DRAM), and/or non-volatile memory, such as read only
memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory may include modules and data. The modules include routines, programs, and objects, components, and data structures etc., which perform particular tasks or implement particular abstract data types.
14

[0072] In one implementation, the diabetic retinopathy prediction device 202
candetect a presence, progression or treatment effect of a disease characterized by retinal pathological changes in a user.
[0073] In an embodiment, the image capturing device 208 can obtain a retinal fundus
5 image from the user. The image capturing device 208 can include a light source for emitting
a light and a light diffusing device adapted to receive the light from the light source and to redirect the light toward an eye of a user to provide a substantially even illumination to a retina of the eye to obtain to obtain the retinal fundus image. The image capturing device 208 is a retinal fundus camera.
10 [0074] In another embodiment, the diabetic retinopathy prediction device 202
canobtain one or more physiological parameters of the user in real-time from the image capturing device.
[0075] In another embodiment, the diabetic retinopathy prediction device 202
canretrieve the obtained retinal fundus image and the one or more obtained physiological
15 parameters to extract one or more features associated with a current health condition of the
retina of the eye of the user.
[0076] In another embodiment, the diabetic retinopathy prediction device 202
cancompare the one or more extracted features with at least one pre-stored feature in a database to generate at least a prediction result indicative of detection of the presence, the
20 progression or the treatment effect of the disease in the user.
[0077] In another embodiment, the image capturing device 208 can be communicably
coupled wired or wireless manner to the the diabetic retinopathy prediction device 202.
[0078] In another embodiment, the proposed device 202 can include display device
218 configured to display the generated prediction result indicative of the detection of the
25 presence, the progression or the treatment effect of the disease in the user.
[0079] In another embodiment, the prediction result can be generated by at least one
algorithm selected from a machine-learning algorithm, a deep learning algorithm, and an
artificial intelligence (AI) algorithm that outputs the prediction result.
[0080] FIG. 3illustrates an exemplary flow diagram of a proposed system, in
30 accordance with an exemplary embodiment of the present disclosure.
[0081] At step 302, a retinal fundus image from the user can obtain at an image
capturing device.
[0082] At step 304, one or more physiological parameters of the user in real-time
from the image capturing device can obtain at the device.
15

[0083] At step 306, the obtained retinal fundus image and the one or more obtained
physiological parameters can retrieve at the device.
[0084] At step 308, the one or more extracted features can compare with at least one
pre-stored feature in a database at the device.
5 [0085] FIG. 4 illustrates an exemplary computer system utilized for implementation
of the proposed system, in accordance with an exemplary embodiment of the present disclosure.In an embodiment, proactive network security assessment based on benign variants of known threats can be implemented in the computer system 400 to enable aspects of the present disclosure. Embodiments of the present disclosure include various steps, which have
10 been described above. A variety of these steps may be performed by hardware components or
may be tangibly embodied on a computer-readable storage medium in the form of machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with instructions to perform these steps. Alternatively, the steps may be performed by a combination of hardware, software, and/or firmware. As shown in the
15 figure, computer system 400 includes an external storage device 410, a bus 420, a main
memory 430, a read only memory 440, a mass storage device 450, communication port 460, and a processor 470. A person skilled in the art will appreciate that computer system 400 may include more than one processor and communication ports. Examples of processor 470 include, but are not limited to, an Intel® Itanium® or Itanium 2 processor(s), or AMD®
20 Opteron® or Athlon MP® processor(s), Motorola® lines of processors, FortiSOC™ system
on a chip processors or other future processors. Processor 470 may include various modules associated with embodiments of the present invention. Communication port 460 can be any of an RS-232 port for use with a modem based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing
25 or future ports. Communication port 460 may be chosen depending on a network, such a
Local Area Network (LAN), Wide Area Network (WAN), or any network to which computer system 400 connects. Memory 430 can be Random Access Memory (RAM), or any other dynamic storage device commonly known in the art. Read only memory 440 can be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips
30 for storing static information e.g., start-up or BIOS instructions for processor 470. Mass
storage 450 may be any current or future mass storage solution, which can be used to store information and/or instructions. Exemplary mass storage solutions include, but are not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external,
16

e.g., having Universal Serial Bus (USB) and/or Firewire interfaces), e.g. those available from
Seagate (e.g., the Seagate Barracuda 7200 family) or Hitachi (e.g., the Hitachi Deskstar
7K1000), one or more optical discs, Redundant Array of Independent Disks (RAID) storage,
e.g. an array of disks (e.g., SATA arrays), available from various vendors including Dot Hill
5 Systems Corp., LaCie, Nexsan Technologies, Inc. and Enhance Technology, Inc. Bus 420
communicatively couples processor(s) 470 with the other memory, storage and communication blocks. Bus 420 can be, e.g. a Peripheral Component Interconnect (PCI) / PCI Extended (PCI-X) bus, Small Computer System Interface (SCSI), USB or the like, for connecting expansion cards, drives and other subsystems as well as other buses, such a front
10 side bus (FSB), which connects processor 470 to software system. Optionally, operator and
administrative interfaces, e.g. a display, keyboard, and a cursor control device, may also be coupled to bus 420 to support direct operator interaction with computer system 400. Other operator and administrative interfaces can be provided through network connections connected through communication port 460. External storage device 410 can be any kind of
15 external hard-drives, floppy drives, IOMEGA® Zip Drives, Compact Disc - Read Only
Memory (CD-ROM), Compact Disc - Re-Writable (CD-RW), Digital Video Disk - Read Only Memory (DVD-ROM). Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system limit the scope of the present disclosure.
20 [0086] In this respect, before explaining at least one embodiment of the invention in
detail, it is to be understood that the invention is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology
25 and terminology employed herein are for the purpose of description and should not be
regarded as limiting. These together with other objects of the invention, along with the various features of novelty which characterize the invention, are pointed out with particularity in the disclosure. For a better understanding of the invention, its operating advantages and the specific objects attained by its uses, reference should be had to the
30 accompanying drawings and descriptive matter in which there are illustrated preferred
embodiments of the invention.
[0087] While the preferred embodiment of the invention has been set forth for the
purpose of disclosure, modifications of the disclosed embodiment of the invention as well as other embodiments thereof may occur to those skilled in the art. Accordingly, the appended
17

claims are intended to cover all embodiments, which do not depart from the spirit and scope of the invention.
[0088] The foregoing object, features and advantages will be able to easily carry out
self-technical features of the present invention one of ordinary skill in the art are described
5 later in detail with reference to the accompanying drawings, accordingly. If the detailed
description of the known art related to the invention In the following description of the present invention that are determined to unnecessarily obscure the subject matter of the present invention, and detailed description thereof will not be given. It will be described in the following, a preferred embodiment according to the present invention with reference to
10 the accompanying drawings, for example, in detail. Like reference numerals in the drawings
it is used to refer to same or similar elements.
[0089] It should be apparent to those skilled in the art that many more modifications
besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of
15 the appended claims. Moreover, in interpreting both the specification and the claims, all
terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements,
20 components, or steps that are not expressly referenced. Where the specification claims refers
to at least one of something selected from the group consisting of A, B, C ….and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily
25 modify and/or adapt for various applications such specific embodiments without departing
from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein
30 have been described in terms of preferred embodiments, those skilled in the art will recognize
that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims.
While embodiments of the present disclosure have been illustrated and described, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications,
18

changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the disclosure, as described in the claims.

We Claim

A device (202) to detect a presence, progression or treatment effect of a disease
characterized by retinal pathological changes in a user, the device comprising:
an image capturing device (208) to obtain a retinal fundus image from the user, the image capturing device comprises a light source for emitting a light and a light diffusing device adapted to receive the light from the light source and to redirect the light toward an eye of a user to provide a substantially even illumination to an retina of the eye to obtain to obtain the retinal fundus image;
a memory (210) coupled to one or more processors (204), the memory comprising executable instructions which upon execution by the one or more processors configures the device to:
obtain one or more physiological parameters of the user in real-time from the
image capturing device;
retrieve the obtained retinal fundus image and the one or more obtained physiological parameters to extract one or more features associated with a current health condition of the retina of the eye of the user;
compare the one or more extracted features with at least one pre-stored feature
in a database to generate at least a prediction result indicative of detection of the presence, the progression or the treatment effect of the disease in the user.
2. The device(202) as claimed in claim 1, wherein the retinal pathological changes are associated with a diabetic retinopathy (DR).
3. The device(202) as claimed in claim 1, wherein the image capturing device (208) is a retinal fundus camera.
4. The device(202) as claimed in claim 1, wherein the image capturing device (208) is communicably coupled to the device (202).
5. The device (202) as claimed in claim 1, wherein the one or more physiological parameters are selected from any or combination of blood pressure body mass index, protein in urine, raised fats (triglycerides in the blood), glycosylated hemoglobin (HbAlc).

6. The device (202) as claimed in claim 1, wherein the device further comprising:
a display device (218) configured to display the generated prediction result indicative of the detection of the presence, the progression or the treatment effect of the disease in the user.
7. The device(202) as claimed in claim 1, wherein the disease is selected from the group consisting of stroke, hypertension, diabetes, cardiovascular diseases including coronary heart disease and cerebral vascular disease, glaucoma, prematurity, papilledema, and common retina disease.
8. The device(202) as claimed in claim 1, wherein the prediction result is generated by at least one algorithm selected from a machine-learning algorithm, a deep learning algorithm, and an artificial intelligence (AI) algorithm that outputs the prediction result.
9. The device(202) as claimed in claim 1, wherein the device further comprising an input device configured to receive the one or more physiological parameters as an input.
10. A method for detecting a presence, progression or treatment effect of a disease characterized by retinal pathological changes in a user, using a device as claimed in claim 1, the method comprising the steps of:
obtaining (302), at an image capturing device, a retinal fundus image from the user, the image capturing device comprises a light source for emitting a light and a light diffusing device adapted to receive the light from the light source and to redirect the light toward an eye of a user to provide a substantially even illumination to an retina of the eye to obtain to obtain the retinal fundus image;
obtaining (304), at the device, one or more physiological parameters of the user in real-time from the image capturing device, the one or more physiological parameters are selected from any or combination of blood pressure body mass index, protein in urine, raised fats (triglycerides in the blood), glycosylated hemoglobin (HbAlc);
retrieving (306),at the device, the obtained retinal fundus image and the one or more obtained physiological parameters to extract one or more features associated with a current health condition of the retina of the eye of the user; and
comparing (308), at the device, the one or more extracted features with at least one pre-stored feature in a database to generate at least a prediction result indicative of detection

of the presence, the progression or the treatment effect of the disease in the user, the prediction result is generated by at least one algorithm selected from a machine-learning algorithm, a deep learning algorithm, and an artificial intelligence (AI) algorithm that outputs the prediction result.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 201911029805-IntimationOfGrant26-08-2024.pdf 2024-08-26
1 201911029805-STATEMENT OF UNDERTAKING (FORM 3) [23-07-2019(online)].pdf 2019-07-23
2 201911029805-FORM FOR STARTUP [23-07-2019(online)].pdf 2019-07-23
2 201911029805-PatentCertificate26-08-2024.pdf 2024-08-26
3 201911029805-FORM FOR SMALL ENTITY(FORM-28) [23-07-2019(online)].pdf 2019-07-23
3 201911029805-Annexure [24-08-2024(online)].pdf 2024-08-24
4 201911029805-Written submissions and relevant documents [24-08-2024(online)].pdf 2024-08-24
4 201911029805-FORM 1 [23-07-2019(online)].pdf 2019-07-23
5 201911029805-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-07-2019(online)].pdf 2019-07-23
5 201911029805-Correspondence to notify the Controller [08-08-2024(online)].pdf 2024-08-08
6 201911029805-US(14)-ExtendedHearingNotice-(HearingDate-12-08-2024)-1500.pdf 2024-07-30
6 201911029805-EVIDENCE FOR REGISTRATION UNDER SSI [23-07-2019(online)].pdf 2019-07-23
7 201911029805-DRAWINGS [23-07-2019(online)].pdf 2019-07-23
7 201911029805-Annexure [01-03-2024(online)].pdf 2024-03-01
8 201911029805-Written submissions and relevant documents [01-03-2024(online)].pdf 2024-03-01
8 201911029805-DECLARATION OF INVENTORSHIP (FORM 5) [23-07-2019(online)].pdf 2019-07-23
9 201911029805-COMPLETE SPECIFICATION [23-07-2019(online)].pdf 2019-07-23
9 201911029805-FORM-26 [13-02-2024(online)].pdf 2024-02-13
10 201911029805-Correspondence to notify the Controller [12-02-2024(online)].pdf 2024-02-12
10 abstract.jpg 2019-08-31
11 201911029805-FORM-26 [03-10-2019(online)].pdf 2019-10-03
11 201911029805-US(14)-HearingNotice-(HearingDate-15-02-2024).pdf 2024-01-16
12 201911029805-CLAIMS [09-09-2022(online)].pdf 2022-09-09
12 201911029805-Proof of Right (MANDATORY) [24-10-2019(online)].pdf 2019-10-24
13 201911029805-CORRESPONDENCE [09-09-2022(online)].pdf 2022-09-09
13 201911029805-FORM 18 [12-06-2021(online)].pdf 2021-06-12
14 201911029805-FER.pdf 2022-03-10
14 201911029805-FER_SER_REPLY [09-09-2022(online)].pdf 2022-09-09
15 201911029805-FER.pdf 2022-03-10
15 201911029805-FER_SER_REPLY [09-09-2022(online)].pdf 2022-09-09
16 201911029805-CORRESPONDENCE [09-09-2022(online)].pdf 2022-09-09
16 201911029805-FORM 18 [12-06-2021(online)].pdf 2021-06-12
17 201911029805-Proof of Right (MANDATORY) [24-10-2019(online)].pdf 2019-10-24
17 201911029805-CLAIMS [09-09-2022(online)].pdf 2022-09-09
18 201911029805-FORM-26 [03-10-2019(online)].pdf 2019-10-03
18 201911029805-US(14)-HearingNotice-(HearingDate-15-02-2024).pdf 2024-01-16
19 201911029805-Correspondence to notify the Controller [12-02-2024(online)].pdf 2024-02-12
19 abstract.jpg 2019-08-31
20 201911029805-COMPLETE SPECIFICATION [23-07-2019(online)].pdf 2019-07-23
20 201911029805-FORM-26 [13-02-2024(online)].pdf 2024-02-13
21 201911029805-DECLARATION OF INVENTORSHIP (FORM 5) [23-07-2019(online)].pdf 2019-07-23
21 201911029805-Written submissions and relevant documents [01-03-2024(online)].pdf 2024-03-01
22 201911029805-Annexure [01-03-2024(online)].pdf 2024-03-01
22 201911029805-DRAWINGS [23-07-2019(online)].pdf 2019-07-23
23 201911029805-EVIDENCE FOR REGISTRATION UNDER SSI [23-07-2019(online)].pdf 2019-07-23
23 201911029805-US(14)-ExtendedHearingNotice-(HearingDate-12-08-2024)-1500.pdf 2024-07-30
24 201911029805-Correspondence to notify the Controller [08-08-2024(online)].pdf 2024-08-08
24 201911029805-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-07-2019(online)].pdf 2019-07-23
25 201911029805-Written submissions and relevant documents [24-08-2024(online)].pdf 2024-08-24
25 201911029805-FORM 1 [23-07-2019(online)].pdf 2019-07-23
26 201911029805-FORM FOR SMALL ENTITY(FORM-28) [23-07-2019(online)].pdf 2019-07-23
26 201911029805-Annexure [24-08-2024(online)].pdf 2024-08-24
27 201911029805-PatentCertificate26-08-2024.pdf 2024-08-26
27 201911029805-FORM FOR STARTUP [23-07-2019(online)].pdf 2019-07-23
28 201911029805-STATEMENT OF UNDERTAKING (FORM 3) [23-07-2019(online)].pdf 2019-07-23
28 201911029805-IntimationOfGrant26-08-2024.pdf 2024-08-26

Search Strategy

1 SS2_201911029805AE_13-09-2022.pdf
1 SS_201911029805E_10-03-2022.pdf
2 SS2_201911029805AE_13-09-2022.pdf
2 SS_201911029805E_10-03-2022.pdf

ERegister / Renewals