Sign In to Follow Application
View All Documents & Correspondence

A Portable Device For Determining Ocular Features

Abstract: The present disclosure relates to a portable device for determining ocular features non-invasively. The device includes a light source to illuminate a target positioned along an optical axis; an optical sensor to capture one or more optical images of an eye of a user illuminated by the light source, wherein the optical sensor is positioned at a predefined distance from the eye and aligned along the optical axis; a transceiver adapted to generate an ultrasound signal such that said generated ultrasound signal is received at the transceiver after reflection from the eye; and a control unit configured to process the captured one or more images, and extract values of one or more attributes from the processed one or more images, wherein the extracted value of the one or more attributes are compared with a pre-defined value of the one or more attributes stored in a first database, and wherein based on comparison, one or more features associated with the eye are determined non-invasively, and wherein based on the received ultrasound signal, the control unit is configured to determine a first set of features from the one or more features.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 November 2018
Publication Number
34/2020
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
info@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-01-24
Renewal Date

Applicants

Chitkara Innovation Incubation Foundation
SCO: 160-161, Sector -9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. SINGH, Varsha
Chitkara School of Health Sciences, Chitkara University, Punjab Campus, Chandigarh-Patiala National Highway (NH-64), Tehsil: Rajpura, Distt. Patiala-140401, Punjab, India.
2. ZAMAN, Mohammed Noor Uz
Chitkara School of Health Sciences, Chitkara University, Punjab Campus, Chandigarh-Patiala National Highway (NH-64), Tehsil: Rajpura, Distt. Patiala-140401, Punjab, India.

Specification

DESC:TECHNICAL FIELD
[001] The present disclosure pertains generally to ocular imaging devices. In particular, the present disclosure relates to a portable device for determining ocular features non-invasively.

BACKGROUND
[002] Since the late 19th century, when retinal imaging was first described, there has been steady technical improvement in imaging of the fundus of the eye. Digital fundus imaging is used extensively in the diagnosis, monitoring, and management of many retinal diseases. Access to fundus photography is often limited by patient morbidity, high equipment cost, and a shortage of trained personnel. Direct ophthalmoscopes, binocular indirect ophthalmoscopes, and funduscopic cameras are the standard of care for ocular examination and these devices can cost as much as $30,000 each.
[003] Commercial medical diagnosis instruments are generally expensive to purchase. In remote or sparsely populated regions, the cost of purchasing the diagnosis instruments may be commercially unrealistic when weighed against the number of people such instruments will service. This means that the patient has to travel to a remote service provider in order to be diagnosed via such instruments. This can then be costly for the patient, both in monetary terms and possibly in health terms. Even in situations where the above problem does not exist, there may still be the problem of the lack of specialist expertise necessary to properly operate such instruments.
[004] There is therefore a need in the art to provide a device for determining ocular feature, that is easy to implement, cost effective, and can be used to determine ocular features non-invasively such as extent of damage of meibomian glands and tear film, quantify the tear break up time and image eyes for ocular health for diseases like dry eye diseases and meibogland dysfunction through transillumination of eyelid structures and the like.
[005] All publications herein are incorporated by reference to the same extent as if each individual publication or patent application were specifically and individually indicated to be incorporated by reference. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.
[006] In some embodiments, the numbers expressing quantities or dimensions of items, and so forth, used to describe and claim certain embodiments of the invention are to be understood as being modified in some instances by the term “about.” Accordingly, in some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the invention are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable. The numerical values presented in some embodiments of the invention may contain certain errors necessarily resulting from the standard deviation found in their respective testing measurements.
[007] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[008] The recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[009] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all groups used in the appended claims.
OBJECTS OF THE PRESENT DISCLOSURE
[0010] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0011] It is an object of the present disclosure to provide a portable device for determining ocular features.
[0012] It is another object of the present disclosure to provide a portable device for determining ocular features that is easy to implement and does not require an expert to operate.
[0013] It is another object of the present disclosure to provide a portable device for determining ocular features that is cost-effective.
[0014] It is another object of the present disclosure to provide a portable device for determining ocular features non-invasively.
[0015] It is another object of the present disclosure to provide a portable device for determining ocular features that enables determining of multiple features using single device.

SUMMARY
[0016] The present disclosure pertains generally to ocular imaging devices. In particular, the present disclosure relates to a portable device for determining ocular features non-invasively.
[0017] An aspect of the present disclosure relates to a portable device for determining ocular features non-invasively. The device can include: a support structure; a light source coupled to the support structure to illuminate a target positioned along an optical axis; an optical sensor coupled to the support structure to capture one or more optical images of an eye of a user illuminated by the light source, wherein the optical sensor is positioned at a predefined distance from the eye and aligned along the optical axis; a transceiver adapted to generate an ultrasound signal such that said generated ultrasound signal is received at the transceiver after reflection from the eye; and a control unit configured to process the captured one or more images, and extract values of one or more attributes from the processed one or more images, wherein the extracted value of the one or more attributes are compared with a pre-defined value of the one or more attributes stored in a first database, and wherein based on comparison, one or more features associated with the eye are determined non-invasively, and wherein based on the received ultrasound signal, the control unit is configured to determine a first set of features from the one or more features.
[0018] In an embodiment, the one or more features comprises dry eye diseases and meibogland dysfunction through transillumination of eyelid structures.
[0019] In an embodiment, the first set of features comprises dry eye disease, Keratoconus screening and corneal surface imaging.
[0020] In an embodiment, the second set of features comprises astigmatism and a corneal dystrophy.
[0021] In an embodiment, the device comprises an infrared camera to capture a thermal image of the eye.
[0022] In an embodiment, the pre-defined distance is about 5 to 10 centimetres.
[0023] In an embodiment, the device comprises a second memory to store the captured one or more images for analysis.
[0024] In an embodiment, the light source comprises any or a combination of a white light source, a red light source and a green light source.

BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The diagrams are for illustration only, which thus is not a limitation of the present disclosure, and wherein:
[0026] FIG. 1 illustrates an exemplary representation of portable device for determining ocular feature in accordance with an embodiment of the present disclosure.
[0027] FIG. 2 illustrates an exemplary representation of architectural representation of proposed device to facilitate determining ocular features in accordance with an embodiment of the present disclosure.
[0028] FIGs. 3A and 3B illustrate exemplary representation of a device for determining ocular health features in accordance with an embodiment of the present disclosure.
[0029] FIG. 4 illustrates an exemplary computer system in which or with which embodiments of the present invention may be utilized in accordance with embodiments of the present disclosure.

DETAILED DESCRIPTION
[0030] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[0031] Embodiments of the present invention include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, and firmware and/or by human operators.
[0032] Embodiments of the present invention may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).
[0033] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present invention with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present invention may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the invention could be accomplished by modules, routines, subroutines, or subparts of a computer program product.
[0034] If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0035] The present disclosure pertains generally to ocular imaging devices. In particular, the present disclosure relates to a portable device for determining ocular features non-invasively.
[0036] An aspect of the present disclosure relates to a portable device for determining ocular features non-invasively. The device can include: a support structure; a light source coupled to the support structure to illuminate a target positioned along an optical axis; an optical sensor coupled to the support structure to capture one or more optical images of an eye of a user illuminated by the light source, wherein the optical sensor is positioned at a predefined distance from the eye and aligned along the optical axis; a transceiver adapted to generate an ultrasound signal such that said generated ultrasound signal is received at the transceiver after reflection from the eye; and a control unit configured to process the captured one or more images, and extract values of one or more attributes from the processed one or more images, wherein the extracted value of the one or more attributes are compared with a pre-defined value of the one or more attributes stored in a first database, and wherein based on comparison, one or more features associated with the eye are determined non-invasively, and wherein based on the received ultrasound signal, the control unit is configured to determine a first set of features from the one or more features.
[0037] In an embodiment, the one or more features comprises dry eye diseases and meibogland dysfunction through transillumination of eyelid structures.
[0038] In an embodiment, the second set of features comprises astigmatism and a corneal dystrophy.
[0039] In an embodiment, the device comprises an infrared camera to capture a thermal image of the eye.
[0040] In an embodiment, the pre-defined distance is about 5 to 10 centimetres.
[0041] In an embodiment, the device comprises a second memory to store the captured one or more images for analysis.
[0042] In an embodiment, the light source comprises any or a combination of a white light source, a red-light source and a green light source.
[0043] FIG. 1 illustrates an exemplary representation of portable device for determine ocular feature in accordance with an embodiment of the present disclosure.
[0044] In an embodiment a portable device 100 for determining ocular features of a user non-invasively is displayed. The device 100 can comprise a housing 102 for providing support to various components of the device 100.
[0045] In an embodiment, the device 100 can include an optical sensor to capture one or more images of an eye 108 of user. The optical sensor can include a digital camera, an analogue camera and the like to capture high resolution images of the eye for analysis to detect Keratoconus and TBUT, progressive eye diseases and the like.
[0046] In an embodiment, the device 100 can include an infrared camera to capture a thermal image of the eye that can be used to detect dry eye diseases and meibogland dysfunction through transillumination of eyelid structures and the like.
[0047] In an embodiment, the device 100 can include a transceiver 106 to generate ultrasonic signals. The generated ultrasonic signals can be made to reflect off of the eye and receive the reflected signal. The received ultrasonic signals can be used to determine dry eye disease, and the like.
[0048] FIG. 2 illustrates an exemplary representation of architectural representation of proposed device to facilitate determining ocular features in accordance with an embodiment of the present disclosure.
[0049] In an exemplary embodiment, the device 200 for determining ocular features of the eye of the user can include a source 202 for generating a light source and/or for generating an ultrasonic signal. The source can be adapted to illuminate a target placed on an optical axis. In an embodiment, the light source can generate any or a combination of a white light, a red light and a green light.
[0050] In an embodiment the device 200 can include the image sensor 204. The image sensor can include a high definition camera to capture one or more images or the video of the eye for analysis. In an embodiment, the camera 204 can include any or combination of the infrared camera and a visible light camera. In an embodiment, the camera can be placed at a per-defined distance. The pre-defined distance is about 5 to 10 centimetres.
[0051] In an embodiment, the device 200 can include a control unit 206 to receive the one or more images or the video of the eye from the camera 204. The received one or more images received can be analyzed by the control unit 206.In an embodiment, the overall surface ocular health will be automated and integrated in the control unit 206 which is specially developed for this purpose. Ultrasonic probing will be using a specific frequency to capture the binary data of the eye surface directly proportional to the symptoms of dry eye diseases. This will further investigate the inner architecture of the eye. The binary output received will be automated and results will be combined in relation to results of patient data received from thermographic images and corneal topography. Florescence imaging can be used to analyze tear break up time. The data received from the test can be further integrated with the results received from corneal topography, and automated for accurate quantitative and qualitative analysis.
[0052] In an embodiment, the device 200 can include a memory 208 for storing the captured one or more images for analysis.
[0053] In an embodiment, the device 200 comprises a display unit 210. The display unit 210 adapted to display any or a combination of, a real-time video, the captured one or more images, and captured video of the eye.
[0054] In an embodiment, the device 200 can be operatively coupled with a database 212. The database 212 can be internally or externally coupled with the system 200. It would be appreciated that the database 212 can be present on a cloud/ server. Further, the database 212 can be used for storing the captured one or more images or the captured video of the eye for analysis by an expert located at a remote location.
[0055] In an embodiment, the control unit 206 can include one or more processors or controllers. Examples of controllers include, but are not limited to PIC® 16F877A microcontroller, AVR ® ATmega8& ATmega16, Renesas®microcontroller and the like. Examples of processor can include, but are not limited to, an Intel® Itanium® or Itanium 2 processor(s), or AMD® Opteron® or Athlon MP® processor(s), Motorola® lines of processors, FortiSOC™ system on a chip processors or other future processors.
[0056] FIGs. 3A and 3B illustrate exemplary representation of a device 300 for determining ocular health features in accordance with an embodiment of the present disclosure.
[0057] In an embodiment, a device 300 for determining ocular health features can include a support structure for providing support to various components of the device 300. The support structure can include a first member 302 and a second member 304 detachably coupled to each other. In another embodiment, the first member 302 and the second member 304 can be slidably coupled to one another. In yet another embodiment, the first member 302 and the second member 304 can be rotatably coupled to one another.
[0058] In an embodiment, the device 300 can include an optical sensor 308 for capturing images of an eye of the user. The optical sensor 308 can include but not limited to one or more of a camera, an infrared camera and a digital camera. In an embodiment, the device can include a set of light sources 310. The set of light sources can include but not limiting only to a green light source, a yellow light source, a blue light source and an infrared light source. Further, the device 100 can include a set of switches 306 for controlling turning ON and turning OFF of the set of light sources 310 either individually or together.
[0059] FIG. 4 illustrates an exemplary computer system in which or with which embodiments of the present invention may be utilized in accordance with embodiments of the present disclosure.
[0060] As shown in FIG. 4, computer system 400 includes an external storage device 410, a bus 420, a main memory 430, a read only memory 440, a mass storage device 450, communication port 460, and a processor 470. A person skilled in the art will appreciate that computer system may include more than one processor and communication ports. Examples of processor 470 include, but are not limited to, an Intel® Itanium® or Itanium 2 processor(s), or AMD® Opteron® or Athlon MP® processor(s), Motorola® lines of processors, FortiSOC™ system on a chip processors or other future processors. Processor 470 may include various modules associated with embodiments of the present invention. Communication port 460 can be any of an RS-232 port for use with a modem based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or future ports. Communication port 460 may be chosen depending on a network, such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which computer system connects.
[0061] Memory 430 can be Random Access Memory (RAM), or any other dynamic storage device commonly known in the art. Read only memory 440 can be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or BIOS instructions for processor 470. Mass storage 450 may be any current or future mass storage solution, which can be used to store information and/or instructions. Exemplary mass storage solutions include, but are not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial Bus (USB) and/or Firewire interfaces), e.g. those available from Seagate (e.g., the Seagate Barracuda 7102 family) or Hitachi (e.g., the Hitachi Deskstar 7K1000), one or more optical discs, Redundant Array of Independent Disks (RAID) storage, e.g. an array of disks (e.g., SATA arrays), available from various vendors including Dot Hill Systems Corp., LaCie, Nexsan Technologies, Inc. and Enhance Technology, Inc.
[0062] Bus 420 communicatively couples processor(s) 470 with the other memory, storage and communication blocks. Bus 420 can be, e.g. a Peripheral Component Interconnect (PCI) / PCI Extended (PCI-X) bus, Small Computer System Interface (SCSI), USB or the like, for connecting expansion cards, drives and other subsystems as well as other buses, such a front side bus (FSB), which connects processor 470 to software system.
[0063] Optionally, operator and administrative interfaces, e.g. a display, keyboard, and a cursor control device, may also be coupled to bus 420 to support direct operator interaction with computer system. Other operator and administrative interfaces can be provided through network connections connected through communication port 460. External storage device 410 can be any kind of external hard-drives, floppy drives, IOMEGA® Zip Drives, Compact Disc - Read Only Memory (CD-ROM), Compact Disc - Re-Writable (CD-RW), Digital Video Disk - Read Only Memory (DVD-ROM). Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system limit the scope of the present disclosure.
[0064] As used herein, and unless the context dictates otherwise, the term "coupled to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously. Within the context of this document terms "coupled to" and "coupled with" are also used euphemistically to mean “communicatively coupled with” over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.
[0065] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C ….and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc.
[0066] While embodiments of the present invention have been illustrated and described, it will be clear that the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the invention, as described in the claim.
[0067] In the foregoing description, numerous details are set forth. It will be apparent, however, to one of ordinary skill in the art having the benefit of this disclosure, that the present invention may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form, rather than in detail, to avoid obscuring the present invention.
[0068] Some portions of the detailed description have been presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
[0069] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “computing”, “comparing”, “determining”, “adjusting”, “applying”, “creating”, “ranking,” “classifying,” or the like, refer to the actions and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (e.g., electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0070] Certain embodiments of the present invention also relate to an apparatus for performing the operations herein. This apparatus may be constructed for the intended purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions.
[0071] It is to be understood that the above description is intended to be illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reading and understanding the above description. The scope of the invention should therefore be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

ADVANTAGES OF THE PRESENT DISCLOSURE
[0072] The present disclosure provides a portable device for determining ocular features.
[0073] The present disclosure provides a portable device for determining ocular features that is easy to implement and does not require an expert to operate.
[0074] The present disclosure provides a portable device for determining ocular features that is cost-effective.
[0075] The present disclosure provides a portable device for determining ocular features non-invasively.
[0076] The present disclosure provides a portable device for determining ocular features that enables determining of multiple features using single device.

We Claim:
1. A portable device for determining ocular features non-invasively, said device comprising:
a support structure;
a light source coupled to the support structure to illuminate a target positioned along an optical axis;
an optical sensor coupled to the support structure to capture one or more optical images of an eye of a user illuminated by the light source, wherein the optical sensor is positioned at a predefined distance from the eye and aligned along the optical axis;
a transceiver adapted to generate an ultrasound signal such that said generated ultrasound signal is received at the transceiver after reflection from the eye; and
a control unit configured to process the captured one or more images, and extract values of one or more attributes from the processed one or more images, wherein the extracted value of the one or more attributes are compared with a pre-defined value of the one or more attributes stored in a first database, and
wherein based on comparison, one or more features associated with the eye are determined non-invasively, and
wherein based on the received ultrasound signal, the control unit is configured to determine a first set of features from the one or more features.
2. The device as claimed in claim 1, wherein the one or more features comprises dry eye diseases and meibogland dysfunction through transillumination of eyelid structures.
3. The device as claimed in claim 1, wherein the first set of features comprises dry eye disease, keratoconus screening, corneal surface imaging.
4. The device as claimed in claim 4, wherein the second set of features comprises astigmatism and a corneal dystrophy.
5. The device as claimed in claim 1, wherein the device comprises an infrared camera to capture a thermal image of the eye.
6. The device as claimed in claim 1, wherein the pre-defined distance is about 5 to 10 centimetres.
7. The device as claimed in claim 1, wherein the device comprises a second memory to store the captured one or more images for analysis.
8. The device as claimed in claim 1, wherein the light source comprises any or a combination of a white light source, a red-light source and a green light source.

Documents

Application Documents

# Name Date
1 201811045181-STATEMENT OF UNDERTAKING (FORM 3) [29-11-2018(online)].pdf 2018-11-29
2 201811045181-FORM FOR STARTUP [29-11-2018(online)].pdf 2018-11-29
3 201811045181-FORM FOR SMALL ENTITY(FORM-28) [29-11-2018(online)].pdf 2018-11-29
4 201811045181-FORM 1 [29-11-2018(online)].pdf 2018-11-29
5 201811045181-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-11-2018(online)].pdf 2018-11-29
6 201811045181-EVIDENCE FOR REGISTRATION UNDER SSI [29-11-2018(online)].pdf 2018-11-29
7 201811045181-DRAWINGS [29-11-2018(online)].pdf 2018-11-29
8 201811045181-DECLARATION OF INVENTORSHIP (FORM 5) [29-11-2018(online)].pdf 2018-11-29
9 201811045181-COMPLETE SPECIFICATION [29-11-2018(online)].pdf 2018-11-29
10 201811045181-Power of Attorney-131218.pdf 2018-12-15
11 201811045181-OTHERS-131218.pdf 2018-12-15
12 201811045181-Correspondence-131218.pdf 2018-12-15
13 201811045181-Proof of Right (MANDATORY) [27-12-2018(online)].pdf 2018-12-27
14 201811045181-FORM-26 [27-12-2018(online)].pdf 2018-12-27
15 abstract.jpg 2018-12-28
16 201811045181-Covering Letter(Mandatory) [16-09-2019(online)].pdf 2019-09-16
17 201811045181-Annexure (Optional) [16-09-2019(online)].pdf 2019-09-16
18 201811045181-DRAWING [28-11-2019(online)].pdf 2019-11-28
19 201811045181-COMPLETE SPECIFICATION [28-11-2019(online)].pdf 2019-11-28
20 201811045181-FORM 18 [11-05-2022(online)].pdf 2022-05-11
21 201811045181-FER.pdf 2022-09-21
22 201811045181-FER_SER_REPLY [21-03-2023(online)].pdf 2023-03-21
23 201811045181-CORRESPONDENCE [21-03-2023(online)].pdf 2023-03-21
24 201811045181-CLAIMS [21-03-2023(online)].pdf 2023-03-21
25 201811045181-PatentCertificate24-01-2024.pdf 2024-01-24
26 201811045181-IntimationOfGrant24-01-2024.pdf 2024-01-24

Search Strategy

1 201811045181E_20-09-2022.pdf

ERegister / Renewals