Abstract: The present disclosure provides an eye-ware 100 that includes a wearable frame 302 that can be adapted to be worn by a user, two frame inserts 304-1 and 304-2 that are attached to the frame 302, sensors 104 that are positioned at a first pre-determined position on the frame 302, and irradiating sources 102, where the irradiating sources 102 and the sensors 104 are positioned such that the irradiation emitted through the irradiating sources 102 passes through at least one optical axis associated with at least a part of an eye of the user before being sensed by the sensors 104. The processing unit 106 of the eye-ware 100 generates set of data packets corresponding to the irradiation sensed by the sensors 104, and the set of data packets are transmitted to mobile computing devices 108, which enables the eye-ware 100 to assist in ocular communication.
The present disclosure relates to systems and methods for communication.
More particularly, the present disclosure relates to an eye-ware to assist in ocular communication.
BACKGROUND
[0002] Background description includes information that may be useful in
understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] There are numerous health-related issues where an individual is not able to
voluntarily communicate using normal speech. For example, the individual may have a temporary or permanent health-related issue that renders him or her unable to speak, the individual may be paralyzed, intubated, and the like. In serious conditions, the individual is not even not able to communicate by making gestures through hands, neck, and other body parts.
[0004] According to a survey, nearly 1 in every 5000 people are paralyzed.
Quadriplegia affected individuals, paraplegia affected individuals, and individuals with other physical disabilities may have their speech impaired, and moreover, may be unable to use alternative methods, such as sign language, handwriting or typing to communicate with others. So, the individuals require a constant monitoring and support. But, it is a very hectic and complex task to monitor and support the individuals as one finds it very tough to communicate with the individuals, and make a sense of and take care of their small to vital requirements such as thirst, hunger, and other relatable requirements.
[0005] Complexions in the process of communicating with the paralyzed individual
adds to the uneasiness of the paralyzed individual, which may increase sufferings of the paralyzed individual, add on to the existing problems, and he may get in worse conditions, and may even lead to death of the paralyzed individual.
[0006] There is, therefore, a need in the art to provide an efficient, improved, and
cost-effective apparatus to overcome the above-mentioned problems, and provide a means to assist the paralyzed person in communicating in a better way.
OBJECTS OF THE PRESENT DISCLOSURE
[0007] Some of the objects of the present disclosure, which at least one embodiment
herein satisfies are as listed herein below.
[0008] It is an object of the present disclosure to provide an eye-ware to monitor
ophthalmic attributes of at least an eye of a user.
[0009] It is another object of the present disclosure to provide an eye-ware to assist a
user in communicating ocularly.
[0010] It is another object of the present disclosure to provide an eye-ware that
distinguishes between voluntary eye-gestures and involuntary eye-gestures.
[0011] It is another object of the present disclosure to provide an eye-ware to transmit
one or more messages corresponding to the eye-blinks made by the user to one or more
computing devices pertaining to one or more persons related to the user.
[0012] It is another object of the present disclosure to provide an eye-ware to assist a
doctor in monitoring eye-gestures, eye-blinks, and related health of the user.
[0013] It is another object of the present disclosure to provide an eye-ware that is
efficient, reliable, user-friendly, and cost-effective.
SUMMARY
[0014] The present disclosure relates to systems and methods for communication.
More particularly, the present disclosure relates to an eye-ware to assist in ocular communication.
[0015] An aspect of the present disclosure pertains toan eye-ware to assist a user in
communicating ocularly, the eye-ware comprising one or more irradiating sources to emit an irradiation, having pre-configured irradiation attributes, on at least a part of an eye of the user; a wearable frame adapted to be worn by the user, said frame comprising: one or more sensors positioned at a first pre-defined position on the frame, wherein the one or more sensors may be adapted to sense an irradiation reflected by the at least a part of the eye of the user, and wherein a first set of signals may be generated based on the sensed irradiation received after the reflection; and a processing unit operatively coupled to the one or more sensors and the one or more irradiating sources, the processing unit comprising one or more processors coupled with a memory, the memory storing instructions executable by the one or more processors and configured to: extract one or more irradiation attributes from the generated first set of signals; determine a variation in at least one of the one or more irradiation attributes by comparing the one or more extracted irradiation attributes of the
sensed irradiation received after the reflection with the pre-configured irradiation attributes,
and generate one or more set of data packets based on the determined variation, wherein the
generated one or more set of data packets may pertain to any or a combination of an image,
an audio, and a text message, and may be transmitted to one or more mobile computing
devices, communicatively coupled to the processing unit, to assist the user in ocular
communication.
[0016] In an aspect, any or a combination of the one or more irradiating sources and
the one or more sensors may be positioned such that the irradiation emitted through the one
or more irradiating sources passes through at least one optical axis associated with the at least
a part of the eye of the user before being sensed by the one or more sensors.
[0017] In an aspect, the one or more irradiating sources may be positioned at a second
pre-defined position on the frame of the eye-ware.
[0018] In an aspect, the one or more set of data packets may be generated by the
processing unit when the detected variation is within one or more ranges of pre-defined
thresholds, wherein the one or more ranges of pre-defined thresholds may pertain to
voluntary movements associated with ophthalmic attributes of the at least a part of the eye of
the user, so that the eye-ware may be able to obviate involuntary movements associated with
ophthalmic attributes.
[0019] In an aspect, the ophthalmic attributes may comprise any or a combination of
an orientation, a position, an eyelid closure, an eye blink, a pupil radius, and cornea of the at
least an eye of the user.
[0020] In an aspect, any or a combination of the one or more ranges of thresholds and
the pre-configured irradiation attributes may be configured through the one or more mobile
computing devices.
[0021] In an aspect, the one or more irradiating sources may comprise any or a
combination of infrared rays (IR) source, led source, and white light source, and wherein the
one or more irradiation attributes may be any or a combination of intensity, colour,
frequency, and wavelength of the one or more irradiating sources.
[0022] In an aspect, the one or more sensors may be any or a combination of
illumination sensor, optical triangular position sensor, ultrasonic sensor, irradiation sensor,
and colour sensor.
[0023] In an aspect, the one or more set of data packets may be indicative of any or a
combination of hunger, thirstiness, and need to go to washroom.
[0024] In an aspect, the eye-ware may comprise at least two frame inserts such that
any or a combination of transmission and reflection properties of the at least two frame inserts may be identical, wherein the frame may be configured in such a way that the at least two frame inserts are exchangeably attachable to the eye-ware.
[0025] In an aspect, the one or more set of data packets may be transmitted to the one
or more mobile computing devices of any or a combination of a doctor, and a person related to the user.
BRIEF DESCRIPTION OF THE DRAWINGS
[0026] The accompanying drawings are included to provide a further understanding
of the present disclosure, and are incorporated in and constitute a part of this specification.
The drawings illustrate exemplary embodiments of the present disclosure and, together with
the description, serve to explain the principles of the present disclosure.
[0027] The diagrams are for illustration only, which thus is not a limitation of the
present disclosure, and wherein:
[0028] FIG. 1 illustrates exemplary block diagram of the proposed eye-ware for
providing assistance in ocular communication to illustrate its overall working in accordance
with an embodiment of the present disclosure.
[0029] FIG. 2 illustrates exemplary functional components of a processing unit
associated with the eye-ware, in accordance with an exemplary embodiment of the present
disclosure.
[0030] FIG. 3 illustrates an exemplary structure of an eye-ware, to illustrate its overall
working in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0031] The following is a detailed description of embodiments of the disclosure
depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0032] Various terms as used herein are shown below. To the extent a term used in a
claim is not defined below, it should be given the broadest definition persons in the pertinent
art have given that term as reflected in printed publications and issued patents at the time of filing.
[0033] In some embodiments, the numerical parameters set forth in the written
description and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the invention are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable. The numerical values presented in some embodiments of the invention may contain certain errors necessarily resulting from the standard deviation found in their respective testing measurements.
[0034] As used in the description herein and throughout the claims that follow, the
meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.
[0035] The recitation of ranges of values herein is merely intended to serve as a
shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. "such as") provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[0036] Groupings of alternative elements or embodiments of the invention disclosed
herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all groups used in the appended claims.
[0037] The present disclosure relates to systems and methods for communication.
More particularly, the present disclosure relates to an eye-ware to assist in ocular communication.
[0038] According to an aspect the present disclosure pertains to An eye-ware to assist
a user in communicating ocularly, the eye-ware including: one or more irradiating sources to
emit an irradiation, having pre-configured irradiation attributes, on at least a part of an eye of
the user; a wearable frame adapted to be worn by the user, said frame including: one or more
sensors positioned at a first pre-defined position on the frame, wherein the one or more
sensors can be adapted to sense an irradiation reflected by the at least a part of the eye of the
user, and wherein a first set of signals can be generated based on the sensed irradiation
received after the reflection; and a processing unit operatively coupled to the one or more
sensors and the one or more irradiating sources, the processing unit including one or more
processors coupled with a memory, the memory storing instructions executable by the one or
more processors and configured to: extract one or more irradiation attributes from the
generated first set of signals; determine a variation in at least one of the one or more
irradiation attributes by comparing the one or more extracted irradiation attributes of the
sensed irradiation received after the reflection with the pre-configured irradiation attributes,
and generate one or more set of data packets based on the determined variation, wherein the
generated one or more set of data packets can pertain to any or a combination of an image, an
audio, and a text message, and can be transmitted to one or more mobile computing devices,
communicatively coupled to the processing unit, to assist the user in ocular communication.
[0039] In an embodiment, any or a combination of the one or more irradiating sources
and the one or more sensors can be positioned such that the irradiation emitted through the
one or more irradiating sources passes through at least one optical axis associated with the at
least a part of the eye of the user before being sensed by the one or more sensors.
[0040] In an embodiment, the one or more irradiating sources can be positioned at a
second pre-defined position on the frame of the eye-ware.
[0041] In an embodiment, the one or more set of data packets can be generated by the
processing unit when the detected variation is within one or more ranges of pre-defined thresholds, wherein the one or more ranges of pre-defined thresholds can pertain to voluntary movements associated with ophthalmic attributes of the at least a part of the eye of the user, so that the eye-ware can be able to obviate involuntary movements associated with ophthalmic attributes.
[0042] In an embodiment, the ophthalmic attributes can include any or a combination
of an orientation, a position, an eyelid closure, an eye blink, a pupil radius, and cornea of the
at least an eye of the user.
[0043] In an embodiment, any or a combination of the one or more ranges of
thresholds and the pre-configured irradiation attributes can be configured through the one or
more mobile computing devices.
[0044] In an embodiment, the one or more irradiating sources can include any or a
combination of infrared rays (IR) source, led source, and white light source, and wherein the
one or more irradiation attributes can be any or a combination of intensity, colour, frequency,
and wavelength of the one or more irradiating sources.
[0045] In an embodiment, the one or more sensors can be any or a combination of
illumination sensor, optical triangular position sensor, ultrasonic sensor, irradiation sensor,
and colour sensor.
[0046] In an embodiment, the one or more set of data packets may be indicative of
any or a combination of hunger, thirstiness, need to go to washroom, and other requirements
of the user.
[0047] In an embodiment, the eye-ware can include at least two frame inserts such
that any or a combination of transmission and reflection properties of the at least two frame
inserts can be identical, wherein the frame can be configured in such a way that the at least
two frame inserts are exchangeably attachable to the eye-ware.
[0048] In an embodiment, the one or more set of data packet scan be transmitted to
the one or more mobile computing devices of any or a combination of a doctor, and a person
related to the user.
[0049] FIG. 1 illustrates exemplary block diagram of the proposed eye-ware for
providing assistance in ocular communication to illustrate its overall working in accordance
with an embodiment of the present disclosure.
[0050] As illustrated, in an aspect, block diagram of the proposed eye-ware lOOcan
include one or more irradiating sources 102 (also referred to as irradiating sources 102,
herein). The irradiating sources 102 can emit an irradiation having pre-configured irradiation
attributes on at least a part of an eye of a user. The irradiating sources 102 can be any or a
combination of infrared rays (IR) source, led source, white light source, and the likes. The
pre-configured irradiation attributes of the irradiating sources 102 can be any or a
combination of intensity, colour, frequency, wavelength, and the likes.
[0051] In an embodiment, the block diagram of the proposed eye-ware 100 can
include one or more sensors 104(also referred to as sensors 104, herein) positioned on a
wearable frame adapted to be worn by the user. The sensors 104 can be positioned at a first
pre-defined position on the frame, such that the sensors 104 can sense an irradiation reflected
by the at least a part of the eye of the user. A first set of signals can be generated based on the
sensed irradiation received after the reflection from the at least a part of the eye of the user.
The sensors 104 can be any or a combination of illumination sensor, optical triangular
position sensor, ultrasonic sensor, irradiation sensor, colour sensor, and the likes. In an
embodiment, any or a combination of the irradiating sources 102 and the sensors 104 can be
positioned such that the irradiation emitted through the irradiating sources 102 can pass
through at least one optical axis associated with the at least a part of the eye of the user before
being sensed by the sensors 104. In an illustrative embodiment, the irradiating sources 102
can be positioned at a second pre-defined position on the frame of the eye-ware 100.
[0052] In an embodiment, the block diagram of the proposed eye-ware 100 can
include a processing unit 106. The processing unit 106 can be operatively coupled to the sensors 104 and the irradiating sources 102, and the processing unit 108 can include one or more processors coupled with a memory, the memory can store instructions executable by the one or more processors. In an embodiment, the processing unit 106 can extract one or more irradiation attributes from the generated first set of signals. In another embodiment, the processing unit 106 can determine a variation in at least one of the one or more irradiation attributes by comparing the one or more extracted irradiation attributes of the sensed irradiation received after the reflection with the pre-configured irradiation attributes. In yet another embodiment, the processing unit 106 can generate one or more set of data packets based on the determined variation, such that the generated one or more set of data packets pertains to any or a combination of an image, an audio, a text message, and the likes. In an illustrative implementation, the one or more set of data packets can be generated by the processing unit 106 when the determined variation is within one or more ranges of pre¬defined thresholds, wherein the one or more ranges of pre-defined thresholds pertain to voluntary movements associated with ophthalmic attributes of the at least a part of the eye of the user, so that the eye-ware 100 can obviate involuntary movements associated with ophthalmic attributes. The ophthalmic attributes can include any or a combination of an orientation, a position, an eyelid closure, an eye blink, a pupil radius, cornea of the at least an eye of the user, and the likes.
[0053] In an embodiment, the block diagram of the proposed eye-ware 100 can
include one or more mobile computing devices 108 (also referred to as mobile computing devices 108, herein), which can be communicatively coupled to the processing unit 106. The one or more set of data packets generated by the processing unit 106 can be transmitted to the mobile computing devices 108 of any or a combination of a doctor, and a person related to the user, to assist the user in ocular communication. The one or more set of data packets can be indicative of any or a combination of hunger, thirstiness, need to go to washroom, and other requirements of the user. The mobile computing devices 108 can enable configuration of any or a combination of the one or more ranges of thresholds and the pre-configured irradiation attributes associated with the irradiating sources 102.
[0054] In an embodiment, the frame of the eye-ware 100 can include at least two
frame inserts such that any or a combination of transmission and reflection properties of the
at least two frame inserts are identical, and the frame can be configured in such a way that the
at least two frame inserts are exchangeably attachable to the eye-ware 100.
[0055] FIG. 2 illustrates exemplary functional components of a processing unit
associated with the eye-ware, in accordance with an exemplary embodiment of the present disclosure.
[0056] As illustrated, in an embodiment, the processing unit 106can include one or
more processor(s) 202, the one or more processor(s) 202 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 can be configured to fetch and execute computer-readable instructions stored in a memory 204 of the processing unit 106. The memory 204can store one or more computer-readable instructions or routines, which can be fetched and executed to create or share the data units over a network service. The memory 204 can be any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0057] The processing unit 106 can include an interface(s) 206. The interface(s) 206
can include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 can facilitate communication of the processing unit 106 with various devices coupled to theprocessing unit 106such as an input unit and an output unit. The interface(s) 206 can also provide a communication pathway for one or more components of theprocessing unit 106 and the
mobile computing units 106. Examples of such components include, but not limited to, processing engine(s) 208 and data base 210.
[0058] The processing engine(s) 208 can be implemented as a combination of
hardware and programming (for example, programmable instructions) to implement one or
more functionalities of the processing engine(s) 208. In examples described herein, such
combinations of hardware and programming may be implemented in several different ways.
For example, the programming for the processing engine(s) 208 can be processor executable
instructions stored on a non-transitory machine-readable storage medium and the hardware
for the processing engine(s) 208 can include a processing resource (for example, one or more
processors), to execute such instructions. In the present examples, the machine-readable
storage medium may store instructions that, when executed by the processing resource,
implement the processing engine(s) 208. In such examples, the processing engine(s) 208 can
include the machine-readable storage medium storing the instructions and the processing
resource to execute the instructions, or the machine-readable storage medium may be
separate but accessible to the processing unit 106 and the processing resource. In other
examples, the processing engine(s) 208 can be implemented by electronic circuitry.
[0059] In an embodiment, the database 210 can include data that is either stored or
generated as a result of functionalities implemented by any of the components of the processing engine(s) 208.
[0060] In an embodiment, the processing engine(s) 208 can include an extraction
engine 212, which can enable extraction of one or more irradiation attributes from a first set of signals. The first set of signals can be transmitted from sensors 104 to the processing unit 106 corresponding to an irradiation received by the sensors 104 after the reflection from the at least a part of an eye of a user, who is wearing the eye-ware 100, and where the irradiation can be emitted by irradiating sources 102. The irradiating sources 102, having pre-configured attributes, can be any or a combination of infrared rays (IR) source, led source, white light source, and the likes, and the extracted one or more irradiation attributes can be any or a combination of intensity, colour, frequency, wavelength, and the likes.
[0061] In an embodiment, the processing engine(s) 208 can include a variation
determining engine 214 that enables determination of a variation in at least one of the extracted one or more irradiation attributes. The variation in the at least one of the one or more irradiation attributes can be determined by comparing the one or more extracted irradiation attributes of the sensed irradiation received after the reflection with the pre-configured irradiation attributes. In an embodiment, the pre-configured attributes can be
stored in the database 210 of the processing unit 106. In another embodiment, the pre-configured attributes can be retrieved from a third source through one or more processing engine(s) 208. In an embodiment, one or more set of data packets can be generated based on the determined variation, such that the generated one or more set of data packets pertains to any or a combination of an image, an audio, a text message, and the likes, such that the one or more set of data packets can be indicative of any or a combination of hunger, thirstiness, need to go to washroom, and other requirements of the user. The generated one or more set of data packets can be transmitted to one or more mobile computing devices 108, which are communicatively coupled to the processing unit 106, where the one or more mobile computing devices 108 can be of any or a combination of a doctor, and a person related to the user, to assist the user in ocular communication. In an illustrative embodiment, the one or more set of data packets can be generated by the processing unit 106 when the determined variation is within one or more ranges of pre-defined thresholds, such that the one or more ranges of pre-defined thresholds pertain to voluntary movements associated with ophthalmic attributes of the at least a part of the eye of the user, so that the eye-ware can obviate involuntary movements associated with ophthalmic attributes. The ophthalmic attributes can be any or a combination of an orientation, a position, an eyelid closure, an eye blink, a pupil radius, cornea of the at least an eye of the user, and the likes.
[0062] In an illustrative embodiment, the variationdetermined in the one or more
irradiation attributes can be associated with eye-blinking of at least one eye of the user. In an embodiment, when the at least one eye of the user is open, the irradiation emitted by the irradiating sources 102 cannot be reflected to the sensors 104, or even if the irradiation is reflected to the sensors 104, the reflected irradiation can be negligibly small. In an embodiment, when the user closes the eye-lid of the at least one eye, most of the irradiation emitted by the irradiating sources 102 can be reflected to the sensors 104 from the at least one eye of the user.
[0063] In an illustrative implementation, the one or more ranges of pre-defined
thresholds can be based on a time-duration associated with the eye-blink corresponding to the at least one eye of the user. In an embodiment, as the normal-rate of the eye-blink corresponding to the at least one eye is 2 seconds to 8 seconds, the time-duration of less than 2 seconds and more than 8 seconds can be set as the one or more ranges of pre-defined thresholds, andthe one or more set of data packets can be generated by the when the determined variation associated with the eye-blink corresponding to the at least one eye of the user is within one or more ranges of pre-defined thresholds, where the one or more set of data
packets can be indicative of any or a combination of hunger, thirstiness, need to go to washroom, and other requirements of the user. So, voluntary eye-blink can be distinguished from involuntary eye-blink by the eye-ware 100, and the one or more set of data packets can be generated corresponding to the voluntary eye-blink, whereas the involuntary eye-blink can be obviated by the eye-ware 100.
[0064] In another illustrative embodiment, the detected variation can be associated
with number of the eye-blinks corresponding to the at least one eye of the user in a pre¬defined time-period, and the one or more data packets are being generated based on the number of the eye-blinks in the pre-defined time-period. For example, in a case, when the user blinks the at least one eye one time in a pre-defined time-period, such that the detected variation associated with the eye-blink corresponding to the at least one eye of the user is within one or more ranges of pre-defined thresholds, a first set of data packets can be generated, such that the first set of data packets can be indicative of water being demanded by the user. In another case, when the user blinks the at least one eye two times in a pre-defined time-period, such that the detected variation associated with the eye-blink corresponding to the at least one eye of the user is within one or more ranges of pre-defined thresholds, a second set of data packets can be generated, such that the second set of data packets can be indicative of any or a combination of hunger, and food being demanded by the user. . In yet another case, when the user blinks the at least one eye three times in a pre-defined time-period, such that the detected variation associated with the eye-blink corresponding to the at least one eye of the user is within one or more ranges of pre-defined thresholds, a third set of data packets can be generated, such that the third set of data packets can be indicative of a requirement of the user to use washroom.
[0065] In an embodiment, the processing engine(s) 208 can include a configuration
engine 216 that can enable configuration of any or a combination of the one or more ranges of thresholds and the pre-configured irradiation attributes through the one or more mobile computing devices 108. The configuration of any or a combination of the one or more ranges of thresholds and the pre-configured irradiation attributes can be done in order to make it more suitable, amicable and ambient for the user. The configuration can be done through an input module of the one or more mobile computing devices 108 such as a keyboard, a mouse, a microphone, and the likes.
[0066] Those skilled in the art would appreciate that though the working of eye-ware
100 is explained with an illustration of eye-blinking, but, it can operate in a similar manner
for the other ophthalmic attributes of the at least a part of the eye of the user, without deviating from the scope of present disclosure.
[0067] FIG. 3 illustrates an exemplary structure of an eye-ware, to illustrate its overall
working in accordance with an embodiment of the present disclosure.
[0068] As illustrated in FIG. 3, in as aspect, the proposed eye-ware 100 can includea
wearable frame 302, which can be adapted to be worn by a user. The eye-ware 100 can include at least two frame inserts 304-land 304-2, such that any or a combination of transmission and reflection properties of the at least two frame inserts 304-land 304-2 are identical. The frame 302 can be configured in such a way that the at least two frame inserts 304-land 304-2 can be exchangeably attachable to the eye-ware 100.In an embodiment, each of the at least two frame inserts 304-land 304-2 can block undesired light from the space which the optical path extends through. In another embodiment, each of the at least two frame inserts 304-land 304-2 can block unwanted reflections and stray light. In an embodiment, the frame 302 can be configured from materials that are strong, reliable, corrosion-resistive, as well as light in weight, the materials used for the frame 302 can include any or a combination of polyamide, nylon, polycarbonate, carbon nano-tubes, aluminium, optyl, and the likes.
[0069] In an embodiment, the proposed eye-ware 100 can include irradiating sources
102 (not shown) to emit an irradiation having pre-configured irradiation attributes on at least a part of an eye of a user. The irradiating sources 102 can be any or a combination of infrared rays (TR) source, led source, white light source, and the likes. The pre-configured irradiation attributes of the irradiating sources 102 can be any or a combination of intensity, colour, frequency, wavelength, and the likes.
[0070] In an embodiment, the proposed eye-ware 100 can include sensors 104-1 and
104-2, which can be positioned at a first pre-defined position at each of the at least two frame inserts 304-land 304-2 on the frame 302, such that the sensors 104 can sense an irradiation reflected by the at least a part of the eye of the user. A first set of signals can be generated based on the sensed irradiation received after the reflection from the at least a part of the eye of the user. The sensors 104 can be any or a combination of illumination sensor, optical triangular position sensor, ultrasonic sensor, irradiation sensor, colour sensor, and the likes. In an embodiment, any or a combination of the irradiating sources 102 and the sensors 104 can be positioned such that the irradiation emitted through the irradiating sources 102 can pass through at least one optical axis associated with the at least a part of the eye of the user
before being sensed by the sensors 104. In an illustrative embodiment, the irradiating sources
102 can be positioned at a second pre-defined position on the frame 302 of the eye-ware 100.
[0071] In an embodiment, the proposed eye-ware 100 can include a processing unit
106. The processing unit 106 can be operatively coupled to the sensors 104 and the irradiating sources 102. In an embodiment, the processing unit 106 can extract one or more irradiation attributes from the generated first set of signals. In another embodiment, the processing unit 106 can determine a variation in at least one of the one or more irradiation attributes by comparing the one or more extracted irradiation attributes of the sensed irradiation received after the reflection with the pre-configured irradiation attributes. In yet another embodiment, the processing unit 106 can generate one or more set of data packets based on the determined variation, such that the generated one or more set of data packets pertains to any or a combination of an image, an audio, a text message, and the likes. In an illustrative implementation, the one or more set of data packets can be generated by the processing unit 106 when the determined variation is within one or more ranges of pre¬defined thresholds, wherein the one or more ranges of pre-defined thresholds pertain to voluntary movements associated with ophthalmic attributes of the at least a part of the eye of the user, so that the eye-ware 100 can obviate involuntary movements associated with ophthalmic attributes. The ophthalmic attributes can include any or a combination of an orientation, a position, an eyelid closure, an eye blink, a pupil radius, cornea of the at least an eye of the user, and the likes.
[0072] In an embodiment, the proposed eye-ware 100 can include a mobile
computing device 108, which can be communicatively coupled to the processing unit 106.
The one or more set of data packets generated by the processing unit 106 can be transmitted
to the mobile computing devices 108 of any or a combination of a doctor, and a person
related to the user, to assist the user in ocular communication. The one or more set of data
packets can be indicative of any or a combination of hunger, thirstiness, need to go to
washroom, and other requirements of the user. The mobile computing device 108 can enable
configuration of any or a combination of the one or more ranges of thresholds and the pre-
configured irradiation attributes associated with the irradiating sources 102.
[0073] Embodiments of the present disclosure may be implemented entirely
hardware, entirely software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a "circuit," "engine," "component," or "system." Furthermore, aspects of the present
disclosure may take the form of a computer program product comprising one or more
computer readable media having computer readable program code embodied thereon.
[0074] Thus, it will be appreciated by those of ordinary skill in the art that the
diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named.
[0075] As used herein, and unless the context dictates otherwise, the term "coupled
to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously. Within the context of this document terms "coupled to" and "coupled with" are also used euphemistically to mean "communicatively coupled with" over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.
[0076] It should be apparent to those skilled in the art that many more modifications
besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms "comprises" and "comprising" should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C .... and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc.
[0077] While the foregoing describes various embodiments of the invention, other
and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE INVENTION
[0078] The present disclosure provides an eye-ware to monitor ophthalmic attributes
of at least an eye of a user.
[0079] The present disclosure provides an eye-ware to assist a user in communicating
ocularly.
[0080] The present disclosure provides an eye-ware that distinguishes between
voluntary eye-gestures and involuntary eye-gestures.
[0081] The present disclosure provides an eye-ware to transmit one or more messages
corresponding to the eye-blinks made by the user to one or more computing devices pertaining to one or more persons related to the user.
[0082] The present disclosure provides an eye-ware to assist a doctor in monitoring
eye-gestures, eye-blinks, and related health of the user.
[0083] The present disclosure provides an eye-ware that is efficient, reliable, user-
friendly, and cost-effective.
We Claim:
1.An eye-ware to assist a user in communicating ocularly, the eye-ware comprising:
one or more irradiating sources to emit an irradiation, having pre-configured
irradiation attributes, on at least a part of an eye of the user;
a wearable frame adapted to be worn by the user, said frame comprising:
one or more sensors positioned at a first pre-defined position on the frame,
wherein the one or more sensors are adapted to sense an irradiation reflected by the at
least a part of the eye of the user, and wherein a first set of signals is generated based
on the sensed irradiation received after the reflection; and
a processing unit operatively coupled to the one or more sensors and the one
or more irradiating sources, the processing unit comprising one or more processors
coupled with a memory, the memory storing instructions executable by the one or
more processors and configured to:
extract one or more irradiation attributes from the generated first set of signals;
determine a variation in at least one of the one or more irradiation attributes by comparing the one or more extracted irradiation attributes of the sensed irradiation received after the reflection with the pre-configured irradiation attributes, and
generate one or more set of data packets based on the determined variation, wherein the generated one or more set of data packets pertains to any or a combination of an image, an audio, and a text message, and is transmitted to one or more mobile computing devices, communicatively coupled to the processing unit, to assist the user in ocular communication.
2. The eye-ware as claimed in claim 1, wherein any or a combination of the one or more irradiating sources and the one or more sensors are positioned such that the irradiation emitted through the one or more irradiating sources passes through at least one optical axis associated with the at least a part of the eye of the user before being sensed by the one or more sensors.
3. The eye-ware as claimed in claim 1, wherein the one or more irradiating sources are positioned at a second pre-defined position on the frame of the eye-ware.
4. The eye-ware as claimed in claim 1, wherein the one or more set of data packets is generated by the processing unit when the detected variation is within one or more ranges of pre-defined thresholds, wherein the one or more ranges of pre-defined thresholds pertain to voluntary movements associated with ophthalmic attributes of the at least a part of the eye of the user, so that the eye-ware is able to obviate involuntary movements associated with ophthalmic attributes.
5. The eye-ware as claimed in claim 4, wherein the ophthalmic attributes comprise any or a combination of an orientation, a position, an eyelid closure, an eye blink, a pupil radius, and cornea of the at least an eye of the user.
6. The eye-ware as claimed in claim 1, wherein any or a combination of the one or more ranges of thresholds and the pre-configured irradiation attributes are configured through the one or more mobile computing devices.
7. The eye-ware as claimed in claim 1, wherein the one or more irradiating sources comprise any or a combination of infrared rays (IR) source, led source, and white light source, and wherein the one or more irradiation attributes are any or a combination of intensity, colour, frequency, and wavelength of the one or more irradiating sources.
8. The eye-ware as claimed in claim 1, wherein the one or more sensors are any or a combination of illumination sensor, optical triangular position sensor, ultrasonic sensor, irradiation sensor, and colour sensor.
9. The eye-ware as claimed in claim 1, wherein the one or more set of data packets is indicative of any or a combination of hunger, thirstiness, and need to go to washroom.
10. The eye-ware as claimed in claim 1, wherein the eye-ware comprises at least two frame inserts such that any or a combination of transmission and reflection properties of the at least two frame inserts are identical, wherein the frame is configured in such a way that the at least two frame inserts are exchangeably attachable to the eye-ware.
| # | Name | Date |
|---|---|---|
| 1 | 202011003608-STATEMENT OF UNDERTAKING (FORM 3) [27-01-2020(online)].pdf | 2020-01-27 |
| 2 | 202011003608-FORM FOR STARTUP [27-01-2020(online)].pdf | 2020-01-27 |
| 3 | 202011003608-FORM FOR SMALL ENTITY(FORM-28) [27-01-2020(online)].pdf | 2020-01-27 |
| 4 | 202011003608-FORM 1 [27-01-2020(online)].pdf | 2020-01-27 |
| 5 | 202011003608-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [27-01-2020(online)].pdf | 2020-01-27 |
| 6 | 202011003608-EVIDENCE FOR REGISTRATION UNDER SSI [27-01-2020(online)].pdf | 2020-01-27 |
| 7 | 202011003608-DRAWINGS [27-01-2020(online)].pdf | 2020-01-27 |
| 8 | 202011003608-DECLARATION OF INVENTORSHIP (FORM 5) [27-01-2020(online)].pdf | 2020-01-27 |
| 9 | 202011003608-COMPLETE SPECIFICATION [27-01-2020(online)].pdf | 2020-01-27 |
| 10 | abstract.jpg | 2020-02-04 |
| 11 | 202011003608-FORM-26 [19-03-2020(online)].pdf | 2020-03-19 |
| 12 | 202011003608-Proof of Right [27-06-2020(online)].pdf | 2020-06-27 |
| 13 | 202011003608-FORM 18 [17-09-2021(online)].pdf | 2021-09-17 |
| 14 | 202011003608-FER.pdf | 2022-03-23 |
| 15 | 202011003608-FORM-26 [22-09-2022(online)].pdf | 2022-09-22 |
| 16 | 202011003608-FER_SER_REPLY [22-09-2022(online)].pdf | 2022-09-22 |
| 17 | 202011003608-DRAWING [22-09-2022(online)].pdf | 2022-09-22 |
| 18 | 202011003608-CORRESPONDENCE [22-09-2022(online)].pdf | 2022-09-22 |
| 19 | 202011003608-CLAIMS [22-09-2022(online)].pdf | 2022-09-22 |
| 20 | 202011003608-US(14)-HearingNotice-(HearingDate-07-06-2024).pdf | 2024-03-12 |
| 21 | 202011003608-US(14)-ExtendedHearingNotice-(HearingDate-01-07-2024).pdf | 2024-06-03 |
| 22 | 202011003608-FORM-26 [27-06-2024(online)].pdf | 2024-06-27 |
| 23 | 202011003608-Correspondence to notify the Controller [27-06-2024(online)].pdf | 2024-06-27 |
| 24 | 202011003608-Written submissions and relevant documents [16-07-2024(online)].pdf | 2024-07-16 |
| 25 | 202011003608-FORM-26 [16-07-2024(online)].pdf | 2024-07-16 |
| 26 | 202011003608-Annexure [16-07-2024(online)].pdf | 2024-07-16 |
| 27 | 202011003608-PatentCertificate20-09-2024.pdf | 2024-09-20 |
| 28 | 202011003608-IntimationOfGrant20-09-2024.pdf | 2024-09-20 |
| 1 | 202011003608_SearchStrategyE_22-03-2022.pdf |