Abstract: The present disclosure provides an eyewear apparatus for real-time motion monitoring of user. The apparatus comprising an eyewear frame and first image capturing unit mounted on and configured to record a first data related to the user, wherein the first data is indicative of events occurring in scene viewed by the user. A second image capturing unit is mounted at a predetermined location, to record second data related to the user, wherein the second data is indicative of the user eye gaze and the user eye pupil size. A sensor is configured for sensing third data related to the user, wherein the third data is indicative of motion of head of the user. A memory unit is configured to store first data, second data and third data. A control unit analyses first data, second data and third data stored in the memory unit for real-time motion monitoring of user. Figure 1
DESC:TECHNICAL FIELD
The present disclosure relates to a wearable apparatus for a user. Embodiments of the disclosure, relates to an eyewear apparatus for real-time motion monitoring of a user. More particularly, embodiments of the disclosure disclose a method for monitoring real-time motion of a user.
BACKGROUND OF THE DISCLOSURE
Human nervous system disorders or neurological disorders such as vertigo, syncope, presyncope, epilepsy and the like are prevalent in many human beings around the world. These neurological disorders occur randomly to a subject (human being), when the subject is manoeuvring at a particular acceleration and position. Thus, diagnosis of these neurological disorders requires accurate determination of the occurrence of the attack or event, so that the clinician can prescribe an effective mode of treatment for the neurological disorder. The treatment broadly includes manoeuvres, rehabilitative physiotherapy, medications and surgery.
In the current state of the art, for acute determination of occurrence of the above mentioned attacks includes manoeuvring the subject at different positions for a particular interval of time, in a controlled environment. This process is tedious and time consuming, as it involves multiple iterations and trials. More importantly, during this process the subject is uncomfortable due to manoeuvring at unusual positions.
To overcome the limitation, tools such as video electroencephalography [video EEG] and videonystagmography are employed. Video EEG includes recording subject’s brain activity with simultaneous video recording of subject’s movements. Videonystagmography also includes recording subject’s eye movement pattern during a provocative manoeuvre. However, these tools are bulky and are made available in clinical settings/environment only. Also, these tools involve painful re-enactment of the attack by passing fluids into subject’s ear to induce an attack. This inherently causes acute discomfort to the subject, irrespective of whether an attack was induced or not. Further, these tools cannot be used in real-time, where most of the attack occurs to the subject. The subject during an attack in real-time, will experience a fall or may collapse which can be fatal.
In the light of the above, there is a need to develop an apparatus and a method for monitoring the subject, by overcoming the limitations stated above.
SUMMARY OF THE DISCLOSURE
One or more shortcomings of the prior art are overcome and additional advantages are provided through a system and a method of the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the disclosure.
It is to be understood that the aspects and embodiments of the disclosure described above may be used in any combination with each other. Several of the aspects and embodiments may be combined together to form a further embodiment of the disclosure.
In an embodiment of the present disclosure, an eyewear apparatus for real-time motion monitoring of a user is disclosed. The apparatus comprising an eyewear frame and at least one first image capturing unit mounted on the eyewear frame. The at least one first image capturing unit is configured to record a first data related to the user, wherein the first data is indicative of events occurring in scene viewed by the user. At least one second image capturing unit is mounted at a predetermined location on the eyewear frame to record second data related to the user, wherein the second data is indicative of the user eye gaze and the user eye pupil size. At least one sensor is configured on the eyewear frame for sensing third data related to the user, wherein the third data is indicative of motion of head of the user. A memory unit is configured to the eyewear frame, wherein the memory unit stores the first data, the second data and the third data received from the at least one first image capturing unit, the at least one second image capturing unit and the at least one sensor respectively. A control unit is configured to the eyewear frame being adapted to processes and analyse the first data, the second data and the third data stored in the memory unit for real-time motion monitoring of the user.
In an embodiment, the control unit analyses the first data, the second data with a first reference parameter and the third data with a second reference parameter, for real-time motion monitoring of the user.
In an embodiment, analysis of the first data, the second data and the third data by the control unit comprises steps of, processing the first data and the third data to determine the user head position relative to a first reference parameter, wherein the first reference parameter provides normal head position of the user at a predetermined time period. The second data is processed to determine the user eye gaze and the user eye pupil size relative to the second reference parameter, wherein the second reference parameter provides normal eye gaze and eye pupil size of the user at a predetermined time period. The first data and the third data are compared with the first reference parameter to detect user head position. The second data is compared with the second reference parameter to detect user eye gaze and the eye pupil size, and monitor real-time motion of the user based on the comparison steps.
In an embodiment, the at least one first image capturing unit is a scene camera.
`
In an embodiment, the at least on second image capturing unit is an infrared camera.
In an embodiment, the at least one sensor is at least one of accelerometer and gyroscope.
In an embodiment, a wireless communication unit is provisioned to provide feedback signal to a computing unit located external to the eyewear apparatus, upon detecting collapse of the user.
In an embodiment, a tracking unit is provisioned on the eyewear frame, for tracking location of the user.
In an embodiment, a light sensor is configured on the eyewear frame for recording ambient light conditions of user’s environment.
In an embodiment, a microphone is provisioned on the eyewear frame, for recording sound around the user’s environment.
In an embodiment, a power pack is provisioned in the eyewear frame for supplying power.
In an embodiment, a toggle switch is provisioned, for enabling and disabling the apparatus.
In an embodiment, a method of real-time motion monitoring of a user by an eyewear apparatus is provided. The method comprising acts of recording a first data, by at least one first image capturing unit, events occurring in scene viewed by the user, wherein the at least one first image capturing unit is mounted on an eyewear frame of the eyewear apparatus. Recording a second data, by at least one second image capturing unit, the user eye gaze and the user eye pupil size, wherein the at least one second image capturing unit is configured at a predetermined position on the eyewear frame. Recording a third data, by at least one sensor, motion of head of the user, wherein the at least one sensor is configured on the eyewear frame. Storing, by a memory unit, the first data, the second data and the third data received from the at least one first image capturing unit, the at least one second image capturing unit and the at least one sensor respectively, wherein the memory unit is configured to the eyewear frame. Lastly, analysing, by a control unit, the first data, the second data and the third data stored in the memory unit, thereby real-time motion monitoring of the user.
In an embodiment, analysing, by the control unit comprises acts of processing the first data and the third data to determine the user head position relative to a first reference parameter, wherein the first reference parameter provides normal head position of the user at a predetermined time period. Processing the second data to determine the user eye gaze and the user eye pupil size relative to the second reference parameter, wherein the second reference parameter provides normal eye gaze and eye pupil size of the user at a predetermined time period. Comparing, the first data and the third data with the first reference parameter to detect user head position. Comparing the second data with the second reference parameter to detect user eye gaze and the eye pupil size, and monitor real-time motion of the user based on the comparison steps.
In an embodiment, the method comprises act of indicating, by a tracking unit, location of the user, wherein the tracking unit is provisioned on the eyewear frame.
In an embodiment, the method comprises act of providing, by a wireless communication unit, a feedback signal to a computing unit upon detecting, mismatch between first data and third data with first reference parameter, and mismatch between the second data with the second reference parameter, wherein the computing unit is configured external to the eyewear apparatus.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The novel features and characteristic of the disclosure are set forth in the appended description. The embodiments of the disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of the illustrative embodiment when read in conjunction with the accompanying drawings. One or more embodiments are now described, by way of example only, with reference to the accompanying drawings.
Figure 1 illustrates perspective view of the eyewear apparatus in accordance with one embodiment of the present disclosure.
Figure 2a illustrates side view of the eyewear apparatus in accordance with one embodiment of the present disclosure.
Figure 2b illustrates another side view of eyewear apparatus in accordance with one embodiment of the present disclosure.
Figure 3 illustrates top view of the eyewear apparatus in accordance with one embodiment of the present disclosure.
Figure 4 illustrates side view of a user in accordance with an embodiment of the present disclosure.
Figure 5 illustrates flow chart of sequence of operations in the eyewear apparatus in accordance with one embodiment of the present disclosure.
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter.
The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the system illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION OF THE DISCLOSURE
While the embodiments in the disclosure are subject to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the figures and will be described below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
It is to be noted that a person skilled in the art would be motivated from the present disclosure and modify the eyewear apparatus and a method of motion monitoring of a user. However, such modification should be construed within the scope of the disclosure. Accordingly, the drawings show only those specific details that are pertinent to understand the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
The terms “comprises”, “comprising”, or any other variations thereof used in the disclosure, are intended to cover a non-exclusive inclusion, such that a method, system, assembly that comprises a list of components does not include only those components but may include other components not expressly listed or inherent to such system, or assembly, or device. In other words, one or more elements in a system proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or device.
To overcome the limitations stated in the background, an eyewear apparatus for real-time motion monitoring of a user is provided. The eyewear apparatus comprises an eyewear frame, which acts as a base for receiving all the components of the eyewear apparatus. The eyewear frame is configured with a plurality of elongated temple members, which provides support for wearing the eyewear apparatus on the head of the user. At least one first image capturing unit is mounted on the eyewear frame, which is configured to record a first data related to the user. The first data is indicative of events occurring in scene viewed by the user. In an embodiment, the at least one first image capturing unit is a scene camera. At least one second image capturing unit is mounted on the eyewear frame, which is configured to record a second data related to the user eye. The second data is indicative of the user eye gaze and the user eye pupil size. In an embodiment, the at least one second image capturing unit is an infrared camera. The eyewear apparatus also comprises at least one sensor, configured on the eyewear frame for sensing a third data related to the user. The third data is indicative of motion of head of the user. In an embodiment, the at least one sensor is at least one of an accelerometer and a gyroscope for sensing acceleration and orientation of motion of user’s head.
A memory unit is configured in the eyewear frame, to store the first data, the second data and the third data. A control unit is configured in the eyewear frame and interfaced with the memory unit, to process and analyse the first data, the second data and the third data. The control unit analyses the first data, the second data with a first reference parameter and the third data with a second reference parameter for real-time motion monitoring of the user. The first reference parameter provides normal head position of the user at a predetermined time period. The second reference parameter provides normal user eye gaze and user pupil size at the predetermined time period.
The analysis performed by the control unit includes determination of user head position by processing the first data and the third data relative to the first reference parameter. The first data and the third data are compared with the first reference parameter to detect user head position. Concurrently, the user eye gaze and the user eye pupil size are determined relative to the second reference parameter. The second data is compared with the second reference parameter to detect user eye gaze and eye pupil size. Thus, monitoring real-time motion of the user, based on comparison steps of the first data, the second data and the third data.
The following paragraphs describe the present disclosure with reference to Figures 1 to 5. In the figures the same element or elements which have same function are indicated by the same reference signs. In an exemplary embodiment of the disclosure, the figures illustrate aspects involved in an apparatus and a system for detecting fire and prevent explosion of a transformer.
Figures 1, 2a, 2b and 3 illustrate perspective view, side views and top view of the eyewear apparatus (100) for real-time motion monitoring of a user, according to an exemplary embodiment of the present disclosure. The eyewear apparatus (100) comprises an eyewear frame (101) which acts as a base to accommodate all the components of the eyewear apparatus (100). The eyewear frame (101) is configured to accommodate a lens (102), which facilitates the user to view the surroundings through the lens (102).
The eyewear frame (101) is provisioned with elongated temple members (104), on either sides of the eyewear frame (101) as a support to the user during use. The elongated temple members (104) anchor onto the user’s ears, thereby act as a support to the user during use of the eyewear apparatus (100). In an embodiment, the elongated temple members (104) are fixed on either sides of the eyewear frame (101).
The eyewear frame (101) is provisioned with at least one first image capturing unit (106) [shown in figures 2a, 2b and 2c], configured to record a first data of the user. The first data corresponds to the events occurring in the scene of the user or in other words, recordation of events occurring in front of the user. In an embodiment, the at least one first image capturing unit (106) is located at an outer portion of the eyewear frame (101), to record view of the user during use of the eyewear apparatus (100). In another embodiment, the at least one first image capturing unit (106) is configured at centre portion of the eyewear frame (101) in order to have a wider scene view in the line of sight of the user. In another embodiment, the at least one first image capturing unit (106) can be positioned at any location on the eyewear frame (101), which serves the purpose of recording events occurring in scene viewed by the user. In an exemplary embodiment, the at least one first image capturing unit (106) is at least one of a motion capture camera, an infrared camera, a reflex camera, a high-speed camera or any other image capturing unit which serves the purpose.
Further, at least one second image capturing unit (103) [shown in figure 1] is configured in the eyewear frame (101), to record a second data of the user. The second data corresponds to the user eye gaze and user eye pupil size. In an embodiment, user eye gaze corresponds to line of sight of the user or intently looking at the surroundings or environment by the user. In an embodiment, user pupil size corresponds to eye pupil size of the user during gazing, variation of pupil size due to changes in light and looking at near-far objects. In another embodiment, the at least one second image capturing unit (103) is positioned at an inner portion of the eyewear frame (101), to record the user eye gaze and the user eye pupil size. In another embodiment, the at least one second image capturing unit (103) is positioned proximal to the edges of the inner portion of the eyewear frame (101) [shown in figure 1]. The second image capturing unit (103) aids in uninterrupted recording of the user eye gaze and eye pupil size without causing any hindrance to the vision of the user. In another embodiment, the at least one second image capturing unit (103) can be positioned at any location on the eyewear frame (101), which serves the purpose of recording user eye gaze and user eye pupil size. In an exemplary embodiment, the at least one second image capturing unit (103) is at least one of a motion capture camera, an infrared camera, a reflex camera, a high-speed camera or any other image capturing unit which serves the purpose.
At least one sensor (107) is provisioned in the eyewear frame (101) for sensing a third data of the user. The third data corresponds to the motion of head of the user. Concurrently, the third data corresponds to orientation/tilt of head of the user, position, movement and acceleration of the head. In an embodiment, the head orientation/tilt is recorded in yaw, pitch and roll planes with respect to the horizontal and vertical axis. In an embodiment, the linear acceleration of head of the user is recorded in heave, fore aft, up and down planes (positive and negative vertical axis). In an embodiment, the rotational acceleration of the head is recorded in yaw, pitch and roll planes (X, Y and Z axis, with respect to the horizontal plane). In another embodiment, at least one sensor (107) is located at the inner portion of the eyewear frame (101) [shown in figure 1], for determining abrupt or sudden motion of head of the user. In another embodiment, the at least one sensor (107) may be positioned on outer portion of the eyewear frame (101) to determine abrupt or sudden motion of head of the user. In an exemplary embodiment, the at least one sensor (107) is located at the centre of the inner portion of the eyewear frame (101). In an embodiment, the at least one sensor (107) is at least one of a gyroscopic sensor, an accelerometer sensor, a proximity sensor or any other sensor which serves the purpose of determining abrupt or sudden motion of head of the user.
A wireless communication unit (113) is configured in the elongated temple members (104) of the eyewear frame (101) for wireless data transfer of the first data, the second data and the third data. In an embodiment, the wireless communication unit (13) is at least one of a Bluetooth device, an infrared device and a smart card. The smart card is at least one of a Subscriber Identity Module (SIM) card, a transmitter chip card or any other means which serves the purpose.
The elongated temple members (104) also include a toggle switch (109), for enabling and disabling the eyewear apparatus (100) based on the users use. The toggle switch (109) is located in such a position that, it is accessible to the user. Further, a tracking unit (115) for tracking the location of the user is included in the elongated temple members (104).A microphone (117) is configured to the elongated temple members (104) to record sound around the user’s environment. A light sensor (116) is also provisioned in the elongated temple member (104) for sensing ambient light condition of the user’s environment. In an embodiment, when no light is detected in user’s environment, the light sensor (116) temporarily disengages the eyewear apparatus (100).
A power pack (114) is configured in the elongated temple members (104) of the eyewear frame (101) for supplying power to the components of the eyewear apparatus (100). In an embodiment, the power pack (114) is a battery, which is at least one of, li-ion battery, rechargeable battery, solar powered battery or any other battery that serves the purpose. A slot may be provisioned [not shown in figures] in the elongated temple members (104) for receiving the power pack (114) for supplying power to the eyewear apparatus (100).
A memory unit (111) is provisioned in the elongated temple members (104) and is interfaced with the at least one first image capturing unit (106), the at least one second image capturing unit (103) and the at least one sensor (107). The memory unit (111) is configured for storing the data acquired by the at least one first image capturing unit (106), the at least one second image capturing unit (103) and the at least one sensor (107). In an embodiment, the memory unit (111) is configured to store the computations required for real-time motion monitoring of the user. In an embodiment, the memory unit (111) is at least one of a RAM, a ROM or any other storage device, which serves the purpose of storing data. In an embodiment, the computations stored in the memory unit (111) include measurement of deviation of data procured by the components relative to set reference parameters.
The memory unit (111) is interfaced to a control unit (110) for processing and analysing the first data, the second data and the third data for real-time motion monitoring of the user. The control unit (110) includes a processing unit [not shown in figures] to process or perform analysis based on the data received from the components. The processing unit may comprise at least one data processor for executing program components for executing user- or system-generated requests. The processor may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc. The processing unit may include a microprocessor, such as AMD Athlon, Duron or Opteron, ARM’s application, embedded or secure processors, IBM PowerPC, Intel’s Core, Itanium, Xeon, Celeron or other line of processors, etc. The processing unit may be implemented using mainframe, distributed processor, multi-core, parallel, grid, or other architectures. Some embodiments may utilize embedded technologies like application-specific integrated circuits (ASICs), digital signal processors (DSPs), Field Programmable Gate Arrays (FPGAs), etc. The processing unit may be disposed in communication with one or more input/output (I/O) devices via I/O interface. The I/O interface may employ communication protocols/methods such as, without limitation, audio, analog, digital, monoaural, RCA, stereo, IEEE-1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), RF antennas, S-Video, VGA, IEEE 802.n /b/g/n/x, Bluetooth, cellular (e.g., code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc. In an embodiment, the control unit (110) may be directly interfaced with the at least one first image capturing unit (106), the at least one second image capturing unit (103) and the at least one sensor (107) for processing the data procured instantaneously.
The control unit (110) upon receipt of the first data, the second data and the third data, analyses the first data, the second data with a first reference parameter and the third data with a second reference parameter. The first reference parameter corresponds to normal head position of the user at a predetermined time period. In an embodiment, the normal head position corresponds to the head position along axis A-A [as shown in figure 4]. The second reference parameter corresponds to normal user eye gaze position and the user eye pupil size at a predetermined time period. In an embodiment, the normal eye gaze position corresponds to direction of gaze of the user along axis B-B [as shown in figure 4]. In an embodiment, the normal eye pupil size corresponds to circumferential size of eye pupil about axis B-B. Subsequently, the control unit (110) compares the first data and the third data with the first reference parameter, thereby detecting user head position. Similarly, the second data is compared with the second reference parameter, thereby detecting user eye gaze and eye pupil size. Based on the comparisons, the control unit (110) monitors motion of the user in real-time. If a mismatch between the first data and the third data with a first reference parameter; and the second data with the second reference parameter is detected, the control unit (110) operates the wireless communication unit (113) for providing a feedback signal to a computing unit (112). The computing unit (112) is an external server, located external to the eyewear apparatus (100).. The computing unit (112) alerts the care taker and/or clinician upon receipt of the feedback signal from the control unit (110). In an embodiment, detection of mismatch of the data corresponds to collapse of the user due to a neurological attack.
The components of the eyewear apparatus (100) described above are enclosed in an enclosure (108) located within the elongated temple members (104). The enclosure (108) is configured to provide ingress protection. Additionally, the enclosure (108) adds on to the aesthetic appeal to the eyewear apparatus (100), by covering the components arranged in the eyewear frame (101).
In an embodiment, the lens (102) is snap fitted into the eyewear frame (101). In another embodiment, the lens (102) is at least one of polarised lens, clear lens or prescription lens and combinations thereof.
Figure 5 in an exemplary embodiment, illustrates flow chart of processes performed by the eyewear apparatus (100) for real-time motion monitoring of the user according to an exemplary embodiment of the present disclosure. The eyewear apparatus (100) is worn by the user in real-time conditions (i.e. while performing daily activities). The user operates the toggle switch (109) to engage the eyewear apparatus (100) for real-time motion monitoring of the user. Once the eyewear apparatus (100) is engaged, the power pack (114) supplies power to all the components of the eyewear apparatus (100).
In step 501, the at least one first image capturing unit (106) records the events occurring in scene viewed by the user, as the first data. Simultaneously, the at least one second image capturing unit (103) records the user eye gaze and user eye pupil size, as the second data. In step 502, the at least one sensor (107) records the motion of the head of the user as the third data. The first data, the second data and the third data are stored in the memory unit (111).
In step 503, the control unit (110) processes the first data and the second data for analysis. Simultaneously, in step 504 the control unit (110) processes the third data for analysis.
In step 505, the control unit (505) analyses the first data and the second data, for abrupt or sudden movement pattern. In an embodiment, abrupt or sudden movement pattern corresponds to sudden change in position of the user from an initial position. From the first data and the second data, the control unit (110) analyses, if there are abrupt or sudden movement change in events occurring as viewed by the user. The first data is compared by the control unit (110) with the first reference parameter, to ascertain abrupt or sudden movement of user head (i.e. abrupt or sudden movement change in user head motion relative to normal position). In an embodiment, the control unit (110) detects abrupt or sudden movement change in head movement of the user along axis A-A [as shown in figure 4]. The second data is compared by the control unit (110) with the second reference parameter, to ascertain abrupt or sudden movement change in user eye gaze and user pupil size (i.e. abrupt or sudden movement change in user eye gaze and user eye pupil size relative to normal position of user eye gaze and user eye pupil size). In an embodiment, the control unit (110) detects abrupt or sudden movement change in user eye gaze and user pupil size relative to the axis B-B [as shown in figure 4]. When abrupt movement of head motion of user and user eye gaze and user pupil size is not detected, the control unit (110) continues to process the first data and the second data via step 507.
Correspondingly, in step 506 the control unit (110) analyses and compares the third data with the first reference parameter, to ascertain any abrupt or sudden movement of head of the user (i.e. whether there are any abrupt or sudden changes in motion of head relative to normal position of head). In an embodiment, the control unit (110) detects abrupt or sudden change in head movement of the user along axis A-A [as shown in figure 4]. When abrupt movement of head motion of user is not detected, the control unit (110) continues to process the third data via step 508.
When the abrupt movement in steps 505 and 506 are detected, the control unit (110) proceeds to step 509. In step 509, the control unit (110) ascertains that there is abrupt or sudden movement of user’s head as well as user eye gaze and user eye pupil size thereby processes a feedback signal to the computing unit (112). The computing unit (112) upon receiving the feedback signal, alerts caretaker or clinician that collapse of the user by neurological attack is detected. When abrupt movement in steps 505 and 506 is not detected, the control unit (110) reiterates the steps for real-time motion monitoring of the user.
In an embodiment, the real-time motion monitoring of the user includes monitoring at least one of gait of the user, collapse of the user, fall of the user, uncontrolled motor coordination, unconscious free-fall of the user, seizure of the user and the like.
Equivalents
With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be understood to include the possibilities of “A” or “B” or “A and B.”
In addition, where features or aspects of the disclosure are described in terms of Markush groups, those skilled in the art will recognize that the disclosure is also thereby described in terms of any individual member or subgroup of members of the Markush group.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
REFERRAL NUMERALS
REFERRAL NUMERALS DESCRIPTION
100 Eyewear apparatus
101 Eyewear Frame
102 Eyewear lens
103 Second camera
104 Elongated member
105 Hinges
106 First camera
107 Sensor
108 Enclosure
109 Toggle switch
110 Control unit
111 Memory card
112 Computing unit
113 Wireless communication unit
114 Power pack
115 Tracking unit
116 Light sensor
117 Microphone
,CLAIMS:1. An eyewear apparatus (100) for real-time motion monitoring of a user, the apparatus (100) comprising:
an eyewear frame (101);
at least one first image capturing unit (106) mounted on the eyewear frame (101), configured to record first data related to the user, wherein the first data is indicative of events occurring in scene viewed by the user;
at least one second image capturing unit (103) mounted at a predetermined location on the eyewear frame (101) to record second data related to the user, wherein the second data is indicative of the user eye gaze and the user eye pupil size;
at least one sensor (107) configured on the eyewear frame (101) for sensing third data related to the user, wherein the third data is indicative of motion of head of the user;
a memory unit (111) configured to the eyewear frame (101), wherein the memory unit (111) stores the first data, the second data and the third data received from the at least one first image capturing unit (106), the at least one second image capturing unit (103) and the at least one sensor (107) respectively, and
a control unit (110) configured to the eyewear frame (101) being adapted to processes and analyse the first data, the second data and the third data stored in the memory unit (111) for real-time motion monitoring of the user,
2. The apparatus (100) as claimed in claim 1, wherein the control unit (110) analyses the first data, the second data with a first reference parameter and the third data with a second reference parameter, for real-time motion monitoring of the user.
3. The apparatus (100) as claimed in claim 1, wherein analysis of the first data, the second data and the third data by the control unit (110) comprises steps of :
processing the first data and the third data to determine the user head position relative to a first reference parameter, wherein the first reference parameter provides normal head position of the user at a predetermined time period;
processing the second data to determine the user eye gaze and the user eye pupil size relative to the second reference parameter, wherein the second reference parameter provides normal eye gaze and eye pupil size of the user at a predetermined time period;
comparing, the first data and the third data with the first reference parameter to detect user head position;
comparing the second data with the second reference parameter to detect user eye gaze and the eye pupil size; and
monitoring real-time motion of the user based on the comparison steps.
4. The apparatus (100) as claimed in claim 1, wherein the at least one first image capturing unit (106) is a scene camera.
5. The apparatus (100) as claimed in claim 1, wherein the at least on second image capturing unit (103) is an infrared camera.
6. The apparatus (100) as claimed in claim 1, wherein the at least one sensor (107) is at least one of accelerometer and gyroscope.
7. The apparatus (100) as claimed in claim 1 comprises a wireless communication unit (113) to provide feedback signal to a computing unit (112) located external to the eyewear apparatus (100), upon detecting collapse of the user.
8. The apparatus (100) as claimed in claim 1 comprises a tracking unit (115) provisioned on the eyewear frame (101), for tracking location of the user.
9. The apparatus (100) as claimed in claim 1 comprises a light sensor (116) configured on the eyewear frame (101) for recording ambient light conditions of user’s environment.
10. The apparatus (100) as claimed in claim 1 comprises a microphone (117) provisioned on the eyewear frame (101), for recording sound in the user’s environment.
11. The apparatus (100) as claimed in claim 1 comprises a power pack (114) provisioned in the eyewear frame (101) for supplying power.
12. The apparatus (100) as claimed in claim 1 comprises a toggle switch (109) for enabling and disabling the apparatus (100).
13. A method of real-time motion monitoring of a user by an eyewear apparatus (100), the method comprising acts of:
recording a first data, by at least one first image capturing unit (106), events occurring in scene viewed by the user, wherein the at least one first image capturing unit (106) is mounted on an eyewear frame (101) of the eyewear apparatus (100);
recording a second data, by at least one second image capturing unit (103), the user eye gaze and the user eye pupil size, wherein the at least one second image capturing unit (103) is configured at a predetermined position on the eyewear frame (101);
recording a third data, by at least one sensor (107), motion of head of the user, wherein the at least one sensor (107) is configured on the eyewear frame (101);
storing, by a memory unit (111), the first data, the second data and the third data received from the at least one first image capturing unit (106), the at least one second image capturing unit (103) and the at least one sensor (107) respectively, wherein the memory unit (111) is configured to the eyewear frame (101), and
analysing, by a control unit (110), the first data, the second data and the third data stored in the memory unit (111), thereby real-time motion monitoring of the user.
14. The method as claimed in claim 13, wherein analysing, by the control unit (110) comprises acts of :
processing the first data and the third data to determine the user head position relative to a first reference parameter, wherein the first reference parameter provides normal head position of the user at a predetermined time period;
processing the second data to determine the user eye gaze and the user eye pupil size relative to the second reference parameter, wherein the second reference parameter provides normal eye gaze and eye pupil size of the user at a predetermined time period;
comparing, the first data and the third data with the first reference parameter to detect user head position;
comparing the second data with the second reference parameter to detect user eye gaze and the eye pupil size; and
monitoring real-time motion of the user based on the comparison steps.
15. The method as claimed in claim 13 comprises act of indicating, by a tracking unit (115), location of the user, wherein the tracking unit (115) is provisioned on the eyewear frame (101).
16. The method as claimed in claim 13 comprises act of providing, by a wireless communication unit (113), a feedback signal to a computing unit (112) upon detecting, mismatch between first data and third data with first reference parameter, and mismatch between the second data with the second reference parameter, wherein the computing unit (112) is configured external to the eyewear apparatus (100).
| # | Name | Date |
|---|---|---|
| 1 | 4261-CHE-2015-FER.pdf | 2021-10-17 |
| 1 | Form 5 [14-08-2015(online)].pdf | 2015-08-14 |
| 2 | Form 3 [14-08-2015(online)].pdf | 2015-08-14 |
| 2 | 4261-CHE-2015-EVIDENCE FOR REGISTRATION UNDER SSI [08-07-2019(online)].pdf | 2019-07-08 |
| 3 | Drawing [14-08-2015(online)].pdf | 2015-08-14 |
| 3 | 4261-CHE-2015-FORM 18 [08-07-2019(online)].pdf | 2019-07-08 |
| 4 | Description(Provisional) [14-08-2015(online)].pdf | 2015-08-14 |
| 4 | 4261-CHE-2015-FORM FOR STARTUP [08-07-2019(online)].pdf | 2019-07-08 |
| 5 | Drawing [11-08-2016(online)].pdf_20.pdf | 2016-08-11 |
| 5 | Correspondence by Agent_F1 F26_18-11-2016.pdf | 2016-11-18 |
| 6 | Form 26 [11-11-2016(online)].pdf | 2016-11-11 |
| 6 | Drawing [11-08-2016(online)].pdf | 2016-08-11 |
| 7 | Other Patent Document [11-11-2016(online)].pdf | 2016-11-11 |
| 7 | Description(Complete) [11-08-2016(online)].pdf | 2016-08-11 |
| 8 | Form-2(Online).pdf | 2016-09-29 |
| 8 | abstract 4261-CHE-2015 .jpg | 2016-09-17 |
| 9 | Form-2(Online).pdf | 2016-09-29 |
| 9 | abstract 4261-CHE-2015 .jpg | 2016-09-17 |
| 10 | Description(Complete) [11-08-2016(online)].pdf | 2016-08-11 |
| 10 | Other Patent Document [11-11-2016(online)].pdf | 2016-11-11 |
| 11 | Form 26 [11-11-2016(online)].pdf | 2016-11-11 |
| 11 | Drawing [11-08-2016(online)].pdf | 2016-08-11 |
| 12 | Drawing [11-08-2016(online)].pdf_20.pdf | 2016-08-11 |
| 12 | Correspondence by Agent_F1 F26_18-11-2016.pdf | 2016-11-18 |
| 13 | Description(Provisional) [14-08-2015(online)].pdf | 2015-08-14 |
| 13 | 4261-CHE-2015-FORM FOR STARTUP [08-07-2019(online)].pdf | 2019-07-08 |
| 14 | Drawing [14-08-2015(online)].pdf | 2015-08-14 |
| 14 | 4261-CHE-2015-FORM 18 [08-07-2019(online)].pdf | 2019-07-08 |
| 15 | Form 3 [14-08-2015(online)].pdf | 2015-08-14 |
| 15 | 4261-CHE-2015-EVIDENCE FOR REGISTRATION UNDER SSI [08-07-2019(online)].pdf | 2019-07-08 |
| 16 | Form 5 [14-08-2015(online)].pdf | 2015-08-14 |
| 16 | 4261-CHE-2015-FER.pdf | 2021-10-17 |
| 1 | SEARCH4261CHE2015E_03-03-2021.pdf |