Sign In to Follow Application
View All Documents & Correspondence

Smart 3 D Glass

Abstract: Disclosed is a method and system for controlling a smart 3D glass. The method comprises obtaining data associated with a multimedia, a user, and a smart 3D glass and determining an alignment of the smart 3D glass with respect to a device displaying the multimedia. Further, the method comprises computing a blink rate of an eye of the user, a time elapsed after last eye blink, a stress on the eye of the user due to watching of the multimedia, and a time elapsed upon wearing the smart 3D glass based on image processing of eye tracking camera data. Furthermore, the method comprises switching a mode of the 3D glass.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
06 January 2016
Publication Number
07/2016
Publication Type
INA
Invention Field
PHYSICS
Status
Email
ip@legasis.in
Parent Application

Applicants

HCL Technologies Limited
B-39, Sector 1, Noida 201 301, Uttar Pradesh, India

Inventors

1. TAMMANA, Sankar Uma
HCL Technologies Limited, A-8 & 9, Sec-60, Noida, Uttar Pradesh - 201301, India
2. DHALIWAL, Jasbir Singh
HCL Technologies Limited, A-8 & 9, Sec-60, Noida, Uttar Pradesh - 201301, India

Specification

FIELD
[001] The present subject matter described herein, in general, relates to a system and a method for smart 3D glass, and more particularly a system and a method for controlling a smart 3D glass.
BACKGROUND
[002] In the current market, 3D movies, games and other content are becoming rapidly popular. Generally, for enabling the watching of a 3D multimedia such as a movie, different display information is necessary to be provided for each eye of an individual. In other words, the right eye of the individual doesn't see any of the information meant for the left eye of the individual, and vice versa. The two current methods employed by a 3D glass in order to enable viewing of a 3D multimedia are called active method and passive method. Generally, active 3D glasses use battery-operated shutter glasses that rapidly shutter open and closed. This, in theory, means the information meant for your left eye is blocked from your right eye by a closed shutter.
[003] The 3D content provides an experience of reality thus becoming popular day by day; resulting in increase of the 3D content providers. Generally, it is observed that while watching video, viewers typically forget to blink their eyes, resulting in high stress on their eyes and dry eyes. The condition becomes even more sever while watching a 3D multimedia using a 3D glasses there is high intensity activity being performed by the 3D glasses in front of the eyes such as shuttering or polarizing in order to enable viewing of 3D content. Thus, constant use of the 3D glasses along with low blinking by the viewer may also result in nauseous, or headache. This may further deteriorate the health of the viewer’s eyes. Conventional techniques available for monitoring blinking of the eyes fail to enforce blinking when implemented with 3D glasses. Conventional techniques also fail in controlling the 3D glass in order to maintain the health of a viewer’s eyes, while watching a 3D multimedia using a 3D glass.
SUMMARY
[004] Before the present systems and methods for controlling a smart 3D glass, are described, it is to be understood that this application is not limited to the particular
3
systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosures. It is also to be understood that the terminology used in the description is for the purpose of describing the particular implementations or versions or embodiments only, and is not intended to limit the scope of the present application. This summary is provided to introduce aspects related to a system and a method for controlling a smart 3D glass. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[005] In one implementation, a system for controlling a smart 3D glass is disclosed. In one aspect, the system may obtain data associated with a multimedia, a user, and a smart 3D glass comprising an infrared sensor and an eye tracking camera. The data may comprise infrared sensor data, eye tracking camera data, multimedia data, and user data. Further, the system may determine an alignment of the smart 3D glass with respect to the TV based on the infrared sensor data and compute a blink rate of an eye of the user the time elapsed after last eye blink based on image processing of the eye tracking camera data. Furthermore, the system may compute a stress on the eye of the user due to watching of the multimedia based image processing of the eye tracking camera data and a time elapsed upon wearing the smart 3D glass based on the eye tracking camera data. The system may finally, switch a mode of the 3D glass based on one or more of the alignment of the smart 3D, the blink rate of the eye, the time elapsed after last eye blink the stress on the eye, multimedia data, user specific time, a time elapsed upon start of the multimedia and the time elapsed upon wearing the smart 3D glass.
[006] In one implementation, a method for controlling a smart 3D glass is disclosed. In one aspect, the method may comprise obtaining data associated with a multimedia, a user, and a smart 3D glass comprising an infrared sensor and an eye tracking camera. The data may comprise infrared sensor data, eye tracking camera data, multimedia data, and user data. The method may further comprise determining an alignment of the smart 3D glass with respect to the TV based on the infrared sensor data and computing a blink rate of an eye of the user a time elapsed after last eye blink based on image processing of the eye tracking camera data. The method may furthermore comprise computing a stress on the eye of the user due to watching of the multimedia based image processing of the eye tracking camera data and computing a time elapsed upon wearing the smart 3D glass
4
based on the eye tracking camera data. The method may finally comprise switching a mode of the 3D glass based on one or more of the alignment of the smart 3D, the blink rate of the eye, the time elapsed after last eye blink the stress on the eye, multimedia data, user specific time, a time elapsed upon start of the multimedia and the time elapsed upon wearing the smart 3D glass.
[007] In yet another implementation, non-transitory computer readable medium embodying a program executable in a computing device for controlling a smart 3D glass is disclosed. In one aspect, the program may comprise a program code for obtaining data associated with a multimedia, a user, and a smart 3D glass comprising an infrared sensor and an eye tracking camera, wherein the data comprises infrared sensor data, eye tracking camera data, multimedia data, and user data. The program may comprise a program code for determining an alignment of the smart 3D glass with respect to the TV based on the infrared sensor data. The program may comprise a program code for computing a blink rate of an eye of the user based on image processing of the eye tracking camera data. The program may comprise a program code for computing a stress on the eye of the user due to watching of the multimedia based image processing of the eye tracking camera data. The program may comprise a program code for computing a time elapsed upon wearing the smart 3D glass based on the eye tracking camera data. The program may comprise a program code for switching a mode of the 3D glass based on one or more of the alignment of the smart 3D, the blink rate of the eye, the time elapsed after last eye blink the stress on the eye, multimedia data, user specific time, a time elapsed upon start of the multimedia and the time elapsed upon wearing the smart 3D glass.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] The foregoing detailed description of embodiments is better understood when read in conjunction with the appended drawings. For the purpose of illustrating of the present subject matter, an example of construction of the present subject matter is provided as figures; however, the invention is not limited to the specific method and system disclosed in the document and the figures.
[009] The present subject matter is described detail with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in
5
which the reference number first appears. The same numbers are used throughout the drawings to refer various features of the present subject matter.
[010] Figure 1 illustrates a network implementation of a system for controlling a smart 3D glass, in accordance with an embodiment of the present subject matter.
[011] Figure 2illustrates the system controlling a smart 3D glass, in accordance with an embodiment of the present subject matter.
[012] Figure 3 illustrates a method for controlling a smart 3D glass, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[013] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words "comprising," "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods for controlling a smart 3D glass, similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the exemplary, systems and methods for controlling a smart 3D glassare now described. The disclosed embodiments for controlling a smart 3D glass are merely examples of the disclosure, which may be embodied in various forms.
[014] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments for controlling a smart 3D glass. However, one of ordinary skill in the art will readily recognize that the present disclosure for controlling a smart 3D glass is not intended to be limited to the embodiments described, but is to be accorded the widest scope consistent with the principles and features described herein.
[015] In an implementation, a system and method for controlling a smart 3D glass, is described. In one embodiment, data is obtained. In one example, the data may be
6
associated with one or more users, one or more multimedia being watched by the users and displayed on a screen of a device such as a television or a laptop, and one or more smart 3D glass being worn by the users. Further, the smart 3D glass may comprise an infrared sensor and an eye tracking camera. Furthermore, the data comprise may infrared sensor data obtained from the infrared sensor, eye tracking camera data obtained from the eye tracking camera, multimedia data obtained from the device the multimedia is displayed, user data obtained from the user or one or more central servers.
[016] Upon obtaining the data, an alignment of the smart 3D glass with respect to the TV may be determined. The determination may be based on the infrared sensor data obtained by an alignment of a first infrared sensor in the smart 3D glass with a second infrared sensor in the device.
[017] Further to the determination, a blink rate of an eye of the user and a time elapsed after the last blink may be computed. The blink rate and the time elapsed may be computed based on image processing of the eye tracking camera data obtained from the eye tracking camera installed in the smart 3D glass. The blink rate may be understood at the speed at which the user of the 3D glass watching the multimedia on the device blinks his eyes.
[018] Subsequent to the computation of blink rate, a stress on the eye of the user may be computed. In one example, the stress may be due to watching of the multimedia. In one embodiment the stress may be computed based image processing of the eye tracking camera data obtained from the eye tracking camera installed in the smart 3D glass. Upon computing the stress, a time elapsed upon wearing the smart 3D glass based on the eye tracking camera data may be computed.
[019] Further to the computation of the time elapsed, a mode of the 3D glass may be switched based on one or more of the alignment of the smart 3D, the blink rate of the eye, the time elapsed after last eye blink the stress on the eye, multimedia data, user specific time, a time elapsed upon start of the multimedia and the time elapsed upon wearing the smart 3D glass. In one example, the switching may comprise changing the mode of the smart 3D glass from 3D to 2D or from 2D to 3D. Thus, the control of the 3D glass to prevent strain to the eyes is enabled.
7
[020] Referring now to Figure 1, a network implementation of a system 102 for controlling a smart 3D glass, in accordance with an embodiment of the present subject matter may be described. In one embodiment, the present subject matter is explained considering that the system 102 may be implemented as a system installed within the smart 3D glass connected to a network 106.It may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, a cloud-based computing environment and the like and in communication with the smart 3D glass through the network 106 in order to enable the control of the 3D glass.
[021] In another embodiment, the system 102 may also be implemented on a display device 104. It may be understood that the system implemented on the display device supports a plurality of browsers and all viewports. Examples of the plurality of browsers may include, but not limited to, Chrome™, Mozilla™, Internet Explorer™, Safari™, and Opera™. It will also be understood that the system 102 may be accessed by multiple users through one or more display devices 104-1 …, and 104-N, collectively referred to as display devices 104 hereinafter, or applications residing on the display devices 104. Examples of the display devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, a television and a workstation. The display devices 104 are communicatively coupled to the system 102 and the 3D glasses through a network 106.
[022] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like. In one embodiment, the system 102 may be communicatively coupled via a network to a database 110. In one example, the display devices may obtain the multimedia to be viewed from the database 110 via a network or
8
via an electromagnetic signal transmission. In one embodiment, 109 represents communication over a Wi-Fi network and 111 refers to wireless communication requiring align of sight, such an infrared communication.
[023] Referring now to Figure 2, the system 102 is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 may be configured to fetch and execute computer-readable instructions stored in the memory 206.
[024] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with the user directly or through the client devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[025] The memory 206 may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[026] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include a receiving module212, a generating module 214, a providing module 216 and an other module 218. The other modules 218 may include programs or coded instructions that supplement applications and functions of
9
the system 102. The modules 208 described herein may be implemented as software modules that may be executed in the cloud-based computing environment of the system 102.
[027] The memory 206, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The memory 206 may include data generated as a result of the execution of one or more modules in the other module 220.In one implementation, the memory may include data 210. Further, the data 210 may include a system data 220 for storing data processed, computed received and generated by one or more of the modules 208. Furthermore, the data 210 may include other data 224 for storing data generated as a result of the execution of one or more modules in the other module 220.
[028] In one implementation, at first, a user may use the device 104 or the smart 3D glass 108 to access the system 102 via the I/O interface 204. The user may register using the I/O interface 204 in order to use the system 102. In one aspect, the user may access the I/O interface 204 of the system 102 for obtaining information or providing input information or switching information. In one implementation the system 102 my automatically provide information to the user through I/O interface 204.
RECEIVING MODULE 212
[029] Referring to figure 2, in an embodiment the receiving module 212 may obtain data associated with a multimedia, a user, and a smart 3D glass comprising an infrared sensor and an eye tracking camera. Further, the data may comprise infrared sensor data, eye tracking camera data, multimedia data, and user data. In one example, the multimedia may be a video, a television program, a movie, an image, and an advertisement being displayed on a device such as a television, a laptop. In one other example, the multimedia may be 2-dimensional (2D) multimedia or a 3-dimensional (3D) multimedia. Further, the multimedia data may be one or more of a type of multimedia, depth of 3D program data, frequency data. Further, the eye tracking camera data may comprise one or more of eye movement data, eye blink data, eye retina data, eye pupil data, an eye open state, an eye close state, and gaze data. The user data may further comprises timer data. The timer data may be understood as the time for which the 3D glass should be active in a 3D mode. The infrared sensor data may further comprises infrared codes. In one example, the infrared
10
coded may be transmitted by an infrared sensor installed in the device such as a TV program and received by the infrared sensor installed in the smart 3D glass. In the embodiment, the receiving module 212 may store the obtained data in system data 220.
[030] In the embodiment upon obtaining the data the multimedia, the user, and the smart 3D glass, the receiving module 212 may determine an alignment of the smart 3D glass with respect to a device displaying the multimedia based on the infrared sensor data. IN one example, the determination may be based on receiving of the infrared codes when the smart 3D glass aligns with the device. In the embodiment, the receiving module 212 may store the alignment in system data 220.
COMPUTING MODULE 214
[031] In the implementation further to obtaining data and determining an alignment, the computation module 214 may compute a blink rate of an eye of the user. In one example, the blink rate of the eye may be computed based on image processing of the eye tracking camera data obtained from the eye tracking camera embedded in the smart 3D glass 108. Upon computing the blink rate of the eye, the computation module 214 may compute a stress on the eye of the user due to watching of the multimedia. In one example, the stress on the eye may be based image processing of the eye tracking camera data. In one example, the image-processing techniques involve treating the image of the eye as a two-dimensional signal and applying standard signal-processing techniques. Subsequent to computing a stress on the eye, the computation module 214 may compute a time elapsed upon wearing of the smart 3D glass108 by the user. The wearing of the smart 3D glass may be detected based on the based on the eye tracking camera data. Upon computing the time elapsed upon wearing of the smart 3D glass108 by the user, the computation module 214 may compute a time elapsed upon start of the multimedia based on multimedia data.
[032] Further, the computation module 214 may store the blink rate of the eye, the stress on the eye, the time elapsed upon wearing the smart 3D glass, and the time elapsed upon start of the multimedia system data 220.
11
SWITCHING MODULE 216
[033] In the implementation, the switching module 216 may compare the alignment of the smart 3D, the blink rate of the eye, the stress on the eye, and the time elapsed with a predefined criterion. The predefined criterion may comprises, maximum allowable stress on the eye, maximum allowable time elapsed after last eye blink, and maximum allowable time elapsed upon wearing the smart 3D glass. The predefined criterion may further comprise minimum blink rate and the alignment condition. Further to the comparison, the switching module 216 may switch a mode of the 3D glass based on one or more of the alignment of the smart 3D, the blink rate of the eye, the time elapsed after last eye blink the stress on the eye, multimedia data, user data, a time elapsed upon start of the multimedia and the time elapsed upon wearing the smart 3D glass. Further, the switching may be from a 3D mode to a 2D mode or 2D mode to a 3D mode. Thus, controlling of the 3D glass is enabled. Further, the switching module 216 may also store switching data and the current mode the system data 220.In one more embodiment, the switching module 216 may monitor the last eye blink and the stress and switch to a 3D mode from 2D mode when the eyes are adequately distressed or are blinked.
[034] Exemplary embodiments for controlling a smart 3D glass discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
[035] Smart 3D glasses can also be used as normal smart glasses, since it will default to 2D mode in case no IR signal is received
[036] Some embodiments of the system and the method enable keeping the viewer’s eyes healthy.
[037] Some embodiments of the system and the method eliminate the need to remove the smart 3D glasses for distressing the eyes.
[038] Some embodiments of the system and the method enable the smart 3D glasses to be used as normal smart glasses.
[039] Some embodiments of the system and the method enable distribution and watching of 2D and 3D mixed multimedia.
12
[040] Referring now to Figure 3, a method 300 for controlling a smart 3D glass is shown, in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
[041] The order in which the method 300 for controlling a smart 3D glass is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be considered to be implemented in the above described system 102.
[042] At block 302,data associated with a multimedia, a user, and a smart 3D glass comprising an infrared sensor and an eye tracking camera is obtained. The data may comprise infrared sensor data, eye tracking camera data, multimedia data, and user data. In an implementation, the receiving module 212 may obtain data associated with a multimedia, a user, and a smart 3D glass comprising an infrared sensor and an eye tracking camera and store the data in system data 220.
[043] At block 304, an alignment of the smart 3D glass with respect to the TV based on the infrared sensor data may be determined. In the implementation, the receiving module 212 may determine an alignment of the smart 3D glass with respect to the TV based on the infrared sensor data and store the alignment in system data 220.
[044] At block 306,a blink rate of an eye of the user may be computed. The blink rate of an eye of the user may be computed based on image processing of the eye tracking camera data. In the implementation, the computing module 214 may compute a blink rate of an eye of the user and store the blink rate in system data 220.
[045] At block 308, a stress on the eye of the user due to watching of the multimedia may be computed. The stress on the eye of the user due to watching of the multimedia
13
may be computed based on image processing of the eye tracking camera data. In the implementation, , the computing module 214 may compute a stress on the eye of the user due to watching of the multimedia and store the stress on the eye of the user in system data 220.
[046] At block 310, a time elapsed upon wearing the smart 3D glass based on the eye tracking camera data may be computed. In the implementation, the computing module 214 may compute a time elapsed upon wearing the smart 3D glass and store the time elapsed in system data 220.
[047] At block 312, a mode of the 3D glass is switched. The switching of the mode may be based on one or more of the alignment of the smart 3D, the blink rate of the eye, the time elapsed after last eye blink the stress on the eye, multimedia data, user data, a time elapsed upon start of the multimedia and the time elapsed upon wearing the smart 3D glass. In the implementation, the switching module 214 may switch a mode of the smart 3D glass and store the mode in system data 220.
[048] Exemplary embodiments discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include a method for controlling a smart 3D glass.
[049] Although implementations for methods and systems for controlling a smart 3D glass have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for controlling a smart 3D glass.
14

WE CLAIM:
1. A method for controlling a smart 3D glass, the method comprising:
obtaining, by a processor, data associated with a multimedia, a user, and a smart 3D glass comprising an infrared sensor and an eye tracking camera, wherein the data comprises infrared sensor data, eye tracking camera data, multimedia data, user data;
determining, by the processor, an alignment of the smart 3D glass with respect to a device displaying the multimedia based on the infrared sensor data;
computing, by the processor, a blink rate of an eye of the user and a time elapsed after last eye blink based on image processing of the eye tracking camera data;
computing, by the processor, a stress on the eye of the user due to watching of the multimedia based on image processing of the eye tracking camera data;
computing, by the processor, a time elapsed upon wearing the smart 3D glass based on the eye tracking camera data; and
switching, by the processor, a mode of the 3D glass based on one or more of the alignment of the smart 3D, the blink rate of the eye, the time elapsed after last eye blink the stress on the eye, multimedia data, user data, a time elapsed upon start of the multimedia and the time elapsed upon wearing the smart 3D glass.
2. The method of claim 1, further comprising computing, by the processor, the time elapsed upon start of the multimedia based on the multimedia data.
3. The method of claim 1, further comprising comparing, by the processor, the alignment of the smart 3D, the blink rate of the eye, the stress on the eye, and the time elapsed with a predefined criterion.
4. The method of claim 1, wherein the multimedia is one or more of a 2 dimensional multimedia and a 3 dimensional multimedia.
5. The method of claim 1, wherein the multimedia is one or more of a type of multimedia, a depth of 3D program, a frequency data.
15
6. The method of claim 1, wherein the multimedia may be one of a video, a television program, a movie, an image, and an advertisement.
7. The method of claim 1, wherein the eye tracking camera data further comprises one or more of eye movement data, eye blink rate data, eye retina data, an eye open state, an eye close state, and gaze data.
8. The method of claim 1, wherein the user data further comprises timer data for switching of the mode of the smart 3D glass.
9. The method of claim 1, wherein the infrared sensor data further comprises infrared codes.
10. A system for controlling a smart 3D glass, the system comprising:
a memory; and
a processor coupled to the memory, wherein the processor is capable of executing instructions to perform steps of:
obtaining data associated with a multimedia, a user, and a smart 3D glass comprising an infrared sensor and an eye tracking camera, wherein the data comprises infrared sensor data, eye tracking camera data, multimedia data, and user data;
determining an alignment of the smart 3D glass with respect to a device displaying the multimedia based on the infrared sensor data;
computing a blink rate of an eye of the user a time elapsed after last eye blink based on image processing of the eye tracking camera data;
computing a stress on the eye of the user due to watching of the multimedia based image processing of the eye tracking camera data;
computing a time elapsed upon wearing the smart 3D glass based on the eye tracking camera data; and
switching a mode of the 3D glass based on the alignment of the smart 3D, the blink rate of the eye, the time elapsed after last eye blink, multimedia data, user data, a time elapsed upon start of the multimedia the stress on the eye, and the time elapsed upon wearing the smart 3D glass.
16
11. The system of claim 10, further comprising computing, by the processor, thetime elapsed upon start of the multimedia based on the multimedia data.
12. The system of claim 10, further comprising comparing the alignment of the smart 3D, the blink rate of the eye, the stress on the eye, and the time elapsed with a predefined criterion.
13. The system of claim 10, wherein the multimedia is one or more of a 2 dimensional multimedia and a 3 dimensional multimedia.
14. The system of claim 10, wherein the multimedia data is one or more of type of Multimedia, depth of 3D program data, frequency data.
15. The system of claim 10, wherein the multimedia may be one of a video, a television program, a movie, an image, and an advertisement.
16. The system of claim 10, wherein the eye tracking camera data further comprises one or more of eye movement data, eye blink data, eye retina data, eye pupil data, an eye open state, an eye close state, and gaze data.
17. The system of claim 10, wherein the user data further comprises timer datafor switching of the mode of the smart 3D glass.
18. The system of claim 10, wherein the infrared sensor data further comprises infrared codes.
19. A non-transitory computer program product having embodied thereon a computer program for controlling a smart 3D glass, the computer program product storing instructions, the instructions comprising instructions for:
obtaining data associated with a multimedia, a user, and a smart 3D glass comprising an infrared sensor and an eye tracking camera, wherein the data comprises infrared sensor data, eye tracking camera data, multimedia data, user data;
17
determining an alignment of the smart 3D glass with respect to a device displaying the multimedia based on the infrared sensor data;
computing a blink rate of an eye of the user a time elapsed after last eye blink based on image processing of the eye tracking camera data;
computing a stress on the eye of the user due to watching of the multimedia based image processing of the eye tracking camera data;
computing a time elapsed upon wearing the smart 3D glass based on the eye tracking camera data;
computing a time elapsed upon start of the multimedia based on the multimedia data;
comparing the alignment of the smart 3D, the blink rate of the eye, the stress on the eye, and the time elapsed with a predefined criterion; and
switching a mode of the 3D glass based on the alignment of the smart 3D, the blink rate of the eye, the time elapsed after last eye blink, the stress on the eye, multimedia data, user data, a time elapsed upon start of the multimedia and the time elapsed upon wearing the smart 3D glass.

Documents

Application Documents

# Name Date
1 Form 3 [06-01-2016(online)].pdf 2016-01-06
3 Drawing [06-01-2016(online)].pdf 2016-01-06
4 Description(Complete) [06-01-2016(online)].pdf 2016-01-06
5 201611000511-GPA-(13-05-2016).pdf 2016-05-13
6 201611000511-Form-1-(13-05-2016).pdf 2016-05-13
7 201611000511-Correspondence Others-(13-05-2016).pdf 2016-05-13
8 abstract.jpg 2016-07-10
9 201611000511-FER.pdf 2018-12-21
10 201611000511-OTHERS [28-05-2019(online)].pdf 2019-05-28
11 201611000511-FER_SER_REPLY [28-05-2019(online)].pdf 2019-05-28
12 201611000511-DRAWING [28-05-2019(online)].pdf 2019-05-28
13 201611000511-COMPLETE SPECIFICATION [28-05-2019(online)].pdf 2019-05-28
14 201611000511-CLAIMS [28-05-2019(online)].pdf 2019-05-28
15 201611000511-POA [09-07-2021(online)].pdf 2021-07-09
16 201611000511-FORM 13 [09-07-2021(online)].pdf 2021-07-09
17 201611000511-Proof of Right [22-10-2021(online)].pdf 2021-10-22
18 201611000511-US(14)-HearingNotice-(HearingDate-17-06-2022).pdf 2022-05-26
19 201611000511-Correspondence to notify the Controller [06-06-2022(online)].pdf 2022-06-06

Search Strategy

1 Search201611000511_26-03-2018.pdf