Sign In to Follow Application
View All Documents & Correspondence

Measuring Multimedia Impact

Abstract: Disclosed is a method and system for determining an impact of a multimedia on a user watching the multimedia. The method comprises obtaining data associated with a smart glass, a user, and a multimedia and detecting if the user is watching the multimedia based on image processing of the eye tracking camera data. The method further comprises computing an actual impact of the multimedia on the user watching the multimedia based on image processing of the eye tracking camera data and the detection and generating a variance based on a comparison of the actual impact of the multimedia on the user and an expected impact of the multimedia on the user. Further, the actual impact and expected impact is one of an emotional impact and a psychological impact.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
14 January 2016
Publication Number
07/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2022-02-10
Renewal Date

Applicants

HCL Technologies Limited
B-39, Sector 1, Noida 201 301, Uttar Pradesh, India

Inventors

1. DHALIWAL, Jasbir Singh
HCL Technologies Limited, A-8 & 9, Sec-60, Noida, Uttar Pradesh 201301, India
2. TAMMANA, Sankar Uma
HCL Technologies Limited, A-8 & 9, Sec-60, Noida, Uttar Pradesh 201301, India

Specification

TECHNICAL FIELD
[001] The present subject matter described herein, in general, relates to a system and a method for determining an impact, and more particularly a system and a method for determining an impact of a multimedia on a user watching the multimedia.
BACKGROUND
[002] In the current market, 3D movies, games and other contest are becoming rapidly popular. Generally, data indication on the viewing habits and activities of household members is very important for developers, advertisers to understand the effect of the multimedia content being developed on the viewers. By determining the effect of the multimedia content developers, advertisers can determine the popularity of their shows and message strength and further helping in development of robust marketing strategy. Conventional systems, such as surveys require direct viewer participation to record the necessary information and data in order to determine the effect of any content. Other systems have attempted to reduce the required participation by the viewer, but failed to enable a completely autonomous impact identification system with zero viewer participation. It is apparent from the above that there exists a need for a device and system which enables determination of an impact of a multimedia on a viewer watching the multimedia.
SUMMARY
[003] Before the present systems and methods, are described, it is to be understood that this application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosures. It is also to be understood that the terminology used in the description is for the purpose of describing the particular implementations or versions or embodiments only, and is not intended to limit the scope of the present application. This summary is provided to introduce aspects related to a system and a method for determining an impact of a multimedia on a user watching the multimedia. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
3
[004] In one implementation, a system for determining an impact of a multimedia on a user watching the multimedia is disclosed. In one aspect, the system may obtain data associated with a smart glass, a user, sensor, and a multimedia, wherein the smart glass comprises an eye tracking camera. The data may comprise eye tracking camera data, multimedia data, sensor data, and user data. Upon obtaining, the system may detect if the user is watching the multimedia based on image processing of the eye tracking camera data. The image processing of eye tracking camera data may comprise identifying a gaze direction. Further to detecting, the system may compute an actual impact of the multimedia on the user watching the multimedia based on image processing of the eye tracking camera data, sensor data, and the detection. Subsequent to computing, the system may generate a variance based on a comparison of the actual impact of the multimedia on the user and an expected impact of the multimedia on the user. The actual impact and the expected impact may be one of an emotional impact and a psychological impact.
[005] In one implementation, a method for determining an impact of a multimedia on a user watching the multimedia is disclosed. In one aspect, the method may comprise obtaining data associated with a smart glass, a user, a sensor and a multimedia, wherein the smart glass may comprise an eye tracking camera. The data may comprise eye tracking camera data, multimedia data, sensor data, and user data. The method may further comprise detecting if the user is watching the multimedia based on image processing of the eye tracking camera data. The image processing of eye tracking camera data may comprise identifying a gaze direction. The method may further comprise computing an actual impact of the multimedia on the user watching the multimedia based on image processing of the eye tracking camera data, sensor data, and the detection. The method may further comprise generating a variance based on a comparison of the actual impact of the multimedia on the user and an expected impact of the multimedia on the user. The actual impact and the expected impact may be one of an emotional impact and a psychological impact.
[006] In yet another implementation, non-transitory computer readable medium embodying a program executable in a computing device for determining an impact of a multimedia on a user watching the multimedia is disclosed. In one aspect, the program may comprise obtaining data associated with a smart glass, a user, a sensor and a multimedia. The smart glass may comprise an eye tracking camera. The data may
4
comprise eye tracking camera data, multimedia data, sensor data, and user data. The program may further comprise detecting if the user is watching the multimedia based on image processing of the eye tracking camera data. The image processing of eye tracking camera data may comprise identifying a gaze direction. The program may comprise computing an actual impact of the multimedia on the user watching the multimedia based on image processing of the eye tracking camera data, sensor data, and the detection. The actual impact may be one of an emotional impact and a psychological impact. The program may comprise generating a variance based on a comparison of the actual impact of the multimedia on the user and an expected impact of the multimedia on the user. The expected impact may be one of an emotional impact and a psychological impact.
BRIEF DESCRIPTION OF THE DRAWINGS
[007] The foregoing detailed description of embodiments is better understood when read in conjunction with the appended drawings. For the purpose of illustrating of the present subject matter, an example of construction of the present subject matter is provided as figures; however, the invention is not limited to the specific method and system disclosed in the document and the figures.
[008] The present subject matter is described detail with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer various features of the present subject matter.
[009] Figure 1 illustrates a network implementation of a system for determining an impact of a multimedia on a user watching the multimedia, in accordance with an embodiment of the present subject matter.
[010] Figure 2 illustrates the system determining the impact of a multimedia on the user watching the multimedia, in accordance with an embodiment of the present subject matter.
[011] Figure 3 illustrates a method for determining the impact of the multimedia on the user watching the multimedia, in accordance with an embodiment of the present subject matter.
5
DETAILED DESCRIPTION
[012] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words "comprising," "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods for determining an impact of a multimedia on a user watching the multimedia, similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the exemplary, systems and methods for determining an impact of a multimedia on a user watching the multimedia are now described. The disclosed embodiments for determining an impact of a multimedia on a user watching the multimedia are merely examples of the disclosure, which may be embodied in various forms.
[013] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments for determining an impact of a multimedia on a user watching the multimedia. However, one of ordinary skill in the art will readily recognize that the present disclosure for determining an impact of a multimedia on a user watching the multimedia is not intended to be limited to the embodiments described, but is to be accorded the widest scope consistent with the principles and features described herein.
[014] In an implementation, a system and method for determining an impact of a multimedia on a user watching the multimedia, is described. In one embodiment, data associated with a smart glass, a user, a sensor and a multimedia may be obtained and it is detected if the user is watching the multimedia based on image processing of the eye tracking camera data. In one example, the smart glass may comprise an eye tracking camera. In one other example, the sensors may be heart beat sensors, skin sensors, and the like. In the example, the data may be obtained when the user is wearing the smart glass and watching the multimedia, Further, the data may comprise eye tracking camera data, multimedia data, a sensor data and user data. In one example, the detection may be based
6
on image processing of eye tracking camera data for identifying the gaze direction and matching the gaze direction with the location of the multimedia.
[015] Further to obtaining the data and detecting, an actual impact of the multimedia on the user watching the multimedia may be computed. The actual impact and the expected impact may be an emotional impact such as crying, laughing and a psychological impact such as sadness, anger. In an example, the actual impact and the expected impact may be of a scene of a character, of the ambience, of the music. In one example, the actual impact may be computed based on image processing of the eye tracking camera data. In one more example, the image processing may be based on computing the dilation of a pupil of the eye of the user due to the watching of the multimedia. In one other example, the image processing may be based on computing a facial expression of the user. In one other example, the image processing may be based on a combination of computing the dilation of a pupil of the eye of the user and the facial expression. In another example, the actual impact may be computed based on sensor data such as rate of heart beat, skin temperature and perspiration level.
[016] Upon computing the actual impact, a variance may be generated. The variance may be understood as quantity of divergent of an actual result from a predefined result. The variance may be generated based on a comparison of the actual impact of the multimedia on the user and an expected impact of the multimedia on the user.
[017] Referring now to Figure 1, a network implementation of a system 102 for determining an impact of a multimedia on a user watching the multimedia, in accordance with an embodiment of the present subject matter may be described. In one embodiment, the present subject matter is explained considering that the system 102 may be implemented as a standalone system connects to a network. It may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, a cloud-based computing environment and the like.
[018] In one implementation, the system 102 may comprise the cloud-based computing environment in which the user may operate individual computing systems configured to execute remotely located applications. In another embodiment, the system 102 may also be implemented on a client device hereinafter referred to as a user device 104. It may be
7
understood that the system implemented on the client device supports a plurality of browsers and all viewports. Examples of the plurality of browsers may include, but not limited to, Chrome™, Mozilla™, Internet Explorer™, Safari™, and Opera™. It will also be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2 … and 104-N, collectively referred to as user devices 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The user devices 104 are communicatively coupled to the system 102 through a network 106. The user devices 104 are communicatively coupled to a smart glasses108-1, 108-2, 108-3… 103-N, hereinafter referred to as smart glass 108, through a network 106.
[019] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like. In one embodiment, 109 represents communication over a Wi-Fi network and 111 refers to wireless communication requiring align of sight, such an infrared communication.
[020] Referring now to Figure 2, the system 102 is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 may be configured to fetch and execute computer-readable instructions stored in the memory 206.
8
[021] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with the user directly or through the client devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[022] The memory 206 may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[023] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include an obtaining module 212, a computing module 214, a generating module 216 and an other module 218. The other modules 218 may include programs or coded instructions that supplement applications and functions of the system 102. The modules 208 described herein may be implemented as software modules that may be executed in the cloud-based computing environment of the system 102.
[024] The memory 206, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The memory 206 may include data generated as a result of the execution of one or more modules in the other module 220. In one implementation, the memory may include data 210. Further, the data 210 may include a system data 220 for storing data processed, computed received and generated by one or more of the modules 208. Furthermore, the data 210 may include other data 224 for storing data generated as a result of the execution of one or more modules in the other module 220.
9
[025] In one implementation, at first, a user may use the client device 104 to access the system 102 via the I/O interface 204. The user may register using the I/O interface 204 in order to use the system 102. In one aspect, the user may access the I/O interface 204 of the system 102 for obtaining information or providing input information. In one implementation the system 102 my automatically provide information to the user through I/O interface 204.
OBTAINING MODULE 212
[026] Referring to figure 2, in an embodiment for determining an impact of a multimedia on a user watching the multimedia the obtaining module 212 may obtain data associated with a smart glass, a user, a sensor and a multimedia. Further, the smart glass may comprise an eye tracking camera. In one example, the smart glass may be a 2 dimensional glass or a 3 dimensional glass. In one other example, the sensors may be heart beat sensors, skin sensors, and the like. Furthermore, the data comprises eye tracking camera data, multimedia data, sensor data, and user data. In one other example, the multimedia may be a 2-dimensional multimedia or a 3-dimensional multimedia. Further, more the multimedia may be an image, a video, a movie, an advertisement and the like. In one example, the multimedia data may further comprise a time of watching the multimedia, the expected impact of the multimedia on the user and duration of the multimedia. In the example, the user data may comprise gender, age, and location and the eye tracking camera data may comprise image data and video data. The sensor data may comprise rate of heart beat, temperature of skin, and perspiration level etc. The obtaining module 212 may store the data in system data 220.
[027] In the embodiment, further to obtaining the data associated with the smart glass, the user, and the multimedia, the obtaining module 212 may detect if the user is watching the multimedia. In one example, the detection may be based on image processing of eye tracking camera data for identifying the gaze direction and matching the gaze direction with the location of the multimedia. The gaze direction may be understood at the direction the user is looking. The obtaining module 212 may store the detection in system data 220.
10
COMPUTING MODULE 214
[028] In the implementation, the computing module 214 may compute an actual impact. The actual impact may be understood as effect of watching the multimedia on a user. Further, the actual impact may be computed based on image processing of the eye tracking camera data, sensor data and the detection. In one example, the image processing may comprise computing the dilution of a pupil of the eye of the user. Upon computing the dilation, comparing the percentage of dilution in the pupil with a predefined impact table to compute the actual impact. In one example, the image processing may comprise computing a facial expression of the user. Upon computing the facial expression, comparing the expression with a predefined impact table to compute the actual impact. In another example, the image processing may comprise a combination of computing the dilution of a pupil of the eye of the user and computing a facial expression of the user. In an example, the actual impact may be computed based on the rate of heartbeat, skin temperature and perspiration level and comparing with a predefined impact table to compute the actual impact. In one example, the actual impact may be computed when the user is watching the multimedia. In one other example, the actual impact may be computed after the user has completed watching the multimedia. In one more example, the actual impact may be computed at a predefined time interval. Further, the actual impact may be an emotional impact such as crying or laughing and a psychological impact such as sadness, anger, depression, happiness. The computing module 214 may compute an actual impact and may store the actual impact in the system data 220. In one example, the actual impact may be one or more of a scene in the multimedia, of a character in the multimedia, and of a music in the multimedia.
[029] Furthermore, the computing module 214 may also compute the total number of user watching the multimedia in a particular location, such as a house. The computing module 214 may store the number of users in the system data 220.
GENERATING MODULE 216
[030] In the implementation upon computing the actual impact, the generating module 216 may generate a variance based on a comparison of the actual impact of the multimedia on the user and an expected impact of the multimedia on the user. In one example, the actual impact may be sadness, whereas the expected impact may be empathy. In one
11
example, the actual impact and the expected impact may be one or more of a scene in the multimedia, of a character in the multimedia, and of a music in the multimedia. Further, the generating module 216 may store the variance in the system data 220, thus enabling denervation of an impact of a multimedia on a user watching the multimedia.
[031] Further, in one more embodiment, the generating module 216 may further obtain the actual impact, the variance and the user data associated with a plurality of users. Upon obtaining the generating module 216 may categorize the plurality of users in to one or more user categories based on the user data. In one example, the generating module 216 may categorize the plurality of users based on gender such as male or female. In one other example, the categorization may be based on location such as Boston and Dallas. In one more example, the categorization may be based on age such as 1-15, 16-30, 31-45, 45-60, 61 and above. Upon categorizing the plurality of users, the generating module 216 may compute an average actual impact of the multimedia and an average variance for the one or more user categories. The generating module 216 may further display or generate a report enabling optimization of multimedia content.
[032] In one other embodiment, the generating module 216 may obtain the actual impact of a multimedia in 2 dimensional and the actual impact of the same multimedia in 3 dimensional. Upon obtaining the actual impact the generating module 216 may generate a relative impact of the multimedia on the user based on comparison of the actual impacts. Further the generating module 216 may store the relative impact in the system data 220.
[033] Furthermore, in one another embodiment of, the generation module 216 may utilized the computed number of users in a particular location such as a house and generate a bill based on the number of users. For example, the cost for a multimedia may be $5, then if in a first house 5 users are watching the multimedia the bill would amount to $25. In another house hold 2 users are watching the multimedia the bill would amount to $10. Thus the system enables optimised billing generation base on number of users.
[034] Exemplary embodiments for determining an impact of a multimedia on a user watching the multimedia discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
12
[035] Some embodiments enable the system and the method to accurately identify number of viewers of a TV program.
[036] Some embodiments enable the system and the method to accurately identify emotional state of a viewer.
[037] Some embodiments enable the system and the method to generate reports, for producers, promoters, advertisers and distributers, indicating the types of viewers and the emotional state for the users.
[038] Some embodiments enable the system and the method to assess an impact of advertisement on a viewer.
[039] Referring now to Figure 3, a method 300 for determining an impact of a multimedia on a user watching the multimedia is shown, in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
[040] The order in which the method 300 for determining an impact of a multimedia on a user watching the multimedia is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be considered to be implemented in the above described system 102.
[041] At block 302, data associated with associated with a smart glass, a user, and a multimedia. The user is wearing the smart glass comprising an eye tracking camera and watching the multimedia. The data comprises eye tracking camera data, multimedia data,
13
and user data. In an implementation, the obtaining module 212 may obtain data associated with a smart glass, a user, and a multimedia and store the data in the system data 220.
[042] At block 304, if the user is watching the multimedia is detected based on image processing of the eye tracking camera data. In an implementation, the obtaining module 212 may detect if the user is watching the multimedia and store the detection in the system data 220.
[043] At block 306, an actual impact of the multimedia on the user based on image processing of the eye tracking camera data may be computed. In the implementation, the computing module 212 may compute an actual impact of the multimedia on the user based on image processing of the eye tracking camera data and store actual impact in the system data 220.
[044] At block 308, a variance based on the comparison of the actual impact and an expected impact of the multimedia on the user may be generated. In the implementation, the generating module 214 may generate a variance based on the comparison of the actual impact and an expected impact of the multimedia on the user and store the variance in the system data 220.
[045] Exemplary embodiments discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include a method for determining an impact of a multimedia on a user watching the multimedia.
[046] Although implementations for methods and systems for determining an impact of a multimedia on a user watching the multimedia have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for determining an impact of a multimedia on a user watching the multimedia.

WE CLAIM
1. A method for determining an impact of a multimedia on a user watching the multimedia, the method comprising:
obtaining, by a processor, data associated with a smart glass, a sensor a user, and a multimedia, wherein the smart glass comprises a eye tracking camera, and wherein the data comprises eye tracking camera data, multimedia data, sensor data and user data;
detecting, by the processor, if the user is watching the multimedia based on image processing of the eye tracking camera data, wherein the image processing of eye tracking camera data comprises identifying a gaze direction, and wherein the user is wearing the smart glass for watching the multimedia;
computing, by the processor, an actual impact of the multimedia on the user watching the multimedia based on image processing of the eye tracking camera data, sensor data and the detection, and wherein the actual impact is one of an emotional impact and a psychological impact; and
generating, by the processor, a variance based on a comparison of the actual impact of the multimedia on the user and an expected impact of the multimedia on the user, and wherein the expected impact is one of an emotional impact and a psychological impact.
2. The method of claim 1, further comprises
obtaining, by the processor, the actual impact, the variance and the user data associated with a plurality of users;
categorizing, by the processor, the plurality of users in to one or more user categories based on the user data; and
computing, by the processor, an average actual impact of the multimedia and an average variance for the one or more user categories.
3. The method of claim 1, further comprises matching the gaze directing with a location of the multimedia.
15
4. The method of claim 1, further comprises generating a relative impact of the multimedia on the user based on comparison of the actual impact of the multimedia in 2 dimensional mode and the actual impact of the multimedia in 3 dimensional mode.
5. The method of claim 1, wherein the actual impact and the expected impact is one or more of a scene in the multimedia, of a character in the multimedia, and of a music in the multimedia.
6. The method of claim 1, wherein the image processing comprises computing the dilation of a pupil of the eye of the user based on the watching of the multimedia.
7. The method of claim 1, wherein the image processing comprises computing the facial expressions of the user.
8. The method of claim 1, wherein the multimedia data further comprises one or more of a time of watching the multimedia, the expected impact of the multimedia on the user and duration of the multimedia.
9. The method of claim 1, wherein the multimedia is one of an audio, a video, an image, a movie, and an advertisement.
10. The method of claim 1, wherein the multimedia is one of a 2 dimensional multimedia and a 3 dimensional multimedia.
11. The method of claim 1, wherein the user data further comprises one or more of gender, age, and location.
12. The method of claim 1, wherein the eye tracking camera data further comprises one or more of image data and video data.
13. The method of claim 1, further comprises
16
computing, by the processor, number of users watching the multimedia wearing the smart glass in a predefined location; and
generating, by the processor, a bill based on the number of users in the predefined location.
14. A system for determining an impact of a multimedia on a user watching the multimedia, the system comprising:
a memory; and
a processor coupled to the memory, wherein the processor is capable of executing instructions to perform steps of:
obtaining data associated with a smart glass, a user, a sensor and a multimedia, wherein the smart glass comprises a eye tracking camera, and wherein the data comprises eye tracking camera data, multimedia data, sensor data and user data;
detecting if the user is watching the multimedia based on image processing of the eye tracking camera data, wherein the image processing of eye tracking camera data comprises identifying a gaze direction, and wherein the user is wearing the smart glass for watching the multimedia;
computing an actual impact of the multimedia on the user watching the multimedia based on image processing of the eye tracking camera data, sensor data, and the detection, and wherein the actual impact is one of an emotional impact and a psychological impact; and
generating a variance based on a comparison of the actual impact of the multimedia on the user and an expected impact of the multimedia on the user, and wherein the expected impact is one of an emotional impact and a psychological impact.
15. The system of claim 14, further comprises
obtaining the actual impact, the variance, and the user data associated with a plurality of users;
categorizing the plurality of users in to one or more user categories based on the user data; and
computing an average actual impact of the multimedia and an average variance for the one or more user categories.
17
16. The system of claim 14, further comprises generating a relative impact of the multimedia on the user based on comparison of the actual impact of the multimedia in 2 dimensional mode and the actual impact of the multimedia in 3 dimensional mode.
17. The system of claim 14, wherein the actual impact and the expected impact is one or more of a scene in the multimedia, of a character in the multimedia, and of a music in the multimedia.
18. The system of claim 14, wherein the image processing computing the dilation of a pupil of the eye of the user due to watching of the multimedia.
19. The system of claim 14, wherein the image processing comprises computing the facial expressions of the user.
20. The system of claim 14, further comprises matching the gaze directing with a location of the multimedia.
21. The system of claim 14, wherein the multimedia data further comprises one or more of a time of watching the multimedia, the expected impact of the multimedia on the user and duration of the multimedia.
22. The system of claim 14, wherein the multimedia is one of an audio, a video, an image, a movie, and an advertisement.
23. The system of claim 14, wherein the multimedia is one of a 2 dimensional multimedia and a 3 dimensional multimedia.
24. The system of claim 14, wherein the user data further comprises one or more of gender, age, and location.
25. The system of claim 14, wherein the eye tracking camera data further comprises one or more of image data and video data.
18
26. The system of claim 14, further comprises
computing number of users watching the multimedia wearing the smart glass in a predefined location; and
generating a bill based on the number of users in the predefined location.
27. A non-transitory computer program product having embodied thereon a computer program for determining an impact of a multimedia on a user watching the multimedia, the computer program product storing instructions, the instructions comprising instructions for:
obtaining data associated with a smart glass, a user, a sensor and a multimedia, wherein the smart glass comprises a eye tracking camera, and wherein the data comprises eye tracking camera data, multimedia data, sensor data and user data;
detecting if the user is watching the multimedia based on image processing of the eye tracking camera data, wherein the image processing of eye tracking camera data comprises identifying a gaze direction, and wherein the user is wearing the smart glass for watching the multimedia;
computing an actual impact of the multimedia on the user watching the multimedia based on image processing of the eye tracking camera data, sensor data, and the detection, and wherein the actual impact is one of an emotional impact and a psychological impact; and
generating a variance based on a comparison of the actual impact of the multimedia on the user and an expected impact of the multimedia on the user, and wherein the expected impact is one of an emotional impact and a psychological impact.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 201611001448-RELEVANT DOCUMENTS [20-09-2023(online)].pdf 2023-09-20
1 Form 3 [14-01-2016(online)].pdf 2016-01-14
2 201611001448-IntimationOfGrant10-02-2022.pdf 2022-02-10
2 Form 20 [14-01-2016(online)].pdf 2016-01-14
3 Drawing [14-01-2016(online)].pdf 2016-01-14
3 201611001448-PatentCertificate10-02-2022.pdf 2022-02-10
4 Description(Complete) [14-01-2016(online)].pdf 2016-01-14
4 201611001448-Proof of Right [20-10-2021(online)].pdf 2021-10-20
5 201611001448-US(14)-HearingNotice-(HearingDate-02-09-2021).pdf 2021-10-17
5 201611001448-GPA-(13-05-2016).pdf 2016-05-13
6 201611001448-Form-1-(13-05-2016).pdf 2016-05-13
6 201611001448-Annexure [15-09-2021(online)].pdf 2021-09-15
7 201611001448-Written submissions and relevant documents [15-09-2021(online)].pdf 2021-09-15
7 201611001448-Correspondence Others-(13-05-2016).pdf 2016-05-13
8 abstract.jpg 2016-07-11
8 201611001448-Response to office action [01-09-2021(online)]-1.pdf 2021-09-01
9 201611001448-FER.pdf 2020-01-20
9 201611001448-Response to office action [01-09-2021(online)].pdf 2021-09-01
10 201611001448-Correspondence to notify the Controller [14-08-2021(online)].pdf 2021-08-14
10 201611001448-OTHERS [13-07-2020(online)].pdf 2020-07-13
11 201611001448-FER_SER_REPLY [13-07-2020(online)].pdf 2020-07-13
11 201611001448-FORM 13 [09-07-2021(online)].pdf 2021-07-09
12 201611001448-COMPLETE SPECIFICATION [13-07-2020(online)].pdf 2020-07-13
12 201611001448-POA [09-07-2021(online)].pdf 2021-07-09
13 201611001448-CLAIMS [13-07-2020(online)].pdf 2020-07-13
14 201611001448-COMPLETE SPECIFICATION [13-07-2020(online)].pdf 2020-07-13
14 201611001448-POA [09-07-2021(online)].pdf 2021-07-09
15 201611001448-FER_SER_REPLY [13-07-2020(online)].pdf 2020-07-13
15 201611001448-FORM 13 [09-07-2021(online)].pdf 2021-07-09
16 201611001448-Correspondence to notify the Controller [14-08-2021(online)].pdf 2021-08-14
16 201611001448-OTHERS [13-07-2020(online)].pdf 2020-07-13
17 201611001448-Response to office action [01-09-2021(online)].pdf 2021-09-01
17 201611001448-FER.pdf 2020-01-20
18 201611001448-Response to office action [01-09-2021(online)]-1.pdf 2021-09-01
18 abstract.jpg 2016-07-11
19 201611001448-Written submissions and relevant documents [15-09-2021(online)].pdf 2021-09-15
19 201611001448-Correspondence Others-(13-05-2016).pdf 2016-05-13
20 201611001448-Form-1-(13-05-2016).pdf 2016-05-13
20 201611001448-Annexure [15-09-2021(online)].pdf 2021-09-15
21 201611001448-US(14)-HearingNotice-(HearingDate-02-09-2021).pdf 2021-10-17
21 201611001448-GPA-(13-05-2016).pdf 2016-05-13
22 Description(Complete) [14-01-2016(online)].pdf 2016-01-14
22 201611001448-Proof of Right [20-10-2021(online)].pdf 2021-10-20
23 Drawing [14-01-2016(online)].pdf 2016-01-14
23 201611001448-PatentCertificate10-02-2022.pdf 2022-02-10
24 201611001448-IntimationOfGrant10-02-2022.pdf 2022-02-10
25 Form 3 [14-01-2016(online)].pdf 2016-01-14
25 201611001448-RELEVANT DOCUMENTS [20-09-2023(online)].pdf 2023-09-20

Search Strategy

1 SearchStrategy_201611001448_20-01-2020.pdf
1 SearchStrategy_A201611001448AE_25-03-2021.pdf
2 SearchStrategy_201611001448_20-01-2020.pdf
2 SearchStrategy_A201611001448AE_25-03-2021.pdf

ERegister / Renewals