Sign In to Follow Application
View All Documents & Correspondence

Multimedia Control

Abstract: Disclosed is a method and system for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device. The method comprises initiating multimedia on a device providing a virtual reality environment to a user, and obtaining primary sensor data from one or more sensors on a body of the user watching the multimedia on the device. The method further comprises, generating an intensity of an emotion of the user based on the primary sensor data, and modifying the multimedia based on a comparison of the intensity with a predefined intensity, thereby controlling the multimedia displayed on a device.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
08 March 2016
Publication Number
12/2016
Publication Type
INA
Invention Field
ELECTRICAL
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2024-04-15
Renewal Date

Applicants

HCL Technologies Limited
B-39, Sector 1, Noida 201 301, Uttar Pradesh, India

Inventors

1. DHALIWAL, Jasbir Singh
HCL Technologies Limited, A-8 & 9, Sec-60, Noida, UP-201301, India
2. TAMMANA, Sankar Uma
HCL Technologies Limited, A-8 & 9, Sec-60, Noida, UP-201301, India

Specification

TECHNICAL FIELD
[001] The present subject matter described herein, in general, relates to a system and a method for controlling multimedia, and more particularly a system and a method for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device.
BACKGROUND
[002] Technologies such as Augmented Reality (AR) and Virtual Reality (VR) are gaining considerable momentum in today’s world, because of immersive real life experience provided by these technologies. Further, the immersive real life experience provided by the technologies, greatly enhance user’s experience by reducing the difference between reality and virtual reality.
[003] However, the reduction between reality and virtual reality, and realistic experiences based on VR technology has given rise to multiple problems, such as physical injury or emotional trauma. In other words, uncontrolled use of VR might result in physical injury or mental trauma or both, due to over immersion in the virtual reality. Making these problems even more complex is the fact that the psychological limits for enduring such virtual reality varies from user to user. This is especially true examples where virtual reality experiences have direct and significant bearing on the emotional state of the user, such as scary scenes or violent games. Thus there exists a need to balance two opposing needs. Firstly, to make the content as immersive and as real world as possible and secondly ensuring that the immersion and real world experience does not result in any way adversely impact, emotionally as well as physically on persons using the VR technology.
SUMMARY
[004] Before the present systems and methods for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device, are described, it is to be understood that this application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosures. It is also to be understood that the terminology used in the description is for the purpose of describing the particular implementations or versions or embodiments only, and is not intended to limit the scope of the present application. This summary is provided to introduce aspects related to a system and a method for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[005] In one implementation, a system for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device is disclosed. In one aspect, the system may initiate multimedia on a device providing a virtual reality environment to a user. Upon initiating, the system may obtain primary sensor data from one or more sensors located on a body of the user watching the multimedia on the device. Further to obtaining, the system may generate an intensity of an emotion of the user based on the primary sensor data. Subsequent to generating, the system may modify the multimedia based on a comparison of the intensity with a predefined intensity, thereby controlling the multimedia displayed on a device.
[006] In one implementation, a method for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device is disclosed. In one aspect, the method may comprise initiating multimedia on a device providing a virtual reality environment to a user. Upon initiating, the method may comprise obtaining primary sensor data from one or more sensors located on a body of the user watching the multimedia on the device. Further to obtaining, the method may comprise generating an intensity of an emotion of the user based on the primary sensor data. Subsequent to generating, the method may comprise modifying the multimedia based on a comparison of the intensity with a predefined intensity, thereby controlling the multimedia displayed on a device.
[007] In yet another implementation, non-transitory computer readable medium embodying a program executable in a computing device for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device is disclosed. In one aspect, the program may comprise a program code for initiating multimedia on a device providing a virtual reality environment to a user. The program may comprise a program code for obtaining primary sensor data and secondary sensor data from one or more sensors on a body of the user watching the multimedia on the device. The program may comprise a program code for generating an intensity of an emotion of the user based on the primary sensor data, and monitoring a tilting of the user based on the secondary sensor data. The program may comprise a program code for modifying the multimedia based on a comparison of the intensity with a predefined intensity. The program may comprise a program code for generating an alert when the tilting and the intensity exceed a predefined threshold, thereby controlling multimedia displayed on the device.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] The foregoing detailed description of embodiments is better understood when read in conjunction with the appended drawings. For the purpose of illustrating of the present subject matter, an example of construction of the present subject matter is provided as figures; however, the invention is not limited to the specific method and system disclosed in the document and the figures.
[009] The present subject matter is described detail with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer various features of the present subject matter.
[010] Figure 1 illustrates a network implementation of a system for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device, in accordance with an embodiment of the present subject matter.
[011] Figure 2 illustrates the system and its subcomponents for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device, in accordance with an embodiment of the present subject matter.
[012] Figure 3 illustrates a method for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[013] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words "comprising," "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device, similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the exemplary, systems and methods for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device are now described. The disclosed embodiments for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device are merely examples of the disclosure, which may be embodied in various forms.
[014] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device. However, one of ordinary skill in the art will readily recognize that the present disclosure for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device is not intended to be limited to the embodiments described, but is to be accorded the widest scope consistent with the principles and features described herein.
[015] In an implementation, a system and method for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device, is described. In an embodiment, multimedia may be initiated on a device. In one example, the device may be configured to provide a virtual reality environment to a user. Upon initiating the multimedia, primary sensor data and secondary sensor data may be obtained. In one example, the primary sensor data and the secondary sensor data may be obtained from one or more sensors. Further, the one or more sensors may be located on a body of the user watching the multimedia on the device. Subsequent to obtain the primary sensor data and the secondary sensor data, an intensity of an emotion of the user may be generated based on the primary sensor data and a tilting of the user may be monitored based on the secondary sensor data. Upon generating the intensity and monitoring the tilt, the multimedia may be modified based on a comparison of the intensity with a predefined intensity and an alert may be generated when the tilting and the intensity exceeds a predefined, thereby controlling the multimedia displayed on a device.
[016] Referring now to Figure 1, a network implementation 100 of a system 102 for controlling multimedia displayed on a device 104 based upon a set of parameters associated with a user of the device 104, in accordance with an embodiment of the present subject matter may be described. In one embodiment, the present subject matter is explained considering that the system 102 may be implemented as a system 102 installed within a central server 112 connected to a network 106. It may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, a cloud-based computing environment and the like.
[017] In one embodiment, the system 102 may also be implementing in the device 104. In one example, the device 104 may be a virtual reality headset such as PlayStation VR™, Oculus Rift™, and HTC Vive™. It may also be understood that the system 102 implemented on the device 104 supports a plurality of browsers and all viewports. Examples of the plurality of browsers may include, but not limited to, Chrome™, Mozilla™, Internet Explorer™, Safari™, and Opera™. It will also be understood that the system 102 may be accessed by multiple users through one or more controller 114. Further, the system may be communicatively coupled to sensors 110 located on a user. Furthermore, the system 102 may be communicatively coupled to a database 110. In one example, the devices 104 may obtain the multimedia to be viewed from the database 110 via the network 106. In one example, the database may be any of the relationship database and the like. In one embodiment, the device 104, controller 114, the database 108, and the sensors 110 are communicatively coupled to the system 102 through a network 106.
[018] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), Wireless Personal Area Network (WPAN), Wireless Local Area Network (WLAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, MQ Telemetry Transport (MQTT), Extensible Messaging and Presence Protocol (XMPP), Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[019] In one embodiment, the network system 100 for controlling multimedia displayed on the device 104 based upon a set of parameters associated with a user of the device 104 may comprise one or more sensors 110 located on a body of the user. In one example, the sensors may be heart rate sensors, body temperature sensor, acceleration sensor, proximity sensor, tilt sensor, gyroscope, and perspiration sensor. Further, the network system 100 may comprise a wearable virtual reality device for displaying a multimedia in a virtual reality environment to the user. In one example, the multimedia may be a video or a game. Furthermore, the network system 100 may comprise a system 102 communicatively coupled with the one or more sensor 110 and the device 104, for example a wearable virtual reality device.
[020] In the embodiment, the system 102 may initiate the multimedia on the wearable virtual reality device 104 in the virtual reality environment. Upon initiating, the system 102 may obtain primary sensor data and secondary sensor data from the sensors 110. Further to obtaining the system 102 may generate an intensity of an emotion of the user based on the primary sensor data, and monitor a tilting of the user based on the secondary sensor data. Subsequent to generating and monitoring, the system 102 may modify the multimedia based on a comparison of the intensity with a predefined intensity. In one example, the modification may comprise of changing the multimedia, usage of alternate characters in the multimedia, usage of alternate scenes in the multimedia, changing a resolution of the multimedia, changing a colour of the multimedia, changing a brightness of the multimedia, altering a volume of a sound being played in the background of the multimedia, altering a pitch of a sound being played in the background of the multimedia. Upon modifying, the system may generate an alert when the tilting and the intensity exceed a predefined threshold, thereby controlling the multimedia. Furthermore, the immersive experience of the multimedia in the virtual environment may be maximized and controlled without any physical or mental effect on the user.
[021] Referring now to Figure 2, the system 102 is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 may be configured to fetch and execute computer-readable instructions stored in the memory 206.
[022] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with the user directly or through the client devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[023] The memory 206 may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[024] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include an obtaining module 212, a generating module 214, a modifying module 216 and other module 218. The other modules 218 may include programs or coded instructions that supplement applications and functions of the system 102. The modules 208 described herein may be implemented as software modules that may be executed in the cloud-based computing environment of the system 102.
[025] The memory 206, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The memory 206 may include data generated as a result of the execution of one or more modules in the other module 220. In one implementation, the memory may include data 210. Further, the data 210 may include a system data 220 for storing data processed, computed received and generated by one or more of the modules 208. Furthermore, the data 210 may include other data 224 for storing data generated as a result of the execution of one or more modules in the other module 220.
[026] In one implementation, at first, a user may use the controller 114 and device 104 to access the system 102 via the I/O interface 204. The user may register using the I/O interface 204 in order to use the system 102. In one aspect, the user may access the I/O interface 204 of the system 102 for obtaining information or providing input information. In one implementation the system 102 may automatically provide information to the user through I/O interface 204, device 104.
OBTAINING MODULE 212
[027] Referring to figure 2, in an embodiment the obtaining module 212 may initiate multimedia on a device 104 providing a virtual reality environment to a user. In one other embodiment, the obtaining module 212 may obtain the multimedia from the database 108 and provide the multimedia to the device 104 for initiating the multimedia on the device 104. In one example, the device 104 may be a wearable virtual reality headset such as a PlayStation VR™ headset, Oculus Rift™ headset, and HTC Vive™ headset. In one more example, the multimedia may be an image, a video, a movie, a game, an e-learning module, and a live broadcast. Further, the obtaining module 212 may track the multimedia being played on the device 104.
[028] In the embodiment, upon initiating the multimedia, the obtaining module 212 may obtain primary sensor data and secondary sensor data from one or more sensors 110 located on a body of the user watching the multimedia on the device. In none example, the primary sensor data may comprise heart beat rate, perspiration level, body temperature, dilution of pupil. The secondary sensor data may comprise acceleration data, proximity data, orientation data, and moment data. In one example, the sensors may be heart rate sensors, body temperature sensor, acceleration sensor, proximity sensor, tilt sensor, gyroscopes, and perspiration sensor. Further, the obtaining module 212 may store the obtained primary sensor data and secondary data in the system data 220.
GENERATION MODULE 214
[029] In the implementation, further to obtaining the primary sensor data and the secondary sensor data, the generation module 214 may generate an intensity of an emotion of the user based on the primary sensor data. The intensity of the emotion may be understood as the quantitate value of emotional response exhibited by the user while watching the multimedia in the virtual reality environment. In an example, the intensity may be generated based weighted matrix approach and primary sensor data. In the example, high heart rate, high body temperature and high perspiration level may be indicative of high intensity of user emotion in response to the multimedia. In one example, the intensity may be generated as a numerical value such as 5 in relation to a predefined scale 0-10. In one other example, the intensity may be generated as percentile such as 30. Further, the generation module 214 may store the intensity in system data 220.
[030] Along with generating the intensity, the generation module 214 may also monitor a tilting of the user based on the secondary sensor data. The tilting may be understood as a change in orientation of the user with respect to the ground. The tilting may comprise tilting in the forwards, a backwards, sideways and twisting. Furthermore, tilting may be also understood as a change in orientation of the user that may cause physical injury when exceeded over a predefined threshold. In one example, the generation module 214 may monitor the tilting based on the acceleration data, the orientation data in the secondary sensor data. Further, the generation module 214 may store the tilting in the system data 220.
MODIFICATION MODULE 216
[031] In the implementation, upon generating the intensity and monitoring the tilting, the modification module 216 may modify the multimedia based on a comparison of the intensity with a predefined intensity. In an example, the predefined intensity may be a range, and the multimedia may be modified if the intensity exceeds or decreases below the predefined intensity. In one example, the modifying the multimedia may comprises one or more of changing the multimedia, usage of alternate or different characters in the multimedia, usage of alternate scenes in the multimedia, changing a resolution of the multimedia, changing a colour of the multimedia, changing a brightness of the multimedia, altering a volume of a sound being played in the background of the multimedia, altering a pitch of a sound being played in the background of the multimedia.
[032] In an example of implementation, a child may get scared and may demonstrate high intensity of emotion while watching a movie on dinosaurs in virtual environment. In such example, the movie may be modified by the modification module 216. Further, the modification may be changing the movie, usage of alternate or different dinosaurs, usage of alternate scenes in the movie, changing a resolution of the movie, changing a colour of the dinosaurs, changing a brightness of the movie, altering a volume of the sound of dinosaurs, altering a pitch of the dinosaurs and the like. Further, the modification module 216 may store the modification in system data 220.
[033] In one example, the modification module 216 may modify the multimedia such as to archive an Ideal Immersive Experience. In the example, the Ideal Immersive Experience may be understood as a predefined range of the intensity of emotion in with an effective immersive experience may be obtained in the virtual environment. In the example, based on the modification of the multimedia the intensity of emotion may be increased or decreased to achieve the Ideal Immersive Experience. Further, the modification module 216 may determine an emotional threshold of the user based on the intensity of the emotional response of the user over a predefined time interval. The emotional threshold may be range. In one embodiment, the predefined intensity may be replaced by the emotional threshold, once determined by the modification module 216.
[034] In the embodiment, along with modification of the multimedia, the modification module 216 may generate an alert. In one example, the alert may be generated when the tilting or the intensity exceeds a predefined threshold. In one example, the generation of alert may protect the user from physical or mental harm and injury. Further, in one other example, the alert is one of a vibration at one or more locations on the body of the user, or a sound.
[035] Exemplary embodiments for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
[036] Some embodiments of the system and the method enable ideal immersive experience in the virtual reality environment.
[037] Some embodiments of the system and the method enable safety from physical injury due to over immersion in virtual reality environment.
[038] Some embodiments of the system and the method enable determination of the psychological threshold of the users which may ultimately be used for judging and enhancing user personality traits.
[039] Some embodiments of the system and the method enable identification emotional equilibrium in real time.
[040] Referring now to Figure 3, a method 300 for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device is shown, in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
[041] The order in which the method 300 for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device as described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be considered to be implemented in the above described system 102.
[042] At block 302, multimedia may be initiated on a device providing a virtual reality environment to a user. In an implementation, the obtaining module 212 may initiate and store entry points data in the system data 220.
[043] At block 304, primary sensor data may be obtained from one or more sensors on a body of the user watching the multimedia on the device. In the implementation, the obtaining module 212 may obtain primary sensor data from one or more sensors on a body of the user watching the multimedia on the device. The obtaining module 212 may store data in the primary sensor in the system data 220.
[044] At block 306, an intensity of an emotion of the user may be generated based on the primary sensor data. In the implementation, the generation module 214 may generate an intensity of an emotion of the user based on the primary sensor data and store the intensity in the system data 220.
[045] At block 308, the multimedia may be modified based on a comparison of the intensity with a predefined intensity, thereby controlling the multimedia displayed on a device. In the implementation, the modification module 216 may modify the multimedia based on a comparison of the intensity with a predefined intensity, thereby controlling the multimedia displayed on a device. The modification module 216 may store the modification in the system data 220.
[046] Exemplary embodiments discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include a method and system for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device.
[047] Although implementations for methods and systems for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device.

Claims:
1. A method for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device, the method comprising:
initiating, by a processor, multimedia on a device providing a virtual reality environment to a user;
obtaining, by the processor, primary sensor data from one or more sensors on a body of the user watching the multimedia on the device;
generating, by the processor, an intensity of an emotion of the user based on the primary sensor data; and
modifying, by the processor, the multimedia based on a comparison of the intensity with a predefined intensity, thereby controlling the multimedia displayed on a device.

2. The method of claim 1, further comprising:
obtaining, by the processor, secondary sensor data from the one or more sensors;
monitoring, by the processor, a tilting of the user based on the secondary sensor data; and
generating, by the processor, an alert when one of the tilting and the intensity exceeds a predefined threshold.

3. The method of claim 1, further comprising:
determining, by the processor, an emotional threshold of the user based on the intensity of the emotional response of the user over a predefined time interval.

4. The method of claim 1, wherein the one or more sensors are heart rate sensors, body temperature sensor, acceleration sensor, proximity sensor, tilt sensor, gyroscope , and perspiration sensor.

5. The method of claim 1, wherein the alert is one of a vibration at one or more locations on the body of the user, or a sound.

6. The method of claim 1, wherein the multimedia is one of an image, a video, a movie, a game, an e-learning module, and a live broadcast.

7. The method of claim 1, wherein the modifying the multimedia comprises one or more of changing the multimedia, usage of alternate characters in the multimedia, usage of alternate scenes in the multimedia, changing a resolution of the multimedia, changing a colour of the multimedia, changing a brightness of the multimedia, altering a volume of a sound being played in the background of the multimedia, altering a pitch of a sound being played in the background of the multimedia.

8. A system for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device, the system comprising:
a memory; and
a processor coupled to the memory, wherein the processor is capable of executing instructions to perform steps of:
initiating multimedia on a device providing a virtual reality environment to a user;
obtaining primary sensor data from one or more sensors on a body of the user watching the multimedia on the device;
generating an intensity of an emotion of the user based on the primary sensor data; and
modifying the multimedia based on a comparison of the intensity with a predefined intensity, thereby controlling the multimedia displayed on a device.

9. The system claim 8, further comprising:
obtaining secondary sensor data from the one or more sensors;
monitoring a tilting of the user based on the secondary sensor data; and
generating an alert when the tilting exceeds a predefined threshold.

10. The system claim 8, further comprising:
determining an emotional threshold of the user based on the intensity of the emotional response of the user over a predefined time interval.

11. The system claim 8, wherein the one or more sensors are heart rate sensors, body temperature sensors, acceleration sensors, proximity sensors, tilt sensors, gyroscope sensors, and perspiration sensors.

12. The system claim 8, wherein the alert is one of a vibration at one or more locations on the body of the user, or a sound.

13. The system claim 8, wherein the multimedia is one of an image, a video, a movie, a game, an e-learning module, and a live broadcast.

14. The system claim 8, wherein the modifying the multimedia comprises one or more of changing the multimedia, usage of alternate characters in the multimedia, usage of alternate scenes in the multimedia, changing a resolution of the multimedia, changing a colour of the multimedia, changing a brightness of the multimedia, altering a volume of a sound being played in the background of the multimedia, altering a pitch of a sound being played in the background of the multimedia.

15. A non-transitory computer program product having embodied thereon a computer program for controlling multimedia displayed on a device based upon a set of parameters associated with a user of the device, the computer program product storing instructions, the instructions comprising instructions for:
initiating multimedia on a device providing a virtual reality environment to a user;
obtaining primary sensor data and secondary sensor data from one or more sensors on a body of the user watching the multimedia on the device;
generating an intensity of an emotion of the user based on the primary sensor data, and monitoring a tilting of the user based on the secondary sensor data;
modifying the multimedia based on a comparison of the intensity with a predefined intensity; and
generating an alert when the tilting and the intensity exceed a predefined threshold, thereby controlling multimedia displayed on the device.

Documents

Application Documents

# Name Date
1 Form 9 [08-03-2016(online)].pdf 2016-03-08
2 Form 3 [08-03-2016(online)].pdf 2016-03-08
3 Form 20 [08-03-2016(online)].pdf 2016-03-08
4 Form 18 [08-03-2016(online)].pdf 2016-03-08
5 Drawing [08-03-2016(online)].pdf 2016-03-08
6 Description(Complete) [08-03-2016(online)].pdf 2016-03-08
7 Form 26 [06-07-2016(online)].pdf 2016-07-06
8 201611008107-GPA-(11-07-2016).pdf 2016-07-11
9 201611008107-Form-1-(11-07-2016).pdf 2016-07-11
10 201611008107-Correspondence Others-(11-07-2016).pdf 2016-07-11
11 abstract.jpg 2016-07-14
12 201611008107-FER.pdf 2020-02-03
13 201611008107-OTHERS [23-07-2020(online)].pdf 2020-07-23
14 201611008107-FER_SER_REPLY [23-07-2020(online)].pdf 2020-07-23
15 201611008107-COMPLETE SPECIFICATION [23-07-2020(online)].pdf 2020-07-23
16 201611008107-CLAIMS [23-07-2020(online)].pdf 2020-07-23
17 201611008107-POA [09-07-2021(online)].pdf 2021-07-09
18 201611008107-FORM 13 [09-07-2021(online)].pdf 2021-07-09
19 201611008107-Proof of Right [22-10-2021(online)].pdf 2021-10-22
20 201611008107-US(14)-HearingNotice-(HearingDate-29-01-2024).pdf 2023-12-29
21 201611008107-FORM-26 [19-01-2024(online)].pdf 2024-01-19
22 201611008107-Correspondence to notify the Controller [19-01-2024(online)].pdf 2024-01-19
23 201611008107-Written submissions and relevant documents [13-02-2024(online)].pdf 2024-02-13
24 201611008107-RELEVANT DOCUMENTS [13-02-2024(online)].pdf 2024-02-13
25 201611008107-RELEVANT DOCUMENTS [13-02-2024(online)]-2.pdf 2024-02-13
26 201611008107-RELEVANT DOCUMENTS [13-02-2024(online)]-1.pdf 2024-02-13
27 201611008107-PETITION UNDER RULE 137 [13-02-2024(online)].pdf 2024-02-13
28 201611008107-PETITION UNDER RULE 137 [13-02-2024(online)]-2.pdf 2024-02-13
29 201611008107-PETITION UNDER RULE 137 [13-02-2024(online)]-1.pdf 2024-02-13
30 201611008107-PatentCertificate15-04-2024.pdf 2024-04-15
31 201611008107-IntimationOfGrant15-04-2024.pdf 2024-04-15

Search Strategy

1 Searchstrategy201611008107AE_01-12-2022.pdf
2 2020-01-3116-31-18_31-01-2020.pdf

ERegister / Renewals

3rd: 22 May 2024

From 08/03/2018 - To 08/03/2019

4th: 22 May 2024

From 08/03/2019 - To 08/03/2020

5th: 22 May 2024

From 08/03/2020 - To 08/03/2021

6th: 22 May 2024

From 08/03/2021 - To 08/03/2022

7th: 22 May 2024

From 08/03/2022 - To 08/03/2023

8th: 22 May 2024

From 08/03/2023 - To 08/03/2024

9th: 22 May 2024

From 08/03/2024 - To 08/03/2025

10th: 06 Mar 2025

From 08/03/2025 - To 08/03/2026