Abstract: In-game advertising (IGA) refers to the use of computer and video games as a medium to deliver advertising. Currently advertisements advertised inside an electronic game are intrusive, non-complementing the gameplay and are not context sensitive. A system and method for displaying context aware advertisements during an electronic gameplay has been provided. The system is configured to enhance the user experience of a player playing the electronic game. The system is configured to advertise a relevant advertisement matching the context and story of the game in the game's native world setting. The system can be used to place manual and automated advertisements inside the game. The advertisements are either time based (game time), event based (in-game event or advertisement world event). Advertisements act as complementing elements adding a resource, game item or game-hint to the player during the gameplay and not disrupting the player's gameplay experience. [To be published with FIG. 2]
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION (See Section 10 and Rule 13)
Title of invention:
METHOD AND SYSTEM FOR DISPLAYING CONTEXT AWARE ADVERTISEMENTS DURING AN ELECTRONIC GAMEPLAY
Applicant
Tata Consultancy Services Limited A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India
Preamble to the description
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD [001] The disclosure herein generally relates to the field of enhancing user experience while providing in-game advertising, and, more particularly, to a method and system for displaying context aware advertisements during an electronic gameplay
BACKGROUND
[002] In-game advertising (IGA) refers to the use of computer and video games as a medium in which to deliver advertising. Advertisers see IGA as a prime way to target a certain age group, who are increasingly neglecting television in favor of computer and video games (electronic games). Aside from the ability to reach an ever-growing audience, the appeal of IGA for advertisers also lies in the long shelf-life and high replay value of electronic games.
[003] IGA has advanced from a static towards a dynamic advertising medium. Early examples of in-game advertising were static. Some of these consisted of virtual billboards, whereas others could be considered in-game product placement. These advertisements were placed directly into the game by artists or programmers and could not be changed later. Similar to how a websites rotate different advertisements through the same fixed space, dynamic in-game advertising can change the messaging on the same billboard in the electronic game. This requires billboard space insertion into the game, as well as an internet-connected gaming system to work.
[004] Increased internet connectivity has led to the growth of dynamic in-game advertising. Unlike the fixed advertisements found in static in-game advertisements, dynamic advertisements can be altered remotely by an advertising agency. Advertisements can be tailored according to various parameters such as geographical location or time of day, time spent looking at advertisements and type of advertisement may be used to better formulate future campaigns and also allows the advertisers to offer more flexible advertising campaigns to their clients. Currently many advertisements advertised inside the video game are intrusive, non-congruent; non-complementing the gameplay and static.
SUMMARY
[005] Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one embodiment, a system for displaying context aware advertisements during an electronic gameplay is provided. The system comprises an input/output interface, one or more hardware processors and a memory in communication with the one or more hardware processors, wherein the one or more first hardware processors are configured to execute programmed instructions stored in the one or more first memories, to: identify one or more of in-game advertising space in the game; design the identified in-game advertising space using a customizable set of specifications; generate a first set of mood tags indicating mood of the user based on gameplay data obtained from the game using flow theory model, GameFlow model and one or more of processing techniques; generate a second set of mood tags indicating mood of the user based on external sensors; determine one or more in-game events using the one or more natural language processing (NLP) techniques; analyze the context of the game using at least one or more of the first set of mood tags, the second set of mood tags, a set of game level tags, and the one or more in-game events processed with the help of NLP techniques; choose a set of objects based on the context of the game; type-cast advertisement data into the chosen set of objects, wherein the advertisement data is selected based on advertisement characterization; convert the type-casted advertisement data into an advertisement form to be rendered in the in-game space; and display the converted advertisement form in the designed in-game advertisement space when the player is in the flow channel.
[006] In another aspect, a method for displaying context aware advertisements during an electronic gameplay. Initially, one or more of in-game advertising spaces are identified in the game. Further, the identified in-game advertising spaces are designed using a customizable set of specifications. In the next step, a first set of mood tags indicating mood of the user based on gameplay data obtained from the game using flow theory model, GameFlow model and one
or more of processing techniques is generated. And a second set of mood tags indicating mood of the user based on external sensors is generated. Further, one or more in-game events determined using the one or more natural language processing (NLP) techniques. In the next step, the context of the game is analyzed using at least one or more of the first set of mood tags, the second set of mood tags, a set of game level tags, and the one or more in-game events processed with the help of NLP techniques. Later a set of objects is chosen based on the context of the game. Advertisement data is then type-casted into the chosen set of objects, wherein the advertisement data is selected based on advertisement characterization. The type-casted advertisement data is then converted into an advertisement form to be rendered in the in-game space. And finally, the converted advertisement form is displayed in the designed in-game advertisement space and when the player is in the flow channel.
[007] In yet another aspect, one or more non-transitory machine readable information storage mediums comprising one or more instructions which when executed by one or more hardware processors cause displaying context aware advertisements during an electronic gameplay. Initially, one or more of in-game advertising spaces are identified in the game. Further, the identified in-game advertising spaces are designed using a customizable set of specifications. In the next step, a first set of mood tags indicating mood of the user based on gameplay data obtained from the game using flow theory model, GameFlow model and one or more of processing techniques is generated. And a second set of mood tags indicating mood of the user based on external sensors is generated. Further, one or more in-game events determined using the one or more natural language processing (NLP) techniques. In the next step, the context of the game is analyzed using at least one or more of the first set of mood tags, the second set of mood tags, a set of game level tags, and the one or more in-game events processed with the help of NLP techniques. Later a set of objects is chosen based on the context of the game. Advertisement data is then type-casted into the chosen set of objects, wherein the advertisement data is selected based on advertisement characterization. The type-casted advertisement data is then converted into an advertisement form to be
rendered in the in-game space. And finally, the converted advertisement form is displayed in the designed in-game advertisement space and when the player is in the flow channel.
[008] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[009] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles:
[010] FIG. 1 illustrates a network diagram of a system for displaying context aware advertisements during an electronic gameplay according to some embodiments of the present disclosure.
[011] FIG. 2 illustrates an architecture of the system of FIG. 1 for displaying context aware advertisements during an electronic gameplay according to some embodiments of the present disclosure.
[012] FIG. 3 is a flowchart of a method for displaying context aware advertisements during an electronic gameplay according to some embodiment of the present disclosure.
[013] FIG. 4 is a graphical representation of flow theory to be used for detecting player mood and generating mood tags for displaying context aware advertisements. The diagram also shows the flow channel within which the advertisements are displayed for receiving user’s ideal attention during an electronic gameplay in accordance with some embodiments of the present disclosure.
DETAILED DESCRIPTION OF EMBODIMENTS [014] Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever
convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the scope of the disclosed embodiments.
[015] Currently advertisements advertised inside an electronic game (computer or mobile or video games) are intrusive, non-complementing the gameplay and are not context sensitive. In addition, they are static and non-congruent, due to which the gamers have shown a distaste for advertisements that distract them while they are trying to immerse themselves in that game world. The gamers are more likely to favorably respond to advertisements and products which are congruent with the game environment.
[016] According to an embodiment of the disclosure, a technical solution is provided to solve the technical problem mentioned above. A system and method for displaying context aware advertisements during an electronic gameplay has been provided. The system may also be referred as in-game advertising engine (IGAE). The system is configured to enhance the user experience of the person playing the electronic game. The system is configured to advertise a relevant advertisement matching the context and story of the game in the game's native world setting. IGAE is configured to place advertisements inside the game, and can be configured to work with existing game engines. It is used for designing in-game advertisement spaces and placing manual and automated advertisements inside the game.
[017] Referring now to the drawings, and more particularly to FIG. 1 through FIG. 4, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
[018] According to an embodiment of the disclosure, a network diagram of a system 100 for displaying context aware advertisements during an electronic gameplay is shown in FIG. 1. In an example, the system 100 may also be referred as an in-game advertisement engine (IGAE). The IGAE enables one or more new
methods of in-game advertisements placement. It is used for designing in-game advertisement spaces and placing manual and automated advertisements inside the electronic gameplay.
[019] The advertisements are either time based (game time), event based (in-game event or an advertisement world event) or other constructs. Advertisements can be displayed when the gameplay corresponds to the flow model by Mihaly Csikszentmihalyi. Advertisements act as complementing elements adding a resource, game item or game-hint to the player during the gameplay and not disrupting the player's gameplay experience.
[020] The in-game advertisements engine (IGAE) enables the game developers and the advertisers display advertisements confirming to the above mentioned constraints. The IGAE track the player's interaction with the advertisements inside the game using an advertisement report. The system 100 also helps the players reach out to an interested brand from the game. The advertisement can be changed dynamically depending on the change of the context of the game. When a gamer reaches the required destination or level, an advertisement is placed in the game world to enable the player to relate the advertisement with victory in the game.
[021] The system 100 is implemented on one or more computing devices 102, such as a laptop computer, a desktop computer, a notebook, a workstation, a cloud-based computing environment and the like. It will be understood that the system 100 may be accessed through an input/output interface referred to as I/O interface 104. Examples of the I/O interface 104 may include, but are not limited to, a user interface, a portable computer, a personal digital assistant, a handheld device, a smartphone, a tablet computer, a workstation and the like. The I/O interface 104 is communicatively coupled to the system 100 through a network 106.
[022] In an embodiment, the network 106 may be a wireless or a wired network, or a combination thereof. In an example, the network 106 can be implemented as a computer network, as one of the different types of networks, such as virtual private network (VPN), intranet, local area network (LAN), wide area network (WAN), the internet, and such. The network 106 may either be a dedicated
network or a shared network, which represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), and Wireless Application Protocol (WAP), to communicate with each other. Further, the network may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices. The network devices within the network may interact with the system 100 through communication links.
[023] According to an embodiment of the disclosure, the system 100 further comprises one or more hardware processors 108, a memory 110 in communication with the one or more hardware processors 108 and a data repository 112. The one or more hardware processors 108 are configured to execute programmed instructions stored in the memory 110, to perform various functions as explained in the later part of the disclosure. The data repository 112 may store data processed, received, and generated by the system 100.
[024] The system 100 supports various connectivity options such as BLUETOOTH®, USB, ZigBee and other cellular services. The network environment enables connection of various components of the system 100 using any communication link including Internet, WAN, MAN, and so on. In an exemplary embodiment, the system 100 is implemented to operate as a stand-alone device. In another embodiment, the system 100 may be implemented to work as a loosely coupled device to a smart computing environment. The components and functionalities of the system 100 are described further in detail.
[025] According to an embodiment of the disclosure, the memory 110 comprises a plurality of modules as shown in the architecture diagram 200 of FIG. 2. The plurality of modules are set of instructions and configured to perform a plurality of functions. The system 100 comprises of a game 114 and a game engine 116. The game 114 is referred as a form of digital circuits that are capable of performing function as controlled by the player. The use of any existing game engine is well within the scope of this disclosure. The advertisements are displayed on the I/O interface 104. The game 114 further comprises an IGAE presets 118. The IGAE presets 118 are a set of preconfigured modules which are packaged
inside the game 114. The IGAE presets 118 further comprises a mood analyzer 120, a context analyzer 122, a skinner unit 124, an entity unit 126 and an ad typecasting unit 128. The game engine 116 further comprises a smart ads unit 130 and a static ads unit 132.
[026] According to an embodiment of the disclosure, the mood analyzer 120 is configured to generate the first set of mood tags and the second set of mood tags. The first set of mood tags indicates mood of the user based on gameplay data obtained from the game using one or more of processing techniques. The second set of mood tags indicates mood of the user based on a plurality of external sensors 134. The gameplay data comprises one or more of a game level, a game based story, performance of a player within the game world.
[027] According to an embodiment of the disclosure, the context analyzer 122 is configured to receive the context data from the game. Based on the received context data, the context analyzer 122 uses context analysis algorithms, text analytics, and the first and the second set of mood tags to create context based advertisements data.
[028] According to an embodiment of the disclosure, the entity unit 126 is configured to store the information of the game entity that needs to be displayed as an advertisement as data. And the skinner unit 124 is configured to receive the entities data from the entity unit 126 and converts them to a displayable advertisement.
[029] According to an embodiment of the disclosure, the ad typecasting unit 128 is configured to typecast the advertisement data to a set of 3d game entities, images, text, audio and music files. The advertisement data is selected based on advertisement characterization. The game engine 116 further comprises the smart ads unit 130 and the static ads unit 132.
[030] According to an embodiment of the disclosure, there are two categories of advertisements that can be placed in the game, either static advertisements or smart advertisements. The static advertisements is for which the content can be controlled by the developer. The static advertisements are placed with the help of the static ads unit 132. Advertisements spaces and advertisements
are created and placed manually by the developer. The static ads unit 132 gives full control of the advertisement content and the advertisement space. The developer can form templates that are reusable for static advertisements.
[031] While the smart advertisements are the context aware advertisements that are placed automatically. The smart advertisements are placed by the smart ads unit 130. The advertisements spaces are initially created by the developer similar to the static advertisements creation. Created advertisement spaces are dynamic in nature. The advertisements are generated automatically by the smart ads unit 130 and the presets 118. The smart ads unit 130 speeds up the process of placing advertisements in the game. The context aware advertisements are displayed. The generated advertisements are dynamic in nature and can change periodically. The advertisements are also displayed based on the player’s mood.
[032] Referring to FIG. 3, flow diagram of a method 300 for displaying context aware advertisements during an electronic gameplay is described in accordance with an example embodiment. The method 300 depicted in the flow chart may be executed by a system, for example, the system, 100 of FIG. 1. In an example embodiment, the system 100 may be embodied in the computing device as explained above.
[033] Operations of the flowchart, and combinations of operation in the flowchart, may be implemented by various means, such as hardware, firmware, processor, circuitry and/or other device associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described in various embodiments may be embodied by computer program instructions. In an example embodiment, the computer program instructions, which embody the procedures, described in various embodiments may be stored by at least one memory device of a system and executed by at least one processor in the system. Any such computer program instructions may be loaded onto a computer or other programmable system (for example, hardware) to produce a machine, such that the resulting computer or other programmable system embody means for implementing the operations specified in the flowchart. It will be noted herein that the operations of the method 300 are described with help of system 100.
However, the operations of the method 300 can be described and/or practiced by using any other system.
[034] Initially at step 302, one or more of in-game advertising spaces are identified in the game. The in-game advertisement space comprises one or more of game entity, conversation, game inventory, in-game music, in-game banners, or heads up display (HUD).
[035] At step 304, the identified in-game advertising space is designed using a customizable set of specifications. Both in case of static advertisements and smart advertisements, the position where advertisements need to be placed inside is selected by the user. In the present context, the position refers to a place on the display device of the game which is visible to the player. Once a position to place an advertisement is defined, a dynamic advertisement space with properties (e.g.: min / max advertisements, advertisement type, frequency etc.) can be designed in the selected position. This advertisement space holds the dynamic advertisements to be displayed.
[036] At step 306, a first set of mood tags is generated. The first set of mood tags indicates mood of the user based on gameplay data obtained from the game using flow theory model or GameFlow model or one or more of processing techniques. These first set of mood tags are generated from:
• Level data, inside the game. For example, when a keyword “enemy” is present in the level, tags such as can be generated. Gameplay data, received from the game;
• Gameplay data generated from the player’s play pattern. This gameplay data is validated by 1. GameFlow model (GFM) and 2. flow theory model (FTM). For example: when a player defeats an enemy, tags such as and is generated.
[037] According to an embodiment of the disclosure, FIG. 4 shows graphical representation of advertisements based on the challenges in the game and skill level of the user according to some embodiment of the present disclosure. When the Player is in a flow channel the player doesn’t feel distracted or irritated
by the advertisement
and feels that the advertisement is a part of the gameplay experience. This figure is a diagrammatical representation of flow theory to be used for detecting player mood and generating mood tags for displaying context aware advertisements. It also shows the flow channel within which the advertisements is displayed for receiving user’s ideal attention during an electronic gameplay. In FIG. 4, advertisement 1 and advertisement 2 are placed in the flow channel. While the advertisement 3 is placed where challenge is high and skill is less therefor, the player is feeling anxious. Similarly, the advertisement 4 is placed where challenge is less and the skill is high, therefore the player is feeling bored.
[038] At step 308, a second set of mood tags are generated. The second set of mood tags indicates mood of the user based on the plurality of external sensors 134. The second set of mood tags are generated from external sensors and devices connected to the game. Various basic emotions such as happy, surprise, sad, anger, fear and disgust are detected. The first and the second set of mood tags are then sent to the context analyzer
[039] At step 310, one or more in-game events are determined using the one or more natural language processing (NLP) techniques namely text analysis and text analytics. The one or more in-game events are chosen such a way that they are not disturbing the game-flow of the player and enhancing the user experience when an advertisement is placed. The example of one or more events may include win conditions, check points, change in level etc.
[040] At step 312, the context of the game is analyzed using at least one or more of the first set of mood tags, the second set of mood tags, a set of game level tags, and the one or more in-game events processed with the help of NLP techniques namely text analysis and text Analytics.
[041] At step 314, a set of objects are chosen based on the context of the game. The set of objects are chosen by the context analyzer 122. The context analyzer 122 analyzes various contexts available in the game using keywords and text analysis, text analytics (Mining) using NLP. For example, a set of relevant nouns and verbs from the game is chosen as a keyword. Further relevant objects are
chosen for the keyword to be converted to an advertisement. For example: Keyword: Hungry[] = {“Apple”, “Pizza”, “Bread”, “Meat”}.
[042] At step 316, advertisement data is type-casted into the chosen set of objects. The advertisement data is selected based on advertisement characterization. The data derived from the context analyzer 122 type casted to the accepted real world form of that object. For example, apple text is converted to 2D image data of the apple or 3D mesh data of an Apple. Further logos, labels and text are skinned onto the entities if required based on the displayed entities’ properties from the entity module.
[043] At step 318, the type-casted advertisement data is converted into an advertisement form to be rendered in the in-game space. For example, the type casted 2D and 3D data converted into a 2D Image and 3D Mesh and renders that image, mesh into the game world / virtual world. And finally at step 320, the converted advertisement form is displayed in the designed in-game advertisement space and when the player is in the flow channel.
[044] According to an embodiment of the disclosure, the smart advertisements or the static advertisements are one or more of a 2D advertisements, 3D advertisements, text advertisements, or audio advertisements. The 2D advertisements are one or more of banners, images or popups. The 3D advertisements are one or more of 3D models / game entities, animated, non-animated, 3D holograms or VFX. The text advertisements are one or more of conversations, information. And the audio advertisements are one or more of dialogues, speech, music, or SFX.
[045] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[046] The embodiments of present disclosure herein addresses unresolved problem of placing the intrusive advertisements and disturbing the gameplay of the player. The embodiment, thus provides a method and system for displaying context aware advertisements during an electronic gameplay which further enhances the user experience of the player.
[047] It is to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software processing components located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[048] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various components described herein may be implemented in other components or combinations of other components. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[049] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological
development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
[050] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[051] It is intended that the disclosure and examples be considered as exemplary only, with a true scope of disclosed embodiments being indicated by the following claims.
WE CLAIM:
1. A processor implemented method (300) for displaying context aware advertisements during an electronic gameplay, the method comprising:
identifying, via one or more hardware processors, one or more of in-game advertising space in a game (302);
designing, via the one or more hardware processors, the identified in-game advertising space using a customizable set of specifications (304);
generating, via the one or more hardware processors, a first set of mood tags indicating mood of the user based on gameplay data obtained from the game using flow theory model, GameFlow model and one or more of processing techniques (306);
generating, via the one or more hardware processors, a second set of mood tags indicating mood of the user based on external sensors (308);
determining, via the one or more hardware processors, one or more in-game events using the one or more natural language processing (NLP) techniques (310);
analyzing, via the one or more hardware processors, the context of the game using at least one or more of the first set of mood tags, the second set of mood tags, a set of game level tags, and the one or more in-game events processed with the help of NLP techniques (312);
choosing, via the one or more hardware processors, a set of objects based on the context of the game (314);
type-casting, via the one or more hardware processors, advertisement data into the chosen set of objects, wherein the advertisement data is selected based on advertisement characterization (316);
converting, via the one or more hardware processors, the type-casted advertisement data into an advertisement form to be rendered in the in-game space (318); and
displaying, via the one or more hardware processors, the converted advertisement form in the designed in-game advertisement space, wherein the player is in a flow channel (320).
2. The method of claim 1 further comprising the step of changing the advertisement depending on a change of the context of the game.
3. The method of claim 1, wherein the gameplay data comprises one or more of a game level, a game based story, performance of a player of the game.
4. The method of claim 1, wherein the advertisement is provided in the form of one or more of texts, audios in the form of speech and music, videos, two-dimensional images or three-dimensional objects.
5. The method of claim 1, wherein the in-game advertisement space comprises one or more of conversation, game inventory, in-game music, in-game banners, and heads up display (HUD).
6. The method of claim 1, wherein the external sensors comprise one or more of wearable sensors, and sensors present in the surrounding area.
7. The method of claim 1 further comprising generating a trigger to display the advertisements.
8. A system (100) for displaying context aware advertisements during an electronic gameplay, the system comprises:
an input/output interface (104); one or more hardware processors (108);
a memory (110) in communication with the one or more hardware processors, wherein the one or more first hardware processors are
configured to execute programmed instructions stored in the one or more first memories, to:
identify one or more of in-game advertising space in a game;
design the identified in-game advertising space using a customizable set of specifications;
generate a first set of mood tags indicating mood of the user based on gameplay data obtained from the game using flow theory model, GameFlow model and one or more of processing techniques;
generate a second set of mood tags indicating mood of the user based on external sensors;
determine one or more in-game events using the one or more natural language processing (NLP) techniques;
analyze the context of the game using at least one or more of the first set of mood tags, the second set of mood tags, a set of game level tags, and the one or more in-game events processed with the help of NLP techniques;
choose a set of objects based on the context of the game;
type-cast advertisement data into the chosen set of objects, wherein the advertisement data is selected based on advertisement characterization;
convert the type-casted advertisement data into an advertisement form to be rendered in the in-game space; and
display the converted advertisement form in the designed in-game advertisement space.
9. The system of claim 8 further configured to perform the step of changing the advertisement depending on a change of the context of the game.
10. The system of claim 8, wherein the gameplay data comprises one or more of a game level, a game based story, performance of a player of the game.
11. The system of claim 8, wherein the advertisement is provided in the form of one or more of texts, audios in the form of speech and music, videos, two-dimensional images or three-dimensional objects.
12. The system of claim 8, wherein the in-game advertisement space comprises one or more of conversation, game inventory, in-game music, in-game banners, and heads up display (HUD).
13. The system of claim 8, wherein the external sensors comprise one or more of wearable sensors, and sensors present in the surrounding area.
Dated this 01 Day of April 2021
| # | Name | Date |
|---|---|---|
| 1 | 202121015628-STATEMENT OF UNDERTAKING (FORM 3) [01-04-2021(online)].pdf | 2021-04-01 |
| 2 | 202121015628-REQUEST FOR EXAMINATION (FORM-18) [01-04-2021(online)].pdf | 2021-04-01 |
| 3 | 202121015628-FORM 18 [01-04-2021(online)].pdf | 2021-04-01 |
| 4 | 202121015628-FORM 1 [01-04-2021(online)].pdf | 2021-04-01 |
| 5 | 202121015628-FIGURE OF ABSTRACT [01-04-2021(online)].jpg | 2021-04-01 |
| 6 | 202121015628-DRAWINGS [01-04-2021(online)].pdf | 2021-04-01 |
| 7 | 202121015628-DECLARATION OF INVENTORSHIP (FORM 5) [01-04-2021(online)].pdf | 2021-04-01 |
| 8 | 202121015628-COMPLETE SPECIFICATION [01-04-2021(online)].pdf | 2021-04-01 |
| 9 | 202121015628-Proof of Right [16-06-2021(online)].pdf | 2021-06-16 |
| 10 | Abstract1.jpg | 2021-10-19 |
| 11 | 202121015628-FORM-26 [22-10-2021(online)].pdf | 2021-10-22 |
| 12 | 202121015628-FER.pdf | 2022-11-01 |
| 13 | 202121015628-OTHERS [25-01-2023(online)].pdf | 2023-01-25 |
| 14 | 202121015628-FER_SER_REPLY [25-01-2023(online)].pdf | 2023-01-25 |
| 15 | 202121015628-CLAIMS [25-01-2023(online)].pdf | 2023-01-25 |
| 16 | 202121015628-US(14)-HearingNotice-(HearingDate-09-01-2025).pdf | 2024-12-13 |
| 17 | 202121015628-Correspondence to notify the Controller [06-01-2025(online)].pdf | 2025-01-06 |
| 18 | 202121015628-Written submissions and relevant documents [22-01-2025(online)].pdf | 2025-01-22 |
| 1 | SearchStrategyE_01-11-2022.pdf |