Abstract: A method and system for providing augmented reality based 3D gaming applications using cloud computing for visual display devices is provided. The method includes enabling a user of a visual display device to select a game from a list of games, displaying one or more gaming characters associated with the game along with an option to replace one of the one or more gaming characters with an image model of the user, enabling one or more images of the user to be captured based on a selected gaming character, processing the one or more images to obtain the image model, replacing the selected gaming character with the image model of the user, and enabling the user to play the game with the image model. The system includes a visual display device, a communication interface in electronic communication with the visual display device, a memory that stores instructions, and a processor.
A METHOD AND SYSTEM FOR PROVIDING AUGMENTED REALITY BASED 3D GAMING APPLICATIONS USING CLOUD COMPUTING FOR VISUAL DISPLAY DEVICES
FIELD OF THE INVENTION
[0001] The present invention relates to the field of multimedia, and more specifically to the field of providing augmented reality based 3D gaming applications using cloud computing for visual display devices.
BACKGROUND
[0002] Gaming using different types of electronic devices are in wide use. Currently, three-dimensional (3D) games are popular due to augmented reality. However, such games are played with a set of permanent characters which does not enhance user experience. Also, real-time data manipulation is not present in the games. Further, higher processing capabilities are needed for such real-time data manipulation.
[0003] In the light of the foregoing discussion there is a need for an efficient method and system for providing augmented reality based 3D gaming applications using cloud computing for visual display devices.
SUMMARY
[0004] Embodiments of the present disclosure described herein provide a method and system for providing augmented reality based 3D gaming applications using cloud computing for visual display devices.
[0005] An example of a method of providing augmented reality based 3D gaming applications using cloud computing for visual display devices includes enabling a user of a visual display device to select a game from a list of games. The method also includes displaying one or more gaming characters associated with the game along with an option to replace one of the one or more gaming characters with an image model of the user. The method further includes enabling one or more images of the user to be captured based on a selected gaming character. Further, the method includes processing the one or more images to obtain the image model and replacing the selected gaming character with the image model of the user. Moreover, the method includes enabling the user to play the game with the image model.
[0006] An example of a system for providing augmented reality based 3D gaming applications using cloud computing for visual display devices includes a visual display device and a communication interface in electronic communication with the visual display device. The system also includes a memory that stores instructions and a processor. The processor is responsive to the instructions to enable a user of the visual display device to select a game from a list of games, to display one or more gaming characters associated with the game along with an option to replace one of the one or more gaming characters with an image model of the user, to enable one or more images of the user to be captured based on a selected gaming character, to process the one or more images to obtain the image model, to replace the selected gaming character with the image model of the user, and to enable the user to play the game with the image model.
BRIEF DESCRIPTION OF FIGURES
[0007] The accompanying figure, similar reference numerals may refer to identical or functionally similar elements. These reference numerals are used in the detailed description to illustrate various embodiments and to explain various aspects and advantages of the present disclosure.
[0008] FIG. 1 is a block diagram of an environment, in accordance with which various embodiments can be implemented;
[0009] FIG. 2 is an exemplary flow diagram illustrating a method of providing augmented reality based 3D gaming applications using cloud computing for visual display devices, in accordance with one embodiment;
[0010] FIG. 3 is a block diagram of a server, in accordance with one embodiment;
[0011] FIG. 4 is a flowchart illustrating a method of providing augmented reality based 3D gaming applications using cloud computing for visual display devices, in accordance with one embodiment;
[0012] FIG. 5 is a flowchart exemplarily illustrating a method of obtaining an image model of a user, in accordance with one embodiment; and
[0013] FIGS. 6A - 6C exemplarily illustrate a method of providing augmented reality based 3D gaming applications using cloud computing for visual display devices, in accordance with one embodiment.
DETAILED DESCRIPTION
[0014] It should be observed the method steps and system components have been represented by conventional symbols in the figure, showing only specific details which are relevant for an understanding of the present disclosure. Further, details may be readily apparent to person ordinarily skilled in the art may not have been disclosed. In the present disclosure, relational terms such as first and second, and the like, may be used to distinguish one entity from another entity, without necessarily implying any actual relationship or order between such entities.
[0015] Embodiments of the present disclosure described herein provide a method and system for providing augmented reality based 3D gaming applications using cloud computing for visual display devices.
[0016] FIG. 1 is a block diagram of an environment 100 in accordance with which various embodiments can be implemented. The environment 100 includes an electronic device, for example a visual display device 105, a network 110, and a server 115.
[0017] The visual display device 105 is connected to the server 115 through the network 110. Examples of the visual display device 105 include, but are not limited to, a digital television, a mobile device, a laptop, a tablet device, a personal digital assistant (PDA), a smart phone, and other hand held display devices. Examples of the network 110 include, but are not limited to, a local area network, a wide area network and a wireless network.
[0018] In some embodiments, the visual display device 105 can perform functions of the server 115.
[0019] The server 115 including a plurality of elements is explained in detail in conjunction with FIG. 3.
[0020] FIG. 2 is an exemplary flow diagram illustrating a method of providing augmented reality based 3D gaming applications using cloud computing for visual display devices, in accordance with one embodiment.
[0021] A user of a client, the visual display device 105, provides login information to access a list of games on the server 115. The server 115 provides access to the client. The user selects 3D games option and the server 115 provides a list of 3D games for the user to select. The user then selects a 3D game from the list of 3D games. The server 115 provides an option to select a gaming character associated with the 3D game for substitution with user image. The user selects the option to substitute the gaming character with the user image. The server 115 then requires one or more images of the user in different angles and projections. The user stands in front of a 2D camera and allows the images to be captured in the different angles and projections, which are then sent to the server 115. The server 115 receives the images and applies image processing techniques on the images. A 3D image is created and displayed to the user. The server subsequently asks the user to select the gaming character to be replaced with 3D image model of the user. The user selects the gaming character to be replaced with the 3D image model and initiates the 3D game. The 3D game starts and the user plays the 3D game with the image model.
[0022] FIG. 3 is a block diagram of the server 115, in accordance with one embodiment.
[0023] The server 115, for example a three dimensional (3D) games server, includes a bus 305 or other communication mechanism for communicating information, and a processor 310 coupled with the bus 305 for processing information. The server 115 also includes a memory 315, for example a random access memory (RAM) or other dynamic storage device, coupled to the bus 305 for storing information and instructions to be executed by the processor 310. The memory 315 can be used for storing temporary variables or other intermediate information during execution of instructions by the processor 310. The server 115 further includes a read only memory (ROM) 320 or other static storage device coupled to the bus 305 for storing static information and instructions for the processor 310. A storage unit 325, for example a magnetic disk or optical disk, is provided and coupled to the bus 305 for storing information.
[0024] The server 115 can be coupled via the bus 305 to a display 330, for example a cathode ray tube (CRT), for displaying a list of games. The input device 335, including alphanumeric and other keys, is coupled to the bus 305 for communicating information and command selections to the processor 310. Another type of user input device is the cursor control 340, for example a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to the processor 310 and for controlling cursor movement on the display 330.
[0025] Various embodiments are related to the use of the server 115 for implementing the techniques described herein. In some embodiments, the techniques are performed by the server 115 in response to the processor 310 executing instructions included in the memory 315. Such instructions can be read into the memory 315 from another machine-readable medium, for example the storage unit 325.
Execution of the instructions included in the memory 315 causes the processor 310 to perform the process steps described herein.
[0026] In some embodiments, the processor 310 can include one or more processing units for performing one or more functions of the processor 310. The processing units are hardware circuitry used in place of or in combination with software instructions to perform specified functions.
[0027] The term "machine-readable medium" as used herein refers to any medium that participates in providing data that causes a machine to perform a specific function. In an embodiment implemented using the server 115, various machine-readable media are involved, for example, in providing instructions to the processor 310 for execution. The machine-readable medium can be a storage medium, either volatile or non-volatile. A volatile medium includes, for example, dynamic memory, for example the memory 315. A non-volatile medium includes, for example, optical or magnetic disks, for example the storage unit 325. All such media must be tangible to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into a machine.
[0028] Common forms of machine-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic media, a CD-ROM, any other optical media, punchcards, papertape, any other physical media with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge.
[0029] In another embodiment, the machine-readable media can be transmission media including coaxial cables, copper wire and fiber optics, including the wires that include the bus 305. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications. Examples of machine-readable media may include, but are not limited to, a carrier wave as described hereinafter or any other media from which the server 115 can read. For example, the instructions can initially be carried on a magnetic disk of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to the server 115 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on the bus 305. The bus 305 carries the data to the memory 315, from which the processor 310 retrieves and executes the instructions. The instructions received by the memory 315 can optionally be stored on the storage unit 325 either before or after execution by the processor 310. All such media must be tangible to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into a machine.
[0030] The server 115 also includes a communication interface 345 coupled to the bus 305. The communication interface 345 provides a two-way data communication coupling between the server 115 and the visual display device 105 through the network 110. For example, the communication interface 345 can be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, the communication interface 345 can be a local area network (LAN) card to provide a data communication connection to a compatible LAN. In any such implementation, the communication interface 345 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
[0031] The server 115 further includes an image capture device 350 that captures one or more images of a user of the visual display device 105.
[0032] The processor 310 in the server 115 is operable to enable the user of the visual display device 105 to select a game from a list of games. The processor 310 further displays one or more gaming characters associated with the game along with an option to replace one of the gaming characters with an image model of the user. The processor 310 enables the images of the user to be captured based on a selected gaming character, using the image capture device 350. The processor 310 then processes the images to obtain the image model and replaces the selected gaming character with the image model of the user. The processor 310 further enables the user to play the game with the image model.
[0033] A method of providing augmented reality based 3D gaming applications using cloud computing for visual display devices is explained in detail in conjunction with FIG. 4.
[0034] FIG. 4 is a flowchart illustrating a method of providing augmented reality based three dimensional (3D) gaming applications using cloud computing for visual display devices, in accordance with one embodiment. Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented by computer-generated sensory input, for example sound, video, graphics or Global Positioning System (GPS) data. AR is about augmenting real world environment with virtual information by improving people's senses and skills. AR mixes virtual characters with real-world environment. There are three common characteristics of AR scenes being a combination of real-world environment with computer characters, interactive scenes, and scenes in 3D form.
[0035] In some embodiments, the method of providing augmented reality based gaming applications is performed using cloud computing.
[0036] The method starts at step 405.
[0037] At step 410, a user of a visual display device, for example the visual display device 105, is enabled to select a game from a list of games, for example 3D games. The user first provides login information to access the list of games present on a server, for example the server 115. In one example, the server is a 3D games server of advanced graphics and processing capabilities. The list of games is displayed to the user and the user subsequently selects one game.
[0038] At step 415, one or more gaming characters associated with the game is displayed along with an option to replace one of the gaming characters with an image model of the user. The user can select a gaming character as desired.
[0039] At step 420, one or more images of the user are enabled to be captured based on a selected gaming character. The user is connected to an image capture device, for example the image capture device 250, and multi-directional tracking of the user is performed by the server. In one example, the image capture device can be a web camera included in the visual display device. In another example, the image capture device can be a web camera separate from and connected to the visual display device. The user rotates 360 degrees in front of the image capture device and the images are hence captured in one or more angles and projections. The images captured are then sent to the server for processing. In one example, the images are sent as two-dimensional (2D) data. The images can be sent through a wired medium or a wirejess medium, for example Transmission Control Protocol/Internet Protocol (TCP/IP), and User Datagram Protocol (UDP).
[0040] In some embodiments, the images obtained by the image capture device are streamed to the visual display device which in turn streams the images to the server via a network.
[0041] In some embodiments, video data is captured in place of the images.
[0042] At step 425, the images are processed to obtain the image model. One or more image processing techniques, for example 3D image processing techniques, are performed on the images. Examples of the image processing techniques include, but are not limited to, image registration and image segmentation. A plurality of intensity derivatives, for example spatio temporal intensity derivatives for an optical flow, associated with the images are then determined. A plurality of depth parameters associated with the one or more images are also determined. The depth parameters are calculated with respect to different attributes and creates a third dimension using two coordinates. Examples of the attributes include, but are not limited to, image cues, each and every feature vector point, blur due to focus and defocus, motion cues, color intensity cues, geometric perspective analysis cues, and edge information.
[0043] If the depth parameters meet required conditions, a segmented and depth parameter calculated image is mapped to a base image or a first image. Rest of the images with varying degrees and profiles need to be superimposed or registration of the images with respect to 3D content needs to be performed.
[0044] In some embodiments, the segmented and depth parameter calculated image was segmented in a static state or dynamic state.
[0045] In some embodiments, movement activity calculation in real time starting from a next frame of mapping onto base image or the selected gaming character is performed.
[0046] At step 430, the selected gaming character is replaced with the image model of the user.
[0047] In some embodiments, the user can deselect the selected gaming character and select another gaming character to be replaced with the image model.
[0048] At step 435, the user is enabled to play the game with the image model as the image model imitates the selected gaming character.
[0049] The method stops at step 440.
[0050] In some embodiments, the image model can be stored in the server and can be directly used for playing various games.
[0051] In some embodiments, the user can update the image model as desired by capturing a different set of images.
[0052] Advantageously, the embodiments specified in the present disclosure enable 3D gaming on cloud for handheld devices and other visual display devices using a basic image capture device by preparing a 3D image model using real-time 2D data. The present disclosure further enables manipulation of data on the cloud.
[0053] FIG. 5 is a flowchart exemplarily illustrating a method of obtaining an image model of a user, in accordance with one embodiment.
[0054] The method starts at step 505.
[0055] At step 510, image acquisition of the user using a 2D camera is performed.
[0056] At step 515, multi-directional tracking of the user is performed due to the user rotating for different angles and projections needed for one or more images.
[0057] At step 520, image segmentation is performed when static or in motion.
[0058] At step 525, a plurality of intensity derivatives associated with the images for optical flow is determined.
[0059] At step 530, the depth parameters are calculated with respect to image cues, each and every feature vector point, blur due to focus and defocus, motion cues, color intensity cues, geometric perspective analysis cues, and edge information.
[0060] At step 535, if the depth parameters meet required conditions, a segmented and depth parameter calculated image is mapped to a base image or a first image. Rest of the images with varying degrees and profiles need to be superimposed or registration of the images with respect to 3D content needs to be performed.
[0061] At step 540, movement activity calculation in real time starting from a next frame of mapping onto base image or the selected gaming character is performed.
[0062] At step 545, 3D image model of the user is prepared. [0063] The method stops at step 550.
[0064] FIGS. 6A - 6C exemplarily illustrate a method of providing augmented reality based 3D gaming applications using cloud computing for visual display devices, in accordance with one embodiment.
[0065] FIG. 6A illustrates three gaming characters, for example a gaming character 605, a gaming character 610, and a gaming character 615, associated with a 3D game, that is displayed to a user. The user can select one of the three gaming characters, for example the gaming character 610, to be replaced with a 3D image model of the user.
[0066] FIG. 6B illustrates a video snapshot of the 3D game with the three gaming characters originally present. The gaming character 610 which is represented with a dotted line is to be replaced with the 3D image model of the user.
[0067] FIG. 6B illustrates a video snapshot of the 3D game with the two of the three gaming characters along with another gaming character 620, which is the 3D image model of the user and which replaced the gaming character 610. Hence, the 3D image model of the user imitates the gaming character 610.
[0068] In the preceding specification, the present disclosure and its advantages have been described with reference to specific embodiments. However, it will be apparent to a person of ordinary skill in the art that various modifications and changes can be made, without departing from the scope of the present disclosure, as set forth in the claims below. Accordingly, the specification and figures are to be regarded as illustrative examples of the present disclosure, rather than in restrictive sense. All such possible modifications are intended to be included within the scope of present disclosure.
I/We claim:
1. A method of providing augmented reality based 3D gaming applications using cloud computing for visual display devices, the method comprising:
enabling a user of a visual display device to select a game from a list of games;
displaying one or more gaming characters associated with the game along with an option to replace one of the one or more gaming characters with an image model of the user;
enabling one or more images of the user to be captured based on a selected gaming character;
processing the one or more images to obtain the image model;
replacing the selected gaming character with the image model of the user; and
enabling the user to play the game with the image model.
2. The method as claimed in claim 1, wherein the user provides login information to access the list of games.
3. The method as claimed in claim 1, wherein enabling the one or more images of the user to be captured comprises:
connecting the user to an image capture device; and performing multi-directional tracking of the user.
4. The method as claimed in claim 3, wherein the one or more images of the user is captured in one or more angles and projections.
5. The method as claimed in claim 1, wherein processing the one or more images comprises:
performing one or more image processing techniques on the one or more images;
determining a plurality of intensity derivatives associated with the one or more images; and
determining a plurality of depth parameters associated with the one or more images.
6. The method as claimed in claim 1, wherein the image model reproduces the selected gaming character.
7. A system for providing augmented reality based 3D gaming applications using cloud computing for visual display devices, the system comprising:
a visual display device;
a communication interface in electronic communication with the visual display device;
a memory that stores instructions; and
a processor responsive to the instructions to enable a user of the visual display device to select a game from a list of games;
display one or more gaming characters associated with the game along with an option to replace one of the one or more gaming characters with an image model of the user;
enable one or more images of the user to be captured based on a selected gaming character;
process the one or more images to obtain the image model;
replace the selected gaming character with the image model of the user; and
enable the user to play the game with the image model.
8. The system as claimed in claim 7 and further comprising
an image capture device that captures the one or more images of the user.
9. The system as claimed in claim 7, wherein the processor is configured to:
connect the user to the image capture device; and
perform multi-directional tracking of the user.
10. The system as claimed in claim 7, wherein the processor is configured to process
the one or more images by performing one or more image processing techniques on the one or more images;
determining a plurality of intensity derivatives associated with the one or more images; and
determining a plurality of depth parameters associated with the one or more images.
| # | Name | Date |
|---|---|---|
| 1 | 4723-CHE-2012 POWER OF ATTORNEY 12-11-2012.pdf | 2012-11-12 |
| 1 | 4723-CHE-2012-AbandonedLetter.pdf | 2019-02-12 |
| 2 | 4723-CHE-2012 FORM-5 12-11-2012.pdf | 2012-11-12 |
| 2 | 4723-CHE-2012-FORM 3 [24-12-2018(online)].pdf | 2018-12-24 |
| 3 | 4723-CHE-2012-FER.pdf | 2018-08-28 |
| 3 | 4723-CHE-2012 FORM-3 12-11-2012.pdf | 2012-11-12 |
| 4 | 4723-CHE-2012-Changing Name-Nationality-Address For Service [22-02-2018(online)].pdf | 2018-02-22 |
| 4 | 4723-CHE-2012 FORM-2 12-11-2012.pdf | 2012-11-12 |
| 5 | 4723-CHE-2012-RELEVANT DOCUMENTS [22-02-2018(online)].pdf | 2018-02-22 |
| 5 | 4723-CHE-2012 FORM-1 12-11-2012.pdf | 2012-11-12 |
| 6 | 4723-CHE-2012-FORM 3 [28-12-2017(online)].pdf | 2017-12-28 |
| 6 | 4723-CHE-2012 DRAWINGS 12-11-2012.pdf | 2012-11-12 |
| 7 | 4723-CHE-2012-FORM-26 [27-11-2017(online)].pdf | 2017-11-27 |
| 7 | 4723-CHE-2012 DESCRIPTION (COMPLETE) 12-11-2012.pdf | 2012-11-12 |
| 8 | Amended Form 1.pdf | 2015-07-20 |
| 8 | 4723-CHE-2012 CORRESPONDENCE OTHERS 12-11-2012.pdf | 2012-11-12 |
| 9 | 4723-CHE-2012 CLAIMS 12-11-2012.pdf | 2012-11-12 |
| 9 | Form 13_Address for service.pdf | 2015-07-20 |
| 10 | 4723-CHE-2012 ABSTRACT 12-11-2012.pdf | 2012-11-12 |
| 10 | 4723-CHE-2012 FORM-13 18-07-2015.pdf | 2015-07-18 |
| 11 | 4723-CHE-2012 FORM-18 25-04-2013.pdf | 2013-04-25 |
| 12 | 4723-CHE-2012 ABSTRACT 12-11-2012.pdf | 2012-11-12 |
| 12 | 4723-CHE-2012 FORM-13 18-07-2015.pdf | 2015-07-18 |
| 13 | 4723-CHE-2012 CLAIMS 12-11-2012.pdf | 2012-11-12 |
| 13 | Form 13_Address for service.pdf | 2015-07-20 |
| 14 | 4723-CHE-2012 CORRESPONDENCE OTHERS 12-11-2012.pdf | 2012-11-12 |
| 14 | Amended Form 1.pdf | 2015-07-20 |
| 15 | 4723-CHE-2012 DESCRIPTION (COMPLETE) 12-11-2012.pdf | 2012-11-12 |
| 15 | 4723-CHE-2012-FORM-26 [27-11-2017(online)].pdf | 2017-11-27 |
| 16 | 4723-CHE-2012 DRAWINGS 12-11-2012.pdf | 2012-11-12 |
| 16 | 4723-CHE-2012-FORM 3 [28-12-2017(online)].pdf | 2017-12-28 |
| 17 | 4723-CHE-2012 FORM-1 12-11-2012.pdf | 2012-11-12 |
| 17 | 4723-CHE-2012-RELEVANT DOCUMENTS [22-02-2018(online)].pdf | 2018-02-22 |
| 18 | 4723-CHE-2012 FORM-2 12-11-2012.pdf | 2012-11-12 |
| 18 | 4723-CHE-2012-Changing Name-Nationality-Address For Service [22-02-2018(online)].pdf | 2018-02-22 |
| 19 | 4723-CHE-2012-FER.pdf | 2018-08-28 |
| 19 | 4723-CHE-2012 FORM-3 12-11-2012.pdf | 2012-11-12 |
| 20 | 4723-CHE-2012-FORM 3 [24-12-2018(online)].pdf | 2018-12-24 |
| 20 | 4723-CHE-2012 FORM-5 12-11-2012.pdf | 2012-11-12 |
| 21 | 4723-CHE-2012-AbandonedLetter.pdf | 2019-02-12 |
| 21 | 4723-CHE-2012 POWER OF ATTORNEY 12-11-2012.pdf | 2012-11-12 |
| 1 | searchresult_27-06-2018.pdf |
| 1 | searchstrategy_27-06-2018.pdf |
| 2 | searchresult_27-06-2018.pdf |
| 2 | searchstrategy_27-06-2018.pdf |