Sign In to Follow Application
View All Documents & Correspondence

System And Method For Optimizing Rendering Time In Three Dimensional Virtual Reality Environments

Abstract: ABSTRACT SYSTEM AND METHOD FOR OPTIMIZING RENDERING TIME IN THREE-DIMENSIONAL VIRTUAL REALITY ENVIRONMENTS The various embodiments of the present invention provide a system and method for reducing the time taken for real-time rendering of a 3D virtual scene using a lower memory (RAM) foot-print and improved quality. The method utilizes pre-baking, which calculates the shadows and lighting on different materials and textures in a non-real time manner and applies pre-baking on one object at a time. In real-time, instead of performing the entire set of calculations for rendering, a simple method is used to merge the pre-baked texture along with the actual texture or material that need to be updated on the object within the virtual scene. This results in reducing the rendering time by nearly 90% while maintaining the quality of rendering. The 3D virtual scene can also be rendered with the same quality with a much lower memory (RAM) footprint. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
23 December 2020
Publication Number
25/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
PATENT@LRSWAMI.COM
Parent Application
Patent Number
Legal Status
Grant Date
2025-01-24
Renewal Date

Applicants

SOCIOGRAPH SOLUTIONS PRIVATE LIMITED
677, 1ST FLOOR, 27TH MAIN, 13TH CROSS, HSR LAYOUT, SECTOR 1, BANGALORE-560102, KARNATAKA, INDIA

Inventors

1. ANANTHAKRISHNAN GOPAL
4081 SOBHA DAISY, GREEN GLEN LAYOUT, BELLANDUR, BANGALORE- 560103

Specification

DESC:CROSS-REFERENCE TO RELATED APPLICATIONS
This patent application claims the priority of the Indian Provisional Patent Application filed on December 23 2020, with the number 202041056166 and titled, "SYSTEM AND METHOD FOR OPTIMIZING RENDERING TIME IN THREE-DIMENSIONAL VIRTUAL REALITY ENVIRONMENTS", the contents of which are incorporated herein by the way of reference.

A) TECHNICAL FIELD
[0001] The present invention is generally related to three-dimensional (3D) virtual reality technology that requires real-time rendering of (two-dimensional) 2D or 3D images from 3D virtual scene. The invention is specifically related to reducing the rendering time and digital memory requirements for rendering the 3D virtual scene in real-time. The invention is more specifically related to a system and method for optimizing the rendering time in 3D virtual reality environments using pre-baking technology.

B) BACKGROUND OF THE INVENTION
[0002] In many applications of 3D visualizations, such as Virtual Reality, Augmented Reality, Digital 3D applications, the application of lights and shadows, reflections and diffusion of light is calculated in real-time called rendering. Real-time rendering computes the applications of the light and shadows in real-time to create images of the 3D-scene based on the view of the camera. Real-time rendering requires a Graphical Processing Unit (GPU) as well as large amount of Random-Access Memory (RAM).
[0003] In current systems, much of the calculations are approximated sufficiently in order to reduce the rendering of the 3D-scene in real time. This results in reduced quality of the rendering and un-realistic view of the 3D scene.
[0004] The process where a part of the calculations is performed in a non-real time renderer and pre-applied on the material textures of each object in the scene is called pre-baking. This process helps is optimizing the quality of real-time rendering to some extent. However, in a scenario where the material textures are variable and change in real-time, the pre-baking mechanism does not work efficiently.
[0005] Therefore, there is a need for a system and method to enable the pre-baking mechanism in scenarios where the textures of objects and images in a virtual reality scene change dynamically.
[0006] The abovementioned shortcomings, disadvantages and problems are addressed herein, which will be understood by reading and studying the following specification.

C) OBJECT OF THE INVENTION
[0007] The primary object of the present invention is to provide a system and method for rendering a 3D-scene in the presence of dynamic materials or textures.
[0008] Another object of the present invention is to provide a system and a method for reducing the time required to render a 3D-scene in real-time in presence of dynamic materials or textures.
[0009] Yet another object of the present invention is to provide a system and method for reducing the digital memory (RAM) required to render a 3D-scene in real-time in presence of dynamic materials or textures.
[0010] Yet another object of the present invention is to improve the quality of the real-time rendering of a 3D-scene by applying the lights and shadows on the textures and materials in a non-real-time system and applying them dynamically on the material textures.
[0011] Yet another object of the present invention is to provide a system and method to enable the pre-baking mechanism in scenarios where the textures of objects and images in a virtual reality scene change dynamically.
[0012] These and other objects and advantages of the present invention will become readily apparent from the following detailed description taken in conjunction with the accompanying drawings.

D) SUMMARY OF THE INVENTION
[0013] The various embodiments of the present invention provide a system and method for rendering a 3D-scene in the presence of dynamic materials or textures. The embodiments also provide a system and method for reducing the rendering time of 3D-scenes in 3D virtual reality environments using pre-baking technology.
[0014] According to one embodiment of the present invention, a system and method is provided to enable the pre-baking mechanism in scenarios where the textures of objects and images in a virtual reality scene change dynamically. Instead of carrying out a non-real-time pre-baking on the actual texture of the objects themselves, the pre-baking is carried out on a replacement texture that is completely white. This process is carried out for each object in the scene sequentially such that, after pre-baking the white texture for one object, it is replaced by default texture back, before moving on to pre-baking the white texture for the next object.
[0015] According to one embodiment of the present invention, instead of carrying out the entire calculations of light and shadows for each new texture in the real-time rendering, the textures of the pre-baked white is merged along with the actual dynamic texture that is loaded.
[0016] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating the preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.

E) BRIEF DESCRIPTION OF THE DRAWINGS
[0017] The other objects, features and advantages will occur to those skilled in the art from the following description of the preferred embodiment and the accompanying drawings in which:
[0018] FIG. 1 illustrates a system for optimizing the rendering time in 3D virtual reality environments, according to one embodiment of the present invention.
[0019] FIG. 2 illustrates a method for optimizing the rendering time in 3D virtual reality environments, according to one embodiment of the present invention.
[0020] Although the specific features of the present invention are shown in some drawings and not in others. This is done for convenience only as each feature may be combined with any or all of the other features in accordance with the present invention.

F) DETAILED DESCRIPTION OF THE INVENTION
[0021] In the following detailed description, a reference is made to the accompanying drawings that form a part hereof, and in which the specific embodiments that may be practiced is shown by way of illustration. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments and it is to be understood that other changes may be made without departing from the scope of the embodiments. The following detailed description is therefore not to be taken in a limiting sense.
[0022] The various embodiments of the present invention provide a system and method for rendering a 3D-scene in the presence of dynamic materials or textures. The embodiments also provide a system and method for reducing the rendering time of 3D-scenes in 3D virtual reality environments using pre-baking technology.
[0023] According to one embodiment of the present invention, a system is provided for rendering a three-dimensional environment on visual display devices in the presence of dynamic objects, materials, surfaces or textures in the three-dimensional environment. The system comprises a camera module, a plurality of visual display devices, a server module, a pre-baked object repository module and a plurality of client computing devices. The server module further comprises a server computing module, a virtual room module, a server graphical computation module and a pre-baking module. The client computing devices further comprise a client computation module and a client graphical computation module.
[0024] According to one embodiment of the present invention, the virtual room module is a programmatic or digital representation of a room constructed in a digital three-dimensional (3D) space. The virtual room module is configured to be accessed through any computing device that is communicably coupled with the server module, and to be rendered on the plurality of visual display devices. The virtual room module includes a digital representation of the dimensions of a plurality of objects, materials, surfaces and textures that are present in a three-dimensional environment. The digital representation includes information on the lighting and a plurality of other parameters that define the visual representation the plurality of objects, materials, surfaces and textures that are present in the three-dimensional environment.
[0025] According to one embodiment of the present invention, the client graphical computation module is configured to provide a rendering of the virtual room module in the plurality of visual display devices. The rendering is a two-dimensional (2D) representation or projection of the virtual room module through a viewing port at a specific angle in the virtual room module. The rendering includes information on a plurality of parameters that define the visual representation of the virtual room module. The information of a plurality of parameters comprises a calculation of how the lights that exist in the room illuminate and reflect from all the objects, materials, surfaces and textures in the room.
[0026] According to one embodiment of the present invention, the pre-baked object repository module comprises a plurality of pre-baked digital representations of a plurality of objects, materials, surfaces and textures that are present in a three-dimensional environment. Pre-baking includes pre-calculating the representation of those parts of the objects, materials, surfaces and textures which are unlikely to change based on interactivity or any other dynamic factors in the three-dimensional environment.
[0027] According to one embodiment of the present invention, the pre-baking process includes a calculation step and an unwrapping step. The calculation step includes calculation of the light and shadows in the three-dimensional environment based on pre-existing lighting conditions in the three-dimensional environment. The calculation is a physics-based simulation of the impact of rays of light on the rendering of the plurality of objects, materials, surfaces and textures in the three-dimensional environment. The calculation step is also configured to identify which components of the virtual room occlude completely or partially each of the sources of light in the virtual room. The computation step is configured to be processed in the server graphical computation module or the client graphical computation module. The unwrapping step includes converting the surface of a three-dimensional virtual object into a two-dimensional image.
[0028] According to one embodiment of the present invention, the plurality of visual display devices are configured to render the virtual room module in real-time. The plurality of visual display devices are communicably coupled to the server module and the plurality of client computing modules. The plurality of visual display devices includes any two-dimensional digital display devices, digital kiosks, interactive displays, virtual reality devices and any type of two-dimensional or three-dimensional digital display systems. The visual display devices are configured to load and render a plurality of renderings depending on the angle in which a user wishes to view the three-dimensional environment in the virtual room module and the plurality of objects, materials, surfaces and textures in the three-dimensional environment.
[0029] According to one embodiment of the present invention, a method is provided for rendering a three-dimensional environment on visual display devices in the presence of dynamic objects, materials, surfaces or textures in the three-dimensional environment. The method comprises calculating lighting and shadow information for a white three-dimensional object in a three-dimensional environment; unwrapping the material of a three-dimensional object in a three-dimensional environment into a two-dimensional surface; applying lighting and shadow information on a new material based on a preset rules or formula; re-wrapping the new material on the three-dimensional object; and, displaying the three-dimensional object in a three-dimensional environment on a plurality of display devices.
[0030] According to one embodiment of the present invention, the method of applying pre-baking on different materials without re-computation is configured for only un-wrapped and pre-baked materials. The method is configured only when the visual properties of the three-dimensional object including surface roughness and lustrous nature do not change appreciably due to interactivity or any other dynamic factors. The method is configured to take into account a plurality of factors including the plurality of colours and patterns that the new material possesses, in order to re-compute the pre-baking process with a drastic reduction in computation compared to applying the light and shadows calculation.
[0031] According to one embodiment of the present invention, the method is configured to be processed in the client computation module without impacting the operation and performance of a plurality of other applications that are also configured to be processed in the client computation module. The client computation module is a general-purpose computation module that is included in the plurality of client computing devices, and the client computation module is not configured for specialized graphical processing.
[0032] According to one embodiment of the present invention, a system and method is provided to enable the pre-baking mechanism in scenarios where the textures of objects and images in a virtual reality scene change dynamically. Instead of carrying out a non-real-time pre-baking on the actual texture of the objects themselves, the pre-baking is carried out on a replacement texture that is completely white. This process is carried out for each object in the scene sequentially such that, after pre-baking the white texture for one object, it is replaced by default texture back, before moving on to pre-baking the white texture for the next object.
[0033] According to one embodiment of the present invention, the pre-baking performed on objects in a 3D-scene considering that the material is completely white surface and absorbs all the light falling at it. The amount of light falling on any other material based on the pre-baked values of a white surface is determined based on the following formula:
applied_material[p] = [(intensity in white pre-baking surface – maximum intensity in the pre-baked material – maximum intensity on the material to be applied + material[p]
Here ‘p’ is each pixel in the un-wrapped surface.
The surface is then re-wrapped around the 3D object in the environment, which is equivalent to the real-time rendered.
[0034] FIG. 1 illustrates a system for optimizing the rendering time in 3D virtual reality environments. The system comprises Server Graphical Computation module 101, Pre-baking module 102, Pre-Baked Object Repository 103, Client Computation module 104, Visual Display Unit 105 and Client Graphical Computation module 106.
[0035] FIG. 2 illustrates a method for optimizing the rendering time in 3D virtual reality environments. The method includes: calculating the lighting and shadows for a white 3D object in a 3D-scene (201); unwrapping the material of a 3D object in 3D-scene into a 2D surface (202); applying the lighting information on a new material based on a preset rules/formula (203); re-wrapping the material on the 3D object (204); and, displaying the object in a 3D environment on a preset device (205).
[0036] Although the embodiments herein are described with various specific embodiments, it will be obvious for a person skilled in the art to practice the embodiments herein with modifications.

G) ADVANTAGES OF THE INVENTION
[0037] The various embodiments of the present invention provide a system and method for rendering a 3D-scene in the presence of dynamic materials or textures. The embodiments also provide a system and method for reducing the rendering time of 3D-scenes in 3D virtual reality environments using pre-baking technology. The present invention enables deploying 3D visualization of virtual scenes, which is specifically useful when textures or materials of the objects in the scene change dynamically. The present invention also enables reducing the time required to render a 3D virtual scene, while maintaining the quality of the rendering. The present invention also enables rendering a 3D virtual scene on computing devices with a lower memory (RAM) compared to high-end computing devices, while maintaining a high quality of rendering.
[0038] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such as specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modifications. However, all such modifications are deemed to be within the scope of the claims.
,CLAIMS:We claim:
1. A system for rendering a three-dimensional environment on visual display devices in the presence of dynamic objects, materials, surfaces or textures in the three-dimensional environment, the system comprising:
a camera module;
a plurality of visual display devices;
a server module, wherein the server module further comprises a server computing module, a virtual room module, a server graphical computation module and a pre-baking module;
a pre-baked object repository module;
a plurality of client computing devices, wherein the client computing devices further comprises a client computation module and a client graphical computation module.

2. The system as claimed in claim 1, wherein the virtual room module is a programmatic or digital representation of a room constructed in a digital three-dimensional (3D) space, and wherein, the virtual room module is configured to be accessed through any computing device that is communicably coupled with the server module, and to be rendered on the plurality of visual display devices, and wherein, the virtual room module includes a digital representation of the dimensions of a plurality of objects, materials, surfaces and textures that are present in a three-dimensional environment, and wherein, the digital representation includes information on the lighting and a plurality of other parameters that define the visual representation the plurality of objects, materials, surfaces and textures that are present in the three-dimensional environment.

3. The system as claimed in claim 1, wherein the client graphical computation module is configured to provide a rendering of the virtual room module in the plurality of visual display devices, and wherein, the rendering is a two-dimensional (2D) representation or projection of the virtual room module through a viewing port at a specific angle in the virtual room module, and wherein, the rendering includes information on a plurality of parameters that define the visual representation of the virtual room module, and wherein, the information of a plurality of parameters comprises a calculation of how the lights that exist in the room illuminate and reflect from all the objects, materials, surfaces and textures in the room.

4. The system as claimed in claim 1, wherein the pre-baked object repository module comprises a plurality of pre-baked digital representations of a plurality of objects, materials, surfaces and textures that are present in a three-dimensional environment, and wherein, pre-baking includes pre-calculating the representation of those parts of the objects, materials, surfaces and textures which are unlikely to change based on interactivity or any other dynamic factors in the three-dimensional environment.

5. The system as claimed in claim 1, wherein the pre-baking process includes a calculation step and an unwrapping step, and wherein, the calculation step includes calculation of the light and shadows in the three-dimensional environment based on a pre-existing lighting conditions in the three-dimensional environment, and wherein, the calculation is a physics-based simulation of the impact of rays of light on the rendering of the plurality of objects, materials, surfaces and textures in the three-dimensional environment, and wherein, the calculation step is also configured to identify which components of the virtual room occlude completely or partially each of the sources of light in the virtual room, and wherein, the computation step is configured to be processed in the server graphical computation module or the client graphical computation module, and wherein, the unwrapping step includes converting the surface of a three-dimensional virtual object into a two-dimensional image.
6. The system as claimed in claim 1, wherein the plurality of visual display devices are configured to render the virtual room module in real-time, and wherein, the plurality of visual display devices are communicably coupled to the server module and the plurality of client computing modules, and wherein, the plurality of visual display devices includes any two-dimensional digital display devices, digital kiosks, interactive displays, virtual reality devices and any type of two-dimensional or three-dimensional digital display systems, and wherein, the visual display devices are configured to load and render a plurality of renderings depending on the angle in which a user wishes to view the three dimensional environment in the virtual room module and the plurality of objects, materials, surfaces and textures in the three-dimensional environment.
7. A method for rendering a three-dimensional environment on visual display devices in the presence of dynamic objects, materials, surfaces or textures in the three-dimensional environment, the method comprising:
calculating lighting and shadow information for a white three-dimensional object in a three-dimensional environment;
unwrapping the material of a three-dimensional object in a three-dimensional environment into a two-dimensional surface;
applying lighting and shadow information on a new material based on a preset rules or formula;
re-wrapping the new material on the three-dimensional object; and,
displaying the three-dimensional object in a three-dimensional environment on a plurality of display devices.

8. The method as claimed in claim 7, wherein the method of applying pre-baking on different materials without re-computation is configured for only un-wrapped and pre-baked materials, and wherein, the method is configured only when the visual properties of the three-dimensional object including surface roughness and lustrous nature do not change appreciably due to interactivity or any other dynamic factors, and wherein, the method is configured to take into account a plurality of factors including the plurality of colours and patterns that the new material possesses, in order to re-compute the pre-baking process with a drastic reduction in computation compared to applying the light and shadows calculation.
9. The method as claimed in claim 7, wherein the method is configured to be processed in the client computation module without impacting the operation and performance of a plurality of other applications that are also configured to be processed in the client computation module, and wherein, the client computation module is a general-purpose computation module that is included in the plurality of client computing devices, and wherein, the client computation module is not configured for specialized graphical processing.

Documents

Application Documents

# Name Date
1 202041056166-PROVISIONAL SPECIFICATION [23-12-2020(online)].pdf 2020-12-23
2 202041056166-OTHERS [23-12-2020(online)].pdf 2020-12-23
3 202041056166-FORM FOR STARTUP [23-12-2020(online)].pdf 2020-12-23
4 202041056166-FORM FOR SMALL ENTITY(FORM-28) [23-12-2020(online)].pdf 2020-12-23
5 202041056166-FORM 1 [23-12-2020(online)].pdf 2020-12-23
6 202041056166-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-12-2020(online)].pdf 2020-12-23
7 202041056166-DRAWINGS [23-12-2020(online)].pdf 2020-12-23
8 202041056166-Proof of Right [14-04-2021(online)].pdf 2021-04-14
9 202041056166-FORM-26 [14-04-2021(online)].pdf 2021-04-14
10 202041056166-Correspondence_Form1, Power of Attorney_19-04-2021.pdf 2021-04-19
11 202041056166-FORM 3 [21-12-2021(online)].pdf 2021-12-21
12 202041056166-FORM 18 [21-12-2021(online)].pdf 2021-12-21
13 202041056166-ENDORSEMENT BY INVENTORS [21-12-2021(online)].pdf 2021-12-21
14 202041056166-DRAWING [21-12-2021(online)].pdf 2021-12-21
15 202041056166-COMPLETE SPECIFICATION [21-12-2021(online)].pdf 2021-12-21
16 202041056166-FER.pdf 2022-07-25
17 202041056166-FER_SER_REPLY [25-01-2023(online)].pdf 2023-01-25
18 202041056166-COMPLETE SPECIFICATION [25-01-2023(online)].pdf 2023-01-25
19 202041056166-CLAIMS [25-01-2023(online)].pdf 2023-01-25
20 202041056166-ABSTRACT [25-01-2023(online)].pdf 2023-01-25
21 202041056166-PETITION UNDER RULE 137 [26-04-2024(online)].pdf 2024-04-26
22 202041056166-PatentCertificate24-01-2025.pdf 2025-01-24
23 202041056166-IntimationOfGrant24-01-2025.pdf 2025-01-24
24 202041056166-EVIDENCE FOR REGISTRATION UNDER SSI [18-04-2025(online)].pdf 2025-04-18

Search Strategy

1 SearchHistoryE_22-07-2022.pdf

ERegister / Renewals

3rd: 18 Apr 2025

From 23/12/2022 - To 23/12/2023

4th: 18 Apr 2025

From 23/12/2023 - To 23/12/2024

5th: 18 Apr 2025

From 23/12/2024 - To 23/12/2025

6th: 18 Apr 2025

From 23/12/2025 - To 23/12/2026