Abstract: This disclosure relates to a system and method for data visualization in a virtual reality (VR) environment that enables end users to collaborate and draw deeper insights, for action in real time. The system is configured for creating data visualization charts in the VR environment with all the features of data visualization tool or augmenting an existing data visualization tool with the immersive 3D charts. The data visualization is performed using low cost frugal devices such as VR box or Google Cardboard in compatibility with android and iOS smartphones. A real time collaboration is provided when the users are multi-located and provides 360 degrees view of dashboards which facilitates in building the context of the report/dashboard in VR. Further, the system is integrated with a voice BOT solution to access the VR dashboards/reports wherein a BOT engine parses the voice command in natural language to fetch VR data visualization reports.
DESC:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
METHOD AND SYSTEM FOR COLLABORATIVE DATA VISUALIZATION IN VIRTUAL REALITY
Applicant
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[001] The present application claims priority from Indian patent application number 201821031845, filed on August 24, 2018 the complete disclosure of which, in its entirety is herein incorporated by reference.
TECHNICAL FIELD
[002] The disclosure herein generally relates to data visualization, and, more particularly, to a method and system for collaborative data visualization of complex and multidimensional data using virtual reality (VR) displays.
BACKGROUND
[003] Data visualization commonly refers to techniques utilized to communicate data or information by encoding it as visual objects that can be displayed via a computer. Visualization is an essential component of any data analysis and/or data mining process. In many instances, a graphical representation of the geometry and topology of a data distribution can enable selection of appropriate analysis tools revealing further insights, and the interpretation of the results.
[004] Recent techniques in data visualization are focused on enabling self-service along with sharing of reports for collaboration and building advanced/complex dashboards/reports. To generate the insights, datasets are visualized in two-dimensional dashboards on the browser of either laptop or mobile device. Computer displays typically display information in two dimensions (2D) and portability is restricted while visualizing. Similarly, using mobile devices for data visualization restricts the viewing area, thereby impacting the contextual insights that could be drawn from a complex dashboard or reports. Examples of some complex reports are: (a) a word embedding graph for data scientists, (b) a scatter plot, (c) network analytics and (d) complex relationship graph.
[005] Conventionally, in virtual reality (VR) environment, most of the advancements are restricted to high end head mounted devices (HMD) like HTC Vive and Oculus and it is known that one size does not fit all. Data visualization gets disrupted as humans find it difficult to wear these HMDs for extended period of time, rather humans are comfortable wearing it only for short duration. Also, these HMDs are costly devices when compared to low-cost frugal devices such as VR box and Google Cardboard. Further, usage of HMDs restricted the adoption of this technology for several use cases in the enterprise analytics field for daily operations.
[006] Also in today's world, when it comes to data driven informed decision making, collaboration plays a very important role. It has been a challenge to collaborate and interact with end users who often work in different locations, wearing HMDs for longer times and collaboration for insights has always been through sharing the report via traditional medium of asynchronous communication (email).Thus leveraging advancements in communication and collaboration technologies are used to minimize the gap of face-to-face interaction. Data visualization in virtual reality is used in an interim period of real world analysis for user experience due to HMDs usage, to reduce the time and to comprehend the complex reports/dashboards and to derive deeper insights which are not possible in the 2D world.
[007] There is therefore a need for a system to acquire the context from the real world to the virtual world and transmitting the actions taken in the virtual world to the real world based on actions performed in the virtual world by integrating both the virtual and real worlds seamlessly.
SUMMARY
[008] Embodiments of the present disclosure provides technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one embodiment, a method and system for data visualization in VR that enables end users to collaborate and draw deeper insights, for action in real time. The method and system performs data visualization using low cost frugal devices such as VR box or Google Cardboard in compatibility with android and iOS smartphones. The method and system provides collaboration in real time when the users are multi-located and provides 360 degrees view of dashboards which facilitates in building the context of the report/dashboard in VR. The method and system is integrated with a voice BOT solution to access the VR dashboards/reports wherein BOT engine parses the voice command in natural language to fetch VR data visualization reports.
[009] A processor-implemented method for data visualization in virtual reality (VR) environment that enables end users to collaborate and draw deeper insights, for action in real time. The method comprising one or more steps as collecting one or more information charts from a plurality of predefined sources of information, authenticating the one or more information charts using one or more predefined parameters, transmitting the authenticated one or more information charts from a real environment to a virtual reality environment to visualize three-dimensional aspects of the charts in the virtual reality environment, identifying at least one information chart of the transmitted one or more information charts for further exploration to draw deeper insight, interacting with the identified at least one information chart by applying filters, drilling down, drilling up, point markers, zoom and line markers to the chart based on requirement of a predefined one or more users, identifying a set of protocols of the virtual reality environment to connect through a voice BOT, wherein the voice BOT is configured to interface with the virtual reality environment solution to call one or more solutions with a voice commands. Further, the voice BOT invoked to access the identified one information chart in the VR environment and to make one or more changes in the identified chart, identifying a format of the identified chart in the VR environment to store the transactions in the real environment and completing the at least one transaction between the VR environment and the real environment.
[010] A system is configured for data visualization in VR that enables end users to collaborate and draw deeper insights, for action in real time. The system comprises a memory, a database, and one or more hardware processors configured to execute at least one instruction stored in the memory for the purpose of performing one or more processes disclosed herein. The one or more processors are also coupled to a communication interface to receive input data as well as transmit instructions from real world to VR environment for action to be taken and vice versa. Further, the system includes a collection module, an authentication module, a transmitting module, an identification module, an interaction module, a determination module, an invocation module, a format identifying module and a transaction module.
[011] Herein, the collection module of the system is configured to collect one or more information charts from a plurality of predefined sources of information. The collected one or more information charts are authenticated at the authentication module using one or more predefined parameters. The transmitting module of the system is configured to transmit the authenticated one or more information charts from a real environment to a VR environment to visualize three-dimensional aspects of the charts in the virtual reality environment. The identification module of the system is configured to identify at least one information chart of the transmitted one or more information charts for further exploration to draw deeper insight and the interaction module of the system is configured to interact with the identified at least one information chart by applying filters, drilling down, drilling up, point 6markers, zoom and line markers to the chart based on requirement of a predefined one or more users. A set of protocols of the VR environment is determined at the determination module to connect through a voice BOT. It is to be noted that the voice BOT is configured to interface with the virtual reality environment solution to call one or more solutions with a voice commands. The invocation module is configured to invoke the voice BOT to access the identified one information chart in the VR environment and to make one or more changes in the identified chart and the format identifying module is configured to identify a format of the identified chart in the virtual reality environment to store the transactions in the real environment. The transaction module of the system is to complete the at least one transaction between the virtual reality environment and the real environment.
[012] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[013] FIG. 1 illustrates a system for generating 3D visualizations of a multidimensional data space according to some embodiments of the present disclosure;
[014] FIG. 2 illustrates a multidimensional data visualization computing system according to some embodiments of the present disclosure;
[015] FIG. 3 is a functional block diagram of various modules stored in module(s) of a memory of the system according to some embodiments of the present disclosure;
[016] FIG. 4 is a flow diagram illustrating a process for generating 3D visualizations of a multidimensional data space according to some embodiments of the present disclosure; and
[017] FIG. 5 is an exemplary flow diagram illustrating a process for generating a multidimensional data visualization in accordance with some embodiments of the present disclosure.
DETAILED DESCRIPTION OF EMBODIMENTS
[018] Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope and spirit being indicated by the claims (when included in the specification).
[019] The embodiments herein provide a method and a system for data visualization in VR environment that enables one or more end users to collaborate and draw deeper insights, for action in real time. It would be appreciated that the method and system performs data visualization using low cost frugal devices such as VR box or Google Cardboard in compatibility with android and iOS smartphones. The method and system provides collaboration in real time when the one or more end users are multi-located and provides 360 degrees view of dashboards which facilitates in building the context of the report/dashboard in VR environment. Further, the method and system is integrated with a voice BOT solution to access the VR dashboards/reports wherein a BOT engine parses the voice command in natural language to fetch VR environment data visualization reports.
[020] Referring now to the drawings, and more particularly to FIG. 1 through FIG. 5, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
[021] Referring FIG.1, illustrating a system (100) for data visualization in VR environment that enables one or more end users to collaborate and draw deeper insights, for action in real time. In an embodiment of the present disclosure, the system (100) comprises a memory (102) for acquiring and storing data that is provided as input by the user depending upon the context. The system (100) comprises a database (108) for storing the received data and the corresponding output. The system (100) further comprises one or more hardware processors (104) configured to execute at least one program stored in memory (102) for the purposes of performing one or more processes disclosed herein. The processor (104) may be coupled to a communication interface (106) to receive input data as well as transmit instructions from real world to virtual world for action to be taken and vice versa. The processor (104) may also receive and transmit data in addition to storing instructions for the program, the memory (102) may store preliminary, intermediate and final datasets involved in techniques that are described herein.
[022] In the preferred embodiment of the present disclosure, the system (100) further comprising a collection module (110), an authentication module (112), a transmitting module (114), an identification module (116), an interaction module (118), a determination module (120), an invocation module (122), a format identifying module (124) and a transaction module (126).
[023] In the preferred embodiment of the present disclosure, the collection module (110) of the system (100) is configured to collect one or more information charts from a plurality of predefined sources of information. The one or more information charts includes a word embedding graph for data scientists, a scatter plot, a network analytics and complex relationship graph. It is to be that each plotter has the responsibility of making an appropriate request to a server and plotting the corresponding graph. Custom scripts built to create the charts using basic objects. Thus, the higher adoption in an enterprise as almost all employees with smartphones can immerse themselves in data and derive insights.
[024] In the preferred embodiment of the present disclosure, the authentication module (112) of the system (100) is configured to authenticate the collected one or more information charts using one or more predefined parameters. The system (100) is configured to obtain one or more users’ input, depending upon the context, the button can be interpreted by the system (100) as conveying different information. A simple input modality involves allowing the user to move the position of the data visualization relative to the user when the button is not being pressed, and rotate the 3D data visualization when the button is pressed. In an embodiment, ray caster based on user gaze direction and/or a remote control that simply includes one or more buttons can be used as user inputs. It is to be noted that any of a variety of processing can be initiated based upon a button input as appropriate to a specific use interface context and the requirements of a given application. Furthermore, any of a variety of additional input modalities can also be supported as appropriate to the needs of a given data visualization application.
[025] It would be appreciated that the one or more users are authenticated based on the privileges assigned to them in the reporting universe. The authentication of the one or more users in the VR environment is integrated with the authentication of the reporting application which may or may not be integrated with an enterprise authentication systems such as AD / LDAP. Thus, only the one or more users who have access to the data will be authorized to visualize the reports in VR environment as well.
[026] In the preferred embodiment of the present disclosure, the transmitting module (114) of the system (100) is configured to transmit the authenticated one or more information charts from a real environment to a VR environment to visualize three-dimensional aspects of the charts in the virtual reality environment.
[027] It would be noted that the ability to visualize the data within a multidimensional data space in 3D opens up a vast array of possibilities for the analysis of complex data, 3D data visualization systems in accordance with an embodiment of the present invention enable data exploration to be performed in a collaborative manner. In another embodiment, one or more users who may or may not be in the same physical location can independently explore the same shared, virtual, multidimensional data space. One user, as a lead user, may host a multiplayer interactive session in which other users are called to view the 3D data visualization space from the same viewpoint controlled by the lead user. While the one or more users are present physically in different geographical locations, insights are shared for their action in real time.
[028] Herein, the lead user shares the insights using the immersive experience in VR environment with all the other users and interacts with the reports (for example, draws lines between the points in the scatter plot, selects only certain nodes in the network diagram, updates the summary wall based on these insights). Along with these, certain points are discussed and noted in the VR environment.
[029] In the preferred embodiment of the present disclosure, the identification module (116) of the system (100) is configured to identify at least one information chart of the transmitted one or more information charts for further exploration to draw deeper insight. In an example, wherein a network graph to trace a fraudulent transaction needs the user to trace the money trail from one node of the network to the other node and in the process, the user filters out or deletes the nodes which leads to the wrong trail. The same cannot be achieved in a 2D chart as the user would not be able to segregate or trace out the links in the complex network of transactions no matter how much we zoom in due to overlaps.
[030] In the preferred embodiment of the present disclosure, the interaction module (118) of the system (100) is configured to interact with the identified at least one information chart by applying filters, drilling down, drilling up, point markers, zoom and line markers to the chart based on requirement of a predefined one or more users. The operations which the one or more users applies in the real environment while exploring the report/dashboard on the mobile device are stored and the same are used to carry forward the context from real environment to the VR environment.
[031] The filters and drill downs applied by the one or more users in real environment on the mobile device are carry forwarded and presented to the one or more users. In turn, the one or more users will be able to access the filters and drill downs when they enter the VR environment. In the VR environment, the one or more users immerses himself/herself in between the data points of the charts in the report/dashboard and derives deeper insights. Upon completion of the task in the VR environment, the lead user and the other users comes back to the real environment and the context from the VR environment is carried to the real environment with all the interactions and insights noted in the virtual environment. Further, analysis, reports and actions are continued in the real environment. Users may access the relevant data by connecting to the backend databases rather than storing the data locally on the device.
[032] When one or more users’ transition to interact with the system (100) via an immersive 3D display such as VR headset, then input of the one or more users can be obtained using a variety of input modalities such as voice or Bluetooth controller. A simple and frugal cardboard-like headset with a smartphone to be kept in it is required per user. A speech recognition engines with speech to text capabilities and speaker identification capabilities are used to analyze the user’s voice commands and store the context.
[033] In the preferred embodiment of the present disclosure, the determination module (120) of the system (100) is configured to determine a set of protocols of the VR environment to connect through a voice BOT. The voice BOT is configured to interface with the VR environment to call one or more solutions with a voice commands. The one or more users operate on the dashboard either itself or instructs the voice BOT to perform relevant operations. The information report/dashboard requires further exploration to draw deeper insights and to reduce the time to comprehend these insights, the one or more users chooses to explore the information report/dashboard in the VR environment.
[034] In the preferred embodiment of the present disclosure, the invocation module (122) of the system (100) is configured to invoke the voice BOT to access the identified one or more information charts in the virtual reality environment and to make one or more changes in the identified chart.
[035] It would be appreciated that the voice BOT of the system (100), which can interface with the VR environment, can call the information chart with the voice commands. The system (100) has a Bluetooth controller with a joystick and four functional buttons. The Bluetooth controller enables not only the functionalities of movement of the user in the VR environment and selection of points of interest, but also enables additional functionalities required such as toggling ON/OFF a laser pointer, filters, zoom etc. The one or more user can also send voice commands when in VR environment which uses the Speech Recognition Engine to convert the user’s command to actionable events. Further, when multiple users exist in the virtual environment, the exchange of information or speech also happens through voice. The system also comprises Unity3D which is a dominant immersive VR development platform (mostly for the game development, but also for the professional activities) which is required for rendering the virtual environment.
[036] In the preferred embodiment of the present disclosure, the format identifying module (124) of the system (100) is configured to identify a format of the identified chart in the VR environment to store the transactions in the real environment.
[037] In the preferred embodiment of the present disclosure, the transaction module (126) of the system (100) is configured to complete the at least one transaction between the VR environment and the real environment.
[038] FIG. 2 shows the multiple views from business to technology at a high level of the system for data visualization in the VR environment. The system (100) is configured for creating data visualization charts in the VR environment with all the features of data visualization tool or augmenting an existing data visualization tool with the immersive 3D charts. Upon authentication, one or more users may select from existing three dimensional charts for the visualization in the VR environment and may access the existing reports in the VR environment by integrating with a voice BOT on mobile. By using cognitive services, the voice BOT understands the user requirements of parameters (dimensions) and report names in natural language, thereby connecting to the backend services to fetch, refresh and open the right report/dashboard on user’s mobile.
[039] Referring FIG. 3, a schematic diagram, wherein the VR headset requires a mobile to be kept in it and the mobile device comprises a plurality of layers to generate multi-dimensional data visualizations. The users request for dashboards and reports via a chat window on the mobile device depending on the context. The request is sent to a middleware layer which fetches the data from the backend servers, parses and publishes the data as JSON back to the mobile device using a HTTPS protocol using web service brokers. This data is then processed and plotted in the business layer of the mobile device and visualized in the presentation layer of the mobile device wherein the native device features of the mobile such as gyroscope, accelerometer are accessed. The user actions are communicated via the middleware layer and maintained in the session management component which is used to store and maintain the context. The middleware layer receives the input from the user over a network and performs the necessary action to be taken and carry forward it from the real world to the virtual world and vice versa. The middleware layer integrates with the backend database layer and with authentication and authorization components, users can access the visualized data in safe, secured and compliant manner by connecting to the databases.
[040] As depicted in FIG. 4, a processor-implemented method (300) according to the present invention is depicted in the flowchart. Wherein the method illustrates to create a data visualization charts in the VR environment with all the features of data visualization tool or augmenting an existing data visualization tool with the immersive 3D charts.
[041] Initially at step (302), collecting one or more information charts from a plurality of predefined sources of information at the collection module (110) of the system (100). The one or more information charts includes a word embedding graph for data scientists, a scatter plot, a network analytics and complex relationship graph.
[042] In the preferred embodiment of the present disclosure at next step (304), wherein authenticating the one or more information charts using one or more predefined parameters at the authentication module (112) of the system (100).
[043] In the preferred embodiment of the present disclosure at next step (306), wherein the authenticated one or more information charts from a real environment to a VR environment is transmitted at the transmitting module (114) of the system (100) to visualize three-dimensional aspects of the charts in the virtual reality environment. The ability to visualize the data within a multidimensional data space in 3D opens up a vast array of possibilities for the analysis of complex data, 3D data visualization systems in accordance with an embodiment of the present disclosure enable data exploration to be performed in a collaborative manner.
[044] In the preferred embodiment of the present disclosure at next step (308), at least one information chart is identified at the identification module (116) of the system (100) of the transmitted one or more information charts for further exploration to draw deeper insight. Herein, the visualization reports requested by the one or more users are fetched from the server and presented to the one or more users. Upon presentation of reports, the one or more users then starts exploring the report in virtual reality mode.
[045] In the preferred embodiment of the present disclosure at next step (310), interacting with the identified at least one information chart by applying filters, drilling down, drilling up, point markers, zoom and line markers to the chart based on requirement of a predefined one or more users at the interaction module (118) of the system (100). It would be appreciated that the user either operate on the dashboard itself or instructs the bot to perform relevant operations
[046] In the preferred embodiment of the present disclosure at next step (312), a set of protocols of the virtual reality environment is determined at the determination module (120) of the system (100) to connect through a voice BOT. The voice BOT is configured to interface with the VR environment to call one or more solutions with a voice commands. The operations which the one or more users applies in the real environment while exploring the report on the mobile device are stored and the same are carried along with the context from real environment to the VR environment and presented to the one or more users. In turn, the one or more users will be able to access the filters and drill downs when they enter the virtual environment.
[047] In the preferred embodiment of the present disclosure at next step (314), the voice BOT is invoked at the invocation module (122) of the system (100) to access the identified one or more information charts in the virtual reality environment to make one or more changes in the identified chart.
[048] In the preferred embodiment of the present disclosure at next step (316), format of each of the one or more information charts in the virtual reality environment is identified at the format identifying module (124) of the system (100) to store the transactions in the real environment.
[049] It is to be noted that a single user, as a lead user, may host a multiplayer interactive session in which one or more other users are called to view the 3D data visualization space from the same viewpoint controlled by the lead user. While one or more other users are present physically in different geographical locations, insights are shared for their action in real time. The lead user share the insights using the immersive experience in VR environment with all the other users and interacts with the reports (for example, draws lines between the points in the scatter plot, selects only certain nodes in the network diagram, updates the summary wall based on these insights). Along with these, certain points are discussed and noted in the virtual environment.
[050] In the preferred embodiment of the present disclosure at the last step (318), completing at least one transaction between the VR environment and the real environment at the transaction module (126) of the system (100). Upon completion of the task in the VR environment, the lead user and the other one or more users comes back to the real world and the context from the virtual environment is carried to the real world with all the interactions and insights noted in the VR environment. Further, analysis, reports and actions are continued in the real world. Further herein, the one or more users may access the relevant data by connecting to the backend databases rather than storing the data locally on the device.
[051] In one example, wherein A and B, two users are multi-located users willing to discuss on the sales reports. A requests the voice BOT on his mobile device to fetch the sales report from the server with defined parameters. Both, A and B enter in the same virtual space and are prepared to interact with the same dashboard to draw insights. Herein, the sales dashboard has summary charts, scatter plot, bar charts, and line graphs in 3D thereby making it intuitive for A and B. Both of them discusses on two unique cases of sales for which they wanted to find the co-relation of certain parameters using a scatterplot. Amongst thousands of points, A filters some points in the scatterplot using the buttons available in the virtual environment for interactivity. B is also able to observe the changes in the scatterplot in real time. A and B moves in between these points to explore the two unique cases. A interacts with the points and B notices the distance between the two points as a label on the scatterplot.
[052] Referring FIG. 5, an exemplary flow diagram, wherein a user requests for dashboards and reports on a chat window on his mobile device. Using a cognitive services, the voice BOT understands the user requirements of parameters (dimensions) and report names in natural language and connects to the backed services to fetch, refresh and open the right report/dashboard on users’ mobile. The user interacts with the report/dashboard by applying certain filters and drilling down the charts based on his requirement. The user either operates on the dashboard itself or commands the BOT for these operations. The complex report/dashboard needs further exploration to draw deeper insights and to reduce the time to comprehend these insights sometimes due to the limitation of the screen size of the mobile. Hence, the user chooses to explore the report/dashboard in virtual reality mode. All operations which user had applied in the real world while exploring the report/dashboard on his mobile was stored and the same is used carry forward the context from real world to the virtual world. Thus, when the user enters the virtual world, all charts in the report/dashboard have filters and drill downs applied by the user in real world on this mobile. In the Virtual World, the user immerses himself in between the data points of the charts in the report/dashboard and derives deeper insights. Wanting to share these insights with the team for action in real time, he hosts a multiplayer session for collaboration and invites his colleagues in the virtual world while these colleagues are present physically in different geographical locations with internet connectivity. The user shares his insights and using the immersive experience in virtual environment with all the colleagues and interacts with the reports. It would be appreciated that to draw lines between the points in the scatter plot, selects only certain nodes in the network diagram, updates the summary wall based on these insights. Along with these, certain points are discussed and noted in the virtual reality environment. Once the task is completed in the virtual environment, the user and his colleagues come back to the real world. However, the context from the virtual world is carried to the real world with all interactions and insights noted. Further analysis, reports and actions continue in the real world. Hence, Data Visualization in Virtual Reality is used in an interim period of real world analysis for apt user experience due to HMDs usage, to reduce the time to comprehend the complex reports/dashboard and to derived deeper insights which are not possible in the 2D world.
[053] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[054] The embodiments of present disclosure herein addresses unresolved problem of enabling an enterprise to use a VR environment as a medium of data visualization by understanding the limitations of VR such as (a) nausea by too much time in VR environment and (b) high cost of HMDs. For overcoming them with solutions of using the real world context of reporting and data visualizations where the user is allowed to spend his majority of the time and carry the same context and interactions with the charts in and out of the virtual world of data visualizations to avoid nausea and use the low cost HMDs to collaborate across users from CXOs to rank and file, thereby improving the adoption and the adaption of the VR for enterprise data visualization.
[055] It is to be understood that the scope of the protection is extended to such instructions and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more instructions of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[056] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[057] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
[058] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[059] It is intended that the disclosure and examples be considered as exemplary only, with a true scope and spirit of disclosed embodiments being indicated by the following claims.
,CLAIMS:1. A processor-implemented method comprising one or more steps of:
collecting one or more information charts from a plurality of predefined sources of information;
authenticating the one or more information charts using one or more predefined parameters;
transmitting the authenticated one or more information charts from a real environment to a virtual reality (VR) environment to visualize three-dimensional aspects of the one or more information charts in the VR environment;
identifying at least one information chart of the transmitted one or more information charts for further exploration to draw deeper insight;
interacting with the identified at least one information chart by applying a plurality of editing tools on the information chart based on requirement of the one or more users, wherein the plurality of editing tools includes filters, drilling down, drilling up, point markers, zoom and line markers;
determining a set of protocols of the VR environment to connect through a voice BOT, wherein the voice BOT is configured to interface with the VR environment to call one or more solutions using voice commands;
invoking the voice BOT to access the at least one identified information chart in the VR environment and to make one or more changes in the at least one identified chart;
identifying a format of the at least one identified chart in the VR environment to store one or more transactions in the real environment; and
completing the at least one transaction between the VR environment and the real environment.
2. The method claimed in claim 1, further comprising:
receiving a response from the one or more users present in different geographical locations of a real environment; and
processing the received response to complete the transaction within the VR environment.
3. The method claimed in claim 1, wherein the voice BOT comprising a speech recognition engine with speech to text capabilities and a speaker identification capabilities are used to analyze the users’ voice commands.
4. The method claimed in claim 1, wherein a lead user of the one or more users host a multiplayer interactions in which other users are called to view the 3D data visualization in the virtual reality environment from the same viewpoint as controlled by the lead user.
5. The method claimed in claim 1, wherein the one or more transactions carried out in the virtual reality environment are moved to the real environment with all the interactions and insights noted in the virtual reality environment.
6. A system comprising:
a collection module configured to collect one or more information charts from a plurality of predefined sources of information;
an authentication module configured to authenticate the one or more information charts using one or more predefined parameters;
a transmitting module configured to transmit the authenticated one or more information charts from a real environment to a virtual reality (VR) environment to visualize three-dimensional aspects of the one or more information charts in the VR environment;
an identification module configured to identify at least one information chart of the transmitted one or more information charts for further exploration to draw deeper insight;
an interaction module configured to interact with the identified at least one information chart by applying a plurality of editing tools on the information chart based on requirement of the one or more users, wherein the plurality of editing tools includes filters, drilling down, drilling up, point markers, zoom and line markers;
a determination module configured to determine a set of protocols of the VR environment to connect through a voice BOT, wherein the voice BOT is configured to interface with the VR environment to call one or more solutions using voice commands;
an invocation module configured to invoke the voice BOT to access the at least one identified information chart in the VR environment and to make one or more changes in the at least one identified chart;
a format identifying module configured to identify a format of the at least one identified chart in the VR environment to store one or more transactions in the real environment; and
a transaction module configured to complete at least one transaction between the VR environment and the real environment.
7. The system claimed in claim 6, further comprising:
one or more means of receiving configured to receive a response from the one or more users present in different geographical locations of a real environment; and
one or more means of processing configured to process the received response to complete the transaction within the VR environment.
8. The system claimed in claim 6, wherein the voice BOT comprising a speech recognition engine with speech to text capabilities and a speaker identification capabilities are used to analyze the users’ voice commands.
9. The system claimed in claim 6, wherein a lead user of the one or more users host a multiplayer interactions in which other users are called to view the 3D data visualization in the virtual reality environment from the same viewpoint as controlled by the lead user.
10. The system claimed in claim 6, wherein the one or more transactions carried out in the virtual reality environment are moved to the real environment with all the interactions and insights noted in the virtual reality environment.
| # | Name | Date |
|---|---|---|
| 1 | 201821031845-STATEMENT OF UNDERTAKING (FORM 3) [24-08-2018(online)].pdf | 2018-08-24 |
| 2 | 201821031845-PROVISIONAL SPECIFICATION [24-08-2018(online)].pdf | 2018-08-24 |
| 3 | 201821031845-FORM 1 [24-08-2018(online)].pdf | 2018-08-24 |
| 4 | 201821031845-DRAWINGS [24-08-2018(online)].pdf | 2018-08-24 |
| 5 | 201821031845-Proof of Right (MANDATORY) [03-10-2018(online)].pdf | 2018-10-03 |
| 6 | 201821031845-FORM-26 [04-10-2018(online)].pdf | 2018-10-04 |
| 7 | 201821031845-ORIGINAL UR 6(1A) FORM 1-091018.pdf | 2019-02-15 |
| 8 | 201821031845-FORM 3 [11-04-2019(online)].pdf | 2019-04-11 |
| 9 | 201821031845-FORM 18 [11-04-2019(online)].pdf | 2019-04-11 |
| 10 | 201821031845-ENDORSEMENT BY INVENTORS [11-04-2019(online)].pdf | 2019-04-11 |
| 11 | 201821031845-DRAWING [11-04-2019(online)].pdf | 2019-04-11 |
| 12 | 201821031845-COMPLETE SPECIFICATION [11-04-2019(online)].pdf | 2019-04-11 |
| 13 | Abstract1.jpg | 2019-07-03 |
| 14 | 201821031845-OTHERS [11-08-2021(online)].pdf | 2021-08-11 |
| 15 | 201821031845-FER_SER_REPLY [11-08-2021(online)].pdf | 2021-08-11 |
| 16 | 201821031845-COMPLETE SPECIFICATION [11-08-2021(online)].pdf | 2021-08-11 |
| 17 | 201821031845-CLAIMS [11-08-2021(online)].pdf | 2021-08-11 |
| 18 | 201821031845-ABSTRACT [11-08-2021(online)].pdf | 2021-08-11 |
| 19 | 201821031845-FER.pdf | 2021-10-18 |
| 20 | 201821031845-US(14)-HearingNotice-(HearingDate-07-11-2023).pdf | 2023-09-26 |
| 21 | 201821031845-FORM-26 [31-10-2023(online)].pdf | 2023-10-31 |
| 22 | 201821031845-FORM-26 [31-10-2023(online)]-1.pdf | 2023-10-31 |
| 23 | 201821031845-Correspondence to notify the Controller [31-10-2023(online)].pdf | 2023-10-31 |
| 24 | 201821031845-Written submissions and relevant documents [14-11-2023(online)].pdf | 2023-11-14 |
| 25 | 201821031845-PatentCertificate13-12-2023.pdf | 2023-12-13 |
| 26 | 201821031845-IntimationOfGrant13-12-2023.pdf | 2023-12-13 |
| 1 | 2021-03-1015-53-37E_10-03-2021.pdf |