Abstract: The present disclosure relates to a method and system for providing industrial safety training to one or more users. The system comprises virtual reality (VR) headsets [102]. Each VR headset [102] comprises a processing unit [104], a memory unit [108] and a user interface [110]. Further the system comprises haptic devices [106a-106n] coupled to the one or more VR headsets [102]. The VR headsets [102] are connected to a central server [114] via the communication unit [112] and are also connected to each other via the communication unit [112]. Also, VR headsets [102] display a media content. The haptic devices [106a-106n] are may generate a haptic feedback in synchronization with the media content being displayed via the user interface [110], and a user input received via: the haptic devices [106a-106n], and the VR headsets [102]. [FIG. 1]
FORM 2
THE PATENTS ACT, 1970
(39 OF 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
“SYSTEM AND METHOD FOR PROVIDING INDUSTRIAL SAFETY
TRAINING TO USERS”
We, Prolearner Interactives Pvt. Ltd., an Indian National, of C-201, Ganesh Meridian, SG Highway, Sola, Ahmedabad, GJ - 380060, India.
The following specification particularly describes the invention and the manner in which it is to be performed:
FIELD OF THE INVENTION:
The present invention generally relates to virtual reality devices and more particularly to virtual reality systems for providing industrial safety training to users.
BACKGROUND OF THE DISCLOSURE:
The following description of the related art is intended to provide background information pertaining to the field of the disclosure. This section may include certain aspects of the art that may be related to various features of the present disclosure. However, it should be appreciated that this section is used only to enhance the understanding of the reader with respect to the present disclosure, and not as admissions of the prior art.
Industrial environments are at high risk of hazards, and many of them are prone to accidents as well. These environments face safety challenges due to various reasons which extend beyond just human errors. A fire in the workplace, or chemical hazards can not only affect the production output of the industry, but can also lead to loss of life or irrevocable damage caused to people or property. Implementing standardized industrial safety measures is the best way to ensure a smooth-running operation of the workplaces as well as to preserve the interests of workers employed at such places.
Safety training is essentially needed for workers who work especially in hazard-prone industrial environments because of the higher risks in these areas. Thus, there is an ethical as well as legal liability on employers to contribute to safety and health by providing proper formal training to the workers. Further, employees should contribute to safety and health by learning formal health safety measures and implementing them at the workplace when needed.
However, the safety education has been insufficient due to the unstructured
nature and non-standard compliant safety training. It is also not always possible to provide physical training due to several reasons. For example, existing training methods using actual tools in relation to fire include the use of consumable materials, limited training space, facility management issues, risk of novice safety hazards due to voltage, current, heat release, spatter, etc. Further, safety training composed of text and simple videos is boring for people and thus not beneficial. Also, at some instances people find it difficult to cope with the languages in which the training is imparted. There have been many vocational training institutes for imparting industrial training to the people. But it is found that there is a huge gap between number of people who need the training and the total number of seats for training. As a result, people do not receive formal training, and do not know the formal standards of industrial safety.
In order to solve the above problems, it is an imperative need to create an environment that is same or atleast similar as the actual work environment to impart industrial safety training to multiple people at the same time. This will help the employees to receive standards compliant formal training for industrial safety, as well as the employers to provide formal and efficient training with minimized cost of training.
SUMMARY OF THE DISCLOSURE
This section is provided to introduce certain objects and aspects of the present invention in a simplified form that are further described below in the detailed description. This summary is not intended to identify the key features or the scope of the claimed subject matter.
One object of the present disclosure is to provide a method and system for providing industrial safety training to users that overcomes at least some of the limitations of the existing approaches for imparting industrial safety training. Another object of the present disclosure is to provide a method and system that
enables providing structured industrial safety training that is standard compliant. Yet another object of the present disclosure is to provide a method and system that enables providing structured industrial safety training that creates an environment that is same or at least similar to the actual work environment. Yet another object of the present disclosure is to provide a method and system that enables providing structured industrial safety training that keeps users engaged. Yet another object of the present disclosure is to provide a method and system that enables providing structured industrial safety training in various languages. Yet another object of the present disclosure is to provide a method and system that enables providing structured industrial safety training that overcomes the limitation of gap between number of people who need the training and the total number of seats for training. Yet another object of the present disclosure is to provide a method and system that enables providing formal and efficient training with minimized cost of training.
In order to achieve at least one of the objectives as mentioned above, one aspect of the present invention relates to a method for providing industrial safety training to one or more users. The method comprises displaying, by one or more VR headsets, a media content. Each VR headset of the one or more VR headset comprises at least a processing unit, a memory unit, and a user interface. Also, the one or more VR headsets are connected to a central server via a communication unit and the one or more VR headsets are connected to each other via the communication unit. Further, the method comprises generating, by one or more haptic devices, a haptic feedback in synchronization with the media content being displayed on the user interface on the one or more VR headsets, and a user input received via: at least one of the one or more haptic devices, and the one or more VR headsets. The one or more haptic devices are included in one or more sets of haptic devices, and the one or more sets of haptic devices are coupled to the one or more VR headsets.
Another aspect of the present invention relates to a system for providing industrial safety training to one or more users. The system comprises one or more virtual reality (VR) headsets. Each VR headset of the one or more VR headsets comprises at least a processing unit, a memory unit and a user interface. Further the system comprises one or more sets of haptic devices coupled to the one or more VR headsets. Each set of haptic devices comprises one or more haptic devices. Further, the system comprises a communication unit coupled to the one or more VR headsets, and the one or more sets of haptic devices. The one or more VR headsets are connected to a central server via the communication unit and the one or more VR headsets are connected to each other via the communication unit. Also, the each VR headset is configured to display a media content via the processing unit and the user interface. Further, the one or more haptic devices are configured to generate a haptic feedback in synchronization with the media content being displayed on the user interface on the one or more VR headsets, and a user input received via: at least one of the one or more haptic devices, and the one or more VR headsets.
BRIEF DESCRIPTION OF DRAWINGS
The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components, electronic components or circuitry commonly used to implement such components.
Figure 1 illustrates an exemplary overview of components of a system for providing industrial safety training to users, in accordance with exemplary embodiments of the present invention.
Figure 2 illustrates an exemplary overview of a system for revalidating user accounts of users for providing industrial safety training to users, in accordance with exemplary embodiments of the present invention.
Figure 3 illustrates an exemplary overview of a system for re-licensing VR headsets for providing industrial safety training to users, in accordance with exemplary embodiments of the present invention.
Figure 4 illustrates exemplary flow chart of a method for providing industrial safety training to users, in accordance with exemplary embodiments of the present invention.
The foregoing shall be more apparent from the following more detailed description of the disclosure.
DESCRIPTION OF THE INVENTION
In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not address any of the problems discussed above or might address only some of the problems discussed above.
The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure.
Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the disclosure as set forth.
Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail.
Also, it is noted that individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed but could have additional steps not included in a figure.
The word “exemplary” and/or “demonstrative” is used herein to mean serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “exemplary” and/or “demonstrative” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques known to those of ordinary skill in the art. Furthermore, to the extent that the terms “includes,” “has,” “contains,” and other similar words are used in either the detailed description or the claims, such terms are intended to be inclusive—in a
manner similar to the term “comprising” as an open transition word—without precluding any additional or other elements.
As used herein, a “processor” or a “processing unit” may be a general-purpose or a special-purpose processing unit. Also, as used herein, a “processing unit” or “general-purpose processing unit” or “special-purpose processing unit” or “processor” or “operating processor” includes one or more processors, wherein processor refers to any logic circuitry for processing instructions. A processor may be a general-purpose processor, a special purpose processor, a conventional processor, a digital signal processor, a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits, Field Programmable Gate Array circuits, any other type of integrated circuits, etc. The processor may perform signal coding data processing, input/output processing, and/or any other functionality that enables the working of the system according to the present disclosure. More specifically, the processor or processing unit is a hardware processor.
As used herein, a “central server”, may be any electrical, electronic and/or computing device or equipment, capable of implementing the features of the present disclosure. The user equipment/device may include, but is not limited to, a mobile phone, smart phone, laptop, a general-purpose computer, desktop, personal digital assistant, tablet computer, wearable device or any other computing device which is capable of implementing the features of the present disclosure. Also, the central server may contain an input means configured to receive an input from and/or an output means to send an output to a processing unit, a transceiver unit, a storage unit and any other such unit(s) which are required to implement the features of the present disclosure.
As used herein, “storage unit” or “memory unit” refers to a machine or computer-readable medium including any mechanism for storing information in a form
readable by a computer or similar machine. For example, a computer-readable medium includes read-only memory (“ROM”), random access memory (“RAM”), magnetic disk storage media, optical storage media, flash memory devices or other types of machine-accessible storage media. The storage unit stores at least the data that may be required by one or more units of the system to perform their respective functions. The memory unit may be distributed in various components of the system or may also form a part of a remote server with which the components of the disclosed invention may be interacting.
As used herein, “haptic feedback” may include feedback perceptible to a user via the sense of touch by applying, for example, forces, vibrations, or other movements. The haptic feedback may be controlled to provide a signal indicative of a real world sound, speech or experience that may coincide with a user's interaction with the computing device or other activities performed using the computing device. Haptic feedback in the disclosed embodiments or implementations may be output with varied timing, duration, patterns, or intensity using a single haptic feedback device or in conjunction with additional “haptic feedback devices” or “haptic devices”. Thus, as used herein, a “haptic feedback device” or “haptic device” is a hardware system configured to generate a haptic feedback according to a received signal, and/or receive instructions from a source to generate a haptic feedback. The haptic feedback may be generated based on a haptic feedback profile identifying haptic feedback timing, duration, pattern or intensities that mimic recognizable sounds or that provide unique indications personal to the user or the user's interaction with a computing device. Numerous different haptic feedback profiles may be defined for a variety of user interactions with an application executed on the computing device. In some embodiments or implementations, variable haptic feedback may be provided together with a sound or visual indication.
As used herein, a “user interface” typically includes an output device in the form of a display, such as a liquid crystal display (LCD), cathode ray tube (CRT) monitors, light emitting diode (LED) screens, etc. and/or one or more input devices such as touchpads or touchscreens. The display may be a part of a portable electronic device such as the VR headset, smartphones, tablets, mobile phones, wearable devices, etc. They also include monitors or LED/LCD screens, television screens, etc. that may not be portable. The display is typically configured to provide visual information such as text and graphics. An input device is typically configured to perform operations such as issuing commands, selecting, and moving a cursor or selector in an electronic device.
As disclosed in the background section, safety education has been insufficient due to the unstructured nature and non-standard compliant training, while the same is needed for workers working in hazard-prone industrial environments. Further, the existing technologies have many limitations. For instance, the existing solutions are unable to create a similar environment as the actual work environment and impart formal and efficient training to multiple employees at the same time at a minimized cost. In order to overcome at least some of the limitations of the prior known solutions, the present disclosure provides a solution for providing industrial safety training to users. The system of the present disclosure comprises virtual reality (VR) headsets, haptic devices coupled to the one or more VR headsets, and a communication unit coupled to the VR headsets, and/or the haptic devices. The VR headsets also comprise a user interface on which a media content can be displayed. This media content is industrial safety training material in the form of video modules, graphical presentations, etc. This media content is either played by the user by accessing the content stored in the memory storage on the VR headset (in standalone mode), or is played by a central server to which the VR headset is connected (in classroom mode). Also, this central server may be controller manually by a trainer. The haptic devices generate a haptic feedback in synchronization with the media content being displayed on the
user interface. Also, these haptic devices also generate a haptic feedback based on the inputs received from the users as and when required.
Also, the media content modules may be shoot-based or 3D reconstruction. In shoot-based content, high resolution 360ᵒ cameras may be used to film the scenes. Incidents may be recreated in reality or using visual effects, including 3D modelling, green-screen removal, etc., and audio effects, etc. The modules may be created using standard video editing tools as generally known in the art. The video may also be captured in stereo 3D for added depth perception. In 3D reconstruction, a high-resolution 3D reconstruction of the environment may also be created using game engines as generally known in the art.
Hereinafter, exemplary embodiments of the present disclosure will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present disclosure.
Referring to Figure 1, an exemplary block diagram of a system [100] for providing industrial safety training to one or more users is shown. The system [100] comprises one or more virtual reality (VR) headsets [102], one or more sets of haptic devices [106], a communication unit [112], all components presumed to be connected with each other unless otherwise indicated herein. Further, each VR headset [102] comprises at least a processing unit [104], a memory unit [108] and a user interface [110]. Also, at least one set of haptic devices [106] is coupled to one VR headset [102], and each set of haptic devices [106] comprises one or more haptic devices [106a-106n]. Further, the communication unit [112] is coupled to the one or more VR headsets [102], and the one or more sets of haptic devices [106]. The one or more VR headsets [102] may be connected to a central server [114] via the communication unit [112]. This central server [114] can be any computing device such as a smartphone or a laptop/desktop computer. Also, the
one or more VR headsets are connected to each other via the communication unit [112].
The each VR headset [102] is configured to display a media content via the processing unit [104] and the user interface [110]. This media content, in an implementation, is saved in the memory unit [108] connected to the VR headset [102]. The media content can be a text file, an audio file, a video file, or any combination thereof. In an implementation, the examples of media content include, but not limited to, images, gamified assessment reports, user-tracking reports, etc.
The present disclosure encompasses that the VR headset [102] is configured to operate in two modes: a first mode and a second mode. The first mode may be a ‘standalone mode’ wherein the user has more control over the VR headset [102]. For example, the user in possession of the VR headset [102] and the set of haptic devices [106] is situated in some location and the user wants to view a media content related to the industrial training. In that case, since the complete media content comprising all the modules related to the training is stored in the memory unit [108] that is connected to, or integrated with, the VR headset [102], the user is able to access any module of the media content from the memory unit [108] to learn from.
Further, the second mode may be a ‘classroom mode’ where the user has less control, or only limited control over the VR headset [102]. In this mode, the user is in the vicinity of a central server [114]. The central server [114] may be a mobile phone, smart phone, laptop, a desktop, a general-purpose computer, or any other computing device. In an implementation, the central server [114] is in full or partial control of a human operator or a trainer. In the classroom mode, the one or more VR headsets [102] connect to the central server [114] via the communication unit [112]. In an implementation, the communication unit [112] is a Wi-Fi router. In
other implementations, the communication unit [112] may comprise any other suitable communication module such as a Wi-Fi router, Bluetooth module, Zigbee module, infrared module, near field communication module, etc. either alone or in any combination. Further, the VR headset [102], in the classroom mode, is configured to automatically connect to the communication unit [112] after the VR headset [102] is powered on. Thus, when a user switches ON the VR headset [102] in the classroom mode, the communication unit [112], say, a Wi-Fi router, automatically connects to the central server [114], say, a smartphone. Also, after switching ON both the VR headset [102] and the set of haptic devices [106], the VR headset [102] also connects to the set of haptic devices [106] automatically. Further, in this classroom mode, the control of the VR headset [102] is with the central server [114]. This means that the control as to which module to play on the VR headset [102] for the user is with the central server [114]. The central server [114] in an implementation, is controlled by a human operator. In another implementation, the central server [114] may be programmed to automatically control the one or more VR headsets [102], for example, the central server [102] may be programmed to display a particular module at a particular time. However, in this second mode, i.e., classroom mode, the user in possession of the VR headset [102] does not have control over which module of the media content to play in the VR headset [102]. Thus, in the classroom mode, the central server [114] is configured to facilitate a limited set of functions of the one or more VR headsets [102]. Also, in classroom mode, only the trainer selects which module to play on the one or more VR headsets [102] and only that media content module as selected by the trainer on the central server [114] is played on all the VR headsets [102] in the training session.
In an implementation, the one or more VR headsets [102] and the central server [114] communicate via a User Datagram Protocol (UDP) network. Using UDP may facilitate in reducing the network search time substantially as compared to other
techniques such as Transmission Control Protocol/Internet Protocol network searching.
The present disclosure encompasses that both the one or more VR headsets [102] and the central server [114] are pre-configured to connect to the same communication network, say the Wi-Fi network, which is broadcast by the communication unit [112], say the Wi-Fi router of the system [100]. Further, the central server application [114a] is programmed to play central server [114] as a server device in the UDP network while the VR headset application [102a] is programmed to play VR headset [102] as a client device. The UDP protocol allows the central server [114] to connect to multiple client devices, i.e., VR headsets [102], but every client is connected to only one UDP server, i.e., the central server [114] through the central server application [114a]. Once the UDP server [114] and clients [102] are on the same network, the UDP server application [114a] starts searching for the client devices [102] connected in the same network and creates a list of all the clients, i.e., VR headsets [102] connected in the same network. When a VR headset [102] connects to the central server [114], the VR headset [102] sends its details such as IP address, device name, battery level, etc. as an introductory message to the central server [114]. At the central server [114] side, all these details about each connected VR headset [102] are stored in different arrays, i.e., a separate corresponding array for each VR headset [102], to handle further communications. Once the details are exchanged, the UDP Server [114], i.e., the central server [114] controls all the client devices [102], i.e., the VR headsets [102]. For example, the IP address of a particular VR headset [102] is bound with the name of the VR headset [102] in an array created for that particular VR headset [102]. So, in further communications, whenever the central server [114] is required to send a private message to a particular VR headset [102], it attaches the IP address of that particular VR headset [102] with the message while sending in the network. In this manner the central server [114] uniquely identifies each of the VR headsets [102].
In an implementation, the haptic devices [106] may implement eccentric rotating mass (ERM) and/or linear resonant actuator (LRA) motors (1.5v - 3v dc; 0.1 A -0.3A) based on Bluetooth universal asynchronous receiver-transmitter (UART) and/or Bluetooth low energy (BLE) trigger from a software application. Additionally, the haptic devices [106] may drive externally powered 12V DC geared motors based on Bluetooth trigger. There may be several variants of haptic devices such as head mounted, wrist mounted, shoulder mounted, jacket type, arms mounted, hand gloves type, feet mounted, etc. Further, the one or more of the haptic devices [106] may also be externally installed on other hardware for motor actuation. Further, in some exemplary implementations, one or more of the haptic devices may implement one or more of a ESP32-C3: BLE5/ Wi-Fi compatible microcontroller, programmable in Arduino and ESP-IDF, Texas Instruments DRV2605L haptic driver for driving the ERM / LRA motors, external clock oscillator for sleep / battery conservation, MCP73831T-2DCI/MC LiPo charge management controller, MCP1826ST-3302E/DB LDO regulator, as known in the existing art.
Each haptic controller unit may comprise a microcontroller, a power source such as a rechargeable lithium polymer ion battery and an actuator. The haptic devices [106a-106n] are designed to connect to the VR headset [102] via a wireless connection such as a Bluetooth connection, Wi-Fi connection, etc. The VR headset application [102a] can read instructions in the processing unit [104] during the training module and direct the signal to the appropriate haptic controller as needed. For this, each course module has an associated file that includes the information about (a) timecode for haptic trigger, (b) device to be triggered, and (c) trigger pattern. For example, say, an illustrative code “20, B1R, 12” means At 20 seconds, for the wrist band right motor, trigger vibration pattern 12; and another code “75, C1C, 45” means At 75 seconds, for the centre chest motor, trigger vibration pattern 45. A person skilled in the art would appreciate that the above example is provided for understanding purposes only and does not limit or restrict the disclosure in any possible manner.
Also, the user interface [110], that is generally an input and output device in the form of a display screen, is used to display the media content on the VR headsets [102]. The same user interface [110] is also configured to display an assessment data comprising one or more assessment questions. The user interface [110] also obtains a user input as a response to each of the one or more assessment questions. For example, after the end of each module, a set of assessment questions is displayed to the users on the user interface [110]. The user submits the response to each of the assessment question via the user interface [110]. In another example, the assessment questions are shown to the users at the end of the training session.
Further, in an implementation, in the classroom mode, the central server [114] first adds a subset of the one or more VR headsets [102] to join a training session. Since, each VR headset [102] may have a unique ID, the central server [114] identifies which VR headset [102] has switched ON and connected with the central server [114]. For this, the central server [114] stores a corresponding set of details of the each VR headset [102] in a corresponding array. Further, the central server [114] receives, say, via the operator input, a playlist of modules for playing in the training session. Say, the operator is operating the central server [114], the operator selects which module to play on the VR headset [102]. This selection may be made by the operator on the central server [114] via an application or a software program installed on the central server [114]. Further, the central server [114] triggers, via an operator input, a start of the training session. For this purpose, after selecting which modules to play on the VR headsets [102], the operator may, say, click on a button ‘Start Training Session’. After the starting of the training session is triggered, the central server [114] sends an information to the one or more VR headsets [102]. The information comprises the playlist of modules for playing in the training session. This information may inform the VR headsets [102] as to which module to play, or the sequence of modules to play
during the training session. The modules are displayed via the user interface [110] of the VR headsets [102].
Also, in an implementation, when a trainer/operator creates a classroom for training session, she/he can select any number of VR headsets [102] from the list to add to the training session in classroom mode. At the time of selecting a VR headset [102], the central server [114] sends a message to that VR headset [102] with instructions to join the training session. Upon receipt of the instructions, the VR headset [102] overrides the current activities with actions as directed in the instructions sent by the central server [114], and loads the classroom waiting page on the VR headset [102]. Additionally, the VR headset [102] in this classroom mode disables the standalone mode, and delegates a partial or complete VR headset [102] control. For example, the VR headset [102] may delegate a control over all features except for button press, and gaze lock events during the assessment to the central server [114]. The VR headset [102] stays under the control of the central server [114] until it is removed from the classroom mode or the training session is completed.
Further, in an implementation, post the sending of the information by the central server [114] to the one or more VR headsets [102], the one or more VR headsets [102] are configured to pre-configure, in the processing unit [104], the playlist of modules for playing in the training session, from the memory unit [108], and the assessment data. Since the modules are stored in the memory unit [108] installed on the VR headsets [102], the central server [114] sends the information about the playlist of modules to be played during the training session to the VR headsets [102]. Using this information, the processing unit [104] of each VR headset [102] configures as to which modules to play during the session. The processing unit [104], for example, sequences the links from the memory unit [108] to play one after another during the training session. Along with this, the processing unit [104] also fetches the assessment data to be shown to the users after each module is
finished or after the training session is completed, and shows the assessment data at desired time point to the users to obtain their input accordingly.
Once the VR headset [102] starts displaying the select modules via the processing unit [104] and the user interface [110], the one or more haptic devices [106a-106n] may generate a haptic feedback in synchronization with the media content being displayed on the user interface [110] on the one or more VR headsets [102], and a user input received via: at least one of the one or more haptic devices [106a-106n], and the one or more VR headsets [102]. For example, a safety training module is being displayed on the user interface to train the users that they should not touch any live wire on their work location site. In this example, say a video is being displayed and a virtual user is created on the display. Further, this virtual user is configured to perform similar actions as the real user. These actions of the virtual user can be obtained by wearable haptic devices that the real user is wearing. Say, the real user is wearing a pair of gloves as haptic device on his/her hands. Now the motion that the real user makes in the real world is tracked and the virtual user performs the same or similar motions. In this case, say the real user made a motion of moving his/her hand which in the virtual world causes the virtual user to touch a live wire carrying electric current. Thus, a haptic feedback may be generated in this case in the gloves of the real user and other haptic device [106] that the real user is wearing such as a jacket, a helmet, etc. This haptic feedback may generate a vibration in the user’s body to make the user realize that the user has touched a live wire. The user learns via such haptic feedback that he/she must not touch a live wire on the work location site.
Additionally, the system [100] is configured to implement other features such as: the VR headsets may re-join the classroom in an event the VR headset application [102a] crashes during a training session or otherwise; the VR headset application [102a] may play, pause, and stop the display of media content, and also send the score to the central server application [114a] after finishing each module and
assessment of the user, etc.; the trainer/operator via the central server [114] may be able to send a message to the VR headset [102] that may be displayed via the user interface [110] of the VR headset [102]; the haptic devices [106] may be able to communicate information such as battery levels, etc. to the VR headsets [102]; the central server [114] may control the VR headsets [102] in a classroom mode via internet from a remote location using TCP tunnelling and other protocols.
Referring to Fig. 2 which illustrates an exemplary overview of a system for revalidating user accounts of users for providing industrial safety training to users, in accordance with exemplary embodiments of the present invention. The present disclosure encompasses that the users for training, need to create user account and register themselves via the central server [114]. For this purpose, users need to create their user account using the application installed on the central server [114] or by accessing a portal via Internet. Further, in an implementation, the user accounts need to be revalidated after specific intervals of time, say, once every 7 days, via a communication network [204] such as the Internet, by connecting to a secondary server [202]. For this purpose, the central server [114] encrypts the user account details and sends the encrypted user account details to the secondary server [202]. The secondary server [202] maintains database of at least all users that have created user accounts for using the system [100]. This database comprises records related to the users such as their user account details, user account expiry details, user details, subscription details, subscription expiry details, etc. In an implementation, the database may form a part of a secondary memory unit [206] associated with the secondary server [202]. This interaction between the central server [114] and the secondary server [202] takes place via a software application [114a] installed in the central server [114] and a software application [202a] installed in the secondary server [202]. The secondary server [202] via the software application [202a] checks the validity of license and/or subscription of the user account in a license database associated with the secondary server [202], and if the validation of user account is positive, i.e., if the
user account has a valid license and/or valid subscription, the secondary server [202] shares an encrypted authentication token back to the central server [114]. This process occurs if a user signs-in for the first time, and/or if the user was logged-out and is logging back in, or if the user has connected to the communication network [204] which may be the Internet, after the specific interval of time for revalidation of the user account by connecting to the secondary server [202], has expired. In an example, where the secondary server [202] is in possession of a third party that maintains and monitors the usage of the system [100] by various users and provides services related to the same, this functionality of revalidation of user accounts after specific intervals of time can be used to monitor and manage the use of the system [100] by various users. In an implementation, the license database may form a part of the secondary memory unit [206] associated with the secondary server [202].
Further, the interaction between the central server [114] and the secondary server [202] via the software application [114a] and the software application [202a] through the communication network [204] may also update the software application [114a] installed in the central server [114]. The update, for example, can be a software update such as security patches, new features, etc. or a date and time update. The update may also be transmitted to the VR headsets [102] when they connect to the central server [114] after the update.
Referring to Fig. 3 which illustrates an exemplary overview of a system for re-licensing VR headsets [102] for providing industrial safety training to users, in accordance with exemplary embodiments of the present invention. In an implementation, the VR headset application [102a], i.e., the software application installed on the VR headsets [102] is re-licensed after a second specific interval of time. This second specific interval of time may be same or different as the first specific interval of time, i.e., the time interval of revalidating user accounts as explained above in this disclosure. This feature of re-licensing may be bypassed in
case the same is intentionally disabled by an administrator that is providing services related to the system [100]. For the purpose of re-licensing of the VR headsets [102], in an implementation, each VR headset [102] sends, via the communication network [204], an encrypted token to the secondary server [202]. Each encrypted token of the corresponding VR headset may be a combination of MAC ID, serial number, other device-specific information of the VR headset [102] and its current device date and time. In an implementation, a separate data and time token is transmitted by the VR headset [102]. The secondary server application [202a] checks the registration status and license status of the VR headset [102] through the license database. In case the validation is positive, i.e., if the VR headset [102] has a valid license and/or valid registration, the secondary server [202] shares a validation token back to the VR headset [102]. That means, the VR headset [102] receives, via the communication network [204], a validation token from the secondary server [202]. In this process, a date and time validation token response may also be generated to update the VR headset [102] date and time as required, i.e., for example, if the VR headset [102] date and/or time has drifted away due to any reason such as resets, and internal clock issues, the date and time of the VR headset is corrected and updated.
In another implementation, for re-licensing VR headsets [102], the VR headsets [102] transmit the encrypted token and the date and time token to the central server application [114a]. The central server application [114a] stores the tokens sent by all such VR headsets [102] in the local storage, i.e., the memory unit associated with the central server [114]. Further, when the central server [114] connects to the communication network [204] such as the Internet, the central server application [114a] transmits to the secondary server [202], the encrypted token and the date and time token. Further, in this implementation, the secondary server application [202a] checks the registration status and license status of the VR headset [102] through the license database. In case the validation is positive,
i.e., if the VR headset [102] has a valid license and/or valid registration, the secondary server [202] shares a validation token back to the central server [114].
In an example, say, a VR headset [102] is coming to the end of the validation token duration. Say, it is at 2 days to expiry. At this point, i.e., Day {-2}, say, VR headset [102] has transmitted its details to the central server [114] for re-licensing, but the central server [114] is not connected to the communication network [204] and does not have an Internet connection. Further, say the VR headset [102] is powered off and not used for 17 days. In the mean-time, say the central server [114] has obtained the validation token for the VR headset [102]. The validation token of the VR headset [102] already expired on Day {0}, but the VR headset [102] was not used / connected with the central server [114] or the communication network [204] till Day {15}. Say, the second specific interval of time, that is the time interval after which the VR headsets [102] are re-licensed, is 30 days. Then, on Day {15}, when the VR headset [102] connects to the central server [114] via the central server application [114a], the license validation token will be valid only for the next 15 days. At Day {31}, the VR headset [102] will need to be re-licensed. This means that the time-dependent license is based on calendar day counts and not relative elapsed time.
The secondary server [202], in an implementation, may track the issued tokens for each VR headset [102], and manages the tokens of the VR headsets [102] to avoid any conflict of tokens that might occur due to connection or disconnection with the communication network [204]. In other words, the encrypted token sent by the each VR headset [102] or central server [114] via the communication network [204] are stored in a secondary memory unit [206] using a secondary server application [202a].
Here, referring to Figure 4, which exemplary flow chart of a method for providing industrial safety training to users, in accordance with exemplary embodiments of
the present invention. As shown, the method starts at step 402 upon a trigger received by the system [100], such as a user switching ON the power of devices of the system [100] such as the VR headsets [102] and/or the haptic feedback devices [106a-106n], and goes to step 404. At step 204, the method comprises displaying, by one or more VR headsets [102], a media content. At this step, each VR headset [102] displays a media content via the processing unit [104] and the user interface [110]. Also, prior to displaying the media content by the one or more VR headsets [102], a connection is first established: between at least the VR headsets [102] and the sets of haptic devices [106] in the standalone mode; and between at least the VR headsets [102], the set of haptic devices [106], and the central server [114] in standalone mode. Further, this media content, in an implementation, is saved in the memory unit [108] connected to the VR headset [102]. The media content can be a text file, an audio file, a video file, or any combination thereof. In an implementation, the examples of media content include, but not limited to, images, gamified assessment reports, user-tracking reports, etc. Also, in the standalone mode, the media content to be displayed on the VR headset [102] via the user interface [110] is selected by the user from the media content stored in the memory unit [108]. Thus, in the standalone mode, the one or more VR headsets [102] can play same or different media content modules as per users’ choice. However, in the classroom mode, the media content to be played on the one or more VR headsets [102] in the training session, is selected by the trainer on the central server [114]. Thus, the same media content module is displayed on one or more VR headsets [102] in the training session in the classroom mode.
In an implementation, the method comprises operating the VR headset [102] in two modes: first mode and second mode. The first mode may be a ‘standalone mode’ where the user has more control over the VR headset [102]. For example, the user in possession of the VR headset [102] and the set of haptic devices [106] is sitting in some location and wants to view a media content related to the industrial training. In that case, since the complete media content comprising all
the modules related to the training is stored in the memory unit [108] that is connected to, or integrated with, the VR headset [102], the user is able to access any module of the media content from the memory unit [108] to learn from.
Further, the second mode may be a ‘classroom mode’ where the user has less control, or only limited control over the VR headset [102]. In this mode, the user is in the vicinity of a central server [114]. In an implementation, the central server [114] is in full or partial control of a human operator or a trainer. In the classroom mode, the one or more VR headsets [102] connect to the central server [114] via the communication unit [112]. Further, in the classroom mode, the VR headset [102] automatically connects to the communication unit [112] after the VR headset [102] is powered on. Thus, when a user switches ON the VR headset [102] in the classroom mode, the communication unit [112], say, a Wi-Fi router, automatically connects to the central server [114], say, a smartphone. Also, after switching ON both the VR headset [102] and the set of haptic devices [106], the VR headset [102] also connects to the set of haptic devices [106] automatically. Further, the control of the VR headset [102] is with the central server [114]. This means that the control as to which module to play on the VR headset [102] for the user is with the central server [114]. The central server [114] in an implementation, is controlled by a human operator. In another implementation, the central server [114] may be programmed to automatically control the one or more VR headsets [102], for example, the central server [102] may be programmed to display a particular module at a particular time. However, in this second mode, i.e., classroom mode, the user in possession of the VR headset [102] does not have control over which module of the media content to play in the VR headset [102]. Thus, in the classroom mode, the central server [114] is configured to facilitate a limited set of functions of the one or more VR headsets [102]. Also, in classroom mode, only the trainer selects which module to play on the one or more VR headsets [102] and only that media content module as selected by the trainer on
the central server [114] is played on all the VR headsets [102] in the training session.
In an implementation, the method comprises the one or more VR headsets [102] and the central server [114] communicating via a User Datagram Protocol (UDP) network. Using UDP may facilitate in reducing the network search time substantially as compared to other techniques such as Transmission Control Protocol/Internet Protocol network searching.
In an implementation, both the one or more VR headsets [102] and the central server [114] are pre-configured to connect to the same communication network, say the Wi-Fi network, which is broadcast by the communication unit [112], say the Wi-Fi router of the system [100]. Further, the central server application [114a] is programmed to play central server [114] as a server device in the UDP network while the VR headset application [102a] is programmed to play VR headset [102] as a client device. The UDP protocol allows a server to connect to multiple client devices, i.e., VR headsets [102], but every client is connected to only one UDP server, i.e., the central server [114] through the central server application [114a]. Once the UDP server [114] and clients [102] are on the same network, the UDP server application [114a] starts searching for the client devices [102] connected in the same network and creates a list of all the clients, i.e., VR headsets [102] connected in the same network. When a VR headset [102] connects to the central server [114], the VR headset [102] sends its details such as IP address, device name, battery level, etc. as an introductory message to the central server [114]. At the central server [114] side, all these details about each connected VR headset [102] are stored in different arrays, i.e., a separate corresponding array for each VR headset [102], to handle further communications. Once the details are exchanged, the method comprises controlling, by the UDP Server [114], i.e., by the central server [114], all the client devices [102], i.e., the VR headsets [102]. For example, the IP address of a particular VR headset [102] is bound with the name
of the VR headset [102] in an array created for that particular VR headset [102]. So, in further communications, whenever the central server [114] wants to send a private message to a particular VR headset, it attaches the IP address of that particular VR headset with the message while sending in the network. In this manner the central server [114] uniquely identifies the VR headsets [102].
Also, in an implementation, the method comprises displaying, by the user interface [110], an assessment data comprising one or more assessment questions. Further, the method also comprises obtaining, by the user interface [110], a user input as a response to each of the one or more assessment questions. For example, after the end of each module, a set of assessment questions is displayed to the users on the user interface [110]. The user submits the response to each of the assessment question via the user interface [110]. In another example, the assessment questions are shown to the users at the end of the training session.
Further, in an implementation, in the classroom mode, the method comprises adding, by the central server [114], a subset of the one or more VR headsets [102] to join a training session. Since, each VR headset [102] may have a unique ID, the central server [114] identifies which VR headset [102] has switched ON and connected with the central server [114]. For this, in an implementation, the central server [114] stores a corresponding set of details of the each VR headset [102] in a corresponding array. Further, the method comprises receiving, by the central server [114], say, via the operator input, a playlist of modules for playing in the training session. Say, the operator is operating the central server [114], the operator selects which module to play on the VR headset [102]. This selection may be made by the operator on the central server [114] via an application or a software program installed on the central server [114]. Further, the method comprises, triggering, by the central server [114], via an operator input, a start of the training session. For this purpose, after selecting which modules to play on the
VR headsets [102], the operator may, say, click on a button ‘Start Training Session’. After the starting of the training session is triggered, the method comprises sending, by the central server [114], an information to the one or more VR headsets [102]. The information comprises the playlist of modules for playing in the training session. This information may inform the VR headsets [102] as to which module to play, or the sequence of modules to play during the training session. The modules are displayed via the user interface [110] of the VR headsets [102].
Also, in an implementation, when a trainer/operator creates a classroom for training session, the method comprises selecting, by the trainer/operator, any number of VR headsets [102] from the list to add to the training session in classroom mode. At the time of selecting a VR headset [102], the central server [114] sends a message to that VR headset [102] with instructions to join the training session. Upon receipt of the instructions, the VR headset [102] overrides the current activities with actions as directed in the instructions sent by the central server [114], and loads the classroom waiting page on the VR headset [102]. Additionally, the VR headset [102] in this classroom mode disables the standalone mode, and delegates a partial or complete VR headset [102] control. For example, the VR headset [102] may delegate a control over all features except for button press, and gaze lock events during the assessment to the central server [114]. The VR headset [102] stays under the control of the central server [114] until it is removed from the classroom mode or the training session is completed.
Further, in an implementation, post the sending of the information by the central server [114] to the one or more VR headsets [102], the one or more VR headsets [102] pre-configure, in the processing unit [104], the playlist of modules for playing in the training session, from the memory unit [108], and the assessment data. Since the modules are stored in the memory unit [108] installed on the VR headsets [102], the central server [114] sends the information about the playlist
of modules to be played during the training session to the VR headsets [102]. Using this information, the processing unit [104] of each VR headset [102] configures as to which modules to play during the session. The processing unit [104], for example, sequences the links from the memory unit [108] to play one after another during the training session. Along with this, the processing unit [104] also fetches the assessment data to be shown to the users after each module is finished or after the training session is completed, and shows the assessment data at desired time point to the users to obtain their input accordingly.
Once the VR headset [102] starts displaying the select modules via the processing unit [104] and the user interface [110] at step 404, the one or more haptic devices [106a-106n], at step 406, generate a haptic feedback in synchronization with the media content being displayed on the user interface [110] on the one or more VR headsets [102], and a user input received via: at least one of the one or more haptic devices [106a-106n], and the one or more VR headsets [102]. The user learns via such haptic feedback what practices the user is required to follow while working on the work location site, and the process ends with this learning of the users at step 408.
Additionally, the method enables the system [100] to implement other features such as: the VR headsets may re-join the classroom in an event the VR headset application [102a] crashes during a training session or otherwise; the VR headset application [102a] may play, pause, and stop the display of media content, and also send the score to the central server application [114a] after finishing each module and assessment of the user, etc.; the trainer/operator via the central server [114] may be able to send a message to the VR headset [102] that may be displayed via the user interface [110] of the VR headset [102]; the haptic devices [106] may be able to communicate information such as battery levels, etc. to the VR headsets [102]; the central server [114] may control the VR headsets [102] in a
classroom mode via internet from a remote location using TCP tunnelling and other protocols.
Further the method comprises revalidating user accounts of users for providing industrial safety training to users, in accordance with exemplary embodiments of the present invention, as explained above in this disclosure with reference to Fig. 2. In an implementation, the users for training need to create user account and register themselves via the central server [114]. For this purpose, users create their user account using the application installed on the central server [114] or by accessing a portal via internet. Further, in an implementation, the user accounts need to be revalidated after specific intervals of time, via a communication network [204] such as the internet, by connecting to a secondary server [202]. For this purpose, the central server [114] encrypts the user account details and sends the encrypted user account details to the secondary server [202]. The secondary server [202] maintains database of at least all users that have created user account for using the system [100]. This database comprises records related to the users such as their user account details, user account expiry details, user details, subscription details, subscription expiry details, etc. In an implementation, the database may form a part of a secondary memory unit [206] associated with the secondary server [202]. This interaction between the central server [114] and the secondary server [202] takes place via a software application [114a] installed in the central server [114] and a software application [202a] installed in the secondary server [202]. The secondary server [202] via the software application [202a] checks the validity of license and/or subscription of the user account in a license database associated with the secondary server [202], and if the validation of user account is positive, i.e., if the user account has a valid license and/or valid subscription, the secondary server [202] shares an encrypted authentication token back to the central server [114]. This process occurs if a user signs-in for the first time, and/or if the user was logged-out and is logging back in, or if the user has connected to the communication network [204] which may be the internet, after
the specific interval of time for revalidation of the user account by connecting to the secondary server [202], has expired. In an example, where the secondary server [202] is in possession of a third party that maintains and monitors the usage of the system [100] by various users and provides services related to the same, this functionality of revalidation of user accounts after specific intervals of time can be used to monitor and manage the use of the system [100] by various users. In an implementation, the license database may form a part of a secondary memory unit [206] associated with the secondary server [202].
Further, the interaction between the central server [114] and the secondary server [202] via the software application [114a] and the software application [202a] through the communication network [204] may also update the software application [114a] installed in the central server [114]. The update may also be transmitted to the VR headsets [102] when they connect to the central server [114] after the update.
Further, the method also comprises re-licensing VR headsets [102] for providing industrial safety training to users, in accordance with exemplary embodiments of the present invention, as explained above in this disclosure with reference to Fig. 3. In an implementation, the VR headset application [102a], i.e., the software application installed on the VR headsets [102] is re-licensed after a second specific interval of time. This second specific interval of time may be same or different as the first specific interval of time, i.e., the time interval of revalidating user accounts as explained above in this disclosure. This feature of re-licensing may be bypassed in case the same is intentionally disabled. For the purpose of re-licensing of the VR headsets [102], in an implementation, each VR headset [102] sends, via the communication network [204], an encrypted token to the secondary server [202]. Each encrypted token of the corresponding VR headset may be a combination of MAC ID, serial number, other device-specific information of the VR headset [102] and its current device date and time. In an implementation, a separate data and
time token is transmitted by the VR headset [102]. The secondary server application [202a] checks the registration status and license status of the VR headset [102] through the license database. In case the validation is positive, i.e., if the VR headset [102] has a valid license and/or valid registration, the secondary server [202] shares a validation token back to the VR headset [102]. That means, the VR headset [102] receives, via the communication network [204], a validation token from the secondary server [202]. In this process, a date and time validation token response may also be generated to update the VR headset [102] date and time as required, i.e., for example, if the VR headset [102] date and/or time has drifted away due to any reason such as resets, and internal clock issues, the date and time of the VR headset is corrected and updated.
In another implementation, for re-licensing VR headsets [102], the method comprises transmitting, by the VR headsets [102], the encrypted token and the date and time token to the central server application [114a]. The central server application [114a] stores the tokens sent by all such VR headsets [102] in the local storage, i.e., the memory unit associated with the central server [114]. Further, when the central server [114] connects to the communication network [204] such as the internet, central server application [114a] transmits to the secondary server [202], the encrypted token and the date and time token. Further, in this implementation, the secondary server application [202a] checks the registration status and license status of the VR headset [102] through the license database. In case the validation is positive, i.e., if the VR headset [102] has a valid license and/or valid registration, the secondary server [202] shares a validation token back to the central server [114]. Pertinently, the time-dependent license is based on calendar day counts and not relative elapsed time.
The secondary server [202], in an implementation, may track the issued tokens for each VR headset [102], and manages the tokens of the VR headsets [102] to avoid any conflict of tokens that might occur due to connection or disconnection with
the communication network [204]. In other words, the encrypted token sent by the each VR headset [102] or central server [114] via the communication network [204] are stored in a secondary memory unit [206] using a secondary server application [202a].
Thus, the present invention provides a novel solution for providing industrial safety training to users. The present invention provides a solution that is technically advanced over the currently known solutions as it enables providing structured industrial safety training that is standard compliant and that keeps users engaged. The solution of the present disclosure structured industrial safety training that creates an environment that is same or at least similar to the actual work environment, by using VR headsets, haptic devices, along with other set of components as disclosed herein. Based on the implementation of features of the present invention, one can obtain a method and system for providing structured industrial safety training in various languages. Also, based on the implementation of features of the present invention, one can obtain a method and system for providing structured industrial safety training that overcomes the limitation of gap between number of people who need the training and the total number of seats for training. The features of the present disclosure also enable one to provide formal and efficient training with minimized cost of training.
While considerable emphasis has been placed herein on the preferred embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the invention. These and other changes in the preferred embodiments of the invention will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter to be implemented merely as illustrative of the invention and not as limitation.
WE CLAIM:
1. A system for providing industrial safety training to one or more users, the system comprising:
- one or more virtual reality (VR) headsets [102], wherein each VR headset [102] of the one or more VR headsets [102] comprises at least a processing unit [104], a memory unit [108] and a user interface [110];
- one or more sets of haptic devices [106] coupled to the one or more VR headsets [102], wherein each set of haptic devices [106] comprises one or more haptic devices [106a-106n]; and
- a communication unit [112] coupled to the one or more VR headsets [102], and the one or more sets of haptic devices [106];
wherein
- the one or more VR headsets [102] are connected to a central server
[114] via the communication unit [112] and the one or more VR
headsets are connected to each other via the communication unit
[112];
and wherein
- the each VR headset [102] is configured to:
o display a media content via the processing unit [104] and the user interface [110]; and
- the one or more haptic devices [106a-106n] are configured to:
o generate a haptic feedback in synchronization with the media content being displayed on the user interface [110] on the one or more VR headsets [102], and a user input received via: at
least one of the one or more haptic devices [106a-106n], and the one or more VR headsets [102].
2. The system as claimed in claim 1, wherein the memory unit [108] of the each VR headset [102] is configured to store the media content.
3. The system as claimed in claim 1, wherein the one or more VR headsets [102] are configured to operate in one of a classroom mode and a standalone mode.
4. The system as claimed in claim 3, wherein the central server [114] is configured to facilitate a limited set of functions of the one or more VR headsets [102] in the classroom mode.
5. The system as claimed in claim 1, wherein the each VR headset [102], on being powered on, is configured to automatically connect to the communication unit [112] and one of the sets of haptic devices [106].
6. The system as claimed in claim 1, wherein the user interface [110] is configured to:
- display an assessment data comprising one or more assessment questions; and
- obtain a user input as a response to each of the one or more assessment questions.
7. The system as claimed in claim 1, wherein the each VR headset [102] is
configured to:
- send, via a communication network [204], an encrypted token to a secondary server [202]; and
- receive, via the communication network [204], a validation token from the secondary server [202].
8. The system as claimed in claim 7, wherein the encrypted token sent by the each VR headset [102] via the communication network [204] are stored in a secondary memory unit [206] using a secondary server application [202a].
9. The system as claimed in claim 1, wherein the central server [114] is configured to store a corresponding set of details of the each VR headset [102] in a corresponding array.
10. The system as claimed in claim 3, wherein in the classroom mode, the central server [114] is configured to:
- add a subset of the one or more VR headsets [102] to join a training session;
- receive, via an operator input, a playlist of modules for playing in the training session;
- trigger, via an operator input, a start of the training session; and
- send an information to the one or more VR headsets [102], wherein the information comprises the playlist of modules for playing in the training session.
11. The system as claimed in claim 6 and 10, wherein post the sending of the
information by the central server [114] to the one or more VR headsets
[102], the one or more VR headsets [102] are configured to:
- pre-configure, in the processing unit [104], the playlist of modules for
playing in the training session, from the memory unit [108], and the
assessment data.
12. A method for providing industrial safety training to one or more users,
the method comprising:
- displaying, by one or more VR headsets [102], a media content,
wherein each VR headset of the one or more VR headset [102] comprises at least a processing unit [104], a memory unit [108], and a user interface [110],
and wherein the one or more VR headsets [102] are connected to a central server [114] via a communication unit [112] and the one or more VR headsets are connected to each other via the communication unit [112]; and
- generating, by one or more haptic devices [106a-106n], a haptic
feedback in synchronization with the media content being displayed on the user interface [110] on the one or more VR headsets [102], and a user input received via: at least one of the one or more haptic devices [106a-106n], and the one or more VR headsets [102],
wherein the one or more haptic devices [106a-106n] are included in one or more sets of haptic devices [106], and the one or more sets of haptic devices [106] are coupled to the one or more VR headsets [102].
13. The method as claimed in claim 12, wherein the media content is stored in the memory unit [108].
14. The method as claimed in claim 12, wherein the one or more VR headsets [102] operate in one of a classroom mode and a standalone mode.
15. The method as claimed in claim 14, wherein the central server [114] is configured to facilitate a limited set of functions of the one or more VR headsets [102] in the classroom mode.
16. The method as claimed in claim 12, wherein the each VR headset [102], on being powered on, automatically connects to the communication unit [112] and one of the sets of haptic devices [106].
17. The method as claimed in claim 12, wherein the method comprises:
- displaying, by the user interface [110], an assessment data comprising one or more assessment questions; and
- obtaining, by the user interface [110], a user input as a response to each of the one or more assessment questions.
18. The method as claimed in claim 12, wherein the method comprises:
- sending, by the each VR headset [102] via a communication network [204], an encrypted token to a secondary server [202]; and
- receiving, by the each VR headset [102] via the communication network [204], a validation token from the secondary server [202].
19. The method as claimed in claim 18, wherein the method comprises:
- storing, by a secondary memory unit [206] using a secondary server
application [202a], the encrypted token sent by the each VR headset
[102] via the communication network [204].
20. The method as claimed in claim 12, wherein the method comprises:
- storing, by the central server [114], a corresponding set of details of
the each VR headset [102], in a corresponding array.
21. The method as claimed in claim 14, wherein in the classroom mode, the
method comprises:
- adding, by the central server [114], a subset of the one or more VR headsets [102] to join a training session;
- receiving, by the central server [114] via an operator input, a playlist of modules for playing in the training session;
- triggering, by the central server [114], via another operator input, a start of the training session; and
- sending, by the central server [114], an information to the one or
more VR headsets [102], wherein the information comprises the
playlist of modules for playing in the training session.
22. The method as claimed in claim 17 and 21, wherein post the sending of the information by the central server [114] to the one or more VR headsets [102], the method comprises:
- pre-configuring, by the one or more VR headsets [102] in the
processing unit [104], the playlist of modules for playing in the
training session, from the memory unit [108], and the assessment
data.
| # | Name | Date |
|---|---|---|
| 1 | 202221020391-STATEMENT OF UNDERTAKING (FORM 3) [05-04-2022(online)].pdf | 2022-04-05 |
| 2 | 202221020391-PROVISIONAL SPECIFICATION [05-04-2022(online)].pdf | 2022-04-05 |
| 3 | 202221020391-FORM FOR STARTUP [05-04-2022(online)].pdf | 2022-04-05 |
| 4 | 202221020391-FORM FOR SMALL ENTITY(FORM-28) [05-04-2022(online)].pdf | 2022-04-05 |
| 5 | 202221020391-FORM 1 [05-04-2022(online)].pdf | 2022-04-05 |
| 6 | 202221020391-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [05-04-2022(online)].pdf | 2022-04-05 |
| 7 | 202221020391-EVIDENCE FOR REGISTRATION UNDER SSI [05-04-2022(online)].pdf | 2022-04-05 |
| 8 | 202221020391-FORM-26 [16-06-2022(online)].pdf | 2022-06-16 |
| 9 | 202221020391-Proof of Right [16-08-2022(online)].pdf | 2022-08-16 |
| 10 | 202221020391-ORIGINAL UR 6(1A) FORM 1-290822.pdf | 2022-09-01 |
| 11 | 202221020391-ENDORSEMENT BY INVENTORS [05-04-2023(online)].pdf | 2023-04-05 |
| 12 | 202221020391-DRAWING [05-04-2023(online)].pdf | 2023-04-05 |
| 13 | 202221020391-CORRESPONDENCE-OTHERS [05-04-2023(online)].pdf | 2023-04-05 |
| 14 | 202221020391-COMPLETE SPECIFICATION [05-04-2023(online)].pdf | 2023-04-05 |
| 15 | Abstract1.jpg | 2023-05-03 |
| 16 | 202221020391-FORM-9 [28-12-2023(online)].pdf | 2023-12-28 |
| 17 | 202221020391-FORM 18 [28-12-2023(online)].pdf | 2023-12-28 |
| 18 | 202221020391-FER.pdf | 2025-06-10 |
| 19 | 202221020391-FORM 3 [08-09-2025(online)].pdf | 2025-09-08 |
| 20 | 202221020391-FER_SER_REPLY [07-10-2025(online)].pdf | 2025-10-07 |
| 21 | 202221020391-US(14)-HearingNotice-(HearingDate-20-11-2025).pdf | 2025-10-30 |
| 22 | 202221020391-FORM-26 [13-11-2025(online)].pdf | 2025-11-13 |
| 23 | 202221020391-Correspondence to notify the Controller [13-11-2025(online)].pdf | 2025-11-13 |
| 1 | 202221020391_SearchStrategyNew_E_SearchStrategy57E_23-05-2025.pdf |