Sign In to Follow Application
View All Documents & Correspondence

Workforce Training And Development Device

Abstract: A workforce training and development device, comprising a cuboidal body 101 having a set of extendable rods 102 for managing the height of the body 101 from ground surface, a motorized wheel 103 enabling movement of the body 101, an artificial intelligence-based imaging unit 104 to detect presence of a user in proximity to the body 101, a touch interactive display panel 105 mounted on the body 101 displays a series of interactive prompts and quiz-style questions to evaluate user’s initial understanding of a topic, accordingly presents a training program on a specific topic, a dedicated chamber 106 to hold different physical tools and equipped with a robotic arm 107 that include ball-and-socket joint securely adjusting position of the tools, a holographic projection unit 108 to deliver presentations related to various training programs, a fingerprint scanner 109 that captures user's biometric data to securely store and authenticate user’s identity.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
01 December 2024
Publication Number
1/2025
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application

Applicants

Marwadi University
Rajkot – Morbi Road, Rajkot 360003 Gujarat, India.

Inventors

1. Dr. Jyoti Ishwar Laljan
Faculty of Management Studies, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat, India.
2. Pooja Sarvaliya
Faculty of Management Studies, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to a workforce training and development device that is capable of providing a comprehensive training experience that is personalized, engaging, secure, adapting to the user's needs and understanding, while continuously assessing their emotional and physical cues.

BACKGROUND OF THE INVENTION

[0002] The modern workforce is increasingly complex and dynamic, with employees requiring continuous training and development to stay competitive. Effective training is crucial for enhancing productivity, improving job satisfaction, and driving business success. However, traditional training methods often fall short in providing personalized, engaging, and effective learning experiences.

[0003] Conventional training approaches typically involve classroom-based instruction, online courses, or on-the-job training. These methods often rely on a one-size-fits-all approach, failing to account for individual learning styles, needs, and preferences. Traditional training methods also tend to be passive, lacking interactivity and engagement, which lead to decreased motivation and retention.

[0004] US20030009742A1 discloses automated job training and performance tool is a suite of computer software applications for enabling an organization to develop a program for the instruction and training of members of the organization. The tool enables those charged with developing instruction and training to develop a web-based training course without having any formal acquaintance with computer programming languages, either individually or jointly in synchronous or asynchronous modes. The suite includes a guidelines application describing the procedures for developing a job training program, a design application which uses analysis and design template to guide the user in course development, and a Web Author application for automating the process of generating an HTML document implementing the course. The three applications may be used individually, but are seamlessly integrated through object-oriented programming techniques so that data entered in the templates and forms is carried over to the Web Author application.

[0005] WO2003075124A3 discloses the invention provides a system that automates workforce management tasks through the integrated use of structured content accessible from a database, a set of business logic rules engines as well as input from users via user interfaces. The invention also provides a methodology for creating engineered content which is accessible to a strategic workforce management system which manages human resources tasks.

[0006] Conventionally, there exists many devices that are capable of providing a training programs to the user, however these existing devices fails in providing real-time information and explanations as per the requirement or intellectual of the user, which not only affects the learning but also the productivity of the user. In addition, these existing devices are also incapable of maintaining data confidentiality while providing training to individuals, which raise risk for security.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a device that is need to be capable of streamlining the process of training programs by providing information and explanation at the same time of the training that helps in engaging the user with the program. Furthermore, the developed device required to be potent enough of securely managing confidentiality of the data while allowing easy access to the user to engage with the program.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a device that is capable of providing a suitable training program based on the user's initial understanding, attention level, engagement, and comprehension by continuously assessing the user's emotional and physical cues to adjust the training content effectively, thereby providing personalized and adaptive training.

[0010] Another object of the present invention is to develop a device that is capable of offering an immersive training experience through interactive and visual aids by providing real-time information and explanations, thereby enhancing user engagement and interaction, making the training experience more effective.

[0011] Yet another object of the present invention is to develop a device that is capable of ensuring secure and efficient user identification by maintaining data confidentiality while providing users with easy access to their training programs for enabling personalized interactions and access to training programs.

[0012] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0013] The present invention relates to a workforce training and development device that is capable of offering a unified training solution that combines personalization, engagement, and security, while continuously gauging user emotions and responses to foster a seamless learning experience.

[0014] According to an embodiment of the present invention, a workforce training and development device, comprising a cuboidal body features extendable rods attached to its bottom portion, allowing for height adjustments from the ground surface, each rod is equipped with a motorized wheel, facilitating smooth movement of the body from one location to another, an artificial intelligence-based imaging unit is installed on the body to capture multiple images of the surroundings and detect users in proximity, upon detecting a user, an inbuilt microcontroller displays interactive prompts and quiz-style questions on the touch interactive display panel to assess the user's initial understanding of a topic, the microcontroller stores user responses and presents a training program on the display panel, continuously monitoring the user's facial expressions, body language, and non-verbal cues via the imaging unit, the microcontroller utilizes machine learning modules to analyze real-time emotional and physical cues, determining the effectiveness of the training content.

[0015] According to another embodiment of the present invention, the proposed device further comprises of a dedicated chamber on top of the body holds various physical tools, which are securely held and adjusted by a robotic arm with a ball-and-socket joint, the robotic arm ensures easy viewing of tools, providing real-time descriptions of physical tool specifications to assist trainers, a holographic projection unit integrated into the body delivers presentations related to various training programs, offering an alternative method for displaying training content, a fingerprint scanner mounted on the body captures users' biometric data, securely storing and authenticating user identities in a database linked with the microcontroller.

[0016] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0017] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a workforce training and development device.

DETAILED DESCRIPTION OF THE INVENTION

[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0019] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0020] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0021] The present invention relates to a workforce training and development device that is capable of providing an integrated training process that adapts to individual user needs, promotes active engagement, and maintains robust security, all while assessing user emotions and responses in real-time.

[0022] Referring to Figure 1, an isometric view of a workforce training and development device, comprising a cuboidal body 101 attached with a set of extendable rods 102, a motorized wheel 103 is integrated with each of the rods 102, an artificial intelligence-based imaging unit 104 installed on the body 101, a touch interactive display panel 105 mounted on the body 101, a dedicated chamber 106 positioned on top of the body 101, the chamber 106 is equipped with a robotic arm 107, a holographic projection unit 108 is integrated on the body 101 and a fingerprint scanner 109 is mounted on the body 101.

[0023] The device disclosed herein, comprises of a cuboidal body 101, which serves as a main structure of the device and developed to be positioned on a ground surface. The body 101 having multiple extendable rods 102 that provides stability to the body 101 over the surface by extend and retracting to manage height of the body 101 as per the requirement. The rod as mentioned herein are powered by a pneumatic unit that utilizes compressed air to extend and retract the rods 102.

[0024] The process begins with an air compressor which compresses atmospheric air to a higher pressure. The air cylinder of the pneumatic unit contains a piston that moves back and forth within the cylinder. The cylinder is connected to one end of the rods 102. The piston is attached to the rods 102 and its movement is controlled by the flow of compressed air. To extend the rods 102 the piston activates the air valve to allow compressed air to flow into the chamber 106 behind the piston. As the pressure increases in the chamber 106, the piston pushes the rods 102 to the desired length for adjusting the height of the body 101 as per the user requirement.

[0025] Each of the rod is equipped with a motorized wheel 103 that provides movement to the body 101 over the surface. The wheels 103 move independently on the surface without changing the orientation of the wheels 103. The rollers and smaller wheels 103 create a lateral force that allows the wheel 103 to move in a direction perpendicular to the axis of rotation. The motorized wheel’s 103 design enables it to move on any type of surface with high agility and versatility.

[0026] The device features a fingerprint scanner 109 installed on the body 101 to scan user's biometric data for authentication purpose. The fingerprint scanner 109 captures the user’s unique physiological or behavioral characteristics and converts them into a digital template. This template is a mathematical representation of the fingerprints. The digital template is compared to the pre-saved fingerprints stored in the database for the verification of the user. This is done by calculating the similarity between the new templates and the stored templates.

[0027] There’s a matching threshold set to determine whether the captured fingerprint sufficiently matches any of the pre-saved fingerprints. Based on the comparison results and the matching threshold of the fingerprints the authentication is decided. If the captured template matches with the template stored in the database within the acceptable threshold, then only the personalized training and interaction is provided and also it captures and stores real-time data on the user's comprehension and intellectual abilities. This data is utilized to create a dynamic user profile, which is continuously updated and refined with each successive evaluation, providing a comprehensive and evolving picture of the user's knowledge and abilities.

[0028] Herein, an artificial intelligence-based imaging unit 104 is mounted on the body 101 that captures multiple high-resolution images of surroundings to monitor presence of a user near to the body 101. The artificial intelligence based imaging unit 104 is constructed with a camera lens and a processor, wherein the camera lens is adapted to capture a series of images of the surrounding present in proximity to the body 101. The processor carries out a sequence of image processing operations including pre-processing, feature extraction, and classification.

[0029] The image captured by the imaging unit 104 is a real-time image of the body’s 101 surrounding. The artificial intelligence based imaging unit 104 in communication with a microcontroller, wherein the microcontroller used herein is an Arduino Uno microcontroller. The artificial intelligence based imaging unit 104 transmits the captured image signal in the form of digital bits to the microcontroller. The microcontroller upon receiving the image signals compares the received image signal with the pre-fed data stored in a database and constantly determines presence of a user in close proximity to the body 101.

[0030] Based on the detection of the user, the microcontroller actuates a touch interactive display panel 105 installed over the body 101 to display multiple interactive prompts and quiz-style questions for analyzing the user’s knowledge in particular topic. The touch interactive display panel 105 as mentioned herein is typically an LCD (Liquid Crystal Display) screen that displays multiple interactive prompts and quiz-style questions in a visible form. The screen is equipped with touch-sensitive technology, allowing the user to interact directly with the display using their fingers. A touch controller IC (Integrated Circuit) is responsible for processing the analog signals generated when the user inputs their response. A touch controller is typically connected to the microcontroller through various interfaces which may include but are not limited to PI (Serial Peripheral Interface) or I2C (Inter-Integrated Circuit).

[0031] After the user provide their response, the microcontroller stores these responses and utilizes them to present a training program. The microcontroller assesses the user responses to categorize them into distinct intellectual tiers, comprising low, medium, and high levels of proficiency.

[0032] The program is specifically designed to address the user's knowledge gaps and areas of improvement, ensuring that the training is relevant and effective. As the training program unfolds, the microcontroller continuously monitors the user's facial expressions, body 101 language, and other non-verbal cues. This is achieved through the imaging unit 104 and machine learning modules, which captures and analyzes the user's visual and behavioral responses in real-time. By assessing these non-verbal cues, the microcontroller gauges the user's attention level, engagement, and comprehension of the training material. The microcontroller delivers instantaneous feedback to the user through visual cues, providing guidance on correcting their form, focus, or pace. This real-time feedback enables users to make adjustments, get back on track, and ultimately improve their performance, fostering a more effective and engaging learning experience.

[0033] The process begins with initial assessment, refer as first phase, which starts by displaying foundational text to gauge the user's baseline understanding. This assessment informs the selection of suitable content and examples that align with the user's intellectual level. In the second phase, the device presents a customized training program on a specific topic. Simultaneously, it continuously monitors the user's non-verbal cues, such as facial expressions and body language, to assess their engagement, attention, and comprehension levels. Upon completing the training program, it evaluates the user's understanding and intellect. If necessary, it provides supplementary examples to reinforce learning and prompts the user to articulate their comprehension of the concepts, ensuring a solid grasp of the material.

[0034] The microcontroller's ability to monitor and respond to the user's non-verbal cues enables it to make adjustments to the training program in real-time. For instance, if the microcontroller detects that the user is becoming disengaged or confused, it adapts the training program to include additional explanations, examples, or interactive elements, thereby ensures that the user remains engaged and motivated throughout the training process, ultimately leading to improved knowledge retention and skill acquisition.

[0035] The body 101 features a chamber 106 designed to hold a range of tools, allowing trainers to showcase and explain different components in a hands-on manner. For instance, the chamber 106 might hold a collection of wrenches, pliers, and screwdrivers, each with distinct features and applications, wherein a robotic arm 107 installed with a chamber 106 by means of a ball-and-socket joint that is actuated by the microcontroller for adjusting the positioning of the tools in such manner that it get easily visible so that the trainers easily specify each tool or components in effective manner. The robotic arm's 107 flexibility allows it to manipulate the tools in various ways, facilitating real-time demonstrations and explanations.
• For example, during a training session on plumbing, the trainer might use the robotic arm 107 to hold a small pipe wrench, allowing the users to see the tool's intricate details. As the trainer explains the wrench's functionality and application, the robotic arm 107 adjusts the tool's position, providing a clear view of its components. This interactive and immersive experience enables users to better comprehend complex topics and develop a deeper understanding of the physical tools involved.

[0036] A holographic projection unit 108 installed on the body 101 to project presentations in correspond to multiple training programs that provides an alternate solution to provide training content to the user. On actuation of holographic projection unit 108 by the microcontroller, the light source emits various combination of lights towards the lens which is further portrayed to project the virtual images depicting the alternative method for displaying training content to the user. In case any knowledge gap is identified through evaluation, the holographic projection unit is triggered, generating a real-time, immersive environment that facilitates in-depth comprehension of the topic, thereby bridging the understanding gap and enhancing the learning experience.

[0037] The present invention works best in following manner, where the cuboidal body 101 positioned on the ground surface, the process begins by detecting the presence of the user in proximity to the cuboidal body 101 through its artificial intelligence-based imaging unit 104. Once the user is detected, the inbuilt microcontroller displays the series of interactive prompts and quiz-style questions on the touch interactive display panel 105 to evaluate the user's initial understanding of the topic. Based on the user's responses, the microcontroller presents the training program on the display panel 105, continuously monitoring the user's facial expressions, body 101 language, and other non-verbal cues through the imaging unit 104. This assessment enables the device to gauge the user's attention level, engagement, and comprehension. As the training program progresses, the device utilizes machine learning modules to analyze real-time emotional and physical cues, determining whether the user is effectively following the training content. To further enhance the training experience, the device project presentations related to various training programs through its integrated holographic projection unit 108. Throughout the training process, the device ensures personalized interactions and access to training programs by securely storing and authenticating user identities through its fingerprint scanner 109. The device also provides real-time descriptions of physical tool specifications through its dedicated chamber 106 and robotic arm 107, assisting trainers in explaining details about smaller tools or components effectively.

[0038] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A workforce training and development device, comprising:

i) a cuboidal body 101 attached with a set of extendable rods 102 provided on a bottom portion of said body 101 for managing the height of said body 101 from ground surface, wherein a motorized wheel 103 is integrated with each of said rods 102, enabling movement of said body 101 from one location to another;
ii) an artificial intelligence-based imaging unit 104 installed on said body 101 and paired with a processor for capturing and processing multiple images of surroundings, respectively, to detect presence of a user in proximity to said body 101, wherein upon successful detection an inbuilt microcontroller displays a series of interactive prompts and quiz-style questions over a touch interactive display panel 105 mounted on said body 101 to evaluate user’s initial understanding of a topic;
iii) said microcontroller stores response of said prompts and accordingly presents a training program over said display panel 105 on a specific topic while continuously monitoring user’s facial expressions, body 101 language via said imaging unit 104 to assess user’s attention level, engagement, and comprehension; and
iv) a dedicated chamber 106 positioned on top of said body 101, configured to hold different physical tools, wherein said chamber 106 is equipped with a robotic arm 107 that include ball-and-socket joint capable of securely holding and adjusting position of said tools, ensuring said tools are easily viewed to provide real-time descriptions of physical tool specifications to assist trainers in explaining details about smaller tools or components in case the users are unable to understand any topic.

2) The device as claimed in claim 1, wherein said microcontroller utilizes machine learning modules to analyze real-time emotional and physical cues, and determine if said user is following training content effectively, and said microcontroller provides real-time feedback to said user, via visual prompts to correct form, focus, or pace, guiding them to improve and encourage said user to get back on track.

3) The device as claimed in claim 1, wherein a holographic projection unit 108 is integrated on said body 101 to deliver presentations related to various training programs, in case said microcontroller detects a drop in user’s performance, demonstrating more detailed visual guides and verbal explanation, making it more effective, engaging, and personalized.

4) The device as claimed in claim 1, wherein a fingerprint scanner 109 is mounted on said body 101 that captures user's biometric data to securely store and authenticate user’s identity and securely link to a personalized database managed by said microcontroller, ensuring each user's data, training history, and progress are accurately recorded.

Documents

Application Documents

# Name Date
1 202421094500-STATEMENT OF UNDERTAKING (FORM 3) [01-12-2024(online)].pdf 2024-12-01
2 202421094500-REQUEST FOR EXAMINATION (FORM-18) [01-12-2024(online)].pdf 2024-12-01
3 202421094500-REQUEST FOR EARLY PUBLICATION(FORM-9) [01-12-2024(online)].pdf 2024-12-01
4 202421094500-PROOF OF RIGHT [01-12-2024(online)].pdf 2024-12-01
5 202421094500-POWER OF AUTHORITY [01-12-2024(online)].pdf 2024-12-01
6 202421094500-FORM-9 [01-12-2024(online)].pdf 2024-12-01
7 202421094500-FORM FOR SMALL ENTITY(FORM-28) [01-12-2024(online)].pdf 2024-12-01
8 202421094500-FORM 18 [01-12-2024(online)].pdf 2024-12-01
9 202421094500-FORM 1 [01-12-2024(online)].pdf 2024-12-01
10 202421094500-FIGURE OF ABSTRACT [01-12-2024(online)].pdf 2024-12-01
11 202421094500-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [01-12-2024(online)].pdf 2024-12-01
12 202421094500-EVIDENCE FOR REGISTRATION UNDER SSI [01-12-2024(online)].pdf 2024-12-01
13 202421094500-EDUCATIONAL INSTITUTION(S) [01-12-2024(online)].pdf 2024-12-01
14 202421094500-DRAWINGS [01-12-2024(online)].pdf 2024-12-01
15 202421094500-DECLARATION OF INVENTORSHIP (FORM 5) [01-12-2024(online)].pdf 2024-12-01
16 202421094500-COMPLETE SPECIFICATION [01-12-2024(online)].pdf 2024-12-01
17 Abstract.jpg 2024-12-27
18 202421094500-FORM-26 [03-06-2025(online)].pdf 2025-06-03