Sign In to Follow Application
View All Documents & Correspondence

Candidate Onboarding Assistance System

Abstract: A candidate onboarding assistance system, comprising a bot 101, an imaging unit 102 that perform facial recognition of the candidate, a microphone 103 and speaker 104, operatively coupled with the microcontroller, to provide input and generates an output for enabling interaction with the candidate, an OCR (optical character recognition) scanner 105 scan the documents of the candidates, at least two L-shaped rods 106, with grippers 107 flips the documents while scanning, a compartment 108 stored with multiple joining kits and secured with a motorized lid 109, a touch interactive display panel 111, provide an option of selecting one or more department or wing within an office premises.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
26 April 2025
Publication Number
20/2025
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Marwadi University
Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Inventors

1. Shashank Agarwal
Department of Computer Engineering - Artificial Intelligence, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
2. Dr. Sanket Badiyani
Department of Computer Engineering - Artificial Intelligence, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
3. Ayush Gour
Department of Computer Engineering - Artificial Intelligence, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
4. Dr. Madhu Shukla
Department of Computer Engineering - Artificial Intelligence, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to a candidate onboarding assistance system that is developed for automating the process of onboarding individuals within organizational premises by enabling secure identification, streamlined document verification, and guided movement, thereby enhancing transparency, accuracy, and speed in handling procedural formalities for new entrants in professional or institutional environments.

BACKGROUND OF THE INVENTION

[0002] Generally, onboarding processes in offices and institutions involve multiple manual steps that is time-consuming and error-prone. Typically, a candidate is greeted by a receptionist, asked to fill out paper forms, submit physical copies of documents, and wait in queues for verification. Identity checks are often done visually or using basic biometric scanners, while document verification is handled by staff members comparing details manually. If the candidate needs to go to a specific department, they are usually given verbal directions or a printed map. These conventional methods not only require constant human involvement but also lead to long waiting times, miscommunication, and missed details, especially when handling a large number of candidates. There’s also the challenge of storing and retrieving documents efficiently. This disjointed, paper-heavy approach lacks real-time integration and consistency, making the process less reliable and more stressful for both the candidate and the organization handling the onboarding.

[0003] Conventionally, the process of candidate onboarding, identity verification, and document management has evolved slowly over the years, primarily relying on manual and semi-automated methods. Initially, the process was completely paper-based. Candidates were asked to fill out handwritten application forms, and submit passport-sized photographs and photocopies of ID and educational documents. Verification was performed by human staff, who manually cross-checked these documents against physical records or internal files. Physical ledgers were used to maintain attendance, appointments, and joining logs. However, there are chances of human errors and the process is quite time-consuming. So, people also use some digital systems for record-keeping. Candidates fill out digital forms, and scanned documents are stored on local systems. Fingerprint-based biometric systems became common in large offices and government organizations, allowing basic identity validation. But with these digital systems, poor syncing between local databases and central servers might result in loss or mismatch of records.

[0004] US20190251515A1 discloses about an invention that includes an automated system for onboarding. The system comprising a database to store details of existing employees, a vacancy repository to store details related to vacant positions within an organization, a recruitment module to receive candidate application details corresponding to a plurality of candidates, a test module to provide test case to candidates, an assessment module to provide assessment score, a selection module to select candidate based on the assessment score, a pre-boarding module to identify and receive missing details of the selected candidates to obtain complete details, a verifier module to verify the complete details of the selected candidates to obtain verified details, an onboarding module to receive verified details and facilitate onboarding of the selected candidates.

[0005] US20150310394A1 discloses about an invention that includes an apparatus and method for hiring, onboarding, and collaborating. The system and method may utilize a server and user interfaces to facilitate the posting of job openings, the narrowing of a pool of jobseekers, the scheduling of interviews, and the selection of a jobseeker for hire. Once hired, the system and method may facilitate the onboarding process through the sharing of documents and other necessary information. The system and method may additionally allow for collaboration among users.

[0006] Conventionally, many systems have been developed that are capable of aiding organizations in onboarding candidate. However, these existing systems are incapable of guiding individuals to designated locations using visual, digital, or physical means. In addition, the developed system also lacks in monitoring and analyzing behaviour to affects decision-making during selection or induction.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that is capable of providing a means for guiding individuals to designated locations using visual, digital, or physical means. In addition, the developed system also monitoring and analyzing behaviour to support decision-making during selection or induction.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a system that is capable of managing candidate onboarding and verification processes in an efficient and organized manner, thereby eliminate delays, inconsistencies, and human errors associated with manual identity verification and document handling.

[0010] Another object of the present invention is to develop a system that is capable of distributing required resources automatically upon successful completion of preliminary formalities.

[0011] Another object of the present invention is to develop a system that is capable of maintaining and syncing user records securely with centralized data sources.

[0012] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0013] The present invention relates to a candidate onboarding assistance system that facilitate an efficient candidate intake procedures and eligibility verification through a structured and effective means, thereby removing lags, discrepancies, and operator-dependent mistakes typically encountered during manual identity assessment and credential review.

[0014] According to an embodiment of the present invention, a candidate onboarding assistance system comprises of, a bot wirelessly linked with a server via a communication module, the server configured to store credentials of a selected candidate, an imaging unit installed over the bot to perform facial recognition of the candidate, a microphone and speaker, operatively coupled with the microcontroller, the microcontroller takes input from the microphone and imaging unit and generates an output through the speaker, enabling interaction with the candidate, the microcontroller is configured in a multi-lingual configuration, interacting with the candidates in multiple languages as per the candidate’s requirement, the microcontroller relays greetings via the speaker in case facial recognition is matched and generates an alert to the concerned authority for updating profile of the candidate in case facial recognition is not matched, an OCR (optical character recognition) scanner installed over the bot to scan the documents of the candidates, the scanned documents are digitally stored in the server in sync with the candidate’s credentials, and the microcontroller correlates the scanned documents of the candidate with a central database stored within the server to cross-verify if the details provided by the candidate during selection process match with the documents.

[0015] According to another embodiment of the present invention, the system further includes at least two L-shaped rods, with grippers installed as an end effector for flipping the documents while scanning, the microcontroller is configured to highlight one or more forms and annexures over the display panel, to be digitally signed, the forms and annexures are synced with the user’s credentials and stored within the server, a compartment stored with a plurality of joining kits and secured with a motorized lid, on successful uploading and verification of documents, the microcontroller activates the lid, along with an inbuilt pusher to eject one of the joining kits out of the compartment, a touch interactive display panel, integrated over the bot, to provide an option of selecting one or more department or wing within an office premises, on selection of a corresponding department, the microcontroller fetches coordinates via an indoor navigation system and navigates the candidate towards the department in a first, second and third manner, the first manner includes display of a map towards the designated department from the initial location of the candidate, the second manner refers to an AR (augmented reality) unit interconnected with the microcontroller to show an augmented path towards the department, the third manner refers to activation of one or more motorized wheels installed with the bot to move towards the department for navigating the candidate, a screening module, operated by one or more processor in linked with the microcontroller, the screening module comprising a plurality of cameras, configured to process a dressing pattern, behavior, and interaction with team members, a sensor, configured to detect and categorize screen time and idle time of the candidate, a user interface, employed by a concerned authority, and in case of mismatch of the behavior, interaction, dressing pattern or idle time with a threshold value, the processor indirectly transmits an alert over the user interface.

[0016] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0017] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates a perspective view of a candidate onboarding assistance system

DETAILED DESCRIPTION OF THE INVENTION

[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0019] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0020] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0021] The present invention relates to a candidate onboarding assistance system that enable optimized entrant processing and validation mechanisms in a coherent and methodical fashion, thus avoiding postponements, irregularities, and manual-based inaccuracies typically found in conventional personal detail checking and credential evaluation practices.

[0022] Referring to Figure 1, a perspective view of a candidate onboarding assistance system is illustrated, comprising a bot 101, an imaging unit 102 installed over the bot 101, a microphone 103 and speaker 104 installed over the bot 101, an OCR (optical character recognition) scanner 105 installed over the bot 101, at least two L-shaped rods 106, with grippers 107 arranged with the bot 101, a compartment 108 secured with a motorized lid 109, and having an inbuilt pusher 110, a touch interactive display panel 111, integrated over the bot 101, an AR (augmented reality) unit 112 integrated over the bot 101, one or more motorized wheels 113 installed with the bot 101.

[0023] The system disclosed herein comprising a bot 101 that is operatively configured to establish wireless communication with a remote server via an integrated communication module, wherein the server is programmed to receive, store, and manage the digital credentials corresponding to a selected candidate. Upon initiation of the onboarding process, the bot 101 transmits identification data to the server, which, in turn, archives the data in a structured format for subsequent retrieval, verification, and correlation with additional candidate-specific inputs acquired during the operational sequence. The server functions as a centralized repository, maintaining secure and synchronized access to the candidate’s credential information throughout the interaction.

[0024] An imaging unit 102 is structurally installed over the bot 101 and operatively coupled with a microcontroller. The imaging unit 102 is activated upon detection of the candidate's presence within its defined field of view and captures high-resolution facial images for the purpose of executing facial recognition. Thereafter, the imaging unit 102 extracts distinguishing facial feature points such as eye distance, jawline, and nose shape. These features are converted into digital biometric markers, which are relayed to the microcontroller for real-time comparison against stored facial data. The unit continuously adjusts focus and lighting parameters to ensure optimal image quality, enabling reliable recognition under varied environmental conditions. Once identification has been verified or denied, the imaging unit 102 resets for the next input cycle.

[0025] A microphone 103 and speaker 104 are operatively coupled with the microcontroller, wherein the microcontroller processes the input received from the microphone 103 and imaging unit 102 to generate a corresponding output through the speaker 104. The microphone 103 captures auditory input, which is processed and analyzed by the microcontroller to facilitate interaction with the candidate, allowing for dynamic communication. The speaker 104, in turn, delivers the output generated by the microcontroller, ensuring a responsive, real-time interaction. This arrangement allows the bot 101 to engage with the candidate, providing auditory feedback based on the candidate's actions or responses as detected by the microphone 103 and imaging unit 102.

[0026] The microphone 103 captures audio signals from the surrounding environment, specifically focusing on the sound emitted by the candidate during interaction. The captured audio is then converted into an electrical signal and transmitted to the microcontroller. The microcontroller processes the audio input to interpret spoken words or sounds, which may be used for triggering specific actions or responses within the system. The microphone 103 continuously monitors and relays input to the microcontroller, ensuring a seamless exchange of information.

[0027] The speaker 104 receives output signals from the microcontroller, converting the electrical signals into audible sound. The microcontroller generates the required audio output based on pre-programmed responses or actions derived from the input captured by the microphone 103 and imaging unit 102. The speaker 104 then emits the sound, providing real-time auditory feedback or guidance to the candidate. The system is designed to adjust volume and clarity to ensure effective communication in diverse environments.

[0028] The microcontroller herein operates in a multi-lingual configuration, where the microcontroller interacting with candidates in a variety of languages based on the candidate's specific requirements or preferences. Upon detection of the candidate's language preference, the microcontroller adjusts its communication parameters to deliver prompts, instructions, or feedback in the selected language.

[0029] This multi-lingual functionality ensures that the system remains accessible and user-friendly for individuals from diverse linguistic backgrounds, thereby enhancing the overall efficiency and inclusivity of the candidate interaction process. The system's language selection capability is synchronized with the bot 101's user interface, ensuring seamless communication throughout the engagement.

[0030] Upon the successful verification of the candidate's identity through facial recognition, the microcontroller initiates a greeting sequence via the speaker 104, thereby confirming the candidate's identity in a manner consistent with the system's programmed protocols. Conversely, in the event that facial recognition fails to match the candidate's stored data, the microcontroller is programmed to immediately generate and transmit an alert to the designated authority. This alert serves as a notification to prompt the authority to take necessary corrective actions, such as updating or modifying the candidate's profile, ensuring that any discrepancies are addressed in a timely and efficient manner.

[0031] An OCR (Optical Character Recognition) scanner 105 is installed over the bot 101 and operatively connected to the microcontroller. The OCR scanner 105 scans the candidate’s document using a high-resolution camera, capturing an image of the text. The captured image is then processed by the OCR module, which analyzes the characters, differentiates between letters, numbers, and other symbols, and converts the visual data into machine-readable text. The output is validated for accuracy, with corrections made for any discrepancies detected during the recognition process. The resulting text is transmitted to the microcontroller for further action, such as storing or cross-referencing the information.

[0032] The scanned documents are processed by the OCR scanner 105 and subsequently converted into a digital format, which is then transmitted to the server for secure storage. The server is configured to synchronize the stored document data with the candidate’s credentials, ensuring that each document is accurately linked to the corresponding candidate profile. This synchronized storage process enables seamless retrieval, verification, and updating of the candidate’s information as needed, while maintaining the integrity and security of the data. This ensures that all documents are stored in compliance with relevant data protection standards and are readily accessible for future reference or validation.

[0033] Thereafter the microcontroller correlates the scanned documents of the candidate with a central database stored within the server. This process involves cross-referencing the candidate’s personal information, as provided during the selection process, with the corresponding data in the scanned documents. The microcontroller retrieves both the candidate’s credentials and the digitized document data from the server and performs a matching operation to ensure that the details are consistent across both sets of data. If any discrepancies are detected between the information provided by the candidate and the scanned documents, the microcontroller triggers an alert for further review or corrective action. This verification ensures that all candidate data is accurate and authentic before proceeding with the next stages of the process.

[0034] Two L-shaped rods 106, equipped with grippers 107 as end effectors, are incorporated in the bot 101 to facilitate the flipping of documents during the scanning process. The rods 106 are operated in a controlled manner to securely grasp the edges of the documents, allowing for precise and smooth flipping. This arrangement ensures that each document is accurately positioned for scanning, while minimizing the risk of damage.

[0035] The rods 106 are pneumatically actuated, wherein the pneumatic arrangement of the rods 106 comprises of a cylinder incorporated with an air piston and the air compressor, wherein the compressor controls discharging of compressed air into the cylinder via air valves which further leads to the extension/retraction of the piston. The piston is attached to the rods 106, wherein the extension / retraction of the piston corresponds to the extension/retraction of the rods 106. The actuated compressor allows extension of the rods 106 to position the grippers 107 in proximity to the documents that are being scanned.

[0036] The gripper 107 disclosed above is a robotic gripper 107. The robotic gripper 107 includes a link connected with multiple motorized ball and socket joints and a gripper 107 for smooth and precise gripping of documents. The motorized ball and socket joint includes a motor powered by the microcontroller generating electrical current, a ball shaped element and a socket. The ball moves freely within the socket. The motor rotates the ball in various directions that is controlled by the microcontroller that further commands the motor to position the ball precisely. The microcontroller further actuates the motor to generate electrical current to rotate in the joint for providing movement to the gripper 107 for flipping the documents while scanning.

[0037] Afterwards the microcontroller identifies and highlight one or more forms and annexures on a touch interactive display panel 111 integrated over the bot 101, which are designated for digital signing. These forms and annexures are directly linked to the user’s credentials, ensuring that the correct documents are presented for signature. Upon selection, the relevant forms and annexures are synchronized with the user’s stored credentials on the server. Once digitally signed, the signed documents are securely stored within the server, maintaining a record of the candidate’s acknowledgment and consent. This process ensures the integrity, security, and proper alignment of the documents with the user's profile throughout the signing procedure.

[0038] A compartment 108 is arranged on the bot 101, wherein a plurality of joining kits is securely stored and enclosed by a motorized lid 109. Upon successful uploading and verification of the candidate’s documents, the microcontroller is configured to trigger the motorized lid 109 to open. Simultaneously, an inbuilt pusher 110 is actuated to eject a single joining kit from the compartment 108, thereby facilitating automated and contactless distribution. The process ensures that only candidates who have completed all required steps receive a joining kit, thereby maintaining procedural accuracy, access control, and operational efficiency in distribution.

[0039] The motorized lid 109 remains in a secured closed position until it receives an activation signal from the microcontroller. Upon receiving this signal following successful document verification, the motor engages and initiates the opening sequence of the lid 109. This ensures smooth and controlled lifting to prevent obstruction or misalignment. Once fully open, it allows for the ejection of a joining kit through the pusher 110. After the kit is dispensed, the lid 109 closes automatically, returning to a locked state to prevent unauthorized access. The cycle then resets for the next operation.

[0040] The pusher 110 herein is a pneumatic pusher 110 and is synchronously actuated by the microcontroller. The pusher 110 is pneumatically actuated, wherein the pneumatic arrangement of the pusher 110 comprises of a cylinder incorporated with an air piston and the air compressor, wherein the compressor controls discharging of compressed air into the cylinder via air valves which further leads to the extension/retraction of the piston. The piston is attached to the pusher 110, wherein the extension/retraction of the piston corresponds to the extension/retraction of the pusher 110. The actuated compressor allows extension of the pusher 110 to eject one of the joining kits out of the compartment 108.

[0041] The touch-interactive display panel 111 integrated into the bot 101, provides an intuitive interface for the candidate to select one or more departments or wings within the office premises. The display is configured to present a list or menu of available departments or wings, allowing the candidate to make a selection based on their intended destination.

[0042] The touch-interactive display panel 111 operates by detecting physical touch input from the candidate via capacitive touch sensors integrated within the screen. When a candidate touches the screen, the sensors register the touch location and transmit this data to the microcontroller. The microcontroller processes the input data, identifies the selected department or option, and immediately updates the display to reflect the selected choice. The microcontroller then triggers the corresponding actions, such as fetching coordinates from the indoor navigation system or displaying relevant information. The panel 111 remains responsive throughout the interaction, ensuring real-time feedback to the user, allowing for continuous navigation and option selection.

[0043] Upon the candidate’s selection of a corresponding department, the microcontroller is programmed to retrieve the necessary coordinates through an indoor navigation system. These coordinates are then utilized to direct the bot 101 towards the selected department. The navigation is facilitated in one of three manners: firstly, by displaying a map on the interactive display panel 111 that guides the candidate towards the department; secondly, by utilizing an augmented reality (AR) unit 112 to overlay an augmented path on the display, guiding the candidate visually; and thirdly, by activating the motorized wheels 113 of the bot 101 to physically move the candidate towards the department, ensuring a seamless and efficient navigational process.

[0044] The augmented reality AR unit 112 operates by receiving real-time location data from the microcontroller, which processes the coordinates for the selected department. The AR unit 112 uses inbuilt infrared sensors, ultrasonic sensors, and cameras to scan the environment, mapping out the physical space. Upon capturing the surroundings, the unit 112 overlays a virtual path or directional arrows onto the display, guiding the candidate towards the destination. As the candidate moves, the AR unit 112 continuously adjusts the visual guidance in real time, updating the overlay to reflect any changes in the environment. This ensures accurate and intuitive navigation to the selected department.

[0045] Simultaneously, the motorized wheels 113 function by receiving directional commands from the microcontroller. Upon receiving navigation data, the microcontroller sends signals to the motors embedded in the wheels 113. These signals control the speed and direction of each wheel 113, allowing precise movement. The wheels 113 rotate in sync, enabling forward, backward, or turning motions. The microcontroller continuously adjusts the power sent to the motors to ensure smooth and accurate movement in real time, guiding the bot 101 toward the selected department. The system also adjusts the movement based on environmental feedback, such as obstacles, ensuring safe and efficient navigation.

[0046] A screening module is operated by one or more processors in communication with the microcontroller, wherein the module incorporates a plurality of cameras. These cameras are strategically positioned to capture and analyze the candidate’s dressing pattern, behaviour, and interaction with team members. The captured visual data is processed by the processor to assess whether the candidate meets predefined criteria for professional conduct and presentation. The microcontroller then evaluates the data in real-time, enabling the system to flag any inconsistencies or deviations from the expected behaviour or appearance, ensuring alignment with company standards.

[0047] The cameras are activated and continuously capture visual data from the candidate’s environment. These cameras are calibrated to detect key attributes such as the candidate’s dressing pattern, gestures, and physical interactions. The processed images are transmitted to the processor, where advanced protocols analyze the visual information. The microcontroller then compares the captured data with preset standards for behaviour and appearance, and generate a response based on the analysis, such as triggering alerts or confirming compliance.

[0048] Also, the screening module incorporates a proximity sensor, which is configured to detect and categorize the candidate’s screen time and idle time. The proximity sensor operates by emitting infrared light or electromagnetic waves towards the candidate or surrounding environment. When the candidate comes within a predefined detection range, the sensor measures the reflected signal or change in the received electromagnetic field. The sensor detects the intensity and the angle of the reflected signal to determine the proximity of the candidate to the device. Based on this information, the microcontroller processes the data to categorize whether the candidate is engaged with the screen or idle. This allows for real-time tracking of screen time and idle periods, aiding in accurate evaluation of the candidate's interaction with the system.

[0049] Furthermore, the screening module includes a user interface, which is utilized by the concerned authority to monitor and evaluate the candidate’s behaviour, interaction, dressing pattern, and idle time. In the event that any of these parameters deviate from a predefined threshold value or do not meet the established standards, the processor, upon detecting such discrepancies, transmits an alert through the user interface. This alert serves to notify the authority of the mismatch, enabling timely corrective action or further investigation. The system ensures real-time monitoring and facilitates proactive decision-making, thereby maintaining adherence to the required standards for candidate conduct and presentation.

[0050] Moreover, a battery is associated with the system for powering up electrical and electronically operated components associated with the system and supplying a voltage to the components. The battery used herein is preferably a Lithium-ion battery which is a rechargeable unit that demands power supply after getting drained. The battery stores the electric current derived from an external source in the form of chemical energy, which when required by the electronic component of the system, derives the required power from the battery for proper functioning of the system.

[0051] The present invention works in the best manner, where the bot 101 is wirelessly linked with the server via the communication module, the server is configured to store credentials of the selected candidate. The imaging unit 102 perform facial recognition of the candidate. The microphone 103 and speaker 104, operatively are coupled with the microcontroller. The microcontroller takes input from the microphone 103 and imaging unit 102 and generates the output through the speaker 104 thereby enabling interaction with the candidate. Also, the microcontroller is configured in the multi-lingual configuration, interacting with the candidates in multiple languages as per the candidate’s requirement. And the microcontroller relays greetings via the speaker 104 in case facial recognition is matched and generates the alert to the concerned authority for updating profile of the candidate in case facial recognition is not matched. The OCR (optical character recognition) scanner 105 scans the documents of the candidates. The scanned documents are digitally stored in the server in sync with the candidate’s credentials. The microcontroller correlates the scanned documents of the candidate with the central database stored within the server to cross-verify if the details provided by the candidate during selection process match with the documents. Thereafter at least two L-shaped rods 106, with grippers 107 flips the documents while scanning. The microcontroller is configured to highlight one or more forms and annexures over the display panel 111, to be digitally signed. The forms and annexures are synced with the user’s credentials and stored within the server. Afterwards the compartment 108 are stored with the plurality of joining kits and secured with the motorized lid 109. On successful uploading and verification of documents the lid 109, along with the inbuilt pusher 110 eject one of the joining kits out of the compartment 108.

[0052] In continuation, the touch interactive display panel 111, provide the option of selecting one or more department or wing within the office premises. On selection of the corresponding department, the microcontroller fetches coordinates via the indoor navigation system and navigates the candidate towards the department in the first, second and third manner. The first manner includes display of the map towards the designated department from the initial location of the candidate. The second manner refers to the AR (augmented reality) unit 112 interconnected with the microcontroller to show the augmented path towards the department. The third manner refers to activation of one or more motorized wheels 113 installed with the bot 101 to move towards the department for navigating the candidate. Further the screening module, operated by one or more processor in linked with the microcontroller comprising plurality of cameras, configured to process the dressing pattern, behavior, and interaction with team members. The sensor, is configured to detect and categorize screen time and idle time of the candidate, the user interface, employed by the concerned authority. Furthermore, in case of mismatch of the behavior, interaction, dressing pattern or idle time with the threshold value, the processor indirectly transmits the alert over the user interface.

[0053] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A candidate onboarding assistance system, comprising:

i) a bot 101 wirelessly linked with a server via a communication module, the server is configured to store credentials of a selected candidate;
ii) an imaging unit 102 installed over the bot 101 and operatively coupled with a microcontroller to perform facial recognition of the candidate;
iii) a microphone 103 and speaker 104, operatively coupled with the microcontroller, wherein the microcontroller takes input from the microphone 103 and imaging unit 102 and generates an output through the speaker 104, enabling interaction with the candidate;
iv) an OCR (optical character recognition) scanner 105 installed over the bot 101 and linked with the microcontroller, is configured to scan the documents of the candidates, wherein the scanned documents are digitally stored in the server in sync with the candidate’s credentials;
v) at least two L-shaped rods 106, with grippers 107 installed as an end effector for flipping the documents while scanning;
vi) a compartment 108 stored with a plurality of joining kits and secured with a motorized lid 109, wherein, on successful uploading and verification of documents, the microcontroller activates the lid 109, along with an inbuilt pusher 110 to eject one of the joining kits out of the compartment 108;
vii) a touch interactive display panel 111, is integrated over the bot 101, to provide an option of selecting one or more department or wing within an office premises, wherein on selection of a corresponding department, the microcontroller fetches coordinates via an indoor navigation system and navigates the candidate towards the department in a first, second and third manner;
viii) a screening module, is operated by one or more processor in linked with the microcontroller, the screening module comprising:
a. a plurality of cameras, is configured to process a dressing pattern, behavior, and interaction with team members;
b. a sensor, is configured to detect and categorize screen time and idle time of the candidate;
c. a user interface, employed by a concerned authority, wherein in case of mismatch of the behavior, interaction, dressing pattern or idle time with a threshold value, the processor indirectly transmits an alert over the user interface.

2) The system as claimed in claim 1, wherein the microcontroller relays greetings via the speaker 104 in case facial recognition is matched and generates an alert to the concerned authority for updating profile of the candidate in case facial recognition is not matched.

3) The system as claimed in claim 1, wherein the microcontroller is configured in a multi-lingual configuration, interacting with the candidates in multiple languages as per the candidate’s requirement.

4) The system as claimed in claim 1, wherein the first manner includes display of a map towards the designated department from the initial location of the candidate.

5) The system as claimed in claim 1, wherein the second manner refers to an AR (augmented reality) unit 112 interconnected with the microcontroller to show an augmented path towards the department.

6) The system as claimed in claim 1, wherein the third manner refers to activation of one or more motorized wheels 113 installed with the bot 101 to move towards the department for navigating the candidate.

7) The system as claimed in claim 1, wherein the microcontroller is configured to highlight one or more forms and annexures over the display panel 111, to be digitally signed, wherein the forms and annexures are synced with the user’s credentials and stored within the server.

8) The system as claimed in claim 1, wherein the microcontroller correlates the scanned documents of the candidate with a central database stored within the server to cross-verify if the details provided by the candidate during selection process match with the documents.

Documents

Application Documents

# Name Date
1 202521040540-STATEMENT OF UNDERTAKING (FORM 3) [26-04-2025(online)].pdf 2025-04-26
2 202521040540-REQUEST FOR EXAMINATION (FORM-18) [26-04-2025(online)].pdf 2025-04-26
3 202521040540-REQUEST FOR EARLY PUBLICATION(FORM-9) [26-04-2025(online)].pdf 2025-04-26
4 202521040540-PROOF OF RIGHT [26-04-2025(online)].pdf 2025-04-26
5 202521040540-POWER OF AUTHORITY [26-04-2025(online)].pdf 2025-04-26
6 202521040540-FORM-9 [26-04-2025(online)].pdf 2025-04-26
7 202521040540-FORM FOR SMALL ENTITY(FORM-28) [26-04-2025(online)].pdf 2025-04-26
8 202521040540-FORM 18 [26-04-2025(online)].pdf 2025-04-26
9 202521040540-FORM 1 [26-04-2025(online)].pdf 2025-04-26
10 202521040540-FIGURE OF ABSTRACT [26-04-2025(online)].pdf 2025-04-26
11 202521040540-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [26-04-2025(online)].pdf 2025-04-26
12 202521040540-EVIDENCE FOR REGISTRATION UNDER SSI [26-04-2025(online)].pdf 2025-04-26
13 202521040540-EDUCATIONAL INSTITUTION(S) [26-04-2025(online)].pdf 2025-04-26
14 202521040540-DRAWINGS [26-04-2025(online)].pdf 2025-04-26
15 202521040540-DECLARATION OF INVENTORSHIP (FORM 5) [26-04-2025(online)].pdf 2025-04-26
16 202521040540-COMPLETE SPECIFICATION [26-04-2025(online)].pdf 2025-04-26
17 Abstract.jpg 2025-05-14
18 202521040540-FORM-26 [03-06-2025(online)].pdf 2025-06-03