Abstract: An autonomous student assistance and navigation device for educational institutions, comprising of a body 101 with motorized omnidirectional wheels 102 for navigating educational institutions. a touch-enabled display panel 103, which gathers admission details, a navigation unit 104 enables real-time mapping and path planning, an articulated arm with electromagnetic spring enable elevator button operation for multi-floor transit, a holographic display unit 107 displays institutional information, an artificial intelligence based imaging unit 108 analyses student facial expressions, a pair of motorized slider with plate and a motorized scissor arrangement 111 aids in transporting students with mobility issues, a secured printing unit 112 with a slit-based drawer arrangement 114 provides authenticated document retrieval, a retractable luggage tray 115 via a folding arrangement 116 for transporting student belongings, a retractable directional microphone pod 117 to capture student voice accurately, QR (quick response) code scanner 118 for scanning student IDs and/or room-specific codes.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to an autonomous student assistance and navigation device for educational institutions that is capable of providing comprehensive support to students through navigation, information dissemination, well-being monitoring, and physical assistance, thereby enhancing their overall campus experience and accessibility.
BACKGROUND OF THE INVENTION
[0002] The increasing size and complexity of educational institutions present significant navigation and information access challenges for students, particularly new enrollees, those with disabilities, or individuals seeking specific resources. Locating classrooms, administrative offices, libraries, and accessing timely information about events, academic procedures, and support services are time-consuming and frustrating. Students faces difficulties in managing their belongings, navigating crowded spaces, or seeking help in cases of distress or mobility issues. The requirement for an autonomous student assistance and navigation device arises from the need to streamline these processes, enhance accessibility, and provide personalized support, ultimately fostering a more efficient and inclusive learning environment by addressing these user-centric challenges.
[0003] Currently available similar devices often include static campus maps or basic digital directories accessible through kiosks or mobile applications. These devices typically lack real-time dynamic guidance, personalized assistance, and the ability to physically aid students with mobility issues or carrying belongings. While some institutions utilize wayfinding apps, these often suffer from accuracy limitations indoors, lack integration with elevators or other accessibility features, and do not offer proactive support for student well-being or document retrieval. Existing educational robots primarily focus on teaching or social interaction and do not encompass the comprehensive navigation, assistance, and support functionalities envisioned in this invention, particularly the physical assistance and well-being monitoring aspects.
[0004] US5363305A discloses about a navigation system for a mobile autonomous robot that includes apparatus for creating and maintaining a map of an environment the mobile autonomous robot is to traverse including provision for storing in a map at an assigned location features representative of geometric beacons located in the environment. Because of the uncertainty in its sensor's operating conditions and changes in the environment, a credibility measure is associated with each map feature stored. This credibility measure is increased or decreased whenever the map feature assigned to a location matches or does not match, respectively, a geometric beacon corresponding to such location. Whenever a geometric beacon is observed for a location that does into match a previously stored map features, an appropriate map feature is added for such location.
[0005] US7243024B2 discloses about a computer-implemented wayfinding system and method, wherein the system includes an interconnected network of computer entities capable of transmitting and receiving audio and visual outputs and inputs to users in order to guide them through an environment. In one aspect of the wayfinding system, a series of kiosks are interconnected, with at least one kiosk being capable of printing a barcode that provides destination information inputted by the user. The kiosks may be located at designated locations of the environment, so that a user searching for his or her destination may scan their barcode, and the kiosk will provide audio and visual directions to the destination. The kiosks know their designated locations in the environment, and thus, may provide the most direct route to any other point in the environment.
[0006] Conventionally, many devices have been available for assisting students through navigation, information dissemination, well-being monitoring, and physical assistance. However, these existing devices lack in providing a comprehensive integration of these functionalities within a single autonomous mobile platform tailored specifically for the dynamic environment of educational institutions, often providing only isolated solutions that do not address the multifaceted needs of students in a cohesive and proactive manner.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a device that requires to be capable of providing autonomous navigation within educational institutions, offering seamless access to information and resources, proactively monitoring student well-being, and providing physical assistance for enhanced accessibility and support, all within a single, integrated mobile platform.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop a device that is capable of achieving self-directed movement throughout an educational institution for enabling the device to accompany and aid students in various locations within the campus environment.
[0010] Another object of the present invention is to develop a device that is capable of presenting information to students and receiving their input for facilitating seamless communication and access to device functionalities.
[0011] Another object of the present invention is to develop a device that is capable of implement mapping and pathfinding capabilities for allowing the device to navigate indoor spaces and guide students efficiently to their desired destinations.
[0012] Yet, another object of the present invention is to develop a device that is capable of analyzing student expressions and cues, providing insights into their well-being and enabling proactive assistance.
[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0014] The present invention relates to an autonomous student assistance and navigation device for educational institutions that is capable of providing guidance, interactive information access, proactive well-being monitoring, and physical support to enhance the student experience and accessibility within the campus environment.
[0015] According to an embodiment of the present invention, an autonomous student assistance and navigation device for educational institutions, comprising a body configured with plurality of motorized wheels at bottom portion to enable multi-directional movement inside an educational institution, the body is assigned to assist a group of students by navigating, advising, and interacting with the students, a touch-enabled display panel mounted on the body for interacting with students, upon activation during admission, the display panel presents a questionnaire to receive student details, a microcontroller linked with the display panel determines the most suitable academic stream for the student based on entered credentials, along with providing real-time suggestions for alternate departments if eligibility is not met, a navigation unit comprising a LiDAR (Light Detection and Ranging) sensor and Simultaneous Localization and Mapping (SLAM) module for real-time indoor mapping and dynamic path planning, a pair of articulated arms assembled on the body and integrated with electromagnetic springs to press elevator buttons and enable the body to travel across building floors while accompanying student(s), a holographic display unit configured on the body to project visual representations of university history, campus events, or classroom layouts, an artificial intelligence-based imaging unit installed on the body for capturing facial expressions and emotional cues of students, a pair of motorized sliders and plates arranged at the body’s base and linked via a motorized scissor arrangement to extend and lift the plate at an optimum height, to safely transport the student to the desired location.
[0016] According to another embodiment of the present invention, the device further comprises of a secured printing unit enclosed within a chamber inside the body to print and dispense institute-related documents via a slit-based drawer arrangement crafted on the body for ensuring authenticated retrieval of requested documents, a retractable luggage tray is integrated within the body for transporting student belongings, including books or laptops, a folding arrangement to provide portability support for disabled or burdened students, a retractable directional microphone pod is arranged at the top of the body and configured with a beamforming microphone array, the pod extends towards the user's face to capture voice accurately in noisy environments and supports voiceprint recognition for personalized interactions, a natural language interface configured with the microphone to enable user interaction in multiple languages, and synchronized with a database to display course-specific queries, finance-related suggestions, and student loan options with EMI calculations based on user inputs, a trained image recognition module configured with the arm to interpret various elevator panel layouts for autonomous interaction with multi-floor building environments, QR (quick response) code scanner is provided on the body for scanning student IDs and/or room-specific codes, and a battery is associated with the device for supplying power to electrical and electronically operated components associated with the device.
[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of an autonomous student assistance and navigation device for educational institutions.
DETAILED DESCRIPTION OF THE INVENTION
[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0022] The present invention relates to an autonomous student assistance and navigation device for educational institutions that is capable of navigating campus environments, providing interactive assistance and information, monitoring student well-being, and offering physical support to enhance the overall student experience and accessibility.
[0023] Referring to Figure 1, an isometric view of an autonomous student assistance and navigation device for educational institutions is illustrated, comprising, a body 101 configured with plurality of motorized wheels 102 at bottom portion of the body 101, a touch-enabled display panel 103 mounted on the body 101, a navigation unit 104 installed with the body 101, a pair of articulated arms 105 assembled on the body 101 and integrated with electromagnetic springs 106, a holographic display unit 107 configured on the body 101, an artificial intelligence-based imaging unit 108 installed on the body 101.
[0024] Figure 1 further illustrates a pair of motorized sliders 109 and plates 110 arranged at the body’s 101 base and linked via a motorized scissor arrangement 111, a secured printing unit 112 enclosed within a chamber 113 inside the body 101, a slit-based drawer arrangement 114 crafted on the body 101, a retractable luggage tray 115 is integrated within the body 101, a folding arrangement 116 configured with the tray 115, a retractable directional microphone pod 117 is arranged at the top of the body 101 and a QR (quick response) code scanner 118 is provided on the body 101.
[0025] The device disclosed herein includes a body 101 that is developed to be positioned on a surface in an educational institution. The body 101 herein incorporates all the components of the device required for assisting a group of students by navigating, advising, and interacting with the students.
[0026] The body 101 is installed with push button, accessed by the students to activate the device for performing the required operations. When the user presses the push button, the electrical circuit is completed, which in response turns the device on. The push button is integrated with an actuator and a spring, which are automatically activated when pressed. They work together to move the internal contact, completing the circuit and allowing electrical current to flow, thereby activating an inbuilt microcontroller.
[0027] The microcontroller associated with the device is pre-fed to detect the signal and actuate/activate the required component of the device. The microcontroller used herein is pre-fed using artificial intelligence and machine learning protocols to coordinate the working of the device. Further, the microcontroller activates a touch-enabled display panel 103 installed on the body 101 to interact with the students during admission by displaying a pre-fed questionnaire to receive student details.
[0028] The touch-enabled display panel 103, commonly known as a touchscreen, serves as an electronic visual interface capable of detecting presence and location of a touch within the panel’s display area. In a typical configuration, this panel comprises several layers, including a display panel 103, a touch sensor, and a controller.
[0029] The display panel 103, which is typically a Liquid Crystal Display (LCD) screen, forms the outermost layer and is responsible for presenting visual information to the user. Situated beneath the display panel 103 is the touch sensor, often made of a transparent conductive material such as indium tin oxide (ITO). This touch sensor is structured as a grid of rows and columns, with each intersection defining a unique touch point.
[0030] When the students interact with the display by touching the display panel 103, their finger acts as a conductor, allowing electricity to flow through the point of contact. This physical touch causes a change in the electrical signal at that specific location, a change that is detected by the touch sensor. The touch sensor then transmits this information to a touch controller IC (Integrated Circuit). This controller is responsible for processing the analog signals generated by the touch input, effectively deciphering details regarding the touch location and characteristics (such as pressure or duration).
[0031] The touch controller IC is typically connected to the microcontroller embedded within the device through various interfaces, including but not limited to SPI (Serial Peripheral Interface) or I2C (Inter-Integrated Circuit). Upon receiving the interpreted input commands from the touch controller, the display panel 103 sends these inputs to the microcontroller in the form of electrical signals. The controller also plays a crucial role in filtering out any noise or interference that present in the signal for ensuring accurate touch detection.
[0032] Post receiving the signals representing the user's inputs from the display panel 103, the microcontroller processes these signals by using real-time student data inputs, historical admission trends, and predictive machine learning models in order to determine most suitable academic stream for the student, along with provides real-time suggestions for alternate departments in the event that the student's eligibility criteria are not met for their initial choice, displayed on the display panel 103 to enable the students to choose for the eligible department.
[0033] Once the user opted the department via the display panel 103, the microcontroller activates a navigation unit 104 installed with the body 101 and equipped with a LiDAR (Light Detection and Ranging) sensor and a Simultaneous Localization and Mapping (SLAM) module to enable real-time indoor mapping and dynamic path planning to guide the students in the institution.
[0034] The LiDAR sensor operates by emitting rapid pulses of infrared light and then measuring the time it takes for these pulses to return after reflecting off surrounding objects. The LiDAR sensor include a laser emitter, a scanner (which directs the laser pulses across the environment), and a detector that measures the reflected light. This time-of-flight measurement allows the microcontroller to calculate the distance to numerous points in the environment, creating a dense three-dimensional point cloud representing the physical space. This point cloud captures the geometry and spatial relationships of walls, obstacles, and other features within the indoor environment.
[0035] Simultaneously, the SLAM module takes the continuous stream of data from the LiDAR sensor and performs two crucial tasks concurrently: building a map of the unknown environment and simultaneously determining the device's own location within that map. The SLAM protocols achieve this by analyzing the features within the LiDAR point clouds, such as corners, edges, and planar surfaces, and matching these features across successive scans. By identifying how these features move relative to the sensor over time, the SLAM module estimates both the device's trajectory and refine the map being built.
[0036] The SLAM module comprises sophisticated protocols running on a processing unit that analyze the incoming sensor data, perform feature extraction and matching, and optimize the map and the device's pose through techniques like Kalman filtering or graph optimization. The resulting real-time map and the device's precise localization within it enable the microcontroller to dynamically plan an optimal path for the student to their chosen department for adapting to any changes or obstacles encountered along the way.
[0037] Once the building layouts is generated, the microcontroller activates a retractable directional microphone pod 117 installed at the top of the body 101 and configured with a beamforming microphone array, to extend towards the student’s face to enable the student to input voice command regarding campus navigation. The retractable directional microphone pod 117 extends using a miniature motor or a solenoid actuator connected to a set of gears. Upon receiving a signal from the microcontroller, the motor or solenoid is activated. This causes the gears to turn, which in turn pushes the microphone pod 117 outwards and upwards, positioning it closer to the student's face. The microcontroller controls the motor or solenoid for holding the pod 117 securely in place at the specific point. When voice input is no longer required, the microcontroller sends another signal to reverse the motor or solenoid, retracting the pod 117 back into its housing.
[0038] The beamforming microphone array utilizes multiple microphones working in concert to capture sound from a specific direction while minimizing noise and sounds from other directions. Each microphone contains a small diaphragm connected to a moving coil. When sound waves of the user hit the diaphragm, the coil vibrates. This causes the coil to move back and forth in the magnet's field, generating an electrical signal, and these signals are then processed using sophisticated protocols. These protocols analyze the time delays and phase differences of the sound waves as they arrive at each microphone.
[0039] By combining these signals through techniques like delaying and summing the pod 117 constructively interfere with sound waves originating from the desired direction, effectively amplifying them. Conversely, sound waves arriving from other angles experience destructive interference, leading to their attenuation. This electronic steering creates a focused beam of sensitivity, allowing the array to isolate a specific sound source, such as a student's voice, even in a noisy environment. The corrected signal is send to the microcontroller, which further processes the student input voice commands.
[0040] A natural language interface (NLI) integrated with the microphone enables multilingual user interaction by combining speech recognition with natural language processing capabilities for various languages. When the student speaks into the microphone, the interface first employs multilingual Automatic Speech Recognition (ASR) to transcribe the spoken audio into text. This ASR is trained on acoustic models for multiple languages, allowing it to identify the language being spoken and accurately convert the speech into its written form. Once the speech is transcribed, the NLI's multilingual Natural Language Processing (NLP) engine takes over. This engine, equipped with language-specific models and rules, analyzes the text to understand its grammatical structure, meaning, and the user's intent, regardless of the language used. Techniques like tokenization, parsing, semantic analysis, and intent recognition are applied, tailored to the identified language.
[0041] If the interface needs to respond, its multilingual Natural Language Generation (NLG) module constructs an answer in the same language, ensuring grammatically correct and contextually appropriate phrasing. In an embodiment of the present invention, for a spoken response, a multilingual Text-to-Speech (TTS) module synthesizes the generated text into natural-sounding speech in the student's language, which is then outputted through a speaker. This seamless integration of multilingual speech recognition, natural language understanding, and speech synthesis allows students to interact with the device using their preferred language through spoken commands and receive spoken responses in the same language. In the event that the student inputs voice commands via the microphone concerning course-specific queries, finance-related suggestions, and student loan options with EMI calculations, the microcontroller accesses a linked database to calculate these parameters and display the result on the display panel 103.
[0042] Post processing the students voice commands, the microcontroller actuates multiple motorized wheels 102 (preferably in range 4-6) installed at bottom portion to enable multi-directional movement inside the educational institution towards the student-desired location/department. The motorized wheels 102 comprise a pair of wheel coupled with a motor via a shaft wherein upon receiving the command from the microcontroller by the motor, the motor starts to rotate in clockwise or anti-clockwise direction in order to provide movement to the wheels 102 via the shaft. The wheels 102 thus maneuver the body 101 towards the chosen department.
[0043] While moving towards the department, the microcontroller activates an artificial intelligence-based imaging unit 108 installed on the body 101 to capture multiple images in proximity of the body 101 to analyze facial expressions and emotional cues to assess the student's well-being and to detect potential indicators of emotional distress or harassment. The imaging unit 108 comprises of an image capturing module including a set of lenses that captures multiple images in surrounding of the body 101, and the captured images are stored within memory of the imaging unit 108 in form of an optical data.
[0044] The imaging unit 108 also comprises of a processor that is encrypted with artificial intelligence protocols, such that the processor processes the optical data and extracts the required data from the captured images. The extracted data is further converted into digital pulses and bits and are further transmitted to the microcontroller. The microcontroller processes the received data to make a determination regarding the student's well-being and identify any potential emotional distress or harassment incidents. Upon successful detection, the microcontroller raises a formal complaint to institution authorities.
[0045] A pair of articulated arms 105 integrated with the body 101 and incorporating electromagnetic springs 106. These arms 105 are controlled by the microcontroller, in synchronization with the imaging unit 108. In scenarios requiring vertical transportation via an elevator, the microcontroller commands the articulated arms 105 to precisely engage and depress the elevator control buttons. This functionality enables the device to navigate between different building floors while physically accompanying a student or students
[0046] When the device approaches an elevator, the microcontroller, potentially guided by its mapping and localization data, determines the location of the desired button. The microcontroller then actuates a pair of articulated arms 105 integrated with the body 101 and incorporating electromagnetic springs 106 to precisely engage and depress the elevator control buttons. The articulated arms 105 comprise of multiple joints and segments connected through rotary and/or prismatic joints, forming a multi-degree-of-freedom mechanical arm.
[0047] In an embodiment of the present invention, a motor is associated with these arms 105 is activated by the microcontroller signal, causes it to position the fingertips (or specialized end-effectors) of the arms 105 in front of the button. As the arms 105 extend towards the button, the electromagnetic springs are actively controlled to provide a gentle but firm pressing force. Instead of a purely rigid movement that could damage the button or the device, the electromagnetic springs allow for a degree of compliance for ensuring a reliable press even if the alignment isn't perfectly exact.
[0048] The electromagnetic spring operates by using an electromagnetic field to control the expansion and contraction. The microcontroller monitors the current flowing through the electromagnetic springs 106 to sense the force being applied, allowing for feedback and adjustment to ensure the button is adequately depressed without excessive force. Once the button press is registered via the imaging unit 108, the microcontroller retracts the arms 105, again using the motors and controlled release of force through the electromagnetic springs to ensure smooth movement. This enables interaction with elevator buttons in a precise, reliable, and safe manner.
[0049] The articulates arms 105 utilize a trained image recognition module that is activated by the microcontroller to interpret various elevator panel layouts for autonomous interaction with multi-floor building environments. The trained image recognition module enables autonomous elevator interaction by first capturing an image of the elevator panel using an integrated camera. This image is then fed into a pre-trained neural network, a sophisticated type of artificial intelligence protocols. This network has been trained on a vast dataset of images depicting various elevator panel designs, including different button arrangements, numbering schemes, and labeling conventions.
[0050] During the training process, the network learns to identify key visual features associated with each button and their corresponding functions (e.g., floor numbers, door open/close, alarm). When a new image of an elevator panel is presented, the trained network analyzes its pixel patterns, identifying edges, shapes, and textures that correspond to the learned features. By comparing these identified features with its internal knowledge base, the module accurately determines the location and function of each button on the specific panel it is currently viewing. The output of this process is a structured interpretation of the elevator panel layout, providing the microcontroller with the precise coordinates and functions of each button, thus allowing the articulated arms 105 to be accurately controlled to press the desired buttons for autonomous floor navigation.
[0051] The device incorporates a pair of motorized sliders 109 and plates 110 at the body 101 base, interconnected by a motorized scissor arrangement 111. Upon the imaging unit 108 detecting a student injury or mobility issue, the microcontroller activates the sliders 109 and scissor arrangement 111 to work in coordination to extend and elevate the plate to an optimal height for providing a platform for safely transporting the student to their desired location. The motorized sliders 109 typically consist of a motorized carriage attached to a rail for enabling the controlled linear movement of the plate. Upon actuation of the motorized slider by the microcontroller, the motor drives the carriage along the rail, facilitating a smooth and precise sliding motion of the plate.
[0052] Once the plate is successfully out, the microcontroller actuates the motorized scissor arrangement 111 to lift the plate for providing the platform to the student. The scissor arrangement 111 comprises of bars linked in a scissor like arrangement that is powered by a pneumatic unit associated with arrangement, including an air compressor, air cylinders, air valves and piston which works in collaboration to aid in extension and retraction of the arrangement. The pneumatic unit is operated by the microcontroller, such that the microcontroller actuates valve to allow passage of compressed air from the compressor within the cylinder, the compressed air further develops pressure against the piston and results in pushing and extending the piston. The piston is connected with the bars and due to applied pressure the arrangement extends and similarly, the microcontroller retracts the arrangement by closing the valve resulting in retraction of the piston. Thus, the microcontroller regulates the extension/retraction of the arrangement in order to elevates the plate at optimum height to provide the platform to the injured student for sitting. In an embodiment of the present invention, the plate is fabricated with a cushioned member.
[0053] A secured printing unit 112, enclosed within a chamber 113 inside the body 101, is activated by the microcontroller when a student requests to print institute-related documents as specified via the display panel 103. Upon activation, the printing unit 112 receives digital information representing these documents from the microcontroller. The unit's internal controller processes this data, converting it into a format suitable for the printing unit 112. This printing unit 112 utilizes a positively charged printing drum and negatively charged toner. Toner particles are electrostatically attracted to the positively charged areas on the rotating printing drum that correspond to the image or text to be printed. Subsequently, a sheet of paper, often carrying a negative charge or passing through a charging corona wire to acquire a negative charge, is brought into contact with the drum. The negatively charged paper then attracts the positively charged toner particles from the drum, transferring the toner image/text onto its surface. Finally, a fuser unit, typically involving heated rollers, melts and presses the toner onto the paper, permanently adhering it.
[0054] Once the students request for the institute-related document through the display panel 103, the microcontroller actuates a slit-based drawer arrangement 114 crafted on the body 101 to dispense the institute-related document. The slit-based drawer arrangement 114 for dispensing institute-related documents typically operates with a spring-loaded or motor that pushes the printed document forward from a stack within the drawer. Once the document is printed by the internal printing unit 112, it is fed into the drawer. When microcontroller initiates the dispense action, a motor or a release lever disengages a holding arrangement, allowing the spring-loaded platform beneath the stack of documents to push the topmost document towards a narrow slit or opening at the front of the drawer. Alternatively, a motorized roller or belt might actively feed the document towards the slit. The width of the slit is carefully designed to be just wide enough for a single sheet of paper to be easily grasped and pulled out by the students for preventing multiple documents from being dispensed simultaneously and ensuring a controlled and orderly retrieval process.
[0055] The device incorporates a retractable luggage tray 115 designed to assist students with transporting their belongings, such as books or laptops. This tray 115 utilizes a folding arrangement 116 that is actuated by the microcontroller. Activation of the tray 115 is triggered by the imaging unit’s 108 detection of a student with mobility issues or carrying a heavy load, offering portability support as needed. The folding arrangement 116 for the retractable luggage tray 115 typically employs hinged panels or interconnected segments.
[0056] When activated by the microcontroller, a small motor or actuator drives a series of gears or linkages. This motion unfolds the tray 115 from its compact, stowed position within the device's body 101 into a stable, horizontal platform. The support arms or locking unit engage to ensure the tray 115 remains rigid and bear weight. The extended tray 115 provides a convenient surface for a student to place books, a laptop, or other belongings, reducing the physical burden of carrying them. The folding design allows the tray 115 to be neatly tucked away when not needed, maintaining the device's streamlined profile and maneuverability.
[0057] Upon detecting exam-related anxiety from a student’s facial expression via the imaging unit 108, the microcontroller suggests study resources available in the institution library and navigates the student to the respective location for access.
[0058] A QR (quick response) code scanner 118 is arranged on the body 101 that is activated by the microcontroller to scan student IDs and/or room-specific codes. The QR (Quick Response) code scanner 118 works by illuminating the QR code with a light source, typically infrared or red light, and capturing the reflected light using a camera or an optical sensor. The scanner then analyzes the captured two-dimensional image, identifying the distinct black and white squares arranged in a specific pattern. Sophisticated protocols within the scanner's processing unit locate the three characteristic finder patterns (large squares in three corners) that indicate the presence and orientation of a QR code. Once the code is located, the scanner maps the grid of black and white squares, assigning a binary value (0 or 1) to each square based on its color.
[0059] These binary values are then interpreted according to the QR code's encoding scheme, which represent various types of data, including text, URLs, or, in this case, encoded information representing the student IDs or room-specific codes. The scanner essentially deciphers this binary data back into its original alphanumeric or symbolic form, allowing the microcontroller to quickly and accurately read and process the information embedded within the QR code for initiating personalized guidance and resource access based on the scanned data and role-based permissions.
[0060] The device incorporates a holographic display unit 107 integrated into its body 101, which the microcontroller activates to project visual representations of university history, campus events, or classroom layouts. This functionality aims to improve student awareness and familiarity with campus facilities. The holographic display unit 107 projects visual representations using principles of light manipulation. The holographic display unit 107 operates by generating and projecting holograms, which are three-dimensional images formed through the interference of light waves. Initially, the laser light emitted from the holographic display unit 107 is divided into two beams: the object beam, which interacts with the digital representation of the university history, campus event, or classroom layout, and whose light waves are altered based on the subject's shape and features; and the reference beam, which remains unchanged. The altered object beam and the reference beam then intersect, creating an interference pattern. This pattern is recorded on a photosensitive surface, such as a holographic plate or within a spatial light modulator. The interference pattern encapsulates information about the phase and amplitude of the light waves, thus preserving the three-dimensional details of the subject. During projection, a laser beam is directed onto the recorded interference pattern, diffracting the laser light and reconstructing the original wave fronts from the object and the reference beams. The reconstructed wave fronts create a three-dimensional image that appears to float in space and provides students with an enhanced visual understanding of university history, campus events, or classroom layouts.
[0061] Lastly, a battery (not shown in figure) is associated with the device to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrode named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the device.
[0062] The present invention works best in the following manner, where the body 101 as disclosed in the invention assists the students within educational institutions. Equipped with the plurality of motorized wheels 102, which enables the body 101’s multi-directional movement. The touch-enabled display panel 103 facilitates student interaction, administering questionnaires during admission to determine the most suitable academic stream based on entered credentials and suggesting alternate departments if eligibility criteria are unmet. The navigation unit 104 integrating the LiDAR sensor and SLAM module enables real-time indoor mapping and dynamic path planning, responding to the student audio queries with live building layouts and providing guidance to specified campus locations. The pair of articulated arms 105 integrated with the electromagnetic springs and actuated by the microcontroller in synchronization with the imaging unit 108, allows the body 101 to operate elevator buttons for multi-floor navigation while accompanying students. The holographic display unit 107 projects visual information such as university history, campus events, or classroom layouts. The artificial intelligence-based imaging unit 108 captures facial expressions and emotional cues to assess student well-being and detect emotional distress or harassment, triggering formal complaints to university authorities upon detection. The pair of motorized sliders 109 and plates 110, linked via the motorized scissor arrangement 111, extends and lifts the plate to safely transport students with injuries or mobility issues.
[0063] In continuation, the secured printing unit 112 in the chamber 113 prints and dispenses university-related documents via the slit-based drawer arrangement 114 upon user request through the display panel 103 for ensuring authenticated retrieval. The device further incorporates the retractable luggage tray 115 for transporting student belongings, the retractable directional microphone pod 117 with the beamforming microphone array for accurate voice command capture and voiceprint recognition, and the QR code scanner 118 for student ID and room-specific code scanning, enabling personalized guidance and resource access. The natural language interface configured with the microphone facilitates multilingual user interaction, accessing the linked database to provide course-specific queries, finance-related suggestions, and student loan options with EMI calculations. The device utilizes real-time student data, historical admission trends, and predictive machine learning models to evaluate course eligibility and recommend optimal departments. The articulated arms 105 employ the trained image recognition module for autonomous interaction with various elevator panel layouts. Upon detecting exam-related anxiety via the imaging unit 108, the device suggests and navigates students to relevant library resources.
[0064] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) An autonomous student assistance and navigation device for educational institutions, comprising:
i) a body 101 configured with plurality of motorized wheels 102 at a bottom portion to enable multi-directional movement inside an educational institution, wherein said body 101 is assigned to assist a group of students by navigating, advising, and interacting with said students;
ii) a touch-enabled display panel 103 mounted on said body 101 for interacting with students, wherein upon activation during admission, said display panel 103 presents a questionnaire to receive student details, wherein a microcontroller linked with the display panel 103 determines the most suitable academic stream for the student based on entered credentials, along with providing real-time suggestions for alternate departments if eligibility is not met;
iii) a navigation unit 104 comprising a LiDAR (Light Detection and Ranging) sensor and Simultaneous Localization and Mapping (SLAM) module for real-time indoor mapping and dynamic path planning, wherein said microcontroller responds to student audio queries by retrieving live building layouts and guiding said students to specified campus locations;
iv) a pair of articulated arms 105 assembled on said body 101 and integrated with electromagnetic springs 106, said arms 105 being actuated by the microcontroller in sync with the imaging unit 108 to press elevator buttons and enable the body 101 to travel across building floors while accompanying student(s);
v) a holographic display unit 107 configured on said body 101 to project visual representations of university history, campus events, or classroom layouts, thereby enhancing student awareness and familiarity with campus facilities;
vi) an artificial intelligence-based imaging unit 108 installed on the body 101 and paired with a processor for capturing facial expressions and emotional cues of students, to assess student well-being and detect emotional distress and/or harassment incidents, wherein upon successful detection, said microcontroller raises a formal complaint to university authorities;
vii) pair of motorized sliders 109 and plates 110 arranged at said body 101’s base and linked via a motorized scissor arrangement 111, wherein upon detecting student injury or mobility issues, said sliders 109 and scissor arrangement 111 to work in collaboration to extend and lift said plate at an optimum height, to safely transport said student to the desired location; and
viii) a secured printing unit 112 enclosed within a chamber 113 inside said body 101, said printing unit 112 is configured to print and dispense university-related documents via a slit-based drawer arrangement 114 crafted on the body 101 upon user request received through said display panel 103, ensuring authenticated retrieval of requested documents.
2) The device as claimed in claim 1, wherein a retractable luggage tray 115 is integrated within said body 101 for transporting student belongings, including books or laptops, said tray 115 is actuated by a folding arrangement 116 to provide portability support for disabled or burdened students.
3) The device as claimed in claim 1, wherein a retractable directional microphone pod 117 is arranged at the top of said body 101 and configured with a beamforming microphone array, said pod 117 extends towards the student’s face to capture voice accurately in noisy environments and supports voiceprint recognition for personalized interactions.
4) The device as claimed in claim 1, wherein said navigation unit 104 guides a student using both visual display and spoken language instructions, and upon request, physically escorts the student using the shortest accessible route, including the use of elevators if required.
5) The device as claimed in claim 1, wherein said microphone is configured with a natural language interface to enable user interaction in multiple languages, and synchronized with said database to display course-specific queries, finance-related suggestions, and student loan options with EMI (Equated Monthly Installment) calculations based on user inputs.
6) The device as claimed in claim 1, wherein said microcontroller evaluates course eligibility using real-time student data inputs, historical admission trends, and predictive machine learning models to recommend departments where admission is likely and academic success is maximized.
7) The device as claimed in claim 1, wherein said articulates arms 105 utilize a trained image recognition module to interpret various elevator panel layouts for autonomous interaction with multi-floor building environments.
8) The device as claimed in claim 1, wherein upon detecting exam-related anxiety from a student’s facial expression, suggests study resources available in the university library and navigates the student to the respective location for access.
9) The device as claimed in claim 1, wherein a QR (quick response) code scanner 118 is provided on the body 101 for scanning student IDs and/or room-specific codes, said microcontroller accordingly module initiates personalized guidance and resource access based on said scanned data and role-based permissions.
10) The device as claimed in claim 1, wherein a battery is associated with said device for supplying power to electrical and electronically operated components associated with said device.
| # | Name | Date |
|---|---|---|
| 1 | 202521052810-STATEMENT OF UNDERTAKING (FORM 3) [30-05-2025(online)].pdf | 2025-05-30 |
| 2 | 202521052810-REQUEST FOR EXAMINATION (FORM-18) [30-05-2025(online)].pdf | 2025-05-30 |
| 3 | 202521052810-REQUEST FOR EARLY PUBLICATION(FORM-9) [30-05-2025(online)].pdf | 2025-05-30 |
| 4 | 202521052810-PROOF OF RIGHT [30-05-2025(online)].pdf | 2025-05-30 |
| 5 | 202521052810-POWER OF AUTHORITY [30-05-2025(online)].pdf | 2025-05-30 |
| 6 | 202521052810-FORM-9 [30-05-2025(online)].pdf | 2025-05-30 |
| 7 | 202521052810-FORM FOR SMALL ENTITY(FORM-28) [30-05-2025(online)].pdf | 2025-05-30 |
| 8 | 202521052810-FORM 18 [30-05-2025(online)].pdf | 2025-05-30 |
| 9 | 202521052810-FORM 1 [30-05-2025(online)].pdf | 2025-05-30 |
| 10 | 202521052810-FIGURE OF ABSTRACT [30-05-2025(online)].pdf | 2025-05-30 |
| 11 | 202521052810-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [30-05-2025(online)].pdf | 2025-05-30 |
| 12 | 202521052810-EVIDENCE FOR REGISTRATION UNDER SSI [30-05-2025(online)].pdf | 2025-05-30 |
| 13 | 202521052810-EDUCATIONAL INSTITUTION(S) [30-05-2025(online)].pdf | 2025-05-30 |
| 14 | 202521052810-DRAWINGS [30-05-2025(online)].pdf | 2025-05-30 |
| 15 | 202521052810-DECLARATION OF INVENTORSHIP (FORM 5) [30-05-2025(online)].pdf | 2025-05-30 |
| 16 | 202521052810-COMPLETE SPECIFICATION [30-05-2025(online)].pdf | 2025-05-30 |
| 17 | Abstract.jpg | 2025-06-17 |
| 18 | 202521052810-FORM-26 [01-07-2025(online)].pdf | 2025-07-01 |