Sign In to Follow Application
View All Documents & Correspondence

Navigation And Safety System For Railway Stations

Abstract: A navigation and safety system for railway stations, comprising of a housing unit 101 installed at designated locations within railway stations, two storage chambers 102 securely holds (AR) glasses 103 and assistive sticks 104 for passenger use, an augmented reality holographic projector 106 projects visual guidance to assist passengers in navigation and boarding of trains, a scanning unit 107 verifies passenger ticket details and platform assignment, the AR glasses and assistive sticks 104 are equipped with an AI enabled cameras 108, GPS modules, haptic feedback units, and LED lights 110, configured to assist passengers in navigation, obstacle detection, and safety alerts throughout their journey, a speaker module 111 providing voice instructions, real-time notifications, and alerts related to travel progress, platform changes, and safety warnings, a touch screen 112 displaying travel-related information and adapted to receive passenger inputs via a braille keypad 113 and voice commands through an integrated microphone 109.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
27 June 2025
Publication Number
29/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Marwadi University
Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Inventors

1. Aryan Mahida
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
2. Chandrasinh D Parmar
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
3. Vijay Dubey
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to a navigation and safety system for railway stations designed to assist passengers in navigating railway stations, ensuring safe movement, accurate boarding, and access to real-time travel information, with special support for individuals with visual impairments and other mobility challenges.

BACKGROUND OF THE INVENTION

[0002] Passengers at railway stations often face significant challenges related to navigation and safety. Common issues include inadequate signage, confusing platform layouts, poor lighting, and lack of real-time information, leading to missed trains and disorientation especially for elderly or visually impaired individuals. Overcrowding and unregulated pedestrian flow increase the risk of accidents, theft, or injuries. Limited accessibility features, such as ramps or tactile paths, further hinder movement for differently-abled passengers. Additionally, insufficient security presence and emergency response infrastructure compromise personal safety. Moreover, visually impaired passengers face significant challenges during navigation and ensuring personal safety at railway stations. These include difficulty in identifying platforms, locating ticket counters, navigating crowded or obstructed walkways, and interpreting audio or visual announcements that are unclear or poorly timed. Inadequate tactile guidance paths, absence of accessible signage, and inconsistent staff assistance further hinder independent mobility. Safety risks arise from proximity to train tracks, gaps between trains and platforms, and unmarked obstacles. Delays in emergency communication and lack of assistive infrastructure amplify vulnerability, often leading to anxiety, disorientation, and increased dependence on others for safe and efficient travel.

[0003] Traditionally, railway stations have relied upon manual methods for passenger guidance and assistance, including static signage, printed timetables, public announcements, and manual intervention by station staff. Assistance to visually impaired or elderly passengers has been limited to basic tactile surfaces, guide rails, or ad-hoc escort services. Platform and train information is typically disseminated through overhead display units or loudspeaker systems, which may not be timely or personalized. Issuance and return of assistive devices such as mobility sticks or visual aids are managed without digital logging, leading to inconsistencies in availability and accountability. These approaches lack real-time responsiveness and adaptive functionality.

[0004] CN110758406A discloses a transfer island system applied to a station comprises a transfer island, a special channel, a railway luggage storing and transporting device with a security check function and a railway passenger ferry vehicle with a ticket check/security check function; the railway luggage access and transportation device and the railway passenger ferry vehicle go to and return to the station waiting hall and the transfer island through special channels, the special channels comprise luggage passing channels, ferry vehicle passing channels and rapid through channels, the luggage passing channels and the ferry vehicle passing channels extend into the station waiting hall from the transfer island, and the rapid through channels extend to the station security inspection position from the transfer island; the railway luggage storing and transporting device is used for storing luggage of passengers, performing security inspection and delivering the luggage to a waiting hall through a luggage passing channel; the railway passenger ferry vehicle is used for verifying the ticket of the passenger, performing security check on the passenger and the luggage of the passenger, and delivering the passenger into a waiting hall through a ferry vehicle passage; the quick-pass type passage is used for providing quick passage to a security check place for passengers carrying or not carrying luggage.

[0005] EP0791518A1 discloses a method and apparatus for a railway navigation system which provides information defining the position of a railway vehicle on a track system. The system uses an on-board computer with a track database representing the locations of the rail lines, including locations of curves and switches. The system uses a turn rate indicator and a speedometer means to obtain a value for the curvature of the track on which the railway vehicle moves. Curvature data so obtained is compared with data in the track database to determine the position of the railway vehicle in relation to curves and switches of the track system.

[0006] Conventionally, many systems have been developed to facilitate in navigation and safety on railway stations, however systems mentioned in prior art have limitations pertaining to providing comprehensive platform-specific navigation, real-time validation of travel details, or automated guidance to boarding areas. Additionally, the existing systems lack support for visually impaired or elderly users, and fail to perform continuous data collection or analysis.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that is capable of improving the boarding process by guiding and helping passengers move easily and safely within railway stations. Additionally, the system is capable of supporting visually impaired and elderly passengers in navigating crowded and complex station areas, and analyzing travel and station data continuously for improving service quality and passenger experience.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a system that is capable of helping passengers move easily and safely within railway stations by providing clear navigation support.

[0010] Another object of the present invention is to develop a system that is capable of improving the boarding process by guiding passengers accurately to the correct platform and train.

[0011] Another object of the present invention is to develop a system that is capable of assisting passengers in verifying their travel details before boarding to reduce errors and confusion.

[0012] Another object of the present invention is to develop a system that is capable of supporting visually impaired and elderly passengers in navigating crowded and complex station areas.

[0013] Another object of the present invention is to develop a system that is capable of monitoring and managing the usage of travel assistance tools across stations in an organized way.

[0014] Yet another object of the present invention is to develop a system that is capable of analyzing travel and station data continuously for improving service quality and passenger experience.

[0015] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0016] The present invention relates to a navigation and safety system for railway stations developed to help passengers to navigate railway stations safely and efficiently, offering timely travel information and dedicated support for those with visual or mobility difficulties to ensure accurate and secure boarding.

[0017] According to an embodiment of the present invention, a navigation and safety system for railway stations comprising of a housing unit configured to be installed at designated locations within railway stations, atleast two storage chambers provided with the housing to securely hold augmented reality (AR) glasses and assistive sticks for passenger use, the storage chambers in the housing unit are equipped with drawer arrangements configured to securely open and close for issuance and return of AR glasses and assistive sticks, an augmented reality holographic projector mounted on the housing unit, configured to project visual guidance to assist passengers in navigation and boarding of trains, a scanning unit integrated within the housing unit to verify passenger ticket details and platform assignment, the scanning unit comprises of an array of cameras to capture images of the ticket that works in conjunction with an integrated OCR (Optical Character Recognition) module, the AR glasses and assistive sticks are equipped with an AI (artificial-intelligence) enabled cameras, GPS (Global Positioning System) modules, haptic feedback units, and LED (Light Emitting Diode) lights, configured to assist passengers in navigation, obstacle detection, and safety alerts throughout their journey, and the AR glasses and assistive sticks are configured to detect if a passenger is carrying luggage or heavy load, prompting the holographic projector to suggest assistance options such as shuttle carts or porter services.

[0018] According to another embodiment of the present invention, the present invention further includes a speaker module integrated with the AR glasses and assistive sticks, providing voice instructions, real-time notifications, and alerts related to travel progress, platform changes, and safety warnings, a touch screen is mounted on the housing unit, displaying travel-related information including passenger PNR (Passenger Name Record) numbers and platform details, and adapted to receive passenger inputs via a braille keypad and voice commands through an integrated microphone, an IoT module is interlinked with the housing units at different railway stations, configured to track issuance and return of the AR glasses and assistive sticks between stations, an ultrasonic sensor is integrated with the AI-enabled camera on the assistive stick, configured to detect obstacles and provide haptic and audio alerts to visually impaired users, a database integrated with the microcontroller, storing platform, train schedules, PNR details, and other relevant data continuously analyzed by the machine learning algorithm to optimize user experience and station operations, and failure to return the AR glasses or assistive sticks triggers notifications via the housing unit screen, speaker modules, and connected computing unit(s), including escalating fines and reminders.

[0019] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0020] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a navigation and safety system for railway stations.

DETAILED DESCRIPTION OF THE INVENTION

[0021] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0022] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0023] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0024] The present invention relates to a navigation and safety system for railway stations developed to facilitate safe and efficient movement of passengers within railway stations, providing essential travel guidance and specialized assistance for individuals with mobility or visual impairments to ensure proper access to trains.

[0025] Referring to Figure 1, an isometric view of a navigation and safety system for railway stations is illustrated, comprising of a housing unit 101 configured to be installed at designated locations within railway stations, two storage chambers 102 provided with the housing to hold augmented reality (AR) glasses 103 and assistive sticks 104, the storage chambers 102 in the housing unit 101 are equipped with drawer arrangements 105, an augmented reality holographic projector 106 mounted on the housing unit 101, a scanning unit 107 integrated within the housing unit 101, the AR glasses 103 and assistive sticks 104 are equipped with an AI (artificial-intelligence) enabled cameras 108, and LED (Light Emitting Diode) lights 110, a speaker module 111 integrated with the AR glasses 103 and assistive sticks 104, a touch screen 112 mounted on the housing unit 101 and configured with a braille keypad 113 and an integrated microphone 109.

[0026] The system disclosed herein comprises of a housing unit 101 configured for fixed installation at designated railway station premises and structured to facilitate passenger access and interaction. Upon user authentication or station staff initiation, the housing unit 101 activates to manage issuance and receipt processes for stored assistive units. The housing unit 101 remains operational during station hours and is embedded with an inbuilt microcontroller to detect unauthorized access, tampering, or failure in operation, thereby ensuring continuous and lawful availability of assistive tools to eligible railway passengers. Each housing unit 101 incorporates at least two internally partitioned storage chambers 102, independently configured for AR glasses 103 and assistive sticks 104, respectively.

[0027] Upon command issuance via authorized interface, a designated chamber 102 unlocks to release a single unit of the stored unit. Post successful usage of the AR glasses 103 and assistive sticks 104, the passenger or staff inserts the item back, triggering automatic locking and inventory verification. Anti-theft, fire-resistant, and hygiene-compliant measures are embedded within each chamber 102. The chambers 102 operate independently, allowing parallel access and return functions, thereby ensuring operational efficiency and user safety within regulated public infrastructure settings.

[0028] The storage chambers 102 in the housing unit 101 are equipped with drawer arrangements 105 configured for secure, automated opening and closing functions. Upon receiving of the command via user interface or station personnel authorization, the drawer arrangements 105 actuate for glides on controlled rails, halting in a precise open position to permit retrieval or return of AR glasses 103 or assistive sticks 104. The microcontroller prevents forceful opening and automatically reverses in case of obstruction. Each drawer arrangement 105 ensures controlled, user-specific access, aligned with prescribed usage protocols and asset accountability requirements.

[0029] The drawer arrangements 105 mentioned herein consist of multiple plates that are overlapped to each other with a sliding unit, wherein upon actuation of the drawer arrangements 105 by the microcontroller, the motor in the sliding unit starts rotating a wheel coupled via a shaft in clockwise/anticlockwise direction providing a movement to the slider in the drawer arrangements 105 for opening and closing for issuance and return of AR glasses 103 and assistive sticks 104. A biometric verification module integrated within the housing unit operates by initiating an identity authentication protocol upon user interaction. The biometric verification module comprises at least one of:
i) a fingerprint scanner;
ii) an iris scanner;
iii) a QR-code reader linked to a prior registration database with disability credentials;
iv) a smart card reader for government-issued disability identity cards

[0030] The fingerprint scanner mentioned herein connected to the biometric verification module and activates upon user contact. The scanner captures the user’s fingerprint data in real time and transmits it to an inbuilt microcontroller for processing. The captured data is matched against a pre-enrolled database of individuals registered with verified visual disabilities. If the fingerprint matches an approved profile, the microcontroller transmits an authentication signal to unlock the assistive stick 104 dispensing. If unmatched, the process is halted and an alert is sent to remote personnel for further verification, ensuring that only eligible users can access the assistive aid.

[0031] The iris scanner captures high-resolution imagery of the user's iris when positioned appropriately before the sensor. The module automatically focuses and extracts unique iris features for identity verification. This biometric data is cross-referenced against a stored dataset of users pre-registered with visual disabilities. Upon a successful match, the microcontroller issues an activation signal to the assistive stick 104 dispensing. In case of mismatch or image obstruction, the system either retries scanning or escalates the authentication process to remote personnel for manual approval, thus ensuring secure and identity-bound dispensing of the assistive stick 104 within the housing unit 101.

[0032] The QR-code reader initiates upon user presentation of a QR-enabled disability credential, typically linked to a central registration database. The reader captures the encoded information and transmits it to the verification module for decoding and database matching. The microcontroller validates the user's identity and associated visual impairment credentials embedded in the QR code. Upon verification, the module generates an authentication signal that triggers the assistive stick dispenser. If the code is invalid, expired, or unmatched, the system denies dispensing and notifies remote personnel to review the user’s eligibility via the live feed, ensuring controlled and justified allocation.

[0033] The smart card reader becomes active when a user inserts or taps a government-issued disability identity card. The reader extracts encrypted data including user ID, disability classification, and registration number. This information is matched in real time against a central database of authorized users with verified visual impairment. If validated, the system permits dispensing of the assistive stick 104 by issuing an actuation command to the dispensing. In the event of failure to authenticate, the microcontroller blocks dispensing and prompts remote personnel for live verification, thereby maintaining secure access and restricting use to eligible individuals only, as per authorization protocol.

[0034] An augmented reality holographic projector 106 is affixed to the housing unit 101, configured to emit and display real-time, three-dimensional holographic visual cues within the passenger environment. The projector 106 is integrated to assist and guide passengers in navigating station layouts, locating designated boarding zones, and identifying train car alignment and boarding sequences. The AR projector 106 synchronized with transport schedules and platform sensors, ensuring dynamic projection of pertinent visual navigation data. The projector 106 adapts responsively to crowd density, platform changes, or train arrivals, thereby enhancing boarding efficiency and passenger orientation.

[0035] The augmented reality holographic projector 106 projects educational and entertainment content such as tourist spot information and historical facts during train travel. The projector 106 herein operates by generating 3D-images using laser-based light modulation, typically through digital light processing (DLP). The projector 106 maps the physical environment and superimposes contextual holographic visuals aligned to user positioning, via cameras, and real-time data sources. The projector 106 utilizes SLAM (simultaneous localization and mapping) to ensure spatial accuracy, projecting directional arrows, symbols, or train indicators.

[0036] A scanning unit 107 is integrally housed within the housing unit 101, operates to authenticate passenger tickets and ascertain platform allocation. Upon initiation by user interaction, the microcontroller activates the scanning unit 107 comprises of an embedded array of cameras that captures the ticket image. This image is instantaneously transmitted to an integrated OCR module. The scanning unit 107 validates ticket authenticity by matching extracted data with the microcontroller. Any discrepancies or mismatches result in access denial. The scanning unit 107 functions autonomously and in real-time, enabling secure, efficient passenger flow management through automated verification protocols.

[0037] Post successful detection of a ticket within the designated capture zone, the microcontroller triggers the array of cameras, configured in a geometrically-aligned matrix within the scanning unit 107 to capture images of the ticket. The cameras operate synchronously to acquire high-resolution visual data of the ticket surface by utilizing multi-angle imaging protocols. Each camera in the array is calibrated for focal alignment and light correction, thereby mitigating obstructions such as folds, shadows, or partial visibility. The resultant composite image ensures complete data capture for accurate processing. This image is then relayed to the OCR module without intermediate manual intervention, forming a seamless imaging-to-recognition unit.

[0038] After the image is received via the camera array, the OCR module initiates data extraction from the ticket. The OCR module interprets alphanumeric characters, QR codes, and other machine-readable elements encoded on the ticket by utilizing pattern recognition and neural parsing protocols. The OCR module converts the captured image into a structured data string, which is cross-referenced against a secure ticketing database to verify ticket validity and determine the appropriate platform assignment. Any errors in interpretation are resolved through built-in correction heuristics. The module operates autonomously and provides verified output to the microcontroller for further routing or gate access control.

[0039] The AR (Augmented Reality) glasses 103 and assistive sticks 104 are integrated with an AI (Artificial-Intelligence) enabled cameras 108, GPS (Global Positioning System) modules, haptic feedback units, and LED (Light Emitting Diode) lights 110, each collectively configured and functionally aligned to provide real-time navigational assistance, obstacle detection, and safety alerts to passengers during transit. The AR glasses 103 and assistive sticks 104 are further programmed to detect whether a passenger is carrying luggage or a heavy load, whereupon the microcontroller triggers a holographic projector 106 suggesting relevant assistance options, including but not limited to the deployment of shuttle carts or porter services for enhanced mobility and support.

[0040] The AI (artificial-intelligence) enabled cameras 108 comprises of an image capturing arrangement including a set of lenses that captures multiple images from the surrounding environment, and the captured images are stored within memory of the imaging unit in form of an optical data. The cameras 108 also comprise of a processor that is integrated with artificial intelligence protocols, such that the processor processes the optical data and extracts the required data from the captured images.

[0041] The extracted data is further converted into digital pulses and bits and are further transmitted to the microcontroller. The microcontroller processes the received data and determines passenger behavior, such as signs of strain or load-bearing posture. Upon detecting predefined cues such as baggage presence, the cameras 108 relay actionable inputs to the central processor, prompting context-specific responses, including holographic suggestions or adjustments in navigation protocols.

[0042] The GPS (Global Positioning System) modules herein receive satellite signals to triangulate and calculate the precise geolocation of the passenger. This geospatial data is transmitted to the navigation unit for real-time positioning. Integrated mapping software correlates the GPS output with waypoints, routes, and predefined zones such as terminals, gates, exits, or transport hubs. The microcontroller dynamically updates the passenger's path and can reroute in response to deviations, delays, or obstructions. Moreover, the GPS module coordinates are utilized to localize incidents or call for proximity-based services, such as dispatching nearby porter assistance when luggage is detected, thereby enabling automated, context-aware operational responses.

[0043] The haptic feedback units herein operate by converting microcontroller-generated signals into tactile stimuli, typically through controlled vibrations or mechanical pulses. These stimuli are issued based on inputs received from the AI cameras 108 or GPS module such as proximity to an obstacle or directional prompts. For example, a left-arm vibration indicates a left turn, while rapid pulses signal an imminent obstruction. The feedback intensity and patterns are customized based on urgency or environmental complexity. These alerts ensure the passenger remains situationally aware, especially in low-visibility or crowded conditions, by providing discreet, real-time physical prompts directly through the AR glasses 103 and assistive sticks 104.

[0044] Based on signals received from the AI-enabled cameras 108 and the GPS modules, the microcontroller actuates a LED (Light Emitting Diode) lights 110 to serve as visual indicators to both the passenger and nearby individuals, displaying status alerts (e.g., safe to proceed, stop, caution). The LED patterns or color changes are contextually triggered such as flashing red for obstacles or green for clear paths. Additionally, the LEDs highlight specific areas, such as uneven surfaces or curbs, enhancing visibility.

[0045] Upon successful detection of luggage, LEDs blink in a pattern to draw attention for assistance personnel. Light intensity and behavior are algorithmically modulated for adaptive response. The LED herein is a two-lead semiconductor light source also known as p-n junction which produce the lighting when constant voltage is supplied across the diode. When the voltage is supplied across the diode, the electrons recombine with the electrons hole in the diode which result in conversion of electron into photons which is another form of light.

[0046] A speaker module 111 integrated within the AR glasses 103 and assistive sticks 104, is designed to provide audio-based assistance to the user. The speaker module 111 delivers voice instructions, real-time travel notifications, and critical alerts, including but not limited to platform changes, navigation updates, and safety warnings. The speaker module 111 works by receiving signals from the microcontroller, converting them into sound waves through a diaphragm’s vibration, and producing audible sounds with the help of amplification and control circuitry in order to notify the user and provide alerts related to travel progress, platform changes, and safety warnings.

[0047] A touch screen 112 is affixed to the housing unit 101 to display of travel-related data, including but not limited to Passenger Name Record (PNR) numbers and platform allocations. The touch screen 112 is further configured to receive passenger-initiated inputs through two distinct auxiliary input units: a braille keypad 113 configured for tactile entry to accommodate visually impaired users, and a voice input unit integrated via an embedded microphone 109. These enables to facilitate multimodal user interaction compliant with accessibility and user interface standards applicable under prevailing regulatory frameworks. The touch screen 112 is adapted to provide passengers with information about nearby restrooms, refreshment shops, and crowd densities at upcoming stations based on stoppage times and IoT data.

[0048] The touch screen 112 as mentioned herein is typically an (Liquid Crystal Display) screen hat presents output in a visible form. The screen 112 is equipped with touch-sensitive technology, allowing the user to interact directly with the display using their fingers. A touch controller IC (Integrated Circuit) is responsible for processing the analog signals generated when the user inputs details regarding travel-related information including passenger PNR (Passenger Name Record) numbers and platform details. The touch controller is typically connected to the microcontroller through various interfaces which may include but are not limited to SPI (Serial Peripheral Interface) or I2C (Inter-Integrated Circuit).

[0049] The braille keypad 113 operates as a physical, tactile input unit composed of raised-dot buttons compliant with braille standards. Each key corresponds to an alphanumeric character or microcontroller command. When a visually impaired passenger presses a key, a capacitive signal is transmitted to the microcontroller, which decodes the braille input and maps it to its digital equivalent. The microcontroller further executes the corresponding function such as querying PNR information or selecting platform data. The audio and/or visual feedback is rendered to confirm successful input. The keypad 113 ensures secure and accessible interaction in accordance with accessibility norms.

[0050] The integrated microphone 109 functions by capturing audio signals in the form of voice commands from the user. These analog audio inputs are converted into digital signals via an analog-to-digital converter (ADC). The digitized voice input is then processed through embedded voice recognition software capable of parsing spoken language into actionable commands. Once validated, the microcontroller executes the corresponding operations, such as retrieving or announcing the PNR status or platform number.

[0051] The microphone 109 include noise cancellation features to enhance recognition accuracy in public environments. The voice interface is designed to comply with accessibility standards and enable hands-free operation. Upon the event of discrepancies arising between ticket particulars and the corresponding platform assignment, an automated alert protocol is triggered, notifying both passengers and duly authorized station personnel. An Internet of Things (IoT) module is integrally networked with the housing unit 101 across participating railway stations.

[0052] The assistive sticks 104 are interlinked via the IoT module to share information about less crowded areas and shortest queues among visually impaired users. The IoT module is pre-configured to monitor and log the issuance and return of the augmented Reality (AR) glasses 103 and assistive sticks 104. The module ensures accountability and traceability in the handling, providing verifiable data trails and compliance with station-level asset management policies and operational mandates.

[0053] The IoT module herein operates by embedding RFID/ NFC sensors in both AR glasses 103 and assistive sticks 104, and integrating sensor readers within housing unit 101. Upon issuance, the unique ID is captured and logged against the user profile via a secure cloud-based database, triggered by a user’s authenticated scan (e.g., ticket barcode or ID). The module timestamps the event, updates availability status, and transmits this data in real-time to the microcontroller.

[0054] An ultrasonic sensor is integrated within the AI-enabled camera on the assistive stick for enabling real-time detection of physical obstacles. The sensor is designed to issue haptic and auditory feedback to visually impaired users to ensure safe navigation. The ultrasonic sensor works by emitting ultrasonic waves and then measuring the time taken by these waves to bounce back after hitting the surface of the obstacles.

[0055] The ultrasonic sensor includes two main parts viz. transmitter, and a receiver. The transmitter sends a short ultrasonic pulse towards the surface of obstacles which propagates through the air at the speed of sound and reflects back as an echo to the transmitter as the pulse hits the obstacles. The transmitter then detects the reflected eco from the surface of the obstacles and calculations is performed by the sensor based on the time interval between the sending signal and receiving echo to determine the obstacles.

[0056] The determined data is sent to the microcontroller in a signal form, based on which the microcontroller further process the signal to microcontroller to provide haptic and audio alerts to visually impaired users. A database embedded within the microcontroller's firmware or hosted on an attached memory module, stores structured datasets including train schedules, platform assignments, and PNR details.

[0057] Upon user interaction, the microcontroller executes query routines to retrieve relevant records. This data is then parsed and analyzed by a machine learning protocol that identifies patterns, optimizes route recommendations, and predicts congestion or delays. The microcontroller ensures read/write integrity through indexed keys and data validation routines, maintaining operational accuracy and supporting informed decision-making for the end-user.

[0058] Additionally, the AR glasses 103 and assistive sticks 104 are configured to receive real-time announcements from station staff via embedded microphone 109, and display or notify passengers accordingly, compensating for crowd noise or inaudible audio. The AR glasses 103 and assistive sticks 104 provide real-time alerts on theft-prone or unsafe areas of the station platform, based on analysis of local camera feeds and machine learning predictions.

[0059] The AR glasses 103 are configured for limited use only within a predefined vicinity of the housing unit 101, such that upon activation, they provide brief holographic guidance related to path layout and optimal station exits, and subsequently deactivate after completion of interaction. To address logistical challenges such as inventory management, sanitation, and potential loss or theft associated with deploying AR glasses 103 universally across all stations, the usage of AR glasses 103 are restricted to the vicinity of the housing unit 101.

[0060] Passengers approach the housing unit 101, input their current location and intended platform, and briefly wear the AR glasses 103 for localized holographic guidance within the housing unit’s 101 immediate vicinity. Following this interaction, comprehensive routing information is transmitted directly to the passenger’s computing unit. This effectively simplifies management, enhances hygiene control, minimizes hardware deployment across multiple stations, and ensures efficient and secure distribution of AR-based navigational assistance.

[0061] After this short interaction, the microcontroller automatically sends the full routing information directly to the user’s computing unit, via third party communication interfaces. This enables the passenger to continue receiving navigation assistance on their own device without needing to carry or manage the AR glasses 103 throughout their journey. By confining the AR glasses 103 usage to the housing unit 101 vicinity, this approach significantly simplifies device tracking and maintenance, enhances user safety through controlled sanitation protocols, and reduces the number of AR glasses 103 required overall.

[0062] Moreover, a battery is associated with the system to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrodes known as a cathode and an anode. A voltage is generated between the anode and cathode via oxidation/reduction and thus produces the electrical energy to provide to the system.

[0063] The present invention works best in the following manner, where the housing unit 101 is installed at designated locations within the railway stations and is equipped with the storage chambers 102 that securely hold the AR glasses 103 and the assistive sticks 104. The scanning unit 107 integrated within the housing unit 101 verifies the passenger ticket details and platform assignment through the array of cameras in conjunction with the OCR module. The touch screen 112 displays relevant travel information, while the microphone 109 and the braille keypad 113 facilitate passenger interaction. Upon verification, the storage chamber 102 drawers open to issue the AR glasses 103 and the assistive sticks 104, both of which are embedded with the AI-enabled cameras 108, GPS modules, haptic feedback units, LED lights 110, and the speaker module 111. These components assist the passenger in navigation, obstacle detection, and provide real-time audio-visual alerts. The ultrasonic sensor integrated with the AI-enabled camera on the assistive stick enhances object detection for visually impaired users. The augmented reality holographic projector 106 mounted on the housing unit 101 displays visual navigation aids. All operations are monitored and optimized by the microcontroller interfaced with the IoT module and the database, ensuring seamless tracking, predictive assistance, and real-time updates across the station network.

[0064] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A navigation and safety system for railway stations, comprising:
i) a housing unit 101 configured to be installed at designated locations within railway stations;
ii) atleast two storage chambers 102 provided with the housing to securely hold augmented reality (AR) glasses 103 and assistive sticks 104 for passenger use;
iii) a biometric verification module provided with the housing unit 101 that authorizes dispensing of assistive sticks 104 only upon successful identity confirmation of visual impairment, to prevent unauthorized use and ensure fair allocation;
iv) an augmented reality holographic projector 106 mounted on the housing unit 101, configured to project visual guidance to assist passengers in navigation and boarding of trains;
v) a scanning unit 107 integrated within the housing unit 101 to verify passenger ticket details and platform assignment;
vi) AR glasses 103 and assistive sticks 104 are equipped with an AI (artificial-intelligence) enabled cameras 108, GPS (Global Positioning System) modules, haptic feedback units, and LED (Light Emitting Diode) lights 110, configured to assist passengers in navigation, obstacle detection, and safety alerts throughout their journey; and
vii) a speaker module 111 integrated with the AR glasses 103 and assistive sticks 104, providing voice instructions, real-time notifications, and alerts related to travel progress, platform changes, and safety warnings.

2) The navigation and safety system as claimed in claim 1, wherein a touch screen 112 is mounted on the housing unit 101, displaying travel-related information including passenger PNR (Passenger Name Record) numbers and platform details, and adapted to receive passenger inputs via a braille keypad 113 and voice commands through an integrated microphone 109.

3) The navigation and safety system as claimed in claim 1, wherein a mismatches in the ticket details and platform assignment triggers alerts to passengers and authorized station personnel.

4) The navigation and safety system as claimed in claim 1, wherein an IoT module is interlinked with the housing units 101 at different railway stations, configured to track issuance and return of the AR glasses 103 and assistive sticks 104 between stations.

5) The navigation and safety system as claimed in claim 1, wherein an ultrasonic sensor is integrated with the AI-enabled camera on the assistive stick, configured to detect obstacles and provide haptic and audio alerts to visually impaired users.

6) The navigation and safety system as claimed in claim 1, wherein a database is integrated with the microcontroller, storing platform, train schedules, PNR details, and other relevant data continuously analyzed by the machine learning algorithm to optimize user experience and station operations.

7) The navigation and safety system as claimed in claim 1, wherein the scanning unit 107 comprises of an array of cameras to capture images of the ticket that works in conjunction with an integrated OCR (Optical Character Recognition) module.

8) The navigation and safety system as claimed in claim 1, wherein the storage chambers 102 in the housing unit 101 are equipped with drawer arrangements 105 configured to securely open and close for issuance and return of AR glasses 103 and assistive sticks 104.

9) The navigation and safety system as claimed in claim 1, wherein the AR glasses 103 and assistive sticks 104 are configured to detect if a passenger is carrying luggage or heavy load, prompting the holographic projector 106 to suggest assistance options such as shuttle carts or porter services.

10) The navigation and safety system as claimed in claim 1, wherein failure to return the AR glasses 103 or assistive sticks 104 triggers notifications via the housing unit 101 screen 112, speaker modules 111, and connected computing unit(s), including escalating fines and reminders.

Documents

Application Documents

# Name Date
1 202521061682-STATEMENT OF UNDERTAKING (FORM 3) [27-06-2025(online)].pdf 2025-06-27
2 202521061682-REQUEST FOR EXAMINATION (FORM-18) [27-06-2025(online)].pdf 2025-06-27
3 202521061682-REQUEST FOR EARLY PUBLICATION(FORM-9) [27-06-2025(online)].pdf 2025-06-27
4 202521061682-PROOF OF RIGHT [27-06-2025(online)].pdf 2025-06-27
5 202521061682-POWER OF AUTHORITY [27-06-2025(online)].pdf 2025-06-27
6 202521061682-FORM-9 [27-06-2025(online)].pdf 2025-06-27
7 202521061682-FORM FOR SMALL ENTITY(FORM-28) [27-06-2025(online)].pdf 2025-06-27
8 202521061682-FORM 18 [27-06-2025(online)].pdf 2025-06-27
9 202521061682-FORM 1 [27-06-2025(online)].pdf 2025-06-27
10 202521061682-FIGURE OF ABSTRACT [27-06-2025(online)].pdf 2025-06-27
11 202521061682-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [27-06-2025(online)].pdf 2025-06-27
12 202521061682-EVIDENCE FOR REGISTRATION UNDER SSI [27-06-2025(online)].pdf 2025-06-27
13 202521061682-EDUCATIONAL INSTITUTION(S) [27-06-2025(online)].pdf 2025-06-27
14 202521061682-DRAWINGS [27-06-2025(online)].pdf 2025-06-27
15 202521061682-DECLARATION OF INVENTORSHIP (FORM 5) [27-06-2025(online)].pdf 2025-06-27
16 202521061682-COMPLETE SPECIFICATION [27-06-2025(online)].pdf 2025-06-27
17 Abstract.jpg 2025-07-11