Abstract: SYSTEM AND METHOD FOR ADAPTIVE AUGMENTED REALITY NAVIGATION IN INDOOR SPACES ABSTRACT A system (100) for adaptive Augmented Reality (AR) navigation in indoor spaces is disclosed. The system (100) comprises a positioning unit (102) to determine an indoor location of a user, and a route generation unit (104) adapted to dynamically adjust a navigation path of the user based on the determined indoor location of the user. The system (100) further comprises a communication engine (106) that fetches meta information, and a feedback unit (108) generates a multi-sensory feedback to guide the user along the adjusted navigation path. The system (100) further comprises a processor (110) that is configured to receive the determined indoor location and the fetched meta information: analyze them to generate the adjusted navigation path of the user; and guide the user along the adjusted navigation path. The system (100) dynamically adjusts routes in real-time based on environmental conditions, crowd density, and obstacles for efficient and safe navigation. Claims: 10, Figures: 3 Figure 1 is selected.
Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a navigation platform and particularly to a system for adaptive Augmented Reality (AR) navigation in indoor spaces.
Description of Related Art
[002] Indoor navigation technology has advanced significantly, incorporating methods such as Wireless Fidelity (Wi-Fi) positioning, Bluetooth beacons, LiDAR mapping, and augmented reality (AR). These technologies have allowed users to navigate complex indoor environments, such as malls, hospitals, airports, and office buildings. Indoor Positioning Systems (IPS) utilize a combination of GPS alternatives, including visual markers, radio frequency signals, and motion sensors, to determine a user’s precise location. While these systems provide accurate mapping and navigation, they often rely on static data that fails to adapt to real-time changes in the environment.
[003] Augmented reality has emerged as a key enabler in indoor navigation, offering immersive guidance by overlaying digital directions onto real-world views. Platforms such as Google AR Core and Apple AR Kit provide foundational tools for AR-based navigation, utilizing visual markers and spatial recognition to generate interactive pathways. However, these solutions generally depend on predefined routes and lack the ability to respond dynamically to factors such as sudden crowd congestion, temporary closures, or evolving user preferences. As a result, users may experience outdated or inefficient guidance when navigating indoor spaces.
[004] Despite advancements in indoor navigation, accessibility remains a critical challenge, particularly for individuals with disabilities. Some existing systems offer auditory cues or haptic feedback, but they often function in isolation rather than as part of a comprehensive multi-sensory experience. Additionally, indoor navigation technologies struggle to scale effectively in large, multi-floor environments, where factors such as escalator availability, elevator delays, or event-driven layout changes impact movement. These limitations underscore the need for an advanced system capable of adapting in real time to environmental and user-specific factors, ensuring a seamless and accessible navigation experience.
[005] There is thus a need for an improved and advanced system for adaptive Augmented Reality (AR) navigation in indoor spaces that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a system for adaptive Augmented Reality (AR) navigation in indoor spaces. The system comprising a positioning unit adapted to determine an indoor location of a user. The system further comprising a route generation unit adapted to dynamically adjust a navigation path of the user based on the determined indoor location of the user. The system further comprising a communication engine adapted to fetch meta information selected from an environmental data, a crowd density, user preferences for factors such as lighting and noise level, or a combination thereof. The system further comprising a feedback unit adapted to generate a multi-sensory feedback selected from a visual feedback, an auditory feedback, a haptic feedback, or a combination thereof to guide the user along the adjusted navigation path. The system further comprising a processor communicatively connected to the positioning unit, the route generation unit, the communication engine, and to the feedback unit. The processor is configured to receive the determined indoor location of the user from the positioning unit; receive the fetched meta information from the communication engine; and analyze the determined indoor location of the user and the fetched meta information for generation of the adjusted navigation path of the user using the route generation unit; and activate the feedback unit adapted to guide the user along the adjusted navigation path.
[007] Embodiments in accordance with the present invention further provide a method for adaptive Augmented Reality (AR) navigation in indoor spaces. The method comprising steps of receiving a determined indoor location of a user from a positioning unit; receiving a fetched meta information from a communication engine; analyzing the determined indoor location of the user and the fetched meta information for generation of an adjusted navigation path of the user; and activating a feedback unit adapted to guide the user along the adjusted navigation path.
[008] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a system for adaptive Augmented Reality (AR) navigation in indoor spaces.
[009] Next, embodiments of the present application may provide an Augmented Reality (AR) navigator that tailors the navigation path based on user preferences such as lighting, noise levels, and congestion, ensuring a comfortable and user-friendly experience.
[0010] Next, embodiments of the present application may provide an Augmented Reality (AR) navigator that dynamically adjusts routes in real-time based on environmental conditions, crowd density, and obstacles, leading to more efficient and safe navigation.
[0011] Next, embodiments of the present application may provide an Augmented Reality (AR) navigator that allows inclusion of multi-sensory feedback (visual, auditory, and haptic) benefits users with diverse needs, including individuals with visual or hearing impairments, ensuring inclusivity in navigation.
[0012] Next, embodiments of the present application may provide an Augmented Reality (AR) navigator that utilizes advanced positioning technologies like Wireless Fidelity (Wi-Fi), Bluetooth Low Energy (BLE) beacons, Ultra-Wide Band (UWB) chips, and Radio Frequency Identifiers (RFID) enhances location accuracy, making the system more reliable for complex indoor environments such as malls, airports, and hospitals.
[0013] Next, embodiments of the present application may provide an Augmented Reality (AR) navigator that leverages IoT connectivity to interact with smart building systems, optimizing navigation paths by considering factors like elevator availability, emergency exits, and restricted zones.
[0014] These and other advantages will be apparent from the present application of the embodiments described herein.
[0015] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0017] FIG. 1 illustrates a schematic block diagram of a system for adaptive Augmented Reality (AR) navigation in indoor spaces, according to an embodiment of the present invention;
[0018] FIG. 2 illustrates a block diagram of a processor, according to an embodiment of the present invention; and
[0019] FIG. 3 depicts a flowchart of a method for adaptive Augmented Reality (AR) navigation in indoor spaces, according to an embodiment of the present invention.
[0020] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0021] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0022] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0023] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0024] FIG. 1 illustrates a schematic block diagram of a system 100 for adaptive Augmented Reality (AR) navigation in indoor spaces, according to an embodiment of the present invention. In an embodiment of the present invention, the system 100 may be adapted to navigate a user in an indoor space such as, but not limited to, an airport, a shopping mall, and so forth. Embodiments of the present invention are intended to include or otherwise cover any indoor space for installation of the system 100, including known, related art, and/or later developed technologies. The system 100 may be adapted to provide feedback and cues to the user indicating navigational actions such as, but not limited to, turn right, turn left, look above, look below, stop, watch your step, and so forth. Embodiments of the present invention are intended to include or otherwise cover any feedback and cues that may be provided to the user, including known, related art, and/or later developed technologies. Further, the system 100 may provide a guidance access to visually impaired and motor challenged users.
[0025] According to the embodiments of the present invention, the system 100 may incorporate non-limiting hardware components to enhance the processing speed and efficiency as the system 100 may comprise a positioning unit 102, a route generation unit 104, a communication engine 106, a feedback unit 108, a processor 110, and a floor detection algorithm 112. In an embodiment of the present invention, the hardware components of the system 100 may be integrated with computer-executable instructions for overcoming the challenges and the limitations of the existing systems.
[0026] In an embodiment of the present invention, the positioning unit 102 may be centrally installed in an indoor location. In another embodiment of the present invention, the positioning unit 102 may be distributively installed the indoor location. The positioning unit 102 may be mounted on surfaces such as, but not limited to, a mast, a ceiling, a pole, a wall, a beam, and so forth. Embodiments of the present invention are intended to include or otherwise cover any surface for mounting the positioning unit 102, including known, related art, and/or later developed technologies.
[0027] In an embodiment of the present invention, the positioning unit 102 may be adapted to determine the indoor location of the user. Further, an accuracy of the determined indoor location of the user may be enhanced by operations of units such as, but not limited to, a Wireless Fidelity (Wi-Fi) based positioning system, a Bluetooth Low Energy (BLE) beacons, Ultra-Wideband (UWB) sensors, Radio Frequency Identifiers (RFID) based trackers, and so forth. Embodiments of the present invention are intended to include or otherwise cover any units for enhancing accuracy of the determined indoor location of the user, including known, related art, and/or later developed technologies.
[0028] In an embodiment of the present invention, the route generation unit 104 may be adapted to dynamically adjust a navigation path of the user based on the determined indoor location of the user. The adjusted navigation path may connect the determined indoor location of the user with a destination location of the user. In an embodiment of the present invention, the destination location may be manually selected by the user using a user device (not shown). For example, the user browsing shopping points in a mall and selecting a desired shopping point. In another embodiment of the present invention, the destination location may be pre-coded for every corresponding user.
[0029] For example, the user flying from city A to city B, in such a case, the boarding gate and/or terminal may be pre-coded for every corresponding user (here traveler) and the route generation unit 104 may adjust the navigation path accordingly. Further, the navigation path adjusted by the route generation unit 104 may prioritize navigation paths that optimize energy efficiency, minimize congestion, avoid obstacles detected, and so forth in real-time using data from the communication engine 106.
[0030] In an embodiment of the present invention, the route generation unit 104 may be adapted to remember a navigational preference of the user. For example, the route generation unit 104 may be adapted to avoid staircases while adjusting the navigation path for a physically immobile user. Moreover, the route generation unit 104 may be adapted to prefer well-lit areas and may avoid dark and obscure areas adjusting the navigation path for a female user.
[0031] In an embodiment of the present invention, the communication engine 106 may be adapted to fetch meta information relating to the indoor location. The meta information may be, but not limited to, an environmental data such as a temperature level, a humidity level, an air quality, a crowd density, user preferences for factors such as lighting and noise level, and so forth. Embodiments of the present invention are intended to include or otherwise cover any meta information, including known, related art, and/or later developed technologies.
[0032] In an embodiment of the present invention, the communication engine 106 may be a smart Internet of Things (IoT)-based modem. The communication engine 106 may be adapted to continuously update the fetched meta information in real time based on changing environmental conditions of the indoor location. The communication engine 106 may further dynamically modify the adjusted navigation path to optimize user experience.
[0033] In an embodiment of the present invention, the feedback unit 108 may be a portable device that may be carried by the user while traversing in the indoor location. In an embodiment of the present invention, the feedback unit 108 may be a wearable device that may be worn by the user while traversing in the indoor location. The feedback unit 108 may be adapted to generate a multi-sensory feedback to guide the user along the adjusted navigation path. The multi-sensory feedback may be, but not limited to, a visual feedback, an auditory feedback, a haptic feedback, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the multi-sensory feedback, including known, related art, and/or later developed technologies.
[0034] The feedback unit 108 may further be adapted to generate an Augmented Reality (AR) overlay that may be projected on glasses and/or googles worn by the user. The Augmented Reality (AR) overlay may provide a turn-by-turn navigation to the user. The Augmented Reality (AR) overlay may further indicate signages such as, but not limited to, restrooms, cafeteria, escalators, lifts, exits, and so forth.
[0035] In an embodiment of the present invention, the feedback unit 108 may be adapted to enable the user to adjust a feedback intensity based on a need of the user. The adjustment of the feedback intensity may enable the feedback unit 108 to provide isolative and differentiable multi-sensory feedbacks for users with mobility impairments or auditory feedback for users with visual impairments. Further, the feedback unit 108 may personalize the feedback mechanism based on user preferences and accessibility need. Moreover, the preferences the accessibility need may be stored in a user profile accessible by the processor 110.
[0036] In an embodiment of the present invention, the processor 110 may be connected to the positioning unit 102, the route generation unit 104, the communication engine 106, and to the feedback unit 108. The processor 110 may further be configured to execute computer-executable instructions to generate an output relating to the system 100. According to embodiments of the present invention, the processor 110 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the processor 110 including known, related art, and/or later developed technologies. In an embodiment of the present invention, the processor 110 may further be explained in conjunction with FIG. 2.
[0037] FIG. 2 illustrates a block diagram of the processor 110, according to an embodiment of the present invention. The processor 110 may comprise the computer-executable instructions in form of programming modules such as a data receiving module 200, a data analysis module 204, a route generation module 204, and a feedback module 206.
[0038] In an embodiment of the present invention, the data receiving module 200 may be configured to receive the determined indoor location of the user from the positioning unit 102. The data receiving module 200 may be configured to receive the fetched meta information from the communication engine 106. The data receiving module 200 may be configured to transmit the determined indoor location and the fetched meta information to the data analysis module 204.
[0039] The data analysis module 204 may be activated upon receipt the determined indoor location and the fetched meta information from the data receiving module 200. In an embodiment of the present invention, the data analysis module 204 may be configured to analyze the determined indoor location and the fetched meta information. The analysis by the data analysis module 204 may further incorporate the destination location of the user and an availability of time by the user. Upon analysis, the data analysis module 204 may transmit a routing signal to the route generation module 204.
[0040] The route generation module 204 may be activated upon receipt of the routing signal from the data analysis module 204. In an embodiment of the present invention, the route generation module 204 may be configured to activate the route generation unit 104 to generate the adjusted navigation path connecting the indoor location with the destination location. In an embodiment of the present invention, the route generation module 204 may be configured to activate the floor detection algorithm 112 to enable the user to seamlessly transition, in real-time, between different floors in the indoor space. Upon generation of the adjusted navigation path, the route generation module 204 may transmit an activation signal to the feedback module 206.
[0041] The feedback module 206 may be activated upon receipt of the activation signal from the route generation module 204. In an embodiment of the present invention, the feedback module 206 may be configured to activate the feedback unit 108 adapted to guide the user along the adjusted navigation path. The guidance may be carried out by the multi-sensory feedback such as, but not limited to, a visual feedback, an auditory feedback, a haptic feedback, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the multi-sensory feedback, including known, related art, and/or later developed technologies.
[0042] FIG. 3 depicts a flowchart of a method 300 for the adaptive Augmented Reality (AR) navigation in the indoor spaces using the system 100, according to an embodiment of the present invention.
[0043] At step 302, the system 100 may be activated upon receipt of the user input.
[0044] At step 304, the system 100 may determine the indoor location of the user.
[0045] At step 306, the system 100 may fetch the meta information.
[0046] At step 308, the system 100 may analyze the determined indoor location of the user and the fetched meta information.
[0047] At step 310, the system 100 may generate the adjusted navigation path.
[0048] At step 312, the system 100 may activate the feedback unit 108 adapted to guide the user along the adjusted navigation path.
[0049] At step 314, the system 100 may interact with the user.
[0050] At step 316, the system 100 may further adjust the navigation path based on the interaction with the user.
[0051] At step 318, the system 100 may dispatch the user to the destination location.
[0052] At step 320, the system 100 may collect feedback from the user.
[0053] At step 322, the system 100 may incorporate the collected feedback.
[0054] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0055] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. A system (100) for adaptive Augmented Reality (AR) navigation in indoor spaces, the system (100) comprising:
a positioning unit (102) adapted to determine an indoor location of a user;
a route generation unit (104) adapted to dynamically adjust a navigation path of the user based on the determined indoor location of the user;
a communication engine (106) adapted to fetch meta information selected from an environmental data, a crowd density, user preferences for factors such as lighting and noise level, or a combination thereof;
a feedback unit (108) adapted to generate a multi-sensory feedback selected from a visual feedback, an auditory feedback, a haptic feedback, or a combination thereof to guide the user along the adjusted navigation path; and
a processor (110) communicatively connected to the positioning unit (102), the route generation unit (104), the communication engine (106), and the feedback unit (108), characterized in that the processor (110) is configured to:
receive the determined indoor location of the user from the positioning unit (102);
receive the fetched meta information from the communication engine (106);
analyze the determined indoor location of the user and the fetched meta information for generation of the adjusted navigation path of the user using the route generation unit (104); and
activate the feedback unit (108) adapted to guide the user along the adjusted navigation path.
2. The system (100) as claimed in claim 1, wherein the processor (110) is configured to enable the user to seamlessly transition between different floors in the indoor space using a floor detection algorithm (112).
3. The system (100) as claimed in claim 1, wherein the positioning unit (102) comprises a Wireless Fidelity (Wi-Fi) based positioning system, a Bluetooth Low Energy (BLE) beacons, Ultra-Wideband (UWB) sensors, Radio Frequency Identifiers (RFID) based trackers, or a combination thereof to enhance the accuracy of the determined indoor location of the user.
4. The system (100) as claimed in claim 1, wherein the communication engine (106) continuously updates the fetched meta information in real time based on changing environmental conditions and dynamically modifies the adjusted navigation path to optimize user experience.
5. The system (100) as claimed in claim 1, wherein the feedback unit (108) personalizes the feedback mechanism based on user preferences and accessibility needs, such that the preferences are stored in a user profile accessible by the processor (110).
6. The system (100) as claimed in claim 1, wherein the route generation unit (104) prioritizes navigation paths that optimize energy efficiency, minimize congestion, or avoid obstacles detected in real time using data from the communication engine (106).
7. A method (300) for adaptive Augmented Reality (AR) navigation in indoor spaces, the method (300) is characterized by steps of:
receiving a determined indoor location of a user from a positioning unit (102);
fetching a meta information from a communication engine (106);
analyzing the determined indoor location of the user and the fetched meta information for generation of an adjusted navigation path of the user; and
activating a feedback unit (108) adapted to guide the user along the adjusted navigation path.
8. The method (300) as claimed in claim 7, wherein the positioning unit (102) comprise a Wireless Fidelity (Wi-Fi) based positioning system, a Bluetooth Low Energy (BLE) beacons, Ultra-Wideband (UWB) sensors, Radio Frequency Identifiers (RFID) based trackers, or a combination thereof to enhance the accuracy of the determined indoor location of the user.
9. The method (300) as claimed in claim 7, wherein the communication engine (106) continuously updates the inferred meta information in real time based on changing environmental conditions and dynamically modifies the adjusted navigation path to optimize user experience.
10. The method (300) as claimed in claim 7, wherein the feedback unit (108) personalizes the feedback mechanism based on user preferences and accessibility needs, such that the preferences are stored in a user profile accessible by a processor (110).
Date: March 27, 2025
Place: Noida
Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant
| # | Name | Date |
|---|---|---|
| 1 | 202541030224-STATEMENT OF UNDERTAKING (FORM 3) [28-03-2025(online)].pdf | 2025-03-28 |
| 2 | 202541030224-REQUEST FOR EARLY PUBLICATION(FORM-9) [28-03-2025(online)].pdf | 2025-03-28 |
| 3 | 202541030224-POWER OF AUTHORITY [28-03-2025(online)].pdf | 2025-03-28 |
| 4 | 202541030224-OTHERS [28-03-2025(online)].pdf | 2025-03-28 |
| 5 | 202541030224-FORM-9 [28-03-2025(online)].pdf | 2025-03-28 |
| 6 | 202541030224-FORM FOR SMALL ENTITY(FORM-28) [28-03-2025(online)].pdf | 2025-03-28 |
| 7 | 202541030224-FORM 1 [28-03-2025(online)].pdf | 2025-03-28 |
| 8 | 202541030224-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-03-2025(online)].pdf | 2025-03-28 |
| 9 | 202541030224-EDUCATIONAL INSTITUTION(S) [28-03-2025(online)].pdf | 2025-03-28 |
| 10 | 202541030224-DRAWINGS [28-03-2025(online)].pdf | 2025-03-28 |
| 11 | 202541030224-DECLARATION OF INVENTORSHIP (FORM 5) [28-03-2025(online)].pdf | 2025-03-28 |
| 12 | 202541030224-COMPLETE SPECIFICATION [28-03-2025(online)].pdf | 2025-03-28 |