Abstract: INTEGRATED NAVIGATION SYSTEM AND METHOD FOR NAVIGATION ASSISTANCE WITHIN BUILT FACILITY ABSTRACT An integrated navigation system (100) for navigation assistance within built facility, comprising navigation display interface (102), network of sensors (102A) and map layout (102B). Each sensor is positioned at map layout of navigation display interface (102) to indicate one location of plurality of defined locations within built facility. The integrated navigation system (100) comprises hand-held portable interaction device (104) configured to be moved over navigation display interface (102) such that navigation path from current location to target location of user is automatically highlighted. When the hand-held portable interaction device (104) is moved to target spot location of user, a sensor is configured to detect hand-held portable interaction device (104) to cause the navigation display interface (102) to highlight the navigation path. Display arrangement (106) electronically connected to navigation display interface (102) is configured to perform mechanical movement to automatically make supplementary information about target location visible to user. FIG. 1
Description:TECHNICAL FIELD
The present disclosure relates generally to the field of navigation systems and, more specifically, to an integrated navigation system and a method for use in the integrated navigation system for navigation assistance within a built facility.
BACKGROUND
Usually, when people arrive in a new location, a city, town, or even a specific place (i.e. airports, hospitals, and shopping malls), people typically start with very little understanding of their surroundings. Navigating a new place can be challenging due to unfamiliarity with the surroundings. People commonly employ various strategies to navigate and explore new places to address such challenges. For example, people might ask local residents for suggestions or directions, search online for information about the new place, look for signs that point out interesting spots or use maps that show a general layout of the new place. However, collecting information about the new place can take a lot of time and can be stressful. For example, the local residents may not always be around to assist, or their advice might be based on personal opinion rather than accurate details. Searching the internet can feel overwhelming due to the large amount of information, which may not always be reliable or up-to-date. Signs around the new place might be few and far between, or the sign might not always provide accurate information.
Additionally, maps may not always show the latest updates or provide enough details to be fully useful. The inconsistency and potential inaccuracy in the information provided on the maps can lead to confusion and frustration. Visitors might end up missing out on exploring some interesting or worthwhile destinations simply because they were not aware of the destinations or were misled by incorrect or incomplete details. Navigation becomes harder when the information is incorrect or missing, which can spoil the experience of exploring new areas.
Indeed, the existing navigation devices, such as Digital kiosks, extended reality (XR) lab kiosks, and immersive display systems for interacting with three-dimensional content are found in various public spaces like airports, shopping malls, and hospitals. The digital kiosks, such as the 22Miles digital kiosks, are interactive terminals in different public spaces like airports, shopping centres, and hospitals. The digital kiosks offer touchscreen interfaces to provide digital maps, directions, and place information. While the digital kiosks offer useful information, they are prone to technical issues and require regular maintenance. Additionally, many people are not technically proficient and may struggle to interact with the navigation devices. The lack of an intuitive interaction device can make navigation challenging and inaccessible, especially for users without technical knowledge. Also, the touchscreen malfunctions or outdated information can hinder the user experience. Another existing device used for navigation is the XR lab kiosk. The XR lab kiosk provides an immersive experience by combining VR headsets with computer setups, allowing people to explore real environments in a virtual space. However, XR lab kiosks require complex hardware setups that are often expensive, making them less accessible for widespread use. The display system for interacting with 3D content includes multiple sensors and gesture detection modules, leading to increased costs and technical challenges during implementation. A navigation device dependent on physical interactions from users may encounter performance issues due to sensor limitations and reliance on specific hardware. Therefore, a technical problem exists in navigating unfamiliar areas due to the complexity of facilities, overwhelming maps, lack of information, and difficulties in remembering directions.
Therefore, in light of the foregoing discussion, there exists a need to overcome the aforementioned drawbacks associated with conventional navigation approaches and devices.
SUMMARY
The present disclosure provides an integrated navigation system and a method for use in an integrated navigation system for navigation assistance within a built facility that combines a sensor network, a hand-held interaction device, and a mechanically dynamic display arrangement to assist users in navigating. The present disclosure provides a solution to the technical problem of how to navigate unfamiliar areas due to the complexity of facilities, overwhelming maps, lack of information, and difficulties in remembering directions. An aim of the present disclosure is to provide a solution that overcomes at least partially the problems encountered in the prior art and provides an improved integrated navigation system and an improved method for navigation assistance within the built facility by providing automatic highlighting of navigation paths and displaying supplementary information about target locations.
The object of the present disclosure is achieved by the solutions provided in the enclosed independent claims. Advantageous implementations of the present disclosure are further defined in the dependent claims.
In one aspect, the present disclosure provides an integrated navigation system for navigation assistance within a built facility, comprising a navigation display interface comprising a network of sensors and a map layout. Each sensor is positioned at the map layout of the navigation display interface to indicate one location of a plurality of defined locations within the built facility. The integrated navigation system further comprises a hand-held portable interaction device configured to be moved over the navigation display interface such that a navigation path from a current deployment location of the integrated navigation system to a target location is automatically highlighted on the navigation display interface. When the hand-held portable interaction device is moved to a target spot on the navigation display interface, a sensor of the network of sensors is configured to detect the hand-held portable interaction device moved to the target spot to cause the navigation display interface to highlight the navigation path. The navigation display interface further comprises a display arrangement electronically connected to the navigation display interface. The display arrangement is configured to perform a mechanical movement in the display arrangement to automatically make supplementary information about the target location visible to a user.
The present disclosure introduces an integrated navigation system designed to provide navigation assistance within the built facility. The integrated navigation system comprises the navigation display interface with the network of sensors and the map layout, where each sensor indicates a specific location within the built facility. Additionally, the hand-held portable interaction device is used to interact with the navigation display interface. When the user moves the hand-held portable interaction device over the navigation display interface, a navigation path from the current deployment location to a target destination is highlighted. The network of sensors detects movement and the position of the handheld portable interaction device, causing the navigation display interface to highlight the navigation path. Furthermore, the display arrangement electronically connected to the navigation display interface provides supplementary information about the target spot. The integrated navigation system enables seamless exploration of built environments, allowing the users to navigate easily to the target spots using the highlighted navigation path and additional information. The integrated components of the integrated navigation system, including the hand-held portable interaction device, the network of sensors, and the display arrangement, work together to provide a comprehensive and user-friendly navigation experience. The hand-held portable interaction device activates the highlighted navigation path, the network of sensors detects movement, and the display arrangement presents supplementary information, all contributing to an enhanced experience within the built facility.
In another aspect, the present disclosure provides a method for use in the integrated navigation system for navigation assistance within the built facility. The method comprises detecting the movement of the hand-held portable interaction device over the navigation display interface of the integrated navigation system. The method further comprises detecting, using the sensor from the network of sensors of the navigation display interface. The target spot on the navigation display interface is based on the movement and placement of the hand-held portable interaction device to the target spot on the navigation display interface. The method further comprises determining the current user location corresponding to a deployment location of the integrated navigation system. The method further comprises automatically generating the navigation path from the current user location to a target user location corresponding to the identified target spot. The method further comprises automatically highlighting the generated navigation path on the navigation display interface. And activating the display arrangement electronically connected to the navigation display interface and performing a mechanical movement in the display arrangement to automatically make supplementary information about the target user location visible to a user.
The method achieves all the advantages and technical effects of the integrated navigation systems.
It is to be appreciated that all the aforementioned implementation forms can be combined.
It has to be noted that all devices, elements, circuitry, units and means described in the present application could be implemented in the software or hardware elements or any kind of combination thereof. All steps which are performed by the various entities described in the present application as well as the functionalities described to be performed by the various entities are intended to mean that the respective entity is adapted to or configured to perform the respective steps and functionalities. Even if, in the following description of specific embodiments, a specific functionality or step to be performed by external entities is not reflected in the description of a specific detailed element of that entity which performs that specific step or functionality, it should be clear for a skilled person that these methods and functionalities can be implemented in respective software or hardware elements, or any kind of combination thereof. It will be appreciated that features of the present disclosure are susceptible to being combined in various combinations without departing from the scope of the present disclosure as defined by the appended claims.
Additional aspects, advantages, features, and objects of the present disclosure would be made apparent from the drawings and the detailed description of the illustrative implementations construed in conjunction with the appended claims that follow.
BRIEF DESCRIPTION OF THE DRAWINGS
The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the present disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those in the art will understand that the drawings are not to scale. Wherever possible, like elements have been indicated by identical numbers.
Embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:
FIG. 1 is a diagram that illustrates an integrated navigation system, in accordance with an embodiment of the present disclosure;
FIG. 2 is a diagram that illustrates an isometric view of the integrated navigation system, in accordance with an embodiment of the present disclosure;
FIG. 3 is a diagram of the integrated navigation system that illustrates a side view of the integrated navigation system, in accordance with an embodiment of the present disclosure;
FIG. 4A is a diagram of a parallel stacked flex of the display arrangement in the integrated navigation system, in accordance with an embodiment of the present disclosure;
FIG. 4B is a diagram of a parallel stacked flex of the display arrangement in the integrated navigation system, in accordance with an embodiment of the present disclosure;
FIG. 4C is a diagram of an information flex holder for the parallel stacked flex in the integrated navigation system, in accordance with an embodiment of the present disclosure;
FIG. 5 is the illustration of a side view of the navigation display interface, in accordance with an embodiment of the present disclosure;
FIG. 6 is a diagram that illustrates an exploded view of the hand-held portable interaction device, in accordance with an embodiment of the present disclosure;
FIG. 7A, 7B, and 7C are the diagrams depicting an operational workflow of the integrated navigation system, in accordance with an embodiment of the present disclosure; and
FIG. 8 is a flowchart of a method for use in an integrated navigation system for navigation assistance within a built facility, in accordance with an embodiment of the present disclosure.
In the accompanying drawings, an underlined number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent. A non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
DETAILED DESCRIPTION OF EMBODIMENTS
The following detailed description illustrates embodiments of the present disclosure and ways in which they can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
FIG. 1 is a diagram that illustrates an integrated navigation system in accordance with an embodiment of the present disclosure. With reference to FIG. 1, there is shown an integrated navigation system 100. The integrated navigation system 100 includes a navigation display interface 102, a hand-held portable interaction device 104, a display arrangement 106, and a direction retention device 108. The navigation display interface 102 further includes a network of sensors 102A and a map layout 102B. The network of sensors 102A is represented by dashed sections used for illustration only. The hand-held portable interaction device 104 includes a scroll wheel 104A and a microcontroller 104B. The integrated navigation system 100 further shows a plurality of independent physical sheets 106A in the display arrangement 106. The navigation display interface 102 and the display arrangement 106 are electronically connected.
The integrated navigation system 100 refers to a system that combines different types of navigation sensors and methods to provide more accurate and reliable location, direction, and movement information. The integrated navigation system 100 may include sensors like a gyroscope, which measures orientation; accelerometer, which detects the change in speed; and magnetometer, which senses the earth’s magnetic field to find direction; and in some cases, sensors like radar or sonar also used which aid to detect object and distance. The integrated navigation system 100 might use different navigation techniques. The examples of the integrated navigation system 100 may include, but are not limited to, global positioning system (GPS), Inertial navigation system (INS), Radio navigation, Satellite-based augmentation system (SBAS), and Cellular navigation. The integrated navigation system 100 is typically used in vehicles, ships, aircraft, and other applications that require precise navigation.
The navigation display interface 102 refers to a component of the integrated navigation system 100 that provides a platform to the user for visually communicating with the integrated navigation system 100. The navigation display interface 102 refers to a screen or digital display where all the important navigation information is presented. In an implementation, the navigation display interface 102 is a screen showing a basic map. In another implementation, the navigation display interface 102 is advanced, having an interactive display that lets the user control or customize the integrated navigation system 100. For example, in cars, the navigation display interface 102 is often a touchscreen embedded in a dashboard, which shows useful details like maps, turn-by-turn directions, current location, and even live traffic updates, assisting the driver in making informed decisions.
The network of sensors 102A refers to a collection of interconnected sensors working together to gather and transmit data about the environment or specific objects. Each sensor within the network of sensors 102A monitors specific parameters, providing comprehensive, real-time information to support navigation. The network of sensors 102A is often used in autonomous vehicles, wearable technology, robotics, indoor navigation systems, medical devices, and smartphones.
The map layout 102B is the visual representation of a physical space or environment, showing key locations and features. The map layout 102B serves as a guide for navigating through various areas (such as hospitals, schools, airports, hotels, roads and highways, and railway stations), facilitating the users to understand their surroundings and find their way to specific destinations. The map layout 102B identifies specific locations of various areas, like patient rooms in the hospital map layout, classrooms in the school map layout, guest rooms in the hotel map layout, and platforms in the railway station map layout. The map layout 102B also displays significant features and landmarks such as points of interest, safety features, and service facilities.
The hand-held portable interaction device 104 is a portable electronic device that is designed for user interaction within the integrated navigation system 100. The hand-held portable interaction device 104 is engineered for easy use and portability, allowing users to operate it with one hand while on the move. The hand-held portable interaction device 104 typically features a user-friendly interface, which may include a touchscreen, buttons, or a combination of both, enabling the users to input commands, select options, and navigate through various navigation system features. The hand-held portable interaction device 104 may also integrate with the network of sensors 102A, which can detect the device’s position and movement. Further, the hand-held portable interaction device 104 includes the scroll wheel 104A and the microcontroller 104B.
The scroll wheel 104A is a cylindrical or disc-shaped component that the user rotates with the finger. The scroll wheel 104A is made from plastic or rubber and may have a textured surface to enhance grip and control. The scroll wheel 104A may consist of an encoder or sensor that detects the rotation of the scroll wheel 104A. The scroll wheel 104A assembly includes electrical contacts or connections that interface with the hand-held portable interaction device 104 circuitry. Such electrical contacts transmit signals generated by the scroll wheel 104A movement to the microcontroller 104B.
The microcontroller 104B is referred to as a “computer on a chip”, the microcontroller 104B contains components such as a processor core, memory, and programmable input/output peripherals all integrated into a single chip. The combination of such components allows the microcontroller 104B to execute programs and control various devices in a wide range of applications. Examples of the microcontroller 104B may include, but are not limited to, Arduino family (Arduino uno, Arduino mega), PIC microcontroller, AVR microcontroller, ARM cortex-M microcontroller, ESP8266, Microchip technology SAM series, NXP kinetics, and another microcontroller. Moreover, the microcontroller 104B may refer to one or more individual processors, processing devices, computing devices, or a processing unit that is part of a machine.
The display arrangement 106 is a setup or configuration of components that present visual information to the user in a structured and organized manner. The display arrangement 106 can include the physical components and the way the physical components are organized to deliver specific types of information. The display arrangement 106 may include a display screen, user interface elements, backlighting, interactive components (touchscreen, gesture sensors, and styluses), connectivity, and mounting and adjustment mechanism (such as flex changing mechanism). The display arrangement 106 is configured to perform a mechanical movement that automatically makes supplementary information about a target location visible to the user.
The plurality of independent physical sheets 106A refers to a collection of multiple individual sheets used in the display arrangement 106. The plurality of independent physical sheets 106A are physical sheets made up of tangible materials that can display information, such as paper or thin plastic sheets with printed or other printable substrates. Each of the plurality of independent physical sheets 106A presents separate pieces of information. For example, one sheet shows a detailed map of a particular floor of a building, while another sheet provides information about facilities or services available in the building.
The direction retention device 108 is a mechanism or system designed to maintain or preserve a specific direction or orientation, ensuring that an object, system, or person remains aligned or oriented in a particular way. Examples of the direction retention device 108 may include but are not limited to, gyroscopes, magnetic compasses, Inertial Measurement units (IMUs), and other direction retention devices.
The hand-held portable interaction device 104 moves over the navigation display interface 102 to select the target spot, with the integrated navigation system 100 automatically highlighting a navigation path from the current user location to the target spot. The display arrangement 106, electronically connected to the navigation display interface 102, adjusts to make supplementary information about the target spot visible to the user. The integrated navigation system 100 provides accurate navigation and additional details, enhancing user orientation within complex environments.
There is provided that the integrated navigation system 100 comprises the navigation display interface 102. The navigation display interface 102 comprises the network of sensors 102A and the map layout 102B. Each sensor in the network of sensors 102A is positioned at the map layout 102B of the navigation display interface 102 to indicate one location of a plurality of defined locations within the built facility. The network of sensors 102A is positioned corresponding to the locations in the map layout 102B. The network of sensors 102A tracks the navigation path that the user selects to follow on the map layout 102B. The network of sensors 102A communicates user information to the navigation display interface 102. The navigation display interface 102 then highlights the selected navigation path on the map layout 102B. The highlighting of the navigation path provides seamless navigation assistance within the built facility, enabling users to explore and navigate the environment easily. The integration of the network of sensors 102A with the navigation display interface 102 and map layout 102B allows for real-time updates and accurate path highlighting, enhancing the overall user experience in navigating the built facility.
The integrated navigation system 100 comprises the hand-held portable interaction device 104 configured to be moved over the navigation display interface 102. The navigation path from the current user location to the target user location is automatically highlighted on the navigation display interface 102. The hand-held portable interaction device 104 is moved to the target spot corresponding to the target user location on the navigation display interface 102. The network of sensors 102A detects the movement and positions of the hand-held portable interaction device 104. The network of sensors 102A detects that the hand-held portable interaction device 104 has been moved to the target spot on the navigation display interface 102. The hand-held portable interaction device 104 sends a signal to the mechanical interfaces (such as the navigation display interface 102). The signal may communicate the position or movement of the hand-held portable interaction device 104, allowing the network of sensors 102A to track the target spot and adjust the navigation display interface 102 accordingly. The navigation display interface 102 updates the map layout 102B to reflect the target spot's most relevant and updated information. Also, the navigation display interface 102 highlights the optimal path (the navigation path) from the user's current location to the selected target spot, providing clear and intuitive navigation guidance.
Furthermore, the integrated navigation system 100 comprises the display arrangement 106 electronically connected to the navigation display interface 102. The display arrangement 106 is configured to perform a mechanical movement in the display arrangement 106 to automatically make supplementary information about the target user's location visible to the user. The integrated navigation system 100 utilizes two mechanical interfaces (such as the navigation display interface 102 uses a flex-changing mechanism and the display arrangement 106 uses a mechanical flex-lifting mechanism) to display multilayer information. The mechanical flex-lifting mechanism and the flex-changing mechanism are shown and described in detail, for example, in FIG. 2. When the supplementary information about the target spot is required to be displayed, the mechanical flex-lifting mechanism activates and triggers the flexible parts (i.e., cables or spring) to move the plurality of independent physical sheets 106A of the display arrangement 106.
FIG. 2 is a diagram that illustrates an isometric view of the integrated navigation system in accordance with an embodiment of the present disclosure. FIG. 2 is described in conjunction with elements from FIG. 1. With reference to FIG. 2, there is shown an isometric view 200 of the integrated navigation system 100. The integrated navigation system 100 includes a structural frame 201, a mechanical flex-lifting mechanism 202, an information flex-changing mechanism 204, an information flex sheet 206, a vertical slider 208, a horizontal slider 210, and a site plan flex 212.
The structural frame 201 provides support for the integrated navigation system 100. The structural frame 201 is the rigid, external framework that provides overall support to the integrated navigation system 100. The structural frame 201 is typically made from materials that are strong, durable, and capable of supporting significant loads without bending or breaking, such as aluminium, steel, composite materials, and titanium. The structural frame 201 holds all the internal components, such as the mechanical flex-lifting mechanism 202, the direction retention device 108, the information flex-changing mechanism 204, the information flex sheets 206, the vertical slider 208, the horizontal slider 210, and the site plan flex 212.
The mechanical flex-lifting mechanism 202 refers to a mechanical system or device designed to provide lifting or raising capabilities using flexible components, such as cables, springs, or other elastic elements, to achieve vertical displacement of an object or load. The mechanical flex-lifting mechanism 202 is commonly used in a variety of settings, including elevators and lifts, medical equipment (surgical tables, adjusting the height of patient beds), industrial systems (for lifting heavy machinery or parts in factories), and robotics (vertical movement of robotic arms).
The information flex-changing mechanism 204 is a dynamic system that adjusts and handles the information in real-time. The information flex-changing mechanism 204 works like a smart control system that can automatically change settings to make sure that everything runs smoothly, even if conditions change. The information flex-changing mechanism 204 may use flexible components or software to adapt to different situations, ensuring everything works smoothly. The information flex-changing mechanism 204 optimizes the flow of navigation data between the network of sensors 102A and the navigation display interface 102.
The information flex sheet 206 refers to a dynamic, adaptable interface within the integrated navigation system 100 that manages and displays information related to the relevant data. In an implementation, the information flex sheet 206 could be part of the user interface that adjusts the information display based on the user’s current need or context. For example, the information flex sheet 206 might show route details, traffic conditions, or points of interest depending on the relevant situation. The information flex sheet 206 integrates various data sources such as GPS, maps, and real-time traffic updates and presents information cohesively.
The vertical slider 208 is a mechanical component, which is a small, movable bar that goes up and down on a screen. The vertical slider 208 is commonly used in applications such as adjusting the volume level of audio or video, changing the brightness, modifying the zoom level on a map, and dragging the information sheets. The horizontal slider 210 is similar to the vertical slider 208, but the horizontal slider 210 moves left and right instead of up and down.
The site plan flex 212 is a versatile approach that helps in designing and updating site plans, which are detailed drawings showing how a piece of land will be used. The site plan flex 212 integrates seamlessly with other components of the integrated navigation system 100, such as the information flex-changing mechanism 204, to ensure that site plan flex 212 remains up-to-date and responsive to the evolving changes in the built facility.
In accordance with an embodiment, the display arrangement 106 comprises the mechanical flex-lifting mechanism 202 configured to reveal portions of the supplementary information pertaining to the target user location. The mechanical flex-lifting mechanism 202 is specifically designed to offer users a more comprehensive and dynamic way to access information about the surroundings of the target user location. Unlike traditional digital displays, which often require constant manual input or interaction with people for guidance, the mechanical flex-lifting mechanism 202 provides a tangible, interactive method for retrieving information. When the user interacts with the integrated navigation system 100 through the hand-held portable interaction device placed on the navigation display interface 102, the hand-held portable interaction device detects the information form the navigation display interface 102 and sends a signal to the mechanical flex-lifting mechanism 202. The mechanical flex-lifting mechanism 202 then adjusts the displays to show supplementary details about the location and path, such as nearby landmarks, alternative routes, or real-time traffic updates. The integration of the mechanical flex-lifting mechanism 202 into the display arrangement 106 ensures that the users have seamless, on-demand access to specific supplementary information about the target spot. Whether the user is exploring a complex built environment, like a large shopping mall or an unfamiliar airport, the integrated navigation system 100 enables the users to obtain detailed and relevant information in real-time. The real-time information not only improves the user's ability to navigate but also enhances the understanding of the target spot. The mechanical flex-lifting mechanism 202 ensures the user stays focused by preventing distractions from too much information.
In accordance with an embodiment, the navigation display interface 102 comprises the flex-changing mechanism configured to display updated map information on the map layout 102B based on a placement of the hand-held portable interaction device 104 to a new target spot over the navigation display interface 102. The flex-changing mechanism is designed to work in real-time, ensuring that as soon as the hand-held portable interaction device 104 is repositioned over a different spot on the map layout 102B, the displayed updated map information adjusts accordingly. The flex-changing mechanism modifies the navigation display interface to reflect the user's current position or user’s new target spot allowing for instant updates without any lag or delay. By updating the information on the map layout 102B, according to the placement of the hand-held portable interaction device 104, the users can effortlessly access details about the target spot. The displayed updated map information enhances the navigation experience by providing the users with real-time updates on the map layout 102B, such as highlighted routes, nearby landmarks, or specific points of interest relevant to the target spot.
In accordance with an embodiment, the display arrangement 106 is configured to hold the plurality of independent physical sheets 106A mechanically. Each of the plurality of independent physical sheets 106A is configured to be operated via a pulley mechanism to display the supplementary information pertaining to said one location of the plurality of defined locations.
The pulley mechanism refers to a system of ropes, pulleys, and possibly other mechanical parts that allow the plurality of independent physical sheets 106A to be moved or adjusted. The pulley mechanism enables each of the plurality of independent physical sheets 106A to be moved or adjusted as needed. When the user selects or requests information about a particular location, the pulley mechanism operates to position the relevant sheet into view. Once the appropriate sheet (sheet containing the supplementary information of the target spot) is in the correct position, the display arrangement 106 displays the supplementary information related to the selected location. The supplementary information might include maps, directions, or other relevant details that provide context or guidance about the location selected by the user. By using the plurality of independent physical sheets 106A, the display arrangement 106 allows the users to access specific information about different places or points of interest in a tangible format. Such a tangible format of the specific information helps the users to explore the area with ease, giving the user details required to navigate quickly and saving time in searching for the target spot.
FIG. 3 is a diagram of the integrated navigation system that illustrates a side view of the integrated navigation system in accordance with an embodiment of the present disclosure. FIG. 3 is described in conjunction with the element from FIG. 1. With reference to FIG. 3, there is shown a side view 300 of the integrated navigation system 100. The side view 300 shows an information highlighter 301, a vertical slider guide 302, a site plan flex changing mechanism 304, an information flex holder 306, and the direction retention device 108.
The information highlighter 301 is a tool or a feature used to draw attention to specific pieces of information by visually emphasizing them. For example, when writing or reading a document, the information highlighter 301 can be used to mark important sections of the text, significant dates, or key instructions likewise, in the integrated navigation system 100, the information highlighter 301 highlights the route or landmarks of the interested destination.
The vertical slider guide 302 is the path or track that helps the vertical slider 208 move up and down smoothly. The vertical slider guide 302 keeps the vertical slider 208 moving in a straight line so the vertical slider 208 doesn’t go off track.
The site plan flex changing mechanism 304 refers to a flexible system used to modify or adjust a site plan flex 212 in response to changing conditions or requirements. The site plan flex 212 is the detailed layouts used in planning, showing arrangements of building, infrastructure, and construction.
The information flex holder 306 refers to a flexible, adjustable holder, which is used to present information in a clear and organized manner. The information flex holder 306 allows the users to access or view information easily while also being able to adjust or reposition the display based on the user’s requirement. The information flex holder 306 is often used in retail, museums, kiosks, or any setting where information needs to be displayed prominently and accessed by users. The information flex holder 306 may hold maps, brochures, instructions, or digital displays that users can interact with. The information flex holder 306 could be a part of the integrated navigation system 100 that allows the content on the display to change and adapt depending on the user's interaction.
FIG. 4A is a diagram of a parallel stacked flex of the display arrangement in the integrated navigation system, in accordance with an embodiment of the present disclosure. FIG. 4A is described in conjunction with the element from FIG. 1. With reference to FIG. 4A, there is shown a diagram 400A of a parallel stacked flex in the display arrangement 106. A parallel stacked flex is shown in a zoom-in view 402 of the parallel stacked flex. Furthermore, the FIG. 4A shows a top view 404 of the parallel stacked flex.
The diagram 400A showcases the parallel stacked flex, which consists of the plurality of independent physical sheets 106A arranged in parallel to enhance flexibility and performance within the integrated navigation system 100. The plurality of independent physical sheet 106A allows the visual representation, providing users with real-time navigation information. The dotted box A indicates a specific area of interest within the parallel stacked flex, shown in the zoom-in view 402. The zoom-in view 402 provides greater insight into the intricate layering and connectivity of the parallel stacked flex components, emphasizing the overall functionality of the display arrangement 106.
FIG. 4B is a diagram of a parallel stacked flex of the display arrangement in the integrated navigation system, in accordance with an embodiment of the present disclosure. FIG. 4B is described in conjunction with the element from FIG. 1. With reference to FIG. 4B, there is shown a diagram 400B of the parallel stacked flex of the display arrangement 106. Further, there is shown a zoom-in view 406 of the dotted box B and a zoom-in view 408 of the dotted box C. The dotted box B shows the side-top view of the arrangement of the parallel stacked flex and the dotted box C represents the side-bottom view of the arrangement of the parallel stacked flex in the display arrangement 106.
The zoom-in view 406 focuses on the top view of the arrangement of the parallel stacked flex, as indicated by the dotted box B. The top view highlights the structural organization and layout of the parallel stacked flex, showcasing the parallel stacked flex are configured to maximize flexibility and functionality in the integrated navigation system 100. Conversely, the zoom-in view 408 located within the dotted box C offers a bottom view of the parallel stacked flex arrangement in the display arrangement 106. The bottom view reveals additional details regarding the underlying connections and components, allowing for a comprehensive understanding of the parallel stacked flex structure integrated with other elements in the integrated navigation system 100.
FIG. 4C is a diagram of an information flex holder for the parallel stacked flex in the integrated navigation system, in accordance with an embodiment of the present disclosure. FIG. 4C is described in conjunction with the element from FIG. 1. With reference to FIG. 4C, there is shown a diagram 400C of the information flex holder 306 for the parallel stacked flex in the integrated navigation system 100.
The information flex holder 306 is configured to organize and support a structure of the parallel stacked flex of the integrated navigation system 100. The information flex holder 306 ensures that the multiple layers of the display arrangement 106 are properly aligned and secured, which is required for optimal performance in displaying the supplementary information. The design of the information flex holder 306 allows for easy integration with the parallel stacked flex, facilitating efficient data transmission and enhancing the overall functionality of the integrated navigation system 100.
FIG. 5 is the illustration of a side view of the navigation display interface, in accordance with an embodiment of the present disclosure. FIG. 5 is described in conjunction with the element from FIG. 1. With reference to FIG. 5, there is shown a side view 500 of the navigation display interface 102. The navigation display interface 102 consists of a Radio Frequency Identification (RFID) tag 502 and the site plan flex changing mechanism 304.
The RFID tag 502 is a small device that uses radio waves to communicate information wirelessly. The RFID tag 502 is often used to track objects, identify items, or transmit small amounts of data. The RFID tag 502 might contain a chip and an antenna. The chip stores information such as a unique identification number. The passive RFID tag doesn’t have any power source like a battery and only sends data when activated by a nearby RFID reader (further described in FIG. 6 radio waves. The RFID tag 502 may be used in retail stores, security systems, libraries, and pets.
FIG. 6 is a diagram that illustrates an exploded view of the hand-held portable interaction device, in accordance with an embodiment of the present disclosure. FIG. 6 is described in conjunction with elements from FIG. 1. With reference to FIG. 6, there is shown an exploded view 600 of the hand-held portable interaction device 104.
The exploded view 600 illustrates extended integral parts of the hand-held portable interaction device 104. The extended view 600 of the hand-held portable interaction device 104 includes a digital display 602, the microcontroller 104B, a rotary encoder 604, and an RFID reader 606.
The digital display 602 is an electronic screen that shows information, images, or video using digital signals. The digital display 602 is made up of tiny pixels (small dots) that light up in different colours to create images, text, or videos. The digital display 602 uses digital signals (a series of ones and zeros) to control what appears on the screen, like showing a picture or displaying information. There are different types of the digital display 602, such as LED (Liquid crystal display), LED (Light emitting diode), and OLED (Organic light emitting diode).
The rotary encoder 604 is a device that tracks the movement of a spinning part, like the scroll wheel 104A or a motor shaft. The rotary encoder 604 converts the rotational motion into an electronic signal that machines, computers, microprocessors, or microcontrollers can read to monitor and control the specific movements within the integrated navigation system 100. When the scroll wheel 104A rotates, the rotary encoder 604 produces an electrical signal, which can either be in pulses (for basic movement) or more detailed (to measure the exact angle or speed of rotation). The rotary encoder 604 may be used in robots .and industrial machines, computer mice (scroll wheel), elevators, and volume knobs.
The RFID reader 606 is a device that communicates with the RFID tag 502 to read the information stored on the RFID tag 502. The RFID reader 606 works by sending out radio waves to activate the RFID tag 502. On activation, the RFID tag 502 sends the stored data to the RFID reader 606, such as a unique ID number or other information. The RFID reader 606 processes the received data and sends the received data to a computer or system (such as the integrated navigation system 100) for further action, like identifying an item or tracking movement. The RFID reader 606 is installed in portable devices (i.e., the hand-held portable interaction devices 104). The RFID reader 606 can be used in retail stores (scan items and track inventory), libraries, and keycard systems.
In accordance with an embodiment, the hand-held portable interaction device 104 comprises the scroll wheel 104A configured to allow user selection of the target spot over the navigation display interface 102. The target spot indicates the target user's location within the built facility, and the navigation display interface 102 is configured to update the map layout 102B based on the placement of the hand-held portable interaction device 104 on the target spot over the navigation display interface 102.
The hand-held portable interaction device 104 with the scroll wheel 104A allows the users to easily select and pinpoint a specific location on the navigation display interface 102, making the navigation simple to select the destination within the target spot. As the scroll wheel 104A moves, the map layout 102B updates in real-time to reflect the select location, providing an intuitive and interactive way to navigate. Once the hand-held portable interaction device 104 is placed on the target spot, the hand-held portable interaction device 104 automatically triggers an update in the map layout 102B on the navigation display interface 102. The updated map layout 102B reflects the user’s current location and adjusts the map layout 102B accordingly, showing routes, nearby areas, or points of interest related to the selected location. By using the scroll wheel 104A, the users can effortlessly select the target spot. At the same time, the integrated navigation system 100 responds in real-time by adjusting the map layout 102B to provide accurate, updated information. The requirement for complex manual inputs or repetitive searches is eliminated, making the navigation easier for the users to select the target spot with minimal effort. By simplifying the location selection and navigation process, the integrated navigation system 100 improves user engagement, reduces the time spent searching for directions, and makes exploring the built facility more convenient.
In accordance with an embodiment, the hand-held portable interaction device 104 comprises the microcontroller 104B configured to coordinate communication between the hand-held portable interaction device 104, the navigation display interface 102, and the display arrangement 106 to execute real-time updates to the navigation path and the supplementary information based on the movement and device-to-device interaction between the hand-held portable interaction device 104 and the navigation display interface 102. The coordinated communication between the hand-held portable interaction device 104, the navigation display interface 102, and the display arrangement 106 ensures that all components of the integrated navigation system 100 work together seamlessly to provide the users with accurate and timely navigation information. The microcontroller 104B enables real-time updates to both the navigation path and supplementary information displayed to the user. As the user interacts with the hand-held portable interaction device 104, which may involve selecting locations or adjusting settings, the microcontroller 104B ensures that such inputs are communicated effectively to the navigation display interface 102 and the display arrangement 106. By coordinating the real-time updates, the microcontroller 104B allows for a smooth and responsive navigation experience. The integrated navigation system 100 delivers timely feedback and useful information in response to user actions, enabling more effective and confident navigation.
In accordance with an embodiment, the integrated navigation system 100 comprises the direction retention device 108 configured to generate a physical media comprising navigation instructions from the current user location to the target user location. The direction retention device 108 in the integrated navigation system 100 is designed to enhance the user's navigation experience by providing physical, printed directions. When the direction retention device 108 receives the signal from the hand-held portable interaction device 104, the direction retention device 108 triggers a thermal printer to generate a printed guide that contains step-by-step directions from the user’s current location to their desired destination. The direction retention device 108 provides a reliable, offline way of guiding users to the target spot, even without access to electronic devices or internet connectivity. The physical media, like a printed map or directions, ensures that the users can still navigate effectively if devices run out of battery or lose signal. Additionally, the direction retention device 108, which activates the thermal printer, executes a printing task such as generating tickets, receipts, or other printed materials as requested by the users. By combining real-time updates on the navigation display interface 102 and the display arrangement 106 with printed directions, the integrated navigation system 100 offers the users both visual and physical guidance to aid the users in reaching the target spot efficiently.
FIG. 7A, 7B, and 7C are the diagrams depicting an operational workflow of the integrated navigation system, in accordance with an embodiment of the present disclosure. FIGs. 7A, 7B, and 7C are described in conjunction with elements from FIGs. 1, 2, 3, 4A, 4B, 4C, 5, and 6. With reference to FIGs. 7A, 7B, and 7C, are shown a diagram 700 that, includes a series of operations 702 to 770 performed by the integrated navigation system 100.
At operation 702, an initial screen displays the welcome message “welcome screen” when the integrated navigation system 100 is activated or turned ON. The welcome message is displayed on the digital display 602 of the hand-held portable interaction device 104.
At operation 704, the hand-held portable interaction device 104 automatically turns off the digital display 602 if the user does not interact with the navigation display interface 102 for 10 seconds.
At operation 706, the user reads the welcome message on the digital display 602 of the hand-held portable interaction device 104. The user initiates the interaction by touching the digital display 602 of the hand-held portable interaction device 104.
At operation 708, the user touches the touch interface (i.e., the digital display 602) of the hand-held portable interaction device 104.
At operation 710, the gyroscope is active on touching the digital display 602 of the hand-held portable interaction device 104. The gyroscope sensor in the hand-held portable interaction device 104 is being used to prevent the digital display 602 from turning off by detecting motion or orientation changes.
At operation 712, the user touches and initiates the interaction, which activates a stepper motor. The stepper motor is used to adjust or alter the horizontal layout of a site map within the navigation display interface 102.
At operation 714, the hand-held portable interaction device 104 detects the user touch input on the digital display 602.
At operation 716, the digital display 602 of the hand-held portable interaction device 104 displays the options for the user to initiate the navigation of the built facility (such as building, school, and hospital).
At operation 718, the microcontroller 104B of the hand-held portable interaction device 104 takes the input from the gyroscope and the digital display 602. The microcontroller 104B processes the inputs and displays the output on the digital display 602.
At operation 720, the digital display 602 displays the message to select the mode. The message displays the choice to the user. The user can choose between two modes of interaction: one choice is to explore the area and the second choice is to search the area.
At operation 722, if the user chooses the search mode of interaction, the user can physically rotate the scroll wheel 104A clockwise or anticlockwise to get the options in the digital display 602 in the hand-held portable interaction device 104.
At operation 724, the user can click on the arrow buttons “(< >)” to change options in the digital display 602. For example, on the rotation of the scroll wheel 104A, the user gets the first option “Physics department”, by using the arrow buttons, the user can change the place accordingly.
At operation 726, when the user physically rotates the scroll wheel 104A, the rotary encoder 604 translates the physical rotation of the scroll wheel 104A and converts it into digital signals. The digital signal is input for the microcontroller 104B in the hand-held portable interaction device 104.
At operation 728, the microcontroller 104B is the central processing unit of the hand-held portable interaction device 104. The microcontroller 104B interprets the inputs from the rotary encoder 604 or button clicks and updates the digital display 602 accordingly.
At operation 730, if the user chooses the explore mode of interaction, the hand-held portable interaction device 104 is placed on the navigation display interface 102 at the map layout 102B.
At operation 732, the RFID reader 606 detects the target spot of the user on the map layout 102B. The navigation display interface 102 consists of the RFID tags 502. The RFID reader 606 detects the RFID tag 502 and identifies the user’s selected destination.
At operation 734, the microcontroller 104B processes the input of the RFID reader 606 and manages the explore mode of interaction. The microcontroller 104B updates the digital display 602.
At operation 736, the digital display 602 allows the user to select the target spot and select various features like showing information about the target spot, highlighting the navigation path of the destination, or printing the navigation path of the destination.
At operation 738, if the user selects the navigation path of the target spot, it triggers the navigation display interface 102 to highlight the navigation path of the destination on the map layout 102B.
At operation 740, the microcontroller 104B of the hand-held portable interaction device 104 detects the user’s selected target spot and sends the signal through a wireless connection (e.g., WiFi) to the microcontroller of the integrated navigation system 100.
At operation 742, the integrated navigation system 100consists of a microcontroller, which receives the input signal from the microcontroller 104B of the hand-held portable interaction device 104. The microcontroller of the integrated navigation system 100 processes the input and specifically controls the LED array and accurate output.
At operation 744, an array of LEDs is positioned beneath a transparent or translucent map layout (i.e., the map layout 102B), which is used to highlight the navigation path.
At operation 746, the LEDs of the specific points (i.e. target spot selected by the user) are illuminated on the navigation display interface 102.
At operation 748, the LEDs are used to highlight the navigation path to the selected target spot by the user on the navigation display interface 102.
At operation 750, if the user selects the option in the digital display 602 to print the navigation path of the target spot. The process of generating a printed version of the navigation path to the selected target spot is initiated.
At operation 752, the microcontroller 104B of the hand-held portable interaction device 104 detects the user’s target spot and sends a signal through a wireless connection to the microcontroller of the integrated navigation system 100
At operation 754, the microcontroller of the integrated navigation system 100processes the input from the microcontroller 104B of the hand-held portable interaction device 104 and specifically sends a signal to the thermal printer.
At operation 756, a thermal printer uses heat to create images on thermal paper, likely for printing maps or directions. The thermal printer gets the route information from the built facility microcontroller.
At operation 758, the printed destination route from the thermal printer provides the users with a tangible printed navigation path of the target spot.
At operation 760, if the user selects “show info”, the user gets detailed information about the target spot, possibly including descriptions, images, or other relevant data. The supplementary information of the target spot is shown by the display arrangement 106 in the integrated navigation system 100.
At operation 762, the microcontroller 104B of the hand-held portable interaction device 104 detects the user’s target spot and sends a signal through the interface to the microcontroller of the integrated navigation system 100
At operation 764, the microcontroller of the integrated navigation system 100receives the signals from the other microcontroller (i.e. the microcontroller 104B of the hand-held portable interaction device 104) and coordinates the overall response of the integrated navigation system 100.
At operation 766, the stepper motor controls the movement of the flex sheets, likely for showing different information or maps on a vertical display. The vertical display involves the use of the plurality of sheets 106B, the vertical slider guide 302, the site plan flex changing mechanism 304, the information flex holder 306, the information flex-changing mechanism 204, the information flex sheets 206, and the vertical slider 208.
At operation 768, the stepper motor is used to lift or release the information flex sheet 206, likely sliding it up or down. The stepper motor ensures the information flex sheet 206 movement smoothly and effectively.
At operation 770, the output is shown on the display arrangement 106 of the built facility, where the relevant information, maps, or other visual data are shown to the user.
FIG. 8 is a flowchart of a method for use in an integrated navigation system for navigation assistance within a built facility, in accordance with an embodiment of the present disclosure. FIG. 8 is described in conjunction with the element from FIGs. 1-6. With reference to FIG. 8, there is shown a method 800 that includes steps 802 to 814.
There is provided the method 800 for use in the integrated navigation system 100 for navigation assistance within a built facility. The method 800 is used for detecting the movement of the hand-held portable interaction device 104 over the navigation display interface 102 of the integrated navigation system 100. The method 800 is designed for the integrated navigation system 100, which allows the user to physically interact with the navigation display interface 102 to select the target spot and receive navigation guidance.
At step 802, the method 800 comprises the hand-held portable interaction device 104, which interacts with the navigation display interface 102, which displays the map layout 102B of the built facility (i.e., buildings, schools, hospitals, and airports) to be navigated. As the user moves the hand-held portable interaction device 104 over the navigation display interface 102, the navigation display interface 102 continuously tracks the movement of the hand-held portable interaction device 104. The tracking of the hand-held portable interaction device 104 helps the integrated navigation system 100 to determine the user’s target spot on the map layout 102B.
At step 804, the method 800 involves utilizing the network of sensors 102A within the navigation display interface 102 to detect the target spot. The user’s target spot detection is based on the movement and placement of the hand-held portable interaction device 104. By tracking the position and movement of the hand-held portable interaction device 104 relative to the navigation display interface 102, the integrated navigation system 100 accurately identifies the specific target spot on the navigation display interface 102. The use of the network of sensors 102A allows for precise detection, ensuring that the navigation display interface 102 responds effectively to the user’s inputs and gestures.
At step 806, method 800 involves identifying the user’s current location within the built facility. The current location of the user is compared or matched with a deployment location. A deployment location is a pre-defined area or point where the integrated navigation system 100 has been installed or is operational. The integrated navigation system 100 uses the user's exact location to fine-tune the features and deliver navigation assistance that fits the user’s position relative to the integrated navigation system 100. The integrated navigation system 100 ensures the delivery of navigation is precisely tuned to the user’s location.
At step 808, the method 800 involves automatically generating the navigation path from the current user location to the target user location, which corresponds to the identified target spot on the navigation display interface 102. By utilizing real-time location data from the network of sensors 102A, the integrated navigation system 100 calculates the most efficient path between the user’s current location and the target spot. The automatic path generation allows the integrated navigation system 100 to dynamically adjust the navigation path based on the user's movement, ensuring that the user is guided accurately toward the target spot. The method 800 provides a seamless and hands-free experience, optimizing navigation by minimizing manual input and enabling the user to reach the identified target spot more effectively and quickly.
At step 810, the method 800 involves automatically highlighting the generated navigation path on the navigation display interface 102. By utilizing the previously calculated path from the current user location to the target user location, the system visually represents the navigation path on the navigation display interface 102. The automatic highlighting ensures that the user can easily follow the path without confusion, enhancing the overall navigation experience. The step 810 ensures that the navigation path is visible and continuously updated as the user progresses toward the identified target spot.
At step 812, the method 800 involves activating the display arrangement 106. The activation of the display arrangement 106 allows the integrated navigation system 100 to extend or adjust the display arrangement 106 in response to the user's interaction. The mechanical adjustments to the display arrangement 106 enable the integrated navigation system 100 to provide an optimized view of navigation data, ensuring visibility and accessibility for the user. The link between the display arrangement 106 and the navigation display interface 102 permits dynamic movements, such as rotation, tilting, or extending, improving the ease of use and adaptability of the integrated navigation system 100.
At step 814, the method 800 involves performing mechanical movement within the display arrangement 106. The movement in the display arrangement 106 is designed to reveal supplementary information related to the user's location automatically. The mechanical movement within the display arrangement 106 enables the integrated navigation system 100 to automatically modify display configuration, making additional, contextually relevant information visible to the user based on the target spot. The approach enhances the user experience by ensuring that pertinent information is readily accessible without manual intervention, resulting in a more intuitive and interactive display.
The method 800 further comprises detecting the movement of the hand-held portable interaction device 104 over the navigation display interface 102 of the integrated navigation system 100. When the hand-held portable interaction device 104 is placed on the navigation display interface 102 over the desired destination, the hand-held portable interaction device 104 sends a signal to the navigation display interface 102 and direction retention device 108 to execute the task. The navigation display interface 102 contains the map layout 102B of the environment of the new places, which updates the information depending on the floor and highlights the navigation path upon receiving the signal from the hand-held portable interaction device 104. By detecting the movement of the hand-held portable interaction device 104 over the navigation display interface 102, the integrated navigation system 100 can accurately track the user's intended destination and provide real-time updates and directions. The method 800 further comprises the detecting, using the sensor from the network of sensors 102A of the navigation display interface 102, the target spot on the navigation display interface 102 based on a movement and placement of the hand-held portable interaction device 104 to the target spot on the navigation display interface 102. As the user moves the hand-held portable interaction device 104 across the navigation display interface 102, the sensor identifies the specific target spot that the user has selected, whether it be a location on the map or a point of interest. The sensor from the network of sensors 102A tracks the hand-held portable interaction device 104 movements and accurately detects where the user intends to interact with the integrated navigation system 100. Once the sensor detects the target spot, the hand-held portable interaction device 104 sends the signal to multiple components of the integrated navigation system 100, including the navigation display interfaces 102 and the direction retention device 108. The signal triggers the integrated navigation system 100 to update both the visual map and any supplementary information relevant to the user’s current selection. By allowing the integrated navigation system 100 to respond immediately to the movement of the hand-held portable interaction device 104, the users can easily explore the environment and obtain up-to-date information about the target spot. The use of the network of sensors 102A and navigation display interface 102 updates eliminates the need for manual input or repetitive searches, streamlining the navigation process. The method 800 determines the current user location corresponding to the deployment location of the integrated navigation system 100. A kiosk featuring an information screen, an interaction display, a device for holding directional data, and an interaction tool is used to establish the user’s position inside the facility. And, automatically generating the navigation path from the current user location to the target user location corresponding to the identified target spot, automatically highlighting the generated navigation path on the navigation display interface 102; activating the display arrangement 106 electronically connected to the navigation display interface 102; and performing a mechanical movement in the display arrangement 106 to automatically make supplementary information about the target user location visible to the user. The method 800 is designed to provide comprehensive navigation assistance within the built facility, allowing users to seamlessly explore the environment and access diverse information about the space. The method 800 aims to offer an alternative to digital displays and traditional methods of gaining information, reducing dependency on people and other sources. The method 800 integrates tangible interaction and precisely designed mechanisms to ensure that users receive the required information intuitively. It also combines multiple sub-systems within the product to assist users in exploring the area, saving their search time and providing various information about the space.
In accordance with an embodiment, the method 800 comprises providing physical user selection of the target spot over the navigation display interface 102 using the hand-held portable interaction device 104, the target spot indicates the target user location within the built facility. By utilizing the hand-held portable interaction device 104, users can intuitively select locations without complicated input methods, facilitating a seamless interaction with the integrated navigation system 100. The direct selection process streamlines the navigation experience, enabling users to quickly identify the target spot while reducing the potential for errors in location identification. The immediate feedback from the navigation display interface 102 contributes to a more efficient and user-friendly navigation process, ultimately improving the integrated navigation system 100 overall effectiveness within the built facility.
In accordance with an embodiment, the method 800 comprises displaying updated map information on the navigation display interface 102 based on the placement of the hand-held portable interaction device 104 to the new target spot over the navigation display interface 102. By directly correlating the hand-held portable interaction device 104 placement with the updated map information, users receive immediate access to relevant navigation data, ensuring that the information reflects the current context and surroundings. The integration of real-time updates enhances user awareness and understanding of the built facility, leading to improved decision-making during navigation. The immediate availability of accurate and updated map information significantly improves the usability of the integrated navigation system 100, empowering users to navigate confidently and easily.
Modifications to embodiments of the present disclosure described in the foregoing are possible without departing from the scope of the present disclosure as defined by the accompanying claims. Expressions such as "including", "comprising", "incorporating", "have", "is" used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural. The word "exemplary" is used herein to mean "serving as an example, instance or illustration". Any embodiment described as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments. The word "optionally" is used herein to mean "is provided in some embodiments and not provided in other embodiments". It is appreciated that certain features of the present disclosure, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the present disclosure, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable combination or as suitable in any other described embodiment of the disclosure.
, C , C , Claims:CLAIMS
I/We claim:
1. An integrated navigation system (100) for navigation assistance within a built facility, comprising:
a navigation display interface (102) comprising a network of sensors (102A) and a map layout (102B), wherein each sensor is positioned at the map layout of the navigation display interface (102) to indicate one location of a plurality of defined locations within the built facility;
a hand-held portable interaction device (104) configured to be moved over the navigation display interface (102) such that a navigation path from a current user location to a target user location is automatically highlighted on the navigation display interface (102) when the hand-held portable interaction device (104) is moved to a target spot corresponding to the target user location on the navigation display interface (102), wherein a sensor of the network of sensors (102A) is configured to detect the hand-held portable interaction device (104) moved to the target spot to cause the navigation display interface (102) to highlight the navigation path; and
a display arrangement (106) electronically connected to the navigation display interface (102), the display arrangement (106) is configured to perform a mechanical movement in the display arrangement (106) to automatically make supplementary information about the target location visible to a user.
2. The integrated navigation system (100) as claimed in claim 1, wherein the display arrangement (106) comprises a mechanical flex-lifting mechanism (202) configured to reveal portions of the supplementary information pertaining to the target user location.
3. The integrated navigation system (100) as claimed in claim 1, wherein the integrated navigation system (100) comprises a direction retention device (108) configured to generate a physical media comprising navigation instructions from the current user location to the target user location.
4. The integrated navigation system (100) as claimed in claim 1, wherein the hand-held portable interaction device (104) comprises a scroll wheel (104A) configured to allow user selection of the target spot over the navigation display interface (102), wherein the target spot indicates the target user location within the built facility, and wherein the navigation display interface (102) is configured to update the map layout (102B) based on the placement of the hand-held portable interaction device (104) on the target spot over the navigation display interface (102).
5. The integrated navigation system (100) as claimed in claim 1, wherein the navigation display interface (102) comprises a flex-changing mechanism configured to display updated map information on the map layout (102B) based on a placement of the hand-held portable interaction device (104) to a new target spot over the navigation display interface (102).
6. The integrated navigation system (100) as claimed in claim 1, wherein the display arrangement configured to mechanically hold a plurality of independent physical sheets (106A), wherein each of the plurality of independent physical sheets (106A) is configured to be operated via a pulley mechanism to display the supplementary information pertaining to said one location of the plurality of defined locations.
7. The integrated navigation system (100) as claimed in claim 1, wherein the hand-held portable interaction device (104) comprises a microcontroller (104B) configured to coordinate communication between the hand-held portable interaction device (104), the navigation display interface (102), and the display arrangement (106) to execute real-time updates to the navigation path and the supplementary information based on the movement and device-to-device interaction between the hand-held portable interaction device (104) and the navigation display interface (102).
8. A method (800) for use in an integrated navigation system (100) for navigation assistance within a built facility, the method comprising:
detecting movement of a hand-held portable interaction device (104) over a navigation display interface (102) of the integrated navigation system (100);
detecting, using a sensor from a network of sensors (102A) of the navigation display interface (102), a target spot on the navigation display interface (102) based on a movement and placement of the hand-held portable interaction device (104) to the target spot on the navigation display interface (102);
determining a current user location corresponding to a deployment location of the integrated navigation system (100);
automatically generating a navigation path from the current user location to a target user location corresponding to the identified target spot;
automatically highlighting the generated navigation path on the navigation display interface (102);
activating a display arrangement (106) electronically connected to the navigation display interface (102); and
performing a mechanical movement in the display arrangement (106) to automatically make supplementary information about the target user location visible to a user.
9. The method (800) as claimed in claim 8, comprises providing physical user selection of the target spot over the navigation display interface (102) using the hand-held portable interaction device (104), wherein the target spot indicates the target user location within the built facility.
10. The method (800) as claimed in claim 8, comprises displaying updated map information on the navigation display interface (102) based on a placement of the hand-held portable interaction device (104) to a new target spot over the navigation display interface (102).
| # | Name | Date |
|---|---|---|
| 1 | 202441092845-STATEMENT OF UNDERTAKING (FORM 3) [27-11-2024(online)].pdf | 2024-11-27 |
| 2 | 202441092845-FORM FOR SMALL ENTITY(FORM-28) [27-11-2024(online)].pdf | 2024-11-27 |
| 3 | 202441092845-FORM 1 [27-11-2024(online)].pdf | 2024-11-27 |
| 4 | 202441092845-FIGURE OF ABSTRACT [27-11-2024(online)].pdf | 2024-11-27 |
| 5 | 202441092845-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [27-11-2024(online)].pdf | 2024-11-27 |
| 6 | 202441092845-EVIDENCE FOR REGISTRATION UNDER SSI [27-11-2024(online)].pdf | 2024-11-27 |
| 7 | 202441092845-EDUCATIONAL INSTITUTION(S) [27-11-2024(online)].pdf | 2024-11-27 |
| 8 | 202441092845-DRAWINGS [27-11-2024(online)].pdf | 2024-11-27 |
| 9 | 202441092845-DECLARATION OF INVENTORSHIP (FORM 5) [27-11-2024(online)].pdf | 2024-11-27 |
| 10 | 202441092845-COMPLETE SPECIFICATION [27-11-2024(online)].pdf | 2024-11-27 |
| 11 | 202441092845-FORM-9 [28-11-2024(online)].pdf | 2024-11-28 |
| 12 | 202441092845-FORM-8 [28-11-2024(online)].pdf | 2024-11-28 |
| 13 | 202441092845-FORM 18A [28-11-2024(online)].pdf | 2024-11-28 |
| 14 | 202441092845-EVIDENCE OF ELIGIBILTY RULE 24C1f [28-11-2024(online)].pdf | 2024-11-28 |
| 15 | 202441092845-FORM-26 [12-02-2025(online)].pdf | 2025-02-12 |
| 16 | 202441092845-Proof of Right [18-02-2025(online)].pdf | 2025-02-18 |
| 17 | 202441092845-FER.pdf | 2025-03-26 |
| 18 | 202441092845-FORM-26 [30-04-2025(online)].pdf | 2025-04-30 |
| 19 | 202441092845-FER_SER_REPLY [03-06-2025(online)].pdf | 2025-06-03 |
| 20 | 202441092845-DRAWING [03-06-2025(online)].pdf | 2025-06-03 |
| 21 | 202441092845-CLAIMS [03-06-2025(online)].pdf | 2025-06-03 |
| 1 | SearchHistoryE_26-12-2024.pdf |