Abstract: Disclosed is a navigation display system that includes at least one host processor (101) configured to receive real-time location information of a vehicle from a navigation system (103). Further, the host processor (101) is configured to receive real-time navigational data from a remote network (105). A plurality of common segments is determined corresponding to the real-time location information based on a first Look-Up-Table (LUT) (501). Also, a plurality of non-essential segments is determined corresponding to the navigational data based on a second LUT (503). Thereby, a first bitmap and a second bitmap are determined based on the plurality of common segments and the plurality of non-essential segments, respectively. Furthermore, a display unit (109) is illuminated based on a first set of segments and a second set of segments corresponding to the first bitmap and the second bitmap, respectively to display “Turn-By-Turn” navigation in the display unit (109).
DESC:TECHNICAL FIELD
[0001] The present disclosure relates to the field of navigation display systems and more particularly, relates to a Turn-By-Turn navigation display system and a method thereof.
BACKGROUND
[0002] A navigational system refers to a technology used to provide navigation information to drivers or riders. The navigational system is commonly found in modern vehicles and offers various features to assist with navigation, route guidance, and situational awareness. The navigational system generally provides “Turn-by-Turn navigation”, indicating to a user regarding upcoming real-time navigational directions, such as turns, lane changes, exits, etc. based on a destination that is manually selected by the user. The “Turn-By-Turn navigation” may be accompanied by visual cues, such as arrows, illuminations, icons, etc., and/ or an automated voice-assisted guidance, additionally Turn-By-Turn navigation can be display as a layover a map or it can be displayed using a multi-segment navigation unit. In vehicles such as two-wheeler vehicles, three-wheeler vehicles, or four-wheeler vehicles, a pre-requisite for ensuring seamless navigation guidance is the connection between a user device such as a mobile phone to an instrument cluster (commonly known as an instrument dashboard) of the vehicle. The connection between the user device and the instrument cluster is required so that the user device can push real-time navigation data to the instrument cluster for display.
[0003] To ensure seamless navigation guidance, conventional “Turn-By-Turn navigation” solutions use a single Look-Up-Table (LUT) to identify the turn information and activate the corresponding segments of a display unit of the instrument cluster. The single LUT comprises all segments of the display unit. The display unit is divided into a plurality of segments that make up the display unit. The plurality of segments is arranged in a specific pattern or layout. Each of the plurality of segments may be independently controlled to display different information, a combination of characters, different paths, etc. The single LUT may require a large number of rows and columns to represent each of the plurality of segments (i.e., all segments) of the display unit. Thus, a complex addressing mechanism is usually required to select a set of cells of the single LUT, where each cell among the set of cells is represented by a unique address. Further, upon selecting a set of unique addresses corresponding to the set of cells, the corresponding segments associated with the set of unique addresses are illuminated to display the required information or navigational path. Furthermore, since only the single LUT is used, the process of fetching the set of unique addresses for all the required segments followed by illumination gets delayed. The delayed illumination may cause the user/rider to miss a turn or to take a last-minute turn thereby causing inconvenience to the user/rider as well as nearby riders/drivers or pedestrians. Additionally, more entries in the single LUT will require more addressing bits to address every entry of the single LUT. Therefore, there is a need for a better and simpler addressing mechanism that can enable faster fetching of the set of unique addresses corresponding to the required segments in order to facilitate a quicker response action from the user/rider.
[0004] Several conventional techniques have attempted to overcome the drawbacks pertaining to the employment of the single LUT, wherein one such conventional technique discloses an electronic instrument cluster for displaying navigational information. The electronic instrument cluster comprises a segmented display capable of displaying graphical maneuvers representing the navigational path to be followed by a vehicle, wherein each graphical maneuver is displayed based upon activation of one or more segments of the segmented display. A controller electronically coupled with the segmented display is configured to execute programmed instructions for: receiving navigational information of the vehicle; processing the navigational information, in real-time, to determine the current position of the vehicle and a navigational path to be followed by the vehicle with respect to the current position of the vehicle; activating one or more segments of the segmented display corresponding to the navigational path determined, thereby forming a graphical maneuver; and displaying the graphical maneuver formed on the segmented display. In particular, the aforementioned conventional technique discloses graphical maneuvers which represent a navigational path to be followed by the vehicle with respect to the current position of the vehicle, wherein each graphical maneuver is displayed based upon activation of one or more segments of the segmented display. Here, each graphical maneuver is assigned with an identifier, wherein each identifier is pre-mapped with a corresponding navigational command in order to trigger the display of the corresponding maneuver on the segmented display. Thus, the aforementioned conventional technique fails to disclose details pertaining to a bit-wise mapping or generating the bitmaps of the segments which implies that the mapping may be fetched from the single LUT.
[0005] Further, another conventional technique discloses a vehicle navigation icon provided in a vehicle information display instrument fitted to a vehicle. The vehicle includes a mobile navigation application connected to the vehicle information display instrument via a Bluetooth communication module. The Bluetooth communication module is provided with the vehicle information display instrument to receive the navigation information fetched from the mobile navigation application. The vehicle information display instrument is also equipped with a microcontroller to receive the navigation information from a Bluetooth Low Energy (BLE) module to subsequently customize and display the received navigation information onto the vehicle navigation icon. The tracked segments present in the navigation icon glow, which sequentially leads to a completely stagnant illumination of the specific highlighted track of the navigation icon. However, this conventional technique focuses mainly on the systemic approach to the illumination of segments based on a location and a destination of the vehicle. Thus, this conventional technique fails to mention details pertaining to how the bitmap is generated thereby leaving the issue of “complex addressing mechanism” unresolved.
[0006] Hence, there lies a need for an improved segmented display system and method that can provide an enhanced navigation display system for providing real-time Turn-by-Turn guidance to a user of a vehicle.
SUMMARY
[0007] This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify key or essential inventive concepts of the invention nor is it intended for determining the scope of the invention.
[0008] According to an embodiment of the present disclosure, disclosed herein is a navigation display system that comprises a host memory, a display unit, a decoder unit, and at least one host processor. The decoder unit is communicatively coupled with the host memory. The at least one host processor is communicatively coupled with the host memory, the display unit, and the decoder unit. The at least one host processor is configured to receive real-time location information of a vehicle from a navigation system. The at least one host processor is further configured to receive, from a remote network, real-time navigational data including a navigational path to be followed by the vehicle. Upon receiving the real-time navigational data, the at least one host processor is configured to determine a plurality of common segments corresponding to the received real-time location information based on a first Look-Up-Table (LUT) stored in the host memory. Subsequently, the at least one host processor is configured to determine a plurality of non-essential segments corresponding to the navigational path based on a second LUT stored in the host memory. Furthermore, the decoder unit is configured to generate a first bitmap based on the determined plurality of common segments and generate a second bitmap based on the determined plurality of non-essential segments. Subsequently, the at least one host processor is configured to control the display unit to illuminate a first set of segments corresponding to the first bitmap and a second set of segments corresponding to the second bitmap.
[0009] According to another embodiment of the present disclosure, also disclosed herein is a navigation display system that comprises a user device including a memory and at least one processor coupled with the memory, a display unit, a host memory, a decoder unit communicatively coupled with the host memory, and at least one host processor. The at least one host processor is communicatively coupled with the at least one processor of the user device, the host memory, the display unit, and the decoder unit. The at least one processor of the user device is configured to receive real-time location information of a vehicle from a navigation system. The at least one processor is further configured to receive, from a remote network, real-time navigational data including a navigational path to be followed by the vehicle. The at least one processor is further configured to determine a plurality of common segments corresponding to the received real-time location information based on a first LUT stored in the memory. Furthermore, the at least one processor is configured to determine a plurality of non-essential segments corresponding to the navigational path based on a second LUT stored in the memory. The decoder unit is configured to generate a first bitmap based on the determined plurality of common segments and generate a second bitmap based on the determined plurality of non-essential segments. Subsequently, the at least one host processor is configured to control the display unit to illuminate a first set of segments corresponding to the first bitmap and a second set of segments corresponding to the second bitmap.
[0010] According to yet another embodiment of the present disclosure, a navigation display method is disclosed. The method includes receiving real-time location information of a vehicle from a navigation system. The method further includes receiving, from a remote network, real-time navigational data including a navigational path to be followed by the vehicle. The method includes determining a plurality of common segments corresponding to the received real-time location information based on a first LUT stored in a host memory. Also, the method includes determining, based on a second LUT stored in the host memory, a plurality of non-essential segments corresponding to the navigational path. Upon determining the plurality of common segments, the method includes generating a first bitmap based on the determined plurality of common segments. Similarly, upon determining the plurality of non-essential segments, the method includes generating a second bitmap based on the determined plurality of non-essential segments. Further, the method includes controlling the display unit to illuminate a first set of segments corresponding to the first bitmap and a second set of segments corresponding to the second bitmap.
[0011] To further clarify the advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawing. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The foregoing and other features of embodiments will become more apparent from the following detailed description of embodiments when read in conjunction with the accompanying drawings. In the drawings, like reference numerals refer to like elements.
[0013] Figure 1a illustrates a block diagram of an embodiment of an Electronic Control Unit (ECU) of a vehicle, in accordance with an embodiment of the present disclosure.
[0014] Figure 1b illustrates a block diagram of an embodiment of a navigation display system, in accordance with an embodiment of the present disclosure.
[0015] Figure 1c illustrates a block diagram of another embodiment of the navigation display system, in accordance with an embodiment of the present disclosure.
[0016] Figure 1d illustrates a block diagram of yet another embodiment of the navigation display system, in accordance with an embodiment of the present disclosure.
[0017] Figure 1e illustrates a block diagram of yet another embodiment of the navigation display system, in accordance with an embodiment of the present disclosure.
[0018] Figure 1f illustrates a block diagram of yet another embodiment of the navigation display system, in accordance with an embodiment of the present disclosure.
[0019] Figure 1g illustrates an example diagram depicting illuminating frequency of segments during navigational path being followed by the vehicle, in accordance with an embodiment of the present disclosure.
[0020] Figure 2 illustrates a method flow chart of a navigation display method, in accordance with an embodiment of the present disclosure.
[0021] Figure 3 illustrates another flow chart depicting a detailed process flow of the navigation display method, in accordance with an embodiment of the present disclosure;
[0022] Figure 4 illustrates a segmented display of the navigation display system, in accordance with an embodiment of the present disclosure.
[0023] Figure 5a illustrates a first LUT for the plurality of common segments, in accordance with an embodiment of the present disclosure.
[0024] Figure 5b illustrates the second LUT for the non-essential segments of the navigation display system, in accordance with an embodiment of the present disclosure.
[0025] Figures 6a, 6b, and 6c illustrate an example of the segmented display and memory map of the navigation display system, in accordance with an exemplary embodiment of the present disclosure.
[0026] Figures 7a, 7b, and 7c illustrate an example of the segmented display and memory map of the navigation display system, in accordance with an exemplary embodiment of the present disclosure.
[0027] Figures 8a, 8b, and 8c illustrate an example of the segmented display and memory map of the navigation display system indicating a U-turn on the left side, in accordance with an exemplary embodiment of the present disclosure.
[0028] Figures 9a, 9b, and 9c illustrate an example of the segmented display and memory map of the navigation display system indicating a fork-left turn, in accordance with an exemplary embodiment of the present disclosure.
[0029] Figures 10a, 10b, and 10c illustrate an example of the segmented display and memory map of the navigation display system indicating a sharp left turn, in accordance with an exemplary embodiment of the present disclosure.
[0030] Figures 11a, 11b, and 11c illustrate an example of the segmented display and memory map of the navigation display system indicating a roundabout left turn, in accordance with an exemplary embodiment of the present disclosure.
[0031] Figure 12 illustrates an example of the segmented display of the navigation display system indicating a fork right turn, in accordance with an exemplary embodiment of the disclosure.
[0032] Figures 13a and 13b illustrate examples of the segmented display of the navigation display system indicating a ramp right turn and a ramp left turn, in accordance with exemplary embodiments of the present disclosure.
[0033] Figure 14 illustrates an example of the segmented display of the navigation display system indicating a sharp right turn, in accordance with an embodiment of the present disclosure.
[0034] Figures 15a and 15b illustrate examples of the segmented display of the navigation system indicating a slight right turn and a slight left turn, in accordance with exemplary embodiments of the present disclosure.
[0035] Figure 16 illustrates an example of the segmented display of the navigation system indicating a straight path, in accordance with an exemplary embodiment of the present disclosure.
[0036] Figure 17 illustrates an example of the segmented display of the navigation system indicating a U-turn towards the right side, in accordance with an exemplary embodiment of the present disclosure.
DETAILED DESCRIPTION
[0037] For the purpose of promoting an understanding of the principles of the present disclosure, reference will now be made to the various embodiments and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the present disclosure is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the present disclosure as illustrated therein being contemplated as would normally occur to one skilled in the art to which the present disclosure relates.
[0038] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are explanatory of the present disclosure and are not intended to be restrictive thereof.
[0039] Whether or not a certain feature or element was limited to being used only once, it may still be referred to as “one or more features” or “one or more elements” or “at least one feature” or “at least one element.” Furthermore, the use of the terms “one or more” or “at least one” feature or element do not preclude there being none of that feature or element, unless otherwise specified by limiting language including, but not limited to, “there needs to be one or more…” or “one or more elements is required.”
[0040] Reference is made herein to some “embodiments.” It should be understood that an embodiment is an example of a possible implementation of any features and/or elements of the present disclosure. Some embodiments have been described for the purpose of explaining one or more of the potential ways in which the specific features and/or elements of the proposed disclosure fulfil the requirements of uniqueness, utility, and non-obviousness.
[0041] Use of the phrases and/or terms including, but not limited to, “a first embodiment,” “a further embodiment,” “an alternate embodiment,” “one embodiment,” “an embodiment,” “multiple embodiments,” “some embodiments,” “other embodiments,” “further embodiment”, “furthermore embodiment”, “additional embodiment” or other variants thereof do not necessarily refer to the same embodiments. Unless otherwise specified, one or more particular features and/or elements described in connection with one or more embodiments may be found in one embodiment, or may be found in more than one embodiment, or may be found in all embodiments, or may be found in no embodiments. Although one or more features and/or elements may be described herein in the context of only a single embodiment, or in the context of more than one embodiment, or in the context of all embodiments, the features and/or elements may instead be provided separately or in any appropriate combination or not at all. Conversely, any features and/or elements described in the context of separate embodiments may alternatively be realized as existing together in the context of a single embodiment.
[0042] Any particular and all details set forth herein are used in the context of some embodiments and therefore should not necessarily be taken as limiting factors to the proposed disclosure.
[0043] The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by “comprises... a” does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.
[0044] Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.
[0045] For the sake of clarity, the first digit of a reference numeral of each component of the present disclosure is indicative of the Figure number, in which the corresponding component is shown. For example, reference numerals starting with digit “1” are shown at least in Figure 1. Similarly, reference numerals starting with digit “2” are shown at least in Figure 2.
[0046] An Electric Vehicle (EV) or a battery powered vehicle including, and not limited to two-wheelers such as scooters, mopeds, motorbikes/motorcycles; three-wheelers such as auto-rickshaws, four-wheelers such as cars and other Light Commercial Vehicles (LCVs) and Heavy Commercial Vehicles (HCVs) primarily work on the principle of driving an electric motor using the power from the batteries provided in the EV. Furthermore, the electric vehicle may have at least one wheel which is electrically powered to traverse such a vehicle. The term ‘wheel’ may be referred to any ground-engaging member which allows traversal of the electric vehicle over a path. The types of EVs include Battery Electric Vehicle (BEV), Hybrid Electric Vehicle (HEV), and Range Extended Electric Vehicle. However, the subsequent paragraphs pertain to the different elements of a Battery Electric Vehicle (BEV).
[0047] In construction, an EV typically comprises hardware components such as a battery or battery pack enclosed within a battery casing and includes a Battery Management System (BMS), an on-board charger, a Motor Controller Unit (MCU), an electric motor, and an electric transmission system. In addition to the hardware components/elements, the EV may be supported with software modules comprising intelligent features including and not limited to navigation assistance, hill assistance, cloud connectivity, Over-The-Air (OTA) updates, adaptive display techniques, and so on. The firmware of the EV may also comprise Artificial Intelligence (AI) & Machine Learning (ML) driven modules which enable the prediction of a plurality of parameters such as and not limited to driver/rider behaviour, road condition, charging infrastructures/charging grids in the vicinity and so on. The data pertaining to the intelligent features may be displayed through a display unit present in the dashboard of the vehicle. In one embodiment, the display unit may contain a Liquid Crystal Display (LCD) screen of a predefined dimension. In another embodiment, the display unit may contain a Light-Emitting Diode (LED) screen of a predefined dimension. The display unit may be a water-resistant display supporting one or more User-Interface (UI) designs. The EV may support multiple frequency bands such as 2G, 3G, 4G, 5G, and so on. Additionally, the EV may also be equipped with wireless infrastructure such as, and not limited to Bluetooth, Wi-Fi and so on to facilitate wireless communication with other EVs or the cloud.
[0048] Figure 1a illustrates a block diagram of an embodiment of an Electronic Control Unit (ECU) of a vehicle, in accordance with an embodiment of the present disclosure. The ECU of the vehicle is responsible for managing all the operations of the EV, wherein the key elements of the ECU (10) typically include (i) a microcontroller core (or processor unit) (12); (ii) a memory unit (14); (iii) a plurality of input (16) and output modules (18) and (iv) communication protocols including, but not limited to CAN protocol, Serial Communication Interface (SCI) protocol and so on. The sequence of programmed instructions and data associated therewith can be stored in a non-transitory computer-readable medium such as a memory unit or a storage device which may be any suitable memory apparatus such as, but not limited to read-only memory (ROM), programmable read-only memory (PROM), electrically erasable programmable read-only memory (EEPROM), random-access memory (RAM), flash memory, disk drive and the like. In one or more embodiments of the disclosed subject matter, non-transitory computer-readable storage media can be embodied with a sequence of programmed instructions for monitoring and controlling the operation of different components of the EV.
[0049] The processor may include any computing system which includes, but is not limited to, Central Processing Unit (CPU), an Application Processor (AP), a Graphics Processing Unit (GPU), a Visual Processing Unit (VPU), and/or an AI-dedicated processor such as a Neural Processing Unit (NPU). In an embodiment, the processor can be a single processing unit or several units, all of which could include multiple computing units. The processor may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor is configured to fetch and execute computer-readable instructions and data stored in the memory. The instructions can be compiled from source code instructions provided in accordance with a programming language such as Java, C++, C#.net or the like. The instructions can also comprise code and data objects provided in accordance with, for example, the Visual Basic™ language, LabVIEW, or another structured or object-oriented programming language. The one or a plurality of processors control the processing of the input data in accordance with a predefined operating rule or artificial intelligence (AI) model stored in the non-volatile memory and the volatile memory. The predefined operating rule or artificial intelligence model is provided through training or learning algorithms which include, but are not limited to, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning.
[0050] Furthermore, the modules, processes, systems, and devices can be implemented as a single processor or as a distributed processor. Also, the processes, modules, and sub-modules described in the various figures of and for embodiments herein may be distributed across multiple computers or systems or may be co-located in a single processor or system. Further, the modules can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, such as the processor, a state machine, a logic array, or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform the required tasks or, the processing unit can be dedicated to performing the required functions. In another embodiment of the present disclosure, the modules may be machine-readable instructions (software) which, when executed by a processor/processing unit, perform any of the described functionalities. In an embodiment, the modules may include a receiving module, a generating module, a comparing module, a pairing module, and a transmitting module. The receiving module, the generating module, the comparing module, the pairing module, and the transmitting module may be in communication with each other. The data serves, amongst other things, as a repository for storing data processed, received, and generated by one or more of the modules. Exemplary structural embodiment alternatives suitable for implementing the modules, sections, systems, means, or processes described herein are provided below.
[0051] In one embodiment, disclosed herein is a navigation display system that includes at least a host memory, a display unit, a decoder unit, and at least one host processor communicatively coupled with the host memory, the display unit, and the decoder unit. The at least one host processor is configured to receive real-time location information of the vehicle from a navigation system and also receive real-time navigational information including a navigational path from a remote network. Further, the at least one host processor is configured to determine a plurality of common segments, based on the first LUT, corresponding to the real-time location information of the vehicle. Also, the at least one host processor is configured to determine a plurality of non-essential segments, based on a second LUT, corresponding to the real-time navigational path to be followed by the vehicle. Subsequently, the decoder unit is configured to generate a first bitmap and a second bitmap based on the determined common segments and non-essential segments, respectively. Further, the at least one host processor is configured to control the display unit to illuminate a first set of segments corresponding to the first bitmap and a second set of segments corresponding to the second bitmap.
[0052] The first set of segments and the second set of segments are illuminated to display a Turn-By-Turn navigation to a user of the vehicle. In a non-limiting example, the Turn-By-Turn navigation display is provided in an instrument cluster panel (i.e., instrument dashboard panel) of the vehicle to show a directional representation of a route or a navigational path through a segmented display of the display unit. Based on navigational data, the navigation display system may keep a user/rider informed about the best possible route that can be taken by the user by illuminating one or more segments of the display unit to form a pattern in a required direction. The navigation display system uses two separate LUTs that are stored in the host memory. The first LUT includes a set of common segments whereas the second LUT includes a set of non-essential segments. Each segment in the first LUT and the second LUT comprises a corresponding bitmap/memory map using which the host processor causes the corresponding segment of the navigation display to illuminate based on the real-time navigation data.
[0053] Figure 1b illustrates a block diagram of an embodiment of a navigation display system (100), in accordance with an embodiment of the present disclosure.
[0054] According to an embodiment, the navigation display system (100) comprises at least one host processor (101) (hereinafter also referred to as a host processor). The host processor (101) may correspond to the instrument cluster of the vehicle. The instrument cluster may correspond to an edge device with limited processing capabilities.
[0055] The host processor (101) may include any computing system which includes, but is not limited to, a Central Processing Unit (CPU), an Application Processor (AP), a Graphics Processing Unit (GPU), a Visual Processing Unit (VPU), and/or an AI-dedicated processor such as a Neural Processing Unit (NPU). In an embodiment, the host processor (101) can be a single processing unit or several units, all of which could include multiple computing units. The host processor (101) may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the host processor (101) is configured to fetch and execute computer-readable instructions and data stored in the host memory. The instructions can be compiled from source code instructions provided in accordance with a programming language such as Java, C++, C#.net, or the like. The instructions can also comprise code and data objects provided in accordance with, for example, the Visual Basic™ language, LabVIEW, or another structured or object-oriented programming language. A single processor or a plurality of processors controls the processing of the input data in accordance with a predefined operating rule or artificial intelligence (AI) model stored in the non-volatile memory and the volatile memory. The predefined operating rule or artificial intelligence model is provided through training or learning algorithms which include, but are not limited to, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning.
[0056] As one of the inputs, the host processor (101) may receive the real-time vehicle location information of the vehicle from the navigation system (103). The navigation system (103) may correspond to a Global Navigation Satellite System (GNSS). The navigation system (103) is also referred to as the GNSS (103), interchangeably in the forthcoming paragraphs without any deviation from the scope of the present disclosure. The real-time vehicle location (i.e., a live location of the vehicle) is automatically updated after a pre-defined interval. In a non-limiting example, the pre-defined interval may correspond to a time interval of a few milliseconds. The real-time vehicle location may be determined based on a GNSS receiver attached to the vehicle. The GNSS receiver receives signals from one or more satellites of the GNSS (103) to determine a precise location of the vehicle. As another input, the host processor (101) may receive, from the remote network (105), real-time navigation data including the navigational path to be followed by the vehicle in accordance with a preferred embodiment of the present disclosure. The real-time navigation path may be fetched in real-time from a plurality of navigation-based platforms such as but not limited to Google Maps™, MapmyIndia™, and the like. The real-time navigation path may be generated based on destination location information manually provided by the user of the user device and the real-time vehicle location.
[0057] Based on the inputs received from the GNSS (103) and the remote network (105), the host processor (101) is configured to process the received input data using at least two LUTs which are stored in the host memory (115). The at least two LUTs correspond to a first LUT (501) and a second LUT (503). The first LUT (501) includes a plurality of common segments. Upon receiving the real-time location information from the GNSS (103), the host processor (101) is configured to determine the plurality of common segments corresponding to the received real-time location information based on the first LUT (501). The first LUT includes the plurality of common segments in a tabular form. In a non-limiting example, the plurality of common segments relates to segments A, B, C, and D which are illuminated by the host processor (101) to indicate a majority of navigation signs. The plurality of common segments relates to the segments which may be frequently illuminated by the host processor (101). The plurality of common segments is represented in a different table which helps in reducing the retrieval time or fetching time of the common segments for illuminating the display segments. Further, the plurality of common segments includes a set of common essential segments, i.e., segments C, and D which are always required to be illuminated by the host processor (101) to indicate the navigation signs during navigation. The host processor (101) is configured to control, based on the real-time navigational data, the display unit to illuminate display segments corresponding to the set of common essential segments, i.e., segments C and D when the navigational path is being followed by the vehicle.
[0058] Upon receiving the navigational path from the remote network (105), the host processor (101) is configured to determine, based on the second LUT (503), the plurality of non-essential segments corresponding to the navigational path. The second LUT (503) includes the plurality of non-essential segments in the tabular form. The plurality of non-essential segments relates to segments 1-54 as shown in Figure 5b. Each segment among the segments 1-54 represents a particular portion/ segment of the display unit (109). Therefore, to display any navigation sign on the display unit (109), a set of portions/segments among the segments 1-54 need to be selected for the illumination.
[0059] The decoder unit (107) is configured to generate a first bitmap based on the determined plurality of common segments. Further, the decoder unit (107) is configured to generate a second bitmap based on the determined plurality of non-essential segments. Each segment of the first LUT (501) and the second LUT (503) comprises a bitmap/memory map. The bitmap represents a particular memory location of the corresponding segment. The bitmap of the corresponding segments from the first LUT (501) and the second LUT (503) is transmitted from the host processor (101) to the decoder unit (107) using a communication interface. In the preferred embodiment, a serial communication is employed for the transmission of the bitmaps of the corresponding segments to the decoder unit (107). The decoder unit (107) comprises a plurality of output ports, wherein each output port is connected to the corresponding segment of the display unit (109). Based on the outputs of the decoder unit (107), the host processor (101) is configured to control the display unit (109) to illuminate the first set of segments corresponding to the first bitmap and the second set of segments corresponding to the second bitmap.
[0060] According to another embodiment, the host processor (101) may further be configured to control the display unit to continuously illuminate segments corresponding to the set of common essential segments in the first bitmap. Particularly, the host processor (101) is configured to control, based on the real-time navigational data, the display unit to illuminate segments corresponding to the set of common essential segments, i.e., segments C and D, throughout the navigation duration during which the navigational path is being followed by the vehicle.
[0061] According to another embodiment, the host processor (101) is configured to determine control, based on the real-time navigational data, the display unit (109) to illuminate the segments corresponding to the plurality of common segments i.e., segments A, B, C and D, along with at least one of the plurality of non-essential segments when the vehicle approaches a plurality of turns while following the navigational path. The plurality of turns may include, but are not limited to, a right turn, a left turn, a fork-left turn, a sharp left turn, a roundabout left turn, a fork right turn, a U-turn, a ramp right turn, a ramp left turn, a sharp right turn, etc. However, in case of a roundabout turn, the illumination of the segments corresponding to the segments A, and B may not be required and only the set of common essential segments i.e. segments C and D along with at least one of the plurality of non-essential segments is required to be illuminated, as illustrated in Figure 11a of the present disclosure.
[0062] Resultantly, based on the live location of the vehicle and the final destination, the user receives Turn-By-Turn navigation instructions through the display unit (109). In an embodiment, the Turn-By-Turn navigation instructions may be additionally provided as an audio output using a speaker (111) which is electronically connected to the host processor (101). Additionally, all the elements of the navigation display system (100) are powered using a power supply unit (113) which may be a grid supply or a battery supply.
[0063] Figure 1c illustrates a block diagram of another embodiment of the navigation display system, in accordance with an embodiment of the present disclosure. The navigation display system (100), according to the present embodiment, is identical to the navigation display system (100) illustrated in Figure 1b. However, in the present embodiment, a user device (102) may be used to fetch the real-time navigation data including the navigational path from the remote network (105). The user device (102) may correspond to, but is not limited to a mobile phone, a smartphone, a tablet, a wearable device, a personal navigation device, a smartwatch, etc. The user device (102) includes a memory and at least one processor coupled with the memory. The received navigation path may be transmitted from the user device (102) to the host processor (101) using a wireless network infrastructure such as but not limited to BLE (Bluetooth Low Energy), Wi-Fi, and so on. This embodiment is particularly useful when there is limited/ no network connectivity on the host processor (101) to receive the navigational path from the remote network (105).
[0064] Figure 1d illustrates a block diagram of yet another embodiment of the navigation display system, in accordance with an embodiment of the present disclosure. The navigation display system (100) in the present embodiment is identical to the navigational system (100) illustrated in Figure 1b. However, in the present embodiment, user devices (102a, 102b) may be used to receive the real-time location information of the vehicle from the GNSS (103) and the real-time navigational data including the navigational path from the remote network (105). The received real-time location information and the navigation path may be transmitted from the user devices (102a, 102b) to the host processor (101) using the wireless network infrastructure, such as, but not limited to, the BLE, Wi-Fi, and so on. This embodiment is particularly useful when there is limited/no network connectivity and no GPS access or the GNSS receiver on the host processor (101). It is understood that the user devices 102a and 102b can be a single user device.
[0065] Figure 1e illustrates a block diagram of yet another embodiment of the navigation display system, in accordance with an embodiment of the present disclosure. The navigation display system (100) in the present embodiment is identical to the navigation display system (100) illustrated in Figure 1b. However, in the present embodiment, a user device processor (102c) such as but not limited to at least one processor present in a mobile phone may be used to fetch real-time location information of the vehicle from the GNSS (103) and the real-time navigational data including the navigational path from the remote network (105). The received real-time location information and the navigational path may be processed locally on the user device processor (102c) to subsequently transmit from the user device processor (102c) to the host processor (101). The user device processor (102c) may be configured to determine the plurality of common segments corresponding to the received real-time location information based on the first LUT stored in the memory of the user device. The user device processor (102c) may further be configured to determine a plurality of non-essential segments corresponding to the navigational path based on the second LUT stored in the memory of the user device. The transmission may be performed using the wireless network infrastructure, such as, but not limited to, the BLE, Wi-Fi, and so on. Further, upon transmission, the decoder unit (107) may be configured to generate the first bitmap based on the determined plurality of common segments. Further, the decoder unit (107) may be configured to generate the second bitmap based on the determined plurality of non-essential segments. Subsequently, the host processor (101) is configured to control the display unit to illuminate the first set of segments corresponding to the first bitmap and the second set of segments corresponding to the second bitmap. In the present embodiment, the utility of the user device processor (102c) decreases the processing load on the host processor (101) thereby resulting in a faster indication of the navigation information using the Turn-By-Turn segmented display unit (109).
[0066] Figure 1f illustrates a block diagram of yet another embodiment of the navigation display system, in accordance with an embodiment of the present disclosure. The navigation display system (100) in the present embodiment is identical to the navigation display system (100) as illustrated in Figure 1b. However, in the present embodiment, the host processor (101) may control a plurality of specifications such as but not limited to a blink rate of the illuminated segments of the display unit (109) based on the proximity of the vehicle to the upcoming navigation such as taking the left/right turn, the U-turn and so on. Particularly, the host processor (101) is configured to control illuminating frequency (i.e., blinking frequency or blinking of the segments) of each of the first set of segments and the second set of segments based on a remaining distance that is to be covered by the vehicle before approaching an upcoming turn along the navigational path. In this embodiment, the blink rate of the upcoming navigation sign may be directly proportional to the distance of the upcoming turn or action on the path to be taken. An example diagram depicting the illuminating frequency of segments during the navigational path being followed by the vehicle is shown in Figure 1g of the drawings, in accordance with an embodiment of the present disclosure.
[0067] For example, when the rider is 250 meters away from a right turn, there is no blinking of the segments of the display unit (109). However, when the rider is 100 meters away from an intended right turn, the blinking action on the segments of the display unit (109) is initiated, wherein the blinking frequency is low such as once every few seconds, for example, 0.75 seconds. Subsequently, when the rider is 75 meters away from the intended right turn, the frequency of the blinking action of the segments of the display unit (109) increases to a pre-defined moderate frequency such as, but not limited to, once every 0.50 seconds. However, when the rider is 50 meters away from the intended right turn, the frequency of the blinking action of the segmented display unit (109) increases to a pre-defined high frequency such as but not limited to once every 0.25 seconds. Finally, when the rider is 10 meters away from the intended right turn, the frequency of the blinking action of the segmented display unit (109) increases to a pre-defined high frequency such as but not limited to once every 0.10 seconds. Additionally, when the rider is approaching the intended turn, an audio output may also be provided using a speaker device (111).
[0068] Figure 2 illustrates a method flow chart of a navigation display method, in accordance with an embodiment of the present disclosure. Figure 2 illustrates a method 200 for navigation display. The method 200 includes a series of operational steps 201 through 213.
[0069] In step 201, the method 200 comprises receiving real-time location information of the vehicle from the navigation system (103). The host processor (101) may be configured to receive the real-time location information of the vehicle from the navigation system (103). Alternatively, the user device processor (102c) may be configured to receive the real-time location information from the navigation system (103). The navigation system may correspond to the GNSS (103). The real-time vehicle location may be determined based on the GNSS receiver attached to the vehicle. The vehicle location may be determined frequently (i.e., in milliseconds) to receive the real-time location of the vehicle. The flow of the method now proceeds to step 203.
[0070] In step 203, the method 200 comprises receiving real-time navigational data including the navigational path to be followed by the vehicle from the remote network (105). The host processor (101) may be configured to receive real-time navigational data including the navigational path from the remote network (105). Alternatively, the user device processor (102c) may be configured to receive the real-time navigational data from the remote network (105). The navigational path may be identified by the remote network (105) based on the destination location information manually received in the user device and the real-time location information of the vehicle. The navigational path may correspond to an optimal navigational path that is available to reach the destination location from the current location of the vehicle (i.e., the real-time location of the vehicle). The flow of the method now proceeds to steps 205 and 207.
[0071] In step 205, the method 200 comprises determining the plurality of common segments corresponding to the received real-time location information based on the first LUT (501) stored in the host memory (115). The host processor (101) may be configured to determine the plurality of common segments corresponding to the real-time location information. Alternatively, the user device processor (102c) may also be configured to determine some or all of the the plurality of common segments. The plurality of common segments relates to segments A, B, C, and D. The plurality of common segments is illuminated in the display unit (109) to indicate the majority of the navigation signs. The flow of the method proceeds to step 209 from step 205.
[0072] In step 207, the method 200 comprises determining the plurality of non-essential segments corresponding to the navigational path based on the second LUT (503) stored in the host memory (115). The host processor (101) may be configured to determine the plurality of non-essential segments corresponding to the navigational path. Alternatively, the user device processor (102c) may also be configured to determine the plurality of non-essential segments. The plurality of non-essential segments is represented in the second LUT (503). The plurality of non-essential segments relates to segments 1-54. Based on the navigational path, the corresponding non-essential segments are determined corresponding to the navigational path. In a non-limiting example, if the navigational path relates to “turn right”, then the plurality of non-essential segments corresponding to the navigational path “turn right” is determined. The flow of the method proceeds to step 211 from step 207.
[0073] In step 209, the method 200 comprises generating the first bitmap based on the determined plurality of common segments. The decoder unit (107) is configured to generate the first bitmap. The first bitmap corresponds to the particular memory location of the corresponding plurality of common segments in the first LUT (501). In a non-limiting example, as illustrated in the first LUT (501) of Figure 5a, the bitmap corresponding to the common segment A is (14,0). The bitmap (14,0) refers to “SEG 14” and “BIT 0”. Thus, each of the plurality of common segments is represented by the corresponding bitmap from the first LUT(501). The flow of the method now proceeds to step 213.
[0074] In step 211, the method 200 comprises generating the second bitmap based on the determined plurality of non-essential segments. The decoder unit (107) is configured to generate the second bitmap. The second bitmap corresponds to the particular memory location of the corresponding plurality of non-essential segments in the second LUT (503). In a non-limiting example, as illustrated in the second LUT (503) of Figure 5b, the bitmap corresponding to the non-essential segment 10 is (2,1). The bitmap (2,1) refers to “SEG 2” and “BIT 1”. Thus, each of the plurality of non-essential segments is represented by the corresponding bitmap from the second LUT(503). The flow of the method now proceeds to step 213.
[0075] In step 213, the method 200 comprises controlling the display unit (109) to illuminate a first set of segments corresponding to the first bitmap and a second set of segments corresponding to the second bitmap. The host processor (101) is configured to control the display unit (109) to illuminate the first set of segments and the second set of segments. The first set of segments represents the first bitmap corresponding to the plurality of common segments. Further, the second set of segments represents the second bitmap corresponding to the plurality of non-essential segments. Thus, the display unit (109) displays Tun-by-Turn navigation to the user by turning-ON the corresponding first set of segments and the second set of segments. The user may maneuver the vehicle based on the Turn-by-Turn navigation displayed on the display unit (109).
[0076] Figure 3 illustrates a flow chart depicting a detailed process flow of the navigation display method, in accordance with an embodiment of the present disclosure. Figure 3 depicts a process flow of a method 300 including a series of operation steps 301 through 313 for the navigation display.
[0077] In step 301, the method 300 comprises receiving the destination location information manually from the user via the user device. In a non-limiting example, the user may initiate an application or app corresponding to the navigation display system in the user device to enter the destination location information manually. In another non-limiting example, the user may access any of the plurality of navigation-based platforms, such as but not limited to Google Maps™, and MapmyIndia™ in the user device, and thereby may enter the destination location information manually using an input interface of the navigation-based platforms. The flow of the method now proceeds to step 303.
[0078] In step 303, the method 300 comprises transmitting the destination location information to the GNSS (103) from the user device. The user device (102a) remains connected with the GNSS (103) via a network connection. Therefore, upon receiving the destination location information in the user device, the destination location information is transmitted to the GNSS (103) via the network connection. The flow of the method now proceeds to step 305.
[0079] In step 305, the method 300 comprises receiving the destination location information including a destination location at the instrument cluster of the vehicle from the GNSS (103). The instrument cluster of the vehicle also remains connected with the GNSS (103) via the network connection. Thus, the instrument cluster receives the destination location from the GNSS (103) once the user device transmits the destination information to the GNSS (103). The flow of the method now proceeds to step 307.
[0080] In step 307, the method 300 comprises triggering a navigation module in the instrument cluster of the vehicle. In step 307, the method 300 further comprises fetching the current location of the vehicle using the GNSS receiver attached to the vehicle. The method thereby uses the current location of the vehicle along with the destination location to determine the navigation path. In a non-limiting example, upon being triggered, the navigation module may initiate displaying a navigational map in the instrument cluster of the vehicle. The flow of the method now proceeds to step 309.
[0081] In step 309, the method 300 comprises synchronizing the destination location and the current location of the vehicle between the remote network (105) and the instrument cluster of the vehicle. The instrument cluster of the vehicle remains connected with the remote network (105) via the network connection. The flow of the method now proceeds to step 311.
[0082] In step 311, the method 300 comprises receiving real-time navigational data including the navigational path at the instrument cluster of the vehicle from the remote network (105). Therefore, the instrument cluster of the vehicle receives the navigational path in which direction the vehicle may be followed to reach the destination location. The host processor (101) of the instrument cluster or the vehicle is configured to receive data relating to the navigational path from the remote network (105) by removing packet headers, cyclic redundancy check (CRC), etc. from data packets to reduce the processing load of the host processor (101). The data relating to the navigational path is further truncated to extract the next turn event along with the remaining distance from the next turn. The flow of the method now proceeds to step 313.
[0083] In step 313, the method 300 comprises displaying real-time navigation by controlling illuminating and turning-OFF illumination of the segments of the display unit (109) using the first bitmap and the second bitmap. Particularly, the plurality of common segments is determined based on the current location of the vehicle, and the plurality of non-common segments is determined based on the navigational path. Thereby, the first bitmap is generated based on the plurality of common segments and the second bitmap is generated based on the plurality of non-essential segments. Upon determining the first bitmap and the second bitmap, the method 300 comprises displaying the navigation by illuminating the corresponding segments of the display unit (109).
[0084] Figure 4 illustrates a segmented display of a navigation display system, in accordance with an embodiment of the present disclosure. The segment display (401) includes a plurality of segments. A pre-defined number is assigned to each segment of the plurality of segments, wherein each pre-defined number comprises bitmap values that is generated using a memory mapping table which is stored in the host memory (115). The segment display (401) comprises common segments including common essential segments, wherein A, B, C, and D are common segments out of which segments, C and D are common essential segments as these two segments, C and D remain illuminated at all times while the vehicle is being navigated. Further, the segmented display comprises non-essential segments ranging from 1 to 54, wherein each of the non-essential segments comprises the bitmap that is generated using the memory mapping table which is stored in the host memory (115). The non-essential segments are illuminated only based on the real-time navigation data received from the remote network (105).
[0085] Figure 5a illustrates the first LUT (501) for the plurality of common segments, in accordance with an embodiment of the present disclosure. The plurality of common segments includes the set of common essential segments of the navigation display system. The segmented display comprises the plurality of common segments including the set of common essential segments, wherein A, B, C, and D are the plurality of common segments out of which C and D are the set of common essential segments that remains illuminated at all times while the vehicle is being navigated along the navigational path. Each of the common segments comprises the bitmap that is generated using the memory mapping table which is stored in the host memory (115). As illustrated in Figure 5a, the bitmap of segment A is (14,0), the bitmap of segment B is (14,1), the bitmap of segment C is (15,0), and the bitmap of segment D is (15,1). The bitmap of segment A is represented by “SEG 14” (segment 14) and “BIT 0” (Bit 0). Thus, the bitmap of segment A is (14, 0).
[0086] Figure 5b illustrates the second LUT (503) for the non-essential segments of the navigation display system, in accordance with an embodiment of the present disclosure. The non-essential segments range from 1 to 54, wherein each of the non-essential segments comprises the bitmap that is generated using the memory mapping table which is stored in the host memory (115) of the host processor (101). For example, the bitmap of segment 30 is (7,1), i.e., segment 30 is mapped under segment 7 and Bit 1. In yet another example, the bitmap of segment 44 is (10,3), i.e., segment 30 is mapped under segment 10 and Bit 3. The non-essential segments are illuminated only based on the real-time navigation data received from the remote network (105).
[0087] Figures 6a, 6b, and 6c illustrate an example of the segmented display and memory map of the navigation display system, in accordance with an exemplary embodiment of the present disclosure. Figure 6a indicates a 90-degree right turn. Based on the real-time location of the vehicle received from the GNSS (103) and the real-time navigation data received from the remote server (105), the host processor (101) generates a bitmap using the first LUT (501) and the second LUT (503) stored in the host memory (115). In this exemplary scenario, to indicate the 90-degree right turn using the segmented display unit (109) (as indicated in Figure 6a), common segments A, B, C, and D, as well as specific non-essential segments such as 3, 14, 31, 36, 45, 47, 46 and 44, are illuminated. The bitmaps for the common segments and the non-essential segments required for indicating a right turn are illustrated in Figures 6b and 6c, respectively.
[0088] Figures 7a, 7b, and 7c illustrate an example of the segmented display and memory map of the navigation display system, in accordance with an exemplary embodiment of the present disclosure. Figure 7a indicates a 90-degree left turn. Based on the real-time location of the vehicle received from the GNSS (103) and the real-time navigation data received from the remote server (105), the host processor (101) generates a bitmap using the first LUT (501) and second LUT (503) stored in the host memory (115). In this exemplary scenario, to indicate a left turn using the segmented display unit (109) (as indicated in Figure 7a), common segments A, B, C, and D, as well as specific non-essential segments such as 8, 25, 34, 39, 48, 49, 50 and 51, are illuminated. The bitmaps for the common segments and the non-essential segments required for indicating the 90-degree left turn are illustrated in Figures 7b and 7c, respectively.
[0089] Figures 8a, 8b, and 8c illustrate an example of the segmented display and memory map of the navigation display system indicating a U-turn on the left side, in accordance with an exemplary embodiment of the present disclosure. Based on the real-time location of the vehicle received from the GNSS (103) and the real-time navigation data received from the remote server (105), the host processor (101) generates a bitmap using the first LUT (501) and the second LUT (503) stored in the host memory (115). In this case, to indicate the U-turn on the left side using the segmented display unit (109) (as indicated in Figure 8a), common segments A, B, C, and D as well as specific non-essential segments such as 6, 21, 22, 54, 33, 41, 39, 48, 49, 50 and 51 are illuminated. The bitmaps for the common segments and the non-essential segments required for indicating a right turn are illustrated in Figures 8b and 8c, respectively.
[0090] Figures 9a, 9b, and 9c illustrate an example of the segmented display and memory map of the navigation display system indicating a fork left turn, in accordance with an exemplary embodiment of the present disclosure. Based on the real-time location of the vehicle received from the GNSS (103) and the real-time navigation data received from the remote server (105), the host processor (101) generates a bitmap using the first LUT (501) and the second LUT (503) stored in the host memory (115). In this case, to indicate the fork left turn using the segmented display unit (109) (as indicated in Figure 9a), common segments A, B, C, and D as well as specific non-essential segments such as 9, 27, 35, 49, 50, 51, 52, 42, 43, 44, 45, 46, 30 and 12 are illuminated. The bitmaps for the common segments and the non-essential segments required for indicating the fork left turn are illustrated in Figures 9b and 9c, respectively.
[0091] Figures 10a, 10b, and 10c illustrate an example of the segmented display and memory map of the navigation display system indicating a sharp left turn, in accordance with an exemplary embodiment of the present disclosure. Based on the real-time location of the vehicle received from the GNSS (103) and the real-time navigation data received from the remote server (105), the host processor (101) generates a bitmap using the first LUT (501) and the second LUT (503) stored in the host memory (115). In this case, to indicate the sharp left turn using the segmented display unit (109) (as indicated in Figure 10a), common segments A, B, C, and D as well as specific non-essential segments such as 7, 22, 23, 33, 38, 48, and 49 are illuminated. The bitmaps for the common segments and the non-essential segments required for indicating the sharp left turn are illustrated in Figures 10b and 10c, respectively.
[0092] Figures 11a, 11b, and 11c illustrate an example of the segmented display and memory map of the navigation display system indicating a roundabout left turn, in accordance with an exemplary embodiment of the present disclosure. Based on the real-time location of the vehicle received from the GNSS (103) and the real-time navigation data received from the remote server (105), the host processor (101) generates a bitmap using the first LUT (501) and the second LUT (503) stored in the host memory (115). In this case, to indicate the roundabout left turn using the segmented display unit (109) (as indicated in Figure 11a), the common essential segments C, and D as well as specific non-essential segments such as 9, 27, 26, 25, 24, 23, 22, 21, and 20 are illuminated. The bitmaps for the common essential segments and the non-essential segments required for indicating the roundabout left turn are illustrated in Figures 11b and 11c, respectively.
[0093] Figure 12 illustrates an example of the segmented display of the navigation display system indicating a fork right turn, in accordance with an exemplary embodiment of the disclosure. In this exemplary embodiment, all common segments and specific non-essential segments are illuminated based on generated bitmaps using the first LUT (501) and the second LUT (503).
[0094] Figures 13a and 13b illustrate examples of the segmented display of the navigation display system indicating a ramp right turn and ramp left turn, in accordance with exemplary embodiments of the present disclosure. In both cases, as illustrated in Figures 13a, and 13b, all common segments and specific non-essential segments are illuminated.
[0095] Figure 14 illustrates an example of the segmented display of the navigation display system indicating a sharp right turn, in accordance with an embodiment of the present disclosure. In this case, all common segments and specific non-essential segments are illuminated to display the sharp right turn.
[0096] Figures 15a and 15b illustrate examples of the segmented display of the navigation system indicating a slight right turn and slight left turn, in accordance with exemplary embodiments of the present disclosure. In both cases, all common segments and specific non-essential segments are illuminated.
[0097] Figure 16 illustrates an example of the segmented display of the navigation system indicating a straight path, in accordance with an exemplary embodiment of the present disclosure. In this case, all common segments and specific non-essential segments are illuminated.
[0098] Figure 17 illustrates an example of the segmented display of the navigation system indicating a U-turn towards the right side, in accordance with an exemplary embodiment of the present disclosure. In this case, all common segments and specific non-essential segments are illuminated.
[0099] It will be appreciated that the modules, processes, systems, and devices described above can be implemented in hardware, hardware programmed by software, software instruction stored on a non-transitory computer readable medium or a combination of the above. Embodiments of the methods, processes, modules, devices, and systems (or their sub-components or modules), may be implemented on a general-purpose computer, a special-purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element, an ASIC or other integrated circuit, a digital signal processor, a hardwired electronic or logic circuit such as a discrete element circuit, a programmed logic circuit such as a programmable logic device (PLD), programmable logic array (PLA), field-programmable gate array (FPGA), programmable array logic (PAL) device, or the like. In general, any process capable of implementing the functions or steps described herein can be used to implement embodiments of the methods, systems, or computer program products (software program stored on a non-transitory computer readable medium).
[0100] Furthermore, embodiments of the disclosed methods, processes, modules, devices, systems, and computer program product may be readily implemented, fully or partially, in software using, for example, object or object-oriented software development environments that provide portable source code that can be used on a variety of computer platforms. Alternatively, embodiments of the disclosed methods, processes, modules, devices, systems, and computer program product can be implemented partially or fully in hardware using, for example, standard logic circuits or a very-large-scale integration (VLSI) design. Other hardware or software can be used to implement embodiments depending on the speed and/or efficiency requirements of the systems, the particular function, and/or particular software or hardware system, microprocessor, or microcomputer being utilized.
[0101] In this application, unless specifically stated otherwise, the use of the singular includes the plural and the use of “or” means “and/or.” Furthermore, use of the terms “including” or “having” is not limiting. Any range described herein will be understood to include the endpoints and all values between the endpoints. Features of the disclosed embodiments may be combined, rearranged, omitted, etc., within the scope of the invention to produce additional embodiments. Furthermore, certain features may sometimes be used to advantage without a corresponding use of other features.
[0102] While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist.
Reference numbers:
Components Reference Numbers
Navigation display system 100
Host processor 101
User device(s) 102/ (102a, 102b)
User device processor 102c
GNSS 103
Remote network 105
Decoder unit 107
Segmented display unit 109
Speaker 111
Power supply unit 113
Host memory 115
First LUT 501
Second LUT 503
,CLAIMS: A navigation display system (100), comprising:
a host memory (115);
a display unit (109);
a decoder unit (107) communicatively coupled with the host memory;
at least one host processor (101) that is communicatively coupled with the host memory (115), the display unit, and the decoder unit, wherein
the at least one host processor (101) is configured to:
receive real-time location information of a vehicle from a navigation system (103);
receive, from a remote network (105), real-time navigational data including a navigational path to be followed by the vehicle;
determine a plurality of common segments corresponding to the received real-time location information based on a first Look-Up-Table (LUT) (501) stored in the host memory (115); and
determine, based on a second LUT (503) stored in the host memory (115), a plurality of non-essential segments corresponding to the navigational path,
the decoder unit (107) is configured to:
generate a first bitmap based on the determined at least one of plurality of common segments; and
generate a second bitmap based on the determined plurality of non-essential segments,
the at least one host processor (101) is further configured to:
control the display unit (109) to illuminate a first set of segments corresponding to the first bitmap and a second set of segments corresponding to the second bitmap.
2. The system as claimed in claim 1, wherein the plurality of common segments includes a set of common essential segments.
3. The system as claimed in claim 2, wherein the at least one host processor (101) is further configured to control, based on the real-time navigational data, the display unit (109) to illuminate segments corresponding to the set of common essential segments throughout navigation duration during which the navigational path being followed by the vehicle.
4. The system as claimed in claim 2, wherein the at least one host processor (101) is further configured to control, based on the real-time navigational data, the display unit to illuminate segments corresponding to the set of common essential segments excluding at least one of the plurity of common segments which are not the set of common essential segments when the vehicle approaches a plurality of turns while following the navigational path, wherein the plurality of turns excludes a roundabout turn.
5. The system as claimed in claim 1, wherein
the first LUT (501) includes a first plurality of values corresponding to the plurality of common segments in a tabular form, and
the second LUT (503) includes a second plurality of values corresponding to the plurality of non-essential segments in the tabular form.
6. The system as claimed in claim 1, wherein the at least one host processor (101) is further configured to control illuminating frequency of each of the first set of segments and the second set of segments based on a remaining distance that is to be covered by the vehicle before approaching an upcoming turn along the navigational path.
7. A navigation display system, comprising:
a user device (102, 102a, 102b) that includes a memory and at least one processor (102c) coupled with the memory;
a display unit (109);
a host memory (115);
a decoder unit (107) communicatively coupled with the host memory;
at least one host processor that is communicatively coupled with the at least one processor, the host memory (115), the display unit, and the decoder unit, wherein
the at least one processor (102c) is configured to:
receive real-time location information of a vehicle from a navigation system;
receive, from a remote network, real-time navigational data including a navigational path to be followed by the vehicle;
determine a plurality of common segments corresponding to the received real-time location information based on a first Look-Up-Table (LUT) (501) stored in the memory; and
determine, based on a second LUT (503) stored in the memory, a plurality of non-essential segments corresponding to the navigational path;
the decoder unit (107) is configured to:
generate a first bitmap based on at least one of the determined plurality of common segments; and
generate a second bitmap based on the determined plurality of non-essential segments,
the at least one host processor (101) is configured to:
control the display unit to illuminate a first set of segments corresponding to the first bitmap and a second set of segments corresponding to the second bitmap.
8. A navigation display method (200), comprising:
receiving (201) real-time location information of a vehicle from a navigation system;
receiving (203), from a remote network, real-time navigational data including a navigational path to be followed by the vehicle;
determining (205) a plurality of common segments corresponding to the received real-time location information based on a first Look-Up-Table (LUT) stored in a host memory;
determining (207), based on a second LUT stored in the host memory, a plurality of non-essential segments corresponding to the navigational path;
generating (209) a first bitmap based on at least one of the determined plurality of common segments;
generating (211) a second bitmap based on the determined plurality of non-essential segments; and
controlling (213) the display unit to illuminate a first set of segments corresponding to the first bitmap and a second set of segments corresponding to the second bitmap.
9. The method as claimed in claim 8, wherein the plurality of common segments includes a set of common essential segments.
10. The method as claimed in claim 9, further comprising:
controlling, based on the real-time navigational data, the display unit to illuminate segments corresponding to the set of common essential segments throughout navigation duration during which the navigational path being followed by the vehicle.
11. The method as claimed in claim 9, further comprising:
controlling, based on the real-time navigational data, the display unit to illuminate segments corresponding to the set of common essential segments excluding at least one of the pluarlity of common segments which are not part of the set of common essential segments when the vehicle approaches a plurality of turns while following the navigational path, wherein the plurality of turns excludes a roundabout turn.
12. The method as claimed in claim 8, wherein
the first LUT (501) includes a first plurality of values corresponding to the plurality of common segments in a tabular form, and
the second LUT (503) includes a second plurality of values corresponding to the plurality of non-essential segments in the tabular form.
13. The method as claimed in claim 8, further comprising:
controlling illuminating frequency of each of the first set of segments and the second set of segments based on a remaining distance that is to be covered by the vehicle before approaching an upcoming turn along the navigational path.
| # | Name | Date |
|---|---|---|
| 1 | 202341041091-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [16-06-2023(online)].pdf | 2023-06-16 |
| 2 | 202341041091-STATEMENT OF UNDERTAKING (FORM 3) [16-06-2023(online)].pdf | 2023-06-16 |
| 3 | 202341041091-PROVISIONAL SPECIFICATION [16-06-2023(online)].pdf | 2023-06-16 |
| 4 | 202341041091-POWER OF AUTHORITY [16-06-2023(online)].pdf | 2023-06-16 |
| 5 | 202341041091-FORM 1 [16-06-2023(online)].pdf | 2023-06-16 |
| 6 | 202341041091-DRAWINGS [16-06-2023(online)].pdf | 2023-06-16 |
| 7 | 202341041091-DECLARATION OF INVENTORSHIP (FORM 5) [16-06-2023(online)].pdf | 2023-06-16 |
| 8 | 202341041091-FORM 18 [30-06-2023(online)].pdf | 2023-06-30 |
| 9 | 202341041091-DRAWING [30-06-2023(online)].pdf | 2023-06-30 |
| 10 | 202341041091-CORRESPONDENCE-OTHERS [30-06-2023(online)].pdf | 2023-06-30 |
| 11 | 202341041091-COMPLETE SPECIFICATION [30-06-2023(online)].pdf | 2023-06-30 |
| 12 | 202341041091-Proof of Right [17-07-2023(online)].pdf | 2023-07-17 |
| 13 | 202341041091-RELEVANT DOCUMENTS [25-09-2024(online)].pdf | 2024-09-25 |
| 14 | 202341041091-POA [25-09-2024(online)].pdf | 2024-09-25 |
| 15 | 202341041091-FORM 13 [25-09-2024(online)].pdf | 2024-09-25 |
| 16 | 202341041091-AMENDED DOCUMENTS [25-09-2024(online)].pdf | 2024-09-25 |