Abstract: Disclosed is a system (100) and method (300) to display one or more parameters on a display unit (104) associated with a vehicle. The one or more parameters may be associated with vehicle components of the vehicle. The method comprises receiving (301) one or more parameters associated with the vehicle. The method further comprises fetching (302), based on the one or more parameters, corresponding API functions from a plurality of data libraries stored in a host memory (102). The data libraries comprise a set of classes corresponding to the one or more parameters. The method comprises determining (303) a set of segments from among a plurality of segments associated with the display unit (104) based on the corresponding API functions and Look-Up-Tables (LUTs) stored in the host memory. The method comprises displaying (304), based on the determined set of segments, the one or more parameters on the display unit (104).
Description:TECHNICAL FIELD
[1] The present disclosure relates to the field of vehicle display systems and more particularly, relates to an electronic display system installed in a vehicle and a method of operating thereof.
BACKGROUND
[2] Nowadays, vehicles generally comprise display systems for displaying various information to drivers or riders of the vehicles. For instance, a two-wheeler vehicle may have a display system provided in a rider’s field of view, thereby facilitating displaying various information to the rider. The information may include, for instance, information associated with instrument cluster, navigational information, information associated with situational awareness while driving a vehicle, and the like.
[3] Conventionally, a microprocessor unit may be integrated with the display system to control the display of information to the user via the display system. In particular, the display system may comprise a display screen that can be controlled by the microprocessor unit to display the information to a driver or a rider of the vehicle. The display screen may be a segmented display screen. Further, the microprocessor unit may be integrated with computer executable and programmable instructions which cause the microprocessor unit to display the information via the display system. In conventionally stored executable and programmable instructions, the instructions do not have any kind of layer differentiation.
[4] Specifically, the instructions may be associated with a look-up-table (LUT) and the microprocessor unit may enable or disable various segments/pixels on the display screen based on user inputs, thereby controlling the display screen to display the information. For instance, it may be determined by the microprocessor unit which bits needs to be enabled and disabled to display the information on the display screen. Each of the bits indicates a specific location corresponding to a particular segment/pixel group of the display screen.
[5] However, there are a few limitations associated with the conventional way of displaying the information on the display screen of the display system of the vehicle. For instance, the executable and programmable instructions are not compatible with different Operating Systems (OS). Thus, porting the executable and programmable instructions from one operating system to another operating system requires substantial modifications to be made to the instructions, which is time-consuming and inconvenient.
[6] Additionally, the executable and programmable instructions are not communication interface agnostic and may not necessarily work with different communication interfaces. Also, significant modifications would be required in the executable and programmable instructions in case of any change in a display design of the display screen or any change in a display type of the display screen. For instance, in a non-limiting example, there may be a change in a segment design of the display screen which would require substantial modification in the executable and programmable instructions to make the instructions compatible with new designs. In another non-limiting example, there may be some new design elements associated with the display screen, which again would require modifications in the executable and programmable instructions.
[7] Therefore, in view of the above-mentioned problems, it is advantageous to provide an improved system and method that can overcome the above-mentioned problems and limitations associated with conventional display systems of the vehicle.
SUMMARY
[8] This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify key or essential inventive concepts of the invention nor is it intended for determining the scope of the invention.
[9] According to an embodiment of the present disclosure, disclosed herein is a system associated with a vehicle. The system comprises a host memory, the host memory configured to store a plurality of modules in the form of programmable instructions, a plurality of data libraries, and one or more Look-Up-Tables (LUTs). The system further comprises a display unit, a decoder unit communicatively coupled to the display unit, and a host processor communicatively coupled to the host memory, the display unit, and the decoder unit. The host processor is configured to execute the programmable instructions associated with the plurality of modules. The plurality of modules comprises an application module configured to receive one or more parameters associated with the vehicle. The plurality of modules comprises an Application Program Interface (API) module configured to fetch, based on the one or more parameters, corresponding API functions from the plurality of data libraries. The plurality of data libraries comprises a set of classes corresponding to the one or more parameters. The plurality of modules comprises a segment module configured to determine a set of segments from among a plurality of segments associated with the display unit based on the corresponding API functions and based on the one or more Look-Up-Tables (LUTs). The decoder unit is configured to display, based on the determined set of segments, the one or more parameters on the display unit.
[10] According to another embodiment of the present disclosure, also disclosed herein is a method for displaying one or more parameters on a display unit associated with a vehicle, The method comprises receiving one or more parameters associated with the vehicle. The method comprises fetching, based on the one or more parameters, corresponding API functions from a plurality of data libraries stored in a host memory. The plurality of data libraries comprises a set of classes corresponding to the one or more parameters. The method comprises determining a set of segments from among a plurality of segments associated with the display unit based on the corresponding API functions and based on one or more Look-Up-Tables (LUTs) stored in the host memory. The method comprises displaying, based on the determined set of segments, the one or more parameters on the display unit.
[11] To further clarify the advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawing. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[12] The foregoing and other features of embodiments will become more apparent from the following detailed description of embodiments when read in conjunction with the accompanying drawings. In the drawings, like reference numerals refer to like elements.
[13] Figure 1A illustrates a block diagram of an embodiment of an Electronic Control Unit (ECU) of a vehicle, in accordance with an embodiment of the present disclosure.
[14] Figure 1B illustrates a block diagram of an embodiment of a system associated with a vehicle, in accordance with an embodiment of the present disclosure.
[15] Figure 1C illustrates a block diagram of a set of modules associated with the system, in accordance with an embodiment of the present disclosure.
[16] Figure 1D illustrates a block diagram depicting a plurality of layers associated with the system, in accordance with an embodiment of the present disclosure.
[17] Figure 2A illustrates an example of a group of segments and the corresponding memory map, in accordance with an embodiment of the present disclosure.
[18] Figure 2B illustrates an example of a set of code representations for alphanumeric characters, in accordance with an embodiment of the present disclosure.
[19] Figure 2C illustrates an example of a final output bitmap and a decoder unit linked to the group of segments, in accordance with an embodiment of the present disclosure.
[20] Figure 3 illustrates a method flow chart of a method, in accordance with an embodiment of the present disclosure.
[21] Figure 4 illustrates a sequential process flow for displaying one or more parameters on the display unit, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[22] For the purpose of promoting an understanding of the principles of the present disclosure, reference will now be made to the various embodiments and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the present disclosure is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the present disclosure as illustrated therein being contemplated as would normally occur to one skilled in the art to which the present disclosure relates.
[23] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are explanatory of the present disclosure and are not intended to be restrictive thereof.
[24] Whether or not a certain feature or element was limited to being used only once, it may still be referred to as “one or more features” or “one or more elements” or “at least one feature” or “at least one element.” Furthermore, the use of the terms “one or more” or “at least one” feature or element do not preclude there being none of that feature or element, unless otherwise specified by limiting language including, but not limited to, “there needs to be one or more…” or “one or more elements is required.”
[25] Reference is made herein to some “embodiments.” It should be understood that an embodiment is an example of a possible implementation of any features and/or elements of the present disclosure. Some embodiments have been described for the purpose of explaining one or more of the potential ways in which the specific features and/or elements of the proposed disclosure fulfil the requirements of uniqueness, utility, and non-obviousness.
[26] Use of the phrases and/or terms including, but not limited to, “a first embodiment,” “a further embodiment,” “an alternate embodiment,” “one embodiment,” “an embodiment,” “multiple embodiments,” “some embodiments,” “other embodiments,” “further embodiment”, “furthermore embodiment”, “additional embodiment” or other variants thereof do not necessarily refer to the same embodiments. Unless otherwise specified, one or more particular features and/or elements described in connection with one or more embodiments may be found in one embodiment, or may be found in more than one embodiment, or may be found in all embodiments, or may be found in no embodiments. Although one or more features and/or elements may be described herein in the context of only a single embodiment, or in the context of more than one embodiment, or in the context of all embodiments, the features and/or elements may instead be provided separately or in any appropriate combination or not at all. Conversely, any features and/or elements described in the context of separate embodiments may alternatively be realized as existing together in the context of a single embodiment.
[27] Any particular and all details set forth herein are used in the context of some embodiments and therefore should not necessarily be taken as limiting factors to the proposed disclosure.
[28] The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by “comprises... a” does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.
[29] Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.
[30] For the sake of clarity, the first digit of a reference numeral of each component of the present disclosure is indicative of the Figure number, in which the corresponding component is shown. For example, reference numerals starting with digit “1” are shown at least in Figure 1. Similarly, reference numerals starting with digit “2” are shown at least in Figure 2.
[31] An Electric Vehicle (EV) or a battery powered vehicle including, and not limited to two-wheelers such as scooters, mopeds, motorbikes/motorcycles; three-wheelers such as auto-rickshaws, four-wheelers such as cars and other Light Commercial Vehicles (LCVs) and Heavy Commercial Vehicles (HCVs) primarily work on the principle of driving an electric motor using the power from the batteries provided in the EV. Furthermore, the electric vehicle may have at least one wheel which is electrically powered to traverse such a vehicle. The term ‘wheel’ may be referred to any ground-engaging member which allows traversal of the electric vehicle over a path. The types of EVs include Battery Electric Vehicle (BEV), Hybrid Electric Vehicle (HEV), and Range Extended Electric Vehicle. However, the subsequent paragraphs pertain to the different elements of a Battery Electric Vehicle (BEV).
[32] In construction, an EV typically comprises hardware components such as a battery or battery pack enclosed within a battery casing and includes a Battery Management System (BMS), an on-board charger, a Motor Controller Unit (MCU), an electric motor, and an electric transmission system. In addition to the hardware components/elements, the EV may be supported with software modules comprising intelligent features including and not limited to navigation assistance, hill assistance, cloud connectivity, Over-The-Air (OTA) updates, adaptive display techniques, and so on. The firmware of the EV may also comprise Artificial Intelligence (AI) & Machine Learning (ML) driven modules which enable the prediction of a plurality of parameters such as and not limited to driver/rider behavior, road condition, charging infrastructures/charging grids in the vicinity and so on. The data pertaining to the intelligent features may be displayed through a display unit present in the dashboard of the vehicle. In one embodiment, the display unit may contain a Liquid Crystal Display (LCD) screen of a predefined dimension. In another embodiment, the display unit may contain a Light-Emitting Diode (LED) screen of a predefined dimension. The display unit may be a water-resistant display supporting one or more User-Interface (UI) designs. The EV may support multiple frequency bands such as 2G, 3G, 4G, 5G, and so on. Additionally, the EV may also be equipped with wireless infrastructure such as, and not limited to Bluetooth, Wi-Fi and so on to facilitate wireless communication with other EVs or the cloud.
[33] Figure 1A illustrates a block diagram of an embodiment of an Electronic Control Unit (ECU) of a vehicle, in accordance with an embodiment of the present disclosure. The ECU of the vehicle is responsible for managing all the operations of the EV, wherein the key elements of the ECU (10) typically include (i) a microcontroller core (or processor unit) (12); (ii) a memory unit (14); (iii) a plurality of input (16) and output modules (18) and (iv) communication protocols including, but not limited to CAN protocol, Serial Communication Interface (SCI) protocol and so on. The sequence of programmed instructions and data associated therewith can be stored in a non-transitory computer-readable medium such as a memory unit or a storage device which may be any suitable memory apparatus such as, but not limited to read-only memory (ROM), programmable read-only memory (PROM), electrically erasable programmable read-only memory (EEPROM), random-access memory (RAM), flash memory, disk drive and the like. In one or more embodiments of the disclosed subject matter, non-transitory computer-readable storage media can be embodied with a sequence of programmed instructions for monitoring and controlling the operation of different components of the EV.
[34] The processor may include any computing system which includes, but is not limited to, Central Processing Unit (CPU), an Application Processor (AP), a Graphics Processing Unit (GPU), a Visual Processing Unit (VPU), and/or an AI-dedicated processor such as a Neural Processing Unit (NPU). In an embodiment, the processor can be a single processing unit or several units, all of which could include multiple computing units. The processor may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor is configured to fetch and execute computer-readable instructions and data stored in the memory. The instructions can be compiled from source code instructions provided in accordance with a programming language such as Java, C++, C#.net or the like. The instructions can also comprise code and data objects provided in accordance with, for example, the Visual Basic™ language, LabVIEW, or another structured or object-oriented programming language. The one or a plurality of processors control the processing of the input data in accordance with a predefined operating rule or artificial intelligence (AI) model stored in the non-volatile memory and the volatile memory. The predefined operating rule or artificial intelligence model is provided through training or learning algorithms which include, but are not limited to, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning.
[35] Furthermore, the modules, processes, systems, and devices can be implemented as a single processor or as a distributed processor. Also, the processes, modules, and sub-modules described in the various figures of and for embodiments herein may be distributed across multiple computers or systems or may be co-located in a single processor or system. Further, the modules can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, such as the processor, a state machine, a logic array, or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform the required tasks or, the processing unit can be dedicated to performing the required functions. In another embodiment of the present disclosure, the modules may be machine-readable instructions (software) which, when executed by a processor/processing unit, perform any of the described functionalities. In an embodiment, the modules may include a receiving module, a generating module, a comparing module, a pairing module, and a transmitting module. The receiving module, the generating module, the comparing module, the pairing module, and the transmitting module may be in communication with each other. The data serves, amongst other things, as a repository for storing data processed, received, and generated by one or more of the modules. Exemplary structural embodiment alternatives suitable for implementing the modules, sections, systems, means, or processes described herein are provided below.
[36] Figure 1B illustrates a block diagram of an embodiment of a system (100) associated with a vehicle, in accordance with an embodiment of the present disclosure.
[37] According to an embodiment, the vehicle may be an electric vehicle (EV). The system (100) comprises a host processor (101) and a host memory (102). In an embodiment, the host processor (101) may correspond to an instrument cluster of the vehicle. The instrument cluster may correspond to an edge device with limited processing capabilities.
[38] The host processor (101) may include any computing system which includes, but is not limited to, a Central Processing Unit (CPU), an Application Processor (AP), a Graphics Processing Unit (GPU), a Visual Processing Unit (VPU), and/or an AI-dedicated processor such as a Neural Processing Unit (NPU). In an embodiment, the host processor (101) can be a single processing unit or several units, all of which could include multiple computing units. The host processor (101) may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the host processor (101) is configured to fetch and execute computer-readable instructions and data stored in the host memory (102). The instructions can be compiled from source code instructions provided in accordance with a programming language such as Java, C++, C#.net, or the like. The instructions can also comprise code and data objects provided in accordance with, for example, the Visual Basic™ language, LabVIEW, or another structured or object-oriented programming language. A single processor or a plurality of processors controls the processing of the input data in accordance with a predefined operating rule or artificial intelligence (AI) model stored in the non-volatile memory and the volatile memory. The predefined operating rule or artificial intelligence model is provided through training or learning algorithms which include, but are not limited to, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning.
[39] In some embodiments, the system (100) is integrated within the vehicle, In some embodiments, one or more components of the system (100) may be implemented in a cloud-based architecture or on a physical server (not shown). The system (100) further comprises a set of modules (110), as described with reference to Figure 1C further below. The set of modules (110) may be configured to perform their designated functions in conjunction with the host memory (102) and the host processor (101).
[40] In some embodiments, the host memory (102) may be communicatively coupled to the host processor (101). In some embodiments, the set of modules (110) may be included within the host memory (102). The host memory (102) may be configured to store data, and instructions executable by the host processor (101). The host memory (102) may include a database configured to store data.
[41] In some embodiments, the set of modules (110) may include a set of instructions that may be executed to cause the host system (100) to perform any one or more of the methods disclosed herein. The set of modules (110) may be configured to perform the steps of the present disclosure using the data stored in the host memory (102), as discussed throughout this disclosure. In an embodiment, each of the set of modules (110) may be hardware units that may be outside the host memory (102). Further, the host memory (102) may include an operating system for performing one or more tasks of the host system (100).
[42] The host memory (102) may be operable to store instructions executable by the host processor (101). The functions, acts, or tasks illustrated in the figures or described may be performed by the host processor (101), in conjunction with the set of modules, for executing the instructions stored in the host memory (102). The functions, acts, or tasks are independent of the particular type of instruction set, storage media, processor, or processing strategy and may be performed by software, hardware, integrated circuits, firmware, micro-code, and the like, operating alone or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing, and the like.
[43] For the sake of brevity, the architecture and standard operations of the host memory (102) and the host processor (101) are not discussed in detail. In one embodiment, the host memory (102) may be configured to store the information as required by the set of modules and/or the host processor (101) to perform the methods described herein.
[44] In some embodiments, the host memory (102) may communicate via a bus within the system (100). The host memory (102) may include, but is not limited to, a non-transitory computer-readable storage media, such as various types of volatile and non-volatile storage media including, but not limited to, random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like. In one example, the host memory (102) may include a cache or random-access memory for the processor. In alternative examples, the host memory (102) is separate from the processor, such as a cache memory of a processor, the system memory, or other memory.
[45] In one embodiment, the host processor (101) may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc. In one embodiment, the host processor (101) may include a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or both. The host processor (101) may be one or more general processors, digital signal processors, application-specific integrated circuits, field-programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now-known or later developed devices for analyzing and processing data. In some embodiments, the host processor (101) may include one or a plurality of processors. The host processor (101) may implement a software program, such as code generated manually (i.e., programmed).
[46] Referring to Figure 1B, the system (100) may comprise a decoder unit (103) in communication with the host processor (101), the host memory (102), and the set of modules (110). The system (100) may further comprise a display unit (104) in communication with the decoder unit (103). In some embodiments, the display unit (104) may be a segmented display and may comprise a plurality of segments. In some embodiments, the decoder unit (103) may be integrated with the display unit (104). In some embodiments, the display unit (104) may be a pixelated display. It is appreciated that the details regarding the functionality of the system (100) may be explained with reference to ‘segments’, however, the details are applicable for other forms of displays, such as pixelated, as well.
[47] Figure 1C illustrates a block diagram depicting the modules (110) of the system (100), in accordance with an embodiment of the present disclosure. The modules (110) comprise an application module (111), an Application Program Interface (API) module (112), a segment module (113), a communication module (114), and a display driver module (115).
[48] Figure 1D illustrates a block diagram depicting a plurality of layers (120) associated with the system (100), in accordance with an embodiment of the present disclosure. The plurality of layers (120) comprises an application layer (121), a hardware abstraction library layer (122), a kernel layer (123), a host device hardware layer (124), and a display device hardware layer (125).
[49] Referring collectively to Figures 1B-1D, the application module (111), in conjunction with the host processor (101) may be configured to receive one or more parameters associated with the vehicle from one or more vehicle components (105). The one or more parameters may be associated with the one or more vehicle components (105) of the vehicle. In non-limiting examples, the one or more parameters may include navigation parameters such as location information, destination information, route information, updates in distances, directional information, and the like. In some embodiments, the application module (111) may be associated with a human-machine interface (HMI) application. The application module (111) may be associated with the application layer (121) shown in Figure 1D.
[50] In other non-limiting examples, the one or more parameters may include information related to an instrument cluster, such as but not limited to, speed information, motor information, timing information, state of charge (SoC) information, kill switch information, and the like. In other non-limiting examples, the one or more parameters may include information associated with connected user devices, such as but not limited to, caller information, accept or reject information, music information, and the like.
[51] The host memory (102) may be configured to store a plurality of data libraries associated with the one or more parameters. In some embodiments, the plurality of data libraries may comprise a set of classes corresponding to the one or more parameters. In some embodiments, the plurality of data libraries may be interfaced with the application module (111). In some embodiments, each class of the set of classes may be associated with a plurality of API functions. The API functions may be related to a corresponding parameter of the one or more parameters.
[52] The API module (112) in conjunction with the host processor (101) may be configured to fetch corresponding API functions from the plurality of data libraries stored in the host memory (102). In some embodiments, the corresponding API functions may be associated with one or more of a display type of the display unit (104), a segment map associated with the display unit (104), the communication interface associated with the display unit (104), and a vendor hardware abstraction associated with the display unit (104).
[53] In some embodiments, the communication module (114) may be associated with a communication interface for facilitating communication between the plurality of modules (110) and the decoder unit (103). In some embodiments, the communication interface may be an interface that facilitates communication between the plurality of modules (110) and the decoder unit (103). In some embodiments, the communication interface may be an inter-integrated circuit (I2C) interface, Universal Asynchronous Reception and Transmission (UART) interface, Serial Peripheral Interface (SPI) interface, and the like.
[54] In some embodiments, a class within the plurality of data libraries abstracts the type of communication interface associated with the display unit (104). In some embodiments, the type of communication interface may be based on the type of the display unit (104). As an example, a display with segments may be associated with an I2C interface while a display with liquid crystal display (LCD) and segments may be associated with the SPI interface. In some embodiments, another class within the plurality of data libraries abstracts the type of display unit (104), forming a part of the vendor hardware abstraction. In some embodiments, yet another class within the plurality of data libraries abstracts a segment design type associated with the display unit (104), forming a part of the vendor hardware abstraction. In some embodiments, yet another class within the plurality of data libraries abstracts the type of operating platform associated with the display unit (104) and/or the system (100).
[55] The host memory (102) may be configured to store one or more look-up-tables (LUTs). In some embodiments, the one or more LUTs may include a plurality of values correlated to each of the plurality of segments associated with the display unit (104). In some embodiments, the plurality of values may be stored in tabular form. In some embodiments, the one or more LUTs may comprise a first LUT associated with essential segments from among the plurality of segments of the display unit (104) and a second LUT associated with non-essential segments from among the plurality of segments of the display unit (104).
[56] In some embodiments, the plurality of values in the one or more LUTs may be associated with a bitmap or a memory map, i.e., the one or more LUTs may define a memory map for the plurality of segments of the display unit (104). The memory map may be indicative of a particular location of the plurality of segments on the display unit (104). In some embodiments, the decoder unit (103) may be configured to illuminate the one or more segments from among the plurality of segments on the display unit (104) based on the one or more LUTs stored in the host memory (102).
[57] In some embodiments, the plurality of segments on the display unit (104) may comprise one or more groups of segments, in that, segments from among the plurality of segments may be grouped into one or more groups of segments. Each group of segments may be associated with a corresponding LUT, i.e., a corresponding memory map. In some embodiments, the display unit (104) may comprise the plurality of segments in the form of groups of segments arranged in a mosaic pattern.
[58] Referring to Figure 2A, an example of a group of segments (201) and the corresponding memory map (202) is illustrated, in accordance with an exemplary embodiment of the present disclosure. The group of segments (201) comprises segments 1-19, also referred to as A1, A2, …A19. The group of segments (201) may be associated with the corresponding memory map (202). The memory map (202) may be a bit addressable memory map. Each value for the segments A1, A2, …A19 within the corresponding memory map (202) may be mapped to one of BIT0, BIT1, BIT2, and BIT 3 of the host memory (102). That is, each BIT in the host memory (102) may be mapped to a corresponding segment in the display unit (104). For example, value for segment A1 is mapped to BIT0. The value for the segments A1, A2, …A19 may be filled with a ‘0’ binary value indicating activation or a ‘1’ binary value indicating deactivation. It is appreciated that although a single group of segments (201) is illustrated in Figure 2A, the details provided herein are equally applicable to each of the one or more groups of segments on the display unit (104).
[59] The host memory (102) may be configured to store a set of code representations of a plurality of alphanumeric characters. As a non-limiting example, a set of code representations (203) is depicted in FIG. 2B. The set of code representations (203) comprises a representation in code, such as a hexadecimal code, for each of the plurality of alphanumeric characters. For instance, the hexadecimal code for character ‘A’ is 0x00FDF. In some embodiments, binary values for each of the plurality of segments are arranged from Least Significant Bit (LSB) to Most Significant Bit (MSB).
[60] The segment module (113) may be configured, in conjunction with the host processor (101), to determine a set of segments from among a plurality of segments associated with the display unit (104) based on the corresponding API functions and based on the one or more Look-Up-Tables (LUTs) stored in the host memory (102). In some embodiments, to determine the set of segments, the segment module (113) may be configured to determine the alphanumeric characters to be displayed based on the one or more parameters, and further, arrange the hexadecimal code of the alphanumeric characters into the corresponding memory maps. The segment module (113) may further be configured to provide bit values, i.e., ‘1’ to the set of segments to indicate activation of the set of segments. For the other segments (other than the selected set of segments), the value ‘0’ may be provided to indicate no activation.
[61] Accordingly, the segment module (113) is configured to determine which segments are to be activated (enabled) and which segments are to remain deactivated (disabled). The segment module (113) may be configured to generate a final bitmap based on the selected set of segments. The final bitmap comprises the binary values (‘0’ or ‘1’) for each of the plurality of segments, in which, value ‘1’ may be provided to set of segments (that are to be activated) and value ‘0’ is provided to remaining segments.
[62] For instance, considering the group of segments (201) in Figures 2A-2B, in a scenario where alphanumeric character ‘A’ is to be displayed on the display unit (104), the segment module (113) may be configured to retrieve the hexadecimal code for ‘A’ from the set of code representations (203) and thereafter arranges the retrieved code into the memory map (202) of the group of segments where ‘A’ is to be displayed. Accordingly, a final output bitmap may be generated for the group of segments. Referring to FIG. 2C, a non-limiting example diagram of a final output bitmap (204) is depicted in which A1, A2, A3, A4, A5, A7, A8, A9, A10 are to be activated and the rest of segments within the group are to remain deactivated. Similarly, final output bitmaps for each of the one or more groups of segments on the display unit (104) may be generated.
[63] Accordingly, based on the API functions, the LUTs, and the set of code representations, relevant sets of segments may be determined so that the relevant sets of segments may be illuminated at the display unit (104) to allow a rider to view information associated with the one or more parameters associated with the vehicle.
[64] As another example, in case of an update in SoC of the vehicle, the parameters associated with the update in SoC may be received from a vehicle component (105) associated with tracking the SoC. The API module (112) calls the API function associated with the received parameter, i.e., the update in SoC. Based on the API function, the set of segments (from among one group or multiple groups of segments) that are linked to the SoC on the display unit (104) are determined based on the one or more LUTs and code representations stored in the host memory (102), such that, enabling and disabling of the set of segments allows the display of the update in SoC on the display unit (104).
[65] In some embodiments, the segment module (113) may be configured to store information related to the set of segments to be activated, and the related one or more parameters, in the host memory (102). The communication module (114) may be configured to send the stored information to the decoder unit (103) over a communication interface. In some embodiments, for each group of segments, the communication module (114) may be configured to send the final output bitmap (204) to the decoder unit (103). In some embodiments, the communication via the communication module (114) may be in a serial sequence.
[66] In some embodiments, the communication module (114) may be associated with a communication controller within the host driver hardware layer (124) as shown in Figure 1D. In some embodiments, the decoder unit (103) and the display unit (104) may be associated with the display driver hardware layer (125) as shown in Figure 1D.
[67] In some embodiments, the API module (112) and the segment module (113) may be associated with the hardware abstraction library layer (122) shown in Figure 1D. The hardware abstraction library layer (122) comprises a display HAL API sub-layer which facilitates the fetching of API functions related to the one or more parameters received from vehicle components (105). The hardware abstraction library layer (122) comprises an animation sub-layer which facilitates display of in-use animation sequences on the display unit (104). The in-use animation sequences may be defined in corresponding API functions. The non-limiting examples, the in-use animation sequences may comprise Text Marquee, Text Blink, Text Scrolling (Right-to-Left and Left-to-Right), Progress Bar, Tell Tale Blink, Park, Assist Animation, Boot Animations, and the like.
[68] The hardware abstraction library layer (122) comprises a display hardware (HW) abstraction layer which facilitates the abstraction of display types of the display unit (104), different design layouts associated with the display unit (104), and vendor hardware abstraction associated with the display unit (104) in corresponding API functions and classes in the data libraries. Accordingly, the relevant type of vendor displays, display type, etc. may be selected based on the display unit (104) being used on the vehicle.
[69] The hardware abstraction library layer (122) comprises segment abstraction sub-layer which facilitates the abstraction of segment designs and segment maps for the display unit (104) in corresponding API functions and classes in the data libraries. For instance, a layout 1 segment map and a layout 2 segment map may be represented in corresponding LUTs for different designs of display unit (104).
[70] The hardware abstraction library layer (122) comprises communication interface abstraction sub-layer which facilitates abstraction of different communication interfaces associated with the display unit (104) in corresponding API functions and classes in the data libraries.
[71] The decoder unit (103) may be configured to display the one or more parameters on the display unit (104) based on the receive final output bitmap (204), i.e., based on the set of segments to be activated. In some embodiments, the decoder unit (103) may be configured to control the display unit (104) to illuminate the set of segments from among the plurality of segments to indicate the one or more parameters.
[72] The decoder unit (103) may be configured to control individually each of the plurality of segments on the display unit (104). In a non-limiting example, the decoder unit (103) linked to the group of segments (201) is depicted in Figure 2C. The decoder unit (103) comprises a plurality of output ports, wherein each output port among the plurality of output ports is connected to a corresponding segment of the display unit (104), such as a corresponding segment of the group of segments (201). Based on the received final output bitmap for the group of segments (201), the set of segments to be activated are illuminated.
[73] In some embodiments, the decoder unit (103) may be configured to initialize the display unit (104). For instance, during boot-up of the display unit (104), the decoder unit (103) may be configured to initialize the display unit (104). In some embodiments, the decoder unit (103) may be in communication with the display driver module (115). In some embodiments, the display driver module (115) may be associated with the kernel layer (123) shown in Figure 1D. The display driver module (115) may be configured to receive an input indicative of a boot-up of the display unit (104) associated with the vehicle. The display driver module (115) may be configured to send a first boot-up command to the decoder unit (103) which is indicative of initializing the display unit (104). In response to receiving the first boot-up command, the decoder unit (103) may be configured to initialize the display unit (104).
[74] In some embodiments, the display driver module (115) may be configured to cause one or more boot-up animations to be displayed on the display unit (104). The display driver module (115) may be configured to determine an initial set of segments corresponding to the one or more boot-up animations. The initial set of segments may be determined based on the one or more LUTs stored in the host memory (102). A bitmap may be determined that corresponds to the initial set of segments. The display driver module (115) may be configured to send a second boot-up command to the decoder unit (103) which is indicative of the boot-up animations to be displayed on the display unit (104), In some embodiments, an entire bitmap of animation sequence stored in the layer associate with display driver module can be sent to decoder unit with a defined interval to realize boot up animation. In response to receiving the first boot-up command, the decoder unit (103) may be configured to display the one or more boot-up animations on the display unit (104). For instance, the decoder unit (103) may be configured to illuminate the initial set of segments of the display unit (104) corresponding to the generated bitmap.
[75] Figure 3 illustrates a method flow chart of a method (300) for displaying one or more parameters on a display unit (104), in accordance with an embodiment of the present disclosure. The method (300) includes a series of operational steps 301 through 304. In one embodiment, the steps of the method (300) may be performed by the system (100), as discussed above.
[76] In step (301), one or more parameters associated with the vehicle are received. In step (302), corresponding API functions from a plurality of data libraries stored in the host memory (102) are fetched. In some embodiments, the plurality of data libraries comprises a set of classes corresponding to the one or more parameters.
[77] In step (303), a set of segments from among a plurality of segments associated with the display unit (104) are determined based on the corresponding API functions and based on the one or more LUTs stored in the host memory (102). In step (304), the one or more parameters are displayed on the display unit (104) based on the determined set of segments.
[78] While the above-discussed steps in Figure 3 are shown and described in a particular sequence, the steps may occur in variations to the sequence in accordance with various embodiments. Further, a detailed description related to the various steps of Figure 3 is already covered in the description related to Figures 1A-2C and is omitted herein for the sake of brevity.
[79] Figure 4 illustrates a sequential process flow (400) for displaying one or more parameters on the display unit, in accordance with an embodiment of the present disclosure. At block (401), the display unit (104) is powered up. In some embodiments, a user input or command may be received to power up the display unit (104).
[80] At block (402), the display unit (104) is initialized by the display driver module (115). A pre-determined sequence may be run for initializing the display unit (104). At block (403), one or more boot-up animations may be displayed on the display unit (104). In some embodiments, the blocks (401)-(402) may be run at by the display driver module (115) within the kernel layer (123).
[81] At block (404), one or more parameters associated with the vehicle may be received from the vehicle components (105). At block (405), within the display HAL API sub-layer of the hardware abstraction library layer (122), the corresponding API functions may be fetched and relevant alphanumeric characters to be displayed may be determined from the set of code representations.
[82] At block (406), the relevant display type, communication interface, and other information (such as RAM, buffer, etc.) may be determined within the display HW abstraction sub-layer of the hardware abstraction library layer (122). At block (407), the set of segments to be activated may be determined within the segment map sub-layer of the hardware abstraction library layer (122). The set of segments may be determined based on the one or more LUTs stored in the host memory (102). In some embodiments, the process at blocks (404)-(405) and (406)-(407) may be simultaneous.
[83] At block (408), the final output bitmap may be determined indicating the set of segments that are to be activated. At block (409), the final output bitmap may be passed to the decoder unit (103) via the communication interface by the communication module (114). In particular, buffer data within the decoder unit (103) may be updated based on the received final output bitmap.
[84] At block (410), the decoder unit (103) may control the set of segments on the display unit (104) to be activated so as to illuminate the set of segments and cause display of the alphanumeric characters on the display unit (104).
[85] The present disclosure provides a system and method that allows display of parameters on display units of vehicles. The data libraries stored in the host memory are portable to multiple operating system platforms and bare metal environments by virtue of the classes abstracting the different types. The data libraries can be ported to constrained hardware platforms, such as ECU of vehicles. The data libraries are communication interface agnostics and can work with multiple interfaces since the classes abstracts the type of interface that the display unit uses. Moreover, the display of parameters would also be possible for different display designs and layouts. The designs and maps are abstracted in the data libraries and the same data libraries can be used for different segment designs on different vehicles. Thus, the effort of bringing up new display designs are reduced since LUT for the new design as to be added to the data libraries and/or stored in the host memory. Moreover, the data libraries can support different vendor hardware displays and the display type is also abstracted. The effort to bring up new display type is thus reduced.
[86] It will be appreciated that the modules, processes, systems, and devices described above can be implemented in hardware, hardware programmed by software, software instruction stored on a non-transitory computer readable medium or a combination of the above. Embodiments of the methods, processes, modules, devices, and systems (or their sub-components or modules), may be implemented on a general-purpose computer, a special-purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element, an ASIC or other integrated circuit, a digital signal processor, a hardwired electronic or logic circuit such as a discrete element circuit, a programmed logic circuit such as a programmable logic device (PLD), programmable logic array (PLA), field-programmable gate array (FPGA), programmable array logic (PAL) device, or the like. In general, any process capable of implementing the functions or steps described herein can be used to implement embodiments of the methods, systems, or computer program products (software program stored on a non-transitory computer readable medium).
[87] Furthermore, embodiments of the disclosed methods, processes, modules, devices, systems, and computer program product may be readily implemented, fully or partially, in software using, for example, object or object-oriented software development environments that provide portable source code that can be used on a variety of computer platforms. Alternatively, embodiments of the disclosed methods, processes, modules, devices, systems, and computer program product can be implemented partially or fully in hardware using, for example, standard logic circuits or a very-large-scale integration (VLSI) design. Other hardware or software can be used to implement embodiments depending on the speed and/or efficiency requirements of the systems, the particular function, and/or particular software or hardware system, microprocessor, or microcomputer being utilized.
[88] In this application, unless specifically stated otherwise, the use of the singular includes the plural and the use of “or” means “and/or.” Furthermore, use of the terms “including” or “having” is not limiting. Any range described herein will be understood to include the endpoints and all values between the endpoints. Features of the disclosed embodiments may be combined, rearranged, omitted, etc., within the scope of the invention to produce additional embodiments. Furthermore, certain features may sometimes be used to advantage without a corresponding use of other features.
[89] While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist.
Reference numbers:
Components Reference Numbers
system 100
Host processor 101
Decoder unit 103
Display unit 104
Vehicle components 105
Set of modules 110
Application module 111
API module 112
Segment module 113
Communication module 114
Display driver module 115
Plurality of layers 120
Application layer 121
Hardware abstraction library layer 122
Kernel layer 123
Host device hardware layer 124
Display device hardware layer 125
Group of segments 201
Memory map 202
Set of code representations 203
Final output bitmap 204
, Claims:1. A system (100) associated with a vehicle, the system (100) comprising:
a host memory (102), the host memory (102) configured to store a plurality of modules (110) in the form of programmable instructions, a plurality of data libraries, and one or more Look-Up-Tables (LUTs);
a display unit (104);
a decoder unit (103) communicatively coupled to the display unit;
a host processor (101) communicatively coupled to the host memory (102), the display unit (104), and the decoder unit (103), wherein the host processor (101) is configured to execute the programmable instructions associated with the plurality of modules (110), the plurality of modules (110) comprising:
an application module (111) configured to receive one or more parameters associated with the vehicle;
an Application Program Interface (API) module (112) configured to fetch, based on the one or more parameters, corresponding API functions from the plurality of data libraries, wherein the plurality of data libraries comprises a set of classes corresponding to the one or more parameters; and
a segment module (113) configured to determine a set of segments from among a plurality of segments associated with the display unit (104) based on the corresponding API functions and based on the one or more Look-Up-Tables (LUTs);
wherein the decoder unit (103) is configured to display, based on the determined set of segments, the one or more parameters on the display unit (104).
2. The system (100) as claimed in claim 1, wherein the segment module (113) is configured to store information related to the set of segments and the one or more parameters in the host memory (102), and wherein the plurality of modules comprises a communication module (114) configured to send the stored information to the decoder unit (103), the communication module (114) being associated with a communication interface for facilitating communication between the plurality of modules (110) and the decoder unit (103).
3. The system (100) as claimed in claim 1, wherein the plurality of modules comprises a display driver module (115), the display driver module (115) being configured to:
Receive an input indicative of a boot-up of the display unit (104) associated with the vehicle; and
send a first boot-up command to the decoder unit (103), the first boot-up command being indicative of initializing the display unit (104),
wherein, in response to receiving the first boot-up command, the decoder unit (103) is configured to initialize the display unit (104).
4. The system (100) as claimed in claim 3, wherein the display driver module (115) is configured to:
determine an initial set of segments corresponding to one or more boot-up animations to be displayed on the display unit (104), based on the one or more LUTs stored in the host memory (102);
generate a bitmap corresponding to the determined initial set of segments; and
send a second boot-up command to the decoder unit (103), the second command being indicative of the one or more boot-up animations to be displayed on the display unit (104);
wherein, in response to receiving the second boot-up command, the decoder unit (103) is configured to display the one or more boot-up animations on the display unit (104) by illuminating the initial set of segments corresponding to the generated bitmap.
5. The system (100) as claimed in claim 1, wherein the one or more LUTs include a plurality of values correlated to each of the plurality of segments associated with the display unit (104).
6. The system (100) as claimed in claim 1, wherein the corresponding API functions are associated with one or more of a display type of the display unit (104), a segment map associated with the display unit (104), the communication interface associated with the display unit (104), and a vendor hardware abstraction associated with the display unit (104).
7. A method (300) for displaying one or more parameters on a display unit (104) associated with a vehicle, the method comprising
receiving (301) one or more parameters associated with the vehicle;
fetching (302), based on the one or more parameters, corresponding API functions from a plurality of data libraries stored in a host memory (102), wherein the plurality of data libraries comprises a set of classes corresponding to the one or more parameters; and
determining (303) a set of segments from among a plurality of segments associated with the display unit (104) based on the corresponding API functions and based on one or more Look-Up-Tables (LUTs) stored in the host memory;
displaying (304), based on the determined set of segments, the one or more parameters on the display unit (104).
8. The method (300) as claimed in claim 7, comprising:
storing information related to the set of segments and the one or more parameters in the host memory (102), and
sending the stored information to a decoder unit (103) via a communication interface.
9. The method (300) as claimed in claim 7, comprising:
receiving an input indicative of a boot-up of the display unit associated with the vehicle;
sending a first boot-up command to a decoder unit (103), the first boot-up command being indicative of initializing the display unit (104); and
in response to receiving the first boot-up command, initializing the display unit (104).
10. The method (300) as claimed in claim 9, comprising:
determining an initial set of segments corresponding to one or more boot-up animations to be displayed on the display unit (104), based on the one or more LUTs stored in the host memory (102);
generating a bitmap corresponding to the determined initial set of segments; and
sending a second boot-up command to the decoder unit (103), the second command being indicative of the one or more boot-up animations to be displayed on the display unit (104);
in response to receiving the second boot-up command, displaying the one or more boot-up animations on the display unit (104) by illuminating the initial set of segments corresponding to the generated bitmap.
11. The method (300) as claimed in claim 7, wherein the one or more LUTs include a plurality of values correlated to each of the plurality of segments associated with the display unit (104).
12. The method (300) as claimed in claim 7, wherein the corresponding API functions are associated with one or more of a display type of the display unit (104), a segment map associated with the display unit (104), the communication interface associated with the display unit (104), and a vendor hardware abstraction associated with the display unit (104).
| # | Name | Date |
|---|---|---|
| 1 | 202341052074-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [02-08-2023(online)].pdf | 2023-08-02 |
| 2 | 202341052074-STATEMENT OF UNDERTAKING (FORM 3) [02-08-2023(online)].pdf | 2023-08-02 |
| 3 | 202341052074-REQUEST FOR EXAMINATION (FORM-18) [02-08-2023(online)].pdf | 2023-08-02 |
| 4 | 202341052074-POWER OF AUTHORITY [02-08-2023(online)].pdf | 2023-08-02 |
| 5 | 202341052074-FORM 18 [02-08-2023(online)].pdf | 2023-08-02 |
| 6 | 202341052074-FORM 1 [02-08-2023(online)].pdf | 2023-08-02 |
| 7 | 202341052074-DRAWINGS [02-08-2023(online)].pdf | 2023-08-02 |
| 8 | 202341052074-DECLARATION OF INVENTORSHIP (FORM 5) [02-08-2023(online)].pdf | 2023-08-02 |
| 9 | 202341052074-COMPLETE SPECIFICATION [02-08-2023(online)].pdf | 2023-08-02 |
| 10 | 202341052074-Proof of Right [01-09-2023(online)].pdf | 2023-09-01 |
| 11 | 202341052074-RELEVANT DOCUMENTS [25-09-2024(online)].pdf | 2024-09-25 |
| 12 | 202341052074-POA [25-09-2024(online)].pdf | 2024-09-25 |
| 13 | 202341052074-FORM 13 [25-09-2024(online)].pdf | 2024-09-25 |
| 14 | 202341052074-AMENDED DOCUMENTS [25-09-2024(online)].pdf | 2024-09-25 |