Abstract: Described herein is a device (D) for controlling human machine interface (HMI) or user interface (UI) of an in-vehicle infotainment system. The device comprises an external wearable device (WD) configured to measure muscle stress (MS) of user while operating the in-vehicle infotainment system, a plurality of sensors (S1, S2, S3) configured to sense a plurality of parameters of the user to generate and record said parameter values and a module (MC) integrated with the vehicle camera (C) configured to record physical seating attributes of the user, hand length (HL) and a real time position of the user’s eye point position, wherein, the device calibrates and personalizes the user interface by configuring input parameters received from the wearable device, plurality of sensors and the module. The device provides better comfort, entertainment and information for an enhanced in-vehicle experience by easing the mode of operating the in-vehicle infotainment system.
Claims:We Claim:
1. A device (D) for controlling human machine interface (HMI) or user interface (UI) of an in-vehicle infotainment system, the device (D) comprises:
an external wearable device (WD) configured to measure muscle stress (MS) of user while operating the in-vehicle infotainment system;
a plurality of sensors (S1, S2, S3) configured to sense a plurality of parameters of the user to generate and record said parameter values; and
a module (MC) integrated with the vehicle camera (C) configured to record hand length (HL), physical seating attributes of the user and a real time position of the user’s eye point position,
wherein,
the device calibrates and personalizes the user interface by configuring input parameters received from the wearable device, plurality of sensors and the module.
2. The device as claimed in claim 1, wherein the plurality of sensors (S1, S2, S3) includes:
a first sensor (S1) configured to sense and measure seatbelt pressure when the user operates the in-vehicle infotainment system;
a second sensor (S2) configured to sense and measure position of the user’s seat; and
a third sensor (S3) configured to sense the reclined position of the user’s seat in the vehicle.
3. The device as claimed in claim 1, wherein the device is integrated with the in-vehicle infotainment system through a hardwire connection or a controller area network (CAN).
4. The device (D) as claimed in claim 1, wherein the plurality of parameters includes in-vehicle parameters.
5. The device (D) as claimed in claim 4, wherein the in-vehicle parameters include seat position, seat reclining angle and seat belt tension.
6. The device (D) as claimed in claim 4, wherein the user’s physical attributes include hand length, muscle stress, user’s posture and eye point position.
7. The device (D) as claimed in claim 1, wherein the device includes a repository database comprising a predefined value for each in-vehicle parameter and user’s physical attributes.
8. A method (100) of controlling the human machine interface (HMI) or user interface based on the device (D) claimed in claim 1, the method (100) comprising:
creating (S101a) user profile by capturing front image of the user by means of a face recognition device;
mapping (101b) and storing (101b) the user profile captured by the face recognition software;
measuring (S102) and recording (S102) the vehicle parameters through the plurality of sensors;
estimating (S103) validity of recorded vehicle parameters for customization;
measuring (S104) and recording (S104) of user’s physical attributes through the wearable device and the module;
assessing (S105) screen display area of the in-vehicle infotainment system based on ease of reachability of the user by configuring the vehicle parameter values with user’s physical attributes;
calibrating (S106) and personalizing (S106) the display interface of the in-vehicle infotainment system based on the user’s usage pattern; and
modifying (S107) and setting (S107) the display interface of the in-vehicle infotainment system.
Description:FIELD OF INVENTION
[0001] The present disclosure, in general, relates to a device and a method for controlling human machine interface (HMI) or user interface (UI) of an in-vehicle infotainment system and in particular, to a device and a method to customize and personalize the human machine interface (HMI) or user interface (UI) of an in-vehicle infotainment system based on the user’s preferences and eases mode of operation of the in-vehicle infotainment system for better comfort, entertainment and information for an enhanced in-vehicle experience.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention.
[0003] With the growing demand for luxurious, safe, and smart vehicles, automotive vehicle manufacturers are increasingly developing automobiles with integrated infotainment systems i.e. systems that provide a combination of entertainment and information for an enhanced in-vehicle experience. The entire automotive industry is moving towards developing innovative technologies to enable better connectivity solutions, improve vehicle safety, and enhance in-vehicle user-experience. One of the key technologies, which works as a focal point of all the modern automotive systems and integrates their functions to be controlled and monitored from one central unit, is an auto adjustable user interface in vehicle infotainment system.
[0004] The main components of an infotainment system which works with many other in-vehicle and external devices to deliver information and entertainment to the user and passengers includes an integrated head unit that is a touch screen based, tablet-like device, mounted on the vehicle’s dashboard. With user-friendly human machine interface (HMI), the head unit acts as a perfectly connected control center for the infotainment system. Other features such as an operating system capable of supporting connectivity, convenience functions, and downloadable software applications to integrate new functions in the system, standardized communication protocols such as controller area network (CAN) for protocol support, integrated automotive sensors and others.
[0005] The existing state of the art comprises of in-vehicle infotainment systems that focus and monitor a wider ambit of user and vehicular parameters or factors such as heart rate of the user, fatigue levels during driving of vehicle, facial features, induced stress etc. Thus, at present the existing systems deliver more on the safety aspects of the user during the driving of the vehicle, there is a lesser focus, and emphasis on the user’s experience combined with better comfort and enhanced entertainment while driving the vehicle. In addition, there is no such technology which understands and utilizes the information related to physical stresses induced on the hand muscles of the user while operating the in-vehicle infotainment system along with other physical attributes of the user such as body posture, hand length, and eye point position in combination with in-vehicle parameters for example orientation of user’s seat, angle of recline of seat and seatbelt tension.
[0006] Therefore, in order to overcome the limitations of the existing provisions, there is need in the art to provide for a device and a method to customize and personalize the user interface (UI) of an in-vehicle infotainment system based on the user’s preferences and eases the stress in operation of in-vehicle infotainment system for better comfort, entertainment and information for an enhanced in-vehicle experience.
OBJECTS OF THE DISCLOSURE
[0007] It is therefore the object of the invention to overcome the aforementioned and other drawbacks in prior systems used for controlling the human machine interface (HMI) or user interface (UI) of an in-vehicle infotainment system in a vehicle.
[0008] Another object of the present invention is to provide for a device that customizes and personalizes the human machine interface (HMI) or user interface (UI) of the in-vehicle infotainment system based on the user’s preferences and eases mode of operation of in-vehicle infotainment system for a better comfort and experience.
[0009] Another object of the present invention is to ensure a usable layout of in-vehicle infotainment options on the display screen of the system.
[0010] Another object of the present invention is to reduce the muscle stress and pressure induced on the hand of the user while using and operating the in-vehicle infotainment system of the vehicle.
[0011] Another object of the present invention is to provide for a user-friendly experience to the user while operating the in-vehicle infotainment system.
[0012] These and other objects and advantages of the present invention will be apparent to those skilled in the art after a consideration of the following detailed description taken in conjunction with the accompanying drawings in which a preferred form of the present invention is illustrated.
SUMMARY
[0013] This summary is provided to introduce concepts related to a device and a method to customize and personalize the human machine interface (HMI) or user interface (UI) of an in-vehicle infotainment system based on the user’s preferences and eases mode of operation of in-vehicle infotainment system for a better comfort and experience.
[0014] The concepts are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
[0015] In an embodiment, the present disclosure relates to a device (D) for controlling human machine interface (HMI) or user interface (UI) of an in-vehicle infotainment system in a vehicle. The device comprises an external wearable device (WD) configured to measure muscle stress (MS) of user while operating the in-vehicle infotainment system, a plurality of sensors (S1, S2, S3) configured to sense a plurality of parameters of the user to generate and record said parameter values and a module (MC) integrated with the vehicle camera (C) configured to record physical seating attributes of the user, hand length (HL) and a real time position of the user’s eye point position relative to the in-vehicle infotainment system, wherein, the device calibrates and personalizes the user interface by configuring input parameters received from the wearable device (WD), plurality of sensors (S1,S2,S3) and the module (MC).
[0016] In an aspect the plurality of sensors includes a first sensor (S1) configured to sense and measure seatbelt pressure or seatbelt tension when the user operates the in-vehicle infotainment system, a second sensor (S2) configured to sense and measure position of the user’s seat and a third sensor (S3) configured to sense the reclined position of the user’s seat in the vehicle.
[0017] In an aspect the device is integrated with the in-vehicle infotainment system through a hardwire connection or a controller area network (CAN).
[0018] In an aspect, the plurality of parameters includes in-vehicle parameters and user’s physical attributes.
[0019] In an aspect, the vehicle parameters include seat position, seat reclining angle and seat belt tension.
[0020] In an aspect the user’s physical attributes include hand length, muscle stress, user’ posture and eye point position.
[0021] In an aspect, the device includes a repository database comprising a predefined value for each in-vehicle parameter and user’s physical attributes.
[0022] In an embodiment, the present disclosure relates to a method of controlling the user interface based on the device comprising the steps of creating user profile by capturing front image of the user by means of a face recognition device, mapping and storing the user profile captured by the face recognition software or any other technology at par, measuring and recording the vehicle parameters through the plurality of sensors, estimating validity of recorded vehicle parameters for customization, measuring and recording of user’s physical attributes through the wearable device and the module, assessing screen display area of the in-vehicle infotainment system based on ease of reachability of the user by configuring the vehicle parameter values with user’s physical attributes, calibrating and personalizing the display interface of the in-vehicle infotainment system based on the user’s usage and modifying and setting the display interface of the infotainment system.
[0023] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
[0024] It is to be understood that the aspects and embodiments of the disclosure described above may be used in any combination with each other. Several of the aspects and embodiments may be combined to form a further embodiment of the disclosure.
[0025] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF DRAWINGS
[0026] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and with reference to the accompanying figures, in which:
[0027] Figure 1 illustrates a schematic view of the method of controlling and customizing the human machine interface (HMI) or user interface of the in-vehicle infotainment system by the device in accordance with an embodiment of the present disclosure;
[0028] Figure 2 illustrates a schematic view of customized and personalized display interface of the in-vehicle infotainment system based on the user’s usage and preferences in accordance with an embodiment of the present disclosure;
[0029] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes, which may be substantially represented in a computer-readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0030] The detailed description of various exemplary embodiments of the disclosure is described herein with reference to the accompanying drawings. It should be noted that the embodiments are described herein in such details as to clearly communicate the disclosure. However, the amount of details provided herein is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0031] It is also to be understood that various arrangements may be devised that, although not explicitly described or shown herein, embody the principles of the present disclosure. Moreover, all statements herein reciting principles, aspects, and embodiments of the present disclosure, as well as specific examples, are intended to encompass equivalents thereof.
[0032] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a", “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
[0033] In addition, the descriptions of "first", "second", “third”, and the like in the present invention are used for the purpose of description only, and are not to be construed as indicating or implying their relative importance or implicitly indicating the number of technical features indicated. Thus, features defining "first" and "second" may include at least one of the features, either explicitly or implicitly.
[0034] It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
[0035] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0036] The present subject matter relates a device and a method to customize and personalize the user interface (UI) of an in-vehicle infotainment system based on the user’s preferences and eases mode of operation of in-vehicle infotainment system for better comfort, entertainment and information for an enhanced in-vehicle experience.
[0037] Figure 1 that illustrates a schematic view of the method of controlling and customizing the human machine interface (HMI) or user interface of the in-vehicle infotainment system by the device. The device based on the method as shown in Figure 1, includes an external wearable device (WD), a plurality of sensors (S1, S2, S3), and a module (MC). It is understood that implementing all of the illustrated components of the present device is not a requirement, and that greater or fewer components may alternatively be implemented.
[0038] The device (D) includes an external wearable device (WD) configured to measure muscle stress (MS) of the user while operating the in-vehicle infotainment system, a plurality of sensors (S1, S2, S3) wherein the sensors are configured to sense a plurality of parameters, to generate and record said parameter values and a module (MC) integrated with the vehicle camera (C) configured to record physical seating attributes, hand length (HL) along with facial features of the user and a real time position of the user’s eye point position. The device calibrates and personalizes the human machine interface (HMI) or user interface (UI) of the in-vehicle infotainment system by configuring input parameters received from the wearable device (WD), plurality of sensors (S1,S2,S3) and the module (MC) to customize and personalize the display of the in-vehicle infotainment system.
[0039] The plurality of sensors (S1,S2,S3) comprises of a first sensor (S1) configured to sense and measure seatbelt pressure or seat belt tension when the user operates the in-vehicle infotainment system, a second sensor (S2) configured to sense and measure position of the user’s seat relative to the in-vehicle infotainment system, and a third sensor (S3) configured to sense the reclined position of the user’s seat or any difference in distance or seat reclining angle due a change of position of the user’s seat in the vehicle. The sensors (S1, S2, S3) may alternately include more numbers of sensors in different embodiments of the present disclosure. The sensors (S1, S2, S3) measure input parameters including in-vehicle parameters.
[0040] The aforementioned in-vehicle parameters include seat position meaning thereby the orientation and position of the user’s seat, seat reclining angle or angle of recline of the user’s seat while sitting on the seat and seat belt tension or tension developed in the user’s seat when the user sits on the seat while operating the in-vehicle infotainment system.
[0041] Further, the user’s physical attributes include hand length that is the length of user’s hand while operating the in-vehicle infotainment system, muscle stress or the stress induced on hand muscles of the user while operating the in-vehicle infotainment system, body posture that is the complete posture of the user while accessing the in-vehicle infotainment system and eye point position of the user while using the in-vehicle infotainment system.
[0042] The device (D) is integrated with the in-vehicle infotainment system through a certain standardized forms of communication protocols such as a hardwire system or a controller area network (CAN) usually used in the automobile industry. Further, the device also comprises of a repository database having a set of predefined values for each input parameters including in-vehicle parameter and user’s physical attributes.
[0043] As shown, Figure 1 that illustrates a schematic view of the process of controlling and customizing the human machine interface (HMI) or user interface of the in-vehicle infotainment system by the device. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any appropriate order to carry out the method 100 or an alternative method. Additionally, individual blocks may be deleted from the method 100 without departing from the scope of the subject matter described herein.
[0044] At step 101a, the method includes creating user profile by capturing front image or facial features of the user by means of a face recognition device such as a camera. The camera maybe placed in front of the seat of the user inside the vehicle.
[0045] At step 101b, the method includes mapping (101b) and storing (101b) the user profile captured by the face recognition device. The face recognition device captures the user profile in form of a user identification data (ID).
[0046] At step 102, the method includes measuring and recording the input parameters through the plurality of sensors (S1, S2, S3).The sensors (S1, S2, S3) for example a seat belt tension sensor, a seat position sensor monitor measure the seat belt tension at different touch points on the seat belt, seat position and seat reclining angle in different seat positions.
[0047] At step 103, the method includes estimating validity of recorded input parameters for customization and personalization of interface of in-vehicle infotainment system. Herein, a validity check is conducted where the value of recorded in-vehicle parameters i.e. the seat belt tension, seat reclining angle and user’s seat position is first monitored and then compared with a standard data set or standard value of the parameter to determine any form of deviation or abnormality. Further, the deviations or abnormalities recorded for each parameter are correlated with each other and stored in mapped user ID of the user. At this step 103, if the deviation or abnormality value is within a feasible range of pre-defined compensatory value in relation to the standard values then the device (D) proceeds further to measure and record other parameter values required for customization of the display interface of in-vehicle infotainment system.
[0048] At step 104, the method includes measuring and recording of user’s physical attributes through the wearable device and the module (MC).
[0049] At step 105, the method includes assessing screen display area of the in-vehicle infotainment system based on ease of reachability of the user by configuring the vehicle parameter values with user’s physical attributes.
[0050] At step 106, the method includes calibrating and personalizing the display interface of the in-vehicle infotainment system based on the user’s usage pattern.
[0051] At step 107, the method includes modifying and setting the display interface of the in-vehicle infotainment system.
[0052] It is important to consider that the device (D) would not be impaired in its functioning if any one of parameter values is not measured or not feasible to be measured. The present device (D) is not limited to the afore-mentioned parameters including in-vehicle parameters and user’s physical attributes and the desirable customization and personalization of the display interface of in-vehicle infotainment system would not be affected by such limitations described above. Further, since the system is collecting valid data and creating individual profiles, the system accommodates the different parameter values and accordingly improvises the user profiles.
Technical advantages:
[0053] With the present device implemented in the in-vehicle infotainment system it is possible to monitor and measure the stress induced on hand muscles of the user while operating the in-vehicle infotainment system.
[0054] With the present disclosure, it is possible to determine the hand length of the user to correlate the hand length with the muscle stress induced on the hand muscles of the user to estimate the easily reachable points on the in-vehicle infotainment system.
[0055] With the present disclosure, the in-vehicle infotainment system can be customized and personalized in accordance with the user preferences and choices.
[0056] With the present disclosure, the in-vehicle infotainment system provides better comfort, entertainment and information for an enhanced in-vehicle experience.
[0057] It should be understood, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, as apparent from the discussion herein, it is appreciated that throughout the description, discussions utilizing terms such as “receiving,” or “determining,” or “retrieving,” or “controlling,” or “comparing,” or the like, refer to the action and processes of an electronic control unit, or similar electronic device, that manipulates and transforms data represented as physical (electronic) quantities within the control unit’s registers and memories into other data similarly represented as physical quantities within the control unit memories or registers or other such information storage, transmission or display devices.
[0058] Further, the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. It will be appreciated that several of the above-disclosed and other features and functions, or alternatives thereof, may be combined into other systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations, or improvements therein may subsequently be made by those skilled in the art without departing from the scope of the present disclosure as encompassed by the following claims.
[0059] It will be appreciated that variants of the above-disclosed and other features and functions, or alternatives thereof, may be combined into many other different systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations, or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.
[0060] The claims, as originally presented and as they may be amended, encompass variations, alternatives, modifications, improvements, equivalents, and substantial equivalents of the embodiments and teachings disclosed herein, including those that are presently unforeseen or unappreciated, and that, for example, may arise from applicants/patentees and others.
| # | Name | Date |
|---|---|---|
| 1 | 202111007060-STATEMENT OF UNDERTAKING (FORM 3) [19-02-2021(online)].pdf | 2021-02-19 |
| 2 | 202111007060-POWER OF AUTHORITY [19-02-2021(online)].pdf | 2021-02-19 |
| 3 | 202111007060-FORM 1 [19-02-2021(online)].pdf | 2021-02-19 |
| 4 | 202111007060-FIGURE OF ABSTRACT [19-02-2021(online)].jpg | 2021-02-19 |
| 5 | 202111007060-DRAWINGS [19-02-2021(online)].pdf | 2021-02-19 |
| 6 | 202111007060-DECLARATION OF INVENTORSHIP (FORM 5) [19-02-2021(online)].pdf | 2021-02-19 |
| 7 | 202111007060-COMPLETE SPECIFICATION [19-02-2021(online)].pdf | 2021-02-19 |
| 8 | 202111007060-Proof of Right [22-03-2021(online)].pdf | 2021-03-22 |
| 9 | 202111007060-FORM-26 [22-03-2021(online)].pdf | 2021-03-22 |
| 10 | 202111007060-FORM 18 [16-04-2021(online)].pdf | 2021-04-16 |
| 11 | 202111007060-Power of Attorney-240321.pdf | 2021-10-19 |
| 12 | 202111007060-OTHERS-240321.pdf | 2021-10-19 |
| 13 | 202111007060-Correspondence-240321.pdf | 2021-10-19 |
| 14 | 202111007060-FER.pdf | 2023-03-01 |
| 15 | 202111007060-FORM 13 [01-09-2023(online)].pdf | 2023-09-01 |
| 16 | 202111007060-FER_SER_REPLY [01-09-2023(online)].pdf | 2023-09-01 |
| 17 | 202111007060-DRAWING [01-09-2023(online)].pdf | 2023-09-01 |
| 18 | 202111007060-POA [26-06-2024(online)].pdf | 2024-06-26 |
| 19 | 202111007060-FORM 13 [26-06-2024(online)].pdf | 2024-06-26 |
| 20 | 202111007060-AMENDED DOCUMENTS [26-06-2024(online)].pdf | 2024-06-26 |
| 21 | 202111007060-US(14)-HearingNotice-(HearingDate-17-10-2025).pdf | 2025-09-17 |
| 22 | 202111007060-Correspondence to notify the Controller [14-10-2025(online)].pdf | 2025-10-14 |
| 23 | 202111007060-FORM-8 [28-10-2025(online)].pdf | 2025-10-28 |
| 24 | 202111007060-Written submissions and relevant documents [31-10-2025(online)].pdf | 2025-10-31 |
| 1 | 202111007060E_27-02-2023.pdf |
| 1 | 202111007060_SearchStrategyAmended_E_SearchHistoryAE_17-09-2025.pdf |
| 2 | 202111007060E_27-02-2023.pdf |