Abstract: ABSTRACT User interaction systems in vehicles. Embodiments disclosed herein relate to vehicle systems and more particularly to interfaces, which enable user interactions with vehicle systems. Embodiments herein disclose methods and systems for enabling user interactions with a vehicle using at least one display. Embodiments herein disclose methods and systems for enabling user interactions with a vehicle using at least one display, wherein the system configures the display for displaying information in a plurality of depth layers using a 3 dimensional depth field. Embodiments herein disclose methods and systems for enabling user interactions with a vehicle using at least one display, wherein the user can interact with the depth layers using a minimal number of touches. FIG. 4
Claims:STATEMENT OF CLAIMS
We claim:
1. A system (100) for enabling user interactions with a vehicle using at least one display (102), the system (100) configured for
displaying at least one item using the display (102) by a controller (101) in a 3 dimensional manner, wherein the at least one item is displayed using a plurality of depth layers; and
enabling at least one user to interact with the at least one item by the controller (101) using at least one of the display (102), at least one button, at least one switch, at least one steering wheel controller, at least one rotary controller, at least one joystick, and at least one gesture.
2. The system, as claimed in claim 1, wherein the controller (101) is configured to arrange positions and perspectives of the depth layers.
3. The system, as claimed in claim 1, wherein the depth layers comprise of a plurality of layers, wherein a first layer is closest to the user and displays a primary menu item and subsequent layers serve as placeholders for additional information.
4. The system, as claimed in claim 1, wherein the controller (101) is configured to enable the user to interact with the system (100) using at least one virtual object.
5. The system, as claimed in claim 1, wherein the controller (101) is configured to enable at least one of a user or an authorized person to configure the depth layers.
, Description:TECHNICAL FIELD
[001] Embodiments disclosed herein relate to vehicle systems and more particularly to interfaces, which enable user interactions with vehicle systems.
BACKGROUND
[002] Currently, most vehicles are provided with systems comprising of a display, such as an infotainment system, HVAC (Heating, Ventilation and Air Conditioning) systems, instrument clusters, and so on. With the touch screen displays, the driver interaction is mainly through touch or through control buttons (for example, steering wheel controls). Considering an example of an infotainment system with a display, a typical home screen of the infotainment system will provide a highest level view of the functionalities; such as options for accessing media, phone, and navigation. However if the driver wishes to operate the infotainment system or view information on the system, the driver has to focus on the screen and maneuver through different screen and perform a series of touches to get to the relevant information for which he is looking. This demands complete driver attention towards the screen and forces him to take his eyes off the road. This can be distracting for the drive, hereby adversely affecting the safety of the vehicle and can result in accidents or dangerous situations.
OBJECTS
[003] The principal object of this invention is to provide methods and systems for enabling user interactions with a vehicle using at least one display.
[004] Another object of the invention is to provide methods and systems for enabling user interactions with a vehicle using at least one display, wherein the system configures the display for displaying information in a plurality of depth layers using a 3 dimensional depth field.
[005] A further object of the invention is to provide methods and systems for enabling user interactions with a vehicle using at least one display, wherein the user can interact with the depth layers using a minimal number of touches.
BRIEF DESCRIPTION OF FIGURES
[006] This invention is illustrated in the accompanying drawings, through out which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[007] FIGs. 1a and 1b depicts vehicle systems wherein a user of the vehicle can interact with the system, according to embodiments as disclosed herein;
[008] FIG. 2 depicts the controller, according to embodiments as disclosed herein;
[009] FIGs. 3a and 3b depict example displays for an infotainment system, according to embodiments as disclosed herein; and
[0010] FIG. 4 depicts another example of a display of an infotainment system, according to embodiments as disclosed herein.
DETAILED DESCRIPTION
[0011] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0012] The embodiments herein disclose methods and systems for enabling user interactions with a vehicle using at least one display. Referring now to the drawings, and more particularly to FIGS. 1 through 4, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[0013] The vehicle as disclosed herein can be at least one of a car, van, truck, bus, motorcycle, scooter, or any other vehicle comprising of at least one display based system. Examples of the system can be an infotainment system (wherein the infotainment system can depict at least one of a media menu, a phone menu, navigation, and so on), a HVAC (Heating, Ventilation and Air Conditioning) system, an instrument cluster or any other system present in the vehicle, which provides information to the user and/or enables the user to interact with the system.
[0014] User as disclosed herein can refer to any person present in the vehicle, such as the driver, at least one passenger, and so on.
[0015] FIGs. 1a and 1b depicts vehicle systems wherein a user of the vehicle can interact with the system. The system 100 as depicted can comprise of at least one controller 101 and at least one display 102. In an embodiment herein, the controller 101 can be separate from the display 102 and can be co-located or located remotely from the display 102 (as depicted in FIG. 1a). The controller 101 can be connected to more than one system and can control more than one system present in the vehicle. In an embodiment herein, the controller 101 can be present internal to the vehicle system (as depicted in FIG. 1b). In an embodiment herein, the system 100 can also comprise of at least one user interface. Examples of the user interface can be touchscreens, physical buttons, steering wheel buttons, a joystick rotary controller, interfaces present in the rear of the vehicle so to enable passengers seated in the rear to interact with the vehicle system 100, gesture based means (which enables the system 100 to sense gestures made by at least one user), and so on.
[0016] FIG. 2 depicts the controller. The controller 101, as depicted, comprises of a rendering engine 201, an object controller 202, at least one communication interface 203, and a memory 204. The at least one communication interface can enable the system 100 to interact with at least one system present in the vehicle. The at least one communication interface 203 can enable a user to interact with the system using at least one of the display 102, physical buttons, steering wheel buttons, a joystick rotary controller, interfaces present in the rear of the vehicle so to enable passengers seated in the rear to interact with the vehicle system 100, gesture based means, and so on. The memory 204 can be at least one of a co-located memory or a remotely located memory.
[0017] The object controller 202 can display a plurality of menu items to the user on the display 102, wherein a primary menu item is displayed as the nearest object on the closest layer to the user and subsequent layers display further information related to the primary menu item or subsequent items. For example, the display 102 can display the current media item being played in the first layer, with the subsequent layers showing the other media items in the queue. The manner of displaying items as a plurality of layers on the display 102 in a 3D (3 dimensional) manner, wherein the first layer is the layer closest to the front which displays the primary menu item and subsequent layers are present behind the first layer and serve as placeholders for additional information, has been referred to herein as depth layers. The object controller 202 can enable at least one user or an authorized person to customize the depth layers, on a global basis or on a per item basis, such as the number of depth layers to be displayed, the items to be displayed, and so on. FIGs. 3a and 3b depict example displays for an infotainment system. In FIG. 3a, the top panel displays generic information such as the date, time, and so on and the display panel in the center displays the media being played currently, information related to a connected mobile device, information related to the vehicle and navigation information. In FIG. 3b, the user has selected the media item and further information about the media is displayed in a prominent manner. Additional panels can display additional information such as information related to a connected mobile device, information related to the vehicle and navigation information.
[0018] The object controller 202 can use the rendering engine 201 to arrange the depth layers in a 3 dimensional (3D) manner, so as to provide the user with a feeling that the display is 3 dimensional (3D), by arranging the position and perspective sizes of the depth layers. FIG. 4 depicts another example of a display of an infotainment system.
[0019] The object controller 202 can create and represent at least one virtual object in the display 102, wherein a virtual object can aid in facilitating the transition from 2D mechanisms to 3D interactions. In an example herein, the virtual object can be a virtual floating sphere graphically drawn on the display 102, which can be used to represent the users hand inside this 3D environment. In an embodiment herein, the user can interact with the object using the display 102 wherein the display 1-2 is a touch based display, using at least one of basic touch gesture(s), multi touch gestures (such as swiping, pinching, and so on). In an embodiment herein, the user can interact with the object using a rotary controller or a joystick. The user can perform actions such as clockwise/anticlockwise rotations, pushing down, pulling up, 360 degrees push and tilt, and so on. In an embodiment herein, the user can interact with the object using controls present on the steering wheel. In an embodiment herein, the user can interact with the object using controls present in the instrument cluster. In an embodiment herein, the user can interact with the object using gestures performed in a pre-defined region within the vehicle. The interface 203 can comprises of means such as camera based means, depth sensing means to identify the gesture made by the user. The interface 203 can locate the position of the users hand and then translate that from a fixed point of reference, into the distance that the users hand has travelled into the 3D depth. This space will be used to represent and interact with the items on the display 102. Additional feedback mechanisms can be thought of for providing the driver with proper feedback to minimize localization errors.
[0020] The above examples herein are explained herein using infotainments systems merely as an example; however it can be obvious to a person of ordinary skill in the art to extend embodiments as disclosed herein to any display based system present in a vehicle.
[0021] Embodiments herein disclose a more ergonomic way of information representation, which will give the user the basic and relevant information by just glancing at the screen and by enabling mechanisms to interact with the depth information layers using minimal number of touches. Embodiments disclosed herein alleviate the demand for driver attention by creating and using a 3D depth field made available in the display 102 to show additional information in these depth layers. Embodiments disclosed herein enable users to interact with these depth layers reducing the number of interactions required to access the information as compared to the currently deployed display based solutions.
[0022] The embodiment disclosed herein describes methods and systems for enabling user interactions with a vehicle using at least one display. Therefore, it is understood that the scope of the protection is extended to such a program and in addition to a computer readable means having a message therein, such computer readable storage means contain program code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The method is implemented in a preferred embodiment through or together with a software program written in e.g. Very high speed integrated circuit Hardware Description Language (VHDL) another programming language, or implemented by one or more VHDL or several software modules being executed on at least one hardware device. The hardware device can be any kind of portable device that can be programmed. The device may also include means which could be e.g. hardware means like e.g. an ASIC, or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. The method embodiments described herein could be implemented partly in hardware and partly in software. Alternatively, the invention may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[0023] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
| # | Name | Date |
|---|---|---|
| 1 | 201641033165-FER.pdf | 2020-04-30 |
| 1 | Power of Attorney [28-09-2016(online)].pdf | 2016-09-28 |
| 2 | Correspondence by Agent_Form1 F5 PA_28-11-2016.pdf | 2016-11-28 |
| 2 | Form 5 [28-09-2016(online)].pdf | 2016-09-28 |
| 3 | abstract 201641033165 .jpg | 2016-11-04 |
| 3 | Form 3 [28-09-2016(online)].pdf | 2016-09-28 |
| 4 | Description(Complete) [28-09-2016(online)].pdf | 2016-09-28 |
| 4 | Form 18 [28-09-2016(online)].pdf_75.pdf | 2016-09-28 |
| 5 | Form 18 [28-09-2016(online)].pdf | 2016-09-28 |
| 5 | Drawing [28-09-2016(online)].pdf | 2016-09-28 |
| 6 | Drawing [28-09-2016(online)].pdf | 2016-09-28 |
| 6 | Form 18 [28-09-2016(online)].pdf | 2016-09-28 |
| 7 | Description(Complete) [28-09-2016(online)].pdf | 2016-09-28 |
| 7 | Form 18 [28-09-2016(online)].pdf_75.pdf | 2016-09-28 |
| 8 | abstract 201641033165 .jpg | 2016-11-04 |
| 8 | Form 3 [28-09-2016(online)].pdf | 2016-09-28 |
| 9 | Correspondence by Agent_Form1 F5 PA_28-11-2016.pdf | 2016-11-28 |
| 9 | Form 5 [28-09-2016(online)].pdf | 2016-09-28 |
| 10 | Power of Attorney [28-09-2016(online)].pdf | 2016-09-28 |
| 10 | 201641033165-FER.pdf | 2020-04-30 |
| 1 | search033165E_24-04-2020.pdf |