Sign In to Follow Application
View All Documents & Correspondence

Touch Gesture Recognition System For Vehicles

Abstract: The present subject matter describes a touch-gesture recognition system (100) in a vehicle. The touch-pad recognition system (100) includes a touch-pad sensor (101) and a microcontroller (105). Touch-gesture patterns made by a user are received by the touch-pad sensor (101). The microcontroller (105) connected to the touch-pad sensor (101) analyzes the touch-gesture patterns. Based on the analysis the microcontroller (105) identifies actuate commands corresponding to the touch-gesture patterns. The actuate commands are executed to operate various devices in the vehicle. Fig.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
09 February 2009
Publication Number
15/2012
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

TVS MOTOR COMPANY LIMITED
JAYALAKSHMI ESTATE, 24(OLD #8) HADDOWS ROAD, CHENNAI 600006

Inventors

1. SAMRAJ JABEZ DHINAGAR
JAYALAKSHMI ESTATE, 24(OLD #8) HADDOWS ROAD, CHENNAI 600006
2. SUNIL KUMAR CHIPPA
JAYALAKSHMI ESTATE, 24(OLD #8) HADDOWS ROAD, CHENNAI 600006

Specification

Technical Field

The present subject matter, in general, relates to switching systems in vehicles and, in particular, relates to a touch-gesture recognition based switching system for the vehicles.

Background

Conventional switching systems in a vehicle require physical activity or manual effort from a driver to accomplish various switching operations within the vehicle. The physical activity that may be required includes pushing a button or moving a switch or knob located within the vehicle. For example, in order to turn on the headlights of the vehicle, a driver of a four-wheeled vehicle has to operate upon a switch located in the vicinity of a steering wheel. In case of a two or three-wheeled vehicle, the switch lies in the vicinity of a handlebar assembly. However, any manual operation performed by the driver while driving tends to distract the driver's attention from surrounding traffic and road conditions. Such distractions can be detrimental to traffic safety.

In an attempt to alleviate the amount of manual handling of vehicle controls and to minimize the driver's distraction, speech recognition and voice command systems have been developed. However, the operation of the speech recognition and the voice command systems are based on the clarity of a voice command given, without which these systems will not operate , in a prescribed way. This could, in fact, increase the driver's distraction as the driver has to frequently repeat a command several times so as to execute a switching operation. Moreover, the driver has to remember a large number of commands in order to operate all the systems. Thus, although the usage of these systems provides a solution in principle, they have yet not been effectively implemented in real life scenarios.

New age vehicles have come up with multiple switches to operate multiple devices within these vehicles. These switches may be in the form of buttons, knobs etc, which are fitted almost everywhere around the driver. The presence of such excessive number of switches and devices may lead to utmost chaos and confusion in the mind of the driver while driving the vehicle. Therefore, manual handling of vehicles has become more complex than ever. On the other hand, if usage of only a few switches and devices is advised as a precautionary measure to avoid distraction, then it would be an under-utilization of the available resources. In addition, a non-utilization of the available resource denies the driver a benefit of advanced technology.

Summary

The present subject matter is directed towards a touch-gesture pattern recognition system implemented in a vehicle. The touch-gesture pattern recognition system includes a touch-pad sensor and a microcontroller. The touch-pad sensor receives touch-gesture patterns from a user. The microcontroller connected to the touch-pad sensor analyzes the touch-gesture patterns to execute actuate commands for operating one or more devices of the vehicle.

The touch-pad sensor, in tandem with the microcontroller, operates upon the various devices, such as a horn, high beam lamp, low beam lamp, dipper, head light, tail light, indicator, etc., within the vehicle, thereby executing a number of operations.

Accordingly, the touch-gesture recognition system acts as a centralized switching system to execute any number of switching operations of the vehicle, thereby saving the user from any chaos or confusion.

These and other features, aspects, and advantages of the present subject matter will be better understood with reference to the following description and appended claims. This summary is provided to introduce a selection of concepts in a simplified form. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.

Brief description of drawings

The above and other features, aspects and advantages of the subject matter will be better understood with regard to the following description, appended claims, and accompanying drawings where:

Fig. 1 illustrates a block diagram representation of a touch-gesture recognition system in a vehicle, according to an embodiment of the present subject matter.

Fig. 2 (a-f) depicts configuration of a touch-pad sensor of the touch-gesture recognition system of Fig, 1, in three different modes of operation, in accordance with an embodiment of the present subject matter.

Fig. 3 illustrates a detailed block diagram representation of the touch-gesture recognition system of Fig. 1, according to an embodiment of the present subject matter.

Fig, 4 depicts a data and control flow diagram representation of a microcontroller of the touch-gesture recognition system of Fig. 1, according to an embodiment of the present subject matter.

Detailed description

The disclosed subject matter relates to a touch-gesture recognition system in a vehicle for executing a number of actuate commands to selectively operate devices within the vehicle. The touch-gesture recognition system includes a touch-pad sensor and a microcontroller. A user or driver of the vehicle touches the touch-pad sensor in a predefined way to apply a touch-gesture pattern for selectively switching ON or OFF the devices to perform the number of switching operations within the vehicle. The various operations may include switching ON or OFF a home, a high beam lamp, a low beam lamp, a dipper, a head light, a tail light, indicators, an air-conditioner, a music player, a window switch, and a door lock.

In response to the receipt of the touch-gesture pattern, the touch-pad sensor generates a particular intermediate code. The microcontroller analyzes the intermediate code and, based on this analysis, activates a particular device or control within the vehicle. For the purpose of data analysis associated with the intermediate code, the microcontroller includes a movement detection module, a pressure differentiation module, and a gesture recognition module.

The movement detection module determines whether the contact made by the operator on the touch-pad sensor is a point contact or a sliding contact. The pressure differentiation module evaluates the amount of pressure exerted by the user on the touch-pad sensor while making the touch-gesture pattern. Based upon the results obtained from the pressure differentiation module and the movement detection module, the gesture recognition module retrieves and executes a relevant instruction which corresponds to the touch-gesture pattern. Accordingly, a particular device or control within the vehicle gets either switched ON or OFF. In addition, a display module is connected to the microcontroller to depict the validity of the applied touch-gesture pattern and either a success or failure of the execution of the actuate command in the vehicle.

The touch-gesture recognition system as disclosed herein provides a centralized switching system for activating or deactivating the devices implemented within the vehicle. Accordingly, the present subject matter eliminates the possibility of any confusion arising in the mind of the user due to simultaneous usage of two or more manual switches and control knobs in the vehicle while driving.

Further, the touch-gesture recognition system facilitates an ease of operation as compared to conventional manually operated switches and knobs. The ease in operation is due to the fact that the pressure applied on the touch-pad sensor is much less as compared to the force or pressure required to operate or manually actuate mechanical switches or knobs.
Fig. 1 illustrates a block diagram representation of a touch-gesture recognition system 100 in a vehicle, in accordance with one embodiment of the present subject matter.

As depicted in Fig. 1, the touch-gesture recognition system 100, hereinafter interchangeably referred to as the recognition system 100, includes a touch-pad sensor 101 and a microcontroller 105. The touch-pad sensor 101 is communicatively connected to the microcontroller 105. The combination of the touch-pad sensor 101 and the microcontroller 105 is used to execute actuate commands that in turn operate a number of devices within the vehicle. The devices may vary depending upon the type and class of the vehicle. For example, the devices in case of a two-wheeled vehicle may be less in number and complexity as compared to a four-wheeled vehicle. Without limiting the scope of subject matter, the recognition system 100 may be implemented in various types of vehicles such as the two-wheeled vehicle, a three-wheeled vehicle or the four-wheeled vehicle.

Further, the touch-pad sensor 101 may be of various sizes and shapes, also depending upon the type and class of the vehicle. The shape of the touch-sensitive surface of the touch-pad sensor 101 may be rectangular, circular, oval, or any other shape in accordance with the aesthetics of the vehicle. Without limiting the scope of the present subject matter, the touch-pad sensor 101 herein explained is rectangular in shape.

Figs 2 (a-f) depicts configuration of a touch-pad sensor 101 of the recognition system 100 of Fig, 1, in three different modes of operation, in an embodiment of the present subject matter.

In one implementation, the touch-pad sensor 101 of the present recognition system 100 operates under different modes of operation. These modes of operation may be classified as a switch-mode, a write-pad mode, and a keypad mode. Based upon the type and class of the vehicle, the touch-pad sensor 101 can be implemented in any of the aforementioned modes. In other implementation, the touch-pad sensor 101 may exhibit all of the aforementioned modes simultaneously.

The switch mode and the write pad mode are location insensitive modes. The touch-gesture pattern in terms of this mode of operation may be applied anywhere within the periphery of the touch-pad sensor 101. These two modes involve application of pressure on a touch-sensitive surface of the touch-pad sensor 101 by a user. With respect to these modes, the touch-gesture pattern includes a directional movement. Accordingly, the contact made by the user on the touch-sensitive surface of the touch-pad sensor 101 may be a sliding contact.

In contrast to the switch mode and the write pad mode, the keypad mode is location sensitive mode and is operable only by the application of pressure by the user on a predefined portion of the touch-sensitive surface of the touch-pad sensor 101, The keypad mode does not involve application of the directional movement by the user. Accordingly, the contact made by the user during this mode may be a point contact.

Fig. 2(a) depicts the touch-pad sensor 101 configured in the switch mode. Under this mode, a user may apply a touch-gesture pattern by pressing his or her finger against the touch-sensitive surface of the touch-pad sensor 101. Such touch-gesture pattern may be made either by sliding a finger or stylus on the touch-sensitive surface in one direction, or by making other different geometrical shapes and figures, such as a square, a circle, a triangle etc. thereon.

All these touch-gesture patterns are not location or position sensitive and may be applied anywhere within the boundaries of the touch-pad sensor 101. Each touch-gesture pattern may be made to correspond to an actuate command which may be executed for switching ON or OFF a particular device of the vehicle. For instance, a straight line may be drawn to execute an actuate command that activates a headlight, while a clockwise and an anticlockwise circle may be made to activate a right and left turn indicator, respectively. Moreover, as evident from the Fig. 2a, the touch-pad sensor 101 under this mode does not bear any marking or any inscription engraved on its surface.

Further, Figs. 2(b-d) depict the touch-pad sensor 101 configured in the keypad mode. In one embodiment as evident from the figures, the touch-pad sensor 101 is segmented into a number of zones. The number of segmented zone may be three, four or more zones depending upon the implementation.

As aforementioned, under the switch mode of operation, the touch-gesture pattern may be made anywhere on the touch-sensitive surface of the touch-pad sensor 101. However, the touch-gesture pattern with respect to the keypad mode is location sensitive as the touch-sensitive surface of the touch-pad sensor 101 is segmented. Within a particular zone or segment, a point contact may be made by the user by touching the aforementioned zone with the help of a finger or a stylus pen. This point contact is applied by the user at any zone of the touch-sensitive surface of the touch-pad sensor 101. Accordingly, a symbol denoted by the contacted zone gets selected in the form of an input.

Likewise, other zones provided on the touch-sensitive surface may also be contacted by the user so as to select only those symbols that are desired by the user. The touch-pad sensor 101, under the keypad mode, may act as a keyboard of a computer. As an example, the symbols associated with the zones may include numerical digits. In an implementation, these symbols may be alphanumeric symbols or special characters or a combination of both. Each symbol may correspond to a predetermined actuate command.

For example, the numeral "1" may be entered as the input by making a point contact over the zone bearing the numeral "1", so as to activate a horn of the vehicle. Similarly, the numeral "2" may be entered to activate a head lamp.

Figs. 2(e-f) depict the write-pad mode of the touch-pad sensor 101. The write-pad mode is location insensitive like the switch mode and unlike the keypad mode. However, the touch-gesture patterns under the write-pad mode include a signature or a scribbled letter applied on the touch-sensitive surface of the touch-pad sensor 101. Any pre-defined linguistic character or a string may be scribbled within the periphery of the touch-sensitive surface of the touch-pad sensor 101. Such mode of operation of the touch-pad sensor 101 makes the recognition system 100 an effective password driven authentication system. In addition, the touch-pad sensor 101 may also act as a Graphical User Interface (GUI) to display the scribbled character or word.

Further, the touch-gesture pattern in the switch mode may include various geometrical shapes and figures, while in the write pad mode, characters or a string including several characters may be entered. Therefore, the write-pad mode and the switch mode differ in terms of the applicable touch-gesture patterns.

In operation, the touch-pad sensor 101 generates a corresponding intermediate code in response to the touch-gesture pattern made thereon by the user. Specifically, the intermediate code may include numerical values, such as binary strings, which indicate the nature of the touch-gesture pattern made on the touch-pad sensor 101. In other words, each touch-gesture pattern may be associated with a unique intermediate code.

As discussed previously, the touch-gesture pattern under the switch mode and the write pad mode includes the application of directional movement and pressure. Accordingly, the corresponding intermediate code generated by the touch-pad sensor 101 includes a set of position coordinates corresponding to the directional movement and pressure data. Such pressure data indicates a pressure value. The set of position coordinates associated with a touch-gesture pattern are significant for analyzing the shape and size of the applied touch-gesture pattern to determine the corresponding actuate command associated with the touch-gesture pattern.

In case of the keypad mode of operation, the touch-gesture pattern is a point contact. Such type of touch-gesture pattern includes application of pressure without any directional movement. Accordingly, the corresponding intermediate code may include the position coordinates of the point of contact and the pressure value.

As discussed before, numerical values in the form of the intermediate code are generated by the touch-pad sensor 101. Accordingly, these numerical values may be transmitted as a part of the intermediate code by the touch-pad sensor 101 to the microcontroller 105, It is understood that the intermediate code for every touch-gesture pattern may be formed as a result of permutation and combination of different position coordinates and pressure values.

Fig. 3 illustrates a detailed block diagram representation of the touch-gesture recognition system 100 of Fig. 1, in accordance with an embodiment of the present subject matter. As shown in Fig. 3, the microcontroller 105 of the recognition system 100 includes a pressure differentiation module 301, a movement detection module 305, a database 310, a gesture recognition module 320, and a display module 325. The pressure differentiation module 301, the movement detection module 305, and the gesture recognition module 320 are executable within the microcontroller 105.

The pressure differentiation module 301 and the movement detection module 305 are communicatively connected to the gesture recognition module 320. The gesture recognition module 320 is further communicatively connected to the database 310 and the display module 325. The display module 325 may include light emitting diode (LED) indicators, a liquid crystal display (LCD) monitor, bulbs, incandescent lamps, etc., based on the type and class of the vehicle.

As previously discussed, the touch-pad sensor 101 generates the intermediate code corresponding to the application of the touch-gesture pattern. This intermediate code is communicated to the microcontroller 105. The microcontroller 105 analyzes or decodes the data pattern generated by the touch-pad sensor 101.

For the purpose, the pressure differentiation module 301 determines the amount of pressure value in the intermediate code of the touch-gesture pattern. Typically, the amount of pressure exerted by the user onto the touch-pad sensor 101, while applying the touch-gesture pattern, is compared with a pre-defined threshold value by the pressure-differentiation module 301. In this way, the pressure differentiation module 301 alerts a failure of operation if the applied pressure is below the threshold value.

Further, the movement detection module 305 determines whether the contact made by the user on the touch-sensitive surface of the touch-pad sensor 101 is a point contact or a sliding contact. For this purpose, the movement detection module analyzes the position coordinates corresponding to the intermediate code of the touch-gesture pattern. If the point contact is present in the touch-gesture pattern, then the corresponding intermediate code will include a single pair of X and Y coordinates corresponding to the position of the point contact in a two dimensional X-Y plane.

However, if the sliding contact is present in the touch-gesture pattern, then the corresponding intermediate code will include more than one pair of X-Y coordinates. Accordingly, if there is more than one pair of position coordinates corresponding to any instantaneous intermediate code, then the movement detection module 305 calculates an intended directional movement of the corresponding touch-gesture pattern.

On the basis of the calculated directional movement, the movement detection module 305 further determines the shape and size of corresponding touch-gesture pattern. This enables ascertaining whether the touch-gesture pattern is a geometrical figure (in case of switch mode) or a linguistic character or a string (in case of write pad mode). However, if a single position coordinate is present, then the movement detection module 305 declares the absence of any directional movement in the intermediate code. Accordingly, the corresponding touch-gesture pattern may be treated as the point contact, when the touch pad sensor 101 is configured to operate in the keypad mode of operation.

In one embodiment, the movement detection module 305 may be a computerized motion detection sensor known in the existing art. Such motion detection sensor may have an embedded electronic system for calculating the shape and size of the applied touch-gesture pattern. Likewise, the pressure differentiation module 301 may be a computerized pressure detection sensor known in the existing art.

The results obtained an evaluation of the numerical values, done by the pressure differentiation module 301 and the movement detection module 305, are communicated to the gesture recognition module 320. Such results in the form of data are analyzed by the gesture recognition module 320. Based upon this analysis, the gesture recognition module 320 generates a composite data pattern. On the basis of this composite data pattern, the gesture recognition module 320 retrieves a related actuate command out of a plurality of actuate commands stored in the database 310.

The actuate commands may be stored in the database 310 as factory settings, during the manufacturing process. These commands may correspond to a variety of composite data patterns related to different touch-gesture patterns. As discussed previously, every touch-gesture pattern includes a unique set of numerical values in terms of the pressure value and the position coordinates. Accordingly, there is a unique actuate command related to each predefined touch-gesture pattern. In one embodiment, these commands may be configurable by the user.

Based upon the analysis of the gesture recognition module 320, the related actuate command related to any instantaneous touch-gesture pattern is fetched from the database 310 and then executed. On the basis of this execution, the gesture recognition module 320 actuates corresponding one or more devices in the vehicle. For this purpose, the gesture recognition module 320 may be operably connected to the devices.

Moreover, an alert or indication of a successfully analyzed touch-gesture pattern for execution of an associate operation is given by the display module 325. Similarly, in case of an incorrect touch-gesture pattern, a failure alert is exhibited by the display module 325, In addition, in case the touch-gesture pattern is intended for an already ongoing operation, an appropriate failure alert may be given by the display module 325. Such failure alert in this case occurs, in spite of the successful analysis of the touch-gesture pattern. However, failure alert in this case may be of a different type then the failure alert in case of the incorrect touch-gesture pattern.

In one embodiment, the gesture recognition module 320 is a software program, which is programmed to execute on the present touch-gesture recognition system 100 or any other computerized system. In addition, the database 310 is a portion of memory used by the recognition system 100 and has an in-built look up table or spreadsheet to facilitate the gesture recognition module 320 in searching and retrieving the appropriate actuate command for any touch-gesture pattern.

Fig. 4 depicts a data and control flow diagram representation 400 of the microcontroller 105 of the touch-gesture recognition system 101 of Fig. 1, in accordance with an embodiment of the present subject matter. In one implementation, the data and control flow diagram representation 400 can be employed by the microcontroller 105 to recognize the touch-gesture pattern, when the touch-pad sensor 101 operates under either of the aforementioned modes.

As explained under Fig. 3, data from the pressure differentiation module 301 and the movement detection module 305 is communicated to the gesture recognition module 320. On the basis of this received data, the gesture recognition module 320 facilitates generation of the composite data pattern.

At decision box 401, the pressure value associated with the intermediate code is compared with a predetermined threshold pressure value P. Thus, it is determined whether a value of pressure in terms of any instantaneous touch-gesture pattern is either below or above the threshold value P. Any pressure below the threshold valve may be regarded as null. Likewise, any pressure value above the threshold depicts a presence of the pressure data in the corresponding intermediate code.

The presence or absence of the pressure data in terms of the instantaneous touch-gesture pattern is accomplished at decision box 401. Simultaneously, at decision box 405, it is ascertained whether there is any directional movement included in the applied touch-gesture pattern. For this purpose, the position coordinates associated with the touch-gesture pattern are considered. On the basis of presence of more than one pair of position coordinate, the shape and size of the instantaneous touch-gesture pattern may be ascertained by the movement detection module 305. As discussed before, if only a single pair of pressure coordinates is present in the data pattern, then the corresponding touch-gesture pattern is considered as a point contact having no directional movement.

Based on all possible results that may be generated at the decision boxes 401 and 405, the plurality of possible operations in terms of the devices may be classified into one or more operation sets 410. In one implementation, these plurality of operations may be classified into four operation sets 410. Each of these operation sets represents a broad category of operations for switching ON or OFF the devices. For example, Operation Set 1 may correspond to the results having absence of pressure data. Whereas, Operation Set 3 may correspond to the results having absence of more than one pair of position coordinates.

Based upon results generated at the decision boxes 401 and 405 in terms of the instantaneous touch-gesture pattern, any one set of operations out of the multiple operation sets 410 may be shortlisted by the gesture recognition module 320, Simultaneously, the recognition module 320 generates a composite data pattern based upon the pressure data and type of contact (point or sliding) included within the results in terms of the instantaneous touch-gesture pattern.

Further, the gesture recognition module 320 retrieves a set of actuate commands out of all actuate commands from the database 310. This retrieved set of actuate commands corresponds to the aforementioned shortlisted operation set out of the operation sets 410.

Then, a most relevant actuate command is selected out of the retrieved set of commands, based upon the composite data pattern. On the basis of execution of this most relevant actuate command, the gesture recognition module 320 executes an actuate command to perform a particular operation or a number of operations in a vehicle.

As known in the existing art, application of an amount of pressure either greater than or equal to a predefined threshold value is necessary while performing any touch-gesture pattern. For any pressure value either below or above the threshold value P at the decision box 401, the gesture recognition module 320 may actuate the display module 325 to alert a failure. Likewise, if a relevant actuate command corresponding to the instantaneous touch-gesture pattern is not found within the database 310, then the failure is alerted. Such non-retrieval of the relevant actuate command may be referred to as the unsuccessful analysis of the touch-gesture pattern, as mentioned under the description of Fig. 3.

In one embodiment, neural network algorithms may be employed by the gesture recognition module 320 for fetching of different commands in the database 310. In case of a variety of users using the same vehicle, neural network algorithms may be additionally utilized by the microcontroller 105 for storage of one or more user identifications.

Typically, various vehicle users using the same recognition system 100 have different ways of applying touch-gesture patterns on the touch-pad sensor 101. In other words, each user tends to apply the touch-gesture patterns belonging to his/her own characteristic. In order to interpret all the touch-gesture patterns of various characteristics, a user authentication may be performed at the beginning of a driving session. Such user authentication may be performed with the help of stored profiles of user who generally drive the vehicle. Moreover, the information related to characteristic gestures exhibited by these users may be pre-entered into the microcontroller 105. In another implementation, all of the users may use a common pre-defined gesture for a given operation.

Further, the database 310 of the recognition system 100 may also be employed for storage of other parameters apart from the actuate commands. As an example, if the keypad mode of the touch-pad sensor 101 is implemented as a password driven authentication means, then the database 310 may also be employed to store the password in its memory. Accordingly, the keypad mode of operation may be employed to switch ON or OFF any device within the vehicle by password-protection. In the present embodiment, the touch-pad sensor 101 in the keypad mode is segmented into different zones and may accordingly have different numerical digits engraved in different zones. In another implementation, alphanumeric and special characters may be engraved on the touch-pad sensor 101 and accordingly a touch gesture pattern may include a sequence of such alphanumeric or special characters. For example, a user may use a sequence of alphabets and numbers, such as "A12" or "A&B" as his password in a touch pad device 101 configured in the keypad mode.

Further, in one embodiment; the user may customize the database 310 through any known means existing in the art. The user may associate the actuate commands with various touch-gesture patterns of his or her own choice. Such user defined customization of the database 310 may be referred to as a configuration of the microcontroller 105 by the user.

In yet another embodiment, the touch-pad sensor 101 may be placed on either side of the handlebar of the two-wheeled vehicle. Likewise, the touch-pad sensor 101 may be placed on a dashboard of a three-wheeled vehicle. In case of a four-wheeled vehicle, the touch-pad sensor 101 may be placed either on the dashboard or on the steering wheel.

In yet another embodiment, the touch-gesture patterns may be made on the touch-pad sensor 101 by hand or by using a stylus pen. Moreover, more than one touch-pad sensor 101 may be employed in the same vehicle.

In yet another embodiment, the devices provided within the vehicle and activated by the touch-gesture pattern may include the horn, the high beam and low beam lamps, the dipper, the head and tail lights, the indicators, the air-conditioner, the music player, the window switch, and the door lock.

The previously described versions of the subject matter and its equivalent thereof have many advantages, including those which are described below.

The touch-gesture recognition system 100 is a centralized switching system that activates or deactivates various devices in the vehicle. Accordingly, an implementation of numerous independent switches for each device is eradicated. Hence, the touch-gesture recognition system 100 eradicates the confusion caused to the user due to simultaneous usage of a number of switches while driving the vehicle. In addition, time elapsed between usage of a switch and performance of a concerned operation is minimized.

The touch-pad sensor 101 occupies less space and can be practically placed anywhere within the vehicle. Moreover, the present touch-pad sensor 101 is not prone to mechanical wear and tear and can be judiciously used for years. In addition, the touch-gesture recognition system 100 may be reconfigured with the help of software programs known in the existing art.

Although the subject matter has been described with reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the subject matter, will become apparent to persons skilled in the art upon reference to the description of the subject matter. It is therefore contemplated that such modifications can be made without departing from the spirit or scope of the present subject matter as defined.

I/We claim:

1. A two-wheeled vehicle including a touch-gesture pattern recognition system (100), the touch-gesture pattern recognition system (100) comprising:

a touch-pad sensor (101) to receive at least one touch-gesture pattern; and

a microcontroller (105) connected to the touch-pad sensor (101) to analyze the

touch-gesture patterns to execute at least one actuate command for operating one or more devices of the two-wheeled vehicle.

2. The two-wheeled vehicle as claimed in claim 1, wherein the devices comprise one or more of a horn, a high beam lamp, a low beam lamp, a dipper, a head light, a tail light and an indicator.

3. The two-wheeled vehicle as claimed in claim 1, wherein the microcontroller (105) comprises:

a pressure differentiation module (301) to evaluate pressure values of the touch-gesture patterns;

a movement detection module (305) to determine directional movement associated with the touch-gesture patterns; and

a gesture recognition module (320) to identify the actuate commands corresponding to the touch-gesture patterns based on the pressure values and the directional movement.

4. The two-wheeled vehicle as claimed in claim 3, wherein the microcontroller (105) further comprises a database (310) to store the actuate commands.

5. The two-wheeled vehicle as claimed in claim 1, wherein the microcontroller (105) is coupled to a display module (325) to depict a validity of the touch-gesture patterns.

6. The two-wheeled vehicle as claimed in claim 1, wherein the touch-pad sensor (101) is operated under one or more of a switch mode, a keypad mode, and a writing-pad mode.

7. The two-wheeled vehicle as claimed in 1, wherein the touch-gesture pattern recognition system (100) is located in a handle bar assembly of the two-wheeled vehicle.

8. A four-wheeled vehicle including a touch-gesture pattern recognition system (100), the touch-gesture pattern recognition system (100) comprising:

a touch-pad sensor (101) to receive at least one touch-gesture pattern; and

a microcontroller (105) connected to the touch-pad sensor (101) to analyze the touch-gesture patterns to execute at least one actuate command for operating one or more devices of the four-wheeled vehicle, the microcontroller comprising:

a pressure differentiation module (301) to evaluate pressure values of the touch-gesture patterns;

a movement detection module (305) to determine directional movement associated with the touch-gesture patterns; and

a gesture recognition module (320) to identify the actuate commands corresponding to the touch-gesture patterns based on the pressure values and the directional movement.

9. The four-wheeled vehicle as claimed in claim 8, wherein the devices comprise one or more of a horn, a high beam lamp, a low beam lamp, a dipper, a head light, a tail light, indicators, an air-conditioner, a music player, a window switch, and a door lock.

10. The four-wheeled vehicle as claimed in claim 8, wherein the touch-gesture pattern recognition system (100) is located in a steering-wheel assembly of the four-wheeled vehicle.

11. A method of operating a touch-gesture recognition system (100) in a vehicle, the method comprising:

receiving at least one touch-gesture pattern;

analyzing the touch-gesture patterns to retrieve at least one actuate command corresponding to the touch-gesture patterns; and

executing the actuate commands to operate one or more devices of the vehicle.

12. The method as claimed in claim 11, wherein the analyzing comprises configuring a microcontroller (105) to associate the actuate commands with the touch-gesture patterns.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 276-CHE-2009 FORM-1 07-05-2009.pdf 2009-05-07
1 276-CHE-2009-Correspondence to notify the Controller (Mandatory) [03-08-2018(online)].pdf 2018-08-03
2 276-CHE-2009 POWER OF ATTORNEY 07-05-2009.pdf 2009-05-07
2 276-CHE-2009-Correspondence to notify the Controller (Mandatory) [31-07-2018(online)].pdf 2018-07-31
3 276-CHE-2009-FORM-26 [31-07-2018(online)].pdf 2018-07-31
3 276-CHE-2009 FORM -5 04-02-2010.pdf 2010-02-04
4 276-CHE-2009-HearingNoticeLetter.pdf 2018-07-02
4 276-CHE-2009 FORM -3 04-02-2010.pdf 2010-02-04
5 Other Patent Document [23-09-2016(online)].pdf 2016-09-23
5 276-CHE-2009 FORM -2 04-02-2010.pdf 2010-02-04
6 276-CHE-2009_EXAMREPORT.pdf 2016-07-02
6 276-CHE-2009 FORM -1 04-02-2010.pdf 2010-02-04
7 Abstract [14-03-2016(online)].pdf 2016-03-14
7 276-CHE-2009 DRAWINGS 04-02-2010.pdf 2010-02-04
8 Claims [14-03-2016(online)].pdf 2016-03-14
8 276-CHE-2009 DESCRIPTION (COMPLETE) 04-02-2010.pdf 2010-02-04
9 276-CHE-2009 CORRESPONDENCE OTHERS 04-02-2010.pdf 2010-02-04
9 Correspondence [14-03-2016(online)].pdf 2016-03-14
10 276-CHE-2009 CLAIMS 04-02-2010.pdf 2010-02-04
10 Description(Complete) [14-03-2016(online)].pdf 2016-03-14
11 276-CHE-2009 ABSTRACT 04-02-2010.pdf 2010-02-04
11 Examination Report Reply Recieved [14-03-2016(online)].pdf 2016-03-14
12 276-CHE-2009 FORM -18 08-02-2010.pdf 2010-02-08
12 OTHERS [14-03-2016(online)].pdf 2016-03-14
13 0276-che--2009 form-3.pdf 2011-09-02
13 Other Document [11-03-2016(online)].pdf 2016-03-11
14 0276-che--2009 form-1.pdf 2011-09-02
14 Petition Under Rule 137 [11-03-2016(online)].pdf 2016-03-11
15 0276-che--2009 drawings.pdf 2011-09-02
15 Correspondence [18-08-2015(online)].pdf 2015-08-18
16 0276-che--2009 correspondence-others.pdf 2011-09-02
16 Description(Complete) [18-08-2015(online)].pdf 2015-08-18
17 Examination Report Reply Recieved [18-08-2015(online)].pdf 2015-08-18
17 276-CHE-2009 DESCRIPTION (PROVISIONAL).pdf 2011-12-09
18 abstract276-CHE-2009.jpg 2012-03-05
19 276-CHE-2009 DESCRIPTION (PROVISIONAL).pdf 2011-12-09
19 Examination Report Reply Recieved [18-08-2015(online)].pdf 2015-08-18
20 0276-che--2009 correspondence-others.pdf 2011-09-02
20 Description(Complete) [18-08-2015(online)].pdf 2015-08-18
21 0276-che--2009 drawings.pdf 2011-09-02
21 Correspondence [18-08-2015(online)].pdf 2015-08-18
22 0276-che--2009 form-1.pdf 2011-09-02
22 Petition Under Rule 137 [11-03-2016(online)].pdf 2016-03-11
23 0276-che--2009 form-3.pdf 2011-09-02
23 Other Document [11-03-2016(online)].pdf 2016-03-11
24 OTHERS [14-03-2016(online)].pdf 2016-03-14
24 276-CHE-2009 FORM -18 08-02-2010.pdf 2010-02-08
25 276-CHE-2009 ABSTRACT 04-02-2010.pdf 2010-02-04
25 Examination Report Reply Recieved [14-03-2016(online)].pdf 2016-03-14
26 276-CHE-2009 CLAIMS 04-02-2010.pdf 2010-02-04
26 Description(Complete) [14-03-2016(online)].pdf 2016-03-14
27 276-CHE-2009 CORRESPONDENCE OTHERS 04-02-2010.pdf 2010-02-04
27 Correspondence [14-03-2016(online)].pdf 2016-03-14
28 276-CHE-2009 DESCRIPTION (COMPLETE) 04-02-2010.pdf 2010-02-04
28 Claims [14-03-2016(online)].pdf 2016-03-14
29 276-CHE-2009 DRAWINGS 04-02-2010.pdf 2010-02-04
29 Abstract [14-03-2016(online)].pdf 2016-03-14
30 276-CHE-2009 FORM -1 04-02-2010.pdf 2010-02-04
30 276-CHE-2009_EXAMREPORT.pdf 2016-07-02
31 Other Patent Document [23-09-2016(online)].pdf 2016-09-23
31 276-CHE-2009 FORM -2 04-02-2010.pdf 2010-02-04
32 276-CHE-2009-HearingNoticeLetter.pdf 2018-07-02
32 276-CHE-2009 FORM -3 04-02-2010.pdf 2010-02-04
33 276-CHE-2009-FORM-26 [31-07-2018(online)].pdf 2018-07-31
33 276-CHE-2009 FORM -5 04-02-2010.pdf 2010-02-04
34 276-CHE-2009-Correspondence to notify the Controller (Mandatory) [31-07-2018(online)].pdf 2018-07-31
34 276-CHE-2009 POWER OF ATTORNEY 07-05-2009.pdf 2009-05-07
35 276-CHE-2009-Correspondence to notify the Controller (Mandatory) [03-08-2018(online)].pdf 2018-08-03
35 276-CHE-2009 FORM-1 07-05-2009.pdf 2009-05-07