Abstract: ABSTRACT System for Detecting User Activity and a Method Thereof The present invention relates to a system (100) for detecting user activity. The system (100) includes one or more sensors (102) attached to a wearable device of the user of a vehicle. The one or more sensors (102) is configured to detect a movement of the user. The system (100) includes a control unit (104) which is communicably connected to the one or more sensors (102). The control unit (104) is configured to receive a data related to the movement of the user from the one or more sensors (102). The control unit (104) is configured to determine a plurality of gestures based on the data received from the one or more sensors (102). The control unit (104) is configured to enable or disable a plurality of functionalities based on the determination of the plurality of gestures. Reference Figure 1
Description:FIELD OF THE INVENTION
[001] The present invention relates to a system for detecting user activity.
BACKGROUND OF THE INVENTION
[002] Typically, in the existing vehicles, more particularly, in the infotainment systems of the existing vehicles, only few switches are available which are to be operated to enable a feature of the vehicle. The user has to execute a series of mechanical presses of selected switch to enable the feature. Therefore, the user is forced to remember these mechanical operations and the map of the switches, which does not come intuitively. The user is also forced to release the handlebar and actuate the switches manually. This can lead to additional distractions to the user and increased risk in safety.
[003] Further, the user may attempt to give voice instruction in place of manually pressing the switch to enable the feature of the vehicle. Many a times, the user faces difficulty in giving voice instruction as the infotainment system fails to recognise user’s voice in scenarios like when it rains or when the vehicle is at a high speed. Voice inputs are often misheard / misread which can cause frustration to the user.
[004] In order to eliminate the above existing problem, now-a-days, the user can give instructions over the helmet to enable certain features of vehicle by way of hand gesture. However, the user has to take his hands off the handlebar to give instruction by using hand gesture over the helmet user interface which inadvertently increases the risk in safety of the user.
[005] Thus, there is a need in the art for a system which can address at least the aforementioned problems.
SUMMARY OF THE INVENTION
[006] In one aspect, the present invention is directed towards a system for detecting user activity. The system includes one or more sensors attached to a wearable device of the user of a vehicle. The one or more sensors is configured to detect a movement of the user. The system includes a control unit which is communicably connected to the one or more sensors. The control unit is configured to receive a data related to the movement of the user from the one or more sensors. The control unit is configured to determine a plurality of gestures based on the data received from the one or more sensors. Further, the control unit is configured to enable or disable a plurality of functionalities based on the determination of the plurality of gestures.
[007] In an embodiment of the invention, the system includes a communication unit. The communication unit is configured to communicate the determined plurality of gestures on an instrument cluster of the vehicle. The communication unit is configured to enable or disable the plurality of functionalities based on the determination of the plurality of gestures.
[008] In an embodiment of the invention, the wearable device comprises any one of a headgear, shoes, gloves and ankle/foot device.
[009] In another embodiment of the invention, the one or more sensors is disposed in any one of a forehead area, a side area, a bottom area, a chin area and an ear lobe area of an interior surface of a headgear.
[010] In a further embodiment of the invention, the headgear comprises a shell having an exterior surface and an interior surface. The headgear includes a visor which is connected to the shell.
[011] In yet another embodiment of the invention, the control unit is any one of the instrument cluster of the vehicle, a microprocessor and a remote device.
[012] In another embodiment of the invention, the one or more sensors is one of an accelerometer and a gyroscopic sensor and a combination thereof.
[013] In another aspect, the present invention is directed towards a method for detecting user activity. The method includes the step of detecting, by one or more sensors, a movement of the user. The method further includes the step of receiving, by a control unit, a data related to the movement of the user from the one or more sensors. The method further includes the step of determining, by the control unit, plurality of gestures based on the data received from the one or more sensors. The method further includes the step of enabling or disabling, by the control unit, a plurality of functionalities based on the determination of the plurality of gestures.
[014] In an embodiment of the invention, the method includes the steps of communicating, by a communication unit, the determined plurality of gestures on an instrument cluster of the vehicle and enabling or disabling of the plurality of functionalities based on the determination of the plurality of gestures.
[015] In an embodiment of the invention, one or more sensors is disposed in any one of a forehead area, side area, bottom area, chin area and an ear lobe area of an interior surface of a wearable device. The wearable device being any one of a headgear, shoes, gloves and ankle/foot device.
BRIEF DESCRIPTION OF THE DRAWINGS
[016] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 illustrates a block diagram of a system for detecting user activity, in accordance with an embodiment of the invention.
Figure 2 illustrates another block diagram of the system for detecting user activity, in accordance with an embodiment of the invention.
Figure 3 illustrates potential locations where the sensor for detecting user activity is mounted in a headgear, in accordance with an embodiment of the invention.
Figure 4 illustrates a method flow diagram for detecting user activity, in accordance with an embodiment of the invention.
DETAILED DESCRIPTION OF THE INVENTION
[017] Various features and embodiments of the present invention here will be discernible from the following further description thereof, set out hereunder.
[018] The present invention generally relates to a system for detecting user activity. In the ensuing exemplary embodiments, the vehicle (not shown) is a motorcycle. However, it is contemplated that the disclosure in the present invention may be applied to any automobile like a scooter or any other saddle type vehicle capable of accommodating the present subject matter without defeating the scope of the present invention.
[019] In an embodiment, the vehicle may be a two-wheeled vehicle, a three-wheeled vehicle, a four-wheeled vehicle or a multi-wheeled vehicle. The vehicle may be powered by an internal combustion engine or an electric motor through one or more batteries or a hybrid-electric motor as per requirement. It should be understood that the scope of present invention is not limited to the illustrated two-wheeled vehicle having the internal combustion engine.
[020] Figure 1 illustrates a block diagram of the system 100 for detecting user activity, in accordance with an embodiment of the invention. The system 100 includes one or more sensors 102. The one or more sensors 102 are configured to detect a movement of the user. The one or more sensors 102 are attached to a wearable device (not shown) of the user of the vehicle. In an embodiment, the wearable device includes any one of or a combination of a headgear, shoes, gloves and ankle/foot device (not shown). The wearable device is mounted on the body of the user. In a non-limiting example, when the wearable device is the headgear, the one or more sensors 102 are disposed in any one of a forehead area, a side area, a bottom area, a chin area and an ear lobe area of an interior surface of the headgear (as shown in figure 3). The headgear includes a shell having an exterior surface and interior surface. The headgear includes a visor connected to the shell. In another embodiment, the one or more sensors 102 is one of an accelerometer and a gyroscopic sensor and a combination thereof.
[021] The system 100 further includes a control unit 104. The control unit 104 is communicably connected to the one or more sensors 102. In a non-limiting example, the control unit 104 is communicably connected to the one or more sensors 102 via Bluetooth, Wi-Fi or by any wireless or wired communication system. In another embodiment, the control unit 104 is any one of an instrument cluster 110 of the vehicle, a microprocessor, a mobile phone, a tablet, a personal digital assistant (PDA) a remote device.
[022] Further, the control unit 104 is configured to receive a data related to the movement of the user from the one or more sensors 102. Once the control unit 104 receives the data, the control unit 104 is configured to determine a plurality of gestures based on the data received from the one or more sensors 102. In a non-limiting example, gestures can be made by head movements. The gestures include, but not limited to, a series of head movements done in a particular sequence, as well as the traditional head nods and side-to-side movements of the head. In a non-limiting example, the control unit 104 determines nodding of head as a ‘Yes’ gesture and side-to-side movement of head as a ‘No’ gesture. In an embodiment, vehicle information and road information can be taken into consideration by the control unit 104 to ensure that the head movement is intentional. This is to rule out any false positives and false negatives. Further, in an embodiment, the system 100 uses Artificial Intelligence (AI) models to rule out any false positives and false negatives. In an embodiment, the present invention involves the development of an Android application and an application software layer on an Advanced RISC Machine (ARM) Cortex platform, both aimed at enhancing headgear safety through gesture detection. In the Android application, modifications to the Bluetooth protocol enable seamless communication with the system 100. Additionally, the application incorporates features like auto-connect functionality for wireless connectivity with the headgear and the instrument cluster 110, along with a telephony state detection and a control system for added user convenience. These changes are crucial for ensuring smooth interaction between the application and the headgear, enhancing user experience and safety. On the ARM Cortex platform, the application software layer is designed to interface with an Inertial Measurement Unit (IMU) (not shown) and implement a machine learning (ML) model for gesture recognition. Leveraging C as the programming language, this layer efficiently handles the integration with hardware components and facilitates communication with the mobile application upon detecting gestures from the wearable device. The ML model, powered by TensorFlow Lite, receives inputs from the accelerometer and gyroscope data collected by the IMU. These inputs are processed to determine whether specific gestures have been detected, contributing to the overall safety and functionality of the system. Training the AI model is a crucial aspect of the project, accomplished using TinyML and the collaborative Google Colab platform. The model architecture, a Sequential model, is optimized for edge devices, ensuring efficient performance with minimal maintenance requirements. The AI model uses sequential model due to its lightweight feature. However, other models can also be used. Further, by training the model with 119 samples per instance, consisting of accelerometer and gyroscope data, the system 100 learns to recognize gestures accurately. The input tensor, with six inputs representing accelerometer and gyroscope data, feeds into the model, while the output tensor categorizes the detected gestures into binary outcomes ("yes" or "no"). This training process ensures the reliability and effectiveness of the system 100, ultimately enhancing the safety and usability of the headgear.
[023] Further, the control unit 104 is configured to enable or disable a plurality of functionalities based on the determination of the plurality of gestures. In an embodiment, the plurality of functionalities is, but not limited to, accept/reject incoming calls, enable/disable music system, increase/decrease volume level, change songs, cruise control, navigation control, audio/video output, temperature adjustments, headlight control, parking assistance. In a non-limiting example, the control unit 104 is configured to accept the incoming call when the ‘Yes’ gesture is determined by the control unit 104 and reject the incoming call when the ‘No’ gesture is determined by the control unit 104.
[024] Further, in another non-limiting example, nodding the head in a specific manner (e.g., a double nod) could enable playing of music, while a different nodding pattern might pause or stop the playing of the music. Furthermore, in a non-limiting example, tilting the head to the right or left can be interpreted as a command to increase or decrease the volume. A side-to-side head shake can be recognized as a command to skip to the next or previous song. Further, head movements can also be used to detect fatigue in the user and tune the vehicle features accordingly. The user can specifically customize a head nod pattern to signify fatigue and accordingly, gradually reduce speed of the vehicle to ensure safety of the user of the vehicle. Therefore, the user can define a unique head movement combination, like a double nod, to trigger the system to enable some features. Therefore, the user can customize the gestures for different functionalities based on their requirements. Further, a gradual movement of ankle or finger can also be determined as gesture by the control unit 104 based on the requirement of the user.
[025] Further, in an embodiment, the system 100 includes a communication unit 106. The communication unit is communicatively connected to the control unit 104. The communication unit 106 is configured to communicate the determined plurality of gestures from the control unit 104 to the instrument cluster 110 of the vehicle. The communication unit 106 is also configured to communicate enabling or disabling of the plurality of functionalities based on the determination of the plurality of gestures to the instrument cluster 110 of the vehicle from the control unit 104. As shown in figure 2, in an embodiment, the communication unit 106 is configured to communicate to the instrument cluster 110 via a mobile application 108 installed in a mobile phone of the user, when the system 100 is communicatively connected to the mobile phone of the user.
[026] Further, figure 4 illustrates a method 400 flow diagram for detecting user activity, in accordance with an exemplary embodiment of the present invention.
[027] At step 402, the one or more sensors 102 detects the movement of the user. The one or more sensors 102 are attached to the wearable device (not shown) of the user of the vehicle. The one or more sensors 102 is communicably connected to the control unit 104. At step 404, the control unit 104 receives the data related to the movement of the user from the one or more sensors 102. At step 406, the control unit 104 determines plurality of gestures based on the data received from the one or more sensors 102. In a non-limiting example, gestures can be made by head movements. The gestures includes, but not limited to, series of head movements done in a particular sequence, as well as the traditional head nods and side-to-side movements of the head. In a non-limiting example, the control unit 104 determines nodding of head as the ‘Yes’ gesture and side-to-side movement of head as the ‘No’ gesture.
[028] At step 408, the control unit 104 enables or disables, the plurality of functionalities based on the determination of the plurality of gestures. In an embodiment, the plurality of functionalities are, but not limited to, accept/reject incoming calls, enable/disable music system, increase/decrease volume level, change songs, cruise control, navigation control, audio/video output, temperature adjustments, headlight control, parking assistance. In a non-limiting example, the control unit 104 is configured to accept the incoming call when the ‘Yes’ gesture is determined by the control unit 104 and reject the incoming call when the ‘No’ gesture is determined by the control unit 104. The communication unit 106 is communicatively connected to the control unit 104.
[029] In an embodiment, at step 410, the communication unit 106 communicates the determined plurality of gestures on the instrument cluster 110 of the vehicle from the control unit 104. The communication unit 106 communicates enabling or disabling the plurality of functionalities based on the determination of the plurality of gestures by the control unit 104. In an embodiment, the communication unit 106 is configured to communicate to the instrument cluster 110 via the mobile application 108 installed in the mobile phone of the user, when the system 100 is communicatively connected to the mobile phone of the user.
[030] Advantageously, the present invention provides a system for detecting user activity by using the movement of the user to determine the gesture and enable or disable a feature of the vehicle based on the gesture of the user. Thus, the use does not have to take his hands off the handlebar to give instruction to enable or disable certain features of the vehicle. Therefore, the present invention ensures increase in safety of the user. The present invention allows user to configure different gestures for different features of the vehicle as per user’s requirement. Thus, it ensures the ease of operability of the present invention.
[031] Further, the present invention is easy to assemble and can be incorporated in wearable device of the user. The incorporation of present invention into wearable device ensures comfort of the user and does not provide any hinderance to the user while riding the vehicle. Thus, making the present invention quite market attractive.
[032] In light of the abovementioned advantages and the technical advancements provided by the disclosed method and system, the claimed steps as discussed above are not routine, conventional, or well understood in the art, as the claimed steps enable the following solutions to the existing problems in conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the system itself as the claimed steps provide a technical solution to a technical problem.
[033] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[034] While the present invention has been described with respect to certain embodiments, it will be apparent to those skilled in the art that various changes and modification may be made without departing from the scope of the invention as defined in the following claims.
List of Reference Numerals
100 - System
102 - Sensor
104 – Control unit
106 – Communication unit
108 – Mobile Application
110 – Instrument cluster
400 – Method
, Claims:WE CLAIM:
1. A system (100) for detecting user activity, the system (100) comprising:
one or more sensors (102) attached to a wearable device of the user of a vehicle, the one or more sensors (102) being configured to detect a movement of the user; and
a control unit (104) being communicably connected to the one or more sensors (102), the control unit (104) being configured to:
receive a data related to the movement of the user from the one or more sensors (102);
determine a plurality of gestures based on the data received from the one or more sensors (102); and
enable or disable a plurality of functionalities based on the determination of the plurality of gestures.
2. The system (100) as claimed in claim 1 comprising a communication unit (106), the communication unit (106) being configured to communicate the determined plurality of gestures on an instrument cluster (110) of the vehicle and enabling or disabling of the plurality of functionalities based on the determination of the plurality of gestures.
3. The system (100) as claimed in claim 1, wherein the wearable device comprises any one of a headgear, shoes, gloves and ankle/foot device.
4. The system (100) as claimed in claim 1, wherein the one or more sensors (102) being disposed in any one of a forehead area, a side area, a bottom area, a chin area and an ear lobe area of an interior surface of a headgear.
5. The system (100) as claimed in claim 4, wherein the headgear comprises:
a shell having an exterior surface and an interior surface; and
a visor connected to the shell.
6. The system (100) as claimed in claim 1, wherein the control unit (104) being any one of the instrument cluster (110) of the vehicle, a microprocessor and a remote device.
7. The system (100) as claimed in claim 1, wherein the one or more sensors (102) being one of an accelerometer and a gyroscopic sensor and a combination thereof.
8. A method (400) for detecting user activity, the method (400) comprising the steps of:
detecting (402), by one or more sensors (102), a movement of the user;
receiving (404), by a control unit (104), a data related to the movement of the user from the one or more sensors (102);
determining (406), by the control unit (104), a plurality of gestures based on the data received from the one or more sensors (102); and
enabling or disabling (408), by the control unit (104), a plurality of functionalities based on the determination of the plurality of gestures.
9. The method (400) as claimed in claim 8 comprising the steps of: communicating (410), by a communication unit (106), the determined plurality of gestures on an instrument cluster (110) of the vehicle and enabling or disabling of the plurality of functionalities based on the determination of the plurality of gestures.
10. The method (400) as claimed in claim 8, wherein one or more sensors (102) being disposed in any one of a forehead area, side area, bottom area, chin area and an ear lobe area of an interior surface of a wearable device, the wearable device being any one of a headgear, shoes, gloves and ankle/foot device.
Dated this 13th day of March 2024
TVS MOTOR COMPANY LIMITED
By their Agent & Attorney
(Nikhil Ranjan)
of Khaitan & Co
Reg No IN/PA-1471
| # | Name | Date |
|---|---|---|
| 1 | 202441018109-STATEMENT OF UNDERTAKING (FORM 3) [13-03-2024(online)].pdf | 2024-03-13 |
| 2 | 202441018109-REQUEST FOR EXAMINATION (FORM-18) [13-03-2024(online)].pdf | 2024-03-13 |
| 3 | 202441018109-PROOF OF RIGHT [13-03-2024(online)].pdf | 2024-03-13 |
| 4 | 202441018109-POWER OF AUTHORITY [13-03-2024(online)].pdf | 2024-03-13 |
| 5 | 202441018109-FORM 18 [13-03-2024(online)].pdf | 2024-03-13 |
| 6 | 202441018109-FORM 1 [13-03-2024(online)].pdf | 2024-03-13 |
| 7 | 202441018109-FIGURE OF ABSTRACT [13-03-2024(online)].pdf | 2024-03-13 |
| 8 | 202441018109-DRAWINGS [13-03-2024(online)].pdf | 2024-03-13 |
| 9 | 202441018109-DECLARATION OF INVENTORSHIP (FORM 5) [13-03-2024(online)].pdf | 2024-03-13 |
| 10 | 202441018109-COMPLETE SPECIFICATION [13-03-2024(online)].pdf | 2024-03-13 |