Abstract: The present invention relates to a system (100) and method (200) for monitoring a user in a vehicle. The system (100) has one or more image sensors (110) for capturing real-time images of the user riding the vehicle. The system (100) has a processing unit (120) configured to receive the real-time images of the user from the one or more image sensors (110) and determine one or more user activities based on the real-time images received from the one or more image sensors (110). The processing unit (120) is configured to determine whether the one or more user activities correspond to a set of predefined user conditions The system (100) has a feedback module (130) configured to receive an input from the processing unit if the one or more user activities correspond to the set of predefined user conditions. The feedback module (130) is configured to generate an output command. Reference Figure 1
Description:FIELD OF THE INVENTION
[001] The present invention relates to monitoring of a user in a vehicle. More particularly, the present invention relates to a system and a method for monitoring the user in the vehicle.
BACKGROUND OF THE INVENTION
[002] With the advancement in vehicle technologies, there is greater focus on enhancement of user assistance and user safety systems, and on improving the overall riding experience of the user. In existing designs, the user safety and monitoring system is explored to a very limited extent, leading to several problems including the safety of the user. Conventionally, it is not possible to monitor the user behaviour and to detect riding patterns of the user in real-time, and therefore the inability to evaluate the overall performance of the user. The absence or insufficient monitoring of the users may lead to safety concerns including but not limited to drowsy riding, distraction, impaired riding, medical emergency, aggressive riding, and the like.
[003] There is a growing need for implementing the user monitoring system for the user based on his riding skills for ensuring the safety of the user and reducing the risk of accidents. However, the existing solutions do not focus on monitoring the user as implementing advanced user monitoring systems in vehicles requires significant investments in research, development, and integration. There is a need for the user monitoring system that is cost-friendly and affordable to the user. Also, a saddle type vehicle faces technical challenges for implementing the user monitoring system. For example, the saddle type vehicle is more susceptible to vibrations and movements due to road condition, and this impacts the accuracy and reliability of the sensors and camera. Similarly, the saddle type vehicle is more susceptible to noise and hence, may impact the overall efficiency of the user monitoring systems. Further, the saddle type vehicles have space constraints as well and hence, cannot accommodate a huge user monitoring system.
[004] Therefore, it is required that a compact design be made with the integration of the user specific customisation such as fatigue detection, drowsiness detection, distraction detection, abnormal riding recognition, bad pose detection, and the like. The users of personal vehicles now require their vehicle to be more customised to their requirements and expect the vehicle to adapt to their riding style to monitor the user behaviour while riding the vehicle.
[005] The automation also needs to allow the vehicle to make informed decisions on the safety of the user, in addition to overall improvement in user experience. In existing systems, the user monitoring system is not present due to various constraints such as cost, technical challenges, space, and the like. Further, the existing systems do not allow the vehicle to intervene in the middle of the trip, which severely restricts user experience and has significant safety concerns.
[006] Thus, there is a need in the art for a system and method for monitoring a user in a vehicle, which addresses at least the aforementioned problems.
SUMMARY OF THE INVENTION
[007] In one aspect, the present invention relates to a system for monitoring a user in a vehicle. The system has one or more image sensors for capturing real-time images of the user riding the vehicle. The system further has a processing unit configured to receive the real-time images of the user from the one or more image sensors during riding condition of the vehicle. The processing unit is further configured to determine one or more user activities based on the real-time images received from the one or more image sensors. The processing unit is then configured to determine whether the one or more user activities correspond to a set of predefined user conditions The system further has a feedback module configured to receive an input from the processing unit if the one or more user activities correspond to the set of predefined user conditions. The feedback module is then configured to generate an output command.
[008] In an embodiment of the invention, the output command generated by the feedback module includes at least one of an indication to the user or activating one or more rider assistance and comfort functions.
[009] In a further embodiment of the invention, the processing unit has one or more modules, the one or more modules being configured to receive real time images of the user for determining the one or more user activities during riding condition of the vehicle.
[010] In a further embodiment of the invention, the processing unit has an analysis module, the analysis module configured to determine one or more of a set of predefined user conditions, based on the frequency of the one or more of the user activity during the predefined time, wherein the set of predefined user conditions includes one or more of: drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.
[011] In a further embodiment of the invention, the system has an illumination sensor unit, the illumination sensor unit is in communication with the processing unit and is configured to detect a level of ambient light around the vehicle, and the processing unit is configured to switch on a vehicle lighting system or an illuminator unit if the ambient light is below a predetermined threshold value of ambient light.
[012] In a further embodiment of the invention, the system has an auxiliary sensor unit, the auxiliary sensor unit is in communication with the processing unit and is configured to detect one or more vehicle parameters. The auxiliary sensor unit is configured to determine whether the one or more vehicle parameters are below a first predetermined threshold. The processing unit is further configured to switch off the one or more image sensors and the illumination sensor unit if the one or more vehicle parameters are below the first predetermined threshold.
[013] In another aspect, the present invention relates to a method for monitoring a user in a vehicle. The method has the steps of capturing, by one or more image sensors, real-time images of the user riding the vehicle; receiving, by a processing unit, the real-time images of the user riding the vehicle captured by the one or more image sensors; determining, by the processing unit, one or more user activities based on the real-time images received from the one or more image sensors; determining, by the processing unit, whether the one or more user activities correspond to a set of predefined user conditions; and generating, by a feedback module, an output command if the one or more user activities correspond to the set of predefined user conditions.
[014] In an embodiment of the invention, the output command generated by the feedback module has at least one of an indication to the user or activating one or more rider assistance and comfort functions.
[015] In a further embodiment of the invention, the one or more modules of the processing unit being configured to receive real time images of the user for determining the one or more user activities during riding condition of the vehicle.
[016] In a further embodiment of the invention, the processing unit has an analysis module, the analysis module configured to determine one or more of a set of predefined user conditions, based on the frequency of the one or more of the user activity during the predefined time, wherein the set of predefined user conditions includes one or more of: drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.
[017] In a further embodiment of the invention, the method further has the step of collecting, a data set in relation to one or more activities of the user; processing, the data set for further analysis by filtering and transforming of the data set; annotating, the data set; feeding, the annotated data set into the one or more modules; and training, the one or more modules to determine the one or more user activities based on the real-time images of the user.
[018] In a further embodiment of the invention, the method further has the steps of detecting, by an illumination sensor unit, an ambient light around the vehicle; and switching on, by the processing unit, a vehicle lighting system or an illuminator unit if the ambient light is below a predetermined threshold value of ambient light.
[019] In a further embodiment of the invention, the method further has the steps of detecting, by an auxiliary sensor unit, one or more vehicle parameters; determining, by the auxiliary sensor, whether the one or more vehicle parameters are below a first predetermined threshold; and switching off, by the processing unit, the one or more image sensors and the illumination sensor unit if the one or more vehicle parameters are below the first predetermined threshold.
BRIEF DESCRIPTION OF THE DRAWINGS
[020] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 illustrates a system for monitoring a user in a vehicle, in accordance with an embodiment of the present invention.
Figure 2 illustrates the user activities and the predefined user conditions, in accordance with an embodiment of the invention.
Figure 3 illustrates the steps involved in the method for monitoring a user in a vehicle, in accordance with an embodiment of the invention.
Figure 4 illustrates the steps involved in the method for training the one or more modules for monitoring a user in a vehicle, in accordance with an embodiment of the invention.
Figure 5 illustrates a process flow of system and method for monitoring the user in the vehicle, in accordance with an embodiment of the present invention.
Figure 6 illustrates a software architecture for the system and method for monitoring the user in the vehicle, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[021] The present invention relates to monitoring of a user in a vehicle. More particularly, the present invention relates to a system and a method for monitoring a user in a vehicle. The system and method of the present invention are typically used in a vehicle such as a two wheeled vehicle, or a three wheeled vehicle including trikes, or a four wheeled vehicle, or other multi-wheeled vehicles as required.
[022] Figure 1 illustrates a system 100 for monitoring a user in a vehicle, in accordance with an embodiment of the present invention. The monitoring of the user is done based on certain predefined conditions of the user in real-time during riding condition of the vehicle. These conditions are the static and dynamic characteristics of the user that define the behaviour or performance of the user when the vehicle is on the road and the user is riding the vehicle. The predefined user conditions include but are not limited to the following:
1. Drowsiness: The user is monitored based on the fatigue and the drowsiness of the user. For example, the user may fall asleep while riding the vehicle or experience decreased alertness and hence, leading to an increased risk of accident. Therefore, it becomes important to monitor the user in such situations.
2. Distraction: The user may be distracted while riding the vehicle and the same may lead to road accidents. For example, the users often engage in activities such as texting, browsing the internet, calling, and the like. Therefore, it becomes important to monitor the user in such distracted situations.
3. Impaired Riding: The users, sometimes, may be riding the vehicle under the influence of drugs or alcohol consumption. It reduces the overall riding skills of the user and hence, may lead to a significant risk to the safety of the user. Therefore, it becomes important to monitor the user in such erratic situations.
4. Medical Emergencies: The users may experience sudden medical emergencies while riding the vehicle, such as a heart attack, seizure, and the like. The user monitoring systems can help detect abnormal behaviour, vital signs, or other indicators of medical distress. Without such monitoring, the timely identification of a medical emergency becomes difficult, leading to accidents of the users.
5. User Pose and Actions: The user monitoring systems are essential for evaluating the performance and behaviour of the user. They can provide valuable data on aspects such as user pose, behaviour, and actions. This information can be utilized for training the user, identifying areas of improvement, and promoting safe riding habits in the user.
[023] The predefined user conditions are detected dynamically by the system 100 during the course of riding the vehicle by the user in real-time. As illustrated, the system 100 has one or more image sensors 110 being configured for capturing real-time images of the user riding the vehicle. In an embodiment, the one or more image sensors 110 captures the user in the real-time and generates the live feed of the user in the form of images and videos for further processing. In an embodiment, the one or more image sensors 110 comprise one or more of, but not limited to a camera, a Red-Green-Blue camera, Red-Green-Blue + Infrared camera, Infrared camera, monochrome camera, thermal camera, RADAR, and the like. In an example, once the vehicle is powered ON by the user, the camera is configured to capture the images and the videos of the user in the real-time.
[024] Further, the system 100 has a processing unit 120 that is configured to receive the real-time images of the user from the one or more image sensors 110. The processing unit 120 is further configured to determine one or more user activities based on the real-time images received from the one or more image sensors 110. In an embodiment, the processing unit 120 includes one or more modules, the one or more modules being configured to determine the one or more user activities during riding condition of the vehicle based on the real time images. In an embodiment, the one or more modules comprises a plurality of Artificial Intelligence based models having machine learning and deep machine learning capabilities. In this regard, the one or more user activities corresponds to one or more of: head movement of the user, lip movement of the user, yawn by the user, eye movement of the user, blinking by the user, hand position of the user, usage of mobile device by the user, and sitting or standing by the user.
[025] Further, the processing unit 120 is configured to determine whether the one or more user activities correspond to one or more of a set of predefined user conditions. In an embodiment, the processing unit 120 includes an analysis module 122. In an embodiment, the analysis module 122 is configured to determine one or more user activities from the set of predefined user conditions, based on the frequency of the one or more of the user activities during the predefined time. In an embodiment, the predetermined frequency of time ranges between 5 seconds to 5 minutes. For example, if the analysis module 122 determines that one or more user activities, say yawning occurs a greater number of times than a threshold value in a span of 5 minutes, the analysis module 122 determines that the same corresponds to one or more of the set of predefined rider conditions. In this regard, the set of predefined user conditions comprises one or more of: drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user. For example, the analysis module 122 of the processing unit 120 determines whether the one or more user activities correspond to a set of predefined user conditions based on the frequency of the one or more of the user activity during the predefined time,, interdependency of activities, activity threshold, data correlation, data dependency, and the like to determine the drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.
[026] In an embodiment as depicted in Figure 2, the processing unit 120 is configured to detect one or more of a head movement, yawn, lip movement, gaze, eye blink, hand pose, presence of mobile phone and seating & standing pose. Herein, the inputs of the aforementioned user activities are received by the analysis module 122. The analysis module 122 determines one or more of a set of predefined user conditions, based on the frequency of the one or more of the user activities during the predefined time. As illustrated, the analysis module 122 determines that the one or more user activities correspond to a fatigue condition based on the frequency of one or more of the following detected user activities within the predetermined time – yawn, gaze, eye blink, hand pose and seating & standing pose. Further, the analysis module 122 determines that the one or more user activities correspond to a distraction condition based on the frequency of one or more of the following detected user activities within the predetermined time - lip movement, gaze, presence of mobile phone, and head movement. Similarly, the analysis module 122 determines that the one or more user activities correspond to a drowsiness condition based on the frequency of one or more of the following detected user activities within the predetermined time – yawn, gaze, eye blink and head movement. Similarly, the analysis module 122 determines that the one or more user activities correspond to an abnormal driving condition based on the frequency of one or more of the following detected user activities within the predetermined time - presence of mobile phone, hand pose and seating & standing position. Further, the analysis module 122 determines that the one or more user activities correspond to a bad riding pose condition based on the frequency of one or more of the following detected user activities within the predetermined time – hand pose and seating & standing pose.
[027] For example, a user ‘X’ is riding the vehicle. During the riding of the vehicle, the user ‘X’ yawns for n number of times and blinks his eye for m number of times in five minutes. The camera captures the images and videos of the user yawning and blinking the eyes in real-time. The processing unit then analyses the images and videos and determines that the user ‘X’ is in fatigue condition. Based on the analysis, the feedback module is activated, and it generates an indication to the user and also activates the rider assistance unit. Based on the rider assistance unit, the speed of the vehicle is reduced, and the seat of the vehicle is cooled down to provide the comfort to the user ‘X’.
[028] For example, a user ‘Y’ is continuously riding the vehicle for five hours in cruise control mode, the processing unit analyses the data based on geographical factors, road conditions, and climatic factors of the area. Based on the analysis, the feedback module is activated, and it generates an indication to the user and also activates the rider assistance unit as well. Based on the rider assistance unit, the speed of the vehicle is reduced to improve the safety of the user ‘Y’.
[029] The system 100 has a feedback module 130. The feedback module 130 is in communication with the processing unit 120. The feedback module 130 is configured to receive an input from the processing unit 120 if the one or more user activities correspond to one or more of the set of predefined user conditions. The feedback module 130 is configured to generate an output command if the one or more user activities correspond to one or more of the set of predefined user conditions. In an embodiment, the output command generated by the feedback module 130 includes at least one of an indication to the user or activating one or more rider assistance and comfort functions. In this regard, the one or more rider assistance and comfort functions comprises one or more of cooling of a seat of the vehicle, limiting a speed of the vehicle to a predetermined value of vehicle speed, adaptive cruise control functionality, and automatic braking. For example, the feedback module 130 generates an output command for cooling of the seat of the vehicle when a fatigue condition is detected by the processing unit 120. In an embodiment, the speed of the vehicle may be limited to 30% of the maximum speed limit in case an output command is generated by the feedback module 130. Further, the indication to the user comprises one or more of voice indication, video indication, haptic indication, display indication, and user activity report. In an embodiment, if the user is riding the vehicle for a predefined duration, for example more than five hours, the indication is also sent to the user based on the road condition, climate details, geographical location, and the like.
[030] The system 100 further includes an illumination sensor unit 140. The illumination sensor unit 140 is in communication with the processing unit 120. The illumination sensor unit 140 is configured to detect a level of ambient light around the vehicle. Further, the processing unit 120 is configured to switch on a vehicle lighting system or an illuminator unit if the ambient light is below a predetermined threshold value of ambient light. For example, the brightness of the instrument cluster is increased if the detected ambient light is below a predetermined threshold. The system 100 further includes an auxiliary sensor unit 150. The auxiliary sensor unit 150 is in communication with the processing unit 120 and is configured to detect one or more vehicle parameters. The auxiliary sensor unit 150 is configured to determine whether the one or more vehicle parameters are below a first predetermined threshold. In an embodiment, the one or more vehicle parameters comprise a state of charge of the battery of the vehicle, and the first predetermined threshold is the state of charge of the battery ranging between 15-20%. The processing unit 120 is further configured to switch off the one or more image sensors 110 and the illumination sensor unit 140 if the one or more vehicle parameters are below the first predetermined threshold. In an embodiment, the vehicle parameters include but is not limited state of charge of a battery of the vehicle. Thus, provision of the auxiliary sensor unit 150 prevents deep discharge of the battery.
[031] In another aspect, the present invention relates to a method 200 for monitoring a user in a vehicle. The steps involved in the method 200 for monitoring the user in the vehicle are illustrated in Figure 3. As illustrated, at step 202, one or more image sensors 110 are activated. In an embodiment, the one or more image sensors 110 are activated when the vehicle is switched ON by the user. This activates all the sensors of the vehicle for monitoring the user during the riding condition in real-time.
[032] At step 204, the one or more image sensors 110 captures real-time images of the user riding the vehicle. Therefore, once the one or more image sensors 110 are activated, it captures the images and the videos of the user in the real-time and sends it to the processing unit 120 for further processing. The processing unit 120 receives the real-time images of the user riding the vehicle captured by the one or more image sensors 110.
[033] At step 206, the processing unit 120 determines one or more user activities based on the real-time images received from the one or more image sensors 110 and processed by the processing unit 120. In an embodiment, the processing unit 120 includes one or more modules, the one or more modules is configured to determine the one or more one or more user activities. In an embodiment, one or more modules of the processing unit 120 are configured to receive real time images of the user for determining the one or more user activities during the riding condition of the vehicle. . In this regard, the one or more user activities corresponds to one or more of: head movement of the user, lip movement of the user, yawn by the user, eye movement of the user, blinking by the user, hand position of the user, usage of mobile device by the user, and sitting or standing by the user.
[034] At step 208, the processing unit 120 determines whether the one or more user activities correspond to one or more of a set of predefined user conditions. In an embodiment, the processing unit 120 includes an analysis module 122. In an embodiment, the analysis module 122 is configured to determine one or more user activities from the set of predefined user conditions, based on the frequency of the one or more of the user activities during the predefined time. In an embodiment, the predetermined frequency of time ranges between 5 seconds to 5 minutes. For example, if the analysis module 122 determines that one or more user activities, say yawning occurs a greater number of times than a threshold value in a span of 5 minutes, the analysis module 122 determines that the same corresponds to one or more of the set of predefined rider conditions. In this regard, the set of predefined user conditions comprises one or more of: drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user. For example, the analysis module 122 of the processing unit 120 determines whether the one or more user activities correspond to a set of predefined user conditions based on the frequency of the one or more of the user activity during the predefined time, interdependency of activities, activity threshold, data correlation, data dependency, and the like to determine the drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user..
[035] At step 210, the feedback module 130 generates an output command if the one or more user activities correspond to one or more of the set of predefined user conditions. In an embodiment, the output command generated by the feedback module 130 includes at least one of an indication to the user or activating one or more rider assistance and comfort functions. In this regard, the one or more rider assistance and comfort functions comprises one or more of cooling of a seat of the vehicle, limiting a speed of the vehicle to a predetermined value of vehicle speed, adaptive cruise control functionality, and automatic braking. In an embodiment, the speed of the vehicle may be limited to a predefined percentage of the maximum speed limit in case an output command is generated by the feedback module 130. Further, the indication to the user comprises one or more of voice alert, video alert, haptic alert, display alert, and user activity report.
[036] In an embodiment, if the user is riding the vehicle continuously for a predefined duration, for example more than four hours, the indication is also sent to the user based on the road condition, climate details, geographical location, and the like for the safety of the user. For example, if the user is drowsy, then the feedback module generates an output command and sends an indication to the user. The rider assistance unit is also activated which cools down the seat of the user and reduces the speed of the vehicle below a predetermined speed.
[037] As further illustrated in Figure 3, the method includes detecting, by an illumination sensor unit 140, an ambient light around the vehicle. The method then includes switching on, by the processing unit 120, a vehicle lighting system or an illuminator unit if the ambient light is below a predetermined threshold value of the ambient light. This ensures the safe riding condition of the user in the real-time. In an embodiment, the method includes detecting, by an auxiliary sensor unit 150, one or more vehicle parameters; determining, by the auxiliary sensor unit 150, whether the one or more vehicle parameters are below a first predetermined threshold; and switching off, by the processing unit 120, the one or more image sensors 110 and the illumination sensor unit 140 if the one or more vehicle parameters are below the first predetermined threshold. In an embodiment, the vehicle parameters include but is not limited to aggressiveness factor, which is indicative of variation in throttle input, other rider control inputs such as braking, clutch actuation, and the like, lean angle data indicative of the lean of the vehicle by the user, and the illumination around the vehicle. In an embodiment, the one or more vehicle parameters comprise a state of charge of the battery of the vehicle, and the first predetermined threshold is the state of charge of the battery ranging between 15-20%.
[038] As illustrated in Figure 4, the steps involved in the method for training the one or more modules for monitoring a user in a vehicle is disclosed. At step 302, a data set is collected in relation to one or more activities of the user. In an embodiment, the inputs are received in relation to one or more activities of the user from at least one of an Artificial Intelligence model or one or more image sensors 110. The one or more activities of the user corresponds to head movement of the user, lip movement of the user, yawn by the user, eye movement of the user, blinking by the user, hand position of the user, usage of mobile device by the user, and sitting or standing by the user. In an embodiment, a set of data related to the physical activity of the user are collected for training the one or more modules for making predictions and decisions.
[039] At step 304, the collected data set is processed for further analysis. The processing is done by filtering and transforming of the data set. In an embodiment, the set of data are processed to prepare it for further analysis and modelling. It further involves transforming and cleaning the data to improve the quality, reliability, and compatibility of the data.
[040] At step 306, the data set is annotated. In an embodiment, the processed data is annotated, for example, by annotating with specific attributes to the data points to provide context and meaning to the information for easy processing. At step 308, the annotated data set is fed into the one or more modules.
[041] At step 310, the one or more modules are trained to determine the one or more user activities based on the real-time images of the user. At step 312, the one or more modules are evaluated. Lastly, at step 314, the one or more modules is configured to determine the activities of the user and the same is then deployed in the processing unit 120.
[042] Figure 5 illustrates a process flow of the present invention in accordance with an embodiment of the invention. In operation, for example, the one or more image sensors 110 capture real time images of the user. The one or more image sensors 110 capture the real time images of the user as a video stream, which is then converted to video input, and this video input is then encoded for further processing. Thereafter, the processing unit 120 detects the one or more user activities based on the Artificial Intelligence model. The processing unit 120 receives input from the image sensors 110 in relation to one or more of the following: a head movement detection, yawn detection, lip movement detection, seating & standing pose detection, hand pose detection, mobile detection, gaze detection and eye blink detection. Thereafter, the analysis module 122 of the processing unit 120 determines whether the one or more user activities correspond to one or more of the set of predefined user conditions which include a fatigue detection, a drowsiness detection, distraction detection, bad pose detection and abnormal driving detection. Based on the inputs from the analysis module 122 of the processing unit 120, the feedback module 130 generates indications for the user using one or more of a voice alert functionality, a haptic alert functionality, a display alert functionality, or a user activity report. Further, the feedback module 130 generates an output command for seat cooling, limiting of vehicle speed referred to as limp/lymph home mode and limiting of vehicle autonomous functionalities.
[043] Figure 6 illustrates the software architecture in relation to the present invention. As illustrated in Figure 6, the software architecture has a user activity analysis module 122. The processing unit 120 comprises the plurality of modules including the analysis module 122. The analysis module 122 of the processing unit 120 is operatively coupled to a plurality of microservices 176. Microservices are libraries which are a part of the processing unit 120. In operation, the processing unit 120 receives the real time images from the one or more image sensors 110 through a hardware abstraction layer 172 and an operating system 170. Based on the inputs, the analysis module 122 receives inputs of one or more user activities from a user activity detection module 124. The user activity is analysed by the analysis module 122 and the user activity interpretation module 123 determines whether the one or more user activities correspond to one or more of a set of predefined user conditions based on user activity frequency and time. The processing unit 120 also has a debug module 178 for limiting chances of error in the functioning of the processing unit 120.
[044] Advantageously, the present invention provides a system and a method for monitoring a user in a vehicle, wherein the system monitors a physical state of the user using one or more modules in real time, which enhances the overall user experience and the safety of the user. Further, the present invention allows for providing an accurate, efficient, and reliable system for monitoring the user according to the riding style of the user.
[045] Furthermore, the present invention generates an indication to be sent to the user and therefore, enhances the safety of the user in real-time. The indication is generated based on the user physical characteristics such as drowsiness detection, distraction detection, impaired riding detection, medical emergency detection, bad pose detection, and the like. The present invention allows for providing the comfort to the user based on the user physical characteristics. The present system being customised to generate the indication in real-time without the intervention of the user and therefore, it increases the performance, handling, market attractiveness of the vehicle. Further the present invention allows the vehicle to intervene in a middle of a trip, which allows the vehicle to make better informed and correct decisions, thus enhancing the safety and monitoring of the user.
[046] In addition, implementation of the system and method of the present invention is done in the real-time based on the activities of the user and the vehicle parameters, thus ensuring better safety and monitoring of the user in the vehicle. Further, the present system is cost-effective and reliable and hence, the system can be integrated with the vehicle for the safety of the user.
[047] In light of the abovementioned advantages and the technical advancements provided by the disclosed method and system, the claimed steps as discussed above are not routine, conventional, or well understood in the art, as the claimed steps enable the following solutions to the existing problems in conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the device itself as the claimed steps provide a technical solution to a technical problem.
[048] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[049] While the present invention has been described with respect to certain embodiments, it will be apparent to those skilled in the art that various changes and modification may be made without departing from the scope of the invention as defined in the following claims.
List of Reference Numerals
100: System for monitoring a user in a vehicle
110: One or More Image Sensors
120: Processing Unit
122: Analysis Module
130: Feedback Module
140: Illumination Sensor Unit
150: Auxiliary Sensor Unit
170: Operating System
172: Hardware Abstraction Layer
176: Microservices
178: Debug Module
200: Method for monitoring a user in a vehicle
300: Method for Training the one or more Modules
, Claims:1. A system (100) for monitoring a user in a vehicle, the system (100) comprising:
one or more image sensors (110), the one or more image sensors (110) being configured to capture real-time images of the user riding the vehicle;
a processing unit (120), the processing unit (120) being configured to receive the real-time images of the user from the one or more image sensors (110), and the processing unit (120) being configured to determine one or more user activities based on the real-time images received from the one or more image sensors (110), and the processing unit (120) being configured to determine whether the one or more user activities correspond to one or more of a set of predefined user conditions; and
a feedback module (130), the feedback module (130) being configured to receive an input from the processing unit (120) if the one or more user activities correspond to one or more of the set of predefined user conditions, and the feedback module (130) being configured to generate an output command.
2. The system (100) as claimed in claim 1, wherein the output command generated by the feedback module (130) comprises at least one of an indication to the user or activating one or more rider assistance and comfort functions.
3. The system (100) as claimed in claim 1, wherein the processing unit (120) comprises one or more modules, the one or more modules being configured to receive real time images of the user for determining the one or more user activities, during riding condition of the vehicle.
4. The system (100) as claimed in claim 1, wherein the one or more modules includes an analysis module (122), the analysis module (122) being configured to determine the one or more user conditions from the set of predefined user conditions, based on the frequency of the one or more of the user activity during the predefined time, wherein the set of predefined user conditions includes one or more of: drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.
5. The system (100) as claimed in claim 1, comprising an illumination sensor unit (140), the illumination sensor unit (140) being in communication with the processing unit (120) and being configured to detect a level of ambient light around the vehicle, and the processing unit (120) being configured to switch on at least one of a vehicle lighting system and an illuminating unit, if the ambient light is below a predetermined threshold value of the ambient light.
6. The system (100) as claimed in claim 5, comprising an auxiliary sensor unit (150), the auxiliary sensor unit (150) being in communication with the processing unit (120) and being configured to detect one or more vehicle parameters and determine whether the one or more vehicle parameters are below a first predetermined threshold; and
switch off the one or more image sensors (110) and the illumination sensor unit (140) if the one or more vehicle parameters are below the first predetermined threshold.
7. A method (200) for monitoring a user in a vehicle, the method (200) comprising the steps of:
capturing, by one or more image sensors (110), real-time images of the user riding the vehicle;
receiving, by a processing unit (120), the real-time images of the user riding the vehicle captured by the one or more image sensors (110);
determining, by the processing unit (120), one or more user activities based on the real-time images received from the one or more image sensors (110);
determining, by the processing unit (120), whether the one or more user activities correspond to one or more of a set of predefined user conditions; and
generating, by a feedback module (130), an output command if the one or more user activities correspond to one or more of the set of predefined user conditions.
8. The method (200) as claimed in claim 7, wherein the output command generated by the feedback module (130) comprises at least one of an indication to the user and activating one or more rider assistance and comfort functions.
9. The method (200) as claimed in claim 7, wherein one or more modules of the processing unit (120) being configured to receive real time images of the user for determining the one or more user activities during the riding condition of the vehicle.
10. The method (200) as claimed in claim 7, wherein the one or more modules includes an analysis module (122), the analysis module (122) being configured to determine one or more user activities from the set of predefined user conditions based on the frequency of the one or more of the user activity during the predefined time, wherein the set of predefined user conditions includes one or more of: drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.,
11. The method (200) as claimed in claim 7, comprising the steps of:
collecting, a data set in relation to one or more activities of the user;
processing, the data set for further analysis by filtering and transforming of the data set;
annotating, the data set;
feeding, the annotated data set into the one or more modules; and
training, the one or more modules to determine the one or more user activities based on the real-time images of the user.
12. The method (200) as claimed in claim 7, comprising the steps of:
detecting, by an illumination sensor unit (140), an ambient light around the vehicle; and
switching on, by the processing unit (120), at least one of a vehicle lighting system or an illuminating unit if the ambient light is below a predetermined threshold value of the ambient light.
13. The method (200) as claimed in claim 12, comprising the steps of:
detecting, by an auxiliary sensor unit (150), one or more vehicle parameters;
determining, by the auxiliary sensor unit (150), whether the one or more vehicle parameters are below a first predetermined threshold; and
switching off, by the processing unit (120), the one or more image sensors (110) and the illumination sensor unit (140) if the one or more vehicle parameters are below the first predetermined threshold.
| # | Name | Date |
|---|---|---|
| 1 | 202341046204-STATEMENT OF UNDERTAKING (FORM 3) [10-07-2023(online)].pdf | 2023-07-10 |
| 2 | 202341046204-REQUEST FOR EXAMINATION (FORM-18) [10-07-2023(online)].pdf | 2023-07-10 |
| 3 | 202341046204-PROOF OF RIGHT [10-07-2023(online)].pdf | 2023-07-10 |
| 4 | 202341046204-POWER OF AUTHORITY [10-07-2023(online)].pdf | 2023-07-10 |
| 5 | 202341046204-FORM 18 [10-07-2023(online)].pdf | 2023-07-10 |
| 6 | 202341046204-FORM 1 [10-07-2023(online)].pdf | 2023-07-10 |
| 7 | 202341046204-FIGURE OF ABSTRACT [10-07-2023(online)].pdf | 2023-07-10 |
| 8 | 202341046204-DRAWINGS [10-07-2023(online)].pdf | 2023-07-10 |
| 9 | 202341046204-DECLARATION OF INVENTORSHIP (FORM 5) [10-07-2023(online)].pdf | 2023-07-10 |
| 10 | 202341046204-COMPLETE SPECIFICATION [10-07-2023(online)].pdf | 2023-07-10 |
| 11 | 202341046204-Request Letter-Correspondence [15-05-2024(online)].pdf | 2024-05-15 |
| 12 | 202341046204-Power of Attorney [15-05-2024(online)].pdf | 2024-05-15 |
| 13 | 202341046204-Form 1 (Submitted on date of filing) [15-05-2024(online)].pdf | 2024-05-15 |
| 14 | 202341046204-Covering Letter [15-05-2024(online)].pdf | 2024-05-15 |