Abstract: ABSTRACT A System for Monitoring a Wearable Safety Gear in a Vehicle and a Method thereof The present invention provides a system (100) and a method (200) for monitoring a wearable safety gear in a vehicle. The system (100) comprises one or more image sensors (110) configured to capture real time images of a user riding the vehicle; and a processing unit (120) configured to receive the real time images of the user. The processing unit has one or more processing modules configured to determine one or more conditions of a user in relation to the wearable safety gear. The system (100) has a feedback module (130) configured to receive an input from the processing unit (120) if any one of the conditions of the user in relation to the wearable safety gear is true, and the feedback module (130) being configured to generate an output command. Reference Figure 1
Description:FORM 2
THE PATENTS ACT, 1970
(39 OF 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[Refer Section 10, Rule 13]
TITLE OF INVENTION
A System for Monitoring a Wearable Safety Gear in a Vehicle and a Method thereof
APPLICANT
TVS MOTOR COMPANY LIMITED, an Indian company, having its address at “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India.
PREAMBLE TO THE DESCRIPTION
The following specification particularly describes the invention and the manner in which it is to be performed.
FIELD OF THE INVENTION
[001] The present invention relates to monitoring of a wearable safety gear. More particularly, the present invention relates to a system and method for monitoring a wearable safety gear in a vehicle.
BACKGROUND OF THE INVENTION
[002] It is widely known and appreciated that wearable safety gears such as helmets are essential safety gear for users of two-wheeled vehicles such as motorcycles, scooters, or three wheelers such as trikes. Helmets play a crucial role in protecting the head of the user and reducing the severity of injuries in the event of an accident. Helmets are specifically designed to protect the head from impact during accidents. Helmets help minimize the risk of head injuries, including traumatic brain injuries, skull fractures, and concussions. Wearing a helmet significantly reduces the risk of fatalities in two-wheeler accidents. Studies have shown that helmets can reduce the risk of death by up to 42% for motorcycle riders and 29% for bicyclists. Apart from injury prevention in case of accidents, helmets often come with visors or face shields that provide clear visibility and protect the rider's eyes from dust, debris, insects, wind, and harsh weather conditions. This improves the overall riding experience and reduces the risk of accidents caused by impaired vision.
[003] In existing systems, to ensure that the user is wearing a helmet or any other safety gear, complex technologies such as smart helmets are used. Complex technologies such as smart helmets are extremely expensive due to multiple sensors being provided in a compact helmet like setup. Further, the smart helmets are configured with a specific vehicle. However, such smart helmets are dependent on that specific vehicle. Usage of smart helmets is not a reliable approach to ensure usage of helmets because the user is totally dependent on one particular kind of smart device or gadget. Thus, in case where the user forgets to take the safety device, or it is stolen, such a system fails to ensure safety of the user.
[004] In existing systems for ensuring the wearing of the helmet by the rider, when the vehicle senses that the user is not wearing a helmet, the vehicle abruptly stops. This can lead to a dangerous situation, because in cases where the rider is unable to carry the helmet, or where the helmet of the rider is stolen, the user is stuck with no option. Further, in such systems, even if the user removes their helmet or safety gear during vehicle riding condition for some reason, the vehicle tends to stop abruptly. This abrupt stopping of the vehicle is dangerous as well as it can lead to sudden hazardous accidents and life-threatening unforeseen scenarios. The conventional ways of detecting safety gears do not deal with emergency scenarios when user is not carrying the safety gear or the when the safety gear is stolen.
[005] Further, the existing system only detect whether the user is wearing the helmet. The conventional systems fail to ensure that rider is wearing the helmet properly, i.e. with proper strapping in locked condition and the helmet fully covering the head of the rider. Further, the conventional systems also use a traditional image processing unit, which takes a higher time to process images and are not compatible with computational processes. This leads to a large time lag in the detection of the wearing the helmet.
[006] Thus, there is a need in the art for a system and method for monitoring a wearable safety gear in a vehicle which addresses at least the aforementioned problems.
SUMMARY OF THE INVENTION
[007] In one aspect, the present invention relates to a system for monitoring a wearable safety gear in a vehicle. The system has one or more image sensors configured to capture real time images of a user riding the vehicle. The system has a processing unit configured to receive the real time images of the user from the one or more image sensors. The processing unit has one or more processing modules, wherein the one or more processing modules being configured to determine one or more conditions of a user in relation to the wearable safety gear based on the real time images of the user. The system has a feedback module configured to receive an input from the processing unit if any of the one or more conditions of the user in relation to the wearable safety gear is true, and the feedback module being configured to generate an output command.
[008] In an embodiment of the invention, one or more conditions of the user in relation to wearable safety gear comprises the user not wearing the wearable safety gear, wearable safety not being of optimum quality, and the user not wearing the wearable safety gear in a predetermined manner.
[009] In an embodiment of the invention, the processing unit has a first processing module and a second processing module. The first processing module is configured to determine whether the user is wearing the wearable safety gear based on the real time images of the user, and the second processing module is configured to determine whether the user is wearing the wearable safety gear in a predetermined manner based on the real time images of the user.
[010] In a further embodiment of the invention, the output command generated by the feedback module includes at least one of an indication to the user or limiting a vehicle operating parameter to a predetermined value of the vehicle operating parameter.
[011] In a further embodiment of the invention, the feedback module is configured to generate the output command for limiting the vehicle operating parameter to the predetermined value after the alert has been communicated to the user for a period of predetermined time.
[012] In a further embodiment of the invention, the wearable safety gear has a quality helmet, and the predetermined manner of wearing the wearable safety gear includes the quality being effectively positioned.
[013] In a further embodiment of the invention, the first processing module has a first computational model and the second processing module has a second computational model. The first computational model is trained based on collected data in respect of bylaws or safety regulations in relation to wearable safety gear across various geographical locations, and the second computational model is trained based on collected data in respect of different shapes of wearable safety gears and effective positioning of the wearable safety gear. Herein the first processing module and the second processing module receive input from one or more vehicle sensors in relation to a geographical location, a geographical condition, and one or more climatic conditions and determine whether the user is wearing the wearable safety gear and whether the user is wearing the wearable safety gear in the predetermined manner.
[014] In a further embodiment of the invention, the system has an illumination sensor unit. The illumination sensor unit is in communication with the processing unit and being configured to detect a level of ambient light around the vehicle, and the processing unit is configured to switch on a vehicle lighting system if the ambient light is below a predetermined threshold value of ambient light.
[015] In a further embodiment of the invention, the system has an auxiliary sensor unit. The auxiliary sensor unit is in communication with the processing unit and is configured to detect one or more vehicle parameters. The processing unit is configured to: determine whether the one or more vehicle parameters are below a first predetermined threshold; and switch off the one or more image sensors and the illumination sensor unit or switch off the system, if the one or more vehicle parameters are below the first predetermined threshold.
[016] In a further embodiment of the invention, the processing unit has a vision processing unit in communication with the first processing module and the second processing module. The vision processing unit is configured to receive inputs from a hardware through an operating system and a hardware abstraction layer.
[017] In another aspect, the present invention relates to a method for monitoring a wearable safety gear in a vehicle. The method has the steps of: capturing real time images of a user riding the vehicle; receiving real time images of the user riding the vehicle captured by the one or more image sensors; determining one or more conditions of the user in relation to the wearable safety gear based on the real time images of the user; receiving an input from the processing unit if any one of the one or more conditions of the user in relation to the wearable safety gear is true; and generating an output command.
[018] In an embodiment of the invention, one or more conditions of the user in relation to wearable safety gear comprises the user not wearing the wearable safety gear, wearable safety not being of optimum quality, and the user not wearing the wearable safety gear in a predetermined manner.
[019] In an embodiment of the invention, the method has the steps of: determining whether the user is wearing the wearable safety gear based on the real time images of the user; and determining whether the user is wearing the wearable safety gear in a predetermined manner based on the real time images of the user.
[020] In an embodiment of the invention, the output command generated by the feedback module includes at least one of an indication to the user or limiting a vehicle operating parameter to a predetermined value of the vehicle operating parameter.
[021] In an embodiment of the invention, an output command for limiting the vehicle operating parameter to the predetermined value is generated after the indication has been communicated to the user for a period of predetermined time.
[022] In an embodiment of the invention, the wearable safety gear has a helmet of quality, shape and size prescribed by the bylaws of the geographical location, and the predetermined manner of wearing the wearable safety gear includes the helmet being effectively positioned.
[023] In a further embodiment of the invention, the method has the steps of: detecting an ambient light around the vehicle; and switching on a vehicle lighting system if the ambient light is below a predetermined threshold value of ambient light.
[024] In a further embodiment of the invention, the method has the steps of: detecting one or more vehicle parameters; determining whether the one or more vehicle parameters are below a first predetermined threshold; and switching off the one or more image sensors and the illumination sensor unit or the system, if the one or more vehicle parameters are below the first predetermined threshold.
[025] In another aspect, the present invention relates to a method for training one or more computational models for monitoring a wearable safety gear. The method has the steps of: collecting, by a processing unit, a data set in relation respect of bylaws or safety regulations in relation to wearable safety gear across various geographical locations, and in respect of different shapes of wearable safety gears and effective positioning of the wearable safety gear; processing the data set for further analysis by filtering and transforming of the data set; annotating the data set; feeding the annotated data set into a first computational model and a second computational model; training the first computational model to determine whether the user is wearing the wearable safety gear; and training the second computational model to determine whether the user is wearing the wearable safety gear in a predetermined manner.
BRIEF DESCRIPTION OF THE DRAWINGS
[026] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 illustrates a system for monitoring a wearable safety gear in a vehicle, in accordance with an embodiment of the present invention.
Figure 2 illustrates in a method for monitoring the wearable safety gear in the vehicle, in accordance with an embodiment of the present invention.
Figure 3A illustrates a method for training one or more computational models for monitoring the wearable safety gear in the vehicle, in accordance with an embodiment of the present invention.
Figure 3B illustrates a method for training one or more computational models for monitoring the wearable safety gear in the vehicle, in accordance with an embodiment of the present invention.
Figure 4 illustrates a process flow of system and method for monitoring a wearable safety gear in the vehicle, in accordance with an embodiment of the present invention.
Figure 5 illustrates a software architecture for the system and method for monitoring the wearable safety gear in the vehicle, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[027] The present invention relates to monitoring of a wearable safety gear. More particularly, the present invention relates to a system and method for monitoring a wearable safety gear in a vehicle. The system and method of the present invention are typically used in a vehicle such as a two wheeled vehicle, or a three wheeled vehicle including trikes, or a four wheeled vehicle, or other multi-wheeled vehicles as required.
[028] Figure 1 illustrates a system 100 for monitoring a wearable safety gear in a vehicle. As illustrated, the system 100 comprises one or more image sensors 110. The one or more image sensors 110 are configured to capture real time images of a user riding the vehicle. In essence, the one or more image sensors 110 capture a series of real time images, or video feed or live feed of the user riding the vehicle. The real time images of the user are captured as soon as the vehicle is in a riding condition. The real time images, or video feed or live feed are a series of individual image frames, which can be analysed for monitoring the wearable safety gear. In an embodiment, the one or more image sensors 110 comprises one or more of a camera, a Red-Green-Blue wavelength camera, a Red-Green-Blue-Infrared wavelength camera, an Infrared camera, a Monochrome camera, a Thermal camera, a Radio Detection and Ranging camera, a Light Detection and Ranging camera, or a Time-of-Flight camera.
[029] As illustrated in Figure 1, the system 100 further comprises a processing unit 120. The processing unit 120 is configured to receive the real time images of the user from the one or more image sensors 110. Further, the processing unit 120 has one or more processing modules. The one or more processing modules of the processing unit 120 are configured to determine one or more conditions of a user in relation to the wearable safety gear based on the real time images of the user. In an embodiment, the one or more conditions of the user in relation to wearable safety gear comprises the user not wearing the wearable safety gear, wearable safety not being of optimum quality, wearable safety gear not being in compliance with the bylaws of the geographical location and the user not wearing the wearable safety gear in a predetermined manner. Thus, the processing unit 120 is configured not only to determine whether the user is wearing the wearable safety gear, but also determines whether the user is wearing the wearable safety gear in a predetermined manner and whether the wearable safety gear is of optimum quality.
[030] In an embodiment, the processing unit 120 comprises a first processing module 122 and a second processing module 124. The first processing module 122 and the second processing module 124 are Artificial Intelligence Modules equipped with machine learning and/or deep machine learning models. Herein, the first processing module 122 is configured to determine whether the user is wearing the wearable safety gear as per the bylaws of the geographical location, based on the real time images of the user. If the first processing module 122 determines that the user is wearing the wearable safety gear, the second processing module 124 is configured to determine whether the user is wearing the wearable safety gear in the predetermined manner based on the real time images of the user. The first processing module 122 of the processing unit 120 analyses one or more frames from the real time images or live feed to determine whether the user is wearing a safety gear. If in some frame, the first processing module 122 determines that the user is wearing the wearable safety gear, the said frame is analysed by the second processing module 124 to determine whether the user is wearing the wearable safety gear in the predetermined manner.
[031] As illustrated in Figure 1, the system further has a feedback module 130. If the processing unit 120 detects or determines that any one of the one or more user conditions is true, the feedback module 130 receives an input from the processing unit 120. Whenever, the feedback module 130 receives an input from the processing unit 120, the feedback module 130 generates an output command. Based on the output command, an appropriate action is taken.
[032] In an embodiment, the output command generated by the feedback module 130 comprises at least one of an indication to the user or limiting a vehicle operating parameter to a predetermined value of the vehicle operating parameter. In operation, if the processing unit 120 determines that the user is not wearing the wearable safety gear, or the wearable safety gear is not in compliance with the bylaws of the geographical location or is not wearing the wearable safety gear in the predetermined manner, the feedback module 130 generates the output command based on which the user is alerted by an indication. For example, the bylaws of the geographical location may prescribe a specific shape, size, configuration and quality of the wearable safety gear. Further, if the processing unit 120 determines that the user is not wearing the wearable safety gear or is not wearing the wearable safety gear in the predetermined manner, the feedback module 130 is also capable of generating the output command based on which the vehicle operating parameter is limited, for example speed of the vehicle being limited to the predetermined vehicle speed. In an exemplary embodiment, the predetermined vehicle speed is 30% of maximum vehicle speed. Restriction of the speed of the vehicle not only reduces the chances of accidents of the vehicles, but also prevents the vehicle from coming to an abrupt stop if the user removes the wearable safety gear during vehicle riding or tampers with the manner in which they are wearing the safety gear. The elimination of the vehicle coming to an abrupt stop also reduces the chances of unforeseen accidents. Further, limiting the speed also ensures reduction of severity of accidents, if any.
[033] In an embodiment, the feedback module 130 is configured to generate the output command for limiting the vehicle operating parameter to the predetermined value after the indication has been communicated to the user for a period of predetermined time. Thus, in operation, for example, if the user is not wearing the wearable safety gear or is not wearing the wearable safety gear in the predetermined manner, an indication is generated for the user for a predetermined time. In an embodiment, the predetermined time ranges between 10 seconds to 5 minutes. If even after the predetermined time of the indication being generated, the processing unit 120 determines that the user is not wearing the wearable safety gear or not wearing the wearable safety gear in the predetermined manner, the speed of the vehicle is limited based on the output command from the feedback module 130.
[034] In an embodiment, the wearable safety gear comprises a helmet of quality, shape and size prescribed by the bylaws of the geographical location. The helmet means any helmet which complies with the bylaws and safety regulations of a region in which the vehicle is being used. The predetermined manner of wearing the wearable safety gear comprises the helmet being effectively positioned. For example, in today’s context, the helmet being effectively positioned means a strap of the helmet being in an engaged condition. Thus, for example, the system 100 is configured to determine whether the user is wearing the helmet, as well as if the user is wearing the helmet with the strap of the helmet being engaged. If the user is not wearing the quality helmet or is wearing the helmet but has not engaged or locked the strap, the processing unit 120 generates an input for the feedback module 130, and the feedback module 130 then generates an output command for the alert to the user and limitation of vehicle speed.
[035] In a further embodiment, the first processing module 122 comprises a first computational model and the second processing module 124 comprises a second computational model. The first computational model is artificial intelligence based model having machine learning and deep machine learning capabilities. The first computational model is trained based on collected data in respect of bylaws or safety regulations in relation to wearable safety gear across various geographical locations. The second computational model is an artificial intelligence based model having machine learning and deep machine learning capabilities and the second computational model is trained based on collected data in respect of different shapes of wearable safety gears and effective positioning of the wearable safety gear. Further the first processing module 122 and the second processing module 124 receive input from one or more vehicle sensors in relation to a geographical location, a geographical condition, and one or more climatic conditions and determine whether the user is wearing the wearable safety gear and whether the user is wearing the wearable safety gear in the predetermined manner. The collected data allows training of the first computational model and the second computational model based on the local laws, regulations and practices of that region, and different ways in which the wearable safety gear is worn. For example, if in a local region according to the local regulations, wearing of a wearable safety gear is not mandatory, the processing unit 120 will not generate an indication in accordance with the local bylaws and regulations.
[036] In an embodiment, the system 100 is also configured to create a log record of number of times and the duration for which the user has ridden the vehicle without the wearable safety gear being worn in the predetermined manner. The log can be sent to the user through an application or any other web service.
[037] As further illustrated in the embodiment depicted in Figure 1, the system further comprises an illumination sensor unit 140. The illumination sensor unit 140 is in communication with the processing unit 120. The illumination sensor unit 140 is configured to detect a level of ambient light around the vehicle. Correspondingly, the processing unit 120 is configured to switch on a vehicle lighting system or any dedicated light, if the ambient light is below a predetermined threshold value of ambient light. For example, if during riding conditions such as overcast conditions, or night time riding conditions when the ambient light is low, to ensure that the one or more real time images are captured appropriately, the processing unit 120 switches on the vehicle lighting system, such as a bulb, or increase the brightness of the display cluster to increase the ambient light around the user.
[038] As further illustrated in the embodiment depicted in Figure 1, the system 100 further has an auxiliary sensor unit 150. The auxiliary sensor unit 150 is in communication with the processing unit 120. The auxiliary sensor unit 150 is configured to detect one or more vehicle parameters. The processing unit 120 receives the one or more vehicle parameters from the auxiliary sensor unit 150 and is configured to determine whether the one or more vehicle parameters are below a first predetermined threshold. If the processing unit 120 determines that the one or more vehicle parameters are below the predetermined threshold, the processing unit 120 is configured to switch off the one or more image sensors 110 or the illumination sensor unit 140 or switch off the system 100. In an embodiment, the one or more vehicle parameters comprises a State of Charge of a battery of the vehicle, and the predetermined threshold of the State of Charge (SOC) of the battery is 15-20%. If based on the input from the auxiliary sensor unit 150, the processing unit 120 determines that the SOC of the battery is lower than 15-20%, the processing unit 120 switches off the system 100, or switches off the one or more image sensors 110 or the illumination sensor unit 140. Such switching off of the system 100 or the one or more image sensors 110 and the illumination sensor unit 140 prevents deep discharging of the battery.
[039] In another aspect of the present invention, the present invention provides a method 200 for monitoring a wearable safety gear in a vehicle. Figure 2 illustrates the method 200 for monitoring the wearable safety gear in a vehicle. At step 202, the one or more image sensors 110 are activated. The one or more image sensors 110 are activated as soon as the vehicle is started and remain activated during vehicle riding conditions. At step 204, real time images of the user of the vehicle are captured by the one or more image sensors 110. Thereafter, the real time images of the user of the vehicle captured by the one or more image sensors 110 are received by the processing unit 120. In essence, a series of real time images, or video feed or live feed of the user riding the vehicle is captured by the one or more image sensors 110 and received by the processing unit 120. The real time images, or video feed or live feed are a series of individual image frames, which can be analysed for monitoring the wearable safety gear.
[040] Thereafter, the method has the step of determining one or more conditions of the user in relation to the wearable safety gear based on the real time images of the user. In an embodiment, the one or more conditions of the user in relation to wearable safety gear comprises the user not wearing the wearable safety gear, wearable safety not being of optimum quality, wearable safety gear not being in compliance with the bylaws of the geographical location and the user not wearing the wearable safety gear in a predetermined manner. In the embodiment depicted in Figure 2, at step 206, whether the user is wearing the wearable safety gear is determined by one or more processing modules of the processing unit 120 based on the real time images of the user. One or more frames of the real time images are used for determining whether the user is wearing the wearable safety gear. If at step 206, it is determined that one or more conditions of the user in relation to the wearable safety gear is true, for example, the user is not wearing the wearable safety gear, the method moves to step 208. At step 208, when it is determined that the user is not wearing the wearable safety gear, an input is received by the feedback module 130 from the processing unit 120, based on which, at step 210, the feedback module 130 generates an output command based on which the action is taken.
[041] At step 206, if it is determined that the user is wearing the wearable safety gear, the method moves to step 212. At step 212, it is determined that whether one or more conditions of the user in relation to the wearable safety gear is true, for example, if the user is wearing the wearable safety gear in the predetermined manner based on the real time images of the user. If at step 212, it is determined that the user is not wearing the wearable safety gear in the predetermined manner, the method moves to step 214. At step 214, when it is determined that the user is not wearing the wearable safety gear in the predetermined manner, an input is received by the feedback module 130 from the processing unit 120, based on which, at step 214, the feedback module 130 generates an output command based on which the action is taken. If at step 212, it is determined that none of the conditions of the user in relation to the wearable safety gear are true, such as the user is wearing the wearable safety gear in the predetermined manner, no input is received by the feedback module 130 from the processing unit 120, and the method 200 reverts to step 206.
[042] In an embodiment, the method 200 comprises the steps of determining, by a first processing module 122 of the processing unit 120, whether the user is wearing the wearable safety gear as per the bylaws of the geographical location, based on the real time images of the user. The method 200 further has the step of determining, by a second processing module of the processing unit 120, whether the user is wearing the wearable safety gear in the predetermined manner based on the real time images of the user. One or more frames from the real time images or live feed are analysed by the first processing module 122 to determine whether the user is wearing a safety gear. If in some frame, it is determined that the user is wearing the wearable safety gear, the said frame is analysed by the second processing module 124 to determine whether the user is wearing the wearable safety gear in the predetermined manner.
[043] In an embodiment, the output command generated by the feedback module 130 comprises at least one of an indication to the user or limiting one or more vehicle operating parameters to a predetermined value of the vehicle operating parameter. In operation, for example, if it is determined that the user is not wearing the wearable safety gear or is not wearing the wearable safety gear in the predetermined manner, the output command is generated based on which the user is alerted. Further, if it is determined that the user is not wearing the wearable safety gear or is not wearing the wearable safety gear in the predetermined manner, the output command is generated based on which the vehicle operating parameter is limited to the predetermined value of the vehicle operating parameter. In an exemplary embodiment, the vehicle operating parameter comprises vehicle speed and the predetermined vehicle speed is 30% of maximum vehicle speed.
[044] In an embodiment, the output command for limiting the one or more vehicle operating parameters to the predetermined value is generated after the indication has been communicated to the user for a period of predetermined time. Thus, in operation, if the user is not wearing the wearable safety gear or is not wearing the wearable safety gear in the predetermined manner, an indication is generated for the user for a predetermined time. If even after the predetermined time of the alert being generated, it is determined that the user is not wearing the wearable safety gear or not wearing the wearable safety gear in the predetermined manner, the speed of the vehicle is restricted based on the output command from the feedback module 130.
[045] In an embodiment, the wearable safety gear comprises the helmet of quality, shape and size prescribed by the bylaws of the geographical location, and the predetermined manner of wearing the wearable safety gear comprises the helmet being effectively positioned. In today’s context, the helmet being effectively positioned comprises the strap of the helmet being in an engaged condition. For example, if the user is not wearing the helmet or is wearing the helmet but has not engaged or locked the strap, the input is generated for the feedback module 130 by the processing unit 120, and the output command is then generated by the feedback module for the alert to the user and restriction of vehicle speed.
[046] In a further embodiment, the method 200 has the steps of detecting an ambient light around the vehicle by the illumination sensor unit 140. Further, the method 200 has the step of switching on a vehicle lighting system if the ambient light is below a predetermined threshold value of ambient light. For example, if during riding conditions such as overcast conditions, or night time riding conditions when the ambient light is low, to ensure that the one or more real time images are captured appropriately, the illuminator, such as a bulb is switched on to increase the ambient light around the user.
[047] The method 200 further has the steps of detecting one or more vehicle parameters by the auxiliary sensor unit 150. The method further has the steps of determining whether the one or more vehicle parameters are below a first predetermined threshold by the processing unit 120; and switching off the one or more image sensors 110 or the illumination sensor unit 140, if the one or more vehicle parameters are below the first predetermined threshold. In an embodiment, the one or more vehicle parameters comprises the State of Charge of a battery of the vehicle, and the predetermined threshold of the State of Charge (SOC) of the battery is 15-20%. If it is determined that the SOC of the battery is lower than 15-20%, the one or more image sensors 110 or the illumination sensor unit 140 are switched off. Such switching off of the one or more image sensors 110 and the illumination sensor unit 140 prevents deep discharging of the battery.
[048] In another aspect, the present invention relates to a method 300, 400 for training one or more computational models for monitoring a wearable safety gear. The method steps involved in the method 300, 400 for training the one or more computational models for monitoring the wearable safety gear have been illustrated in Figure 3A and Figure 3B. As illustrated in Figure 3A and 3B, at step 302/402, a data set is collected by the processing unit 120. The data set is in respect of bylaws or safety regulations in relation to wearable safety gear across various geographical locations, different shapes of wearable safety gears and effective positioning of the wearable safety gear. Thereafter, at step 304/404, the data set is processed by the processing unit 120 for further analysis by filtering and transforming the data set. The filtering and transforming is done to improve the quality, reliability and compatibility of the data set.
[049] Thereafter, at step 306/406, the data set is annotated by the processing unit 120. For example, the data set is annotated with specific attributes to the data points to provide context and meaning to the information for easy processing by the processing unit 120. Thereafter at step 308, the annotated data set is fed into the first computational model and at step 310, the first computational model is trained to determine whether the user is wearing the wearable safety gear. Thereafter at step 312, the first computational model is evaluated, after which at step 314, the final first computational model is obtained for determining whether the user is wearing the wearable safety gear, and the first computational model is deployed in the vehicle. Similarly, at step 408, the annotated data set is fed into the second computational model and at step 410, the second computational model is trained to determine whether the user is wearing the wearable safety gear in the predetermined manner. Thereafter at step 412, the second computational model is evaluated, after which at step 414, the final second computational model is obtained for determining whether the user is wearing the wearable safety gear in the predetermined manner, and the second computational model is deployed in the vehicle.
[050] Figure 4 illustrates a process flow of the present invention in accordance with an embodiment of the invention. In operation, for example, the one or more image sensors 110 capture real time images of the user. The one or more image sensors 110 capture the real time images of the user as a video stream, which is then converted to video input, and this video input is then encoded for further processing. Thereafter, the processing unit 120 breaks down the encoded video input into a plurality of frames by a frame grabber functionality. Thereafter each of the frames is processed through the digital image processing. Thereafter, the processing unit 120 detects whether the user is wearing the helmet through the helmet detection functionality using the artificial intelligence module, i.e. the first processing module 122. Thereafter, the processing unit 120 determines whether the user is wearing the helmet in the predetermined manner through the helmet strap detection functionality using the artificial intelligence module, i.e. the second processing module 124. Based on the inputs from the processing unit 120, the feedback module 130 generates indications for the user using one or more of a voice alert functionality, a haptic alert functionality or a display alert functionality.
[051] Figure 5 illustrates the software architecture in relation to the present invention. As illustrated in Figure 5, the software architecture has a vision processing unit 170. The vision processing unit 170 is operatively coupled to a plurality of microservices 172, namely microservice 1, microservice 2 and microservice 3. Herein, microservices are predefined libraries for supporting and enabling the functioning of the software architecture. Microservice 1 is in relation to capturing of the real time images using a hardware 178 such as the one or more image sensors. In operation, the microservice 1 receives the real time images from the hardware 178 through a hardware abstraction layer 176 and an operating system 174, and communicates the real time images to the vision processing unit 170. The first processing module 122 and the second processing module 124 coupled with the vision processing unit 170 determine whether the user is wearing the wearable and whether the user is wearing the wearable safety gear in the predetermined manner. Similarly, microservice 2 is in relation to detection of ambient light, wherein the microservice 2 receives input from the relevant hardware 178, namely the illumination sensor unit 140 through operating system 174 and hardware abstraction layer 176 to be sent to the vision processing unit 170 for detection of ambient light. Based on the detection of the ambient light, the processing unit 120 determines whether to switch on the vehicle lighting system. Similarly, microservice 3 is in relation to detection of the state of charge of the battery, wherein microservice 3 receives input from relevant hardware 178 through the hardware abstraction layer 176 and the operating system 174 to be sent to the vision processing unit 170 for detection of state of charge of the battery. Based on the detection of the state of charge of the battery, the processing unit 120 determines whether to switch off the system or the illumination sensor 140. Further, a debug module 180 is provided for debugging the software as per requirement.
[052] Advantageously, the present invention provides a system and method for monitoring a wearable safety gear which is capable of determining whether the user is wearing the safety gear, as well as determining whether the user is wearing the safety gear in the predetermined manner. The present invention provides a real time feedback to the user based on real time images of the user, thus providing instant alerts/speed reduction of the vehicle if the user is not wearing the wearable safety gear or is not wearing the wearable safety gear in the predetermined manner. The reduction in speed and alerts greatly reduce the chances of accidents.
[053] The present invention ensures that the vehicle does not come to an abrupt stop is the user removes the wearable safety gear or tampers with the wearable safety gear during vehicle riding, which reduces the chances of unforeseen accidents. The present invention provides for the system and method to be synchronized with the local laws, regulations and practices.
[054] Furthermore, the present invention reduces the dependency on smart devices such as smart helmets, which reduces the chances of disruption and interdependence of multiple hardware devices. The limitation of the smart helmets being configured for a specific vehicle is also obviated by the present invention. Situations such as the user forgetting to carry the specific smart device and the same leading to incapacitation of the vehicle is also eliminated in the present invention. The reduced interdependence also reduces the time lag in the detection of whether the user is wearing the wearable safety gear and whether the user is wearing the safety gear in the predetermined manner. Further, the present invention also allows for faster processing of the real time images of the user for detection of the one or more conditions of the user in relation to the wearable safety gear, which not only reduces the required processing capabilities and processing time, but also enhances safety.
[055] In light of the abovementioned advantages and the technical advancements provided by the disclosed system and method, the claimed steps as discussed above are not routine, conventional, or well understood in the art, as the claimed steps enable the following solutions to the existing problems in conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the system itself as the claimed steps provide a technical solution to a technical problem.
[056] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[057] While the present invention has been described with respect to certain embodiments, it will be apparent to those skilled in the art that various changes and modification may be made without departing from the scope of the invention as defined in the following claims.
List of Reference Numerals
100: System for Monitoring a Wearable Safety Gear in a vehicle
110: One or more image sensors
120: Processing Unit
122: First Processing Module
124: Second Processing Module
130: Feedback Module
140: Illumination Sensor Unit
150: Auxiliary Sensor Unit
170: Vision Processing Unit
172: Microservices
174: Operating System
176: Hardware Abstraction Layer
178: Hardware
180: Debug
200: Method for Monitoring a Wearable Safety Gear in a vehicle
300, 400: Method for training one or more computational models for monitoring a Wearable Safety Gear
, Claims:WE CLAIM:
1. A system (100) for monitoring a wearable safety gear in a vehicle, the system (100) comprising:
one or more image sensors (110), the one or more image sensors (110) being configured to capture real time images of a user riding the vehicle;
a processing unit (120), the processing unit (120) being configured to receive the real time images of the user from the one or more image sensors (110), and the processing unit (120) having one or more processing modules, wherein the one or more processing modules (??) being configured to determine one or more conditions of the user in relation to the wearable safety gear based on the real time images of the user; and
a feedback module (130), the feedback module (130) being configured to receive an input from the processing unit (120) if any of the one more conditions of the user in relation to the wearable safety gear is true, and the feedback module (130) being configured to generate an output command.
2. The system (100) as claimed in claim 1, wherein the one or more conditions of the user in relation to wearable safety gear comprises the user not wearing the wearable safety gear, wearable safety not being of optimum quality, wearable safety gear not being in compliance with the bylaws of the geographical location, and the user not wearing the wearable safety gear in a predetermined manner.
3. The system (100) as claimed in claim 2, wherein the processing unit (120) comprises a plurality of processing modules having at least a first processing module (122) and a second processing module (124), wherein the first processing module (122) being configured to determine whether the user is wearing the wearable safety gear as per the bylaws of the geographical location based on the real time images of the user, and the second processing module (124) being configured to determine whether the user is wearing the wearable safety gear in the predetermined manner based on the real time images of the user.
4. The system (100) as claimed in claim 1, wherein the output command generated by the feedback module (130) comprises at least one of an indication to the user or limiting one or more vehicle operating parameters to a predetermined value of the vehicle operating parameter.
5. The system (100) as claimed in claim 4, wherein the feedback module (130) being configured to generate the output command for limiting a vehicle operating parameter to the predetermined value after the indication has been communicated to the user for a period of predetermined time.
6. The system (100) as claimed in claim 2, wherein the wearable safety gear comprises a helmet of quality, shape and size prescribed by the bylaws of the geographical location, and the predetermined manner of wearing the wearable safety gear comprises the helmet being effectively positioned.
7. The system (100) as claimed in claim 3, wherein the first processing module (122) comprises a first computational model and the second processing module (124) comprises a second computational model, the first computational model is trained based on collected data in respect of bylaws or safety regulations in relation to wearable safety gear across various geographical locations, and the second computational model is trained based on collected data in respect of different shapes of wearable safety gears and effective positioning of the wearable safety gear; wherein the first processing module (122) and the second processing module (124) receive input from one or more vehicle sensors in relation to a geographical location, a geographical condition, and one or more climatic conditions and determine whether the user is wearing the wearable safety gear and whether the user is wearing the wearable safety gear in the predetermined manner. .
8. The system (100) as claimed in claim 1, comprising an illumination sensor unit (140), the illumination sensor unit (140) being in communication with the processing unit (120) and being configured to detect a level of ambient light around the vehicle, and the processing unit (120) being configured to switch on a vehicle lighting system if the ambient light is below a predetermined threshold value of ambient light.
9. The system (100) as claimed in claim 8, comprising an auxiliary sensor unit (150), the auxiliary sensor unit (150) being in communication with the processing unit (120) and being configured to detect one or more vehicle parameters; and
the processing unit (120) being configured to:
determine whether the one or more vehicle parameters are below a first predetermined threshold; and
switch off the one or more image sensors (110) or the illumination sensor unit (140) or switch off the system (100), if the one or more vehicle parameters are below the first predetermined threshold.
10. The system (100) as claimed in claim 3, wherein the processing unit (120) comprises a vision processing unit (170) in communication with the first processing module (122) and the second processing module (124), the vision processing unit (170) being configured to receive inputs from a hardware (178) through an operating system (174) and a hardware abstraction layer (176).
11. A method (200) for monitoring a wearable safety gear in a vehicle, the method comprising the steps of:
capturing, by one or more image sensors (110), real time images of a user riding the vehicle;
receiving, by a processing unit (120), real time images of the user riding the vehicle captured by the one or more image sensors (110);
determining, by one or more processing modules of the processing unit (120), one or more conditions of a user in relation to the wearable safety gear based on the real time images of the user;
receiving, by a feedback module (130), an input from the processing unit (120) if any one of the one or more conditions of the user in relation to the wearable safety gear is true; and
generating, by the feedback module (130), an output command.
12. The method as claimed in claim 11, wherein the one or more conditions of the user in relation to wearable safety gear comprises the user not wearing the wearable safety gear, wearable safety not being of optimum quality, wearable safety gear not being in compliance with the bylaws of the geographical location, and the user not wearing the wearable safety gear in a predetermined manner.
13. The method as claimed in claim 12, comprising the steps of:
determining, by a first processing module (122) of the processing unit (120), whether the user is wearing the wearable safety gear as per the bylaws of the geographical location based on the real time images of the user; and
determining, by a second processing module (124) of the processing unit (120), whether the user is wearing the wearable safety gear in the predetermined manner based on the real time images of the user.
14. The method (200) as claimed in claim 11, wherein the output command comprises at least one of an indication to the user or limiting one or more vehicle operating parameters to a predetermined value of the vehicle operating parameter.
15. The method (200) as claimed in claim 14, wherein an output command for limiting the vehicle operating parameter to the predetermined value is generated after the indication has been communicated to the user for a period of predetermined time.
16. The method (200) as claimed in claim 11, wherein the wearable safety gear comprises a helmet of quality, shape and size prescribed by the bylaws of the geographical location, and the predetermined manner of wearing the wearable safety gear comprises the helmet being effectively positioned.
17. The method (200) as claimed in claim 11, comprising the steps of:
detecting, by an illumination sensor unit (140), a level of ambient light around the vehicle; and
switching on, by the processing unit (120), a vehicle lighting system if the ambient light is below a predetermined threshold value of ambient light.
18. The method (200) as claimed in claim 17, comprising the steps of:
detecting, by an auxiliary sensor unit (150), one or more vehicle parameters;
determining, by the processing unit (120), whether the one or more vehicle parameters are below a first predetermined threshold; and
switching off, by the processing unit (120), the one or more image sensors (110) or the illumination sensor unit (140) if the one or more vehicle parameters are below the first predetermined threshold.
19. A method (300, 400) for training one or more computational models for monitoring a wearable safety gear, comprising the steps of:
collecting, by a processing unit (120), a data set in respect of bylaws or safety regulations in relation to wearable safety gear across various geographical locations, and in respect of different shapes of wearable safety gears and effective positioning of the wearable safety gear;
processing, by the processing unit (120), the data set for further analysis by filtering and transforming of the data set;
annotating, by the processing unit (120), the data set;
feeding, by the processing unit (120), the annotated data set into a first computation model and a second computation model;
training, by the processing unit (120), the first computation model to determine whether the user is wearing the wearable safety gear; and
training, by the processing unit (120), the second computation model to determine whether the user is wearing the wearable safety gear in a predetermined manner.
Dated this 6th day of July 2023
TVS MOTOR COMPANY LIMITED
By their Agent & Attorney
(Nikhil Ranjan)
of Khaitan & Co
Reg No IN/PA-1471
| # | Name | Date |
|---|---|---|
| 1 | 202341045515-STATEMENT OF UNDERTAKING (FORM 3) [06-07-2023(online)].pdf | 2023-07-06 |
| 2 | 202341045515-REQUEST FOR EXAMINATION (FORM-18) [06-07-2023(online)].pdf | 2023-07-06 |
| 3 | 202341045515-PROOF OF RIGHT [06-07-2023(online)].pdf | 2023-07-06 |
| 4 | 202341045515-POWER OF AUTHORITY [06-07-2023(online)].pdf | 2023-07-06 |
| 5 | 202341045515-FORM 18 [06-07-2023(online)].pdf | 2023-07-06 |
| 6 | 202341045515-FORM 1 [06-07-2023(online)].pdf | 2023-07-06 |
| 7 | 202341045515-FIGURE OF ABSTRACT [06-07-2023(online)].pdf | 2023-07-06 |
| 8 | 202341045515-DRAWINGS [06-07-2023(online)].pdf | 2023-07-06 |
| 9 | 202341045515-DECLARATION OF INVENTORSHIP (FORM 5) [06-07-2023(online)].pdf | 2023-07-06 |
| 10 | 202341045515-COMPLETE SPECIFICATION [06-07-2023(online)].pdf | 2023-07-06 |
| 11 | 202341045515-REQUEST FOR CERTIFIED COPY [22-04-2024(online)].pdf | 2024-04-22 |
| 12 | 202341045515-Request Letter-Correspondence [09-05-2024(online)].pdf | 2024-05-09 |
| 13 | 202341045515-Power of Attorney [09-05-2024(online)].pdf | 2024-05-09 |
| 14 | 202341045515-Form 1 (Submitted on date of filing) [09-05-2024(online)].pdf | 2024-05-09 |
| 15 | 202341045515-Covering Letter [09-05-2024(online)].pdf | 2024-05-09 |
| 16 | 202341045515-Response to office action [05-06-2024(online)].pdf | 2024-06-05 |