Sign In to Follow Application
View All Documents & Correspondence

“Method And System For Detecting Event For Alerting Users Associated With An Automotive Vehicle”

Abstract: Disclosed herein is a method and a system (101) for detecting an event for alerting one or more users (103) associated with an automotive vehicle (102). Here, a control unit (107) in the automotive vehicle (102) receives tilt angle data (111) associated with head movement of a user (1031) driving the automotive vehicle (102) from a sensor (1091) configured in a wearable product (109) worn by the user. Further, the control unit (107) compares the tilt angle data (111) with a predefined range of angle values. When the tilt angle data (111) deviates from the predefined range of angle values based on the comparison for a preset time duration, the control unit (107) detects the event associated with drowsiness state (113) of the user. Upon detecting occurrence of the event associated with the drowsiness state (113) of the user, the control unit (107) activates one or more alert generating components (105) in the automotive vehicle (102) for alerting the one or more users (103). Fig. 1a

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 December 2020
Publication Number
23/2022
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
ipo@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2025-05-07
Renewal Date

Applicants

TATA MOTORS LIMITED
Bombay House, 24 Homi Mody Street, Hutatma Chowk, Mumbai – 400 001, Maharashtra, India

Inventors

1. Vivek Jagannath Shinde
C/o. Tata Motors Limited, Bombay House, 24 Homi Mody Street, Hutatma Chowk, Mumbai – 400 001, Maharashtra, India

Specification

Claims:We claim:
1. A method of detecting an event for alerting one or more users 103 associated with an automotive vehicle 102, the method comprising:
receiving, by a control unit 107 in the automotive vehicle 102, tilt angle data 111 associated with head movement of a user 1031 driving the automotive vehicle 102, from a sensor 1091 configured in a wearable product 109 associated with the user;
comparing, by the control unit 107, the tilt angle data 111 with a predefined range of angle values stored in a memory associated with the control unit 107;
detecting, by the control unit 107, an event associated with drowsiness state 113 of the user when the tilt angle data 111 deviates from the predefined range of angle values based on the comparison for a preset time duration; and
activating, by the control unit 107, one or more alert generating components 105 in the automotive vehicle 102 upon detecting occurrence of the event associated with the drowsiness state 113 of the user for alerting the one or more users 103.

2. The method as claimed in claim 1 further comprises:
detecting, by the control unit 107, one or more sound signals 121 associated with the one or more users 103 of the automotive vehicle 102 from a microphone 115 associated with the control unit 107;
comparing, by the control unit 107, intensity of the one or more sound signals 121 with a predefined intensity value; and
activating, by the control unit 107, the one or more alert generating components 105 in the automotive vehicle 102 when the intensity of the one or more sound signals 121 exceeds the predefined intensity value.

3. The method as claimed in claim 1 further comprises:
receiving, by the control unit 107, measured real time pulse rate data 123 associated with the user through a pulse rate sensor 117 associated with the user 1031 driving the automotive vehicle 102;
comparing, by the control unit 107, the pulse rate data 123 with a predefined pulse rate value;
detecting, by the control unit 107, health issue of the user when the pulse rate data 123 exceeds the predefined pulse rate value; and
activating, by the control unit 107, the one or more alert generating components 105 in the automotive vehicle 102 based on the detected health issue of the user.

4. The method as claimed in claim 3 further comprises:
notifying, by the control unit 107, one or more health care facilities 127 about the detected health issue of the user for assisting the user.

5. The method as claimed in claim 1, wherein activating the one or more alert generating components 105 comprises:
activating, by the control unit 107, a speaker 1051 associated with the control unit 107 to generate a sound alert inside the automotive vehicle 102;
activating, by the control unit 107, vibration of a steering wheel 1052 of the automotive vehicle 102;
activating, by the control unit 107, one or more illuminating devices 1053 provided on interior part of the automotive vehicle 102 to generate a predefined luminous intensity inside the automotive vehicle 102; and
activating, by the control unit 107, one or more illuminating devices 1054 provided on exterior part of the automotive vehicle 102.

6. The method as claimed in claim 1 further comprises:
transmitting, by the control unit 107, in real-time one or more video frames 112, associated with an exterior environment of the automotive vehicle 102, to a server 125 for further analysis to assist the one or more users 103, wherein the one or more video frames 112 are captured while driving the automotive vehicle 102 by a video capturing unit 1092 configured in the wearable product 109.

7. A system 101 for detecting an event for alerting one or more users 103 associated with an automotive vehicle 102, the system 101 comprises:
a wearable product 109, associated with a user, comprising a sensor 1091;
one or more alert generating components 105 associated with the automotive vehicle 102; and
a control unit 107, communicatively coupled to the sensor 1091 and the one or more alert generating components 105, wherein the control unit 107 is configured to:
receive tilt angle data 111 associated with head movement of the user 1031 driving the automotive vehicle 102 from the sensor 1091;
compare the tilt angle data 111 with a predefined range of angle values stored in a memory associated with the control unit 107;
detect an event associated with drowsiness state 113 of the user when the tilt angle data 111 deviates from the predefined range of angle values based on the comparison for a preset time duration; and
activate the one or more alert generating components 105 in the automotive vehicle 102 upon detecting occurrence of the event associated with the drowsiness state 113 of the user for alerting the one or more users 103.

8. The system 101 as claimed in claim 7, wherein the control unit 107 is configured to:
detect one or more sound signals 121 associated with the one or more users 103 of the automotive vehicle 102 from a microphone 115 associated with the control unit 107;
compare intensity of the one or more sound signals 121 with a predefined intensity value; and
activate the one or more alert generating components 105 in the automotive vehicle 102 when the intensity of the one or more sound signals 121 exceeds the predefined intensity value.

9. The system 101 as claimed in claim 7, wherein the control unit 107 is configured to:
receive measured real time pulse rate data 123 associated with the user through a pulse rate sensor 117 associated with the user 1031 driving the automotive vehicle 102;
compare the pulse rate data 123 with a predefined pulse rate value;
detect health issue of the user when the pulse rate data 123 exceeds the predefined pulse rate value; and
activate the one or more alert generating components 105 in the automotive vehicle 102 based on the detected health issue of the user.

10. The system 101 as claimed in claim 9, wherein the control unit 107 is configured to:
notify one or more health care facilities 127 about the detected health issue of the user for assisting the user.

11. The system 101 as claimed in claim 7, wherein the control unit 107 is configured to:
activate a speaker 1051 associated with the control unit 107 to generate a sound alert inside the automotive vehicle 102;
activate vibration of a steering wheel 1052 of the automotive vehicle 102;
activate one or more illuminating devices 1053 provided on interior part of the automotive vehicle 102 to generate a predefined luminous intensity inside the automotive vehicle 102; and
activate one or more illuminating devices 1054 provided on exterior part of the automotive vehicle 102.

12. The system 101 as claimed in claim 7, wherein the control unit 107 is configured to:
transmit in real-time one or more video frames 112, associated with an exterior environment of the automotive vehicle 102, to a server 125 for further analysis to assist the one or more users 103, wherein the one or more video frames 112 are captured while driving the automotive vehicle 102 by a video capturing unit 1092 configured in the wearable product 109.
, Description:TECHNICAL FIELD
The present subject matter is generally related to Vehicle Safety Technology (VST) in automotive industry and more particularly, but not exclusively, to a method and a system for detecting an event for alerting one or more users associated with an automotive vehicle.
BACKGROUND
Road accidents of automotive vehicles frequently occur late at night, early in the morning and in the late afternoon. Most of these accidents are caused due to drowsy driving, which occurs when a user driving the automotive vehicle is too tired to remain alert. More specifically, a drowsiness state of the user is associated with slow reaction times, reduced vigilance, and impaired thinking of the user. If the user feels drowsy while driving the automotive vehicle, then the user may not control properly a steering wheel of the automotive vehicle. In a worst-case scenario, the user may fall asleep, leading to fatalities to the user of automotive vehicle, and passengers accompanying the user. In some cases, the automotive vehicle collides with other neighboring automotive vehicles, thereby adversely affecting nearby passengers as well.
Further, in some scenarios, critical health issues of the user may suddenly arise while driving the automotive vehicle. For example, the user may have stroke or feel breathlessness while driving the automotive vehicle. If passengers accompanying the user remain unnoticed about the health issues of the user, then this may lead to fatal accident. Moreover, no visual alert is indicated to nearby passengers travelling on the road during such panic situation, to avoid occurrence of multiple accidents.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
SUMMARY
One or more shortcomings of the prior art are overcome by a system and a method as claimed and additional advantages are provided through the system and the method as claimed in the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
In one non-limiting embodiment of the present disclosure, a method of detecting an event for alerting one or more users associated with an automotive vehicle is disclosed. The method comprises receiving, by a control unit in the automotive vehicle, tilt angle data associated with head movement of a user driving the automotive vehicle. The tilt angle data is received from a sensor configured in a wearable product associated with the user. Thereafter, the method comprises comparing, by the control unit, the tilt angle data with a predefined range of angle values stored in a memory associated with the control unit. Further, the method comprises detecting, by the control unit, the event associated with drowsiness state of the user when the tilt angle data deviates from the predefined range of angle values based on the comparison for a preset time duration. Thereafter, the method comprises activating, by the control unit, one or more alert generating components in the automotive vehicle upon detecting occurrence of the event associated with the drowsiness state of the user. The one or more alert generating components are activated in the automotive vehicle for alerting the one or more users regarding the drowsiness state of the user.
In an embodiment of the present disclosure, the method comprises detecting, by the control unit, one or more sound signals associated with the one or more users of the automotive vehicle from a microphone associated with the control unit. Further, the method comprises comparing, by the control unit, intensity of the one or more sound signals with a predefined intensity value. Thereafter, the method comprises activating, by the control unit, the one or more alert generating components in the automotive vehicle when the intensity of the one or more sound signals exceeds the predefined intensity value.
In another embodiment of the present disclosure, the method comprises receiving, by the control unit, measured real time pulse rate data associated with the user through a pulse rate sensor associated with the user driving the automotive vehicle. Further, the method comprises comparing, by the control unit, the pulse rate data with a predefined pulse rate value. Further, the method comprises detecting, by the control unit, health issue of the user when the pulse rate data exceeds the predefined pulse rate value. Thereafter, the method comprises activating, by the control unit, the one or more alert generating components in the automotive vehicle based on the detected health issue of the user. Here, the method also comprises notifying, by the control unit, one or more health care facilities about the detected health issue of the user for assisting the user.
In an embodiment of the present disclosure, the method of activating the one or more alert generating components comprises activating, by the control unit, a speaker associated with the control unit to generate a sound alert inside the automotive vehicle. The method also comprises activating, by the control unit, vibration of a steering wheel of the automotive vehicle. The method also comprises activating, by the control unit, one or more illuminating devices provided on interior part of the automotive vehicle to generate a predefined luminous intensity inside the automotive vehicle. The method also comprises activating, by the control unit, one or more illuminating devices provided on exterior part of the automotive vehicle.

In some embodiment of the present disclosure, the method comprises transmitting, by the control unit, in real-time one or more video frames, associated with an exterior environment of the automotive vehicle, to a server for further analysis to assist the one or more users. Here, the one or more video frames are captured while driving the automotive vehicle by a video capturing unit configured in the wearable product.

In another non-limiting embodiment of the present disclosure a system for detecting an event for alerting one or more users associated with an automotive vehicle is disclosed. The system comprises a wearable product, associated with a user. The wearable product comprises a sensor to detect tilt angle data associated with head movement of the user. The system also comprises one or more alert generating components associated with the automotive vehicle. Further, the system comprises a control unit which is communicatively coupled to the sensor and the one or more alert generating components. In the automotive vehicle, the control unit receives tilt angle data associated with head movement of the user who is driving the automotive vehicle, from the sensor. Further, the control unit compares the tilt angle data with a predefined range of angle values stored in a memory associated with the control unit. When the tilt angle data deviates from the predefined range of angle values based on the comparison for a preset time duration, the control unit detects the event associated with drowsiness state of the user. Upon detecting occurrence of the event associated with the drowsiness state of the user, the control unit activates the one or more alert generating components in the automotive vehicle for alerting the one or more users.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The novel features and characteristic of the disclosure are set forth in the appended claims. The disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying figures. One or more embodiments are now described, by way of example only, with reference to the accompanying figures wherein like reference numerals represent like elements and in which:
Fig.1a shows an exemplary architecture for detecting an event for alerting one or more users associated with an automotive vehicle in accordance with some embodiments of the present disclosure.
Fig.1b shows exemplary illustration of detecting event associated with drowsiness state of a user driving an automotive vehicle by a control unit of a system in accordance with some embodiments of the present disclosure.
Fig.1c shows a block diagram of a system for detecting an event for alerting one or more users associated with an automotive vehicle in accordance with some embodiments of the present disclosure.
Fig.1d shows exemplary illustration of activating one or more alert generating components of an automotive vehicle in multiple exemplary scenarios in accordance with some embodiments of the present disclosure.
Fig.2 shows a flow chart illustrating a method of detecting an event for alerting one or more users associated with an automotive vehicle in accordance with some embodiments of the present disclosure.
The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the system and method illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the specific forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
The terms “comprises”, “comprising”, “includes”, “including” or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device, or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
The present disclosure relates to a method and a system for detecting an event for alerting one or more users associated with an automotive vehicle. A user may feel drowsy while driving the automotive vehicle. During driving, a control unit in the automotive vehicle may receive tilt angle data associated with head movement of the user. The tilt angle data may be received from a sensor configured in a wearable product associated with the user. Thereafter, the control unit may compare the tilt angle data with a predefined range of angle values. Further, the control unit may detect the event associated with drowsiness/distracted state of the user, when the tilt angle data deviates from the predefined range of angle values based on the comparison for a preset time duration. Upon detecting occurrence of the event associated with the drowsiness state of the user, the control unit may activate one or more alert generating components in the automotive vehicle for alerting the one or more users. Here, the control unit may activate a speaker associated with the control unit to generate a sound alert inside the automotive vehicle. Further, the control unit may activate vibration of a steering wheel of the automotive vehicle. Also, the control unit may activate one or more illuminating devices provided on interior part of the automotive vehicle to generate a predefined luminous intensity inside the automotive vehicle and one or more illuminating devices provided on exterior part of the automotive vehicle.
In an embodiment, the control unit may detect one or more sound signals associated with the one or more users of the automotive vehicle. The one or more sound signals may be received from a microphone associated with the control unit. Further, the control unit may compare intensity of the one or more sound signals with a predefined intensity value. Thereafter, the control unit may activate the one or more alert generating components in the automotive vehicle. The one or more alert generating components may be activated when the intensity of the one or more sound signals exceeds the predefined intensity value. To alert the one or more users associated with the automotive vehicle, the control unit may activate a speaker associated with the control unit to generate a sound alert inside the automotive vehicle. Further, the control unit may activate vibration of a steering wheel of the automotive vehicle to alert the user driving the automotive vehicle. Also, the control unit may activate one or more illuminating devices provided on interior part of the automotive vehicle to generate a predefined luminous intensity inside the automotive vehicle and one or more illuminating devices provided on exterior part of the automotive vehicle.
In another embodiment, the control unit may receive measured real time pulse rate data associated with the user. The pulse rate data may be received through a pulse rate sensor, which may be associated with the user driving the automotive vehicle. Further, the control unit may compare the pulse rate data with a predefined pulse rate value. When the pulse rate data exceeds the predefined pulse rate value, the control unit may detect health issue of the user. Thereafter, the control unit may activate one or more alert generating components in the automotive vehicle based on the detected health issue of the user. Here, the control unit may activate a speaker associated with the control unit to generate a sound alert inside the automotive vehicle. Further, the control unit may activate vibration of a steering wheel of the automotive vehicle. Also, the control unit may activate one or more illuminating devices provided on interior part of the automotive vehicle to generate a predefined luminous intensity inside the automotive vehicle and one or more illuminating devices provided on exterior part of the automotive vehicle. In an embodiment, the control unit may notify one or more health care facilities for assisting the user when the pulse rate data exceeds the predefined pulse rate value. The one or more health care facilities may be notified about the detected health issue of the user.
In another embodiment, the control unit may transmit in real-time one or more video frames associated with an exterior environment of the automotive vehicle. The one or more video frames may be transmitted to a server for further analysis to assist the one or more users. Here, the one or more video frames may be captured while driving the automotive vehicle by a video capturing unit, which may be configured in the wearable product.
In this manner, the present disclosure provides a method and a system for alerting one or more users associated with an automotive vehicle. In the disclosed system, the one or more alert generating components are activated not only to alert the user driving the automotive vehicle but also the passengers accompanying the user and nearby passengers travelling on the road. The system wakes up the user to prevent drowsy driving and park the automotive vehicle safely. This prevents probable occurrence of multiple accidents due to collision of the automotive vehicle with other one or more automotive vehicles on the road, thereby avoiding damage of roadside infrastructure, and saving lives from accidents.
Fig.1a shows an exemplary architecture for detecting an event for alerting one or more users associated with an automotive vehicle in accordance with some embodiments of the present disclosure.
As shown in Fig.1a, the architecture 100 may include one or more users 103 associated with an automotive vehicle 102, user 1031 to user 1033, and a system 101 for detecting an event for alerting the one or more users 103. Here, the one or more users 103 may include the user 1031 driving the automotive vehicle 102 (driver 1031), one or more passengers 1032, 1033 accompanying the driver 1031 and nearby passengers travelling on a road (not shown in figure).
In the automotive vehicle 102, the system 101 may comprise a wearable product 109 [shown in Fig.1b] associated with the user 1031 driving the automotive vehicle 102. As an example, the wearable product 109 may be mounted on ear of the user 1031. Further, the wearable product 109 may be wirelessly connected with an In-Vehicle Infotainment (IVI) system (not shown in figure) of the automotive vehicle 102. As an example, the wearable product 109 may be paired with IVI system controller (not shown in figure) via communication protocols, which may include, but not limited to, Bluetooth or Wi-fi. Further, the wearable product 109 may include, but not limited to, a sensor 1091 and a video capturing unit 1092 [shown in Fig.1b]. For example, the sensor 1091 may include, but not limited to, an angle sensor, and a posture sensor. The sensor 1091 may be configured to detect tilt angle data 111 [as indicated in Fig.1b] associated with head movement of the user 1031 and send the detected tilt angle data 111 to a control unit 107 [as shown in Fig.1b] of the system 101. Further, the video capturing unit 1092 may be configured to capture one or more video frames associated with an exterior environment of the automotive vehicle 102 and send the captured one or more video frames to the control unit 107 of the system 101. As an example, the video capturing unit 1092 may include, but not limited to, a mini cam recorder.
Further, the system 101 may comprise one or more alert generating components 105 associated with the automotive vehicle 102. The one or more alert generating components 105 may include, but not limited to, a speaker 1051 associated with the control unit 107, a steering wheel 1052 of the automotive vehicle 102, one or more illuminating devices 1053 provided on interior part of the automotive vehicle 102 and one or more illuminating devices 1054 provided on exterior part of the automotive vehicle 102. The one or more alert generating components 105 may be activated for alerting the one or more users 103 associated with the automotive vehicle 102 upon receiving actuating signal from the control unit 107 of the system 101.
Further, the control unit 107 may be communicatively coupled to the sensor 1091 of the wearable product 109 and the one or more alert generating components 105 of the automotive vehicle 102. The control unit 107 may be configured in a telematics system (not shown in figure) of the automotive vehicle 102. Alternatively, the control unit 107 may be configured as a separate hardware device, which may be communicatively coupled to the telematics system of the automotive vehicle 102. The telematics system may send a notification indicating activation of the sensor 1091 and the video capturing unit 1092 to a user device (not shown in figure) upon mounting of the wearable product 109 on an ear of the user 1031 driving the automotive vehicle 102. As an example, the user device may include, but not limited to, a mobile phone, a smart phone, a tablet, and the like. In an embodiment, upon receipt of the sensor activation notification from the telematics system, the user 1031 may be permitted to crank an Internal Combustion Engine (ICE) of the automotive vehicle 102 by an ignition interlock device. While the user 1031 is driving the automotive vehicle 102, the control unit 107 may continuously receive tilt angle data 111 associated with head movement of the user 1031 from the sensor 1091 of the wearable product 109. The control unit 107 may compare the received tilt angle data 111 with a predefined range of angle values stored in a memory associated with the control unit 107. When the tilt angle data 111 deviates from the predefined range of angle values based on the comparison for a preset time duration, the control unit 107 may detect the event associated with drowsiness/ distracted state 113 of the user 1031 [as indicated in Fig.1b]. Upon detecting the drowsiness state 113 of the user 1031, the control unit 107 may activate the one or more alert generating components 105 in the automotive vehicle 102 for alerting the one or more users 103 [shown in Fig.1a].
In an embodiment, the system 101 may receive one or more sound signals associated with the one or more users 103 of the automotive vehicle 102 from a microphone associated with the control unit 107. For example, the one or more sound signals may include screaming. Upon receiving the one or more sound signals, the system 101 may compare intensity of the one or more sound signals with a predefined intensity value to detect whether the received one or more sound signals is associated with panic state or inconveniences of the one or more users 103 of the automotive vehicle 102. When the intensity of the one or more sound signals exceeds the predefined intensity value, the system 101 may activate the one or more alert generating components 105 in the automotive vehicle 102.
In another embodiment, the system 101 may receive measured real time pulse rate data associated with the user 1031 through a pulse rate sensor associated with the user 1031 driving the automotive vehicle 102. As an example, the pulse rate sensor may be configured in a fit-bit band worn on a wrist of the user 1031 driving the automotive vehicle 102. Upon receiving measured real time pulse rate data, the system 101 may compare the pulse rate data with a predefined pulse rate value. For example, the predefined pulse rate value may be set within 60 – 100 beats per minute. As an example, the predefined pulse rate value may also be set based on age and previous health conditions of the user 1031 driving the automotive vehicle 102. When the received pulse rate data deviates from the predefined pulse rate value, the system 101 may detect health issue of the user 1031. Based on the detected health issue of the user 1031, the system 101 may activate the one or more alert generating components 105 in the automotive vehicle 102. In an embodiment, the system 101 may notify one or more health care facilities about the detected health issue of the user 1031 for assisting the user 1031.
In another embodiment, the system 101 may activate a speaker 1051 associated with the control unit 107 to generate a sound alert inside the automotive vehicle 102 to alert the one or more users 103 of the automotive vehicle 102. Further, system 101 may activate vibration of a steering wheel 1052 of the automotive vehicle 102 to wake up the user 1031 driving the automotive vehicle 102 from drowsiness state. The system 101 may also activate one or more illuminating devices 1053 provided on interior part of the automotive vehicle 102 to generate a predefined luminous intensity inside the automotive vehicle 102. The interior one or more illuminating devices 1053 may be activated to alert the one or more users 103 of the automotive vehicle 102. The system 101 may further activate one or more illuminating devices 1054 provided on exterior part of the automotive vehicle 102. The exterior one or more illuminating devices 1054 may be activated to alert one or more users 103 of other automotive vehicles (not shown in figure) travelling at a proximal distance from the automotive vehicle 102.
In another embodiment, the system 101 may transmit in real-time one or more video frames, associated with an exterior environment of the automotive vehicle 102. The one or more video frames may be captured by the video capturing unit 1092 configured in the wearable product 109 [shown in Fig.1b]. The one or more video frames may be captured while driving the automotive vehicle 102. The system 101 may transmit the captured one or more video frames to a server for further analysis to assist the one or more users 103.
Fig.1b shows exemplary illustration of detecting event associated with drowsiness state of a user driving an automotive vehicle by a control unit of a system in accordance with some embodiments of the present disclosure.
In an exemplary scenario, a user 1031 may wear a wearable product 109 comprising a sensor 1091 while driving the automotive vehicle 102. The sensor 1091 may be configured to sense tilt angle data 111 associated with head movement of the user 1031 while driving the automotive vehicle 102. The sensed tilt angle data 111 may be transmitted to the control unit 107 of the system 101. Upon receiving the tilt angle data 111 from the sensor 1091 of the wearable product 109 of the user 1031, the control unit 107 may compare the tilt angle data 111 with a predefined range of angle values. The predefined range of angle values may be user configurable. Here, the predefined range of angle values may be considered for comparison of received tilt angle data 111 to avoid false detection of drowsiness state 113 of the user 1031 driving the automotive vehicle 102. As an example, the predefined range of angle values may be set between - 100 (negative ten degrees) to +100 (positive ten degrees). The predefined range of angle values may be indicated as +/- 100. Here, if the received tilt angle data 111 either exceeds +100 or falls below -100, then the control unit 107 may detect the event associated with drowsiness state 113 of the user 1031.As an example, initially, the user 1031 may be awake, and the sensor 1091 may detect the tilt angle data 111 to be varying within a range of -40 (four degrees) to +40 (four degrees), such as -30 (three degrees), -20 (two degrees), -10 (one degree), +10 (one degree), +20 (two degrees), and +30 (three degrees). The control unit 107 may receive the detected tilt angle data 111 from the sensor 1091. Further, the control unit 107 may compare the received tilt angle data 111 with the predefined range of angle values which is +/-100 (ten degrees). Upon comparing the tilt angle data 111, the control unit 107 may detect the received tilt angle data 111 such as -30, -20, -10, +10, +20 and +30 to be within -100 to +100 which is +/-100 (ten degrees) for preset time duration of 5 minutes. Hence, the control unit 107 may interpret that the user 1031 is in alert state.
After driving for a time duration, the user 1031 may feel drowsy. As a result, the user 1031 may keep nodding off while driving the automotive vehicle 102. As an example, the sensor 1091 may detect the tilt angle data 111 associated with the head position of the user 1031 as +150 (fifteen degrees), -110 (eleven degrees), -130 (thirteen degrees), and +120 (twelve degrees) for a predefined duration . The control unit 107 may receive the detected tilt angle data 111 from the sensor 1091 and may compare with the predefined range of angle values which is +/- 100 (ten degrees). Thereafter, the control unit 107 may detect received tilt angle data 111 of 150 (fifteen degrees) to be deviating from the predefined range of angle values of +/- 100 (ten degrees) for a preset time duration of 5 minutes. Upon detecting this deviation of the tilt angle data 111 from the predefined range of angle values for a preset time duration of 5 minutes, the control unit 107 may detect event associated drowsiness state 113 of the user 1031. Further, the control unit 107 may activate a speaker 1051 associated with the control unit 107 to generate a sound alert inside the automotive vehicle 102. The sound alert generated from the speaker 1051 may wake-up the user 1031 driving the automotive vehicle 102 from the drowsiness state 113 and also may alert the one or more users 1032, 1033 of the automotive vehicle 102 accompanying the user 1031 [shown in Fig.1a]. Further, the control unit 107 may also activate vibration of a steering wheel 1052 of the automotive vehicle 102 to wake-up the user 1031 driving the automotive vehicle 102 from the drowsiness state 113. Also, the control unit 107 may activate one or more illuminating devices 1053 provided on interior part of the automotive vehicle 102 to generate a predefined luminous intensity inside the automotive vehicle 102. Due to activation of the aforesaid speaker 1051 and the steering wheel 1052, the user 1031 may stop nodding off while driving the automotive vehicle 102, thereby avoiding the probable cause of road accident of the automotive vehicle 102. Further, the control unit 107 may activate one or more illuminating devices 1054 provided on exterior part of the automotive vehicle 102 upon detecting the drowsiness state 113 of the user 1031 to wake up the user 1031 driving the automotive vehicle 102. The exterior one or more illuminating devices 1054 may be activated to provide visual alert to the one or more users 103 of nearby automotive vehicles (not shown in figure) regarding the drowsiness state 113 of the user 1031 of the automotive vehicle 102. Consequently, the one or more users 103 of the nearby automotive vehicles may keep a safe distance from the automotive vehicle 102 to avoid occurrence of multiple road accidents, damage to roadside infrastructure and fatalities.
Fig.1c shows a block diagram of a system for detecting an event for alerting one or more users associated with an automotive vehicle in accordance with some embodiments of the present disclosure.
In some implementations, the system 101 may include a wearable product 109, one or more alert generating components 105, a microphone 115, a pulse rate sensor 117, a control unit 107, and a transceiver 119. The wearable product 109 may include a sensor 1091 and the video capturing unit 1092. The sensor 1091 may be configured to detect tilt angle data 111 associated with head movement of a user 1031 driving the automotive vehicle 102. The video capturing unit 1092 may be configured to capture one or more video frames 112 associated with an exterior environment of the automotive vehicle 102. The sensor 1091 and the video capturing unit 1092 may be enabled when the wearable product 109 is mounted on ear of the user 1031, ready to drive the automotive vehicle 102 and when the wearable product 109 is paired with an infotainment system (not shown in figure) of the automotive vehicle 102 through Bluetooth or Wi-fi.
In some embodiments, the one or more alert generating components 105 may include a speaker 1051, a steering wheel 1052, one or more illuminating devices 1053 provided on interior part of the automotive vehicle 102 and one or more illuminating devices 1054 provided on exterior part of the automotive vehicle 102. The speaker 1051 may be provided at a proximal distance from the ear of the user 1031 driving the automotive vehicle 102. For example, the speaker 1051 may be provided in the interior part of the vehicle door near to the user 1031 driving the automotive vehicle 102. As an example, the speaker 1051 may generate buzzer sound to alert the one or more users 103 of the automotive vehicle 102. Further, the interior one or more illuminating devices 1053 may be provided near foot pedals such as accelerator, brake and clutch of the automotive vehicle 102, below glove box, inside cup holder, car roof, rear passenger compartment and luggage compartment of the automotive vehicle 102. Further, the one or more illuminating devices 1054 provided on exterior part may be hazard lamps. The aforesaid one or more alert generating components 105 may be activated by the control unit 107 of the system 101, when at least one of following events occurs:
i. upon detecting an event associated with drowsiness state 113 of the user 1031 driving the automotive vehicle 102,
ii. when intensity of one or more sound signals associated with the one or more users 103 of the automotive vehicle 102 exceeds a predefined intensity value, and
iii. upon detecting health issue of the user 1031 driving the automotive vehicle 102.
The one or more alert generating components 105 may be activated by the control unit 107 to avoid accidents and fatalities.
In some embodiments, the microphone 115 may be configured to detect one or more sound signals associated with the one or more users 103 of the automotive vehicle 102 [shown in Fig.1a]. The one or more sound signals may be sent to the control unit 107 for further signal processing to detect whether the one or more sound signals 121 are associated with panic or accidental situations associated with the one or more users 103 of the automotive vehicle 102.
In some embodiments, the pulse rate sensor 117 may be configured to measure pulse rate of the user 1031 driving the automotive vehicle102. As an example, the pulse rate sensor 117 may be configured in a fit-bit band, which may be worn on wrist of the user 1031.
In some embodiments, the control unit 107 in the automotive vehicle 102 may receive tilt angle data 111 associated with head movement of the user 1031 driving the automotive vehicle 102. The control unit 107 may receive the tilt angle data 111 from the sensor 1091 configured in the wearable product 109 associated with the user 1031. Upon receiving the tilt angle data 111, the control unit 107 may compare the tilt angle data 111 with a predefined range of angle values. If the tilt angle data 111 is within the predefined range of angle values, then the control unit 107 may detect that the user 1031 driving the automotive vehicle 102 is awake. The control unit 107 may deactivate the one or more alert generating components 105 based on detection of alert state of the user 1031 driving the automotive vehicle 102. However, when the tilt angle data 111 deviates from the predefined range of angle values based on the comparison for a preset time duration, the control unit 107 may detect an event associated with drowsiness state 113 of the user 1031 as illustrated in Fig.1b. Upon detecting occurrence of the event associated with the drowsiness state 113 of the user 1031, the control unit 107 may activate one or more alert generating components 105 in the automotive vehicle 102 as illustrated in Fig.1a and Fig.1d. Particularly, the control unit 107 may activate the speaker 1051 associated with the control unit 107. The speaker 1051 may generate a sound alert inside the automotive vehicle 102 to alert the one or more users 1032, 1033 of the automotive vehicle 102 regarding drowsiness state 113 of the user 1031. As an example, the speaker 1051 may generate one or more preset sounds or user configurable sounds upon activation. Further, the control unit 107 may activate vibration of the steering wheel 1052 of the automotive vehicle 102 to wake-up the user 1031 driving the automotive vehicle 102 from the drowsiness state 113. Further, the control unit 107 may activate the one or more illuminating devices 1053 provided on interior part of the automotive vehicle 102 to generate a predefined luminous intensity inside the automotive vehicle 102. The generated predefined luminous intensity may be sufficient to wake up the user 1031 driving the automotive vehicle 102 from the drowsiness state 113. Further, the control unit 107 may activate the one or more illuminating devices 1054 provided on exterior part of the automotive vehicle 102. The exterior one or more illuminating devices 1054 may alert the one or more users 103 of nearby automotive vehicles (not shown in figure) regarding drowsiness state 113 of the user 1031 driving the automotive vehicle 102. The one or more users 103 of nearby automotive vehicles may be alerted to keep a safe distance from the automotive vehicle 102 whose user 1031 is in drowsiness state 113. Keeping a safe distance from the automotive vehicle 102 may avoid occurrence of road accidents and fatalities.
In some embodiments, the control unit 107 in the automotive vehicle 102 may detect one or more sound signals 121 [ as shown in Fig.1d] from a microphone 115 associated with the control unit 107. The one or more sound signals 121 may be associated with the one or more users 103 of the automotive vehicle 102 [as shown in Fig.1a]. Upon detecting the one or more sound signals 121, the control unit 107 may compare intensity of the one or more sound signals 121 with a predefined intensity value. If the intensity of the one or more sound signals 121 is below the predefined intensity value, the control unit 107 may deactivate the one or more alert generating components 105. However, when the intensity of the one or more sound signals 121 exceeds the predefined intensity value, the control unit 107 may activate the one or more alert generating components 105 in the automotive vehicle 102 as illustrated in Fig.1d. For example, in the automotive vehicle 102 two passengers 1032, 1033 may be accompanying the driver 1031. In panic or accidental situations, the two passengers 1032, 1033 may scream in the automotive vehicle 102. The sound signals 121 associated with screaming of the passengers 1032, 1033 of the automotive vehicle 102 may be detected by the microphone 115 associated with the control unit 107 as illustrated in Fig.1d. Further, the control unit 107 may determine an intensity of the detected sound signals 121 associated with screaming to be 110 dB. The control unit 107 may compare the determined intensity of the sound signals 121 with a predefined intensity value, which may be set as 70dB and may determine that intensity of 110dB is higher than predefined intensity value of 70dB. Based on this determination, the control unit 107 may activate the speaker 1051, vibration of the steering wheel 1052, the one or more interior illuminating devices 1053 and one or more exterior illuminating devices 1054 in the automotive vehicle 102 as illustrated in Fig.1d. Also, the control unit 107 may assist the user 1031 driving the automotive vehicle 102 and one or more users 1032, 1033 of the automotive vehicle 102 to access various control panels inside the automotive vehicle 102 by illuminating interior of the automotive vehicle 102 based on sound intensity. The user 1031 driving the automotive vehicle 102 may customize the interior illumination based on comfort level in such a way that it would not have an adverse effect on visibility of the user 1031 during nighttime driving.
In some embodiments, the control unit 107 in the automotive vehicle 102 may receive measured real time pulse rate data 123 associated with the user 1031 driving the automotive vehicle 102. The control unit 107 may receive measured real time pulse rate data 123 through the pulse rate sensor 117 configured in a fit-bit band associated with the user 1031 driving the automotive vehicle 102. Upon receiving the pulse rate data 123, the control unit 107 may compare the pulse rate data 123 with a predefined pulse rate value. If the pulse rate data 123 is below the predefined pulse rate value, the control unit 107 may deactivate the one or more alert generating components 105. However, when the pulse rate data 123 exceeds the predefined pulse rate value, the control unit 107 may detect health issue of the user 1031. Based on the detected health issue of the user 1031, the control unit 107 may activate the one or more alert generating components 105 in the automotive vehicle 102 as illustrated in Fig.1d. Also, the control unit 107 may notify the one or more health care facilities 127 about the detected health issue of the user 1031 for assisting the user 1031 as illustrated in Fig.1d. For example, in the automotive vehicle 102, the user 1031 driving the automotive vehicle 102 may wear a fit-bit band in which the pulse rate sensor 117 is configured. The fit-bit band may be communicatively coupled to the control unit 107 to transmit the measured pulse rate data 123 of the user 1031 while driving the vehicle. The control unit 107 may determine the pulse rate of the user 1031 to be 120 beats per minute. The control unit 107 may compare the pulse rate data 123 with a predefined pulse rate value, which may be set to 100 beats per minute. The predefined pulse rate value may also be configured to a lower value based on previous health conditions and medical records of the user 1031 driving the vehicle. Upon comparing the received pulse rate data 123 of the user 1031 while driving the automotive vehicle 102, the control unit 107 may determine that the pulse rate 120 beats per minute of the user 1031 is greater than the predefined pulse rate value of 100 beats per minute. Upon determining this, the control unit 107 may detect health issue of the user 1031 as “tachycardia”. Thereafter, the control unit 107 may activate the speaker 1051, vibration of the steering wheel 1052, the one or more interior illuminating devices 1053 and one or more exterior illuminating devices 1054 in the automotive vehicle 102 as illustrated in Fig.1d, based on the detected health issue of the user 1031. Also, the control unit 107 may determine locations of one or more health care facilities 127 using a Global Positioning System (GPS) (not shown in figure) of the automotive vehicle 102. The one or more health care facilities 127 may be located within a predefined distance of the automotive vehicle 102. The control unit 107 may notify the determined one or more health care facilities 127 regarding “tachycardia” of the user 1031 driving the automotive vehicle 102. Here, the predefined pulse rate value may also be set to 60 beats per minute to detect “bradycardia” health issue of the user 1031 when the pulse rate of the user 1031 falls below the predefined pulse rate value.
In some embodiments, the control unit 107 may transmit one or more video frames 112 associated with an exterior environment of the automotive vehicle 102 to a server 125 as illustrated in Fig. 1d. The one or more video frames 112 may be captured by the video capturing unit 1092 configured in the wearable product 109. The one or more video frames 112 may be captured while driving the automotive vehicle 102. The control unit 107 may transmit these one or more video frames 112 by a transceiver 119 associated with the control unit 107, to the server 125 for further analysis. The one or more video frames 112 may be transmitted to assist the one or more users 103 of the automotive vehicle 102. For example, the transmitted one or more video frames 112 from an automotive vehicle 102 may determine whether an accident of the automotive vehicle 102 has occurred due to drowsiness state 113 of the user 1031 or not. As an example, a mini cam recorder configured in the wearable product 109 may record live video frames 112 same as whatever seen by the driver 1031 while driving on the road. If the driver 1031 is awake during driving the automotive vehicle 102, then the captured one or more video frames 112 may be straight. However, when the driver 1031 feels drowsy and starts nodding off, then the captured one or more video frames 112 may not be straight. In other words, the captured one or more video frames 112 may get rotated with some rotation angle based on bending of driver’s neck while feeling drowsy. As an example, upon receiving the one or more video frames 112 from the control unit 107, the server 125 may determine the received one or more video frames 112 to be straight. As a result, the server 125 may determine that the driver 1031 was awake during the accident and the driver 1031 is not at fault for the accident. However, when the received one or more video frames 112 are determined to be rotated, the server 125 may determine that the driver 1031 was in the drowsiness/distracted state 113 during the accident.
Fig.2 shows a flow chart illustrating a method of detecting an event for alerting one or more users associated with an automotive vehicle in accordance with some embodiments of the present disclosure.
As shown in Fig.2, the method 200 includes one or more blocks illustrating a method of detecting an event for alerting one or more users 103 associated with an automotive vehicle 102. The order in which the method 200 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 201, the method may include receiving, by a control unit 107 in the automotive vehicle 102, tilt angle data 111 associated with head movement of a user 1031 driving the automotive vehicle 102, from a sensor 1091 configured in a wearable product 109 associated with the user 1031. Here, tilt angle data 111 may be detected by the sensor 1091 of the wearable product 109. The wearable product 109 may be worn by the user 1031 driving the vehicle.
At block 203, the method may include comparing, by the control unit 107, the tilt angle data 111 with a predefined range of angle values. The predefined range of angle values may be user configurable. Here, the tilt angle data 111 received from the sensor 1091 may be compared with the predefined range of angle value to detect drowsiness state 113 of the user 1031.
At block 205, the method may include detecting, by the control unit 107, an event associated with drowsiness state 113 of the user 1031 when the tilt angle data 111 exceeds the predefined range of angle values based on the comparison for a preset time duration. Here, the received tilt angle data 111 may exceed the predefined angle value due to nodding off the head of the user 1031 driving the vehicle. The nodding off may be due to drowsiness state 113 of the user 1031.
At block 207, the method may include activating, by the control unit 107, one or more alert generating components 105 in the automotive vehicle 102 upon detecting occurrence of the event associated with the drowsiness state 113 of the user 1031 for alerting the one or more users 103. Here, a speaker 1051 may be activated by the control unit 107 to generate a sound alert inside the automotive vehicle 102. The speaker 1051 may be associated with the control unit 107. Further, vibration of a steering wheel 1052 of the automotive vehicle 102 may be activated by the control unit 107 to alert the user 1031 driving the automotive vehicle 102. Also, one or more illuminating devices 1053 may be activated by the control to generate a predefined luminous intensity inside the automotive vehicle 102. The one or more illuminating devices 1054 may be provided on interior part of the automotive vehicle 102. Further, one or more illuminating devices 1054 provided on exterior part of the automotive vehicle 102 may be activated by the control unit 107 to alert one or more users 103 of nearby automotive vehicles associated with the automotive vehicle 102.
In an embodiment, the present disclosure provides a method and a system for detecting an event for alerting one or more users associated with an automotive vehicle.
In an embodiment, the present disclosure provides a method for automatically activating one or more alert generating components in the automotive vehicle to alert the user driving the vehicle, one or more users of the automotive vehicle and one or more users of nearby automotive vehicles. As a consequence, occurrence of accidents can be avoided promptly.
In an embodiment, the present disclosure provides a method for activating one or more alert generating components in the automotive vehicle upon detecting drowsiness state of the user driving the automotive vehicle to alert the user beforehand, thereby reducing occurrence of road accidents caused due to drowsy driving. Here, the system ensures that safety of the user driving the automotive vehicle and safety of the one or more users of the automotive vehicle are maintained.
In an embodiment, the present disclosure provides a method for activating one or more alert generating components in the automotive vehicle upon detecting sound signals associated with the one or more users of the automotive vehicle in panic or accidental situations to alert one or more users of nearby automotive vehicles.
In an embodiment, the present disclosure provides a method for activating one or more alert generating components in the automotive vehicle upon detecting health issue of the user driving the automotive vehicle to alert one or more users of nearby automotive vehicles. Also, the method assists the user by notifying one or more health care facilities about the detected health issue of the user to facilitate early treatment of the detected health issue.
EQUIVALENTS

With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.

It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system (100) having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system (100) having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be understood to include the possibilities of “A” or “B” or “A and B.”

In addition, where features or aspects of the disclosure are described in terms of Markush groups, those skilled in the art will recognize that the disclosure is also thereby described in terms of any individual member or subgroup of members of the Markush group.

While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

Referral Numerals:

Reference Number Description
100 Architecture
101 System
102 Automotive vehicle
103 Users
1031 User driving the automotive vehicle
1032, 1033 Users of the automotive vehicle
105 Alert generating components
1051 Speaker
1052 Steering wheel
1053 Illuminating devices on interior part of the automotive vehicle
1054 Illuminating devices on exterior part of the automotive vehicle
107 Control unit
109 Wearable product
1091 Sensor
1092 Video capturing unit
111 Tilt angle data
112 Video frames
113 Drowsiness state
115 Microphone
117 Pulse rate sensor
119 Transceiver
121 Sound signals
123 Pulse rate data
125 Server
127 Health care facilities

Documents

Application Documents

# Name Date
1 202021052828-STATEMENT OF UNDERTAKING (FORM 3) [04-12-2020(online)].pdf 2020-12-04
2 202021052828-REQUEST FOR EXAMINATION (FORM-18) [04-12-2020(online)].pdf 2020-12-04
3 202021052828-POWER OF AUTHORITY [04-12-2020(online)].pdf 2020-12-04
4 202021052828-FORM-8 [04-12-2020(online)].pdf 2020-12-04
5 202021052828-FORM 18 [04-12-2020(online)].pdf 2020-12-04
6 202021052828-FORM 1 [04-12-2020(online)].pdf 2020-12-04
7 202021052828-DRAWINGS [04-12-2020(online)].pdf 2020-12-04
8 202021052828-DECLARATION OF INVENTORSHIP (FORM 5) [04-12-2020(online)].pdf 2020-12-04
9 202021052828-COMPLETE SPECIFICATION [04-12-2020(online)].pdf 2020-12-04
10 Abstract1.jpg 2021-10-19
11 202021052828-FER.pdf 2022-06-15
12 202021052828-FER_SER_REPLY [14-12-2022(online)].pdf 2022-12-14
13 202021052828-Proof of Right [16-12-2022(online)].pdf 2022-12-16
14 202021052828-PETITION UNDER RULE 137 [19-12-2022(online)].pdf 2022-12-19
15 202021052828-PatentCertificate07-05-2025.pdf 2025-05-07
16 202021052828-IntimationOfGrant07-05-2025.pdf 2025-05-07

Search Strategy

1 202021052828E_14-06-2022.pdf

ERegister / Renewals