Sign In to Follow Application
View All Documents & Correspondence

A System And Method For Generating And Transmitting An Emergency Alert Signal

Abstract: ABSTRACT A SYSTEM AND METHOD FOR GENERATING AND TRANSMITTING AN EMERGENCY ALERT SIGNAL The present invention relates to a system (100) and a method (200) for generating and transmitting an emergency alert signal in a simple, reliable and cost-effective manner without human intervention. The system (100) comprises one or more first recording units (106), one or more second recording units (108), a processing unit (112), an emergency alert unit (114) and a transmitting unit (116). The processing unit (112) is configured to receive and process the recorded sounds of the vehicle (102), recorded sounds of the rider of the vehicle (102) and the recorded sounds of the environment surrounding the vehicle (102). The emergency alert unit (114) is configured to generate an emergency alert signal on satisfaction of one or more pre-defined conditions. The transmitting unit (116) is configured to perform one or more pre-defined operations when an emergency alert signal is generated by the emergency alert unit (114). Reference Figure 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 November 2022
Publication Number
22/2024
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

TVS MOTOR COMPANY LIMITED
“Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India.

Inventors

1. NAVEEN NATARAJAN KRISHNAKUMAR
TVS Motor Company Limited, “Chaitanya”, No 12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India
2. SATHIAMOORTHY MURALIMANOHAR
TVS Motor Company Limited, “Chaitanya”, No 12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India
3. RAGHAVENDRA PRASAD
TVS Motor Company Limited, “Chaitanya”, No 12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India
4. DATTA RAJARAM SAGARE
TVS Motor Company Limited, “Chaitanya”, No 12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India

Specification

Description:FORM 2
THE PATENTS ACT, 1970
(39 OF 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[See section 10, Rule 13]

TITLE OF INVENTION
A SYSTEM AND METHOD FOR GENERATING AND TRANSMITTING AN EMERGENCY ALERT SIGNAL

APPLICANT
TVS MOTOR COMPANY LIMITED, an Indian company, having its address at “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India.

PREAMBLE TO THE DESCRIPTION
The following specification particularly describes the invention and the manner in which it is to be performed.

FIELD OF THE INVENTION
[001] The present invention relates to a system and a method for generating and transmitting an emergency alert signal. More particularly, the present invention relates to generating and transmitting an emergency alert signal without human intervention.

BACKGROUND OF THE INVENTION
[002] Vehicles are prone to wear, tear and/or accidents/collisions. In such conditions, a rider of the vehicle needs assistance for further mobility or as a life saver. In modern day scenario, various emergency rescue operations are available to the rider of the vehicle. In one example, the emergency rescue operation can be triggered through a phone call initiated by the rider of the vehicle. The phone call can also be initiated by passers-by in case the rider of the vehicle is severely injured or in an unconscious state. It is to be noted that such emergency rescue operation needs human intervention to be triggered. In case the rider of the vehicle is severely injured, unconscious or in a remote area with no passers-by, the emergency recue operation will not be triggered which can lead to loss of life of the rider or the rider being stranded in the remote area without any assistance.
[003] In another example, the emergency rescue operation can be triggered by an emergency switch available on the vehicle. The switch can be pressed by the rider of the vehicle in emergency situations. The switch can also be pressed by passers-by in case the rider of the vehicle is severely injured or in an unconscious state. It is to be noted that such emergency rescue operation also needs human intervention to be triggered. In case the rider of the vehicle is severely injured, unconscious or in a remote area with no passers-by, the emergency rescue operation will not be triggered which can lead to loss of life of the rider or the rider being stranded in the remote area without any assistance.
[004] In yet another example, the emergency rescue operation can be triggered by one or more sensors disposed in the vehicle or on user peripherals which can detect the breakdown condition of the vehicle but cannot detect condition of the rider of the vehicle. There can be instances wherein the vehicle is not damaged but the rider is severely injured. In such a scenario, the sensors will not trigger the emergency rescue operation leading to loss of life of the rider.
[005] In view thereof, there is a need-felt to overcome at least the above-mentioned disadvantages of the prior art and provide a reliable and cost-effective system and method for generating and transmitting an emergency alert signal without human intervention.

SUMMARY OF THE INVENTION
[006] In one aspect of the present invention, a system for generating and transmitting an emergency alert signal is disclosed. The system comprises one or more first recording unit, one or more second recording units, a processing unit, an emergency alert unit and a transmitting unit. The one or more first recording units are configured to record sounds of the vehicle. The one or more second recording units are configured to record sounds of a rider of the vehicle and sounds of an environment surrounding the vehicle. The processing unit is configured to receive and process the sounds of the vehicle, the sounds of the rider of the vehicle and the sounds of the environment surrounding the vehicle. The emergency alert unit is in communication with the processing unit and configured to generate an emergency alert signal on satisfaction of one or more pre-defined conditions. The transmitting unit is in communication with the emergency alert unit and is configured to perform one or more pre-defined operations on receipt of the emergency alert signal.
[007] In an embodiment, the processing unit, the emergency alert unit and the transmitting unit are disposed in a remote server. The remote server is in communication with the personal digital assistant of the rider and the vehicle.
[008] In an embodiment, the processing unit, the emergency alert unit and the transmitting unit are disposed in the personal digital assistant of the rider.
[009] In an embodiment, the processing unit and the emergency alert unit are disposed in the personal digital assistant of the rider of the vehicle and the transmitting unit is disposed in the remote server.
[010] In an embodiment, the one or more first recording units are disposed in the vehicle and the one or more second recording units are disposed in the at least one of the personal digital assistant and one or more peripheral devices coupled with the personal digital assistant such as, not being limited to, helmets, ear phones, head phones, watch and clothes etc.
[011] In an embodiment, the one or more first recording units are configured to transmit the recorded sounds of the vehicle to the personal digital assistant of the rider of the vehicle such that the recorded sounds of the vehicle, the recorded sound of the rider of the vehicle and the recorded sound of the environment surrounding the vehicle are transmitted by the personal digital assistant to the processing unit. As already stated, the processing unit can either be disposed in the personal digital assistant of the rider of the vehicle or the remote server.
[012] In an embodiment, the one or more pre-defined conditions comprises: (a) a match between at least one of the recorded sounds of the vehicle, the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle with at least one of pre-recorded reference sounds and (b) in an event of no match between at least one of the recorded sounds of the vehicle, the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle with the at least one of the pre-stored reference sounds, a voice engine in the processing unit is configured to process the recorded sounds of the vehicle, the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle to detect a state of emergency in the recorded sounds.
[013] In an embodiment, the emergency alert unit is configured to receive inputs from a telematics units disposed in the vehicle and/or a speed detection unit disposed in the vehicle. The telematics unit is configured to receive information from a lean angle sensor of the vehicle and transmit information indicative of the lean angle of the vehicle to the emergency alert unit. The speed detection unit is configured to receive information from one or more speed sensors disposed on wheels of the vehicle and transmit information indicative of the speed of the vehicle to the emergency alert unit. It is to be understood that such inputs are received by the emergency alert unit only after receiving an input from the processing unit that at least one of the recorded sounds of the vehicle, recorded sounds of the rider of the vehicle and the recorded sounds of the environment surrounding the vehicle found a match in the pre-stored reference sounds or a state of emergency is detected in at least one of the recorded sounds of the vehicle, recorded sounds of the rider of the vehicle and the recorded sounds of the environment surrounding the vehicle when processed by the voice engine. In other words, the emergency alert unit is configured to receive inputs with respect to lean angle and/or the speed of the vehicle to corroborate the occurrence of an emergency situation such that triggering of false emergency signals can be avoided. The emergency alert unit is configured to check whether the speed of the vehicle is less than a pre-defined speed and/or lean angle of the vehicle is less than a pre-defined lean angle. In a scenario, wherein the speed of the vehicle is less than the pre-defined speed and the lean angle of the vehicle is less than the pre-defined lean angle, the emergency alert unit generates an emergency alert signal.
[014] In an embodiment, on generation of an emergency alert signal by the emergency alert unit, the one or more pre-defined operation comprises transmitting a signal to the telematics unit of the vehicle and/or personal digital assistant of the vehicle for retrieving coordinates of the vehicle. On receiving the coordinates of the vehicle, the transmitting unit is configured to detect one or more emergency centres within a pre-defined distance of the retrieved coordinates of the vehicle. On detection of one or more emergency centres, the transmitting unit is configured to transmit an alert message to the one or more emergency centres along with retrieved coordinates of the vehicle. The emergency centres comprise medical centres and/or police stations. The coordinates of the vehicle are transmitted by the telematics units disposed in the vehicle.
[015] In another aspect of the invention, a method for generating and transmitting an emergency alert signal is disclosed. The method comprises a step of recording sounds of the vehicle. The sounds of the vehicle are recorded by one or more recording units. The method further comprises a step of recording sounds of the rider of the vehicle and sounds of the environment surrounding the vehicle. The sounds of the rider and the environment surrounding the vehicle of the rider are recorded by one or more second recording units. The method further comprises a step of receiving and processing the recorded sounds of the vehicle, the recorded sounds of the rider of the vehicle and the recorded sounds of the environment surrounding the rider of the vehicle. The step of receiving and processing is performed by a processing unit. The method further comprises a step of generating an emergency alert signal on satisfaction of one or more pre-defined conditions. The step of generating the emergency alert signal is performed by an emergency alert unit. The emergency alert unit is in communication with the processing unit. The method further comprises performing one or more pre-defined operations on generation of the emergency alert signal. The one or more pre-defined operations are performed by a transmitting unit. The transmitting unit is in communication with the emergency alert unit.
[016] In an embodiment, the one or more pre-defined conditions comprises (a) a match between at least one of the recorded sounds of the vehicle, the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle with at least one of pre-recorded reference sounds and (b) in an event of no match between at least one of the recorded sounds of the vehicle, the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle with the at least one of the pre-stored reference sounds, a voice engine in the processing unit is configured to process the recorded sounds of the vehicle, the rider and the environment surrounding the vehicle to detect a state of emergency such as distress in recorded sounds.
[017] In an embodiment, the method further comprises receiving inputs from a telematics unit and/or speed detection unit disposed in the vehicle. The telematics unit is configured to receive information from a lean angle sensor of the vehicle and transmit information indicative of the lean angle of the vehicle to the emergency alert unit. The speed detection unit is configured to receive information from one or more speed sensors disposed on wheels of the vehicle and transmit information indicative of the speed of the vehicle to the emergency alert unit. It is to be understood that such inputs are received by the emergency alert unit only after receiving an input from the processing unit that at least one of the recorded sounds of the vehicle, recorded sounds of the rider of the vehicle and the recorded sounds of the environment surrounding the vehicle found a match in the pre-stored reference sounds or a state of emergency is detected in at least one of the recorded sounds of the vehicle, recorded sounds of the rider of the vehicle and the recorded sounds of the environment surrounding the vehicle when processed by the voice engine. In other words, the emergency alert unit is configured to receive inputs with respect to lean angle and/or the speed of the vehicle to corroborate the occurrence of an emergency situation such that triggering of false emergency signals can be avoided. The emergency alert unit is configured to check whether the speed of the vehicle is less than a pre-defined speed and/or lean angle of the vehicle is less than a pre-defined lean angle. In a scenario, wherein the speed of the vehicle is less than the pre-defined speed and/or the lean angle of the vehicle is less than the pre-defined lean angle, the emergency alert unit generates an emergency alert signal.
[018] In an embodiment, on generation of an emergency alert signal by the emergency alert unit, the one or more pre-defined operation comprises transmitting a signal to the telematics unit of the vehicle and/or personal digital assistant of the vehicle for retrieving coordinates of the vehicle. On receiving the coordinates of the vehicle, the transmitting unit is configured to detect one or more emergency centres within a pre-defined distance of the retrieved coordinates of the vehicle. On detection of one or more emergency centres, an alert message is transmitted to the one or more emergency centres along with retrieved coordinates of the vehicle. The emergency centres comprise medical centres and/or police stations. The coordinates of the vehicle are transmitted by the telematics units disposed in the vehicle.

BRIEF DESCRIPTION OF THE DRAWINGS
[019] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 is a block diagram illustrating a system for generating and transmitting an emergency alert signal, in accordance with an embodiment of the present invention.
Figure 2 is a flow chart illustrating a method for generating and transmitting an emergency alert signal, in accordance with an embodiment of the present invention.
Figure 3 is a flow chart illustrating a method of processing recorded sounds by a processing unit, in accordance with an exemplary embodiment of the present invention.

DETAILED DESCRIPTION OF THE INVENTION
[020] Various features and embodiments of the present invention here will be discernible from the following further description thereof, set out hereunder.
[021] One object of the present invention is to generate and transmit an emergency alert signal without human intervention. Another object of the present invention is to provide a system and method to generate and transmit an emergency alert signal which is simple, reliable and cost effective.
[022] Figure 1 is a block diagram illustrating a system 100 for generating and transmitting an emergency alert signal, in accordance with an embodiment of the present invention.
[023] For the purpose of the present invention, the term “vehicle” comprises any vehicle provided with one or more recording units such as, but not being limited to, bicycles, scooters, motorcycles, rickshaws, cars, trucks, etc. The term “vehicle” also comprises, but not being limited to, conventional internal combustion engine vehicles, electric vehicles and hybrid vehicles.
[024] As shown in Figure 1, the system comprises one or more first recording units 106, one or more second recording units 108, a processing unit 112, an emergency alert unit 114 and a transmitting unit 116.
[025] The one or more first recording units 106 are configured to record the sounds of the vehicle 102. The one or more second recording units 108 are configured to record the sounds of a rider of the vehicle 102 and sounds of the environment surrounding the vehicle 102. The processing unit 112 is configured to receive and process the recorded sounds of the vehicle 102, recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102. The emergency alert unit 114 is in communication with the processing unit 112 and generates an emergency alert signal on satisfaction of one or more pre-defined conditions. The transmitting unit 116 is in communication with the emergency alert unit 114 and is configured to perform one or more pre-defined operations when an emergency alert signal is generated by the emergency alert unit 114.
[026] In an embodiment, the one or more first recording units 106 are disposed in the vehicle 102. The one or more first recording units 106 are configured to record the sounds of the vehicle 102 before the occurrence of an emergency situation, during the occurrence of the emergency situation and after the occurrence of the emergency situation.
[027] In an embodiment, the one or more second recording units 108 are disposed in at least one of a personal digital assistant 104 of the rider and one or more peripheral devices 118 coupled with the personal digital assistant 104 of the rider. The one or more second recording units 108 are configured to record the sounds of the rider of the vehicle 102 and the sounds of the environment surrounding the vehicle 102 before the occurrence of an emergency situation, during the occurrence of the emergency situation and after the occurrence of the emergency situation. The one or more peripheral devices 118 includes, but not being limited to, headphones, earphones, smart helmets, watch, clothes etc. In a scenario wherein the one or more second recording units 108 are disposed in peripheral devices 118, the recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102 will be transmitted to the personal digital assistant 104 of the rider. Also, the one or more first recording units 106 transmit the recorded sounds of the vehicle 102 to the personal digital assistant 104 of the rider.
[028] In an embodiment, the processing unit 112, the emergency alert unit 114 and the transmitting unit 116 are disposed in the personal digital assistant 104 of the rider.
[029] In an embodiment, the processing unit 112, the emergency alert unit 114 and the transmitting unit 116 are disposed in a remote server 110. The remote server 110 is in communication with the vehicle 102 and/or personal digital assistant 104 of the rider. In one scenario, the one or more first recording units 106 are configured to transmit the recorded sounds of the vehicle 102 directly to the processing unit 112 in the remote server 110. In another scenario, the one or more first recording units 106 are configured to transmit the recorded sounds of the vehicle 102 to the personal digital assistant 104 of the vehicle 102, which thereafter transmits the recorded sounds of the vehicle 102, recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102 to the processing unit 112.
[030] In an embodiment, the one or more first pre-defined conditions comprises match between at least one of the recorded sounds of the vehicle 102, the recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102 with at least one pre-stored reference sounds. The at least one pre-stored reference sounds comprise sounds of the vehicle 102 which are generally identified during accident or crash of the vehicle 102 such as, but not being limited to, sounds related to airbag deployment, engine explosion, screeching of tyres, tyre explosion, collision/crash of the vehicle, etc. The at least one pre-stored reference sounds also comprise sounds which can be uttered by the rider of the vehicle 102 such as, not being limited to, HELP, ACCIDENT, SAVE, EMERGENCY etc. It is to be noted that sounds which can be uttered by the rider of the vehicle 102 in emergency situation are stored in different translations including translations in vernacular languages depending upon the geographical location in which the emergency situation has occurred. The at least one pre-stored reference sounds also comprise sounds of the environment surrounding the vehicle 102 such as sounds which can be uttered by passers-by or onlookers. Such sounds include, but not being limited to, HELP, ACCIDENT, SAVE, EMERGENCY, HOSPITAL, POLICE etc. It is to be noted that sounds which can be uttered by the onlookers and passers-by in an emergency situation are stored in different translations including translations in vernacular languages depending upon the geographical location in which the emergency situation has occurred. In case no match is found between the recorded sounds of the vehicle 102, recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102 with the at least one pre-stored reference sounds, the above-mentioned recorded sounds are processed by a voice engine in the processing unit 112. The voice engine is a trained input model and configured to detect a state of emergency in the above-mentioned recorded sounds. The state of emergency can be detected by various factors such as, but not being limited to, overall severity of the recorded sound, roughness of the recorded sound, breathiness of the recorded sound, tension in the recorded sound, distress in the recorded sound, pitch of the recorded sound and loudness of the recorded sound. In case of detection of match between at least one of the above-mentioned recorded sounds and at least one of the pre-stored reference sounds or in case of detection of state of emergency in at least one of the above-mentioned recorded sounds, the processing unit 112 transmits a signal to the emergency alert unit 114 for generation of an emergency alert signal. In one scenario, the emergency alert unit 114, on receiving a signal from the processing unit 112, may generate an emergency alert signal without any further inputs from the vehicle 102. In another scenario, the emergency alert unit 114 may be configured to corroborate the occurrence of an emergency situation by receiving inputs from the vehicle 102. The emergency alert unit 114 can be configured to receive various inputs from the vehicle 102 to rule out or confirm the possibility of the occurrence of an emergency situation. For example, the emergency alert unit 114 may be configured to receive inputs from a telematics unit 120 disposed in the vehicle and/or a speed detection unit 122 disposed in the vehicle 102. The telematics unit 120 is configured to receive information from one or more lean angle sensors disposed in the vehicle 102 and transmit information indicative of the lean angle of the vehicle 102 to the emergency alert unit 114. The speed detection unit 122 is configured to receive information from one or more speed sensors disposed on one or more wheel(s) of the vehicle 102 and transmit information indicative of speed of the vehicle to the emergency alert unit 114. On receiving inputs from the telematics unit 120 and/or the speed detection unit 122, the emergency alert unit 114 detects whether the speed of the vehicle 102 is less than a pre-defined speed and/or the lean angle of the vehicle 102 is less than a pre-defined lean angle. In case the speed of the vehicle 102 is less than the pre-defined speed and/or the lean angle of the vehicle 102 is less than the pre-defined lean angle, the emergency alert unit 114 generates an emergency alert signal. On generation of the emergency alert signal by the emergency alert unit 114, the transmitting unit 116 is configured to perform one or more pre-defined operations.
[031] In an embodiment, the one or more first pre-defined operations comprises transmitting a signal to the telematics unit 120 of the vehicle 102 and/or the personal digital assistant 104 for retrieving coordinates of the vehicle 102. On receiving the coordinates of the vehicle 102 by the transmitting unit 116, the transmitting unit 116 is configured to detect the location of one or more emergency centres 124 within pre-defined distance of the retrieved coordinates of the vehicle 102. On detection of the one or more emergency centres 124, the transmitting unit 116 is configured to transmit an emergency alert message/signal along with the retrieved coordinates of the vehicle 102 such that rescue operation can be performed by the emergency centres. The one or more emergency centres 124 include such as, but not being limited to hospitals and police stations. The telematics unit 120 disposed in the vehicle 102 is configured to transmit retrieved coordinates of the vehicle 102 to the transmitting unit 116. In an event of failure of telematics unit 120 owing to crash of the vehicle 102, the transmitting unit 116 is configured to transmit a signal to the personal digital assistant 104 for coordinates of the personal digital assistant 104 to determine the nearest location of the vehicle 102. The personal digital assistants 104 available now-a-days are generally enabled with geo-location, vehicle navigation and global positioning features.
[032] In an embodiment, the one or more first recording units 106 are microphones inbuilt in the vehicle 102.
[033] In an embodiment, the one or more first recording units 116 are inbuilt in a speedometer of the vehicle 102.
[034] In an embodiment, the one or more second recording units 108 are microphones inbuilt in the personal digital assistant 104 and/or peripheral devices 118 communicatively coupled with the personal digital assistant 104.
[035] Figure 2 is a flow chart illustrating a method 200 for generating and transmitting an emergency alert signal, in accordance with an embodiment of the present invention.
[036] As shown, at step 201, the method comprises recording sounds of the vehicle 102. The step 201 is performed by one or more first recording units 106. In an embodiment, the one or more first recording units 106 are disposed in the vehicle 102.
[037] At step 202, the method comprises recording sounds of the rider of the vehicle 102 and sounds of an environment surrounding the vehicle 102. The step 202 is performed by one or more second recording units 108. In an embodiment, the one or more second recording units 108 are disposed in at least one of a personal digital assistant 104 and one or more peripheral devices 118 communicatively coupled to the personal digital assistant 104. In one non-limiting example, the one or more peripheral devices 118 includes, not being limited to, earphones, headphones, smart helmet, watch, clothes, etc.
[038] At step 203, the method comprises receiving, by a processing unit 112, recorded sounds of the vehicle 102, the recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102. In an embodiment, the processing unit 112 is disposed in the personal digital assistant 104 of the rider. In an embodiment, the processing unit is disposed in a remote server 110. The processing unit 112 is in communication with the personal digital assistant 104 of the rider of the vehicle 102.
[039] At step 204, the method comprises processing of the recorded sounds of the vehicle 102, the recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102. The step 204 is performed by the processing unit 112.
[040] At step 205, the method comprises generating an emergency alert signal on satisfaction of one or more pre-defined conditions. The step 205 of generating the emergency alert signal is performed by an emergency alert unit 114. In an embodiment, the emergency alert unit 114 is disposed in the personal digital assistant 104 of the rider. In an embodiment, the emergency alert unit 114 is disposed in the remote server 110. The emergency alert unit 114 is communicatively coupled to the processing unit 112.
[041] At step 206, the method comprises performing, by a transmitting unit 116, one or more pre-defined operations on receipt of an emergency alert signal. In an embodiment, the transmitting unit 116 is disposed in the personal digital assistant 104 of the rider of the vehicle 102. In another embodiment, the transmitting unit 116 is disposed in the remote server 110. The transmitting unit 116 is communicatively coupled to the emergency alert unit 114, the vehicle 102 and the personal digital assistant 104 of the rider of the vehicle 102.
[042] In an embodiment, the one or more first pre-defined conditions comprises match between at least one of the recorded sounds of the vehicle 102, the recorded sounds of the rider of the vehicle 102 and the recorded sounds of an environment surrounding the vehicle 102 with at least one pre-stored reference sounds. The pre-stored reference sounds comprise sounds of the vehicle 102 which are generally identified during accident or crash of the vehicle 102 such as, not being limited to, sounds related to airbag deployment, engine explosion, screeching of tyres, tyre explosion, collision/crash of the vehicle 102, etc. The pre-stored reference sounds also comprise sounds which can be uttered by the rider of the vehicle 102 such as, not being limited to, HELP, ACCIDENT, SAVE, EMERGENCY, AMBULANCE, CRASH etc. It is to be noted that sounds which can be uttered by the rider of the vehicle 102 in emergency situations are stored in different translations including translations in vernacular languages depending upon the geographical location in which the emergency situation has occurred. The pre-stored reference sounds also comprise sounds of the environment surrounding the vehicle such as sounds which can be uttered by passers-by or onlookers. Such sounds include, not being limited to, HELP, ACCIDENT, SAVE, EMERGENCY, HOSPITAL, POLICE, AMBULANCE, CRASH etc. It is to be noted that sounds which can be uttered by the onlookers and passers-by in emergency situation are stored in different translations including translations in vernacular languages depending upon the geographical location in which the emergency situation has occurred. In case no match is found between the recorded sounds of the vehicle 102, recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102 with the pre-stored reference sounds, the above-mentioned recorded sounds are processed by a voice engine in the processing unit 112. The voice engine is configured to detect a state of emergency in the above-mentioned recorded sounds. The state of emergency can be detected by various factors such as, not being limited to, overall severity of the recorded sound, roughness of the recorded sound, breathiness of the recorded sound, tension in the recorded sound, distress in the recorded sound, pitch of the recorded sound and loudness of the recorded sound. In case of detection of match between at least one of the above-mentioned recorded sounds and at least one of the pre-stored reference sounds or in case of detection of state of emergency in at least one of the above-mentioned recorded sounds, the processing unit 112 transmits a signal to the emergency alert unit 114 for generation of an emergency alert signal.
[043] In an embodiment, the method further comprises receiving inputs by the emergency alert unit 114 from the vehicle 102. The emergency alert unit 114 is communicatively coupled to the vehicle 102. This method step is performed to corroborate the occurrence of an emergency situation by the emergency alert unit prior to generation of an emergency alert signal. It should however be understood that this method step is an optional step and the emergency alert unit 114 can be configured to generate an emergency alert signal based only on the recorded sounds of the vehicle 102, the recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102. The emergency alert unit 114 can be configured to receive various inputs from the vehicle 102 to rule out or confirm the possibility of the occurrence of an emergency situation. For example, the emergency alert unit 114 may be configured to receive inputs from a telematics unit 120 disposed in the vehicle 102 and/or a speed detection unit 122 disposed in the vehicle 102. The telematics unit 122 is configured to receive information from one or more lean angle sensors disposed on the vehicle 102 and transmit information indicative of the lean angle of the vehicle to the emergency alert unit 114. The speed detection unit 122 is configured to receive information from one or more speed sensors disposed on one or more wheel(s) of the vehicle 102 and transmit information indicative of speed of the vehicle to the emergency alert unit 114. On receiving such inputs from the telematics unit 120 and the speed detection unit 122, the emergency alert unit 114 detects whether the speed of the vehicle 102 is less than a pre-defined speed and/or the lean angle of the vehicle 102 is less than a pre-defined lean angle. In case the speed of the vehicle 102 is less than the pre-defined speed and/or the lean angle of the vehicle 102 is less than the pre-defined lean angle, the emergency alert unit 114 generates an emergency alert signal. On generation of the emergency alert signal by the emergency alert unit 114, the transmitting unit 116 is configured to perform one or more pre-defined operations.
[044] In an embodiment, the one or more first pre-defined operations comprises transmitting a signal to the telematics unit 120 of the vehicle 102 and/or the personal digital assistant 104 for retrieving coordinates of the vehicle 102. On receiving the coordinates of the vehicle 102 by the transmitting unit 116, the transmitting unit 116 is configured to detect the location of one or more emergency centres 124 within a pre-defined distance of the retrieved coordinates of the vehicle 102. On detection of the one or more emergency centres 124, the transmitting unit 116 is configured to transmit an emergency alert message or signal along with the retrieved coordinates of the vehicle 102 such that rescue operation can be performed by the one or more emergency centres 124. The one or more emergency centres 124 include such as, but not being limited to hospitals and police stations. The telematics unit 120 disposed in the vehicle 102 is configured to transmit the retrieved coordinates of the vehicle 102 to the transmitting unit 116. In an event of failure of telematics unit 116 owing to crash of the vehicle 102, the transmitting unit 116 is configured to transmit a signal to the personal digital assistant 104 for coordinates of the personal digital assistant 104 to determine the nearest location of the vehicle 102.
[045] Figure 3 is a flow chart illustrating a method 300 of processing recorded sounds and generating an emergency alert signal, in accordance with an embodiment of the present invention.
[046] At step 301, the method comprises receiving the recorded sounds of a vehicle 102, recorded sounds of a rider of the vehicle 102 and recorded sounds of an environment surrounding the vehicle 102. The recorded sounds of the vehicle 102, the recorded sounds of the rider of the vehicle 102 and recorded sounds of the environment surrounding the vehicle 102 are received by a processing unit 112. The recorded sounds of the vehicle 102 are received by the processing unit 112 by one or more first recording units 106. The recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102 are received by the processing unit 112 from one or more second recording units 108.
[047] At step 302, the method comprises comparing at least one of the recorded sounds of the vehicle 102, recorded sounds of the rider of the vehicle 102 and recorded sounds of the environment surrounding the vehicle 102 with at least one pre-stored reference sounds. At step 303, in case a match is found between at least one of the recorded sounds of the vehicle 102, recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102 with at least one pre-stored reference sounds, the method moves to step 304, else the method moves to step 305.
[048] At step 304, the method comprises generating an emergency alert signal. The step 304 is performed by an emergency alert unit 114 which is communicatively coupled with the processing unit 112. At step 305, the method comprises processing at least one of the recorded sounds of the vehicle 102, recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102 by a voice engine in the processing unit 112. The voice engine is configured to detect a state of emergency in the above-mentioned recorded sounds. The state of emergency can be detected by various factors such as, but not being limited to, overall severity of the recorded sound, roughness of the recorded sound, breathiness of the recorded sound, tension in the recorded sound, distress in the recorded sound, pitch of the recorded sound and loudness of the recorded sound. In case a state of emergency is detected in at least one of the recorded sounds of the vehicle 102, recorded sounds of the rider of the vehicle 102 and the recorded sounds of the environment surrounding the vehicle 102, the method moves to step 304, else the method moves to step 301.
[049] It is to be understood that typical hardware configuration of the personal digital assistant 104, the processing unit 112, the emergency alert unit 114 and the transmitting unit 116 disclosed in the present invention can include a set of instructions that can be executed to cause the personal digital assistant 104, the processing unit 112, the emergency alert unit 114 and the transmitting unit 116 to perform the above-disclosed method.
[050] Each of the the personal digital assistant 104, the processing unit 112, the emergency alert unit 114 and the transmitting unit 116 may include a processor which may be a central processing unit (CPU), a graphics processing unit (GPU), or both. The processor may be one or more general processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analysing and processing data. The processor may implement a software program, such as code generated manually i.e., programmed.
[051] Each of the personal digital assistant 104, the processing unit 112, the emergency alert unit 114 and the transmitting unit 116 comprises a storage unit which may include a memory. The memory may be a main memory, a static memory, or a dynamic memory. The memory may include, but is not limited to, computer readable storage media such as various types of volatile and non-volatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like. The memory is operable to store instructions executable by the processor. The functions, acts or tasks illustrated in the figures or described may be performed by the programmed processor executing the instructions stored in the memory.
[052] Each of the personal digital assistant 104, the processing unit 112, the emergency alert unit 114 and the transmitting unit 116 may also include a disk or optical drive unit. The disk drive unit may include a computer-readable medium in which one or more sets of instructions, e.g., software, can be embedded. Further, the instructions may embody one or more of the methods or logic as described. In a particular example, the instructions may reside completely, or at least partially, within the memory or within the processor during execution by the personal digital assistant 104, the processing unit 112, the emergency alert unit 114 and the transmitting unit 116. The memory and the processor also may include computer-readable media as discussed above. The present invention contemplates a computer-readable medium that includes instructions or receives and executes instructions responsive to a propagated signal so that a device connected to a network can communicate data over the network. Further, the instructions may be transmitted or received over the network. The network includes wireless networks, Ethernet AVB networks, or combinations thereof. The wireless network may be a cellular telephone network. Further, the network may be a public network, such as the Internet, a private network, such as an intranet, or combinations thereof, and may utilize a variety of networking protocols now available or later developed.
[053] Each of the personal digital assistant 104, the processing unit 112, the emergency alert unit 114 and the transmitting unit 116 may accept incoming content and send content to connected components via a communication channel such as Controller Area Network (CAN), Local Interconnect Network (LIN) and Bluetooth.
[054] The claimed features/method steps of the present invention as discussed above are not routine, conventional, or well understood in the art, as the claimed features/steps enable the following solutions to the existing problems in conventional technologies. Specifically, the technical problem of human intervention while generating an emergency alert signal is solved by the present invention. As already stated, in the prior arts, alerts were generated through phone call initiated by the rider of the vehicle or through emergency switch available on the vehicle. In case of non-human intervention in the prior arts, the alerts were generated by sensors disposed on the vehicle which completely ignored the condition of the rider of the vehicle. The present invention overcomes the disadvantages of the prior art by generating an emergency alert signal without human intervention and taking into account the condition of the rider of the vehicle in addition to the vehicle condition.
[055] In the present invention, the sounds of the vehicle 102 are recorded by an inbuilt microphone which is generally available in the vehicle 102. Also, sounds of the rider of the vehicle 102 and sounds of the environment surrounding the vehicle 102 are recorded by inbuilt microphone generally available on the personal digital assistant 104 or peripheral devices 118 coupled with the personal digital assistant 104 which are also generally available. Further, telematics units 120 are also generally available in the vehicle 102. Therefore, the present invention can be implemented without incurring any additional costs on the vehicle 102, personal digital assistant 104 and the peripheral devices 118 communicatively coupled to the personal digital assistant 104. The present invention is, therefore, cost effective.
[056] In the present invention, vehicle rash driving can also be monitored through recorded sounds of the vehicle 102, recorded sounds of the rider of the vehicle 102 and recorded sounds of the environment surrounding the vehicle 102 and post-crash analysis of such recorded sounds can be executed.
[057] While the present invention has been described with respect to certain embodiments, it will be apparent to those skilled in the art that various changes and modification may be made without departing from the scope of the invention as defined in the following claims.

List of Reference Numerals
100- system
102- vehicle
104- personal digital assistant
106- first recording unit
108- second recording units
110- remote server
112- processing unit
114- emergency alert unit
116- transmitting unit
118- peripheral devices
120- telematics unit
122- speed detection unit
124- emergency center , Claims:WE CLAIM:

1. A system (100) for generating and transmitting an emergency alert signal, the system (100) comprising:
- one or more first recording units (106), the one or more first recording units (106) configured to record sounds of a vehicle (102);
- one or more second recording units (108), the one or more second recording units (108) configured to record sounds of a rider of the vehicle (102) and sounds of an environment surrounding the vehicle (102);
- a processing unit (112), the processing unit (112) configured to receive and process the recorded sounds of the vehicle (102), the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle (102);
- an emergency alert unit (114), the emergency alert unit (114) being in communication with the processing unit (112), wherein the emergency alert unit (114) configured to generate an emergency alert signal on satisfaction of one or more pre-defined conditions; and
- a transmitting unit (116), the transmitting unit (116) being in communication with the emergency alert unit (114), the transmitting unit (116) configured to perform the one or more pre-defined operations on receipt of the emergency alert signal.

2. The system as claimed in claim 1, wherein the processing unit (112), the emergency alert unit (114) and the transmitting unit (116) are disposed in a remote server (110), the remote server (110) being in communication with at least one of a personal digital assistant (104) of the rider and the vehicle (102).

3. The system (100) as claimed in claim 2, wherein the one or more first recording units (106) are disposed on the vehicle (102) and the one or more second recording units (108) are disposed in at least one of the personal digital assistant (104) and one or more peripheral devices (118), wherein the one or more peripheral devices (118) being communicatively coupled to the personal digital assistant (104).

4. The system (100) as claimed in claim 3, wherein the one or more first recording units (106) are configured to transmit the recorded sounds of the vehicle (102) to the personal digital assistant (104).

5. The system as claimed in claim 4, wherein the recorded sounds of the vehicle (102), the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle (102) are transmitted to the processing unit (112) via the personal digital assistant (104) of the rider.

6. The system (100) as claimed in claim 5, wherein the one or more pre-defined conditions comprises:
- a match between at least one of the recorded sounds of the vehicle (102), the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle (102) with at least one of pre-stored reference sounds, wherein the at least one of pre-stored reference sounds are stored in a memory of the processing unit (112); and
- in an event of no match between at least one of the recorded sounds of the vehicle (102), the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle (102) with the at least one of pre-stored reference sounds, the processing unit (112) being configured to detect a state of emergency in at least one of the recorded sounds of the vehicle (102), the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle (102) when such recorded sounds are processed by a voice engine stored in the processing unit (112).

7. The system (100) as claimed in claim 6, wherein the emergency alert unit (114) is configured to receive inputs from at least one of: a telematics unit (120) disposed in the vehicle (102) and a speed detection unit (122) disposed in the vehicle (102).

8. The system (100) as claimed in claim 7, wherein the one or more pre-defined condition further comprises at least one of:
- speed of the vehicle (102) being less than a pre-defined speed, the speed of the vehicle (102) being measured and transmitted to the emergency alert unit (114) by the speed detection unit (122) disposed in the vehicle (102); and
- a lean angle of the vehicle (102) being less than a pre-defined lean angle, the lean angle of the vehicle (102) being measured by a lean angle sensor disposed in the vehicle (102) and transmitted to the emergency alert unit (114) by the telematics unit (120) disposed in the vehicle (102).

9. The system (102) as claimed in claim 6 or claim 8, wherein the one or more pre-defined operations comprises:
- transmitting a signal to at least one of the telematics unit (120) of the vehicle (102) and the personal digital assistant (104) for retrieving coordinates of the vehicle (102);
- detecting one or more emergency centers (124) within a pre-defined distance of the retrieved coordinates of the vehicle (102); and
- transmitting an alert message to the one or more emergency centers (124) along with retrieved coordinates of the vehicle (102).

10. The system (100) as claimed in claim 9, wherein the one or more emergency centers (124) comprise at least one of: a medical center and a police station.

11. The system (100) as claimed in claim 9, wherein the retrieved coordinates of the vehicle (102) are transmitted by the telematics unit (120) disposed in the vehicle (102).

12. A method (200) for generating and transmitting an emergency alert signal, the method (200) comprising:
- recording (201), by one or more first recording units (106), sounds of a vehicle (102);
- recording (202), by one or more second recording units (108), sounds of a rider and sounds of an environment surrounding the vehicle (102);
- receiving (203), by a processing unit (112), the recorded sounds of the vehicle (102), the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle (102);
- processing (204), by the processing unit (112), the recorded sounds of the vehicle (102), the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle (102);
- generating (205), by an emergency alert unit (114), an emergency alert signal on satisfaction of one or more pre-defined conditions, wherein said emergency alert unit (114) being in communication with the processing unit (112); and
- performing (206), by a transmitting unit (116), one or more pre-defined operations on receipt of the emergency alert signal, wherein the transmitting unit (116) being in communication with the emergency alert unit (116).

13. The method (200) as claimed in claim 12, wherein the one or more pre-defined conditions comprises:
- a match between at least one of the recorded sounds of the vehicle (102), the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle (102) with at least one of pre-stored reference sounds, wherein the at least one of pre-stored reference sounds are stored in a memory of the processing unit (112);
- in an event of no match between at least one of the recorded sounds of the vehicle (102), the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle (102) with the at least one of pre-stored reference sounds, the processing unit (112) being configured to detect a state of emergency in at least one of the recorded sounds of the vehicle (102), the recorded sounds of the rider and the recorded sounds of the environment surrounding the vehicle (102) when such recorded sounds are processed by a voice engine stored in the processing unit (112).

14. The method (200) as claimed in claim 13, comprising:
- receiving, by the emergency alert unit (114), inputs from at least one of: a telematics unit (120) disposed in the vehicle (102) and a speed detection unit (122) disposed in the vehicle (102).

15. The method (200) as claimed in claim 14, wherein the one or more pre-defined conditions further comprises at least one of:
- speed of the vehicle (102) being less than a pre-defined value, the speed of the vehicle (102) being measured and transmitted to the emergency alert unit (114) by the speed detection unit (122) disposed in the vehicle (102); and
- a lean angle of the vehicle (102) being less than a pre-defined value, the lean angle of the vehicle (102) being measured by a lean angle sensor disposed in the vehicle (102) and transmitted to the emergency alert unit (114) by the telematics unit (120) disposed in the vehicle.

16. The method (200) as claimed in claim 13 or claim 15, wherein the one or more pre-defined operations comprises:
- transmitting a signal to at least one of the telematics unit (120) of the vehicle (102) and the personal digital assistant (104) of the rider for retrieving coordinates of the vehicle (102);
- detecting one or more emergency centers (124) within a pre-defined distance of the retrieved coordinates of the vehicle (102); and
- transmitting an alert message to the one or more emergency centers (124) along with retrieved coordinates of the vehicle (102).

Dated this 30th day of November 2022

TVS MOTOR COMPANY LIMITED
By their Agent & Attorney

(Nikhil Ranjan)
of Khaitan & Co
Reg No IN/PA-1471

Documents

Application Documents

# Name Date
1 202241069181-STATEMENT OF UNDERTAKING (FORM 3) [30-11-2022(online)].pdf 2022-11-30
2 202241069181-REQUEST FOR EXAMINATION (FORM-18) [30-11-2022(online)].pdf 2022-11-30
3 202241069181-PROOF OF RIGHT [30-11-2022(online)].pdf 2022-11-30
4 202241069181-POWER OF AUTHORITY [30-11-2022(online)].pdf 2022-11-30
5 202241069181-FORM 18 [30-11-2022(online)].pdf 2022-11-30
6 202241069181-FORM 1 [30-11-2022(online)].pdf 2022-11-30
7 202241069181-FIGURE OF ABSTRACT [30-11-2022(online)].pdf 2022-11-30
8 202241069181-DRAWINGS [30-11-2022(online)].pdf 2022-11-30
9 202241069181-DECLARATION OF INVENTORSHIP (FORM 5) [30-11-2022(online)].pdf 2022-11-30
10 202241069181-COMPLETE SPECIFICATION [30-11-2022(online)].pdf 2022-11-30
11 202241069181-FER.pdf 2025-07-15
12 202241069181-FORM 3 [17-07-2025(online)].pdf 2025-07-17

Search Strategy

1 202241069181_SearchStrategyNew_E_SEARCHSTRATEGYE_14-07-2025.pdf