Sign In to Follow Application
View All Documents & Correspondence

Accident Detection And Reporting System

Abstract: The present disclosure pertains to a system 100 for detecting an accident and immediately reporting the accident to nearby hospitals, police stations and concerned person. The system 100 includes one or more sensors 102 located inside the vehicle to sense one or more parameters of the vehicle, and a location identifier 104 for determining real-time location of the vehicle. Upon detection of accident by analyzing the received parameters, a first processing unit 106 notify a centralized server 110 that further instruct a UAV to reach to the location of the accident with a first aid kit, and the UAV capture images of the accident. The captured images are analyzed using to evaluate severity of the accident, and correspondingly an emergency vehicle i.e. ambulance is transmitted to the location with required medical equipment.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
22 January 2022
Publication Number
46/2022
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2025-09-18
Renewal Date

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector - 9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. LILHORE, Umesh Kumar
Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jhansla, Rajpura, Punjab - 140401, India.
2. SIMAIYA, Sarita
Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jhansla, Rajpura, Punjab - 140401, India.
3. KHURANA, Meenu
Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jhansla, Rajpura, Punjab - 140401, India.

Specification

TECHNICAL FIELD
[0001] The present invention generally relates to automobiles. More particularly, relates to a system to be installed within the vehicle for detecting accident and reporting the accident information automatically to concerned authorities such as emergency services and family members.

BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] The transportation facilities in metropolitan cities have undergone a tremendous proliferation in recent years which have made the human lives easier by increasing the mobility. However, recently in daily life scenarios, it is observed that said advancements in transportation have also simultaneously increased traffic hazards. Various efforts has been made by government and other organizations to create awareness against careless driving, still there is no reduction in the number of accidents. It is also observed that the ratio of road accidents is increasing day by day which is causing an immense loss of life due to improper emergency and alerting facilities. There could be various possible reasons behind the increase in road accidents, some of which are over speeding, careless driving, drunken driving, avoiding seat belts and helmets. In light of increasing road accidents, preventing said accidents and saving human or animal life has become a concern. Therefore, there is a need to provide quick a rescue to a driver or other person in the vehicle immediately after an accident without wasting much time.
[0004] Presently, there exist several solutions to detect the accidents and inform the hospitals and police about the detected accidents. In one of the existing solutions, the system detects accident based on the vehicle’s speed i.e. if a vehicle is abruptly stopped due to any reason, it may be considered as accident. However, said system fails to check the reason of the accident. For example, if a driver of the vehicle is driving at a speed of 100 km/hr and suddenly the driver stops the vehicle due to a sudden appearance of an animal in front of the vehicle, then also the system will consider it as an accident and perform requisite actions. Therefore, the system lacks deciding criteria required in identifying whether or not an accident is detected. This is a time wasting and costly process. In another existing solution, sensors are installed in the vehicles to detect the accidents but said sensors are unable to determine exact location of the accident, and therefore, the sensors are totally dependent on any information received from people present near the accident location. Thus, the existing solutions are unable to inform the nearby hospitals and police in real time, also these solutions are fail to determine the direct impact and the associated deformation of the vehicle. Thus, the existing solutions due to lack of exact location of the accident results in delay to inform the nearby hospitals and other concerned person.
[0005] There is a need to provide a solution that overcomes the above-mentioned and other limitations of existing solutions by providing an efficient mechanism to detect the accident and notify the nearby hospitals, police etc. in a real time manner without causing any delay.

OBJECTS OF THE PRESENT DISCLOSURE
[0006] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0007] An object of the present disclosure is to provide a system for detecting accident.
[0008] Another object of the present disclosure is to provide a system for notifying the nearby hospitals, police etc. in a real time manner without causing any delay.
[0009] Another object of the present disclosure is to provide a system to increase the probability of life saving during an accident.
[0010] Another object of the present disclosure is to provide a system for tracking location of the vehicle, which facilitates in getting exact location of the vehicle where accident occurs.
[0011] Another object of the present disclosure is to provide a system for detecting accident and notifying with efficient and cost effective solution.
[0012] Various objects, features, aspects and advantages of the present disclosure will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like features.

SUMMARY
[0013] Various aspects of the present disclosure relates to vehicle assistance system. In particular the present disclosure relates to a system to be installed within the vehicle for detecting accident and reporting the accident information automatically to concerned authorities such as emergency services and family members.
[0014] According to an aspect of the present disclosure a system for detecting accident and reporting is disclosed. The system may including one or more sensors coupled with the vehicle to detect a plurality of parameters of the vehicle, a location identifier located inside the vehicle to determine location information of the vehicle, a first processing unit operatively coupled with the one or more sensors, and the location identifier.
[0015] In an aspect, the processing unit may include a learning engine coupled with a memory, the memory may store instructions executable by the learning engine and configured to: analyse the plurality of parameters to determine occurring of an accident, and correspondingly actuate an alert unit located inside the vehicle, generate a warning signal, wherein the warning signal pertains location information of the vehicle, wherein the warning signal are transmitted to a centralized server 110.
[0016] In an aspect, the centralized server may including a second processing unit configured to receive one or more images of location of the accident by one or more image acquisition units and pre-process the received one or more images to evaluate severity of the accident, and correspondingly instruct an emergency vehicle to accommodate one or more medical equipment, and move to the location of accident, wherein the location information of the vehicle is extracted from the receive warning signal.
[0017] In an aspect, the second processing unit may implement one or more deep learning models to evaluate severity of the accident.
[0018] In an aspect, the one or more image acquisition units may be located on a plurality of roads to capture images of a plurality of vehicles on the road, and upon receiving information of the accident, the centralized server acquire one or more images of the location of accident from the associated image acquisition unit to evaluate severity of the accident.
[0019] In an aspect, the one or more image acquisition units may be located on one or more unmanned aerial vehicles (UAVs), and upon receiving information of the accident, the centralized server may instruct the UAVs found in vicinity of the accident to move to the location of accident to acquire one or more images of the accident.
[0020] In an aspect, one or more sensors may include any or a combination of, speed sensor, noise sensor, pressure sensor, and accelerometer.
[0021] In an aspect, the plurality of parameters may include any or a combination of speed, impact, pressure, vibration, and sound.
[0022] In an aspect, the location identifier may include any or a combination of location sensor, Global Positioning System sensor (GPS), and geolocation sensor.
[0023] Another aspect of the present disclosure discloses a method for detecting accident of a vehicle and reporting, the method may include detection by one or more sensors, a plurality of parameters, detection by a location identifier, location information of the vehicle, analyzing, by a first processing unit the plurality of parameters to detect an accident, generating by the first processing unit a warning signal, and the warning signal may be transmitted to a centralized server, receiving by the centralized server one or more images acquired from one or more acquisition units, pre-processing the received one or more images, by a second processing unit to evaluate severity of the accident, and instructing by the second processing unit 112 an emergency vehicle to move to the location of accident with one or more medical equipment required based on the evaluated severity of the accident.
[0024] In an aspect, the one or more image acquisition units 114 may be positioned on a plurality of roads and located with one or more unmanned aerial vehicles (UAVs).

BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0026] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[0027] FIG. 1 illustrates a block diagram of a proposed system for detecting accident of a vehicle and reporting, in accordance with an embodiment of the present disclosure.
[0028] FIG. 2 illustrates a flow diagram of detecting accident of a vehicle and reporting method, in accordance with an embodiment of the present disclosure.
[0029] FIG. 3 illustrates a method for detecting accident of a vehicle and reporting, in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[0030] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the present disclosure as defined by the appended claims.
[0031] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details. Embodiments explained herein relate to vehicle assistance system. In particular the present disclosure relates to a system to be installed within the vehicle for detecting accident and reporting the accident information automatically to concerned authorities such as emergency services and family members.
[0032] FIG. 1 illustrates a block diagram of a proposed system for detecting accident of a vehicle and reporting, in accordance with an embodiment of the present disclosure.
[0033] As illustrated in FIG. 1, proposed system 100 for detecting accident of a vehicle (also referred to as system 100, herein) can include one or more sensors 102 coupled with the vehicle to detect a one or more parameters, a location identifier 104 located inside the vehicle to determine location information of the vehicle, and a first processing unit 106 for analysing the received parameters and detecting accident accurately.
[0034] In an embodiment, the system 100 can be configured with any vehicle, to facilitate in detecting accident of the vehicle, and reporting to concerned person to save life of driver and other person in the vehicle. In another embodiment, the vehicle can be two-wheeler, three-wheeler, and four-wheeler, and can include but not limited to buses, cars, trucks, van, auto rickshaw, and motorcycles.
[0035] In an embodiment, the system 100 can include the one or more sensors 102 (interchangeably referred sensors 102, hereinafter) that can be implemented to pre-defined positions in the vehicle, and the sensors 102 can be configured to sense one or more parameters (interchangeably referred as information, hereinafter) of the vehicle. In another embodiment, the one or more sensors 102 can include but not limited to any or a combination of, speed sensor, noise sensor, pressure sensor, and accelerometer. In an exemplary embodiment, the one or more sensors 102 can also include tilt sensor, wheel sensor, shock sensor, gyroscopic sensor, and MEMS sensor for collecting various other parameters.
[0036] In an embodiment, when the vehicle is moving, the one or more sensors can be configured to sense one or more parameters of the vehicle, and the one or more parameters can include, but not limited to speed, sound, wheel rotation, steering rotation, impact, pressure, angular velocity, vibration, tilt of the vehicle, rotation of the vehicle. In another embodiments, upon sensing the plurality of the parameters, the sensors 102 can generate a first set of signals.
[0037] In an embodiment, the pressure sensor can be configured to detect impact on the vehicle, for example, the object such as another vehicle, tree, and the likes hit the vehicle, that impact can be detected using it. The noise sensor can be configured to detect noise i.e. sound generated at the time of hitting or accident, i.e. when two vehicle collides, the sound can be generated, that can be captured by the noise sensor or by a microphone positioned on the vehicle. The accelerometer can be configured to detect the sudden change in the axes of vehicle, and these information collected from the sensors can be transmitted to the first processing unit 106, in the form of signals.
[0038] In an exemplary embodiment, the ABS sensors can be installed with each wheel of the vehicle) that facilitate in monitoring rotational speed of the vehicle, when the vehicle is moving. In another exemplary embodiment, the gyroscope sensor can be configured with the wheel and rear side of the vehicle detect orientation error any of the plurality of parameters of the vehicle such as wheel rotation, steering means, vehicle rotation during accident, and the likes
[0039] In an exemplary embodiment, the shock sensor can be configured at front side of the vehicle to detect physical shock or impact occurred to the vehicle. In another exemplary embodiment, the shock sensors can be configured to detect metal-to-metal impact, pyrotechnic shock, and vibrations caused by motion of the vehicle.
[0040] In an exemplary embodiment, various other sensors such as flame detector, temperature sensor, gas sensor, gas leakage detector, humidity detector, and the like, can be positioned inside the vehicle to detect smoke leakage, gas leakage, humidity, and the likes. For example, the flame detector can be configured to detect presence of fire in engine or any other part of the vehicle, the temperature detector can be configured to detect the temperature inside the vehicle, and transmit the collected information to a first processing unit 106.
[0041] In an embodiment, the location identifier 104 can be configured to determine real-time location information (latitude and longitude) of the vehicle. In another embodiment, the location identifier 104 can be selected from location sensor, Global Positioning System sensor (GPS), and geolocation sensor. The location identifier 104 can sense real-time geographic location of the vehicle, and correspondingly generate a location signal. In an exemplary embodiment, the location identifier 104 can display location of the vehicle on google map also, which can be coupled with a dashboard of the vehicle.
[0042] In an exemplary embodiment, the location identifier 104 can provide nearby hospital, ambulance, and police station details to the first processing unit 106.
[0043] In an embodiment, the first processing unit 106 can be operatively coupled to the one or more sensors (102), the location identifier 104. The first processing unit 106 can include a memory storing a set of instructions executable by the processor. In another embodiment, the, first processing unit 106 can be configured to analyse the received information to determine occurrence of the accident, and transmit warning signals to mobile computing device(s) through a communication unit 116.
[0044] In an exemplary embodiment, the mobile computing device(s) can be a desktop computer, a vehicle computer, a tablet computer, a personal digital assistant, a laptop, a navigational device, a portable media device, and a smart phone. The mobile computing device(s) can include any one of a web client or application to facilitate communication and interaction between entities and the system 100. In various embodiments, information communicated between the system 100 and the mobile computing device(s) can involve user-selected functions available through one or more user interfaces (UIs). The UIs may be specifically associated with the web client (e.g., a browser) or the application. Accordingly, during a communication session with the mobile computing device(s), the system 100 may provide the mobile computing device(s) with a set of machine-readable instructions that, when interpreted by the client device using the web client or the application, cause the client device to present the UI, and transmit user input received through such UIs back to the system 100. As an example, the UIs provided to the mobile computing device(s) by the system 100 can allow entities to view information regarding vehicle accident along with location.
[0045] In an embodiment, the communication unit 116 can be configured to facilitate wireless Internet technology. Examples of such wireless Internet technology include GSM, Wireless LAN (WLAN), Wireless Fidelity (Wi-Fi), Wi-Fi Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), Worldwide Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), HSUPA (High Speed Uplink Packet Access), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced), and the like.
[0046] In addition, the communication unit 116 can be configured to facilitate short-range communication. For example, short-range communication can be supported using at least one of Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra-Wideband (UWB), ZigBee, Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus), and the like.
[0047] In an exemplary embodiment, upon detection of accident of the vehicle, the first processing unit 106 can generate a warning signal which can be transmitted to the mobile computing device through the GSM 116, where the first set of alert signals can include real-time location of the vehicle received from the location identifier 104 and one or more parameters of the vehicle collected from the sensors. The warning signal can be transmitted in the form of text, and mail through the GSM 116. Thus, the driver and of the person sitting in the vehicle can be provided medical assistance on time.
[0048] In an embodiment, the system (100) can include a power source to provide power supply to the system 100. The power source can be operatively coupled with the sensors 102, the location identifiers 104, and the first processing unit 106.
[0049] In an embodiment, the first processing unit 106 can include one or more processor(s). that can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) can be configured to fetch and execute computer-readable instructions stored in a memory of the first processing unit 106. The memory can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0050] In an embodiment, the first processing unit 106 can also include an interface(s) The that can include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) can facilitate communication of the first processing unit 106with various devices coupled to the first processing unit 106.
[0051] In an embodiment, the first processing unit 106 can include a learning engine 108 that can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the learning engine 108. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for learning engine 108 may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the learning engine 108 may include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the learning engine 108. In such examples, the first processing unit 106 can include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the first processing unit 106 and the processing resource. In other examples, the learning engine 108 may be implemented by electronic circuitry. A database can include data that is either stored or generated as a result of functionalities implemented by any of the components of the learning engine 108.
[0052] In an embodiment, learning engine 108 can be trained using supervised or unsupervised machine learning, and the machine learning program may employ a neural network, which may be a convolutional neural network (CNN), a deep learning neural network, or a combined learning module or program that learns in two or more fields or areas of interest. Machine learning may involve identifying and recognizing patterns in existing data. Additionally or alternatively, the machine learning programs may be trained by inputting sample data sets or certain data into the programs, such as autonomous system sensor and/or control signal data, and other data discuss herein. The machine learning programs may utilize deep learning algorithms primarily focused on pattern recognition, and may be trained after processing multiple examples. After training, machine learning programs (or information generated by such machine learning programs) may be used to evaluate additional data.
[0053] In an embodiment, the machine learning programs can include VGG 19 Bayesian program learning (BPL), deep learning, but not limited to such as support vector machines, decision trees, artificial neural networks, and convolutional neural networks (CNN).
[0054] In an embodiment, the first processing unit 106 can be configured to receive the signal generated by the sensors 102, and extract value of the one or more parameters from the received signals using learning engine 108. Extracted values can be compared with a set of pre-defined values to detect the accident. Upon detection of the accident, the first processing unit 106 can generate warning signal that can be transmitted to a centralized server 110.
[0055] In an embodiment, the centralized server 110 can be configured to store information of hospitals, police stations, ambulance, healthcare professionals, unmanned aerial vehicle(s) (UAVs) and information received from the system 100. A second processing unit 112 can be provided on the centralized server 110, upon receiving the warning signal from the system 100, the second processing unit 112 can receive one or more images of location of the accident by one or more image acquisition units 114 such as camera, webcam, surveillance cameras, and the likes. In addition, the second processing unit 112 can pre-process the received one or more images to evaluate severity of the accident, and correspondingly instruct an emergency vehicle to accommodate one or more medical equipment, and move to the location of accident. The location information of the vehicle can be extracted from the received warning signal.
[0056] In an embodiment, the second processing unit 112 can include another learning engine that can be trained using different neural models, i.e., VGG19, AlexNet, VGG16, that assist in image analysis, and feature extractions from images, that facilitates in detecting severity of the accident. The transfer learning is used to improve the accuracy of the image classification.
[0057] In an embodiment, the one or more image acquisition units 114 can be located on a many roads to capture images of vehicles running on the road. Upon receiving information of the accident, the centralized server 110 can acquire one or more images of the location of accident from the associated image acquisition unit 114 to evaluate severity of the accident.
[0058] In an embodiment, the one or more image acquisition units 114 can be located on one or more unmanned aerial vehicles (UAVs). Upon receiving information of the accident, the centralized server 110 can instruct the UAVs found in vicinity of the accident to move to the location of accident to acquire one or more images of the accident. The location information of the accident can be received from the location identifier 104 of the vehicle.
[0059] In an exemplary embodiment, upon receiving the warning signal, the centralized server 110 instruct a UAV in the vicinity of the accident to capture images of the accident location. Also, the UAV can be sent with a first aid kit to assist the person injured in the accident immediately, as ambulance takes time to reach. Further, the captured images can be transmitted to the second processing unit 112, that can analyse the received images to evaluate level of accident, and accordingly the second processing unit 112 can instruct the ambulance (i.e. to driver, or admin or healthcare professionals) to accommodate oxygen cylinder, blood, and the likes, and move to the location of accident.
[0060] In an embodiment, the power source can include any or a combination of rechargeable battery, lithium (Li) ion cell, rechargeable cells, solar cell, solar battery, electrochemical cells, storage battery, secondary cell, and the likes.
[0061] In an embodiment, existing power source of the vehicle which provides power to dashboard can be connected with the system 100 to provide power supply.
[0062] FIG. 2 illustrates a flow diagram of detecting accident of a vehicle and reporting method, in accordance with an embodiment of the present disclosure.
[0063] As illustrated in FIG. 2, one or more sensors 102 can be located in the vehicle to start tracking of one or more parameters of the vehicle, by collecting values of the one or more parameters. The collected values can be transmitted to a first processing unit 106 that can analyse the received values with a set of threshold values, if any of the value exceed from the threshold, the system 100 can detect accident. Upon detection of the accident, an alarm can be generated for 10 seconds. When the alarm is closed within 10 seconds that indicates that no accident occurred. When the alarm is not closed within 10 seconds, the first processing unit 106 can find location of the vehicle with GPS 104 (i.e. location identifier 104), and correspondingly send notification (warning signal) to one or more mobile computing devices associated with one or more entities using a communication unit such as Wi-Fi, GSM and the likes.
[0064] In an embodiment, the one or more mobile computing devices can be a desktop computer, a vehicle computer, a tablet computer, a personal digital assistant, a laptop, a navigational device, a portable media device, and a smart phone.
[0065] In an embodiment, one or more entities can be police station, nearest hospital (location can be detected using GPS), and registered contact number (such as family member or owner contact numbers).
[0066] In an embodiment, the notification can be transmitted to a centralizer server 110 of hospitals, the centralizer server 110 can send a drone (i.e. UAV) with a first aid box to the accident spot (i.e. to the location received from the GPS). The drone can capture images using one or more image acquisition units 114 coupled to the drone, and transmit the acquired images to the centralized server 110.
[0067] In an embodiment, upon receiving the images, a second processing unit 112 provided with the centralized server 110 can find severity of accident by implementing one or more deep learning models (such as convolutional neural network (CNN) to evaluate severity of the accident accurately.
[0068] FIG. 3 illustrates a method for detecting accident of a vehicle and reporting, in accordance with an embodiment of the present disclosure.
[0069] As illustrated in FIG. 3, a method 300 for detecting accident and reporting is disclosed. At step 302 the method 300 can include detection of a plurality of parameters, by one or more sensors 102. The one or more sensors 102 can include any or a combination of, speed sensor, noise sensor, pressure sensor, and accelerometer, and each of the sensors 102 can be configured to detect the at least one of the one or more parameters such as of speed , impact, pressure, vibration, and sound.
[0070] At step 304, the method 300 can include detection of location information of the vehicle, by a location identifier 104. The location identifier 104 can include any or a combination of location sensor, Global Positioning System sensor, and geolocation sensor, and configured to detect latitude and longitude coordinates, that can be used to get location of the vehicle at the time of accident.
[0071] At step 306, the method 300 can include analyzing, the one or more parameters to detect an accident. The analysis can be performed using machine learning model such as a trained VGG-19 model to detect an accident accurately.
[0072] At step 308, the method 300 can include generation of a warning signal by the, by the first processing unit 106. The generated warning signal can be transmitted to a centralized server 110. The centralized server 110 can be a server of hospitals, for storing information of emergency vehicle such as ambulance, unmanned aerial vehicles (UAVs) which are used by the hospitals.
[0073] At step 310, the method 300 can include receiving of one or more images by the centralized server, and the images are acquired from one or more acquisition units 114. The one or more image acquisition units 114 can be positioned on roads and located with one or more unmanned aerial vehicles (UAVs).
[0074] In an exemplary embodiment, upon receiving the warning signal, a second processing unit 112 of the centralized server 110 can instruct a UAV found in vicinity of the accident to acquire images of the accident. In another exemplary embedment, upon receiving the warning signal, the second processing unit 112 can collect images of the location of the accident from the surveillance cameras positioned on the road.
[0075] At step 312, the method 300 can include pre-processing by the second processing unit 112 can evaluate severity of the accident. For analysis, the second processing unit 112 can implement one or more deep learning models (such as convolutional neural network (CNN) to evaluate severity of the accident accurately.
[0076] At step 314, the method 300 can include instructing, by the second processing unit 112, an emergency vehicle to move to the location of accident with one or more medical equipment required based on the evaluated severity of the accident. For example, driver of the vehicle is found in critical condition, then the oxygen cylinders, first aid kit, and associated healthcare professionals can be send to the location with the emergency vehicle to save the life of the driver, and to reduce chances of causality. Also, the second processing unit 112 can automatically notify a nearby police station regarding the accident, and family members of the driver, or the owner of the vehicle can be informed by transmitting other warning signals to stored contact numbers.
[0077] The above described features, configurations, effects, and the like are included in at least one of the embodiments of the present invention, and should not be limited to only one embodiment. In addition, the features, configurations, effects, and the like as illustrated in each embodiment may be implemented with regard to other embodiments as they are combined with one another or modified by those skilled in the art. Thus, content related to these combinations and modifications should be construed as including in the scope and spirit of the invention as disclosed in the accompanying claims.
[0078] Further, although the embodiments have been mainly described until now, they are just exemplary and do not limit the present invention. Thus, those skilled in the art to which the present invention pertains will know that various modifications and applications which have not been exemplified may be performed within a range which does not deviate from the essential characteristics of the embodiments. For instance, the constituent elements described in detail in the exemplary embodiments can be modified to be performed. Further, the differences related to such modifications and applications shall be construed to be included in the scope of the present invention specified in the attached claims.
[0079] The present invention encompasses various modifications to each of the examples and embodiments discussed herein. According to the invention, one or more features described above in one embodiment or example can be equally applied to another embodiment or example described above. The features of one or more embodiments or examples described above can be combined into each of the embodiments or examples described above. Any full or partial combination of one or more embodiment or examples of the invention is also part of the invention.

ADVANTAGES OF THE PRESENT DISCLOSURE
[0080] The present disclosure provides a system for detecting accident.
[0081] The present disclosure provides a system for notifying the nearby hospitals, police etc. in a real time manner without causing any delay.
[0082] The present disclosure provides a system to increase the probability of life saving during an accident.
[0083] The present disclosure provides a system for tracking location of the vehicle, which facilitates in getting exact location of the vehicle where accident occurs.
[0084] The present disclosure provides a system for detecting accident and notifying with efficient and cost effective solution.

We Claims:

1. A system 100 for detecting accident and reporting, the system comprising:
one or more sensors 102 coupled with the vehicle to detect a plurality of parameters of the vehicle;
a location identifier 104 located inside the vehicle to determine location information of the vehicle;
a first processing unit 106 operatively coupled with the one or more sensors 102, and the location identifier 104, wherein the processing unit comprises of a learning engine 108 coupled with a memory, the memory storing instructions executable by the learning engine 108 and configured to:
analyse the plurality of parameters to determine occurring of an accident, and correspondingly actuate an alert unit located inside the vehicle;
generate a warning signal, wherein the warning signal pertains location information of the vehicle, wherein the warning signal are transmitted to a centralized server 110; and
the centralized server 110 comprising:
a second processing unit 112, configured to:
receive one or more images of location of the accident by one or more image acquisition units 114; and
pre-process the received one or more images to evaluate severity of the accident, and correspondingly instruct an emergency vehicle to accommodate one or more medical equipment, and move to the location of accident, wherein the location information of the vehicle is extracted from the receive warning signal.
2. The system as claimed in claim 1, wherein second processing unit 112 implement one or more deep learning models to evaluate severity of the accident.
3. The system as claimed in claim 1, wherein the one or more image acquisition units 114 are located on a plurality of roads to capture images of a plurality of vehicles on the road, wherein upon receiving information of the accident, the centralized server acquire one or more images of the location of accident from the associated image acquisition unit 114 to evaluate severity of the accident.
4. The system as claimed in claim 1, the one or more image acquisition units 114 are located on one or more unmanned aerial vehicles (UAVs), wherein upon receiving information of the accident, the centralized server instruct the UAVs found in vicinity of the accident to move to the location of accident to acquire one or more images of the accident.
5. The system as claimed in claim 1, wherein one or more sensors 102 comprises any or a combination of, speed sensor, noise sensor, pressure sensor, and accelerometer.
6. The system as claimed in claim 1, wherein the plurality of parameters comprises any or a combination of speed , impact, pressure, vibration, and sound.
7. The system as claimed in claim 1, wherein location identifier 104 comprises any or a combination of location sensor, Global Positioning System sensor (GPS), and geolocation sensor.
8. A method 300 for detecting accident of a vehicle and reporting, the method comprising:
detecting, by one or more sensors 102, a plurality of parameters;
detecting, by a location identifier 104, location information of the vehicle;
analyzing, by a first processing unit 106, the plurality of parameters to detect an accident;
generating, by the first processing unit 106, a warning signal, wherein the warning signal is transmitted to a centralized server;
receiving, by the centralized server 110, one or more images acquired from one or more acquisition units;
pre-processing the received one or more images, by a second processing unit 112 to evaluate severity of the accident; and
instructing, by the second processing unit 112, an emergency vehicle to move to the location of accident with one or more medical equipment required based on the evaluated severity of the accident.
9. The method as claimed in claim 8, wherein the one or more image acquisition units 114 are positioned on a plurality of roads and located with one or more unmanned aerial vehicles (UAVs).

Documents

Application Documents

# Name Date
1 202211003728-STATEMENT OF UNDERTAKING (FORM 3) [22-01-2022(online)].pdf 2022-01-22
2 202211003728-POWER OF AUTHORITY [22-01-2022(online)].pdf 2022-01-22
3 202211003728-FORM FOR SMALL ENTITY(FORM-28) [22-01-2022(online)].pdf 2022-01-22
4 202211003728-FORM 1 [22-01-2022(online)].pdf 2022-01-22
5 202211003728-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [22-01-2022(online)].pdf 2022-01-22
6 202211003728-DRAWINGS [22-01-2022(online)].pdf 2022-01-22
7 202211003728-DECLARATION OF INVENTORSHIP (FORM 5) [22-01-2022(online)].pdf 2022-01-22
8 202211003728-COMPLETE SPECIFICATION [22-01-2022(online)].pdf 2022-01-22
9 202211003728-FORM FOR STARTUP [24-01-2022(online)].pdf 2022-01-24
10 202211003728-EVIDENCE FOR REGISTRATION UNDER SSI [24-01-2022(online)].pdf 2022-01-24
11 202211003728-Proof of Right [04-02-2022(online)].pdf 2022-02-04
12 202211003728-FORM-9 [09-11-2022(online)].pdf 2022-11-09
13 202211003728-FORM 18 [06-11-2023(online)].pdf 2023-11-06
14 202211003728-FER.pdf 2024-02-28
15 202211003728-FER_SER_REPLY [28-08-2024(online)].pdf 2024-08-28
16 202211003728-CORRESPONDENCE [28-08-2024(online)].pdf 2024-08-28
17 202211003728-COMPLETE SPECIFICATION [28-08-2024(online)].pdf 2024-08-28
18 202211003728-CLAIMS [28-08-2024(online)].pdf 2024-08-28
19 202211003728-US(14)-HearingNotice-(HearingDate-25-02-2025).pdf 2025-01-24
20 202211003728-FORM-26 [20-02-2025(online)].pdf 2025-02-20
21 202211003728-Correspondence to notify the Controller [20-02-2025(online)].pdf 2025-02-20
22 202211003728-Written submissions and relevant documents [12-03-2025(online)].pdf 2025-03-12
23 202211003728-PatentCertificate18-09-2025.pdf 2025-09-18
24 202211003728-IntimationOfGrant18-09-2025.pdf 2025-09-18

Search Strategy

1 SearchStrategyE_09-02-2024.pdf

ERegister / Renewals