Abstract: The present invention is directed to a crash warning triggering system (100) for a two-wheeled vehicle (102). The system (100) comprises a first control unit (108) communicably coupled to vehicle data sensors (104) and to UAV (106). The first control unit (108) is configured to determine vehicle operating parameters of vehicle (102) based on the one or more vehicle related information. A crash condition of vehicle (102) is then determined, when the vehicle operating parameters exceeds a threshold value. One or more control commands are then generated and transmitted to UAV (106), upon determining the crash condition. The UAV (106) is then operated based on the one or more control commands comprising an image capture command for enabling the UAV (106) to capture the environment data of a driving environment of the rider (R) for determining an injury condition of the rider (R). Reference Figure 1
Description:FIELD OF THE INVENTION
[001] The present invention relates to a crash warning triggering system for a two-wheeled vehicle and a method thereof. Particularly, the present invention relates to the crash warning triggering system through an Unmanned Aerial Vehicle (UAV) in the event of a crash condition of the two-wheeled vehicle.
BACKGROUND OF THE INVENTION
[002] In recent past, vehicles such as two-wheeled vehicles, are typically equipped with an automatic Save Our Ship (SOS) triggering system, as a safety accessory. The SOS triggering system comprises a telematics unit that is adapted to trigger emergency SOS calls to emergency services and/or contacts of a rider of the vehicle, when a crash of the vehicle is determined. As a result, in the event of the crash of the vehicle, the emergency services (such as hospitals) are notified, thereby facilitate a suitable medical care to the rider.
[003] However, the existing SOS triggering systems are incapable of determining a severity of the crash condition and/or an injury condition of the rider. Such a deficiency may render a rescue team from the emergency services to be uninformed about the condition of the rider, and therefore less-equipped for providing adequate treatment or a first aid to the rider, upon reaching a location of the crash. Also, as the injury condition of the rider is unknown to the emergency services, medical assistance is unavailable to the rider until the rescue team arrives. Further, lack of determination of severity of the crash, leads to reporting of each and every crash event to the emergency services. Such a situation may affect effective utilization of emergency services, since each and every injury sustained by the rider need not be reported. Moreover, the existing SOS triggering systems are incapable of interacting with the rider, for maintaining consciousness of the rider which may further decrease probability of overall recovery of the rider from the crash.
[004] Thus, there is a need for a crash warning triggering system and a method thereof which addresses at least one or more aforementioned problems.
SUMMARY OF THE INVENTION
[005] In one aspect, a crash warning triggering system for a two-wheeled vehicle. The system comprises one or more vehicle data sensors disposed in the two-wheeled vehicle. Each of the one or more vehicle data sensors is adapted to procure one or more vehicle related information of the two-wheeled vehicle. An Unmanned Aerial Vehicle (UAV) is disposed on the two-wheeled vehicle. The UAV comprises a vision sensor for capturing an environment data of a driving environment of a rider of the two-wheeled vehicle. A first control unit is disposed in the two-wheeled vehicle and is communicably coupled to the one or more vehicle data sensors and to the UAV. The first control unit is configured to receive the one or more vehicle related information of the two-wheeled vehicle from each of the one or more vehicle data sensors. The one or more vehicle operating parameters of the two-wheeled vehicle are then determined based on the one or more vehicle related information. Subsequently, a crash condition of the two-wheeled vehicle is determined when the one or more vehicle operating parameters exceeds a threshold value. One or more control commands are generated by the first control unit, wherein the one or more control commands are to be transmitted to the UAV upon determining the crash condition of the two-wheeled vehicle. The UAV is operated based on the one or more control commands, the one or more control commands comprising an image capture command for enabling the UAV to capture the environment data of the driving environment of the rider for determining an injury condition of the rider.
[006] In an embodiment, an image processing unit is disposed in one of the UAV and the two-wheeled vehicle. The image processing unit is communicably coupled to the first control unit and being adapted to receive the environment data from the UAV for analysing a bloodstain pattern, wherein the first control unit is adapted to perform a blood splatter analysis based on the bloodstain pattern for determining the injury condition of the rider.
[007] In an embodiment, the first control unit is adapted to classify the injury condition of the rider into one of a less severe condition, an intermediately severe condition, a moderately severe condition, and a highly severe condition based on the blood splatter analysis.
[008] In an embodiment, the first control unit is adapted to operate a flight controller unit in the UAV, for enabling the UAV to ascend to one of a predefined flight pattern and an adaptive flight pattern for capturing the environment data of the rider upon providing the image capture command.
[009] In an embodiment, a telematics unit disposed in one of the UAV and the two-wheeled vehicle. The telematics unit is communicably coupled to the first control unit for providing a location data and an elevation data of a location of the crash from the telematics unit, wherein, the first control unit is adapted to generate a flight pattern for the UAV based on the location data and the elevation data for procuring the environment data of the rider.
[010] In an embodiment, the one or more control commands comprise at least one of an audio communication command for maintaining an audio communication with the rider to ensure consciousness of the rider, a kit deploy command, for providing a first-aid package to the rider, an alert command, for alerting an emergency service team regarding the crash condition of the rider; and a reimburse communication command, for providing the environment data to a financial institution.
[011] In an embodiment, the UAV comprises a voice communication unit having a speaker unit and a microphone unit. The voice communication unit is adapted to maintain audio communication with the rider for ensuring consciousness of the rider, upon receiving the audio communication command from the first control unit.
[012] In an embodiment, the first control unit is configured to operate the UAV to deploy the first-aid package through a payload unit of the UAV to the rider, when the injury condition of the rider is one of a less severe condition, an intermediately severe condition, and a moderately severe condition.
[013] In an embodiment, the first control unit is adapted to operate a signal transmission unit of the UAV for enabling the UAV to alert an emergency service team to the location of the rider during the crash condition, when the injury condition of the rider is a highly severe condition.
[014] In an embodiment, the first control unit is adapted to operate the signal transmission unit of the UAV for transferring the environment data to a financial institution, when an insurance claim is made by the rider to the financial institution.
[015] In an embodiment, the one or more vehicle data sensors comprises an Anti-lock Braking (ABS) sensor, a Light Detection and Ranging (LiDAR) sensor, a Radio Detection and Ranging (Radar) sensor and an image sensor.
[016] In an embodiment, the UAV comprises one or more sensors. The one or more sensors are adapted to determine the crash condition of the two-wheeled vehicle if the one or more vehicle data sensors are damaged during the crash condition of the two-wheeled vehicle.
[017] In an embodiment, a second control unit is disposed in the UAV. The second control unit is adapted to generate the one or more control commands and operate the UAV based on the one or more control commands for determining the injury condition of the rider during the crash condition, when the first control unit is determined to be in a failure condition.
[018] In another aspect, a method of operating the crash warning triggering system of the two-wheeled vehicle. The method comprises act of receiving, by the first control unit, the one or more vehicle information of the two-wheeled vehicle from each of the one or more vehicle data sensors. The one or more vehicle operating parameters of the two-wheeled vehicle are then determined based on the one or more vehicle related information. Subsequently, the crash condition of the two-wheeled vehicle is determined when the one or more vehicle operating parameters exceeds the threshold value. The one or more control commands are generated by the first control unit, wherein the one or more control commands are to be transmitted to the UAV upon determining the crash condition of the two-wheeled vehicle. The UAV is operated based on the one or more control commands, the one or more control commands comprising an image capture command for enabling the UAV to capture the environment data of the driving environment of the rider for determining an injury condition of the rider.
BRIEF DESCRIPTION OF THE DRAWINGS
[019] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 is a block diagram of the crash warning triggering system, in accordance with an exemplary embodiment of the present invention.
Figure 2 is a schematic view of a two-wheeled vehicle comprising a crash warning triggering system, in accordance with an exemplary embodiment of the present invention.
Figure 3 is a block diagram of a first control unit of the crash warning triggering system of Figure 1, in accordance with an exemplary embodiment of the present invention.
Figure 4 is a flow diagram of a method for operating the crash warning triggering system based on severity of a crash of the two-wheeled vehicle, in accordance with an exemplary embodiment of the present invention.
Figure 5 is a flow diagram of the method for operating the crash warning triggering system depicting determination of an injury condition of the rider based on an image processing results, in accordance with an exemplary embodiment of the present invention.
Figure 6 is a flow diagram of a method for operating the crash warning triggering system, in accordance with an exemplary embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[020] Present invention relates to a crash warning triggering system and a method thereof for a two-wheeled vehicle. Particularly, the present invention relates to the crash warning triggering system through an Unmanned Aerial Vehicle (UAV) in the event of a crash condition of the two-wheeled vehicle.
[021] Figure 1 is a block diagram of a crash warning triggering system 100 for a two-wheeled vehicle 102, in accordance with an exemplary embodiment of the present invention. The crash warning triggering system 100 (hereinafter referred to as ‘system 100’) is adapted to provide dynamic assistance to a rider R (shown in Figure 2) of the two-wheeled vehicle 102 (hereinafter referred to as ‘vehicle 102’) in the event of a crash condition of the rider R. In the present disclosure, the term “crash condition” refers to a collision or an accident of the vehicle 102 with an obstacle or another vehicle. Accordingly, the term “crash condition” of the vehicle 102 is interchangeably referred to as “crash” or as “crash condition” in the present disclosure. In the present embodiment, the vehicle 102 comprises a prime mover adapted to provide motive force required for movement of the vehicle. The prime mover may be an internal combustion engine and/or an electric motor.
[022] The system 100 comprises one or more vehicle data sensors 104 (hereinafter interchangeably referred to as ‘vehicle data sensor 104’) disposed in the vehicle 102 (as shown in Figure 2). Each of the vehicle data sensor 104 is adapted to procure one or more vehicle related information of the vehicle 102. The vehicle related information of the vehicle 102 may be information pertaining to parameters that govern or are associated with operation of the vehicle 102. In an embodiment, the vehicle related information may be information pertaining to a speed of the vehicle 102, a braking condition of the vehicle 102 and an orientation or inclination of the vehicle 102 with respect to a ground surface. In the present embodiment, the vehicle data sensors 104 comprises an Anti-lock Braking (ABS) sensor 104a, a Light Detection and Ranging (LiDAR) sensor 104b, a Radio Detection and Ranging (RADAR) sensor (104d), and an image sensor (104c).
[023] In an embodiment, the ABS sensor 104a is located within a hub (not shown) of a wheel (not shown) of the vehicle 102. The ABS sensor 104a is adapted to monitor a wheel speed of the vehicle 102, during movement of the vehicle 102. Accordingly, the ABS sensor 104a is adapted to monitor an abrupt deceleration of the vehicle 102 or the rate of engagement of brakes (not shown) by the rider R.
[024] In an embodiment, the LiDAR sensor 104b is located within a rear-view mirror (not shown) or below a handle member (not shown) of the vehicle 102. The LiDAR sensor 104b is adapted to detect objects in front of the vehicle 102 and estimate distance between the vehicle 102 and the object, during movement of the vehicle 102.
[025] In an embodiment, the image sensor 104c is located below the headlamp assembly (not shown) of the vehicle 102. The image sensor 104c is adapted to capture an image data of the surroundings, during movement of the vehicle 102.
[026] In an embodiment, the RADAR sensor 104d is located in front of the handlebar member. The RADAR sensor 104d is adapted to detect objects in front of the vehicle 102 and estimate distance between the vehicle 102 and the object, during movement of the vehicle 102.
[027] In an embodiment, the vehicle data sensors 104 may also comprise an Inertial Measurement Unit (IMU) for determining inclination of the vehicle 102 with respect to the ground surface.
[028] Further, the system 100 comprises an Unmanned Aerial Vehicle (UAV) 106 disposed on the vehicle 102. In the present embodiment, the UAV 106 is disposed on a rear end of the vehicle 102. In an embodiment, the UAV 106 is disposed on the vehicle 102 through a mounting mechanism such as a fastening mechanism or a clamping mechanism known in the art. The UAV 106 is provided with a vision sensor 116 (or image sensor) for capturing an environment data of a driving environment of the rider R of the vehicle 102. In an embodiment, the term “environment data” pertains to data of the driving environment or surroundings at which the crash of the vehicle 102 has occurred. As such, the UAV 106 using the vision sensor 116 is capable of capturing data pertaining to surroundings of the rider R during traversal or travelling of the vehicle 102. In an embodiment, the UAV 106 is a drone mounted onto the rear end of the vehicle 102. The UAV 106 comprises the vision sensor 116 for capturing the environment data of the driving environment of the rider R. In the present embodiment, the vision sensor 116 is a camera module mounted onto the UAV 106.
[029] Referring to Figure 3 in conjunction with Figures 1 and 2, the system 100 comprises a first control unit 108 disposed in the vehicle 102. In the present embodiment, the first control unit 108 may be disposed within an instrument cluster (not shown) of the vehicle 102. The first control unit 108 is communicably coupled to the UAV 106 and the vehicle data sensors 104. The first control unit 108 is communicably coupled to the UAV 106 and the vehicle data sensors 104 through a wired connection or a wireless connection as per requirement. The first control unit 108 is adapted to operate the UAV 106 suitably based on the information received from the vehicle data sensors 104, for capturing the environment data of the rider R during the crash condition.
[030] In an embodiment, the first control unit 108 is adapted to operate the UAV 106 through a second control unit 120 disposed in the UAV 106. As such, the UAV 106 comprises the second control unit 120. In another embodiment, the second control unit 120 is adapted to independently operate the UAV 106 for capturing the environment data, when the first control unit 108 is in a failure condition, during the crash of the vehicle 102. The term “failure condition” may be a damaged condition of the first control unit 108 during the crash of the vehicle 102.
[031] In an embodiment, the first control unit 108 can be in communication with at least one vehicle control unit (not shown) of the vehicle 102. Accordingly, the first control unit 108 may obtain the one or more vehicle related information from the vehicle control unit. In an embodiment, the first control unit 108 may comprise one or more additional components such as, but not limited to an input/output module 122 and a processing module 124 with an analytic module 126.
[032] In an embodiment, the first control unit 108 may be embodied as a multi-core processor, a single core processor or a combination of one or more multi-core processors and one or more single core processors. For example, the first control unit 108 is embodied as one or more of various processing devices or modules, such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as but not limited to, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. In yet another embodiment, the first control unit 108 may be configured to execute hard-coded functionality. In still another embodiment, the first control unit 108 may be embodied as an executor of instructions, where the instructions are specifically configured to the first control unit 108 to perform steps or operations described herein for operating the UAV 106 during the crash condition of the vehicle 102.
[033] Further, the processing module 124 is communicably coupled to a memory unit 128. The memory unit 128 is capable of storing information processed by the processing module 124 operating the UAV 106 during the crash condition of the vehicle 102. In an embodiment, the memory unit 128 may be external to the first control unit 108.
[034] In an embodiment, the memory unit 128 is embodied as one or more volatile memory devices, one or more non-volatile memory devices and/or combination thereof, such as magnetic storage devices, optical-magnetic storage devices and the like as per design feasibility and requirement. The memory unit 128 communicates with the first control unit 108 or the processing module 124 via suitable interfaces such as Advanced Technology Attachment (ATA) adapter, a Serial ATA [SATA] adapter, a Small Computer System Interface [SCSI] adapter, a network adapter or any other component enabling communication between the memory unit 128 and the first control unit 108 or the processing module 124. In an embodiment, the first control unit 108 may be connected to a power supply such as a battery module (not shown) of the vehicle 102, for receiving electrical power. In an embodiment, the first control unit 108 may have an inbuilt power supply 130 for drawing power from the battery module of the vehicle 102.
[035] In an embodiment, the first control unit 108 or the analytic module 126 of the processing module 124 of the first control unit 108 is adapted to operate the UAV 106 during the crash condition of the rider R, based on information procured by the vehicle data sensors 104. That is, the first control unit 108 or the analytic module 126 is adapted to operate the UAV 106 during the crash condition of the rider R based on the one or more vehicle related information procured by the vehicle data sensors 104. The first control unit 108 is adapted to determine one or more vehicle operating parameters of the vehicle 102 based on the vehicle related information from the vehicle data sensors 104. Based on the vehicle operating parameters, the first control unit 108 is adapted to operate the UAV 106.
[036] In an embodiment, the first control unit 108 is adapted to determine the crash condition of the vehicle 102 based on the vehicle operating parameters. As an example, if an abrupt deceleration is determined by the first control unit 114 based on data received from the ABS sensor 104a and/or the vehicle 102 is determined to be tilted horizontally (or beyond a permissible tilt angle for e.g. 50 degrees) with respect to a ground surface based on data received from the LiDAR sensor 104b or RADAR 104d, the first control unit 108 may determine the vehicle 102 to be in the crash condition. In another embodiment, the first control unit 108 may be adapted to determine the crash condition of the vehicle 102 when a collision is detected by a collision sensor (not shown).
[037] The first control unit 108 upon determining the crash condition of the vehicle 102 is adapted to generate one or more control commands for operating the UAV 106. The one or more control commands are transmitted to the UAV 106 by the first control unit 108, for operating the UAV 106 suitably. In an embodiment, the one or more control commands are transmitted to the UAV 106 through an input/output module (not shown) provided in the UAV 106.
[038] In an embodiment, the one or more control commands generated by the first control unit 108 may be an image capture command. The first control unit 108 transmits the image capture command to the UAV 106. Upon receiving the image capture command from the first control unit 108, the UAV 106 activates the vision sensor 116 for capturing the environment data of the rider R in the crash. In an embodiment, the first control unit 108 upon transmitting the image capture command to the UAV 106 subsequently activates a flight controller unit (not shown) disposed in the UAV 106, for enabling the UAV 106 to ascend to one of a predefined flight pattern and an adaptive flight pattern for capturing the environment data of the rider R. The images captured by the UAV 106 are transferred to the first control unit 108 for determining an injury condition of the rider R.
[039] In an embodiment, the predefined flight pattern refers to a movement pattern of the UAV 106 for capturing a field of vision of the crash of the vehicle 102. In an embodiment, the field of vision of the crash may be obtained through a front view, a rear view, a top view and side views of the environment in a location of the crash of the vehicle 102. Accordingly, the predefined flight pattern may include movement of the UAV 106 to at least one of a front side, a rear side, a top side, a left side and right side of the environment for capturing the field of vision of the crash. In another embodiment, the views captured by the UAV 106 may be around a predetermined perimeter around the location of the crash of the vehicle 102. As such, data such as an elevation or height data of the drone to be moved may be set in the predefined flight pattern of the UAV 106. As an example, the first control unit 108 is adapted to control movement of the UAV 106 in the predefined flight pattern for capturing the front view, the rear view, the top view and the side views of the environment data about a perimeter of 50 meters around the location of the crash of the vehicle 102.
[040] In an embodiment, the adaptive flight pattern refers to a movement pattern of the UAV 106 required for capturing intricate details of the crash, that were unable to be captured through the predefined flight pattern. As such, in the adaptive flight pattern, the UAV 106 may be required to be elevated at an altitude higher or a lower than the elevation set in the predefined flight pattern for obtaining the field of vision. As an example, if the crash of the vehicle 102 occurs in a hilly terrain and the vehicle 102 and rider R are separated subsequent to the crash, the first control unit 108 firstly may receive the images captured by the UAV 106 through the predefined flight pattern. If the images were unable to provide information for accessing severity of the injury of the rider R, the first control unit 108 operates the UAV 106 in the adaptive flight pattern. Accordingly, the first control unit 108 may provide elevation and location details to the flight controller unit of the UAV 106 for capturing the images of the crash suitably.
[041] The first control unit 108 transfers the images captured by the vision sensor 116 to an image processing device, which analyses the images for a bloodstain. In an embodiment, the image processing device may be an image processing unit 112 (shown in Figure 1) disposed in the UAV 106 or in the vehicle 102. The first control unit 108 is adapted to perform a blood splatter analysis based on the bloodstain information obtained from the image processing unit 112. The blood splatter analysis determines the injury condition of the rider R. Thus, if the images provided by the UAV 106 does not provided information on the bloodstain, the image processing unit 112 determines that the images lack the information required for accessing the severity of the injury of the rider R. Accordingly, the first control unit 108 operates the UAV 106 in the adaptive flight pattern until the bloodstains in the crash are identified. In an embodiment, the first control unit 108 performs the blood splatter analysis through one or more techniques known in the art. In another embodiment, the first control unit 108 may perform the blood splatter analysis by analyzing a volume of blood lost by the rider R based on analysis performed by the image processing unit 112 along with the elevation data and a slope of a ground surface at the location of the crash. In an embodiment, the first control unit 108 may perform the blood splatter analysis based on a general posture of the rider R such as gripping an injured body part of the rider, determined through sensors (for e.g. sensors 106a) in the vehicle 102.
[042] Upon analyzing the bloodstain, the first control unit 108 may determine a score to determine the severity of the injury of the rider R. In an embodiment, the first control unit 108 classifies the injury of the rider R into one of a less severe condition, an intermediately severe condition, a moderately severe condition, and a highly severe condition based on the blood splatter analysis. In an embodiment, the first control unit 108 classifies the injury of the rider R as the highly severe condition when the score exceeds 89. In an embodiment, the first control unit 108 classifies the injury of the rider R as the moderately severe condition when the score is in the range of about 70 to about 89. In an embodiment, the first control unit 108 classifies the injury of the rider R as the intermediately severe condition when the score is in the range of about 30 to about 69. In an embodiment, the first control unit 108 classifies the injury of the rider R as the less severe condition when the score is in the range of about 0 to about 29.
[043] Further, the first control unit 108 is also communicably coupled to a telematics unit 110 disposed in one of the UAV 106 and the vehicle 102. The telematics unit 110 is adapted to provide a location data and an elevation data of the location of the crash of the vehicle 102. Thus, the first control unit 108 may be adapted to monitor and adjust the predefined flight pattern and/or the adaptive flight pattern of the UAV 106 based on the location data and the elevation data provided by the telematics unit 110. As such, the first control unit 108 also considers the result from the image processing unit 112 but also considers the data from the telematics unit 110 for monitoring and adjusting the predefined flight pattern and/or the adaptive flight pattern of the UAV 106.
[044] The first control unit 108 is further communicably coupled to a voice communication unit 114 disposed in the UAV 106. The voice communication unit 114 may comprise a speaker unit (not shown) and a microphone unit (not shown) and is adapted to maintain an audio communication with the rider R of the vehicle 102. As such, the UAV 106 is adapted to maintain a voice communication with the rider R through the voice communication unit 114. Such a configuration in the system 100 maintains consciousness of the rider R, thereby increasing probability of recovery of the rider R upon suitably medical assistance. In an embodiment, the voice communication unit 114 (i.e. the speaker unit and the microphone unit) may operate in accordance with a speech recognition and relay techniques known in the art. In an embodiment, the UAV 106 is adapted to provide audio inputs to the rider R through the voice communication unit 114, upon determining the crash condition of the vehicle 102.
[045] In an embodiment, the UAV 106 may be equipped with medical checklist queries, that may be posed to the rider R upon determining of the crash of the vehicle 102. The UAV 106 may request an audible response from the rider R for the medical checklist queries. Based on the response received from the rider R, the first control unit 108 may determine the severity of injury of the rider R. The first control unit 108 may also transmit the audible response of the rider R to an emergency service team or to a medical team to facilitate diagnosis or for equipping necessary medical equipment for providing medical assistance to the rider R. As an example, the medical checklist queries may include questions to the rider R such as “do you feel pain or discomfort anywhere” and the like. The audible response received from the rider R may be stored and sent to the emergency service team or to the medical team for facilitating diagnosis. In an embodiment, the first control unit 108 is adapted to perform posture analysis of the rider R based on the audible response received from the rider R, when blood is not detected from the analysis of the image processing unit 112.
[046] Further, the first control unit 108 is communicably coupled to a payload unit (not shown) of the UAV 106, wherein the payload unit is adapted to store a first-aid package. The first control unit 108 is also adapted to deploy the payload unit for enabling the rider R to access the first-aid package based on the severity of the injury of the rider R. In an embodiment, the first control unit 108 is adapted to locate the rider R and move the UAV 106 to the location of the rider R for deploying the first aid kit to the rider R. In an embodiment, the first control unit 108 is adapted to deploy the payload unit for enabling access to the first-aid package, when the injury condition of the rider R is one of the less severe condition, the moderately severe condition and intermediately severe condition.
[047] In an embodiment, the payload unit is adapted to store the first aid package through a mechanism (not shown) capable of being operable between a retracted position and a deployed position. In the retracted position of the mechanism, the first-aid package is stored within the payload unit, while in the deployed position, the first-aid package is ejected or dispensed from the payload unit.
[048] Furthermore, the first condition unit 108 is communicably coupled to a signal transmission unit 118 of the UAV 106. The signal transmission unit 118 is adapted to transfer the environment data to a financial institution or an emergency service team upon determination of the crash of the vehicle 102. The signal transmission unit 118 is also adapted to transfer audible response received from the rider R for the medical checklist queries to the emergency service team. In an embodiment, the signal transmission unit 118 may be a radio-transmission device for enabling transfer of data from the system 100 to at least one of the financial institution or the emergency service team. In an embodiment, the financial institution may be an insurance corporation. In another embodiment, the emergency service team may be a hospital. In an embodiment, the first control unit 108 may enable the signal transmission unit 118 to transfer the environment data and/or the audible response from the rider R to the nearest financial institution or the emergency service team based on the location of the crash of the vehicle 102. Thus, enable a swifter response from the emergency service team to the rider R for providing the medical assistance. In an embodiment, the first control unit 108 may enable the signal transmission unit 118 to transfer the environment data and/or the audible response to a computing device such as a mobile device of the rider R as per requirement.
[049] Apart from the image capture command, the first control unit 108 is also adapted to generate the one or more control commands to the UAV 106 based on the severity of the injury to the rider R. In an embodiment, the image capture command is a primary command to the UAV 106, while the other one or more control commands are secondary commands. In another embodiment, the secondary commands are situational commands, which are generated and transmitted to the UAV 106 based on the severity of the injury condition of the rider R.
[050] In an embodiment, the first control unit 108 also generates an audio communication command, a kit deployment command, an alert command and a reimburse communication command based on the severity of injury of the rider R. Upon providing the audio communication command, the first control unit 108 enables the voice communication unit 114 to maintain audio communication with the rider R for ensuring consciousness of the rider R. Upon providing the kit deployment command, the first control unit 108 operates the UAV (106) to deploy the first-aid package through the payload unit to the rider R. In an embodiment, the kit deployment command is generated and provided to the UAV 106 by the first control unit 106, when the severity injury condition of the rider R is one of a less severe condition, an intermediately severe condition, and a moderately severe condition. Further, upon providing the alert command, the first control unit 108 is adapted to operate the signal transmission unit 118 for transferring the environment data and/or the audible response from the rider R to the financial institution or the emergency service team. In an embodiment, the alert command is provided by the first control unit 108 when an insurance claim is made by the rider R to the financial institution. Additionally, upon providing the reimburse communication command, the first control unit 108 is adapted to operate the signal transmission unit 118 for transferring the environment data and/or the audible response from the rider R to the financial institution or the emergency service team. In an embodiment, the first control unit 108 may automatically upload the environment data to a closed cloud server (not shown) accessible through the computing device associated with the rider R without downloading to an insurance agent or the financial institution company with a certificate indicating that the data untampered.
[051] In an embodiment, the architecture or configuration of the second control unit 120 is identical to the first control unit 108, so that, in the event of failure of the first control unit 108, the second control unit 120 operates the system 100 suitably. Accordingly, the second control unit 120 is also configured with modules identical to the first control unit 108 for enabling operation of the system 100. Also, the second control unit 120 is also associated with components as that of the first control unit 108. In another embodiment, the UAV 106 is also provided with one or more sensors 106a adapted to determine the crash condition of the vehicle 102, if the vehicle data sensors 104 are damaged or are in a failure condition during the crash of the vehicle 102. As such, the one or more sensors 106a are identical to the vehicle data sensors 104.
[052] In an embodiment, the system 100 may also be communicably coupled to the computing device or a mobile device of the rider R. Accordingly, the system 100 may utilize the signal transmission unit provided in the mobile device for ensuring communication with the emergency service team or the financial institutions. In an embodiment, the system 100 may also provide an alert message to a priority contact set by the rider R, in the event of the crash of the vehicle 102. In an embodiment, the first control unit 108 is adapted to determine location of the rider R through the mobile device, when the rider R is not visible at the location of the crash.
[053] In an embodiment, the system 100 may also be communicably coupled to the instrument cluster of the vehicle 102. Accordingly, the rider R may customize operation of the UAV 106 as per requirement.
[054] Figure 4 is a flow diagram of a method 400 for operating the system 100 based on severity of the crash of the vehicle 102, in accordance with an exemplary embodiment of the present invention. In the present disclosure, the method 400 carried out by the first control unit 108 are described. The method 400 can be carried out by the second control unit 120 as well, when the first control unit 108 reaches the failure condition during the crash of the vehicle 102.
[055] At step 402, the first control unit 108 is adapted to determine the vehicle operating parameters based on the vehicle related information received from the vehicle data sensors 104. Based on the vehicle operating parameters, the first control unit 108 is adapted to determine whether the crash has occurred for the vehicle 102. If the crash has occurred, the method 400 moves to step 404.
[056] At step 404, the first control unit 108 is adapted to check the severity of the crash through data received from the vehicle data sensors 104. As such, the severity condition of the rider R is determined when data received from the vehicle data sensors 104 exceeds a threshold value. If the crash is not severe, the first control unit 108 may consider the crash to be a minor incident or may trigger a “Save Our Ship” (SOS) signal to the nearby emergency service team for medical assistance at step 406. If the crash is severe, the method 400 moves to step 408. In an embodiment, the system 100 may seek confirmation from the rider R (through the instrument cluster of the vehicle 102 or through the mobile device) if the crash is a minor incident. The system 100 may also seek confirmation from the rider R if the rider R is in need of medical assistance for tending to injuries.
[057] At step 408, the first control unit 108 is adapted to disengage the UAV 106 from the vehicle 102 and activate the UAV 106. In an embodiment, activating the UAV 106 may refer to providing the predefined flight pattern or the adaptive flight pattern to the UAV 106 for traversing around the vehicle 102, for monitoring the environment data at the location of the crash of the vehicle 102.
[058] At step 410, the first control unit 108 enables the UAV 106 to capture images of the location of the crash of the vehicle 102 through the vision sensor 116, while traversing around the vehicle 102. As such, thorough data pertaining to the crash is obtained through the UAV 106.
[059] At step 412, the images captured by the vision sensor 116 are transmitted to the first control unit 108. The first control unit 108 upon receiving the images, is adapted to carry out image processing of the images through the image processing unit 112 based on one or more image processing techniques known in the art, at step 414. As already described in description with reference to Figure 3, the bloodstain analysis is carried to the images obtained through the UAV 106. At step 416, the severity of the injury to the rider R is determined based on the bloodstain analysis.
[060] Referring to Figure 5 in conjunction with Figure 4, the severity analysis of the injury to the rider R is depicted. As depicted, once the bloodstain analysis is completed, a severity scale or the score is determined by the first control unit 108 for classifying the severity of the injury of the rider R, at step 502. Upon classification, based on the severity of the injury, the first control unit 108 generates the one or more control commands to the UAV 106 for suitable operation. In an embodiment, the first control unit 108 classifies the injury of the rider R into one of the less severe condition, the intermediately severe condition, the moderately severe condition, and the highly severe condition based on the blood splatter analysis.
[061] In an embodiment, when the severity score exceeds 89, the first control unit 108 moves to step 504, wherein the first control unit 108 identifies that the rider R is bleeding, unconscious and is unable to move. At this stage, the first control unit 108 moves to step 512, wherein the first control unit 108 is adapted to provide the environment data to the emergency service team through the signal processing unit 118 of the UAV 106.
[062] In an embodiment, when the severity score is in the range of about 70 to about 89, the first control unit 108 moves to step 506, wherein the first control unit 108 identifies that the rider R is bleeding, has body injuries and movement of the rider R is detected. At this scenario, the first control unit 108 moves to step 514, wherein the first control unit 108 delivers the first-aid package to the rider R. Also, the first control unit 108 moves to step 516 and provides voice assistance for ensuring consciousness of the rider R.
[063] In an embodiment, when the severity score is in the range of about 30 to about 69, the first control unit 108 moves to step 508, wherein the first control unit 108 identifies that the rider R has minor injuries, and no threat is detected. At this scenario, the first control unit 108 moves to step 518, wherein the first control unit 108 delivers the first-aid package to the rider R.
[064] In an embodiment, when the severity score is in the range of about 0 to about 29, the first control unit 108 moves to step 520, wherein the first control unit 108 identifies that the rider R had a minor crash, and a further assistance is not required. At this scenario, the first control unit 108 moves to step 520, wherein the first control unit 108 updates the environment data with confirmation from the rider R.
[065] Figure 6 is a flow diagram of a method 600 for operating the system 100 in accordance with an exemplary embodiment of the present invention. In the present disclosure, the method 600 carried out by the first control unit 108 is described. The method 600 can also be carried out by the second control unit 120 as well, when the first control unit 108 reaches the failure condition during the crash of the vehicle 102.
[066] At step 602, the first control unit 108 is adapted to receive the one or more vehicle related information of the vehicle 102, based on the vehicle data sensors 104.
[067] At step 604, the first control unit 108 is adapted to determine the vehicle operating parameters of the vehicle 102, based on the vehicle related information received from the vehicle data sensors 104. As an example, the first control unit 108 determines if an abrupt deceleration has occurred in the vehicle 102 or if an abrupt tilting of the vehicle 102 has occurred, based on the vehicle operating parameters of the vehicle 102.
[068] At step 604, the first control unit 108 is adapted to determine the crash condition of the vehicle 102 based on the determined vehicle operating parameters. Considering the above-mentioned example, the first control unit 108 determines the crash condition of the vehicle 102, when the abrupt deceleration has occurred in the vehicle 102 or when the abrupt tilting of the vehicle 102 has occurred, based on the vehicle operating parameters of the vehicle 102.
[069] At step 608, the first control unit 108 generates one or more control commands upon determining the crash condition of the vehicle 102. The one or more control commands are transmitted to the UAV 106, for controlling the UAV 106 suitably.
[070] At step 610, the first control unit 108 operates the UAV 106 based on the one or more control commands, wherein the one or more control commands comprises the image capture command for enabling the UAV 106 to capture the environment data of the driving environment of the rider R. Based on the environment data, the injury condition of the rider R is determined. Accordingly, the one or more control commands are generated in accordance with the severity condition of the injury to the rider R and transmitted to the UAV 106, for ensuring operation of the UAV 106 for providing suitable assistance required for the rider R.
[071] The claimed invention as disclosed above is not routine, conventional or well understood in the art, as the claimed aspects enable the following solutions to the existing problems in conventional technologies. Specifically, the system in the present invention is adapted to provide dynamic assistance to a rider R of the vehicle the in event of the crash. Particularly, the claimed aspect of generating and providing the image capture command of the one or more control commands to the UAV by the first control unit enables determination of the severity condition of the injury to the rider R. Thus, the system is capable of procuring the injury related information accurately and provide the information to the emergency service team for ensuring that a rescue team is suitably equipped for providing the necessary medical assistance to the rider R. Also, the UAV is capable of maintaining voice communication or is capable of providing the voice assistance to the rider R and thus ensure consciousness of the rider R. Consequently, the probability of recovery of the rider R is enhanced. Additionally, the provision of the second control unit and the corresponding sensors in the UAV ensure that in the event of failure of the first control unit and the vehicle data sensors, the UAV is capable of operating the system 100 for ensuring assistance to the rider R. Moreover, the system is also capable of automatically providing the first-aid package to the rider R, based on the severity of the injury of the rider R. Further, the system is capable of transmitting the environment data to the insurance companies directly and thus ensure immediate processing of an insurance claim request.
[072] In light of the abovementioned advantages and the technical advancements provided by the disclosed system and method, the claimed steps as discussed above are not routine, conventional, or well understood in the art, as the claimed steps enable the following solutions to the existing problems in conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the system itself as the claimed steps provide a technical solution to a technical problem.
[073] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable storage medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.”
Reference numerals and Characters
100 - Crash warning triggering system
102 - Two-wheeled Vehicle
104 - Vehicle data sensors
104a - Anti-lock Braking sensor
104b - LiDar sensor
104c - Image sensor
104d - RADAR sensor
106 - Unmanned aerial vehicle
108 - First control unit
110 - Telematics unit
112 - Image processing unit
114 - Voice communication unit
116 - Vision sensor
118 - Signal transmission unit
120 - Second control unit
122 - Input/Output module
124 - Processing module
126 - Analytic module
128 - Memory unit
130 - Power supply
R - Rider of the Vehicle
, Claims:1. A crash warning triggering system (100) for a two-wheeled vehicle (102), the system (100) comprising:
one or more vehicle data sensors (104) disposed in the two-wheeled vehicle (102), each of the one or more vehicle data sensors (104) being adapted to procure one or more vehicle related information of the two-wheeled vehicle (102);
an Unmanned Aerial Vehicle (UAV) (106) disposed on the two-wheeled vehicle (102), the UAV (106) comprising a vision sensor (116) for capturing environment data of a driving environment of a rider (R) of the two-wheeled vehicle (102); and
a first control unit (108) disposed in the two-wheeled vehicle (102) and communicably coupled to the one or more vehicle data sensors (104) and to the UAV (106), the first control unit (108) being configured to:
receive, the one or more vehicle related information of the two-wheeled vehicle (102) from each of the one or more vehicle data sensors (104);
determine, one or more vehicle operating parameters of the two-wheeled vehicle (102) based on the one or more vehicle related information;
determine, a crash condition of the two-wheeled vehicle (102) when the one or more vehicle operating parameters exceeds a threshold value;
generate, one or more control commands to be transmitted to the UAV (106), upon determining the crash condition of the two-wheeled vehicle (102); and
operate, the UAV (106) based on the one or more control commands, the one or more control commands comprising an image capture command for enabling the UAV (106) to capture the environment data of the driving environment of the rider (R) for determining an injury condition of the rider (R).
2. The system (100) as claimed in claim 1 comprising an image processing unit (112) disposed in one of the UAV (106) and the two-wheeled vehicle (102), the image processing unit (112) being communicably coupled to the first control unit (108) and being adapted to receive the environment data from the UAV (106) for analyzing a bloodstain pattern,
wherein the first control unit (108) is adapted to perform a blood spatter analysis based on the bloodstain pattern for determining the injury condition of the rider (R).
3. The system (100) as claimed in claim 2, wherein the first control unit (108) is adapted to classify the injury condition of the rider (R) into one of a less severe condition, an intermediately severe condition, a moderately severe condition, and a highly severe condition based on the blood splatter analysis.
4. The system (100) as claimed in claim 1, wherein the first control unit (108) is adapted to operate a flight controller unit in the UAV (106), for enabling the UAV (106) to ascend to one of a predefined flight pattern and an adaptive flight pattern for capturing the environment data of the rider (R) upon providing the image capture command.
5. The system (100) as claimed in claim 3 comprising a telematics unit (110) disposed in one of the UAV (106) and the two-wheeled vehicle (102), the telematics unit (110) being communicably coupled to the first control unit (108) for providing a location data and an elevation data of a location of the crash from the telematics unit (110),
wherein, the first control unit (108) is adapted to generate a flight pattern for the UAV (106) based on the location data and the elevation data for procuring the environment data of the rider (R).
6. The system (100) as claimed in claim 1, wherein the one or more control commands comprise at least one of:
an audio communication command, for maintaining an audio communication with the rider (R) to ensure consciousness of the rider (R);
a kit deploy command, for providing a first-aid package to the rider (R);
an alert command, for alerting an emergency service team regarding the crash condition of the rider (R); and
a reimburse communication command, for providing the environment data to a financial institution.
7. The system (100) as claimed in claim 6, wherein the UAV (106) comprises a voice communication unit (114) having a speaker unit and a microphone unit, the voice communication unit (114) being adapted to maintain audio communication with the rider (R) for ensuring consciousness of the rider (R), upon receiving the audio communication command from the first control unit (108).
8. The system (100) as claimed in claim 6, wherein the first control unit (108) is configured to operate the UAV (106) to deploy the first-aid package through a payload unit of the UAV (106) to the rider (R), when the injury condition of the rider (R) is one of a less severe condition, an intermediately severe condition, and a moderately severe condition.
9. The system (100) as claimed in claim 6, wherein the first control unit (108) is adapted to operate a signal transmission unit (118) of the UAV (106) for enabling the UAV (106) to alert an emergency service team to the location of the rider (R) during the crash condition, when the injury condition of the rider (R) is a highly severe condition.
10. The system (100) as claimed in claim 9, wherein the first control unit (108) is adapted to operate the signal transmission unit (118) of the UAV (106) for transferring the environment data to a financial institution, when an insurance claim is made by the rider (R) to the financial institution.
11. The system (100) as claimed in claim 1, wherein the one or more vehicle data sensors (104) comprises:
an Anti-lock Braking (ABS) sensor (104a);
a Light Detection and Ranging (LiDAR) sensor (104b);
a Radio Detection and Ranging (Radar) sensor (104c); and
an image sensor (104c).
12. The system (100) as claimed in claim 1, wherein the UAV (106) comprises one or more sensors (106a), the one or more sensors (106a) being adapted to determine the crash condition of the two-wheeled vehicle (102), if the one or more vehicle data sensors (104) are damaged during the crash condition of the two-wheeled vehicle (102).
13. The system (100) as claimed in claim 1 comprises a second control unit (120) disposed in the UAV (106), the second control unit (120) being adapted to generate the one or more control commands and operate the UAV (106) based on the one or more control commands for determining the injury condition of the rider (R) during the crash condition, when the first control unit (108) is determined to be in a failure condition.
14. A method of operating a crash warning triggering system (100) of a two-wheeled vehicle (102), the method comprising:
receiving, by a first control unit (108), one or more vehicle related information of the two-wheeled vehicle (102) from each of one or more vehicle data sensors (104), the one or more sensors (104) being disposed in the two-wheeled vehicle (102);
determining, by the first control unit (108), one or more vehicle operating parameters of the two-wheeled vehicle (102) based on the one or more vehicle related information received from each of the one or more vehicle data sensors (104);
determining, by the first control unit (108), a crash condition of the two-wheeled vehicle (102) when the one or more vehicle operating parameters exceeds a threshold value;
generating, by the first control unit (108), one or more control commands to be transmitted to an Unmanned Aerial Vehicle (UAV) (106), upon determining the crash condition of the two-wheeled vehicle (102), the UAV (106) being disposed in the two-wheeled vehicle (102); and
operating, by the first control unit (108), the UAV (106) based on the one or more control commands, the one or more control commands comprising an image capture command for enabling the UAV (106) to capture an environment data of a driving environment of the rider (R) for determining an injury condition of the rider (R).
15. The method as claimed in claim 14 comprising receiving, by an image processing unit (112) disposed in one of the UAV (106) and the two-wheeled vehicle (102), the environment data from the UAV (106) for analyzing a bloodstain pattern,
wherein the first control unit (108) is adapted to perform a blood splatter analysis based on the bloodstain pattern for determining the injury condition of the rider (R).
16. The method as claimed in claim 15 comprising classifying, by the first control unit (108), the injury condition of the rider (R) into one of a less severe condition, an intermediately severe condition, a moderately severe condition and a highly severe condition based on the blood splatter analysis.
17. The method as claimed in claim 14 comprising operating, by the first control unit (108), a flight controller unit in the UAV (106), for enabling the UAV (106) to ascend to one of a predefined flight pattern and an adaptive flight pattern for capturing the environment data of the rider (R) upon providing the image capture command.
18. The method as claimed in claim 17 comprising providing, by a telematics unit (110) disposed in one of the UAV (106) and the two-wheeled vehicle (102), to the first control unit (108), a location data and an elevation data of a location of the crash from the telematics unit (110),
wherein, the first control unit (108) is adapted to generate a flight pattern for the UAV (106) based on the location data and the elevation data for procuring the environment data of the rider (R).
19. The method as claimed in claim 14, wherein the one or more control commands comprise at least one of:
an audio communication command, for maintaining an audio communication with the rider (R) to ensure consciousness of the rider (R);
a kit deploy command, for providing a first-aid package to the rider (R);
an alert command, for alerting an emergency service team regarding the crash condition of the rider (R); and
a reimburse communication command, for providing the environment data to a financial institution.
20. The method as claimed in claim 19 comprising operating, by the first control unit (108), a voice communication unit (114) disposed in the UAV (106), the voice communication unit (114) being adapted to maintain audio communication with the rider for ensuring consciousness of the rider (R), upon receiving the audio communication command from the first control unit (108).
21. The method as claimed in claim 19 comprising operating, by the first control unit (108), a payload unit of the UAV (106), to deploy the first-aid package to the rider (R), when the injury condition of the rider is one of a less severe condition, an intermediately severe condition and a moderately severe condition.
22. The method as claimed in claim 19 comprising operating, by the first control unit (108), a signal transmission unit (118) of the UAV (106) for enabling the UAV (106) to alert an emergency service team to the location of the rider (R) during the crash condition, when the injury condition of the rider (R) is a highly severe condition.
23. The method as claimed in claim 19 comprising operating, by the first control unit (108), a signal transmission unit (118) of the UAV (106) for transferring the environment data to the financial institution, when an insurance claim is made by the rider to the financial institution.
24. The method as claimed in claim 19 comprising determining, by the first control unit (108), the crash condition of the two-wheeled vehicle (102) through one or more sensors (106a) disposed in the UAV (106), if the one or more vehicle data sensors (104) are damaged during the crash of the two-wheeled vehicle (102).
25. The method as claimed in claim 14 comprises generating and operating, by a second control unit (120) disposed in the UAV (106), the UAV (106) based on the one or more control commands for determining the injury condition of the rider (R) during the crash condition, when the first control unit (108) is determined to be in a failure condition.
| # | Name | Date |
|---|---|---|
| 1 | 202341001228-STATEMENT OF UNDERTAKING (FORM 3) [05-01-2023(online)].pdf | 2023-01-05 |
| 2 | 202341001228-REQUEST FOR EXAMINATION (FORM-18) [05-01-2023(online)].pdf | 2023-01-05 |
| 3 | 202341001228-PROOF OF RIGHT [05-01-2023(online)].pdf | 2023-01-05 |
| 4 | 202341001228-POWER OF AUTHORITY [05-01-2023(online)].pdf | 2023-01-05 |
| 5 | 202341001228-FORM 18 [05-01-2023(online)].pdf | 2023-01-05 |
| 6 | 202341001228-FORM 1 [05-01-2023(online)].pdf | 2023-01-05 |
| 7 | 202341001228-FIGURE OF ABSTRACT [05-01-2023(online)].pdf | 2023-01-05 |
| 8 | 202341001228-DRAWINGS [05-01-2023(online)].pdf | 2023-01-05 |
| 9 | 202341001228-DECLARATION OF INVENTORSHIP (FORM 5) [05-01-2023(online)].pdf | 2023-01-05 |
| 10 | 202341001228-COMPLETE SPECIFICATION [05-01-2023(online)].pdf | 2023-01-05 |
| 11 | 202341001228-FER.pdf | 2025-08-04 |
| 12 | 202341001228-FORM 3 [11-08-2025(online)].pdf | 2025-08-11 |
| 13 | 202341001228-FORM 3 [11-08-2025(online)]-1.pdf | 2025-08-11 |
| 14 | 202341001228-Defence-01-09-2025.pdf | 2025-09-01 |
| 1 | 202341001228_SearchStrategyNew_E_202341001228E_13-03-2025.pdf |