Sign In to Follow Application
View All Documents & Correspondence

Method And System For Avoiding Collision Of Moving Vehicle

Abstract: Embodiments of the disclosure describe a collision avoidance method for a moving vehicle. The method includes detecting at least one obstacle in the path and corresponding surroundings of the path of the moving vehicle. The method includes predicting, using at least one of predetermined Artificial Intelligence (AI) model and machine learning based ethological model, behavior and a type of the detected at least one obstacle. The method includes determining a plurality of operational parameters associated with movement and characteristics of the moving vehicle based on the prediction. The method includes determining an optimal action based on the plurality of operational parameters, a response time required to stop the moving vehicle, and a braking distance of the moving vehicle from the at least one obstacle. The method includes controlling, based on the determined optimal action, an operational metric of the moving vehicle to avoid collision with the detected obstacle.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
21 May 2025
Publication Number
29/2025
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Tardid Technologies Pvt Ltd
Tardid House, No.137, Sy No. 68/5 and 72/2, Bengaluru IT Park, KIADB Industrial Area, Bandikodige Halli, Karnataka 562149, India

Inventors

1. DUTTA, Niladri
#303, Sai Samhitha Apartments, Nandanavanam Layout Vidyaranyapura, Banglore North, Vidyaranyapura, Bangalore - 560097, India
2. VERMA, Aastha
RMZ Galleria Residency, Plot No: D-1004, D Block, 10th Floor, Opposite to Yelahanka Police Station, Bangalore - 560064, Karnataka, India
3. SAHU, Pawan
H.No.207, Bishnu Complex, Bishnu Stores, Gar-Ali, Jorhat - 785001, Assam, India
4. SINGH, Dinesh Kumar
Agam parvati Niwas, New mainpura, pragati Nagar, Danapur cantt, Patna - 801503, India
5. DEEPTI, Mallu
#9, 3rd Main, 4th cross, opposite to xerox shop, next to soundarya Apartment, Ganga Bakery, GangaNagar, Bangalore – 560032, Karnataka, India

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention generally relates to the field of collision avoidance systems, and more specifically relates to a method and system for avoiding collision of a moving vehicle.

BACKGROUND

[0002] The information disclosed in this background section is only for enhancement of understanding of the general background of the disclosure and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
[0003] A train collision occurs when a train unintentionally comes into contact with another object, which can include vehicles/moving vehicles, animals, boulders, debris, or even natural occurrences like landslides. These collisions can result from factors such as human error, equipment malfunction, or unforeseen environmental conditions. Such incidents can lead to severe accidents, causing injuries, fatalities, and considerable damage to the trains and surrounding infrastructure. To minimize the risk of these collisions, railways implement an Automated Pilot Assistance System (APAS). The APAS leverages advanced technologies, including automatic train control, signal systems, and communication networks, to monitor train speed and location, provide real-time updates to operators, and facilitate communication between trains and control centers. The primary objective of the APAS is to enhance safety and reduce the likelihood of collisions with various objects.
[0004] For instance, consider a scenario where a railway line crosses a busy road. If a car fails to yield at a crossing, it could lead to a dangerous collision with an oncoming train. However, with the APAS in place, the system can detect the presence of the car on the tracks and automatically alert the train operator. If necessary, the APAS can also apply the brakes on the train to prevent a collision, ensuring the safety of passengers and minimizing damage to both the train and the vehicle involved.
[0005] Several existing train collision avoidance methods/systems have incorporated advanced mechanisms to enhance safety and operational efficiency. For instance, certain existing system has developed systems such as an Obstacle Detection Assistance System (ODAS) and a Collision & Overspeed Monitoring & Prevention Assistance System (COMPAS). The ODAS is configured to utilize sophisticated sensors to identify potential obstacles on the tracks, providing real-time alerts to train operators, thereby reducing the risk of collisions. The COMPAS is configured to monitor train speed and operational parameters to prevent over speeding and facilitate safe braking in critical situations.
[0006] Similarly, for instance, “certain existing system offer a protran collision avoidance system, which integrates multiple features into a single platform. The protran collision avoidance system is configured to maintain the operator’s focus while approaching other vehicles, workers, and designated worksite boundaries. The protran collision avoidance system is configured to employ a combination of visual and auditory alerts to ensure that operators are aware of their surroundings, thereby enhancing situational awareness and reducing the likelihood of accidents.
[0007] Additionally, for instance, certain existing system provide innovative solutions such as main-line and shunting yard systems, which leverage cutting-edge vision sensor technology. These main-line and shunting yard systems are tailored for the railway industry, offering advanced safety features that enable precise monitoring of the railway environment. By utilizing high-resolution cameras and intelligent image processing algorithms, rail vision technologies enhance obstacle detection and improve overall safety in both mainline and shunting operations. Through the integration of these above-mentioned existing train collision avoidance methods/systems, the railway industry is making significant strides in collision prevention, thereby ensuring safer transportation for passengers and freight alike. However, several problems are encountered in the existing train collision avoidance methods/systems, which are mentioned below.
[0008] Thus, it is desired to address the above-mentioned disadvantages or other shortcomings or at least provide a useful alternative for avoiding collision of the moving vehicle (e.g., train).

SUMMARY

[0009] This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify key or essential inventive concepts of the invention nor is it intended for determining the scope of the invention.
[0010] According to one embodiment of the present disclosure, a collision avoidance method for a moving vehicle is disclosed herein. The method includes detecting at least one obstacle in the path and corresponding surroundings of the path of the moving vehicle. The method further includes predicting, using at least one of predetermined Artificial Intelligence (AI) model and machine learning based ethological model, behavior and a type of the detected at least one obstacle. The method further includes determining a plurality of operational parameters associated with movement and characteristics of the moving vehicle based on the predicted behavior and the predicted type of the at least one detected obstacle. The method further includes determining an optimal action based on the plurality of operational parameters, a response time required to stop the moving vehicle, and a braking distance of the moving vehicle from the at least one obstacle. The method further includes controlling, based on the determined optimal action, an operational metric of the moving vehicle to avoid collision with the detected obstacle.
[0011] According to one embodiment of the present disclosure, a collision avoidance system for a moving vehicle is disclosed herein. The system includes an Artificial Intelligence (AI) driven collision avoidance controller coupled with a memory, a processor, and a communicator. The AI-driven collision avoidance controller is configured to detect at least one obstacle in the path and corresponding surroundings of the path of the moving vehicle. The AI-driven collision avoidance controller is further configured to predict behavior and a type of the detected at least one obstacle. The AI-driven collision avoidance controller is further configured to determine a plurality of operational parameters associated with movement and characteristics of the moving vehicle based on the predicted behavior and the predicted type of the at least one detected obstacle. The AI-driven collision avoidance controller is further configured to determine an optimal action based on the plurality of operational parameters, a response time required to stop the moving vehicle, and a braking distance of the moving vehicle from the at least one obstacle. The AI-driven collision avoidance controller is further configured to control, based on the determined optimal action, an operational metric of the moving vehicle to avoid collision with the detected obstacle.
[0012] To further clarify the advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail in the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

[0013] These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0014] FIGS. 1A, 1B, and 1C illustrates exemplary scenarios associated with collision avoidance method and/or system for a moving vehicle, according to an embodiment as disclosed herein;
[0015] FIG. 2 illustrates a block diagram of an electronic device for the collision avoidance mechanism, according to an embodiment as disclosed herein;
[0016] FIG. 3 illustrates a block diagram of an AI-driven collision avoidance controller associated with the electronic device for the collision avoidance mechanism, according to an embodiment as disclosed herein;
[0017] FIG. 4 illustrates end-to-end services from AI solutions and application development to testing and comprehensive solutions for the collision avoidance mechanism, according to an embodiment as disclosed herein; and
[0018] FIG. 5 is a flow diagram illustrating a method for the collision avoidance, according to an embodiment as disclosed herein.
[0019] Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present invention. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.

DETAILED DESCRIPTION OF FIGURES

[0020] For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
[0021] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are explanatory of the invention and are not intended to be restrictive thereof.
[0022] Reference throughout this specification to “an aspect”, “another aspect” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrase “in an embodiment”, “in one embodiment”, “in another embodiment”, and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0023] The terms “comprise”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by “comprises... a” does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.
[0024] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The term “or” as used herein, refers to a non-exclusive or unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those skilled in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0025] As is traditional in the field, embodiments may be described and illustrated in terms of blocks that carry out a described function or functions. These blocks, which may be referred to herein as units or modules or the like, are physically implemented by analog or digital circuits such as logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronic components, active electronic components, optical components, hardwired circuits, or the like, and may optionally be driven by firmware and software. The circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like. The circuits constituting a block may be implemented by dedicated hardware, or by a processor (e.g., one or more programmed microprocessors and associated circuitry), or by a combination of dedicated hardware to perform some functions of the block and a processor to perform other functions of the block. Each block of the embodiments may be physically separated into two or more interacting and discrete blocks without departing from the scope of the invention. Likewise, the blocks of the embodiments may be physically combined into more complex blocks without departing from the scope of the invention.
[0026] The accompanying drawings are used to help easily understand various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed to extend to any alterations, equivalents, and substitutes in addition to those which are particularly set out in the accompanying drawings. Although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another.
[0027] Referring now to the drawings, and more particularly to FIGS. 1A to 5, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[0028] FIGS. 1A, 1B, and 1C illustrates exemplary scenarios (e.g., 101, 102, and 103) associated with collision avoidance method and/or system (autonomous pilot assistance system) for a moving vehicle 100 (e.g., train, locomotive, etc.), according to an embodiment as disclosed herein.
[0029] In some example embodiments, the disclosed autonomous pilot assistance system signifies a significant innovation in railway transportation, effectively addressing critical industry challenges while improving safety, efficiency, and reliability. By utilizing state-of-the-art Artificial Intelligence (AI) and automation technologies, the disclosed autonomous pilot assistance system adeptly mitigates potential obstacles and accidents with exceptional accuracy. Its sophisticated obstacle detection mechanism identifies and responds to hazards including wildlife 101, large debris 102, and landslides 103 on the tracks through predictive modeling and real-time monitoring, thereby preventing collisions and safeguarding the well-being of passengers and cargo, as described in conjunction with FIG. 2, FIG. 3, and FIG. 4.
[0030] For instance, consider a scenario (refer to FIG. 1A) where the train 100 frequently passes through areas inhabited by the wildlife 101, the autonomous pilot assistance system plays a crucial role. As the train 100 approaches a section of track known for animal (101a and 101b) crossings, the autonomous pilot assistance system utilizes its advanced sensors and cameras to monitor the area ahead. When the autonomous pilot assistance system detects a herd of animal on the tracks, the autonomous pilot assistance system immediately processes the information and assesses the distance and speed of the train 100. The autonomous pilot assistance system then activates emergency braking, which slows the train 100 down in a controlled manner. This action not only prevents a potential collision but also ensures the safety of the passengers onboard. As a result, the autonomous pilot assistance system demonstrates its effectiveness in protecting both human lives and wildlife (101a and 101b), as one of advantages of the disclosed method and/or system.
[0031] For instance, consider a scenario (refer to FIG. 1B) associated with a severe storm, strong winds and heavy rain can cause debris 102 to accumulate on railway tracks. In this scenario, the autonomous collision avoidance system continuously scans the track environment using real-time data from various sensors. Upon detecting the debris 102, the autonomous pilot assistance system calculates the train’s speed and distance from the obstacle 102a. It then triggers an alert to a train operator, providing critical information about the situation. Simultaneously, the autonomous pilot assistance system initiates an emergency stop protocol, allowing the train 100 to halt safely before reaching the obstruction. As a result, the autonomous pilot assistance system minimizes the risk of derailment and ensures the safety of both passengers and cargo, as one of the advantages of the disclosed method and/or system.
[0032] For instance, consider a scenario (refer to FIG. 1C) associated with mountainous regions 103a, heavy rainfall can lead to landslides 103 that pose significant risks to train operations. The autonomous pilot assistance system incorporates predictive modeling techniques that analyze weather patterns and geological data to assess the likelihood of landslides. When heavy rain is detected, the autonomous pilot assistance system continuously monitors the terrain for signs of instability. If the autonomous pilot assistance system detects the landslide 103, the autonomous pilot assistance system activates a warning protocol. As the train 100 approaches the affected area, it receives real-time alerts to halt its progress. This timely intervention prevents the train 100 from entering a hazardous zone, thus safeguarding the lives of passengers and ensuring the integrity of the cargo being transported, as one of the advantages of the disclosed method and/or system.
[0033] In some example embodiments, to address personnel shortages, the autonomous pilot assistance system functions as an intelligent assistant for train operators/ loco-pilots, handling repetitive tasks such as speed monitoring, signal detection, and compliance with speed limits. This reduces the workload for human operators, allowing focus on critical decisions while the AI manages routine safety and operational duties. Such capabilities ensure continuous operations, even when skilled personnel are unavailable, as one of the advantages of the disclosed method and/or system.
[0034] In some example embodiments, the autonomous pilot assistance system performs reliable operation in challenging conditions, such as fog, rain, or low light, and is achieved through robust sensors and AI models. Enhanced safety is maintained around a clock and over long distances, minimizing risks associated with poor visibility or operator fatigue, as one of the advantages of the disclosed method and/or system.
[0035] In some example embodiments, the autonomous pilot assistance system provides a comprehensive solution for the rail industry by enhancing safety, optimizing costs, and improving operational efficiency. The ability to tackle complex challenges, such adverse environmental conditions, sets a new standard for autonomous train systems, leading to a safer, smarter, and more sustainable future in rail transportation, as one of the advantages of the disclosed method and/or system.
[0036] FIG. 2 illustrates a block diagram of an electronic device 200 for the collision avoidance mechanism, according to an embodiment as disclosed herein. In one embodiment, the electronic device 200 may relate to the autonomous pilot assistance system.
[0037] In some example embodiments, the electronic device 200 comprises a system 201. The system 201 may include a memory 210, a processor 220, a communicator 230, and an AI-driven collision avoidance controller 240. In one or more embodiments, the system 201 may be implemented on one or multiple electronic devices (not shown in FIG. 2).
[0038] In some example embodiments, the memory 210 stores instructions to be executed by the processor 220 for the collision avoidance mechanism, as discussed throughout the disclosure. The memory 210 may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. In addition, the memory 210 may, in some examples, be considered a non-transitory storage medium. The term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term “non-transitory” should not be interpreted that the memory 210 is non-movable. In some examples, the memory 210 can be configured to store larger amounts of information than the memory. In certain examples, a non-transitory storage medium may store data that can, over time, change (e.g., in Random Access Memory (RAM) or cache). The memory 210 can be an internal storage unit, or it can be an external storage unit of the electronic device 200, a cloud storage, or any other type of external storage.
[0039] In some example embodiments, the processor 220 communicates with the memory 210, the communicator 230, and the AI-driven collision avoidance controller 240. The processor 220 is configured to execute instructions stored in the memory 210 and to perform various processes for the collision avoidance mechanism, as discussed throughout the disclosure. The processor 220 may include one or a plurality of processors, maybe a general-purpose processor, such as a Central Processing Unit (CPU), an Application Processor (AP), or the like, a graphics-only processing unit such as a Graphics Processing Unit (GPU), a Visual Processing Unit (VPU), and/or an Artificial intelligence (AI) dedicated processor such as a Neural Processing Unit (NPU).
[0040] In some example embodiments, the communicator 230 is configured for communicating internally between internal hardware components and with external devices (e.g., server) via one or more networks (e.g., radio technology). The communicator 230 includes an electronic circuit specific to a standard that enables wired or wireless communication.
[0041] In some example embodiments, the system 201 may include a display module (not shown in FIG. 2). The display module can accept user inputs and is made of a Liquid Crystal Display (LCD), a Light Emitting Diode (LED), an Organic Light Emitting Diode (OLED), or another type of display. The user inputs may include but are not limited to, touch, swipe, drag, gesture, and so on.
[0042] In some example embodiments, the system 201 may include a camera module (not shown in FIG. 2). The camera module includes one or more image sensors (e.g., Charged Coupled Device (CCD), Complementary Metal-Oxide Semiconductor (CMOS)) to capture one or more images/image frames /video to be processed for the collision avoidance mechanism. In an alternative embodiment, the camera module may not be present, and the system 201 may process an image/video received from an external device or process a pre-stored image/video displayed at the display module.
[0043] In some example embodiments, the AI-driven collision avoidance controller 240 is implemented by processing circuitry such as logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronic components, active electronic components, optical components, hardwired circuits, or the like, and may optionally be driven by firmware. The circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like.
[0044] In some example embodiments, the AI-driven collision avoidance controller 240 is configured to detect at least one obstacle in the path and corresponding surroundings of the path of the moving vehicle (e.g., train 100). The at least one obstacle is one of a stable obstacle (e.g., Boulder/debris) and a moving obstacle (e.g., cattle, elephant, herds, another train, vehicle, etc.). In one embodiment, the AI-driven collision avoidance controller 240 is configured to detect the at least one obstacle in the path and the corresponding surrounding of the path of the moving vehicle using one or more long-range cameras, one or more wide-angle cameras, one or more thermal imaging sensors, and one or more radars. In addition, the AI-driven collision avoidance controller 240 is configured to predict, using at least one of predetermined AI model and machine learning based ethological model, behavior and a type of the detected at least one obstacle. For example, the AI-driven collision avoidance controller 240 understands that a leader’s movements can influence the entire herd’s direction. Moreover, the AI-driven collision avoidance controller 240 is configured to determine a plurality of operational parameters associated with movement and characteristics of the moving vehicle based on the predicted behavior and the predicted type of the at least one detected obstacle. The AI-driven collision avoidance controller 240 is further configured to determine an optimal action based on the plurality of operational parameters, a response time required to stop the moving vehicle, and a braking distance of the moving vehicle from the at least one obstacle. The AI-driven collision avoidance controller 240 is further configured to control based on the determined optimal action, an operational metric of the moving vehicle to avoid collision with the detected obstacle.
[0045] In some example embodiments, the plurality of operational parameters is utilized to perform an optimal action comprise a distance of the at least one detected obstacle from the moving vehicle, a size of the at least one detected obstacle, a location of the at least one detected obstacle with respect to a track of the moving vehicle, a direction movement of the at least one detected obstacle with respect to a track of the moving vehicle, one or more dynamics of the at least one detected obstacle, and a behaviour of the at least one detected obstacle based on historical data.
[0046] In some example embodiments, the optimal action may include, one of, reducing speed of the moving vehicle, initiating emergency braking to stop the moving vehicle, or issuing a notification for continuing movement of the moving vehicle with caution.
[0047] For instance, consider an example scenario where a high-speed train 100 is equipped with the AI-driven collision avoidance controller 240 (advanced collision avoidance system) designed to enhance safety by detecting potential obstacles on the tracks. This AI-driven collision avoidance controller 240 utilizes a combination of one or more long-range and one or more wide-angle cameras, one or more thermal imaging sensors, and radar technology to monitor the area ahead of the train 100. These sensors work individually or in fusion to identify both stable obstacles, such as large rocks or debris, and moving obstacles, like cattle that may wander onto the tracks. The AI-driven collision avoidance controller 240 is not only capable of detecting these obstacles but also employs the predetermined AI model to predict their behavior. For instance, when the wildlife (101a and 101b) is detected near the tracks, the AI-driven collision avoidance controller 240 analyzes various factors to determine whether the wildlife (101a and 101b) is likely to move away or remain stationary. The AI-driven collision avoidance controller 240 calculates essential operational parameters, including the distance of the obstacle from the train 100, its size, and its location relative to the train’s path. Additionally, the AI-driven collision avoidance controller 240 assesses the direction of the obstacle’s movement (e.g., wildlife (101a and 101b)), its dynamics, and historical behavioral data to make informed predictions about its future actions.
[0048] Based on this comprehensive analysis, the AI-driven collision avoidance controller 240 may determine the optimal course of action. If the wildlife (101a and 101b) is detected moving towards the tracks, the AI-driven collision avoidance controller 240 may alert the train operator and, if necessary, initiate emergency braking to prevent a collision. Furthermore, the AI-driven collision avoidance controller 240 may adjust the train’s speed to minimize the risk of an accident. By effectively detecting and predicting the behavior of obstacles, the AI-driven collision avoidance controller 240 significantly enhances the safety of train operations, allowing for timely and informed decision-making in potentially hazardous situations.
[0049] In some example embodiments, the AI-driven collision avoidance controller 240 is configured to perform one or more operations to determine the optimal action, which are given below.
[0050] The AI-driven collision avoidance controller 240 may determine, using a physics-driven AI model, the braking distance of the moving vehicle with respect to the detected at least obstacle. In some example embodiments, the AI-driven collision avoidance controller 240 may determine the response time required to stop the moving vehicle based on the braking distance and the plurality of operational parameters. In some example embodiments, the AI-driven collision avoidance controller 240 may determine the optimal action based on the plurality of operational parameters, the response time required, and the braking distance of the moving vehicle.
[0051] In some example embodiments, the physics-driven AI model is configured to perform one or more operations, which are given below.
[0052] The AI-driven collision avoidance controller 240 may execute a decision intelligence model to evaluate an appropriate timing for triggering a braking distance model. In some example embodiments, the AI-driven collision avoidance controller 240 may determine one or more situational parameters to differentiate between normal and emergency braking mechanisms. In some example embodiments, the AI-driven collision avoidance controller 240 may trigger the braking distance model in response to the determining the one or more situational parameters.
[0053] In some example embodiments, the AI-driven collision avoidance controller 240 is configured to control the speed of the moving vehicle when the optimal action is determined to be initiating the emergency braking, controlling the operational metric (e.g., train speed or emergency braking) comprises controlling the emergency braking of the moving vehicle to prevent vehicle instability (e.g., derailment/ flipping over/tipping).
[0054] In some example embodiments, the AI-driven collision avoidance controller 240 is configured to control the speed of the moving vehicle when the optimal action is to proceed with the continuing movement of the moving vehicle with caution, without controlling the operational metric.
[0055] In some example embodiments, the AI-driven collision avoidance controller 240 is configured to monitor a condition along the route of the moving vehicle (e.g., train 100), wherein the condition includes at least one of a signboard, a warning indicator, a signal, a speed limit board, a home signal, a turnout indicator, and an animal zone indicator. The AI-driven collision avoidance controller 240 is further configured to generate an alert to prompt a driver of the moving vehicle to perform a corrective action based on the monitored condition. The AI-driven collision avoidance controller 240 is further configured to perform the corrective action in response to determining a failure of the driver to perform the corrective action within a predefined period of time.
[0056] For instance, consider a scenario where the train 100 traveling along its route when the train 100 approaches a caution signal indicating a possible obstruction ahead. The AI-driven collision avoidance controller 240 is actively monitoring the situation. As the train 100 gets closer to the signal, the AI-driven collision avoidance controller 240 detects it and immediately alerts the driver, saying, “caution: reduce speed and prepare to stop”. If the driver doesn’t respond to this warning within 30 seconds, the AI-driven collision avoidance controller 240 takes over and automatically applies the brakes, slowing the train 100 down to prevent a potential collision with whatever might be on the tracks.
[0057] In some example embodiments, the AI-driven collision avoidance controller 240 is configured to determine a drag force associated with the moving vehicle. The AI-driven collision avoidance controller 240 is further configured to detect a change in the drag force beyond a predetermined threshold. The AI-driven collision avoidance controller 240 is further configured to detect, based on the change in the drag force, at least one of the potential derailments of the moving train or mechanical anomalies in the moving train.
[0058] For instance, consider a scenario where the train 100 is moving through a section of track, its AI-driven collision avoidance controller 240 is constantly measuring how much force is needed to keep it moving. Suddenly, the AI-driven collision avoidance controller 240 detects a sharp increase in this drag force, which suggests that something might be wrong like a track issue or a mechanical failure. The AI-driven collision avoidance controller 240 immediately sends a warning to the driver, stating, “Unusual resistance detected. Please slow down”. At the same time, the AI-driven collision avoidance controller 240 starts running checks on the train’s systems to identify the problem, ensuring that any issues are addressed quickly to maintain safety.
[0059] In some example embodiments, the AI-driven collision avoidance controller 240 is configured to detect an approach of one or more high-risk zones (e.g., a landslide-prone area, accident hotspot, or animal crossing path) by utilizing historical records (e.g., landslides, accident-prone zones, and animal migration patterns along railway tracks) and real-time telemetry data. In response to detecting the one or more high-risk zones, dynamically adjust the plurality of operational parameters to generate an alert (e.g., “Reduce Speed” or “Increase Caution”) to notify a driver of the moving vehicle. In response to generating the alert, interpreting the generated alert, by an autonomous control system, and executing one or more pre-defined safety protocols (e.g., speed adjustments, emergency braking, or enhanced visual monitoring) corresponding to the optimal action.
[0060] For instance, consider a scenario where the train 100 approaches a region known for frequent animal crossings, the AI-driven collision avoidance controller 240 utilizes both historical data and real-time information to recognize this high-risk area. The AI-driven collision avoidance controller 240 triggers an alert to the driver, saying, “Warning: animal crossing zone ahead. Please reduce speed”. In response, the system automatically adjusts the train’s speed to a safer level and enhances the visual monitoring of the surroundings. The AI-driven collision avoidance controller 240 also prepares the emergency braking system in case an animal appears on the tracks, ensuring the safety of both the train 100 and any wildlife in the area.
[0061] In some example embodiments, a function associated with the various components of the electronic device 200 may be performed through the non-volatile memory, the volatile memory, and the processor 220. One or a plurality of processors controls the processing of the input data in accordance with a predefined operating rule or AI model stored in the non-volatile memory and the volatile memory. The predefined operating rule or AI model is provided through training or learning. Here, being provided through learning means that, by applying a learning algorithm to a plurality of learning data, a predefined operating rule or AI model of the desired characteristic is made. The learning may be performed in a device itself in which AI according to an embodiment is performed, and/or may be implemented through a separate server/system. The learning algorithm is a method for training a predetermined target device (for example, a robot) using a plurality of learning data to cause, allow, or control the target device to decide or predict. Examples of learning algorithms include, but are not limited to, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning.
[0062] The AI model may consist of a plurality of neural network layers. Each layer has a plurality of weight values and performs a layer operation through a calculation of a previous layer and an operation of a plurality of weights. Examples of neural networks include, but are not limited to, convolutional neural network (CNN), deep neural network (DNN), recurrent neural network (RNN), restricted Boltzmann Machine (RBM), deep belief network (DBN), bidirectional recurrent deep neural network (BRDNN), generative adversarial networks (GAN), and deep Q-networks.
[0063] Although FIG. 2 shows various hardware components of the electronic device 200, but it is to be understood that other embodiments are not limited thereon. In other embodiments, the electronic device 200 may include less or more number of components. Further, the labels or names of the components are used only for illustrative purposes and do not limit the scope of the invention. One or more components can be combined to perform the same or substantially similar functions for the collision avoidance mechanism.
[0064] FIG. 3 illustrates a block diagram of the AI-driven collision avoidance controller 240 associated with the electronic device 200 for the collision avoidance mechanism, according to an embodiment as disclosed herein.
[0065] In some example embodiments, the AI-driven collision avoidance controller 240 may continuously monitor the tracks and surroundings for obstacles. The AI-driven collision avoidance controller 240 may use a mix of the one or more long-range cameras, the one or more wide-angle cameras, the one or more thermal imaging sensors, and the one or more radars to detect objects (e.g., 101a, 101b, 102a, 103a) near the tracks. The AI-driven collision avoidance controller 240 may determine one or more key parameters like train speed, distance to obstacles, and total train load are analyzed. Based on the one or more key parameters, the AI-driven collision avoidance controller 240 may calculate the response times, require braking distances, and suggest the optimal actions. The AI-driven collision avoidance controller 240 may provide real-time alerts to a locomotive pilot, recommending actions such as slowing down or applying emergency brakes. By using the one or more long-range cameras, the one or more wide-angle cameras, the one or more thermal imaging sensors along with the one or more radars, the AI-driven collision avoidance controller 240 may effectively detect obstacles in various weather and lighting conditions. As a result, the AI-driven collision avoidance controller 240 may offer precise, real-time recommendations, enhancing the locomotive pilot’s decision-making and overall train safety.
[0066] In some example embodiments, the AI-driven collision avoidance controller 240 may include a wild-life tracking module 241, a train-vehicle detection module 242, a landslide-boulder detection module 243, a signal-sign detection module 244, a neural network-based braking module 245, a derailment alert module 246, an Overhead Equipment (OHE) integration module 247, and a real-time train route monitoring module 248.
[0067] In some example embodiments, to address risks from wildlife (e.g., 101a, 101b) and livestock on railway tracks, the wild-life tracking module 241 is configured to employ advanced Large Vision Model (LVM) technology combined with an Ethology Behavioral Model (EBM). The LVM is trained on a diverse dataset of animals near railway corridors, ensuring accurate detection across different species and environments. The EBM enhances the AI’s ability to predict animal behavior by considering species-specific movement patterns and group dynamics. The EBM is particularly effective for large animals like elephants and cattle, which can pose significant safety risks. While the LVM excels in detecting and classifying objects, the EBM adds predictive capabilities, accounting for the complex behaviors of these animals. For example, it understands that a leader’s movements can influence the entire herd’s direction.
[0068] In some example embodiments, once an animal or herd is detected, the wild-life tracking module 241 may begin real-time tracking to refine its predictions. Cameras and sensors monitor their position, speed, and movement, differentiating between behaviors like grazing, resting, or approaching the tracks. For herds, the wild-life tracking module 241 may predict both individual and group dynamics. The EBM, combined with real-time data, assesses the likelihood of the herd crossing the tracks and when this might happen. When a potential collision risk is identified, the wild-life tracking module 241 may calculate the best response strategy. The wild-life tracking module 241 may consider the train’s speed, weight, break capacity, and track conditions to determine the stopping distance. In low-risk situations, like animals grazing far away, the wild-life tracking module 241 may suggest early deceleration. In higher-risk scenarios, such as a herd moving toward the tracks, the wild-life tracking module 241 may alert the locomotive pilot and trigger emergency brakes if needed.
[0069] In some example embodiments, the wild-life tracking module 241 may also aim to minimize secondary hazards. For instance, if a herd is present, the wild-life tracking module 241 may manage actions to avoid scattering the group, which may lead to animals running onto adjacent tracks. The EBM ensures responses align with natural animal behaviors, reducing unpredictability and enhancing safety for both trains and wildlife. By integrating advanced LVM technology, real-time tracking, and EBMs, the wild-life tracking module 241 may offer a comprehensive solution to the challenges posed by the wildlife and livestock on railway tracks. As a result, the AI-driven collision avoidance controller 240 may enable proactive decision-making, ensuring operational safety, protecting wildlife, and minimizing disruptions to rail operations.
[0070] In some example embodiments, the train-vehicle detection module 242 may detect and locate locomotives or bogies ahead of the train 100, as well as stationary or slow-moving vehicles at railway crossings, issuing alerts as needed. Before calculating stopping distances, the train-vehicle detection module 242 may use AI-based object detection to identify obstacles on the track, for instance:
a. Sensor data integration: The train-vehicle detection module 242 may combine data from various sensors such as radar, Light Detection and Ranging (LiDAR), and cameras to create a detailed 3D map of the train’s surroundings. These sensors provide information on the distance, shape, and movement of detected objects (e.g., 101a, 101b, 102a, 103a).
b. AI object detection: The train-vehicle detection module 242 may employ deep learning to classify objects (e.g., animals, debris, other trains) and estimate their positions relative to the train 100. The train-vehicle detection module 242 may continuously update the location and movement of these objects (e.g., 101a, 101b, 102a, 103a) in real-time as the train 100 approaches.
c. Tracking and prediction: For moving objects (like animals or people), the train-vehicle detection module 242 may predict future positions based on their current speed and trajectory. For stationary objects, the train-vehicle detection module 242 may calculate the required stopping distance using a static classification.
[0071] In some example embodiments, the landslide-boulder detection module 243 may utilize a Brainbox ML model and AI techniques that enhance the detection of hazards on the tracks. For instance, Very High Frequency (VHF) sensors emit radio waves to measure distance. If a boulder is on the tracks, the one or more radars detects reflected waves and triggers an alert. The landslide-boulder detection module 243 may identify inanimate hazards such as boulders, debris, and landslides. Unlike moving objects, these static obstacles require a straightforward response: the train must stop. The landslide-boulder detection module 243 may calculate the most efficient stopping method based on the distance to the obstacle (e.g., 102a, 103a) and the train’s speed.
[0072] When a hazard is detected, the landslide-boulder detection module 243 may calculate the stopping distance considering the train’s speed, weight, braking efficiency, and track conditions. If the train 100 is far enough from the obstacle (e.g., 102a, 103a), the landslide-boulder detection module 243 may use normal braking. If the obstacle (e.g., 102a, 103a) is too close, the landslide-boulder detection module 243 may initiate emergency braking to prevent the collision. To optimize the detection of the obstacle (e.g., 102a, 103a), the landslide-boulder detection module 243 may use advanced sensors and cameras to identify hazards early. For instance, technologies like LiDAR, thermal imaging, and radar operate effectively in various conditions, including low visibility from fog or rain. The landslide-boulder detection module 243 may continuously monitor the train’s speed and distance to potential hazards, ensuring real-time braking decisions. This approach enables the train 100 to respond quickly to immovable hazards, minimizing accident risks. By using normal or emergency braking as needed, the landslide-boulder detection module 243 may enhance passenger safety, protect infrastructure, and reduce the likelihood of derailments or serious incidents.
[0073] In some example embodiments, the signal-sign detection module 244 may utilize a Computer Vision (CV) system, which integrates with existing railway infrastructure to enhance train schedule reliability and optimize energy use by adjusting train speeds and stopping distances. This proactive strategy reduces unnecessary idling and braking, leading to lower fuel consumption and emissions. The signal-sign detection module 244 may improve the safety, efficiency, and sustainability of train operations in railways. The signal-sign detection module 244 may support railway loco pilots by continuously monitoring signboards, warning signals, and speed limit indicators along the tracks. Human operators can sometimes miss or misinterpret these critical signals due to fatigue, distraction, or difficult conditions. The signal-sign detection module 244 may act as a reliable assistant, ensuring that important signals are always recognized.
[0074] In some example embodiments, the signal-sign detection module 244 may employ advanced vision systems and sensors to detect and identify signs and signals in real-time. For instance, when a speed limit sign is detected, the signal-sign detection module 244 may compare the limit with the train’s current speed. If the train 100 exceeds the limit, the signal-sign detection module 244 may provide an audio alert to the loco pilot for immediate action. If the pilot does not respond quickly enough, the signal-sign detection module 244 may automatically adjust the train’s speed to ensure compliance, enhancing safety.
[0075] In some example embodiments, for warning signs indicating sharp curves, approaching stations, or construction zones, the signal-sign detection module 244 may issue clear audio warnings, giving the pilot time to prepare for necessary actions. If the train 100 needs to slow down or stop such as at a red signal the signal-sign detection module 244 may autonomously apply the brakes if the pilot does not respond in time.
[0076] In some example embodiments, the signal-sign detection module 244 may also monitor dynamic signals like railway traffic lights and gate crossing indicators. Using machine learning and image recognition technologies, the signal-sign detection module 244 may adapt to different sign formats and operate effectively in low visibility conditions, such as rain, fog, or nighttime.
[0077] In some example embodiments, by providing audio alerts and automated speed control, the signal-sign detection module 244 may reduce the cognitive load on the loco pilot and minimize the risk of human error. This proactive approach ensures strict adherence to railway protocols, enhances safety for passengers and crew, and improves operational efficiency by preventing delays or accidents from missed signals (signs or warnings).
[0078] In some example embodiments, the neural network-based braking module 245 may determine the stopping distance when objects are detected on the track by utilizing real-time sensor data, object detection algorithms, and physical principles of motion to predict the stopping distance required for the train to avoid the collision. The stopping distance is a total distance required for the train 100 to come to a complete stop after the brakes are applied. It includes a distance traveled during the driver’s reaction time and a distance covered during braking. factors affecting train stopping are braking force, train mass (weight), speed, brake system efficiency, train length, track conditions-friction and gradient, brake delay time, weather conditions, and load distribution.
[0079] In some example embodiments, the stopping distance calculation relies on the principles of kinematics and dynamics. When the neural network-based braking module 245 may detect the at least one object on the track, the neural network-based braking module 245 may calculate how far the train 100 needs to travel before the train 100 can safely come to a complete stop, given the train’s current speed and the braking capability. For instance, the calculation of stopping distance starts with calculating the deceleration 𝑎𝑗 in time instant 𝑡𝑗, the mathematical representation is shown in below equation(s).
𝑎𝑗 = (( Σ𝐹𝐵,𝑖 +Σ𝐹𝑒𝑥𝑡) /𝑚𝑑𝑦𝑛) (1)
where 𝑎𝑗 represents a train deceleration in time 𝑡𝑗 , 𝐹𝐵,𝑖 represents braking force of each brake, 𝐹𝑒𝑥𝑡 represents an external force, 𝑚𝑑𝑦𝑛 represents a vehicle dynamic mass. Then the speed of the train 100 is calculated at time 𝑡𝑗+, the mathematical representation is shown in the below equation.
𝑣𝑗+1 = 𝑣𝑗 −𝑎𝑗 * Δ𝑡 (2)
where 𝑣𝑗+1 represents a vehicle speed in time 𝑡𝑗+1, 𝑣𝑗 represents a train speed in time 𝑡𝑗, Δt represents a time step. The next step is to calculate the stopping distance at time 𝑡𝑗+1, the mathematical representation is shown in the below equation.
𝑠𝑗+1 = 𝑠𝑗 − 𝑣𝑗.Δ𝑡 – (1/2) 𝑎𝑗*Δ𝑡2 (3)
where 𝑠𝑗+1 represents a vehicle braking distance in time 𝑡𝑗+1, 𝑠𝑗 represents a vehicle braking distance in time 𝑡𝑗. The last step is to calculate the deceleration in time 𝑡𝑗+1, which is calculated as follows,
𝑎𝑗+1 = ( (Σ𝐹𝐵,𝑖 +Σ𝐹𝑒𝑥𝑡)/ 𝑚𝑑𝑦𝑛)𝑗+1 (4)
where 𝑎𝑗+1 represents the train deceleration in time step 𝑡𝑗+1.
[0080] The above physics model, which is based on the kinematics and dynamics of the train 100 is implemented in real-time to calculate the deceleration of the train and finally the stopping distance. Once the at least one object is detected and classified, the neural network-based braking module 245 may determine the required deceleration needed to stop the train 100 in time. The deceleration is not constant; it depends on several factors that the model considers.
[0081] In some example embodiments, the neural network-based braking module 245 may utilize regenerative and friction brakes, which is essential for determining how quickly the train 100 can stop. This efficiency is affected by factors like the train’s weight, the state of the braking system, and the friction between the wheels and the track.
[0082] In some example embodiments, the neural network-based braking module 245 may determine a track condition(s), such as wet or icy rails, which lower the friction between the wheels and the track. In such situations, the neural network-based braking module 245 may adjust the expected deceleration to reflect the reduced braking force.
[0083] In some example embodiments, on inclined tracks, gravity can either assist or impede braking, depending on the slope’s direction, the neural network-based braking module 245 may modify the deceleration calculations based on the gradient, as it influences the train’s stopping ability.
[0084] In some example embodiments, the derailment alert module 246 may perform real-time monitoring of train dynamics and derailment detection based on a cutting-edge implementation using Inertial Measurement Units (IMUs). The IMUs measure key physical parameters like linear acceleration, angular velocity, and orientation, which relate directly to the train's movement and stability. Under normal conditions, these parameters follow predictable patterns based on physical laws, considering factors like train speed, track curvature, gradient, and load distribution. The derailment alert module 246 may analyze IMU data against standard models of train dynamics, using thresholds based on physical principles like centrifugal force, friction, and angular momentum. Any anomalies such as sudden lateral movements or unusual tilting can indicate risks of derailment, bogie instability, or mechanical issues.
[0085] In some example embodiments, the derailment alert module 246 may train on both simulated and real-world data, to detect deviations from expected behavior. If a significant anomaly occurs like excessive lateral motion or tilt, the derailment alert module 246 may issue a real-time alert, to identify the affected train section. This allows the loco-pilot to take corrective actions, such as reducing speed or stopping the train. In critical situations, where immediate danger is detected (like an ongoing derailment), the system can automatically activate emergency braking to reduce impact.
[0086] In some example embodiments, by combining IMU monitoring with a physics-based AI model, the derailment alert module 246 may enhance safety by detecting derailments and mechanical issues as they happen. The derailment alert module 246 may provide loco-pilots with actionable insights for quick responses, merging theoretical motion dynamics with practical railway operations to ensure safer transport for passengers, cargo, and infrastructure.
[0087] In some example embodiments, the OHE integration module 247 may utilize advanced object detection and Optical Character Recognition (OCR) technologies to improve safety and operational efficiency. The OHE integration module 247 may focus on identifying one or more Overhead Equipment (OHE) structures, extracting important information from them, and linking that data with a caution order document. This integration allows the loco pilot to receive real-time alerts about track restrictions, enhancing the safety and efficiency of train operations. In other words, the OHE integration module 247 may perform multiple operations, for instance, which are mentioned below.
a. OHE structure detection: The OHE integration module 247 may detect OHE structures, particularly the masts that serve as reference points along the railway. Each mast is labeled with a unique combination of a kilometer number and a mast number. The kilometer number indicates the distance along the track, while the mast number specifies the individual mast at that distance. For example, an OHE mast labeled 16/2 refers to a second mast in the 16th kilometer.
b. Bounding box generation: Once the object detection model identifies one or more OHE masts, the OHE integration module 247 may create bounding boxes around the one or more OHE masts for accurate localization.
c. Text extraction: After identifying the one or more OHE masts, the OHE integration module 247 may apply an OCR algorithm to extract the text displayed on the mast, such as the kilometer and mast numbers. This information helps pinpoint the train’s location on the track.
d. Integration with caution order: The extracted data is integrated with the caution order, which contains essential operational instructions for the loco pilot.
[0088] The integration of the one or more OHE masts detection with the caution order has several benefits, for instance, which are mentioned below.
a. Speed compliance: Ensures that trains adhere to speed limits in restricted areas.
b. Safety alerts: Provides warnings for sharp curves to maintain train stability.
c. Wildlife warnings: Alerts the crew about wildlife zones, such as areas where elephants may cross, to prevent accidents.
d. Real-time updates: Incorporates temporary restrictions due to maintenance or construction, ensuring that the loco pilot receives timely information.
[0089] In some example embodiments, the real-time train route monitoring module 248 may provide real-time monitoring of train routes using Global Positioning System (GPS) technology on the train 100 and others on the same route. The real-time train route monitoring module 248 may continuously track nearby train locations, giving the loco pilot proactive alerts to improve safety and efficiency. This ensures that the loco pilot is always informed about the positions and movements of other trains, helping to prevent accidents and facilitate smoother operations.
[0090] In some example embodiments, the real-time train route monitoring module 248 may activate a visual alert on a User Interface (UI) (display module) when any other train is detected within a certain distance or whether ahead or approaching from behind. One or more nearby trains are highlighted in red on a route map, prompting the loco pilot to be cautious. This clear visual cue allows for quick assessment and necessary actions to maintain safety, enhancing situational awareness. The real-time train route monitoring module 248 may also detect stationary trains ahead on the same track. If any other train is identified as stationary, the real-time train route monitoring module 248 may send an emergency alert to the loco pilot and activates a speed control module (e.g., train-vehicle detection module 242). This automatically adjusts the train’s speed to maintain a safe distance, which is crucial for preventing accidents from sudden stops or delays. As a result, the real-time train route monitoring module 248 may provide real-time monitoring, proximity alerts, and intelligent speed control significantly boosting the safety and reliability of autonomous train system, providing loco pilots with essential tools to navigate shared tracks more effectively, and reducing risks, as one of the advantages of the disclosed method and/or system.
[0091] FIG. 4 illustrates end-to-end services from AI solutions and application development to testing and comprehensive solutions for the collision avoidance mechanism, according to an embodiment as disclosed herein. The FIG. 4 illustrates a four-stage process for managing the data lifecycle within a Data Annotation and Data Sharing (DADS) system.
[0092] In some example embodiments, stage 1 represents test data protection, which is an initial stage that focuses on the raw data generation and capture at the source location (e.g., the train 100). Various sensors, including cameras, GPS, and other devices, are used to collect large volumes of raw data, often reaching terabytes (TBs) per day, by utilizing various modules associated with the AI-driven collision avoidance controller 240. This raw data is the foundation for the subsequent data processing and sharing activities.
[0093] In some example embodiments, stage 2 represents a data processing, in this stage, the raw data is loaded, copied, and cleansed at a remote location. The workflow establishes a restricted “red zone” and a secure “green zone” to handle the data. Dedicated stations are used for data copying and cleaning, ensuring the data is prepared for the next stage.
[0094] In some example embodiments, stage 3 represents clean data sharing, the processed data is then securely transferred to a centralized data center facility. This facility has a secure zone for data sharing access, with robust storage infrastructure and tight security controls. This ensures the confidentiality, integrity, and availability of the data as it is shared across various stages.
[0095] In some example embodiments, stage 4 represents a user access as a final stage grants authorized users access to the data for various DADS activities, such as data annotation and labeling, data processing using AI/ML techniques, testing, and deep learning. Additionally, remote locations can access the data for DADS engineering, modeling, testing, and simulation tasks. The interconnected nature of these stages creates a comprehensive data protection and processing workflow for the collision avoidance mechanism, enabling efficient and secure data management within the DADS system.
[0096] FIG. 5 is a flow diagram illustrating a method 500 for the collision avoidance, according to an embodiment as disclosed herein. The method 500 may execute multiple operations for the collision avoidance, which are given below.
[0097] At operation 501, the method 500 includes detecting the at least one obstacle (e.g., 101a, 101b, 102a, 103a) in the path and corresponding surrounding of the path of the moving vehicle (e.g., the train 100). At operation 502, the method 500 includes predicting, using at least one of predetermined AI model and machine learning based ethological model, behavior and the type of the detected at least one obstacle. At operation 503, the method 500 includes determining the plurality of operational parameters associated with movement and characteristics of the moving vehicle based on the predicted behavior and the predicted type of the at least one detected obstacle. At operation 504, the method 500 includes determining the optimal action based on the plurality of operational parameters, the response time required to stop the moving vehicle, and the braking distance of the moving vehicle from the at least one obstacle. At operation 505, the method 500 includes controlling, based on the determined optimal action, the operational metric of the moving vehicle to avoid collision with the detected obstacle. Further, a detailed description related to the various operations of FIG. 5 is covered in the description related to FIG. 1, FIG. 2, and FIG. 3, and is omitted herein for the sake of brevity.
[0098] The various actions, acts, blocks, steps, or the like in the flow diagrams may be performed in the order presented, in a different order, or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, or the like may be omitted, added, modified, skipped, or the like without departing from the scope of the invention.
[0099] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one ordinary skilled in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
[0100] While specific language has been used to describe the present subject matter, any limitations arising on account thereto, are not intended. As would be apparent to a person in the art, various working modifications may be made to the method to implement the inventive concept as taught herein. The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment.
[0101] The embodiments disclosed herein can be implemented using at least one hardware device and performing network management functions to control the elements.
[0102] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the scope of the embodiments as described herein. , Claims:1. A collision avoidance method (500) for a moving vehicle, the method (500) comprising:
detecting (501) at least one obstacle in the path and corresponding surrounding of the path of the moving vehicle;
predicting (502), using at least one of predetermined artificial intelligence (AI) model and machine learning based ethological model, behavior and a type of the detected at least one obstacle;
determining (503) a plurality of operational parameters associated with movement and characteristics of the moving vehicle based on the predicted behavior and the predicted type of the at least one detected obstacle;
determining (504) an optimal action based on the plurality of operational parameters, a response time required to stop the moving vehicle, and a braking distance of the moving vehicle from the at least one obstacle; and
controlling (505), based on the determined optimal action, an operational metric of the moving vehicle to avoid collision with the detected obstacle.

2. The method (500) as claimed in claim 1, wherein the plurality of operational parameters is utilized to perform the optimal action comprises a distance of the at least one detected obstacle from the moving vehicle, a size of the at least one detected obstacle, a location of the at least one detected obstacle with respect to a track of the moving vehicle, a direction movement of the at least one detected obstacle with respect to a track of the moving vehicle, one or more dynamics of the at least one detected obstacle, and a behaviour of the at least one detected obstacle based on historical data.

3. The method (500) as claimed in claim 1, wherein the at least one obstacle is one of a stable obstacle and a moving obstacle.

4. The method (500) as claimed in claim 1, wherein detecting the at least one obstacle comprises:
detecting the at least one obstacle in the path and the corresponding surrounding of the path of the moving vehicle using one or more long-range cameras, one or more wide-angle cameras, one or more thermal imaging sensors, and one or more radars.

5. The method (500) as claimed in claim 1, wherein determining the optimal action comprises:
determining, using a physics-driven AI model, the braking distance of the moving vehicle with respect to the detected at least obstacle;
determining the response time required to stop the moving vehicle based on the braking distance and the plurality of operational parameters; and
determining the optimal action based on the plurality of operational parameters, the response time required, and the braking distance of the moving vehicle.

6. The method (500) as claimed in claim 5, wherein the physics-driven AI model preforms one or more operations comprising:
executing a decision intelligence model to evaluate an appropriate timing for triggering a braking distance model;
determining one or more situational parameters to differentiate between normal and emergency braking mechanisms; and
triggering the braking distance model in response to the determining the one or more situational parameters.

7. The method (500) as claimed in claim 1, wherein the optimal action comprises one of reducing speed of the moving vehicle, initiating emergency braking to stop the moving vehicle, or issuing a notification for continuing movement of the moving vehicle with caution.

8. The method (500) as claimed in claim 5, wherein when the optimal action is determined to be reducing speed of the moving vehicle, controlling the operational metric comprises controlling the speed of the moving vehicle; and
when the optimal action is determined to be initiating the emergency braking, controlling the operational metric comprises controlling the emergency braking of the moving vehicle to prevent vehicle instability or
when the optimal action is to proceed with continuing movement of the moving vehicle with caution, without controlling the operational metric.

9. The method (500) as claimed in claim 1, further comprising:
monitoring a condition along the route of the moving vehicle, wherein the condition includes at least one of a signboard, a warning indicator, a signal, a speed limit board, a home signal, a turnout indicator, and an animal zone indicator;
generating an alert to prompt a driver of the moving vehicle to perform a corrective action based on the monitored condition; and
performing the corrective action in response to determining a failure of the driver to perform the corrective action within a predefined period of time.

10. The method (500) as claimed in claim 1, wherein when the moving vehicle is a moving train, the method (500) further comprises:
determining a drag force associated with the moving vehicle;
detecting a change in the drag force beyond a predetermined threshold; and
detecting, based on the change in the drag force, at least one of potential derailment of the moving train or mechanical anomalies in the moving train.

11. The method (500) as claimed in claim 1, comprising:
detecting an approach of one or more high-risk zones by utilizing historical records and real-time telemetry data;
in response to detecting the one or more high-risk zones, dynamically adjusting the plurality of operational parameters to generate an alert to notify a driver of the moving vehicle; and
in response to generating the alert, interpreting the generated alert, by an autonomous control system, and executing one or more pre-defined safety protocols corresponding to the optimal action.

12. A collision avoidance system (201) for a moving vehicle, wherein the system (201) comprising:
a memory (210);
a processor (220);
a communicator (230); and
an Artificial Intelligence (AI)-driven collision avoidance controller (240), operably connected to the memory (210), the processor (120), and the communicator (230), configured to:
detect at least one obstacle in the path and corresponding surrounding of the path of the moving vehicle;
predict, using at least one of predetermined artificial intelligence (AI) model and machine learning based ethological model, behavior and a type of the detected at least one obstacle;
determine a plurality of operational parameters associated with movement and characteristics of the moving vehicle based on the predicted behavior and the predicted type of the at least one detected obstacle;
determine an optimal action based on the plurality of operational parameters, a response time required to stop the moving vehicle, and a braking distance of the moving vehicle from the at least one obstacle; and
control, based on the determined optimal action, an operational metric of the moving vehicle to avoid collision with the detected obstacle.

13. The system (201) as claimed in claim 12, wherein the plurality of operational parameters is utilized to perform the optimal action comprises a distance of the at least one detected obstacle from the moving vehicle, a size of the at least one detected obstacle, a location of the at least one detected obstacle with respect to a track of the moving vehicle, a direction movement of the at least one detected obstacle with respect to a track of the moving vehicle, one or more dynamics of the at least one detected obstacle, and a behaviour of the at least one detected obstacle based on historical data.

14. The system (201) as claimed in claim 12, wherein the at least one obstacle is one of a stable obstacle and a moving obstacle.

15. The system (201) as claimed in claim 12, wherein to detect the at least one obstacle, the AI-driven collision avoidance controller (240) is configured to:
detect the at least one obstacle in the path and the corresponding surrounding of the path of the moving vehicle using one or more long-range cameras, one or more wide-angle cameras, one or more thermal imaging sensors, and one or more radars.

16. The system (201) as claimed in claim 12, wherein to determine the optimal action, the AI-driven collision avoidance controller (240) is configured to:
determine, using a physics-driven AI model, the braking distance of the moving vehicle with respect to the detected at least obstacle;
determine the response time required to stop the moving vehicle based on the braking distance and the plurality of operational parameters; and
determine the optimal action based on the plurality of operational parameters, the response time required, and the braking distance of the moving vehicle.

17. The system (201) as claimed in claim 16, wherein the physics-driven AI model preforms one or more operations, the AI-driven collision avoidance controller (240) is configured to:
execute a decision intelligence model to evaluate an appropriate timing for triggering a braking distance model;
determine one or more situational parameters to differentiate between normal and emergency braking mechanisms; and
trigger the braking distance model in response to the determining the one or more situational parameters.

18. The system (201) as claimed in claim 12, wherein the optimal action comprises one of reducing speed of the moving vehicle, initiating emergency braking to stop the moving vehicle, or issuing a notification for continuing movement of the moving vehicle with caution.

19. The system (201) as claimed in claim 16, wherein when the optimal action is determined to be reducing speed of the moving vehicle, controlling the operational metric comprises controlling the speed of the moving vehicle; and
when the optimal action is determined to be initiating the emergency braking, controlling the operational metric comprises controlling the emergency braking of the moving vehicle to prevent vehicle instability, or
when the optimal action is to proceed with continuing movement of the moving vehicle with caution, without controlling the operational metric.

20. The system (201) as claimed in claim 12, the AI-driven collision avoidance controller (240) is further configured to:
monitor a condition along the route of the moving vehicle, wherein the condition includes at least one of a signboard, a warning indicator, a signal, a speed limit board, a home signal, a turnout indicator, and an animal zone indicator;
generate an alert to prompt a driver of the moving vehicle to perform a corrective action based on the monitored condition; and
perform the corrective action in response to determining a failure of the driver to perform the corrective action within a predefined period of time.

21. The system (201) as claimed in claim 12, wherein when the moving vehicle is a moving train, the AI-driven collision avoidance controller (240) is further configured to:
determine a drag force associated with the moving vehicle;
detect a change in the drag force beyond a predetermined threshold; and
detect, based on the change in the drag force, at least one of potential derailment of the moving train or mechanical anomalies in the moving train.

22. The system (201) as claimed in claim 12, the AI-driven collision avoidance controller (240) is configured to:
detect an approach of one or more high-risk zones by utilizing historical records and real-time telemetry data;
in response to detecting the one or more high-risk zones, dynamically adjust the plurality of operational parameters to generate an alert to notify a driver of the moving vehicle; and
in response to generating the alert, interpret the generated alert, by an autonomous control system, and executing one or more pre-defined safety protocols corresponding to the optimal action.

Documents

Application Documents

# Name Date
1 202541049077-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [21-05-2025(online)].pdf 2025-05-21
2 202541049077-STATEMENT OF UNDERTAKING (FORM 3) [21-05-2025(online)].pdf 2025-05-21
3 202541049077-REQUEST FOR EXAMINATION (FORM-18) [21-05-2025(online)].pdf 2025-05-21
4 202541049077-OTHERS [21-05-2025(online)].pdf 2025-05-21
5 202541049077-FORM FOR STARTUP [21-05-2025(online)].pdf 2025-05-21
6 202541049077-FORM FOR SMALL ENTITY(FORM-28) [21-05-2025(online)].pdf 2025-05-21
7 202541049077-FORM 18 [21-05-2025(online)].pdf 2025-05-21
8 202541049077-FORM 1 [21-05-2025(online)].pdf 2025-05-21
9 202541049077-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [21-05-2025(online)].pdf 2025-05-21
10 202541049077-EVIDENCE FOR REGISTRATION UNDER SSI [21-05-2025(online)].pdf 2025-05-21
11 202541049077-DRAWINGS [21-05-2025(online)].pdf 2025-05-21
12 202541049077-DECLARATION OF INVENTORSHIP (FORM 5) [21-05-2025(online)].pdf 2025-05-21
13 202541049077-COMPLETE SPECIFICATION [21-05-2025(online)].pdf 2025-05-21
14 202541049077-Proof of Right [05-06-2025(online)].pdf 2025-06-05
15 202541049077-FORM-26 [09-06-2025(online)].pdf 2025-06-09
16 202541049077-STARTUP [10-07-2025(online)].pdf 2025-07-10
17 202541049077-FORM28 [10-07-2025(online)].pdf 2025-07-10
18 202541049077-FORM-9 [10-07-2025(online)].pdf 2025-07-10
19 202541049077-FORM FOR STARTUP [10-07-2025(online)].pdf 2025-07-10
20 202541049077-FORM 18A [10-07-2025(online)].pdf 2025-07-10
21 202541049077-FER.pdf 2025-08-14

Search Strategy

1 202541049077_SearchStrategyNew_E_Document7E_14-08-2025.pdf