Sign In to Follow Application
View All Documents & Correspondence

System And Method For Detection, Recognition And/Or Identification Of The Objects On A Railway Track

Abstract: The present invention comprises a system (100) for detection, recognition, and/or identification of one or more objects. The system (100) comprises one or more first sensors (102a), one or more second sensors (102d) and a control unit (104). The one or more first sensors (102a) are configured to capture an image and/or a video of an environment. The control unit (104) is configured to detect, recognize and/or identify one or more objects in the environment from the captured image and/or video. The one or more second sensors (102d) are configured to detect a distance of the one or more objects from the locomotive/train. The control unit (104) is communicatively coupled to the one or more second sensors (102d) to perform one or more pre-defined actions/operations upon determination of distance of the locomotive/train from the one or more objects being less than one or more pre-defined distances. Reference Figure 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
12 April 2024
Publication Number
44/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

LEAPEDGE AEROSPACE AND DEFENSE TECHNOLOGIES PRIVATE LIMITED
Lower Ground Floor, A-294, Road No. 6, Nh-8, Mahipalpur, Southwest Delhi, Delhi- 110037, India.

Inventors

1. UPADHYAY MUKESH
N1-1601, M3M, THE MARINA, SECTOR 68, AKLIMPUR-TEEKLI ROAD, GURUGRAM-122101, HARYANA, INDIA
2. SHARMA VIPAN
254, THE NAVAL TECHNICAL OFFICERS CGHS, PLOT NO. 3A, SECTOR 22, DWARKA, NEW DELHI – 110075. INDIA.
3. SAXENA AMAN
12/117, SEC-12, JAIPURIA INSTT ROAD, AVAS VIKAS COLONY, VASUNDHARA, GHAZIABAD (U.P. - 201012), INDIA.
4. VISSAPRAGADA NAGA SURYA NARAYANA
FLAT NO: PC-802, ELDECO ACCOLADE, SECTOR-2, SOHNA, GURUGRAM - 122103, HARYANA, INDIA

Specification

DESC:FORM 2

THE PATENTS ACT, 1970
(39 OF 1970)
&
PATENT RULES, 2003

COMPLETE SPECIFICATION
(Section 10 and Rule 13)

SYSTEM AND METHOD FOR DETECTION, RECOGNITION AND/OR IDENTIFICATION OF THE OBJECTS ON A RAILWAY TRACK

LEAPEDGE AEROSPACE AND DEFENSE TECHNOLOGIES PRIVATE LIMITED
Lower Ground Floor, A-294, Road No. 6, NH-8, Mahipalpur, Southwest Delhi, Delhi-110037, India.

The following specification particularly describes the invention and the manner in which it is to be performed.
SYSTEM AND METHOD FOR DETECTION, RECOGNITION AND/OR IDENTIFICATION OF THE OBJECTS ON A RAILWAY TRACK
FIELD OF THE INVENTION
[001] The present invention relates to a system and method for detection, recognition and identification of objects on a railway track.
BACKGROUND OF THE INVENTION
[002] As per publicly available data, in November 2013, a speeding train passed through the Chapramari Wildlife Sanctuary, West Bengal, and crashed into a herd of 40-50 elephants. The accident led to the death of 10 elephants, fatally injuring several others. The remaining grieving elephants were forced to leave the site by forest guards and other relief workers.
[003] Between 2016 and 2018, 32,000 animals were killed on railway tracks in India. What is more concerning is that this number may be underestimated considering that incidents report the death of animals like lions, leopards, tigers, elephants and cattle, leaving out smaller-sized species that are equally in danger. As railway lines cut through forests and conservation sites, animals lose access to their food and water sources and are forced to encounter speeding trains every time they move about in their natural habitat.
[004] Not only animals, despite several steps taken to prevent loss of human lives on railway tracks, total of 27,987 railway accidents were reported in 2019, with 3,569 persons injured and 24,619 deaths, according to the National Crime Records Bureau (NCRB) data. NCRB data said that the majority (76.3%) of railway accident cases (21,361) were reported under the ‘Fall from trains/collision with people on track’ category.
[005] Although several guidelines and security checks are followed by railway personnels/loco pilots to avoid such tragedies, recognizing moving or stationary objects such as, not being limited to, humans, animals, trees on a railway track from a moving train/locomotive poses a significant challenge owing to several factors such as, not being limited to, day-night transition, varying atmospheric conditions including fog and rain, etc. over which humans have no control. Such factors hinder the ability of railway personnel/loco pilot to visually detect, recognize and/or identify animals / humans/large objects on the railway track. Even if the railway personnel/loco pilot is able to visually detect and identify objects on the railway track, it is not always possible to take corrective action on time to avoid accidents/collisions. Moreover, loco pilots are tasked with multiple responsibilities while operating the locomotive, making it almost impractical to maintain constant focus on the railway tracks at all times.
[006] In view of the foregoing, there is a need-felt to overcome at least the above-mentioned disadvantages of the prior arts.
SUMMARY OF THE INVENTION
[007] In one aspect of the invention, a system for detection, recognition, and/or identification of one or more objects is disclosed. The system comprises one or more first sensors, one or more second sensors and a control unit. The one or more first sensors, the one or more second sensors and the control unit are disposed on a locomotive/train. The one or more first sensors and the one or more second sensors are communicatively coupled to the control unit. The one or more first sensors are configured to capture at least one of an image and a video of an environment including a railway track and a pre-defined area surrounding the railway track. The one or more first sensors are configured to capture at least one of an image and a video of the environment in real time. The control unit is configured to detect, recognize and/or identify one or more objects in the environment from the captured image and/or video. The one or more second sensors are configured to detect a distance of the one or more detected, recognized and/or identified object from the locomotive/train in real time. The control unit is configured to perform one or more pre-defined actions upon determination of the distance of the locomotive/train from the one or more detected, recognized and/or identified object being less than one or more pre-defined distances.
[008] In an embodiment, the one or more first sensors include at least one of a day vision sensor, a thermal vision sensor, a fog vision sensor and a light sensor.
[009] In an embodiment, the one or more second sensors include at least one of a RADAR sensor and a LIDAR sensor.
[010] In an embodiment, the control unit is configured to dynamically adjust calibration of the one or more first sensors including brightness, contrast, thermal palette, and auto gain control based on ambient and environmental conditions.
[011] In an embodiment, the control unit is configured to dynamically switch between the plurality of the first sensors based on ambient and environmental conditions.
[012] In an embodiment, the control unit is configured to couple with one or more display units disposed on the locomotive/train. The one or more display units are configured to perform at least one of: display images of detected, recognized and/or identified objects; display live video feed of selected one or more first sensors; display Picture-in-Picture video feeds; display GPS information; display current date and time; display speed of the locomotive; display distance of detected, recognized and/or identified object from the locomotive/train; display configuration setting information; display health of selected components of the system; and display virtual visual color bar/indicator to indicate danger level.
[013] In an embodiment, the control unit is configured to couple with one or more alarm units disposed on the locomotive/train. The one or more alarm units are activated by the control unit to generate an alarm upon determination of the distance of the locomotive/train from the one or more objects being less than a first pre-defined distance.
[014] In an embodiment, a frequency of generation of an alarm is indirectly proportional to distance between the detected, recognized and/or identified object and the locomotive/train.
[015] In an embodiment, the control unit is configured to vary blink rate, colour code, and audio frequency of the one or more alarm units based on the level of danger and proximity to the detected, recognized and/or identified object.
[016] In an embodiment, the control unit is configured to couple with one or more actuators disposed in the locomotive/train. The one or more actuators is actuated by the control unit to control speed and/or braking operations of the locomotive/train upon determination of the distance of the locomotive/train from the one or more objects being less than a second pre-defined distance. The second pre-defined distance is less than the first pre-defined distance.
[017] In an embodiment, the control unit comprises a collision forecast unit configured to dynamically compute a braking threshold distance for collision avoidance.
[018] In an embodiment, the control unit is configured to automatically or manually control the one or more actuators. The one or more actuators can be manually controlled by the control unit upon receiving one or more manual inputs from a railway personnel.
[019] In an embodiment, the control unit is configured to communicatively couple with at least one of an acknowledgement unit, an input unit, a viper control unit, a communication unit and an external interface.
[020] In an embodiment, the acknowledgement unit is configured to receive one or more first inputs from the railway personnel of the locomotive/train upon detection, recognition and/or identification of the object. The control unit, upon receiving the one or more first inputs from the acknowledgement unit, is configured to log the received input with geolocation, date and time stamp for future reference and analysis.
[021] In an embodiment, the input unit is configured to receive one or more second inputs from the railway personnel of the locomotive/train for operation of the locomotive/train.
[022] In an embodiment, the viper control unit is configured to clean the one or more first sensors and/or one or more second sensors.
[023] In an embodiment, the communication unit is configured to receive information from the control unit and/or transmit information to the control unit.
[024] In an embodiment, the external interface is configured for maintenance, data backup, configuration, OBD, external video-feeds to view on separate viewing devices, updating and upgrading system software.
[025] In an embodiment, the system comprises a pan and tilt assembly. The one or more first sensors are disposed on the pan and tilt assembly. The pan and tilt assembly is controlled by the control unit to adjust alignment of the one or more first sensors to maintain field-of-view along curved tracks, ensuring minimal blind zones.
[026] In an embodiment, the control unit comprises a decision making unit configured to determine a certainty/confidence percentage of detected, recognized and/or identified object.
[027] In an embodiment, the control unit is configured to control changes in speed of the locomotive/train based on recorded historical data and regulatory geofencing zones.
[028] In an embodiment, the one or more first sensors and/or second sensors are mounted in modular, magnetically attachable pods with vibration isolation, allowing quick installation and removal without violating the maximum moving dimensions of the locomotive/train.
[029] In another aspect of the present invention, a method for detection, recognition, and/or identification of one or more objects is disclosed. The method comprises capturing, in real time, by one or more first sensors disposed on a locomotive/train, at least one of an image and a video of an environment including a railway track and a pre-defined area surrounding the railway track. The method further comprises detecting, recognizing and/or identifying, by a control unit communicatively coupled to the one or more first sensors, one or more objects in the environment from the captured image and/or video. The method further comprises detecting, in real time, by one or more second sensors disposed on the locomotive/train, a distance of the one or more detected, recognized and/or identified object from the locomotive/train. The method further comprises performing, by the control unit communicatively coupled to the one or more second sensors, one or more pre-defined actions/operations upon determination of the distance of the locomotive/train from the one or more detected, recognized and/or identified object being less than one or more pre-defined distances.
[030] In an embodiment, the method further comprises dynamically adjusting, by the control unit, calibration of the one or more first sensors including brightness, contrast, thermal palette, and auto gain control based on ambient and environmental conditions.
[031] In an embodiment, the method further comprises dynamically switching, by the control unit, between the plurality of the first sensors based on ambient and environmental conditions by the control unit.
[032] In an embodiment, the method further comprises displaying, by a display unit, coupled with the control unit, images of detected, recognized and/or identified objects.
[033] In an embodiment, the method further comprises displaying, by the display unit, live video feed of selected one or more first sensors.
[034] In an embodiment, the method further comprises displaying, by the display unit, Picture-in-Picture video feeds.
[035] In an embodiment, the method further comprises displaying, by the display unit, GPS information.
[036] In an embodiment, the method further comprises displaying, by the display unit, current date and time.
[037] In an embodiment, the method further comprises displaying, by the display unit, speed of the train/locomotive.
[038] In an embodiment, the method further comprising displaying, by the display unit, distance of detected, recognized and/or identified object from the locomotive/train.
[039] In an embodiment, the method further comprises displaying, by the display unit, distance of detected, recognized and/or identified object from the locomotive/train.
[040] In an embodiment, the method further comprises displaying, by the display unit, configuration setting information.
[041] In an embodiment, the method further comprises displaying, by the display unit, health of selected components of the system.
[042] In an embodiment, the method further comprises displaying, by the display unit, virtual visual color bar/indicator to indicate danger level.
[043] In an embodiment, the method further comprises activating, by the control unit, one or more alarm units disposed on the locomotive/train to generate an alarm upon determination of the distance of the locomotive/train from the one or more objects being less than a first pre-defined distance.
[044] In an embodiment, the method further comprises controlling, by the control unit, speed and/or braking operations of the locomotive/train upon determination of the distance of the locomotive/train from the one or more objects being less than a second pre-defined distance.
[045] In an embodiment, the method further comprises dynamically computing, by collision forecast unit of the control unit, a braking threshold distance for collision avoidance.
[046] In an embodiment, the method further comprises an automatically or manually controlling, by the control unit, one or more actuators.
[047] In an embodiment, the method further comprises receiving one or more first inputs, by an acknowledgement unit communicatively coupled to the control unit, upon detection, recognition and/or identification of the one or more objects and, upon receiving the one or more inputs from the acknowledgement unit, logging the received input with geolocation, date and time stamp for future reference and analysis.
[048] In an embodiment, the method further comprises receiving, by an input unit communicatively coupled to the control unit, one or more second inputs unit from the railway personnel of the locomotive/train for operation of the locomotive/train.
[049] In an embodiment, the method further comprises cleaning, by the viper control unit communicatively coupled to the control unit , one or more first sensors and/or one or more second sensors.
[050] In an embodiment, the method further comprises receiving and/or transmitting, by the communication unit communicatively coupled to the control unit, information from the control unit.
[051] In an embodiment, the method further comprises maintenance, data backup, configuration, OBD, external video-feeds to view on separate viewing devices, updating and upgrading system software by an external interface communicatively coupled to the control unit.
[052] In an embodiment, the method further comprises adjusting, by a pan and tilt assembly controlled by the control unit, alignment of the one or more first sensors to maintain field-of-view along curved tracks, ensuring minimal blind zones.
[053] In an embodiment, the method further comprises determining, by a decision making unit of the control unit, a certainty/confidence percentage of detected, recognized and/or identified object.
[054] In an embodiment, the method further comprises controlling, by the control unit, changes in speed of the locomotive/train based on historical recorded data and regulatory geofencing zones.
BRIEF DESCRIPTION OF THE DRAWINGS
[055] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 illustrates a block diagram of a system for detection, recognition and/or identification of the objects on a railway track, in accordance with an embodiment of the present invention.
Figure 2 illustrates a block diagram of a memory unit of the system, in accordance with an embodiment of the present invention.
Figure 3 illustrates a flowchart of a method for detection, recognition and/or identification of the objects on a railway track, in accordance with an embodiment of the present invention.
Figure 4 illustrates a flow chart of a method for selection of a vision sensor 102a, in accordance with an embodiment of the present invention.

DESCRIPTION OF THE INVENTION
[056] In one aspect of the present invention, a system 100 for detection, recognition and identification of the objects on a railway track is disclosed. The system 100 comprises one or more sensors 102 and a control unit 104 communicatively coupled to the one or more sensors 102. The control unit 104 comprises one or more memory unit 104a and one or more processors 104b. The sensors 102 are configured to detect/measure one or more pre-defined parameters. The control unit 104 is configured to receive information indicative of the one or more pre-defined parameters and perform one or more pre-defined operations upon satisfaction of the one or more pre-defined conditions. In another aspect of the present invention, a method for detection, recognition and identification of the objects on a railway track is disclosed. The method comprises a step of detecting/measuring by one or more sensors 102 one or more pre-defined parameters. The method further comprises a step of receiving information indicative of the one or more pre-defined parameters. The step of receiving is performed by a control unit 104 communicatively coupled to the one or more sensors 102. The method further comprises a step of performing one or more pre-defined operations upon satisfaction of the one or more pre-defined conditions. The step of performing is also performed by the control unit 104.
[057] Figure 1 illustrates a block diagram of a system 100 for detection, recognition and/or identification of the objects on a railway track, in accordance with an embodiment of the present invention.
[058] For the purposes of the present invention, the term “object” includes living beings as well as non-living beings. As shown, the system 100 comprises a plurality of sensors 102. The plurality of sensors 102 are configured to detect one or more pre-defined parameters. Each of the plurality of sensors 102 are disposed on a locomotive or a train at one or more locations which ensure efficient detection/measurement of the one or more pre-defined parameters. The plurality of sensors 102 may include one or more first sensors and one or more second sensors. The one or more first sensors are configured to capture in real time at least one of an image and a video of an environment including a railway track and a pre-defined area surrounding the railway track. The one or more second sensors are configured to detect a distance of the one or more detected, recognized and/or identified object from the locomotive/train in real time. The one or more first sensors are vision sensors 102a and include at least one of a day vision sensor, a thermal vision sensor, a fog vision sensor and a light sensor and the one or more second sensors are distance sensors 102d includes at least one of a RADAR sensor and a LIDAR sensor. The plurality of sensors may also include geo location sensors 102b, speed sensors 102c, vibration sensors 102e and BIT (Built in Test) sensors 102f. However, this should not be construed as limiting and other now known or later developed sensors are also well within the scope of the present invention. The day vision sensor is configured to capture images and/or videos of an object during day time when there is adequate sunlight. The thermal vision sensor is configured to capture images and/or videos during night time or when the sun light is not adequate. The fog vision sensor is configured to capture images and/or videos in fog conditions. It is to be understood that other now known or later developed sensors which are configured to capture images and/or videos in different ambient and environmental conditions such as rains, hails etc. are well within the scope of the present invention. The light sensor is configured to determine the value of ambient light. Based on the determination of ambient light, a control unit 104 (discussed later) switches between day vision sensors, thermal vision sensor, fog vision sensor. The vision/first sensors 102a are generally disposed or mounted at a front of the locomotive/train or a rear of locomotive/train. The one or more second sensors i.e. distance sensors 102d are configured to calculate a distance of locomotive/train from a detected, recognized and/or identified object. In a non-limiting example, the distance sensor 102d is LIDAR. In a non-limiting example, the distance sensor 102d is RADAR. The geo-location sensors 102b are configured to detect geographical locations in real time. In a non-limiting example, the geo location sensor 102b is a GPS / GNSS module. The speed sensors 102c are configured to detect speed of the locomotive/train in real time. In a non-limiting example, the speed sensor 102c is selected from a group comprising a Tachometer / GPS / GNSS. The BIT sensors 102f are configured to determine status/health of different components of the system 100. In a non-limiting example, the BIT sensors 102f are selected from a group comprising current sensor, voltage sensor, power sensor, signal sensor, video feed sensor and a continuity sensor. The vibration sensors 102e are configured to sense vibration of different components of the system 100 mounted on the train/locomotive. The input of the vibration sensors 102e is used to determine whether the vibration of the one or more components of the system 100 disposed in the locomotive/train is within a pre-defined range. In case the vibrations of components are not within the pre-defined range, an indication of the preventive maintenance or replacement of the component is displayed on a display unit 106 of the system. Although plurality of sensors 102 are being shown in Figure 1, it has to be understood that all the sensors 102 are not required for the working of the system 100 and different combinations of sensors 102 can be used to achieve a desired result with respect to efficiency as well as costs of the system 102.
[059] The system further comprises the control unit 104. The control unit 104 is communicatively coupled to the one or more first sensors 102a and configured to detect, recognize and/or identify the one or more objects in the environment from the captured image and/or video. The control unit 104 is also communicatively coupled to the one or more second sensors 102d to perform one or more pre-defined actions upon determination of the distance of the locomotive/train from the one or more detected, recognized and/or identified object being less than one or more pre-defined distances say a first pre-defined distance and a second pre-defined distance. In a non-limiting example, the control unit 104 comprises one or more memory units 104a and one or more processor units 104b. The control unit 104 is configured to receive inputs from the one or more sensors 102 and processes the same to perform one or more pre-defined operations. The detection of object shows presence of object on a railway track. The recognition of the object shows class of the identified object. In a non-limiting example, the class of the object may be animal, human, tree, rock, etc. The identification of object shows the object in an enclosed rectangle with name of the object. The one or more processors 104b and one or more memory units 104a are configured to interact with each other to perform the process of detection, recognition and identification of the object.
[060] The system 100 further comprises one or more display units 106. The one or more display units 106 are configured to display images of detected, recognized and identified objects. The display unit 106 is communicatively coupled to the control unit 104. The display unit 106 is disposed in the locomotive at a location which is easily accessible to the railway personnel operating the locomotive. In a non-limiting example, the display unit 106 is configured to display live video feed of selected one or more visions sensors 102a. In a non-limiting example, the display unit 106 is configured to display Picture-in-Picture video feeds. In a non-limiting example, the display unit 106 is configured to display picture-in-picture view of video feeds. In a non-limiting example, the display unit 106 is configured to display GPS information. In a non-limiting example, the display unit 106 is configured to display current date and time. In a non-limiting example, the display unit 106 is configured to display speed of the locomotive. In a non-limiting example, the display unit 106 is configured to display distance of detected, recognized and/or identified object. In a non-limiting example, the display unit 106 is configured to display annotated object. In a non-limiting example, the display unit 106 is configured to display configuration setting information. In a non-limiting example, the display unit 106 is configured to display BIT (Built in test) which is health of selected components of the system 100. Malfunction of any of the components of the system 100 can be displayed for preventive measurement. In a non-limiting example, the display unit 106 is configured to display virtual visual color bar/indicator to indicate danger level.
[061] The system 100 may further comprise an acknowledgement unit 108. The acknowledgement unit 108 comprises a plurality of input means such as, not being limited to buttons to receive inputs from railway personnel of the locomotive. In a non-limiting example, the input is given by the railway personnel upon detection, recognition and identification of the object. The acknowledgement unit 108 is communicatively coupled with the control unit 104. Upon receiving input from the railway personnel/loco pilot, the control unit 104 may log the received input with geolocation, date and time stamp for future reference and analysis.
[062] The system 100 may further comprise one or more alarm units 110 communicatively coupled with the control unit 104. Upon determination of distance of detected, identified and/or recognized object being less than a first pre-defined distance, the control unit 104 is configured to activate one or more alarm units 110. The one or more alarm units 110 serve at least one of the two purposes. The first purpose of the alarm unit 110 is to alert the railway personnel/loco pilot upon detection, identification and recognition of the object. The second purpose of the alarm unit 110 is to alert the detected, recognized and/or identified living object. To achieve both the functionalities, the alarm units 110 are disposed inside the locomotive/train. The one or more alarm unit 110 can be selected from a group comprising audio alarm unit, visual alarm unit and haptic alarm unit. The audio alarm unit is configured to generate an audio sound. The visual alarm unit is configured to generate a visual indication such as blinking of lights. The haptic alarm unit is configured to generate vibrations. The haptic alarm unit can be provided under seat of the railway personnel operating the locomotive. In a non-limiting example, the frequency of generation of alarm is indirectly proportional to distance between the detected, recognized and/or identified object and the locomotive/train. In other words, the frequency of alarm will increase with the decreasing distance between the detected, recognized and/or identified object and the locomotive/ train. In a non-limiting example, the intensity of alarm is indirectly proportional to distance between the detected, recognized and/or identified object and the locomotive/train. In other words, the intensity of alarm will increase with the decreasing distance between the detected, recognized and/or identified object and the locomotive/train. In an embodiment, the control unit is configured to vary blink rate, colour code, and audio frequency of the one or more alarm units based on the level of danger and proximity to the detected, recognized and/or identified object.
[064] The system 100 may further comprise one or more actuators. Upon determination of distance of detected, identified and/or recognized object being less than a second pre-defined distance, the control unit 104 is configured to actuate one or more actuators 112. It is to be understood that the first pre-defined distance is generally greater than the second pre-defined distance. In other words, the alarms will be activated prior to actuation of the one or more actuators. The one or more actuators 112 are communicatively coupled to the control unit 104. In a non-limiting example, the one or more actuators are actuated simultaneously with the alarm units. The one or more actuators 112 are configured to control speed and braking operations of the locomotive/train. In absence of any corrective action by railway personnel/loco pilot to prevent accident/collision, the control unit 104 is configured to actuate the one or more actuators 112 automatically to prevent accidents/collisions. The activation of braking actuation is configurable as manual or automatic. In a non-limiting example, the actuators are selected from a group comprising relay, SSR (Solid State Relay), MOSFETS and power transistors.
[065] The system 100 may further comprise an input unit 114. The input unit 114 is communicatively coupled to the control unit 104. The input unit is configured to receive one or more inputs from the railway personnel/loco pilot which are necessary for operation of locomotive/train. The input unit 114 comprises special ports, menu selection buttons, entry key buttons, on-off buttons, OBD (On Board Diagnostics Systems), software base buttons on Display Screen (with touch capability) and the likes.
[066] The system 100 may further comprise a viper control unit 116. The viper control unit 116 is communicatively coupled to the control unit 104. The viper control unit 104 is configured to clean the one or more sensors 102 such as vision sensors 102a. It is to be understood that over the time and/or changing atmospheric conditions, the one or more vision sensors 102a are covered with dust which negatively impacts working of such vision sensors 102a. Upon determination of images and/or videos being captured by the one or more vision sensors 102a being blurred or unclear, the control unit 104 actuates the viper unit 116 to clean the one or more vision sensors 102 more particularly, the lenses of the one or more vision sensors 102a. In a non-limiting example, the viper control unit 116 include DC Motors, servo motors, stepper motors and linear actuators.
[067] The system 100 may further comprise a communication unit 118. The communication unit 118 is communicatively coupled to the control unit 104 to receive information from the control unit 104 and/or transmit information to the control unit 104. In a non-limiting example, the communication unit 118 provides interaction with a railway network management system. The communication unit 118 may comprise radio sets, wireless sets, mobile sets and the likes.
[068] The system 100 may further comprise an external interface 120. The external interface 120 is communicatively coupled with the control unit 120 for the purpose of maintenance, data backup, configuration, OBD, external video-feeds to view on separate viewing devices, updating and upgrading system software and the likes. In a non-limiting example, the external interface 120 comprises display ports, USB ports, special purpose ports, parallel ports and serial ports.
[069] The system 100 may further comprise a pan and tilt assembly 122. The pan and tilt assembly 122 is mounted on the locomotive/train at a location to maintain the field of view bore-sighted to tracks in case of curves to avoid vision dead zones. The Pan assembly 122 may comprise servo motors, DC motors, stepper motors, hydraulic systems and the likes.
[070] It is to be understood that adequate power supply is required to be given to different components of the system 100. Since the power supply required for different components of the system 100 is different from conventional power supplies, power conversion is required to run the system 100 of the present invention achieved by power converters, buck boost, power controller etc. In a non-limiting example, the power supply is provided to different components of the locomotive/train using DC-DC converter, buck converters, boost converters, buck-boost converters, level converters, current limiting components and voltage and current measuring systems.
[071] In an embodiment, the control unit 104 is configured to dynamically adjust calibration of the one or more first sensors 102a including brightness, contrast, thermal palette, and auto gain control based on ambient and environmental conditions.
[072] In an embodiment, the control unit 104 is configured to dynamically switch between the plurality of the first sensors based on ambient and environmental conditions.
[073] In an embodiment, the control unit 104 comprises a collision forecast unit configured to dynamically compute a braking threshold distance for collision avoidance.
[074] In an embodiment, the control unit 104 comprises a decision-making unit configured to determine a certainty/confidence percentage of detected, recognized and/or identified object
[075] In an embodiment, the control unit 104 is configured to control changes in speed of the locomotive/train based on historical recorded data and regulatory geofencing zones.
[076] In an embodiment, the plurality of sensors including one or more first sensors may be mounted in modular, magnetically attachable pods with vibration isolation, allowing quick installation and removal without violating the maximum moving dimensions of the locomotive/train.
[077] Figure 2 illustrates a block diagram of a memory unit 104a of the control unit 104, in accordance with an embodiment of the present invention.
[078] As can be seen the memory unit 104a is configured for storing configuration, features of the objects, trained knowledge, procedural knowledge, static data, dynamic data, shared data, buffer, video data store and log data store. The configuration includes, not being limited to, activation of reading of Light Sensor, Geo Position, Speed-Sensor, Vibration Sensor, Bit Sensors, Vision Sensors, automatic Braking, Aural and Visual timings and their frequencies, backup timings, disk overflow threshold. The features include, not being limited to, coloring of video feed, overlays, display message setup. The trained knowledge includes, not being limited to the trained models for individual or multiple objects for the purpose of detection, it also includes the False Alarm data set. The procedural knowledge includes, not being limited to the processing functions for complete system operations, including process threads, concurrent processing of sensor data. The static data includes, not being limited to set of data values fixed for specific system. The dynamic data includes, not being limited to variables, pointers and memory specifications of continuous sensed data values. The shared data includes, not being limited to semaphores, set of values shared between concurrent processes. The buffer includes, not being limited to stack, queues, double buffer for video. The video data store includes, not being limited to secondary storage devices, auxiliary storage devices including HDD, Pen drives, SSD. The log data store includes, not being limited to the memory structure holding the live data log of various sensors including GPS, Temperature and the likes.
[079] In a non-limiting example, the system 100 continuously records the output from one or more vision sensors 102a on a secondary memory available in the control unit 104 or an external memory (not shown) with the purpose of maintaining sufficient storage at all times and a facility to overflow management. In a non-limiting example, the memory includes RAM, ROM, EEPROM, hard disk, magnetic memories and the like.
[080] Figure 3 illustrates a flowchart of a method 300 for detection, recognition and/or identification of the objects on a railway track, in accordance with an embodiment of the present invention.
[081] As shown, at step 301, the method comprising receiving one or more inputs from the one or more sensors 102. The one or more sensors are first sensors 102a communicatively coupled to a control unit 104 and configured to capture one of an image and a video of an environment including a railway track and a pre-defined area surrounding the railway track in real time. The images and/or video of the environment are received as the one or more inputs. The step of receiving is performed by the control unit 104. At step 302, the method 300 comprises processing the inputs received from the one or more sensors 102. The step 302 of processing is performed by control unit by means of trained models based on neural networks, artificial intelligence etc. Based on the processing, at step 303, the control unit 104 detects whether there are one or more objects in the environment within a pre-defined distance of the railway track. The distance is calculated by the one or more second sensors 102d. The one or more second sensors 102d are also communicatively coupled to the control unit. In case the object is not detected, the method moves to step 301, else step 304 and 305. At step 304 and 305, the method 300 comprises processing the information in relation to the detected object to recognize the object (discussed in detail in subsequent paragraphs). In case the control unit 104 is not able to recognize the object, the method 300 moves to step 308, else step 306 and 307. At step 308, the method 300 displays the detected object. At step 306 and 307, the method 300 comprises processing information in relation to the recognized object to identify the object (discussed in detail in subsequent paragraphs). In case the object is not identified, the method moves to step 309, else step 310. At step 309, the method displays the recognized object. At step 310, the method comprises processing information in relation to the recognized object to identify the object (discussed in detail in subsequent paragraphs) and displays the object. Upon display of detected object, recognized object and/or identified object being at a distance less that a first pre-defined distance, the method 100 may generate an aural, visual and/or haptic alarm to alert the railway personnel/loco pilot as well as the detected, recognized or identified objected. Upon display of detected object, recognized object and/or identified object being at a distance less that a second pre-defined distance, the method 100 may actuate one or more actuators to control speed and braking operation of the train/locomotive. In a non-limiting example, the braking operation is carried out automatically, in case, the configuration of braking is set to automatic, else, it is the responsibility of the loco pilot to carry out braking operation. All the above-mentioned steps are performed by the control unit 104.
[082] In a non-limiting example, the method steps of the present invention can be broadly divided into three phases i.e., supervised learning, real time operation and unsupervised learning.
[083] In the supervised learning, the method comprises capturing images and/or videos of the objects by different vision sensors 102a under different ambient conditions and time(s) of the day. The images will be captured with different orientation and projection of the objects with reference to the vision sensors 102a. The method further comprises a step of processing and annotating the captured images and/or videos to create image pools. A single pool will contain images pertaining to one object class. The annotation is defined for recognition and further identification of the object under consideration. For example, the pools can be for humans or different animals such as, not being limited to, elephant, rhino, cows, goats, sheep, etc. The image set is then trained over a Neural Networks with devised method over time till a specific accuracy and precision is not obtained for a typical confidence value. The outcome of the Neural Network is a trained model to be used for the purpose of detection, recognition and identification of the object. The system/method of the present invention is provided with a feature of eliminating false positives by iterating the neural networks with devised method, leading to better confusion matrix in terms of higher true positives.
[084] In real time operation, with trained model, defined I/O and set configuration, the system is ready to run in field (locomotive run). The real time operation involves many processes running concurrently. When the power is applied to the system 100, the following actions takes place: (a) First, the system 100 enters into Power on Self-Test Mode (POST) and checks the status of working of designated components of the system 100 and reports it to railway personnel/loco pilot via aural, visual signals in terms of indicators, beeps and on display Unit. If the system passes POST, then it proceeds for further actions, otherwise, a fault is indicated for taking decision for partial run or complete stop. (b) The POST is followed by configuration setting, whereby, the system parameters are set in accordance with the settings as carried out by user. (c) The configuration settings are followed by activation of cyclic reading of one or more sensors 102 such as vision sensor 102a, geo-location sensor 102b, speed-sensor 102c, vibration sensor 102e and bit sensors 102f. If the configuration has Auto selection of Video Feed, then depending on the Light Senor (Ambient Light Conditions) respective sensor (Camera) from Day, Night and Fog is selected for DRI and display on display unit 106. In Manual selection of Video Feed, depending on the selection as carried out by user the respective camera feed is used for the purpose of DRI (detection, recognition and identification) and display. The system 100 has a facility of blending multiple camera video feed, and PIP (picture-in-picture) whereby, one camera feed is made primary and other secondary. The primary feed is shown full screen whereas the secondary feed is shown in form of a window or vice versa is also true. The system 100 has a facility of text and graphics overlays for the purpose of annotation. The video feeds are provided with colour palette selection to change the gamut of video over display unit. This is helpful in thermal vision, where the output is in Gray Scale, the Colour Palette Selection converts the live video into Colour output as desired. Also, the Colour System has a facility of mapping the Screen View in Temperature Zones depending on the Ambient Temperature Value. In case of detection, recognition and identification (DRI), the identified object is shown by an enclosed rectangle with name. In case of recognition only, the class of animal is shown, and in case of detection only, the display shows presence of object on track. After detection, recognition and identification, the object details are saved in form of annotated photograph with GPS location and Date-Time Stamp. The detection, recognition and identification (DRI) are immediately followed by activating the distance sensor, which in turn returns the distance of the object from locomotive/train. The aural/audio alarm, visual alarm and/or haptic alarm indications are computed proportional to distance. In case of aural alarm, the frequency and amplitude are varied to produce low and smooth sound to high and shrill sound to alert the loco pilot and detected, recognized and identified object. Similarly, the blink speed and colour of visual indicators are also varied in proportion to distance. In a non-limiting example, the color of visual indicators will be varying from green with slow blink, yellow with medium blink and red with fast blink, inversely proportional to distance of the DRI object from the locomotive. Further, depending on auto or manual selection of actuators, actuators are activated to perform speed control operation and/or braking operation. If the auto braking is configured, the locomotive/train is applied with automatic braking and in case of manual, it is the responsibility of railway personnel/ loco-pilot to activate the speed control and/or braking units. In case of detection, recognition and identification or braking, the loco-pilot acknowledge the same by giving an input such as, not being limited to, pressing a button (hard or soft). The action of railway personnel/ loco-pilot in this case is recorded into database with GPS location, date and time stamp. The system 100 in case of exception, as generated by any of the running processes such as vibration sensing, bit sensing etc. proper messages or alarms are raised and logged into the system. The system 100 has facility of auto recording the video feed in memory 104a of the system 100 while maintaining sufficient storage. In case the storage of the memory 104a falls below a configured threshold, an indication can be shown on display screen along with visual/aural alarm to alert the user for taking a backup. In case, the backup is not taken, and the recording threshold reaches maximum limit configuration, the video feeds of earlier times will be deleted automatically. The field-of-view of vision sensors 102a is maintained by pan-and-tilt assembly 122, such that in case of curved tracks, the vision sensors 102a automatically moves to maintain bore-sight with reference to the tracks. The system is provided with external interfaces for the purposes such as taking auto / manual backup of video feeds and log files to external storage devices such as HDD, pen drives etc., on-board diagnostics to check the health of system units and viper control mechanism being activated on detection of blurriness in the video feed for cleaning the lenses.
[085] The method of unsupervised learning is generally an automated process, where by in case of recognition and/or identification carried out by the system 100, the image of the object is made part of image pool-set. The moving of image into respective pool will be decided by Recognition or Identification, whichever case may be. The detected object is automatically annotated according to its class. When the system is not live, then it will automatically/manually enter into self-learning mode. In self-learning mode it will execute the devised method over Neural Networks for certain epochs making the model more robust over the time.
[086] In an embodiment, the method further comprises dynamically adjusting, by the control unit 104, calibration of the one or more first sensors 102a including brightness, contrast, thermal palette, and auto gain control based on ambient and environmental conditions.
[087] In an embodiment, the method further comprises dynamically switching, by the control unit 104, between the plurality of the first sensors 102a based on ambient and environmental conditions by the control unit 104.
[088] In an embodiment, the method further comprises displaying, by a display unit 106, coupled with the control unit 104, images of detected, recognized and/or identified objects.
[089] In an embodiment, the method further comprises displaying, by the display unit 106, live video feed of selected one or more first sensors 102a.
[090] In an embodiment, the method further comprises displaying, by the display unit 106, Picture-in-Picture video feeds.
[091] In an embodiment, the method further comprises displaying, by the display unit 106, GPS information.
[092] In an embodiment, the method further comprises displaying, by the display unit 106, current date and time.
[093] In an embodiment, the method further comprises displaying, by the display unit 106, speed of the train/locomotive.
[094] In an embodiment, the method further comprising displaying, by the display unit 106, distance of detected, recognized and/or identified object from the locomotive/train.
[095] In an embodiment, the method further comprises displaying, by the display unit 106, distance of detected, recognized and/or identified object from the locomotive/train.
[096] In an embodiment, the method further comprises displaying, by the display unit 106, configuration setting information.
[097] In an embodiment, the method further comprises displaying, by the display unit 106, health of selected components of the system.
[098] In an embodiment, the method further comprises displaying, by the display unit 106, virtual visual color bar/indicator to indicate danger level.
[099] In an embodiment, the method further comprises activating, by the control unit 106, one or more alarm units 110 disposed on the locomotive/train to generate an alarm upon determination of the distance of the locomotive/train from the one or more objects being less than a first pre-defined distance.
[0100] In an embodiment, the method further comprises controlling, by the control unit 104, speed and/or braking operations of the locomotive/train upon determination of the distance of the locomotive/train from the one or more objects being less than a second pre-defined distance.
[0101] In an embodiment, the method further comprises dynamically computing, by collision forecast unit of the control unit 104, a braking threshold distance for collision avoidance.
[0102] In an embodiment, the method further comprises an automatically or manually controlling, by the control unit 104, one or more actuators 112.
[0103] In an embodiment, the method further comprises receiving one or more first inputs, by an acknowledgement unit 108 communicatively coupled to the control unit 104, upon detection, recognition and/or identification of the one or more objects and, upon receiving the one or more inputs from the acknowledgement unit 108, logging the received input with geolocation, date and time stamp for future reference and analysis.
[0104] In an embodiment, the method further comprises receiving, by an input unit 120 communicatively coupled to the control unit 104, one or more second inputs unit from the railway personnel of the locomotive/train for operation of the locomotive/train.
[0105] In an embodiment, the method further comprises cleaning, by the viper control unit 116 communicatively coupled to the control unit 104 , the one or more first sensors 102a and/or one or more second sensors 102b.
[0106] In an embodiment, the method further comprises receiving and/or transmitting, by the communication unit communicatively coupled to the control unit 104, information from the control unit 104.
[0107] In an embodiment, the method further comprises maintenance, data backup, configuration, OBD, external video-feeds to view on separate viewing devices, updating and upgrading system software by an external interface 120 communicatively coupled to the control unit 104.
[0108] In an embodiment, the method further comprises adjusting, by a pan and tilt assembly 122 controlled by the control unit 104, alignment of the one or more first sensors 102a to maintain field-of-view along curved tracks, ensuring minimal blind zones.
[0109] In an embodiment, the method further comprises determining, by a decision-making unit of the control unit 104, a certainty/confidence percentage of detected, recognized and/or identified object.
[0110] In an embodiment, the method further comprises controlling, by the control unit 104, changes in speed of the locomotive/train based on historical recorded data and regulatory geofencing zones.
[0111] Figure 4 illustrates a flow chart of a method 400 for selection of a vision sensor 102a, in accordance with an embodiment of the present invention.
[0112] As shown, at step 401, the method comprises selection of a day sensor. At step 402, the method comprises reading ambient light sensor value. At step 403, the method comprises processing light sensor data. At step 404, the method determines whether switching is required or not. In case switching is required, the method moves to step 405, else 402. At step 405, appropriate vision sensor is selected.
[0113] The claimed features/method steps of the present invention as discussed above are not routine, conventional, or well understood in the art, as the claimed features/steps enable the following solutions to the existing problems in conventional technologies. Specifically, the technical problem of accidents/collisions on railway tracks is solved by the present invention.
[0114] An object/advantage of the present invention is to provide a system and method to automatically detect, recognize and identify one or more objects on railway tracks or in vicinity of the railway track in real time under different atmospheric conditions and perform one or more pre-defined operations to prevent accidents/collisions.
[0115] Another object/advantage of the invention is to display the certainty percentage of the identified object.
[0116] Another object/advantage of the invention is to detect, recognize and identify multiple objects on the railway track or in vicinity of the railway track.
[0117] Another object/advantage of the invention is to provide a system and method to add the geographical locations along with date and time.
[0118] Another object/advantage of the invention is to provide an aural, visual and/or haptic alarm units to alert the locomotive driver inside the locomotive and animals and humans on the railway track or vicinity of railway track.
[0119] Another object/advantage of the present invention is to automatically switch (or see in parallel) the vision sensors according to the atmospheric conditions. This is sufficed by incorporating ambient light sensors including the sunset and sunrise timing of the geographical location of the area under which the locomotive is running.
[0120] Another object/advantage of the invention is to automatically store the images and/or video recordings in internal storage devices.
[0121] Another object/advantage of the invention is to compute the distance in accordance with the locomotive speed and object under vicinity, to further enable the automatic/manual braking.
[0122] Another object/advantage of the invention is to store the location, date-time stamp of the identified object.
[0123] Another object/advantage of the invention is to alert the locomotive driver to reduce the speed in a particular area/zone depending on the learning history of appearance of the objects.
[0124] Another object/advantage of the invention is to provide a self-diagnosis of the system to indicate any malfunction or failure of the device sensors or components to the locomotive driver to take preventive measures.
[0125] Another object/advantage of the invention is to allow automatic transfer of the recorded data to external devices.
[0126] Another object/advantage of the invention is to control recording storage overflow by automatically maintaining sufficient storage at all the times.
[0127] Another object/advantage of the invention is to make system automatically learn the feature(s) of the recognized and identified object to make the system more robust (unsupervised learning).
[0128] Another object/advantage of the invention to display various information values pertaining to the scene and associated information values to the locomotive driver, not limiting to geographic location, date-time stamp, storage availability.
[0129] Another object/advantage of the invention is to carry out automatic non-uniform corrections of the vision devices.
[0130] Another object/advantage of the invention is to define a region of interest corelating the tracks and vision devices view area.
[0131] Another object/advantage of the invention is to mark the identified object on the display unit.
[0132] Another object/advantage of the invention is to configure various parameters of the system in accordance with the need of locomotive driver and/or supervisor settings.
[0133] Another object/advantage of the invention is to send automatic alert messages to the remote railway authorities via radio signals, wireless equipment, mobile devices etc.
[0134] Another object/advantage of the invention is to implement auto update of vision intelligence between two identical Vision Systems installed on the either side of the locomotive which operate one at a time depending on direction of travel.
[0135] Another object/advantage of the invention is to have a power supply system matching the output of locomotive power and input of the system power requirement.
[0136] Another object/advantage of the invention is to design a system that automates processes, reducing the need for user intervention and reducing the ‘load’ on railway personnel responsible for operating the locomotive/train. This allows locomotive drivers to focus more on other responsibilities.
[0137] Another object/advantage of the invention is to design a robust enclosure of the system, which can sustain the vibrations and atmospheric condition.
[0138] Another object/advantage of the invention is to provide an external interface for the purpose of data transfer, diagnostics and setting the system configuration.
[0139] Another object/advantage of the invention is to provide touch sensitive display to the locomotive driver to change/set certain parameter(s), if needed, at the run time of the locomotive.
[0140] Another object/advantage of the invention is to make the system flexible, such that it can be trained for different type of animals not limited to elephants, cow, sheep, rhino etc.
[0141] Another object/advantage of the invention is to maximize the accuracy of identifying the object and minimize false alarm rate, making system robust.
[0142] Another object of the invention is to provide supervised learning.
[0143] Another object/advantage of the invention is interfacing of Sensors such as vision sensors, location sensors, distance etc. working in tandem and automatic switching as per the ambient conditions.
[0144] Another object/advantage of the invention is to provide On Board Diagnostics (OBD) System to log system health with date, time and locomotive id stamp.
[0145] Another object/advantage of the invention is to suggest loco driver to maintain speed of locomotive with respect to geo-fencing created in accordance with reference to frequency of appearance of animals in the area under concern. (The area under concern is the location identified by the system for appearance of the animals)
[0146] Another object/advantage of the invention is to define Geo-Fencing, as provided by the Government and or Authorities governing the locomotion systems.
[0147] Another object/advantage of the invention is to create a centralized system monitoring the online health of all the Systems installed in the locomotives. (This requires network connectivity)
[0148] Another object/advantage of the invention is to create a centralized system to log all the detected, recognized and identified objects for the purpose of analysis and creating preventive measure operative procedures to reduce the accidents.
[0149] Another object/advantage of the invention is to incorporate mechanism to activate the inbuilt Siren of the locomotive sending caution signals to animals/humans in vicinity of locomotive with potential danger enroute.
[0150] Another object/advantage of the invention is to send alert/emergency messages to the nearby station in case the same is needed (DRI or casualty). In this case the proposed system should have a proper network connectivity. The network connectivity can be through, but not limited to wireless set, GSM connectivity, Radio system. The system installed in different locomotives can log their data on DRI to Central Server using the network connectivity /communication systems for the purpose of analysis. Since, the system can be installed on multiple locomotives in different regions/area, the proposed invention has a facility to have a Central Server system, to which all the units can be connected through wireless/radio systems.
[0151] Another object/advantage of the invention is to provide a mechanism to update/upgrade the system as and when required.
[0152] Another object/advantage of the invention is to train the system to provide different projection (360-degree view) of the objects with respective to vision sensors position. Example: moving towards, moving away, moving broad ways etc.
[0153] Another object/advantage of the invention is to automatically find the actual distance of the recognized and identified object.
[0154] Another object/advantage of the invention is to identify multiple objects and simultaneously monitor and display recognition and identification along with distance marking
[0155] Another object/advantage of the invention is to mark (window) the recognized and identified object with “name” and “confidence value” in accordance with the model training.
[0156] Another object/advantage of the invention is to change the blinking rate of visual signal in accordance with the distance of detection and identification. For example, if it is safe distance the blink can be slow, if the distance is moderately safe then the blink rate can be medium, if the distance is under potential danger of accident, then the blinking can be very fast.
[0157] Another object/advantage of the invention is to provide different color visual indicators in accordance with danger level, for example green for low danger, yellow for moderate level and red for high level.
[0158] Another object/advantage of the invention is to change the Aural Alarm frequency in accordance with Visual Signal indicator. The Alam can sound from normal (smooth) to high alert mode (shrill) proportional to the distance observed from the object.
[0159] Another object/advantage of the invention is to provide a mechanism of applying different colour Palette to Gray Scale video of Thermal Vision feed for clarity of visualization.
[0160] Another object/advantage of the invention is to provide Pan system for Vision Sensor Assembly along with Distance Sensors to maintain the Field-of-View of [VS] Bore-sighted to Tracks in case of curves to avoid vision dead zones.
[0161] Another object/advantage of the invention is to provide a Visual Colour Bar on the Display Screen, referred to as danger/collision bar having colour levels in accordance to degree of danger corelating the speed of locomotive and distance of Object on track vicinity.
[0162] Another object/advantage of the invention is to create a Network Management System (NMS) for centralized control and distribution of the Vision Intelligence amongst the locomotives ganged in a particular region, path or track.
[0163] Another object/advantage of the invention to provide a system with vibration sensors alerting beforehand the locomotive driver in case the system needs mechanical attention for preventive maintenance. [using vibration sensors and/or accelerometers]
[0164] Another object/advantage of the invention is to provide a system of mountings of Vision Sensors along with Distance Sensor to carry out harmonization/bore sighting.
[0165] Another object/advantage of the invention is to provide POST (Power on Self-Test) for the system every time it is switched on. The POST reports will be saved on the system memory unit.
[0166] Another object/advantage of the invention is to continuously monitor the health of all units of the system and display their health status continuously on the display Unit.
[0167] Another object/advantage of the invention is to have a viper system to automatically clean the lenses of Vision System, whenever the system imagery analysis finds blurring of images/video due to dirt and debris.
[0168] Another object/advantage of the invention is to have the complete system formed in a modular structure for the purpose of maintenance, repairing and replacements, when needed.
[0169] Another object/advantage of the invention is to have anti-theft, anti-vandalization system for the purpose of the safety and security equipment mounted inside/outside of the locomotive. (mechanical and electrical safety)
[0170] Another object/advantage of the invention is to provide automatic brightness control of display depending on the ambient vision of locomotive cabin.
[0171] Another object/advantage of the invention is to provide Automatic Gain Control for each of the Vison Sensor for better video production.
[0172] Another object/advantage of the invention is to provide mechanism for Brightness and Contrast control for each Vision Sensor.
[0173] Another object/advantage of the invention is to provide a mechanism for Auto/Manual transition of White-Hot and Black-Hot imagery in case of Thermal Vision Sensor.
[0174] Another object/advantage of the invention is to provide a configurable Text and Image annotation on the Video Screen (Monitor).
[0175] Another object/advantage of the invention is to provide proper compression mechanism for video recording reducing the storage space without compromising on the quality of the Video.
[0176] Another object/advantage of the invention is to store the snapshot of DRI objects with Geotagging, Date and Time stamp.
[0177] While the present invention has been described with respect to certain embodiments, it will be apparent to those skilled in the art that various changes and modification may be made without departing from the scope of the invention as defined in the following claims.
,CLAIMS:WE CLAIM

1. A system (100) for detection, recognition, and/or identification of one or more objects, the system comprising:
- one or more first sensors (102a) disposed on a locomotive/train, the one or more first sensors (102a) configured to capture in real time at least one of an image and a video of an environment including a railway track and a pre-defined area surrounding the railway track;
- a control unit (104), the control unit (104) communicatively coupled to the one or more first sensors (102a) and configured to detect, recognize and/or identify one or more objects in the environment from the captured image and/or video;
- one or more second sensors (102d) disposed on the locomotive/train, the one or more second sensors (102d) configured to detect a distance of the one or more detected, recognized and/or identified object from the locomotive/train in real time; and
- the control unit (104) being communicatively coupled to the one or more second sensors to perform one or more pre-defined actions/operations upon determination of the distance of the locomotive/train from the one or more detected, recognized and/or identified object being less than one or more pre-defined distances.
2. The system (100) as claimed in claim 1, wherein the one or more first sensors (102a) include at least one of a day vision sensor, a thermal vision sensor, a fog vision sensor and a light sensor and the one or more second sensors (102d) include at least one of a RADAR sensor and a LIDAR sensor.
3. The system (100) as claimed in claim 1, wherein the control unit (104) is configured to perform at least one of: dynamically adjust calibration of the one or more first sensors (102a) including brightness, contrast, thermal palette, and auto gain control based on ambient and environmental conditions and dynamically switch between the plurality of the first sensors (102a) based on ambient and environmental conditions.
4. The system (100) as claimed in claim 1, wherein the control unit (104) is configured to couple with one or more display units (106) disposed on the locomotive/train, the one or more display units (106) being configured to perform at least one of: display images of detected, recognized and/or identified objects; display live video feed of selected one or more first sensors (102a); display Picture-in-Picture video feeds; display GPS information; display current date and time; display speed of the locomotive; display distance of detected, recognized and/or identified object from the locomotive/train; display configuration setting information; display health of selected components of the system; and display virtual visual color bar/indicator to indicate danger level.
5. The system (100) as claimed in claim 1, wherein the control unit (104) being configured to couple with one or more alarm units (110) disposed on the locomotive/train, the one or more alarm units (110) being activated by the control unit (104) to generate an alarm upon determination of a distance of the locomotive/train from the one or more objects being less than a first pre-defined distance.
6. The system (100) as claimed in claim 5, wherein a frequency of generation of alarm by the one or more alarm units (110) being indirectly proportional to distance between the detected, recognized and/or identified object and the locomotive/train.
7. The system (100) as claimed in claim 6, wherein the control unit (104) being configured to vary blink rate, colour code, and audio frequency of the one or more alarm units (110) based on the level of danger and proximity to the detected, recognized and/or identified object.
8. The system (100) as claimed in claim 1, wherein the control unit (104) being configured to couple with one or more actuators (112) disposed in the locomotive/train, the one or more actuators (112) being actuated by the control unit (106) to control speed and/or braking operations of the locomotive/train upon determination of a distance of the locomotive/train from the one or more objects being less than a second pre-defined distance.
9. The system (100) as claimed in claim 8, wherein the control unit (104) comprises a collision forecast unit configured to dynamically compute a braking threshold distance for collision avoidance.
10. The system (100) as claimed in claim 8, wherein the control unit (104) being configured to automatically or manually control the one or more actuators (112), the one or more actuators (112) being manually controlled by the control unit (104) upon receiving one or more manual inputs from a railway personnel.
11. The system (100) as claimed in claim 5 and claim 8, wherein the second pre-defined distance being less than the first pre-defined distance.
12. The system (100) as claimed in claim 1, wherein the control unit (104) being configured to communicatively couple with at least one of: an acknowledgement unit (108), an input unit (120), a viper control unit (116), a communication unit/module and an external interface (120).
13. The system as claimed in claim 12, wherein:
- the acknowledgement unit (108) being configured to receive one or more first inputs from the railway personnel of the locomotive/train upon detection, recognition and/or identification of the object and the control unit (104), upon receiving the one or more first inputs from the acknowledgement unit (108), configured to log the received input with geolocation, date and time stamp for future reference and analysis;
- the input unit (104) being configured to receive one or more second inputs from the railway personnel of the locomotive/train for operation of the locomotive/train;
- the viper control unit (116) being configured to clean the one or more first sensors (102a) and/or one or more second sensors (102d);
- the communication unit being configured to receive information from the control unit (104) and/or transmit information to the control unit (104); and
- the external interface (120) being configured for maintenance, data backup, configuration, OBD, external video-feeds to view on separate viewing devices, updating and upgrading system software.
14. The system (100) as claimed in claim 1, wherein the one or more first sensors (102d) being disposed on a pan and tilt assembly (122), the pan and tilt assembly (122) being controlled by the control unit (104) to adjust alignment of the one or more first sensors (102a) to maintain field-of-view along curved tracks, ensuring minimal blind zones.
15. The system (100) as claimed in claim 1, wherein the control unit (104) comprises a decision-making unit configured to determine a certainty/confidence percentage of detected, recognized and/or identified object
16. The system (100) as claimed in claim 1, wherein the control unit (104) being configured to control changes in speed of the locomotive/train based on historical recorded data and regulatory geofencing zones.
17. The system (100) as claimed in claim 1, wherein the one or more first sensors (102a) being mounted in modular, magnetically attachable pods with vibration isolation, allowing quick installation and removal without violating the maximum moving dimensions of the locomotive/train.
18. A method for detection, recognition, and/or identification of one or more objects, the method comprising:
- capturing, by one or more first sensors (102a) disposed on a locomotive/train, at least one of an image and a video of an environment including a railway track and a pre-defined area surrounding the railway track in real time;
- detecting, recognizing and/or identifying, by a control unit (104) communicatively coupled to the one or more first sensors (102a), one or more objects in the environment from the captured image and/or video;
- detecting, by one or more second sensors (102d) disposed on the locomotive/train, a distance of the one or more detected, recognized and/or identified object from the locomotive/train in real time; and
- performing, by the control unit (104) communicatively coupled to the one or more second sensors (102d), one or more pre-defined actions upon determination of the distance of the locomotive/train from the one or more detected, recognized and/or identified object being less than one or more pre-defined distances.
19. The method as claimed in claim 18, wherein the one or more first sensors (102a) include at least one of a day vision sensor, a thermal vision sensor, a fog vision sensor and a light sensor and the one or more second sensors (102d) include at least one of a RADAR sensor and a LIDAR sensor.
20. The method as claimed in claim 18, comprising:
dynamically adjusting, by the control unit (104), calibration of the one or more first sensors (102a) including brightness, contrast, thermal palette, and auto gain control based on ambient and environmental conditions; and/or
dynamically switching, by the control unit (104), between the plurality of the first sensors (102a) based on ambient and environmental conditions.
21. The method as claimed in claim 18, comprising:
displaying, by a display unit (106) coupled with the control unit (104), images of detected, recognized and/or identified objects;
displaying, by the display unit (106), live video feed of selected one or more first sensors (102a);
displaying, by the display unit (106), Picture-in-Picture video feeds;
displaying, by the display unit (106), GPS information;
displaying, by the display unit (106), current date and time;
displaying, by the display unit (106), speed of the train/locomotive;
displaying, by the display unit (106), distance of detected, recognized and/or identified object from the locomotive/train;
displaying, by the display unit (106), configuration setting information;
displaying, by the display unit (106), health of selected components of the system; and/or
displaying, by the display unit (106), virtual visual color bar/indicator to indicate danger level.
22. The method as claimed in claim 18, comprising:
activating, by the control unit (106), one or more alarm units (110) disposed on the locomotive/train to generate an alarm upon determination of the distance of the locomotive/train from the one or more objects being less than a first pre-defined distance.
23. The method as claimed in claim 22, wherein a frequency of generation of alarm by the one or more alarm units (110) being indirectly proportional to distance between the detected, recognized and/or identified object and the locomotive/train.
24. The method as claimed in claim 23, wherein the control unit (104) being configured to vary blink rate, colour code, and audio frequency of the one or more alarm units (110) based on the level of danger and proximity to the detected, recognized and/or identified object.
25. The method as claimed in claim 18, comprising:
actuating, by the control unit (104), one or more actuators (112) to control speed and/or braking operations of the locomotive/train upon determination of the distance of the locomotive/train from the one or more objects being less than a second pre-defined distance.
26. The method as claimed in claim 18, comprising:
dynamically computing, by collision forecast unit of the control unit (104), a braking threshold distance for collision avoidance.
27. The method as claimed in claim 25, comprising:
automatically or manually controlling, by the control unit (104), one or more actuators (112).
28. The method as claimed in claim 18, comprising:
- receiving one or more first inputs, by an acknowledgement unit (108) communicatively coupled to the control unit (104), upon detection, recognition and/or identification of the one or more objects and, upon receiving the one or more inputs from the acknowledgement unit (108), logging the received input with geolocation, date and time stamp for future reference and analysis;
- receiving, by an input unit (120) communicatively coupled to the control unit (106), one or more second inputs unit from the railway personnel of the locomotive/train for operation of the locomotive/train;
- cleaning, by the viper control unit (116) communicatively coupled to the control unit (104), one or more first sensors (102a) and/or one or more second sensors (102d);
- receiving and/or transmitting, by the communication unit/module communicatively coupled to the control unit (104), information from the control unit (104); and/or
- maintenance, data backup, configuration, OBD, external video-feeds to view on separate viewing devices, updating and upgrading system software by an external interface communicatively coupled to the control unit (104).
29. The method as claimed in claim 18, comprising:
- adjusting, by a pan and tilt assembly (122) controlled by the control unit (104), alignment of the one or more first sensors (102a) to maintain field-of-view along curved tracks, ensuring minimal blind zones.
30. The method as claimed in claim 18, comprising:
determining, by a decision-making unit of the control unit (104), a certainty/confidence percentage of detected, recognized and/or identified object
31. The method as claimed in claim 18, comprising:
controlling, by the control unit (104), changes in speed of the locomotive/train based on historical recorded data and regulatory geofencing zones.

Dated this the 11th day of April 2025.

Saravanan Gopalan
Registered Patent Agent
(INPA – 3249)

Documents

Application Documents

# Name Date
1 202411029796-PROVISIONAL SPECIFICATION [12-04-2024(online)].pdf 2024-04-12
2 202411029796-FORM 1 [12-04-2024(online)].pdf 2024-04-12
3 202411029796-DRAWINGS [12-04-2024(online)].pdf 2024-04-12
4 202411029796-FORM-5 [11-04-2025(online)].pdf 2025-04-11
5 202411029796-FORM FOR SMALL ENTITY [11-04-2025(online)].pdf 2025-04-11
6 202411029796-FORM 18 [11-04-2025(online)].pdf 2025-04-11
7 202411029796-EVIDENCE FOR REGISTRATION UNDER SSI [11-04-2025(online)].pdf 2025-04-11
8 202411029796-ENDORSEMENT BY INVENTORS [11-04-2025(online)].pdf 2025-04-11
9 202411029796-DRAWING [11-04-2025(online)].pdf 2025-04-11
10 202411029796-CORRESPONDENCE-OTHERS [11-04-2025(online)].pdf 2025-04-11
11 202411029796-COMPLETE SPECIFICATION [11-04-2025(online)].pdf 2025-04-11