Abstract: ABSTRACT Disclosed subject matter relates generally to wagons and tipplers, that particularly discloses a method, device and system for monitoring coupling of uncoupled wagons in real-time. The method includes receiving, by a monitoring unit video frames related to rail-bound wagons captured by image capturing devices. The monitoring unit may detect presence of an uncoupled wagon with a coupler and a former wagon in at least one of the plurality of video frames based on a trained first AI technique. Monitoring unit then tracks distance between the uncoupled wagon with coupler and the former wagon, as the uncoupled wagon moves towards the former wagon, and detects the coupling status of the uncoupled wagon and the former wagon based on the tracked distance. The present disclosure provides the advantage of determining unsuccessful coupling of uncoupled wagons and providing suitable alerts for operators to be prepared prior for rectifying the coupling.
TECHNICAL FIELD
The present subject matter relates generally to wagons and tipplers. Particularly, but not exclusively embodiments of the disclosure disclose a method, device and system for monitoring coupling of uncoupled wagons in real-time.
BACKGROUND
Generally, loading and unloading of different materials such as raw materials may include wagons, specifically rail-bound wagons, carrying the load of raw materials from sources such as domestic mines and ports, and wagon tipplers tippling the loaded wagons for unloading the raw materials. This process of loading and unloading the wagons may include sub-routine tasks such as rake placement, decoupling, indexing, tippling, charging, coupling and rake-out. This process has been illustrated with the help of FIG.1 (prior art). Some sub-routines such as unloading of rakes (in other words rake-out) may involve multiple interventions of operators on rail lines, between wagons, and platforms. Such situations involving intervention of the operators may be life threatening and hence call for multiple safety measures and checks in the form of manual Standard Operating Procedure (SOP) practice to avoid any unfortunate or hazardous situation. However, often due to negligence or accidents, operators lose their life or may be subjected to serious injury while performing the sub-routine tasks. Currently, existing techniques do not teach monitoring and providing a safe working environment to the operators in a tippler and wagon movement region.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the disclosure and should not be taken as an acknowledgement or any form of suggestion that this information forms prior art already known to a person skilled in the art.
SUMMARY
One or more shortcomings of the prior art may be overcome, and additional advantages may be provided through the present disclosure. Additional features and advantages may be realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
In one non-limiting embodiment of the disclosure, a method of monitoring coupling of uncoupled wagons in real-time. The method comprises receiving, by a monitoring unit associated with one or more image capturing devices, a plurality of video frames related to rail-bound wagons in a wagon movement area captured by the one or more image capturing devices.
Thereafter, the method comprises detecting presence of an uncoupled wagon with a coupler and a former wagon in at least one of the plurality of video frames based on a trained first artificial intelligence technique. Subsequently, the method comprises tracking distance between the uncoupled wagon with the coupler and the former wagon, as the uncoupled wagon moves towards the former wagon. Finally, the method comprises detecting the coupling status of the uncoupled wagon with coupler and the former wagon based on the tracked distance.
In an embodiment of the disclosure, tracking the distance between the uncoupled wagon with coupler and the former wagon comprises detecting position of edges of the uncoupled wagon with the coupler and the former wagon, and tracking the distance based on relative change in the position of edges of the uncoupled wagon with coupler and the former wagon, as the uncoupled wagon moves towards the former wagon for coupling.
In an embodiment of the disclosure, the coupling status of the uncoupled wagon with coupler and the former wagon based on the tracked distance is detected to be (a) successful when the distance reaches a pre-set minimum threshold value and remains constant within the pre-set minimum threshold value in each of plurality of subsequent video frames, or (b) unsuccessful when the distance reaches the pre-set minimum threshold value and increases thereafter in the plurality of subsequent video frames.
In an embodiment of the disclosure, the method includes providing alerts to one or more personnel, via a server associated with the monitoring unit, to manually couple the uncoupled wagon with the coupler and the former wagon, when the coupling status is determined to be unsuccessful, wherein the manual coupling is performed by the one or more personnel without violating predefined Standard Operating Procedure (SOP).
In an embodiment of the disclosure, each of the plurality of video frames comprising both the uncoupled wagon with coupler and the former wagon are subjected to video frame perspective transformation.
In an embodiment of the disclosure, the method includes receiving the plurality of video frames from the one or more image capturing devices, and detecting presence of one or more subjects in at least one of the plurality of video frames using a trained second artificial intelligence technique. Thereafter, the method includes determining whether the presence of the one or more subjects is authorized as per SOP, and providing alerts to the one or more subjects and one or more security personnel, via a server associated with monitoring unit, to step out of at
least one of the wagon movement area and the wagon tippler area, when the presence of th one or more subjects is determined to be unauthorized as per the SOP.
In an embodiment of the disclosure, the SOP includes a predefined list of authorized scenario comprising at least one of rack placement, emergency stop of tippler equipment, maintenanc mode of equipment and complete power shutdown.
In another non-limiting embodiment of the disclosure, a monitoring unit for monitorin coupling of uncoupled wagons in real-time is disclosed. The monitoring unit includes controller and a memory communicatively coupled to the controller. The memory stores th controller-executable instructions, which, on execution, causes the controller to receive plurality of video frames related to rail-bound wagons in a wagon movement area captured b one or more image capturing devices associated with the monitoring unit. Thereafter, th controller detects presence of an uncoupled wagon with a coupler and a former wagon in a least one of the plurality of video frames based on a trained first artificial intelligence technique Subsequently, the controller tracks the distance between the uncoupled wagon with couple and the former wagon, as the uncoupled wagon moves towards the former wagon. Finally, th controller detects the coupling status of the uncoupled wagon with coupler and the forme wagon based on the tracked distance.
In yet another non-limiting embodiment of the disclosure, a system for monitoring alignmen of uncoupled wagons in real-time is disclosed. The system includes one or more imag capturing devices, a monitoring unit associated with the one or more image capturing devices a server associated with the monitoring unit, and one or more computing devices associate with the server. The one or more image capturing devices are configured to capture a pluralit of video frames related to a wagon movement area and a wagon tippler area. Further, th monitoring unit is configured to perform the method of monitoring coupling of uncouple wagons in real-time. Furthermore, the server is configured to receive data related to alerts from the monitoring unit, and generate one or more alerts for one or more subjects and one or mor security personnel. Finally, the one or more computing devices are configured to receive an display the one or more alerts received from the server.
The foregoing summary is illustrative only and is not intended to be in any way limiting. I addition to the illustrative aspects, embodiments, and features described above, further aspects embodiments, and features will become apparent by reference to the drawings and th following detailed description.
BRIEF DESCRIPTION OF THE ACCOMPANYING DIAGRAMS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and with reference to the accompanying figures, in which:
FIG.1 (prior art) shows an exemplary conventional coiling process of loading and unloading using a wagon tippler and rail-bound wagons;
FIG.2A shows an exemplary system for monitoring coupling of uncoupled wagons in real¬time in accordance with some embodiments of the present disclosure;
FIG.2B shows an exemplary set up of a monitoring unit in accordance with some embodiments of the present disclosure;
FIG.2C shows a detailed block diagram of the monitoring unit for monitoring coupling of uncoupled wagons in real-time in accordance with some embodiments of the present disclosure;
FIG.2D(1) – FIG.2D(3) show exemplary training images of the uncoupled wagons and couplers used for training the first artificial intelligence technique in accordance with some embodiments of the present disclosure.
FIG.2E shows an exemplary video frame comprising the uncoupled wagon and the former wagon in a video frame, after being transformed based on perspective transformation in accordance with some embodiments of the present disclosure.
FIG.2F shows an exemplary illustration of detecting edges of the uncoupled wagon and the former wagon in a video frame, in accordance with some embodiments of the present disclosure.
FIG.2G (1) and FIG.2G (2) show exemplary illustration of tracking distance between the uncoupled wagon and the former wagon in accordance with some embodiments of the present disclosure.
FIG.2H (1) shows an exemplary graphical illustration of constant distance between the uncoupled wagon and the former wagon due to successful coupling status in accordance with some embodiments of the present disclosure.
FIG.2H (2) shows an exemplary graphical illustration of varying distance between the uncoupled wagon and the former wagon due to unsuccessful coupling status in accordance with some embodiments of the present disclosure.
FIG.3A is a flowchart illustrating a method of monitoring coupling of uncoupled wagons in real-time in accordance with some embodiments of the present disclosure;
FIG.3B is a flowchart illustrating a method of detecting presence of one or more subjects parallelly while monitoring coupling of uncoupled wagons in real-time in accordance with some embodiments of the present disclosure; and
FIG.4 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether such computer or processor is explicitly shown.
DETAILED DESCRIPTION
In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
The terms “comprises”, “comprising”, “includes” or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that includes a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
Disclosed herein are a method, device and a system for monitoring coupling of uncoupled wagons in real-time. In some embodiments, the device that implements the method disclosed in the present disclosure may be a monitoring unit. The monitoring unit may be associated with one or more image capturing devices. As an example, the one or more image capturing devices may be a camera, handycam, video recorder and the like. The monitoring unit may receive a plurality of video frames related to rail-bound wagons in a wagon movement area captured by the one or more image capturing devices. In some embodiments, the monitoring unit may extract at least one of a plurality of image frames from the plurality of video frames based on a trained artificial intelligence technique. In some embodiments, the monitoring unit may detect presence of an uncoupled wagon with a coupler and a former wagon in at least one of the plurality of video frames based on a trained first artificial intelligence technique. In some embodiments, both presence of an uncoupled wagon with a coupler and a former wagon may be detected in a single frame. Subsequently, the monitoring unit may track distance between the uncoupled wagon with the coupler and the former wagon, as the uncoupled wagon moves towards the former wagon. In some embodiments, the former wagon may be a wagon which is available for coupling with an uncoupled wagon. For instance, if wagon 2 is an uncoupled wagon arriving for coupling, then wagon 1 which is available to be coupled with wagon 2 may be referred as the former wagon. Similarly, wagon 2 would now acts as former wagon for another uncoupled wagon 3 which is arriving for coupling. In some embodiments, for successful coupling, status of coupler of the uncoupled wagon should to be opposite to status
of a coupler of a corresponding former wagon. Consider a scenario, where wagon 1 is the former wagon available for coupling with an uncoupled wagon 2. In such a scenario, status of the coupler of the former wagon and the coupler of the uncoupled wagon 2 should be opposite, which means that, status of one of the coupler should be “open” and status of another coupler should be “close”. In this scenario, consider status of the coupler of the uncoupled wagon 2 arriving for coupling is “open” and status of the coupler of the former wagon is “closed”. Therefore, since the statuses are opposite, the coupler of the uncoupled wagon 2 is said to be aligned for coupling with the corresponding former wagon. In some embodiments, upon tracking the distance between the uncoupled wagon with the coupler and the former wagon, as the uncoupled wagon moves towards the former wagon, the monitoring unit may detect the coupling status of the uncoupled wagon with coupler and the former wagon based on the tracked distance.
In some embodiments, the present disclosure provides an Artificial Intelligence (AI) based technique, which performs analysis of at least one of the plurality of image frames based on its learning from the training data, to determine whether uncoupled wagon and the former wagon are properly aligned or not, without any manual interference. Therefore, when there is an unsuccessful coupling detected, the monitoring unit may notify such coupling failure status to a server associated with the monitoring unit. The server may thereafter provide alerts to one or more personnel, via a server associated with the monitoring unit, to manually perform coupling of the coupler of the one or more uncoupled wagons. Such determination of unsuccessful coupling and notifying process may enable one or more personnel such as ground crew to be aware of the unsuccessful coupling condition beforehand, which otherwise may not be visible. This in turn enables the one or more personnel to be prepared for manual coupling procedure.
In some embodiments, the manual coupling is performed by the one or more personnel without violating predefined Standard Operating Procedure (SOP), thereby ensuring safety of the one or more personnel while performing the coupling operation manually. For instance, as per the SOP, the movement of the one or more uncoupled wagons and the wagon tippler may be halted for manual coupling.
In some embodiments, the present disclosure provides a feature wherein the presence of one or more subjects is detected in the wagon movement area and the wagon tippler area, it is determined whether such presence of the one or more subjects is authorized according to SOP. In the scenario when the presence of the one or more subjects is unauthorized, the one or more
subjects may be alerted immediately, thus eliminating fatal accidents that may occur in the wagon movement area and the wagon tippler area.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
FIG.2A shows an exemplary system for monitoring coupling of uncoupled wagons in real¬time in accordance with some embodiments of the present disclosure.
Th exemplary system 200 comprises an image capturing device 2011 to an image capturing device 201n (also referred as one or more image capturing devices 201, a monitoring unit 203, a server 205 and computing device 2071 to computing device 207n (also referred as one or more computing devices 207). The exemplary system 200 is implemented in an environment such as wagon movement area and wagon tippler area. For instance, the wagons discussed in the present disclosure may be rail-bound wagons. However, this should not be construed as a limitation of the present disclosure since the aspects of the present disclosure may be applicable to other type of wagons that perform similar functionalities related to coupling and decoupling the wagons for loading and unloading material carried in the wagons. For instance, the wagon movement area may be an area comprising a rail track for the movement of the wagons upon loading the material into the wagons and unloading the material from the wagons. For instance, the wagon tippler area may be the area comprising a tippler equipment which tipples wagons arriving at the tippler to unload material loaded in the wagons. In some embodiments, the one or more image capturing devices 201 may be configured to capture a plurality of video frames related to the wagon movement area and the wagon tippler area. As an example, the one or more image capturing devices 201 may include, but not limited to, a camera, a handycam, a
video recorder, and a mobile phone comprising an embedded camera. The monitoring unit 203 associated with the one or more image capturing devices 201 may receive the plurality of video frames related to the wagon movement area and the wagon tippler area from the one or more image capturing devices 201. In some embodiments, the monitoring unit 203 may be configured as an embedded system in the one or more image capturing devices 201. In some embodiments, the monitoring unit 203 may be associated with more than one image capturing device 201 as shown in the FIG.2A. In alternative embodiments, one monitoring unit 203 may be associated with one image capturing device 201 (not shown in the FIG.2A). The monitoring unit 203 may extract a plurality of video frames based on a trained first Artificial Intelligence (AI) technique. In some embodiments, the monitoring unit 203 may monitor coupling of one or more uncoupled wagons (not shown in FIG.2A) in real-time based on analysis of at least one of the plurality of video frames. In this implementation, uncoupled wagons may refer to the wagons that are moving independently on the rail track without being coupled with other wagons moving on the rail track to form a chain of wagons. In some embodiments, the monitoring unit 203 may monitor coupling of the uncoupled wagons by tracking distance between the uncoupled wagon and a former wagon when the uncoupled wagon moves towards the former wagon for coupling.
While monitoring the coupling of uncoupled wagons in real-time, when the monitoring unit 203 determines unsuccessful condition for coupling, the monitoring unit 203 may provide data related to such determination to the server 205 associated with the monitoring unit 203. In some embodiments, the monitoring unit 203 and the server 205 may communicate via a wired communication network, wireless communication network or a combination of both wired and wireless communication network. The server 205 may be configured to receive data related to alerts from the monitoring unit 203, and generate one or more alerts for one or more subjects and one or more security personnel. As an example, the one or more security personnel may include operators, security, technicians and the like in the wagon movement area and the wagon tippler area. In one implementation, the one or more alerts may be provided to one or more computing devices 207 of the one or more security personnel and one or more subjects. As an example, the one or more computing devices 207 may include, but not limited to, a mobile phone, a tablet phone, a laptop, a desktop, and the like. In another implementation, the one or more alerts may be provide via sirens, light indications and the like in wagon tippler area and the wagon movement area. As an example, the server 205 may provide alerts to manually perform coupling of the uncoupled wagon by properly fixing the coupler of the uncoupled
wagon with the coupler of the former wagon. In some alternative embodiments, the monitoring unit 203 may be facilitated with the capability of alerting directly, without the requirement of a server 205 for alerting.
In some embodiments, the monitoring unit 203 may also detect presence of one or more subjects in at least one of the plurality of video frames using a trained second artificial intelligence technique, and may determine whether such presence of the one or more subjects is authorized as per SOP. When the presence of the one or more subjects is determined to be unauthorized as per the SOP, the monitoring unit 203 may send alerts to at least one of the one or more subjects or the one or more security personnel, via the server 205.
FIG.2B shows an exemplary set up of a monitoring unit in accordance with some embodiments of the present disclosure.
The implementation shown in the FIG.2B indicates the monitoring unit 203 configured/embedded in an image capturing device 201. However, this should not be construed as a limitation of the present disclosure, as the monitoring unit 203 may be associated with the image capturing device 201 and need not be explicitly embedded within the image capturing device 201. The monitoring unit 203 may include an Advance RISC Machine (ARM) based controller 209 (also referred as controller 209 in the present disclosure), a network module 211, and an Interface Card (IC) 213 such as a camera IC. In some embodiments, the monitoring unit 203 may be interfaced with the image capturing device 201 using the IC 213. As an example, the IC 213 may communicate with the image capturing device 201 via an LVDS interface. Further, the IC 213 may be communicatively connected to ARM based controller 209 via an interface such as Universal Serial Bus (USB) interface. The ARM based controller 209 may further be associated with the network module 211 such a Long Term Evolution (LTE) module to communicate data from the monitoring unit 203 to a server 205 associated with the monitoring unit 203. In some embodiments, the ARM based controller 209 may include, for example, a Graphical Processing Unit (GPU) that has a parallel processing capability suitable for fast computing and deep learning required for the first and second trained AI techniques.
In some embodiments, the image capturing device 201 embedded with the monitoring unit 203 may be include an enclosure 215 covering the image capturing device 201 as shown in the FIG.2B. In some embodiments, when there is more than one image capturing device, each of the one or more image capturing devices 201 may be provided with the enclosure 215. In some
embodiments, the enclosure 215 may be used for preventing dust and water ingress towards the image capturing device 201. In some embodiments, the enclosure 215 may further be configured with an electronic cooling device 217 as shown in FIG.2B to maintain normal temperature inside the enclosure 215. In some embodiments, the electronic cooling device 217 may be an electrically operated peltier cooler that performs active cooling to maintain the normal temperature inside the enclosure 215. This in turn ensures efficient working of the monitoring unit 203 and the image capturing device 201 embedding the monitoring unit 203. The electronic cooling device 217 may operate, for example at 24 volts power supply. Hence, there may be no requirement for separate utility lines like air and water lines for cooling of embedded hardware, enclosure layout, and power arrangement. In some embodiments the enclosure 215 may be required since the monitoring unit 203 is used for the implementation disclosed in the present disclosure in a harsh industrial environment which involves enormous amount of dust, pollution and heat generation.
FIG.2C shows a detailed block diagram of the monitoring unit for monitoring coupling of uncoupled wagons in real-time in accordance with some embodiments of the present disclosure.
In some implementations, the monitoring unit 203 may include data 219 and modules 221 related to the monitoring unit 203. As an example, the data 219 is stored in a memory 223 associated with an ARM based controller 209 of the monitoring unit 203. In some embodiments, the data 219 may include video data 227, distance data 229, coupling status data 231, Standard Operating Procedure (SOP) data 233, training data 235, alert data 237, and other data 239.
In some embodiments, the data 219 may be stored in the memory 223 in the form of various data structures. Additionally, the data 219 can be organized using data models, such as relational or hierarchical data models. The other data 239 may be stored data, including temporary data and temporary files, generated by the modules 221 for performing the various functions of the ARM based controller 209.
In some embodiments, the data 219 stored in the memory 223 may be processed by the modules 221 of the ARM based controller 209. The modules 221 may be stored within the memory 223. In an example, the modules 221 communicatively coupled to the ARM based controller 209 may also be present outside the memory 223 as shown in FIG.2B and implemented as
hardware. As used herein, the term modules 221 may refer to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
In some embodiments, the modules 221 may include, for example, a receiving module 241, an uncoupled wagon detecting module 243, a distance tracking module 245, a coupling status detecting module 247, an alerting module 249, a subject detecting module 251 and other modules 253. The other modules 253 may be used to perform various miscellaneous functionalities of the ARM based controller 209. It will be appreciated that such aforementioned modules 221 may be represented as a single module or a combination of different modules.
In some embodiments, the receiving module 241 may receive the video data 227 from one or more image capturing devices 201 via an Input/Output (I/O) interface 225. The video data 227 may include a plurality of video frames related to rail-bound wagons in a wagon movement area captured by the one or more image capturing devices 201. As an example, the plurality of video frames may capture videos of the rail-bound wagons moving on a rail track with load and without load, and also videos of loading at the source and unloading at the tippler.
In some embodiments, the uncoupled wagon detecting module 243 may detect an uncoupled wagon with a coupler and a former wagon in at least one frame of the plurality of video frames based on a trained first artificial intelligence technique. In some embodiments, an uncoupled wagon with a coupler and a former wagon may both be detected in a single video frame. The first artificial intelligence technique may be trained for object detection, which in this case is an uncoupled wagon and a former wagon, by initially creating a database with multiple training images. In some embodiments, each of the training images includes the uncoupled wagon which is captured under different conditions of lighting, view point, clarity, resolution and the like. Exemplary training images are as shown in the FIG.2D (1) and FIG.2D (2). FIG.2D (1) is a training image that shows presence of an exemplary uncoupled wagon 244. In some of the training images, the uncoupled wagon may be completely visible and in some other training images, the uncoupled wagon may be partially visible. FIG.2D (2) shows training images comprising an exemplary coupler 242 under various lighting conditions. The training images may be annotated to indicate the presence of at least one of the uncoupled wagon, a coupler of the uncoupled wagon, and a status of the coupler of the uncoupled wagon, in the training
images. Further, FIG.2D (3) indicates an exemplary training video frame which comprises both uncoupled wagon with a coupler and a former wagon in a single video frame. In the exemplary training video frame, reference number 244 may refer to an exemplary uncoupled wagon which is moving towards the exemplary former wagon 246 in the exemplary video frame. Thereafter, the annotated training video frames and images are fed as an input to a model, such as a Deep Neural Network (DNN) model associated of the first artificial intelligence technique. In some embodiments, the training video frames and images may be annotated manually by a subject matter expert. Upon receiving the annotated video frames and training images, the DNN model of the first artificial intelligence technique may analyse the training images to extract plurality of features that enable the DNN model to infer the presence of the uncoupled wagon in the training images. The DNN model may refine the plurality of features and modify weights in one or more neural network layers to enhance accuracy of the detection of the uncoupled wagon with a coupler and the former wagon in a single video frame, with analysis of large number of training images. Similarly, the DNN model may self-learn features to detect additional aspects such as presence of the coupler of the uncoupled wagon, status of the coupler of the uncoupled wagon and status of the coupler of the former wagon and the like, in the annotated training images. Upon self-learning the features, the DNN model of the first artificial intelligence technique may be referred for ease as trained first artificial intelligence technique. In some embodiments, even when the trained first artificial intelligence technique is deployed for real-time detection in a real world scenario, the DNN model of the trained first artificial intelligence technique continuously self-learns to provide accurate detection results. The data thus used for training the first artificial intelligence technique and self-learning of the trained first artificial intelligence technique may be stored as the training data 235.
In some embodiments, each of the plurality of video frames comprising both the uncoupled wagon with the coupler and the former wagon are subjected to video frame perspective transformation. As an example, the exemplary video frame in FIG.2D (3) may be transformed into an exemplary video frame as shown in the FIG.2E. In the FIG.2D (3), which is a raw video frame, the video frame seems to be tilted or rotated due to the viewing angle of the one or more image capturing devices that captured the video frames. However, upon subjecting the raw video frame to perspective transformation, the video frame is titled or rotated or aligned in order to showcase the content of the video frame clearly and also to ensure that the content of the video frame is suitable for coupling status detection, as shown in the FIG.2E. In some
embodiments, the perspective transformation may be performed using one or more predefined transformation techniques.
Further, the distance tracking module 245 may track distance between the uncoupled wagon with the coupler and the former wagon, as the uncoupled wagon moves towards the former wagon. In some embodiments, the former wagon may be a wagon which is available to be coupled with an uncoupled wagon that arrives for coupling. For instance, as shown in the FIG.2F, the wagon 244 is an uncoupled wagon arriving for coupling and wagon 246 is the former wagon available for coupling with wagon 244. Similarly, wagon 244 would now acts as former wagon available for coupling with another uncoupled wagon (not shown in the FIG.2F). Therefore, according to the example in this scenario where wagon 246 is the former wagon available to be coupled with an uncoupled wagon 244, status of the coupler of the former wagon 246 and the coupler of the uncoupled wagon 244 should be opposite. This means that, prior to coupling, while the uncoupled wagon is moving towards the former wagon for coupling, the coupler status of the uncoupled wagon and the former wagon should be aligned.
In some embodiments, as the uncoupled wagon moves towards the former wagon for coupling, as part of tracking the distance between the uncoupled wagon with coupler and the former wagon, the distance tracking module 245 may detect position of edges of the uncoupled wagon with the coupler and the former wagon that are detected in the video frame. As an example, as shown in the FIG. 2F, referral numeral 248 indicates edge of the exemplary uncoupled wagon 244 and the referral numeral 250 indicates edge of the exemplary former wagon 246. Thereafter, the distance tracking module 245 may track the distance based on relative change in the position of edges of the uncoupled wagon with coupler and the former wagon, as the uncoupled wagon moves towards the former wagon for coupling. For instance as shown in the FIG. 2G (1), when the exemplary uncoupled wagon 244 moves towards the exemplary former wagon 246, the distance between the edge 248 and the edge 250 varies relatively which is tracked in the video frame and indicated via an arrow mark between the edges 248 and 250. In some embodiments, the trained first artificial intelligence technique may analyse the at least one of the plurality of video frames to detect the one or more changes which the DNN model of the first artificial intelligence technique had detected at the time of training, and accordingly track the relative change in the position of edges between of the uncoupled wagon with coupler and the former wagon, when the uncoupled wagon with coupler moves towards the former wagon for coupling. In some embodiments, the detected edges of the uncoupled wagon and the
former wagon, and the tracked distance between of the uncoupled wagon and the former wagon may be stored as the distance data 229.
Further, the coupling status detecting module 247 may detect the coupling status of the uncoupled wagon with coupler and the former wagon based on the tracked distance. In some embodiments, to detect the coupling status, the coupling status detecting module 247 may compare the tracked distance between the uncoupled wagon with the coupler and the former wagon with a pre-set minimum threshold value. For instance, coupling status detecting module 247 may detect the coupling status of the uncoupled wagon with coupler and the former wagon to be successful when the distance reaches a pre-set minimum threshold value as shown in the FIG.2G (2) and remains constant within the pre-set minimum threshold value in each of plurality of subsequent video frames. In other words, when the uncoupled wagon with the coupler hits the coupler of the former wagon, the couplers that are in aligned state get coupled with each other, thus rendering the distance between the uncoupled wagon and the former wagon to be equal to a pre-set a minimum threshold value. When the couplers of the uncoupled wagon and the former wagon are coupled correctly or in other words, successfully, then both the wagons would move in a single direction together or stand still together, which does not change the pre-set minimum threshold value which was reached due to coupling. Therefore, in each of the plurality of the subsequent video frames, the distance between the two wagons i.e. the wagon that is referred as the uncoupled wagon and the former wagon, would remain constant. FIG.2H (1) shows a graphical illustration of the distance between the two wagons to be a constant, post the coupling point i.e. the point after which the two wagons coupled with each other. In the graphical illustration, X-axis indicates the movement of the wagon and Y-axis indicates the distance between the wagons. The curve moving in a straight line in the graph indicates constant distance equal to the pre-set minimum threshold value, due to the movement of the two wagons together towards a single direction.
In some other embodiments, the coupling status detecting module 247 may detect the coupling status of the uncoupled wagon with coupler and the former wagon to be unsuccessful when the distance between the uncoupled wagon with coupler and the former wagon reaches a pre-set minimum threshold value post coupling, and increases thereafter in the plurality of subsequent video frames. In other words, when the uncoupled wagon with the coupler hits the coupler of the former wagon, the couplers that are in aligned state get coupled with each other, thus rendering the distance between the uncoupled wagon and the former wagon to be equal to a
pre-set a minimum threshold value. However, in some scenarios, due to weak coupling, the coupling may break and become unsuccessful. In some other scenarios, the couplers of the uncoupled wagon and the former wagon may be not aligned for coupling, due to which, though the uncoupled wagon moves towards the former wagon, the coupling may fail. When the couplers of the uncoupled wagon and the former wagon are not coupled or coupled weakly, or in other words, unsuccessfully, then both the wagons would move in the same direction or the opposite direction, but not together. Therefore, the distance between the two wagons would not be constantly equal to the pre-set minimum threshold value, but would vary unpredictably. Therefore, in each of the plurality of the subsequent video frames, the distance between the two wagons i.e. the wagon that is referred as the uncoupled wagon and the former wagon, would vary. FIG.2H(2) shows a graphical illustration of the distance between the two wagons to be varying or increasing, as the two wagons move away from each other due to unsuccessful coupling, post the coupling point i.e. the point after which the two wagons coupled with each other. In the graphical illustration, X-axis indicates the movement of the wagon and Y-axis indicates the distance between the wagons. The curve moving upward in the graph indicates increase in the distance from the pre-set minimum threshold value, due to the movement of the two wagons away from each other. In some embodiments, the coupling status thus detected may be stored as the coupling status data 231.
Further, an alerting module 249 may provide alerts to one or more personnel via a server 205 associated with the monitoring unit 203. In some embodiments, when the unsuccessful coupling is detected, the alerting module 249 may share data associated with detection of the unsuccessful coupling with the server 205. The server 205 may analyze the data received and provide one or more alerts to one or more personnel (interchangeably used as one or more security personnel). As an example, the one or more security personnel may include operators, security, technicians and the like in the wagon movement area and the wagon tippler area. In one implementation, the one or more alerts may be provided to one or more computing devices 207 of the one or more security personnel and one or more subjects. As an example, the one or more computing devices 207 may include, but not limited to, a mobile phone, a tablet phone, a laptop, a desktop, and the like. In another implementation, the one or more alerts may be provided via sirens, light indications and the like in the wagon tippler area and the wagon movement area. As an example, the server 205 may provide alerts to manually couple the uncoupled wagon with the coupler and the former wagon, when the coupling status is determined to be unsuccessful. In some embodiments, the manual coupling may be performed
by the one or more personnel without violating predefined Standard Operating Procedure (SOP). In some embodiments, the alert data 237 may store data associated with one or more alerts provided to the one or more personnel. As an example, the alert data 237 may include, but not limited to, reason behind the alert, time at which alerts were provided, number of times alerts were provided, and the like. In some embodiments, the one or more warning alerts may include, but not limited to, audio alerts, visual alerts, and haptic alerts and the like.
In some embodiments, the subject detecting module 251 may parallelly detect presence of one or more subjects in at least one of the plurality of video frames using a trained second artificial intelligence technique. As an example, the one or more subjects may include, but not limited to, humans, animals and some obstacles that may hinder movement of the wagons or tippler. In some embodiments, like the first artificial intelligence technique, the second artificial intelligence technique may also be trained by providing annotated training images related to presence of one or more subjects in the training images. Upon receiving the annotated training images, the DNN model of the second artificial intelligence technique may analyse the training images to extract plurality of features that enable the DNN model to infer the presence of the one or more subjects in the training images. The DNN model may refine the plurality of features and modify weights in one or more neural network layers to enhance accuracy of the detection of the presence of the one or more subjects, with analysis of large number of training images. Upon self-learning the features for detecting the presence of the one or more subjects, the DNN model of the second artificial intelligence technique may be referred for ease as trained second artificial intelligence technique. In some embodiments, even when the trained second artificial intelligence technique is deployed for real-time detection in a real world scenario, the DNN model of the trained second artificial intelligence technique continuously self-learns to provide accurate detection results. The data thus used for training the second artificial intelligence technique and self-learning of the trained second artificial intelligence technique may also be stored as the training data 235.
In some embodiments, upon determining the presence of the one or more subjects in at least one of the wagon movement area and the wagon tippler area, the subject detecting module 251 may determine whether such presence of the one or more subjects is authorized as per Standard Operating Procedure (SOP). In some embodiments, the SOP may include, but not limited to, a predefined list of authorized scenarios comprising at least one of rack placement, emergency stop of tippler equipment, maintenance mode of equipment and complete power shutdown.
Such SOP may be stored as the SOP data 233. When the presence of the one or more subjects is determined to be unauthorized as per the SOP, the alerting module 249 may send alerts to at least one of the one or more subjects and the one or more security personnel, via the server 205. The one or more alerts may indicate the one or more subjects to step out of at least one of the wagon movement area and the wagon tippler area, when the presence of the one or more subjects is determined to be unauthorized as per the SOP. In the embodiment where the one or more detected subjects is a non-living thing such as an obstacle, the one or more alerts may be provided to remove the obstacle from its position, to ensure unobstructed movement of the wagons and the tippler. In some embodiments, data related to alerts such as the number of alerts sent, purpose of the alerts, time at which alerts were provided and the like may be stored as part of the alert data 237.
FIG.3A is a flowchart illustrating a method of monitoring coupling of uncoupled wagons in real-time in accordance with some embodiments of the present disclosure.
As illustrated in FIG.3A, the method 300A comprises one or more blocks illustrating a method of monitoring coupling of uncoupled wagons in real-time. The method 300A may be described in the general context of computer-executable instructions. Generally, computer-executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform functions or implement abstract data types.
The order in which the method 300A is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300A. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 300A can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 301, the method 300A may include receiving, by a monitoring unit 203 associated with the one or more image capturing devices 201, a plurality of video frames related to rail-bound wagons in a wagon movement area captured by the one or more image capturing devices 201.
At block 303, the method 300A may include detecting, by the monitoring unit 203, presence of an uncoupled wagon with a coupler and a former wagon in at least one of the plurality of video frames based on a trained first artificial intelligence technique.
At block 305, the method 300A may include tracking, by the monitoring unit 203, distance between the uncoupled wagon with the coupler and the former wagon, as the uncoupled wagon moves towards the former wagon. In some embodiments, to track the distance between the uncoupled wagon with coupler and the former wagon, the controller of the monitoring unit 203 may detect position of edges of the uncoupled wagon with coupler and the former wagon in the video frame. Thereafter, the controller of the monitoring unit 203 may track the distance based on relative change in the position of edges of the uncoupled wagon with coupler and the former wagon, as the uncoupled wagon moves towards the former wagon for coupling.
At block 307, the method 300A may include detecting, by the monitoring unit 203, the coupling status of the uncoupled wagon with coupler and the former wagon based on the tracked distance. In some embodiments, the controller detects the coupling status of the uncoupled wagon with coupler and the former wagon based on the tracked distance to be successful when the distance reaches a pre-set minimum threshold value and remains constant within the pre-set minimum threshold value in each of plurality of subsequent video frames. In some other embodiments, the controller detects the coupling status of the uncoupled wagon with coupler and the former wagon based on the tracked distance to be unsuccessful when the distance reaches the pre-set minimum threshold value and increases thereafter in the plurality of subsequent video frames. In some embodiments, the monitoring unit 203 may provide alerts to one or more personnel, via a server associated with the monitoring unit 203, to manually couple the coupler of the uncoupled wagon with the coupler and the former wagon, when the coupling status is determined to be unsuccessful. The manual coupling is performed by the one or more personnel without violating predefined Standard Operating Procedure (SOP).
FIG.3B is a flowchart illustrating a method of detecting presence of one or more subjects parallelly while monitoring coupling of uncoupled wagons in real-time in accordance with some embodiments of the present disclosure.
As illustrated in FIG.3B, the method 300B comprises one or more blocks illustrating a a method of detecting presence of one or more subjects parallelly while monitoring coupling of uncoupled wagons in real-time. The method 300B may be described in the general context of computer-executable instructions. Generally, computer-executable instructions can include
routines, programs, objects, components, data structures, procedures, modules, and functions, which perform functions or implement abstract data types.
The order in which the method 300B is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300B. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 300B can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 311, the method 300B may include receiving, by a monitoring unit 203, a plurality of video frames from the one or more image capturing devices 201.
At block 313, the method 300B may include detecting, by the monitoring unit 203, presence of one or more subjects in at least one of the plurality of video frames using a trained second artificial intelligence technique.
At block 315, the method 300B may include determining, by the monitoring unit 203, whether the presence of the one or more subjects is authorized as per SOP. In some embodiments, the SOP may include, but not limited to, a predefined list of authorized scenarios comprising at least one of rack placement, emergency stop of tippler equipment, maintenance mode of equipment and complete power shutdown.
At block 317, the method 300B may include providing, by the monitoring unit 203, alerts to the one or more subjects and one or more security personnel, via a server associated with monitoring unit, to step out of at least one of the wagon movement area and the wagon tippler area, when the presence of the one or more subjects is determined to be unauthorized as per the SOP.
FIG.4 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
In some embodiments, FIG.4 illustrates a block diagram of an exemplary computer system 400 for implementing embodiments consistent with the present disclosure. In some embodiments, the computer system 400 can be a monitoring unit 203 (also referred as a processor 402 in this FIG.4) that is used for monitoring coupling of uncoupled wagons in real¬time. The processor 402 may include at least one data processor for executing program
components for executing user or system-generated business processes. The processor 402 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
The processor 402 may be disposed in communication with input devices 411 and output devices 412 via I/O interface 401. The I/O interface 401 may employ communication protocols/methods such as, without limitation, audio, analog, digital, stereo, IEEE-1394, serial bus, Universal Serial Bus (USB), infrared, PS/2, BNC, coaxial, component, composite, Digital Visual Interface (DVI), high-definition multimedia interface (HDMI), Radio Frequency (RF) antennas, S-Video, Video Graphics Array (VGA), IEEE 802.n /b/g/n/x, Bluetooth, cellular (e.g., Code-Division Multiple Access (CDMA), High-Speed Packet Access (HSPA+), Global System For Mobile Communications (GSM), Long-Term Evolution (LTE), WiMax, or the like), etc.
Using the I/O interface 401, computer system 400 may communicate with input devices 411 and output devices 412.
In some embodiments, the processor 402 may be disposed in communication with a communication network 409 via a network interface 403. The network interface 403 may communicate with the communication network 409. The network interface 403 may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/Internet Protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc. Using the network interface 403 and the communication network 409, the computer system 400 may communicate with an image capturing device 2011 to image capturing device 201n (also referred as one or more image capturing devices 201) and a server 205. The communication network 409 can be implemented as one of the different types of networks, such as intranet or Local Area Network (LAN) and such within the organization. The communication network 409 may either be a dedicated network or a shared network, which represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), etc., to communicate with each other. Further, the communication network 409 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, etc. In some embodiments, the processor 402 may be disposed in communication with a memory 405 (e.g.,
RAM, ROM, etc. not shown in FIG.4) via a storage interface 404. The storage interface 404 may connect to memory 405 including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as Serial Advanced Technology Attachment (SATA), Integrated Drive Electronics (IDE), IEEE-1394, Universal Serial Bus (USB), fibre channel, Small Computer Systems Interface (SCSI), etc. The memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, Redundant Array of Independent Discs (RAID), solid-state memory devices, solid-state drives, etc.
The memory 405 may store a collection of program or database components, including, without limitation, a user interface 406, an operating system 407, a web browser 408 etc. In some embodiments, the computer system 400 may store user/application data, such as the data, variables, records, etc. as described in this invention. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase.
Operating system 407 may facilitate resource management and operation of computer system 400. Examples of operating systems include, without limitation, APPLE® MACINTOSH® OS X®, UNIX®, UNIX-like system distributions (E.G., BERKELEY SOFTWARE DISTRIBUTION® (BSD), FREEBSD®, NETBSD®, OPENBSD, etc.), LINUX® DISTRIBUTIONS (E.G., RED HAT®, UBUNTU®, KUBUNTU®, etc.), IBM®OS/2®,
MICROSOFT® WINDOWS® (XP®, VISTA®/7/8, 10 etc.), APPLE® IOS®, GOOGLETM ANDROIDTM, BLACKBERRY® OS, or the like. User interface 406 may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities. For example, user interfaces may provide computer interaction interface elements on a display system operatively connected to computer system 400, such as cursors, icons, check boxes, menus, scrollers, windows, widgets, etc. Graphical User Interfaces (GUIs) may be employed, including, without limitation, Apple® Macintosh® operating systems’ Aqua®, IBM® OS/2®, Microsoft® Windows® (e.g., Aero, Metro, etc.), web interface libraries (e.g., ActiveX®, Java®, Javascript®, AJAX, HTML, Adobe® Flash®, etc.), or the like.
Computer system 400 may implement web browser 408 stored program components. Web browser 408 may be a hypertext viewing application, such as MICROSOFT® INTERNET EXPLORER®, GOOGLETM CHROMETM, MOZILLA® FIREFOX®, APPLE® SAFARI®, etc.
Secure web browsing may be provided using Secure Hypertext Transport Protocol (HTTPS), Secure Sockets Layer (SSL), Transport Layer Security (TLS), etc. Web browsers 408 may utilize facilities such as AJAX, DHTML, ADOBE® FLASH®, JAVASCRIPT®, JAVA®,
Application Programming Interfaces (APIs), etc. Computer system 400 may implement a mail server stored program component. The mail server may be an Internet mail server such as Microsoft Exchange, or the like. The mail server may utilize facilities such as ASP, ACTIVEX®, ANSI® C++/C#, MICROSOFT®,. NET, CGI SCRIPTS, JAVA®, JAVASCRIPT®, PERL®, PHP, PYTHON®, WEBOBJECTS®, etc. The mail server may utilize communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), MICROSOFT® exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like. In some embodiments, the computer system 400 may implement a mail client stored program component. The mail client may be a mail viewing application, such as APPLE® MAIL, MICROSOFT® ENTOURAGE®, MICROSOFT® OUTLOOK®, MOZILLA® THUNDERBIRD®, etc.
Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present invention. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, non-volatile memory, hard drives, Compact Disc (CD) ROMs, Digital Video Disc (DVDs), flash drives, disks, and any other known physical storage media.
Equivalents:
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention. When a single device or article is described herein, it will be apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device
may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
The specification has described a method, device and a system for monitoring coupling of uncoupled wagons in real-time. The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that on-going technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments. Also, the words "comprising," "having," "containing," and "including," and other similar forms are intended to be equivalent in meaning and be open-ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the embodiments of the present invention are intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
Referral numerals
Reference Number Description
200 Exemplary system
201 One or more image capturing devices
203 Monitoring unit
205 Server
207 Computing device
209 ARM based Controller
211 Network module
213 Interface Card
215 Enclosure
217 Electronic cooling device
219 Data
221 Modules
223 Memory
225 I/O interface
227 Video data
229 Distance data
231 Coupling status data
233 Standard Operating Procedure (SOP) data
235 Training data
237 Alert data
239 Other data
241 Receiving module
242 Exemplary coupler
243 Uncoupled wagon detecting module
244 Exemplary uncoupled wagon
245 Distance tracking module
246 Exemplary former wagon
247 Coupler status detecting module
248 Edge of the exemplary uncoupled wagon 244
250 Edge of the exemplary former wagon 246
249 Alerting module
251 Subject detecting module
253 Other modules
400 Exemplary computer system
401 I/O Interface of the exemplary computer system
402 Processor of the exemplary computer system
403 Network interface
404 Storage interface
405 Memory of the exemplary computer system
406 User interface
407 Operating system
408 Web browser
409 Communication network
411 Input devices
412 Output devices
We claim:
1. A method of monitoring coupling of uncoupled wagons in real-time, the method comprising:
receiving, by a monitoring unit (203) associated with one or more image capturing devices (201), a plurality of video frames related to rail-bound wagons in a wagon movement area captured by the one or more image capturing devices (201);
detecting, by the monitoring unit (203), presence of an uncoupled wagon with a coupler and a former wagon in at least one of the plurality of video frames based on a trained first artificial intelligence technique;
tracking, by the monitoring unit (203), distance between the uncoupled wagon with the coupler and the former wagon, as the uncoupled wagon moves towards the former wagon; and
detecting, by the monitoring unit (203), the coupling status of the uncoupled wagon with coupler and the former wagon based on the tracked distance.
2. The method as claimed in claim 1, wherein tracking the distance between the uncoupled
wagon with coupler and the former wagon comprises:
detecting position of edges of the uncoupled wagon with the coupler and the former wagon; and
tracking the distance based on relative change in the position of edges of the uncoupled wagon with coupler and the former wagon, as the uncoupled wagon moves towards the former wagon for coupling.
3. The method as claimed in claim 1, wherein the coupling status of the uncoupled wagon
with coupler and the former wagon based on the tracked distance is detected to be:
a) successful when the distance reaches a pre-set minimum threshold value and
remains constant within the pre-set minimum threshold value in each of plurality
of subsequent video frames; or
b) unsuccessful when the distance reaches the pre-set minimum threshold value
and increases thereafter in the plurality of subsequent video frames.
4. The method as claimed in claim 1 further comprises providing, by the monitoring unit (203), alerts to one or more personnel, via a server (205) associated with the monitoring unit (203), to manually couple the uncoupled wagon with the coupler and the former wagon, when the coupling status is determined to be unsuccessful, wherein the manual coupling is performed by the one or more personnel without violating predefined Standard Operating Procedure (SOP).
5. The method as claimed in claim 1 further comprises:
receiving, by the monitoring unit (203), the plurality of video frames from the one or more image capturing devices (201);
detecting, by the monitoring unit (203), presence of one or more subjects in at least one of the plurality of video frames using a trained second artificial intelligence technique;
determining, by the monitoring unit (203), whether the presence of the one or more subjects is authorized as per SOP; and
providing, by the monitoring unit (203), alerts to the one or more subjects and one or more security personnel, via a server (205) associated with monitoring unit (203), to step out of at least one of the wagon movement area and the wagon tippler area, when the presence of the one or more subjects is determined to be unauthorized as per the SOP.
6. The method as claimed in claim 5, wherein the SOP comprises a predefined list of authorized scenarios comprising at least one of rack placement, emergency stop of tippler equipment, maintenance mode of equipment and complete power shutdown.
7. A monitoring unit (203) for monitoring coupling of uncoupled wagons in real-time, the monitoring unit (203) comprising: a controller (209);
a memory (223) communicatively coupled to the controller (209), wherein the memory (223) stores the controller-executable instructions, which, on execution, causes the controller (209) to:
receive a plurality of video frames related to rail-bound wagons in a wagon movement area captured by one or more image capturing devices (201) associated with the monitoring unit (203);
detect presence of an uncoupled wagon with a coupler and a former wagon both in at least one of the plurality of video frames based on a trained first artificial intelligence technique;
track distance between the uncoupled wagon with coupler and the former wagon, as the uncoupled wagon moves towards the former wagon; and
detect the coupling status of the uncoupled wagon with coupler and the former wagon based on the tracked distance.
8. The monitoring unit (203) as claimed in claim 7, wherein to track the distance between the
uncoupled wagon with coupler and the former wagon, the processor is configured to:
detect position of edges of the uncoupled wagon with coupler and the former wagon; and
track the distance based on relative change in the position of edges of the uncoupled wagon with coupler and the former wagon, as the uncoupled wagon moves towards the former wagon for coupling.
9. The monitoring unit (203) as claimed in claim 7, wherein the controller (209) detects the
coupling status of the uncoupled wagon with coupler and the former wagon based on the
tracked distance to be:
c) successful when the distance reaches a pre-set minimum threshold value and
remains constant within the pre-set minimum threshold value in each of plurality
of subsequent video frames; or
d) unsuccessful when the distance reaches the pre-set minimum threshold value
and increases thereafter in the plurality of subsequent video frames.
10. The monitoring unit (203) as claimed in claim 7, wherein the controller (209) is further configured to provide alerts to one or more personnel, via a server (205) associated with monitoring unit (203), to manually couple the uncoupled wagon with coupler and the former wagon, when the coupling status is determined to be unsuccessful, wherein the manual coupling is performed by the one or more personnel without violating predefined Standard Operating Procedure (SOP).
11. The monitoring unit (203) as claimed in claim 7, wherein the controller (209) is further configured to:
receive the plurality of video frames from the one or more image capturing devices (201);
detect presence of one or more subjects in at least one of the plurality of video frames using a trained second artificial intelligence technique;
determine whether the presence of the one or more subjects is authorized as per SOP; and
provide alerts to the one or more subjects and one or more security personnel, via a server (205) associated with monitoring unit (203), to step out of at least one of the wagon movement area and the wagon tippler area, when the presence of the one or more subjects is determined to be unauthorized as per the SOP.
12. The monitoring unit (203) as claimed in claim 11, wherein the SOP comprises a predefined list of authorized scenarios comprising at least one of rack placement, emergency stop of tippler equipment, maintenance mode of equipment and complete power shutdown.
13. A system for monitoring coupling of uncoupled wagons in real-time, the system comprising:
one or more image capturing devices (201);
a monitoring unit (203) associated with the one or more image capturing devices (201);
a server (205) associated with the monitoring unit (203); and
one or more computing devices (207) associated with the server (205), wherein
the one or more image capturing devices (201) are configured to capture a plurality of video frames related to a wagon movement area and a wagon tippler area;
the monitoring unit (203) configured to perform the method as claimed in any of the preceding claims 1 to 6;
the server (205) configured to receive data related to alerts from the monitoring unit (203), and generate one or more alerts for one or more subjects and one or more security personnel; and
the one or more computing devices (207) configured to receive and display the one or more alerts received from the server (205).
14. The system as claimed in claim 13 further comprises an enclosure (215) covering each of the one or more image capturing devices (201) for preventing dust and water ingress.
15. The system as claimed in claim 14, wherein the enclosure (215) is configured with an electronic cooling device (217) to maintain normal temperature inside the enclosure (215).
| # | Name | Date |
|---|---|---|
| 1 | 202231019786-STATEMENT OF UNDERTAKING (FORM 3) [31-03-2022(online)].pdf | 2022-03-31 |
| 2 | 202231019786-REQUEST FOR EXAMINATION (FORM-18) [31-03-2022(online)].pdf | 2022-03-31 |
| 3 | 202231019786-POWER OF AUTHORITY [31-03-2022(online)].pdf | 2022-03-31 |
| 4 | 202231019786-FORM-8 [31-03-2022(online)].pdf | 2022-03-31 |
| 5 | 202231019786-FORM 18 [31-03-2022(online)].pdf | 2022-03-31 |
| 6 | 202231019786-FORM 1 [31-03-2022(online)].pdf | 2022-03-31 |
| 7 | 202231019786-DRAWINGS [31-03-2022(online)].pdf | 2022-03-31 |
| 8 | 202231019786-DECLARATION OF INVENTORSHIP (FORM 5) [31-03-2022(online)].pdf | 2022-03-31 |
| 9 | 202231019786-COMPLETE SPECIFICATION [31-03-2022(online)].pdf | 2022-03-31 |
| 10 | 202231019786-Proof of Right [20-07-2022(online)].pdf | 2022-07-20 |
| 11 | 202231019786-FER.pdf | 2024-09-24 |
| 12 | 202231019786-OTHERS [19-03-2025(online)].pdf | 2025-03-19 |
| 13 | 202231019786-FER_SER_REPLY [19-03-2025(online)].pdf | 2025-03-19 |
| 14 | 202231019786-DRAWING [19-03-2025(online)].pdf | 2025-03-19 |
| 15 | 202231019786-CLAIMS [19-03-2025(online)].pdf | 2025-03-19 |
| 1 | SearchHistoryE_11-09-2024.pdf |