Abstract: The present disclosure describes techniques for collaborative edge computing in video surveillance using AI models and data analytics . In an embodiment, an edge device is configured to detect a first road incident involving at least one first object and derive at least one first road incident parameter and a first ToI. The edge device is configured to relay the derived at least one first road incident parameter and the first ToI to one or more neighbouring edge devices and a command server. The edge device is configured to detect an existence of at least one second object. The edge device is configured to relay the at least one second road incident parameter and a second ToI to a first neighbouring edge device and the command server. The edge device is configured to relay a confirmation of the existence of the at least one second object to the command server.
DESC:FIELD OF THE INVENTION
[0001] The present disclosure relates generally to systems and methods for collaborative edge computing in video surveillance using AI models and data analytics .
BACKGROUND
[0002] The field of video surveillance has become crucial in enhancing security and monitoring activities across various environments, such as public spaces, retail locations, and private properties. Traditional video surveillance systems include various components, such as cameras for capturing visual data, storage devices for archiving video footage, servers for video processing, and software applications for facilitating access and management of video files. These surveillance systems provide both real-time monitoring and offline analysis capabilities. However, despite advancements in surveillance technology, significant challenges remain. These challenges are particularly evident in the efficient management, processing, and analysis of the large volumes of data generated by these surveillance systems.
[0003] One of the primary challenges in current video surveillance systems is the reliance on manual intervention for both real-time operation and offline analysis. This manual dependency becomes especially cumbersome when the surveillance system is deployed across large areas, and each area is equipped with numerous cameras. These cameras are tasked with recording video footage from various surveillance zones. Such tasks often lead to an overwhelming amount of video data and the data must be stored, processed, and analyzed. Such cumbersome tasks not only place a strain on storage and processing resources but also result in inefficiencies in handling large-scale surveillance operations.
[0004] Furthermore, incidents that occur across multiple surveillance zones present an additional challenge. The current approach to incident analysis typically requires a manual review of video footage from different cameras covering various areas and time periods. This manual review process is time-consuming, error-prone, and inefficient. The need for accurate and timely compilation of an incident's complete overview is often hindered by the sheer volume of data and the fragmented nature of video footage captured across different zones.
[0005] Existing solutions in the field of video surveillance have attempted to address some of these challenges. For example, some existing systems utilize edge-assisted technologies, where an edge device communicates with connected vehicles or roadside units to provide alerts based on observations and driving data. These edge devices classify the behaviour of detected vehicles and determine whether and how a connected vehicle may be impacted, providing vehicle-specific alerts. However, these systems tend to be focused on specific environments, such as road traffic and are limited in their application to a broader range of surveillance scenarios. Additionally, such systems typically do not offer a comprehensive solution for seamlessly analyzing video data across multiple surveillance zones.
[0006] In other existing technologies, Internet of Things (IoT)-based integrated devices have been employed to monitor and control events in real-time environments. These IoT devices include sensors and image/video capture devices that are connected to processors to correlate sensor data with media files, such as images, audio, and video. The processor validates events and transmits relevant data to edge nodes, cloud servers, or user devices for further monitoring. While these systems improve real-time monitoring, they are often fragmented. These systems do not address the complexities involved in efficiently processing and analyzing large amounts of video surveillance data across diverse environments and multiple surveillance zones.
[0007] Additionally, some existing systems focus on edge alert coordination for mobile devices. In these systems, an edge device monitors device data and metrics. The edge device transmits alerts based on inherited device data and real-time monitoring. While these systems provide useful notifications, they do not directly address the core problem of efficiently analyzing video surveillance data from multiple surveillance zones in an integrated and automated manner.
[0008] Therefore, there remains a need for an improved solution that addresses the aforementioned challenges in video surveillance systems.
SUMMARY
[0009] This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify essential inventive concepts of the invention nor is it intended for determining the scope of the invention.
[0010] According to an embodiment of the present disclosure, an edge device for monitoring and relaying data associated with road incidents is disclosed. The edge device comprises a camera module configured to capture real-time video data associated with one or more objects on a road. The edge device further comprises a processor coupled to the camera module. The processor is configured to detect, by at least one artificial intelligence (AI) model, a first road incident involving at least one first object among the one or more objects using the captured real-time video data. The processor is further configured to derive at least one first road incident parameter associated with the at least one first object and a first time of incident (ToI) of the first road incident. The processor is furthermore configured to relay the derived at least one first road incident parameter and the first ToI to one or more neighbouring edge devices and a command server connected to the edge device. Further, when the edge device receives, from at least one neighbouring edge device among the one or more neighbouring edge devices, at least one second road incident parameter and a second ToI associated with a second road incident, the processor is further configured to detect an existence of at least one second object associated with the second road incident by analyzing the captured real-time video data within a predefined time period. The processor is further configured to relay the at least one second road incident parameter and the second ToI to a first neighbouring edge device among the one or more neighbouring edge devices based on the detection of the existence of the at least one second object. The processor is further configured to relay the at least one second road incident parameter, the second ToI, and a confirmation of the existence of the at least one second object to the command server, in response to detecting an existence of the at least one second object.
[0011] In another embodiment, a command server for generating road incident information is disclosed. The command server comprises a memory and a processor coupled to the memory. The processor is configured to receive road incident data associated with a road incident from a first edge device, captured at a first time instance, and one or more other edge devices at corresponding one or more subsequent time instances. The processor is further configured to generate the road incident information comprising an incident number, a sequence of edge devices involved in the road incident, time of existence (TOE) of at least one object involved in the road incident and mapped real-time video data corresponding to the road incident by analyzing the received road incident data.
[0012] In yet another embodiment, a method for monitoring and relaying data associated with road incidents is disclosed. The method includes capturing, by an edge device, real-time video data associated with one or more objects on a road. The method further includes detecting, by at least one artificial intelligence (AI) model of the edge device, a first road incident involving at least one first object among the one or more objects using the captured real-time video data. The method furthermore includes deriving, by the edge device, at least one first road incident parameter associated with the at least one first object and a first time of incident (ToI) of the first road incident. The method further includes relaying, by the edge device, the derived at least one first road incident parameter and the first ToI to one or more neighbouring edge devices and a command server connected to the edge device. Further, when the edge device receives, from at least one neighbouring edge device among the one or more neighbouring edge devices, at least one second road incident parameter and a second ToI associated with a second road incident, the method further comprises detecting an existence of at least one second object associated with the second road incident by analyzing the captured real-time video data within a predefined time period. The method further comprises relaying the at least one second road incident parameter and the second ToI to a first neighbouring edge device among the one or more neighbouring edge devices based on the detection of the existence of the at least one second object. The method furthermore comprises in response to detecting an existence of the at least one second object, relaying the at least one second road incident parameter, the second ToI, and a confirmation of the existence of the at least one second object to the command server.
[0013] In yet another embodiment, a method for generating road incident information is disclosed. The method comprises receiving road incident data associated with a road incident from a first edge device, captured at a first time instance, and one or more other edge devices at corresponding one or more subsequent time instances. The method further comprises generating the road incident information comprising an incident number, a sequence of edge devices involved in the road incident, time of existence (TOE) of at least one object involved in the road incident and mapped real-time video data corresponding to the road incident by analyzing the received road incident data.
[0014] To further clarify the advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting to its scope. The invention will be described and explained with additional specificity and detail in the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0016] FIG. 1 illustrates an environment for monitoring and analysing road incidents, according to an embodiment of the present disclosure;
[0017] FIG. 2 illustrates a block diagram of an edge device for monitoring and relaying data associated with the road incidents, according to an embodiment of the present disclosure;
[0018] FIG. 3 illustrates an exemplary scenario for monitoring and analysing the road incidents, according to an embodiment of the present disclosure;
[0019] FIG. 4 illustrates a block diagram of a command server for generating road incident information, according to an embodiment of the present disclosure;
[0020] FIGS. 5A-5B illustrate flowcharts depicting a method for monitoring and relaying data associated with road incidents, according to an embodiment of the present disclosure; and
[0021] FIG. 6 illustrates a flowchart depicting a method for generating road incident information, according to an embodiment of the present disclosure;
[0022] Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present invention. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0023] For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the various embodiments, and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
[0024] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are explanatory of the invention and are not intended to be restrictive thereof.
[0025] Reference throughout this specification to “an aspect”, “another aspect” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0026] The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by “comprises... a” does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.
[0027] The present disclosure discloses techniques for monitoring and analysing road incidents on a road. In an embodiment, the disclosed techniques provided an edge device that monitors and relays data associated with the road incidents. The disclosed techniques also provide a command server that generates road incident information associated with the road incidents.
[0028] Example embodiments of the present inventive concepts will be described below in detail with reference to the accompanying drawings.
[0029] FIG. 1 illustrates an environment 100 for monitoring and analysing road incidents, according to an embodiment of the present disclosure. As shown, the environment 100 includes a plurality of edge devices, such as a first edge device 101A, a second edge device 101B, and a third edge device 101C (also referred to as edge devices 101). The edge devices 101 may be connected to a command center 103 through a wireless communication network 105. In some examples, the wireless communication network 105 may be a Long Term Evolution (LTE) network, an LTE-Advanced (LTE-A) network, an LTE-A Pro network, a New Radio (NR) network, a Fifth-Generation (5G) network, a Sixth-Generation (6G) network, a 6G-pro network or similar networks. In some examples, the wireless communication network 105 may support enhanced broadband communications, ultra-reliable (e.g., mission-critical) communications, low latency communications, communications with low-cost and low-complexity devices, or any combination thereof.
[0030] In an embodiment, the plurality of edge devices 101 may be connected with each other in a pre-defined network configuration. In an embodiment, the pre-defined network configuration may include but is not limited to a linear configuration, a circular configuration, a mesh configuration, and similar configurations. In an embodiment, the network configuration may be defined based on an area of surveillance on the road and available network connectivity.
[0031] In an embodiment, the edge devices 101 may be placed across various routes on a road.
[0032] Further, each of the plurality of edge devices 101 may be capable of executing Artificial Intelligence (AI) models. Each of the plurality of edge devices 101 may include or may be referred to as a mobile device, a wireless device, a remote device, a handheld device, a subscriber device, or some other suitable terminology, where the “device” may also be referred to as a unit, a station, a terminal, or a client, among other examples. Each of the plurality of edge devices 101 may also include or may be referred to as a personal electronic device, such as a cellular phone, a Personal Digital Assistant (PDA), a tablet computer, a laptop computer, or a personal computer. In some examples, each of the plurality of devices 101 may include or be referred to as an Internet of Things (IoT) device, an Internet of Everything (IoE) device, or A Machine-Type Communications (MTC) device, among other examples, which may be implemented in various objects such as appliances, or vehicles, meters, among other examples. The edge devices 101 described herein may be able to communicate with various types of devices, such as other edge devices that may sometimes act as relays. The edge devices 101 and working thereof have been further explained in reference to FIG. 2.
[0033] FIG. 2 illustrates a block diagram of the edge device for monitoring and relaying data associated with the road incidents, according to an embodiment of the present disclosure. In an exemplary embodiment, the first edge device 101A (also referred to as the edge device 101A) has been explained in FIG. 2. However, it should be noted that the structure of each of the edge devices 101 is the same as the structure of the edge device 101A.
[0034] As shown in FIG. 2, the edge device 101A may include a memory 202, at least one processor 204 (herein referred to as the processor 204), a camera module 206, and an Input/ Output (I/O) interface 208. In an exemplary embodiment, the at least one processor 204 may be operatively coupled to the camera module 206, the I/O interface 208, and the memory 202.
[0035] In one embodiment, the at least one processor 204 may be operatively coupled to the memory 202 for processing, executing, or performing a set of operations. The at least one processor 204 may include at least one data processor for executing processes in a Virtual Storage Area Network. In another embodiment, the at least one processor 204 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc. In one embodiment, the processor 204 may include a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or both. In another embodiment, the at least one processor 204 may be one or more general processors, digital signal processors, application-specific integrated circuits, field-programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analyzing and processing data. The at least one processor 204 may execute a software program, such as code generated manually (i.e., programmed) to perform one or more operations disclosed in the present disclosure.
[0036] The at least one processor 204 may be disposed in communication with one or more I/O devices, such as the edge device 101, via the I/O interface 208. The I/O interface 208 may employ communication Code-Division Multiple Access (CDMA), High-Speed Packet Access (HSPA+), Global System For Mobile Communications (GSM), Long-Term Evolution (LTE), WiMax, or the like, etc.
[0037] In an embodiment, the at least one processor 204 may be disposed in communication with a communication network via a network interface. In an embodiment, the network interface may be the I/O interface 208. The network interface may connect to the communication network to enable connection of the edge device 101A with the outside environment and/or device/system. The network interface may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/Internet Protocol (TCP/IP), token ring, IEEE 802.11/b/g/n/x, etc. The communication network may include, without limitation, a direct interconnection, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol (WAP)), the Internet, etc. Using the network interface and the communication network, the edge device 101A may communicate with other devices. The network interface may employ connection protocols including, but not limited to, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), TCP/IP, token ring, IEEE 802.11/b/g/n/x, etc.
[0038] In an embodiment, the processor 204 may use at least one artificial intelligence (AI) model. A function associated with AI may be performed through the non-volatile memory, the volatile memory, and the processor 204. Accordingly, the processor 204 may include one or a plurality of processors. At this time, one or a plurality of processors may be a general purpose processor, such as a central processing unit (CPU), an application processor (AP), or the like, a graphics-only processing unit such as a graphics processing unit (GPU), a visual processing unit (VPU), and/or an AI-dedicated processor such as a neural processing unit (NPU). The one or a plurality of processors control the processing of the input data in accordance with a predefined operating rule or artificial intelligence (AI) model stored in the non-volatile memory and the volatile memory. The predefined operating rule or artificial intelligence model is provided through training or learning.
[0039] In an embodiment, the processor 202 may be configured to perform the functions of the edge device 101A, as described throughout the specification.
[0040] Furthermore, the memory 202 may include any non-transitory computer-readable medium known in the art including, for example, volatile memory, such as Static Random-Access Memory (SRAM) and Dynamic Random-Access Memory (DRAM), and/or non-volatile memory, such as Read-Only Memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0041] The memory 202 is communicatively coupled with the processor 204 to store bitstreams or processing instructions for completing the process. Further, the memory 202 may include an operating system 210 for performing one or more tasks of the edge device 101A, as performed by a generic operating system in the communications domain or the standalone device. In an embodiment, the memory 202 may comprise a database 212 configured to store the information as required by the processor 204 to perform one or more functions for monitoring and relaying data associated with the road incidents, as discussed throughout the disclosure.
[0042] The memory 202 may be operable to store instructions executable by the processor 204. The functions, acts, or tasks illustrated in the figures or described may be performed by the processor 204 for executing the instructions stored in the memory 202. The functions, acts, or tasks are independent of the particular type of instruction set, storage media, processor, or processing strategy and may be performed by software, hardware, integrated circuits, firmware, micro-code, and the like, operating alone or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing, and the like.
[0043] For the sake of brevity, the architecture, and standard operations of the memory 202 and the processor 204 are not discussed in detail. In one embodiment, the memory 202 may be configured to store the information as required by the processor 204 to perform the methods described herein.
[0044] The camera module 106 may capture real-time video data associated with one or more objects on a road. In an embodiment, the one or more objects may include but is not limited to a vehicle, a person, and a stationary object. In an embodiment, the camera module may convert the real-time video data into advanced digital video format using video codecs like H.264/H.265. Accordingly, the real-time video data may be analysed to detect a road incident in accordance with the techniques as described in the following paragraphs.
[0045] Further, the captured real-time video data may be stored in the memory 202 for further processing.
[0046] FIG. 2 is further explained in conjunction with FIG. 3. FIG. 3 illustrates an exemplary scenario for monitoring and analysing the road incidents, according to an embodiment of the present disclosure. As shown in FIG. 3, the plurality of edge devices 101 are connected to each other on a route 300. In particular, the plurality of edge devices 101, i.e., Ed(n-3), Ed(n-2), Ed(n-1), Ed(n), Ed(n+1), Ed(n+2), Ed(n+3 are deployed on the route 300, i.e. from A to B. Each of the edge devices 101 has different surveillance areas. For example, the first edge device 101A has a surveillance area 302A, the second edge device 101B has a surveillance area 302B, the third edge device 101C has a surveillance area 302C and so on.
[0047] Referring back to FIG. 2, the edge device 101A (also referred to as a source edge device 101A) may detect, a first road incident on the route 300. The edge device 101A may detect the first road incident using at least one AI model. In an embodiment, the at least one AI model may correspond to a pre-loaded AI model pre-loaded with different scenarios for the road incidents. For example, the at least AI model may be loaded with a hit-and-run scenario, a collision scenario, a road incident involving multiple objects, etc. Accordingly, the at least one AI model may detect the first road incident based on the loaded scenario. Further, in an embodiment, the first road incident may involve at least one first object among the one or more objects. The edge device 101A may detect the first road incident using the captured real-time video data. Accordingly, the at least one AI model may analyse the real-time video data and detect the first road incident.
[0048] Further, the edge device Ed(n) 101A may derive at least one first road incident parameter associated with the at least one first object and a first Time of Incident (ToI) of the first road incident. In an embodiment, the at least one first road incident parameter may include but is not limited to identification of the at least one first object, direction of the at least one first object, a location of the first road incident, and a type of the first road incident. For example, the edge device Ed(n) 101A may determine that a hit-and-run incident took place at 10 PM at “XYZ” place. The incident involved a car of “ABC” model and white colour with a license plate “AAAAAAA”. The car was coming from the north direction towards the “XYZ” place.
[0049] Thereafter, the edge device 101A Ed(n) may relay the derived at least one first road incident parameter and the first ToI to one or more neighbouring edge devices and the command server 103. For example, the edge device Ed(n) 101A may relay the derived at least one first road incident parameter and the first ToI to the edge devices Ed(n-1) 101B and Ed(n+1) 101C.
[0050] In an embodiment, the edge device Ed(n) 101A may map the real-time video data corresponding to the first road incident and forward the mapped real-time video data to the command server 103. In an embodiment, the mapped real-time video data may include mapped video frames corresponding to the road incident in the real-time video data. The edge device Ed(n) 101A may also forward meta-data associated with the first road incident to the command server 103.
[0051] In an embodiment, the edge device Ed(n) 101A may receive a confirmation of existence of the at least one first object from the one or more neighbouring edge devices in response to relaying the derived at least one first road incident parameter and the first ToI.
[0052] In an embodiment, the edge device as a neighbouring edge device has been explained in reference to the edge device 101A for the sake of continuity of the description. Accordingly, to differentiate the working of the edge device 101A as the source edge device, the further embodiments have been explained with reference to a second road incident.
[0053] Accordingly, in an embodiment, the edge device 101A may receive at least one second road incident parameter and a second ToI associated with the second road incident. The edge device Ed(n) 101A may receive the at least one second road incident parameter and the second ToI from at least one neighbouring edge device, such as the edge device Ed(n-1) 101B. Then, the edge device Ed(n) 101A may detect the existence of at least one second object associated with the at least one second road incident. In an embodiment, the edge device Ed(n) 101A may detect the existence of the at least one second object by analyzing the captured real-time video data within a predefined time period. In an embodiment, the predefined time period corresponds to a time period prior to the second ToI, when the at least one second object is captured by the at least one neighbouring edge device prior to the edge device. In an embodiment, the predefined time period may be configurable by the edge device 101A. For example, let us assume that the edge device Ed(n) 101A receives the at least one second road incident parameter and the second from the edge device Ed(n+1) 101C, then the edge device Ed(n) 101A may capture the at least one second object prior to the edge device Ed(n+1) 101C. Accordingly, the edge device Ed(n) 101A may analyse the captured real-time video data within the time period such as 5 minutes prior to the second ToI. For example, if the second ToI is 10 PM, then the edge device Ed(n) 101A may analyse the captured real-time video data from 9.55 PM-10 PM.
[0054] In another embodiment, the predefined time period corresponds to a time period subsequent to the second ToI, when the at least one second object is captured by the at least one neighbouring edge device subsequent to the edge device. For example, let us assume that the edge device Ed(n) 101A receives the at least one second road incident parameter and the second from the edge device Ed(n-1) 101B, then the edge device Ed(n) 101A may capture the at least one second object subsequent to the edge device Ed(n-1) 101B. Accordingly, the edge device Ed(n) 101A may analyse the captured real-time video data within the time period such as 5 minutes subsequent to the second ToI. For example, if the second ToI is 10 PM, then the edge device Ed(n) 101A may analyse the captured real-time video data from 10 PM-10.05 PM.
[0055] It should be noted that the at least one second road incident parameter and the second ToI may correspond to the at least one first road incident parameter and the first ToI, respectively. Similarly, the second road incident and the at least one second object may correspond to the first road incident and the at least one first object, respectively. Accordingly, the at least one second road incident parameter may include but is not limited to the identification of the at least one second object, the direction of the at least one second object, the location of the second road incident, and the type of the second road incident.
[0056] After detecting the existence of the at least one second object, the edge device Ed(n) 101A may relay the at least one second road incident parameter and the second ToI to a first neighbouring edge device among the one or more neighbouring edge devices. For example, the edge device Ed(n) 101A may relay the at least one second road incident parameter and the second ToI to the edge device Ed(n+1) 101C or the edge device Ed(n-1) 101B. However, it should be noted that if the edge device Ed(n) 101A does not detect the existence of the at least one second object, then the edge device Ed(n) 101A does not relay the at least one second road incident parameter and the second ToI to the first neighbouring edge device.
[0057] In an embodiment, after detecting the existence of the at least one second object, the edge device Ed(n) 101A may also relay the at least one second road incident parameter, the second ToI, and a confirmation of the existence of the at least one second object to the command server 103.
[0058] Referring back to FIG. 2, the command server 103 may correspond to a server capable of real-time data processing and decision-making without relying heavily on centralized cloud systems, reducing latency and bandwidth usage. Further, the command server 103 is capable of collaborating with the edge devices 101 to share and process data collectively, creating a memory link for the road incident. The command server 103 and it’s working thereof have been further explained in reference to FIG. 4.
[0059] FIG. 4 illustrates a block diagram of the command server 103 for generating road incident information, according to an embodiment of the present disclosure. As shown, the command server 103 may include a memory 402, at least one processor 404 (herein referred to as the processor 404), and an Input/ Output (I/O) interface 406. In an exemplary embodiment, the at least one processor 404 may be operatively coupled to the I/O interface 406, and the memory 402.
[0060] In one embodiment, the at least one processor 404 may be operatively coupled to the memory 402 for processing, executing, or performing a set of operations. The at least one processor 404 may include at least one data processor for executing processes in a Virtual Storage Area Network. In another embodiment, the at least one processor 404 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc. In one embodiment, the processor 404 may include a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or both. In another embodiment, the at least one processor 404 may be one or more general processors, digital signal processors, application-specific integrated circuits, field-programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analyzing and processing data. The at least one processor 404 may execute a software program, such as code generated manually (i.e., programmed) to perform one or more operations disclosed in the present disclosure.
[0061] The at least one processor 404 may be disposed in communication with one or more I/O devices, such as the user devices 170, via the I/O interface 406. The I/O interface 406 may employ communication Code-Division Multiple Access (CDMA), High-Speed Packet Access (HSPA+), Global System For Mobile Communications (GSM), Long-Term Evolution (LTE), WiMax, or the like, etc.
[0062] In an embodiment, the at least one processor 404 may be disposed in communication with a communication network via a network interface. In an embodiment, the network interface may be the I/O interface 406. The network interface may connect to the communication network to enable connection of the command server 103 with the outside environment and/or device/system. The network interface may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/Internet Protocol (TCP/IP), token ring, IEEE 802.11/b/g/n/x, etc. The communication network may include, without limitation, a direct interconnection, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol (WAP)), the Internet, etc. Using the network interface and the communication network, the command server 103 may communicate with other devices. The network interface may employ connection protocols including, but not limited to, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), TCP/IP, token ring, IEEE 802.11/b/g/n/x, etc.
[0063] In an embodiment, the processor 404 may be configured to perform the functions of the command server 103, as described throughout the disclosure.
[0064] Furthermore, the memory 402 may include any non-transitory computer-readable medium known in the art including, for example, volatile memory, such as Static Random-Access Memory (SRAM) and Dynamic Random-Access Memory (DRAM), and/or non-volatile memory, such as Read-Only Memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0065] The memory 402 is communicatively coupled with the processor 204 to store bitstreams or processing instructions for completing the process. Further, the memory 402 may include an operating system 410 for performing one or more tasks of the command server 103, as performed by a generic operating system in the communications domain or the standalone device. In an embodiment, the memory 402 may comprise a database 412 configured to store the information as required by the processor 204 to perform one or more functions for generating the road incident information, as discussed throughout the disclosure.
[0066] The memory 402 may be operable to store instructions executable by the processor 404. The functions, acts, or tasks illustrated in the figures or described may be performed by the processor 404 for executing the instructions stored in the memory 402. The functions, acts, or tasks are independent of the particular type of instruction set, storage media, processor, or processing strategy and may be performed by software, hardware, integrated circuits, firmware, micro-code, and the like, operating alone or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing, and the like.
[0067] For the sake of brevity, the architecture, and standard operations of the memory 402 and the processor 404 are not discussed in detail. In one embodiment, the memory 402 may be configured to store the information as required by the processor 404 to perform the methods described herein.
[0068] In an embodiment, the command server 103 may receive road incident data associated with a road incident from a first edge device, captured at a first time instance, and one or more other edge devices at corresponding one or more subsequent time instances. For example, the command server 103 may receive the road incident data from the edge device 101A captured at the first time instance, such as 10 PM. The command server 103 may also receive the road incident data from the edge devices 101B and 101C at 10.05 PM and 10.10 PM, respectively. In an embodiment, the road incident data may include but is not limited to at least one road incident parameter associated with the at least one object involved in the road incident, time of incident (ToI) of the road incident, and time of existence (ToE) of the at least one object at each of the plurality of edge devices. The ToE may refer to a time at which the at least one object was captured at the corresponding edge device. Further, the at least one road incident parameter may include, but is not limited to identification of the at least one object, direction of the at least one object, a location of the road incident, and a type of the road incident. For example, the road incident data may define that a hit-and-run incident took place at 10 PM at “XYZ” place. The incident involved a car of “ABC” model and white colour with a license plate “AAAAAAA”. The car was coming from the north direction towards the “XYZ” place. The road incident data may also include that the “car” was captured at the edge device 101B at 9.55 PM and captured at the edge device 101C at 10.05 PM.
[0069] Further, the command server 103 may generate the road incident information based on the received road incident data. In an embodiment, the command server 103 may analyze the received road incident data to generate the road incident information. For example, the command server 103 creates a memory link of the road incident using the road incident data and may backtrack the road incident using the memory link. In an embodiment, the road incident information may include, but is not limited to an incident number, a sequence of edge devices involved in the road incident, time of existence (ToE) of at least one object involved in the road incident, and mapped real-time video data corresponding to the road incident. The mapped real-time video data may include mapped video frames corresponding to the road incident in the real-time video data.
[0070] In a further embodiment, the command server 103 may provide a notification of the road incident to one or more authorities, such as emergency services, law and order authorities, etc.
[0071] FIGS. 5A-5B illustrate flowcharts depicting a method 500 for monitoring and relaying data associated with the road incidents, according to an embodiment of the present disclosure. The method 500 may be performed by any of the edge devices 101. In another embodiment, the method 500 may be performed by the processor 204 of the edge device 101A.
[0072] At step 502, the method 500 may include capturing, by the edge device 101A, real-time video data associated with the one or more objects on the road.
[0073] At step 504, the method 500 may include detecting, by the at least one AI model of the edge device 101A, the first road incident involving the at least one first object among the one or more objects using the captured real-time video data.
[0074] At step 506, the method 500 may include deriving, by the edge device 101A, the at least one first road incident parameter associated with the at least one first object and the first ToI of the first road incident.
[0075] At step 506, the method 500 may include relaying, by the edge device 101A, the derived at least one first road incident parameter and the first ToI to one or more neighbouring edge devices and a command server 103 connected to the edge device 101A.
[0076] Further, in an embodiment, the edge device 101A may receive, from at least one neighbouring edge device among the one or more neighbouring edge devices, the at least one second road incident parameter and the second ToI associated with a second road incident. Accordingly, at step 510, the method 500 may include detecting the existence of the at least one second object associated with the second road incident. The existence of the at least one second object may be detected by analyzing the captured real-time video data within the predefined time period. Thereafter, at step 512, the method 500 may include relaying the at least one second road incident parameter and the second ToI to a first neighbouring edge device among the one or more neighbouring edge devices based on the detection of the existence of the at least one second object. Then, at step 514, the method 500 may include in response to detecting an existence of the at least one second object, relaying the at least one second road incident parameter, the second ToI, and the confirmation of the existence of the at least one second object to the command server 103.
[0077] While the above-discussed steps in FIGS. 5A-5B are shown and described in a particular sequence, the steps may occur in variations to the sequence in accordance with various embodiments. Further, a detailed description related to the various steps of FIGS. 5A-5B is already covered in the description related to FIGS. 1-3 and is omitted herein for the sake of brevity.
[0078] FIG. 6 illustrates a flowchart depicting a method 600 for generating the road incident information, according to an embodiment of the present disclosure. The method 600 may be performed by the command server 103. In another embodiment, the method 600 may be performed by the processor 404 of the command server 103.
[0079] At step 602, the method 600 may include receiving road incident data associated with a road incident from a first edge device, captured at a first time instance, and one or more other edge devices at corresponding one or more subsequent time instances.
[0080] At step 604, the method 600 may include generating the road incident information comprising an incident number, a sequence of edge devices involved in the road incident, time of existence (ToE) of at least one object involved in the road incident, and mapped real-time video data corresponding to the road incident by analyzing the received road incident data.
[0081] While the above-discussed steps in FIG. 6 are shown and described in a particular sequence, the steps may occur in variations to the sequence in accordance with various embodiments. Further, a detailed description related to the various steps of FIG. 6 is already covered in the description related to FIGS. 1 and 3-4 and is omitted herein for the sake of brevity.
[0082] Accordingly, the present disclosure provides various advantages. For example, the disclosed techniques help in enhancing the speed and accuracy of the road incident analysis. Further, the disclosed techniques provide a more holistic and integrated view of the road incidents as the road incidents unfold across different surveillance areas. The disclosed techniques also automate the management, processing, and analysis of video data, minimizing manual intervention and reducing inefficiencies. The disclosed techniques also help in improving both the effectiveness and reliability of road surveillance systems.
[0083] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.
[0084] The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein.
[0085] Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.
[0086] Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any component(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or component of any or all the claims. ,CLAIMS:1. An edge device (101A) for monitoring and relaying data associated with road incidents, the edge device (101A) comprising:
a camera module configured to capture real-time video data associated with one or more objects on a road;
a processor coupled to the camera module and configured to:
detect, by at least one artificial intelligence (AI) model, a first road incident involving at least one first object among the one or more objects using the captured real-time video data;
derive at least one first road incident parameter associated with the at least one first object and a first time of incident (ToI) of the first road incident; and
relay the derived at least one first road incident parameter and the first ToI to one or more neighbouring edge devices and a command server (103) connected to the edge device;
wherein, when the edge device (101A) receives, from at least one neighbouring edge device among the one or more neighbouring edge devices, at least one second road incident parameter and a second ToI associated with a second road incident, the processor is further configured to:
detect an existence of at least one second object associated with the second road incident by analyzing the captured real-time video data within a predefined time period;
relay the at least one second road incident parameter and the second ToI to a first neighbouring edge device among the one or more neighbouring edge devices based on the detection of the existence of the at least one second object; and
in response to detecting an existence of the at least one second object, relay the at least one second road incident parameter, the second ToI, and a confirmation of the existence of the at least one second object to the command server (103).
2. The edge device (101A) as claimed in claim 1, wherein the processor is further configured to:
map the real-time video data corresponding to the first road incident and the second road incident; and
forward the mapped real-time video data to the command server (103).
3. The edge device (101A) as claimed in claim 1, wherein the at least one first road incident parameter includes identification of the at least one first object, direction of the at least one first object, a location of the first road incident, and a type of the first road incident; and
wherein the at least one second road incident parameter includes an identification of the at least one second object, direction of the at least one second object, a location of the second road incident, and a type of the second road incident.
4. The edge device (101A) as claimed in claim 1, wherein the predefined time period corresponds to a time period prior to the second ToI, when the at least one second object is captured by the at least one neighbouring edge device prior to the edge device;
wherein the predefined time period corresponds to a time period subsequent to the second ToI, when the at least one second object is captured is by the at least one neighbouring edge device subsequent to the edge device.
5. The edge device (101A) as claimed in claim 1, wherein the processor is further configured to:
receive a confirmation of existence of the at least one first object from the one or more neighbouring edge devices in response to relaying the derived at least one first road incident parameter and the first ToI.
6. A command server (103) for generating road incident information, the command server (103) comprising:
a memory; and
a processor coupled to the memory and configured to:
receive road incident data associated with a road incident from a first edge device, captured at a first time instance, and one or more other edge devices at corresponding one or more subsequent time instances; and
generate the road incident information comprising an incident number, a sequence of edge devices involved in the road incident, time of existence (TOE) of at least one object involved in the road incident, and mapped real-time video data corresponding to the road incident by analyzing the received road incident data.
7. The command server (103) as claimed in claim 6, wherein the road incident data includes at least one road incident parameter associated with the at least one object involved in the road incident, time of incident (ToI) of the road incident, and time of existence (ToE) of the at least one object at each of the plurality of edge devices; and
wherein the at least one road incident parameter includes identification of the at least one object, direction of the at least one object, a location of the road incident, and a type of the road incident.
8. The command server (103) as claimed in claim 6, wherein the processor is further configured to:
provide a notification of the road incident to one or more authorities.
9. A method (500) for monitoring and relaying data associated with road incidents, the method (500) comprising:
capturing (502), by an edge device, real-time video data associated with one or more objects on a road;
detecting (504), by at least one artificial intelligence (AI) model of the edge device, a first road incident involving at least one first object among the one or more objects using the captured real-time video data;
deriving (506), by the edge device, at least one first road incident parameter associated with the at least one first object and a first time of incident (ToI) of the first road incident; and
relaying (508), by the edge device, the derived at least one first road incident parameter and the first ToI to one or more neighbouring edge devices and a command server (103) connected to the edge device;
wherein, when the edge device receives, from at least one neighbouring edge device among the one or more neighbouring edge devices, at least one second road incident parameter and a second ToI associated with a second road incident, the method (500) further comprising:
detecting (510) an existence of at least one second object associated with the second road incident by analyzing the captured real-time video data within a predefined time period;
relaying (512) the at least one second road incident parameter and the second ToI to a first neighbouring edge device among the one or more neighbouring edge devices based on the detection of the existence of the at least one second object; and
in response to detecting an existence of the at least one second object, relaying (514) the at least one second road incident parameter, the second ToI, and a confirmation of the existence of the at least one second object to the command server (103).
10. The method (500) as claimed in claim 9, the method (500) further comprising:
mapping the real-time video data corresponding to the first road incident and the second road incident; and
forwarding the mapped real-time video data to the command server (103).
11. The method (500) as claimed in claim 9, wherein the at least one first road incident parameter includes identification of the at least one first object, direction of the at least one first object, a location of the first road incident, and a type of the first road incident; and
wherein the at least one second road incident parameter includes an identification of the at least one second object, direction of the at least one second object, a location of the second road incident, and a type of the second road incident.
12. The method (500) as claimed in claim 9, wherein the predefined time period corresponds to a time period prior to the second ToI, when the at least one second object is captured by the at least one neighbouring edge device prior to the edge device; and
wherein the predefined time period corresponds to a time period ahead of the second ToI, when the at least one second object is captured by the at least one neighbouring edge device subsequent to the edge device.
13. The method (500) as claimed in claim 9, the method (500) further comprising:
receiving a confirmation of existence of the at least one first object from the one or more neighbouring edge devices in response to relaying the derived at least one first road incident parameter and the first ToI.
14. A method (600) for generating road incident information, the method (600) comprising:
receiving (602) road incident data associated with a road incident from a first edge device, captured at a first time instance, and one or more other edge devices at corresponding one or more subsequent time instances; and
generating (604) the road incident information comprising an incident number, a sequence of edge devices involved in the road incident, time of existence (ToE) of at least one object involved in the road incident, and mapped real-time video data corresponding to the road incident by analyzing the received road incident data.
15. The method (600) as claimed in claim 14, wherein the road incident data includes at least one road incident parameter associated with the at least one object involved in the road incident, time of incident (ToI) of the road incident, and time of existence (ToE) of the at least one object at each of the plurality of edge devices; and
wherein the at least one road incident parameter includes identification of the at least one object, direction of the at least one object, a location of the road incident, and a type of the road incident.
16. The method (600) as claimed in claim 14, the method (600) further comprising:
providing a notification of the road incident to one or more authorities.
| # | Name | Date |
|---|---|---|
| 1 | 202441025694-PROVISIONAL SPECIFICATION [28-03-2024(online)].pdf | 2024-03-28 |
| 2 | 202441025694-FORM 1 [28-03-2024(online)].pdf | 2024-03-28 |
| 3 | 202441025694-DRAWINGS [28-03-2024(online)].pdf | 2024-03-28 |
| 4 | 202441025694-FORM-26 [07-06-2024(online)].pdf | 2024-06-07 |
| 5 | 202441025694-Proof of Right [16-09-2024(online)].pdf | 2024-09-16 |
| 6 | 202441025694-RELEVANT DOCUMENTS [04-10-2024(online)].pdf | 2024-10-04 |
| 7 | 202441025694-POA [04-10-2024(online)].pdf | 2024-10-04 |
| 8 | 202441025694-FORM 13 [04-10-2024(online)].pdf | 2024-10-04 |
| 9 | 202441025694-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 10 | 202441025694-DRAWING [18-03-2025(online)].pdf | 2025-03-18 |
| 11 | 202441025694-CORRESPONDENCE-OTHERS [18-03-2025(online)].pdf | 2025-03-18 |
| 12 | 202441025694-COMPLETE SPECIFICATION [18-03-2025(online)].pdf | 2025-03-18 |