Sign In to Follow Application
View All Documents & Correspondence

System And Method For Automatic Vehicle Tracking From Incidents And Selected Routes

Abstract: The present disclosure relates to a system (100) to detect a vehicle of interest, the system includes a processing server (104) loaded with a tracking program. A plurality of edge devices (108) coupled to the processing server that loads multiple incidents in the plurality of edge devices (108) to identify the vehicle of interest. The multiple incidents are a plurality of machine-learning models loaded to the plurality of edge devices (108) from a command centre (102). The command centre selects the route of the vehicle of interest, acquires video of the vehicle of interest, processes the video feed and tracks the vehicle of interest and communicates to the image-capturing unit, upon detection of deviation in the route, wherein the route is selected automatically to enable the image capturing unit in the path for vehicle tracking.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 September 2023
Publication Number
14/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Bharat Electronics Limited
Corporate Office, Outer Ring Road, Nagavara, Bangalore - 560045, Karnataka, India.

Inventors

1. VASUDEVA RAO PRASADULA
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.
2. VENKATESWARLU NAIK B
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.
3. SUMIT KUMAR
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.
4. JOYDEV GHOSH
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.
5. SHIVAKUMAR MURUGESH
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.

Specification

Description:TECHNICAL FIELD
[0001] The present disclosure relates, in general, to an automatic vehicle selection and tracking system, and more specifically, relates to a system and method for automatic vehicle tracking from incidents and selected routes.

BACKGROUND
[0002] City surveillance mainly depends on constant monitoring and manual intervention. Tracking lost vehicles, perimeter security, escape route estimation and the likes are areas where constant monitoring and intervention are required. To provide the framework and automate this process modern technologies like edge computing, machine learning, computer vision, and data analytics can be used to build the system.
[0003] Therefore, it is desired to overcome the drawbacks, shortcomings, and limitations associated with existing solutions, and develop a system that reduces manual intervention and increases the efficiency of the surveillance system and response time.

OBJECTS OF THE PRESENT DISCLOSURE
[0004] An object of the present disclosure relates, in general, to an automatic vehicle selection and tracking system, and more specifically, relates to a system and method for automatic vehicle tracking from incidents and selected routes.
[0005] Another object of the present disclosure is to provide a system that tracks based on multiple ML models of incidents/scenarios loaded to edge devices.
[0006] Another object of the present disclosure is to provide a system that performs an automatic selection of the route of the vehicle that may travel, thus enabling the edge devices in the route for automatic tracking of selected objects/vehicles.
[0007] Another object of the present disclosure is to provide a system that passes the selected object/vehicle information by the edge device to other edge devices in the selected route.
[0008] Another object of the present disclosure is to provide a system that performs dynamic changeover in selected routes and corresponding display camera changes on the display console.
[0009] Yet another object of the present disclosure is to provide a system that finds the direction of the vehicle based on angle of deviation and distance between the vehicle and edge device.

SUMMARY
[0010] The present disclosure relates to an automatic vehicle selection and tracking system, and more specifically, relates to a system and method for automatic vehicle tracking from incidents and selected routes. The main objective of the present disclosure is to overcome the drawback, limitations, and shortcomings of the existing system and solution, by providing a system and method to detect and track vehicles of interest by using edge devices with cameras placed across the routes in a city, by mapping the edge devices located on the route with the route number for faster tracking of the vehicle and automatic selection of navigation route by processing the video feed of the cameras in the path of the route.
[0011] The system includes a computing server loaded with a tracking program, a database system to store and retrieve tracking information, multiple edge devices deployed across a route and a command center and a display console. Various scenarios are selected to identify the vehicle of interest. These scenarios are categorized as incidents that occur in the daily commute. A scenario is loaded in the edge devices to automatically select a route. These scenarios are modeled as a plurality of machine learning algorithms and loaded onto the edge device for detecting vehicles with respect to incidents.
[0012] The present disclosure relates to a system to detect a vehicle of interest. The system includes a processing server loaded with a tracking program. The plurality of edge devices coupled to the processing server, the plurality of edge devices deployed across a route, the processing server configured to load multiple incidents in the plurality of edge devices to identify the vehicle of interest, wherein the multiple incidents are a plurality of machine-learning models loaded to the plurality of edge devices from a command centre through the processing server. The processing server select, by the command centre, route of the vehicle of interest, upon detection of the loaded incidents from the plurality of edge devices. The processing server acquires video of the vehicle of interest from an image-capturing unit provided in the plurality of edge devices, processes the video feed and tracks the vehicle of interest and communicates to the image capturing unit, upon detection of deviation in the route, wherein the route is selected automatically to enable the image capturing unit in the path for vehicle tracking.
[0013] The multiple incidents pertain to accidents, hit-and-run and vehicle features. The database is adapted to store and retrieve tracking information of the vehicle of interest. The display console is capable of showing multiple video feeds. The plurality of edge devices consists of the image capturing unit and a graphics processing unit (GPU) capable of loading with multiple incidents to identify the vehicle of interest and to communicate with the processing server and other edge devices. The automatic selection of vehicles for tracking based on the parameters defined as a violation, rules/policies of a scenario and the plurality of edge devices running a mathematical module to predict the direction in which the vehicle of interest turn for the quick response from the plurality of edge device in the corresponding direction of the vehicle route and the command center.
[0014] The automatic selection of the route enables the image-capturing unit in the path of the route for video tracking by streaming the live video feed of the image-capturing unit selected in the route to the display console and changing the image-capturing unit feed based on the route change. The route change occurs whenever the plurality of edge devices deviates from the selected route, deviation of the vehicle of interest is detected by other edge devices in the deviated route and deviation of the vehicle of interest is conveyed to other edge devices in the new route either by the plurality of edge devices which first detected the deviation or by the processing server.
[0015] The vehicle features and models of the incidents are loaded to the plurality of the edge devices from the processing server, wherein the vehicle features comprise parameters that define the vehicle in a video frame, the parameters are the inputs to the corresponding machine-learning model that detects the vehicle of interest and user-defined scenarios are defined based on vehicle classification parameters, vehicle number and any combination thereof.
[0016] The processing server is configured to predict the direction in which the vehicle of interest turn using the angle of deviation for the image-capturing unit situated in line with the vehicle of interest. The processing server uses the change in the distance between the image-capturing unit and the vehicle of interest in consecutive duration for the image-capturing unit situated sidewise to the vehicle of interest and passing the direction information to the command center and the corresponding edge devices located in the route of the derived direction.
[0017] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF THE DRAWINGS
[0018] The following drawings form part of the present specification and are included to further illustrate aspects of the present disclosure. The disclosure may be better understood by reference to the drawings in combination with the detailed description of the specific embodiments presented herein.
[0019] FIG. 1A illustrates an exemplary automatic vehicle selection and tracking system, in accordance with an embodiment of the present disclosure.
[0020] FIG. 1B illustrates an exemplary command center, in accordance with an embodiment of the present disclosure.
[0021] FIG. 1C illustrates an exemplary processing server, in accordance with an embodiment of the present disclosure.
[0022] FIG. 1D illustrates an exemplary edge device, in accordance with an embodiment of the present disclosure.
[0023] FIG. 1E illustrates an exemplary display console, in accordance with an embodiment of the present disclosure.
[0024] FIG. 2 illustrates an exemplary selected route for a scenario with the vehicle of interest, in accordance with an embodiment of the present disclosure.
[0025] FIG. 3 illustrates an exemplary deviation selected route for the vehicle of interest, in accordance with an embodiment of the present disclosure.
[0026] FIG. 4 illustrates an exemplary flow chart of a method of automatic vehicle selection and tracking, in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[0027] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0028] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0029] The present disclosure relates, to an automatic vehicle selection and tracking system, and more specifically, relates to a system and method for automatic vehicle tracking from incidents and selected routes.
[0030] The proposed system disclosed in the present disclosure overcomes the drawbacks, shortcomings, and limitations associated with the conventional system by providing a system and method to detect and track vehicles of interest by using edge devices with cameras placed across the routes in a city, by mapping the edge devices located on the route with the route number for faster tracking of the vehicle and automatic selection of navigation route by processing the video feed of the cameras in the path of the route. The system includes a computing server loaded with a tracking program, a database system to store and retrieve tracking information, multiple edge devices deployed across a route and a command center and a display console. Various scenarios are selected to identify the vehicle of interest. These scenarios are categorized as incidents that occur in the daily commute. A scenario is loaded in the edge devices to automatically select a route. These scenarios are modeled as a plurality of machine learning algorithms and loaded onto the edge device for detecting vehicles with respect to incidents. The present disclosure can be described in enabling detail in the following examples, which may represent more than one embodiment of the present disclosure.
[0031] The system detects the vehicle of interest, the system includes the processing server loaded with a tracking program. The plurality of edge devices coupled to the processing server, the plurality of edge devices deployed across a route, the processing server configured to load multiple incidents in the plurality of edge devices to identify the vehicle of interest, wherein the multiple incidents are the plurality of machine-learning models loaded to the plurality of edge devices from a command centre through the processing server; select, by the command centre, route of the vehicle of interest, upon detection of the loaded incidents from the plurality of edge devices; acquire video of the vehicle of interest from an image capturing unit provided in the plurality of edge devices; process the video feed and track the vehicle of interest; and communicate to the image capturing unit, upon detection of deviation in the route, wherein the route is selected automatically to enable the image capturing unit in the path for vehicle tracking.
[0032] In an aspect, the multiple incidents pertain to accidents, hit-and-run and vehicle features. The database is adapted to store and retrieve tracking information of the vehicle of interest The display console is capable of showing multiple video feeds. The plurality of edge devices comprises the image capturing unit and a graphics processing unit (GPU) capable of loading with multiple incidents to identify the vehicle of interest and to communicate with the processing server and other edge devices.
[0033] In another aspect, the processing server is configured to perform an automatic selection of vehicle of interest for tracking based on the parameters defined as a violation, rules/policies of a scenario and the plurality of edge devices running a mathematical module to predict the direction in which the vehicle of interest turn for the quick response from the plurality of edge device in the corresponding direction of the vehicle route and the command center. The automatic selection of the route enables the image-capturing unit in the path of the route for video tracking by streaming the live video feed of the image-capturing unit selected in the route to the display console and changing the image-capturing unit feed based on the route change, wherein route change occurs whenever the plurality of edge devices deviates from the selected route; deviation of the vehicle of interest is detected by other edge devices in the deviated route; and deviation of the vehicle of interest is conveyed to other edge devices in the new route either by the plurality of edge devices which first detected the deviation or by the processing server.
[0034] In another aspect, the vehicle features and models of the incidents are loaded to the plurality of the edge devices from the processing server, wherein the vehicle features comprise parameters that define the vehicle in a video frame, the parameters are the inputs to the corresponding machine-learning model that detects the vehicle of interest and user-defined scenarios are defined based on vehicle classification parameters, vehicle number and any combination thereof.
[0035] The processing server is configured to predict the direction in which the vehicle of interest turn using the angle of deviation for the image-capturing unit situated in line with the vehicle of interest, using the change in the distance between the image-capturing unit and the vehicle of interest in consecutive duration for the image capturing unit situated sidewise to the vehicle of interest; and pass the direction information to the command center and the corresponding edge devices located in the route of the derived direction.
[0036] The advantages achieved by the system of the present disclosure can be clear from the embodiments provided herein. The system tracks based on multiple ML models of incidents/scenarios loaded to the edge devices and performs an automatic selection of the route of the vehicle that may travel, thus enabling the edge devices in the route for automatic tracking of selected objects/vehicle. The system passes the selected object/vehicle information by the edge device to other edge devices in the selected route.
[0037] The system performs dynamic change over in selected routes and corresponding display camera changes on the display console. Further, the system finds the direction of the vehicle based on the angle of deviation and distance between the vehicle and the edge device. The description of terms and features related to the present disclosure shall be clear from the embodiments that are illustrated and described; however, the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents of the embodiments are possible within the scope of the present disclosure. Additionally, the invention can include other embodiments that are within the scope of the claims but are not described in detail with respect to the following description.
[0038] FIG. 1A illustrates an exemplary automatic vehicle selection and tracking system, in accordance with an embodiment of the present disclosure.
[0039] Referring to FIG.1A, automatic vehicle selection and tracking system 100 is disclosed. The system 100 can include a command center 102, processing server 104, database 106, a network of edge devices 108 with image capturing units e.g., cameras and a display console 110. The processing server 104 loaded with tracking program, a database system to store and retrieve tracking information, multiple edge devices 108 deployed across a route, the command center 102 and the display console 110. The edge device 108 can include the image capturing unit and the graphics processing unit, (GPU) capable of loading with multiple scenarios or incidents to identify the vehicle of interest and to communicate with computing server 104 (also referred to as processing server 104, herein) and other edge devices. The display console 110 is a device capable of showing multiple video feeds.
[0040] Based on the vehicle parameters, the vehicle tracking scenario is been sent to the processing server 104. The processing server 104 intern is connected with the command centre 102, the network of edge devices 108, the database 106 and the display console 110. The database contains the records of the mapping of cameras in the route and the geo-location of edge devices mapped on the route.
[0041] The selected route is the automatic selected path of the vehicle it may travel by the command centre 102 using the vehicle information received from the edge device 106. The route of the vehicle can be selected by command centre 102 when edge device 106 first detects the vehicle of interest from the loaded incidents.
[0042] The scenarios are based on the automatic selection of vehicle features derived from the plurality of vehicle detection algorithms. These features can be loaded by the user or operator using the GUI which can be operated at command center 102. Incidents considered for the scenario are defined based on accidents, and hit-n-run. User-defined scenarios are defined based on vehicle classification parameters, and vehicle number. The identification parameters for the accidents and hit-and-run are trained by the plurality of the machine learning models.
[0043] The vehicle features consist of parameters that define the vehicle in a video frame. Incidents such as accidents, hit-and-run and based on the vehicle features like the colour of the vehicle, model of the vehicle, and vehicle number. The identification and detection of these incident occurrences are determined using the plurality of machine learning models. Vehicle features or characteristics and models of a scenario are loaded to the edge devices from the command centre. These features are sent to the edge devices as the parameters for identifying a vehicle of interest.
[0044] The processing server 104 has control over the scenario and gets the feed from edge devices 106. It processes the scenario and builds the route for tracking the vehicle. The vehicle tracking is done with the plurality of the machine learning models. The build scenario in the processing server 104 is being fed to edge devices 106. This build scenario establishes the route for vehicle tracking using the vehicle features. This enables the automatic selection of route. Automatic selection of route is to enable the cameras in the path of the route for vehicle tracking by streaming the live video feed of the cameras selected in the route to the display console. Deviation in the route is identified at the processing server 104 using the camera feed received from the edge devices. Route change happens whenever the edge device deviates from the selected route.
[0045] FIG. 1B illustrates an exemplary command center, in accordance with an embodiment of the present disclosure. The command center 102 can include push parameters and models. The vehicle features can be loaded by the user or operator using the GUI which can be operated at the command center.
[0046] FIG. 1C illustrates an exemplary processing server, in accordance with an embodiment of the present disclosure. The processing server 104 can include edge communication, tracking module, data module, display handler and edge device management. The processing server 104 has control over the scenario and gets the feed from the edge devices. It processes the scenario and builds the route for tracking the vehicle. The vehicle tracking is done with the plurality of the machine learning models. The build scenario in the processing server 104 is being fed to the edge devices 108. This build scenario establishes the route for vehicle tracking using the vehicle features. This enables the automatic selection of route. Automatic selection of route is to enable the cameras in the path of the route for vehicle tracking by streaming the live video feed of the cameras selected in the route to the display console. The deviation in the route is identified at processing server using the camera feed received from the edge devices. Route change happens whenever edge device deviates from the selected route.
[0047] FIG. 1D illustrates an exemplary edge device, in accordance with an embodiment of the present disclosure. The edge device 106 can include image capturing unit, GPU, loaded models and network connectivity hardware. The object/vehicle selection for tracking based on multiple ML models of incidents/scenarios loaded to the edge devices.
[0048] FIG. 1E illustrates an exemplary display console, in accordance with an embodiment of the present disclosure. The dynamic change over in the selected route and corresponding display cameras changes on the display console 110. The camera feed may be changed in the display console 110 based on route change. The display console displays multiple video feeds for the loaded scenarios.
[0049] Thus, the present invention overcomes the drawbacks, shortcomings, and limitations associated with existing solutions, and provides a system that tracks based on multiple ML models of incidents/scenarios loaded to the edge devices and performs an automatic selection of the route of the vehicle that may travel, thus enabling the edge devices in the route for automatic tracking of selected object/vehicle. The system passes the selected object/vehicle information by the edge device to other edge devices in the selected route.
[0050] The system performs dynamic change over in selected routes and corresponding display cameras changes on the display console. Further, the system finds the direction of the vehicle based on the angle of deviation and distance between the vehicle and the edge device. The description of terms and features related to the present disclosure shall be clear from the embodiments that are illustrated and described; however, the invention is not limited to these embodiments only.
[0051] FIG. 2 illustrates an exemplary selected route for a scenario with the vehicle of interest, in accordance with an embodiment of the present disclosure.
[0052] The direction of route change is determined using the following algorithm:
[0053] If the edge device sensor is facing in line with the vehicle, then the angle of deflection from the line of sight can determine the probable direction in which the vehicle may turn. If the angle of deflection is positive, then the vehicle may turn to its left else right.
[0054] If the edge device sensor is sidewise to the vehicle, then the distance between the vehicle and the edge device sensor in two consecutive times can determine the probable direction in which the vehicle may turn. If the edge device sensor is situated to the left of the vehicle, then reduce in the distance between the edge device sensor and the vehicle may suggest the probable direction of turn of the vehicle to the left, if the distance is increasing then right and if the distance is constant then the vehicle is supposedly moving straight.
[0055] FIG. 3 illustrates an exemplary deviation selected route for vehicle of interest, in accordance with an embodiment of the present disclosure.
[0056] As depicted in FIG. 3, if the edge device sensor is to the right side of the vehicle, then reduce in the distance between the edge device sensor and the vehicle may suggest the probable direction of a turn to be right, if the distance is increasing then left and if the distance is constant then the vehicle is supposedly moving straight. The deviation of the vehicle is conveyed to other edge devices in the new route either by the edge device which first detected the deviation or by the processing server. The deviation exists, accordingly the route is updated in the edge devices. The camera feed may be changed in the display console based on route change. The display console displays the multiple video feeds for the loaded scenarios.
[0057] The dead zones are identified in a route and deploying drone cameras in the dead zones of the selected route. The dead zone is an area where no edge device is installed to capture the video of an area in the path of the route. Identifying dead zones in a route and saving dead zone information of a route in the database. Dynamic update of dead zones of a route, in case of camera failure identified by the plurality of methods can be inducted into the system. During an operation, deploying drone cameras in the dead zones of a selected route by the plurality of methods can be explored as the futuristic scenario.
[0058] FIG. 4 illustrates an exemplary flow chart of a method of vehicle tracking, in accordance with an embodiment of the present disclosure.
[0059] Referring to FIG. 4, the method 400 includes at block 402, the system defines push parameters. At block 404, build and establish in the scenario. At block 406, load scenario in the edge devices. At block 408, acquire video of the object of interest from cameras. At block 410, the system can process the video feed and track the vehicle.
[0060] At block 412, the system can communicate with cameras in the route. At block 414, if the deviation of the route is detected, the video feed is processed and the vehicle is tracked.
[0061] FIG. 5 illustrates an exemplary flow chart of a method of automatic vehicle selection and tracking, in accordance with an embodiment of the present disclosure.
[0062] Referring to FIG. 5, the method 500 includes at block 502, the processing server loaded with the tracking program. At block 504, the plurality of edge devices is coupled to the processing server. The plurality of edge devices deployed across a route.
[0063] At block 506, the processing server is configured to load multiple incidents in the plurality of edge devices to identify the vehicle of interest, wherein the multiple incidents are the plurality of machine-learning models loaded to the plurality of edge devices from a command centre through the processing server. At block 508, the processing server can select, by the command centre, the route of the vehicle of interest, upon detection of the loaded incidents from the plurality of edge devices. At block 510, the processing server can acquire video of the vehicle of interest from an image-capturing unit provided in the plurality of edge devices. At block 512, the processing server can process the video feed and track the vehicle of interest and at block 514, communicate to the image-capturing unit, upon detection of deviation in the route, wherein the route is selected automatically to enable the image-capturing unit in the path for vehicle tracking.
[0064] It will be apparent to those skilled in the art that the system 100 of the disclosure may be provided using some or all of the mentioned features and components without departing from the scope of the present disclosure. While various embodiments of the present disclosure have been illustrated and described herein, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the disclosure, as described in the claims.

ADVANTAGES OF THE PRESENT INVENTION
[0065] The present invention provides a system that tracks based on multiple ML models of incidents/scenarios loaded to the edge devices.
[0066] The present invention provides a system that performs an automatic selection of the route of the vehicle that may travel, thus enabling the edge devices in the route for automatic tracking of selected objects/vehicle.
[0067] The present invention provides a system that passes the selected object/vehicle information by the edge device to other edge devices in the selected route.
[0068] The present invention provides a system that performs dynamic change over in selected routes and corresponding display cameras changes on the display console.
[0069] The present invention provides a system that finds the direction of the vehicle based on angle of deviation and distance between the vehicle and edge device.
, Claims:1. A system (100) to detect a vehicle of interest, the system comprising:
a processing server (104) loaded with a tracking program;
a database (106) adapted to store and retrieve tracking information of the vehicle of interest;
a display console (110) capable of showing multiple video feed;
a plurality of edge devices (108) coupled to the processing server, the plurality of edge devices deployed across a route, the plurality of edge devices (108) comprises the image capturing unit and a graphics processing unit (GPU) capable of loading with the multiple incidents to identify the vehicle of interest and to communicate with the processing server (104) and other edge devices, the processing server (104) configured to:
load multiple incidents in the plurality of edge devices (108) to identify the vehicle of interest, wherein the multiple incidents are a plurality of machine-learning models loaded to the plurality of edge devices (108) from a command centre (102) through the processing server, the multiple incidents pertain to accidents, hit-and-run and vehicle features;
select, by the command centre (102), the route of the vehicle of interest, upon detection of the loaded incidents from the plurality of edge devices;
acquire video of the vehicle of interest from an image capturing unit provided in the plurality of edge devices (108);
process the video feed and track the vehicle of interest; and
communicate to the image-capturing unit, upon detection of deviation in the route, wherein the route is selected automatically to enable the image-capturing unit in the path for vehicle tracking.
2. The system as claimed in claim 1, wherein the processing server (104) is configured to perform automatic selection of the vehicle of interest for tracking based on the parameters defined as violation, rules/policies of a scenario and the plurality of edge devices running a mathematical module to predict the direction in which the vehicle of interest turn for the quick response from the plurality of edge device in the corresponding direction of the vehicle route and the command center.
3. The system as claimed in claim 1, wherein the automatic selection of the route enables the image-capturing unit in the path of the route for video tracking by streaming the live video feed of the image-capturing unit selected in the route to the display console and changing the image capturing unit feed based on the route change, wherein
route change occurs whenever the plurality of edge devices detects deviation from the selected route;
deviation of the vehicle of interest is detected by other edge devices in the deviated route;
and
deviation of the vehicle of interest is conveyed to other edge devices in the new route either by the plurality of edge devices which first detected the deviation or by the processing server
4. The system as claimed in claim 1, wherein the vehicle features and models of the incidents are loaded to the plurality of the edge devices from the processing server, wherein the vehicle features comprise parameters that define the vehicle in a video frame, the parameters are the inputs to the corresponding machine-learning model that detects the vehicle of interest and user-defined scenarios are defined based on vehicle classification parameters, vehicle number and any combination thereof.
5. The system as claimed in claim 1, wherein the processing server is configured to:
predict the direction in which the vehicle of interest turn using the angle of deviation for the image-capturing unit situated in line with the vehicle of interest;
use the change in the distance between the image capturing unit and the vehicle of interest in consecutive duration for the image-capturing unit situated side-wise to the vehicle of interest; and
pass the direction information to the command center and the corresponding edge devices located in the route of the derived direction.
6. A method (500) to detect a vehicle of interest, the method comprising:
loading (502), a processing server, with a tracking program;
deploying (504), a plurality of edge devices coupled to the processing server across a route, the processing server configured to:
load (506) multiple incidents in the plurality of edge devices to identify the vehicle of interest, wherein the multiple incidents are a plurality of machine-learning models loaded to the plurality of edge devices from a command centre through the processing server;
select (508), by the command centre, route of the vehicle of interest, upon detection of the loaded incidents from the plurality of edge devices;
acquire (510) video of the vehicle of interest from an image-capturing unit provided in the plurality of edge devices;
process (512) the video feed and track the vehicle of interest; and
communicate (514) to the image-capturing unit upon detection of deviation in the route, wherein the route is selected automatically to enable the image-capturing unit in the path for vehicle tracking.

Documents

Application Documents

# Name Date
1 202341065884-STATEMENT OF UNDERTAKING (FORM 3) [29-09-2023(online)].pdf 2023-09-29
2 202341065884-POWER OF AUTHORITY [29-09-2023(online)].pdf 2023-09-29
3 202341065884-FORM 1 [29-09-2023(online)].pdf 2023-09-29
4 202341065884-DRAWINGS [29-09-2023(online)].pdf 2023-09-29
5 202341065884-DECLARATION OF INVENTORSHIP (FORM 5) [29-09-2023(online)].pdf 2023-09-29
6 202341065884-COMPLETE SPECIFICATION [29-09-2023(online)].pdf 2023-09-29
7 202341065884-RELEVANT DOCUMENTS [04-10-2024(online)].pdf 2024-10-04
8 202341065884-POA [04-10-2024(online)].pdf 2024-10-04
9 202341065884-FORM 13 [04-10-2024(online)].pdf 2024-10-04
10 202341065884-Response to office action [01-11-2024(online)].pdf 2024-11-01