Abstract: The present invention relates to a method and system for autonomous airborne vehicle tracking by splitting a captured video frame/process window from the recorded video of the target vehicle obtained from a video surveillance sensor (102). A process window splitting block (106) splits the captured video frame/process window into a top and bottom segments. The output from the window splitting block (106) is further sent to a detection & tracking block (108) to track the target vehicle on a priority based approach. Refer to Fig.:
DESC:TECHNICAL FIELD
[0001] The present invention relates generally to vehicle tracking. The invention, more particularly, relates to a method and system for autonomous airborne vehicle tracking.
BACKGROUND
[0002] There are many conventional solutions that exist to track or video surveillance of moving objects such as automobiles, humans, or airborne vehicles. They are used to monitor the movement of objects.
[0003] For example, one of a conventional solution is proposed in US8477998 titled “Object Tracking in Video with Visual Constraints” discloses an object tracking in video. In an embodiment, a computer-implemented method tracks an object in a frame of a video. An adaptive term value is determined based on an adaptive model and at least a portion of the frame. A pose constraint value is determined based on a pose model and at least a portion the frame. An alignment confidence score is determined based on an alignment model and at least a portion the frame. Based on the adaptive term value, the pose constraint value, and the alignment confidence score, an energy value is determined. Based on the energy value, a resultant tracking state is determined. The resultant tracking state defines a likely position of the object in the frame given the object's likely position in a set of previous frames in the video.
[0004] Another conventional solution is proposed in US20140133703 titled “Video object tracking using multi-path trajectory analysis” discloses a method for obtaining trajectory of an object using multi-path tracking mode is provided. The method includes marking a portion of the object in a frame of a video, obtaining consecutive frames in the video, and tracking the marked portion of the object in consecutive frames by estimating sum of absolute difference. The method further includes comparing the sum of absolute difference to a sum of absolute difference threshold, switching between the multi-path tracking mode and single path tracking mode based on the comparison of the sum of absolute difference to the sum of absolute difference threshold, and obtaining trajectory of the marked portion by combining the single path tracking mode and multi-path tracking mode.
[0005] Another conventional solution is proposed in US20040100563 titled “video tracking system and method” discloses a video tracking system and method which includes a video camera having a selectively adjustable panning orientation, tilting orientation and focal length. A processor receives video images acquired by the camera. The processor is programmed to detect target objects in the images and selectively adjust the camera to track the target object. The camera is adjusted at variable rates which are selected as a function of a property, such as the velocity, of the target object. The focal length of the camera is selectively adjusted as a function of the distance of the target object from the camera. The images acquired by the camera are geometrically transformed to align images having different fields of view to facilitate the analysis of the images and thereby allowing the camera to be continuously adjustable for the production of video images having relatively smooth transitional movements.
[0006] One more limitation of the conventional solution is that a single-window (or segment) as a process window is used.
[0007] Thus, there is a need for an invention that solves the above-defined problems and provides a system and method for tracking an autonomous airborne vehicle.
SUMMARY OF THE INVENTION
[0008] This summary is provided to disclose a system and method for autonomous airborne vehicle tracking. This summary is neither intended to identify essential features of the present invention nor is it intended for use in determining or limiting the scope of the present invention.
[0009] For example, various embodiments herein may include one or more systems and methods for autonomous airborne vehicle tracking.
[0010] In an embodiment, the present invention discloses an autonomous airborne vehicle tracking system. The system comprises a video surveillance sensor mounted on a pan and tilt unit to record a video of a target vehicle entered in the video surveillance sensor field of view (FOV). Further, a video frame capturing block is configured to capture video frames from the recorded video of the target vehicle. The video frame capturing block further sends the captured video frames towards a process window splitting block. The process window splitting block is configured to receive the captured video frames and split the captured video frames/process window into a top and bottom segments. The process window is the area in which it needs to detect and track the target.
[0011] The tracking system of the present invention further includes a detection & tracking block. The detection & tracking block is configured to receive the output from the process window splitting block, and further track the target vehicle on a priority-based approach. The tracking system of the present invention further includes a display and a database. The display is configured to display the tracked target vehicle, and the database to store a trajectory of the tracked target vehicle.
[0012] In an exemplary implementation, the present invention discloses a method for tracking an autonomous airborne vehicle. The method comprises recording, by a video surveillance sensor, a video of the target vehicle entering in the video surveillance sensor field of view (FOV). The method further discloses capturing, by a video frame capturing block, video frames from the recorded video of the target vehicle. The method further discloses splitting, by a process window splitting block, the captured video frame/process window into a top and bottom segments. The method further discloses receiving, by a detection & tracking block, the output from the process window splitting block. The method further discloses tracking the target vehicle on a priority basis approach. The method further includes displaying, by a display, the tracked target vehicle and storing, by a database, a trajectory of the tracked target vehicle.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0013] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules.
[0014] Fig. 1 illustrates a block diagram depicting video surveillance sensor based autonomous airborne vehicle tracking, according to an embodiment of the present invention.
[0015] Fig. 2 (a) illustrates a schematic diagram depicting a priority for the top segment if the drone moves upward, according to an embodiment of the present invention.
[0016] Fig. 2 (b) illustrates a schematic diagram depicting priority for the bottom segment if the drone moves downwards, according to an embodiment of the present invention.
[0017] Fig. 2 (c) illustrates a schematic diagram depicting equal priority given to both segments till the drone is found, according to an embodiment of the present invention.
[0018] Fig. 3 illustrates a flow diagram for autonomous airborne vehicle tracking using a priority-based approach, according to an embodiment of the present invention.
[0019] Fig. 4 illustrates a method for tracking an autonomous airborne vehicle, according to an exemplary implementation of the present invention.
[0020] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative methods embodying the principles of the present disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes that may be substantially represented in a computer-readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0021] The various embodiments of the present invention describe a method and system for autonomous airborne vehicle tracking.
[0022] In the following description, for purpose of explanation, specific details are outlined to provide an understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these details. One skilled in the art will recognize that embodiments of the present invention, some of which are described below, may be incorporated into a number of systems.
[0023] However, the systems and methods are not limited to the specific embodiments described herein. Further, structures and devices shown in the figures are illustrative of exemplary embodiments of the present invention and are meant to avoid obscuring the present disclosure.
[0024] It should be noted that the description merely illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present invention. Furthermore, all examples recited herein are principally intended expressly to be only for explanatory purposes to help the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.
[0025] In an exemplary implementation, the present invention discloses a method for tracking an autonomous airborne vehicle. The method includes recording, by a video surveillance sensor, a video of a target vehicle i.e. the autonomous airborne vehicle entering in the video surveillance sensor field of view (FOV). The autonomous airborne vehicles maybe, but not limited to Drones. These drones are observed by using a video surveillance sensor. This method relates to the detection and tracking of an intruder’s drone which is at a far distance with respect to the position of the video sensor. Objects at a far distance look like they are at horizon level i.e., the place where sky and ground look like they merge each other. There is a chance to lose track at the horizon level by considering the process window as a single segment. To differentiate the sky portion and ground portion, the method further includes capturing, by a video frame capturing block, video frames from the recorded video of the target vehicle. The method further includes splitting the captured video frame/ process window by a process window splitting block into two segments i.e. top and bottom segments. The top segment is considered as the sky portion and the Bottom segment is considered as the ground portion. A priority has to be given to these segments for tracking the target. Hence the detection and tracking of autonomous vehicles in both segments is achieved by using a priority-based approach.
[0026] In another embodiment, the present invention discloses detection and tracking block detects and tracks the target (drone) and generates the position error with respect to the centre of the video location at a rate of sensor frame rate. The position errors of all the frames are stored in a database that can be used for trajectory information. The tracking output is shown in a video display device for visualization purposes. The outer thick window represents video display, the inner thick window represents process window splitted into top & bottom segments. The thin window around the drone shown is the track window which is in the top segment. The circle shown inside the bottom segment is an unwanted object.
[0027] In another embodiment, the present invention discloses the pan & tilt unit. The pan and tilt unit is configured to selectively adjust the position of the video surveillance sensor. The video surveillance sensor is mounted on the pan and tilt unit.
[0028] In another embodiment, the present invention discloses the video frame capturing block. The video frame capturing block is configured to differentiate a sky portion and a ground portion present in the process window. The sky portion is displayed in the top segment of the process window and the ground portion is displayed in the ground segment of the process window
[0029] In another embodiment, the present invention discloses the size of the top and bottom segments of the process window is variable. The top and bottom segment sizes dynamically change during tracking with respect to the position of the target vehicle present in the process window.
[0030] In another embodiment, the present invention discloses the tracking of the target vehicle is done on a priority based approach. The priority based approach applies to the occurrence of any one of three conditions ie. first, second, and third conditions. In the first condition, if the target vehicle is moving upward towards the sky, a major priority is allotted to the top segment of the process window for tracking the target, and a minor priority is allotted to the bottom segment of the process window. Further, in the second condition if the target is moving downwards toward the ground, then major priority is allotted to the bottom segment of the process window for tracking of the target, and minor priority is allotted to the top segment of the process window. In the third condition, if the target vehicle is in the detection stage, equal priority is allotted to both segments.
[0031] In another embodiment, the present invention discloses an Electro-Optic (EO) sensor-based autonomous airborne vehicle tracking in presence of intruders Drone at a range of far distances. Electro-Optic sensor-based object tracking systems, methods for real-time object tracking, apparatus, and devices are widely available. Most of the EO-based object trackers are working on the principle of background modeling and suppression of background in the process window. So, the leftover portion in the process window after background suppression is the target portion which is selected as a target for tracking. In this method, drones are selected as a target for tracking.
[0032] The available existing methods use a single window (or segment) as a process window. But the present invention mainly relates to the splitting of the process window into two segments i.e., top segment and bottom segment. The top segment is related to the sky portion and the bottom segment is related to the ground portion. Sky portion refers to drones flying in the open air without any obstacles above 200 meters from ground level. Ground portion refers to drones flying on the top of the terrace (or) top of trees (or) below the trees.
[0033] In another exemplary implementation, the present invention discloses the sensor based autonomous airborne vehicle tracking. Assume an intruder’s autonomous airborne vehicle, for example, a Drone is flying at a range of far distances with respect to the video surveillance sensor which is required to detect and track. The video frames of autonomous airborne vehicles flying at far distances are captured by the video surveillance sensor and the drone looks like it is at horizon level. The term horizon level is the location in the video frame where the sky and ground portion merge each other. This is the case where the single window-based process window will fail to detect and track the target because the background modelling and subtraction of the successive frames may fail to detect the target.
[0034] In the present method, the tracking of autonomous airborne vehicles is done by splitting the process window into two segments. The top segment refers to the sky portion and the bottom segment refers to the ground portion. The size of the process window is fixed but the size of the two segments is not fixed and varies dynamically depending on the position of the drone inside the processing window. The term process window is defined as the particular area in the video frame which can be selectable by the operator. The size of the process window is fixed which needs to be decided before initiating tracking. The size of the two segments is fixed and equally divided inside the process window before initiating tracking. Track command should be initiated by the operator once he observes the autonomous airborne vehicle (Drone) entering the process window. Once the Drone is detected inside the process window then it will be tracked continuously depending on the priority of the two segments. The size of the two segments inside the process window will be varied dynamically once the Drone enters into tracking.
[0035] In another exemplary implementation, the present invention discloses that in a case, the autonomous airborne vehicle i.e., Drone is flying towards the sky portion then major priority is given to the top segment for tracking the target and minor priority is given to the bottom segment i.e., w.r.to centre location of video 75% size of process window is allotted to top segment and 25% size of process window is allotted to bottom segment.
[0036] Further, in a case, if the autonomous airborne vehicle i.e., Drone is flying towards the ground portion (i.e. on to the trees or below the trees) then major priority is given to the bottom segment for tracking the target, and minor priority is given to top segment i.e. w.r.to centre location of video 75% size of process window is allotted to bottom segment and 25% size of process window is allotted to top segment.
[0037] In case the target is present in both the segments, then priority is given to the maximum segment size and considering that as the actual target irrespective of size and shape of the target. The trajectory of the actual tracked target needs to be stored as a backup to confirm that the selected segment chosen for tracking and stored trajectory information gets matched or not. If the target is moving towards the sky, then the trajectory information will have positive error values in the database w.r.to centre location of the video. Suppose if the target is moving towards the ground then the trajectory information will have negative error values in the database w.r.to centre location of the video. Hence the method for autonomous airborne vehicle tracking will be done by using a priority-based approach by splitting the process window.
[0038] In another exemplary implementation, the present invention discloses a method for autonomous airborne vehicle tracking. The method comprises steps of video surveillance sensor, pan & tilt unit, FPGA device for processing, video frame capturing block for capturing videos, process window splitting block, detection & tracking block detects and tracks the target (drone) and generates the position error with respect to the centre of video location at a rate of sensor frame rate, and video display device for displaying output.
[0039] In another exemplary implementation, the present invention discloses that autonomous airborne vehicle tracking is performed by splitting the process window into two segments. In this method the autonomous airborne vehicles are Drones.
[0040] In another exemplary implementation, the present invention discloses that by considering the autonomous airborne vehicle is at a range of far distances with respect to the position of the video sensor. In view of the sensor, the target is at a horizon level i.e., both the sky and ground look like merge each other.
[0041] In another exemplary implementation, the present invention discloses that the discrimination of sky and the ground portion inside the process window should be performed initially before splitting the process window. The autonomous airborne vehicle (Drone) in a condition enters in between the trees and above or below the trees or can stay on top of the terrace. Henceforth the process window is splitted into two segments namely the top segment & the bottom segment.
[0042] In another exemplary implementation, the present invention discloses the detection of the autonomous airborne vehicle is performed separately using standard methods in both the top & bottom segments. The top segment is considered as sky portion like drones flying above the trees, towers, etc. and the bottom segment is considered as the ground portion like drones flying below or in between the trees. In case the detection of the autonomous airborne vehicle is present in either or both the segments a major priority is given to the top segment of the process window irrespective of size & shape of the target if the target is moving towards the sky portion else a major priority is given to the bottom segment of process window if the target is moving towards ground portion.
[0043] In a case, if the detection is not present in both segments before initiating track command then equal priority is given to both segments. If the detection is not present in both segments after initiating track command then major priority is given to the bottom segment and the target will be searched in the bottom segment while diluting the threshold.
[0044] In another embodiment, the present invention discloses that the size of splitting the two segments should be changed dynamically after detecting and initiating tracking depending on the maneuvering of target position and target segment where major priority for target tracking is given to maximum segment size.
[0045] In another embodiment, the present invention discloses the trajectory of the actual tracked target that needs to be stored in the database as a backup to confirm that the selected segment chooses for tracking and the stored trajectory information gets matched or not.
[0046] In another embodiment, the present invention discloses the autonomous airborne vehicles flying at a range from 20 meters to above 1Km from ground level. These Drones can be observed by using a video surveillance sensor. This method relates to the detection and tracking of an intruders Drone which is at a range of far distances with respect to the position of the video sensor. Objects at a far distance look like they are at horizon level i.e., the place where sky and ground look like they merge each other. So to discriminate the sky portion and ground portion the process window is splitted into two segments i.e. the top & the bottom segment. The top segment is considered as the sky portion and the Bottom segment is considered as the ground portion. A priority has to be given to these two segments for tracking the target. Hence the detection and tracking of the autonomous airborne vehicle in both segments can be done by using a priority-based approach.
[0047] Fig. 1 illustrates a block diagram depicting video surveillance sensor based autonomous airborne vehicle tracking.
[0048] In Fig. 1, the video surveillance sensor based autonomous airborne vehicle tracking in which all the blocks have been depicted according to the present invention. The tracking system comprises a video surveillance sensor (102), pan & tilt unit, a video frame capturing block (104), a process window splitting (106), detection & tracking block (108), and a video display (110). The video surveillance sensor (102) is mounted on top of the pan & tilt unit.
[0049] The video surveillance sensor (102) is configured to record a video of a target vehicle entering the video surveillance sensor field of view (FOV). Further, the frame capturing block (104) is configured to capture video frames from the recorded video of the target vehicle. The captured video frame/process window should cover both the sky background and ground background since the drone travels at a height of 20 meters to above 1Km from ground level. Objects at far distances with respect to the position of the video surveillance sensor (102) look like they are at horizon level i.e., the place where sky and ground look like they merge each other. In order to differentiate the sky portion and the ground portion inside the process window, the process window splitting block (106) splits the process window into two segments i.e., top and bottom segments. The top segment represents the sky portion and the bottom segment represents the ground portion. With respect to the position of the drone in either top or bottom segment, the background needs to be suppressed inside the segments, respectively and hence the leftover portion inside the segment shows the actual target i.e., drones. There are different standard threshold methods to suppress the background. Further, the output of splitted process window is fed to the detection and tracking block (108). The detection and tracking block (108) is configured to track the target vehicle on a a priority based approach.
[0050] The tracking of the target vehicle on the priority based approach applies to the occurrence of any one of three conditions ie. first, second, and third condition.
[0051] In the first condition, if the target vehicle is moving upward towards the sky, a major priority is allotted to the top segment of the process window for tracking of the target, and a minor priority is allotted to the bottom segment of the process window. In the second condition if the target is moving downwards toward the ground, then major priority is allotted to the bottom segment of the process window for tracking of the target, and minor priority is allotted to the top segment of the process window. In the third condition, if the target vehicle is in the detection stage, equal priority is allotted to both segments.
[0052] Fig. 2 (a) illustrates a schematic diagram depicting a priority for the top segment if the drone moves upward. In Figure 2(a) if the target moves upwards towards the sky then major priority is given to the top segment for tracking the target and minor priority is given to the bottom segment i.e. w.r.to centre location of video 75% size of process window is allotted to top segment and 25% size of process window is allotted to bottom segment.
[0053] Fig. 2 (b) illustrates a schematic diagram depicting priority for the bottom segment if the drone moves downwards. In Fig. 2 (b) in case the target moves downwards towards the ground (i.e., on to the trees or below the trees) then major priority is given to the bottom segment for tracking the target and minor priority is given to the top segment i.e., w.r.to centre location of video 25% size of process window is allotted to top segment and 75% size of process window is allotted to bottom segment. If the actual target (Drone) and an unwanted object (circle) are present in both segments, then major priority is given to the maximum segment size for target tracking.
[0054] Fig. 2 (c) illustrates a schematic diagram depicting equal priority given to both segments till the drone is found. In Fig. 2 (c), initially before starting tracking the target equal priority should be given to both segments. When there is no target found inside the process window then also an equal priority is to be given to both segments. When the target enters into the Field of View (FOV) of the sensor (102) the target will be detected and tracked by the detection and tracking block (108). The size of the two segments will change dynamically during tracking with respect to the position of the target (Drone) that it enters into either top or bottom segment. The size of the process window always remains fixed before or after initiates tracking.
[0055] FIG. 3 illustrates a flow diagram for autonomous airborne vehicle tracking using a priority-based approach. Initially let the target enters into the Field of View of the sensors (102). The operator selects the size of the process window before initiating tracking. To split the process window into two segments, differentiate the sky portion and ground portion before initiating tracking. The top segment refers to the sky portion and the bottom segment refers to the ground portion inside the process window. Initiate track command to perform tracking if the target is detected by the detection & tracking block (108). In this method tracking of a target is purely depends on the priority-based approach and the size of the segments varies dynamically with respect to the position of the target inside the process window. If the target moves upwards towards the sky, then choose the top segment as a priority else if it moves towards the ground then choose the bottom segment as a priority. Reconfirm that the selected segment chooses for tracking the target by using trajectory information. This method is a single FPGA based solution where the blocks are shown in Figure 1 i.e., video frame capturing block (104), process window splitting block (106) and detection & tracking block (108) is performed by using FPGA Processor. This consumes the inbuilt memory of the FPGA Processor to perform the method for autonomous airborne vehicle tracking.
[0056] In an exemplary implementation, the present invention discloses an Electro-Optic (EO) sensor-based autonomous airborne vehicle tracking in presence of intruders Drone at a range of far distances. Electro-Optic sensor-based object tracking systems, methods for real time object tracking, apparatus, and devices are widely available. Most of the EO-based object trackers are working on the principle of background modelling and suppression of background in the process window. So the leftover portion in the process window after background suppression is the target portion which is selected as a target for tracking. In this method, drones are selected as a target for tracking. After selection, the video frame/process window is splitted into two segments i.e. top segment and bottom segment. The top segment is related to the sky portion and the bottom segment is related to the ground portion. the sky portion refers to drones flying in the open air without any obstacles above 200 meters from ground level. The ground portion refers to drones flying on the top of the terrace (or) top of trees (or) below the trees.
[0057] Now, assume an intruder’s autonomous airborne vehicle let’s say Drone is flying at a range of far distances with respect to the video surveillance sensor which is required to detect and track. The video frames of autonomous airborne vehicles flying at far distances are captured by the video surveillance sensor (102) and the drone looks like it is at horizon level. The term horizon level is the location in the video frame where the sky and ground portion merge each other. This is the case where the single window based process window will fail to detect and track the target because the background modelling and subtraction of the successive frames may fail to detect the target.
[0058] The tracking of the autonomous airborne vehicle is performed by the process window splitting block (106). The process window splitting block (106) splits the captured video/frame process window into two segments. The top segment refers to the sky portion and the bottom segment refers to the ground portion. The size of the process window is fixed but the size of the two segments is not fixed and varies dynamically depending on the position of the drone inside the processing window. The term process window is defined as the particular area in the video frame which can be selectable by the operator. The size of the process window is fixed which needs to be decided before initiating tracking. The size of the two segments is fixed and equally divided inside the process window before initiating tracking. Track command should be initiated by the operator once he observes the autonomous airborne vehicle (Drone) entering into the process window. Once the Drone is detected inside the process window, the target vehicle is continuously tracked. The continuous tracking of the target vehicle depends on the priority of the two segments. The size of the two segments inside the process window is varied dynamically once the Drone enters into tracking.
[0059] For example, in the first condition, the autonomous airborne vehicle i.e. Drone is flying towards the sky portion then major priority is given to the top segment for tracking the target and minor priority is given to the bottom segment i.e. w.r.to centre location of video 75% size of process window is allotted to top segment and 25% size of process window is allotted to bottom segment.
[0060] In the second condition, suppose if the autonomous airborne vehicle i.e. Drone is flying towards the ground portion (i.e. on to the trees or below the trees) then major priority is given to the bottom segment for tracking the target and minor priority is given to top segment i.e. w.r.to centre location of video 75% size of process window is allotted to bottom segment and 25% size of process window is allotted to top segment. In a third condition, if the target vehicle is present in both the segments then priority is given to the maximum segment size and considering that as the actual target irrespective of size and shape of the target.
[0061] The trajectory of the actual tracked target needs to be stored as a backup to confirm whether the selected segment chosen for tracking and stored trajectory information gets matched or not. If the target is moving towards the sky then the trajectory information will have positive error values in the database w.r.to centre location of the video. Suppose if the target is moving towards the ground then the trajectory information will have negative error values in the database w.r.to centre location of the video.
[0062] Fig. 4 illustrates a method for tracking an autonomous airborne vehicle, according to an exemplary implementation of the present invention.
[0063] Referring now to Fig. 4 which illustrates a flowchart (400) of tracking an autonomous airborne vehicle, according to an exemplary implementation of the present invention. The flow chart (400) of Fig. 4 is explained below with reference to Fig.1 as described above.
[0064] At step 402, recording, by a video surveillance sensor (102), a video of a target vehicle entering the video surveillance sensor field of view (FOV).
[0065] At step 404, capturing, by a video frame capturing block (104), video frames from the recorded video of the target vehicle.
[0066] At step 406, splitting, by a process window splitting block (106), the captured video frame/process window into a top and bottom segments.
[0067] At step 408, receiving, by a detection & tracking block (108), the output from the process window splitting block (106), and tracking the target vehicle on a priority basis approach.
[0068] At step 410, displaying, by a video display (110), the tracked target vehicle.
[0069] At step 412, storing, by a database, a trajectory of the tracked target vehicle.
[0070] The foregoing description of the invention has been set merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the substance of the invention may occur to a person skilled in the art, the invention should be construed to include everything within the scope of the invention.
,CLAIMS:
1. An autonomous airborne vehicle tracking system, the tracking system comprising:
a video surveillance sensor (102) mounted on a pan and tilt unit, to record a video of a target vehicle entered in the video surveillance sensor (102) field of view (FOV);
a video frame capturing block (104) configured to capture video frames from the recorded video of the target vehicle;
a process window splitting block (106) configured to split the captured video frame/process window into a top and bottom segments;
a detection & tracking block (108) configured to receive the output from the process window splitting block (106) to track the target vehicle on a priority based approach;
a video display (110) is configured to display the tracked target vehicle; and
a database to store a trajectory of the tracked target vehicle.
2. The system as claimed in claim 1, wherein the pan & tilt unit is configured to selectively adjust the position of the video surveillance sensor (102).
3. The system as claimed in claim 1, wherein the video frame capturing block (104) further configured to differentiate a sky portion and a ground portion present in the process window.
4. The system as claimed in claim 3, wherein the sky portion is displayed in the top segment of the process window and the ground portion is displayed in the ground segment of the process window.
5. The system as claimed in claim 1, wherein the top and bottom segments of the process window have a variable size, the top and bottom segment size dynamically changes during tracking with respect to the position of the target vehicle present in the process window.
6. The system as claimed in claim 1, wherein the tracking of the target vehicle on the priority based approach applies to the occurrence of any one of three conditions ie. first, second and third conditions.
7. The system as claimed in claim 6, wherein in the first condition if the target vehicle is moving upward towards the sky, a major priority is allotted to the top segment of the process window for tracking of the target, and a minor priority is allotted to the bottom segment of the process window.
8. The system as claimed in claim 6, wherein in the second condition if the target is moving downwards toward the ground, then major priority is allotted to the bottom segment of the process window for tracking of the target, and minor priority is allotted to the top segment of the process window.
9. The system as claimed in claim 6, wherein in the third condition if the target vehicle is in detection stage, equal priority is allotted to both the segments.
10. A method for tracking an autonomous airborne vehicle, the method comprising:
recording, by a video surveillance sensor (102), a video of a target vehicle entering in the video surveillance sensor (102) field of view (FOV);
capturing, by a video frame capturing block (104), video frames from the recorded video of the target vehicle;
splitting, by a process window splitting block (106), the captured video frame/process window into a top and bottom segments;
receiving, by a detection & tracking block (108), the output from the process window splitting block, and
tracking the target vehicle on a priority basis approach;
displaying, by a video display (110), the tracked target vehicle; and
storing, by a database, trajectory of the tracked target vehicle.
| # | Name | Date |
|---|---|---|
| 1 | 202141013498-PROVISIONAL SPECIFICATION [26-03-2021(online)].pdf | 2021-03-26 |
| 2 | 202141013498-FORM 1 [26-03-2021(online)].pdf | 2021-03-26 |
| 3 | 202141013498-DRAWINGS [26-03-2021(online)].pdf | 2021-03-26 |
| 4 | 202141013498-Proof of Right [04-05-2021(online)].pdf | 2021-05-04 |
| 5 | 202141013498-FORM-26 [15-07-2021(online)].pdf | 2021-07-15 |
| 6 | 202141013498-Correspondence, Form-1_15-07-2021.pdf | 2021-07-15 |
| 7 | 202141013498-FORM 3 [28-03-2022(online)].pdf | 2022-03-28 |
| 8 | 202141013498-ENDORSEMENT BY INVENTORS [28-03-2022(online)].pdf | 2022-03-28 |
| 9 | 202141013498-DRAWING [28-03-2022(online)].pdf | 2022-03-28 |
| 10 | 202141013498-COMPLETE SPECIFICATION [28-03-2022(online)].pdf | 2022-03-28 |
| 11 | 202141013498-POA [07-10-2024(online)].pdf | 2024-10-07 |
| 12 | 202141013498-FORM 13 [07-10-2024(online)].pdf | 2024-10-07 |
| 13 | 202141013498-AMENDED DOCUMENTS [07-10-2024(online)].pdf | 2024-10-07 |
| 14 | 202141013498-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 15 | 202141013498-FORM 18 [15-03-2025(online)].pdf | 2025-03-15 |