Abstract: AUTOMOBILE ASSISTANCE SYSTEM FOR OBJECT DETECTION AND TRACKING A system and method for detecting and tracking one or more object in front of a host automobile, the system comprises of an image capturing device for capturing images in plurality of frames. A processor configured for processing the detected image and includes a detector for applying one or more object detection techniques for obtaining a template. The processor further includes an evaluation module to generate a dynamic threshold matching score for region of interest and fixed matching score value for surrounding region. The evaluation value further updates the stored template after a fixed interval of frames and also determines an offset value for modifying the matching score obtained after performing the template matching in order to reduce the false detections. The tracking module performs a repetitive tracking of detected object based on the template matching and modified matching score.
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
AUTOMOBILE ASSISTANCE SYSTEM FOR OBJECT DETECTION AND
TRACKING
Applicant
TATA Consultancy Services A company Incorporated in India under The Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021,
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
FIELD OF THE INVENTION
The present invention relates to a system and method for detecting and tracking an object. More particularly, the invention relates to a system and method for detecting object in a diver assistance system.
BACKGROUND OF THE INVENTION
Over the past several decades, driver's safety has become a matter of great concern. Advancement in technology has provided faster means of communication which has increased the dependency on automobiles while drastically increasing the risk rate of road accidents. To avoid the vehicle collision, vehicles are being provided with certain collision detecting systems which are supposed to provide a warning to the driver when a collision like situation is expected. Fast and quick alert systems are in a high demand to provide a safer vehicle assistance system in order to minimize the probability of collision while driving. The detection of an object in images is an important step in applications such as traffic monitoring, driver assistance system etc.
Many of the solutions have been proposed to address this problem. In most of the solutions, the vehicles are being provided with certain object detection devices which not only detect an object but also classify it according to their size and shape. The most followed techniques for detecting the object involves use of multiple cameras, like RADAR, LIDAR etc to detect objects in front of a vehicle to alert the driver about the possible collision.
Advance computational algorithms are also being widely used for detecting the object. The Histogram of Oriented Gradients (HOG) is a feature descriptor used in computer vision and image processing for the purpose of object detection. Similarly, HAAR like features are digital image features used in object recognition. In the conventional object detection systems, images of the object are captured and processed by applying one or more image
detection algorithm for classifying and detecting the object. Template matching is further performed for identifying the matching features in the object image.
For robust detections multiple sensors (RADAR, LIDAR, or multiple cameras) are used and hence require complex algorithms to be processed in real time for efficient results. This calls for complex hardware requirements. The use of template matching also involves possibility of error due to variation in matching score of template and object. Also, most of the solutions are focused towards the detection of moving object; however, they remain silent with respect to still object detection.
Therefore, there is a need of a system which is capable of detecting object in front of moving vehicle while reducing the number of sensors and computational complexity. The system should also be capable of detecting stiil and moving object without involving use of complex hard ware and should also reduce the false detections.
OBJECTS OF THE INVENION
It is the primary object of the invention to provide a system and method for detecting and tracking of an object in front of a host automobile.
It is another object of the invention to provide a system for applying an adaptive threshold matching score to a region of interest.
It is yet another object of the invention to provide a system for determining an offset value for obtaining a modified matching score.
It is yet another object of the invention to provide a system and a method for detecting still and moving objects.
It is yet another object of the invention to provide a system and method for calculating a distance between the object and the host vehicle.
SUMMARY OF THE INVENTION
The present invention provides a method for detecting and tracking one or more object in front of a host automobile. The method comprises of capturing an image in a plurality of frames and processing the captured image for detecting and tracking the object. The processing further comprises of applying one or more image detection techniques on a predefined area of the image in a valid frame to detect the object, storing the detected object as a template and generating a dynamic threshold matching score for a region of interest in the image while maintaining a fixed matching score value for a surrounding region and performing a template matching for a fixed number of consecutive frames. The processing further comprises of modifying the matching score by including a predetermined offset value and comparing the modified matching score with the dynamic threshold value for tracking the detected object and updating the template after a fixed interval of frame and generating an updated dynamic threshold matching score value with respect to the updated template for repetitive tracking of the detected object.
The present invention also provides a system for detecting and tracking one or more object in front of a host automobile. The system comprises of an image capturing device for capturing an image in a plurality of frames and a processor configured for processing the image for detecting and tracking the object. The processor further comprises of a detector for applying one or more object detection techniques on a predefined area of the image in a valid frame and storing the detected object as a template in a storage medium and an evaluation module configured for generating a dynamic threshold matching score value for a region of interest in the image while maintaining a fixed threshold matching score value for a surrounding region. The evaluation module is further configured to update the template and the dynamic threshold matching score value after a fixed interval of frame in order to provide a repetitive and dynamic tracking. The processor further comprises of a matching module for performing a template matching for a fixed number of consecutive frames and a tracking module configured to modify the matching score by including a
predetermined offset value and comparing the modified matching score with the predetermined threshold value for tracking the detected object.
BRIEF DESCRIPTION OF DRAWINGS
Figure 1 illustrates the architecture of the system for detecting and tracking object in accordance with an embodiment of the invention.
Figure 2 illustrates an exemplary flow chart of an alternate embodiment of the system.
Figure 3 illustrates an exemplary embodiment of the system.
DETAILED DESCRIPTION OF THE INVENTION
Some embodiments of this invention, illustrating its features, will now be discussed:
The words "comprising", "having", "containing", and "including", and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items.
It must also be noted that as used herein and in the appended claims, the singular forms "a", "an", and "the" include plural references unless the context clearly dictates otherwise. Although any systems, methods, apparatuses, and devices similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present invention, the preferred, systems and parts are now described. In the following description for the purpose of explanation and understanding reference has been made to numerous embodiments for which the intent is not to limit, the scope of the invention.
One or more components of the invention are described as module for the understanding of the specification. For example, a module may include self-contained component in a hardware circuit comprising of logical gate, semiconductor device, integrated circuits or
any other discrete component. The module may also be a part of any software programme executed by any hardware entity for example processor. The implementation of module as a software programme may include a set of logical instructions to be executed by the processor or any other hardware entity. Further a module may be incorporated with the set of instructions or a programme by means of an interface.
The disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms.
A system and method for tracking and detecting one or more objects in front of a host automobile is disclosed. The object is detected and tracked by applying one or more image detection techniques and by updating a matching score along with a stored template for reducing a false detection.
In accordance with an aspect, referring to figure 1 and 2, the system (100) comprises of an image capturing device (104) for capturing one or more object (102) (moving or still) in front of the host vehicle. The image capturing device may include a rear view camera for capturing the image of the object (102) in a plurality of frames. As shown in step (201 and 203), out of the plurality of frames, one or more valid frame is selected and is fed as an input to a processor (106).
In accordance with an embodiment, the object may include but is not limited to car, truck or the heavy vehicle, pedestrian or animals,
The processor (106) is configured to process the image captured in the valid input frame. A predefined triangular region based on the width and height of the frame and containing the probable road area is selected. The selected triangular region is divided into one or more blocks. The processor of the system (100) further comprises of a detector (108) for detecting the object (102). Referring to figure 2, as shown in step 205, the detector (108) is configured for applying one or more object detection techniques over the each block at
different scales. The object detection techniques may include but is not limited to Histogram of Oriented Gradients (HOG). The detected object (102) in a first valid input frame is then stored as a template in a storage medium as shown in step 207 of figure 2. This template will be used for matching the image captured in the consecutive frames. The matching will be performed for a fixed number of frames and a new template will be stored for performing a template matching after a predetermined number of frames.
In accordance with an embodiment, by way of specific example, the fixed preset number of frames will be the frames before and after a frame number multiple of five for performing the template matching with the template obtained after applying HOG. For each frame number multiple of five, a new template will be stored in the storage medium by replacing the old template.
Again referring to figure 2, as shown in step 209, when the template is stored, an evaluation module (110) of the processor generates a dynamic threshold matching score for a region of interest (ROI) for the consecutive frames for applying HOG. Using the dynamic matching score in the ROI ensures that even when the object moves away relative to the earlier frame, HOG will be able to detect it. The evaluation module (110) also generates a fixed matching score value for a surrounding region. The fixed matching score for the surrounding region value does not change while performing HOG in order to reduce the false detections as the variation in matching score for the surrounding region may result in finding a match in related objects which are not relevant.
Referring to figure 1, the processor (106) further comprises of a matching module (112) configured for performing the template matching. The matching module (112) matches the image of the object detected (102) with the stored template and obtains updated coordinates of the object.
Referring to figure 2, as shown in step 211, for achieving more accurate results, the evaluation module (110) further determines an offset value for modifying the matching
score obtained after matching the image captured in the consecutive frames with the HOG. If the obtained matching score of the image in consecutive frames with the HOG within the earlier detected ROI window is less than the threshold matching score, then the predetermined offset value is added to modify the obtained matching score. This predetermined offset value is statistically determined by considering the worst case movement of objects over an interval of frames. The offset value is added to the threshold matching score generated for the region of interest while the matching score value for the surrounding remains same.
Referring to figure 2, as shown in step 213 and 215, after obtaining the modified matching score, the matching module (112) further tracks the object by finding a region which matches with the stored template. Once the matched region is determined the template is updated with the new region for further tracking. If no matching is found the template will be deleted.
As shown in step 217, for every frame number multiple of five, the captured image in the other valid frame will be processed by the processor (106) and the detector (108) will apply one or more object detection techniques (HOG) for detecting the object. That HOG applied image will again be stored as a new template for performing the template matching. This will start a cycle and will result in repetitive object detection (102) in front of the host automobile.
In accordance with an embodiment, referring to figure 3, the system (100) is capable of detecting a plurality of objects. By way of specific example, the image capturing device (104) may capture one or more images in a plurality of frames. When two separate images are captured for two objects, the processor (106) will process them separately and the detector (108) will apply object detection technique (say HOG) over both of the two images and two different templates will be stored in the storage medium.
The evaluation module (110) will generate different threshold matching score for both the images for the region of interest and different fixed matching score value for both the images for the surrounding region. The matching module (112) will perform the separate template matching for the images in consecutive frames and will update the template for every frame number multiple of five.
When the matching score is less than the threshold matching score, the evaluation module (110) will add the predetermined offset value in order to increase the accuracy in object detection. The matching module (112) will again determine the matching region by template matching. The object will only be detected if the matching score obtained during HOG is equal to or greater than the pre-determined threshold. Once the object is detected, a warning alert is transmitted to the driver.
In accordance with an embodiment, referring to figure 1, the system (110) will further comprises of a classifier configured to classify the object by comparing it to a reference vector. The reference vector is obtained by using a SVM (Support Vector Machine).
In accordance with an embodiment, referring to figure 1, the system (100) further comprises of a calculation means configured for determining distance between the object detected and the host vehicle. The calculation means uses an inverse perspective mapping for determining the distance between the object and the host vehicle.
BEST MODE/EXAMPLE FOR WORKING OF THE INVENTION
The process illustrated for object detection in front of a host vehicle in the above paragraph can be supported by a working example showed in the following paragraph; the process is not restricted to the said example only:
Let us assume a Tata Indica car moving in a lane. Let another car say Maruti Zen overtakes the Tata Indica. The Tata Indica is provided with an auto object detection and alert system to alert the driver about any preceding vehicle and/or object. The system embedded in Tata
Indica comprises of a single camera which captures the images of Maruti Zen in a plurality
of frames. These images are further fed as an input to a processor present in the system.
The processor applies HOG technique and if it detects an object (for eg:Maruti Zen)
generates a template for the first valid frame. The camera is in a continuous functioning
and keeps on capturing the images of the Maruti Zen for a fixed range of distance. The
images captured are compared with the HOG generated template for 5 successive frames,
and the templates are updated with matched regions. This helps in tracking the vehicle
detected by HOG. And in the 6th frame HOG will be applied and a matching score is
obtained for a region of Interest (ROI). This matching score is compared to a threshold
value to check if the detected vehicle is still present in the video frame. In the ROI window
HOG is applied with dynamic matching score which is computed by adding an offset
(determined with statistical analysis) with the original matching score. This ensures that
even when the Maruti Zen moves away from Indica, it is detected consistently and no false
detections happened because of movement of vehicle or other environmental conditions.
The dynamic threshold matching score is dynamic in nature as it is repeatedly generated after a fixed number of frames (say for every frame number multiple of 5) provided there a detection by HOG in the previous interval. Also, a new template is generated by applying HOG for every fixed number of frame (say frame number multiple of 5) if HOG detects any object in that frame. The matching score threshold for surrounding region will remain fixed. Also the template obtained during HOG is updated in subsequent frames during template matching if found a matched region
If the obtained matching score is less than the threshold matching score, then the system is capable of modifying this obtained matching score by adding a predetermined offset value in order to increase the accuracy in the Maruti Zen detection. The Maruti Zen will be detected if the modified matching score will be greater than or equal to the threshold matching score else it will be considered to be out of range.
Whenever the Maruti Zen will be detected, its distance from Tata Indica will also be calculated and alert will also be transmitted to the driver.
The system embedded in Tata Indica is capable of detecting one or more vehicle moving in front of it and it is also capable of calculating distance between Tata Indica and the other vehicles.
We claim:
1. A method for detecting and tracking one or more object in front of a host automobile, the method comprising:
capturing an image in a plurality of frames; and
processing the captured image for detecting and tracking the object, the processing further comprising;
applying one or more image detection techniques on a predefined area of the image in a valid frame to detect the object;
storing the detected object as a template and generating a dynamic threshold matching score for a region of interest in the image while maintaining a fixed matching score value for a surrounding region;
performing a template matching for a fixed number of consecutive frames;
modifying the matching score by including a predetermined offset value and comparing the modified matching score with the dynamic threshold value for tracking the detected object; during HOG based detections and
updating the template after a fixed interval of frame and generating an updated dynamic threshold value with respect to the updated template for dynamic and repetitive tracking of the detected object,
2. The method as claimed in claim 1, wherein the one or more image detection techniques includes but is not limited to Histogram of Oriented Gradients (HOG).
3. The method as claimed in claim 1, wherein the template is updated for a frame number multiple of five.
4. The method as claimed in claim 1 and 3, wherein the fixed number of consecutive frames comprises of frame number before and after the frame number multiple of five.
5. The method as claimed in claim 1, wherein the predetermined offset value is calculated by performing an analysis of plurality of test vectors.
6. The method as claimed in claim 1, wherein the method further comprises of determining a distance of the detected object from the host vehicle by using an Inverse Perspective Mapping.
7. The method as claimed in claim 1, wherein the method further comprises of classifying the object by comparing it to a reference vector, the reference vector is obtained by using a SVM (Support Vector Machine).
8. A system for detecting and tracking one or more object in front of a host automobile, the system comprising:
an image capturing device for capturing an image in a plurality of frames;
a processor configured for processing the image for detecting and tracking the object, the processor further comprising;
a detector for applying one or more object detection techniques on a predefined area of the image in a valid frame and storing the detected object as a template in a storage medium;
an evaluation module configured for generating a dynamic threshold matching score for a region of interest in the image while maintaining a fixed matching score value for a surrounding region, the evaluation module is further configured to update the template and the dynamic threshold value after a fixed interval of frame in order to provide a repetitive tracking;
a matching module for performing a template matching for obtaining a matching score for a fixed number of consecutive frames;
a tracking module configured to track the object by performing template matching.
9. The system as claimed in claim 8, wherein image capturing device includes a rear view camera.
10. The system as claimed in claim 8, wherein the one or more image detection techniques includes but is not limited to Histogram of Oriented Gradients (HOG).
11. The system as claimed in claim 8, wherein the template is updated for a frame number multiple of five.
12. The system as claimed in claim 8 and 11, wherein the fixed number of consecutive frames comprises of frame number before and after the frame number multiple of five.
13. The system as claimed in claim 8, wherein the predetermined offset value is calculated by performing an analysis of plurality of test vectors.
14. The system as claimed in claim 8, wherein the system further comprises of a calculation means for determining a distance of the detected object from the host vehicle by using the Inverse Perspective Mapping.
15. The system as claimed in claim 8, wherein the system further comprises of a classifier configured to classify the object by comparing it to a reference vector, the reference vector is obtained by using a SVM (Support Vector Machine).
| # | Name | Date |
|---|---|---|
| 1 | 52-MUM-2012-DUPLICATE-FER-2017-10-31-16-35-34.pdf | 2017-10-31 |
| 1 | 52-MUM-2012-RELEVANT DOCUMENTS [27-09-2023(online)].pdf | 2023-09-27 |
| 2 | 52-MUM-2012-OTHERS [30-04-2018(online)].pdf | 2018-04-30 |
| 2 | 52-MUM-2012-RELEVANT DOCUMENTS [30-09-2022(online)].pdf | 2022-09-30 |
| 3 | 52-MUM-2012-RELEVANT DOCUMENTS [23-09-2021(online)].pdf | 2021-09-23 |
| 3 | 52-MUM-2012-FER_SER_REPLY [30-04-2018(online)].pdf | 2018-04-30 |
| 4 | 52-MUM-2012-RELEVANT DOCUMENTS [31-03-2020(online)].pdf | 2020-03-31 |
| 4 | 52-MUM-2012-COMPLETESPECIFICATION [30-04-2018(online)].pdf | 2018-04-30 |
| 5 | 52-MUM-2012-ORIGINAL UR 6(1A) FORM 26-120719.pdf | 2019-11-07 |
| 5 | 52-MUM-2012-CLAIMS [30-04-2018(online)].pdf | 2018-04-30 |
| 6 | 52-MUM-2012-FORM-26 [05-07-2019(online)].pdf | 2019-07-05 |
| 6 | 52-MUM-2012-ABSTRACT [30-04-2018(online)].pdf | 2018-04-30 |
| 7 | 52-MUM-2012-IntimationOfGrant28-01-2019.pdf | 2019-01-28 |
| 7 | 52-MUM-2012-FORM-26 [08-06-2018(online)].pdf | 2018-06-08 |
| 8 | ABSTRACT1.jpg | 2018-08-11 |
| 8 | 52-MUM-2012-PatentCertificate28-01-2019.pdf | 2019-01-28 |
| 9 | 52-MUM-2012-FORM 3.pdf | 2018-08-11 |
| 9 | 52-MUM-2012-OTHERS(ORIGINAL UR 6( 1A) FORM 26)-130618.pdf | 2018-09-12 |
| 10 | 52-MUM-2012-ABSTRACT.pdf | 2018-08-11 |
| 10 | 52-MUM-2012-FORM 26(6-2-2012).pdf | 2018-08-11 |
| 11 | 52-MUM-2012-CLAIMS.pdf | 2018-08-11 |
| 11 | 52-MUM-2012-FORM 2.pdf | 2018-08-11 |
| 12 | 52-MUM-2012-CORRESPONDENCE(3-7-2012).pdf | 2018-08-11 |
| 12 | 52-MUM-2012-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 13 | 52-MUM-2012-CORRESPONDENCE(6-2-2012).pdf | 2018-08-11 |
| 13 | 52-MUM-2012-FORM 18.pdf | 2018-08-11 |
| 14 | 52-MUM-2012-CORRESPONDENCE.pdf | 2018-08-11 |
| 14 | 52-MUM-2012-FORM 1.pdf | 2018-08-11 |
| 15 | 52-MUM-2012-DESCRIPTION(COMPLETE).pdf | 2018-08-11 |
| 15 | 52-MUM-2012-FORM 1(3-7-2012).pdf | 2018-08-11 |
| 16 | 52-MUM-2012-DRAWING.pdf | 2018-08-11 |
| 16 | 52-MUM-2012-FER.pdf | 2018-08-11 |
| 17 | 52-MUM-2012-FER.pdf | 2018-08-11 |
| 17 | 52-MUM-2012-DRAWING.pdf | 2018-08-11 |
| 18 | 52-MUM-2012-DESCRIPTION(COMPLETE).pdf | 2018-08-11 |
| 18 | 52-MUM-2012-FORM 1(3-7-2012).pdf | 2018-08-11 |
| 19 | 52-MUM-2012-CORRESPONDENCE.pdf | 2018-08-11 |
| 19 | 52-MUM-2012-FORM 1.pdf | 2018-08-11 |
| 20 | 52-MUM-2012-CORRESPONDENCE(6-2-2012).pdf | 2018-08-11 |
| 20 | 52-MUM-2012-FORM 18.pdf | 2018-08-11 |
| 21 | 52-MUM-2012-CORRESPONDENCE(3-7-2012).pdf | 2018-08-11 |
| 21 | 52-MUM-2012-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 22 | 52-MUM-2012-CLAIMS.pdf | 2018-08-11 |
| 22 | 52-MUM-2012-FORM 2.pdf | 2018-08-11 |
| 23 | 52-MUM-2012-ABSTRACT.pdf | 2018-08-11 |
| 23 | 52-MUM-2012-FORM 26(6-2-2012).pdf | 2018-08-11 |
| 24 | 52-MUM-2012-OTHERS(ORIGINAL UR 6( 1A) FORM 26)-130618.pdf | 2018-09-12 |
| 24 | 52-MUM-2012-FORM 3.pdf | 2018-08-11 |
| 25 | ABSTRACT1.jpg | 2018-08-11 |
| 25 | 52-MUM-2012-PatentCertificate28-01-2019.pdf | 2019-01-28 |
| 26 | 52-MUM-2012-IntimationOfGrant28-01-2019.pdf | 2019-01-28 |
| 26 | 52-MUM-2012-FORM-26 [08-06-2018(online)].pdf | 2018-06-08 |
| 27 | 52-MUM-2012-FORM-26 [05-07-2019(online)].pdf | 2019-07-05 |
| 27 | 52-MUM-2012-ABSTRACT [30-04-2018(online)].pdf | 2018-04-30 |
| 28 | 52-MUM-2012-ORIGINAL UR 6(1A) FORM 26-120719.pdf | 2019-11-07 |
| 28 | 52-MUM-2012-CLAIMS [30-04-2018(online)].pdf | 2018-04-30 |
| 29 | 52-MUM-2012-RELEVANT DOCUMENTS [31-03-2020(online)].pdf | 2020-03-31 |
| 29 | 52-MUM-2012-COMPLETESPECIFICATION [30-04-2018(online)].pdf | 2018-04-30 |
| 30 | 52-MUM-2012-RELEVANT DOCUMENTS [23-09-2021(online)].pdf | 2021-09-23 |
| 30 | 52-MUM-2012-FER_SER_REPLY [30-04-2018(online)].pdf | 2018-04-30 |
| 31 | 52-MUM-2012-OTHERS [30-04-2018(online)].pdf | 2018-04-30 |
| 31 | 52-MUM-2012-RELEVANT DOCUMENTS [30-09-2022(online)].pdf | 2022-09-30 |
| 32 | 52-MUM-2012-DUPLICATE-FER-2017-10-31-16-35-34.pdf | 2017-10-31 |
| 32 | 52-MUM-2012-RELEVANT DOCUMENTS [27-09-2023(online)].pdf | 2023-09-27 |
| 1 | 52mum2012_05-09-2017.pdf |