Abstract: A METHOD OF DETECTING A SPEED OF A VEHICEL USING A VISION BASED TECHNIQUE ABSTRACT Multiple images of a number plate of the vehicle (16) are received from a camera (12) using an automatic number plate recognition (ANPR) engine and is represented in a linear binary pattern (LBP) histogram. At least one feature from the received images are extracted using the LBP technique and to find occurrence of the number plate in the another image using a template matching performed in LBP histogram feature space. A scale invariant feature transform (SIFT) match of the received images is calculated and match SIFT of one said received image with another said received image for detecting a best SIFT match pair . Actual coordinates of the match pair are estimated using a homography matrix and calculate a distance using a measuring technique. The vehicle speed is calculated from said distance value and a time value from said received images. (FIGURE 1)
Description:Complete Specification:
The following specification describes and ascertains the nature of this invention and the manner in which it is to be performed:
[0001] Field of the invention:
The invention is related to a method of detecting a speed of a vehicle using a vision-based technique.
[0002] Background of the invention:
Determining vehicle speed is an important task for traffic management, as the information of variation of velocity distribution of the vehicles may help to estimate the occurrence of traffic congestion. The information may also be used to issue fines when drivers exceed speed limits. Vehicle speed is measured by intrusive or nonintrusive technologies. Intrusive systems, usually based on inductive loop detectors, are highly sensible and accurate. However, these have high installation and maintenance costs, complex installation and maintenance procedure, and can damage the roadway. Non-intrusive systems are mostly based on laser sensors, infrared sensors, Doppler radar, or audio-based sensors. They are portable, but expensive, and requires frequent maintenance. Radar guns is reliable, when a moving object is in the field of view and no other moving objects are nearby. It has the limitations of cosine error and radio interference and is less accurate. LIDAR can determine vehicle's speed accurately, but it cannot be used while a car is in motion. It also requires the operator to actively target each vehicle. Both these techniques require camera in order to document offensive vehicles.
[0003] A US patent 9607220 discloses methods and systems for estimating the speed of passing vehicles based on License Plate Recognition (LPR) image information. The distance traversed by a vehicle between image frames is estimated based on a difference between a pixel location of a tracking point on the vehicle in a first image and a second pixel location of the tracking point in second image of the vehicle. The distance is converted to a projected displacement on the roadway surface based on the mapping between pixel locations within the field of view of the image sensor and locations on the surface of the roadway. An estimate of vehicle speed is calculated from the displacement and the elapsed time between image frames.
[0004] Brief description of the accompanying drawings:
An embodiment of the disclosure is described with reference to the following accompanying drawing,
[0005] Figure 1 illustrates a control unit for calculating a speed of a vehicle according to one embodiment of the invention: and
[0005] Figure 2 illustrates a flow chart of a method of calculating a speed of a vehicle according to the present invention.
Detailed description of the embodiments:
[0006] Figure 1 illustrates a control unit for calculating a speed of a vehicle according to one embodiment of the invention. The control unit 10 receives multiple images of a number plate of the vehicle 16 from a camera 12 located at different locations. The control unit 10 extracts at least one textural features from the received images using a local binary pattern (LBP) histogram, that is referred as template of that number plate. The template is matched in the next frame/image, represented in LBP space. The match between template histogram and the current histogram at each location in the current image is calculated using a coefficient. The location, where the similarity is maximum is noted as the position of occurrence of template number plate in the current image. The control unit 10 calculates a scale invariant feature transform (SIFT) match between the received images, i.e., the template number plate image and the matched number plate in the current frame. The control unit 10 finds the best match SIFT pair locations using an Euclidean distance measure. The control unit 10 estimates actual coordinates of the match pair using a homography matrix and calculates the Euclidean distance. The control unit 10 calculates the vehicle speed from the distance value and a time value from the received images.
[0007] Further the construction of the control unit and the components of the control unit is explained in detail. The control unit 10 is chosen from a group of control units comprising a microprocessor, a microcontroller, a digital circuit, an integrated chip and the like. The images of the number plate of the vehicle 16 is taken from a camera 12 that is installed in a location using an automatic number plate recognition (ANPR) engine. Each location will have a camera 12 installed and the camera captures the images of the number plate of the vehicle 16, when the vehicle is passed in that location. The captured image is referred as a first image and the image is taken during the start of the vehicle 16 at that location, which is referred as t=0.
[0008] And the control unit 10 receives another image of the number plate of the vehicle 16, that is referred as a second image at a next location from the same camera 12, i.e., after a time period of the start of the vehicle 16, which is t= t+i, wherein i being the increment time from the start of the vehicle 16. For instance, if the vehicle number plate is captured at 10’o clock at a “X” location, then the second image is captured at 10.30 at a “Y” location.
[0009] A method for calculating a speed of a vehicle 16. The method comprises the following steps. In step S1, multiple images of a number plate of the vehicle (16) are received from a camera (12) using an automatic number plate recognition engine and is represented in a linear binary pattern (LBP) histogram. In step S2, at least one feature from the received images are extracted using said local binary pattern technique and to find a most probable occurrence of the number plate in the another image using a template matching performed in LBP histogram feature space. The LBP histogram feature is extracted from the received images referred as template of that number plate. The template is matched in another image, represented in LBP space. The match between template histogram and the current histogram at each location in the current image is calculated using a co-efficient , which according to one embodiment is a Bhattacharya coefficient. The location, where the similarity is maximum is noted as the position of occurrence of template number plate in the current frame. In step S3, a scale invariant feature transform (SIFT) match of the received images which are matched using said template matching is calculated and match SIFT of one said received image with another said received image for detecting a best SIFT match pair . In step S4, actual coordinates of the match pair are estimated using a homography matrix and calculate a distance using a measuring technique. In step S5,the vehicle speed is calculated from said distance value and a time value from said received images.
[0010] The method is explained in detail. The method is used to detect the speed using a vision-based technique. The number plate of the vehicle 16 is captured using an overhead camera 12 located at a location “x” at a time period of t=0 and using automatic number plate recognition engine. The captured number plate is represented using a local binary pattern histogram.. The captured number plate image is transmitted to the control unit 10. When the vehicle moves from the “x” location to the “y” location, the camera 12 present in that location captures another image of the number plate using template matching of LBP histogram of the template number plate and the current frame, represented in LBP histogram space, at a time period of t= t+i.
[0011] The two images received by the control unit 10 from two different locations are referred as the first image and the current image. It is to be noted that, in some parts of the description, the received images are referred as one image and another image. The first image and the one image are meant to be same. The second image and another image are meant to be same.
[0012] The control unit 10 extracts the features from the first image and the second image using a linear binary pattern technique (LBP). It is to be noted that, the extraction of the features from the received images (the first image and the second image) can be of from any other extraction technique that is known to a person skilled in the art. The extracted features in the LBP feature space of the first image are matched with the extracted features in the LBP feature space of the second image using a template matching. The template is different for each number plate and will be updated with time using a linear adaptive filter.
[0013] After finding the location of the same number plate in the second image, the control unit 10 further calculates the SIFT matching (scale invariant feature transform model) between both the received images having the same number plate enclosed by a bounding box. i.e., the SIFT of the first image is matched with the SIFT of the second image. The SIFT matches between the first image and the second image are calculated to find the best SIFT match pair. The SIFT match pair is a pair that has close features in both the images (the first image and the second image). After finding the SIFT match pair, the control unit 10 estimates the actual coordinates of both the matched images and the matched pair using the Homography matrix.
[0014] The homography matrix is calculated offline using a checkerboard 14 at the same location, where the camera 12 is placed. The checkerboard 14 is placed at a target place on a perfectly horizontal table, which is at the height as the number plate to prevent projection error. The image of the checkerboard 14 is captured and the actual coordinates along with the image co-ordinates of the four corners of the checkerboard 14 are detected. Using a RANSAC (Random Sample Consensus) technique, the homography matrix is calculated,. Each location will have a corresponding homography matrix.
[0015] Once the actual coordinates are estimated, using a measuring technique, the distance is calculated between the matched pair of points . According to one embodiment of the invention, the measuring technique is a Euclidean measure. However, the measuring techniques can be different from the above disclosed technique that is known to a person skilled in the art. The distance is calculated between the matched pair, corresponding to two consecutive occurrences of the number plate in the first image and the second image.
[0016] The control unit 10 upon detecting the time in the first and the second image and the difference in the time detected between the first image and the second image, and with the calculated distance, the speed in which the vehicle 16 is moved within these two locations is detected/calculated. If the detected speed is exceeded beyond a threshold value, the control unit 10 transmits a fine to the owner of the vehicle 16.
[0017] With the above disclosed method and control unit provides an efficient low-cost solution using the existing cameras that are installed in various locations of the city. The homography matrix in each location should be computed offline and is location specific. The method further provides an effective speed calculation even with one captured number plate. The method further provides a texture-based template matching and SIFT matching technique for detecting the vehicle speed without disclosing the identity of the person driving the vehicle 16.
[0018] It should be understood that embodiments explained in the description above are only illustrative and do not limit the scope of this invention. Many such embodiments and other modifications and changes in the embodiment explained in the description are envisaged. The scope of the invention is only limited by the scope of the claims.
, Claims:We claim: -
1. A control unit (10) for calculating a speed of a vehicle (16), said control unit (10) adapted to:
- receive multiple images of a number plate of said vehicle (16) from a camera (12) using an automatic number plate recognition engine and is represented in a linear binary pattern (LBP) histogram;
- extract at least one feature from said received images using said local binary pattern technique and to find a most probable occurrence of said number plate in said another image using a template matching performed in LBP histogram feature space;
- calculate a scale invariant feature transform (SIFT) match of said received images which are matched using said template maching and match SIFT of one said received image with another said received image for detecting a best SIFT match pair;
- estimate actual coordinates of said match pair using a homography matrix and calculate a distance using a measuring technique;
- calculate said vehicle speed from said distance value and a time value from said received images.
2. The control unit (10) as claimed in claim 1, wherein one said received image is captured at a start of said vehicle (16) (t=0) and another said received image is captured after a time period of said start of said vehicle (16) (t=t+i).
3. The control unit (10) as claimed in claim 1, wherein said extracted features in an LBP feature space detected in said one image is matched with said another frame in said LBP feature space using a template matching technique.
4. The control unit (10) as claimed in claim 1, wherein each said vehicle (16) comprises a corresponding template and is updated in a data base using a linear adaptive filter technique.
5. The control unit (10) as claimed in claim 1, wherein said SIFT match pair is a pair that has close features in said received images.
6. The control unit (10) as claimed in claim 1, wherein an image of a checkerboard (14) that is positioned at said location is captured for detecting world coordinates and image coordinates associated with corners of said checkerboard (14).
7. The control unit (10) as claimed in claim 6, wherein said homography matrix is generated using said detected world coordinates and said image coordinates.
8. The control unit (10) as claimed in claim 7, wherein each of said different locations has a corresponding homography matrix.
9. A method for calculating a speed of a vehicle (16), said method comprising:
- receive multiple images of a number plate of said vehicle (16) from a camera (12) using an automatic number plate recognition engine and is represented in a linear binary pattern (LBP) histogram;
- extract at least one feature from said received images using said local binary pattern technique and to find a most probable occurrence of said number plate in said another image using a template matching performed in LBP histogram feature space;
- calculate a scale invariant feature transform (SIFT) match of said received images which are matched using said template maching and match SIFT of one said received image with another said received image for detecting a best SIFT match pair;
- estimating actual coordinates of said match pair using a homography matrix and calculate a distance using a measuring technique;
- calculating said vehicle speed from said distance value and a time value from said received images
| # | Name | Date |
|---|---|---|
| 1 | 202241043435-POWER OF AUTHORITY [29-07-2022(online)].pdf | 2022-07-29 |
| 2 | 202241043435-FORM 1 [29-07-2022(online)].pdf | 2022-07-29 |
| 3 | 202241043435-DRAWINGS [29-07-2022(online)].pdf | 2022-07-29 |
| 4 | 202241043435-DECLARATION OF INVENTORSHIP (FORM 5) [29-07-2022(online)].pdf | 2022-07-29 |
| 5 | 202241043435-COMPLETE SPECIFICATION [29-07-2022(online)].pdf | 2022-07-29 |