Abstract: An image processing device detects with respect to a frame image feature points ( 1 2) for specifications measurement which are feature points on side surfaces of a vehicle and a feature point ( 3) for specifications measurement which is a feature point on a lower surface and calculates the width of the vehicle by performing a motion stereo process on the feature points ( 1 2 3) for specifications measurement. The feature points ( 1 to 3) for specifications measurement detected by the image processing device are detected such that the positions of the points have a degree of freedom on the side surfaces or the lower surface of the vehicle.
VEHICLE DIMENSION MEASUREMENT PROCESSING APPARATUS, VEHICLE DIMENSION MEASURING METHOD, AND STORAGE MEDEUM
Technical Field
[0001]
The present invention relates to a vehicle dimension measurement processing apparatus, a vehicle dimension measuring method and a storage medium, and especially relates to an apparatus and a method for measuring a dimension (a width, a length, a height) of a vehicle.
Background Art
[0002]
At a toll gate of an expressway, a toll road and so on, there is a case in which a vehicle type judgement apparatus for judging a type of a vehicle passing through the toll gate is provided. Although various methods are used for judging the vehicle type, in recent years, it has been studied that an image of the vehicle is obtained and the judgement of the vehicle type is performed using the obtained image. Since it is possible to obtain various j^inds of information regarding the vehicle from the vehicle image, the judgement of the vehicle type using the image of the vehicle is useful for increasing accuracy of the judgement of the vehicle type.
[0003]
In the judgement of the vehicle type using the vehicle image, there is a case in which a feature point is extracted from the vehicle image and the judgement of the vehicle type is performed based on the extracted feature point. For example, Non Patent Literature 1 (J. Prol^ai, G. Medioni, "3-D model based
vehicle recognition". Applications of Computer Vision (WACV), 2009) discloses a technique in which template is created by studying the feature point for each vehicle type using CG (computer graphic), a database having the template registered therein is prepared in advance, and the vehicle type is presumed by matching process between the extracted feature point from the image and the template in the database. However, in this technique, it is necessary to prepare the template for each vehicle type. [0004]
As another method of judging the vehicle type using the vehicle image, a technique in which the dimension (for example, the width, the length, the height) of the vehicle is calculated by image processing and the calculated vehicle dimension is used as information for judging the vehicle type is known. For example. Patent Literature 1 (JP Hll-86185 A) discloses a technique in which geometric correction images of front and side surfaces of the vehicle are created by performing correction on an obtained vehicle image and a size of the vehicle is estimated from the geometric correction images of the front and side surfaces is disclosed. More specifically, in the technique disclosed in Patent Literature 1, a contact position between a tire and a road surface is detected and a vehicle position and a vehicle approach angle in the three-dimensional space are calculated from the contact position and installation conditions of an imaging device. The geometric correction images of the front and side surfaces of the vehicle are created by performing the correction on the obtained vehicle image based on the calculated vehicle position and vehicle approach angle in the three-dimensional space, and the size of the vehicle is estimated from the created geometric correction images.
[0005]
In addition, Non Patent Literature 2 (A. Lai, G. Fung and N. Yung, "Vehicle type classification from visual-based dimension estimation", IEEE, Intelligent Transportation Systems, pages 201-206, 2001) discloses a technique in which the dimension of the vehicle is estimated from an apparent vehicle size in the image. More specifically, in the technique disclosed in Non Patent Literature 2, a region in the image correlated with a vehicle contour is extracted as a hexagonal region. Coordinates of three vertexes closer to the road surface among vertexes of the hexagon are calculated, the width and length of the vehicle are calculated from the coordinates, and furthermore the height of the vehicle is calculated from a length of a side of the hexagon correlated with a height direction in the real space.
[0006]
In the method of calculating the dimension of the vehicle by the image processing, it is preferable that the vehicle dimension can be calculated by the image processing in which certainty of extracting the feature point is not required, that is, the image processing which is robust regarding the feature point detection. A method in which the vehicle dimension cannot be calculated or the accuracy is decreased when a specific feature point cannot be calculated is not preferable. For example, in the technique disclosed in Non Patent Literature 2, unless three vertexes closer to the road surface among the vertexes of the hexagon correlated with the vehicle contour can be detected as the feature points, it is not possible to calculate the vehicle dimension. If constraint regarding the detected feature point is strong, measurement of the vehicle dimension by image processing becomes uncertain.
[0007]
Under the above-mentioned background, provision of a technique in which a dimension (for example, a width, a length, a height) of a vehicle can be calculated by image processing which is robust regarding feature point detection is required.
Citation List
[Patent Literature]
[0008]
Patent Literature 1: Japan Patent Application
Publication JP Hll-86185 A
[Non Patent Literature]
[0009]
Non Patent Literature 1: J. Prol^ai, G. Medioni, "3-D model based vehicle recognition". Applications of Computer Vision (WACV), 2009
Non Patent Literature 2: A. Lai, G. Fung and N. Yung, "Vehicle type classification from visual-based dimension estimation", IEEE, Intelligent Transportation Systems, pages 201-206, 2001
SummarY of the Invention
[0010]
Accordingly, an object of the present invention is providing a technique in which a dimension (for example, a width, a length, a height) of a vehicle can be calculated by image processing which is robust regarding feature point detection.
[0011]
In one aspect of the present invention, a vehicle dimension measuring apparatus includes an imaging device which obtains frame images of a vehicle by successively imaging the vehicle, and an image processing device which calculates a width of the
vehicle based on the frame images. The image processing device includes a feature point detection means which detects a first feature point for measuring a dimension which is a feature point in a first plane of a rectangular parallelepiped that approximates a shape of the vehicle, a second feature point for measuring the dimension which is a feature point in a second plane of the rectangular parallelepiped and a third feature point for measuring the dimension which is a feature point in a third plane of the rectangular parallelepiped in at least one frame image among the frame images, and a vehicle dimension calculation means which calculates the width of the vehicle based on the first feature point for measuring the dimension, the second feature point for measuring the dimension and the third feature point for measuring the dimension. In this case, the first plane is a plane of the rectangular parallelepiped correlated with a first side surface of the vehicle, the second plane is a plane of the rectangular parallelepiped correlated with a second side surface of the vehicle, and the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle. The vehicle dimension calculation means calculates an amount of movement of the vehicle from an amount of movement of the third feature point, calculates three-dimensional coordinates of the first feature point and three-dimensional coordinates of the second feature point by performing a motion stereo process on the first feature point and the second feature point using the amount of movement of the vehicle, and calculates the width of the vehicle from the calculated three-dimensional coordinates of the first feature point and the calculated three-dimensional coordinates of the second feature point. A position of the first feature
point detected by the feature point detection means is arbitrary in the first plane, a position of the second feature point detected by the feature point detection means is arbitrary in the second plane, and a position of the third feature point detected by the feature point detection means is arbitrary in the third plane.
[0012]
In one embodiment, the feature point detection means may be configured to extract a moving region correlated with the vehicle from the frame image, extract tangential lines on the moving region in the frame image correlated with lines contacting the vehicle in real space, and determine an intersection point in the first plane among intersection points of the extracted tangential lines as the first feature point for measuring the dimension. In this case, the feature point detection means may be configured to determine an intersection point in the second plane among the intersection points of the extracted tangential lines as the second feature point, and may be configured to determine an intersection point in the third plane among the intersection points of the extracted tangential lines as the third feature point.
[0013]
In one embodiment, the feature point detection means may be configured to extract a moving region correlated with the vehicle from the frame image, extract tangential lines on the moving region in the frame image correlated with lines contacting the vehicle in real space, and determine the first feature point by searching a feature point along a first search path which is constituted by a portion of the extracted tangential lines in the first plane. In this case, the feature point detection means may be configured to determine the second feature point by searching a feature point along a second search path
which is constituted by a portion of the extracted tangential lines in the second plane, and may be configured to determine the third feature point by searching a feature point along a third search path which is constituted by a portion of the extracted tangential lines in the third plane.
[0014]
In one embodiment, the feature point detection means may include a first template database having a first template of a component supposed to exist on the first side surface of the vehicle and first feature point position data indicating a position of a feature point on the first template. In this case, the feature point detection means may be configured to calculate a first matching position which is a position of the first template at which the first template in the first template database matches the best with the frame image by means of template matching, and detect the first feature point by supposing that the first feature point is at a position determined from the first matching position and the first feature point position data.
[0015]
Further, the feature point detection means may include a second template database having a second template of a component supposed to exist on the second side surface of the vehicle and second feature point position data indicating a position of a feature point on the second template. In this case, the feature point detection means may be configured to calculate a second matching position which is a position of the second template at which the second template in the second template database matches the best with the frame image by means of template matching, and detect the second feature point by supposing that the second feature point is at a
position determined from the second matching position and the second feature point position data.
[0016]
Moreover, the feature point detection means may include a third template database having a third template of a component supposed to exist on the bottom surface of the vehicle and third feature point position data indicating a position of a feature point on the third template. In this case, the feature point detection means may be configured to calculate a third matching position which is a position of the third template at which the third template in the third template database matches the best with the frame image by means of template matching, and detect the third feature point by supposing that the third feature point is at a position determined from the third matching position and the third feature point position data.
[0017]
In another aspect of the present invention, a vehicle dimension measuring apparatus includes an imaging device which obtains frame images of a vehicle by successively imaging the vehicle, and an image processing device which calculates a length of the vehicle based on the frame images. The image processing device includes a feature point detection means which detects a third feature point for measuring a dimension which is a feature point in a third plane of a rectangular parallelepiped that approximates a shape of the vehicle, a fourth feature point for measuring the dimension which is a feature point in a fourth plane of the rectangular parallelepiped and a fifth feature point for measuring the dimension which is a feature point in a fifth plane of the rectangular parallelepiped in at least one frame image among the frame images, and a vehicle
dimension calculation means which calculates the length of the vehicle based on the third feature point for measuring the dimension, the fourth feature point for measuring the dimension and the fifth feature point for measuring the dimension. In this case, the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle, the fourth plane is a plane of the rectangular parallelepiped correlated with a front surface of the vehicle, and the fifth plane is a plane of the rectangular parallelepiped correlated with a rear surface of the vehicle. The vehicle dimension calculation means calculates an amount of movement of the vehicle from an amount of movement of the third feature point, calculates three-dimensional coordinates of the fourth feature point and three-dimensional coordinates of the fifth feature point by performing a motion stereo process on the fourth feature point and the fifth feature point using the amount of movement of the vehicle, and calculates the length of the vehicle from the calculated three-dimensional coordinates of the fourth feature point and the calculated three-dimensional coordinates of the fifth feature point. A position of the third feature point detected by the feature point detection means is arbitrary in the third plane, a position of the fourth feature point detected by the feature point detection means is arbitrary in the fourth plane, and a position of the fifth feature point detected by the feature point detection means is arbitrary in the fi fth plane . [0018]
In still another aspect of the present invention, a vehicle dimension measuring apparatus includes an imaging device which obtains frame images of a vehicle by successively imaging the vehicle, and
10
an image processing device which calculates a height of the vehicle based on the frame images. The image processing device includes a feature point detection means which detects a third feature point for measuring a dimension which is a feature point on a third plane of a rectangular parallelepiped that approximates a shape of the vehicle and a sixth feature point for measuring the dimension which is a feature point in a sixth plane of the rectangular parallelepiped in at least one frame image among the frame images, and a vehicle dimension calculation means which calculates the height of the vehicle based on the third feature point for measuring the dimension and the sixth feature point for measuring the dimension. In this case, the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle, and the sixth plane is a plane of the rectangular parallelepiped correlated with an upper surface of the vehicle. The vehicle dimension calculation means calculates an amount of movement of the vehicle from an amount of movement of the third feature point, calculates three-dimensional coordinates of the third feature point and three-dimensional coordinates of the sixth feature point by performing a motion stereo process on the third feature point and the sixth feature point using the amount of movement of the vehicle, and calculates the height of the vehicle from the calculated three-dimensional coordinates of the third feature point and the calculated three-dimensional coordinates of the sixth feature point. A position of the third feature point detected by the feature point detection means is arbitrary in the third plane, and a position of the sixth feature point detected by the feature point detection means is arbitrary in the sixth plane. [0019]
11
In still another aspect of the present invention, a vehicle dimension measuring method includes: obtaining frame images of a vehicle by successively imaging the vehicle; detecting a first feature point for measuring a dimension which is a feature point in a first plane of a rectangular parallelepiped that approximates a shape of the vehicle, a second feature point for measuring the dimension which is a feature point in a second plane of the rectangular parallelepiped and a third feature point for measuring the dimension which is a feature point in a third plane of the rectangular parallelepiped in the frame image; and calculating a width of the vehicle based on the first feature point, the second feature point and the third feature point. In this case, the first plane is a plane of the rectangular parallelepiped correlated with a first side surface of the vehicle, the second plane is a plane of the rectangular parallelepiped correlated with a second side surface of the vehicle, and the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle. The calculating a width of the vehicle includes: calculating an amount of movement of the vehicle from an amount of movement of the third feature point; calculating three-dimensional coordinates of the first feature point and three-dimensional coordinates of the second feature point by performing a motion stereo process on the first feature point and the second feature point using the amount of movement of the vehicle; and calculating the width of the vehicle from the calculated three-dimensional coordinates of the first feature point and the calculated three-dimensional coordinates of the second feature point. A position of the first feature point detected by the step of the detecting is
12
arbitrary in the first plane, and a position of the second feature point detected by the step of the detecting is arbitrary in the second plane. In addition, a position of the third feature point detected by the step of the detecting is arbitrary in the third plane. [0020]
In still another aspect of the present invention, a vehicle dimension measuring method includes: obtaining frame images of a vehicle by successively imaging the vehicle; detecting a third feature point for measuring a dimension which is a feature point in a third plane of a rectangular parallelepiped that approximates a shape of the vehicle, a fourth feature point for measuring the dimension which is a feature point in a fourth plane of the rectangular parallelepiped and a fifth feature point for measuring the dimension which is a feature point in a fifth plane of the rectangular parallelepiped in at least one frame image among the frame images; and calculating a length of the vehicle based on the third feature point, the fourth feature point and the fifth feature point. In this case, the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle, the fourth plane is a plane of the rectangular parallelepiped correlated with a front surface of the vehicle, and the fifth plane is a plane of the rectangular parallelepiped correlated with a rear surface of the vehicle. The calculating a length of the vehicle includes: calculating an amount of movement of the vehicle from an amount of movement of the third feature point; calculating three-dimensional coordinates of the fourth feature point and three-dimensional coordinates of the fifth feature point by performing a motion stereo process on the fourth
13
feature point and the fifth feature point using the amount of movement of the vehicle; and calculating the length of the vehicle from the calculated three-dimensional coordinates of the fourth feature point and the calculated three-dimensional coordinates of the fifth feature point. A position of the third feature point detected by the step of the detecting is arbitrary in the third plane, a position of the fourth feature point detected by the step of the detecting is arbitrary in the fourth plane, and a position of the fifth feature point detected by the step of the detecting is arbitrary in the fifth plane. [0021]
In still another aspect of the present invention, a vehicle dimension measuring method includes: obtaining frame images of a vehicle by successively imaging the vehicle; detecting a third feature point for measuring a dimension which is a feature point in a third plane of a rectangular parallelepiped that approximates a shape of the vehicle and a sixth feature point for measuring the dimension which is a feature point in a sixth plane of the rectangular parallelepiped in at least one frame image among the plurality of frame images; and calculating a height of the vehicle based on the third feature point and the sixth feature point. In this case, the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle, and the sixth plane is a plane of the rectangular parallelepiped correlated with an upper surface of the vehicle. The calculating a height of the vehicle includes: calculating an amount of movement of the vehicle from an amount of movement of the third feature point; calculating three-dimensional coordinates of the third feature point and three-dimensional coordinates of the sixth feature point by
14
performing a motion stereo process on the third feature point and the sixth feature point using the amount of movement of the vehicle; and calculating the height of the vehicle from the calculated three-dimensional coordinates of the third feature point and the calculated three-dimensional coordinates of the sixth feature point. A position of the third feature point detected by the step of the detecting is arbitrary in the third plane, and a position of the sixth feature point detected by the step of the detecting is arbitrary in the sixth plane.
[0022]
The above-mentioned vehicle dimension measuring method may be performed by executing a program by a computer. In this case, the program may be installed in the computer using a storage medium in which the program is recorded.
[0023]
According to the present invention, it is possible to provide the technique in which the dimension (for example, the width, the length, the height) of the vehicle can be calculated by image processing which is robust regarding the feature point detection.
Brief Description of the Drawings
[0024]
Fig. lA is a conceptual diagram for indicating a position of a feature point for measuring a dimension used for measuring a width of a vehicle according to one e mb o d i m e n t;
Fig. IB is a conceptual diagram for indicating a position of a feature point for measuring the dimension used for measuring a length of the vehicle according to the present embodiment;
Fig. IC is a conceptual diagram for indicating a
15
position of a feature point for measuring the dimension used for measuring a height of the vehicle according to the present embodiment;
Fig. 2 is a block diagram for indicating an example of a hardware configuration of a vehicle dimension measurement processing apparatus according to one embodiment;
Fig. 3A is a flowchart for indicating image processing for calculating the dimension of the vehicle according to the present embodiment;
Fig. 3B is the flowchart for indicating the image processing for calculating the dimension of the vehicle according to the present embodiment;
Fig. 4 is a diagram for conceptually indicating detection of a moving feature point according to the present embodiment;
Fig. 5 is a diagram for conceptually indicating clustering of the moving feature point according to the present embodiment;
Fig. 6 is a diagram for conceptually indicating detection of a moving region according to the present embodiment;
Fig. 7A is a diagram for indicating procedure to obtain tangential lines on each moving region and intersection points of the tangential lines according to the present embodiment;
Fig. 7B is a diagram for indicating the procedure to obtain the tangential lines on each moving region and the intersection points of the tangential lines according to the present embodiment;
Fig. 8A is a diagram for indicating procedure to detect the feature points for measuring the dimension in side and bottom surfaces of the vehicle from the obtained tangential lines on the moving region and the obtained intersection points according to the present embodiment;
16
Fig. 8B is a diagram for indicating procedure to detect the feature points for measuring the dimension in front and rear surfaces of the vehicle from the obtained tangential lines on the moving region and the obtained intersection points according to the present embodiment;
Fig. 8C is a diagram for indicating procedure to detect the feature point for measuring the dimension in an upper surface of the vehicle from the obtained tangential lines on the moving region and the obtained intersection points according to the present embodiment;
Fig. 9A is a diagram for indicating the procedure to detect the feature points for measuring the dimension in the side and bottom surfaces of the vehicle by searching the feature points along search paths defined by the obtained tangential lines on the moving region and the obtained intersection points according to the present embodiment;
Fig. 9B is a diagram for indicating the procedure to detect the feature points for measuring the dimension in the front and rear surfaces of the vehicle by searching the feature points along search paths defined by the obtained tangential lines on the moving region and the obtained intersection points according to the present embodiment;
Fig. 9C is a diagram for indicating the procedure to detect the feature point for measuring the dimension in the upper surface of the vehicle by searching the feature points along search paths defined by the obtained tangential lines on the moving region and the obtained intersection points according to the present embodiment;
Fig. 10 is a conceptual diagram for indicating contents of a template database used for detecting the feature point for measuring the dimension in each
17
surface of the vehicle according to the present embodiment;
Fig. 11 is a diagram for conceptually indicating a method of detecting the feature point for measuring 5 the dimension in each surface of the vehicle by
template matching according to the present embodiment;
Fig. 12 is a diagram for conceptually indicating detection of entering and leaving of the vehicle into and from a processing area according to the present 10 embodiment;
Fig. 13A is a diagram for conceptually explaining motion stereo process performed in the present embodiment; and
Fig. 13B is a diagram for conceptually explaining 15 the motion stereo process performed in the present embodiment.
Description of Embodiments
[0025]
20 Firstly, an outline of a vehicle dimension
measuring method according to one embodiment will be explained. In the present embodiment, images of a vehicle which is traveling are successively imaged (that is, the images of the vehicle are imaged at
25 different times), and detection and tracking of
feature points are performed regarding the plurality of images. Furthermore, motion stereo process is performed on the feature points. The motion stereo process is a technique in which images of a moving
30 object are imaged at different times, a moving
distance of the object is calculated by some method (it may be calculated from the obtained images themselves), and three-dimensional coordinates of the object in a three-dimensional coordinate system which
35 moves together with the object are calculated from the obtained images and the moving distance of the object.
18
It is possible to calculate the three-dimensional coordinates of the object by a method similar to a general stereo process using information that positions of the object are different from each other 5 by the moving distance even though the images have been captured from the same place. In the present invention, the three-dimensional coordinates of the feature points (more specifically, the three-dimensional coordinates of the feature points in the
10 three-dimensional coordinate system which moves
together with the vehicle) are calculated by means of the motion stereo process, and the dimension of the vehicle is calculated from the three dimensional coordinates.
15 [0026]
One of the features of a vehicle dimension measuring process according to the present embodiment is in a technique of extracting feature points which are to be used for calculating a width, a length and a
20 height of the vehicle. Figs. 1A to 1C are the
diagrams for indicating the feature points used for measuring the width, length and height of the vehicle, respectively. In the present invention, a rectangular parallelepiped 30, which approximates a contour of the
25 vehicle, is defined, and the feature points at
positions along specific planes of the rectangular parallelepiped 30 are used as the feature points for measuring the width, length and height of the vehicle. Note that the rectangular parallelepiped 30 is only
30 imaginarily determined. In Figs. 1A to 1C, symbols v1 to v8 represent the vertices of the rectangular parallelepiped 30. Note that the feature points used for measuring the width, length or height of the vehicle are referred to as feature points for
35 measuring a dimension (in order to distinguish from
19
feature points used for detecting a moving object from
the images (as mentioned later) ) .
[0027]
In detail, as shown in Fig. 1A, following three 5 types of the feature points for measuring the
dimension are used in measuring the width of the vehicle.
Feature point #1 for measuring the dimension: the feature point in a side surface 31 of the vehicle 10 (strictly speaking, the feature point which is at a position along the plane of the rectangular parallelepiped 30 correlated with the side surface 31 of the vehicle) ;
Feature point #2 for measuring the dimension:
15 the feature point in a side surface 32 of the vehicle
(strictly speaking, the feature point which is at a
position along the plane of the rectangular
parallelepiped 30 correlated with the side surface 32
of the vehicle) ;
20 Feature point #3 for measuring the dimension:
the feature point in a bottom surface 33 of the vehicle (strictly speaking, the feature point which is at a position along the plane of the rectangular parallelepiped 30 correlated with the bottom surface 25 33 of the vehicle);
Note that the side surface 31 is a side surface
which is opposed to a camera for measuring the vehicle
which is traveling, and the side surface 32 is the
other side surface. The plane of the rectangular
30 parallelepiped 30 correlated with the side surface 31
includes the vertices v1 to v4, and the plane of the
rectangular parallelepiped 30 correlated with the side
surface 32 includes the vertices v5 to v6.
[0028]
35 In the present embodiment, the feature points
#1, #2, #3 for measuring the dimension can be
20
extracted from arbitrary positions, respectively as long as conditions under which the feature points #1, #2, #3 for measuring the dimension are in the side surface 31, in the side surface 32 and in the bottom 5 surface 33, respectively are satisfied. For example, regarding the side surface 31 opposed to the camera, it is allowed to use the feature point at the arbitrary position in the side surface 31 as the feature point #1 for measuring the dimension.
10 [0029]
Similarly, regarding the side surface 32 on the opposite side from the camera, it is allowed to use the feature point at the arbitrary position in the side surface 32 as the feature point #2 for measuring
15 the dimension. However, since the side surface 32 is actually hidden by the vehicle, in practice, the feature point extracted from the arbitrary position along the side v5v6 and the side v6v7 on the vehicle contour is used as the feature point in the side
20 surface 32. [0030]
Furthermore, regarding the bottom surface 33, it is allowed to use the feature point at the arbitrary position in the bottom surface 33 as the
25 feature point #3 for measuring the dimension. However, since the bottom surface 33 is actually hidden by the vehicle, in practice, the feature point extracted from the arbitrary position along the side v3v7 and the side v3v4 on the vehicle contour is used as the feature
30 point in the bottom surface 33. [0031]
As described later, among the extracted feature points #1 to #3 for measuring the dimension, the feature point #3 for measuring the dimension in the
35 bottom surface 33 is used for calculating a moving distance of the vehicle. Furthermore, the motion
21
stereo process is performed on the feature points #1, #2 for measuring the dimension using the calculated moving distance, the three-dimensional coordinates of the feature points #1, #2 for measuring the dimension 5 in the three-dimensional coordinate system which moves together with the vehicle (for example, relative coordinates with respect to the feature point #3 for measuring the dimension) are calculated, and the width of the vehicle is calculated from the calculated
10 three-dimensional coordinates of the feature points #1, #2 for measuring the dimension. [0032]
On the other hand, as shown in Fig. 1B, following three types of the feature points for
15 measuring the dimension are used in measuring the length of the vehicle.
Feature point #3 for measuring the dimension: the feature point in the bottom surface 33 of the vehicle;
20 Feature point #4 for measuring the dimension:
the feature point in a front surface 34 of the vehicle (strictly speaking, the feature point which is at a position along the plane of the rectangular parallelepiped 30 correlated with the front surface 34
25 of the vehicle) ;
Feature point #5 for measuring the dimension: the feature point in a rear surface 35 of the vehicle (strictly speaking, the feature point which is at a position along the plane of the rectangular
30 parallelepiped 30 correlated with the rear surface 35 of the vehicle); [0033]
As similar to the feature points #1 to #3 for measuring the dimension, the feature points #4, #5 for
35 measuring the dimension can also be extracted from
arbitrary positions as long as conditions under which
22
the feature points #4, #5 for measuring the dimension are in the front surface 34 and in the rear surface 35 are satisfied. However, the positions of the feature points extracted actually depend on arrangement of the 5 camera. If the camera is arranged so as to image the traveling vehicle from the back side, the rear surface 35 is opposed to the camera. In this case, regarding the rear surface 35, it is allowed to use the feature point at the arbitrary position in the rear surface 35
10 as the feature point #5 for measuring the dimension. On the other hand, regarding the front surface 34 which is actually hidden by the vehicle, the feature point extracted from the arbitrary position along the side v1v5 and the side v1v4 on the vehicle contour is
15 used as the feature point #4 for measuring the
dimension in the front surface 34. In Fig. 1B, the case in which such camera arrangement is taken is illustrated. On the contrary, if the camera is arranged so as to image the traveling vehicle from the
20 front side, the front surface 34 is opposed to the
camera. In this case, regarding the front surface 34, it is allowed to use the feature point at the arbitrary position in the front surface 34 as the feature point #4 for measuring the dimension. On the
25 other hand, regarding the rear surface 35 which is actually hidden by the vehicle, the feature point extracted from the arbitrary position along the sides, which are capable of being imaged from the camera, on the vehicle contour is used as the feature point #5
30 for measuring the dimension in the rear surface 35. [0034]
In calculating the length of the vehicle, the motion stereo process is performed on the feature points #4, #5 for measuring the dimension using the
35 moving distance of the vehicle, which is calculated based on the feature point #3 for measuring the
23
dimension in the bottom surface 33. Thereby, the three-dimensional coordinates of the feature points #4, #5 for measuring the dimension in the three-dimensional coordinate system which moves together 5 with the vehicle (for example, the relative
coordinates with respect to the feature point #3 for measuring the dimension) are calculated, and the length of the vehicle is calculated from the calculated three-dimensional coordinates of the
10 feature points #4, #5 for measuring the dimension. [0035]
Furthermore, as shown in Fig. 1C, following two types of the feature points for measuring the dimension are used in measuring the height of the
15 vehicle:
Feature point #3 for measuring the dimension: the feature point in the bottom surface 33 of the vehicle (strictly speaking, the feature point which is at a position along the surface of the rectangular
20 parallelepiped 30 correlated with the bottom surface 33 of the vehicle)
Feature point #6 for measuring the dimension: the feature point in the upper surface 36 of the vehicle (strictly speaking, the feature point which is
25 at a position along the surface of the rectangular
parallelepiped 30 correlated with the upper surface 36
of the vehicle)
[0036]
As similar to the feature points #1 to #5 for
30 measuring the dimension, the feature points #6 for
measuring the dimension can also be extracted from an arbitrary position as long as conditions under which the feature point #6 for measuring the dimension is in the upper surface 36 are satisfied. Note that, in
35 case of imaging the vehicle traveling on the road,
since the camera is usually arranged above the vehicle
24
(for example, the camera is arranged on a gantry
provided on the road), the whole of the upper surface
36 of the vehicle is usually imaged.
[0037]
5 In calculating the height of the vehicle, the
motion stereo process is performed on the feature point #6 for measuring the dimension using the moving distance of the vehicle calculated based on the feature point #3 for measuring the dimension in the
10 bottom surface 33. Thereby, the three-dimensional
coordinates of the feature points #3, #6 for measuring the dimension in the three-dimensional coordinate system which moves together with the vehicle (for example, the relative coordinates with respect to the
15 feature point #3 for measuring the dimension) are calculated, and the height of the vehicle is calculated from the calculated three-dimensional coordinates of the feature points #3, #6 for measuring the dimension.
20 [0038]
Subsequently, the vehicle dimension measurement processing apparatus and the vehicle dimension measuring method according to one embodiment will be described in detail.
25 [0039]
Fig. 2 is the block diagram for indicating the example of the configuration of the vehicle dimension measurement processing apparatus 10 according to the present embodiment. The vehicle dimension measurement
30 processing apparatus 10 includes a camera 1, and an
image processing device 2. The camera 1 is an imaging device for imaging the vehicle traveling on the road. The camera 1 generates image data 21 of the vehicle image (the frame image), and supplies the generated
35 image data 21 to the image processing device 2. The image processing device 2 measures the dimension of
25
the vehicle, more specifically, the width, length, height of the vehicle, which has been captured in the image correlated with the image data 21 by performing image processing on the received image data 21 from 5 the camera 1, and generates vehicle dimension data 22 indicating the width, length, height of the vehicle. [0040]
The image processing device 2 includes, an image processing IC (integrated circuit) 3, and an
10 external interface 4, an external storage device 5, a memory 6, and a ROM (read only memory) 7. The external interface 4 is connected to the camera 1 and supplies the received image data 21 from the camera 1 to the image processing IC 3. The external storage
15 device 5 stores data which are generated in image
processing by the image processing device 2. The data stored in the external storage device 5 includes the vehicle dimension data 22. The memory 6 is used as a working area in calculation processing by the image
20 processing IC 3. The ROM 7 stores a program which is to be performed by the image processing IC 3. The program stored in the ROM 7 includes a vehicle dimension measurement processing program 7a which is a program for performing image processing for measuring
25 the vehicle dimension. [0041]
Note that, in the above, the configuration of the image processing device 2 for calculating the vehicle dimension from the image data 21 has been
30 described, but with the image processing device 2, vehicle type judgement using the vehicle dimension data 22 may be performed. In this case, the program required for the vehicle type judgement may be installed in the ROM 7.
35 [0042]
26
In this case, as the ROM 7, a rewritable non¬volatile memory (such as a flash memory) may be used. In this case, the vehicle dimension measurement processing program 7a (and the program used for the 5 vehicle type judgement if necessary) may be installed in the ROM 7 from the outside. In the installation of the vehicle dimension measurement processing program 7a in the ROM 7, a storage medium which has stored the vehicle dimension measurement processing program 7a
10 may be used. In this case, a device for accessing the storage medium is provided in the image processing device 2. Instead of the ROM 7, a suitable storage device (e.g. a hard disk drive (HDD), a solid state drive) may be used.
15 [0043]
The image processing IC 3 includes an arithmetic module 11, an image input interface 12, a data input/output interface 13, a memory controller 14, and a ROM controller 15. The arithmetic module 11,
20 the image input interface 12, the data input/output interface 13, the memory controller 14 and the ROM controller 15 are connected via an internal bus 16. The arithmetic module 11 performs the vehicle dimension measurement processing program 7a while
25 using the memory 6 as the working area, and performs
image processing for calculating the vehicle dimension from the image data 21. The image input interface 12 is an interface that receives the image data 21 from the camera 1. The data input/output interface 13 is
30 an interface for performing access to the external storage device 5. The memory controller 14 is an interface for performing access to the memory 6. Furthermore, the ROM controller 15 is an interface for performing access to the ROM 7.
35 [0044]
27
The image processing for calculating the dimension of the vehicle mentioned below is performed using the hardware shown in Fig. 2. The image processing for calculating the dimension of the 5 vehicle will be described below in detail. [0045]
Fig. 3A and Fig. 3B are flowcharts for indicating the image processing for calculating the dimension of the vehicle according to one embodiment. The above-
10 mentioned vehicle dimension measurement processing
program 7a is a program code group for performing the image processing. The image processing for calculating the dimension of the vehicle roughly includes following three stages:
15 1. Detection and tracking of feature points for
detecting the moving object (steps S01 to S04 in Fig. 3A)
2. Detection and separation of the moving object, and
detection and tracking of the feature points for
20 measuring the dimension (steps S05 to S07 in Fig. 3A)
3. Calculation of the vehicle dimension based on the
feature points for measuring the dimension (steps S08
to S12 in Fig. 3B)
Each of the stages will be described below in
25 detail. [0046]
1. Detection and tracking of the feature points for detecting the moving object
With reference to Fig. 3A, firstly, after the
30 initialization of the image processing device 2 is
performed (step S01), the image data 21 of the frame image obtained by imaging the vehicle are successively captured (step S02). Furthermore, the feature points included in the frame image correlated with the image
35 data 21 are detected, and furthermore the feature
points are tracked (step S03). However, the feature
28
points detected and tracked in step S03 do not necessarily match with the above-mentioned feature points for measuring the dimension. Therefore, in order to distinguish them from the feature points for 5 measuring the dimension, the feature points detected and tracked in step S03 are hereinafter referred to as feature points for detecting the moving object. In step S03, a list 23 of the feature points for detecting the moving object, which is a list that
10 enumerates the feature points for detecting the moving
object to be detected and tracked, is generated. When
new feature points for detecting the moving object are
detected in a region into which the moving object
(which is the vehicle in many cases) has been entered,
15 these new feature points for detecting the moving
object are added in the list 23 of the feature points for detecting the moving object. Moreover, the feature points for detecting the moving object which has disappeared toward the outside of the image or the
20 feature points for detecting the moving object which has not been able to be tracked are removed from the list 23 of the feature points for detecting the moving object. [0047]
25 The capture of the image data 21 (step S02) and
the detection and tracking of the feature points for detecting the moving object (step S03) are performed repeatedly one after another until an end signal is given to the image processing device 2. When the end
30 signal is given to the image processing device 2 (step S04), the image processing by the image processing device 2 comes to end. [0048] 2. Detection and separation of the moving object, and
35 detection and tracking of the feature points for measuring the dimension
29
At this stage, feature points correlated with each vehicle are detected among the feature points enumerated in the list 23 of the feature points for detecting the moving object, that is, the feature 5 points for detecting the moving object currently tracked. Specifically, firstly, among the feature points for detecting the moving object enumerated in the list 23 of the feature points for detecting the moving object, feature points correlated with the
10 vehicle (i.e. feature points excluding feature points not correlated with the vehicle) are detected as moving feature points (step S05) . More specifically, among the feature points for detecting the moving object enumerated in the list 23 of the feature points
15 for detecting the moving object, feature points which are stationary and feature points moving while movement is different from movement of the vehicle are removed. Since the vehicle which is traveling on the road generally moves in a direction of the road, it is
20 possible to judge that the feature points without such movement are feature points that do not correspond to the vehicle. [0049]
Fig. 4 is the diagram for indicating an example
25 of the detection of the moving feature point in step S05. As shown in left side diagram in Fig. 4, in step S03, the feature points 41 in series of images are detected, and further, the feature points 41 which is moving are tracked. In left side diagram in Fig. 4,
30 the feature points 41 being tracked are illustrated so as to be connected to one another by line segments. The feature points which are judged to be stationary are removed from the list 23 of the feature points for detecting the moving object.
35 [0050]
30
Furthermore, among the feature points being tracked, feature points which satisfies following conditions (1) to (3) are detected as the moving feature points correlated with the vehicle: 5 (1) Tracking has been able to be performed over a
prescribed number NSTD or more of serial frame images. (The prescribed number NSTD is given as a parameter.)
(2) Magnitude of a vector connecting a start point
(position at which the tracking on the image is
10 started regarding the corresponding feature point) to the current point (i.e. the latest position of the corresponding feature point on the image) is equal to μ × NTRC or more. Note that μ i s a prescribed parameter value (technically means a threshold of a
15 moving amount per unit time) and NTRC is the number of frame images in which the tracking is performed regarding the corresponding feature point.
(3) An angle between the vector directed to the
current point from the start point and a vector
20 directed to a vanishing point on the image from the start point is equal to a prescribed threshold angle or less. Note that the above-mentioned vanishing point is a vanishing point defined in a perspective image when the image of the vehicle is approximated to
25 the perspective image. In right side diagram in Fig. 4, the vanishing point 43 is illustrated. [0051]
The right side diagram in Fig. 4 illustrates an example of the moving feature points detected in step
30 S05. In right side diagram in Fig. 4, the feature
points 42a, 42b, 42c being tracked over the prescribed number or more of the serial frame images are illustrated. Note that reference numeral 42a-1 indicates the start point of the feature point 42a,
35 and reference numeral 42a-3 indicates the current point of the feature point 42a. Reference numeral
31
42a-2 indicates a position of the feature point 42a in a frame image between a frame image correlated with the start point and a frame image correlated with the current point. Similarly, reference numeral 42b-1 5 indicates a start point of the feature point 42b, and reference numeral 42b-3 indicates the current point of the feature point 42b. Reference numeral 42b-2 indicates a position of the feature point 42b in a frame image between a frame image correlated with the
10 start point and a frame image correlated with the current point. Further, reference numeral 42c-1 indicates the start point of the feature point 42c, and reference numeral 42c-3 indicates the current point of the feature point 42c. Reference numeral
15 42c-2 indicates a position of the feature point 42c in a frame image between a frame image correlated with the start point and a frame image correlated with the current point. [0052]
20 For the feature point 42a, the judgement
whether or not the magnitude of the vector connecting the start point 42a-1 to the current point 42a-3 is equal to y x NTR or more and the judgement whether or not the angle between the vector directed to the
25 current point 42a-3 from the start point 42a-1 and the vector directed to the vanishing point 43 from the start point 42a-1 is equal to the prescribed threshold value BTH1 or less are performed. Based on these judgements, whether or not the feature point 42a is
30 extracted as the moving feature point is judged. [0053]
Regarding the feature point 42b, 42c, similar judgements are performed. In the following, explanation will be made under the assumption that the
35 feature points 42a, 42b, 42c have been detected as the moving feature points in step S05.
32
[0054]
As shown in Fig. 3A, after the detection of the moving feature point (step S05), clustering of the detected moving feature points is performed (step S06). 5 Referring to Fig. 5, this clustering is performed
based on a distance to the vanishing point 43 of each feature point and an orientation of the vector directed to the current point from the start point. Note that, as the distance to the vanishing point 43
10 of the feature point, the distance from the current point to the vanishing point 43 (in Fig. 5, the distance is illustrated as the distance DM) may be used, the distance from the start point to the vanishing point 43 may be used, or average distance
15 during the tracking of the feature point may be used. According to this clustering, a group of the feature points correlated with one vehicle is classified into the same cluster. [0055]
20 Fig. 5 illustrates an example of clustering in
step S06. Referring to Fig. 5, in step S05, it is supposed that six feature points: the feature points 42a, 42b, 42c, 42d, 42e and 42f are extracted. In this case, for each of the feature points 42a, 42b,
25 42c, 42d, 42e, 42f, the distance to the vanishing
point 43 and the orientation of the vector directed to the current point from the start point are calculated. In Fig. 5, the distance to the vanishing point 43 of each feature point is defined as the distance DM from
30 the current point to the vanishing point 43 and also, the orientation of the vector directed to the current point from the start point is defined as an angle from the horizontal direction in the image. [0056]
35 For example, regarding the feature point 42a,
the distance from the current point 42a-3 to the
33
vanishing point 43 and the angle θ between the vector directed to the current point 42a-3 from the start point 42a-1 and the horizontal direction in the image are calculated. Regarding the other feature points 5 42b, 42c, 42d, 42e, 42f, the distance from the current point to the vanishing point 43 and the orientation of the vector directed to the current point from the start point are calculated in the same manner. Based on the calculated distance from the current point to
10 the vanishing point 43 and the calculated orientation of the vector directed to the current point from the start point, clustering of the feature points 42a, 42b, 42c, 42d, 42e, 42f is performed. In the example shown in Fig. 5, as a result of the clustering, the feature
15 points 42a, 42b, 42c are classified into the cluster correlated with the first vehicle, and the feature points 42d, 42e, 42f are classified into the cluster correlated with the second vehicle. [0057]
20 As shown in Fig. 3A, after clustering moving
feature points, a moving region correlated with each vehicle is extracted from the image, and further, for each moving region, detection and tracking of the feature points for measuring the dimension are
25 performed (step S07). [0058]
Firstly, as shown in Fig. 6, based on the moving feature points of each cluster, the moving region in which the vehicle is imaged is extracted in
30 each image. The moving region is extracted for each cluster. The moving region is extracted as a region having a polygonal shape which corresponds to the vehicle contour in the image. In this case, the moving region is extracted such that the feature
35 points of each cluster correspond to the vertices of the polygon.
34
[0059]
Furthermore, regarding the moving region correlated with each vehicle, a process of detecting the above-mentioned feature points for measuring the 5 dimension and a process of tracking them are performed. Note that, as explained with reference to Fig. 1A, the feature points #1, #2 for measuring the dimension are the feature points in the side surfaces 31, 32 of the vehicle, respectively, and the feature point #3 for
10 measuring the dimension is the feature point in the bottom surface 33 of the vehicle. Moreover, the feature point #4 for measuring the dimension is the feature point in the front surface 34 of the vehicle, the feature point #5 for measuring the dimension is
15 the feature point in the rear surface 35 of the vehicle, and further, the feature point #6 for measuring the dimension is the feature point in the upper surface 36 of the vehicle. It should be noted that the number of the feature points #1 for measuring
20 the dimension to be extracted is not necessarily
limited to one, but a plurality of the feature points #1 for measuring the dimension may be extracted. It is the same for other feature points #2 to #6 for measuring the dimension.
25 [0060]
Note that the tracking of the feature points for measuring the dimension may be performed by means of a method in which the feature points for measuring the dimension are detected for each frame image and
30 the feature points for measuring the dimension are associated with one another among the frame images. Alternatively, the tracking of the feature points for measuring the dimension may be performed by means of a method in which the feature points for measuring the
35 dimension are detected in a frame image (an initial frame image) where the moving region is initially
35
appeared and motion tracking of the feature points for
measuring the dimension in subsequent frame images is
performed.
[0061]
5 In any of the cases, since the process of
detecting the feature points for measuring the dimension is performed for at least the initial frame, in the following, firstly, the process of detecting the feature points for measuring the dimension will be
10 explained. The process of detecting the feature
points for measuring the dimension is performed using at least one of the following methods (1) to (3). It should be noted that the number of the feature points #1 for measuring the dimension to be extracted is not
15 necessarily limited to one, but a plurality of the
feature points #1 for measuring the dimension may be extracted. Therefore, it is possible that the plurality of the feature points #1 for measuring the dimension is extracted using a plurality of the
20 following methods (1) to (3). It is the same for other feature points #2 to #6 for measuring the dimension. [0062] Method (1):
25 In the method (1) of detecting the feature
points for measuring the dimension, tangential lines on each vehicle are extracted and intersection points of the tangential lines are calculated. The feature points for measuring the dimension are selected and
30 determined from the calculated intersection points. The above-mentioned “tangential line” means that a line in the image which corresponds to a tangential line on the vehicle in real space. Figs. 7A and 7B illustrate an example of the process for calculating
35 the tangential lines on the vehicle and the
intersection points thereof. There is a case in which
36
the tangential lines are also tangential lines (tangential lines indicated by reference numerals 61 to 66) on the moving region in the image, but the case is not limited thereto (e.g. tangential lines 5 indicated by reference numerals 67 to 69) . [0063]
As shown in diagram (1) in Figs. 7A and 7B, in the method (1), firstly, the tangential lines 61, 62 correlated with lines which are over the road surface
10 and parallel to traffic lanes in the real space are extracted. Since we can determine at what angle the lines over the road surface and parallel to the traffic lanes in the real space are imaged in the image based on spatial arrangement of the camera 1, it
15 is information that is known in advance. Using this fact, the tangential lines correlated with the lines which is parallel to the traffic lanes as well as which is over the road surface are extracted. Note that the tangential line 61 is a tangential line on a
20 near side of the vehicle (when viewed from the camera 1) among the extracted tangential lines correlated with the lines parallel to the traffic lanes, and the tangential line 62 is a tangential line on a far side of the vehicle.
25 [0064]
Furthermore, as shown in diagram (2) in Figs. 7A and 7B, the tangential lines correlated with lines which are over the road surface and perpendicular to the traffic lanes in the real space are extracted. As
30 similar to the lines parallel to the traffic lanes in the real space, we can determine at what angle the lines over the road surface and perpendicular to the traffic lanes in the real space are imaged in the image based on the spatial arrangement of the camera 1.
35 Therefore, it is information that is known in advance. Using this fact, the tangential lines correlated with
37
the lines perpendicular to the traffic lanes in the real space are extracted. In diagram (2) in Figs. 7A and 7B, the tangential line 63 is a tangential line on a near side of the vehicle (when viewed from the 5 camera 1) among the extracted tangential lines
correlated with the lines perpendicular to the traffic lanes, and the tangential line 64 is a tangential line on a far side of the vehicle. [0065]
10 In addition, the intersection point 51 of the
tangential line 61 and the tangential line 63 is calculated, and the intersection point 52 of the tangential line 62 and the tangential line 64 is calculated.
15 [0066]
Furthermore, as shown in diagram (3) in Figs. 7A and 7B, the tangential lines correlated with lines which are perpendicular to the road surface in the real space are extracted. Since it is information
20 that is known in advance at what angle the lines
perpendicular to the road surface in the real space are imaged in the image, it is possible to extract the tangential lines on the moving region correlated with the lines perpendicular to the traffic lanes in the
25 real space using this fact. In diagram (3) in Figs. 7A and 7B, among the extracted tangential lines correlated with the lines perpendicular to the road surface, the tangential line which is on a far side of the vehicle is indicated by reference number 65 and
30 the tangential lines which are on a near side of the vehicle are indicated by reference numbers 66, 67. Note that the tangential line 67 is extracted as a line passing through the intersection point 51 of the tangential lines 61, 63 on the near side of the
35 vehicle. [0067]
38
In addition, as shown in diagram (3), the intersection point 53 of the tangential line 61 and the tangential line 65, the intersection point 54 of the tangential line 64 and the tangential line 65, the 5 intersection point 55 of the tangential line 63 and the tangential line 66 and the intersection point 56 of the tangential line 62 and the tangential line 66 are calculated. [0068]
10 Furthermore, as shown in diagram (3) in Figs.
7A and 7B, the tangential lines 68, 69 correlated with lines which are located away from and above the road surface, which are in a plane parallel to the road surface and which are perpendicular or parallel to the
15 traffic lanes are extracted. Note that the tangential line 68 is the tangential line correlated with the line perpendicular to the traffic lanes, and the tangential line 69 is the tangential line correlated with the line parallel to the traffic lanes. The
20 tangential line 68 is extracted as a line passing
through the intersection point 56. In addition, the intersection point 57 of the tangential line 67 and the tangential line 68 is calculated, and the tangential line 69 is extracted as a line passing
25 through the intersection point 57. [0069]
Figs. 8A to 8C are the diagrams for indicating the procedure to determine each feature point for measuring the dimension among the calculated
30 intersection points. More specifically, Fig. 8A is
the diagram for indicating the method to determine the feature points #1, #2 for measuring the dimension in the side surfaces 31, 32 of the vehicle and the feature point #3 for measuring the dimension in the
35 bottom surface 33 of the vehicle. Moreover, Fig. 8B is the diagram for indicating the method to determine
39
the feature point #4 for measuring the dimension in the front surface 34 of the vehicle and the feature point #5 for measuring the dimension in the rear surface 35 of the vehicle. Fig. 8C is the diagram for 5 indicating the method to determine the feature points #6 for measuring the dimension in the upper surface 36 of the vehicle. Note that the method of determining the feature point for measuring the dimension in the bottom surface 33 of the vehicle is illustrated not
10 only in Fig. 8A, but also in Figs. 8B and 8C. [0070]
Referring to Fig. 8A, the feature point #1 for measuring the dimension is selected from the intersection points (i.e. intersection points 51, 53,
15 54, 57) in the side surface 31 of the vehicle among
the calculated intersection points as mentioned above. Note that all of the intersection points 51, 53, 54, 57 may be determined as the feature points #1 for measuring the dimension. Alternatively, one or more
20 intersection points appropriately selected from among the intersection points 51, 53, 54, 57 may be determined as the feature points #1 for measuring the dimension. [0071]
25 Furthermore, the feature point #2 for measuring
the dimension is selected from among the intersection points (i.e. intersection points 52, 55, 56) in the side surface 32 of the vehicle. Note that all of the intersection points 52, 55, 56 may be determined as
30 the feature points #2 for measuring the dimension. Alternatively, one or more intersection points appropriately selected from among the intersection points 52, 55, 56 may be determined as the feature points #2 for measuring the dimension.
35 [0072]
40
In addition, the feature point #3 for measuring the dimension is selected from among the intersection points (i.e. intersection points 51, 53, 55) in the bottom surface 33 of the vehicle. Note that all of 5 the intersection points 51, 53, 55 may be determined as the feature points #3 for measuring the dimension. Alternatively, one or more intersection points appropriately selected from among the intersection points 51, 53, 55 may be determined as the feature
10 points #3 for measuring the dimension. [0073]
Furthermore, as shown in Fig. 8B, the feature point #4 for measuring the dimension is selected from among the intersection points (i.e. intersection
15 points 52, 53, 54) in the front surface 34 of the
vehicle. Note that all of the intersection points 52, 53, 54 may be determined as the feature points #4 for measuring the dimension. Alternatively, one or more intersection points appropriately selected from among
20 the intersection points 52, 53, 54 may be determined as the feature points #4 for measuring the dimension. [0074]
In addition, the feature point #5 for measuring the dimension is selected from among the intersection
25 points (i.e. intersection points 51, 55, 56, 57) in the rear surface 35 of the vehicle. Note that all of the intersection points 51, 55, 56, 57 may be determined as the feature points #5 for measuring the dimension. Alternatively, one or more intersection
30 points appropriately selected from among the
intersection points 51, 55, 56, 57 may be determined as the feature points #5 for measuring the dimension. [0075]
Furthermore, as shown in Fig. 8C, the feature
35 point #6 for measuring the dimension is selected from among the intersection points (i.e. intersection
41
points 52, 56, 57) in the upper surface 36 of the vehicle. Note that all of the intersection points 52, 56, 57 may be determined as the feature points #6 for measuring the dimension. Alternatively, one or more 5 intersection points appropriately selected from among the intersection points 52, 56, 57 may be determined as the feature points #6 for measuring the dimension. [0076] Method (2):
10 In the method (2) of detecting the feature
points for measuring the dimension, after the tangential lines and the intersection points are calculated as similar to the method (1), the feature point in each surface which is easily tracked is
15 searched by searching the feature point along a search path which is defined by the calculated tangential lines and the calculated intersection points. The feature point obtained by this search is determined as the feature point for measuring the dimension. Figs.
20 9A to 9C are diagrams for indicating the procedure to determine each feature point for measuring the dimension by the method (2). [0077]
Referring to Fig. 9A, in determining the
25 feature point #1 for measuring the dimension in the side surface 31, a path constituted by a portion of the above-mentioned calculated tangential lines in the side surface 31 is determined as the search path, and the search for the feature point is performed around
30 this search path. In this case, the intersection points are used for obtaining the portion of the tangential lines in the side surface 31 in determining this search path. More specifically, the search for the feature point is performed around the search path
35 constituted by the portion of the tangential line 65
between the intersection point 54 and the intersection
42
point 53, the portion of the tangential line 62 between the intersection point 53 and the intersection point 51 and the portion of the tangential line 67 between the intersection point 51 and the intersection 5 point 57. The feature point obtained by this search is determined as the feature point #1 for measuring the dimension. The number of the determined feature points #1 for measuring the dimension may be one or more.
10 [0078]
In determining the feature point #2 for measuring the dimension in the side surface 32, a path constituted by a portion of the calculated tangential lines in the side surface 32 is determined as the
15 search path, and the search for the feature point is performed around this search path. More specifically, the search for the feature point is performed around the search path constituted by the portion of the tangential line 66 between the intersection point 55
20 and the intersection point 56 and the portion of the tangential line 62 between the intersection point 56 and the intersection point 52. The feature point obtained by this search is determined as the feature point #2 for measuring the dimension. The number of
25 the determined feature points #2 for measuring the dimension may be one or more. [0079]
Furthermore, in determining the feature point #3 for measuring the dimension in the bottom surface
30 33, a path constituted by a portion of the calculated tangential lines in the bottom surface 33 is determined as the search path, and the search for the feature point is performed around this search path. More specifically, the search for the feature point is
35 performed around the search path constituted by the portion of the tangential line 61 between the
43
intersection point 53 and the intersection point 51 and the portion of the tangential line 63 between the intersection point 51 and the intersection point 55. The feature point obtained by this search is 5 determined as the feature point #2 for measuring the dimension. The number of the determined feature points #2 for measuring the dimension may be one or more. [0080]
10 Further, referring to Fig. 9B, in determining
the feature point #4 for measuring the dimension in the front surface 34, a path constituted by a portion of the above-mentioned calculated tangential lines in the front surface 34 is determined as the search path,
15 and the search for the feature point is performed around this search path. More specifically, the search for the feature point is performed around the search path constituted by the portion of the tangential line 65 between the intersection point 53
20 and the intersection point 54 and the portion of the tangential line 64 between the intersection point 54 and the intersection point 52. The feature point obtained by this search is determined as the feature point #4 for measuring the dimension. The number of
25 the determined feature points #4 for measuring the dimension may be one or more. [0081]
Further, in determining the feature point #5 for measuring the dimension in the rear surface 35, a
30 path constituted by a portion of the calculated
tangential lines in the rear surface 35 is determined as the search path, and the search for the feature point is performed around this search path. More specifically, the search for the feature point is
35 performed around the search path constituted by the portion of the tangential line 66 between the
44
intersection point 55 and the intersection point 56, the portion of the tangential line 68 between the intersection point 56 and the intersection point 57, the portion of the tangential line 67 between the 5 intersection point 57 and the intersection point 51, and the portion of the tangential line 63 between the intersection point 51 and the intersection point 55. The feature point obtained by this search is determined as the feature point #5 for measuring the
10 dimension. The number of the determined feature
points #5 for measuring the dimension may be one or more. [0082]
Further, referring to Fig. 9C, in determining
15 the feature point #6 for measuring the dimension in
the upper surface 36, a path constituted by a portion of the calculated tangential lines in the upper surface 36 is determined as the search path, and the search for the feature point is performed around this
20 search path. More specifically, the search for the feature point is performed around the search path constituted by the portion of the tangential line 68 between the intersection point 57 and the intersection point 56 and the portion of the tangential line 62
25 between the intersection point 56 and the intersection point 52. The feature point obtained by this search is determined as the feature point #6 for measuring the dimension. The number of the determined feature points #6 for measuring the dimension may be one or
30 more. [0083] Method (3):
In the method (3) of detecting the feature points for measuring the dimension, regarding each
35 surface of the vehicle, the search for the feature
point is directly (that is, without referring to the
45
tangential lines and the intersection points which are calculated by the above-mentioned method (1)) performed. The feature point obtained by this search is determined as the feature point for measuring the 5 dimension. Figs. 10 and 11 are the diagrams for indicating the procedure to determine the feature point for measuring the dimension by the method (3). [0084]
In the case in which the method (3) of
10 detecting the feature point for measuring the
dimension is adopted, a template database including templates of components which are supposed to exist on each surface of the vehicle is prepared, and template matching is performed using the templates. Fig. 10 is
15 the diagram for conceptually indicating the example of the contents of the prepared template database. The prepared template database includes following six databases:
- a template database 71 including the templates of
20 the components on the side surface 31 which is on the
camera side;
- a template database 72 including the templates of
the components on the side surface 32 which is on the
other side from the camera;
25 - a template database 73 including the templates of the components on the bottom surface 33;
- a template database 74 including the templates of
the components on the front surface 34;
- a template database 75 including the templates of
30 the components on the rear surface 35; and
- a template database 76 including the templates of
the components on the upper surface 36.
In each of the template databases 71 to 76, feature point position data for indicating the 35 position of the feature point on the template are given to the template of each component.
46
[0085]
For example, regarding the template database 71, the templates created from images of components (e.g. a tire, a window, a mirror, a lamp) that are supposed 5 to be on the side surface 31 are registered and further, the position of the feature point is designated for each template. Black dots in Fig. 10 indicate the feature points defined for each template. In order to accommodate multiple vehicle types, a
10 plurality of the templates is prepared for each
component in the template database 71. Similarly, regarding the other template databases 72 to 76, the templates created from images of components that are supposed to be on each surface are registered and the
15 feature point is designated for each template. The template databases 71 to 76 are stored in an appropriate storage means, for example, the external storage device 5 shown in Fig. 2. [0086]
20 Fig. 11 is the diagram for conceptually
explaining the detection of the feature point for measuring the dimension by the template matching. Note that Fig. 11 indicates the detection of the feature point #5 for measuring the dimension in the
25 rear surface 35 using the templates registered in the template database 75. Regarding the feature point #5 for measuring the dimension in the rear surface 35, a matching process with the templates (that is, the templates of the components on the rear surface 35) in
30 the template database 75 is performed on the entire
frame image which is a processing target. Similarity with each template is calculated for each position of each frame image, and for each component, the best match template and the position of the template (the
35 matching position) are calculated. Furthermore, the position of the feature point for measuring the
47
dimension is determined from the position of the best match template and the position of the feature point designated by the feature point position data concerning the best match template. By this 5 procedure, the feature point for measuring the dimension is detected. [0087]
In the above, the detection of the feature point #5 for measuring the dimension in the rear
10 surface 35 has been explained. Regarding the other
surfaces (the side surfaces 31, 32, the bottom surface 33, the front surface 34, the upper surface 36), the detection of the feature points for measuring the dimension by the template matching is performed in a
15 similar manner. [0088]
The detection of the feature point for measuring the dimension according to the above-mentioned methods (1) to (3) may be performed on each
20 frame image. In this case, the tracking of the
feature point for measuring the dimension is performed by a method in which the feature points for measuring the dimension are associated with each other among the frame images.
25 [0089]
The detection of the feature point for measuring the dimension according to the above-mentioned methods (1) to (3) may be performed on the frame image (the initial frame image) in which the
30 moving region is initially appeared. In this case,
regarding subsequent frame images, the motion tracking of the detected feature points for measuring the dimension is performed using an appropriate algorithm. The detection and tracking of the feature points for
35 measuring the dimension in step S07 are performed as described above.
48
[0090]
With reference to Fig. 3A again, when the detection and tracking of the feature points for measuring the dimension in step S07 are performed, 5 historical data of the detection and tracking of the feature points for measuring the dimension, that is, data indicating the positions of the feature points for measuring the dimension in the frame images at each time are stored in a list 24 of the feature
10 points for measuring the dimension. The list 24 of the feature points for measuring the dimension is stored in the appropriate storage device (for example, the external storage device 5 in Fig. 2). The list 24 of the feature points for measuring the dimension is
15 created for each moving region every time when the moving region is appeared in the image. [0091]
3. Calculation of the vehicle dimension based on the feature points for measuring the dimension
20 After the process of the detection and tracking
of the feature point for measuring the dimension is performed, the vehicle dimension is calculated using the feature points for measuring the dimension. More specifically, as shown in Fig. 3B, firstly, time when
25 the whole of each vehicle has entered into a
particular processing area defined in each frame image is detected (step S08), and further, time when the whole the vehicle has left from the processing area is detected (step S09). Fig. 12 is the diagram for
30 conceptually indicating the detection of entering and leaving of the vehicle into and from the processing area. The particular area (for example, the area defined in a lower portion of the frame image) of each frame image is defined as the processing area 44, and
35 the time when the entire moving region correlated with each vehicle enters into the processing area 44 and
49
the time when the entire moving region leaves from the processing area 44 are detected. [0092]
In addition, the vehicle dimension data 22 5 indicating the vehicle dimension are calculated using historical data in a period between the time when the entire of each vehicle enters into the processing area 44 and the time when the entire of the each vehicle leaves from the processing area 44 among the
10 historical data of the feature points for measuring the dimension of each moving region (i.e. each vehicle) stored in the list 24 of the feature points for measuring the dimension (step S13) . The calculation of the vehicle dimension data 22 is
15 performed by performing the motion stereo process on the position data of the feature points for measuring the dimension in the frame image at each time during the period between the time when the entire of each vehicle enters into the processing area 44 and the
20 time when the entire of the each vehicle leaves from the processing area 44. By the motion stereo process, the three-dimensional coordinates of the feature points at each time (more specifically, the three-dimensional coordinates of the feature points in the
25 three-dimensional coordinate system which moves
together with the vehicle) are calculated, and the dimension of the vehicle is calculated by the three dimensional coordinates. [0093]
30 Fig. 13A and Fig. 13B are the conceptual diagrams for explaining the motion stereo process for calculating the three-dimensional coordinates of each feature point for measuring the dimension according to the present embodiment. Referring to Fig. 13A, in the
35 motion stereo process in the present embodiment, the moving amount dC of the vehicle in the real space is
50
calculated from the moving amount of the feature point #3 for measuring the dimension in the bottom surface 33 of the vehicle in the frame image. Correspondence between the moving amount of a point over the real 5 road surface in the frame image and the moving amount of the point in the real space can be obtained in advance based on the spatial arrangement of the camera 1. Using this fact, it is possible to calculate the moving amount dC of the feature point #3 for measuring
10 the dimension in the real space from the moving amount of the feature point #3 (which can be approximated as a point located on the road surface in the real space) for measuring the dimension in the bottom surface 33 of the vehicle in the frame image.
15 [0094]
As shown in Fig. 13B, the motion stereo process is performed based on the assumption that the feature points #1 to #6 for measuring the dimension have been moved by the calculated moving amount dC. That is, the
20 three-dimensional coordinates of the feature points #1 to #6 for measuring the dimension are calculated by the motion stereo process based on the assumption that the feature points #1 to #6 for measuring the dimension have been moved by the calculated moving
25 amount dC. [0095]
After calculating the three-dimensional coordinates of each feature point for measuring the dimension, the vehicle dimension, that is, the width,
30 length, height of the vehicle is calculated using the three-dimensional coordinates. As shown in Fig. 1A, the width of the vehicle is calculated from the three-dimensional coordinates of the feature points #1, #2 for measuring the dimension in the side surfaces 31,
35 32. Further, as shown in Fig. 1B, the length of the vehicle is calculated from the three-dimensional
51
coordinates of the feature points #4, #5 for measuring the dimension in the front surface 34 and the rear surface 35. Furthermore, as shown in Fig. 1C, the height of the vehicle is calculated from the three-5 dimensional coordinates of the feature points #3, #6 for measuring the dimension in the bottom surface 33 and the upper surface 36. [0096]
In addition, as shown in Fig. 3B, the vehicle
10 dimension data 22 indicating the calculated vehicle dimension are generated. The vehicle dimension data 22 are stored in the external storage device 5. Then, the calculated vehicle dimension is output using an appropriate output device (e.g. display device) (step
15 S11), and the list 24 of the feature points for
measuring the dimension correlated with the vehicle of which the dimension has been calculated is deleted (step S12) . Thus, the process for calculating the vehicle dimension has completed.
20 [0097]
As described above, according to the present embodiment, the feature points for measuring the dimension in each surface of the vehicle are extracted, and the three-dimensional coordinates of each feature
25 point for measuring the dimension are calculated by performing the motion stereo process on the feature points for measuring the dimension. From the calculated three-dimensional coordinates of each feature point for measuring the dimension, the vehicle
30 dimension (the width, the length, the height) is
calculated. Note that, in the process of measuring the vehicle dimension according to the present embodiment, there is a certain arbitrariness for the position of the extracted feature point for measuring
35 the dimension in each surface of the vehicle.
Therefore, it is possible to detect the feature points
52
for measuring the dimension by robust image processing. Thus, in the process of measuring the vehicle dimension according to the present embodiment, the vehicle dimension (e.g. the width, the length, the 5 height) can be calculated by the robust image processing. [0098]
In the above, although embodiments of the present invention have been specifically described,
10 the present invention is not limited to the above-mentioned embodiments. It is obvious for a person skilled in the art that the present invention may be implemented with various modifications. For example, in the above, embodiments in which all of the width,
15 length and height of the vehicle are calculated are described. However, the present invention can be applied to the case in which at least one of the width, length or height of the vehicle is calculated. In this case, only what is required among the feature
20 points for measuring the dimension may be detected.
CLAIMS
1. A vehicle dimension measuring apparatus comprising: 5 an imaging device configured to obtain frame images of a vehicle by successively imaging the vehicle; and
an image processing device configured to calculate a width of the vehicle based on the frame 10 images,
wherein the image processing device includes: a feature point detection means configured to detect a first feature point for measuring a dimension which is a feature point in a first plane of a 15 rectangular parallelepiped that approximates a shape of the vehicle, a second feature point for measuring the dimension which is a feature point in a second plane of the rectangular parallelepiped and a third feature point for measuring the dimension which is a 20 feature point in a third plane of the rectangular
parallelepiped in at least one frame image among the frame images; and
a vehicle dimension calculation means configured to calculate the width of the vehicle based on the 25 first feature point for measuring the dimension, the second feature point for measuring the dimension and the third feature point for measuring the dimension,
wherein the first plane is a plane of the rectangular parallelepiped correlated with a first 30 side surface of the vehicle, the second plane is a plane of the rectangular parallelepiped correlated with a second side surface of the vehicle, and the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the 35 vehicle,
wherein the vehicle dimension calculation means
54
is configured to calculate an amount of movement of the vehicle from an amount of movement of the third feature point, calculate three-dimensional coordinates of the first feature point and the second feature 5 point by performing a motion stereo process on the
first feature point and the second feature point using the amount of movement of the vehicle, and calculate the width of the vehicle from the calculated three-dimensional coordinates of the first feature point and
10 the second feature point,
wherein a position of the first feature point detected by the feature point detection means is arbitrary in the first plane,
wherein a position of the second feature point
15 detected by the feature point detection means is arbitrary in the second plane, and
wherein a position of the third feature point detected by the feature point detection means is arbitrary in the third plane.
20
2. The vehicle dimension measuring apparatus
according to claim 1, wherein the feature point
detection means is configured to extract a moving
region correlated with the vehicle from the frame
25 image, extract tangential lines on the moving region in the frame image correlated with lines contacting the vehicle in real space, and determine an intersection point in the first plane among intersection points of the extracted tangential lines
30 as the first feature point for measuring the dimension.
3. The vehicle dimension measuring apparatus
according to claim 2, wherein the feature point
detection means is configured to determine an
35 intersection point in the second plane among the
intersection points of the extracted tangential lines
55
as the second feature point for measuring the dimension.
4. The vehicle dimension measuring apparatus
5 according to claim 2 or 3, wherein the feature point detection means is configured to determine an intersection point in the third plane among the intersection points of the extracted tangential lines as the third feature point for measuring the dimension. 10
5. The vehicle dimension measuring apparatus
according to claim 1, wherein the feature point
detection means is configured to extract a moving
region correlated with the vehicle from the frame
15 image, extract tangential lines on the moving region in the frame image correlated with lines contacting the vehicle in real space, and determine the first feature point for measuring the dimension by searching a feature point along a first search path which is
20 constituted by a portion of the extracted tangential lines in the first plane.
6. The vehicle dimension measuring apparatus
according to claim 5, wherein the feature point
25 detection means is configured to determine the second feature point for measuring the dimension by searching a feature point along a second search path which is constituted by a portion of the extracted tangential lines in the second plane.
30
7. The vehicle dimension measuring apparatus
according to claim 5 or 6, wherein the feature point
detection means is configured to determine the third
feature point for measuring the dimension by searching
35 a feature point along a third search path which is
constituted by a portion of the extracted tangential
56
lines in the third plane.
8. The vehicle dimension measuring apparatus
according to claim 1, wherein the feature point
5 detection means includes a first template database having a first template of a component supposed to exist on the first side surface of the vehicle and first feature point position data indicating a position of a feature point on the first template,
10 wherein the feature point detection means is configured to calculate a first matching position which is a position of the first template at which the first template in the first template database matches the best with the frame image by means of template
15 matching, and detect the first feature point for
measuring the dimension by supposing that the first feature point is at a position determined from the first matching position and the first feature point position data.
20
9. The vehicle dimension measuring apparatus
according to claim 1 or 8, wherein the feature point
detection means includes a second template database
having a second template of a component supposed to
25 exist on the second side surface of the vehicle and second feature point position data indicating a position of a feature point on the second template,
wherein the feature point detection means is configured to calculate a second matching position
30 which is a position of the second template at which the second template in the second template database matches the best with the frame image by means of template matching, and detect the second feature point for measuring the dimension by supposing that the
35 second feature point is at a position determined from the second matching position and the second feature
57
point position data.
10. The vehicle dimension measuring apparatus
according to claim 8 or 9, wherein the feature point
5 detection means includes a third template database having a third template of a component supposed to exist on the bottom surface of the vehicle and third feature point position data indicating a position of a feature point on the third template,
10 wherein the feature point detection means is configured to calculate a third matching position which is a position of the third template at which the third template in the third template database matches the best with the frame image by means of template
15 matching, and detect the third feature point for
measuring the dimension by supposing that the third feature point is at a position determined from the third matching position and the third feature point position data.
20
11. The vehicle dimension measuring apparatus
according to claim 1, wherein the feature point
detection means is configured to detect a fourth
feature point for measuring the dimension which is a
25 feature point in a fourth plane of the rectangular
parallelepiped and a fifth feature point for measuring the dimension which is a feature point in a fifth plane of the rectangular parallelepiped in the frame image,
30 wherein the fourth plane is a plane of the
rectangular parallelepiped correlated with a front surface of the vehicle, and the fifth plane is a plane of the rectangular parallelepiped correlated with a rear surface of the vehicle,
35 wherein the vehicle dimension calculation means is configured to calculate three-dimensional
58
coordinates of the fourth feature point and the fifth feature point by performing the motion stereo process on the fourth feature point and the fifth feature point using the amount of movement of the vehicle, and 5 calculate a length of the vehicle from the calculated three-dimensional coordinates of the fourth feature point and the fifth feature point,
wherein a position of the fourth feature point detected by the feature point detection means is 10 arbitrary in the fourth plane,
wherein a position of the fifth feature point detected by the feature point detection means is arbitrary in the fifth plane.
15 12. The vehicle dimension measuring apparatus
according to claim 1 or 11, wherein the feature point detection means is configured to detect a sixth feature point for measuring the dimension which is a feature point in a sixth plane of the rectangular
20 parallelepiped in the frame image,
wherein the sixth plane is a plane of the rectangular parallelepiped correlated with an upper surface of the vehicle,
wherein the vehicle dimension calculation means
25 is configured to calculate three-dimensional
coordinates of the third feature point and the sixth feature point by performing the motion stereo process on the third feature point and the sixth feature point using the amount of movement of the vehicle, and
30 calculate a height of the vehicle from the calculated three-dimensional coordinates of the third feature point and the sixth feature point,
wherein a position of the sixth feature point detected by the feature point detection means is
35 arbitrary in the sixth plane.
59
13. A vehicle dimension measuring apparatus comprising:
an imaging device configured to obtain frame images of a vehicle by successively imaging the 5 vehicle; and
an image processing device configured to calculate a length of the vehicle based on the frame images,
wherein the image processing device includes:
10 a feature point detection means configured to
detect a third feature point for measuring a dimension which is a feature point in a third plane of a rectangular parallelepiped that approximates a shape of the vehicle, a fourth feature point for measuring
15 the dimension which is a feature point in a fourth plane of the rectangular parallelepiped and a fifth feature point for measuring the dimension which is a feature point in a fifth plane of the rectangular parallelepiped in at least one frame image among the
20 frame images; and
a vehicle dimension calculation means configured to calculate the length of the vehicle based on the third feature point for measuring the dimension, the fourth feature point for measuring the dimension and
25 the fifth feature point for measuring the dimension, wherein the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle, the fourth plane is a plane of the rectangular parallelepiped correlated with a front
30 surface of the vehicle, and the fifth plane is a plane of the rectangular parallelepiped correlated with a rear surface of the vehicle,
wherein the vehicle dimension calculation means is configured to calculate an amount of movement of
35 the vehicle from an amount of movement of the third
feature point, calculate three-dimensional coordinates
60
of the fourth feature point and the fifth feature point by performing a motion stereo process on the fourth feature point and the fifth feature point using the amount of movement of the vehicle, and calculate 5 the length of the vehicle from the calculated three-dimensional coordinates of the fourth feature point and the fifth feature point,
wherein a position of the third feature point detected by the feature point detection means is 10 arbitrary in the third plane,
wherein a position of the fourth feature point detected by the feature point detection means is arbitrary in the fourth plane, and
wherein a position of the fifth feature point 15 detected by the feature point detection means is arbitrary in the fifth plane.
14. A vehicle dimension measuring apparatus comprising:
20 an imaging device configured to obtain frame images of a vehicle by successively imaging the vehicle; and
an image processing device configured to calculate a height of the vehicle based on the frame
25 images,
wherein the image processing device includes: a feature point detection means configured to detect a third feature point for measuring a dimension which is a feature point in a third plane of a
30 rectangular parallelepiped that approximates a shape
of the vehicle and a sixth feature point for measuring the dimension which is a feature point in a sixth plane of the rectangular parallelepiped in at least one frame image among the frame images; and
35 a vehicle dimension calculation means configured to calculate the height of the vehicle based on the
61
third feature point for measuring the dimension and the sixth feature point for measuring the dimension,
wherein the third plane is a plane of the rectangular parallelepiped correlated with a bottom 5 surface of the vehicle, and the sixth plane is a plane of the rectangular parallelepiped correlated with an upper surface of the vehicle,
wherein the vehicle dimension calculation means is configured to calculate an amount of movement of
10 the vehicle from an amount of movement of the third
feature point, calculate three-dimensional coordinates of the third feature point and the sixth feature point by performing a motion stereo process on the third feature point and the sixth feature point using the
15 amount of movement of the vehicle, and calculate the height of the vehicle from the calculated three-dimensional coordinates of the third feature point and the sixth feature point,
wherein a position of the third feature point
20 detected by the feature point detection means is arbitrary in the third plane, and
wherein a position of the sixth feature point detected by the feature point detection means is arbitrary in the sixth plane.
25
15. A vehicle dimension measuring method comprising:
obtaining frame images of a vehicle by successively imaging the vehicle;
detecting a first feature point for measuring a
30 dimension which is a feature point in a first plane of a rectangular parallelepiped that approximates a shape of the vehicle, a second feature point for measuring the dimension which is a feature point in a second plane of the rectangular parallelepiped and a third
35 feature point for measuring the dimension which is a feature point in a third plane of the rectangular
62
parallelepiped in the frame image; and
calculating a width of the vehicle based on the first feature point for measuring the dimension, the second feature point for measuring the dimension and 5 the third feature point for measuring the dimension, wherein the first plane is a plane of the rectangular parallelepiped correlated with a first side surface of the vehicle, the second plane is a plane of the rectangular parallelepiped correlated 10 with a second side surface of the vehicle, and the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle,
wherein the calculating a width of the vehicle 15 comprises:
calculating an amount of movement of the vehicle from an amount of movement of the third feature point; calculating three-dimensional coordinates of the first feature point and the second feature point by 20 performing a motion stereo process on the first
feature point and the second feature point using the amount of movement of the vehicle; and
calculating the width of the vehicle from the calculated three-dimensional coordinates of the first 25 feature point and the second feature point,
wherein a position of the first feature point detected by said detecting is arbitrary in the first plane,
wherein a position of the second feature point 30 detected by said detecting is arbitrary in the second plane, and
wherein a position of the third feature point detected by said detecting is arbitrary in the third plane.
35
16. A vehicle dimension measuring method comprising:
63
obtaining frame images of a vehicle by successively imaging the vehicle;
detecting a third feature point for measuring a dimension which is a feature point in a third plane of 5 a rectangular parallelepiped that approximates a shape of the vehicle, a fourth feature point for measuring the dimension which is a feature point in a fourth plane of the rectangular parallelepiped and a fifth feature point for measuring the dimension which is a 10 feature point in a fifth plane of the rectangular
parallelepiped in at least one frame image among the frame images; and
calculating a length of the vehicle based on the third feature point for measuring the dimension, the 15 fourth feature point for measuring the dimension and the fifth feature point for measuring the dimension,
wherein the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle, the fourth plane is a plane of 20 the rectangular parallelepiped correlated with a front surface of the vehicle, and the fifth plane is a plane of the rectangular parallelepiped correlated with a rear surface of the vehicle,
wherein the calculating a length of the vehicle 25 comprises:
calculating an amount of movement of the vehicle from an amount of movement of the third feature point; calculating three-dimensional coordinates of the fourth feature point and the fifth feature point by 30 performing a motion stereo process on the fourth
feature point and the fifth feature point using the amount of movement of the vehicle; and
calculating the length of the vehicle from the calculated three-dimensional coordinates of the fourth 35 feature point and the fifth feature point,
wherein a position of the third feature point
64
detected by said detecting is arbitrary in the third plane,
wherein a position of the fourth feature point detected by said detecting is arbitrary in the fourth 5 plane, and
wherein a position of the fifth feature point detected by said detecting is arbitrary in the fifth plane.
10 17. A vehicle dimension measuring method comprising: obtaining flame images of a vehicle by
successively imaging the vehicle;
detecting a third feature point for measuring a
dimension which is a feature point in a third plane of 15 a rectangular parallelepiped that approximates a shape
of the vehicle and a sixth feature point for measuring
the dimension which is a feature point in a sixth
plane of the rectangular parallelepiped in at least
one frame image among the frame images; and 20 calculating a height of the vehicle based on the
third feature point for measuring the dimension and
the sixth feature point for measuring the dimension, wherein the third plane is a plane of the
rectangular parallelepiped correlated with a bottom 25 surface of the vehicle, and the sixth plane is a plane
of the rectangular parallelepiped correlated with an
upper surface of the vehicle,
wherein the calculating a height of the vehicle
comprises: 30 calculating an amount of movement of the vehicle
from an amount of movement of the third feature point; calculating three-dimensional coordinates of the
third feature point and the sixth feature point by
performing a motion stereo process on the third 35 feature point and the sixth feature point using the
amount of movement of the vehicle; and
65
calculating the height of the vehicle from the calculated three-dimensional coordinates of the third feature point and the sixth feature point,
wherein a position of the third feature point detected by said detecting is arbitrary in the third plane, and
wherein a position of the sixth feature point detected by said detecting is arbitrary in the sixth plane.
18. A storage medium storing a program which when executed causes a computer to perform steps of:
detecting a first feature point for measuring a dimension which is a feature point in a first plane of a rectangular parallelepiped that approximates a shape of a vehicle, a second feature point for measuring the dimension which is a feature point in a second plane of the rectangular parallelepiped and a third feature point for measuring the dimension which is a feature point in a third plane of the rectangular parallelepiped in a frame image of the vehicle, the frame image being obtained by successively imaging the vehicle; and
calculating a width of the vehicle based on the first feature point for measuring the dimension, the second feature point for measuring the dimension and the third feature point for measuring the dimension,
wherein the first plane is a plane of the rectangular parallelepiped correlated with a first side surface of the vehicle, the second plane is a plane of the rectangular parallelepiped correlated with a second side surface of the vehicle, and the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle,
wherein the calculating a width of the vehicle
66
comprises:
calculating an amount of movement of the vehicle from an amount of movement of the third feature point;
calculating three-dimensional coordinates of the first feature point and the second feature point by performing a motion stereo process on the first feature point and the second feature point using the amount of movement of the vehicle; and
calculating the width of the vehicle from the calculated three-dimensional coordinates of the first feature point and the second feature point,
wherein a position of the first feature point detected by said detecting is arbitrary in the first plane,
wherein a position of the second feature point detected by said detecting is arbitrary in the second plane, and
wherein a position of the third feature point detected by said detecting means is arbitrary in the third plane.
19. A storage medium storing a program which when executed causes a computer to perform steps of:
obtaining frame images of a vehicle by successively imaging the vehicle;
detecting a third feature point for measuring a dimension which is a feature point in a third plane of a rectangular parallelepiped that approximates a shape of the vehicle, a fourth feature point for measuring the dimension which is a feature point in a fourth plane of the rectangular parallelepiped and a fifth feature point for measuring the dimension which is a feature point in a fifth plane of the rectangular parallelepiped in at least one frame image among the frame images; and
calculating a length of the vehicle based on the
67
third feature point for measuring the dimension, the fourth feature point for measuring the dimension and the fifth feature point for measuring the dimension,
wherein the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle, the fourth plane is a plane of the rectangular parallelepiped correlated with a front surface of the vehicle, and the fifth plane is a plane of the rectangular parallelepiped correlated with a rear surface of the vehicle,
wherein the calculating a length of the vehicle comprises:
calculating an amount of movement of the vehicle from an amount of movement of the third feature point;
calculating three-dimensional coordinates of the fourth feature point and the fifth feature point by performing a motion stereo process on the fourth feature point and the fifth feature point using the amount of movement of the vehicle; and
calculating the length of the vehicle from the calculated three-dimensional coordinates of the fourth feature point and the fifth feature point,
wherein a position of the third feature point detected by said detecting is arbitrary in the third plane,
wherein a position of the fourth feature point detected by said detecting is arbitrary in the fourth plane, and
wherein a position of the fifth feature point detected by said detecting means is arbitrary in the fi fth plane.
20. A storage medium storing a program which when executed causes a computer to perform steps of:
obtaining frame images of a vehicle by successively imaging the vehicle;
68
detecting a third feature point for measuring a dimension which is a feature point in a third plane of a rectangular parallelepiped that approximates a shape of the vehicle and a sixth feature point for measuring the dimension which is a feature point in a sixth plane of the rectangular parallelepiped in at least one frame image among the frame images; and
calculating a height of the vehicle based on the third feature point for measuring the dimension and the sixth feature point for measuring the dimension,
wherein the third plane is a plane of the rectangular parallelepiped correlated with a bottom surface of the vehicle, and the sixth plane is a plane of the rectangular parallelepiped correlated with an upper surface of the vehicle,
wherein the calculating a length of the vehicle comprises:
calculating an amount of movement of the vehicle from an amount of movement of the third feature point;
calculating three-dimensional coordinates of the third feature point and the sixth feature point by performing a motion stereo process on the third feature point and the sixth feature point using the amount of movement of the vehicle; and
calculating the height of the vehicle from the calculated three-dimensional coordinates of the third feature point and the sixth feature point,
wherein a position of the third feature point detected by said detecting is arbitrary in the third plane, and
wherein a position of the sixth feature point detected by said detecting means is arbitrary in the sixth plane.
69
| # | Name | Date |
|---|---|---|
| 1 | 7541-DELNP-2015-IntimationOfGrant09-11-2023.pdf | 2023-11-09 |
| 1 | Form 5 [25-08-2015(online)].pdf | 2015-08-25 |
| 2 | 7541-DELNP-2015-PatentCertificate09-11-2023.pdf | 2023-11-09 |
| 2 | Form 3 [25-08-2015(online)].pdf | 2015-08-25 |
| 3 | Drawing [25-08-2015(online)].pdf | 2015-08-25 |
| 3 | 7541-DELNP-2015-CLAIMS [08-02-2019(online)].pdf | 2019-02-08 |
| 4 | Description(Complete) [25-08-2015(online)].pdf | 2015-08-25 |
| 4 | 7541-DELNP-2015-COMPLETE SPECIFICATION [08-02-2019(online)].pdf | 2019-02-08 |
| 5 | 7541-DELNP-2015.pdf | 2015-08-29 |
| 5 | 7541-DELNP-2015-DRAWING [08-02-2019(online)].pdf | 2019-02-08 |
| 6 | 7541-delnp-2015-Verification Translation-(28-10-2015).pdf | 2015-10-28 |
| 6 | 7541-DELNP-2015-FER_SER_REPLY [08-02-2019(online)].pdf | 2019-02-08 |
| 7 | 7541-delnp-2015-Others-(28-10-2015).pdf | 2015-10-28 |
| 7 | 7541-DELNP-2015-Correspondence-030119.pdf | 2019-01-07 |
| 8 | 7541-DELNP-2015-OTHERS-030119.pdf | 2019-01-07 |
| 8 | 7541-delnp-2015-GPA-(28-10-2015).pdf | 2015-10-28 |
| 9 | 7541-delnp-2015-Form-1-(28-10-2015).pdf | 2015-10-28 |
| 9 | 7541-DELNP-2015-Power of Attorney-030119.pdf | 2019-01-07 |
| 10 | 7541-delnp-2015-Correspondence Others-(28-10-2015).pdf | 2015-10-28 |
| 10 | 7541-DELNP-2015-FORM-26 [26-12-2018(online)].pdf | 2018-12-26 |
| 11 | 7541-delnp-2015-Form-3-(03-11-2015).pdf | 2015-11-03 |
| 11 | Correspondence-251018.pdf | 2018-10-27 |
| 12 | 7541-delnp-2015-Correspondence Others-(03-11-2015).pdf | 2015-11-03 |
| 12 | 7541-DELNP-2015-FER.pdf | 2018-08-10 |
| 13 | 7541-DELNP-2015-Correspondence-260418-.pdf | 2018-05-02 |
| 13 | Form 3 [25-04-2017(online)].pdf | 2017-04-25 |
| 14 | 7541-DELNP-2015-FORM 3 [13-09-2017(online)].pdf | 2017-09-13 |
| 14 | 7541-DELNP-2015-OTHERS-260418-.pdf | 2018-05-02 |
| 15 | 7541-DELNP-2015-8(i)-Substitution-Change Of Applicant - Form 6 [24-04-2018(online)].pdf | 2018-04-24 |
| 15 | 7541-DELNP-2015-RELEVANT DOCUMENTS [23-04-2018(online)].pdf | 2018-04-23 |
| 16 | 7541-DELNP-2015-ASSIGNMENT DOCUMENTS [24-04-2018(online)].pdf | 2018-04-24 |
| 16 | 7541-DELNP-2015-RELEVANT DOCUMENTS [23-04-2018(online)]-1.pdf | 2018-04-23 |
| 17 | 7541-DELNP-2015-PA [24-04-2018(online)].pdf | 2018-04-24 |
| 17 | 7541-DELNP-2015-FORM 13 [23-04-2018(online)].pdf | 2018-04-23 |
| 18 | 7541-DELNP-2015-Changing Name-Nationality-Address For Service [23-04-2018(online)].pdf | 2018-04-23 |
| 19 | 7541-DELNP-2015-FORM 13 [23-04-2018(online)].pdf | 2018-04-23 |
| 19 | 7541-DELNP-2015-PA [24-04-2018(online)].pdf | 2018-04-24 |
| 20 | 7541-DELNP-2015-ASSIGNMENT DOCUMENTS [24-04-2018(online)].pdf | 2018-04-24 |
| 20 | 7541-DELNP-2015-RELEVANT DOCUMENTS [23-04-2018(online)]-1.pdf | 2018-04-23 |
| 21 | 7541-DELNP-2015-8(i)-Substitution-Change Of Applicant - Form 6 [24-04-2018(online)].pdf | 2018-04-24 |
| 21 | 7541-DELNP-2015-RELEVANT DOCUMENTS [23-04-2018(online)].pdf | 2018-04-23 |
| 22 | 7541-DELNP-2015-FORM 3 [13-09-2017(online)].pdf | 2017-09-13 |
| 22 | 7541-DELNP-2015-OTHERS-260418-.pdf | 2018-05-02 |
| 23 | 7541-DELNP-2015-Correspondence-260418-.pdf | 2018-05-02 |
| 23 | Form 3 [25-04-2017(online)].pdf | 2017-04-25 |
| 24 | 7541-DELNP-2015-FER.pdf | 2018-08-10 |
| 24 | 7541-delnp-2015-Correspondence Others-(03-11-2015).pdf | 2015-11-03 |
| 25 | 7541-delnp-2015-Form-3-(03-11-2015).pdf | 2015-11-03 |
| 25 | Correspondence-251018.pdf | 2018-10-27 |
| 26 | 7541-delnp-2015-Correspondence Others-(28-10-2015).pdf | 2015-10-28 |
| 26 | 7541-DELNP-2015-FORM-26 [26-12-2018(online)].pdf | 2018-12-26 |
| 27 | 7541-delnp-2015-Form-1-(28-10-2015).pdf | 2015-10-28 |
| 27 | 7541-DELNP-2015-Power of Attorney-030119.pdf | 2019-01-07 |
| 28 | 7541-delnp-2015-GPA-(28-10-2015).pdf | 2015-10-28 |
| 28 | 7541-DELNP-2015-OTHERS-030119.pdf | 2019-01-07 |
| 29 | 7541-DELNP-2015-Correspondence-030119.pdf | 2019-01-07 |
| 29 | 7541-delnp-2015-Others-(28-10-2015).pdf | 2015-10-28 |
| 30 | 7541-DELNP-2015-FER_SER_REPLY [08-02-2019(online)].pdf | 2019-02-08 |
| 30 | 7541-delnp-2015-Verification Translation-(28-10-2015).pdf | 2015-10-28 |
| 31 | 7541-DELNP-2015.pdf | 2015-08-29 |
| 31 | 7541-DELNP-2015-DRAWING [08-02-2019(online)].pdf | 2019-02-08 |
| 32 | Description(Complete) [25-08-2015(online)].pdf | 2015-08-25 |
| 32 | 7541-DELNP-2015-COMPLETE SPECIFICATION [08-02-2019(online)].pdf | 2019-02-08 |
| 33 | Drawing [25-08-2015(online)].pdf | 2015-08-25 |
| 33 | 7541-DELNP-2015-CLAIMS [08-02-2019(online)].pdf | 2019-02-08 |
| 34 | Form 3 [25-08-2015(online)].pdf | 2015-08-25 |
| 34 | 7541-DELNP-2015-PatentCertificate09-11-2023.pdf | 2023-11-09 |
| 35 | Form 5 [25-08-2015(online)].pdf | 2015-08-25 |
| 35 | 7541-DELNP-2015-IntimationOfGrant09-11-2023.pdf | 2023-11-09 |
| 1 | search_02-02-2018.pdf |