Sign In to Follow Application
View All Documents & Correspondence

An Automated Three Dimensional Mapping Method

Abstract: The present invention relates to an automated three dimensional mapping method estimating tree dimensional models taking advantage of a plurality of images. The object of the invention is to eliminate or at least reduce the need of smoothing and as a consequence increase the stability of a 3D model avoiding blur. According to the method the positions (x, y, z) and attitudes (α, β, γ) for at least one camera is recorded when images are taken, the at least one camera is geometrically calibrated to indicate the direction of each pixel of an image, a stereo disparity is calculated (42) for a plurality of image pairs covering a same scene position setting a disparity and a certainty measure estimate for each stereo disparity, the different stereo disparity estimates are weighted together (43) to form a 3D model, and the stereo disparity estimates are reweighted automatically and adaptively based on the estimated 3D model.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 July 2012
Publication Number
40/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2021-10-27
Renewal Date

Applicants

SAAB AB
S-581 88 Linköping

Inventors

1. LEIF HAGLUND
Rängevägen 16  S-590 45 Brokind
2. JOHAN BORG
JOHAN BORG
3. INGEMAR ANDERSSON
Tegskiftesgatan 275  S-583 34 Linköping
4. FOLKE ISAKSSON
Nya Tanneforsvägen 21B  S-582 42 Linköping

Specification

An automated three dimensional mapping method
5 Technical field
The present invention relates to an automated three dimensional mapping method
estimating three dimensional models taking advantage of a plurality of images.
10 Background
Estimation of three dimensional, 3D, models from stereo photogrammetric methods is
generally known in connection to manual utilization of stereo goggles. There are also
solutions utilizing computers and examples of such computerized solutions are inter
15 alias found in our patent applications PCT/EP2007/056780 and PCT/SE2000/000739.
Classically the result is based on two images taken from different positions covering
the same scene in the world.
From US 5,808,626 and US 2002/0101438 A1 methods using multi overlapping
20 images are known. These methods are based on identifYing and selection of key
points. Other examples of methods using overlapping images are known from US
6,658,207 B1, US 2004/0105090 A1, US 2002/0163582 A1 and US 5,104,217.
To get accuracy in the automated estimates it is normally required to introduce some
25 kind of smoothing scheme. A drawback with such smoothing is that sharp changes in
depth will be smoothed and the overall quality of the 3D model will be very blurred.
30
It is an object of this invention to obtain a method reducing the need of smoothing
resulting in a more stable 3D model requiring none or very little post smoothing.
Summary of the invention
The object is obtained by a method characterized in that the positions and attitudes for
at least one camera is recorded when images are taken, that at least one camera is
wo 2011/093752 PCT/SE2010/000015
2
geometrically calibrated to indicate the direction of each pixel in an image, that a
stereo disparity is calculated for a plurality of image pairs covering a same scene
position setting a disparity and a certainty measure estimate for each stereo disparity,
that the different stereo disparity estimates are weighted together to form a 3D model,
5 and that the stereo disparity estimates are reweighted automatically and adaptively
based on the estimated 3D model.
10
Our mapping method relies on that the 3D models cover the whole area covered by
collected images without selection of key points or segmentation of objects.
By calculating the stereo disparity for a plurality of image pairs covering the same
scene position according to the preceding paragraph no or only slight post smoothing
is to be carried out to obtain a stable 3D model. By using several different images
taken from different angles following a voting or weighting scheme it is possible to
15 combine the results from stereo pairs to a three dimensional model that will be smooth
where the world is smooth, like on a street, and that simultaneously at sharp depth
changes can be kept sharp. The stereo disparity estimates could for example be
reweighted based on normal vectors of the estimated 3D model. Furthermore the
different stereo disparity estimates could be weighted together to form a 3D height
20 model.
25
According to a preferred development of the method a stereo disparity is calculated
for each possible image pair. By taking advantage of as many image pairs as possible
the 3D model is optimized with respect to accuracy.
According to a still preferred development of the method images in the direction of
flight are taken with an overlap of approximately 60-90%.
According to another preferred development of the method the images between
30 adjacent flights are taken with an overlap of approximately 60-80%.
The choice of overlaps as proposed above as to direction of flight and between
adjacent flights results in that a coverage of at least 10 images are available for
contribution to the estimates for each point in the scene.
wo 2011/093752 PCT/SE2010/000015
5
10
3
In order to further increase the number of images available it is proposed according to
yet another preferred development of the method that images are taken with overlap in
two essentially perpendicular directions of flight.
The weighting of the disparity estimates can be performed in many different ways.
According to one proposal of the method the weighting of the stereo disparities are
based on averaging. To avoid uncertain measurements it is also proposed that the
weighting ofthe stereo disparities involves excluding of outliers.
The certainty measure estimate for each stereo disparity can be set in consideration of
local contrast, visibility affected for example by obscuration or resolution or a mixture
ofthese considerations.
15 Brief description of the drawings
The invention will now be described in more detail with reference to the
accompanying drawings in which:
20 Figure 1 schematically illustrates the capturing of images from above.
25
Figure 2a illustrates an example of a !mown stereo scheme used to collect data.
Figure 2b illustrates a proposed stereo scheme to be used to collect data.
Figure 3 more schematically illustrates another proposed scheme to be used to collect
data.
Figure 4 schematically illustrates image processing involved in the 3D mapping
30 method according to the invention.
wo 2011/093752 PCT/SE2010/000015
4
Detailed description
According to figure 1 an air plane 1 provided with a camera 2 is shown in a first
position by unbroken lines and in a second position by broken lines above a landscape
5 3. As illustrated in the figure the landscape differs in height and there are abrupt
configurations 4 such as houses and more billowing configurations 5 such as
billowing hills. The position of the camera in the first position is denoted by x, y, z
and the attitude by a, ~' y. Accordingly, all six degrees of rotation and position are
available. The corresponding position and attitude for the second camera position
10 shown are denoted by x', y', z' and a', ~', y'. The coverage of the landscape by the
camera 1 is indicated by lines 6, 7 for the first position and 6', 7' for the second
position. When comparing an image of the landscape taken from the first position with
an image taken from the second position an overlapping section 8 can be identified. If
the overlapping 8 section is observed, it can be seen that an image taken from the first
15 position lacks image information about the vertical right part 4.1 of the abrupt
configuration 4, while the same vertical right part 4.1 is easily imaged from the second
position. Accordingly, being in possession of a plurality of images covering the same
scene position, increases the possibilities to build up three dimensional images
coinciding closer with the real world.
20
In figure 1 there is shown an overlap of about 25%. Of course this overlap could be
much higher such as for example 75%.
Figure 2a shows an example of a known stereo scheme. Such a scheme is obtained by
25 flying an air plane or other airborne vehicle provided with a downwards looking
camera above the landscape such that there is an overlap of about 50-60% in the
direction of flight and for adjacent flights principally without overlap and in practice
about 10% in order to avoid holes. In the figure an upper gray strip 9 illustrates the
footprints of a first flight and a lower gray strip 1 0 the footprints of a second flight. In
30 the strips 9, 10 the footprints from every second image are illustrated as solid
rectangles 13-20 while the footprints from every second image in between are
illustrated as rectangles 21-26 delimited by dashed lines perpendicular to the flight
direction 12. By the scheme shown each point on the ground is covered with two
images and from these images stereo estimates can be calculated.
wo 2011/093752 PCT/SE2010/000015
5
Figure 2b shows an example of a stereo scheme that can be used in our proposed
invention. In the proposed scheme the upper and lower strips 9, 1 0 illustrates an
overlap of80% in the direction of flight 12 and an overlap between adjacent flights of
5 60 %. Suitable proposed overlapping in the flight direction is about 60-90% and about
60-80% between adjacent flights. In the different strips 9, 10 five different rectangles
27-31 can be identified illustrating five consecutive footprints that are repeatably
present along the flight direction. The five rectangles are indicated by five different
delimiting lines (solid, dash-dotted, short-dashed, long-dashed, and dash-double-
1 0 dotted) perpendicular to the flight direction. By the scheme as shown and described
with reference to figure 2b each point on the ground is covered with at least 1 0
images and all these images can contribute to the stereo estimates for each point in the
scene. The number could be at least 15 with an overlap of 67% sidewise.
15 Figure 3 schematically shows an example of a scheme offering still more overlapping.
In this case images are collected not only from essentially parallel flight paths in one
and first flight direction 32 but also in a second flight direction 33 essentially
perpendicular to the first flight direction. The flights are here only indicated as arrows
34.1-34.5 in the first flight direction 32 and arrows 38.1-38.5 in the second flight
20 direction 33. Even though the arrows are shown pointing in the same direction for a
flight direction some of them, for example, every second could be pointing in the
opposite direction. The overlapping between adjacent parallel flights and overlapping
in the flight direction are not particularly shown in figure 3 but can be varied as
described with reference to figure 2b within wide frames. For example each point on
25 the ground can be covered by at least 20 images that can contribute to the stereo
disparity estimates for each point in the scene.
30
The image processing involved in the 3D mapping method of the invention is now
described with reference to figure 4.
Images 44 collected according to the description above with reference to the figures 1,
2b and 3 and that can be available in a storing unit 45 are applied to a stereo disparity
block 42 calculating a stereo disparity for each possible image pair n covering the
same scene position. For each image involved the position x, y, z and the attitude
wo 2011/093752 PCT/SE2010/000015
5
6
a, p, y from which the image is taken are known, i. e. all six degrees of rotation and
position are known. Furthermore, a measure of the certainty for each stereo disparity
is estimated. This measure can be based upon local contrasts, visibility and/ or
resolution.
The stereo disparities calculated in the stereo disparity block 42 are subjected to a
weighting process in a weighting block 43 taking notice of estimated certainty
measures. Available as an output of the weighting block 43 after weighting is a height
model 46 that can be visualised as a grid. From this first model the original stereo
10 estimates are reweighted automatically and adaptivelybased on normal vectors ofthe
estimated 3D model taking information such as visibility, local contrast, resolution
and visibility such as obscuration into consideration. fu this connection for example an
image taken straight above a building is used to estimate the roof structure and not the
sides of a building. Another example could be to avoid mixing of the front side and
15 back side ofbuildings. By an iterative process taking advantage of images from aside
and connected measurements a more reliable 3 D model is obtained disclosing hidden
sections. fu the weighting process outliers can be sorted out and as a straight forward
example the remaining stereo disparities for a scene are weighted together by
averaging or other mathematical methods to find a concentration of similar stereo
20 disparities.
25
Based upon the 3D model46 on the output of the weighting block 43 a wire model47
of triangles is built up and the triangles are draped with images fitting the direction of
vtewmg.
The invention is not limited to the method exemplified above but may be modified
within the scope ofthe attached claims.

Patent Claims
5 1. An automated three dimensional mapping method estimating tree dimensional
models taking advantage of a plurality of images, characterized in that the positions
and attitudes for at least one camera is recorded when images are taken, that at least
one camera is geometrically calibrated to indicate the direction of each pixel in an
image, that a stereo disparity is calculated for a plurality of image pairs covering a
10 same scene position setting a disparity and a certainty measure estimate for each
stereo disparity, that the different stereo disparity estimates are weighted together to
form a 3D model, and that the stereo disparity estimates are reweighted automatically
and adaptively based on the estimated 3D model.
15 2. A method as claimed in claim 1, characterized in that the stereo disparity
estimates are reweighted automatically and adaptively based on the normal vectors of
the estimated 3D model.
3. A method as claimed in any of the preceding claims, characterized in that
20 different stereo disparity estimates are weighted together to form a 3D height model.
4. A method as claimed in any ofthe preceding claims, characterized in that a
stereo disparity is calculated for each possible image pair.
25 5. A method as claimed in any of the preceding claims, characterized in that
images in the direction of flight are taken with an overlap of approximately 60-90%.
30
6. A method as claimed in any of the preceding claims, characterized in that the
images between adjacent flights are taken with an overlap of approximately 60-80%.
7. A method as claimed in any of the preceding claims, characterized in that
images are taken with overlap in two essentially perpendicular directions of flight.
wo 2011/093752 PCT/SE2010/000015
8
8. A method as claimed in any ofthe preceding claims, characterized in that the
certainty measure estimate for each disparity is set in consideration of resolution.
9. A method as claimed in any of the preceding claims, characterized in that the
5 weighting ofthe stereo disparities involves excluding of outliers.
10
15
10. A method as claimed in any of the preceding claims, characterized in that the
weighting of the stereo disparities are based on averaging.
11. A method as claimed in any of the preceding claims, characterized in that the
certainty measure estimate for each stereo disparity is set in consideration of local
contrast.
12. A method as claimed in any of the preceding claims, characterized in that the
certainty measure estimate for each disparity is set in consideration of visibility
affected for example by obscuration.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 6329-DELNP-2012-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26
1 Form-5.doc 2012-07-25
2 6329-DELNP-2012-IntimationOfGrant27-10-2021.pdf 2021-10-27
2 Form-3.doc 2012-07-25
3 Form-1.pdf 2012-07-25
3 6329-DELNP-2012-PatentCertificate27-10-2021.pdf 2021-10-27
4 6329-DELNP-2012-US(14)-HearingNotice-(HearingDate-15-06-2021).pdf 2021-10-17
4 6329-delnp-2012-GPA-(19-09-2012).pdf 2012-09-19
5 6329-DELNP-2012-Written submissions and relevant documents [29-06-2021(online)].pdf 2021-06-29
5 6329-delnp-2012-Form-1-(19-09-2012).pdf 2012-09-19
6 6329-delnp-2012-Correspondence-Others-(19-09-2012).pdf 2012-09-19
6 6329-DELNP-2012-Correspondence to notify the Controller [10-06-2021(online)].pdf 2021-06-10
7 6329-delnp-2012-Form-3-(12-12-2012).pdf 2012-12-12
7 6329-DELNP-2012-FORM 13 [19-03-2021(online)].pdf 2021-03-19
8 6329-DELNP-2012-RELEVANT DOCUMENTS [19-03-2021(online)].pdf 2021-03-19
8 6329-delnp-2012-Correspondence Others-(12-12-2012).pdf 2012-12-12
9 6329-DELNP-2012-CLAIMS [09-04-2020(online)].pdf 2020-04-09
9 6329-delnp-2012-Form-18-(15-01-2014).pdf 2014-01-15
10 6329-delnp-2012-Correspondence-Others-(15-01-2014).pdf 2014-01-15
10 6329-DELNP-2012-DRAWING [09-04-2020(online)].pdf 2020-04-09
11 6329-delnp-2012--Form-3-(15-01-2014).pdf 2014-01-15
11 6329-DELNP-2012-FER_SER_REPLY [09-04-2020(online)].pdf 2020-04-09
12 6329-delnp-2012--Correspondence-Others-(15-01-2014).pdf 2014-01-15
12 6329-DELNP-2012-FORM 3 [09-04-2020(online)].pdf 2020-04-09
13 6329-DELNP-2012-FORM-26 [09-04-2020(online)]-1.pdf 2020-04-09
13 6329-delnp-2012.pdf 2015-09-23
14 6329-DELNP-2012-FER.pdf 2019-10-11
14 6329-DELNP-2012-FORM-26 [09-04-2020(online)].pdf 2020-04-09
15 6329-DELNP-2012-Information under section 8(2) [09-04-2020(online)].pdf 2020-04-09
15 6329-DELNP-2012-PETITION UNDER RULE 137 [09-04-2020(online)].pdf 2020-04-09
16 6329-DELNP-2012-OTHERS [09-04-2020(online)].pdf 2020-04-09
17 6329-DELNP-2012-PETITION UNDER RULE 137 [09-04-2020(online)].pdf 2020-04-09
17 6329-DELNP-2012-Information under section 8(2) [09-04-2020(online)].pdf 2020-04-09
18 6329-DELNP-2012-FORM-26 [09-04-2020(online)].pdf 2020-04-09
18 6329-DELNP-2012-FER.pdf 2019-10-11
19 6329-DELNP-2012-FORM-26 [09-04-2020(online)]-1.pdf 2020-04-09
19 6329-delnp-2012.pdf 2015-09-23
20 6329-delnp-2012--Correspondence-Others-(15-01-2014).pdf 2014-01-15
20 6329-DELNP-2012-FORM 3 [09-04-2020(online)].pdf 2020-04-09
21 6329-delnp-2012--Form-3-(15-01-2014).pdf 2014-01-15
21 6329-DELNP-2012-FER_SER_REPLY [09-04-2020(online)].pdf 2020-04-09
22 6329-delnp-2012-Correspondence-Others-(15-01-2014).pdf 2014-01-15
22 6329-DELNP-2012-DRAWING [09-04-2020(online)].pdf 2020-04-09
23 6329-DELNP-2012-CLAIMS [09-04-2020(online)].pdf 2020-04-09
23 6329-delnp-2012-Form-18-(15-01-2014).pdf 2014-01-15
24 6329-DELNP-2012-RELEVANT DOCUMENTS [19-03-2021(online)].pdf 2021-03-19
24 6329-delnp-2012-Correspondence Others-(12-12-2012).pdf 2012-12-12
25 6329-delnp-2012-Form-3-(12-12-2012).pdf 2012-12-12
25 6329-DELNP-2012-FORM 13 [19-03-2021(online)].pdf 2021-03-19
26 6329-delnp-2012-Correspondence-Others-(19-09-2012).pdf 2012-09-19
26 6329-DELNP-2012-Correspondence to notify the Controller [10-06-2021(online)].pdf 2021-06-10
27 6329-DELNP-2012-Written submissions and relevant documents [29-06-2021(online)].pdf 2021-06-29
27 6329-delnp-2012-Form-1-(19-09-2012).pdf 2012-09-19
28 6329-DELNP-2012-US(14)-HearingNotice-(HearingDate-15-06-2021).pdf 2021-10-17
28 6329-delnp-2012-GPA-(19-09-2012).pdf 2012-09-19
29 Form-1.pdf 2012-07-25
29 6329-DELNP-2012-PatentCertificate27-10-2021.pdf 2021-10-27
30 6329-DELNP-2012-IntimationOfGrant27-10-2021.pdf 2021-10-27
31 6329-DELNP-2012-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26

Search Strategy

1 SearchStrategyMatrix8_10-10-2019.pdf

ERegister / Renewals

3rd: 29 Dec 2021

From 26/01/2012 - To 26/01/2013

4th: 29 Dec 2021

From 26/01/2013 - To 26/01/2014

5th: 29 Dec 2021

From 26/01/2014 - To 26/01/2015

6th: 29 Dec 2021

From 26/01/2015 - To 26/01/2016

7th: 29 Dec 2021

From 26/01/2016 - To 26/01/2017

8th: 29 Dec 2021

From 26/01/2017 - To 26/01/2018

9th: 29 Dec 2021

From 26/01/2018 - To 26/01/2019

10th: 29 Dec 2021

From 26/01/2019 - To 26/01/2020

11th: 29 Dec 2021

From 26/01/2020 - To 26/01/2021

12th: 29 Dec 2021

From 26/01/2021 - To 26/01/2022

13th: 29 Dec 2021

From 26/01/2022 - To 26/01/2023

14th: 07 Dec 2022

From 26/01/2023 - To 26/01/2024

15th: 29 Nov 2023

From 26/01/2024 - To 26/01/2025

16th: 21 Nov 2024

From 26/01/2025 - To 26/01/2026

17th: 18 Nov 2025

From 26/01/2026 - To 26/01/2027