Abstract: The present invention relates to a three dimensional model method based on combination of ground based images (54) and images (44) taken from above. According to the invention an existing 3D model (46) based on images (44) taken from above is matched with images (54) taken from ground level, all images comprising information about position and attitudes of the camera when the images (54) from ground level and the images (44) taken from above were taken and direction of each pixel. The method offers an automatically carried out imaging solving or at least mitigating occlusion problems.
A three dimensional model method based on combination of ground based
images and images taken from above
Technical field
The present invention relates to a three dimensional model method based on
combination of ground based images and images taken from above.
In this connection ground based images are to include images taken directly from
ground as well as images taken from a low height for example by a low flying
helicopter.
15 Background
The basic idea to combine images taken from a ground based equipment with images
taken from the air by means of for example air planes is inter alia known from US
2008/0221843 A1 and also from an article by Fruhe C. et al "Constructing 3D City
20 Models by Merging Ground-Based and Airborn views" in Proceedings of the 2003
IEEE Computer Society Conference on Computer Vision and Pattern Recognition
(CVPR'03).
The solutions according these two references to combine images taken from ground
25 based equipment and images taken from the air are rather complex and involves
manual processing of the image information. Furthermore there is no discussion about
the complex of problems that arises due to occlusion. Examples of objects causing
occlusions as obscuration are obscuration from trees, light poles, cars etc.
30 It could also be noted that there are available solutions today to build views from
images taken from the street and often called "street views". These solutions are based
on images, often covering 360 degrees taken from a known georeferenced location
with known directions. By choosing a specific point, e.g. on a map, it is possible to
wo 2011/093751 PCT/SE2010/000014
2
view the environment from this point. No other three dimensional model is created
than the interpretation made in the eye of the observer.
It is an object of the invention to obtain a three dimensional model method that is less
5 complicated to carry out, is carried out automatically, can take care of the occlusion
complex of problems, and that can build a detailed model of a realistic or real three
dimensional world.
10
15
Summary of the invention
The object of the invention is obtained by a three dimensional method according to the
first paragraph characterized in that an existing 3D model based on images taken from
above is matched with a 3D model based on images taken from ground level in order
to improve an overall 3D model.
The images taken from ground preferably comprise information about position and
attitudes of the camera when the images from ground level were taken and direction of
each pixel.
20 A three dimensional model based on images taken from above and matching with
images taken from ground level enables handling of multiple depths seen from ground
based systems and for example when a tree is located in front of a building by
dividing observations of the tree and the building.
25 Preferably, the 3D model based on images taken from ground level is controlled by
the existing 3D model based on images taken from above.
According to a preferred method image information taken from ground level having a
high resolved texture is used to enhance images of the existing 3D model based on
30 images taken from above by replacing essentially vertical and downwards inclined
surfaces with images based on images taken from ground level. The use of high
resolved images taken from ground as textures in the total model results in an
increased image quality of the total model.
wo 2011/093751 PCT/SE2010/000014
3
According to a still preferred method the matching of the existing 3D model based on
images taken from above with images taken from the ground level is based on position
and attitude information of images taken.
5 According to another preferred method all images available :from the ground level and
images taken from above are considered for estimating a three dimensional model
both in terms of geometries and textures. This method utilizes a maximum of image
information in the creation of a final three dimensional model.
1 0 Advantageously high level surfaces such as roofs are estimated and textured from
images taken from above and vertical surfaces such as house facades are estimated
from available images taken :from ground level and textured :from these images. This
way of estimating ensures that high quality image information is used in the building
of a final three dimensional model.
15
When taking images from the ground level and from above some deviation between
the exact position and the exact attitude of an image is likely to occur. In order to
minimise such deviations it is proposed that images taken :from the ground are
mutually correlated to compensate for position and attitude deviations. Furthermore it
20 is proposed that images taken from the ground in a 3D model are correlated with
images taken from above in a 3D model to compensate for position and attitude
deviations. A fitting procedure for mutual images from the ground level as well as in
combination with images from above is possible.
25 Brief description of the drawings
The invention will now be described in more detail with reference to the
accompanying drawings in which:
30 Figures la-Id schematically illustrate the occlusion problem involving obscuration
and such that:
Figure I a is a view of a house from above illustrating imaging by a ground based
camera,
wo 2011/093751 PCT/SE2010/000014
4
Figure 1 b is a ground based view of the house of figure 1 a,
Figure 1 c is a view from above of the house of figure 1 a and 1 b to be imaged from
5 above, and
Figure 1 d is a ground based view of the house illustrating imaging by a camera
imaging from above.
10 Figure 2 schematically illustrates the capturing of images from above.
15
20
Figure 3a illustrates an example of a known stereo scheme used to collect data.
Figure 3b illustrates a proposed stereo scheme to be used to collect data.
Figure 4 schematically illustrates an example of a model method based on
combination of ground based images and images taken from above.
Detailed description
The obscuration problem will now be described with reference to figures 1a-1d
showing a house 1 with a tree 2 in front of the house 1. According to figure 1 a, a
camera is positioned to take images in positions 3.1-3.4 and the image fields of view
are indicated by angle indications 4.1- 4.4. These images are used to form a three
25 dimensional ground based model. From the angle indications shown it is obvious that
the tree 2 especially by its crown 8 is hiding parts of the house fa9ade 9. Furthermore
depending on light conditions the tree 2 will create not shown shadows on the house
fa9ade 9.
30 According to figure 1c and 1d imaging from above is illustrated. Again four camera
positions 5.1-5.4 are shown with corresponding angle indications 6.1-6.4 indicating
the image fields of view for the taken images. These images are used to form a three
dimensional model taken from above. When imaging from above from an airborne
wo 2011/093751 PCT/SE2010/000014
5
system certain parts of the real environment are excluded from imaging. In this case
the trunk 7 of the tree 2 is invisible.
However by combing image information from ground based images and images from
5 airborne systems most of occlusion and obscuration problems are met. If a three
dimensional model based on airborne images is available this a priori knowledge can
be used to handle the obscuration of the tree 2, i.e. several different depths must be
handled in the observation direction. On the other hand the three dimensional model
taken from above can not observe the tree trunk 7 which can be modelled from the
10 ground based three dimensional model.
The principles for collecting images to be used for stereo imaging are now discussed.
According to figure 2 an air plane 11 provided with a camera 12 is shown in a first
position by unbroken lines and in a second position by broken lines above a landscape
15 13. As illustrated in the figure the landscape differs in height and there are abrupt
configurations 14 such as houses and more billowing configurations 15 such as
billowing hills. The position of the camera in the first position is denoted by x, y, z
and the attitude by a, 13, y. Accordingly, all six degrees of rotation and position are
available. The corresponding position and attitude for the second camera position
20 shown are denoted by x', y', z' and a', 13', y'. The coverage of the landscape by the
camera 12 is indicated by lines 16, 17 for the first position and 16 ', 17' for the second
position. When comparing an image of the landscape taken from the first position with
an image taken from the second position an overlapping section 18 can be identified.
If the overlapping 18 section is observed, it can be seen that an image taken from the
25 first position lacks image information about the vertical right part 14.1 of the abrupt
configuration 14, while the same vertical right part 14.1 is easily imaged from the
second position. Accordingly, being in possession of a plurality of images covering
the same scene position, increases the possibilities to build up three dimensional
images coinciding closer with the real world.
30
Figure 3a shows an example of a known stereo scheme. Such a scheme is obtained by
flying an air plane or other airborne vehicle provided with a downwards looking
camera above the landscape such that there is an overlap of about 50-60% in the
wo 2011/093751 PCT/SE2010/000014
6
direction of flight and for adjacent flights principally without overlap and in practice
about 10% in order to avoid holes. In the figure an upper gray strip 19 illustrates the
footprints of a first flight and a lower gray strip 20 the footprints of a second flight. In
the strips 19, 20 the footprints from every second image are illustrated as solid
5 rectangles 23~30 while the footprints from every second image in between are
illustrated as rectangles 31 ~ 3 6 delimited by dashed lines perpendicular to the flight
direction 22. By the scheme shown each point on the ground is covered with two
images and from these images stereo estimates can be calculated.
10 Figure 3b shows another proposed example of a stereo scheme that can be used. In the
proposed scheme the upper and lower strips 19, 20 illustrates an overlap of 80% in the
direction of flight 22 and an overlap between adjacent flights of 60 %. Suitable
proposed overlapping in the flight direction is about 60~90% and about 60~80%
between adjacent flights. In the different strips 19, 20 five different rectangles 37~41
15 can be identified illustrating five consecutive footprints that are repeatably present
along the flight direction. The five rectangles are indicated by five different delimiting
lines (solid, dash-dotted, short-dashed, long~dashed, and dash-double-dotted)
perpendicular to the flight direction. By the scheme as shown and described with
reference to figure 3b each point on the ground is covered with at least 10 images and
20 all these images can contribute to the stereo estimates for each point in the scene.
The image processing involved in the 3D model method of the invention is now
described with reference to figure 4.
25 Images 44 collected according to the description above with reference to the figures 2,
3a and 3b and that can be available in a storing unit 45 are applied to a stereo disparity
block 42 calculating a stereo disparity for each possible image pair n covering the
same scene position. For each image involved the position x, y, z and the attitude
a, j3, y from which the image is taken are known, i. e. all six degrees of rotation and
30 position are known. Furthermore, a measure of the certainty for each stereo disparity
is estimated. This measure can be based upon local contrasts, visibility and/or
resolution.
wo 2011/093751 PCT/SE2010/000014
7
The stereo disparities calculated in the stereo disparity block 42 are subjected to a
weighting process in a weighting block 43 taking notice of estimated certainty
measures. Available as an output of the weighting block 43 after weighting is a height
model 46 that can be visualised as a grid. From this first model the original stereo
5 estimates are reweighted automatically and adaptively based on normal vectors of the
estimated 3D model taking information such as visibility, local contrast, resolution
and visibility such as obscuration into consideration. In this connection for example an
image taken straight above a building is used to estimate the roof structure and not the
sides of a building. Another example could be to avoid mixing of the front side and
10 back side ofbuildings. By an iterative process taking advantage of images from aside
and connected measurements a more reliable 3 D model is obtained disclosing hidden
sections. In the weighting process outliers can be sorted out and as a straight forward
example the remaining stereo disparities for a scene are weighted together by
averaging or other mathematical methods to find a concentration of similar stereo
15 disparities.
'
In a more simple calculation just one stereo pair is enough requiring just two images
of the same area and the weigthing can be simplified or even left out.
20 Based upon the 3D model46 on the output of the weighting block 43 a wire model47
of triangles is built up and the triangles are draped with images fitting the direction of
v1ewmg.
A similar imaging is taken up from the ground level and images 54 for stereo
25 treatment are stored in a storing unit 55 and treated in a stereo block 52. For each
image involved, as for imaging from above, the position x, y, z and the attitude a, p, y
from which the image is taken are known, i. e. all six degrees of rotation and position
are known. The stereo disparities can then be subjected to weighting in a weighting
block before a three dimensional grid model 56 is built up. When combining ground
30 based images with images taken from above, image information are fetched from the
grid model 56 on request from the image model for images taken from above and high
resolved texture are fetched from the ground based grid model to complete the wire
model 4 7 of triangles with draping taken from the ground based grid model. All
combining of texture from the ground based model and the model based on images
wo 2011/093751 PCT/SE2010/000014
8
taken from above utilizes the fact that complete information concerning position x, y,
z and attitude a, ~, y from which the images are taken are known for all images.
In figure 4 the image treatment has been illustrated by different processing channels
5 for images taken from above and images taken from ground level. It is, however,
possible that at least some of the included blocks are common for both channels.
10
The invention is not limited to the method exemplified above but may be modified
within the scope of the attached claims.
Patent Claims
5 1. A three dimensional model method based on combination of ground based
images and images taken from above, characterized in that an existing 3D model
based on images taken from above is matched with a 3D model based on images taken
from ground level in order to improve an overall 3D model.
10 2. A method as claimed in claim 1, characterized in that the 3D model based on
images taken from ground level is controlled by the existing 3D model based on
images taken from above.
3. A method as claimed in any of the preceding claims, characterized in that all
15 images available from the ground level and images taken from above are considered
for estimating a three dimensional model both in terms of geometries and textures.
4. A method as claimed in any of the preceding claims, characterized in that the
matching of the existing 3D model based on images taken from above with images
20 taken from the ground level is based on position and attitude information of images
taken.
5. A method as claimed in any of the preceding claims, characterized in that
image information taken from ground level having a high resolved texture is used to
25 enhance images of the existing 3D model based on images taken from above by
replacing essentially vertical and downwards inclined surfaces with images based on
images taken from ground level.
6. A method as claimed in any of the preceding claims, characterized in that
30 high level surfaces such as roofs are estimated and textured from images taken from
above.
wo 2011/093751 PCT/SE2010/000014
10
7. A method as claimed in any of the preceding claims, characterized in that
vertical surfaces such as house facades are estimated from available images taken
from ground level and textured from these images.
5 8. A method as claimed in any of the preceding claims, characterized in that
images taken from the ground are mutually correlated to compensate for position and
attitude deviations.
9. A method as claimed in any of the preceding claims, characterized in that
10 images taken from the ground in a 3D model are correlated with images taken from
above in a 3D model to compensate for position and attitude deviations.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 6328-DELNP-2012-IntimationOfGrant08-01-2024.pdf | 2024-01-08 |
| 1 | Form-5.doc | 2012-07-25 |
| 2 | 6328-DELNP-2012-PatentCertificate08-01-2024.pdf | 2024-01-08 |
| 2 | Form-3.doc | 2012-07-25 |
| 3 | Form-1.pdf | 2012-07-25 |
| 3 | 6328-DELNP-2012-PETITION UNDER RULE 137 [17-04-2023(online)].pdf | 2023-04-17 |
| 4 | 6328-DELNP-2012-Written submissions and relevant documents [17-04-2023(online)].pdf | 2023-04-17 |
| 4 | 6328-DELNP-2012-GPA-(18-09-2012).pdf | 2012-09-18 |
| 5 | 6328-DELNP-2012-Form-1-(18-09-2012).pdf | 2012-09-18 |
| 5 | 6328-DELNP-2012-Correspondence to notify the Controller [28-03-2023(online)].pdf | 2023-03-28 |
| 6 | 6328-DELNP-2012-FORM-26 [28-03-2023(online)].pdf | 2023-03-28 |
| 6 | 6328-DELNP-2012-Correspondence-others-(18-09-2012).pdf | 2012-09-18 |
| 7 | 6328-DELNP-2012-US(14)-HearingNotice-(HearingDate-03-04-2023).pdf | 2023-03-03 |
| 7 | 6328-delnp-2012-Form-3-(12-12-2012).pdf | 2012-12-12 |
| 8 | 6328-DELNP-2012-FORM 3 [16-08-2022(online)].pdf | 2022-08-16 |
| 8 | 6328-delnp-2012-Correspondence-others-(12-12-2012).pdf | 2012-12-12 |
| 9 | 6328-DELNP-2012-FORM 13 [19-03-2021(online)].pdf | 2021-03-19 |
| 9 | 6328-delnp-2012-Form-18-(15-01-2014).pdf | 2014-01-15 |
| 10 | 6328-delnp-2012-Correspondence-Others-(15-01-2014).pdf | 2014-01-15 |
| 10 | 6328-DELNP-2012-RELEVANT DOCUMENTS [19-03-2021(online)].pdf | 2021-03-19 |
| 11 | 6328-delnp-2012--Form-3-(15-01-2014).pdf | 2014-01-15 |
| 11 | 6328-DELNP-2012-Correspondence-240719.pdf | 2019-07-27 |
| 12 | 6328-delnp-2012--Correspondence-Others-(15-01-2014).pdf | 2014-01-15 |
| 12 | 6328-DELNP-2012-Power of Attorney-240719.pdf | 2019-07-27 |
| 13 | 6328-DELNP-2012-2. Marked Copy under Rule 14(2) (MANDATORY) [19-07-2019(online)].pdf | 2019-07-19 |
| 13 | 6328-delnp-2012.pdf | 2015-09-23 |
| 14 | 6328-delnp-2012 Complete Specification.pdf | 2018-12-27 |
| 14 | 6328-DELNP-2012-ABSTRACT [19-07-2019(online)].pdf | 2019-07-19 |
| 15 | 6328-DELNP-2012-COMPLETE SPECIFICATION [19-07-2019(online)].pdf | 2019-07-19 |
| 15 | 6328-DELNP-2012-FER.pdf | 2019-01-21 |
| 16 | 6328-DELNP-2012-DRAWING [19-07-2019(online)].pdf | 2019-07-19 |
| 16 | 6328-DELNP-2012-Retyped Pages under Rule 14(1) (MANDATORY) [19-07-2019(online)].pdf | 2019-07-19 |
| 17 | 6328-DELNP-2012-Information under section 8(2) (MANDATORY) [19-07-2019(online)].pdf | 2019-07-19 |
| 17 | 6328-DELNP-2012-FER_SER_REPLY [19-07-2019(online)].pdf | 2019-07-19 |
| 18 | 6328-DELNP-2012-FORM 3 [19-07-2019(online)].pdf | 2019-07-19 |
| 18 | 6328-DELNP-2012-FORM-26 [19-07-2019(online)].pdf | 2019-07-19 |
| 19 | 6328-DELNP-2012-FORM 3 [19-07-2019(online)].pdf | 2019-07-19 |
| 19 | 6328-DELNP-2012-FORM-26 [19-07-2019(online)].pdf | 2019-07-19 |
| 20 | 6328-DELNP-2012-FER_SER_REPLY [19-07-2019(online)].pdf | 2019-07-19 |
| 20 | 6328-DELNP-2012-Information under section 8(2) (MANDATORY) [19-07-2019(online)].pdf | 2019-07-19 |
| 21 | 6328-DELNP-2012-DRAWING [19-07-2019(online)].pdf | 2019-07-19 |
| 21 | 6328-DELNP-2012-Retyped Pages under Rule 14(1) (MANDATORY) [19-07-2019(online)].pdf | 2019-07-19 |
| 22 | 6328-DELNP-2012-COMPLETE SPECIFICATION [19-07-2019(online)].pdf | 2019-07-19 |
| 22 | 6328-DELNP-2012-FER.pdf | 2019-01-21 |
| 23 | 6328-DELNP-2012-ABSTRACT [19-07-2019(online)].pdf | 2019-07-19 |
| 23 | 6328-delnp-2012 Complete Specification.pdf | 2018-12-27 |
| 24 | 6328-DELNP-2012-2. Marked Copy under Rule 14(2) (MANDATORY) [19-07-2019(online)].pdf | 2019-07-19 |
| 24 | 6328-delnp-2012.pdf | 2015-09-23 |
| 25 | 6328-delnp-2012--Correspondence-Others-(15-01-2014).pdf | 2014-01-15 |
| 25 | 6328-DELNP-2012-Power of Attorney-240719.pdf | 2019-07-27 |
| 26 | 6328-delnp-2012--Form-3-(15-01-2014).pdf | 2014-01-15 |
| 26 | 6328-DELNP-2012-Correspondence-240719.pdf | 2019-07-27 |
| 27 | 6328-delnp-2012-Correspondence-Others-(15-01-2014).pdf | 2014-01-15 |
| 27 | 6328-DELNP-2012-RELEVANT DOCUMENTS [19-03-2021(online)].pdf | 2021-03-19 |
| 28 | 6328-DELNP-2012-FORM 13 [19-03-2021(online)].pdf | 2021-03-19 |
| 28 | 6328-delnp-2012-Form-18-(15-01-2014).pdf | 2014-01-15 |
| 29 | 6328-delnp-2012-Correspondence-others-(12-12-2012).pdf | 2012-12-12 |
| 29 | 6328-DELNP-2012-FORM 3 [16-08-2022(online)].pdf | 2022-08-16 |
| 30 | 6328-DELNP-2012-US(14)-HearingNotice-(HearingDate-03-04-2023).pdf | 2023-03-03 |
| 30 | 6328-delnp-2012-Form-3-(12-12-2012).pdf | 2012-12-12 |
| 31 | 6328-DELNP-2012-FORM-26 [28-03-2023(online)].pdf | 2023-03-28 |
| 31 | 6328-DELNP-2012-Correspondence-others-(18-09-2012).pdf | 2012-09-18 |
| 32 | 6328-DELNP-2012-Form-1-(18-09-2012).pdf | 2012-09-18 |
| 32 | 6328-DELNP-2012-Correspondence to notify the Controller [28-03-2023(online)].pdf | 2023-03-28 |
| 33 | 6328-DELNP-2012-Written submissions and relevant documents [17-04-2023(online)].pdf | 2023-04-17 |
| 33 | 6328-DELNP-2012-GPA-(18-09-2012).pdf | 2012-09-18 |
| 34 | Form-1.pdf | 2012-07-25 |
| 34 | 6328-DELNP-2012-PETITION UNDER RULE 137 [17-04-2023(online)].pdf | 2023-04-17 |
| 35 | 6328-DELNP-2012-PatentCertificate08-01-2024.pdf | 2024-01-08 |
| 36 | 6328-DELNP-2012-IntimationOfGrant08-01-2024.pdf | 2024-01-08 |
| 1 | 6328delnp2012searchstrategy_28-12-2018.pdf |