Abstract: A method for calibrating a multi-view three dimensional camera (4)having a time-of-flight camera (4) for generating a three dimensional depth map (6) through a 360 degree lens system (8), the method includes placing the camera (4) relative to a surrounding hollow body (10) of a known inner shape (12), generating the depth map (6) of the known inner shape (12) of the hollow body (10), comparing the depth map (6) with the known inner shape (12) to determine correction parameters (14) of the depth map (6).
Description
A method and a system for calibrating a multi-view three
dimensional camera
This invention relates to a method and a system for
calibrating a multi-view three dimensional camera including a
time-of-flight camera.
Cameras based on time-of-flight principle use an
electromagnetic pulse to generate a depth map of a surface of
an object in a scene when the electromagnetic pulse is
reflected by the surface of the object. A lens of the camera
gathers the reflected electromagnetic pulses from various
such surfaces of the object in the scene and generates a
depth map of the whole scene. Depending on the distance
travelled by each of the pulse, the electromagnetic pulse
experiences a delay when captured by the lens of the camera.
A 360 degree lens system has lenses as well as curved mirrors
to combine refraction via lenses and reflection via curved
mirrors to gather the reflected electromagnetic pulses from
all angles of the scene.
To generate a multi-view depth map of the scene representing
views from various angles of the scene, a multi-view three
dimensional camera based on the time-of-flight principle is
used. The multi-view three dimensional camera uses the 360
degree lens system to gather the reflected electromagnetic
pulses from various angles of the scene to generate the depth
map. But, the 360 degree lens system has a tendency to
distort path length of the reflected electromagnetic pulse
when the electromagnetic pulse passes through the 360 degree
lens system. Thus the depth map generated by the time of
flight camera using the 360 degree lens system gets
distorted.
It is an object of the present invention to calibrate a
multi-view three dimensional camera based on the time-of-
flight principle to compensate for the distortion of
electromagnetic pulse path lengths due to the 360 degree lens
system.
The object of the invention is achieved by a method of claim
1 and a system of the claim 10.
The underlying idea of the invention is to generate a depth
map using a 360 degree lens system representing multi-view of
a hollow body with a known inner shape and to compare this
depth map with the known inner shape to determine for a set
of undetermined parameters of the depth map to compensate the
distortion of the electromagnetic pulse path length due to
the 360 degree lens system. This allows calibrating the
multi-view three dimensional cameras easily.
According to an exemplary embodiment, the correction
parameters are determined such that the depth map corrected
with the correction parameters corresponds to the known inner
shape, so that, the compensation parameters are aligned
synchronously to the comparison of depth map with the known
inner shape. Such compensation parameters calibrate the
multi-view three dimensional cameras more precisely.
According to one embodiment, the correction parameters
compensate a difference offset of a path length of an
electromagnetic pulse for each pixel of the depth map due to
generating of the depth map through the 360 degree lens
system. This helps to correct the path length of the pulse
which has been shortened or lengthened due to distortion of
the path length when the electromagnetic pulse travels
through 360 degree lens system, thus making the
electromagnetic pulse appear non-distorted. The depth map
generated from the electromagnetic pulse after correction
will correspond to the known inner shape.
According to another embodiment, wherein the method includes
determining the correction parameters for every pixel of the
depth map, so that the correction parameters can be available
for entire depth map to correct the entire depth map at the
same time in a way to provide the depth map after correction
which corresponds to the inner shape of the hollow body.
According to yet another embodiment, wherein determining the
correction parameters by matching the depth map to the known
inner shape by a matching algorithm. Using a matching
algorithm provides a simple solution to provide the
correction parameters.
According to an exemplary embodiment, wherein the matching
algorithm is based on linear shift or spline transformation
of the depth map or combination thereof. Such matching
algorithms are easy to implement, as the concepts of linear
shift and spline transformation are generally known.
According to one embodiment, the inner shape of the hollow
body is cylinder or hemisphere. This makes method easy to
implement and use, as the hollow body with such inner shape
are readily available.
According to another embodiment, the hollow body is a room of
known shape, thus using a room itself as the hollow body for
calibration of camera. This replaces a need of- separate
hollow body for the calibration of the camera and also
provides a solution for calibration when the separate hollow
body generally used for calibration of the camera is lost.
According to another embodiment, placing the camera in
predetermined position relative to the hollow body. This
helps to determine the correction factors easily and quickly,
as time taken by the processor to calculate the position of
the camera with respect to the surrounding body is saved.
According to yet another embodiment, the shape of the hollow
body is having a symmetry axis and the hollow body is placed
coaxially with an optical axis of the camera. As the symmetry
axis is easy to locate, so a user can easily place the camera
relative to the surrounding of the hollow body.
According to an exemplary embodiment, the system includes a
holder for receiving the hollow body, so that the hollow body
can be supported during the calibration of the camera.
FIG 1 shows a schematic diagram of a system for calibrating a
multi-view three dimensional camera.
FIG 2 shows a flowchart for a matching algorithm used for
determining the correction parameters by matching the depth
map to the known inner shape.
A camera includes a electromagnetic pulse source which emits
a electromagnetic pulse to be captured back by the camera on
being reflected by a surface of a known inner shap, so that
to generate a multi-view depth map of the known inner shape
representing views from various angles of the known inner
shape. But, as the electromagnetic pulse passes through the
360 degree lens system, path length of the electromagnetic
pulse gets distorted resulting in a change in the path
length. To calibrate the camera for generating the depth map
to correspond to a known inner shape on compensation for
distortion of path length, a system is illustrated in FIG 1.
In reference to FIG 1, a system 4 is exemplified showing a
multi-view three dimensional camera 4 with a 360 degree lens
system 8 for generating a multi-view depth map 6, a
surrounding hollow body 10 of known inner shape 12
surrounding the camera 4, wherein the camera 4 is generating
the multi-view depth map of the inner shape 12 and processor
22 receiving the depth map from the camera 4 to calibrate the
camera 4 by determining correction parameters 14 for a set of
undetermined parameters 2 6 for each pixel 16 of depth map 6
such that the depth map 6 on being corrected using the
correction parameters 14 corresponds to the known inner shape
12.
The undetermined parameters 26 are based on various physical
properties of the camera 4, the hollow body 10, the known
inner shape 12 of the hollow body 10, the interrelation
between the hollow body 10 and the camera 4, etc or
combination thereof.
The camera 4 also includes a source of electromagnetic pulse
which emits a electromagnetic pulse like a light pulse or a
laser pulse or any such pulses having electromagnetic
properties and a shutter to cut-off electromagnetic pulse
when the pulse returns back after being reflected by the
surface of the known inner shape 12 through the 360 degree
lens system 8. The 360 degree lens system 8 distorts the path
length of the electromagnetic pulse by elongating or
shortening of the path length or by changing geometry of path
of the electromagnetic pulse, so the depth map 6 generated by
the pulse of electromagnetic pulse doesn't maps onto the
known inner shape 12. The processor 22 takes the depth map 6
as an input to process the depth map 6 and determines the
correction factors. The depth map 6 on correction using the
correction parameters 14 represents the known inner shape 12.
The correction parameters 14 on being used with the depth map
6 compensate a difference offset of the path length of the
electromagnetic pulse for each pixel 16 of the depth map 6,
wherein the difference offset is defined by a difference in
the path lengths of the electromagnetic pulse when the 360
degree lens system 8 is used and when the 360 degree lens
system 8 is not used.
The correction parameters 14 compensate a difference offset
of the path length of the electromagnetic pulse for each
pixel 16 of the depth map 6 by compensating intensity of each
pixel 16 of the depth map 6. Alternatively, the correction
parameters 14 can compensate the difference offset of the
path length of the electromagnetic pulse for each pixel 16 of
the depth map 6 by compensating frequency or wavelength or a
combination of intensity, wavelength or frequency of each
pixel 16 of the depth map 6.
The depth map 6 comprises of various pixels 16 on the depth
map 6 and each pixel 16 is generated by different
electromagnetic pulses. In general, the electromagnetic pulse
source emits the electromagnetic pulse in different
directions to reach to different parts of the known inner
shape 12. Each part of the inner shape 12 is located at
varied distances, so each electromagnetic pulse travels to a
different distance and generates the pixel 16 on the depth
map 6 on a basis of the distance travelled by each of the
electromagnetic pulse. When passing through the 360 degree
lens system 8, each of the electromagnetic pulse gets
distorted differently as each of the electromagnetic pulses
comes from different directions after travelling for
different distance. So the corrections parameters are
determined for each of the pixels 16 on the depth map 6, as
the electromagnetic pulses corresponding to each of the
pixels 16 have been distorted differently by the 360 degree
lens system 8. Alternatively, the correction parameters 14
can be determined as a function to represent correction
parameters 14 for each of the pixels 16 generated from the
electromagnetic pulses distorted differently by the 360
degree lens system 8. The function for correction parameters
14 can be an algebraic function, a vector function, a
trigonometric function or any of such mathematical function
which can represent all the correction parameters 14 for each
of the pixels 16.
The correction parameters 14 are determined by matching the
depth map 6 to the known inner shape 12 by a matching
algorithm 18. The matching algorithm takes the depth map 6
and spatial coordinates referring to the known inner shape 12
as an input and matches those spatial parameters with the
pixels 16 in the depth map 6 referring to the spatial
parameters. And on a basis of the match, the processor 22
determines the correction parameters 14. The matching
algorithm can be based on linear shift or spline
transformation of the depth map 6, or any such spatial
transform based algorithm which determine the correction
parameters 14 or any other such matching algorithm to
determine correction parameters 14 or combinations thereof.
The known inner shape 12 of the hollow body 10 is a cylinder.
The inner shape 12 of the device is known through the
dimensions of the cylinder 10, i.e., a length and a radius of
the cylinder 10. The cylinder 10 has a regular shape which
makes it fast to make calculations while determining
correction parameters 14 by the processor 22. Once,
dimensions of the cylinder 10 are known, it becomes easy for
the processor 22 to calculate the orientation and location of
the camera 4 into the surroundings of the cylinder 10.
Location and orientation of the camera 4 can be calculated by
measuring spatial distances travelled by the electromagnetic
pulse and using the spatial distances along with the
dimensions of the camera 4. In furtherance, from the
dimensions of the cylinder 10, the location and the
orientation of the camera 4 into the surroundings of the
cylinder 10 and the spatial distance travelled by the
electromagnetic pulse, the processor 22 calculates the
correction parameters 14. Alternatively, the known inner
shape 12 can be a hemisphere. The shape of the hemisphere is
known through a dimension of the hemisphere, i.e., a radius
of the hemisphere. The processor 22 takes in consideration of
the radius to determine the correction parameters 14 by
taking in consideration the spatial distance travelled by the
electromagnetic pulse and using the spatial distance along
with the radius of the hemisphere. In the intermediate step,
the orientation and the location of the camera 4 into the
surrounding of the hemisphere can also be determined using
the spatial distance travelled by the electromagnetic pulse
and the radius of the hemisphere. Alternatively, the known
inner shape 12 can be cuboids, cube, trapezium or any other
known shape for which the dimensions of the inner shape 12 is
known and the correction parameters 14 can be determined by
the processor 22 using the dimensions of the inner shape 12
and the spatial distance travelled by the electromagnetic
pulse to various parts of the inner shape 12 of the hollow
body 10.
In an alternate embodiment, the known inner shape 12 can be a
room of known dimension and the camera 4 can be placed inside
the room to determine the correction parameters 14. The
dimensions of the room can be made available by architectural
map 6 of the room and the data in relation to the dimensions
related to room can be fed into the processor 22. When the
electromagnetic pulse travels after being reflected from
various parts of the inner shape 12 of the room, the
processor 22 will determine the correction parameters .14
taking in consideration dimensions of the room. In the
intermediate step, the orientation and the location of the
camera 4 into the surrounding of the room can also be
determined using the spatial distance travelled by the
electromagnetic pulse to various parts of the room and the
dimensions of the room.
The camera 4 is placed in a predetermined position relative
to the hollow body 10. If the predetermined position is known
than determination of the correction parameters 14 by the
processor 22 even becomes faster, because the orientation and
the location of the camera 4 is not required to be known due
to the availability of these data by knowledge of the
predetermined position. Alternatively, the camera 4 need not
be placed in a predetermined position, rather it can be
placed arbitrary in respect to the surrounding of the hollow
body 10 and the correction parameters 14 are calculated by
the processor 22 using the matching algorithm.
The hollow body 10 is having a symmetry axis 18 and the
camera 4 is placed into the surrounding of the hollow body
10, so that the optical axis 20 of the camera 4 and the
symmetry axis 18 are coaxial. Placing the camera 4 in such a
way helps to place the camera 4 into the predetermined
position, as when the camera 4 is placed coaxially to the
hollow body 10 the orientation and the location of the camera
4 is easily known due to the symmetry of the hollow body 10.
Alternatively, the hollow body 10 need not have a symmetry
axis 18 and the camera 4 is placed in a predetermined
position in respect to the hollow body 10 in a way to
surround the hollow body 10.
The hollow body 10 and the camera 4, both are movable to move
in a way to attain a desired position of camera 4 and the
hollow body 10 relative to each other. In an alternate
embodiment, either of the camera 4 or the hollow body 10 is
movable to attain a desired position of camera 4 and the
hollow body 10 relative to each other.
The hollow body 10 is placed on a holder 24, so that the
hollow body 10 can be easily placed and retained in a
position relative to the camera 4. Alternatively, the holder
24 can be used to hold the camera 4 when the camera 4 is kept
fixed and the hollow body 10 is movable. Yet alternatively,
the holder 24 can be provided to keep both the holder 24 and
the camera 4 to be in a desired position of camera 4 and the
hollow body 10 relative to each other.
The holder 24 is provided with a flexibility to move the
hollow body 10 rotationally and transitionally in three
dimensional spaces. While moving the hollow body 10, when the
hollow body 10 has attained the desired position with respect
to the hollow body 10, the movement of the hollow body 10 is
locked using a locking mechanism of the hollow body 10.
Alternatively, the holder 24 can have a resistive movement of
the hollow body 10 by having a resistive movement mechanism,
so that the hollow body 10 can be easily moved into the
desired position easily and quickly with larger precedence.
The 360 degree lens system 8 includes combination of lenses
and curved mirrors to provide multi-view depth map 6 of the
known inner shape 12, so that a sectional view of a part of
the known inner shape 12 is generated. In an alternate
embodiment, the 360 degree lens system 8 generates a 360
degree view depth map 6 of the known inner shape 12
representing complete view for each part of the known inner
shape 12.
The processor 22 receives the multi-view depth map 12
generated by the camera 4 and further compares the depth map
12 with the known inner shape 12 to determine the correction
parameters 14. While generating the correction parameters 14,
the processor 22 uses the matching algorithm. The processor
14 can be a general purpose computer like a central
processing unit of a personal computer or a calculating
device able to perform arithmetic and logical function, which
are trained to generate the correction parameters 14 using
the depth map 6.
FIG 2 illustrates a matching algorithm used for determining
the correction parameters by matching a depth map of a known
inner shape, while calibrating a multi-view three dimensional
camera.
According to the matching algorithm, intensity of pixels on
the depth map are compensated using linear shift or spline
transform or any other such transform, on a basis of path
length of a electromagnetic pulse.
The matching algorithm comprises following steps. In step 102
a set of undetermined parameters based on various physical
properties of the camera, the hollow body, the known inner
shape of the hollow body, the interrelation between the
hollow body and the camera, etc or combination thereof, is
chosen for each of the pixels on the depth map of a known
inner shape of the hollow body. In step 104 the intensities
of pixels are transformed radialy for the set of undetermined
parameters. A transformed depth map is produced in step 106
by using the intensities transformed in step 104. While, in
step 108 the depth maps of the known inner shape of the
hollow body are compared at various relative positions of the
camera with respect to the hollow body to obtain best
approximation of position of the hollow body with respect to
the camera to identify best match of the transformed depth
map and the known inner shape of the hollow body, so that the
transformed depth map exactly fits the the known inner shape.
In step 110, the undetermined parameters are changed if the
matching of the depth map and the inner known inner shape of
the hollow body is not appropriate and steps 104 to 110 are
iterated till the matching is not appropriate. In step 112,
on finding the appropriate match, the undetermined parameters
are saved as correction parameters to transform the intensity
of the pixels on the depth map to get accurate depth map.
WE CLAIM
1. A method for calibrating a multi-view three dimensional
camera (4), comprising a time-of-flight camera (4) for
generating a three dimensional depth map (6) through a 360
degree lens system (8), wherein the method comprises:
- placing the camera (4) relative to a surrounding hollow
body (10) of a known inner shape (12),
- generating the depth map (6) of the known inner shape (12)
of the hollow body (10),
- comparing the depth map (6) with the known inner shape (12)
to determine correction parameters (14) for a set of
undetermined parameters (26) of the depth map (6).
2. The method according to claim 1, wherein the correction
parameters (14) are determined such that the depth map (6)
corrected with the correction parameters (14) corresponds to
the known inner shape (12).
3. The method according to any of the claims 1 or 2, wherein
the correction parameters (14) compensate an offset of a path
length of a electromagnetic pulse for each pixel of the depth
map due to generating the depth map through the 360 degree
lens system (8).
4. The method according to claim 3, wherein the correction
parameters (14) compensate the offset of the path length of
the electromagnetic pulse for each pixel of the depth map by
compensating intensity of each pixel of the depth map.
5. The method according to any of the claims from 1 to 4,
wherein the correction parameters (14) are determined for
every pixel (16) of the depth map (6).
6. The method according to any of the claims from 1 to 5,
wherein the correction parameters (14) are determined by
matching the depth map (6) to the known inner shape (12) by a
matching algorithm.
7. The method according to the claim 6, wherein the matching
algorithm is based on linear shift or spline transformation
or combination thereof.
8. The method according to any of the claims from 1 to 7,
wherein the inner shape (12) of the hollow body (10) is a
cylinder or a hemisphere.
9. The method according to any of the claims from 1 to 8,
wherein the hollow body (10) is a room of known shape.
10. The method according to any of the claims from 1 to 9,
wherein the camera (4) is placed in a predetermined position
relative to the hollow body (10).
11. The method according to claim 10, wherein the inner shape
of the hollow body (10) has a symmetry axis (18) and the
hollow body (10) is placed coaxially to an optical axis (20)
of the camera (4).
12. A system (2) comprising:
- a multi-view three dimensional camera (4) for generating
the three dimensional depth map (6) through the 360 degree
lens system (8),
- a processor (22) adapted to calibrate the camera (4)
according to any of the method according to any of the claims
from 1 to 11.
13. The system (2) according to claim 12 comprising:
- a holder (24) for receiving the hollow body in a
predetermined position relative to the camera (4).
A method for calibrating a multi-view three dimensional
camera (4)having a time-of-flight camera (4) for generating a
three dimensional depth map (6) through a 360 degree lens
system (8), the method includes placing the camera (4)
relative to a surrounding hollow body (10) of a known inner
shape (12), generating the depth map (6) of the known inner
shape (12) of the hollow body (10), comparing the depth map
(6) with the known inner shape (12) to determine correction
parameters (14) of the depth map (6).
| # | Name | Date |
|---|---|---|
| 1 | 828-KOL-2010-AbandonedLetter.pdf | 2017-10-13 |
| 1 | abstract-828-kol-2010.jpg | 2011-10-07 |
| 2 | 828-kol-2010-specification.pdf | 2011-10-07 |
| 2 | 828-KOL-2010-FER.pdf | 2016-11-07 |
| 3 | 828-KOL-2010-PA.pdf | 2011-10-07 |
| 3 | 828-KOL-2010-(12-12-2011)-CORRESPONDENCE.pdf | 2011-12-12 |
| 4 | 828-kol-2010-gpa.pdf | 2011-10-07 |
| 4 | 828-KOL-2010-(12-12-2011)-FORM-3.pdf | 2011-12-12 |
| 5 | 828-kol-2010-form 3.pdf | 2011-10-07 |
| 5 | 828-kol-2010-abstract.pdf | 2011-10-07 |
| 6 | 828-KOL-2010-FORM 3 1.1.pdf | 2011-10-07 |
| 6 | 828-kol-2010-claims.pdf | 2011-10-07 |
| 7 | 828-kol-2010-form 2.pdf | 2011-10-07 |
| 7 | 828-KOL-2010-CORRESPONDENCE 1.1.pdf | 2011-10-07 |
| 8 | 828-KOL-2010-FORM 2 1.1.pdf | 2011-10-07 |
| 8 | 828-kol-2010-correspondence.pdf | 2011-10-07 |
| 9 | 828-KOL-2010-FORM 18.pdf | 2011-10-07 |
| 9 | 828-kol-2010-description (complete).pdf | 2011-10-07 |
| 10 | 828-kol-2010-drawings.pdf | 2011-10-07 |
| 10 | 828-KOL-2010-FORM 13.pdf | 2011-10-07 |
| 11 | 828-KOL-2010-FORM 1 1.1.pdf | 2011-10-07 |
| 11 | 828-kol-2010-form 1.pdf | 2011-10-07 |
| 12 | 828-KOL-2010-FORM 1 1.1.pdf | 2011-10-07 |
| 12 | 828-kol-2010-form 1.pdf | 2011-10-07 |
| 13 | 828-kol-2010-drawings.pdf | 2011-10-07 |
| 13 | 828-KOL-2010-FORM 13.pdf | 2011-10-07 |
| 14 | 828-kol-2010-description (complete).pdf | 2011-10-07 |
| 14 | 828-KOL-2010-FORM 18.pdf | 2011-10-07 |
| 15 | 828-kol-2010-correspondence.pdf | 2011-10-07 |
| 15 | 828-KOL-2010-FORM 2 1.1.pdf | 2011-10-07 |
| 16 | 828-KOL-2010-CORRESPONDENCE 1.1.pdf | 2011-10-07 |
| 16 | 828-kol-2010-form 2.pdf | 2011-10-07 |
| 17 | 828-kol-2010-claims.pdf | 2011-10-07 |
| 17 | 828-KOL-2010-FORM 3 1.1.pdf | 2011-10-07 |
| 18 | 828-kol-2010-abstract.pdf | 2011-10-07 |
| 18 | 828-kol-2010-form 3.pdf | 2011-10-07 |
| 19 | 828-kol-2010-gpa.pdf | 2011-10-07 |
| 19 | 828-KOL-2010-(12-12-2011)-FORM-3.pdf | 2011-12-12 |
| 20 | 828-KOL-2010-PA.pdf | 2011-10-07 |
| 20 | 828-KOL-2010-(12-12-2011)-CORRESPONDENCE.pdf | 2011-12-12 |
| 21 | 828-kol-2010-specification.pdf | 2011-10-07 |
| 21 | 828-KOL-2010-FER.pdf | 2016-11-07 |
| 22 | abstract-828-kol-2010.jpg | 2011-10-07 |
| 22 | 828-KOL-2010-AbandonedLetter.pdf | 2017-10-13 |
| 1 | SearchStrategy_06-10-2016.pdf |