Sign In to Follow Application
View All Documents & Correspondence

Dynamic Calibration Of Multi Camera Systems Using Multiple Multi View Image Frames

Abstract: System, apparatus, method, and computer readable media for on-the-fly dynamic calibration of multi-camera platforms using images of multiple different scenes. Image frame sets previously captured by the platform are scored as potential candidates from which new calibration parameters may be computed. The candidate frames are ranked according to their score and iteratively added to the calibration frame set according to an objective function. The selected frame set may be selected from the candidates based on a reference frame, which may be a most recently captured frame, for example. A device platform including a CM and comporting with the exemplary architecture may enhance multi-camera functionality in the field by keeping calibration parameters current. Various computer vision algorithms may then rely upon these parameters, for example.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
15 October 2018
Publication Number
20/2019
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
ipo@iphorizons.com
Parent Application

Applicants

INTEL CORPORATION
2200 Mission College Boulevard, Santa Clara, California, 95054, USA.

Inventors

1. AVINASH KUMAR
3655 Pruneridge Avenue, Apt. 125, Santa Clara, CA 95051, US.
2. RAMKUMAR NARAYANSWAMY
845 Waite Drive, Boulder, CO 80303, US.
3. MANJULA GURURAJ
34200 Finnigan Terrace, Fremont, CA 94555, US.

Specification

Claims:1. A camera calibration method, comprising:
collecting a plurality of image frame sets, wherein each image frame set comprises two or more image frames of a single scene captured with two or more cameras;
scoring the plurality of image frame sets;
selecting two or more of the image frame sets based on the scoring and an objective function of error associated with camera calibration parameters determined from the image frame set selection; and
updating one or more camera calibration parameters based, at least in part, on the selected image frame sets.
, Description:BACKGROUND
A digital camera is a component often included in commercial electronic media device platforms. Digital cameras are now available in wearable form factors (e.g., image capture earpieces, image capture headsets, image capture eyeglasses, etc.), as well as embedded within smartphones, tablet computers, and notebook computers, etc. Multiple cameras are now often embedded in the same device platform. For such multi-camera platforms, two or more cameras may each capture or acquire an image frame at one instant in time (e.g., in a stereo image mode). With synchronous multi-camera image capture, computer vision techniques may be employed to process the stereo image sets and generate novel output effects. For example, a number of computational imaging tasks, such as depth mapping, depth dependent blurring, image stitching, and 3D scene object measurement, can be performed based on the image frame data collected by a multi-camera platform. However, the accuracy of many of these tasks relies heavily on calibration parameters of the cameras. Camera calibration is therefore an important interface between captured images and a computer vision algorithm.
Camera calibration estimates the intrinsic geometric properties of a single camera, such as, focal length, pixel pitch, center point etc., which allows for transforming image pixels to metric units (e.g. mm) of a scene. Camera calibration also estimates extrinsic parameters characterizing the relative pose between all pairs of cameras in a multi-camera system. Together, these parameters can be used to accurately compute a 3D reconstruction of the imaged scene, which is an important component driving many of the computational photography applications. An out-of-calibration camera can therefore result in inaccurate 3D reconstructions, and thus affect the performance of these applications.
When multi-camera platforms are manufactured, a calibration is typically performed to determine an accurate estimate of the platform configuration. While such a calibration can be very accurate, the platform configuration can change over time as a result of repeated use and exposure to various external factors that make it unlikely the factory calibration with hold over a camera platform’s life cycle. For example, changes in ambient temperature, platform orientation with respect to gravity, and deformations induced by physical impacts can all result in changes to the platform configuration that will induce significant error in computation imaging tasks if performed based calibration parameters that remain fixed at the time of manufacture.
The camera parameter calibrations performed at the time of manufacture are tedious and difficult to duplicate in the field, particularly when the platform is a consumer device (e.g., a smartphone). A dynamic calibration method that uses captured images of natural scenes collected in the field to refine or update the camera calibration parameters is therefore more practical. For dynamic calibration, the target scene geometry is not known a-priori and is instead computed as part of the camera calibration process. The accuracy of dynamic calibration is dependent upon the number of feature points in a captured scene, and their 3D distribution within the scene. Capturing a scene in the field that has a suitable feature count, and distribution of feature points, is not easy. This is particularly an issue for consumer device platforms where the environment in which the platform is used is unpredictable.

BRIEF DESCRIPTION OF THE DRAWINGS
The material described herein is illustrated by way of example and not by way of limitation in the accompanying figures. For simplicity and clarity of illustration, elements illustrated in the figures are not necessarily drawn to scale. For example, the dimensions of some elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference labels have been repeated among the figures to indicate corresponding or analogous elements. In the figures:
FIG. 1 is a schematic illustrating an exemplary single camera geometric model and associated coordinate systems, in accordance with some embodiments;
FIG. 2 is a flow diagram illustrating computer-implemented methods that include multi-image, multi-camera dynamic calibration, and the further use of calibration parameters determined through such a calibration, in accordance with some embodiments;
FIG. 3A and 3B are flow diagrams illustrating computer-implemented multi-image, multi-capture dynamic calibration methods, in accordance with some embodiments;
FIG. 4 is a flow diagram illustrating a computer-implemented dynamic calibration method that may be enlisted in a multi-image, multi-camera dynamic calibration method, in accordance with some embodiments;
FIG. 5A is a schematic illustrating rectification error, which may be employed to assess error associated with calibration parameter values determined through a calibration routine, in accordance with some embodiments;
FIG. 5B and 5C are exemplary feature distribution histograms illustrating differences in feature distribution between two candidate image frame sets, in accordance with some embodiments;
FIG. 6 is a block diagram of a system platform that includes multiple cameras and a processor operable to implement multi-image, multi-capture dynamic calibration methods, in accordance with some embodiments;
FIG. 7 is a diagram of an exemplary system that includes multiple cameras and a processor operable to implement multi-image, multi-capture dynamic calibration methods, in accordance with one or more embodiment; and
FIG. 8 is a diagram of an exemplary mobile handset platform, arranged in accordance with some embodiments.

DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
One or more embodiments are described with reference to the enclosed figures. While specific configurations and arrangements are depicted and discussed in detail, it should be understood that this is done for illustrative purposes only. Persons skilled in the relevant art will recognize that other configurations and arrangements are possible without departing from the spirit and scope of the description. It will be apparent to those skilled in the relevant art that techniques and/or arrangements described herein may be employed in a variety of other systems and applications beyond what is described in detail herein.

Documents

Application Documents

# Name Date
1 201844039076-FORM 1 [15-10-2018(online)].pdf 2018-10-15
2 201844039076-DRAWINGS [15-10-2018(online)].pdf 2018-10-15
3 201844039076-DECLARATION OF INVENTORSHIP (FORM 5) [15-10-2018(online)].pdf 2018-10-15
4 201844039076-COMPLETE SPECIFICATION [15-10-2018(online)].pdf 2018-10-15
5 201844039076-FORM 18 [18-10-2018(online)].pdf 2018-10-18
6 Correspondence by Agent_Form5_22-10-2018.pdf 2018-10-22
7 201844039076-Correspondence-Letter [02-11-2018(online)].pdf 2018-11-02
8 201844039076-FORM-26 [05-11-2018(online)].pdf 2018-11-05
9 Correspondence by Agent_Power Of Attorney_09-11-2018.pdf 2018-11-09
10 201844039076-FORM 3 [12-04-2019(online)].pdf 2019-04-12
11 201844039076-FER.pdf 2020-08-11
12 201844039076-Information under section 8(2) [12-01-2021(online)].pdf 2021-01-12
13 201844039076-FORM 3 [12-01-2021(online)].pdf 2021-01-12
14 201844039076-OTHERS [29-01-2021(online)].pdf 2021-01-29
15 201844039076-FER_SER_REPLY [29-01-2021(online)].pdf 2021-01-29
16 201844039076-CLAIMS [29-01-2021(online)].pdf 2021-01-29
17 201844039076-ABSTRACT [29-01-2021(online)].pdf 2021-01-29
18 201844039076-US(14)-HearingNotice-(HearingDate-26-12-2022).pdf 2022-12-08
19 201844039076-Correspondence to notify the Controller [14-12-2022(online)].pdf 2022-12-14
20 201844039076-Proof of Right [16-12-2022(online)].pdf 2022-12-16
21 201844039076-FORM-26 [23-12-2022(online)].pdf 2022-12-23
22 201844039076-Response to office action [13-01-2023(online)].pdf 2023-01-13

Search Strategy

1 Search_Strategy_201844039076E_03-08-2020.pdf