Abstract: Aspects of the present disclosure relates to technologies (systems, devices, methods, etc.) for performing feature detection and/or feature tracking based on image data. In embodiments, the technologies include or leverage a SLAM hardware accelerator (SWA) that includes a feature detection component and optionally a feature tracking component. The feature detection component may be configured to perform feature detection on working data encompassed by a sliding window. The feature tracking component is configured to perform feature tracking operations to track one or more detected features, e.g., using normalized cross correlation (NCC) or another method.
Claims:1. A computer implemented method for detecting features in a digital image, comprising the following computer implemented operations:
defining a first sliding window encompassing first working data, the first working data comprising a first portion of image data of the digital image;
performing first feature detection operations on one or more first candidate pixels in the first working data in the sliding window buffer to classify whether said one or more first candidate pixels is or is not a feature;
defining a second sliding window encompassing second working data, the second working data comprising reuse data and new data; and
performing second feature detection operations on one or more second candidate pixels within the second working data to classify whether said one or more second candidate pixels is or is not a feature;
wherein said reuse data comprises a portion of the first working data, and said new data comprises image data of the digital image that was not included in the first working data.
, Description:FIELD
[001] The present disclosure generally relates to feature detection and tracking technologies and, in particular, to feature detection and tracking technologies that are useful for computer vision applications such as Simultaneous Localization and Mapping (SLAM). Methods, devices, and systems utilizing such technologies are also described.
BACKGROUND
[002] Simultaneous Localization and Mapping (SLAM) is a computer vision task that is concerned with the computational problem of constructing and/or updating a map of an environment while also keeping track of an agent/platform within that map. A wide variety of SLAM algorithms are known, and are often used to facilitate computer vision in the context of various platforms such as automated robots, self-driving vehicles, virtual reality (VR) headsets, augmented reality (AR) headsets, and the like. Many SLAM algorithms are tailored to resources available to the platform on which they are implemented. For example a visual SLAM algorithm may be configured utilize image data provided by one or more cameras on a platform to determine a three dimensional map of the environment surrounding the platform, as well as the position (pose) of the camera within that map. In such instances the map of the environment and the three-dimensional (3D) position of the platform and/or a camera may be estimated by analyzing a temporal sequence of images provided by the camera, e.g., as the platform and/or the camera moves.
[003] Feature (e.g., corner) detection is often an initial image processing step in many visual SLAM algorithms. A large number of feature (e.g., corner) detectors have therefore been developed, though practical implementation of such detectors remains challenging in some applications. For example some feature detectors for visual SLAM are configured to detect features in a 30 frames per second (30 FPS) video graphics array (VGA) image stream provided by a camera on a frame by frame basis. When such feature detectors perform a pixel by pixel determination as to whether any features (e.g., corners) are present in each frame, large quantities of compute cycles, input/output (I/O) operations, electric power, etc. may be consumed. Indeed despite enormous increases in computing power over time, many existing feature detectors can still consume much or even all of the processing bandwidth of a processor. Implementation of feature detectors for visual SLAM in software (e.g., by a general purpose processor) may also be too slow for latency sensitive applications, such as but not limited to VR, AR, and/or real-time feature detection/tracking applications.
[004] Similar challenges exist with regard to other aspects of visual SLAM. For example in addition to one or more feature detectors, some systems for implementing visual SLAM may include one or more feature trackers to track the position of detected features in image data. Like the feature detectors noted above, many feature tracking techniques are computationally expensive, consume significant I/O operations, and/or consume significant electrical power.
[005] Implementation of feature detection and/or feature tracking operations for visual SLAM therefore remains challenging in some applications. This is particularly true with regard to the implementation of visual SLAM on platforms with limited computing and/or power resources (e.g., mobile platforms such as smart phones, robots, laptop computers, tablet computers, etc.), and/or which are latency sensitive (e.g., AR, VR, real-time detection and/or tracking, etc.).
| # | Name | Date |
|---|---|---|
| 1 | Drawing [29-12-2016(online)].pdf | 2016-12-29 |
| 2 | Description(Complete) [29-12-2016(online)].pdf_102.pdf | 2016-12-29 |
| 3 | Description(Complete) [29-12-2016(online)].pdf | 2016-12-29 |
| 4 | Form 18 [03-01-2017(online)].pdf | 2017-01-03 |
| 5 | Form 26 [24-01-2017(online)].pdf | 2017-01-24 |
| 6 | Other Patent Document [11-03-2017(online)].pdf | 2017-03-11 |
| 7 | Correspondence by Agent_Form 1_17-03-2017.pdf | 2017-03-17 |
| 8 | Other Patent Document [22-03-2017(online)].pdf | 2017-03-22 |
| 9 | Corespondence By Agent_Form1_24-03-2017.pdf | 2017-03-24 |
| 10 | Form 3 [29-06-2017(online)].pdf | 2017-06-29 |
| 11 | 201641044794-REQUEST FOR CERTIFIED COPY [08-11-2017(online)].pdf | 2017-11-08 |
| 12 | 201641044794-FORM 3 [29-01-2018(online)].pdf | 2018-01-29 |
| 13 | 201641044794-FORM 3 [24-08-2018(online)].pdf | 2018-08-24 |
| 14 | 201641044794-FER.pdf | 2020-05-05 |
| 15 | 201641044794-Information under section 8(2) [28-10-2020(online)].pdf | 2020-10-28 |
| 16 | 201641044794-FORM 3 [28-10-2020(online)].pdf | 2020-10-28 |
| 17 | 201641044794-OTHERS [30-10-2020(online)].pdf | 2020-10-30 |
| 18 | 201641044794-FER_SER_REPLY [30-10-2020(online)].pdf | 2020-10-30 |
| 19 | 201641044794-CLAIMS [30-10-2020(online)].pdf | 2020-10-30 |
| 20 | 201641044794-Information under section 8(2) [13-09-2023(online)].pdf | 2023-09-13 |
| 21 | 201641044794-US(14)-HearingNotice-(HearingDate-15-01-2024).pdf | 2023-12-27 |
| 22 | 201641044794-Correspondence to notify the Controller [03-01-2024(online)].pdf | 2024-01-03 |
| 23 | 201641044794-Correspondence to notify the Controller [11-01-2024(online)].pdf | 2024-01-11 |
| 1 | Search201641044794E_29-04-2020.pdf |
| 2 | 2021-04-2114-54-55AE_25-04-2021.pdf |