Sign In to Follow Application
View All Documents & Correspondence

Three Dimensional Reconstruction In Mobile Devices

Abstract: Disclosed is a system (102) and method for generating a 3D structure. The system (102) of the present disclosure comprises an image capturing device (108) for capturing images of an object. The system (102) comprises an accelerometer (110), a gyroscope (112), and a magnetometer (114) for determining acceleration data and rotation data respectively. The system (102) may determine positions of the images. Further, the system (102) may determine a fundamental matrix using the acceleration data, positions of the images, and an internal calibration matrix. The system (102) extracts features of the images for establishing point correspondences between the images. Further, the point correspondences are filtered using the fundamental matrix for generating filtered point correspondences. The filtered point correspondences may be triangulated for determining 3D points representing the 3D structure.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
09 January 2015
Publication Number
29/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2022-03-01
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th floor, Nariman point, Mumbai 400021, Maharashtra, India

Inventors

1. MALLIK, Apurbaa
Tata Consultancy Services Limited, Building 1B, Ecospace Plot - IIF/12, New Town, Rajarhat, Kolkata - 700156, West Bengal, India
2. BHOWMICK, Brojeshwar
Tata Consultancy Services Limited, Building 1B, Ecospace Plot - IIF/12, New Town, Rajarhat, Kolkata - 700156, West Bengal, India

Specification

DESC:FORM 2 THE PATENTS ACT, 1970 (39 of 1970) & THE PATENT RULES, 2003 COMPLETE SPECIFICATION (See Section 10 and Rule 13) Title of invention: SYSTEM AND METHOD FOR GENERATING A 3D STRUCTURE Applicant Tata Consultancy Services Limited A Company Incorporated in India under the Companies Act, 1956 Having address: Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India The following specification particularly describes the invention and the manner in which it is to be performed. CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY The present application claims priority from Indian patent application 94/MUM/2015 filed on 09th January, 2015 entirety of which is enclosed for reference. TECHNICAL FIELD The present subject matter described herein, in general, relates to generating 3D structure, and more particularly to generating 3D structure of an object in real time. BACKGROUND A 3D structure may be generated using several known techniques. Most of the known techniques work using heavy algorithms. These heavy algorithms are used to determine extrinsic camera parameters using images captured by a camera. Thus, a device used for generating the 3D structure uses a Graphical Processing Unit (GPU) for processing the images in order to determine the extrinsic camera parameters. Sensors present within the device such as an accelerometer, a gyroscope, and a magnetometer may be used for determining parameters used while generating the 3D structure. Further, the parameters determined by the sensors may provide an erroneous value of the parameters due to several reasons. A few of the several reasons may be an effect of environment, magnetic field, and temperature. SUMMARY This summary is provided to introduce aspects related to systems and methods for generating a 3D structure and the aspects are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter. In one implementation, a method for generating a three-dimensional (3D) structure is disclosed. The method comprises capturing images of an object. The method further comprises determining acceleration data of the images of the object simultaneously while the images of the object are captured. The acceleration data may comprise directional components and magnitudes of accelerations of the images of the object. The method further comprises filtering the acceleration data by using a median filtering technique and an extended Kalman filter on the acceleration data to generate filtered acceleration data. The method further comprises capturing rotation data of the images simultaneously while the images of the object are captured. The method further comprises determining refined rotation data by using a Kalman filter and a rotation averaging technique on the rotation data. The method further comprises determining positions of the image capturing device while capturing the images by using the filtered acceleration data. The method further comprises determining optimized positions of the image capturing device by optimizing the positions of the image capturing device. The positions of the image capturing device may be optimized using a Pseudo Huber loss function. The method further comprises computing a fundamental matrix using the refined rotation data, the optimized positions of the image capturing device, and an internal calibration matrix. The method further comprises extracting features of the images. The method further comprises establishing point correspondences amongst the images. The point correspondences may be established using the features of the images. The method further comprises generating filtered point correspondences by filtering the point correspondences using the fundamental matrix. The method further comprises generating a 3D structure using the filtered point correspondences. The filtered point correspondences may be triangulated to generate a 3D structure of the images. In one implementation, a system for generating a 3D structure is disclosed. The system comprises an image capturing device, an accelerometer, a gyroscope, a magnetometer. Further, the system comprises a processor, and a memory coupled to the processor for executing programmed instructions stored in the memory. The image capturing device captures images of an object. The accelerometer may determine acceleration data of the images of the object simultaneously while the images of the object are captured. The acceleration data may comprise directional components and magnitudes of accelerations of the images of the object. The processor filters the acceleration data by using a median filtering technique and an extended Kalman filter on the acceleration data to generate filtered acceleration data. The gyroscope and the magnetometer capture rotation data of the images simultaneously while the images of the object are captured. The processor determines refined rotation data by using a Kalman filter and a rotation averaging technique on the rotation data. The processor further determines positions of the image capturing device while capturing the images by using the filtered acceleration data. The processor further determines optimized positions of the image capturing device by optimizing the positions of the image capturing device. The positions of the image capturing device may be optimized using a Pseudo Huber loss function. The process further computes a fundamental matrix using the refined rotation data, the optimized positions of the image capturing device, and an internal calibration matrix. The processor further extracts features of the images. The processor further establishes point correspondences amongst the images. The point correspondences may be established using the features of the images. The processor further generates filtered point correspondences by filtering the point correspondences using the fundamental matrix. The processor further generates a 3D structure using the filtered point correspondences. The filtered point correspondences may be triangulated to generate a 3D structure of the images. In one implementation, a non-transitory computer readable medium embodying a program executable in a computing device for generating a 3D structure is disclosed. The program comprises a program code for capturing images of an object. The program comprises a program code for determining acceleration data of the images of the object simultaneously while the images of the object are captured. The acceleration data may comprise directional components and magnitudes of accelerations of the images of the object. The program further comprises a program code for filtering the acceleration data by using a median filtering technique and an extended Kalman filter on the acceleration data to generate filtered acceleration data. The program further comprises a program code for capturing rotation data of the images simultaneously while the images of the object are captured. The program further comprises a program code for determining refined rotation data by using a Kalman filter and a rotation averaging technique on the rotation data. The program further comprises a program code for determining positions of the image capturing device while capturing the images by using the filtered acceleration data. The program further comprises a program code for determining optimized positions of the image capturing device by optimizing the positions of the image capturing device. The positions of the image capturing device may be optimized using a Pseudo Huber loss function. The program further comprises a program code for computing a fundamental matrix using the refined rotation data, the optimized positions of the image capturing device, and an internal calibration matrix. The program further comprises a program code for extracting features of the images. The program further comprises a program code for establishing point correspondences amongst the images. The point correspondences may be established using the features of the images. The program further comprises a program code for generating filtered point correspondences by filtering the point correspondences using the fundamental matrix. The program further comprises a program code for generating a 3D structure using the filtered point correspondences. The filtered point correspondences may be triangulated to generate a 3D structure of the images. BRIEF DESCRIPTION OF THE DRAWINGS The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components. Figure 1 illustrates a system for generating a 3D structure, in accordance with an embodiment of the present subject matter. Figure 2a and figure 2b jointly show a flowchart for illustrating a method for generating a 3D structure, in accordance with an embodiment of the present subject matter. DETAILED DESCRIPTION Systems and methods for generating a 3D structure are described. The present subject matter discloses a mechanism for generating the 3D structure. The system 102 may comprise an image capturing device 108 for capturing images of an object. Further, the system 102 may also comprise an accelerometer 110 for determining acceleration data of the image capturing device. The system 102 may comprise a gyroscope 112 and a magnetometer 114 to determine rotation data of the images. The system 102 may filter the acceleration data and the rotation data, of the image capturing device 108, to generate filtered acceleration data and refined rotation data. The system 102 may determine positions of the image capturing device 108 by using the filtered acceleration data. Further, the system 102 may determine the positions of the image capturing device 108 while capturing the images. The system 102 may determine optimized positions of the image capturing device 108 by optimizing the positions of the image capturing device 108. Subsequently the system 102 may compute a fundamental matrix by using the refined rotation data, the optimized positions of the image capturing device 108, and an internal calibration matrix. Further, the system 102 may extract features of the images using a suitable technique. The features of the images may be used for establishing point correspondences between the images. The point correspondences may then be filtered, by the system 102, using the fundamental matrix for generating filtered point correspondences. The filtered point correspondences may be used for generating the 3D structure using a triangulation technique. Also, the 3D structure may be optimized for minimizing reprojection errors associated with the 3D structure. While aspects of described system and method for generating a 3D structure may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system. Referring now to Figure 1, the system 102 for generating a 3D structure by using images of an object is shown, in accordance with an embodiment of the present subject matter. Although the present subject matter is explained considering that the system 102 is implemented on a mobile device, it may be understood that the system 102 may also be implemented in a variety of computing systems including but not limited to, a smart phone, a tablet, a notepad, a personal digital assistant, a handheld device, a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, wherein each of the devices include an image capturing device/camera. In one embodiment, the system 102 may include at least one processor 104, an input/output (I/O) interface 106, an image capturing device 108, an accelerometer 110, a gyroscope 112, a magnetometer 114, and a memory 116. The accelerometer 110, the gyroscope 112, and the magnetometer 114 may have similar sampling rates. Further, the at least one processor 104 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 104 is configured to fetch and execute computer-readable instructions stored in the memory 116. The I/O interface 106 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 106 may allow the media system to interact with a user directly. Further, the I/O interface 106 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 106 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 106 may include one or more ports for connecting a number of devices to one another or to a server. The memory 116 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 116 may include data 118. The data 118, amongst other things, serves as a repository for storing data processed, received, and generated by the at least one processor 104. In one implementation, at first, a user may capture the images of an object by using the image capturing device 108. The image capturing device 108 may also be identified as a camera. The images may be captured from different viewpoints/locations/angles in order to collect maximum image details of the object. Further, both the system 102 and the object may be stationary while the image capturing device 108 captures images of the object. The system 102, when stationary, allows the image capturing device 108 to capture the images with a proper focus. In one embodiment, the images may be recorded in a suitable image format like Joint Photographic Experts Group (JPEG), Exchangeable image file format (Exif), Tagged Image File Format (TIFF), Raw Image Format (RAW), Graphics Interchange Format (GIF), Bitmap format (BMP), and Portable Network Graphics (PNG). In an example, the system 102 may capture the images in JPEG format. The images when present in the JPEG format are usually of smaller sizes and require less computation for their processing. Simultaneously while capturing the images, the system 102 may use the accelerometer 110 for determining an acceleration data of the images. The acceleration data determined by the accelerometer 110 may include a static bias, before the accelerometer 110 of the system 102 is calibrated. The static bias refers to a non-zero value of the acceleration data, determined by the accelerometer 110, while the system 102 is stationary. The system 102 may compute a mean value of the static bias (bmean)and a standard deviation of the static bias (bstd) using the below mentioned equations 1 and 2, where symbols have their usual meanings, bmean=1/N ?_(i=0)^N¦(a_i-g_i ) ………. Equation 1 bstd = ?(ai – gi - bmean)2………. Equation 2 (N - 1) Here, ‘¬N’ represents total number of samples of sensor data while the system 102 is stationary, ‘ai’ represents acceleration of the system 102 for ith sensor data, and ‘gi’ represents a gravity vector of the system 102 for the ith sensor data. In one embodiment, the acceleration data may comprise directional components and magnitudes of accelerations of the images. The accelerometer 110 may be a 3-axis accelerometer 110 determining the magnitude of accelerations along an x-axis, y-axis, and a z-axis. The 3-axis accelerometer 110 may be calibrated for eliminating the static bias present in the magnitude of accelerations determined along the x-axis, y-axis, and a z-axis. Thus, the accelerometer 110 upon calibration determines the acceleration data, excluding any noises and errors. In another embodiment, the system 102 may falsely identify a movement of the image capturing device 108, while capturing the images and determining the accelerometer data. The system 102 may falsely identify a movement of the image capturing device 108 while hands of a user, holding the image capturing device, shake. A false identification of the movement may result into an erroneous identification of the direction or position of the image capturing device 108. In order to avoid the false identification of the movement, the system 102 may determine a region of active movement of the image capturing device 108 by using the static bias and an acceleration profile. The system 102 may determine the region of active movement using a condition mentioned as below. -3 * bstd

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 94-MUM-2015-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30
1 Form 5.pdf 2018-08-11
2 94-MUM-2015-IntimationOfGrant01-03-2022.pdf 2022-03-01
2 Form 2.pdf 2018-08-11
3 Figure for Abstract.jpg 2018-08-11
3 94-MUM-2015-PatentCertificate01-03-2022.pdf 2022-03-01
4 94-MUM-2015-Written submissions and relevant documents [16-02-2022(online)].pdf 2022-02-16
4 94-MUM-2015-Power of Attorney-250215.pdf 2018-08-11
5 94-MUM-2015-Response to office action [07-02-2022(online)].pdf 2022-02-07
5 94-MUM-2015-Form 1-050215.pdf 2018-08-11
6 94-MUM-2015-Correspondence-250215.pdf 2018-08-11
6 94-MUM-2015-Correspondence to notify the Controller [28-01-2022(online)].pdf 2022-01-28
7 94-MUM-2015-FORM-26 [28-01-2022(online)].pdf 2022-01-28
7 94-MUM-2015-Correspondence-050215.pdf 2018-08-11
8 94-MUM-2015-US(14)-HearingNotice-(HearingDate-09-02-2022).pdf 2022-01-06
8 94-MUM-2015-FORM-26 [29-10-2018(online)].pdf 2018-10-29
9 94-MUM-2015-CLAIMS [25-06-2020(online)].pdf 2020-06-25
9 94-MUM-2015-ORIGINAL UR 6(1A) FORM 26-021118.pdf 2019-04-08
10 94-MUM-2015-COMPLETE SPECIFICATION [25-06-2020(online)].pdf 2020-06-25
10 94-MUM-2015-FORM-26 [05-07-2019(online)].pdf 2019-07-05
11 94-MUM-2015-FER_SER_REPLY [25-06-2020(online)].pdf 2020-06-25
11 94-MUM-2015-ORIGINAL UR 6(1A) FORM 26-120719.pdf 2019-11-07
12 94-MUM-2015-FER.pdf 2019-11-25
12 94-MUM-2015-OTHERS [25-06-2020(online)].pdf 2020-06-25
13 94-MUM-2015-FORM 4(ii) [25-05-2020(online)].pdf 2020-05-25
14 94-MUM-2015-FER.pdf 2019-11-25
14 94-MUM-2015-OTHERS [25-06-2020(online)].pdf 2020-06-25
15 94-MUM-2015-FER_SER_REPLY [25-06-2020(online)].pdf 2020-06-25
15 94-MUM-2015-ORIGINAL UR 6(1A) FORM 26-120719.pdf 2019-11-07
16 94-MUM-2015-COMPLETE SPECIFICATION [25-06-2020(online)].pdf 2020-06-25
16 94-MUM-2015-FORM-26 [05-07-2019(online)].pdf 2019-07-05
17 94-MUM-2015-ORIGINAL UR 6(1A) FORM 26-021118.pdf 2019-04-08
17 94-MUM-2015-CLAIMS [25-06-2020(online)].pdf 2020-06-25
18 94-MUM-2015-FORM-26 [29-10-2018(online)].pdf 2018-10-29
18 94-MUM-2015-US(14)-HearingNotice-(HearingDate-09-02-2022).pdf 2022-01-06
19 94-MUM-2015-FORM-26 [28-01-2022(online)].pdf 2022-01-28
19 94-MUM-2015-Correspondence-050215.pdf 2018-08-11
20 94-MUM-2015-Correspondence-250215.pdf 2018-08-11
20 94-MUM-2015-Correspondence to notify the Controller [28-01-2022(online)].pdf 2022-01-28
21 94-MUM-2015-Response to office action [07-02-2022(online)].pdf 2022-02-07
21 94-MUM-2015-Form 1-050215.pdf 2018-08-11
22 94-MUM-2015-Written submissions and relevant documents [16-02-2022(online)].pdf 2022-02-16
22 94-MUM-2015-Power of Attorney-250215.pdf 2018-08-11
23 Figure for Abstract.jpg 2018-08-11
23 94-MUM-2015-PatentCertificate01-03-2022.pdf 2022-03-01
24 Form 2.pdf 2018-08-11
24 94-MUM-2015-IntimationOfGrant01-03-2022.pdf 2022-03-01
25 94-MUM-2015-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30
25 Form 5.pdf 2018-08-11

Search Strategy

1 D1_22-11-2019.pdf

ERegister / Renewals

3rd: 03 Mar 2022

From 09/01/2017 - To 09/01/2018

4th: 03 Mar 2022

From 09/01/2018 - To 09/01/2019

5th: 03 Mar 2022

From 09/01/2019 - To 09/01/2020

6th: 03 Mar 2022

From 09/01/2020 - To 09/01/2021

7th: 03 Mar 2022

From 09/01/2021 - To 09/01/2022

8th: 03 Mar 2022

From 09/01/2022 - To 09/01/2023

9th: 03 Jan 2023

From 09/01/2023 - To 09/01/2024

10th: 09 Jan 2024

From 09/01/2024 - To 09/01/2025

11th: 09 Jan 2025

From 09/01/2025 - To 09/01/2026