Sign In to Follow Application
View All Documents & Correspondence

System And Method For Augmented Reality Inspection And Data Visualization

Abstract: SYSTEM AND METHOD FOR AUGMENTED REALITY INSPECTION AND DATA VISUALIZATION ABSTRACT [0036] A 3D tracking system is provided. The 3D tracking system Includes at least one acoustic emission sensor disposed around an object. The acoustic emission sensor is configured to identify location of a probe inserted into the object based upon time of arrival of an acoustic signature emitted from a location on or near the probe. The 3D tracking system also includes a first sensor configured to detect an elevation of the probe. The 3D tracking system further includes a second sensor configured to detect an azimuth of the probe.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 December 2007
Publication Number
37/2009
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

GENERAL ELECTRIC COMPANY
1 RIVER ROAD, SCHENECTADY, NEW YORK 12345, USA

Inventors

1. SENGUPTA, ANANDRAJ
APT 802-B, KNIGHTSBRIDGE APTS, ITPL MAIN ROAD, BROOKFIELDS, BANGALORE 560037, KARNATAKA, INDIA
2. KUMAR, VINOD PADMANABHAN
FLAT 118, PRESTIGE LANGLEIGH, PHASE 1, ECC ROAD, WHITEFIELD, BANGALORE 560066, KARNATAKA, INDIA
3. GORAVAR, SHIVAPPA NINGAPPA
17/121, HIREBAN, LAXMESHWAR, KARNATAKA 582116, INDIA

Specification

SYSTEM AND METHOD FOR AUGMENTED
REALITY INSPECTION AND DATA
VISUALIZATION
BACKGROUND
[0001] The invention relates generally to non-destructive inspection
techniques and, more particularly, to inspection techniques employing augmented reality.
[0002] Inspection techniques are commonly used in a variety of
applications ranging from aircraft industry, health industry to security applications. Inspection of complex parts and structures generally require immense inspector skill and experience. Borescope inspection is one of the commonly used sources of information for monitoring of industrial infrastructure due to easy access to in-service parts and reduced downtime. Condition based maintenance strategies on gas turbine and related systems rely heavily on data obtained from such inspection. Generally, probes that use long cables with display pendants have been employed for borescope inspection. However, once the probe is inserted into a borescope inspection hole, minimal information about a location and pose of a tip of the borescope is available to an operator. Tracking the location and the pose reduces error in measurements and is very important to accurately locate flaws and damages seen. Moreover, during tip change scenarios, it is almost impossible to bring the tip to the same location.
[0003] Thus a lot of the inspection is dependant on operator skill and is
subjective. Accurate information about the boresope tip and pose also enables automation and control of an entire inspection process, beginning from inspection planning to guidance to damage reporting.

[0004] Therefore, a need exists for an improved inspection system that
addresses problems set forth above.
BRIEF DESCRIPTION
[0005] In accordance with an embodiment of the invention, a 3D tracking
system is provided. The 3D tracking system includes at least two acoustic emission sensors disposed around an object. The acoustic emission sensors are configured to identify location of a probe inserted into the object based upon time of arrival of an acoustic signature emitted from a location on or near the probe. The 3D tracking system also includes a first sensor configured to detect an elevation of the probe. The 3D tracking system further includes a second sensor configured to detect an azimuth of the probe.
[0006] In accordance with another embodiment of the invention, an
augmented reality system for inspection within an object is provided. The augmented reality system includes a tracking system configured to identify a 3D location of a probe inserted into the object. The 3D tracking system includes at least one acoustic emission sensors disposed around an object. The acoustic emission sensors are configured to identify location of a probe inserted into the object based upon time of arrival of an acoustic signature emitted from a location on or near the probe. The 3D tracking system also includes a first sensor configured to detect an elevation of the probe. The 3D tracking system further includes a second sensor configured to detect an azimuth of the probe. The augmented reality system also includes a microprocessor configured to generate graphics and superimpose the graphics on the image captured by the camera based upon the 3D location identified by the tracking system. The augmented

reality system further includes a display unit configured to display an augmented reality image.
[0007] In accordance with another embodiment of the invention, a
method of 3D tracking within an object is provided. The method includes inserting a probe into the object. The method also includes disposing at least one acoustic emission sensor around the object. The method further includes attaching a first sensor and a second sensor to the probe.
[0008] In accordance with another embodiment of the invention, a
method for forming an augmented reality image for inspection within an object is provided. The method includes capturing an image via a camera. The method also includes identifying a location of a probe within the object via a plurality of acoustic emission sensors. The method further includes determining elevation of the probe via a first sensor. The method also includes determining azimuth of the probe via a second sensor. The method also includes generating graphics of the object. The method further includes registering the graphics on the image captured based upon the location, elevation and the azimuth determined to form an augmented reality image.
DRAWINGS
[0009] These and other features, aspects, and advantages of the present
invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0010] FIG. 1 is a block diagram representation of an augmented reality
image system including a tracking system in accordance with an embodiment of the invention;

[0011] FIG. 2 is a block diagram representation of elements within the
tracking system in FIG. 1;
[0012] FIG. 3 is a diagrammatic illustration of an exemplary borescope in
accordance with an embodiment of the invention;
[0013] FIG. 4 is a schematic illustration of an augmented reality image
formed for inspection of a gas turbine using the borescope in FIG. 3;
[0014] FIG. 5 is a schematic illustration of an exemplary display unit in
accordance with an embodiment of the invention;
[0015] FIG. 6 is a flow chart representing steps in an exemplary method
for 3D tracking within an object; and
[0016] FIG. 7 is a flow chart representing steps in an exemplary method
for forming an augmented reality image for inspection within an object.
DETAILED DESCRIPTION
[0017] As discussed in detail below, embodiments of the invention
include a system and method for non-destructive inspection of an object. The system and method disclosed herein generate an augmented reality image using an improved tracking system for inspection. As used herein, 'augmented reality image' refers to an image that includes real world data superimposed with computer generated data. Non-limiting examples of the object include aircraft engines, gas turbines, steam turbines, diesel engines and a living organism.
[0018] Turning to the drawings, FIG. 1 is a high-level block diagram
representation of an augmented reality system 10 to inspect an object 12. A

tracking system 14 is employed to identify a 3D location and position of a probe 13 inserted into the object 12. In a particular embodiment, the probe 13 includes a borescope or an endoscopes. The camera 18 captures a view of the object 12 as a real image. In a particular embodiment, the camera 18 captures a monocular view. In another embodiment, the camera 18 captures a stereoscopic view. Non-limiting examples of the camera 18 include a web camera, a video camera, or a CCD camera. In another embodiment, more than one camera can be used, for example two cameras could be arranged so as to provide stereoscopic images. Non-limiting examples of the real image include a video image and a still image.
[0019] The real image captured is used as a reference by a microprocessor
20 that is configured to generate graphics corresponding to the real image. In an example, the graphics includes computer-aided design drawings of the object 12. The microprocessor 20 further superimposes the graphics on the real image based upon the 3D location identified by the tracking system 14 to generate an augmented reality image. Thus, the stereoscopic view obtained from the camera 18 is augmented with additional information and provided to a user in real time. The additional information may include, for example, text, audio, video, and still images. For example, in a surgical workspace, a surgeon may be provided with a view of a patient including, inter alia, the view of the patient and an overlay generated by the microprocessor 20. The overlay may include a view of the patient's internal anatomical structures as determined, for example, during a Computerized Axial Topography (CAT) scan, or by Magnetic Resonance Imaging (MRI).
[0020] In another embodiment, the overlay includes a textual view of the
patient's medical and family history. The overlays may be displayed in real-time. The augmented reality image includes the real image captured by the camera 18 overlaid with an additional virtual view. The virtual view is derived from the microprocessor 20 and stored information, for example, images. The augmented

reality image also enables detection of flaws or cracks in the object 12. In an exemplary embodiment, the microprocessor 20 includes a wearable computer. The microprocessor 20 displays an augmented reality image on a display unit 22 but not limited to, a personal digital assistant (PDA), pendant, an external computer and semi-transparent goggles.
[0021] It should be noted that embodiments of the invention are not
limited to any particular microprocessor for performing the processing tasks of the invention. The term "microprocessor" as that term is used herein, is intended to denote any machine capable of performing the calculations, or computations, necessary to perform the tasks of the invention. The term "microprocessor" is intended to denote any machine that is capable of accepting a structured input and of processing the input in accordance with prescribed rules to produce an output.
[0022] FIG. 2 is a block diagram representation of elements 30 within the
tracking system 14 in FIG. 1. The tracking system 14 includes at least one acoustic emission sensor 32 disposed around the object 12 (FIG. 1). The acoustic emission sensor 32 is configured to identify location of a probe 13 (FIG. 1) inserted into the object 12 for inspection. In a particular embodiment, the acoustic emission sensor includes a diameter in a range between about 6 mm and about 12 mm. The location is determined based upon calculation of a time of arrival of an acoustic signature emitted from within the object 12. In a particular embodiment, the acoustic signature is emitted via a single speaker or multiple speakers disposed at a tip of the probe 13. A first sensor 34 detects an elevation of the probe. In one embodiment, the first sensor 34 includes a gravity sensor implemented in integrated micro-electrical mechanical systems (MEMS) technology and configured to detect the elevation based upon acceleration due to gravity.

[0023] The tracking system 14 further includes a second sensor 36
configured to detect an azimuth of the probe 13. In an exemplary embodiment, the second sensor 36 includes a magnetic sensor such as, but not limited to, a magnetic compass, configured to detect the azimuth in presence of a magnetic field. In an example, a solenoid is employed to apply a magnetic field to the magnetic sensor. In yet another embodiment, the second sensor 36 is a gyroscope that detects angular rotation rate along three orthogonal axes.
[0024] FIG. 3 is a schematic illustration of an exemplary probe 13 (FIG.
1) such as a borescope 50 employed for inspection of an object such as, but not limited to, a gas turbine. The borescope 50 includes a first sensor 52 and a second sensor 54 disposed near a tip 56 of the borescope 50. Multiple 'tiny' speakers 58 are also disposed near the tip 56. It will be appreciated that although multiple speakers 58 have been illustrated in FIG. 3, a single speaker may also be employed. The speakers 58 emit acoustic signals at various locations within the object that are captured by the acoustic emission sensor 32 (FIG. 2). In the illustrated embodiment, the first sensor 52 is a MEMS gravity sensor. The MEMS gravity sensor 52 measures acceleration/gravity along a sensitive axis having direction 60. In one embodiment, when the borescope 50 is aligned vertically downward towards the earth in a direction 62, the sensitive axis 60 points vertically downward, resulting in a value of acceleration/gravity equal to 1. In another embodiment, when the borescope 50 is tilted with respect to the vertical, the sensitive axis of the MEMS gravity sensor 52, along which the acceleration/gravity is measured, is tilted with respect to the vertical, and the MEMS gravity sensor 52 measures a value of acceleration/gravity less than 1. Accordingly, output of the MEMS gravity sensor detects an elevation of the borescope 50. In a particular embodiment, the MEMS gravity sensor includes a diameter in a range between about 1 to about 4 mm. The second sensor 54 may be an angular rate gyroscope implemented in integrated MEMS technology. The

gyroscope senses a change in rotation of the borescope 50 and accordingly detects an azimuth of the borescope 50. In one embodiment, the second sensor 54 includes a diameter in a range between about 2 to about 8 mm.
[0025] FIG. 4 is a schematic illustration of an augmented reality image 70
including real world data as in a gas turbine 72 and computer generated data as in blades 74 interior to the gas turbine 72. The augmented reality image 70 is obtained by superimposing graphics generated of the blades 74 on an image of the gas turbine including a casing captured by the camera 18 (FIG. 1). In a particular embodiment, the image includes a 2D image. The microprocessor 20 (FIG. 1) stores and registers information from the image of the gas turbine captured and generates graphics based upon the 3D location obtained from the tracking system 14 (FIG. 1). The microprocessor 20 contains necessary software in order to generate a graphical representation and an augmented reality image based upon the image from the camera 18 and the generated graphical representation. Further, the microprocessor 20 contains a storage medium in order to save and restore previously saved information.
[0026] In order to overlay an image, position and orientation of the
camera 18 with respect to the gas turbine 72, and the orientation of the gas turbine, need to be determined. As a result, it is desirable to know the relationship between two coordinate systems, a camera coordinate system (not shown) attached to the camera 18, and a coordinate system 78 attached to the gas turbine 72. Tracking denotes the process of monitoring the relationship between the coordinate systems. The microprocessor 20 (FIG. 1) registers 3D location obtained from the tracking system 14 in a reference frame having the coordinate system 78 of the gas turbine 72.
[0027] FIG. 5 is a schematic illustration of an exemplary display unit 100.
The display unit 100 includes a handheld display commercially available from

(Si
General Electric Inspection Technologies under the designation Everest XLG3 . The display unit 100 displays an image of blades 102 in an interior of the gas turbine 72 (FIG. 4) and superimposed with information 104 generated by the microprocessor 20. Some examples of the information include a serial number of a blade, time of operation, and identification of a crack. The display unit 100 also includes navigation buttons 106 to select and edit the display.
[0028] As illustrated, a real view and a virtual view are blended. For
example, the virtual view is provided as a transparency over the real view of the gas turbine 72. Registration between the real view and the virtual view aligns the real and virtual views. Registration of the virtual view includes, inter alia, position, orientation, scale, perspective, and internal camera parameters for each camera. Preferably, the internal camera parameters such as, but not limited to, magnification are determined in a prior camera calibration procedure. The registered virtual view is aligned with the real image of the gas turbine 72 in real time. In a particular embodiment, an operator carries the display unit 100 which will provide him/her with an augmented reality view of the gas turbine 72. In an exemplary embodiment, the display unit 100 is of a "video see through"type.
[0029] "Video see-through' generates and presents an augmented reality
world at a handheld display device such as display unit 100. The camera integrated with the display device is used to capture a live video stream of the real world. The camera 18 (FIG. 1) is located in relation with the display unit 100 in such a way that it provides the same view, as an user would get by looking "through" the display device. The live video stream combined with computer-generated graphics is presented in real-time at the display unit 100. Additional functionality includes camera zooming with output of the actual camera focal length. This will enable an accurate display of the computer-generated graphics correctly while zooming.

[0030] FIG. 6 is a flow chart representing steps in an exemplary method
120 for 3D tracking within an object. The method 120 includes inserting a probe into the object in step 122. One or more acoustic emission sensors are disposed around the object in step 124. A first sensor and a second sensor are further attached to the probe in step 126. In a particular embodiment, the first sensor and the second sensor are attached at a tip of the probe. In another embodiment, a single speaker or multiple speakers are disposed at the tip of the probe.
[0031] FIG. 7 is a flow chart representing steps in an exemplary method
140 for forming an augmented reality image for inspection within an object. The method 140 includes capturing an image via a camera in step 142. A location of a probe within the object is identified in step 144 via one or more acoustic emission sensors. In a particular embodiment, the location is identified by calculating a time of travel of an acoustic signal emitted from the tip of the probe to the acoustic sensors. An elevation of the probe is determined via a first sensor in step 146. An azimuth of the probe is further determined via a second sensor in step 148. In an exemplary embodiment, the azimuth of the probe is determined by applying a magnetic field to the second sensor. Graphics of the object is generated in step 150. The graphics is registered on the image captured based upon the location, elevation and the azimuth determined to form an augmented reality image in step 152.
[0032] The various embodiments of an augmented reality system and
method described above thus provide a way to achieve a convenient and efficient means for inspection. The system and method also provides for a guided and enhanced insitu inspection & repair and foreign debris removal. Further, it provides a lower risk of forced outage due to improved damage reporting.
[0033] It is to be understood that not necessarily all such objects or
advantages described above may be achieved in accordance with any particular

embodiment. Thus, for example, those skilled in the art will recognize that the systems and techniques described herein may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other objects or advantages as may be taught or suggested herein.
[0034] Furthermore, the skilled artisan will recognize the
interchangeability of various features from different embodiments. For example, the use of a web camera with respect to one embodiment can be adapted for use with a pendant as a display unit described with respect to another. Similarly, the various features described, as well as other known equivalents for each feature, can be mixed and matched by one of ordinary skill in this art to construct additional systems and techniques in accordance with principles of this disclosure.
[0035] While the invention has been described in detail in connection
with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.

WE CLAIM:A
1. A 3D tracking system comprising:
at least one acoustic emission sensor disposed around an object, said at least one acoustic emission sensor configured to identify a location of a probe inserted into the object based upon time of arrival of an acoustic signature emitted from a location on or near the probe;
a first sensor configured to detect an elevation of the probe; and
a second sensor configured to detect an azimuth of the probe.
2. The 3D tracking system of claim 1, wherein the acoustic signature comprises an acoustic signature emitted via a single or plurality of speakers disposed at a tip of the probe.
3. The 3D tracking system of claim 1, wherein the first sensor comprises a micro-electro-mechanical systems gravity sensor configured to detect the elevation based upon acceleration due to gravity.
4. The 3D tracking system of claims 1, wherein at least one of the first sensor and the second sensor are disposed near the tip of the probe.
5. The system of claim 1, wherein the second sensor comprises a micro-electromechanical systems magnetic sensor configured to detect the azimuth in presence of a magnetic field.

6. The system of claim 1, wherein the second sensor comprises at least one of a magnetic compass or a gyroscope.
7. The system of claim 1, further comprising a solenoid to apply a magnetic field to the second sensor.
8. An augmented reality system for inspection within an object
Comprising:
a tracking system configured to identify a 3D location of a probe inserted into the object, the tracking system comprising:
at least one acoustic emission sensor disposed around the object, the acoustic emission sensor configured to identify location of a probe inserted into the object based upon time of arrival of an acoustic signature emitted from a location on or near the probe;
a first sensor configured to detect an elevation of the probe; and
a second sensor configured to detect an azimuth of the probe;
a camera configured to capture an image of the object;
a microprocessor configured to generate graphics and superimpose the graphics on the image captured by the camera based upon the 3D location identified by the tracking system; and
a display unit configured to display an augmented reality image.

9. The augmented reality system of claim 8, wherein the display unit comprises a handheld display or an external computer.
10. The augmented reality system of claim 8, wherein the image captured comprises a 2D or a 3D image.
11. A method of 3D tracking within an object comprising:
inserting a probe into the object;
disposing at least one acoustic emission sensor around the object; and attaching a first sensor and a second sensor to the probe.
12. The method of claim 11, further comprising disposing a plurality of speakers at the tip of the probe.
13. The method of claim 11, wherein the attaching comprises attaching the first sensor and the second sensor near a tip of the probe.
14. A method for forming an augmented reality image for inspection within an object comprising:
capturing an image via a camera;
identifying a location of a probe within the object via a plurality of acoustic emission sensors;

determining elevation of the probe via a first sensor;
determining azimuth of the probe via a second sensor;
generating graphics of the object; and
registering the graphics on the image captured based upon the location, elevation and the azimuth determined to form an augmented reality image.
15. The method of claim 14, wherein the identifying comprises
calculating a time of travel of an acoustic signal emitted from the tip of the probe
to the acoustic sensors.
16. The method of claim 14, wherein the determining azimuth
comprises applying a magnetic field to the second sensor.

Documents

Application Documents

# Name Date
1 3014-CHE-2007-AbandonedLetter.pdf 2018-05-17
1 3014-che-2007-form 3.pdf 2011-09-04
2 3014-CHE-2007-DUPLICATE-FER-2017-11-03-14-29-17.pdf 2017-11-03
2 3014-che-2007-form 26.pdf 2011-09-04
3 3014-che-2007-form 1.pdf 2011-09-04
3 3014-CHE-2007-FER.pdf 2017-10-31
4 3014-che-2007-drawings.pdf 2011-09-04
4 3014-CHE-2007 CORRESPONDENCE OTHERS 07-10-2013.pdf 2013-10-07
5 3014-che-2007-description(complete).pdf 2011-09-04
5 3014-CHE-2007 FORM-3 07-10-2013.pdf 2013-10-07
6 3014-che-2007-correspondnece-others.pdf 2011-09-04
6 3014-CHE-2007 OTHER PATENT DOCUMENT 07-10-2013.pdf 2013-10-07
7 3014-che-2007-claims.pdf 2011-09-04
7 3014-CHE-2007 FORM-3 04-10-2013.pdf 2013-10-04
8 3014-che-2007-abstract.pdf 2011-09-04
8 3014-CHE-2007 OTHER PATENT DOCUMENT 04-10-2013.pdf 2013-10-04
9 3014-CHE-2007 CORRESPONDENCE OTHERS 09-04-2012.pdf 2012-04-09
9 3014-CHE-2007 POWER OF ATTORNEY 05-12-2011.pdf 2011-12-05
10 3014-CHE-2007 FORM-18 05-12-2011.pdf 2011-12-05
10 3014-CHE-2007 POWER OF ATTORNEY 09-04-2012.pdf 2012-04-09
11 3014-CHE-2007 CORRESPONDENCE OTHERS 05-12-2011.pdf 2011-12-05
12 3014-CHE-2007 FORM-18 05-12-2011.pdf 2011-12-05
12 3014-CHE-2007 POWER OF ATTORNEY 09-04-2012.pdf 2012-04-09
13 3014-CHE-2007 CORRESPONDENCE OTHERS 09-04-2012.pdf 2012-04-09
13 3014-CHE-2007 POWER OF ATTORNEY 05-12-2011.pdf 2011-12-05
14 3014-CHE-2007 OTHER PATENT DOCUMENT 04-10-2013.pdf 2013-10-04
14 3014-che-2007-abstract.pdf 2011-09-04
15 3014-CHE-2007 FORM-3 04-10-2013.pdf 2013-10-04
15 3014-che-2007-claims.pdf 2011-09-04
16 3014-CHE-2007 OTHER PATENT DOCUMENT 07-10-2013.pdf 2013-10-07
16 3014-che-2007-correspondnece-others.pdf 2011-09-04
17 3014-CHE-2007 FORM-3 07-10-2013.pdf 2013-10-07
17 3014-che-2007-description(complete).pdf 2011-09-04
18 3014-CHE-2007 CORRESPONDENCE OTHERS 07-10-2013.pdf 2013-10-07
18 3014-che-2007-drawings.pdf 2011-09-04
19 3014-che-2007-form 1.pdf 2011-09-04
19 3014-CHE-2007-FER.pdf 2017-10-31
20 3014-che-2007-form 26.pdf 2011-09-04
20 3014-CHE-2007-DUPLICATE-FER-2017-11-03-14-29-17.pdf 2017-11-03
21 3014-che-2007-form 3.pdf 2011-09-04
21 3014-CHE-2007-AbandonedLetter.pdf 2018-05-17

Search Strategy

1 PatSeer_26-10-2017.pdf