Sign In to Follow Application
View All Documents & Correspondence

Real Time Surface Defect Analysis And Correction In Friction Stir Welding Process By Image Processing

Abstract: ABSTRACT TITLE: REAL TIME SURFACE DEFECT ANALYSIS AND CORRECTION IN FRICTION STIR WELDING PROCESS BY IMAGE PROCESSING. The present invention discloses an image processing based system for online monitoring of the weld surface defects during friction stir welding (FSW) process involving 3D (three dimensional) laser scanning of weld bead surface for feedback and taking the requisite control action in real time by changing the FSW process parameters to avoid continuance of any weld defects once detected and/or overcome the upcoming defects. The 3D Laser scanning technology employed in the present system is modified and includes faster and more accurate/precise 3D reconstruction technique for real time construction 3D surface profile of the surface being welded by a FSW device.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 September 2018
Publication Number
13/2020
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
anjanonline@vsnl.net
Parent Application
Patent Number
Legal Status
Grant Date
2024-02-26
Renewal Date

Applicants

1. INDIAN INSTITUTE OF TECHNOLOGY, KHARAGPUR
Sponsored Research & Industrial Consultancy, Indian Institute of Technology, Kharagpur Kharagpur West Bengal India 721302

Inventors

1. Prof. Surjya Kanta Pal
Professor, Department of Mechanical Engineering, Indian Institute Of Technology, Kharagpur; Kharagpur West Bengal India 721302
2. Aaquib Reza Khan
Ex Student, Department of Electrical and Electronics Engineering, Birla Institute of Technology Mesra; Ranchi Jharkhand India
3. Ravi Ranjan
Ex Student, Department of Electronics and Communication Engineering, Birla Institute of Technology Mesra; Ranchi Jharkhand India
4. Chirag Parikh
Ex Student, Department of Chemical Engineering, Birla Institute of Technology Mesra; Ranchi Jharkhand India
5. Prof. Srikanta Pal
Professor, Department of Electronics and Communication Engineering, Birla Institute of Technology Mesra; Ranchi Jharkhand India
6. Prof. Debashish Chakravarty
Associate Professor, Department of Mining Engineering, Indian Institute Of Technology, Kharagpur; Kharagpur West Bengal India 721302
7. Abhik Maiti
Doctoral Scholar, Department of Mining Engineering, Indian Institute Of Technology, Kharagpur; Kharagpur West Bengal India 721302

Specification

Claims:WE CLAIM:

1. A system for real-time monitoring surface defects and controlling on-line of friction stir welding (FSW) based thereon comprising

atleast one three dimensional (3D) optical scanner means for scanning of weld bead for feedback involving a light source and a camera for imaging and visualization of weld surface during welding involving 3D reconstruction of the weld surface;

3D coordinating system for generating 3D cloud point data of the weld surface for comparing with respect to reference plane weld surface enabling measurement of position and depth of surface weld defect in real time;

GUI based FSW Control means for FSW process parameter control in real time for changing the process parameters to overcome the measured surface weld defects.

2. The system as claimed in claim 1, wherein the 3D coordinating system generates 3D triangulation and reconstruction based on change in dynamics and relative motion between the camera, FSW tool on the weld bead and the scanning plane.

3. The system as claimed in claim 1 or 2 including laser based light source generating thin laser line and the camera kept still with respect to FSW device, for visualizing the relatively moving laser plane, with respect to the moving weld surface, over the surface during the welding, said 3D reconstructing adapted for thin laser line based reconstructing for faster and accurate reconstructing.

4. The system as claimed in claim 1 comprising

said light source to project stripe of light contiguous to the rotating FSW tool of the FSW device on the weld surface thus welded by said rotating FSW tool;

said camera to capture images of said stripe of light shifting translationally with translational movement of the rotating FSW tool on the weld surface during welding in direction of weld motion; and

computing means to filter out occurrence of weld defects in the weld surface during said translational shifting of the rotating FSW tool on the weld surface in real time by continuously constructing the 3D surface profile of the weld surface thus welded by the rotating FSW tool from the captured images and comparing the constructed 3D surface profile with a reference weld surface;

said computing means generates required operative changes in FSW process parameters based on such filtered out weld defects and communicates with the FSW device to effect such changed FSW process parameters on the FSW device in real time to avoid continuation of the filtered out defects with the welding.

5. The system as claimed in claim 4, wherein the weld surface is placed on a background having fixed colored marker enabling tracking of the translationally shifting stripe of light in the captured images.

6. The system as claimed in claim 4 or 5, wherein the camera which preferably a digital camera is selectively disposed with respect to the FSW device to have a clear view of the translational movement of the FSW tool on the weld surface during welding and the background having the colored marker thereon for capturing the images.

7. The system as claimed in anyone of the claims 4 to 6, wherein the light source preferably includes laser source which is adapted to highly coherent laser line to project stripe of laser contiguous to rotating FSW tool of a FSW device on weld surface thus welded by said rotating FSW tool highlighting minor surface variations over the weld surface.

8. The system as claimed in anyone of the claims 4 to 7, wherein the computing means comprises

a master computer operatively connected to the digital camera to receive the captured images for filtering out occurrence of the weld defects and generating the required operative changes in the FSW process parameters to rectify the filtered out defects; and

a slave computer operatively connected to the master computer for receiving the changes in FSW process parameters and accordingly control the FSW device in real time to effect the changed FSW process parameters in order to avoid continuation of the filtered out weld defects with the welding.

9. The system as claimed in anyone of claims 4 to 8, wherein the master computer is configured analyze each of the captured images in real time to filter out occurrence of the weld defects including

constructing the 3D surface profile of the weld surface under the laser stripe in the analyzing image by involving intrinsic and extrinsic parameters of the camera;

generating a depth map of the weld surface in real time by comparing the 3D surface profile for each captured image with a reference flat weld surface; and

thresholding the generated depth map based on threshold depth map values using minimum and maximum depth values and filtering out the weld surface defect portions which can be either depressed below (like voids) or above the weld surface (like flash defects) enabling altering the FSW process parameters including FSW tool-spindle rpm, plunge depth and weld traversal speed accordingly.

10. The system as claimed in anyone of the claims 4 to 9, wherein the master computer is configured to construct the 3D surface profile of the weld surface under the laser stripe in the weld surface in each of the captured images under analysis by

identifying and localizing the laser stripe in the captured image by converting the image in grayscale and thresholding on its grayscale converted image to creating a binary image having white pixels correspond to points on the laser stripe;

determining 3D coordinates for each pixel point’s x, y coordinate lying on the laser stripe by evaluating optical ray vector pointing towards the pixel point from the digital camera center using camera intrinsic and extrinsic parameters, whereby the intersection of this optical ray vector with laser plane is used to determine real world 3D coordinates of that pixel point on the weld surface based on perspective transformation given as
or

where:
(X, Y, Z) are the coordinates of a 3D point in the world coordinate space
(u, v) are the coordinates of the projection point in pixels
A is a camera matrix, or a matrix of intrinsic parameters
(cx, cy) is a principal point that is usually at the image center
fx, fy are the focal lengths expressed in pixel units
[R|t] is a matrix of extrinsic parameters describe the rigid motion of an object in front of a still camera.

11. The system as claimed in anyone of the claims 4 to 10, wherein the master computer is configured to construct the 3D surface profile of the weld surface in iterative manner for the captured images of the translationally shifting laser stripe including

shifting the reconstructed 3D world coordinates to encompass the pixels of the laser stripe which is being shifted translationally in the direction of weld motion in the continuously captured images of the welding by considering the all the pixels apparently moving in the opposite weld direction to save all extra time for re-shifting already reconstructed 3D coordinates;

wherein a particular current pixel’s (x, y) coordinate apparently relatively shifted to new image coordinates (x_new, y_new), by amounts ( and ) in x and y directions respectively, in the direction of weld motion as
(x_new,y_new) = (x + , y + );

wherein the and is determined by calculating amount of shift in centroid pixel coordinates of the colored marker;

wherein the optical ray vector, , pointing towards the shifted new image coordinate (x_new,y_new) from the camera center, , is evaluated using the camera intrinsic and extrinsic parameters and therefrom the real world 3D coordinates of object at (x_new,y_new) image coordinates is computed ( );
wherein unit vector, , is evaluated in the direction of line segment vector pointing towards from initial point vectors by subtraction of from optical ray vectors;

computing magnitude M1, M2, M3 across all projected unit vectors , , respectively, where M2 is determined by
M2 = magnitude of ( - )
which formulated as :
(M1 * ) x (M2 * ) = (M3 * ) x (M1 * )
and simplified to determine magnitude M3 by:
M3 = [( ) x (M2 * )] / [( x )];

determining shifted 3D world coordinate for all shifted pixels by
( - ) = (M3 * )
implying,
= + (M3 * ).

12. The system as claimed in anyone of the claims 4 to 11, wherein the point vector is iteratively evaluated and stored for all pixel coordinates lying along the laser stripe representing a weld surface cross-section, while a binary image is periodically updated as the weld progresses, where values 1 indicate that 3D reconstruction for those pixel x, y coordinates have already been performed to avoid any repetition of 3D reconstruction at any particular image x, y coordinates as the weld traverses.

13. The system as claimed in anyone of the claims 4 to 12, wherein the master computer transmits the changed FSW process parameters upon inquiring a defect on the weld surface and the slave computers are set into listening mode for acquiring the changed FSW process parameters sent from the master server;

wherein the master computer sent two different data sequences on detecting a weld defect including
a first data which is received on the slave computer interface with the FSW device as an identifier to figure out whether the next data would be FSW tool-spindle rpm, plunge depth or weld traversal speed; and
a second data which is actual value of one of the parameters which is received on the slave computer interface with the FSW device to overcome upcoming defects.

14. The system as claimed in anyone of the claims 4 to 13, wherein the master computer includes memory element to store all the 3D coordinates of the weld surface in the order of their occurrence to form a 3D point cloud visualizing entire weld surface and validate the inception of defects and its removal after automatic change in welding parameters based on real time generated feedback.

15. A method of real-time monitoring surface defects and controlling of friction stir welding (FSW) based thereon involving the system as claimed in anyone of the claims 1 to 14, comprising:

scanning of weld bead involving 3D Laser scanner means for feedback involving laser and camera for imaging and visualization of weld surface during welding involving 3D reconstruction;

generating 3D cloud point data of weld surface for comparing with respect to plane weld surface enabling measurement of position and depth of surface weld defect in real time;

controlling FSW process parameter control in real time for changing the process parameters to overcome the measured surface weld defects.

16. The method as claimed in claim 15, wherein scanning of the weld bead includes shifting the reconstructed 3D world coordinates to encompass the pixels of the laser stripe which is being shifted translationally in the direction of weld motion in the continuously captured images of the welding by considering the all the pixels apparently moving in the opposite weld direction to save all extra time for re-shifting already reconstructed 3D coordinates.

Dated this the 20th day of September, 2018
Anjan Sen
Of Anjan Sen & Associates
(Applicants Agent)
IN/PA-199
, Description:FIELD OF THE INVENTION:
The present invention relates improving friction stir welding (FSW). More specifically, the present invention is directed to develop a system for online monitoring and real-time controlling of welding process parameters of the FSW process to overcome surface defects during the FSW process. The system of the present invention is low cost and adapted to be used with any FSW machine.

BACKGROUND OF THE INVENTION:
FSW is a relatively simple method of solid phase welding developed to join work-pieces by their abutting edges. In a typical FSW process, a non-consumable cylindrical tool is rotated and plunged into a joint formed by the abutting edges of the work-pieces that are to be joined until a surface of the shoulder contacts the surface of the work-pieces. The rotation of the tool develops frictional heating of the work-pieces and the tool forces plasticized work-piece material from the leading edge of the tool to the rear of the tool, while the shoulder confines the plasticized material from above and the plasticized material consolidates and cools to form a high quality weld.

The FSW process has a number of potential advantages over conventional fusion-welding processes but it is not free of surface defects. The surface defects which occur during the FSW process may degrade the weld strength. A bad quality weld is highly undesirable in industries as it indirectly increases the production cost.

There are numerous destructive and non-destructive post processing methods have been reported in state of art for analyzing the weld quality. But these reported weld quality analyzing methods are mostly offline and adapted to detect defects in the welded product post welding process only. The defected welds must be eventually discarded which accounts for a huge wastage of the welding material.

Zhang et al. have used an analytic reconstruction algorithm based on the slope field of the reflected laser pattern to measure the 3D weld pool surface in real time for Gas tungsten arc welding (GTAW) [Ref: https://ieeexplore.ieee.org/document/6555692/]. But, the online monitoring output has not been used to control the welding parameters.

US 2005/0040209 A1 attempts to maintain a constant load by increasing or decreasing the travel speed. But, in this invention no defect identification algorithm has been employed to monitor and eradicate weld defects in real time during welding.

Thus, online monitoring and control of FSW process is required for subsequently improving the weld quality avoiding occurrence of significant defects during the welding.

OBJECT OF THE INVENTION:

It is thus the basic object of the present invention is to develop an automated system for online monitoring and real time control of FSW process to monitor and eradicate weld defects in real time during FSW welding.

Another object of the present invention is to develop an automated system for online monitoring and real time control of FSW process which would be adapted to analyze surface of weld bed being welded by FSW process in real time and instantly detect weld defect including nature of the defect.

Yet another object of the present invention is to develop an automated system for online monitoring and real time control of FSW process which would be adapted analyze surface of weld bed being welded by FSW process in real time by image processing technique and instantly detect weld defect including nature of the defect.

A still further object of the present invention is to develop a system for real time surface defect analysis and correction in FSW process by image processing which would be adapted to determine required operative changes in FSW process parameters based on filtered out weld defects and communicates with a FSW device to effect such changed FSW process parameters on the FSW device in real time to avoid continuation of the filtered out defects with the welding.

SUMMARY OF THE INVENTION:

Thus according to the basic aspect of the present invention there is provided a system for real-time monitoring surface defects and controlling on-line of friction stir welding (FSW) based thereon comprising

atleast one three dimensional (3D) optical scanner means for scanning of weld bead for feedback involving a light source and a camera for imaging and visualization of weld surface during welding involving 3D reconstruction of the weld surface;

3D coordinating system for generating 3D cloud point data of the weld surface for comparing with respect to reference plane weld surface enabling measurement of position and depth of surface weld defect in real time;

GUI based FSW Control means for FSW process parameter control in real time for changing the process parameters to overcome the measured surface weld defects.

In a preferred embodiment of the present system, the 3D coordinating system generates 3D triangulation and reconstruction based on change in dynamics and relative motion between the camera, FSW tool on the weld bead and the scanning plane.

According to an embodiment, the present system includes laser based light source generating thin laser line and the camera kept still with respect to FSW device, for visualizing the relatively moving laser plane, with respect to the moving weld surface, over the surface during the welding, said 3D reconstructing adapted for thin laser line based reconstructing for faster and accurate reconstructing.

According to a preferred embodiment, the present system for real-time monitoring surface defects and controlling on-line of friction stir welding (FSW) based thereon comprises

said light source to project stripe of light contiguous to the rotating FSW tool of the FSW device on the weld surface thus welded by said rotating FSW tool;

said camera to capture images of said stripe of light shifting translationally with translational movement of the rotating FSW tool on the weld surface during welding in direction of weld motion; and

computing means to filter out occurrence of weld defects in the weld surface during said translational shifting of the rotating FSW tool on the weld surface in real time by continuously constructing the 3D surface profile of the weld surface thus welded by the rotating FSW tool from the captured images and comparing the constructed 3D surface profile with a reference weld surface;

said computing means generates required operative changes in FSW process parameters based on such filtered out weld defects and communicates with the FSW device to effect such changed FSW process parameters on the FSW device in real time to avoid continuation of the filtered out defects with the welding.

In an embodiment of the present system, the weld surface is placed on a background having fixed colored marker enabling tracking of the translationally shifting stripe of light in the captured images.

In an embodiment of the present system, the camera which preferably a digital camera is selectively disposed with respect to the FSW device to have a clear view of the translational movement of the FSW tool on the weld surface during welding and the background having the colored marker thereon for capturing the images.

In an embodiment of the present system, the light source preferably includes laser source which is adapted to highly coherent laser line to project stripe of laser contiguous to rotating FSW tool of a FSW device on weld surface thus welded by said rotating FSW tool highlighting minor surface variations over the weld surface.

In an embodiment of the present system, the computing means comprises

a master computer operatively connected to the digital camera to receive the captured images for filtering out occurrence of the weld defects and generating the required operative changes in the FSW process parameters to rectify the filtered out defects; and

a slave computer operatively connected to the master computer for receiving the changes in FSW process parameters and accordingly control the FSW device in real time to effect the changed FSW process parameters in order to avoid continuation of the filtered out weld defects with the welding.

In an embodiment of the present system, the master computer is configured analyze each of the captured images in real time to filter out occurrence of the weld defects including

constructing the 3D surface profile of the weld surface under the laser stripe in the analyzing image by involving intrinsic and extrinsic parameters of the camera;

generating a depth map of the weld surface in real time by comparing the 3D surface profile for each captured image with a reference flat weld surface; and

thresholding the generated depth map based on threshold depth map values using minimum and maximum depth values and filtering out the weld surface defect portions which can be either depressed below (like voids) or above the weld surface (like flash defects) enabling altering the FSW process parameters including FSW tool-spindle rpm, plunge depth and weld traversal speed accordingly.

In an embodiment of the present system, the master computer is configured to construct the 3D surface profile of the weld surface under the laser stripe in the weld surface in each of the captured images under analysis by

identifying and localizing the laser stripe in the captured image by converting the image in grayscale and thresholding on its grayscale converted image to creating a binary image having white pixels correspond to points on the laser stripe;

determining 3D coordinates for each pixel point’s x, y coordinate lying on the laser stripe by evaluating optical ray vector pointing towards the pixel point from the digital camera center using camera intrinsic and extrinsic parameters, whereby the intersection of this optical ray vector with laser plane is used to determine real world 3D coordinates of that pixel point on the weld surface based on perspective transformation given as
or

where:
(X, Y, Z) are the coordinates of a 3D point in the world coordinate space
(u, v) are the coordinates of the projection point in pixels
A is a camera matrix, or a matrix of intrinsic parameters
(cx, cy) is a principal point that is usually at the image center
fx, fy are the focal lengths expressed in pixel units
[R|t] is a matrix of extrinsic parameters describe the rigid motion of an object in front of a still camera.

In an embodiment of the present system, the master computer is configured to construct the 3D surface profile of the weld surface in iterative manner for the captured images of the translationally shifting laser stripe including

shifting the reconstructed 3D world coordinates to encompass the pixels of the laser stripe which is being shifted translationally in the direction of weld motion in the continuously captured images of the welding by considering the all the pixels apparently moving in the opposite weld direction to save all extra time for re-shifting already reconstructed 3D coordinates;
wherein a particular current pixel’s (x, y) coordinate apparently relatively shifted to new image coordinates (x_new, y_new), by amounts ( and ) in x and y directions respectively, in the direction of weld motion as
(x_new,y_new) = (x + , y + );
wherein the and is determined by calculating amount of shift in centroid pixel coordinates of the colored marker;
wherein the optical ray vector, , pointing towards the shifted new image coordinate (x_new,y_new) from the camera center, , is evaluated using the camera intrinsic and extrinsic parameters and therefrom the real world 3D coordinates of object at (x_new,y_new) image coordinates is computed ( );
wherein unit vector, , is evaluated in the direction of line segment vector pointing towards from initial point vectors by subtraction of from optical ray vectors
computing magnitude M1, M2, M3 across all projected unit vectors , , respectively, where M2 is determined by
M2 = magnitude of ( - )
which formulated as :
(M1 * ) x (M2 * ) = (M3 * ) x (M1 * )
and simplified to determine magnitude M3 by:
M3 = [( ) x (M2 * )] / [( x )]

determining shifted 3D world coordinate for all shifted pixels by
( - ) = (M3 * )
implying,
= + (M3 * ).

In an embodiment of the present system, the point vector is iteratively evaluated and stored for all pixel coordinates lying along the laser stripe representing a weld surface cross-section, while a binary image is periodically updated as the weld progresses, where values 1 indicate that 3D reconstruction for those pixel x, y coordinates have already been performed to avoid any repetition of 3D reconstruction at any particular image x, y coordinates as the weld traverses.

In an embodiment of the present system, the master computer transmits the changed FSW process parameters upon inquiring a defect on the weld surface and the slave computers are set into listening mode for acquiring the changed FSW process parameters sent from the master server;

wherein the master computer sent two different data sequences on detecting a weld defect including
a first data which is received on the slave computer interface with the FSW device as an identifier to figure out whether the next data would be FSW tool-spindle rpm, plunge depth or weld traversal speed; and
a second data which is actual value of one of the parameters which is received on the slave computer interface with the FSW device to overcome upcoming defects.

In an embodiment of the present system, the master computer includes memory element to store all the 3D coordinates of the weld surface in the order of their occurrence to form a 3D point cloud visualizing entire weld surface and validate the inception of defects and its removal after automatic change in welding parameters based on real time generated feedback.

According to another aspect of the present invention there is provided a method of real-time monitoring surface defects and controlling of friction stir welding (FSW) based thereon involving the above system, comprising:

scanning of weld bead involving 3D Laser scanner means for feedback involving laser and camera for imaging and visualization of weld surface during welding involving 3D reconstruction;

generating 3D cloud point data of weld surface for comparing with respect to plane weld surface enabling measurement of position and depth of surface weld defect in real time;

controlling FSW process parameter control in real time for changing the process parameters to overcome the measured surface weld defects.

In above method, scanning of the weld bead includes shifting the reconstructed 3D world coordinates to encompass the pixels of the laser stripe which is being shifted translationally in the direction of weld motion in the continuously captured images of the welding by considering the all the pixels apparently moving in the opposite weld direction to save all extra time for re-shifting already reconstructed 3D coordinates.

BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS:
Figure 1 shows a preferred embodiment of the system for online monitoring and real-time controlling of welding process parameters of the FSW process to overcome surface defects during the FSW process in accordance with the present invention.

Figure 2a shows Red line laser based highlighting keyhole defect during the FSW process in accordance with the preferred embodiment of the present system.

Figure 2a shows conventional light based highlighting of the keyhole defect during the FSW process in accordance with the preferred embodiment of the present system.

Figure 3 shows setup for modified 3D reconstruction of weld surface in accordance with the preferred embodiment of the present system.

Figure 4 shows pinhole camera model in accordance with online monitoring of the surface welded by the FSW process involving the present system.

Figure 5 shows checkerboard images with multiple orientations used for camera calibration in accordance with the preferred embodiment of the present system.

Figure 6 shows extrinsic parameter visualization in accordance with the preferred embodiment of the present system.

Figure 7 shows re-projection errors.

Figure 8 shows GUI for controlling the 2 Ton linear NC controlled FSW machine in accordance with the preferred embodiment of the present system.

Figure 9 shows changing of FSW parameters after the detection of defects involving the preferred embodiment of the present system.

DETAILED DESCRIPTION OF THE INVENTION WITH REFERENCE TO THE ACCOMPANYING DRAWINGS:
As stated hereinbefore, the present invention discloses an image processing based system for online monitoring of the weld surface defects during FSW process involving 3D (three dimensional) Laser scanning of weld bead surface for feedback and taking the requisite control action in real time by changing the FSW process parameters to avoid continuance of any weld defects once detected and/or overcome the upcoming defects.

The 3D Laser scanning technology employed in the present invention is modified and includes faster and more accurate/precise 3D reconstruction technique for real time construction 3D surface profile of the surface being welded by a FSW device. This technology uses thin laser line and a USB camera, both kept still with respect to the FWS welding device, for visualizing the relatively moving laser plane, with respect to the moving weld surface, over the surface during the welding. While the present 3D reconstruction technique can be used for thick shadow planes, slowly moving over the welding surface, a higher accuracy is achieved by using thin laser line, instead of thick shadow regions, which could even penetrate through micrometer level depths of surface defects, highlighting even minor surface variations over the weld bead. Moreover, the reconstruction time is drastically reduced by using the already known constraints like current weld traversal speed and predefined region of interest pixels for only one-time reconstruction of thin laser line instead of that done twice for left and right parts of thick shadow plane. Since the camera is kept still, intrinsic and extrinsic camera calibration parameters are evaluated, using standard sized checkerboards, for converting the reconstructed x, y & z coordinates to 3D world coordinate system. Then 3D triangulation of laser plane and camera viewing plane is carried out to get 3D point cloud data of the weld surface.

Reference is now invited from the accompanying figure 1 which is a schematic representation of a preferred embodiment of the present system for online real-time monitoring and controlling of welding process parameters of the FSW process to overcome surface defects during the FSW process.

The system setup (1) as shown in Figure 1 comprises five primary components viz. a digital camera (2), a light source (3) preferably a laser stripe projector, a calibration checkerboard (6) and computing means with good processor based computers for executing real time reconstruction algorithm and generating ideal FSW process parameter.

While the rotating FSW tool (4), is translationally moving during welding, the camera (2) and laser projector's (3) position is always fixed, projecting a fixed plane of laser stripe (5) at region on interest (ROI) on the moving weld surface. The light source (3) which is preferably a laser source projects the laser stripe (5) contiguous to the rotating FSW tool (4) on the weld surface thus welded by said rotating FSW tool (4).

The digital camera (2) is selectively disposed with respect to the FSW device having a clear view of the translational movement of the FSW tool (4) on the weld surface during welding to capture images of the laser stripe (5) being shifted translationally with translational movement of the rotating FSW tool on the weld surface during welding in direction of weld motion.

The weld surface is placed to have the calibration checkerboard (6) as a background which includes fixed colored marker enabling tracking of the translationally shifting laser stripe in the captured images.

The computing means of the present system is adapted to filter out occurrence of the weld defects in the weld surface in real time during the translational shifting of the rotating FSW tool on the weld surface by continuously constructing the 3D surface profile of the weld surface thus welded by the rotating FSW tool from the captured images and comparing the constructed 3D surface profile with a reference weld surface. The computing means thereby generates required operative changes in FSW process parameters based on such filtered out weld defects and communicates with the FSW device to effect such changed FSW process parameters on the FSW device in real time to avoid continuation of the filtered out defects with the welding.

In a preferred embodiment, the computing means comprises a master computer and slave computers. The slave computers are connected with the master computer through a secure ethernet based TCP/IP communication protocol. The master computer is operatively connected to the digital camera to receive the captured images for filtering out occurrence of the weld defects by real time 3D reconstruction of the weld surface from the captured images and generating the required operative changes in the FSW process parameters to rectify the filtered out defects.

The slave computers are operatively connected to the master computer and adapted to receive the changes in FSW process parameters from the master computer and accordingly control the FSW device in real time to effect the changed FSW process parameters in order to avoid continuation of the filtered out weld defects with the welding.

The master computer embodies MATLAB for transmitting changed FSW process parameters upon inquiring a defect on the weld surface. The slave computers are set into listening mode in Labview for acquiring the data sent from the master. The master computer sent two different data sequences on detecting a weld defect. The first data received on the slave computer interface with the FSW device is used as an identifier to figure out whether the next data would be FSW tool-spindle rpm, plunge depth or weld traversal speed. Then, the next data which is the actual value of one of the parameters are sent to the FSW device via the slave computer to overcome upcoming defects.

Now to operate the present system, firstly the digital camera is initialized. The initial camera adjustment is done manually by fixing its focal length such that a sharp and clear visual is observed throughout the FSW process. The capturing rate of the camera is preferably fixed to 60 frame rate per second with image capture resolution to 960x960. The intrinsic and extrinsic camera calibration parameters are computed using checkerboard squares’ real world dimensions which is captured and placed at different orientations over two perpendicular planar surfaces, i.e., vertical and horizontal planes. Thereby using extrinsic parameters, vertical, horizontal plane equations and camera center coordinates in 3D world coordinate system are evaluated.

Manually the position and orientation is specified in image coordinate system, of dividing line between the two planes, upper and lower lines in vertical and horizontal planes respectively. Also the ROI is specified in both vertical and horizontal planes for estimating fixed line equations of the laser stripe over these planes in image coordinate system.

The master computer identify and localize the laser stripe edge in the selected ROI of the captured image by thresholding on its grayscale converted image and creating a binary image out of it, where white pixels correspond to points on the edge of laser stripe. The master computer then executes fitting lines over x and y coordinates of white pixels extracted in image to determine equations of edge lines on vertical and horizontal planes separately. The point of intersections between laser line on vertical plane with dividing and upper lines and between laser line on horizontal plane with dividing and lower lines is evaluated. Then optical ray vectors are determined, in world coordinate system, emerging from camera center to these intersection points in image using the intrinsic and extrinsic camera parameters. The real world point vectors are computed from intersection of these optical rays with the vertical and horizontal planes. The line segment vectors along the laser stripe in vertical and horizontal planes are separately evaluated upon subtraction of these point vectors. Hence, the position and orientation of fixed laser plane is determined by taking the cross product of these line segment vectors.

Finally after configuring all the required system settings and computing all static parameters and equations required for real time 3D reconstruction, the welding process is initiated by sending the initial welding parameters to slave computer via TCP/IP communication.

During the inception of welding, the first camera captured image, having laser stripe edge lying over the weld bead surface and is positioned past the location of colored marker, is chosen as the reference image after which the real time reconstruction process actually starts. This reference image is used for calculating the amount of distance travelled in x and y directions as the welding proceeds, i.e., by computing the difference between the corresponding centroid x and y coordinates of the identified colored markers in current and reference images. The positions of colored markers are extracted on the basis of thresholding on RGB colored channels of image.

Initiating the process of real time 3D reconstruction, for each image captured during welding, firstly localization of laser stripe edge is carried out in the same way as explained before within the specified ROI. Now, for each white pixel on laser stripe edge, 3D reconstruction process is carried out to ultimately create the 3D surface profile of the weld cross-section which is as observed by the position and orientation of laser stripe edge in current image. To carry out the 3D reconstruction process, the master computer determines 3D coordinates for each pixel point’s x, y coordinate lying on the laser stripe by evaluating optical ray vector pointing towards the pixel point from the digital camera center using camera intrinsic and extrinsic parameters, whereby the intersection of this optical ray vector with laser plane is used to determine real world 3D coordinates of that pixel point on the weld surface. The 3D point cloud of the weld surface under the captured laser strip is generated in real time by using world 3D coordinates of all the pixel points under the captured laser strip. The 3D point cloud is basically the 3d surface profile of the weld surface under the captured laser strip.

The spatial and temporal location of the defect is determined by coordinate locations in reconstructed 3D point cloud data and the time at which the current image is captured respectively. Distance travelled and time elapsed for the occurrence of defect can accordingly be derived by the cumulative addition of distances between consecutive parallel lines across the weld surface and cumulative addition of change in time between the capture of two consecutive images of weld respectively.

The master computer after constructing the 3D surface profile of the weld surface under the laser stripe for each captured image, generates a depth map of the weld surface in real time by comparing the 3D surface profile for each said captured image with a reference flat weld surface. Preferably, the depth map is calculated with reference to the smooth and flat weld surface. The position and orientation of plane representing this flat weld surface is evaluated by fitting plane equation over the 3D reconstructed points in point cloud. Now, with respect to this plane, the depth or shortest distance of all newly reconstructed points are thereby evaluated creating the depth map of weld surface in real world dimensions. Threshold values are designed basis upon minimum and maximum positive and negative depth values in the depth map, to filter out surface defect portions which can be either depressed below (like voids) or above the weld surface (like flash defects). Based on the nature of the defects i.e. whether it is voids or flash defects, the FSW process parameters which includes FSW tool-spindle rpm, plunge depth and weld traversal speed are altered by the master computer.

Since the laser plane and camera position is fixed while the weld is moving, the reconstructed surface coordinates are spatially translated in the direction of weld motion by the amount of triangulated distance traversed by the weld during the time elapsed between the capture of two consecutive camera frames. These shifted 3D coordinates are iteratively stored for each instance captured during welding, The modified 3D reconstruction technique of the present invention is computationally less expensive from standard swept plane based 3D reconstruction algorithm. The position and orientation of projected laser plane does not need to be re-evaluated at each time interval since it is always fixed. 3D reconstruction of only one laser stripe edge is evaluated instead of both entering and leaving edges. 3D reconstruction is only done for a specific region of interest in image. A check is maintained to avoid any re-computation of already reconstructed points. The processing time is thus reduced making it possible for the algorithm to be run in real time during welding.

The steps involved in the present modified 3D triangulation algorithm as executed in the master computer for real time generation of 3D point cloud data of the weld surface are explained below:

First of all, for each pixel’s x, y coordinate lying on laser stripe, as shown in Figure 3, 3D reconstruction is carried out. This is achieved by evaluating optical ray vector, , pointing towards this point from camera center using camera intrinsic and extrinsic parameters. The intersection of this optical ray vector with laser plane is hence used to determine real world 3D coordinates of that point, i.e., , on object surface.

Since, the camera and laser plane is at rest while the weld plate is moving, all pixel coordinates, of object over laser stripe in the previous image snapshot, appear to be shifted translationally in the direction of weld motion. Due to this all the already reconstructed 3D world coordinates need to be shifted by a specified amount, which would turn out to be computationally more expensive if the same process is iteratively repeated for all the images captured at different time intervals. Thus to compensate this shift in a computationally efficient manner, the weld plate is assumed to be at rest while the laser plane is relatively moving in the direction opposite to that of weld plate. So all the pixels will appear to be moving in the opposite direction. By this method only the current pixels over the weld cross-section need to be reconstructed and shifted in the opposite direction, thereby saving all the extra time for re-shifting already reconstructed 3D coordinates.

Thus current pixel’s (x, y) coordinate will appear to be relatively shifted to new image coordinates (x_new, y_new), by amounts ( and ) in x and y directions respectively, in the direction of weld motion. This and is determined by calculating the amount of shift in centroid pixel coordinates of colored marker. Thus,
(x_new, y_new) = (x + , y + )
The optical ray vector, , pointing towards this (x_new, y_new) image coordinates from camera center, , is evaluated using camera intrinsic and extrinsic parameters. Let the real world 3D coordinates of object at (x_new,y_new) image coordinates be ( ) point vector, all shown in Figure 3.

The unit vector, , in the direction of line segment vector pointing towards from point vectors is evaluated by subtraction of from optical ray vectors.

Considering magnitude across all projected unit vectors , , be M1, M2, M3 respectively, M2 can be determined by the equation:

M2 = magnitude of ( - )

Since the cross product of any two vectors along edges in a triangle is equivalent to each other, the equivalence of the above can be given as:

(M1 * ) x (M2 * ) = (M3 * ) x (M1 * )

This equation can further be simplified to determine magnitude M3 by:

M3 = [( ) x (M2 * )] / [( x )]

Hence the shifted 3D world coordinate is determined by the equation:

( - ) = (M3 * )

implying,
= + (M3 * )

This point vector is iteratively evaluated and stored for all pixel coordinates lying along the laser stripe representing a weld surface cross-section.

While, a binary image is periodically updated as the weld progresses, where values 1 indicate that 3D reconstruction for those pixel x, y coordinates have already been performed. This is done so as to avoid any repetition of 3D reconstruction at any particular image x, y coordinates as the weld traverses.

On the occurrence of a surface defect, depending on its severity, basis upon the amount of depth observed, the current welding parameters are altered to good welding parameters so as to rectify any future occurrence of such defects.

The entire process above is repeated for all images captured at different time intervals covering up 3D reconstruction of the entire weld surface. All the reconstructed coordinates of weld surface are stored in the order of their occurrence to form a 3D point cloud. Post welding, this 3D point cloud data can be used to visualize entire weld surface and validate the inception of defects and its removal after automatic change in welding parameters based on real time feedback generated by the proposed algorithm above.

The accuracy produced by the algorithm depends on and is directly proportional to image resolution captured by camera. While processing higher resolution imagery is computationally expensive, accuracy is directly proportional to the processing time of the algorithm. Thus a tradeoff between processing time and accuracy was taken such that the algorithm was suitable for running in real time while processing a higher image resolution, around 960x960, to get a more detailed 3D point cloud data. Also with the use of thin laser stripe instead of thick shadow, minor weld surface variations can also be visualized creating a more accurate 3D point cloud model of weld surface. The proposed algorithm is hence comparatively faster and more accurate than standard 3D reconstruction algorithm.

The camera calibration is also an important aspect for the present 3D reconstruction of the weld surface. The camera calibration is the process of finding characteristics internal to a camera and finding camera’s location in space with respect to a fixed object. This is critical when someone want to correct for lens distortion or measure the size of an object in world units. The characteristics inside a camera are known as intrinsic parameters and these include focal length of the lens used, the optical center, lens distortion coefficient and pixel scaling factors. These parameters improve image quality, correct for lens distortion and map real world distances to pixels. The camera’s location in space is known as it extrinsic parameter. Camera calibration is useful in applications like machine vision, 3d scene reconstruction etc.

For camera calibration a pin hole camera model is used as shown in Figure 4. In this model, light is visualized as entering from the scene or a distant object, but only a single ray enters from any point.

In this model, a scene view is formed by projecting 3D points into the image plane using a perspective transformation


or

where:
• (X, Y, Z) are the coordinates of a 3D point in the world coordinate space
• (u, v) are the coordinates of the projection point in pixels
• A is a camera matrix, or a matrix of intrinsic parameters
• (cx, cy) is a principal point that is usually at the image center
• fx, fy are the focal lengths expressed in pixel units

Thus, if an image from the camera is scaled by a factor, these parameters should be scaled (multiplied/divided, respectively) by the same factor. The matrix of intrinsic parameters does not depend on the scene viewed. So, once estimated, it can be re-used if the focal length is fixed (in case of zoom lens). The joint rotation-translation matrix [R|t] is called a matrix of extrinsic parameters. It is used to describe the camera motion around a static scene, or vice versa, rigid motion of an object in front of a still camera. That is, [R|t] translates coordinates of a point (X, Y, Z) to a coordinate system, fixed with respect to the camera. The transformation above is equivalent to the following (when z ? 0):

Real lenses usually have some distortion, mostly radial distortion and slight tangential distortion. Radial distortion arises because of shape of lens. In this, rays farther from the center of a simple lens are bent too much compared to rays that pass closer to the center. This can be corrected using a non-linear model given by the equations below:

Here, (x, y) is the original location (on the imager) of the distorted point and (xcorrected, ycorrected) is the new location because of the correction. k1, k2, k3 are radial distortion coefficients and r is the radial distance of points on the image.

Further, tangential distortion arises due to manufacturing defects resulting from the lens not being exactly parallel to the imaging plane. This can be corrected using another nonlinear model given by the equations below:

Here, (x, y) is the original location (on the imager) of the distorted point and (xcorrected, ycorrected) is the new location because of the correction. p1 and p2 are tangential distortion coefficients and r is the radial distance of points on the image.

For calculation of these parameters the checkerboard is used because its regular pattern makes it easy to detect automatically. The next step is to enter the size of checker board square in world units. This is necessary to find the mapping between world units and image pixels. The checkerboard calibration pattern is then detected automatically by the toolbox. The corner extraction engine in the tool box extracts all the corners as well as the boundary of the checkerboard. Further the number of radial distortion coefficients can be specified in order to correct for more bending of light at the edges of lens as compared to its center. Two coefficients are sufficient but for severe distortion as in the case of wide angle lens 3 coefficients might be necessary. The estimation of tangential distortion can also be enabled for the cases when the lens and camera center are not parallel. The checkerboard pattern in multiple orientation that has been used for camera calibration is show in the Figure 5.

In order to calibrate camera, a correspondence between 3D world points and 2D image points is needed. This is the reason for using multiple images with different orientation of checkerboard in them as the 3D points cannot be coplanar. So, for generating non-coplanar points from a planar checkerboard pattern, multiple images in different 3D orientations is required. Furthermore, multiple images of the calibration pattern also take care of noise and errors help estimate camera parameters more reliably. Also, the extrinsic parameter visualization is represented in Figure 6 to show the angles in which the calibration images have been taken with respect to the camera.

Calibration errors have been evaluated by visualizing re-projection errors. Re-projection errors are a global measure of calibration error and is the difference between points detected in the image and points re-projected back onto the image using the camera parameters that have been calculated. The re-projection error observed after the calibration is shown in Figure 7.

In the present system red laser source is preferred as a Red laser beam is collimated, meaning it consists of waves traveling parallel to each other in a single direction with very little divergence. This allows the red laser light to be focused to very high intensity. Also, Red Laser light is coherent, which means all the light waves move in phase together in both time and space. This allows laser to have a very tight beam that is strong and concentrated. These two peculiar properties of red laser allow it to highlight even minor surface variations over the weld bead. Moreover, the segmentation of red color from an image is very easy as only (R = 255, G = 0, B = 0) is needed to be searched for in the image matrix. This makes red laser line easily detectable as compared to thick shadow region in any lighting conditions. This is a very prime requirement for the 3D-reconstruction algorithm mentioned in the invention to work properly.

In figure 2a a red line laser has been used for 3D reconstruction of weld bead and in figure 2b a thick shadow region has been used for the same. It is quite evident that the red line laser is able to penetrate more deeply into the keyhole defect shown in the green box and highlight minor variations as compared to the thick shadow region shown in figure 2b.

The Graphical User Interface (GUI) associated with the present system developed in Labview used for controlling the process parameters in real time is shown in figure 8.

The accompanying figure 9 depicts the change in the welding parameters after the detection of defects in real time, to improve the quality of the forthcoming weld. The experiment has been carried on a 2 Ton linear NC controlled FSW machine. The work piece material is AA1100, pure commercial alloy, whose thickness is 3 mm, and has a length and width of 200mm and 75mm respectively. The FSW tool is made up of tool steel H13 and has the following dimensions: shoulder diameter of 15 mm, pin diameter of 4 mm and pin height of 2.4 mm. As shown in the referred figure, the welding defect is filtered out for weld speed 80 mm/min, tool RPM of 600 and plunge depth 0.05mm. After online detection of such weld defect by the present system, the FSW parameters are changed to weld speed 30 mm/min, tool RPM of 1000 and plunge depth 0.15mm in real time which avoids the continuation of the welding defect and result good welding surface.

Documents

Application Documents

# Name Date
1 201831035477-STATEMENT OF UNDERTAKING (FORM 3) [20-09-2018(online)].pdf 2018-09-20
2 201831035477-FORM 1 [20-09-2018(online)].pdf 2018-09-20
3 201831035477-DRAWINGS [20-09-2018(online)].pdf 2018-09-20
4 201831035477-COMPLETE SPECIFICATION [20-09-2018(online)].pdf 2018-09-20
5 201831035477-FORM-26 [13-12-2018(online)].pdf 2018-12-13
6 201831035477-Proof of Right (MANDATORY) [20-03-2019(online)].pdf 2019-03-20
7 201831035477-FORM 18 [04-03-2022(online)].pdf 2022-03-04
8 201831035477-FER.pdf 2022-09-08
9 201831035477-OTHERS [06-03-2023(online)].pdf 2023-03-06
10 201831035477-FORM-26 [06-03-2023(online)].pdf 2023-03-06
11 201831035477-FER_SER_REPLY [06-03-2023(online)].pdf 2023-03-06
12 201831035477-DRAWING [06-03-2023(online)].pdf 2023-03-06
13 201831035477-COMPLETE SPECIFICATION [06-03-2023(online)].pdf 2023-03-06
14 201831035477-CLAIMS [06-03-2023(online)].pdf 2023-03-06
15 201831035477-PatentCertificate26-02-2024.pdf 2024-02-26
16 201831035477-IntimationOfGrant26-02-2024.pdf 2024-02-26

Search Strategy

1 1KeyWordUsedE_07-09-2022.pdf

ERegister / Renewals