Abstract: Systems and methods for identifying damage on homogeneous surface of an object using images obtained from polarization angles. The traditional systems and methods provide for damage detection by focusing only on sub-micron damage on thin films and do not provide for detecting damage on the homogeneous surface using the images obtained from the polarization angles. Embodiments of the present disclosure provide for identifying damage on the homogeneous surface using the images obtained from the polarization angles by, capturing a set of images using an image capturing device, computing, by one or more hardware processors (104), intensities of pixels based upon the set of images, determining, an angle of zenith and an angle of azimuth of surface normal, obtaining, one or more gradients of pixels based upon the angle of zenith and the angle of azimuth, and obtaining, a combined gradient value, to identify damage on the homogeneous surface of the object.
Claims:1. A method for identifying damage on homogeneous surface of an object using a set of images obtained from polarization angles, the method comprising a processor implemented steps of:
capturing, using an image capturing device, the set of images, wherein the set of images comprise a plurality of multi-pixel images varying with the polarization angles (201);
computing, by one or more hardware processors, one or more intensities of pixels, based upon the set of images, wherein the one or more intensities of pixels are a function of maximum and minimum intensities of a sinusoidal variation in the set of images and a function of a phase angle of a sinusoid (202);
determining, an angle of zenith and an angle of azimuth of surface normal, based upon the one or more intensities of pixels, wherein the surface normal comprises a vector value corresponding to a gradient of point of depth of the object (203);
obtaining, one or more gradients of pixels, based upon the angle of zenith and the angle of azimuth, wherein the gradients of pixels comprises variations in a depth of the set of images on primary and secondary axis (204); and
obtaining, a combined gradient value, based upon the one or more gradients of pixels to identify damage on the homogeneous surface of the object (205).
2. The method of claim 1, wherein the step of obtaining the one or more gradients of pixels is preceded by computing, a degree of polarization of an incident light corresponding to the object, based upon the one or more intensities and the polarization angles to identify damage on the homogeneous surface of the object.
3. The method of claim 1, wherein the angle of zenith is a function of a degree of polarization and refractive index of the homogeneous surface, and wherein the azimuth angle is an angle between a projection of the surface normal on horizontal plane of the surface normal and X-axis of the surface normal.
4. The method of claim 1, wherein the one or more intensities of pixels generate a pattern of a sinusoidal wave when plotted at the polarization angles.
5. The method of claim 1, wherein the step of determining the angle of azimuth comprises differentiating, by an Independent Component Analysis (ICA) technique, a set of specular and diffuse images of each pixel corresponding to the set of images to identify damage on the homogeneous surface of the object.
6. A system (100) for identifying damage on homogeneous surface of an object using a set of images obtained from polarization angles, the system (100) comprising:
a memory (102) storing instructions;
one or more communication interfaces (106); and
one or more hardware processors (104) coupled to the memory (102) via the one or more communication interfaces (106), wherein the one or more hardware processors (104) are configured by the instructions to:
capture, using an image capturing device, the set of images, wherein the set of images comprise a plurality of multi-pixel images varying with the polarization angles;
compute, by the one or more hardware processors (104), one or more intensities of pixels, based upon the set of images, wherein the one or more intensities of pixels are a function of maximum and minimum intensities of a sinusoidal variation in the set of images and a function of a phase angle of a sinusoid;
determine, an angle of zenith and an angle of azimuth of surface normal, based upon the one or more intensities of pixels, wherein the surface normal comprises a vector value corresponding to a gradient of point of depth of the object;
obtain, one or more gradients of pixels, based upon the angle of zenith and the angle of azimuth, wherein the gradients of pixels comprises variations in a depth of the set of images on primary and secondary axis; and
obtain, a combined gradient value, based upon the one or more gradients of pixels to identify damage on the homogeneous surface of the object.
7. The system (100) of claim 6, wherein the one or more hardware processors (104) are configured to obtain the one or more gradients of pixels by computing, a degree of polarization of an incident light corresponding to the object, based upon the one or more intensities and the polarization angles to identify damage on the homogeneous surface of the object.
8. The system (100) of claim 6, wherein the angle of zenith is a function of a degree of polarization and refractive index of the homogeneous surface, and wherein the azimuth angle is an angle between a projection of the surface normal on horizontal plane of the surface normal and X-axis of the surface normal.
9. The system (100) of claim 6, wherein the one or more intensities of pixels generate a pattern of a sinusoidal wave when plotted at the polarization angles.
10. The system (100) of claim 6, wherein the angle of azimuth is determined by differentiating, by an Independent Component Analysis (ICA) technique, a set of specular and diffuse images of each pixel corresponding to the set of images to identify damage on the homogeneous surface of the object.
, Description:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
SYSTEMS AND METHODS FOR IDENTIFYING DAMAGE ON HOMOGENEOUS SURFACE USING IMAGES OBTAINED FROM POLARIZATION ANGLES
Applicant:
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021,
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
The present disclosure generally relates to identifying damage on homogeneous surface using polarization angles. More particularly, the present disclosure relates to systems and methods for identifying damage on homogeneous surface of an object using images obtained from polarization angles.
BACKGROUND
Imaging based damage detection techniques are increasingly being utilized to identify damage on surfaces. Image processing methods use inexpensive and readily available equipment (i.e. a standard digital camera. Furthermore, advances in camera technology mean that rich detailed imagery of damaged components can be acquired. Applying damage detection algorithms to images may find and quantify visible damage on the surface of homogeneous elements with minimal human supervision. Physical properties of the identified damage, such as the size and shape characteristics, may be performed using image processing. The quantitative nature of data obtained from the image processing is important and an accurate data analysis may provide for numerous applications for identifying damage.
Various other techniques may also be used for identifying damage on surfaces. Interferometric techniques are commonly used to measure the profile of a surface of an object. To do so, an interferometer combines measurement light reflected from the surface of interest with reference light reflected from a reference surface to produce an interferogram. Fringes in the interferogram are indicative of spatial and structural variations between the surface of interest and the reference surface. Optical metrology involves directing an incident beam at a structure, measuring the resulting diffracted beam, and analyzing the diffracted beam to determine various characteristics, such as the profile of the structure.
However, image processing and other traditional systems and methods for damage detection face numerous problems. For example, most of the image processing systems do not estimate all critical quantifiable parameters. Further, the traditional systems and methods use a source to send light or electromagnetic waves onto the surface of the object with appropriate mirrors to aid the reflection to detect damaged surface. Also, the traditional systems and methods focus on wafers which are sub-micron level defects which cannot be detected by normal cameras. Some of the traditional systems and methods also provide for damage detection using transparent photographic material or films. However, a film may be mishandled, which may lead to a physical damage. Films unspooled on a dirty worktable or passed through worn rollers can pick up dust, dirt, scratches, and abrasions. The visualization of a photographic image always entails a large magnification and, obviously, the flaws are magnified together with the image; therefore, even very small blemishes can have a strong impact on the image. This in turn greatly affects the accuracy of identifying damage on surfaces.
SUMMARY
Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one embodiment, a method for identifying damage on homogeneous surface of an object using a set of images obtained from polarization angles is provided, the method comprising: capturing, using an image capturing device, the set of images, wherein the set of images comprise a plurality of multi-pixel images varying with the polarization angles; computing, by one or more hardware processors, one or more intensities of pixels, based upon the set of images, wherein the one or more intensities of pixels are a function of maximum and minimum intensities of a sinusoidal variation in the set of images and a function of a phase angle of a sinusoid; determining, an angle of zenith and an angle of azimuth of surface normal, based upon the one or more intensities of pixels, wherein the surface normal comprises a vector value corresponding to a gradient of point of depth of the object; obtaining, one or more gradients of pixels, based upon the angle of zenith and the angle of azimuth, wherein the gradients of pixels comprises variations in a depth of the set of images on primary and secondary axis; obtaining, a combined gradient value, based upon the one or more gradients of pixels to identify damage on the homogeneous surface of the object; obtaining the one or more gradients of pixels by computing, a degree of polarization of an incident light corresponding to the object, based upon the one or more intensities and the polarization angles to identify damage on the homogeneous surface of the object; and determining the angle of azimuth by differentiating, by an independent component analysis (ICA) technique, a set of specular and diffuse images of each pixel corresponding to the set of images to identify damage on the homogeneous surface of the object.
In another aspect, there is provided a system for identifying damage on homogeneous surface of an object using a set of images obtained from polarization angles, the system comprising a memory storing instructions; one or more communication interfaces; and one or more hardware processors coupled to the memory via the one or more communication interfaces, wherein the one or more hardware processors are configured by the instructions to: capture, using an image capturing device, the set of images, wherein the set of images comprise a plurality of multi-pixel images varying with the polarization angles; compute, by the one or more hardware processors, one or more intensities of pixels, based upon the set of images, wherein the one or more intensities of pixels are a function of maximum and minimum intensities of a sinusoidal variation in the set of images and a function of a phase angle of a sinusoid; determine, an angle of zenith and an angle of azimuth of surface normal, based upon the one or more intensities of pixels, wherein the surface normal comprises a vector value corresponding to a gradient of point of depth of the object; obtain, one or more gradients of pixels, based upon the angle of zenith and the angle of azimuth, obtaining, one or more gradients of pixels, based upon the angle of zenith and the angle of azimuth, wherein the gradients of pixels comprises variations in a depth of the set of images on primary and secondary axis; obtain, a combined gradient value, based upon the one or more gradients of pixels to identify damage on the homogeneous surface of the object; obtain the one or more gradients of pixels by computing, a degree of polarization of an incident light corresponding to the object, based upon the one or more intensities and the polarization angles to identify damage on the homogeneous surface of the object; and determining the angle of azimuth by differentiating, by an independent component analysis (ICA) technique, a set of specular and diffuse images of each pixel corresponding to the set of images to identify damage on the homogeneous surface of the object.
In yet another aspect, there is provided one or more non-transitory machine readable information storage mediums comprising one or more instructions which when executed by one or more hardware processors causes the one or more hardware processor to perform a method for identifying damage on homogeneous surface of an object using a set of images obtained from polarization angles, the method comprising: capturing, using an image capturing device, the set of images, wherein the set of images comprise a plurality of multi-pixel images varying with the polarization angles; computing, by the one or more hardware processors, one or more intensities of pixels, based upon the set of images, wherein the one or more intensities of pixels are a function of maximum and minimum intensities of a sinusoidal variation in the set of images and a function of a phase angle of a sinusoid; determining, an angle of zenith and an angle of azimuth of surface normal, based upon the one or more intensities of pixels, wherein the surface normal comprises a vector value corresponding to a gradient of point of depth of the object; obtaining, one or more gradients of pixels, based upon the angle of zenith and the angle of azimuth, wherein the gradients of pixels comprises variations in a depth of the set of images on primary and secondary axis; obtaining, a combined gradient value, based upon the one or more gradients of pixels to identify damage on the homogeneous surface of the object; obtaining the one or more gradients of pixels by computing, a degree of polarization of an incident light corresponding to the object, based upon the one or more intensities and the polarization angles to identify damage on the homogeneous surface of the object; and determining the angle of azimuth by differentiating, by an independent component analysis (ICA) technique, a set of specular and diffuse images of each pixel corresponding to the set of images to identify damage on the homogeneous surface of the object.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles:
FIG. 1 illustrates a block diagram of a system for identifying damage on homogeneous surface of an object using a set of images obtained from polarization angles, according to some embodiments of the present disclosure.
FIG. 2 is a flowchart illustrating the steps involved for identifying damage on the homogeneous surface of the object using the set of images obtained from the polarization angles, according to some embodiments of the present disclosure.
FIG. 3 illustrates visual representation of a combination of a specular reflection and a diffuse reflection by the object, according to some embodiments of the present disclosure.
FIG. 4 illustrates visual representation of the object being captured using an image capturing device comprising of a polarizing lens, according to some embodiments of the present disclosure.
FIG. 5 illustrates visual representation of the set of images captured through the image capturing device with the polarization angles, according to some embodiments of the present disclosure.
FIG. 6 shows the graphical representation of one or more intensities of pixels and a pattern of a sinusoidal wave generated by the one or more intensities of pixels when plotted at the polarization angles, according to some embodiments of the present disclosure.
FIG. 7 illustrates visual representation of the identified damaged area of the homogeneous surface of the object, according to some embodiments of the present disclosure.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope and spirit being indicated by the following claims.
The embodiments of the present disclosure provides systems and methods for identifying damage on homogeneous surface of an object using a set of images obtained from polarization angles. Nowadays, techniques in image processing make it possible to detect damage, such as moisture or biological changes, on the surfaces of objects or historical buildings. Digital classification techniques can be used to identify damages in objects in a non-destructive way. Imaging based damage detection techniques are increasingly being utilized to identify damage on surface. The quantitative nature of data obtained from the image processing provides for numerous applications for identifying damage.
Some of the traditional systems and methods using image processing in detecting damage to some extent but do not provide for estimating all critical parameters involved in detecting damage. For example, some of the traditional systems and methods capture the images and apply object oriented classification. This requires prior segmentation of the scene followed by classification. A three-dimensional (3D) model is then constructed using laser scanning. The focus is simply to detect damages affecting biocalcarenite stone. Further, some of the traditional systems and methods use a source to send light or electromagnetic waves onto the surface of the object with appropriate mirrors to aid the reflection to detect damaged surface. Also, the traditional systems and methods focus on wafers which are sub-micron level defects which cannot be detected by normal cameras. Further, some of the traditional systems and methods also provide for damage detection using transparent photographic material or films. However, a film may be mishandled, which may lead to a physical damage. This in turn greatly affects the accuracy of identifying damage on surfaces.
Hence, there is a need for a technology that can provide for identifying damage on the homogeneous surface using polarized images with a standard camera. The intensity of the images vary with the angle of polarization. Also, the technology must provide for capturing these varying intensities along with the knowledge of polarization angles to calculate degree of polarization and phase angle to accurately identify damage on the homogeneous surface.
Referring now to the drawings, and more particularly to FIGS. 1 through FIG. 7, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
FIG. 1 illustrates an exemplary block diagram of a system 100 for identifying damage on homogeneous surface of an object using a set of images obtained from polarization angles according to an embodiment of the present disclosure. In an embodiment, the system 100 includes one or more processors 104, communication interface device(s) or input/output (I/O) interface(s) 106, and one or more data storage devices or memory 102 operatively coupled to the one or more processors 104. The one or more processors 104 that are hardware processors can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) is configured to fetch and execute computer-readable instructions stored in the memory. In an embodiment, the system 100 can be implemented in a variety of computing systems, such as laptop computers, notebooks, hand-held devices, workstations, mainframe computers, servers, a network cloud and the like.
The I/O interface device(s) 106 can include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like and can facilitate multiple communications within a wide variety of networks N/W and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. In an embodiment, the I/O interface device(s) can include one or more ports for connecting a number of devices to one another or to another server.
The memory 102 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
FIG. 2, with reference to FIG. 1, illustrates an exemplary flow diagram of a method for identifying damage on the homogeneous surface of the object using the set of images obtained from the polarization angles according to an embodiment of the present disclosure. In an embodiment the system 100 comprises one or more data storage devices of the memory 102 operatively coupled to the one or more hardware processors 104 and is configured to store instructions for execution of steps of the method by the one or more processors 104. The steps of the method of the present disclosure will now be explained with reference to the components of the system 100 as depicted in FIG. 1 and the flow diagram. In the embodiments of the present disclosure, the hardware processors 104 when configured the instructions performs one or more methodologies described herein.
According to an embodiment of the present disclosure, at step 201, the set of images may be captured using an image capturing device, wherein the set of images comprise a plurality of multi-pixel images varying with the polarization angles. The image capturing device may be a Digital single-lens reflex camera (DSLR) camera, comprising of a polarizing lens. The polarizing lens is rotated at different angles and the set of images corresponding to the object (of interest) may then be captured. When an un-polarized light ray falls on the object, two types of reflections may be observed, namely, a specular reflection and a diffuse reflection. Referring to FIG. 3, the object reflects a combination of both the specular reflection and the diffuse reflection. Further, both the specular reflection and the diffuse reflection are partially polarized in nature. In an embodiment, referring to FIG. 4, the partially polarized light rays may be passed through the polarizing lens as depicted. The polarizing lens may then be rotated at different angles to capture the set of images. Referring to FIG. 5, the set of images captured through the image capturing device with the polarization angles may be referred. It may be noted that the embodiments of the present disclosure restrict capturing the set of images from at least three different angles of polarization for the purposes of identification of damage on the homogeneous surface of the object.
According to an embodiment of the present disclosure, at step 202, the one or more hardware processors 104 compute, one or more intensities of pixels, based upon the set of images, wherein the one or more intensities of pixels are a function of maximum and minimum intensities of a sinusoidal variation in the set of images and a function of a phase angle of a sinusoid. In an embodiment, the term ‘sinusoidal variation’ implies a smooth repetitive oscillation of the one or more intensities of pixels over the set of images captured at the polarization angles (that is, when the polarizing lens is rotated at different angles). In an embodiment, the one or more intensities of pixels may be obtained by using equation (1) as shown below:
I(Ø_pol )=(I_max+I_min)/2+(I_max-I_min)/2 cos??(2(Ø_pol ?-Ø)) equation (1)
In an embodiment, referring to equation (1), it may be noted that I_max and I_min represent the maximum and minimum intensities of pixels. Referring to equation (1) again, suppose three different set of images are be captured from the polarization angles (comprising of three different polarization angles) to compute I_max, I_min and a phase angle Ø (phi).
In an example implementation, suppose the three different set of images are captured by the image capturing device from the three different polarization angles corresponding to, for example, 30, 75 and 120 degrees. For a particular pixel, I(30)=124,I(75)=120 and I(120)=119, where I denotes the one or more intensities of pixels (corresponding to the set of images). Using a non-linear least squares algorithm and the equation (1), the maximum and minimum intensities of pixels may be obtained as below:
I_max=120;
I_min=119; and
Ø=1.5 degrees
It may be noted that images (30)=124,I(75)=120 and I(120)=119 are obtained by rotating the polarizing lens fitted into the image capturing device, for example, the DSLR camera. When the polarizing lens is rotated by 30 degrees, for a particular pixel, I(30)=124 may be obtained. Further, for a rotation of 75 degrees, I(75)=120 may be obtained and for a rotation of 120 degrees, I(30)=119 may be obtained. As is known in the art, the phase angle Ø (phi) denotes a fraction of an oscillation which has elapsed with respect to the origin or zero crossing. In an embodiment, the term ‘function’ (that is, the function of the maximum and minimum intensities of the sinusoidal variation and the function of a phase angle of the sinusoid) may be defined as I=f(I_max, I_min,Ø), that the value of a pixel at a particular location depends on the values of I_max, I_min and Ø.
Referring to FIG. 6, it may be noted that the one or more intensities of pixels generate a pattern of a sinusoidal wave or the sinusoidal variation when plotted at the polarization angles. Referring to FIG. 6 again, it may be noted that I_max and I_min represent the maximum and minimum intensities of pixels respectively.
According to an embodiment of the present disclosure, at step 203, the one or more hardware processors 104 determine, an angle of zenith and an angle of azimuth of surface normal, based upon the one or more intensities of pixels, wherein the surface normal (that is, the surface normal corresponding to the angle of zenith and the angle of azimuth) comprises a vector value corresponding to a gradient of point of depth of the object. In an embodiment, the angle of zenith is a function of a degree of polarization and refractive index of the homogeneous surface and the angle of azimuth is an angle between a projection of the surface normal on horizontal plane of the surface normal and X-axis of the surface normal. In an embodiment, the degree of polarization comprises polarization degree of an incident light corresponding to the object.
The angle of azimuth has an ambiguity of 90 degrees with respect to the surface normal of reflection. In case of the specular reflection surface corresponding to the object, the angle of azimuth is equal to the phase angle Ø and additional 90 degrees in case of the diffuse reflection. In an embodiment, this ambiguity may be resolved by implementing an independent component analysis (ICA) technique. The ICA differentiates a set of specular and diffuse images of each pixel corresponding to the set of images. It may be noted that the ICA does not, per se or directly, identifies damage on the homogeneous surface. Using the ICA, for each pixel, the energy of the specular and diffuse images is computed and the greater one may be considered for resolving the ambiguity, which further facilitates identifying damage on the homogeneous surface of the object (discussed below).
In an example implementation, referring to the step 202 above, using the ICA on the I_max=120; I_min=119; and Ø=1.5 degrees it may be noted that there is a dominance of the diffuse reflection and hence the angle of azimuth may be obtained as below:
azimuth angle=Ø+90 degrees=91.5 degrees equation (2)
In an example implementation, the ICA may be applied as below for determining whether there is a dominance for the specular reflection or the diffuse reflection:
Using a rotating polarizer, a series of M surface reflection images may be captured. The M surface reflection images may be scanned and vectorized into a row vector. The observation matrix X is composed of below row vectors:
X=[x1,x2,x3…..xM];
Let d and s be row vectors representing the diffuse reflection and the specular reflection images; the matrix S is composed of:
S=[s,d];
The matrix X may then be denoted as X=AS; where A is an integrated matrix (integration of the specular and diffuse reflections); and
A Singular Value Decomposition (SVD) technique may be used to calculate S to obtain specular and diffuse components.
According to an embodiment of the present disclosure, as discussed above, the angle of zenith is the function of the degree of polarization and refractive index of the homogeneous surface. In an embodiment, the degree of polarization may be denoted by rho and may be computed as:
?=(I_max-I_min)/(I_max+I_min ) equation (3);
where ? denotes the symbol of rho, and referring to the step 202 again, where I_max=120; I_min=119.
Using equation (3), the degree of polarization, that is, rho=0.02.41
In an example implementation, suppose the refractive index is n of the object, the angle of zenith, that is, theta or ? may be calculated using the below equation (4):
?=(2 sin?? tan???v(n^2-sin^2?? )?)/(n^2-2 sin^2???+tan^2?? ? ) equation (4);
where ? is rho=0.02.41 (or the degree of polarization); and
n if the refractive index and n=1.5
Therefore, using equation (4) and the above set of input values, the angle of zenith, that is ? is as below:
?=35 degrees
According to an embodiment of the present disclosure, at step 204, one or more gradients of pixels may be obtained based upon the angle of zenith and the angle of azimuth may be obtained, obtaining, one or more gradients of pixels, based upon the angle of zenith and the angle of azimuth, wherein the gradients of pixels comprises variations in a depth of the set of images on primary and secondary axis, wherein the primary and secondary axis denote horizontal and vertical axis corresponding to an image (from amongst the set of images) captured using the image capturing device. In an embodiment, the one or more gradients of pixels may be obtained, for example, by integrating the angle of zenith, the angle of azimuth and the degree of polarization. In an embodiment, the one or more gradients of pixels may be computed in the primary (x) and secondary (y) axis. The one or more gradients of pixels are computed for integrating the gradients obtained in both the x and y axis for further obtaining a complex number. An angle of gradient (discussed below) corresponding to the complex number may then be plotted to identify the damage on the object.
In an example implementation, the one or more gradients of pixels in the x and y directions may be represented as dzdx and dzdy and may be obtained using equations (5) and (6) as below:
dzdx=tan??/v(1+tan^2?? ) equation (5); and
dzdy=(tan?? tan?Ø)/v(1+tan^2?? ) equation (6)
where ? denotes the angle of zenith and Ø denotes the angle of azimuth.
According to an embodiment of the present disclosure, at step 205, a combined gradient value may be obtained based upon the one or more gradients of pixels to identify damage on the homogeneous surface of the object. In an embodiment, the one or more gradients of pixels may be combined to obtain the combined gradient value comprising of the complex number. The combined gradient value helps in identifying damage on the homogeneous surface of the object. The integration of the one or more gradients of pixels may be obtained as denoted by equation (7) below:
Z= dzdx+(-1i* dzdy) equation (7);
where Z denotes the combined gradient value (which is the complex number);
dzdx denotes a real part of the combined gradient value; and
-1i* dzdy denotes an imaginary part of the combined gradient value.
In an embodiment, the angle of gradient corresponding to the combined gradient value may be computed as below for finally identifying damage on the homogeneous surface of the object:
angle of gradient=tan^(-1)?( dzdx/( dzdy)) equation (8)
Taking dzdx=0.0813 and dzdy=0.6960 as inputs in equation (8) above, the angle of gradient=1.45 degrees. In an embodiment, based on the angle of gradient computed, damage on the homogeneous surface of the object may be identified. For example, if the angle of gradient for a pixel is 1.45 degrees (as obtained above), while for other pixel is 1.0 or 1.25 or 1.55 degrees, it may be concluded that the homogeneous surface is smooth with no damage as the homogeneous surface have smooth variations corresponding to the angle of zenith and the angle of azimuth. Similarly, the angle of gradient may then be computed for each of the pixel corresponding to the set of images to identify damage. For example, for another pixel corresponding to the set of images, if the angle of gradient is 7.0 or 15.5 degrees, which is not close to 1.45 degrees, the homogeneous surface may have a damage. Referring to FIG. 7, the identified damage on the homogeneous surface using the above proposed disclosure may be referred.
According to an embodiment of the present disclosure, technical advantages of the present disclosure may now be considered. As discussed above, the traditional systems and methods use a source to send light or electromagnetic waves onto the surface of the object with appropriate mirrors to aid the reflection to detect damaged surface. Also, the traditional systems and methods focus on wafers which are sub-micron level defects which cannot be detected by normal cameras and provide for damage detection using transparent photographic material or films. The present disclosure provides for identifying damage on the homogeneous surface of the object using the three different polarization angles. Further, using the refractive index of the object the present disclosure provides for computing the angle of zenith and the angle of azimuth of the surface normal for every pixel of the image. The damage results in change in the direction of the surface normal which in turn affects the angle of zenith and the angle of azimuth as the homogeneous surface/s have smooth variations corresponding to the angle of zenith and the angle of azimuth. The presence of damage changes properties of the object and hence the angle of zenith and the angle of azimuth. The present disclosure provides for computing the combined gradient value and the angle of gradient corresponding to each pixel of the set of images captured and thus provides for identification of even small damage on the surfaces based upon the quantified values, which cannot be performed using the traditional systems and methods. Also, the present disclosure facilitates identifying damage by capturing the set of images using any normal image capturing device, like a camera.
In an embodiment, the memory 102 can be configured to store any data that is associated with the identification of damage on the homogeneous surface of the object using the set of images obtained from the polarization angles. In an embodiment, the information pertaining to the set of images, the one or more intensities of pixels, the angle of zenith and the angle of azimuth of the surface normal computed, the one or more gradients of pixels etc. are stored in the memory 102. Further, all information (inputs, outputs and so on) pertaining to the identification of damage on the homogeneous surface of the object using the set of images obtained from the polarization angles may also be stored in the database, as history data, for reference purpose.
The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
It is to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.
The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, BLU-RAYs, flash drives, disks, and any other known physical storage media.
It is intended that the disclosure and examples be considered as exemplary only, with a true scope and spirit of disclosed embodiments being indicated by the following claims.
| # | Name | Date |
|---|---|---|
| 1 | 201821004694-STATEMENT OF UNDERTAKING (FORM 3) [07-02-2018(online)].pdf | 2018-02-07 |
| 2 | 201821004694-REQUEST FOR EXAMINATION (FORM-18) [07-02-2018(online)].pdf | 2018-02-07 |
| 3 | 201821004694-FORM 18 [07-02-2018(online)].pdf | 2018-02-07 |
| 4 | 201821004694-FORM 1 [07-02-2018(online)].pdf | 2018-02-07 |
| 5 | 201821004694-FIGURE OF ABSTRACT [07-02-2018(online)].jpg | 2018-02-07 |
| 6 | 201821004694-DRAWINGS [07-02-2018(online)].pdf | 2018-02-07 |
| 7 | 201821004694-COMPLETE SPECIFICATION [07-02-2018(online)].pdf | 2018-02-07 |
| 8 | 201821004694-Proof of Right (MANDATORY) [19-02-2018(online)].pdf | 2018-02-19 |
| 9 | 201821004694-FORM-26 [30-03-2018(online)].pdf | 2018-03-30 |
| 10 | Abstract1.jpg | 2018-08-11 |
| 11 | 201821004694-ORIGINAL UR 6( 1A) FORM 26-050418.pdf | 2018-08-11 |
| 12 | 201821004694-ORIGINAL UNDER RULE 6 (1A)-FORM 1-220218.pdf | 2018-08-11 |
| 13 | 201821004694-FER.pdf | 2020-06-26 |
| 14 | 201821004694-OTHERS [26-12-2020(online)].pdf | 2020-12-26 |
| 15 | 201821004694-FER_SER_REPLY [26-12-2020(online)].pdf | 2020-12-26 |
| 16 | 201821004694-COMPLETE SPECIFICATION [26-12-2020(online)].pdf | 2020-12-26 |
| 17 | 201821004694-CLAIMS [26-12-2020(online)].pdf | 2020-12-26 |
| 18 | 201821004694-US(14)-HearingNotice-(HearingDate-17-01-2024).pdf | 2024-01-01 |
| 19 | 201821004694-Correspondence to notify the Controller [15-01-2024(online)].pdf | 2024-01-15 |
| 20 | 201821004694-FORM-26 [16-01-2024(online)].pdf | 2024-01-16 |
| 21 | 201821004694-FORM-26 [16-01-2024(online)]-2.pdf | 2024-01-16 |
| 22 | 201821004694-FORM-26 [16-01-2024(online)]-1.pdf | 2024-01-16 |
| 23 | 201821004694-Written submissions and relevant documents [31-01-2024(online)].pdf | 2024-01-31 |
| 24 | 201821004694-PatentCertificate06-02-2024.pdf | 2024-02-06 |
| 25 | 201821004694-IntimationOfGrant06-02-2024.pdf | 2024-02-06 |
| 1 | searchqueryfor201821004694E_24-06-2020.pdf |
| 2 | searchqueryandstrategyfor201821004694E_24-06-2020.pdf |