Abstract: A system (100) for real-time automated fluorescence-based defect detection in an object (104) is presented. The system (100) includes a treatment unit (110) for preparing the object (104), an ultraviolet light imaging subsystem (116) for illuminating and capturing ultraviolet light images (202), a white light imaging subsystem (122) for illuminating and capturing white light images (204), an image processing system (128) including an acquisition subsystem (206), a processing subsystem (208) including a defect detection platform (210) configured to process a white light image to generate a geometric line drawing (418) of the object, process an ultraviolet light image to obtain fluorescence signals (422) in the image, perform an image processing operation on the white light and ultraviolet light images to identify non-overlapping fluorescent pixels, generate a filtered ultraviolet image (428), analyze the filtered ultraviolet image (428) to identify defects, annotate defects to create an output image (436) to facilitate analysis.
DESC:BACKGROUND
[0001] Embodiments of the present specification relate generally to inspection of components or products, and more particularly to systems and methods for automated real-time detection of defects on the surface of the components, while minimizing a number of false positives.
[0002] In the manufacturing industry, it is highly desirable that all the manufacturing units maximize yield, while minimizing discards due to defects. Some sources of the defects may include raw materials, processing equipment, processing technology, and the like. These defects may adversely impact the quality, appearance, corrosion resistance, and/or fatigue strength of the manufactured product. Accordingly, timely detection of any defects in the manufacturing process is essential to optimize yield while minimizing cost and rejects.
[0003] As will be appreciated, surface inspection plays a vital role in ensuring high quality in the manufacturing process. Furthermore, especially in the manufacturing industry, products must be inspected for defects, before they can be dispatched as finished goods to reach target markets. By way of example, in products that include engineered components which in turn entail surface finish, the presence of any surface cracks, dents, rust, and the like are typically tagged as defects in a given component and are subject to rework or reject, thereby increasing hidden costs.
[0004] Traditionally, inspectors manually examine the surface of the component or product to detect any defects. However, the manual surface inspection entails arduous work by the inspectors. Also, the inspection process is highly dependent on the skill level of the inspector, leading to defects being missed. Additionally, manually inspecting the surface of the product in real-time during the manufacturing process is a challenging task. Moreover, use of manual inspection methods suffers from a higher rate of false positives or indications of defects, thereby disadvantageously impacting the quality of the inspection process.
[0005] Non-destructive testing (NDT) techniques such as fluorescent penetrant inspection (FPI) and magnetic particle inspection (MPI) have been extensively used in the manufacturing industry to detect surface cracks or defects. However, the FPI and MPI inspection techniques entail manual inspection which in turn may lead to variable results influenced by human factors. Moreover, the FPI inspections of components are typically carried out by the inspectors under low light conditions and hence may lead to operator fatigue which in turn may result in undetected passing of defective parts. Use of these defective parts may unfortunately result in catastrophic failures of the components and/or products using these components.
[0006] In light of the drawbacks of the manual inspection methods, automating the detection of defects in components is essential to ensure the reliability and safety of the final product. Implementing automated inspection systems during the manufacturing or quality control process enables early identification and prompt addressing of defects. In the recent past, several automated surface inspection methods have been developed to alleviate the issues with manual surface inspection and enhance consistency, reliability, and increased productivity of the manufacturing processes. Some automated surface inspection methods entail use of computer vision approaches. However, non-uniform illumination conditions and similarities between certain surface textures and defects impact the efficiency of the computer vision approaches in the identification of the defects.
[0007] Moreover, in recent times, machine learning (ML)/artificial intelligence (AI) techniques have been used to aid in the automated identification of defects on the surface of the components/products. However, currently, identification of defects using these ML techniques entails processing a huge volume of data related to a surface area of the component being inspected, where the data set may include redundant information. Consequently, this processing disadvantageously results in an increase in computational load of the processing systems. Also, the increase in computational load of the processing systems adversely impacts real-time detection of surface defects on the components.
[0008] Furthermore, there have been attempts to automate the FPI defect detection process. Some automated defect detection systems entail the use of Random Forest algorithms that focus on statistical and pattern recognition techniques to identify defects. However, these techniques fail to adequately address the influence of other factors on the accuracy of defect detection, thereby leading to suboptimal performance. Also, various other factors such as accumulating surface penetrant due to roughness of the surface, geometry of the surface, insufficient wash off of the penetrant, and the like present a challenge in training an automated FPI inspection system that can efficiently distinguish defects due to the accumulating surface penetrant from other non-defective conditions.
[0009] A major drawback of most of the automated defect detection techniques is that these techniques call out non-defects or “false positives.” These false positives disadvantageously reverse any gain in automation resulting in an increase in the manual effort to validate the false positives as not being defects. Hence, there exists a constant need to enhance the accuracy and efficiency of automated defect detection, in real-time, while minimizing the call out of false positives.
BRIEF DESCRIPTION
[0010] In accordance with aspects of the present specification, a system for real-time automated defect detection in an object is presented. The system includes a treatment unit configured to prepare the object for fluorescence-based defect detection, where preparing the object includes applying a fluorescent penetrant dye or fluorescent magnetic particles to a surface of the object. Moreover, the system includes an ultraviolet light imaging subsystem configured to illuminate the surface of a prepared object with ultraviolet light and capture one or more ultraviolet light images of the prepared object in response to the ultraviolet illumination, where the ultraviolet light images include fluorescence signals caused by the fluorescent penetrant dye or the fluorescent magnetic particles applied to the prepared object. Furthermore, the system includes a white light imaging subsystem configured to illuminate the surface of the prepared object with white light and capture one or more white light images of the prepared object in response to the white light illumination. Additionally, the system includes an image processing system including an acquisition subsystem configured to receive the captured one or more ultraviolet light images and the one or more white light images of the prepared object and a processing subsystem in operative association with the acquisition subsystem and including a defect detection platform configured to process the captured one or more ultraviolet light images and the one or more white light images to detect, in real-time, one or more defects on the prepared object with a reduced number of false positives, where to detect, in real-time, the one or more defects the defect detection platform is configured to process a white light image to generate a geometric line drawing of the prepared object, where the geometric line drawing includes geometric lines representing geometric features of the prepared object, process an ultraviolet light image to obtain the fluorescence signals in the ultraviolet light image, where the fluorescence signals include fluorescent pixels corresponding to one or more defects, fluorescent pixels corresponding to one or more geometric artifacts, fluorescent pixels corresponding to one or more geometric features, or combinations thereof, perform an image processing operation on the white light image and the ultraviolet light image to identify non-overlapping fluorescent pixels, where the non-overlapping fluorescent pixels are representative of fluorescent indications corresponding to one or more defects on the prepared object, generate a filtered ultraviolet light image including the fluorescent indications corresponding to one or more defects on the prepared object, analyze the filtered ultraviolet light image to identify one or more defects on the prepared object, annotate the one or more defects to create an output image, and an interface unit configured to provide, in real-time, the output image, the one or more defects, or both to facilitate analysis, where the system is configured to enhance accuracy of the fluorescence-based defect detection by eliminating false positives caused by reflections from the geometric features on the prepared object.
[0011] In accordance with another aspect of the present specification, a method for real-time automated defect detection in an object is presented. The method includes treating the object by applying a fluorescent penetrant dye or fluorescent magnetic particles to a surface of the object to prepare the object for fluorescence-based defect detection. Furthermore, the method includes illuminating the surface of the prepared object with ultraviolet light and capturing one or more ultraviolet light images of the prepared object in response to the ultraviolet illumination, where the one or more ultraviolet light images includes fluorescence signals caused by the fluorescent penetrant dye or the fluorescent magnetic particles applied to the prepared object. In addition, the method includes illuminating the surface of the prepared object with white light and capturing one or more white light images of the prepared object in response to the white light illumination. Moreover, the method includes processing the captured one or more ultraviolet light images and the one or more white light images to detect, in real-time, one or more defects on the prepared object with a reduced number of false positives, where detecting, in real-time, the one or more defects includes processing a white light image to generate a geometric line drawing of the prepared object, where the geometric line drawing includes geometric lines representing features of the prepared object, processing an ultraviolet light image to obtain the fluorescence signals in the ultraviolet light image, where the fluorescence signals include fluorescent pixels corresponding to one or more defects, fluorescent pixels corresponding to one or more geometric artifacts, fluorescent pixels corresponding to one or more geometric features, or combinations thereof, performing an image processing operation on the white light image and the ultraviolet light image to identify non-overlapping pixels, where the non-overlapping fluorescent pixels are representative of fluorescent indications corresponding to one or more defects on the prepared object, generating a filtered ultraviolet light image including the fluorescent indications corresponding to one or more defects on the prepared object, analyzing the filtered ultraviolet light image to identify one or more defects on the prepared object, annotating the one or more defects to create an output image, and providing, in real-time, the output image, the one or more defects, or both to facilitate analysis, where the method is configured to enhance accuracy of the fluorescence-based defect detection by eliminating false positives caused by reflections from geometric features in the prepared object.
[0012] In accordance with yet another aspect of the present specification, an image processing system for real-time automated defect detection in an object is presented. The image processing system includes an acquisition subsystem configured to receive one or more ultraviolet light images and one or more white light images of a prepared object. Furthermore, the image processing system includes a processing subsystem in operative association with the acquisition subsystem and including a defect detection platform configured to process the one or more ultraviolet light images and the one or more white light images to detect, in real-time, one or more defects on the prepared object with a reduced number of false positives, where to detect, in real-time, the one or more defects the defect detection platform is configured to process a white light image to generate a geometric line drawing of the prepared object, where the geometric line drawing includes geometric lines representing geometric features of the prepared object, process an ultraviolet light image to obtain fluorescence signals in the ultraviolet light image, where the fluorescence signals includes fluorescent pixels corresponding to one or more defects, fluorescent pixels corresponding to one or more geometric artifacts, fluorescent pixels corresponding to one or more geometric features, or combinations thereof, perform an image processing operation on the white light image and the ultraviolet light image to identify non-overlapping fluorescent pixels, where the non-overlapping fluorescent pixels are representative of fluorescent indications corresponding to one or more defects on the prepared object, generate a filtered ultraviolet light image including the fluorescent indications corresponding to one or more defects on the prepared object, analyze the filtered ultraviolet light image to identify one or more defects on the prepared object, annotate the one or more defects to create an output image, and provide, in real-time, the identified one or more defects, the output image, or both to facilitate analysis, where the image processing system is configured to integrate white light imaging and ultraviolet light imaging with an intersection-based filtering technique to enhance accuracy of fluorescence-based defect detection and reduce a number of false positives in the object being inspected.
DRAWINGS
[0013] These and other features and aspects of embodiments of the present specification will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0014] FIG. 1 is a schematic representation of an exemplary system for real-time automated defect detection in an object, in accordance with aspects of the present specification;
[0015] FIG. 2 is a schematic representation of an exemplary embodiment of the system for real-time automated defect detection in an object of FIG. 1, in accordance with aspects of the present specification;
[0016] FIG. 3 is a schematic illustration of one example of an object being inspected, in accordance with aspects of the present specification;
[0017] FIG. 4 is a flow chart illustrating a method for real-time automated defect detection in an object, in accordance with aspects of the present specification; and
[0018] FIG. 5 is a schematic representation of one embodiment of a digital processing system implementing a defect detection platform for use in the system of FIG. 1, in accordance with aspects of the present specification.
DETAILED DESCRIPTION
[0019] The following description presents exemplary systems and methods for real-time automated defect detection in an object. Particularly, embodiments described hereinafter present exemplary systems and methods that facilitate enhanced automated surface defect detection and quality control of components in manufacturing industries using Fluorescence Penetrant Inspection (FPI) and/or Magnetic Fluorescence Penetration Inspection (MPI). These systems and methods facilitate the automated detection of defects in the object, in real-time. In one example, the systems and methods presented hereinafter accurately and efficiently enable the automated detection of defects, in real-time, by integrating ultraviolet (UV) light imaging and white light imaging with an intersection-based filtering approach to significantly reduce false positives. More particularly, the systems and method enhance the efficiency and accuracy of defect detection and contribute to maintaining a high pass rate of components by swiftly identifying and rectifying any potential issues related to scenarios where fluorescence is caused by factors other than defects, such as geometric reflections, surface roughness, residual penetrant dye, and the like.
[0020] Use of the present systems and methods presents significant advantages in revolutionizing the quality of inspection of manufactured components that use FPI or MPI processes for defect detection, thereby overcoming the drawbacks of currently available methods of inspection and detection of defects/anomalies in the manufacturing processes. More particularly, the systems and methods facilitate reduction in the number of false positives during defect detection without sacrificing accuracy of defect detection. This exemplary automation proposed by the systems and methods presented hereinafter streamlines the inspection process, leading to improved overall productivity in industrial quality control that focuses on optimizing post-treatment inspection of components using FPI and MPI.
[0021] For ease of understanding, the exemplary embodiments of the present systems and methods are described in the context of an inspection system configured to provide enhanced real-time automated fluorescence-based detection of defects on a component and/or a product. As will be appreciated, fluorescence-based inspection is a non-destructive testing method that uses fluorescent dyes to detect defects in materials. FPI and MPI processes find application in a variety of industries, including manufacturing, automotive, aerospace, environmental research, biomedical research, and the like.
[0022] Moreover, use of the exemplary embodiments of the present systems and methods presented hereinafter in other systems and applications such as, but not limited to, surface defect inspection and/or detection in automotive and aerospace components, and the like is also contemplated. An exemplary environment that is suitable for practising various implementations of the present systems and methods is discussed in the following sections with reference to FIG. 1.
[0023] As used herein, the term “object” or “component” refers to an element that combines with other elements or parts to form a bigger entity such as a mechanical assembly or product. Also, as used herein, the term “object” or “component” refers to objects or components being inspected post treatment using FPI or MPI. Also, as used herein, “fluorescence-based inspection” is used to represent a non-destructive testing method that uses fluorescent dyes or fluorescent magnetic particles to detect defects in materials. Fluorescence-based inspection finds application in a wide variety of industries, such as manufacturing, environmental research, biomedical research, automotive, aerospace, and the like. Furthermore, as used herein, the term “false positive” or “false indication” is used to refer to an erroneous situation where a test result incorrectly indicates the presence of a condition. Moreover, as used herein, the term “real-time” is used to refer to imperceptible delays in user experience. In addition, the term “real-time” processing may also be used to encompass “near real-time” processing. As will be appreciated, the real-time processing is typically dependent upon the application.
[0024] Referring now to the drawings, FIG. 1 illustrates an exemplary system 100 for real-time inspection of an object 104 to detect one or more defects 108 on a surface 106 of the object 104, in accordance with aspects of the present specification. The system 100 is configured to facilitate real-time automated fluorescence-based defect detection of the object 104, for example. In one example, the object 104 may be representative of an engineered component for use in a product. The term “object” and “component” may be used interchangeably.
[0025] It may be noted for ease of explanation, the system 100 is described with reference to the real-time inspection of a single object 104. However, the system 100 may be configured to simultaneously inspect multiple objects, in real-time. Reference numeral 104 generally refers to an object being monitored or inspected. In the present example, the object 104 is representative of an engineered component that is being inspected via use of FPI or MPI processes. However, other objects may also be inspected using the system 100.
[0026] In a presently contemplated configuration, the system 100 includes an inspection system 102. As will be appreciated, various defects such as small cracks may be caused by processes used to shape and form a material such as a metal into the object 104. The inspection system 102 may be configured to monitor and inspect the object 104 to automatically detect, in real-time, one or more defects 108 on the surface 106 of the object 104. More particularly, the inspection system 102 may be configured to facilitate the automated fluorescence-based detection of the defects 108 on the surface of the object 104 being inspected, in real-time. In one example, fluorescence-based inspection methods such as FPI or MPI may be employed to aid in the automated defect detection due to the sensitivity of FPI or MPI to small defects.
[0027] In accordance with exemplary aspects of the present specification, the inspection system 102 may be configured to provide the automated fluorescence-based defect detection of the object 104, in real-time, by integrating white light and UV light imaging with an intersection-based filtering approach to significantly reduce false positives in the object 104 being inspected.
[0028] As noted hereinabove, the system 100 facilitates the automated real-time fluorescence-based defect detection in the object 104. Hence, it is desirable to prepare the object 104 being inspected for fluorescence-based defect detection. To that end, in one embodiment, the system 100 and the inspection system 102 in particular may include a treatment unit 110. The treatment unit 110 is configured to prepare or treat the object 104 to render the object 104 suitable for fluorescence-based inspection for defects. In one embodiment, the treatment unit 110 may be configured to prepare the object 104 for inspection via use of an FPI process. In this example, the object 104 may be treated via the FPI process to aid in exhibiting any defect areas on the surface 106 of the object 104 using physical aspects like fluorescence.
[0029] As noted hereinabove, FPI is a non-destructive testing method that uses a fluorescent penetrant such as a dye to detect surface defects 108, if any, on the object 104. To prepare the object 104 for fluorescence-based inspection, the treatment unit 110 may be configured to pre-clean the object 104 thoroughly. The surface 106 of the object 104 is thoroughly cleaned prior to the application of a penetrant to ensure that the surface 106 is free of any contamination such as, but not limited to, oil, dirt, paint, and the like that may penetrate a defect or falsely indicate a defect.
[0030] Further, the treatment unit 110 may be configured to dry the object 104 being inspected. Subsequently, the treatment unit 110 may be configured to apply a penetrant dye to the surface 106 of the object 104. In one example, the penetrant dye may include a liquid fluorescent penetrant dye that is configured to highlight any surface defects. The treatment unit 110 may be configured to allow the penetrant dye to seep into or penetrate any surface discontinuities or surface breaking defects 108 on the object 104. This time of waiting for the penetrant dye to seep into the defects is generally referred to as dwell time. The dwell time varies based on the material of the object 104 and/or size of potential flaws and is typically in the range of ten minutes to about an hour.
[0031] Moreover, subsequent to the passage of the dwell time, the treatment unit 110 may be configured to process the object 104 to remove any excess penetrant dye from the surface 106 of the object 104. It may be noted that excess penetrant dye is removed only from the surface 104 of the object 104. Also, ensuing the removal of any excess penetrant dye, the treatment unit 110 may be configured to apply a developer to the surface 106 of the object 104 to enhance the visibility of the entrapped penetrant dye. The developer provides a contrasting background to facilitate enhanced detection of defects. In addition, the developer also causes the penetrant dye present in the defects to surface and bleed, thereby enabling better detection of defects during the inspection process. The treatment unit 110 may further be configured to allow dwell time or contact time for the developer to act. Consequent to the processing by the treatment unit 110, a prepared or treated object 112 is produced. The treated object 112 may be inspected to detect presence of defects, if any, on the surface of the treated object 112. Reference numeral 114 is used to generally refer to the defect 108 that has been treated by the treatment unit 110.
[0032] Traditionally, in fluorescence-based defect detection, the treated object 112 is inspected in a dark room under ultraviolet (UV) light. The object illuminated by the UV light is then manually inspected by an inspector. The inspector typically manually identifies and marks any areas in question to allow the location of these indications to be subsequently identified easily without the use of the UV lighting. Unfortunately, the identification of defects employing the FPI process is highly dependent on the experience and skill level of the inspector and other environmental factors.
[0033] While automated methods for preparing an object for FPI processing are available, automated defect detection via FPI processing is still a nascent area. Most currently available automated defect detection methods often suffer from high rates of false positives, especially in scenarios where fluorescence is caused by factors other than defects, such as geometric reflections or residual dye. Thresholding techniques and Random Forest algorithms have been employed to facilitate the automated fluorescence-based defect detection. However, these techniques fail to adequately address geometric influences on fluorescence, leading to suboptimal performance due to a higher number of false positives.
[0034] In some embodiments, an MPI process may be employed to prepare or treat the object 104 being inspected. As will be appreciated, MPI is a non-destructive testing process where a magnetic field is used for detecting surface and shallow subsurface discontinuities in ferromagnetic materials. In this process, the object 104 may be magnetized. Subsequently, finely milled iron particles that are coated in fluorescent dye may be applied to the surface 106 of the object 104. The presence of any surface or subsurface discontinuities such as cracks causes magnetic flux leakage. The fluorescent iron particles are attracted to these areas of flux leakage and cluster to form an indication over the discontinuity. These indications may then be evaluated to determine further course of action. In this example, the object 104 may be treated via the MPI process to aid in exhibiting any defect areas on the surface 106 of the object 104 using physical aspects like magnetism.
[0035] It may be noted that for ease of explanation, the system 100 is described with reference to FPI based fluorescence defect detection. However, use of MPI based fluorescence defect detection is also contemplated.
[0036] In accordance with aspects of the present specification, an exemplary system 100 for the automated fluorescence-based detection of defects that is configured to efficiently distinguish actual defects from false indications without sacrificing accuracy, is presented. More particularly, the exemplary system 100 is configured to overcome the shortcomings of the currently available methods by integrating white light and UV light imaging with an intersection-based filtering approach to significantly reduce false positives. Specifically, in accordance with aspects of the present specification, the system 100 is configured to recognize that physical defects rarely occur along the geometry of the component or object 112, such as along edges and/or boundaries of the physical object 112. Accordingly, any surface defects that do not align with the component geometry may result in UV light reflections that are representative of residual fluorescence from defects or deviations that are away from the geometry of the object 112. Additionally, surface defects that align with the geometry of the object 112 may result in UV light reflections due to fluorescence from geometric outlines of the object 112 and/or other geometric artifacts on the object 112 and hence may be identified as false positives and removed from consideration in the defect detection process.
[0037] Hence, in accordance with aspects of the present specification, any surface defects that align with the geometry of the object 112 may be identified as false positives and removed or nulled out, thereby effectively reducing the call out count of any false positives. Furthermore, any surface defects that do not align with the geometric outline of the object 112 may be identified as true or actual defects on the surface of the object 112.
[0038] The system 100 is configured to leverage this insight to effectively reduce the call out count of any false positives. Consequently, the system 100 is configured to advantageously automate the fluorescence-based defect detection in the object 112, thereby enhancing the accuracy of true defect detection, reducing the number of false positives, and minimizing dependencies on the geometry of the object 112. This automation of the fluorescence-based defect detection in the object 112 facilitates the streamlining of the FPI inspection process, thereby leading to improvement in overall performance and productivity in the industrial manufacturing process and quality control process.
[0039] Furthermore, to facilitate the automated fluorescence-based defect detection in the object 112, the inspection system 102 is configured to capture images of the treated object 112. In particular, the inspection system 102 is configured to capture images of the object 112 under UV light and white light.
[0040] In a presently contemplated configuration, the inspection system 102 includes a UV imaging subsystem 116 and a white light imaging subsystem 122. In one embodiment, the UV imaging subsystem 116 may include a UV light illumination unit 118 and a UV light camera unit 120. The UV illumination unit 118 is configured to illuminate the object 112 with UV light. In one embodiment, the UV light illumination unit 118 may be strategically positioned to illuminate the object 112 so as to create reflections on the surface 106 of the object 112. Additionally, in some embodiments, the orientation and/or intensity of the UV light illumination unit 118 may be configurable to optimize the reflections from the object 112 based on a material of the object 112 and/or surface characteristics of the object 112. Some non-limiting examples of an illumination source for use in the UV light illumination unit 118 include sources of radiation that emit UV light, such as gas discharge lamps like mercury high-pressure lamps and low-pressure lamps, fluorescent tubes, metal halides lamps, Xenon arcs, and the like.
[0041] Further, the UV light camera unit 120 may be configured to capture one or more UV light images of the object 112 in response to the UV light illumination. In one embodiment, the UV light camera unit 120 may be configured to capture the reflections from the surface 106 of the object 112 being inspected. In one example, the UV light camera unit 120 may be configured to capture the reflections from the surface 106 of the object 112 based on light reflected from the surface 106 and directed towards a field of view (FOV) of the UV light camera unit 120 and generate one or more UV light images. Further, the UV light images are captured under UV light to facilitate identification of areas of fluorescence in the UV light images.
[0042] Moreover, in one embodiment, the white light imaging subsystem 122 may include a white light illumination unit 124 and a white light camera unit 126. The white light illumination unit 124 is configured to illuminate the object 112 with white light. In one embodiment, the white light illumination unit 124 may be strategically positioned to illuminate the object 112 so as to create reflections on the surface 106 of the object 112. Furthermore, in some embodiments, the orientation and/or intensity of the white light illumination unit 124 may be configurable to optimize the reflections from the object 112 based on a material of the object 112 and/or surface characteristics of the object 112. Some non-limiting examples of an illumination source for use in the white light illumination unit 124 include fluorescence light bulbs, white light emitting diodes (LEDs), and the like.
[0043] Also, the white light camera unit 126 may be configured to capture one or more white light images of the object 112 in response to the white light illumination. The white light camera unit 126 may be configured to capture the reflections from the surface 106 of the object 112 being inspected, in one embodiment. Moreover, in one example, the white light camera unit 126 may be configured to capture the reflections from the surface 106 of the object 112 based on light reflected from the surface 106 and directed towards a field of view (FOV) of the white light camera unit 126 and generate one or more white light images. The white light images are captured under white light to facilitate identification of geometric features in the white light images.
[0044] It may be noted that the UV light images and the white light images of the object 112 may be captured sequentially to maintain spatial alignment of detected features in the UV light images and the white light images. Also, sequentially imaging the same object 112 under white light to obtain the white light images to capture geometric features of the object 112 subsequent to imaging the object 112 under UV light to capture the UV light images ensures that fluorescent areas on the surface 106 of the object 112 are clearly defined for further analysis.
[0045] In accordance with aspects of the present application, the inspection system 102 may include an image processing system 128 that may be configured to receive the UV light images and the white light images from the UV light imaging subsystem 116 and the white light imaging subsystem 122 respectively. Additionally, the image processing system 128 may be configured to process the UV light images and the white light images to detect, in real-time, one or more defects on the surface 106 of the prepared object 112 with a reduced number of false positives.
[0046] The image processing system 128 may be configured to process a white light image to generate a geometric line drawing of the object 112 being inspected. It may be noted that the geometric line drawing includes line pixels representing the geometric features of the object 112. In one example, line pixels representing an “edge” or “outline” are set to a non-zero value (typically with a value of “1”), while all other pixels (“non-edge”) are set to zero.
[0047] In addition, the image processing system 128 may be configured to process a UV light image to obtain the fluorescence in the UV light image. It may be noted that the fluorescence in the UV light image includes fluorescent pixels corresponding to one or more defects, fluorescent pixels corresponding to geometric artifacts, fluorescent pixels corresponding to one or more geometric features of the object 112, or combinations thereof. The geometric artifacts may include visible imperfections or anomalies present on the surface 106 of the object 112. These anomalies may be caused by the manufacturing process, the FPI or MPI preparation process, or other external factors. Some non-limiting examples of the geometric artifacts include marks, blemishes, uneven textures, visible seams, discolorations, or small irregularities that may potentially impact the appearance or functionality of the object 112. Also, the geometric features may include outlines, edges, embossed attributes, engraved attributes, and other such characteristics of the object 112.
[0048] Moreover, it may be noted that the fluorescence may include overlapping fluorescent pixels and non-overlapping fluorescent pixels. The overlapping or intersecting fluorescent pixels are those fluorescent pixels in the UV light image that align with the geometric line drawing of the object 112 and are caused by UV light reflections due to fluorescence along geometric edges or lines of the object 112. The overlapping fluorescent pixels are generally representative of false positives or false indicators of defects on the object 112. Also, the non-overlapping or non-intersecting fluorescent pixels are those fluorescent pixels that do not align with the geometry of the object 112.
[0049] In accordance with aspects of the present specification, it is desirable to identify the non-overlapping fluorescent pixels to enable accurate identification of the true defects on the object 112. Accordingly, the image processing system 128 may be configured to process the geometric line drawing of the object 112 and the UV light image to identify one or more non-overlapping or non-intersecting fluorescent pixels in the UV light image, where the non-overlapping fluorescent pixels are representative of fluorescent indications corresponding to defects on the object 112.
[0050] In one example, the image processing system 128 may be configured to process the geometric line drawing corresponding to the white light image and the UV light image via a pixel-by-pixel exclusive OR (XOR) operation. Specifically, processing the geometric line drawing and the UV light image via the pixel-by-pixel XOR operation aids in outputting only the non-overlapping fluorescent pixels that are present in the UV light image and are absent in the white light image. The non-overlapping fluorescent pixels are generally representative of fluorescent indications corresponding to one or more defects on the object 112.
[0051] Additionally, as a consequence of the processing of the geometric line drawing and the UV light image via the pixel-by-pixel exclusive XOR operation, any intersecting or overlapping fluorescent pixels in the UV light image are excluded. The intersecting or overlapping fluorescent pixels are representative of fluorescent pixels that are present both in the UV light image and the white light image. Further, these overlapping fluorescent pixels are generally caused by reflections from the geometric lines or other geometric features of the object 112 and hence are representative of false positives or false indicators of defects on the object 112.
[0052] Subsequently, the image processing system 128 is configured to generate a filtered UV light image. This filtered UV light image includes only the non-overlapping fluorescent pixels that are present in the UV light image and are absent in the white light image. Additionally, the filtered UV light image has a reduced number of false positives due to the exclusion of the overlapping fluorescent pixels.
[0053] As previously noted, the non-overlapping fluorescent pixels in the output image are representative of fluorescent indications corresponding to one or more defects on the object 112. The image processing system 128 may be configured to analyze the filtered UV light image to identify one or more defects 114 on the surface 106 of the object 112, in real-time. The identification of the defects 114 will be described in greater detail with reference to FIGs. 2-4. Subsequent to the identification of the defects 114, the image processing system 102 may be configured to generate an output image indicating the detected defects 114.
[0054] The identified defects 114, the output image, other annotations or information may be provided to the inspector and/or other processing systems to facilitate any further analysis or for storage and further processing to a computer 130 and/or a data repository. The processing of the UV light images and the white light images by the image processing system 128 advantageously enhances the accuracy of the fluorescence-based defect detection by retaining only the fluorescent pixels that are true indicators of one or more defects on the object 112, while eliminating or excluding the false positives caused by the reflections from the geometric lines of the object 112.
[0055] Additionally, the inspection system 102 may be plugged in between the camera feed and the workstation such as the computer 130 and may be configured to handle processing of the feed data on a real-time basis. The automated fluorescence-based detection, in real-time, of one or more defects 114 on the surface 106 of the object 104 using the exemplary system 100 will be described in greater detail with reference to FIGs. 2-5.
[0056] As previously noted, currently, traditional methods of fluorescence-based inspection of an object typically rely on manual processes, thereby disadvantageously resulting in time-consuming and error-prone processes. In accordance with aspects of the present specification, the shortcomings of the presently available techniques are circumvented by processing the UV light images and the white light images to enhance the accuracy of the fluorescence-based defect detection by eliminating the false positives caused by the reflections from the geometric lines of the object.
[0057] Turning now to FIG. 2, one embodiment 200 of the system 100 of FIG. 1, in accordance with aspects of the present specification, is presented. In a presently contemplated configuration, the system 200 is configured to provide real-time automated fluorescence-based defect detection in an object such as the treated object 112. FIG. 2 is described with reference to the components of FIG. 1.
[0058] As previously described with reference to FIG. 1, the object 104 being inspected is subject to treatment by the treatment unit 110 using an FPI or an MPI process to create the treated object 112. Treating the object 104 via the FPI process or the MPI process aids in enhancing defect visibility in the treated object 112 under UV light. By way of example, in the FPI process, a fluorescent dye is applied to the surface of the object 104 to highlight any surface anomalies/defects by fluorescing under UV light. In another example, in the MPI process, one or more fluorescent magnetic particles may be applied to the surface of the object 104 to highlight any surface anomalies/defects by fluorescing under UV light. In the example of FIG. 2, the object 112 is described as being treated with a fluorescent dye during the FPI process to highlight any surface anomalies.
[0059] In accordance with aspects of the present specification, the fluorescence-based defect detection process facilitated by the inspection system 200 is configured to advantageously highlight fluorescent indications corresponding to one or more defects on the object and reduce the number of false positives by integrating white light imaging and UV light imaging with an intersection-based filtering approach, thereby significantly enhancing the accuracy and efficiency of the detection of defects by the inspection system 200. To that end, the inspection system 200 is configured to capture images of the treated object 112. In particular, the inspection system 200 is configured to capture images of the object 112 under UV light and white light. Accordingly, one or more UV light images and one or more white light images of the treated object 112 may be captured. Capturing the images of the object 112 under both white light and UV light ensures a comprehensive evaluation of the surface 106 of the object 112.
[0060] As previously described with reference to FIG. 1, the UV light imaging subsystem 116 includes the UV light illumination unit 118 and the UV light camera unit 120. The UV light illumination unit 118 is configured to illuminate the object 112 with UV light and the UV light camera unit 120 is configured to capture one or more UV light images of the object 112 in response to the UV light illumination to capture and highlight fluorescent regions on the surface of the object 112. Reference numeral 202 is generally used to represent the UV light images.
[0061] The UV light images 202 are captured under UV light to identify areas of fluorescence. Capturing images under the UV light aids in highlighting any remaining fluorescent stains on the surface 106 of the object 112. These fluorescent stains may be indicative of defects or irregularities that may not be visible under normal lighting conditions. Illuminating the object 112 with UV light and capturing the UV light images 202 ensures a thorough examination of the object 112 and facilitates identification of potential flaws/defects that might compromise the functionality of the object 112. By way of example, UV light reflections that are captured by the UV light images 202 may include fluorescence due to defects having residual fluorescence dye, fluorescence due to geometric artifacts, and false positives due to fluorescence from reflections due to geometric features on the object 112. Hence, it may be desirable to accurately identify the fluorescence due to defects, while eliminating any other sources of fluorescence.
[0062] Similarly, the white light imaging subsystem 122 includes the white light illumination unit 124 and the white light camera unit 126. The white light illumination unit 124 is configured to illuminate the object 112 with white light and the white light camera unit 126 is configured to capture one or more white light images of the object 112 in response to the white light illumination to capture geometric outlines of the object 112. The white light images are generally represented by reference numeral 204.
[0063] Moreover, the same object 112 may be illuminated with white light having a desirable luminosity and the white light images 204 of the object 112 may be captured such that UV reflections are not visible in the white light images 204. In accordance with aspects of the present specification, white light imaging plays a pivotal role in providing an edge drawing or geometric outline of the object 112. This technique utilizes color, brightness, and sharpness to create a detailed representation of the surface. Information related to the geometric outlines and/or geometric features obtained from the white light images 204 may be used as a baseline for understanding the overall condition and characteristics of the object 112.
[0064] Additionally, in accordance with aspects of the present specification, the UV light images 202 and the white light images 204 of the object 112 may be captured sequentially to maintain spatial alignment of detected features in the images 202, 204. In particular, sequentially imaging the same object 112 under white light to obtain the white light images 204 to capture geometric features of the object 112 subsequent to imaging the object 112 under UV light to capture the UV light images 202 ensures that fluorescent areas on the surface 106 of the object 112 are clearly defined for further analysis.
[0065] For ease of illustration and explanation, FIG. 2 will be described with reference to the use of a single UV light image 202 and a single white light image 204.
[0066] In a presently contemplated configuration, as depicted in the embodiment of FIG. 2, the image processing system 128 is configured to receive as input the UV light image 202 and the white light image 204 of the object 112 being inspected and process the images 202, 204 to detect, in real-time, presence of one or more defects 114 on the surface 106 of the object 112. In a presently contemplated configuration, the image processing system 128 includes an acquisition subsystem 206 and a processing subsystem 208 that is operatively and/or communicatively coupled to the acquisition subsystem 206.
[0067] The acquisition subsystem 206 is configured to receive the UV light image 202 and the white light image 204 of the object 112 from the UV light camera unit 120 and the white light camera unit 126 respectively. It may be noted that in one embodiment, the acquisition subsystem 206 may be configured to directly obtain the images 202, 204 from the camera units 120, 126. However, in certain other embodiments, the acquisition subsystem 206 may obtain the images 202, 204 from a storage such as a data repository 216, an optical data storage article such as a compact disc (CD), a digital versatile disc (DVD), a Blu-ray disc, and the like. Further, the acquisition subsystem 206 is configured to communicate the images 202, 204 or image frames to the processing subsystem 208 for further processing.
[0068] Subsequent to receiving the UV light image 202 and the white light image 204 from the acquisition subsystem 206, the processing subsystem 208 is configured to process the images 202, 204 to facilitate automated fluorescence-based detection of defects on the surface 106 of the object 112. In alternative embodiments, the processing subsystem 208 may be configured to retrieve the image frames/video capture from the data repository 216.
[0069] Furthermore, in a non-limiting example, the processing subsystem 208 may include one or more application-specific processors, digital signal processors, microcomputers, graphical processing units, microcontrollers, Application Specific Integrated Circuits (ASICs), Programmable Logic Arrays (PLAs), Field Programmable Gate Arrays (FGPAs), and/or any other suitable processing devices. Also, the data repository 216 may include a hard disk drive, a floppy disk drive, a read/write CD, a DVD, a Blu-ray disc, a flash drive, a solid-state storage device, a local database, and the like.
[0070] In addition, the examples, demonstrations, and/or process steps performed by certain components of the system 200 such as the processing subsystem 208 may be implemented by suitable code on a processor-based system, where the processor-based system may include a general-purpose computer or a special-purpose computer. Also, different implementations of the present specification may perform some or all of the steps described herein in different orders or substantially concurrently.
[0071] The processing subsystem 208 is configured to process the UV light image 202 and the white light image 204 to enhance the accuracy of the fluorescence-based defect detection. In particular, the processing subsystem 208 is configured to eliminate false positives caused by reflections from the geometric lines and/or edges of the object 112 and highlight fluorescent indications corresponding to one or more defects on the object 112, thereby improving the efficiency of fluorescent-based defect detection.
[0072] In a presently contemplated configuration, the processing subsystem 208 is depicted as including a defect detection platform 210. Although the embodiment depicted in FIG. 2 depicts the processing subsystem 208 as including the defect detection platform 210, in some embodiments, the defect detection platform 210 may be employed as a standalone unit that is physically separate from the processing subsystem 208 and/or the image processing system 128. Also, in some embodiments, the defect detection platform 210 may be integrated into end user systems such as, but not limited to, an edge device, such as a phone or a tablet. It may be noted that other implementations of the processing subsystem 208 are also contemplated.
[0073] In accordance with aspects of the present specification, the defect detection platform 210 is configured to accurately and efficiently detect, in real-time, one or more defects, if any, on the surface 106 of the object 112 by processing/analyzing the UV light image 202 and the white light image 204 of the object 112 being inspected. As used hereinafter, the term “object” refers to the treated object 112 that has been prepared for inspection via the FPI process or the MPI process.
[0074] As will be appreciated, it is desirable to automate industrial inspections and quality control processes, particularly in manufacturing environments. In accordance with aspects of the present specification, the image processing system 128 and the defect detection platform 210 in particular may be configured to optimize post-treatment inspection of the object 112, where the object 112 has been treated using FPI or MPI processes. Specifically, the defect detection platform 210 is configured to minimize false positive identifications associated with the current manual processes that disadvantageously cause inefficiencies in defect identification, inefficient utilization of human resources, prolonged time consumption, and low productivity outcomes.
[0075] The defect detection platform 210 is configured to facilitate the automated fluorescence-based detection of defects by efficiently distinguishing actual defects from false indications without sacrificing accuracy. In one embodiment, the defect detection platform 210 is configured to integrate white light and UV light imaging with an intersection-based filtering approach to significantly reduce false positives. In accordance with exemplary aspects of the present specification, the defect detection platform 210 is configured to recognize that physical defects rarely occur along the geometry of the component or object 112, such as along edges and/or boundaries of the object 112.
[0076] As will be appreciated, any fluorescence from the object 112 may include fluorescence due to presence of true defects that do not align with the geometry of the object, fluorescence due to geometric artifacts, and/or fluorescence due to geometric lines and other features of the object 112. By way of example, any surface anomalies that do not align with the geometry of the object 112 may result in UV light reflections that are representative of residual fluorescence from surface defects that are away from the geometry of the object 112. In a similar fashion, anomalies that align with the geometry of the object 112 may result in UV light reflections due to fluorescence from geometric outlines of the object 112 and hence need to be identified as false positives.
[0077] In accordance with aspects of the present specification, , any surface defects that do not align with the geometric outline of the object 112 may be identified as probable or true defects on the surface of the object 112, while any surface defects that align with the geometry of the object 112 may be identified as false positives, thereby effectively reducing the call out count of any false positives.
[0078] Accordingly, the defect detection platform 210 may be configured to process the white light image 204 to obtain a geometric line drawing of the object 112. In one embodiment, the defect detection platform 210 may be configured to process the white light image 204 via an edge detection technique to generate the geometric line drawing of the object 112. In one non-limiting example, the edge detection technique may include techniques such as, but not limited to, a Sobel edge detection technique, a Canny edge detection technique, or other edge detection techniques. In other embodiments, the defect detection platform 210 may be configured to employ a thresholding operation to process the white light image 202 to generate the geometric line drawing. The thresholding operation may include an adaptive thresholding technique or a fixed thresholding technique.
[0079] The geometric line drawing of the object 112 depicts geometric features such as geometric lines, outer boundaries, outlines, edges, embossed attributes, engraved attributes, and any other features of the object 112. Consequent to processing the white light image 204 via the edge detection technique or the thresholding technique, a binary image including the geometric line drawing of the object 112 is obtained. The geometric line drawing includes line pixels representing the geometric features of the object 112. In one example, line pixels representing an “edge” or “outline” are set to a non-zero value (typically with a value of “1”), while all other pixels (“non-edge”) are set to zero.
[0080] Subsequently, the defect detection platform 210 may be configured to process the UV light image 202 to obtain or extract the fluorescence or fluorescence signals in the UV light image 202. In one embodiment, the defect detection platform 210 may be configured to process the UV light image 202 via an intensity threshold technique to obtain the fluorescence signals in the UV light image 202. In some embodiments, prior to obtaining the fluorescence, the defect detection platform 210 may be configured to pre-process the fluorescence signals in the ultraviolet light image 202 to enhance the fluorescence signals. By way of example, the defect detection platform 210 may be configured to pre-process the fluorescence signals via a noise reduction technique, an intensity normalization technique, a contrast adjustment technique, or combinations thereof.
[0081] It may be noted that the obtained fluorescence may include fluorescent pixels corresponding to one or more defects, fluorescent pixels corresponding to one or more geometric artifacts, fluorescent pixels corresponding to one or more geometric features, or combinations thereof. As previously noted, the geometric artifacts may include visible imperfections or anomalies present on the surface of the object 112 that do not align with the geometry of the object 112. Some non-limiting examples of the geometric artifacts may include marks, blemishes, uneven textures, visible seams, discolorations, or small irregularities that may potentially impact the appearance or functionality of the object 112. Also, the geometric features may include geometric lines, outer boundaries, outlines, edges, embossed attributes, engraved attributes, and the like.
[0082] To enhance the accuracy and efficiency of the fluorescence-based defect detection process, it may be desirable to identify the fluorescent pixels that correspond to one or more defects and eliminate other fluorescent pixels. Accordingly, the defect detection platform 210 may be configured to process the line pixels in the geometric line drawing of the object 112 and the fluorescent pixels in the UV light image 202 to identify one or more non-overlapping or non-intersecting fluorescent pixels in the UV light image 202. The non-overlapping fluorescent pixels are representative of fluorescent pixels in the UV light image 202 that do not align with the geometry of the object 112 and are representative of fluorescent indications corresponding to defects on the object 112.
[0083] Furthermore, it may be noted that fluorescent pixels in the UV light image 202 that align with the line pixels in the geometric line drawing of the object 112 are caused by UV light reflections due to fluorescence along geometric edges or lines of the object 112 and are generally representative of false positives or false indicators of defects on the object 112. These fluorescent pixels may be referred to as overlapping or intersecting fluorescent pixels.
[0084] In accordance with aspects of the present specification, it is desirable to identify and isolate the non-overlapping fluorescent pixels in the UV light image 202 to facilitate the accurate identification of true defects on the object 112. Accordingly, the defect detection platform 210 may be configured to process the geometric line drawing corresponding to the white light image 202 and the fluorescence corresponding to the UV light image 202 via an image processing operation to identify the non-overlapping fluorescent pixels in the UV light image 202. In one example, the defect detection platform 210 may be configured to process the line pixels in the geometric line drawing and the fluorescent pixels in the fluorescence corresponding to the UV light image 202 via a pixel-by-pixel exclusive OR (XOR) operation to identify the non-overlapping fluorescent pixels. As will be appreciated, the XOR operator typically takes two binary or gray level images as input, and outputs a third image whose pixel values are just those of the first image XORed with corresponding pixels from the second image. Specifically, processing the line pixels and the fluorescent pixels via the pixel-by-pixel XOR operation aids in outputting only the non-overlapping fluorescent pixels that are present in the UV light image 202 and are absent in the white light image 204. The non-overlapping fluorescent pixels are generally representative of fluorescent indications corresponding to one or more defects on the object 112, as previously noted.
[0085] Additionally, as a consequence of the processing of the white light image 204 and the UV light image 202 via the pixel-by-pixel exclusive XOR operation, any intersecting or overlapping fluorescent pixels in the UV light image 202 are excluded. The intersecting or overlapping fluorescent pixels are representative of fluorescent pixels that are present both in the UV light image 202 and the white light image 204. More particularly, the overlapping fluorescent pixels include fluorescent pixels in the UV light image 202 that overlap with the line pixels in the geometric line drawing. Further, these overlapping fluorescent pixels are generally caused by UV reflections from the geometric lines or other geometric features of the object 112 and hence are representative of false positives or false indicators of defects on the object 112.
[0086] Subsequent to this processing of the white light image 204 and the UV light image 202 via the XOR operation, the defect detection platform 210 is configured to generate a filtered UV light image. This filtered UV light image includes only the non-overlapping fluorescent pixels that are present in the UV light image 202 but are absent in the white light image 204. Additionally, consequent to the exclusion of the overlapping fluorescent pixels, the filtered UV light image has a reduced number of false positives.
[0087] As noted hereinabove, the non-overlapping fluorescent pixels in the filtered UV light image are representative of fluorescent indications corresponding to one or more defects on the object 112. In one embodiment, the defect detection platform 210 may be configured to analyze the filtered UV light image to identify one or more defects 114 on the surface 106 of the object 112, in real-time.
[0088] It may be noted that in one example, some of these non-overlapping fluorescent pixels may occur due to non-defects or geometric artifacts on the object 112, while other non-overlapping fluorescent pixels may occur due to actual defects on the surface of the object 112. By way of example, non-defects such as geometric artifacts that include marks, blemishes, uneven textures, visible seams, discolorations, or small irregularities on the surface of the object 112 may result in the occurrence of certain non-overlapping fluorescent pixels. These non-overlapping fluorescent pixels that occur due to the geometric artifacts may adversely impact the accuracy of defect detection. To enhance the efficiency of the fluorescence-based defect detection in real-time, it may be desirable to null out the non-overlapping fluorescent pixels that occur as a result of the geometric artifacts.
[0089] Accordingly, in one embodiment, the defect detection platform 210 may be configured process the filtered UV light image via an image processing technique to identify and eliminate the non-overlapping fluorescent pixels that occur as a result of the geometric artifacts. In one example, the defect detection platform 210 may be configured to process the non-overlapping fluorescent pixels in the filtered UV light image via an intensity thresholding technique to eliminate the non-overlapping fluorescent pixels that occur due to the geometric artifacts.
[0090] Additionally or alternatively, in some embodiments, the defect detection platform 210 may be configured to process the filtered UV light image to identify and segregate groups of non-overlapping pixels that occur due to the one or more defects from other groups of non-overlapping pixels that occur due to the geometric artifacts. To that end, the defect detection platform 210 may be configured to process the filtered UV light image to identify one or more groups of contiguous non-overlapping pixels in the filtered UV light image. As will be appreciated, contiguous pixels are pixels in an image that are connected to each other. Techniques such as object detection models, morphological operations such as dilation and erosion, and other available contouring algorithms and libraries may be employed to identify the contiguous pixels.
[0091] Furthermore, in accordance with aspects of the present specification, it may be desirable to identify a subset or one or more sub-groups of contiguous non-overlapping pixels of the groups of contiguous non-overlapping pixels that occurs due to the one or more defects and exclude the other groups of contiguous non-overlapping pixels that occur due to the geometric artifacts. To that end, in one embodiment, the defect detection platform 210 may be configured to process the groups of contiguous non-overlapping pixels via a thresholding operation to identify the one or more sub-groups of contiguous non-overlapping pixels of the groups of contiguous non-overlapping pixels that occurs due to the one or more defects. In one example, the defect detection platform 210 may be configured to use a size threshold that is representative of a number of contiguous non-overlapping fluorescent pixels to identify the groups of contiguous non-overlapping pixels that occur due to the one or more geometric artifacts. Accordingly, the defect detection platform 210 may be configured to process these groups of contiguous non-overlapping pixels via the size threshold to identify and eliminate any groups of contiguous non-overlapping pixels that is smaller than the desired size threshold. By way of a non-limiting example, groups of contiguous non-overlapping pixels that have fewer than five (5) contiguous non-overlapping pixels may be identified as occurring due to the geometric artifacts and may be eliminated.
[0092] Subsequently, the defect detection platform 210 may be configured to identify or recognize the remaining groups of contiguous non-overlapping fluorescent pixels as being indicative of actual or true defect regions on the surface of the object 112. Moreover, to enhance the detection of the defects, in one embodiment, the defect detection platform 210 may be configured to process an image having the remaining groups of non-overlapping fluorescent pixels via an image contouring technique to efficiently identify defect regions on the surface of the object 112. As will be appreciated, image contouring is a process of detecting and extracting boundaries or outlines of objects in an image. Image contouring entails identifying points of similar intensity or color that form a continuous curve, thereby outlining the shape of objects in an image. Also, a contour is a continuous line of pixels separating its interior from the rest of the image. Processing the image having the remaining groups of non-overlapping fluorescent pixels via the image contouring technique may entail use of edge detection techniques, thresholding techniques, contour approximation techniques, and the like. Consequent to the processing of the image having the remaining groups of non-overlapping fluorescent pixels by the image contouring technique, a plurality of contours of the remaining groups of non-overlapping fluorescent pixels may be identified and/or annotated to generate an output image.
[0093] The defect detection platform 210 may be configured to classify the contours of non-overlapping fluorescent pixels on the output image as true positives or actual defects. Hence, the output image includes contours of the non-overlapping fluorescent pixels that are generally representative of true defects on the object 112. Moreover, the output image includes a reduced number of false positives due to the exclusion of contours of groups of contiguous non-overlapping pixels identified as occurring due to the geometric artifacts. Subsequently, the output image having the true or actual defect regions annotated thereon may be presented for further analysis.
[0094] It may be noted that the remaining contours of fluorescent pixels in the output image may include fluorescent pixels of varying shades of the color of fluorescence. For example, if the color of fluorescence is green, the fluorescent pixels in the remaining contours of fluorescent pixels may exhibit various shades of green and hence adversely impact the visibility of the true defects. In accordance with aspects of the present specification, the color of the remaining contours of fluorescent pixels may be uniformly changed to or be filled with a single shade of a different color to enhance the visibility or presentation of the remaining contours of fluorescent pixels in the output image. By way of example, the various shades of green color of the remaining contours of fluorescent pixels may be uniformly changed to or filled with a red color in the output image to improve the visibility or presentation of the true defects in the object 112. The output image highlighting the true defect regions may be presented for further analysis or decision making.
[0095] Moreover, in certain embodiments, the defect detection platform 210 may be configured to process the output image to identify one or more defects 114 on the surface of the object 112, in real-time. In one example, the output image may be superimposed or overlaid on the white light image 204 to facilitate identification of the defects 114 on the object 112.
[0096] The identified defects 114, the output image, other annotations or information may be provided to the inspector and/or other processing systems to facilitate any further analysis or for storage and further processing to a computer 130 and/or a data repository 216. The processing of the UV light images 202 and the white light images 202 by the defect detection platform 210 as described hereinabove advantageously enhances the accuracy of the fluorescence-based defect detection by retaining only the fluorescent pixels that are true indicators of one or more defects on the object 112, while eliminating or excluding the false positives caused by the reflections from the geometric outlines of the object 112.
[0097] With continuing reference to FIG. 2, the image processing system 128 may include a display 212 and a user interface 214. The display 212 and the user interface 214 may overlap in some embodiments such as a touch screen. Further, in some embodiments, the display 212 and the user interface 214 may include a common area. The display 212 may be configured to visualize or present the identified defects 114, the output image, any other annotations, and the like. In certain other embodiments, identified defects 114, the output image, and any other annotations may be stored in a local data repository 216, a remote data repository, cloud, and the like.
[0098] The user interface 214 of the image processing system 128 may include a human interface device (not shown) that is configured to aid a user such as an inspector in providing inputs or manipulating the identified defects 114, the output image, and any other annotations visualized on the display 212. The user interface 214 may be used to add labels and/or annotations to the information visualized on the display 212. In certain embodiments, the human interface device may include a trackball, a joystick, a stylus, a mouse, or a touch screen. It may be noted that the user interface 214 may be configured to aid the user in navigating through the inputs and/or outcomes/indicators generated by the image processing system 128.
[0099] The systems 100, 200 as described hereinabove provide a robust framework for the automated fluorescence-based defect detection in an object, in real-time. Particularly, the systems 100, 200 enable the real-time automation of surface defect detection and quality control of components in manufacturing industries using FPI or MPI. In one example, the systems 100, 200 presented hereinabove accurately and efficiently enable the automated detection of defects, in real-time, by integrating UV light imaging and white light imaging with an intersection-based filtering approach to significantly reduce false positives, thereby circumventing the shortcomings of the currently available methods.
[0100] Additionally, the systems 100, 200 present a unique technique of analyzing the UV light images and white light images to accurately interpret defect detection and contribute to maintaining a high pass rate of components by swiftly identifying and rectifying any potential issues related to scenarios where fluorescence is caused by factors other than defects, such as geometric reflections, surface roughness, residual penetrant dye, and the like. Moreover, the systems 100, 200 present a geometry-based filtering mechanism to reduce false positives. Specifically, the systems 100, 200 employ physical alignment and intersection analysis between fluorescent regions in the UV light images 202 and the geometric outlines in the white light images 204 to achieve superior accuracy and reduced manual validation efforts.
[0101] Furthermore, the systems 100, 200 as described herein are specifically designed to address the unique requirement of reducing false positives in surface inspections, while maintaining high defect detection accuracy. Specifically, the automated framework for real-time fluorescence-based defect detection in the object provided by the systems 100, 200 revolutionize the automated fluorescence-based defect detection and quality control technologies by significantly enhancing the accuracy of defect identification, reducing false positives, and minimizing dependencies on the geometry of the object being inspected. The automation of defect detection provided by the systems 100, 200 advantageously streamline the inspection process, leading to improved overall productivity in industrial quality control.
[0102] FIG. 3 is a schematic illustration 300 of one example 302 of an object 304 being inspected, in accordance with aspects of the present specification. The object 304 may be representative of the treated object 112 of FIG. 2. Reference numeral 306 is generally representative of a geometric outline of the object 304. FIG. 3 is described with reference to the components of FIGs. 1-2.
[0103] As previously noted, the object 304 is treated using an FPI process or an MPI process in preparation for fluorescence-based defect detection. Accordingly, when the object 304 is subject to UV light illumination, various regions on the surface of the object 304 may fluoresce when the object 304 is imaged under UV light. Fluorescence may be exhibited by UV light reflections from true defects having residual fluorescent dye such as a defect 308 on the surface of the object 304. However, as will be appreciated, fluorescence may also be exhibited by UV light reflections from geometric outlines 306 such as edges 310 of the object 304. Additionally, other features on the object 304 such as engraved or embossed text, for example a serial number 312, may also result in fluorescence being exhibited by UV light reflections. Hence, the fluorescence due to the geometric outlines 306 such as edges 310 and other features such as the engraved serial number 312 may be representative of UV light reflections from non-defects and are generally indicative of false positives. These false positives adversely impact the efficiency and accuracy of the automated fluorescence-based defect detection process.
[0104] In accordance with aspects of the present specification, the systems 100, 200 overcome the inefficiencies and adverse impact of the false positives on the defect detection process by integrating white light and UV light imaging with an intersection-based filtering approach to significantly reduce false positives. In particular, a white light image of the object 304 is acquired and processed to obtain a geometric line drawing of the object 304, where the geometric line drawing includes line pixels. Fluorescence is obtained by processing a UV light image of the object 304, where the obtained fluorescence may include fluorescent pixels corresponding to defects, fluorescent pixels corresponding to geometric artifacts, fluorescent pixels corresponding to geometric features of the object, or combinations thereof.
[0105] As previously noted, the non-overlapping fluorescent pixels include fluorescent pixels that are present in the UV light image but are absent in the geometric line drawing. Similarly, the overlapping fluorescent pixels include fluorescent pixels that are present both in the UV light image and the geometric line drawing.
[0106] The line pixels and the fluorescent pixels are processed via an XOR operation, for example, to identify the non-overlapping fluorescent pixels. Simultaneously, the overlapping fluorescent pixels in the UV light image that overlap with line pixels in the geometric line drawing may be identified as false positives and excluded. Exclusion of these overlapping fluorescent pixels or false positives reveals any remaining non-overlapping fluorescent pixels, which may signify a defect or irregularity in the object 304. The remaining non-overlapping fluorescent pixels may be presented as actual or true defect areas. Integrating white light and UV light imaging with a fluorescent intersection-based filtering approach as described hereinabove significantly reduces the number of false positives, while maintaining high defect detection accuracy.
[0107] Embodiments of the exemplary method of FIG. 4 may be described in a general context of computer executable instructions on computing systems or a processor. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract data types.
[0108] Moreover, the embodiments of the exemplary method may be practised in a distributed computing environment where optimization functions are performed by remote processing devices that are linked through a wired and/or wireless communication network. In the distributed computing environment, the computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0109] Additionally, in FIG. 4, the exemplary method is illustrated as a collection of blocks in a logical flow chart, which represents operations that may be implemented in hardware, software, firmware, or combinations thereof. It may be noted that the various operations are depicted in the blocks to illustrate the functions that are performed. In the context of software, the blocks represent computer instructions that, when executed by one or more processing subsystems, perform the recited operations.
[0110] Furthermore, the order in which the exemplary method is described is not intended to be construed as a limitation, and any number of the described blocks may be combined in any order to implement the exemplary method disclosed herein, or equivalent alternative methods. Further, certain blocks may be deleted from the exemplary method or augmented by additional blocks with added functionality without departing from the spirit and scope of the subject matter described herein.
[0111] Turning to FIG. 4, a flow chart 400 of a method for real-time automated defect detection in an object, in accordance with aspects of the present specification, is presented. In particular, the method 400 entails real-time monitoring and inspection of the object to facilitate fluorescence-based defect detection on a surface of the object. The method 400 of FIG. 4 is described with reference to the components of FIGs. 1-3. Moreover, in certain embodiments, the method 400 may be performed by the inspection system 100 and the defect detection platform 210 in particular.
[0112] The method starts at step 402, where an object such as the object 104 being inspected is treated in preparation for fluorescence-based defect detection. In one embodiment, the treatment unit 110 is configured to prepare or treat the object 104 to render the object 104 suitable for fluorescence-based inspection for defects. Also, the object 104 may be prepared for inspection via use of an FPI process or an MPI process. In one example the object 104 may be treated via the FPI process to aid in exhibiting any defect areas on the surface 106 of the object 104 using physical aspects like fluorescence. In another example, the object 104 may be treated via the MPI process to aid in exhibiting any defect areas on the surface 106 of the object 104 using physical aspects like magnetism. For ease of explanation, the object 104 is described as being treated with a fluorescent dye during the FPI process to highlight any surface anomalies.
[0113] Accordingly, at step 402, the surface of the object 104 may be thoroughly cleaned. Once the object 104 is dried, a penetrant such as a fluorescent dye may be applied to the surface 106 of the object 104. Subsequent to the dwell time of the fluorescent dye, any excess penetrant dye may be removed from the surface 106 of the object 104. Furthermore, a developer may be applied to the surface 106 of the object 104 to enhance the visibility of the entrapped penetrant dye. Consequent to the processing by step 402, a prepared or treated object 112 is produced. The treated object 112 may be inspected to detect presence of defects 114, if any, on the surface of the treated object 112.
[0114] In accordance with aspects of the present specification, the method 400 is configured to advantageously reduce the number of false positives in the fluorescence-based defect detection process by integrating white light imaging and UV light imaging with an intersection-based filtering approach, thereby significantly enhancing the accuracy and efficiency of the detection of defects. To that end, one or more UV light images and one or more white light images of the object 112 may be acquired.
[0115] At step 404, the object 112 may be illuminated via a UV light illumination source such as the UV light illumination unit 118 in the UV light illumination subsystem 116. Moreover, as indicated by step 406, one or more UV light images of the object 112 in response to the UV light illumination may be captured by the UV light camera unit 120 in the UV light illumination subsystem 116. Reference numeral 408 is generally used to represent the UV light images. These UV light images 408 are captured under UV light to identify areas of fluorescence. Furthermore, capturing images under the UV light aids in highlighting any fluorescent regions on the surface of the object 112, where the fluorescent regions may be indicative of defects or irregularities that may not be visible under normal lighting conditions. It may be noted that the UV light images 408 may include fluorescence due to defects having residual fluorescence dye, fluorescence due to geometric artifacts, and false positives due to fluorescence from reflections due to geometric features on the object 112.
[0116] Furthermore, as depicted by step 410, the same object 112 may be illuminated via a white light illumination source such as the white light illumination unit 124 in the white light illumination subsystem 122. Additionally, at step 412, one or more white light images of the object 112 in response to the white light illumination may be captured by the white light camera unit 126 in the white light illumination subsystem 122. Reference numeral 414 is generally used to represent the white light images. The white light images 414 capture geometric outlines of the object 112. Also, UV reflections are not visible in the white light images 414. Moreover, information related to the geometric outlines and/or geometric features obtained from the white light images 414 may be used as a baseline for understanding the overall condition and characteristics of the object 112.
[0117] In accordance with aspects of the present specification, the UV light images 408 and the white light images 414 of the object 112 may be captured sequentially to maintain spatial alignment of detected features in the images 408, 414. Specifically, sequentially imaging the same object 112 under white light to obtain the white light images 414 to capture geometric features of the object 112 subsequent to imaging the object 112 under UV light to capture the UV light images 408 ensures that fluorescent areas on the surface 106 of the object 112 are clearly defined for further analysis.
[0118] Subsequently, the UV light images 408 and the white light images 414 may be processed and analyzed to facilitate the automated fluorescence-based defect detection in the object 112, in real-time. In one example, the defect detection platform 210 in the image processing system 128 may be configured to process UV light images 408 and the white light images 414 to accurately and efficiently detect, in real-time, one or more defects on the surface 106 of the object 112.
[0119] Further, to optimize post-treatment inspection of the object 112, where the object 112 has been treated using FPI or MPI processes, the method 400 is configured to minimize false positive identifications associated with the current manual processes that disadvantageously cause inefficiencies in defect identification, inefficient utilization of human resources, prolonged time consumption, and low productivity outcomes. Specifically, the method 400 entails overcoming the inefficiencies and adverse impact of the false positives on the defect detection process by integrating white light imaging and UV light imaging with an intersection-based filtering approach to significantly reduce false positives.
[0120] In accordance with exemplary aspects of the present specification, method 400 is configured to recognize that physical defects rarely occur along the geometry of the component or object 112, such as along edges and/or boundaries of the object 104.
[0121] As will be appreciated, any fluorescence from the object 112 may include fluorescence due to presence of true defects that do not align with the geometry of the object and fluorescence due to geometric lines and other features of the object 112. Specifically, any surface anomalies that do not align with the geometry of the object 112 may result in UV light reflections that are representative of residual fluorescence from surface defects that are away from the geometry of the object 112. In a similar fashion, anomalies that align with the geometry of the object 112 may result in UV light reflections due to fluorescence from geometric outlines of the object 112 and hence may be identified as false positives.
[0122] Hence, in accordance with aspects of the present specification, , any surface defects that do not align with the geometric outline of the object 112 may be identified as probable or true defects on the surface of the object 112, while any surface defects that align with the geometry of the object 112 may be identified as false positives, thereby effectively reducing the call out count of any false positives.
[0123] To facilitate the enhanced accuracy of fluorescence-based defect detection, a white light image 414 may be processed to obtain a geometric line drawing 418 of the object 112, as indicated by step 416. In one embodiment, the white light image 414 may be processed via an edge detection technique to generate the geometric line drawing 418 of the object 112. Some non-limiting examples of the edge detection technique may include techniques such as, but not limited to, a Sobel edge detection technique, a Canny edge detection technique, or other edge detection techniques. In other embodiments, a thresholding operation may be utilized to process the white light image 414 to generate the geometric line drawing 418. The thresholding operation may include an adaptive thresholding technique or a fixed thresholding technique. The geometric line drawing 418 of the object 112 depicts geometric features such as geometric lines, outer boundaries, outlines, edges, embossed attributes, engraved attributes, and any other features or characteristics of the object 112.
[0124] Consequent to processing the white light image 204 via the edge detection technique a binary image including the geometric line drawing 418 of the object 112 is obtained. Also, the geometric line drawing 418 includes line pixels representing the geometric features of the object 112. In this binary image, line pixels representing an “edge” or “outline” are set to a non-zero value (typically with a value of “1”), while all other line pixels (“non-edge”) are set to zero.
[0125] Additionally, at step 420, the UV light image 408 may be processed to obtain or extract the fluorescence or fluorescence signals 422 in the UV light image 408. In one embodiment, an intensity threshold technique may be employed to process the UV light image 408 to obtain the fluorescence signals 422 in the UV light image 408. Further, in some embodiments, prior to extracting the fluorescence signals, the ultraviolet light image 408 may be pre-processed to enhance the fluorescence signals 422. In one example, techniques such as a noise reduction technique, an intensity normalization technique, a contrast adjustment technique, or combinations thereof may be employed to pre-process the fluorescence signals 422.
[0126] As will be appreciated, the obtained fluorescence signals 422 may include fluorescent pixels corresponding to one or more defects, fluorescent pixels corresponding to one or more geometric artifacts, fluorescent pixels corresponding to one or more features, or combinations thereof. As previously noted, the geometric artifacts may include visible imperfections or anomalies present on the surface 106 of the object 112 that may not align with the geometry of the object 112. Further, as previously noted, some non-limiting examples of the geometric artifacts include marks, blemishes, uneven textures, visible seams, discolorations, or small irregularities that may potentially impact the appearance or functionality of the object 112. Also, the geometric features of the object 112 may include boundaries, edges, outlines, embossed features, engraved features, and the like.
[0127] In addition, it may be desirable to identify and segregate the fluorescent pixels corresponding to one or more defects and eliminate other fluorescent pixels to enhance the accuracy and efficiency of the defect detection process. Accordingly, the line pixels in the geometric line drawing 418 of the object 112 and the fluorescent pixels in the UV light image 408 may be processed to identify one or more non-overlapping or non-intersecting fluorescent pixels in the UV light image 408, as depicted by step 424. These non-overlapping fluorescent pixels are representative of fluorescent indications corresponding to defects on the object 112. Additionally, the non-overlapping fluorescent pixels are representative of fluorescent pixels in the UV light image 408 that do not align with the geometry of the object 112.
[0128] With continuing reference to step 424, it may be noted that fluorescent pixels in the UV light image 408 that align with the line pixels in the geometric line drawing 418 of the object 112 are caused by UV light reflections due to fluorescence along geometric edges or lines of the object 112 and are generally representative of false positives or false indicators of defects on the object 112. These fluorescent pixels may be referred to as overlapping or intersecting fluorescent pixels.
[0129] In accordance with aspects of the present specification, it is desirable to identify the non-overlapping fluorescent pixels in the UV light image 408 to facilitate the accurate identification of true defects on the object 112. Accordingly, at step 424, the line pixels in the geometric line drawing 418 corresponding to the white light image 414 and the fluorescence 422 corresponding to the UV light image 408 may be processed via an image processing operation to identify the non-overlapping fluorescent pixels in the UV light image 408. In one example, the line pixels in the geometric line drawing 418 and the fluorescent pixels in the fluorescence 422 corresponding to the UV light image 404 may be processed via a pixel-by-pixel exclusive OR (XOR) operation to identify the non-overlapping fluorescent pixels. As previously noted, the XOR operator typically takes two binary or gray level images as input, and outputs a third image whose pixel values are just those of the first image XORed with corresponding pixels from the second image. Processing the line pixels and the fluorescent pixels via the pixel-by-pixel XOR operation aids in outputting only the non-overlapping fluorescent pixels that are present in the UV light image 408 and are absent in the white light image 414. These non-overlapping fluorescent pixels are generally representative of fluorescent indications corresponding to one or more defects on the object 112, as previously noted.
[0130] Moreover, consequent to the processing of the white light image 414 and the UV light image 408 via the pixel-by-pixel exclusive XOR operation, any intersecting or overlapping fluorescent pixels in the UV light image 408 are excluded. The intersecting or overlapping fluorescent pixels are representative of fluorescent pixels that are present both in the UV light image 408 and the white light image 414. In particular, the overlapping fluorescent pixels include fluorescent pixels in the UV light image 408 that overlap with the line pixels in the geometric line drawing 418. Also, these overlapping fluorescent pixels are generally caused by UV reflections from the geometric lines or other geometric features of the object 112 and hence are representative of false positives or false indicators of defects on the object 112.
[0131] Subsequent to this processing of the white light image 414 and the UV light image 408 via the XOR operation, a filtered UV light image 428 is generated, as indicated by step 426. This filtered UV light image 428 includes only the non-overlapping fluorescent pixels that are present in the UV light image 408 but are absent in the white light image 414. Moreover, excluding the overlapping fluorescent pixels results in the filtered UV light image 428 having a reduced number of false positives.
[0132] Furthermore, at step 430, the filtered UV light image 428 may be analyzed to identify one or more defects 114 on the surface of the object 112, in real-time. As noted hereinabove, the non-overlapping fluorescent pixels in the filtered UV light image 428 are representative of fluorescent indications corresponding to one or more defects on the object 112. Moreover, as previously noted, some of these non-overlapping fluorescent pixels may occur due to non-defects or geometric artifacts on the object 112, while other non-overlapping fluorescent pixels may occur due to actual defects on the surface of the object 112. By way of example, non-defects such as geometric artifacts that include marks, blemishes, uneven textures, visible seams, discolorations, or small irregularities on the surface of the object 112 may result in the occurrence of some non-overlapping fluorescent pixels. These non-overlapping fluorescent pixels that occur due to the geometric artifacts may adversely impact the accuracy of defect detection. To enhance the efficiency of the fluorescence-based defect detection in real-time, it may be desirable to null out the non-overlapping fluorescent pixels that occur as a result of the geometric artifacts.
[0133] Accordingly, at step 430, the filtered UV light image 428 may be processed via an image processing technique to identify and eliminate the non-overlapping fluorescent pixels that occur as a result of the geometric artifacts. In one example, an intensity thresholding technique may be employed to eliminate the non-overlapping fluorescent pixels that occur due to the geometric artifacts.
[0134] With continuing reference to step 430, additionally or alternatively, in some embodiments, the filtered UV light image 428 may be processed to identify and segregate groups of non-overlapping pixels that occur due to the one or more defects from other groups of non-overlapping pixels that occur due to the geometric artifacts. In particular, the filtered UV light image 428 may be processed to identify one or more groups of contiguous non-overlapping pixels in the filtered UV light image 428. As will be appreciated, contiguous pixels are pixels in an image that are connected to each other. Techniques such as object detection models, morphological operations such as dilation and erosion, and other available contouring algorithms and libraries may be employed to identify the contiguous pixels.
[0135] Furthermore, in accordance with aspects of the present specification, it may be desirable to identify a subset or one or more sub-groups of contiguous non-overlapping pixels of the groups of contiguous non-overlapping pixels that occurs due to the one or more defects and exclude the other groups of contiguous non-overlapping pixels that occur due to the geometric artifacts. To that end, in one embodiment, the groups of contiguous non-overlapping pixels may be processed via a thresholding operation to identify the subset or one or more sub-groups of contiguous non-overlapping pixels of the groups of contiguous non-overlapping pixels that occurs due to the one or more defects and the other groups of contiguous non-overlapping pixels that occur due to the geometric artifacts. In one example, a size threshold that is representative of a number of contiguous non-overlapping fluorescent pixels may be utilized to identify the groups of contiguous non-overlapping pixels that occur due to the geometric artifacts. Accordingly, these groups of contiguous non-overlapping pixels may be processed via the size threshold to identify and eliminate any groups of contiguous non-overlapping pixels that is smaller than the desired size threshold. By way of a non-limiting example, groups of contiguous non-overlapping pixels that have fewer than five (5) contiguous non-overlapping pixels may be identified as occurring due to the geometric artifacts and may be eliminated. In addition, at step 430, the remaining groups of contiguous non-overlapping fluorescent pixels 432 may be recognized as being indicative of actual or true defect regions on the surface of the object 112.
[0136] Moreover, to enhance the detection of the defects, in one embodiment, an image having the remaining groups of non-overlapping fluorescent pixels 432 may be processed via an image contouring technique to efficiently identify defect regions on the surface of the object 112. As will be appreciated, image contouring is a process of detecting and extracting boundaries or outlines of objects in an image. Image contouring entails identifying points of similar intensity or color that form a continuous curve, thereby outlining the shape of objects in an image. Also, a contour is a continuous line of pixels separating its interior from the rest of the image. Processing the image having the remaining groups of non-overlapping fluorescent pixels 432 via the image contouring technique may entail use of edge detection techniques, thresholding techniques, contour approximation techniques, and the like. Consequent to the processing of the image having the remaining groups of non-overlapping fluorescent pixels 432 by the image contouring technique, a plurality of contours of the remaining groups of non-overlapping fluorescent pixels may be identified. Subsequently, at step 434, these contours may be annotated to generate an output image 436.
[0137] In one embodiment, the annotated contours of non-overlapping fluorescent pixels on the output image 436 may be classified as true positives or actual defects 438. Hence, the output image 436 includes contours of the non-overlapping fluorescent pixels 432 that are generally representative of true defects 438 on the object 112. Moreover, the output image 436 includes a reduced number of false positives due to the exclusion of contours of groups of contiguous non-overlapping pixels identified as occurring due to the geometric artifacts. Subsequently, the output image 436 having the true or actual defect regions 438 annotated thereon may be presented for further analysis.
[0138] Subsequently, at step 440, the remaining contours of fluorescent pixels in the output image 436 may be presented as true or actual defect regions 438. In particular, the remaining contours of fluorescent pixels in the output image 436 may signify defects or irregularities 438 on the object 112.
[0139] It may be noted that the remaining contours of fluorescent pixels in the output image 436 may include fluorescent pixels of varying shades of the color of fluorescence. For example, if the color of fluorescence is green, the fluorescent pixels in the remaining contours of fluorescent pixels may exhibit various shades of green and hence may adversely impact the visibility of the true defects. In some embodiments, at step 440, the color of the remaining contours of fluorescent pixels may be uniformly changed to or be filled with a single shade of a different color to enhance the visibility or presentation of the remaining contours of fluorescent pixels in the output image 436. By way of example, the various shades of green color of the remaining contours of fluorescent pixels may be uniformly changed to or filled with a single shade of red color in the output image 436 to improve the visibility or presentation of the true defects 438 in the object 112. The output image 436 highlighting the true defect regions 438 may be presented for further analysis or decision making.
[0140] Moreover, in some embodiments, the output image 436 may be further processed to identify one or more defects 438 on the surface of the object 112, in real-time. In one example, the output image 436 may be superimposed or overlaid on the white light image 414 to facilitate identification of the defects 438 on the object 112.
[0141] The identified defects 438, the output image 436, other annotations or information may be provided to the inspector and/or other processing systems to facilitate any further analysis or for storage and further processing to a computer 130 and/or a data repository 216. The processing of the UV light images 404 and the white light images 414 by steps 402-440 of the method 400 as described hereinabove advantageously enhances the accuracy of the fluorescence-based defect detection by retaining only the fluorescent pixels that are true indicators of one or more defects 438 on the object 112, while eliminating or excluding the false positives caused by the reflections from the geometric outlines of the object 112.
[0142] The method 400 for real-time automated defect detection in an object as described hereinabove facilitates the real-time automated fluorescence-based defect detection in objects and quality control in manufacturing industries using FPI or MPI. Additionally, the method 400 presents a unique technique of analyzing the UV light images and white light images to accurately interpret defect detection and contribute to maintaining a high pass rate of components by swiftly identifying and rectifying any potential issues related to scenarios where fluorescence is caused by factors other than defects, such as geometric reflections, surface roughness, residual penetrant dye, and the like. Specifically, the method 400 revolutionizes the automated fluorescence-based defect detection and quality control technologies significantly enhancing the accuracy of defect identification, reducing false positives, and minimizing dependencies on the geometry of the object being inspected. The automation of defect detection provided by the method 400 advantageously streamlines the inspection process, leading to improved overall productivity in industrial quality control.
[0143] Referring now to FIG. 5, a schematic representation 500 of one embodiment 502 of a digital processing system implementing the image processing system 128 (see FIG. 1), in accordance with aspects of the present specification, is depicted. Also, FIG. 5 is described with reference to the components of FIGs. 1-4.
[0144] It may be noted that while, in FIG. 1, the defect detection platform 210 is shown as being a part of the image processing system 128, in certain embodiments, the defect detection platform 210 may also be integrated into end user systems such as, but not limited to, an edge device, such as a phone or a tablet. Moreover, the example of the digital processing system 502 presented in FIG. 5 is for illustrative purposes. Other designs are also anticipated.
[0145] The digital processing system 502 may contain one or more processors such as a central processing unit (CPU) 504, a random access memory (RAM) 506, a secondary memory 508, a graphics controller 510, a display unit 512, a network interface 514, and an input interface 516. It may be noted that the components of the digital processing system 502 except the display unit 512 may communicate with each other over a communication path 518. In certain embodiments, the communication path 518 may include several buses, as is well known in the relevant arts.
[0146] The CPU 504 may execute instructions stored in the RAM 506 to provide several features of the present specification. Moreover, the CPU 504 may include multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, the CPU 504 may include only a single general-purpose processing unit.
[0147] Furthermore, the RAM 506 may receive instructions from the secondary memory 508 using the communication path 518. Also, in the embodiment of FIG. 5, the RAM 506 is shown as including software instructions constituting a shared operating environment 520 and/or other user programs 522 (such as other applications, DBMS, and the like). In addition to the shared operating environment 520, the RAM 506 may also include other software programs such as device drivers, virtual machines, and the like, which provide a (common) run time environment for execution of other/user programs.
[0148] With continuing reference to FIG. 5, the graphics controller 510 is configured to generate display signals (e.g., in RGB format) for display on the display unit 512 based on data/instructions received from the CPU 504. The display unit 512 may include a display screen to display images defined by the display signals. Furthermore, the input interface 516 may correspond to a keyboard and a pointing device (e.g., a touchpad, a mouse, and the like) and may be used to provide inputs. In addition, the network interface 514 may be configured to provide connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems connected to a network, for example.
[0149] Moreover, the secondary memory 508 may include a hard drive 526, a flash memory 528, and a removable storage drive 530. The secondary memory 508 may store data generated by the system 100 (see FIG. 1) and software instructions (for example, for implementing the various features of the present specification), which enable the digital processing system 502 to provide several features in accordance with the present specification. The code/instructions stored in the secondary memory 508 may either be copied to the RAM 506 prior to execution by the CPU 504 for higher execution speeds or may be directly executed by the CPU 504.
[0150] Some or all of the data and/or instructions may be provided on a removable storage unit 532, and the data and/or instructions may be read and provided by the removable storage drive 530 to the CPU 504. Further, the removable storage unit 532 may be implemented using medium and storage format compatible with the removable storage drive 530 such that the removable storage drive 530 can read the data and/or instructions. Thus, the removable storage unit 532 includes a computer readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can also be in other forms (e.g., non-removable, random access, and the like).
[0151] It may be noted that as used herein, the term “computer program product” is used to generally refer to the removable storage unit 532 or a hard disk installed in the hard drive 526. These computer program products are means for providing software to the digital processing system 502. The CPU 504 may retrieve the software instructions and execute the instructions to provide various features of the present specification.
[0152] Also, the term “storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may include non-volatile media and/or volatile media. Non-volatile media include, for example, optical disks, magnetic disks, or solid-state drives, such as the secondary memory 508. Volatile media include dynamic memory, such as the RAM 506. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
[0153] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, the transmission media may include coaxial cables, copper wire, and fiber optics, including the wires that include the communication path 518. Moreover, the transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infrared data communications.
[0154] Reference throughout this specification to “one embodiment,” “an embodiment,” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present specification. Thus, appearances of the phrases “in one embodiment,” “in an embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0155] Furthermore, the described features, structures, or characteristics of the specification may be combined in any suitable manner in one or more embodiments. In the description presented hereinabove, numerous specific details are provided such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, and the like, to provide a thorough understanding of embodiments of the specification.
[0156] The aforementioned components may be dedicated hardware elements such as circuit boards with digital signal processors or may be software running on a general-purpose computer or processor such as a commercial, off-the-shelf personal computer (PC). The various components may be combined or separated according to various embodiments of the invention.
[0157] Moreover, the foregoing examples, demonstrations, and process steps such as those that may be performed by the system may be implemented by suitable code on a processor-based system, such as a general-purpose or special-purpose computer. It should also be noted that different implementations of the present specification may perform some or all of the steps described herein in different orders or substantially concurrently, that is, in parallel. Furthermore, the functions may be implemented in a variety of programming languages, including but not limited to C++, Python, and Java. Such code may be stored or adapted for storage on one or more tangible, machine readable media, such as on data repository chips, local or remote hard disks, optical disks (that is, CDs or DVDs), memory or other media, which may be accessed by a processor-based system to execute the stored code. Note that the tangible media may include paper or another suitable medium upon which the instructions are printed. For instance, the instructions may be electronically captured via optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner if necessary, and then stored in the data repository or memory.
[0158] Embodiments of the systems and methods for real-time automated defect detection in an object presented hereinabove provide a robust framework for automated fluorescence-based defect detection in an object, in real-time, that facilitates defect detection and quality control of components in manufacturing industries using FPI or MPI. More particularly, implementing the systems and methods as described hereinabove accurately and efficiently enables the automated detection of defects, in real-time, by integrating UV light imaging and white light imaging with an intersection-based filtering approach to significantly reduce false positives, thereby circumventing the shortcomings of the currently available methods.
[0159] Moreover, the systems and methods present a unique technique of analyzing the UV light images and white light images to accurately interpret defect detection and contribute to maintaining a high pass rate of components by swiftly identifying and rectifying any potential issues related to scenarios where fluorescence is caused by factors other than defects, such as geometric reflections, surface roughness, residual penetrant dye, and the like. Specifically, the systems and methods present a geometry-based filtering that employs physical alignment and intersection analysis between fluorescent regions in the UV light images and the geometric outlines in the white light images to achieve superior defect detection accuracy, minimized false positives, and reduced manual validation efforts.
[0160] Also, the systems and methods described herein are specifically designed to address the unique requirement of reducing false positives in surface inspections, while maintaining high defect detection accuracy. Use of the automated framework for real-time automated fluorescence-based defect detection in the object provided by the systems and methods presented herein provides significant advantages and revolutionizes the automated fluorescence-based defect detection and quality control technologies by enhancing the accuracy of defect identification, reducing false positives, and minimizing dependencies on the geometry of the object being inspected. Moreover, the automation of defect detection provided by the systems and methods advantageously streamlines the inspection process, leading to improved overall productivity in industrial quality control.
[0161] In addition, the systems and methods for real-time automated defect detection in an object present a targeted and comprehensive approach to automating industrial inspections and quality control processes, particularly in manufacturing environments and focus on optimizing post-treatment inspection of components using FPI and MPI, thereby circumventing the shortcomings of associated with current manual processes that cause inefficiencies in defect identification, inefficient utilization of human resources, prolonged time consumption, and low productivity outcomes. In addition, the systems and methods presented hereinabove advantageously provide varying degrees of automation for defect detection.
[0162] Although specific features of embodiments of the present specification may be shown in and/or described with respect to some drawings and not in others, this is for convenience only. It is to be understood that the described features, structures, and/or characteristics may be combined and/or used interchangeably in any suitable manner in the various embodiments.
[0163] While only certain features of the present specification have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the present specification is intended to cover all such modifications and changes as fall within the true spirit of the invention.
,CLAIMS:1. A system (100, 200) for real-time automated defect detection in an object (104), the system (100, 200) comprising:
a treatment unit (108) configured to prepare the object (104) for fluorescence-based defect detection, wherein preparing the object (104) comprises applying a fluorescent penetrant dye or fluorescent magnetic particles to a surface (106) of the object (104);
an ultraviolet light imaging subsystem (116) configured to:
illuminate the surface (106) of a prepared object (112) with ultraviolet light;
capture one or more ultraviolet light images (202, 408) of the prepared object (112) in response to the ultraviolet light illumination, wherein the ultraviolet light images (202, 408) comprise fluorescence signals (422) caused by the fluorescent penetrant dye or the fluorescent magnetic particles applied to the prepared object (112);
a white light imaging subsystem (122) configured to:
illuminate the surface (106) of the prepared object (112) with white light;
capture one or more white light images (204, 414) of the prepared object (112) in response to the white light illumination;
an image processing system (128) comprising:
an acquisition subsystem (206) configured to receive the captured one or more ultraviolet light images (202, 408) and the one or more white light images (204, 414) of the prepared object (112);
a processing subsystem (208) in operative association with the acquisition subsystem (206) and comprising:
a defect detection platform (210) configured to process the captured one or more ultraviolet light images (202, 408) and the one or more white light images (204, 414) to detect, in real-time, one or more defects (114, 438) on the prepared object (112) with a reduced number of false positives, wherein to detect, in real-time, the one or more defects (114, 438) the defect detection platform (210) is configured to:
process a white light image (204, 414) to generate a geometric line drawing (418) of the prepared object (112), wherein the geometric line drawing (418) comprises geometric lines representing geometric features of the prepared object (112);
process an ultraviolet light image (202, 408) to obtain the fluorescence signals (422) in the ultraviolet light image (202, 408), wherein the fluorescence signals (422) comprise fluorescent pixels corresponding to one or more defects, fluorescent pixels corresponding to one or more geometric artifacts, fluorescent pixels corresponding to one or more geometric features, or combinations thereof;
perform an image processing operation on the white light image (204, 414) and the ultraviolet light image (202, 408) to identify non-overlapping fluorescent pixels, wherein the non-overlapping fluorescent pixels are representative of fluorescent indications corresponding to one or more defects (114, 438) on the prepared object (112);
generate a filtered ultraviolet light image (428) comprising the fluorescent indications corresponding to defects (114, 438) on the prepared object (112);
analyze the filtered ultraviolet light image (428) to identify one or more defects (114, 438) on the prepared object (112);
annotate the one or more defects (114, 438) to create an output image (436); and
an interface unit (224, 226) configured to provide, in real-time, the output image (436), the one or more defects (114, 438), or both to facilitate analysis,
wherein the system (100, 200) is configured to enhance accuracy of the fluorescence-based defect detection by eliminating false positives caused by reflections from the geometric features on the prepared object (112).
2. The system (100, 200) of claim 1, wherein the object (104, 112, 304) comprises a component or a product being inspected.
3. The system (100, 200) of claim 1, wherein the defect detection platform (210) is configured to integrate white light imaging and ultraviolet light imaging with an intersection-based filtering technique to provide enhanced accuracy of fluorescence-based defect detection and reduce a number of false positives in the object (112, 304) being inspected.
4. The system (100, 200) of claim 1, wherein the treatment unit (110) is configured to prepare the object (104) using a Fluorescence Penetrant Inspection (FPI) process or a Magnetic Fluorescence Penetrant Inspection (MPI) process.
5. The system (100, 200) of claim 4, wherein to prepare the object (104) for inspection by fluorescent-based defect detection, the treatment unit (108) is configured to:
pre-clean and dry the object (104);
apply a fluorescent penetrant dye or fluorescent magnetic particles to the surface (106) of the object (104);
remove excess fluorescent penetrant dye or fluorescent magnetic particles from the surface (106) of the object (104); and
apply a developer to the surface (106) of the object (104) to enhance defect visibility under ultraviolet light.
6. The system (100, 200) of claim 1, wherein the system (100, 200) is configured to capture the one or more ultraviolet light images (202, 408) and the one or more white light images (204, 414) of the prepared object (112) sequentially to maintain spatial alignment of detected features.
7. The system (100, 200) of claim 1, wherein the ultraviolet light imaging subsystem (116) is configured to capture the one or more ultraviolet light images (202, 408) under ultraviolet light to facilitate identification of areas of fluorescence in the one or more ultraviolet light images (202, 408).
8. The system (100, 200) of claim 1, wherein the white light imaging subsystem (122) is configured to capture the one or more white light images (204, 414) under white light to facilitate identification of geometric features in the one or more white light images (204, 414).
9. The system (100, 200) of claim 1, wherein to generate the geometric line drawing (418) of the prepared object (112), the defect detection platform (210) is configured to process the white light image (204, 414) via a thresholding operation, an edge detection technique, or a combination thereof to identify the geometric lines representing structural features of the prepared object (112).
10. The system (100, 200) of claim 9, wherein the thresholding operation comprises an adaptive thresholding technique or a fixed thresholding technique to isolate the geometric lines of the prepared object (112), and wherein the edge detection technique comprises a Sobel edge detection technique, a Canny edge detection technique, or other edge detection techniques.
11. The system (100, 200) of claim 1, wherein the defect detection platform (210) is configured to pre-process the fluorescence signals (422) in the ultraviolet light image (202, 408) to enhance the fluorescence signals (422), and wherein pre-processing the fluorescence signals (422) comprises processing the fluorescence signals (422) via a noise reduction technique, an intensity normalization technique, a contrast adjustment technique, or combinations thereof.
12. The system (100, 200) of claim 1, wherein to perform the image processing operation on the white light image (204, 414) and the ultraviolet light image (202, 408) to identify the non-overlapping pixels, the defect detection platform (210) is configured to perform a pixel-by-pixel exclusive OR (XOR) operation between the white light image (204, 414) and the ultraviolet light image (202, 408) to output only the non-overlapping fluorescent pixels that are present in the ultraviolet light image (202, 408) and are absent in the white light image (204, 414).
13. The system (100, 200) of claim 12, wherein the pixel-by-pixel XOR operation between the white light image (204, 414) and the ultraviolet light image (202, 408) is configured to exclude overlapping fluorescent pixels that are present in the ultraviolet light image (202, 408) and the white light image (204, 414), and wherein the overlapping fluorescent pixels are representative of false positives caused by reflections due to geometric features on the prepared object (112).
14. The system (100, 200) of claim 1, wherein to analyze the filtered ultraviolet light image (428), the defect detection platform (210) is configured to:
identify one or more groups of contiguous non-overlapping pixels in the filtered ultraviolet light image (428);
process the one or more groups of contiguous non-overlapping pixels via a thresholding operation to identify one or more sub-groups of contiguous non-overlapping pixels (432), wherein the one or more sub-groups of contiguous non-overlapping pixels (432) are representative of one or more defects (114, 438) on the surface of the prepared object (112);
process the one or more sub-groups of contiguous non-overlapping pixels (432) via an image contouring technique to generate one or more contours corresponding to the one or more sub-groups of contiguous non-overlapping pixels (432), wherein the one or more contours correspond to one or more true defects (114, 438) on the surface of the prepared object (112);
annotate the one or more contours representative of the one or more defects (114, 438) to generate an output image (436) to enhance visualization of the one or more defects (114, 438); and
process the output image (436) to identify the one or more defects (114, 438) on the surface (106) of the object (112, 304).
15. A method (400) for real-time automated defect detection in an object (104), the method (400) comprising:
treating (402) the object (104) by applying a fluorescent penetrant dye or fluorescent magnetic particles to a surface (106) of the object (104) to prepare the object (104) for fluorescence-based defect detection;
illuminating (404) the surface (106) of the prepared object (112) with ultraviolet light;
capturing (406) one or more ultraviolet light images (202, 408) of the prepared object (112) in response to the ultraviolet light illumination, wherein the one or more ultraviolet light images (202, 408) comprise fluorescence signals (422) caused by the fluorescent penetrant dye or the fluorescent magnetic particles applied to the prepared object (112);
illuminating (410) the surface (106) of the prepared object (112) with white light;
capturing (412) one or more white light images (204, 414) of the prepared object (112) in response to the white light illumination;
processing the captured one or more ultraviolet light images (202, 408) and the one or more white light images (204, 414) to detect, in real-time, one or more defects (114, 438) on the prepared object (112) with a reduced number of false positives, wherein detecting, in real-time, the one or more defects (114, 438) comprises:
processing (416) a white light image (204, 414) to generate a geometric line drawing (418) of the prepared object (112), wherein the geometric line drawing (418) comprises geometric lines representing geometric features of the prepared object (112);
processing (420) an ultraviolet light image (202, 408) to obtain the fluorescence signals (422) in the ultraviolet light image (202, 408), wherein the fluorescence signals (422) comprise fluorescent pixels corresponding to one or more defects, fluorescent pixels corresponding to one or more geometric artifacts, fluorescent pixels corresponding to one or more geometric features, or combinations thereof;
performing (424) an image processing operation on the white light image (204, 414) and the ultraviolet light image (202, 408) to identify non-overlapping pixels, wherein the non-overlapping fluorescent pixels are representative of fluorescent indications corresponding to one or more defects (114, 438) on the prepared object (112);
generating (426) a filtered ultraviolet light image (428) comprising the fluorescent indications corresponding to one or more defects (114, 438) on the prepared object (112);
analyzing (430) the filtered ultraviolet light image (428) to identify one or more defects (114, 438) on the prepared object (112);
annotating (434) the one or more defects (114, 438) to create an output image (436); and
providing (440), in real-time, the output image (436), the one or more defects (114, 438), or both to facilitate analysis,
wherein the method (400) is configured to enhance accuracy of the fluorescence-based defect detection by eliminating false positives caused by reflections from geometric features in the prepared object (112).
16. The method (400) of claim 15, comprising integrating white light imaging and ultraviolet light imaging with an intersection-based filtering technique to provide enhanced accuracy of fluorescence-based defect detection and reduce a number of false positives in the object (112, 304) being inspected.
17. The method (400) of claim 15, wherein treating (402) the object (104) comprises preparing the object (104) for inspection using a Fluorescence Penetrant Inspection (FPI) process or a Magnetic Fluorescence Penetrant Inspection (MPI) process, and wherein preparing the object (104) for inspection by fluorescence-based defect detection comprises:
pre-cleaning and drying the object (104);
applying a fluorescent penetrant dye or fluorescent magnetic particles to the surface (106) of the object (104);
removing excess fluorescent penetrant dye or fluorescent magnetic particles from the surface (106) of the object (104); and
applying a developer to the surface (106) of the object (104) to enhance defect visibility under ultraviolet light.
18. The method (400) of claim 15, comprising capturing the one or more ultraviolet light images (202, 408) and the one or more white light images (204, 414) of the prepared object (112) sequentially to maintain spatial alignment of detected features.
19. The method (400) of claim 15, wherein capturing (406) the one or more ultraviolet light images (202, 408) comprises acquiring the one or more ultraviolet light images (202, 408) under ultraviolet light to facilitate identification of areas of fluorescence (422) in the one or more ultraviolet light images (202, 408), and wherein capturing the one or more white light images (204, 414) comprises acquiring the one or more white light images (204, 414) under white light to facilitate identification of geometric features in the one or more white light images (204, 414).
20. The method (400) of claim 15, wherein processing (416) the white light image (204, 414) to generate the geometric line drawing (418) of the prepared object (112) comprises processing the white light image (204, 414) via a thresholding operation, an edge detection technique, or a combination thereof to identify the geometric lines representing structural features of the prepared object (112).
21. The method (400) of claim 20, wherein the thresholding operation comprises an adaptive thresholding technique or a fixed thresholding technique to isolate the geometric lines of the prepared object (112), and wherein the edge detection technique comprises a Sobel edge detection technique, a Canny edge detection technique, or other edge detection techniques.
22. The method (400) of claim 15, comprising pre-processing the fluorescence signals (422) to enhance the fluorescence signals (422) in the ultraviolet light image (202, 408), wherein pre-processing the fluorescence signals (422) comprises processing the fluorescence signals (422) via a noise reduction technique, an intensity normalization technique, a contrast adjustment technique, or combinations thereof.
23. The method (400) of claim 15, wherein performing (424) the image processing operation on the white light image (204, 414) and the ultraviolet light image (202, 408) to identify non-overlapping pixels comprises performing a pixel-by-pixel exclusive OR (XOR) operation between the white light image (204, 414) and the ultraviolet light image (202, 408) to output only the non-overlapping fluorescent pixels that are present in the ultraviolet light image (202, 408) image and are absent in the white light image (204, 414).
24. The method (400) of claim 23, wherein performing the pixel-by-pixel exclusive XOR operation between the white light image (204, 414) and the ultraviolet light image (202, 408) comprises excluding overlapping fluorescent pixels that are present in the ultraviolet light image (202, 408) image and the white light image (204, 414), and wherein the overlapping fluorescent pixels are representative of false positives caused by reflections due to the geometric features of the prepared object (112).
25. The method (400) of claim 15, wherein analyzing (430) the filtered ultraviolet light image (428) to identify one or more defects (114, 438) comprises:
identifying one or more groups of contiguous non-overlapping pixels in the filtered ultraviolet light image (428);
processing the one or more groups of contiguous non-overlapping pixels via a thresholding operation to identify one or more sub-groups of contiguous non-overlapping pixels (432), wherein the one or more sub-groups of contiguous non-overlapping pixels (432) are representative of one or more defects (114, 438) on the surface of the prepared object (112);
processing the one or more sub-groups of contiguous non-overlapping pixels (432) via an image contouring technique to generate one or more contours corresponding to the one or more sub-groups of contiguous non-overlapping pixels (432), wherein the one or more contours correspond to one or more true defects (114, 438) on the surface of the prepared object (112);
annotating the one or more contours representative of the one or more defects (114, 438) to generate an output image (436) to enhance visualization of the one or more defects (114, 438); and
processing the output image (436) to identify the one or more defects (114, 438) on the surface (106) of the object (112, 304).
26. An image processing system (128) for real-time automated defect detection in an object (104), the image processing system (128) comprising:
an acquisition subsystem (206) configured to receive one or more ultraviolet light images (202, 408) and one or more white light images (204, 414) of a prepared object (112);
a processing subsystem (208) in operative association with the acquisition subsystem (206) and comprising:
a defect detection platform (210) configured to process the one or more ultraviolet light images (202, 408) and the one or more white light images (204, 414) to detect, in real-time, one or more defects (114, 438) on the prepared object (112) with a reduced number of false positives, wherein to detect, in real-time, the one or more defects (114, 438) the defect detection platform (210) is configured to:
process a white light image (204, 414) to generate a geometric line drawing (418) of the prepared object (112), wherein the geometric line drawing (418) comprises geometric lines representing geometric features of the prepared object (112);
process an ultraviolet light image (202, 408) to obtain fluorescence signals (422) in the ultraviolet light image (202, 408), wherein the fluorescence signals (422) comprise fluorescent pixels corresponding to one or more defects, fluorescent pixels corresponding to one or more geometric artifacts, fluorescent pixels corresponding to one or more geometric features, or combinations thereof;
perform an image processing operation on the white light image (204, 414) and the ultraviolet light image (202, 408) to identify non-overlapping fluorescent pixels, wherein the non-overlapping fluorescent pixels are representative of fluorescent indications corresponding to one or more defects (114, 438) on the prepared object (112);
generate a filtered ultraviolet light image (428) comprising the fluorescent indications corresponding to one or more defects (114, 438) on the prepared object (112);
analyze the filtered ultraviolet light image (428) to identify one or more defects (114, 438) on the prepared object (112);
annotate the one or more defects (114, 438) to create an output image (436); and
provide, in real-time, the identified one or more defects (114, 438), the output image (436), or both to facilitate analysis,
wherein the image processing system (128) is configured to integrate white light imaging and ultraviolet light imaging with an intersection-based filtering technique to enhance accuracy of the fluorescence-based defect detection and reduce a number of false positives in the object (112, 304) being inspected.
| # | Name | Date |
|---|---|---|
| 1 | 202441008529-PROVISIONAL SPECIFICATION [08-02-2024(online)].pdf | 2024-02-08 |
| 2 | 202441008529-POWER OF AUTHORITY [08-02-2024(online)].pdf | 2024-02-08 |
| 3 | 202441008529-FORM FOR STARTUP [08-02-2024(online)].pdf | 2024-02-08 |
| 4 | 202441008529-FORM FOR SMALL ENTITY(FORM-28) [08-02-2024(online)].pdf | 2024-02-08 |
| 5 | 202441008529-FORM 1 [08-02-2024(online)].pdf | 2024-02-08 |
| 6 | 202441008529-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [08-02-2024(online)].pdf | 2024-02-08 |
| 7 | 202441008529-EVIDENCE FOR REGISTRATION UNDER SSI [08-02-2024(online)].pdf | 2024-02-08 |
| 8 | 202441008529-Request Letter-Correspondence [26-07-2024(online)].pdf | 2024-07-26 |
| 9 | 202441008529-Power of Attorney [26-07-2024(online)].pdf | 2024-07-26 |
| 10 | 202441008529-FORM28 [26-07-2024(online)].pdf | 2024-07-26 |
| 11 | 202441008529-Form 1 (Submitted on date of filing) [26-07-2024(online)].pdf | 2024-07-26 |
| 12 | 202441008529-Covering Letter [26-07-2024(online)].pdf | 2024-07-26 |
| 13 | 202441008529-FORM 3 [05-08-2024(online)].pdf | 2024-08-05 |
| 14 | 202441008529-DRAWING [07-02-2025(online)].pdf | 2025-02-07 |
| 15 | 202441008529-CORRESPONDENCE-OTHERS [07-02-2025(online)].pdf | 2025-02-07 |
| 16 | 202441008529-COMPLETE SPECIFICATION [07-02-2025(online)].pdf | 2025-02-07 |
| 17 | 202441008529-FORM-9 [19-02-2025(online)].pdf | 2025-02-19 |
| 18 | 202441008529-STARTUP [20-02-2025(online)].pdf | 2025-02-20 |
| 19 | 202441008529-FORM28 [20-02-2025(online)].pdf | 2025-02-20 |
| 20 | 202441008529-FORM 18A [20-02-2025(online)].pdf | 2025-02-20 |
| 21 | 202441008529-FER.pdf | 2025-04-01 |
| 22 | 202441008529-FORM 3 [29-04-2025(online)].pdf | 2025-04-29 |
| 23 | 202441008529-Proof of Right [23-09-2025(online)].pdf | 2025-09-23 |
| 24 | 202441008529-PETITION UNDER RULE 137 [23-09-2025(online)].pdf | 2025-09-23 |
| 25 | 202441008529-OTHERS [23-09-2025(online)].pdf | 2025-09-23 |
| 26 | 202441008529-FER_SER_REPLY [23-09-2025(online)].pdf | 2025-09-23 |
| 27 | 202441008529-CLAIMS [23-09-2025(online)].pdf | 2025-09-23 |
| 28 | 202441008529-US(14)-HearingNotice-(HearingDate-28-10-2025).pdf | 2025-09-30 |
| 29 | 202441008529-Correspondence to notify the Controller [07-10-2025(online)].pdf | 2025-10-07 |
| 30 | 202441008529-FORM 3 [11-11-2025(online)].pdf | 2025-11-11 |
| 31 | 202441008529-Written submissions and relevant documents [12-11-2025(online)].pdf | 2025-11-12 |
| 32 | 202441008529-MARKED COPIES OF AMENDEMENTS [12-11-2025(online)].pdf | 2025-11-12 |
| 33 | 202441008529-FORM 13 [12-11-2025(online)].pdf | 2025-11-12 |
| 34 | 202441008529-Annexure [12-11-2025(online)].pdf | 2025-11-12 |
| 35 | 202441008529-AMMENDED DOCUMENTS [12-11-2025(online)].pdf | 2025-11-12 |
| 36 | 202441008529-PatentCertificate17-11-2025.pdf | 2025-11-17 |
| 37 | 202441008529-IntimationOfGrant17-11-2025.pdf | 2025-11-17 |
| 1 | 202441008529_SearchStrategyNew_E_SearchHistory(1)E_28-03-2025.pdf |