Sign In to Follow Application
View All Documents & Correspondence

System And Method For Defect Inspection And Measurement Of Parallelepiped 3 D Object

Abstract: A real time method and system is provided for inspection and measurement of defect in a 3D parallelepiped object. The 3D model of the 3D object under inspection and the 3D base model are generated either using CAD techniques or any high precision 3D imaging technique. 3D surfaces of the base model and the model under inspection are aligned in a single coordinate system and matched using a first transformation. Matched surfaces and the deformed surface areas are identified on the 3D model under inspection. The 3D base model is then synthetically deformed as per the 3D model under inspection. The deviation of the points is measured on the 3D base model and represents the measurement in a metric form of deformation. Finally the 3D model under inspection is accepted or rejected depending on the measured deviation values.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 December 2015
Publication Number
42/2017
Publication Type
INA
Invention Field
PHYSICS
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2023-12-14
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED
Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India

Inventors

1. SAHA, Arindam
Tata Consultancy Services Limited, Building 1B, Ecospace, Innovation Labs, Kolkata - STP Kolkata, West Bengal – 700156, India
2. DEWANGAN, Keshaw
Tata Consultancy Services Limited, Building 1B, Ecospace, Innovation Labs, Kolkata - STP Kolkata, West Bengal – 700156, India
3. DASGUPTA, Ranjan
Tata Consultancy Services Limited, Building 1B, Ecospace, Innovation Labs, Kolkata - STP Kolkata, West Bengal – 700156, India
4. PAL, Arpan
Tata Consultancy Services Limited, Building 1B, Ecospace, Innovation Labs, Kolkata - STP Kolkata, West Bengal – 700156, India

Specification

FORM 2
THE PATENTS ACT, 1970 (39 of 1970) & THE PATENTS RULES, 2003
COMPLETE SPECIFICATION (See sec t io n 10, ru le 13) 1. Title of the invention: SYSTEM AND METHOD FOR DEFECT INSPECTION AND
MEASUREMENT OF PARALLELEPIPED 3D OBJECT
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor,
SERVICES LI MITED Nariman Point, Mumbai,
Maharashtra 400021, India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it
is to be performed.

FIELD OF THE INVENTION
The present application generally relates to measurement and inspection of defect in a three dimensional (3D) object. More particularly, but not specifically, the invention provides a real time method and system for defect inspection in a 3D parallelepiped object using vision based inspection and measurement.
BACKGROUND OF THE INVENTION
[001] In the manufacturing industry, a lot of three dimensional (3D) objects are needed in the production process to manufacture a product. These 3D objects, specifically the parallelepiped 3D objects must be manufactured without any surface or interior flaws. Even a very slight defect in the 3D object may cause a large scale defect in the manufacturing of the product. Therefore, it is very important to measure and inspect the defect at the very early stage of the production process. It is also important to identify the defect in the real time in the production process.
[002] Conventionally, the inspection of defects or flaws on 3D objects is conducted by human visual sight. The defect inspection under the conventional methods contains many issues such as skill is required, so that the conventional method is inadequate for automatic product inspection, etc. Further, the inspection dependent upon the human visual sight contained problems of accuracy. Therefore there is a certain need automated processes with better accuracy of defect inspection.
[003] Various automated methods have been used in the past for the inspection and measurement of defects in the 3D objects with parallelepiped shape. A few of the automated methods are designed only for a larger magnitude of deformation in the 3D objects. The existing methods and systems are measuring the amount of deformation after registration of model under inspection with the base model. This registration will produce expected result only if the amount of deformation is very less. None of the method is available in the prior art which can be used for varying magnitude of defects in the 3D objects.

[004] Few other automated methods use comparison of a 3D model under inspection with a base model. In these methods, the system calculates a point to point distance between the base model and the model under inspection where the basic assumption is both the model should have same number of points. This assumption is not correct in the practical scenario, because the surface area of the base model and the model under inspection are not same practically. So the distance calculation between the base model and the model under inspection to detect the defect is not always correct.
[005] In another technique, the entire model under inspection is aligned with the base model, but the alignment is not perfect even for heavily deformed model.
[006] Traditionally the deformation measurement is carried out by measuring the distance between two points in the two surfaces. But in most of the cases this won’t give exact deformation length because of point mismatch. Because the number of points in base model and the model under inspection are not equal.
[007] In addition to that, the existing systems and methods are specific to a single type of object, not a generalized solution which would work for any shape of object and any type of deformation. Further, the existing techniques may also result in increase of the overall cost of inspection of 3D objects.
[008] There is currently no automated method that inspect and measure the defects in the three dimensional parallelepiped objects in real time. Therefore, there is a need to provide a method and system for an accurate and real time inspection and measurement of defect in the 3D parallelepiped objects.
OBJECTIVE OF THE INVENTION
[009] In accordance with the present invention, the primary objective is to provide a system and method to inspect and measure the defect in a three dimension parallelepiped object.

[0010] Another objective of the invention is to provide an automated system using advanced and low cost 3D capturing technique and automatic defect measurement methodology.
[0011] Yet another object of the invention is to achieve accurate estimation of deformation surface areas even for high magnitude of deformation.
[0012] Other objects and advantages of the present invention will be more apparent from the following description when read in conjunction with the accompanying figures, which are not intended to limit the scope of the present disclosure.
SUMMARY OF THE INVENTION
[0013] Before the present methods, systems, and hardware enablement are described, it is to be understood that this invention is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments of the present invention which are not expressly illustrated in the present disclosure. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only, and is not intended to limit the scope of the present invention which will be limited only by the appended claims.
[0014] The present application provides a system and method for real time inspection and measurement of defect in a 3D object, the system comprises a capturing tool, a memory and a processor. The capturing tool captures a 3D structure of the 3D object. The 3D structure is captured in terms of point cloud and is made of a plurality of points having outlier points and inlier points. The processor is coupled to the memory. The processor executes computer readable instructions stored in the memory to perform various steps as follows. Initially, a 3D base model of a base non-deformed object is generated either using a computer aided design (CAD) modelling technique or any high precision 3D imaging technique. The 3D model of the 3D structure is also generated using any 3D imaging technique. The generated 3D model is the model under inspection. A first transformation is then performed on the 3D model. The first transformation results in alignment of the 3D model with the 3D base model. A dynamic

threshold value is calculated for each of the plurality of points with respective point to point distance between 3D base model and the 3D model. One or more outlier points are then removed from the 3D structure by comparing the dynamic threshold value with a predefined threshold value. A second transformation is then performed on the 3D model. A combined transformation matrix is then obtained by combining the output obtained from the first transformation and the second transformation. The 3D model obtained from the combined transformation matrix segmented using a segmentation technique. A final dynamic threshold is performed to generate of a refined 3D model with segregated inlier points and outlier points. The 3D base model is deformed based on the matching comparison of the 3D base model with the refined 3D model. The deforming is done only on the unmatched portion of the 3D base model. A distance is calculated on the 3D base model for the points which are moved to a new position after deforming. The distance indicating error for each point in the point cloud. And finally, the defect is detected if the distance is out of a predefined deformation distance.
[0015] According to an embodiment of the invention, the present invention also provides a method for real time inspection and measurement of defect in a 3D object in the same way as mentioned above.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The foregoing summary, as well as the following detailed description of preferred embodiments, are better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, there is shown in the drawings exemplary constructions of the invention; however, the invention is not limited to the specific methods and system disclosed. In the drawings:
[0017] Fig. 1 shows a block diagram of a system for inspection and measurement of defects in a three dimensional (3D) objects in accordance with an embodiment of the invention;
[0018] Fig. 2 shows a 3D model of a Rubik cube for performing in accordance with an embodiment of the invention; and

[0019] Fig. 3 shows a flow chart illustrating steps involved in inspection and measurement of defects in a three dimensional (3D) objects in accordance with an embodiment of the invention.
DETAILED DESCRIPTION OF THE INVENTION
[0020] Some embodiments of this invention, illustrating all its features, will now be discussed in detail.
[0021] The words "comprising," "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items.
[0022] It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present invention, the preferred, systems and methods are now described. In the following description for the purpose of explanation and understanding reference has been made to numerous embodiments for which the intent is not to limit the scope of the invention.
[0023] One or more components of the invention are described as module for the understanding of the specification. For example, a module may include self-contained component in a hardware circuit comprising of logical gate, semiconductor device, integrated circu its or any other discrete component. The module may also be a part of any software program executed by any hardware entity for example processor. The implementation of module as a software program may include a set of logical instructions to be executed by a processor or any other hardware entity.
[0024] The disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms.

[0025] The elements illustrated in the Figures interoperate as explained in more detail below. Before setting forth the detailed explanation, however, it is noted that all of the discussion below, regardless of the particular implementation being described, is exemplary in nature, rather than limiting. For example, although selected aspects, features, or components of the implementations are depicted as being stored in memories, all or part of the systems and methods consistent with the attrition warning system and method may be stored on, distributed across, or read from other machine-readable media.
[0026] Method steps of the invention may be performed by one or more computer processors executing a program tangibly embodied on a computer-readable medium to perform functions of the invention by operating on input and generating output. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, the processor receives (reads) instructions and data from a memory (such as a read-only memory and/or a random access memory) and writes (stores) instructions and data to the memory. Storage devices suitable for tangibly embodying computer program instructions and data include, for example, all forms of non-volatile memory, such as semiconductor memory devices, including EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROMs. Any of the foregoing may be supplemented by, or incorporated in, specially-designed ASICs (application-specific integrated circuits) or FPGAs (Field-Programmable Gate Arrays). A computer can generally also receive (read) programs and data from, and write (store) programs and data to, a non-transitory computer-readable storage medium such as an internal disk (not shown) or a removable disk.
[0027] The present application provides a system and method for real time inspection and measurement of defect in a 3D object, the system comprises a capturing tool, a memory and a processor. The capturing tool captures a 3D structure of the 3D object. The 3D structure is captured in terms of point cloud and is made of a plurality of points having outlier points and inlier points. The processor is coupled to the memory. The processor executes computer readable instructions stored in the memory to perform various steps as follows. Initially, a 3D base model

of a base non-deformed object is generated either using a computer aided design (CAD) modelling technique or any high precision 3D imaging technique. The 3D model of the 3D structure is also generated using any 3D imaging technique. The generated 3D model is the model under inspection. A first transformation is then performed on the 3D model. The first transformation results in alignment of the 3D model with the 3D base model. A dynamic threshold value is calculated for each of the plurality of points with respective point to point distance between 3D base model and the 3D model. One or more outlier points are then removed from the 3D structure by comparing the dynamic threshold value with a predefined threshold value. A second transformation on the 3D model. A combined transformation matrix is then obtained by combining the output obtained from the first transformation and the second transformation. The 3D model obtained from the combined transformation matrix segmented using a segmentation technique. A final dynamic threshold is performed to generate of a refined 3D model with segregated inlier points and outlier points. The 3D base model is deformed based on the matching comparison of the 3D base model with the refined 3D model. The deforming is done only on the unmatched portion of the 3D base model. A distance is calculated on the 3D base model for the points which are moved to a new position after deforming. The distance indicating error for each point in the point cloud. And finally, the defect is detected if the distance is out of a predefined deformation distance.
[0028] Fig. 1 illustrates a schematic block diagram of a system 100 for real time inspection and measurement of defects in a three dimensional (3D) objects according to an illustrative embodiment of the present invention. The system 100 is configured to provide automatic monitoring of 3D parallelepiped objects in real time and identify & measure the defects in its shape and surface. The 3D parallelepiped object could be any object used in the large scale manufacturing process. The system 100 is using a vision based inspection and measurement method for defect detection. The system 100 compares the 3D object with the non-deformed base model of the similar 3D object.
[0029] The system 100 includes a capturing tool 102, a memory 104 and a processor 106. The processor 106 further includes a plurality of modules for performing various functions. Initially, the system 100 captures a 3D structure of the 3D object under inspection using the capturing tool

102. It should be appreciated that the 3D object under inspection will be referred as 3D object going forward in this disclosure and will be used interchangeably. The captured the 3D parallelepiped structure of the object under inspection is in the form of point cloud in real-time. The captured 3D structure is then converted in to a 3D model. This 3D model will be used for inspection and measurement of defect.
[0030] The capturing tool 102 can include any existing data capturing method based on the environment and other parameters such as cost affordability etc. For example, high precision laser scanner can be used for higher accuracy result but with high cost. In another example, infrared sensors (Like Kinect) can also be used if no other infrared source is present in the environment. LSD-SLAM or Multi-view 3D reconstruction can also be used if there are other infrared sources present in the environment.
[0031] According to an embodiment of the invention, based on the 3D model under inspection, a similar non-deformed 3D object is also taken as an input to the system 100. The system 100 then generates a base 3D model of the base non-deformed 3D object. The non-deformed 3D object is referred as base for performing the comparison. The same has been explained with the example of a Rubik cube in the later part of this disclosure.
[0032] According to an embodiment of the invention, the system 100 also includes a noise filtering module 108. The noise filtering module 108 is applied on the 3D model and the base 3D model to remove noise from the 3D model and the base 3D model. The noise filtering module 108 filters the noise in such a way that the gross structure such as edges, corners etc. remain unaffected. It should be appreciated that the use of any existing noise filtering technique is well within the scope of this disclosure.
[0033] The 3D model under inspection and the base 3D model are then provided to the processor 106. The processor 106 then performs a first level registration of the 3D model. A simple rigid transformation on the entire point cloud of the 3D model is performed using local feature descriptor. The result would register the 3D model under inspection with the 3D base model in the same coordinate system.

[0034] According to an embodiment of the invention, a three-dimensional Fast Point Feature Histogram (FPFH) is selected as the local feature descriptor. Here, classifying includes calculating a Fast Point Feature Histogram (FPFH) at each point in the plurality of points in the 3D model. The principle for the selection of descriptor for classification can be depicted as: everything should be as simple as possible using simple geometry, but not simpler. By simple, it is meant that the calculation is not expensive, and the dimension is small. One of the simplest features is Shape Index (SI). However, SI only considers the principal curvatures, which is not distinctive enough for multiple categories in this application. On the other hand, brief experiments on simple shape classification show that other descriptors such as Spin Image and 3D Shape Context are outperformed by FPFH in both accuracy and efficiency. It is noted, however, the performance of the descriptors can be different when the descriptors are used in, for example, complex shape matching. FPFH is an approximate and accelerated version of Point Feature Histogram (PFH). PFH uses a histogram to encode the geometrical properties of a point's neighborhood by generalizing the mean curvature around the point. The histogram representation is quantized based on all relationships between the points and their estimated surface normal within the neighborhood.
[0035] According to an embodiment of the disclosure, the processor 106 further calculates a dynamic threshold value with point to point distance between 3D base model and 3D model under inspection. The outliers are then calculated using the dynamic threshold value. Any point on the 3D model under inspection is having a distance with its nearest point on the 3D base model more than the dynamic threshold value will be consider as an outlier and would be removed temporarily from the 3D model. These outliers are removed temporarily from the 3D model under inspection. It should be appreciated that the first level registration may not be very perfect and due to this reason some of inliers can be rejected as outliers. It should be appreciated that the first level registration also segments the 3D model from the environment.
[0036] The processor 106 is further configured to apply a second level registration. Since, the left out inliers should ideally match with the base model but the matching may not be perfect due to alignment error. So the inliers are used as a template and matching the template with the 3D base model to achieve the second level transformation.

[0037] According to an embodiment of the invention Iterative Closest Point (ICP) algorithm is used to achieve the second level transformation. ICP is an algorithm employed to minimize the difference between two clouds of points. Broadly speaking, the ICP algorithm is an iterative process that determines a set of corresponding points between the two depth images, determines a transformation that minimizes an error metric over the corresponding points, and repeats the process using the determined transformation until convergence is reached.
[0038] According to an embodiment of the invention, the processor 106 further configured to calculate the combined transformation matrix from the output of the first transformation and the second transformation. This can also be referred to as the final transformation. To achieve the final transformation the calculated transformation matrix is used. The 3D model with the base 3D model results are registered in the final transformation process.
[0039] According to an embodiment of the disclosure, the processor 106 further configured to segment the 3D model from the environment using state-of-the-art model matching and segmentation techniques. The use of any existing model matching and segmentation technique is well within the scope of this disclosure. The previous steps are repeated the final dynamic threshold and outlier would be calculated as similar way as in previous steps. Finally, we will get the 3D model with inliers and outliers are segregated. So the unmatched & deformed structures are clearly identified. The final dynamic threshold also results in generation of a refined 3D model.
[0040] Further the processor 106 configured to synthetically deform the unmatched portion of the 3D base model according with the 3D model under inspection. Many new points were interpolated during this modification and finally the 3D base model would exactly match with the 3D model under inspection. Deforming of the 3D base model is done based on the matching comparison of the 3D base model with the refined 3D model. The processor 106 also configured to calculate a distance on the 3D base model for the points which are moved to a new position after deforming. The calculated distance would give the error in distance for each point. The acceptance / rejection of the object under inspection would be decided based on a predefined

deformation distance. The defect will be detected if the distance is out of the predefined deformation distance.
[0041] The system 100 is explained with the example of defect inspection in a Rubik cube 110 as shown in Fig. 2 according to an embodiment of the disclosure. Fig. 2A shows a normal non-deformed sample base model of the Rubik cube 110, while Fig. 2B shows a deformed sample model of the Rubik cube 110. Further Fig. 2C shows the base model of Rubik cube aligned with the deformed model of Rubik cube 110. And finally Fig. 2D shows defected points on the deformed model of the Rubik cube 110 after the inspection of the deformed Rubik cube as per the method provided above.
[0042] Fig. 3 shows a flowchart 200 showing the steps involved in the measurement and inspection of defects in the 3D object. Initially at step 202, the 3D base model of the base non-deformed 3D object is generated. The 3D base model is generated either using a computer aided design (CAD) modelling technique or any high precision 3D imaging technique. At step 204, the noise from the 3D base model is filtered using the noise filtering module 108.
[0043] At step 206, the 3D structure of the 3D object is captured. The 3D structure is captured in terms of point cloud. The point cloud is made of a plurality of points having outlier points and inlier points. In the next step 208, the 3D model of the 3D structure is generated using any 3D imaging technique. It should be appreciated that the 3D model is the 3D model under inspection and will be referred interchangeable in the present disclosure. At step 210, the noise from the 3D model under inspection is filtered using the noise filtering module 108. It should be appreciated in another embodiment that the step 206 – 210 can be performed before the steps 202 – 204. The 3D base model and the 3D model under inspection will now be compared to detect and measure the presence of defect.
[0044] In the next step 212, the first transformation is performed on the 3D model under inspection. The first transformation results in alignment of the 3D model with the 3D base model. In step 214, the dynamic threshold value for each of the plurality of points with respective point to point distance between 3D base model and the 3D model is calculated. At

step 216, one or more outlier points from the 3D model are removed by comparing the dynamic threshold value with a predefined threshold value.
[0045] In the next step 218, the second transformation on the 3D model is performed. In the next step 220, the combined transformation matrix is obtained by combining the output obtained from the first transformation and the second transformation. At step 222, the 3D model obtained from the combined transformation matrix is segmented using a segmentation technique. At step 224, the final dynamic threshold is performed to generate of a refined 3D model with segregated inlier points and outlier points.
[0046] At step 226, the 3D base model is deformed based on the matching comparison of the 3D base model with the refined 3D model. The deforming is done only on the unmatched portion of the 3D base model. In the next step 228, the distance on the 3D base model for the points which are moved to a new position after deforming are calculated. The distance indicates an error for each point in the point cloud. Finally, at step 230 the defect is detected if the distance is out of a predefined deformation distance. The acceptance / rejection of the 3D model under inspection would be decided based on a predefined deformation distance. The defect will be detected if the distance is out of the predefined deformation distance.
[0047] In view of the forego ing, it will be appreciated that the present invention provides a method and system for assessing the learning experience of person by monitoring the mental activity of the person. Still, it should be understood that the foregoing relates only to the exemplary embodiments of the present invention, and that numerous changes may be made thereto without departing from the spirit and scope of the invention as defined by the following claims.

I/We Claim:
1. A real time method for inspection and measurement of defects in a three dimensional (3D) object, the method comprising:
generating by a processor, a 3D base model of a base non-deformed 3D object using a high precision 3D imaging technique;
capturing a 3D structure of the 3D object, wherein the 3D structure is captured in terms of point cloud is made of a plurality of points having outlier points and inlier points;
generating by the processor, a 3D model of the 3D structure using a 3D imaging technique, wherein the 3D model is the model under inspection;
performing by the processor, a first transformation on the 3D model, wherein the first transformation results in alignment of the 3D model with the 3D base model;
calculating by the processor, a dynamic threshold value for each of the plurality of points with respective point to point distance between 3D base model and the 3D model;
removing by the processor, one or more outlier points from the 3D model by comparing the dynamic threshold value with a predefined threshold value;
performing by the processor, a second transformation on the 3D model;
calculating by the processor, a combined transformation matrix by combining the output obtained from the first transformation and the second transformation;
segmenting by the processor, the 3D model obtained from the combined transformation matrix using a segmentation technique;
performing by the processor, a final dynamic threshold to generate of a refined 3D model with segregated inlier points and outlier points;
deforming by the processor, the 3D base model based on the matching comparison of the 3D base model with the refined 3D model, wherein the deforming is done only on the unmatched portion of the 3D base model;
calculating by the processor, a distance on the 3D base model for the points which are moved to a new position after deforming, wherein the distance indicating error for each point in the point cloud; and

detecting by the processor, the defect if the distance is out of a predefined deformation distance.
2. The method of claim 1 further includes the step of providing a noise cleaning function on the 3D object.
3. The method of claim 1, wherein the high precision 3D imaging technique is a computer aided design (CAD) modelling technique.
4. The method of claim 1 wherein the first transformation is performed using surface normal and fast point feature histogram.
5. The method of claim 1 wherein, the second level transformation is performed using iterative closest point method.
6. The method of claim 1, wherein the step of removing outlier further includes: the 3D model is having a distance with its nearest point on the base model more than the calculated threshold value will be consider as an outlier and would be removed temporarily from the 3D model.
7. The method of claim 1 further includes interpolating one or more new points into the 3D model to obtain the 3D base model exactly match with the 3D model.
8. The method of claim 1 wherein the 3D object is a parallelepiped object.
9. A system for real time inspection and measurement of defect in a 3D object, the system comprising:
a capturing tool for capturing a 3D structure of the 3D object, wherein the 3D structure is captured in terms of point cloud is made of a plurality of points having outlier points and inlier points;

a memory; and
a processor coupled to the memory, wherein the processor executes computer readable instructions stored in the memory to:
generate, by the processor, a 3D base model of a base non-deformed object using a high precision 3D imaging technique;
generate, by the processor, the 3D model of the 3D structure using a 3D imaging technique, wherein the 3D model is the model under inspection;
perform, by the processor, a first transformation on the 3D model, wherein the first transformation results in alignment of the 3D model with the 3D base model;
calculate, by the processor, a dynamic threshold value for each of the plurality of points with respective point to point distance between 3D base model and the 3D model;
remove, by the processor, one or more outlier points from the 3D model by comparing the dynamic threshold value with a predefined threshold value;
perform, by the processor, a second transformation on the 3D model;
calculate, by the processor, a combined transformation matrix by combining the output obtained from the first transformation and the second transformation;
segment, by the processor, the 3D model obtained from the combined transformation matrix using a segmentation technique;
perform, by the processor, a final dynamic threshold to generate of a refined 3D model with segregated inlier points and outlier points;
deform, by the processor, the 3D base model based on the matching comparison of the 3D base model with the refined 3D model, wherein the deforming is done only on the unmatched portion of the 3D base model;
calculate, by the processor, a distance on the 3D base model for the points which are moved to a new position after deforming, wherein the distance indicating error for each point in the point cloud; and
detect, by the processor, the defect if the distance is out of a predefined deformation distance.
10. The system of claim 8, wherein the capturing tool involves at least one of a high precision laser scanner, IR sensors, Multi-view 3D construction.

11. The system of claim 8 wherein the 3D object is a parallelepiped object.
12. The system of claim 9, wherein the high precision 3D imaging technique is a computer aided design (CAD) modelling technique

Documents

Application Documents

# Name Date
1 4853-MUM-2015-IntimationOfGrant14-12-2023.pdf 2023-12-14
1 Form 5 [24-12-2015(online)].pdf 2015-12-24
2 4853-MUM-2015-PatentCertificate14-12-2023.pdf 2023-12-14
2 Form 3 [24-12-2015(online)].pdf 2015-12-24
3 Form 18 [24-12-2015(online)].pdf 2015-12-24
3 4853-MUM-2015-ABSTRACT [15-07-2019(online)].pdf 2019-07-15
4 Drawing [24-12-2015(online)].pdf 2015-12-24
4 4853-MUM-2015-CLAIMS [15-07-2019(online)].pdf 2019-07-15
5 Description(Complete) [24-12-2015(online)].pdf 2015-12-24
5 4853-MUM-2015-COMPLETE SPECIFICATION [15-07-2019(online)].pdf 2019-07-15
6 4853-MUM-2015-POWER OF AUTHORITY-(03-03-2016).pdf 2016-03-03
6 4853-MUM-2015-DRAWING [15-07-2019(online)].pdf 2019-07-15
7 4853-MUM-2015-FER_SER_REPLY [15-07-2019(online)].pdf 2019-07-15
7 4853-MUM-2015-CORRESPONDENCE-(03-03-2016).pdf 2016-03-03
8 ABSTRACT1.jpg 2018-08-11
8 4853-MUM-2015-OTHERS [15-07-2019(online)].pdf 2019-07-15
9 4853-MUM-2015-FER.pdf 2019-01-16
9 4853-MUM-2015-Form 1-210116.pdf 2018-08-11
10 4853-MUM-2015-Correspondence-210116.pdf 2018-08-11
11 4853-MUM-2015-FER.pdf 2019-01-16
11 4853-MUM-2015-Form 1-210116.pdf 2018-08-11
12 4853-MUM-2015-OTHERS [15-07-2019(online)].pdf 2019-07-15
12 ABSTRACT1.jpg 2018-08-11
13 4853-MUM-2015-CORRESPONDENCE-(03-03-2016).pdf 2016-03-03
13 4853-MUM-2015-FER_SER_REPLY [15-07-2019(online)].pdf 2019-07-15
14 4853-MUM-2015-DRAWING [15-07-2019(online)].pdf 2019-07-15
14 4853-MUM-2015-POWER OF AUTHORITY-(03-03-2016).pdf 2016-03-03
15 4853-MUM-2015-COMPLETE SPECIFICATION [15-07-2019(online)].pdf 2019-07-15
15 Description(Complete) [24-12-2015(online)].pdf 2015-12-24
16 4853-MUM-2015-CLAIMS [15-07-2019(online)].pdf 2019-07-15
16 Drawing [24-12-2015(online)].pdf 2015-12-24
17 4853-MUM-2015-ABSTRACT [15-07-2019(online)].pdf 2019-07-15
17 Form 18 [24-12-2015(online)].pdf 2015-12-24
18 4853-MUM-2015-PatentCertificate14-12-2023.pdf 2023-12-14
18 Form 3 [24-12-2015(online)].pdf 2015-12-24
19 Form 5 [24-12-2015(online)].pdf 2015-12-24
19 4853-MUM-2015-IntimationOfGrant14-12-2023.pdf 2023-12-14

Search Strategy

1 4853_06-03-2018.pdf

ERegister / Renewals

3rd: 11 Jan 2024

From 24/12/2017 - To 24/12/2018

4th: 11 Jan 2024

From 24/12/2018 - To 24/12/2019

5th: 11 Jan 2024

From 24/12/2019 - To 24/12/2020

6th: 11 Jan 2024

From 24/12/2020 - To 24/12/2021

7th: 11 Jan 2024

From 24/12/2021 - To 24/12/2022

8th: 11 Jan 2024

From 24/12/2022 - To 24/12/2023

9th: 11 Jan 2024

From 24/12/2023 - To 24/12/2024

10th: 23 Dec 2024

From 24/12/2024 - To 24/12/2025