Abstract: A shape measuring method comprises: a step for creating a surface (C) representing the surface shape of an object to be measured as an implicit function on the basis of measurement point group data; a step for dividing the entire measurement region in which the surface (C) is present into tetrahedral small regions (hereinafter referred to as cells) laid closely without overlapping by division processing using a three dimensional Delaunay drawing; a step for classifying the vertexes (4) of the cells into inner points (5) present inside the surface (C) and outer points (6) present outside; a step for extracting boundary cells; a step for calculating intersection points (7) of the boundary cells and the surface (C); a step for finding triangular or quadrangular faces (8) by connecting the intersection points (7) of the boundary cells; and a step for bonding all the faces (8). Consequently closed polyhedral data that is the manifold of the object to be measured and contains no self intersections can be automatically created.
FORM 2
THE PATENT ACT 1970 (39 of 1970)
&
The Patents Rules, 2003
COMPLETE SPECIFICATION
{See Section 10, and rule 13)
1. TITLE OF INVENTION SHAPE MEASURING METHOD
2. APPLICANT(S)
a) Name : MITSUBISHI HEAVY INDUSTRIES, LTD.
b) Nationality : JAPANESE Company
c) Address : 16-5, KONAN 2-CHOME,
MINATO-KU, TOKYO 1088215, JAPAN
3. PREAMBLE TO THE DESCRIPTION
The following specification particularly describes the invention and the manner in which it is to be performed : -
TECHNICAL FIELD
The present invention relates to a shape measuring method utilized in machines such as industrial machines controlled by numerical control devices.
BACKGROUND ART
As a shape measuring method utilized for a collision preventing device of a machine tool, a light-section method is known in which a target object is illuminated by a slit light to obtain an optical image along the shape of the target object and the optical image is captured by a CCD camera (Patent Document 1).
Moreover, methods of creating a three-dimensional model of a target object in a CAD system includes a method in which first point cloud data of the target object placed on a reference plane and second point cloud data of the target object placed on the reference plane in a different posture are obtained and the two pieces of point cloud data are combined into single piece of combined point cloud data (Patent Document 2).
Furthermore, shape measuring methods utilized for a collision preventing device includes the following method. A three-dimensional mesh structure formed by dividing a space into polyhedron shapes is created and measurement point coordinates of a measured workpiece are calculated based on information on the distance to the workpiece. Positions in the workpiece are scanned which correspond to one unit of the three-dimensional mesh structure (hereafter, referred to as voxel). When a ratio of the number of times a calculated measurement point is included in the voxel to the number of times of the scanning is equal to or more than a predetermined threshold, the voxel is assumed to be part of the shape of the workpiece. In this way, a measured shape map is created (Patent Document 3).
PRIOR ART DOCUMENT PATENT DOCUMENT
Patent Document 1: Japanese Patent No. 2895316
Patent Document 2: Japanese Patent Application Publication No. 2003-345840
Patent Document 3; Japanese Patent Application Publication No. 2009-265023
SUMMARY OF THE INVENTION
PROBLEMS TO BE SOLVED BY THE INVENTION
When STL data is to be generated from the point cloud data, utilization of, for example, a two-dimensional Delaunay diagram shown in Fig. 26 is conceivable. Here, STL stands for Standard Triangulated Language (also referred to as Stereo Lithography), which is an industry-standard file format for three-dimensional CAD systems developed by 3D systems, Inc., and represents a three-dimensional shape as a group of small triangles. Moreover, the two-dimensional Delaunay diagram is "space division where the circumscribed circle of each triangle (cell) includes no other vertices therein".
The STL data generated by the two-dimensional Delaunay diagram is a polyhedron which is not closed. Specifically, three-dimensional measurement is performed from multiple directions and face generation results from the measurement directions are matched and combined in a coordinate system to create shape data. The thus-generated shape data is a group of surfaces (faces) and does not strictly define a solid shape having volume elements. In Patent Document 2, when there c a missing portion in the combining of the first and second point cloud data, the created single piece of combined point cloud data may not be a solid shape and thus not be a closed shape model.
The conventional surface shape is sufficient in some utilization methods of the measured data. However, in some applications, it is prerequisite that inputted shape data is a closed polyhedron.
This is because, when geometry calculation such as interference calculation is performed on a program, the easiness of implementation of an application and the robustness thereof can be improved by using a closed polyhedron strictly defining volume elements of an object as a starting point of the calculation.
The problem described above can be also dealt with by utilizing a discrete shape model such a voxel shape representation shown in Fig. 27 to generate closed STL data from a measured point cloud. In Patent Document 3, a measured shape map which is three-dimensional data of a workpiece is used. Since the measured shape map is voxels of a three-dimensional mesh structure, the measured shape map is a discrete shape model.
However, in the discrete voxel shape representation, when the resolution is improved, the amount of used memory and algorithms of processing speed increase by the cube and the shape representation in a high resolution is extremely difficult.
Moreover, there is a high possibility that the closed polyhedron generated from the voxel shape is a non-manifold polyhedron. In some cases, an application handling the geometry calculation has a limiting condition that inputted shape data should be a manifold polyhedron like the closed polyhedron, and this is a problem to be solved.
Here, the "manifold" polyhedron means a polyhedron which is not "non-manifold" such that three or more faces share one edge as shown in Figs. 23 to 25, and which has geometrically good characteristics, when attention is given to connection relationships among edges and faces of the polyhedron. Fig. 23 shows an example in which four flat faces share one edge, Fig. 24 shows an example in which four faces of two cubes share one edge, and Fig. 25 shows an example in which upper ends of four inclined faces share one edge.
Furthermore, a characteristic that a polyhedron includes no self-intersection can be considered as an important characteristic of a polyhedron. The self-intersection refers to a phenomenon in which triangles forming a polyhedron overlap one another. Since the polyhedron including the self-intersection is difficult to handle in the application on the geometry calculation, it is important that the polyhedron has the characteristic of including no self-intersection. Fig. 28 shows examples of polyhedrons including the self-intersections. Fig. 28(a) shows the self-intersection in two dimensions while Figs. 28(b) and 28(c) show examples (1) and (2) of the self-intersections in three dimensions (portions circled by broken lines in the drawings are the self-intersections).
The present invention has been made in view of the conventional techniques described above and an object thereof is to generate closed polyhedron.data of a measurement-target object which is manifold and which includes no self-intersection, from point cloud data measured by a three-dimensional measurement unit.
MEANS FOR SOLVING THE PROBLEMS
A shape measuring method according to claim 1 of the present invention which solves the problems described above comprises the following steps:
(1) scanning a measurement-target object with a three-dimensional measurement unit from a plurality of measurement directions to obtain, for each of the measurement directions, measured point cloud data of the measurement-target object; (2) generating an implicit function representing a shape of the measurement-target object on the basis of the measured point cloud data; (3) generating polyhedron data on the basis of the implicit function representing the shape of the measurement-target object.
Furthermore/ the step (3) includes the following sub-steps:
(a) dividing an entire measurement region in which the measurement-target object exists into tetrahedral small regions (hereafter, referred to as cells) by performing space division processing using a three-dimensional Delaunay diagram on the basis of the measured point cloud data, the tetrahedral small regions filling the measurement region without a gap; (b) classifying vertices of the cells in the three-dimensional Delaunay diagram into an inner point existing inside the measurement-target object and an outer point existing outside the measurement-target object by using the implicit function; (c) extracting boundary cells having four vertices including both of the inner point and the outer point, from all of the cells in the three-dimensional Delaunay diagram; (d) calculating an intersection between a surface of the measurement-target object and each of edges having a combination of the inner point and the outer point located at the respective two ends, among edges of the boundary cells; (e) obtaining a triangular or quadrilateral face by connecting three or four of the intersections included in each of the boundary cells; and (f) generating closed polyhedron data which is manifold and which includes no self-intersection, by connecting all of the triangular or quadrilateral faces.
EFFECT OF THE INVENTION
In the present invention, the entire measurement region in which a surface representing the surface shape of the measurement-target object exists is divided into the tetrahedral cells filling the measurement region without a gap and without overlapping each other, by performing the division processing using the three-dimensional Delaunay diagram. Hence, the number of the intersections between the surface and each boundary cell is inevitably three or four. Accordingly, the triangular or quadrilateral face obtained by connecting the intersections can be regarded as a face obtained by slicing (cutting) the boundary cell along ihe surface
(hereafter, this face is referred to as slice cross-section). Thus, the closed polyhedron data can be generated by connecting the slice cross-sections of all of the boundary cells. Moreover, since two faces share each of the edges of the sliced-cross sections, it can be said that the closed polyhedron data is not non-manifold where three or more faces share one edge, and is thus manifold. Furthermore, the cells in the three-dimensional Delaunay diagram do not overlap each other due to the definition of the three-dimensional Delaunay diagram. Accordingly, the polyhedron data generated by connecting the slice cross-sections of the cells can be guaranteed to include no self-intersection.
In other words, the present invention can surely generate the closed polyhedron data which is manifold and which includes no self-intersection in a simple way.
In conventional techniques, it is difficult to generate polyhedron data in a format easy to handle in subsequent processing, such as the polyhedron data which is manifold and which includes no self-intersection. Accordingly, when the shape data having the characteristics described above is required, the generation of such data relied on manual shape correction works.
Since continuous coordinate values can be employed in the intersection coordinate calculation, an approximate accuracy of the shape in the present invention is higher than that in a conventional discrete shape representation method such as voxels. Accordingly, it is possible to suppress missing of information such as a normal vector of each triangular face.
Note that, in the description following "EFFECT OF THE INVENTION", "surface" is an abstract representation of the measurement-target object obtained by abstracting matters other than the surface shape from the measurement-target object. Generally, "surface" means the measurement-target object itself observed from the outside as described in the "MEANS FOR SOLVING THE PROBLEMS" and "CLAIM".
BRIEF DESCRIPTION OF THE DRAWINGS
[Fig. 1] Fig. 1 is an explanatory view showing how a measurement-target object is scanned with a three-dimensional measurement sensor.
[Fig. 2a] Fig. 2a is an explanatory view of two-dimensional data in which point cloud data is projected in a measurement direction.
[Fig. 2b] Fig. 2b is an explanatory view of a two-dimensional triangular mesh.
[Fig. 3] Fig. 3 is an explanatory view of a three-dimensional triangular mesh.
[Fig. 4] Fig. 4 is an explanatory view showing a relationship between the point cloud data and a surface.
[Fig. 5] Fig. 5 is an explanatory view showing a relationship among the measurement-target object and three measurement directions.
[Fig. 6] Fig. 6 is an explanatory view of a region including the target object and a region other than that when the measurement direction is an upper face measurement direction.
[Fig. 7] Fig. 7 is an explanatory view of the region including the target object and the region other than that when the measurement direction is a left face measurement direction.
[Fig. 8] Fig. 8 is an explanatory view of the region including the target object and the region other than that when the measurement direction is a right face measurement direction.
[Fig. 9] Fig. 9 is an explanatory view showing a logical conjunction of the implicit functions of the measurement directions.
[Fig. 10] Fig. 10 is an explanatory view of a final implicit function.
[Fig. 11] Fig. 11 is a three-dimensional Delaunay diagram showing a relationship in the surface of the measurement-target object in a measurement region.
[Fig. 12] Fig. 12(a) is a perspective view showing one unit of a polyhedron cell, Fig. 12(b) is a perspective view of a sphere formed by the polyhedron cells, and Fig. 12(c) is a perspective view of a rectangular solid formed by the polyhedron cells.
[Fig. 13] Fig. 13 is a three-dimensional Delaunay diagram showing outer points and inner points of the surface.
[Fig. 14] Fig. 14 is a three-dimensional Delaunay diagram showing intersections between the surface and cells.
[Fig. 15] Fig. 15 is a perspective view (part 1) showing a slice cross-section of a cell sliced along the surface.
[Fig. 16] Fig. 16 is a perspective view (part 2) showing a slice cross-section of a cell sliced along the surface.
[Fig. 17] Fig. 17 is a perspective view (part 3) showing a slice cross-section of a cell sliced along the surface.
[Fig. 18] Fig. 18 is an explanatory view showing closed polyhedron data which is manifold.
[Fig. 19] Fig. 19 is a system configuration diagram showing an embodiment in which a shape measuring method of the present invention is applied to an NC device.
[Fig. 20] Fig. 20 is an explanatory view of a machine tool used in the shape measuring method of the present invention.
[Fig. 21] Fig. 21 is a flowchart showing a work procedure of the shape measurement.
[Fig. 22] Fig. 22 is a flowchart showing STL processing.
[Fig. 23] Fig. 23 is an explanatory view showing an example (part 1) of a non-manifold polyhedron.
[Fig. 24] Fig. 24 is an explanatory view showing an example (part 2) of a non-manifold polyhedron.
[Fig. 25] Fig. 25 is an explanatory view showing an example (part 3) of a non-manifold polyhedron.
[Fig. 26] Fig. 26 is a two-dimensional Delaunay diagram.
[Fig. 27] Fig. 27 is an explanatory diagram of a voxel shape model.
[Fig. 28] Fig. 28(a) is an explanatory diagram showing an example of a polyhedron including a self-intersection in two dimensions, Fig. 28(b) is an explanatory diagram showing an example (1) of a polyhedron including the self-intersections in three dimensions, and Fig. 28(c) is an explanatory diagram showing an example (2) of a polyhedron including the self-intersection in three dimensions.
[Fig. 29] Fig. 29 is a schematic view of a side-surface machining attachment.
[Fig. 30] Fig. 30 is a schematic view of a sensor-dedicated attachment.
[Fig. 31] Fig. 31 is an explanatory view showing an example in which a laser beam does not return to a measurement device.
[Fig. 32] Fig. 32 is an explanatory view showing an example in which the laser beam returns to the measurement device.
[Fig. 33] Fig. 33 is an explanatory view showing a lattice-pattern measurement route.
[Fig. 34] Fig. 34 is a schematic view of an attachment whose inclination angle can be arbitrary changed.
MODE FOR CARRYING OUT THE INVENTION
In the present invention, first, measured point cloud data which is three-dimensional coordinates of a measured point cloud in a measurement-target object is obtained by scanning the measurement-target object with a three-dimensional measurement unit.
Specifically, in a mode of embodiment, as shown in Fig. 1, a three-dimensional measurement sensor 1 is attached to a main spindle 2 of a machine tool and measures a measurement-target object 3 while the main spindle 2 is moved in a main-spindle moving direction (horizontal direction) B orthogonal to a measurement direction (downward direction) A. The measurement-target object 3 is placed on a table 9 and the main spindle 2 of the machine tool is movable in three-dimensional directions.
For example, when a laser sensor configured to measure the distance to the measurement-target object 3 is used as the three-dimensional measurement sensor 1, the measurement-target object 3 can be three-dimensionally measured by repeating the following operations. After being moved in the main-spindle moving direction B, the three-dimensional measurement sensor 1 is moved in a direction perpendicular to the sheet surface of the drawing by a constant distance and thereafter moved in the main-spindle moving direction B.
Alternatively, when a line sensor in which laser sensors configured to measure the distance to the measurement-target object 3 are arranged linearly in the direction perpendicular to the sheet surface of the drawing is used as the three-dimensional measurement sensor 1/ the measurement-target object 3 can be three-dimensionally measured by moving the three-dimensional measurement sensor 1 once in the main-spindle moving direction B.
Measured points a which are points measured with the three-dimensional measurement sensor 1 are points located below the three-dimensional measurement sensor 1 in the measurement direction A which is the downward direction. The measured points a exist not only on the measurement-target object 3 but also on the table 9.
The measured point cloud data measured for the measured points a includes three-dimensional coordinates of the main spindle 2, i.e. three-dimensional coordinates of the three-dimensional measurement sensor 1 and the distances from the three-dimensional measurement sensor 1 to the measurement-target object 3. The point cloud data includes a range visible in a view in the measurement direction A and does not include a portion of the target object in a blind area (hidden area) which is not visible. In other words, the measured point cloud data is point cloud data with no overlapping portion in the view in the measurement direction.
Since the point cloud data has no overlapping portion in the view in the measurement direction as described above, the point cloud data includes the distance from the three-dimensional measurement sensor 1 to the measurement-target object 3, i.e. height information. Accordingly, the point cloud data can be considered as "two-dimensional point cloud data having height information (= three-dimensional point cloud data)".
Specifically, the measured point cloud data is considered as two-dimensional data of projection in the measurement direction as shown in Fig. 2a and a two-dimensional triangular mesh is created from the two-dimensional point cloud data as shown in Fig. 2b. In Fig. 2a, the point cloud data is points of two-dimensional point cloud arranged at constant intervals in horizontal and vertical directions. Moreover, the two-dimensional triangular mesh is a mesh in which points of the two-dimensional point cloud are connected by edges to form vertices of triangles as shown in Fig. 2b.
In the mode of embodiment, the two-dimensional triangular mesh is generated from the two-dimensional point cloud data by utilizing the Delaunay triangulation.
Here, the Delaunay triangulation is a method of creating a triangular mesh to represent a surface shape when the surface shape is to be restored from a measured point cloud which is a group of three-dimensional coordinates, and refers to division of a space by triangles satisfying the following conditions.
(1) Points in the point cloud are vertices
(2) No points in the point cloud are included in circumscribed circles of the triangles.
Note that the method used herein is not limited to the Delaunay triangulation as long as the surface shape can be restored from the measured point cloud which is the group of three-dimensional coordinates.
Since the vertices (point cloud data) of the two-dimensional triangular mesh generated in the procedure described above has the height information, a three-dimensional triangular mesh (= surface C) representing the surface shape of the measurement-target object can be obtained by adding the height information as shown in Fig. 3.
Specifically, in the mode of embodiment, the surface C representing the surface shape of the measurement-target object with a three-dimensional triangular mesh is created from the "two-dimensional point cloud data having the height information", as surface data (hereafter referred to as triangular mesh data). The three-dimensional triangular mesh is also a type of surface representing the surface shape of the measurement-target object, and the triangular mesh data is also a type of surface data. Note that Fig. 3 shows an image of the surface C represented with the three-dimensional triangular mesh and is a shape different from the surface shape of the measurement-target object in Fig. 1. Moreover, the surface C shown in Fig. 4
(shown by a broken line in the drawing) corresponds to the measurement-target object 3 shown in Fig. 1.
As described above, the surface C representing the surface shape of the measurement-target object 3 with the three-dimensional triangular mesh can be obtained as the triangular mesh data on the basis of the point cloud data obtained by scanning the measurement-target object 3 with the three-dimensional measurement sensor 1 from one measurement direction. However, this data is merely a result obtained from one measurement direction.
Thus, in the embodiment, results obtained from multiple directions are combined to generate the following implicit function representing the measurement-target object. Generation processing of this implicit function can convert the surface data which is a group of pieces of face data and which does not strictly define volume elements of an object to solid data which strictly defines the volume elements of the object, and represent the measurement target object. Specifically, the implicit function of the mode of embodiment is an inside-outside determination function which receives a certain three-dimensional coordinate value and which outputs 1 if the designated coordinate value is inside the measurement-target object and outputs 0 if the designated coordinate value is outside the measurement-target object.
Specifically, the implicit function is created which separates a measurement region into a region (1) including no target object in a view in one measurement direction and a region (2) other than the region (1) by utilizing the triangular mesh data representing the surface shape of the measurement-target object with the three-dimensional triangular mesh.
The region (1) is a region on the three-dimensional measurement senor side of the surface C while the region (2) is a region on the measurement-tar get object side of the surface C. Accordingly, a boundary plane between the region (1) and the region (2) is the surface C (see Fig. 4).
In summary, the implicit function separates the measurement region into the region (1) including no target object and the region (2) other than the region (1) with the surface C being the boundary plane. In other words, the implicit function can be considered as the surface data which defines the surface C representing the surface shape of the measurement-target object 3, by using the boundary plane between the region (1) including no target object and the region (2) other than the region (1).
Here, the region (2) is a region in which the measurement-target object may exist and cannot be immediately determined as a region in which the measurement-target object exists. This is because there may be, for example, a region which is determined to be the region (2) when the measurement direction A is the downward direction as shown in Fig. 1, but is determined to be the region (1) including no target object when the measurement direction is a different direction such as a horizontal direction.
As described above, the region in which the target object exists is overestimated when the implicit function of only one face measurement is used. In other words, in some cases, the region in which no measurement-target object actually exists is erroneously determined as the region in which the measurement-target object exists.
To counter this, the measurement-target object is measured from multiple directions, multiple pieces of measured point cloud data are obtained, multiple pieces of triangular mesh data representing the surface shape of the measurement-target object with the three-dimensional triangular mesh are created, the implicit function is created for each piece of triangular mesh data, and the created multiple implicit functions are combined to obtain a final implicit function. This can suppress the overestimation as much as possible.
Ideally, it is desirable to measure the measurement-target object from six directions of a plan view, right face view, a front view, a left view, a back view, and a bottom view. However, the surface representing the surface shape of the entire region of the
measurement-target object can be calculated to some extent by combining (calculating logical conjunction of) the implicit functions based on pieces of point cloud data obtained by measuring the measurement-target object from five directions excluding the bottom view.
This will be briefly described below. As shown in Fig. 5, three-dimensional measurement is performed on the measurement-tar get object 3 from the upper face measurement direction A, a left face measurement direction D, and a right face measurement direction E. Note that, since three dimensions cannot be expressed in the drawing/ in Fig. 3, description of the measurement from two directions perpendicular to the sheet surface of the drawing is omitted and description is given mainly in two dimensions.
When the measurement direction is the upper face measurement direction A, as shown in Fig. 6, an implicit function is created in which a surface C1 (shown by a broken line in the drawing) representing the shape of an upper face of the measurement-target object 3 is set as the boundary plane and in which a region above the surface C1 is defined as the region (1) including no target object while a region below the surface C1 is defined as the region (2) including the target object. The surface C1 has a shape of a bowl provided with a protruding portion in a bottom portion. The region (2) including the target object is hatched by diagonal lines in the drawings.
Similarly, when the measurement direction is the left face measurement direction D, as shown in Fig. 7, an implicit function is created in which a surface C2 (shown by a broken line in the drawing) representing the shape of a left face of the measurement-target object 3 is set as the boundary plane and in which a region on the left side of the surface C2 is defined as the region (1) including no target object while a region on the right side of the surface C2 is defined as the region (2) including the target object. However, in the actual measurement, a measurement range is limited to be equal to or higher than a constant height from the table 9 in order to avoid interference
between the three-dimensional measurement sensor 1 and the table 9, and a portion of the measurement-target object 3 in contact with the table 9 cannot be measured. The region which cannot be measured is the region (1) in which no target object exists.
Similarly, when the measurement direction is the right face measurement irection E, as shown in Fig. 8, an implicit function is created in which a surface C3 (shown by a broken line in the drawing) representing the shape of a right face of the measurement-target object 3 is set as the boundary plane and in which a region on the right side of the surface C3 is defined as the region (1) including no target object while a region on the left side of the surface C3 is defined as the region (2) including the target object. However, in the actual measurement the measurement range is limited to be equal to or higher than the constant height from the table 9 in order to avoid interference between the three-dimensional measurement sensor 1 and the table 9, and the portion of the measurement-target object 3 in contact with the table 9 cannot be measured. The region which cannot be measured is the region (1) in which no target object exists.
When the logical conjunction of the implicit functions C1, C2, C3 obtained from the measurement from the three directions as described above is calculated, as shown in Fig. 9, some of the regions defined as the regions (2) including the target object in the measurement from each of the measurement directions A, D, E are defined as the regions (1) including no target object in the measurement from the other measurement directions. Since the measurement is performed three-dimensionally in actual, the logical conjunction of the implicit functions obtained from the measurement from the two directions perpendicular to the sheet surface of the drawing is also calculated. Here, "logical conjunction" refers to determination in which a region defined as the region (2) including the target object in the measurement from one measurement direction is determined to be the region (1) including no target object when the region is defined as the region (1) including no target object in the measurement from another measurement direction.
For the bottom face, as shown in Fig. 10, the shape can be obtained from the side face measurement from the measurement directions D, E (including measurement from the other two directions perpendicular to the sheet surface of the drawing), although the obtained shape is inaccurate. Specifically, since the region which cannot be measured is defined as the region (1) in which no target object exists in the measurement from the left face measurement directions D, E, the shape of the bottom surface can be determined as the shape cut horizontally at a lower end of the measurement range. Moreover, the shape of the bottom surface can be determined similarly in the measurement from the two directions perpendicular to the sheet surface of the drawing.
However, this is considered to be "inaccurate" because the measurement from the bottom is not performed and the overestimation actually remains. Specifically, although the logical conjunction of the implicit functions obtained from the measurement from the measurement directions is the final implicit function and is the surface data defining the surface representing the surface shape of the measurement-target object, as shown in Fig. 10, an accurate shape cannot be obtained for a recessed portion 3a formed in a center portion of the bottom face which is the overestimation. Similarly, an accurate shape cannot be obtained also for the portion of the measurement-target object 3 in contact with the table 9.
If the measurement from the bottom face direction is performed, the final implicit function can accurately represent the surface shape of the entire region of the measurement-target object.
As shown in Fig. 11, the thus-calculated final implicit function is surface data defining the surface C representing the surface shape of the measurement-target object in the measurement region. The region outside the surface C is the region
(1) including no target object, and the region inside the surface C is the region
(2) other than the region (1), i.e. the region including the target object. Moreover, the
surface C defined by the final implicit function does not coincide with the surface C represented by the three-dimensional triangular mesh in a precise sense. However, since the values roughly coincide with one another when the intervals in the point cloud data are sufficiently small, the surfaces are denoted by the same reference sign C in the mode of embodiment. Note that the measurement region is not a region in which the measurement-target object physically exists but a region in which the surface C representing the surface shape of the measurement-target object in a computer is virtually assumed to exist.
In the mode of embodiment "surface data" of the present invention is the final implicit function defining the surface C in the measurement region.
Next, description is given of a method of generating polyhedron data from the implicit function which is generated by the procedure described above and which represents the measurement-target object.
In the present invention, as shown in Fig. 11, the entire measurement region is divided into a group of tetrahedral small regions (cells) by division processing using a three-dimensional Delaunay diagram, the tetrahedral small regions filling the measurement region without a gap.
Here, the division processing using the three-dimensional Delaunay diagram refers to processing in which points of a point cloud 4 (shown by black circles in the drawing) randomly arranged in the measurement region are connected to be vertices of the small tetrahedral regions (cells).
Fig. 11 is a three-dimensional Delaunay diagram showing the measurement region that is a space in which the surface C of the measurement-target object exists. However, due to restrictions as a drawing, triangles are used instead to represent the tetrahedra.
In Fig. 11, the points of the point cloud 4 are randomly arranged in the measurement region. However, the arrangement is not limited to this and the points may be regularly arranged in a lattice pattern.
Note that the points of the point cloud 4 in the Delaunay diagram are assumed to be arranged inside the surface C of the measurement-target object and outside the surface shape C of the measurement-target object, and to be always arranged near a boundary of the measurement region.
Here, the three-dimensional Delaunay diagram is a diagram in which the two-dimensional Delaunay diagram is expanded. The two-dimensional Delaunay diagram is "space division where the circumscribed circle of each triangle (cell) includes no other vertices therein". In contrast, the three-dimensional Delaunay diagram is "space division where the circumscribed sphere of each tetrahedron (cell) includes no other vertices therein".
Examples of the tetrahedral cells are shown in Fig. 12. Fig. 12(a) shows one unit of the tetrahedral cell and Figs. 12(b) and 12(c) each shown an example of a sphere or a rectangular solid formed by connecting multiple tetrahedral cells.
Subsequently, as shown in Fig. 13, the points of the point cloud 4 in the three-dimensional Delaunay diagram, i.e. the vertices of the cells are classified into an inner point 5 (shown by triangles in the drawing) existing inside the surface C of the measurement-target object and an outer point 6 (shown by squares in the drawing) existing outside the surface C.
Then, as shown in Fig. 13, cells whose four vertices include both of the inner point 5 and the outer point 6 are extracted from all of the cells in the three-dimensional Delaunay diagram.
Such cells can be considered as cells near the surface C of the measurement-target object and are thus called boundary cells.
Furthermore, as shown in Fig. 14, attention is given to edges each having a combination of the inner point 5 and the outer point 6 located at the respective two ends among the edges of the boundary cells, and intersections 7 (boundary coordinates, shown by double circles in the drawing) between the surface C of the measurement-target object and each boundary cell are calculated by a bisection method.
The bisection method is a method as follows. Whether an obtained midpoint is the inner point or the outer point is determined by using the implicit function. When the midpoint is the inner point, a midpoint between the last-obtained midpoint and the outer point of the boundary cell is obtained. When the midpoint is the outer point, a midpoint between the last-obtained midpoint and the inner point of the boundary cell is obtained. This is repeated to obtain the intersections 7 between the surface C and the boundary cell.
As shown in Figs. 15 to 17, the number of the intersections 7 between the surface C and each boundary cell is inevitably three or four and a triangular or quadrilateral face 8 can be obtained by appropriately connecting the intersections 7.
Specifically, Fig. 15 shows a case where the four vertices of the cell include three inner points 5 and one outer point 6. Since each of three edges having the inner point 5 and the outer point 6 at both ends has an intersection 7, a triangular face 8 can be formed by connecting the three intersections 7.
Moreover, Fig. 16 shows a case where the four vertices of the cell include two inner points 5 and two outer points 6. Since each of four edges having the inner point 5 and the outer point 6 at both ends has an intersection 7, a quadrilateral face 8 can be formed by connecting the four intersections 7.
Furthermore, Fig. 17 shows a case where the four vertices of the cell include one inner point 5 and three outer points 6. Since each of three edges having the inner point 5 and the outer point 6 at both ends has an intersection 7, a triangular face 8 can be formed by connecting the three intersections 7.
Each of the triangular and quadrilateral faces 8 obtained in the aforementioned procedure can be considered as a slice cross-section obtained by slicing (cutting) the boundary cell along the surface C of the measurement-target object. A quadrilateral slice cross-section 8 corresponds to triangular slice cross-sections 8 when divided into two.
Accordingly, it is possible to generate the polyhedron data (STL format) defined as a closed polyhedron which is manifold and which includes no self-intersection, by connecting the slice cross-sections 8 of all of the boundary cells.
Specifically, as shown in Fig. 18, since the slice cross-sections 8 of all of the boundary cells are connected, the generated polyhedron data is a closed polyhedron 10. Moreover, since two slice cross-sections 8 share each of the edges of the slice cross-sections 8, it can be said that the polyhedron is not non-manifold where three or more faces share one edge, and is thus manifold.
Furthermore, the cells in the three-dimensional Delaunay diagram do not overlap each other due to the definition of the three-dimensional Delaunay diagram. Accordingly, the polyhedron data generated by connecting the slice cross-sections of the boundary cells can be guaranteed to include no self-intersection.
Moreover, the STL format is a format in which a three-dimensional shape is represented as a group of small triangles. The shape of each slice cross-section 8 is a triangle or a quadrilateral and the quadrilateral becomes triangles when divided into two. Accordingly, the polyhedron data obtained by connecting the slice cross-sections 8 is generated as the STL format which is a group of triangles.
EMBODIMENT 1
An embodiment in which the shape measuring method of the present invention is applied to an NC device is described below in detail with reference to the drawings.
As shown in Fig. 19, the embodiment includes an NC device 100 and a measurement system 200 and is used for a machine tool 140 shown in Fig. 20.
The NC device 100 includes: a NC program storage part 110 configured to store a NC program describing a movement route of a tool used to perform machining of a workpiece; a NC program analysis part 120 configured to create information related to the movement amount and the movement speed of the tool on the basis of the NC program read from the NC program storage part 110; a movement control part 130 configured to control movement of the machine tool 140 including a table, a saddle, and a ram, on the basis of the information created by the program analysis part 120; and the machine tool 140 including the table, the ram, and the saddle.
Fig. 20 shows the machine tool 140 including the table, the ram, and the saddle. As shown in Fig. 20, the machine tool 140 includes: a table 141 on which the workpiece (measurement-target object) is placed and which moves in a X direction; a supporting part 142 which is formed in a gate shape to straddle the table 141; a beam part 143 which extends in a Y direction in an upper part of the supporting part 142; a saddle 144 which is provided on the beam part 143 to be movable in the Y direction; and a ram (main spindle) 145 which is movable in a Z direction on the saddle 144.
Accordingly, the three-dimensional coordinates of the ram 145 with respect to the workpiece on the table 141 can be obtained from the movement amounts of the table 141, the saddle 144, and the ram 145. A three-dimensional measurement part 210 configured to measure the distance to the workpiece is attached to the ram 145 in measurement, while the tool is attached to the ram 145 in machining.
The measurement system 200 includes: the three-dimensional measurement unit 210 configured to measure the distance to a measurement-target object 300 which is the workpiece; a measurement unit control part 220 configured to control the three-dimensional measurement unit 210; a measured point storage part 230 configured to store point cloud data which include the distance to the workpiece measured by the measurement part 210 and the three-dimensional coordinates (X, Y, Z) of the ram 145 at that the time of the measurement; and a STL generation part 240 configured to generate polyhedron data (STL data) on the basis of the point cloud data stored in the measured point storage part 230, the polyhedron data defined as a closed polyhedron which is "manifold" and which includes no self-intersection.
For example, a laser distance sensor used for distance measurement can be used as the three-dimensional measurement unit 210.
Fig. 21 shows a flowchart of shape measurement by the measurement system 200.
First, a worker places the measurement-target object 300 on the table 110 (step SI).
Next, the worker attaches the three-dimensional measurement unit 210 to the ram 145 (step S2).
Then, the distance from the three-dimensional measurement unit 210 to the measurement-target object is measured and the three-dimensional coordinates (X, Y, Z) of the ram 145 at that the time of the measurement is obtained (step S3). This processing is performed multiple times while the position of the three-dimensional measurement unit 210 relative to the measurement-target object 300 is changed by moving the table 141 in the X direction and by moving the saddle 144 in the Y direction. In other words, the measurement-target object 300 is scanned by the three-dimensional measurement unit 210.
Subsequently, the three-dimensional coordinates (point cloud data) of each measured point are calculated based on the distance from the three-dimensional measurement unit 210 to the measurement-target object 300 and the three-dimensional coordinates (X, Y, Z) of the ram 145 at the time of the measurement (step S4).
Here, specific operation contents of "attaching the three-dimensional measurement unit (step S2)" to "calculating the coordinates of each measured point (step S4)" are additionally described below in sections (1) to (4).
(1) Since it is necessary to measure five faces (in a case of a vertical-type machine, an upper face and four side faces) of the measurement-target object 300 placed on the table 141, a measurement route of the three-dimensional measurement unit 210 is determined. Since the three-dimensional measurement unit 210 is attached to the ram (main spindle) 145 of the machine tool, movement along the measurement route of the three-dimensional measurement unit 210 can be achieved by axial motion of the machine tool.
(2) Each face of the measurement-target object 300 is measured (scanned) such that the three-dimensional measurement unit 210 is moved while being maintained at a constant height. Then, the distance to the measurement-target object 300 which is information obtained by the three-dimensional measurement unit 210 and machine tool coordinate information (i.e. the position coordinates of the three-dimensional measurement unit 210) at each time of measurement are combined. Machine tool coordinate data (point cloud data) of the target object can be thereby calculated.
(3) Since the five faces of the measurement-target object 300 placed on the table 141 are sequentially measured, it is possible to determine, for the point cloud data calculated for each face, the direction (view point) in which the measurement-target object 300 has been measured. In other words, the outside and the inside
of the measurement-target object can be determined if the point cloud data and view point information (measurement direction) are known. To put it differently/ as described in Figs. 5 to 10, the measurement region can be separated into the region (1) including no target objects and the region (2) other than the region (1) by using the logical conjunction of the implicit functions.
(4) The point cloud data calculated for each face is subjected to the following operation. As described above in the mode of embodiment/ "the two-dimensional point cloud data having the height information" is created. Then, the three-dimensional mesh data representing the shape with the three-dimensional triangular mesh is created. Next, the implicit function is created from the triangular mesh data. After repeating these operations/ the logical conjunction of the multiple implicit functions created for the faces is calculated to obtain the final implicit function.
Thereafter, a shape model (STL data) representing the surface shape of the measurement-target object is generated from the final implicit function (step S5).
Step S5 is performed by the STL generation part 240 according to the flowchart shown in Fig. 22, as described below specifically.
First as shown in Fig. 11, the STL generation part 240 generates a group of cells filling the entire measurement region in which the surface defined by the final implicit function exists (step S6).
Next as shown in Fig. 13, the STL generation part 240 determines whether each of the vertices of the cells is inside or outside the surface (step S7).
Then, as shown in Fig. 14, the STL generation part 240 extracts the boundary cells having both the inner point and the outer point from all of the cells (step S8).
Thereafter, as shown in Fig. 14, the (three or four) intersection coordinates between the surface and the edges of each boundary cell are calculated (step S9).
Furthermore, as shown in Figs. 15 to 17, the (triangular or quadrilateral) slice cross-section is generated by connecting the intersections of each boundary cell (step S10). The quadrilateral slice cross-section is divided into two to be triangles.
Next, as shown in Fig. 18, the slice cross-sections of all of the boundary cells are connected to generate the STL data defined as a closed polyhedron which is "manifold" and which includes no self-intersection (step Sll).
EMBODIMENT 2
An embodiment of an attachment for attaching a three-dimensional measurement sensor (hereafter, referred to as measurement device) to a main spindle is described with reference to Figs. 29 and 30, the three-dimensional measurement sensor used in the shape measuring method of the present invention.
As described above, ideally, it is desirable to perform measurement of six faces because a region in which no measurement-target object exists may be erroneously determined as a region in which the measurement-target object exists from a result of measurement of only one face. A surface representing a surface shape of the entire region of the measurement-target object can be obtained to some extent by measuring five faces (in a case of a vertical-type machine, the upper face and the four side faces) of the target object placed on a table of a machine tool.
Here, as described in Embodiment 1, since a three-dimensional measurement unit can be attached to a main spindle of the machine tool, movement along a measurement route can be achieved by axial motion of the machine tool.
Furthermore, in the embodiment, the measurement is performed by using a side-surface machining attachment attached to the machine tool to optimize the measurement of the side surfaces, an inclined-surface machining attachment, or an attachment having a tilt mechanism dedicated to the sensor.
Specifically, regarding the side-surface machining attachment, a measurement
device 22 is attached to a main spindle 20 via a 90-degrees inclined attachment 21 as
shown in Fig. 29.
The 90-degrees inclined attachment 21 is one type of side-surface machining attachment and a rotating axis thereof is inclined with respect to a rotating axis of the main spindle 20 at 90 degrees.
For example, when the rotating axis of the main spindle 20 is extending in a vertical direction as shown in the drawing, the rotating axis of the 90-degrees inclined attachment 21 extends in a horizontal direction. Accordingly, the distance to a measurement target 23 can be measured by causing a laser beam to reflect on the measurement target 23, the laser beam outputted from the measurement device 22 while the measurement device 22 is rotated about the two rotating axes.
Moreover, the inclined-surface machining attachment refers to an attachment in which the inclination angle is not limited to 90 degrees as in the 90-degrees inclined attachment 21 described above but can be arbitrary changed as shown in Fig. 34. The measurement of the five faces is made possible by using this attachment.
Using the attachment for machining as shown in Fig. 29 thus has an advantage that no dedicated attachment is required.
Meanwhile, regarding the sensor-dedicated attachment, as shown in Fig. 30, the measurement device 22 is attached to the main spindle 20 via a rotating mechanism (tilt mechanism) 24 having two rotating axes.
The rotating mechanism 24 is one type of attachment dedicated for the measurement device 22 and has two rotating axes in addition to the rotating axis of the main spindle 20 which are, for example, rotating axes extending in directions orthogonal to each other and in directions intersecting the main spindle at arbitrary angles. Hence, rotation about the three axes in total including the main spindle is made possible. This has an advantage that the freedom of the measurement device 22 with respect to the measurement target 23 on a table 25 can be increased.
As shown in Fig. 30, using the attachment dedicated to the sensor thus enables measurement even when there is no attachment for machining.
Moreover, in the sensor-dedicated attachment, it is possible to reduce the size of the attachment and increase the measurement range compared to the attachment for machining.
Particularly, when a line laser sensor is used as the measurement device 22, the line of the laser and the sending direction in the measurement can be adjusted to be perpendicular to each of the measured faces. Freedom in the sending direction is thereby increased even when the line laser is used.
EMBODIMENT 3
An embodiment of the measurement route in the shape measuring method of the present invention is described with reference to Figs. 31 to 33.
For example, as shown in Fig. 31, assume a case where a measurement target 23 of a certain shape is measured by triangulation only from one direction shown by the arrow in the drawing. When a measurement target 23 has a step, a laser beam from a light projecting portion (not illustrated) in a measurement device 22 is blocked and does not return to a light receiving portion (not illustrated) in the measurement device 22 in some cases. Accordingly, missing (so-called absence) of the three-dimensional shape data occurs.
As shown in Fig. 32, this can be improved in some cases by reversing the arrangement direction of the light projecting portion and the light receiving portion in the measurement device 22 with respect to a measurement direction shown by the arrow in the drawing. However, it is difficult to perform this reversal automatically unless the shape of the measured object is recognized in advance.
Accordingly, in the embodiment, when the measurement target 23 of a certain shape is measured, five faces thereof are each measured along a lattice-pattern measurement route to create the three-dimensional shape data.
Specifically, as shown in Fig. 33, one face of the measurement target 23 is first measured while moving the measurement device 22 from an upper-left position shown by reference numeral 22a along the arrow in a horizontal rightward direction. Thereafter, the following operations are repeated. The measurement device 22 is shifted downward by a constant distance to a position shown by reference numeral 22b or 22c and the measurement is performed while moving the measurement device 22 along the arrow in the horizontal rightward direction. In other words, the measurement is performed along a horizontal direction measurement route.
Then, the measurement is performed while moving the measurement device 22 from a lower-left position shown by reference numeral 22d in Fig. 33 along the arrow in a vertical upward direction. Thereafter, the following operations are repeated. The measurement device 22 is shifted rightward by a constant distance to a position shown by reference numeral 22e or 22f and the measurement is performed while moving the measurement device 22 along the arrow in the vertical upward direction. In other words, the measurement is performed along a vertical direction measurement route.
When the measurement of the one face of the measurement target 23 along the lattice-pattern route including the horizontal direction measurement route and the vertical direction measurement route is completed as described above, the similar measurement is performed continuously for the other four faces.
In the embodiment, even when the shape of the measurement target 23 is not recognized in advance (Note that recognition of a rough shape thereof such as recognition that the measurement target 23 has five faces is required), measuring the five faces of the measurement target 23 along the lattice-pattern route can reduce the missing of data due to blocking of the laser beam, compared to the case where the measurement is performed only from one direction.
Furthermore, attaching the measurement device 22 to the main spindle via the attachment having two rotating axes enables measurement of the five faces of the measurement target in one setup. This has an advantage that the three-dimensional shape data of the rough shape of the entire measurement target can be created.
INDUSTRIAL APPLICABILITY
The shape measuring method of the present invention has wide industrial usages for machine tools such as industrial machines controlled by numerical control devices.
EXPLANATION OF THE REFERENCE NUMERALS
1 three-dimensional measurement sensor
2 main spindle of machine tool
3 measurement-target object
4 point cloud
5 inner point
6 outer point
7 intersection
8 slice cross-section
9 table
10 closed polyhedron which is "manifold" and which includes no self-intersection A, D, E measurement direction
B moving direction C surface
WE CLAIM:
1. A shape measuring method characterized in that the method comprises the steps of:
scanning a measurement-target object with a three-dimensional measurement unit from a plurality of measurement directions to obtain, for each of the measurement directions, measured point cloud data which is three-dimensional coordinates of measured point cloud in the measurement-target object, and generating an implicit function representing the measurement-target object on the basis of pieces of the measured point cloud data;
dividing an entire measurement region in which the measurement-target object exists into tetrahedral small regions (hereafter, referred to cs cells) by performing division processing using a three-dimensional Delaunay diagram on the basis of the implicit function, the tetrahedral small regions filling the measurement region without a gap and without overlapping each other;
classifying vertices of the cells in the three-dimensional Delaunay diagram into an inner point existing inside the measurement-target object and an outer point existing outside the measurement-target object by using the implicit function;
extracting boundary cells having four vertices including both of the inner point and the outer point, from all of the small regions in the three-dimensional Delaunay diagram;
calculating an intersection between a surface of the measurement-target object and each of edges having a combination of the inner point and the outer point located at the respective two ends, among edges of the boundary cells;
obtaining a triangular or quadrilateral face by connecting three or four of the intersections included in each of the boundary cells; and
generating closed polyhedron data which is manifold and which includes no self-intersection, by connecting all of the triangular or quadrilateral faces.
| # | Name | Date |
|---|---|---|
| 1 | 2180-MUMNP-2013-MARKED COPY-17-12-2013.pdf | 2013-12-17 |
| 2 | ABSTRACT1.jpg | 2018-08-11 |
| 3 | 2010-MUMNP-2013.pdf | 2018-08-11 |
| 4 | 2010-MUMNP-2013-WO INTERNATIONAL PUBLICATION REPORT A1.pdf | 2018-08-11 |
| 5 | 2010-MUMNP-2013-OTHER PCT FORM.pdf | 2018-08-11 |
| 6 | 2010-MUMNP-2013-OTHER DOCUMENT.pdf | 2018-08-11 |
| 7 | 2010-MUMNP-2013-GENERAL POWER OF ATTORNEY.pdf | 2018-08-11 |
| 8 | 2010-MUMNP-2013-FORM PCT-ISA-210.pdf | 2018-08-11 |
| 9 | 2010-MUMNP-2013-FORM PCT-IB-311.pdf | 2018-08-11 |
| 10 | 2010-MUMNP-2013-FORM PCT-IB-304.pdf | 2018-08-11 |
| 11 | 2010-MUMNP-2013-FORM 5.pdf | 2018-08-11 |
| 12 | 2010-MUMNP-2013-FORM 3.pdf | 2018-08-11 |
| 13 | 2010-MUMNP-2013-FORM 3(21-4-2014).pdf | 2018-08-11 |
| 14 | 2010-MUMNP-2013-FORM 2.pdf | 2018-08-11 |
| 15 | 2010-MUMNP-2013-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 16 | 2010-MUMNP-2013-FORM 18.pdf | 2018-08-11 |
| 17 | 2010-MUMNP-2013-FORM 1.pdf | 2018-08-11 |
| 18 | 2010-MUMNP-2013-FER.pdf | 2018-08-11 |
| 19 | 2010-MUMNP-2013-ENGLISH TRANSLATION.pdf | 2018-08-11 |
| 20 | 2010-MUMNP-2013-DRAWING.pdf | 2018-08-11 |
| 21 | 2010-MUMNP-2013-DESCRIPTION(COMPLETE).pdf | 2018-08-11 |
| 22 | 2010-MUMNP-2013-CORRESPONDENCE.pdf | 2018-08-11 |
| 23 | 2010-MUMNP-2013-CORRESPONDENCE-(7-10-2014).pdf | 2018-08-11 |
| 24 | 2010-MUMNP-2013-CORRESPONDENCE(21-4-2014).pdf | 2018-08-11 |
| 25 | 2010-MUMNP-2013-CLAIMS.pdf | 2018-08-11 |
| 26 | 2010-MUMNP-2013-ABSTRACT.pdf | 2018-08-11 |
| 27 | 2010-MUMNP-2013--FORM PCT-ISA-237(21-4-2014).pdf | 2018-08-11 |
| 28 | 2010-MUMNP-2013--FORM PCT-IB-338(21-4-2014).pdf | 2018-08-11 |
| 29 | 2010-MUMNP-2013--FORM 1(21-4-2014).pdf | 2018-08-11 |
| 30 | 2010-MUMNP-2013--CORRESPONDENCE(21-4-2014).pdf | 2018-08-11 |
| 31 | 2010-MUMNP-2013-AbandonedLetter.pdf | 2018-10-09 |
| 1 | 2010mumnp2013searchstrategy1_14-09-2017.pdf |