Abstract: [Problem] To provide a shape measuring device with which it is possible to accurately measure the shape of an object to be measured even when the distance between the object to be measured and an image capturing device varies when the uneven shape of the object to be measured is measured by optical cutting. [Solution] This shape measuring device is provided with: a linear light position detection unit for detecting from a captured image in which linear light irradiated by a linear light radiation unit onto an object to be measured is imaged by an image capturing device the linear light position of the linear light; a distance calculation unit for calculating the distance from the image capturing device to the object to be measured on the basis of the difference in distance between the linear light position detected by the linear light position detection unit and a reference linear light position detected by the linear light position detection unit when the object to be measured is located at a position set apart from the image capturing device by a reference distance a reference distance and the angle formed between the optical axis of the image capturing device and the emission angle of the linear light; a focus adjustment unit for adjusting the focus of the image capturing device on the basis of the distance from the image capturing device to the object to be measured; and a shape calculation unit for calculating the shape of the object to be measured on the basis of the captured image.
The present invention relates to a shape measurement apparatus and a shape
10 measurement method that measure the shape of a measurement object by a lightsection
method.
15
Background Art
[0002]
A light-section method is a technique of capturing, by an image capturing
apparatus, an image of a line of light applied to a measurement object using a laser or
the like, and measuring the unevenness shape of the measurement object from the
degree of bend of the line of light detected from the captured image. For example,
Patent Literature 1 discloses a technique of capturing an image of laser light applied
20 to a measurement object using a time delay integration (TDI) camera, and measuring
the shape of the measurement object on the basis of the obtained striped image.
[0003]
The light-section method will be described in detail. As illustrated in FIG.
7, first, a linear light irradiation apparatus 10 that applies linear light, such as line
25 laser or slit light, irradiates a measurement object 5 with linear light. Then, an
image capturing apparatus 20 captures an image of linear light applied to the
measurement object 5, and outputs a captured image A to an image processing
apparatus 50. For example, when a measurement surface Sa of the measurement
object 5, which is an irradiation surface irradiated with linear light, is flat, straight
30 linear light appears in the captured image A. However, when the measurement
surface Sa has a depression, linear light 12 that includes a bent part 12b due to the
PCT/JP2016/062806
2/30
depression in a straight part 12a appears in the captured image A, as illustrated in FIG.
7. Thus, the shape of the measurement surface Sa can be measured on the basis of
the degree of bend of the linear light 12 included in the captured image A, which is
acquired by capturing an image of the measurement surface Sa of the measurement
5 objectS irradiated with the linear light 12.
[0004]
In measuring the shape of the measurement object S from the captured
image A by such a technique, in order to accurately find the degree of bend of the
linear light 12 in the shot image and maintain the precision of shape measurement, it
10 is necessary to achieve focus so that the linear light 12 is shown thin and clear in the
shot image A. This requires the focus of the image capturing apparatus 20 to be
accurately adjusted to be set on the measurement surface Sa of the measurement
objectS. For example, in the case of measuring the shape of a side surface or a top
surface of the measurement object S moving on a conveyance line, the image
15 capturing apparatus 20 needs to be accurately focused on the side surface or the top
surface of the measurement object S. However, the shape of the measurement
object S is not constant because, for example, specifications of products differ in a
production line; for example, in the case where the measurement object S is a
rectangular parallelepiped, size, such as width and height, differs.
20 [OOOS]
If the width and height of the measurement object S is found before shape
measurement, the distance from the installation position of the image capturing
apparatus 20 to the measurement smface Sa of the measurement object S is
calculated, and focus of the image captming apparatus 20 is adjusted in accordance
25 with the distance; thus, a clear image can be acquired. For example, in the case of
measuring the shape of a side surface of the measurement object S, as illustrated in
FIG. 8, assume that a control apparatus 60 is notified of the width of the
measurement object S before the start of measurement, and a distance D from the
image capturing apparatus 20 to the measurement surface Sa is known. In addition,
30 a focus ring 24 of the image capturing apparatus 20 that adjusts the position of a
focus lens 22 is configured to be rotatable by a drive device, such as a motor. Thus,
PCT/JP2016/062806
3/30
the control apparatus 60 drives the motor in accordance with the distance D from the
installation position of the image capturing apparatus 20 to the measurement surface
Sa of the measurement object S to rotate the focus ring 24 of the focus lens 22,
thereby focusing the image capturing apparatus 20 on the measurement surface Sa.
5 Alternatively, if the depth of field of the image capturing apparatus 20 is sufficiently
deep, a clear image can be obtained without adjustment of focus in some cases.
Citation List
Patent Literature
10 [0006)
Patent Literature l: JP 2004-3930A
Summary of Invention
Technical Problem
15 [0007]
However, the size (e.g., width) of a measurement object is not found
beforehand in some cases. Alternatively, even if the size of a measurement object is
found beforehand, in the event of slanted movement in which the measurement
object S moves in a state of being inclined with respect to a conveyance direction as
20 illustrated in FIG. 9, or position deviation in which the width center of the
measurement object S is deviated from the center C in the width direction of the
conveyance line as illustrated in FIG. l 0, the focus of the image capturing apparatus
20 is off the measurement surface Sa of the measurement object S, resulting in a
blurred, unclear image.
25 [0008]
As a coping method for the focus of the image capturing apparatus 20 being
off the measurement surface Sa of the measurement object S, for example, it is
possible to install a distance sensor, and adjust focus on the basis of a measured
distance between the image capturing apparatus 20 and the measurement surface Sa
30 measured by the distance sensor. However, it is necessary to additionally install a
distance sensor, which complicates device configuration. In addition, as a method
PCT/JP2016/062806
4/30
not using a distance sensor, it is possible to calculate contrast of luminance fi·om
images continuously captured while moving a focus lens to and fro in the optical axis
direction of the image capturing apparatus, and adjust focus by searching for a
position with high contrast. However, this method takes time until focus is
5 achieved and leads to poor responsivity, and thus is difficult to apply to a
measurement object that is being conveyed.
[0009]
Hence, the present invention is made in view of the above problems, and an
object of the present invention is to provide a novel and improved shape
10 measurement apparatus and shape measurement method that, in measuring the
unevenness shape of a measurement object by a light-section method, enable the
shape of the measurement object to be measured precisely even when the distance
between the measurement object and an image capturing apparatus fluctuates ..
15 Solution to Problem
[0010]
According to an aspect of the present invention in order to achieve the
above-mentioned object, there is provided a shape measurement apparatus including:
a linear light position detection unit that detects, from a captured image oflinear light
20 applied to a measurement object by a linear light irradiation apparatus that is
captured by an image capturing apparatus, a linear light position of the linear light; a
distance computation unit that computes a distance from the image capturing
apparatus to the measurement object, on the basis of a distance difference between a
reference linear light position detected by the linear light position detection unit
25 when the measurement object is positioned at a position of a predetermined reference
distance from the image capturing apparatus and the linear light position detected by
the linear light position detection unit, the reference distance, and an angle formed by
an optical axis of the image capturing apparatus and an emission direction of the
linear light; a focus adjustment unit that adjusts focus of the image capturing
30 apparatus on the basis of the distance from the image capturing apparatus to the
measurement object; and a shape computation unit that computes a shape of the
PCT/JP2016/062806
5/30
measurement object on the basis of the captured image.
[0011]
The distance computation unit may compute the distance from the image
capturing apparatus to the measurement object on the basis of a distance function
5 expressed using an image capturing resolution of the image capturing apparatus.
[0012]
For example, the distance computation unit may compute a distance D from
the image capturing apparatus to the measurement object on the basis of Formula (A)
below. Alternatively, the distance computation unit may compute a distance D from
10 the image capturing apparatus to the measurement object on the basis of Formula (B)
below.
15
[0013]
[Math. 1]
D -D XJ0 ItanB
- 0 +
l-X,r0 ltanBI D0
D = D0 + X,r0 tanB ... (B)
... (A)
Here, Do is the reference distance, r0 is an image capturing resolution at the
reference distance, Xe is a distance difference between the linear light position and
the reference linear light position in units of pixels of the captured image, and 8 is an
angle formed by the optical axis of the image capturing apparatus and the emission
20 direction of the linear light.
[0014]
The linear light position detection unit may calculate a projection waveform
expressing a sum ofluminance values of pixels aligned in a straight-line direction of
linear light at each position in a direction orthogonal to the straight-line direction of
25 the linear light in the captured image, and set a peak position of the projection
waveform as the linear light position.
[0015]
Alternatively, the linear light position detection unit may calculate a
projection waveform expressing a sum of luminance values of pixels aligned in a
30 straight-line direction of linear light at each position in a direction mihogonal to the
PCT/JP2016/062806
6/30
straight-line direction of the linear light in the captured image, and set a center-ofgravity
position of the projection waveform as the linear light position.
[0016]
The shape computation unit may compute the shape of the measurement
5 object on the basis of a maximum luminance position in a direction orthogonal to a
straight-line direction of the linear light that is calculated for each position in the
straight-line direction in the captured image.
[0017]
Alternatively, the shape computation unit may compute the shape of the
10 measurement object on the basis of a center-of-gravity position of luminance in a
direction orthogonal to a straight-line direction of the linear light that is calculated
for each position in the straight-line direction in the captured image.
[0018]
According to another aspect of the present invention in order to achieve the
15 above-mentioned object, there is provided a shape measurement method including: a
linear light position detection step of detecting, from a captured image of linear light
applied to a measurement object by a linear light inadiation apparatus that is
captured by an image capturing apparatus, a linear light position of the linear light; a
distance computation step of computing a distance from the image capturing
20 apparatus to the measurement object, on the basis of a distance difference between a
reference linear light position detected when the measurement object is positioned at
a position of a predetermined reference distance from the image capturing apparatus
and the linear light position, the reference distance, and an angle formed by an
optical axis of the image capturing apparatus and an emission direction of the linear
25 light; a focus adjustment step of adjusting focus of the image capturing apparatus on
the basis of the distance from the image capturing apparatus to the measurement
object; and a shape· computation step of computing a shape of the measurement
object on the basis of the captured image.
30 Advantageous Effects ofinvention
[0019]
5
PCT/JP2016/062806
7/30
As described above, according to the present invention, in measuring the
unevenness shape of a measurement object by a light-section method, the shape of
the measurement object can be measured precisely even when the distance between
the measurement object and an image capturing apparatus fluctuates.
Brief Description of Drawings
[0020]
[FIG. 1] FIG. 1 is an explanatory diagram illustrating a schematic configuration of a
shape measurement system that measures the shape of a measurement object by a
10 light-section method.
[FIG. 2] FIG. 2 is a functional block diagram illustrating a functional configuration of
a shape measurement apparatus according to an embodiment of the present invention.
[FIG. 3) FIG. 3 is a flowchart illustrating processing performed by a shape
measurement apparatus according to the embodiment.
15 [FIG. 4] FIG. 4 is an explanatory diagram for explaining a method for calculating a
linear light position in a captured image in step S 110.
[FIG. 5] FIG. 5 is an explanatory diagram for explaining a method for calculating a
distance between an image capturing apparatus and a measurement object in step
S120.
20 [FIG. 6] FIG. 6 is an explanatory diagram illustrating an example of a captured image
of a measurement object having a convex shape on a measurement surface.
[FIG. 7] FIG. 7 is an explanatory diagram for explaining the principle of a lightsection
method.
[FIG. 8] FIG. 8 is an explanatory diagram illustrating an example of a coping method
25 for a change in width of a measurement object.
30
[FIG. 9] FIG. 9 is an explanatmy diagram for explaining slanted movement of a
measurement object that serves as a cause of defocus.
[FIG. 1 OJ FIG. 10 is an explanatory diagram for explaining position deviation of a
measurement object that serves as a cause of defocus.
Description of Embodiments
PCT/JP2016/062806
8/30
[0021]
Hereinafter, (a) preferred embodiment(s) of the present invention will be
described in detail with reference to the appended drawings. In this specification
and the appended drawings, structural elements that have substantially the same
5 function and structure are denoted with the same reference numerals, and repeated
explanation of these structural elements is omitted.
[0022]
<1. Configuration>
First, a configuration of a shape measurement apparatus according to an
10 embodiment of the present invention is described with reference to FIGS. 1 and 2.
FIG. 1 is an explanatory diagram illustrating a schematic configuration of a shape
measurement system that measures the shape of a measurement object S by a lightsection
method. FIG. 2 is a functional block diagram illustrating a functional
configuration of a shape measurement apparatus according to the present
15 embodiment. Note that FIG. 1 illustrates a state of viewing the measurement object
S in a plan view, and one side surface of the measurement object S that is a
rectangular parallelepiped as illustrated in FIG. 7 serves as a measurement surface Sa.
[0023]
[1-1. Schematic configuration of shape measurement system]
20 A shape measurement system is a system that measures the shape of the
measurement objectS by a light-section method. As illustrated in FIG 1, the shape
measurement system includes a linear light irradiation apparatus 10 that irradiates the
measurement object S with linear light, an image capturing apparatus 20 that captures
an image of linear light applied to the measurement object S, and a shape
25 measurement apparatus 1 00 that specifies the unevenness shape of the measurement
surface Sa of the measurement object S, on the basis of a captured image captured by
the image capturing apparatus 20. The linear light irradiation apparatus 10 is an
apparatus capable of outputting linear light, such as line laser or slit light. As the
image capturing apparatus 20, an area camera can be used, for example.
30 [0024]
The shape measurement apparatus 1 00 according to the present embodiment
PCT/JP2016/062806
9/30
adjusts focus of the image capturing apparatus 20 in accordance with the distance
between the measurement object S and the image capturing apparatus 20. Thus,
even when the distance between the measurement object S and the image capturing
apparatus 20 fluctuates, the position of a focus lens 22 of the image capturing
5 apparatus 20 is controlled so that a clear image is acquired, which enables the shape
of the measurement object S to be measured precisely. In the present embodiment,
the image capturing apparatus 20 includes the focus lens 22 having a focus ring 24
that is rotated by a drive device such as a motor. That is, the shape measurement
apparatus 100 drives the motor in accordance with the distance from the installation
10 position of the image capturing apparatus 20 to the measurement surface Sa of the
measurement object S to rotate the focus ring 24 of the focus lens 22, thereby
achieving focus.
[002S]
The shape measurement apparatus 1 00 according to such an embodiment
15 performs shape measurement processing of specifYing the shape of the measurement
object S on the basis of a captured image, and focus adjustment processing of
adjusting focus of the image capturing apparatus 20 on the basis of a captured image.
[0026]
In shape measurement processing, by a light-section method, an image of a
20 line of light applied to a measurement object is captured by an image capturing
apparatus, and the unevenness shape of the measurement object is measured from the
degree of bend of linear light detected from the captured image. As illustrated in
FIG. 1, when the linear light inadiation apparatus 10 irradiates the measurement
object S with linear light, an image of linear light applied to the measurement object
25 S is captured by the image capturing apparatus 20, and a captured image is output to
the shape measurement apparatus 100. The shape measurement apparatus 100
measures the shape of the measurement surface Sa on the basis of the degree of bend
of the linear light included in the captured image, which is acquired by capturing an
image of the measurement surface Sa of the measurement objectS irradiated with the
30 linear light 12
[0027]
PCT/JP2016/062806
10/30
In focus adjustment processing, focus is set on the measurement surface Sa
in accordance with fluctuation of the distance between the image capturing apparatus
20 and the measurement surface Sa of the measurement object S. In the present
embodiment, the distance between the image capturing apparatus 20 and the
5 measurement surface Sa of the measurement object S is acquired on the basis of a
captured image acquired by the image capturing apparatus 20, and focus of the image
capturing apparatus 20 is adjusted by the shape measurement apparatus 100.
Executing the focus adjustment processing in parallel with the shape measurement
processing or executing them alternately enables the shape of the measurement
10 object to be measured precisely even when the distance between the measurement
object and the image capturing apparatus fluctuates.
[0028]
[1-2. Configuration of shape measurement apparatus]
The shape measurement apparatus 100 will be described in detail. As
15 illustrated in FIG. 2, the shape measurement apparatus 100 includes an image
acquisition unit 110, a linear light position detection unit 120, a distance computation
unit 130, a focus adjustment unit 140, a shape computation unit lSO, and a result
output unit 160. Of these, the linear light position detection unit 120, the distance
computation unit 130, and the focus adjustment unit 140 are functional units that
20 execute focus adjustment processing of adjusting the focus of the image capturing
apparatus 20. The shape computation unit 150 and the result output unit 160 are
functional units that execute shape specifying processing of specifying the shape of
the measurement object 5.
[0029]
25 The image acquisition unit 11 0 is an interface unit that acquires a captured
image captured by the image capturing apparatus 20. The image captured by the
image capturing apparatus 20 is sequentially input to the image acquisition unit 110.
The image acquisition unit 11 0 outputs the input captured image to the linear light
position detection unit 120 and the shape computation unit lSO.
30 [0030]
The linear light position detection unit 120 detects a linear light position of
PCT/JP2016/062806
11/30
linear light in the captured image by arithmetic processing. For example, in the
captnred image, the straight -line direction oflinear light is set as a vertical direction,
and a direction orthogonal to the straight-line direction of linear light is set as a
horizontal direction, and the linear light position detection unit 120 first takes the
5 sum of luminance values of pixels aligned in the vertical direction at each position in
the horizontal direction of the captured image, and acquires a projection in the
vertical direction (hereinafter also referred to as a "projection waveform"). Then,
the linear light position detection unit 120 specifies the linear light position in the
captured image on the basis of the projection waveform. The linear light position in
10 the captured image may be a peak position or a center-of-gravity position of the
projection waveform, for example. The linear light position detection unit 120
outputs the calculated linear light position in the captured image to the distance
computation unit 130.
15
[0031]
The distance computation unit 130 calculates the distance between the
image captnring apparatus 20 and the measurement object 5, on the basis of the
linear light position in the captnred image calculated by the linear light position
detection unit 120. The distance computation unit 130 geometrically calculates the
distance between the image captnring apparatns 20 and the measurement object 5 on
20 the basis of the linear light position in the captured image and installation positions
of the linear light irradiation apparatns 10 and the image capturing apparatns 20 with
respect to a reference plane that is away from the image capturing apparatus 20 by a
reference distance decided in advance. Note that details of calculation processing
of the distance between the image capturing apparatus 20 and the measurement
25 object 5 by the distance computation unit 130 are described later. The distance
computation unit 130 outputs the calculated distance between the image capturing
apparatus 20 and the measurement object 5 to the focus adjustment unit 140.
[0032]
The focus adjustment unit 140 adjusts the focus position of the focus lens 22
30 of the image capturing apparatus 20 on the basis of the distance between the image
capturing apparatus 20 and the measurement object 5 calculated by the distance
PCT/JP2016/062806
12/30
computation unit 130. As illustrated in FIG. 1, the focus lens 22 according to the
present embodiment is a motor drive lens including a motor 26 that rotates the focus
ring 24. The focus adjustment unit 140 outputs, to the motor 26, a command to
move the focus lens 22 so that focus is set on the measurement surface 5a, on the
5 basis of the distance between the image capturing apparatus 20 and the measurement
object 5. The motor 26 is a stepping motor, for example. The focus adjustment
unit 140 adjusts focus by, for example, causing the motor 26 to rotate the focus ring
24 so that the lens is positioned at a distance position where focus is achieved, which
is away from the measurement surface 5a of the measurement object 5 by a
10 predetermined distance. The focus adjustment unit 140 may keep, in advance, a
correspondence relationship between the distance from the image capturing
apparatus 20 to the measurement surface 5a and a rotation angle of the focus ring 24
at which focus is achieved. For example, this cmTespondence relationship may be
obtained by a technique such as setting a plurality of distances from the image
15 capturing apparatus 20, capturing an image of a sample at each distance, and
acquiring, in advance, the rotation angle of the focus ring 24 at which focus is set on
the sample at each distance.
[0033]
The shape computation unit 150 calculates the unevenness shape of the
20 measurement surface 5a of the measurement object 5 on the basis of the degree of
bend of the linear light in the captured image. The shape computation unit 150
specifies a position in the horizontal direction that exhibits the maximum luminance
at each position in the vertical direction of the captured image, and calculates the
unevenness shape of the measurement surface 5a of the measurement object 5.
25 Note that details of calculation processing of the shape of the measurement object 5
by the shape computation unit 150 are described later. The shape computation unit
150 outputs the calculated shape of the measurement object 5 to the result output unit
160.
30
[0034]
The result output unit 160 outputs the shape of the measurement surface Sa
of the measurement object 5 calculated by the shape computation unit 150 to a
PCT/JP2016/062806
13/30
display apparatus 30 and a storage unit 40. The display apparatus 30 may be a
display provided for the shape measurement apparatus I 00, or may be a display
capable of outputting also display information from a device other than the shape
measurement apparatus 100. Displaying the calculated shape of the measurement
5 surface 5a of the measurement object 5 on the display apparatus 30 enables an
operator to be notified of the shape of the measurement surface 5a of the
measurement object 5. In addition, storing the shape of the measurement surface 5a
of the measurement object 5 in the storage unit 40 makes it possible to specify a
position having an unevenness shape on the measurement surface 5a of the
10 measurement object 5, for example.
[0035]
The functional configuration of the shape measurement apparatus l 00
according to the present embodiment has been described.
[0036]
15 <2. Processing by shape measurement apparatus>
Next, processing performed by the shape measurement apparatus 100
according to the present embodiment is described on the basis of FIGS. 3 to 6. The
shape measurement apparatus l 00 according to the present embodiment performs
shape measurement processing of specifying the shape of the measurement surface
20 5a of the measurement object S on the basis of a captured image, and focus
adjustment processing of adjusting focus of the image capturing apparatus 20 on the
basis of a captured image; thus, the shape of the measurement surface Sa of the
measurement object can be measured precisely even when the distance between the
measurement object and the image capturing apparatus fluctuates.
25 [0037]
First, an image of the measurement surface Sa of the measurement object S
irradiated with linear light is captured by the image capturing apparatus 20, and the
captured image captured by the image capturing apparatus 20 is output to the shape
measurement apparatus l 00 at a predetermined timing. As illustrated in FIG. 3,
30 when the image acquisition unit ll 0 acquires the captured image captured by the
image capturing apparatus 20 (S l 00), the shape measurement apparatus l 00 starts
PCT/JP2016/062806
14/30
focus adjustment processing (S 110 to S 130) and shape measurement processing
(Sl40, Sl50). The focus adjustment processing and the shape measurement
processing may be executed in parallel or may be executed alternately. The
processing will be described in detail.
5 [0038]
[2-1. Focus adjustment processing]
In the focus adjustment processing, first, the linear light position detection
unit 120 calculates the linear light position of linear light in the captured image
(S 11 0). A method for calculating the linear light position in the captured image will
10 be described on the basis of FIG. 4. The captured image A illustrated on the upper
side of FIG. 4 is an example of an image of the measurement surface Sa of the
measurement object 5 captured by the image capturing apparatus 20 in the shape
measurement system with the configuration illustrated in FIG. 1. In the captured
image A, the conveyance direction of the measurement object 5 is set as an X
15 direction, and the straight-line direction of the linear light 12 orthogonal to the X
direction is set as a Y direction. The captured image A is an image I(x, y) composed
ofNxM pixels (0:::; x:::; N-1, 0:::; y:::; M-1). Here, xis the X-direction position of
each pixel, and y is theY-direction position of each pixel.
20
[0039]
The linear light position detection unit 120 takes the sum (cumulative
luminance value) ofluminance values of pixels aligned in the straight-line direction
of the linear light 12 (the vertical direction, the Y direction) at each position in the
horizontal direction (the X direction) of the captured image A of FIG. 4, on the basis
of Formula (1) below, to acquire a waveform expressing the cumulative luminance
25 value at each position in the horizontal direction as illustrated on the lower side of
FIG. 4. This waveform is referred to as a projection waveform. Since the linear
light 12 extends in the vertical direction, the position of the linear light 12 appears as
a peak in the projection waveform. The linear light position detection unit 120
specifies the linear light position in the captured image A on the basis of such a
30 projection waveform.
[0040]
PCT/JP2016/062806
15/30
More specifically, the linear light position appears in the captured image A
with a luminance value different from that of a portion not irradiated with the linear
light 12. Consequently, also in the projection waveform, the cumulative luminance
value at a position irradiated with the linear light 12 is significantly higher than the
5 cumulative luminance value at another position. Hence, the linear light position
detection unit 120 detects a position with a significantly high cumulative luminance
value in the projection waveform as a linear light position. The linear light position
may be a peak position of the projection waveform as expressed by Formula (2)
below, or may be a center-of-gravity position of the projection waveform as
10 expressed by Formula (3) below, for example. Note that even if the captured image
A from which the projection waveform is calculated is not focused on the
measurement object 5, thus being unclear, the linear light position detection unit 120
can specify the linear light position as long as a peak appears in the projection
waveform.
15 [0041]
[Math. 2]
M-1
Proj(x) = LJ(x,y)
y=O ... (1)
Peak position= argmax, Proj(x) ... (2)
Center-of- gravity position= L~:~xProj(x)/L~,-~ Proj(x) ... (3)
20 [0042]
When the linear light position is specified by the linear light position
detection unit 120, next, the distance computation unit 130 calculates the distance
between the image capturing apparatus 20 and the measurement object 5 at the time
of acquiring the captured image, on the basis of the linear light position (S 120). A
25 method for calculating the distance between the image capturing apparatus 20 and
the measurement object 5 will be described on the basis of FIG. 5.
[0043]
FIG. 5 is a schematic diagram illustrating, in regard to the linear light
irradiation apparatus I 0 and the image capturing apparatus 20, a positional
PCT/JP2016/062806
16/30
relationship between the measurement surface Sa of the measurement object S and a
reference plane B that is away from the image capturing apparatus 20 by a reference
distance Do in the optical axis direction of the image capturing apparatus 20. The
reference distance Do is a fixed value set in advance for calculating a distance D from
5 the image capturing apparatus 20 to the measurement surface Sa. For example, in
the case where one side surface of the measurement object S serves as the
measurement surface Sa as illustrated in FIG. 1, the distance between the image
capturing apparatus 20 and a plarmed position where the measurement surface Sa
originally is to be placed may be set as the reference distance D0. Note that the
10 plarmed position where the measurement surface Sa originally is to be placed is, for
example, a position such that the width center of the measurement object S coincides
with the center C in the width direction of the conveyance line. In addition, for
example, in the case where the top surface of the measurement object S serves as the
measurement surface Sa, the distance between the image capturing apparatus 20 and
15 a planned position where the top surface originally is to be placed may be set as the
reference distance D0, as in the case where one side surface serves as the
measurement surface Sa.
[0044]
As illustrated in FIG. S, the reference plane B positioned away from the
20 image capturing apparatus 20 by the reference distance Do orthogonally intersects the
optical axis of the image capturing apparatus 20 at its center. In the shape
measurement system, the image capturing apparatus 20 is placed to be able to be
focused on this reference plane B. In addition, the linear light irradiation apparatus
10 emits the linear light 12 from a direction inclined by an angle 8 fi·om the optical
25 axis of the image capturing apparatus 20. On this occasion, the linear light
irradiation apparatus 10 is placed in a manner that the linear light 12 intersects the
optical axis of the image capturing apparatus 20 at the reference plane B. In this
manner, the shape measurement system is configured in a manner that a clear image
of the linear light 12 can be captured when the measurement surface Sa of the
30 measurement objectS is at the reference plane B.
[004S]
PCT/JP2016/062806
17/30
Here, assume that the measurement surface Sa of the measurement object S
is deviated from the position of the reference plane Bin a direction going away fi·om
the image capturing apparatus 20. On this occasion, since focus is not set on the
measurement surface Sa, the captured image A of the image capturing apparatus 20 is
5 an unclear image. Hence, to move the focus lens 22 of the image capturing
apparatus 20 to a position where focus is set on the measurement surface Sa, the
distance computation unit 130 calculates the distance D from the image capturing
apparatus 20 to the measurement surface Sa.
[0046]
10 The distance D from the image capturing apparatus 20 to the measurement
surface Sa is expressed by Formula (4) below. In Formula (4), d is the distance
[ mm] between the reference plane B and the measurement surface Sa, and is
expressed by Formulas (S) and (6) below. In Formula (S), Xo is a linear light
position on the reference plane B (hereinafter also referred to as a "reference linear
15 light position"), and X is an irradiation position of the linear light 12 that appears in
the captured image A. For example, when the measurement object S is farther from
the image capturing apparatus 20 than the reference plane B is as illustrated in FIG. S,
the distance D from the image capturing apparatus 20 to the measurement surface Sa
is larger than the reference distance D0. On this occasion, in the captured image A,
20 a linear light position X appears on the right side of the drawing (the side opposite to
the linear light irradiation apparatus 1 0) with respect to the linear light position X0.
When the measurement object S is closer to the image capturing apparatus 20 than
the reference plane B is, the distance D from the image capturing apparatus 20 to the
measurement surface Sa is smaller than the reference distance D0• On this occasion,
25 in the captured image A, a linear light position X appears on the left side of the
drawing (the linear light irradiation apparatus I 0 side) with respect to the linear light
position X0. Thus, in accordance with the distance D from the image capturing
apparatus 20 to the measurement surface Sa, deviation (a distance difference Xe
[pixel]) occurs between the linear light position X0 and the linear light position X
30 detected in step S 110. A distance difference in real space corresponding to this Xe
is Xe·r, where a shooting resolution at the distance D is r [ mm/pixel], and d is
PCT/JP2016/062806
18/30
expressed by Formula (5) on the basis of a geometric relationship. In addition, the
shooting resolution r [mm/pixel] at the distance D is expressed by Formula (6),
where the width of the field-of-view of the image capturing apparatus 20 at the
distance D [ mm] is W [ mm].
5 [0047]
[Math. 3]
... C4r
d = X~ Xo r = X, r
tane tane ... (5)
w r=-
N ... (6)
10 [0048]
On the other hand, on the basis of a proportional relationship, the relation of
Formula (7) below holds, where the width of the field-of-view of the image capturing
apparatus 20 at the reference plane B (the reference distance Do [mm]) is W0 [mm].
In addition, an image capturing resolution r0 at the reference plane B is W0/N; hence,
15 ro and r satisfy the relation of Formula (8) below.
[0049]
[Math. 4]
D
W=-W0
Do ... (7)
D
r=-~ Do
0 ••• (8)
20 [0050]
Hence, the distance D is expressed by Formula (9) below on the basis of
25
Formulas (4), (5), and (7).
[0051]
[Math. 5]
D
XJ·o /tane
= Do + ---'--"----
1~X,r0 /tanB/ D0
[0052]
... (9)
PCT/JP2016/062806
19/30
Here, the image capturing resolution ro at the reference distance Do is
expressed by Formula (1 0) below on the basis of Fmmula (6); in the case where the
reference distance Do is sufficiently larger than Xe·ro/tan8 derived from the distance
difference Xe, the denominator of the second term of Formula (9) can be regarded as
5 1. Consequently, the distance D can be calculated using Formula (I 0) below
obtained by simplifYing Formula (9). That is, the distance D can be expressed by
the sum of the reference distance Do and a distance difference r0d. In the present
embodiment, Formula (9) or Formula (I 0) is defined as a distance function.
[OOS3]
10 [Math. 6]
... (10)
[OOS4]
The distance computation unit 130 calculates the distance D from the image
capturing apparatus 20 to the measurement surface Sa on the basis of Formula (I 0),
15 which is a distance function, for example. Then, the distance computation unit 130
outputs the calculated distanceD to the focus adjustment unit 140.
[OOSS]
After that, the focus adjustment unit 140 adjusts the position of the focus
lens 22 of the image capturing apparatus 20 on the basis of the distance D from the
20 image capturing apparatus 20 to the measurement surface Sa calculated in step S 120
(S130). In the example illustrated in FIG S, when the measurement smface Sa of
the measurement object S is deviated from the reference plane B, the linear light
position X that appears in the captured image A is deviated from the reference linear
light position X0 by Xe in the X direction. As described above, in the case where
25 the measurement surface Sa of the measurement object S is deviated from the
reference plane B in a direction going away from the image capturing apparatus 20,
the linear light position X is deviated to the right side of the drawing (i.e., the side
opposite to the linear light inadiation apparatus 1 0) with respect to the linear light
position Xo as illustrated on the lower side of FIG S. In the case where the
30 measmement surface Sa of the measurement object S is deviated from the reference
plane B in a direction approaching the image captming apparatus 20, the linear light
PCT/JP2016/062806
20/30
position X is deviated to the left side of the drawing (the linear light irradiation
apparatus 10 side) with respect to the reference linear light position Xo. When the
linear light position X is thus deviated from the reference linear light position X0, the
focus lens 22 of the image capturing apparatus 20 is not focused on the measurement
5 surface Sa. A captured image A acquired in a state where focus of the focus lens 22
of the image capturing apparatus 20 is not achieved is unclear, and when shape
measurement . processing described later is executed on the basis of the unclear
captured image A, linear light appears thick in a shot image, which leads to a
decrease in shape measurement precision of the measurement objectS.
10 [OOS6]
Hence, the shape measurement apparatus 100 according to the present
embodiment adjusts the focus position of the focus lens 22 of the image capturing
apparatus 20 on the basis of the distance D between the image capturing apparatus 20
and the measurement objectS by the focus adjustment unit 140. For example, in
15 the case where the focus lens 22 according to the present embodiment is a motor
drive lens including the motor 26 that rotates the focus ring 24, the focus adjustment
unit 140 outputs a command to move the focus lens 22 to a predetermined distance
position to the motor 26, on the basis of the distance D between the image capturing
apparatus 20 and the measurement objectS. The predetermined distance position is
20 a position such that focus is set on the measurement surface Sa of the measurement
object S when the captured image A is acquired. This enables the image capturing
apparatus 20 to acquire a clear captured image A. The focus adjustment unit 140
adjusts focus by causing the motor 26 to rotate the focus ring 24, on the basis of a
correspondence relationship between the distance from the image capturing
25 apparatus 20 to the measurement surface Sa and a rotation angle of the focus ring 24
at which focus is achieved, which is acquired in advance.
[OOS7]
In this manner, the shape measurement apparatus 1 00 repeatedly performs
processing of steps S 110 to S 130 each time a captured image A is acquired from the
30 image capturing apparatus 20, thereby keeping a state where focus is set on the
measurement surface Sa to enable a clear image to be acquired.
PCT/JP2016/062806
21130
[OOS8]
[2-2. Shape measurement processing]
The shape measurement apparatus I 00 executes shape measurement
processing (Sl40, SISO) as well as the focus adjustment processing (SilO to S130).
5 [OOS9]
First, the shape computation unit ISO calculates the unevenness shape of the
measurement surface Sa of the measurement object S, on the basis of the degree of
bend of the linear light in the captured image (S 140). Here, FIG. 6 illustrates an
example of the captured image A of the measurement objectS having a convex shape
10 on the measurement surface Sa. In the case where the measurement surface Sa is a
flat surface without unevenness, straight linear light appears in the captured image A,
whereas when there is a convex shape on the measurement surface Sa, linear light 12
including a straight part 12a and a bent part 12b caused by the convex shape on the
measurement surface Sa appears in the captured image A, as illustrated in FIG. 6.
15 [0060]
Here, the captured image A composed ofNxM pixels captured at a timet is
an image I(x,ylt) (0 :::; x :::; N-1, 0 :::; y :::; M-1). The shape computation unit 1SO
specifies a position in the horizontal direction (X direction) that exhibits the
maximum luminance at each position in the vertical direction (Y direction) of the
20 captured image A. That is, the shape computation unit 150 calculates an X
coordinate Xmax(Yit) that gives the maximum luminance at each position in the
vertical direction (Y direction) of the captured image A, on the basis of Formula (11)
below.
[0061]
25 [Math. 7]
x'"~ (y It)= argmaxi(x,y It)
X ... (II)
[0062]
A value (hereinafter also referred to as a "shape value") Z indicating the
unevenness shape of the measurement object 5 measured at this time is acquired as a
30 discrete value as in Formula (12) below, where the reference distance Do serves as
the origin point of the shape.
[0063]
[Math. 8]
Z(y It)= X,."' (y It)- Xo r
tane
5 [0064]
PCT/JP2016/062806
22/30
... (12)
Note that an angle 0 between the optical axis of the image capturing
apparatus 20 and an emission direction of the linear light 12 of the linear light
irradiation apparatus 10 is set to a value of 30° to 4S0
, and is normally set to 4S0
•
The shape computation unit 150 finds the shape value Z for each of images
10 continuously captured in a temporal direction, on the basis of Formula (12), thereby
calculating the shape on the entire measurement surface Sa of the measurement
objectS.
[0065]
In addition, the shape computation unit 150 can calculate the discrete shape
15 of the measurement surface Sa of the measurement objectS on the basis of the shape
value Z expressed by Formula (13) below, where an image capturing interval is ,A.t
[sec] and the movement speed of the measurement object 5 is v [mrn/sec]. Note
that u is a discrete value (u = 0, 1, 2, ... ). The movement direction of the
measurement object S is set as a u direction (the same direction as the X direction),
20 and a direction orthogonal to this is set as a v direction (the same direction as the Y
direction).
[0066]
[Math. 9]
Z( )
X,."' (vI uM)- X u,v = 0 r
tane
25 [0067]
... (13)
Furthetmore, in the present embodiment, the shape computation unit 150
acquires the discrete shape in units of pixels of a captured image of the measurement
smface Sa of the measurement object 5, on the basis of the X coordinate that gives
the maximum luminance at each position in the vertical direction (Y direction) of the
PCT/JP2016/062806
23/30
captured image A, which is obtained ustng Formula (11); however, the present
invention is not limited to this example. For example, instead of the X coordinate
Xmax(Yit) that gives the maximum luminance at each position in the vertical direction
(Y direction) of the captured image A, a center-of-gravity position Xg(ylt) expressed
5 by Formula (14) below may be used. Using the center-of-gravity position Xg(ylt)
makes it possible to obtain a continuous value of the shape in the Y direction (v
direction), which is not limited by a pixel resolution of a captured image.
10
[0068]
[Math. 10]
X (y It)= "I:~'I(x,y)x
" I,~:~I(x,y)
[0069]
... (14)
In this mauner, the shape computation unit 150 calculates the shape value Z,
which is a variable indicating the shape of the measurement surface 5a of the
measurement object 5. The shape computation unit 150 outputs the calculated
15 shape value Z to the result output unit 160.
[0070]
When the shape value Z indicating the shape of the measurement object 5 is
received from the shape computation unit 150, the result output unit 160 outputs this
calculation result to the display apparatus 30 or the storage unit 40 (S 150). The
20 display apparatus 30 displays the shape of the measurement object 5 on the basis of
the shape value Z to notifY an operator of the shape of the measurement object 5. In
addition, the shape of the measurement object 5 stored in the storage unit 40 can be
used as, for example, information for specifying a position having an unevenness
shape on the measurement surface 5a of the measurement object 5.
25 [0071]
The shape measurement apparatus 100 repeatedly performs processing of
steps Sl40 and S150 each time a captured image A is acquired from the image
capturing apparatus 20, to specifY the shape of the measurement surface 5a of the
measurement object 5. The captured image A used in the shape measurement
PCT/JP2016/062806
24/30
processing is an image acquired by the focus adjustment processing described above.
By calculating the shape of the measurement object 5 using a clear captured image,
the shape measurement apparatus 1 00 can specifY the shape of the measurement
object 5 with higher precision.
5 [0072]
As described above, when the image acquisition unit 110 acqmres the
captured image captured by the image capturing apparatus 20 (SlOO), the shape
measurement apparatus 100 may execute focus adjustment processing (S 110 to
S130) and shape measurement processing (Sl40, Sl50) in parallel or alternately.
10 For example, in the case of executing them alternately, focus is adjusted by focus
adjustment processing (SllO to S130), and next, shape measurement processing
(Sl40, Sl50) is executed on the same captured image as the shot image used for the
focus adjustment processing.
[0073]
15 Description has been given on focus adjustment processing of the image
capturing apparatus 20 and shape measurement processing performed by the shape
measurement apparatus 100 according to the present embodiment. According to the
present embodiment, the distance between the image capturing apparatus 20 and the
measurement object 5 is calculated from the linear light position of the linear light 12
20 that appears in the captured image A, without additional installation of a distance
sensor, and the focus lens 22 is moved so that focus is set on a measurement surface
at a position of the calculated distance. Thus, focus can be adjusted on the basis of
a captured image acquired by the image capturing apparatus 20, without performing
repeated processing such as sweeping of moving the position of the focus lens 22 in
25 the optical axis direction, and a clear captured image can be obtained by the image
capturing apparatus 20 without time delay. As a result, even when the distance
between the image capturing apparatus 20 and the measurement object 5 changes, the
position of the focus lens 22 can be adjusted in accordance with the change, which
makes it possible to prevent linear light in a captured image from being blurred to be
30 unclear, and maintain high precision of shape measurement.
[0074]
PCT/JP2016/062806
25/30
The preferred embodiment(s) of the present invention has/have been
described above with reference to the accompanying drawings, whilst the present
invention is not limited to the above examples. A person skilled in the art may find
various alterations and modifications within the scope of the appended claims, and it
5 should be understood that they will naturally come under the technical scope of the
present invention.
Reference Signs List
[0075]
10 5 measurement object
Sa measurement surface
10 linear light irradiation apparatus
12 linear light
12a straight part
15 12b bent part
20 image capturing apparatus
22 focus lens
24 focus ring
26 motor
20 30 display apparatus
40 storage unit
100 shape measurement apparatus
110 image acquisition unit
120 linear light position detection unit
25 130 distance computation unit
140 focus adjustment unit
150 shape computation unit
160 result output unit
A captured image
30 B reference plane
Claim 1
PCT/JP2016/062806
English translation of Amendments under PCT Art.19
CLAIMS
A shape measurement apparatus comprising:
a linear light position detection unit that detects, from a captured image of
5 linear light applied to a measurement object by a linear light irradiation apparatus
that is captured by an image capturing apparatus, a linear light position of the linear
light;
a distance computation unit that computes a distance from the image
capturing apparatus to the measurement object, on the basis of a distance difference
10 between a reference linear light position detected by the linear light position
detection unit when the measurement object is positioned at a position of a
predetermined reference distance from the image capturing apparatus and the linear
light position detected by the linear light position detection unit, the reference
distance, and an angle formed by an optical axis of the image capturing apparatus
15 and an emission direction of the linear light;
a focus adjustment unit that adjusts focus of the image capturing apparatus
on the basis of the distance from the image capturing apparatus to the measurement
object; and
a shape computation unit that computes a shape of the measurement object
20 on the basis of the captured image.
Claim 2
The shape measurement apparatus according to claim 1,
wherein the distance computation unit computes the distance from the
25 rmage capturing apparatus to the measurement object on the basis of a distance
function expressed using an image capturing resolution of the image capturing
apparatus.
Claim 3
30 The shape measurement apparatus according to claim 2,
wherein the distance computation unit computes a distance D from the
5
10
PCT/JP2016/062806
English translation of Amendments under PCT Art.19
2/4
image capturing apparatus to the measurement object on the basis of Formula (A)
below,
[Math. 11]
D - __x _,~~~ol__ta_n e _ =Do+
l-X,r0 /tanO/ D0 ... (A),
where Do is the reference distance, ro is an image capturing resolution at the
reference distance, Xe is a distance difference between the linear light position and
the reference linear light position in units of pixels of the captured image, and e is an
angle formed by the optical axis of the image capturing apparatus and the emission
direction of the linear light.
Claim 4 (Amended)
The shape measurement apparatus according to claim 2,
wherein the distance computation unit computes a distance D from the
image capturing apparatus to the measurement object on the basis of Formula (B)
15 below,
[Math. 12]
... (B),
where Do is the reference distance, ro is an image capturing resolution at the
reference distance, Xe is a distance difference between the linear light position and
20 the reference linear light position in units of pixels of the captured image, and e is an
angle formed by the optical axis of the image capturing apparatus and the emission
direction of the linear light.
25
30
Claim5
The shape measurement apparatus according to any one of claims 1 to 4,
wherein the linear light position detection unit
calculates a projection waveform expressing a sum of luminance values of
pixels aligned in a straight-line direction of linear light at each position in a direction
orthogonal to the straight-line direction of the linear light in the captured image, and
sets a peak position of the projection waveform as the linear light position.
)l
5
10
Claim6
PCT/JP2016/062806
English translation of Amendments under PCT Art.19
3/4
The shape measurement apparatus according to any one of claims 1 to 4,
wherein the linear light position detection unit
calculates a projection waveform expressing a sum of luminance values of
pixels aligned in a straight-line direction of linear light at each position in a direction
orthogonal to the straight-line direction of the linear light in the captured image, and
sets a center-of-gravity position of the projection waveform as the linear
light position.
Claim 7
The shape measurement apparatus according to any one of claims 1 to 6,
wherein the shape computation unit computes the shape of the measurement
object on the basis of a maximum luminance position in a direction orthogonal to a
15 straight-line direction of the linear light that is calculated for each position in the
straight-line direction in the captured image.
20
Claim 8
The shape measurement apparatus according to any one of claims 1 to 6,
wherein the shape computation unit computes the shape of the measurement
object on the basis of a center-of-gravity position of luminance in a direction
orthogonal to a straight-line direction of the linear light that is calculated for each
position in the straight-line direction in the captured image.
25 Claim 9
A shape measurement method comprising:
a linear light position detection step of detecting, from a captured image of
linear light applied to a measurement object by a linear light irradiation apparatus
that is captured by an image capturing apparatus, a linear light position of the linear
30 light;
a distance computation step of computing a distance from the image
PCT/JP2016/062806
English translation of Amendments under PCT Art.19
4/4
capturing apparatus to the measurement object, on the basis of a distance difference
between a reference linear light position detected when the measurement object is
positioned at a position of a predetermined reference distance from the image
capturing apparatus and the linear light position, the reference distance, and an angle
5 formed by an optical axis of the image capturing apparatus and an emission direction
of the linear light;
10
15
a focus adjustment step of adjusting focus of the image capturing apparatus
on the basis of the distance from the image capturing apparatus to the measurement
object; and
a shape computation step of computing a shape of the measurement object
on the basis of the captured image.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201717036849-IntimationOfGrant29-11-2023.pdf | 2023-11-29 |
| 1 | 201717036849-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [17-10-2017(online)].pdf | 2017-10-17 |
| 2 | 201717036849-PatentCertificate29-11-2023.pdf | 2023-11-29 |
| 2 | 201717036849-STATEMENT OF UNDERTAKING (FORM 3) [17-10-2017(online)].pdf | 2017-10-17 |
| 3 | 201717036849-Written submissions and relevant documents [14-11-2023(online)].pdf | 2023-11-14 |
| 3 | 201717036849-REQUEST FOR EXAMINATION (FORM-18) [17-10-2017(online)].pdf | 2017-10-17 |
| 4 | 201717036849-PRIORITY DOCUMENTS [17-10-2017(online)].pdf | 2017-10-17 |
| 4 | 201717036849-Correspondence to notify the Controller [28-10-2023(online)].pdf | 2023-10-28 |
| 5 | 201717036849-POWER OF AUTHORITY [17-10-2017(online)].pdf | 2017-10-17 |
| 5 | 201717036849-FORM-26 [28-10-2023(online)].pdf | 2023-10-28 |
| 6 | 201717036849-US(14)-ExtendedHearingNotice-(HearingDate-30-10-2023).pdf | 2023-10-16 |
| 6 | 201717036849-FORM 18 [17-10-2017(online)].pdf | 2017-10-17 |
| 7 | 201717036849-US(14)-HearingNotice-(HearingDate-20-10-2023).pdf | 2023-10-13 |
| 7 | 201717036849-FORM 1 [17-10-2017(online)].pdf | 2017-10-17 |
| 8 | 201717036849-DRAWINGS [17-10-2017(online)].pdf | 2017-10-17 |
| 8 | 201717036849-ABSTRACT [06-11-2020(online)].pdf | 2020-11-06 |
| 9 | 201717036849-CLAIMS [06-11-2020(online)].pdf | 2020-11-06 |
| 9 | 201717036849-DECLARATION OF INVENTORSHIP (FORM 5) [17-10-2017(online)].pdf | 2017-10-17 |
| 10 | 201717036849-COMPLETE SPECIFICATION [06-11-2020(online)].pdf | 2020-11-06 |
| 10 | 201717036849-COMPLETE SPECIFICATION [17-10-2017(online)].pdf | 2017-10-17 |
| 11 | 201717036849-DRAWING [06-11-2020(online)].pdf | 2020-11-06 |
| 11 | 201717036849.pdf | 2017-10-25 |
| 12 | 201717036849-FER_SER_REPLY [06-11-2020(online)].pdf | 2020-11-06 |
| 12 | 201717036849-Verified English translation (MANDATORY) [30-11-2017(online)].pdf | 2017-11-30 |
| 13 | 201717036849-FORM 3 [06-11-2020(online)].pdf | 2020-11-06 |
| 13 | 201717036849-Proof of Right (MANDATORY) [30-11-2017(online)].pdf | 2017-11-30 |
| 14 | 201717036849-FER.pdf | 2020-07-23 |
| 14 | 201717036849-OTHERS-041217.pdf | 2017-12-07 |
| 15 | 201717036849-FORM 3 [20-03-2020(online)].pdf | 2020-03-20 |
| 15 | 201717036849-OTHERS-041217-.pdf | 2017-12-07 |
| 16 | 201717036849-Correspondence-041217.pdf | 2017-12-07 |
| 16 | 201717036849-FORM 3 [19-09-2019(online)].pdf | 2019-09-19 |
| 17 | abstract.jpg | 2018-01-24 |
| 17 | 201717036849-OTHER-050719-.pdf | 2019-07-19 |
| 18 | 201717036849-MARKED COPIES OF AMENDEMENTS [24-01-2018(online)].pdf | 2018-01-24 |
| 18 | 201717036849-Power of Attorney-050719.pdf | 2019-07-19 |
| 19 | 201717036849-Correspondence-050719.pdf | 2019-07-12 |
| 19 | 201717036849-FORM 3 [24-01-2018(online)].pdf | 2018-01-24 |
| 20 | 201717036849-AMMENDED DOCUMENTS [24-01-2018(online)].pdf | 2018-01-24 |
| 20 | 201717036849-OTHERS-050719 -.pdf | 2019-07-12 |
| 21 | 201717036849-Amendment Of Application Before Grant - Form 13 [24-01-2018(online)].pdf | 2018-01-24 |
| 21 | 201717036849-OTHERS-050719.pdf | 2019-07-12 |
| 22 | 201717036849-AMENDED DOCUMENTS [01-07-2019(online)].pdf | 2019-07-01 |
| 22 | 201717036849-FORM 3 [30-05-2018(online)].pdf | 2018-05-30 |
| 23 | 201717036849-FORM 13 [01-07-2019(online)].pdf | 2019-07-01 |
| 23 | 201717036849-FORM 3 [30-10-2018(online)].pdf | 2018-10-30 |
| 24 | 201717036849-RELEVANT DOCUMENTS [01-07-2019(online)].pdf | 2019-07-01 |
| 25 | 201717036849-FORM 3 [30-10-2018(online)].pdf | 2018-10-30 |
| 25 | 201717036849-FORM 13 [01-07-2019(online)].pdf | 2019-07-01 |
| 26 | 201717036849-AMENDED DOCUMENTS [01-07-2019(online)].pdf | 2019-07-01 |
| 26 | 201717036849-FORM 3 [30-05-2018(online)].pdf | 2018-05-30 |
| 27 | 201717036849-Amendment Of Application Before Grant - Form 13 [24-01-2018(online)].pdf | 2018-01-24 |
| 27 | 201717036849-OTHERS-050719.pdf | 2019-07-12 |
| 28 | 201717036849-AMMENDED DOCUMENTS [24-01-2018(online)].pdf | 2018-01-24 |
| 28 | 201717036849-OTHERS-050719 -.pdf | 2019-07-12 |
| 29 | 201717036849-Correspondence-050719.pdf | 2019-07-12 |
| 29 | 201717036849-FORM 3 [24-01-2018(online)].pdf | 2018-01-24 |
| 30 | 201717036849-MARKED COPIES OF AMENDEMENTS [24-01-2018(online)].pdf | 2018-01-24 |
| 30 | 201717036849-Power of Attorney-050719.pdf | 2019-07-19 |
| 31 | 201717036849-OTHER-050719-.pdf | 2019-07-19 |
| 31 | abstract.jpg | 2018-01-24 |
| 32 | 201717036849-Correspondence-041217.pdf | 2017-12-07 |
| 32 | 201717036849-FORM 3 [19-09-2019(online)].pdf | 2019-09-19 |
| 33 | 201717036849-FORM 3 [20-03-2020(online)].pdf | 2020-03-20 |
| 33 | 201717036849-OTHERS-041217-.pdf | 2017-12-07 |
| 34 | 201717036849-FER.pdf | 2020-07-23 |
| 34 | 201717036849-OTHERS-041217.pdf | 2017-12-07 |
| 35 | 201717036849-FORM 3 [06-11-2020(online)].pdf | 2020-11-06 |
| 35 | 201717036849-Proof of Right (MANDATORY) [30-11-2017(online)].pdf | 2017-11-30 |
| 36 | 201717036849-Verified English translation (MANDATORY) [30-11-2017(online)].pdf | 2017-11-30 |
| 36 | 201717036849-FER_SER_REPLY [06-11-2020(online)].pdf | 2020-11-06 |
| 37 | 201717036849-DRAWING [06-11-2020(online)].pdf | 2020-11-06 |
| 37 | 201717036849.pdf | 2017-10-25 |
| 38 | 201717036849-COMPLETE SPECIFICATION [06-11-2020(online)].pdf | 2020-11-06 |
| 38 | 201717036849-COMPLETE SPECIFICATION [17-10-2017(online)].pdf | 2017-10-17 |
| 39 | 201717036849-CLAIMS [06-11-2020(online)].pdf | 2020-11-06 |
| 39 | 201717036849-DECLARATION OF INVENTORSHIP (FORM 5) [17-10-2017(online)].pdf | 2017-10-17 |
| 40 | 201717036849-ABSTRACT [06-11-2020(online)].pdf | 2020-11-06 |
| 40 | 201717036849-DRAWINGS [17-10-2017(online)].pdf | 2017-10-17 |
| 41 | 201717036849-FORM 1 [17-10-2017(online)].pdf | 2017-10-17 |
| 41 | 201717036849-US(14)-HearingNotice-(HearingDate-20-10-2023).pdf | 2023-10-13 |
| 42 | 201717036849-US(14)-ExtendedHearingNotice-(HearingDate-30-10-2023).pdf | 2023-10-16 |
| 42 | 201717036849-FORM 18 [17-10-2017(online)].pdf | 2017-10-17 |
| 43 | 201717036849-POWER OF AUTHORITY [17-10-2017(online)].pdf | 2017-10-17 |
| 43 | 201717036849-FORM-26 [28-10-2023(online)].pdf | 2023-10-28 |
| 44 | 201717036849-PRIORITY DOCUMENTS [17-10-2017(online)].pdf | 2017-10-17 |
| 44 | 201717036849-Correspondence to notify the Controller [28-10-2023(online)].pdf | 2023-10-28 |
| 45 | 201717036849-Written submissions and relevant documents [14-11-2023(online)].pdf | 2023-11-14 |
| 45 | 201717036849-REQUEST FOR EXAMINATION (FORM-18) [17-10-2017(online)].pdf | 2017-10-17 |
| 46 | 201717036849-STATEMENT OF UNDERTAKING (FORM 3) [17-10-2017(online)].pdf | 2017-10-17 |
| 46 | 201717036849-PatentCertificate29-11-2023.pdf | 2023-11-29 |
| 47 | 201717036849-IntimationOfGrant29-11-2023.pdf | 2023-11-29 |
| 47 | 201717036849-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [17-10-2017(online)].pdf | 2017-10-17 |
| 1 | SS(201717036849)E_23-07-2020.pdf |