Sign In to Follow Application
View All Documents & Correspondence

Inspection System And Inspection Method

Abstract: To provide a storage unit that stores an attention part mask indicating which part of an inspection target object to pay attention, a calculation unit that calculates a distance between an image of an attention part mask of the object in an image of an object photographed by a photographing unit at a first time point and an image of the attention part mask in an image of the object photographed by a photographing unit at a second time point that is after the first time point, and a distance between the image and an image of the attention part mask in a target image, which is an image when the object is in a normal state, and a determination unit that compares two distances calculated by the calculation unit, and determines whether or not an object is in a normal state.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 August 2021
Publication Number
13/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
archana@anandandanand.com
Parent Application

Applicants

Hitachi Systems, Ltd.
2-1 Osaki 1-chome, Shinagawa-ku, Tokyo 141-8672, Japan

Inventors

1. Yuki INOUE
c/o Hitachi, Ltd., 6-6, Marunouchi 1-chome, Chiyoda-ku, Tokyo 1008280, Japan
2. Takayuki AKIYAMA
c/o Hitachi, Ltd., 6-6, Marunouchi 1-chome, Chiyoda-ku, Tokyo 1008280, Japan

Specification

BACKGROUND OF THE INVENTION
1. Field of the Invention
[0001]
The present invention relates generally to inspection of an object.
2. Description of the Related Art
[0002]
In recent years, automation of work support has been required from the viewpoint of reducing the number of workers, preventing mistakes, and so on. As a part of this, there is an increasing need for technology of automatically determining a situation from a video by a wearable camera mounted on a worker, a fixed camera installed on site, and the like. [0003]
Examples of a system that determines a work situation from a video include an inspection system that determines whether an object is normal by comparing the object on site with an image registered in advance. This is used, for example, for finding a defective product at a product inspection site, or for inspection of checking whether or not a lamp, a meter, or the like of a work

target component is normally operating at a maintenance work site. [0004]
Prior arts related to the above include a technology related to inspection of a product (see JP 2018-54438 A) . With the technology disclosed in JP 2018-54438 A, it is checked whether a product to be inspected is defective by aligning (correcting inclination and misalignment) an image registered in a database with an image appearing on a wearable camera.
SUMMARY OF THE INVENTION
[0005]
In the technology described in JP 2018-54438 A, the inspection target product is compared with the registered image for the entire image. Hence, if this technology is used as it is for appearance inspection, there is a problem that an erroneous determination is caused when a change is made at a part irrelevant to determination of the end of work, such as a color change due to aging deterioration of equipment, affixing such as a seal or a sticker, or the like.
[0006]
The present invention has been made in view of the above points, and an object is to propose an inspection

system or the like capable of inspecting an object more appropriately.
[0007]
In order to solve such a problem, the present invention is provided with: a storage unit that stores an attention part mask indicating which part of an inspection target object to pay attention; a calculation unit that calculates a distance between an image of an attention part mask of the object in an image of an object photographed by a photographing unit at a first time point and an image of the attention part mask in an image of the object photographed by a photographing unit at a second time point that is after the first time point, and a distance between the image and an image of the attention part mask in a target image, which is an image when the object is in a normal state; and a determination unit that compares two distances calculated by the calculation unit, and determines whether or not an object is in a normal state.
[0008]
In the above configuration, since the distance between the image of the attention part mask in the image of the object at the first time point and the image of the attention part mask in the image of the object at the second time point is compared with the distance between the image and the image of the attention part mask in the

target image to determine whether or not the object is in a normal state, it is possible to inspect the object more accurately than, for example, when comparing the entire image. [0009]
According to the present invention, it is possible to inspect the object more appropriately.
BRIEF DESCRIPTION OF THE DRAWINGS [0010]
Fig. 1 is a diagram illustrating an example of a configuration related to an inspection system according to a first embodiment;
Fig. 2 is a diagram illustrating an example of processing related to the inspection system according to the first embodiment;
Fig. 3 is a diagram illustrating an example of a data configuration accumulated in a DB unit according to the first embodiment;
Fig. 4 is a diagram illustrating an example of a configuration of an attention part mask creation unit according to the first embodiment;
Fig. 5 is a diagram illustrating an example of a configuration of a combination ratio preparation unit according to the first embodiment;

Fig. 6 is a diagram illustrating an example of a configuration of an image comparison unit according to the first embodiment;
Fig. 7 is a diagram illustrating an example of a configuration of a GUI unit according to the first embodiment;
Fig. 8 is a diagram illustrating an example of a GUI screen according to the first embodiment;
Fig. 9 is a diagram illustrating an example of processing related to an inspection system according to a second embodiment;
Fig. 10 is a diagram illustrating an example of a configuration related to an inspection system according to a third embodiment;
Fig. 11 is a diagram illustrating an example of processing performed after work determination according to the third embodiment;
Fig. 12 is a diagram illustrating an example of a configuration of a GUI unit according to the third embodiment;
Fig. 13 is a diagram illustrating an example of a GUI screen according to the third embodiment; and
Fig. 14 is a diagram illustrating an example of the GUI screen according to the third embodiment. DESCRIPTION OF THE PREFERRED EMBODIMENTS

[0011]
(I) First embodiment
An embodiment of the present invention will be described below in detail. In the present embodiment, an inspection technology of automatically determining whether the appearance of an object coincides with a registered image will be described. The inspection system of the present embodiment determines whether or not an object on site is normal by using an image in which parts not relevant to the inspection are excluded from the entire image. The object is an object related to an inspection target that may be a product, a machine (belt conveyor, manufacturing device, and the like) used for manufacturing of a product, or equipment (magnet, whiteboard, and so on indicating progress of manufacturing) for managing manufacturing of a product.
[0012]
Here, in order to exclude a point not relevant to the inspection, it is conceivable to manually add in advance an attention part in a full image and register it in a database (DB). This has a problem that, in a case of performing work for a large number of pieces of equipment such as maintenance work, it is necessary to manually register an attention part for each piece of equipment, which increases man-hours.

[0013]
In this regard, in the present embodiment, a configuration capable of determining whether or not the object is normal by automatically determining the attention part even without manual registration work of an attention part will also be described. [0014]
Next, embodiments of the present invention will be described with reference to the drawings. However, the present invention is not limited to the embodiments. The following description and drawings are illustrative of the present invention and are omitted and simplified as appropriate for a clearer description. The present invention can also be carried out in various other forms. Unless otherwise specified, each component may be singular or plural. [0015]
In the following description, various types of information are sometimes described by expressions such as "table" and "list", but various types of information may be expressed by data structures other than these. In order to indicate that they do not depend on the data structure, "XX table", "XX list", and the like are sometimes called "XX information". In describing identification information, expressions such as "identification information",

"identifier", "name", "ID", and "number" are used, and these can be replaced with one another.
[0016]
In the following description, the same elements in the drawings are denoted by the same reference numerals, and description thereof is sometimes omitted as appropriate. A common part (part excluding a branch number) of reference numerals including the branch number is used when the same type of elements are sometimes explained without any distinction, and a reference numeral including the branch number is sometimes used when the same type of elements are explained with distinction. For example, when an image pair selection unit is sometimes described without any particular distinction, it is described as an "image pair selection unit 101". Meanwhile, when individual image pair selection units are described with distinction, they are sometimes described as "attention part mask image pair selection unit 101-1" and "image comparison image pair selection unit 101-2".
[0017]
In Fig. 1, the reference numeral 100 denotes an inspection system according to the first embodiment as a whole.
[0018]
Fig. 1 is a diagram illustrating an example of the

configuration related to the inspection system 100.
[0019]
The inspection system 100 includes a photographing unit 110, a DB unit 120, an alignment unit 130, an attention part mask creation unit 140, a cropping processing unit 150, a combination ratio preparation unit 160, an image comparison unit 170, an appearance normality determination unit 180, and a GUI unit 190.
[0020]
The photographing unit 110 is, for example, a camera and captures an image of work target equipment (work target component). The DB unit 120 is a DB that retains an image
(target image) of a work target component in a normal state photographed in advance. The alignment unit 130 aligns, on a pixel-by-pixel basis, corresponding parts between an image of a work target component photographed by the photographing unit 110 and an image retained in the DB unit 120. The attention part mask creation unit 140 automatically creates a mask (attention part mask) describing which part of the image aligned by the alignment unit 130 to use for appearance normality determination. The cropping processing unit 150 performs processing
(cropping processing) of cutting out the aligned image for a region indicated by the attention part mask. The combination ratio preparation unit 160 determines a

relative importance of a calculation method used for calculation of the degree of difference (distance) between images. The image comparison unit 170 calculates the distance between images based on the importance prepared by the combination ratio preparation unit 160. The appearance normality determination unit 180 performs appearance normality determination (work determination) as to whether or not the work has normally ended from the distance between the images. The GUI unit 190 displays the result of the processing to the user. The user may be a worker who performs the work or may be a person different from the worker. [0021]
The inspection system 100 includes, as the image pair selection unit 101, an attention part mask image pair selection unit 101-1, an image comparison image pair selection unit 101-2, and a combination ratio image pair selection unit 101-3. Each image pair selection unit 101 selects an appropriate pair of images (set of two images) from the image group having been input and outputs the image pair to a subsequent processing. In the present embodiment, the attention part mask image pair selection unit 101-1 selects a pair of an image (start image) of a work target component at the time of start of the work and an image (end image) of the work target component at the

time of end of the work. The image comparison image pair selection unit 101-2 selects a pair of a start image and an end image and a pair of a target image and an end image. The combination ratio image pair selection unit 101-3 selects a pair of a start image and a target image. [0022]
Details of the DB unit 120 will be described with reference to Fig. 3. Details of the attention part mask creation unit 140 will be described later with reference to Fig. 4. Details of the combination ratio preparation unit 160 will be described with reference to Fig. 5. Details of the image comparison unit 170 will be described later with reference to Fig. 6. Details of the GUI unit 190 will be described with reference to Fig. 7. [0023]
The alignment unit 130 receives, as input, the target image, the start image, and the end image. The same object (work target component) is in all of these images. However, the work target component in the images looks different because the photographing conditions such as the angle and distance from the subject are different. In order to compare images, it is preferable that the work target components are on equal terms at the pixel coordinate level. Therefore, the alignment unit 130 geometrically deforms (aligns) the other images so that

they appear on equal terms to those of the work target components in the target image.
[0024]
As the alignment calculation method, existing methods of feature amount matching such as template matching, AKAZE, and SURF may be used. By performing the alignment processing, it is possible to know the correspondence of pixels having captured the same part of the input image, and to perform image comparison in the downstream of the processing. In a case where the DB unit 120 retains the attention part mask, the alignment unit 130 receives also the image as input. In this case, it is not necessary to perform alignment (geometric deformation) between the attention part mask and the target image, and the alignment unit 130 outputs the attention part mask as it is.
[0025]
The cropping processing unit 150 performs processing called cropping of cutting out a part of an input image. In order to designate a part to perform cropping, the cropping processing unit 150 receives also an attention part mask as input.
[0026]
The appearance normality determination unit 180 receives, from an image comparison unit 170, the distance

(distance 1) between the start image and the end image and the distance (distance 2) between the end image and the target image as input and compares those distances, thereby determining whether the appearance of the work target component appearing in the end image is normal. As an example of the determination logic, in a case where the distance 2 is smaller than the distance 1, the end image is more similar to the target image than the start image is, and therefore the appearance is determined to be normal.
[0027]
Fig. 2 is a diagram illustrating an example of the processing related to the inspection system 100.
[0028]
In S200, at the time of start of the work, the user photographs the work target component by using the photographing unit 110, and the photographing unit 110 acquires (captures) an image (start image) of the work target component.
[0029]
In S201, the worker performs the work. For example, while the worker performs the work in accordance with a predetermined procedure or waits until the operation of the equipment ends, time elapses and the appearance of the work target component changes.
[0030]

In S202, at the time of end of the work, the user photographs the work target component by using the photographing unit 110, and the photographing unit 110 acquires an image (end image) of the work target component. The start image and the end image are input to the alignment unit 130. [0031]
In S203, the start image and the end image that have been aligned by the alignment unit 130 are input to the attention part mask creation unit 140, and the attention part mask creation unit 140 creates an attention part mask. [0032]
In S204, the start image and the target image that have been aligned by the alignment unit 130 are input to the combination ratio preparation unit 160, and the combination ratio preparation unit 160 decides the importance of the calculation method used to calculate the distance between the images. [0033]
In S205, the image comparison unit 170 calculates the distance (distance 1) between the start image and the end image and the distance (distance 2) between the end image and the target image. The importance of the calculation method used by the image comparison unit 170 at this time conforms to that decided in S204.

[0034]
In S206, the appearance normality determination unit 180 compares the distances between the images calculated in S205, and performs work determination as to whether or not the work target component is in a normal state (appearance inspection of whether the appearance of the work target component appearing in the end image is normal). When the appearance normality determination unit 180 determines that the work target component is not in a normal state, it notifies the GUI unit 190 of information indicating that the work target component is not in a normal state (in an abnormal state) (S207) . When the appearance normality determination unit 180 determines that the work target component is in a normal state, it notifies the GUI unit 190 of information indicating that the work target component is in a normal state (in a normal state) (S208). [0035]
Fig. 3 is a diagram illustrating an example of the data configuration accumulated in the DB unit 120. The DB unit 120 accumulates data in a form where a work process and an image used for work determination are associated. The DB unit 120 stores information in which a work ID identifiable of the work, a target image, a reference image, and an attention part mask are associated with one another. Of these pieces of information, the reference

image and the attention part mask may actually retain data
or may not actually retain data.
[0036]
Fig. 4 is a diagram illustrating an example of the configuration of the attention part mask creation unit 140 and the relationship with other units. [0037]
The attention part mask creation unit 140 receives a set of two images from the attention part mask image pair selection unit 101-1 as an input, and creates an attention part mask from the difference between those images. When the DB unit 120 retains the attention part mask, the attention part mask creation unit 140 receives the attention part mask as input from the alignment unit 130. [0038]
The attention part mask creation unit 140 includes a pixel difference unit 400, a threshold processing unit 410, a mask part decision unit 420, and an attention part mask integration unit 430. [0039]
First, the pixel difference unit 400 calculates a difference in each pixel of the two images having been input. Next, using a preset threshold the threshold processing unit 410 performs threshold processing on the obtained difference. Since the part with a large

difference between the two images is a part changed during the work, these parts are calculated by the processing of the pixel difference unit 400 and the threshold processing unit 410. In the present embodiment, the output of the threshold processing unit 410 is referred to as a mask candidate image, and pixels having equal to or greater than a threshold value in the mask candidate images are referred to as mask candidate pixels.
[0040]
Finally, the mask part decision unit 420 decides the attention part mask. The mask part decision unit 420 includes a regionalization unit 421, a noise removal unit 422, a vicinity region integration unit 423, and a DB information output unit 424. The regionalization unit 421 groups mask candidate pixels and recognizes them as a mask candidate region. The noise removal unit 422 removes a mask candidate region smaller than a preset threshold
(area). In a case where two mask candidate regions exist nearby (in a case where they are smaller than a preset threshold (number of pixels)), the vicinity region integration unit 423 integrates them and treats them as one mask candidate region. When the DB unit 120 retains an attention part mask, the DB information output unit 424 outputs the attention part mask.
[0041]

As described with reference to Fig. 1, in the present embodiment, the attention part mask image pair selection unit 101-1 selects one pair (pair of a start image and an end image) and inputs it to the attention part mask creation unit 140. However, in other embodiments, a plurality of pairs may be input. In this case, since the attention part mask is created for each image pair, the attention part mask integration unit 430 integrates (e.g., OR operation) and outputs a plurality of mask candidate regions as one attention part mask.
[0042]
Fig. 5 is a diagram illustrating an example of the configuration of the combination ratio preparation unit 160 and the relationship with other units.
[0043]
The combination ratio preparation unit 160 receives, as input, a set of two images from the combination ratio image pair selection unit 101-3, and outputs, as a value, the importance of the calculation method (distance calculation unit 600 described later) used by the image comparison unit 170 from those images. The combination ratio preparation unit 160 includes, as weighting units, a fixed weighting unit 500, a distance proportional weighting unit 510, a distance inversely proportional weighting unit 520, and a weight setting unit 530. The combination ratio

preparation unit 160 may use any weighting unit for preparing the importance. [0044]
The fixed weighting unit 500 determines that all the distance calculation units 600 have the same importance and outputs it as an I-th importance Ii = 1/(number of distance calculation units 600) . [0045]
The distance proportional weighting unit 510 calculates the distance of the input image pair by using the distance calculation unit 600, and prepares the importance in a form proportional to the distance. That is, if the distance output of the I-th distance calculation unit 600 is DI, it is output as the I-th importance Iz = DI/SJ(DJ) . [0046]
The distance inversely proportional weighting unit 520 calculates the distance of the input image pair by using the distance calculation unit 600, and prepares the importance in a form of inversely proportional to the distance. That is, if the distance output of the I-th distance calculation unit 600 is DI, it is output as the I-th importance Ii = (1/Di) /Sj (1/Dj) . [0047]
The weight setting unit 530 outputs the preset

importance as it is.
[0048]
The combination ratio preparation unit 160 may include a combination ratio integration unit 540.
[0049]
As described with reference to Fig. 1, in the present embodiment, the combination ratio image pair selection unit 101-3 selects one pair (pair of a start image and a target image) and inputs it to the combination ratio preparation unit 160. However, in other embodiments, a plurality of pairs may be input. In this case, since the importance is prepared for each image pair, the combination ratio integration unit 540 integrates (e.g., calculates a mean value) the importance.
[0050]
Fig. 6 is a diagram illustrating an example of the configuration of the image comparison unit 170 and the relationship with other units.
[0051]
The image comparison unit 170 calculates a distance
(degree of difference) between image groups with respect to an image group in which a part important in work determination is cut out, output from the cropping processing unit 150. Since the distance between images is calculated in discretionary two images, the image

comparison image pair selection unit 101-2 selects an image pair from the image group output from the cropping processing unit 150.
[0052]
The image pair to be selected is a pair of an end image and a target image in addition to a pair of a start image and an end image as described with reference to Fig. 1. Since the pairs selected by the image comparison image pair selection unit 101-2 are each input to the image comparison unit 170, the output of the image comparison unit 170 in the present embodiment is the two distances corresponding to the two pairs.
[0053]
Examples of the processing unit that calculates the distance of the image include various distance calculation units 600. The distance calculation units 600 each receive two images as input and outputs one value representing the distance between the images. The image comparison unit 170 may perform preprocessing of such as converting an image to be input to the distance calculation unit 600 into a monochrome image or into another color space such as HSV or HSL before inputting the image into the distance calculation unit 600.
[0054]
The image comparison unit 170 includes, as the

distance calculation unit 600, a histogram distance calculation unit 600-1, a pixel mean distance calculation unit 600-2, a pixel median distance calculation unit 600-3, a pixel difference sum distance calculation unit 600-4, and a learning model type distance calculation unit 600-5.
[0055]
The histogram distance calculation unit 600-1 creates a histogram of pixel values for each image and calculates those distances. The calculation method of distance of histogram may be absolute value comparison for each bin of the histogram or by a method such as the earth mover's distance.
[0056]
The pixel mean distance calculation unit 600-2 calculates the mean value of pixel values for each image and calculates those distances. The calculation method of the distance may be comparison of absolute values or comparison such as an L2 distances.
[0057]
The pixel median distance calculation unit 600-3 calculates the median of pixel values for each image and calculates those distances. The calculation method of the distance may be comparison of absolute values or comparison such as an L2 distances.
[0058]

The pixel difference sum distance calculation unit 600-4 calculates a difference of pixels having the same position between two images, and uses the sum of the difference values as the distance. The calculation method of the pixel difference may be comparison of absolute values or comparison such as an L2 distances. [0059]
The learning model type distance calculation unit 600-5 calculates the distance of two images by using a model (machine learning model, AI model, and the like) adjusted based on learning data. [0060]
The image comparison unit 170 includes a distance result combination unit 610 for bringing the outputs of the distance calculation units 600 together. The distance result combination unit 610 refers to the importance of the distance calculation unit 600 received from the combination ratio preparation unit 160, and brings the distance outputs of the distance calculation units 600 into one value. More specifically, the distance result combination unit 610 includes a weight multiplication unit 611 and a threshold processing unit 612. The weight multiplication unit 611 multiplies the importance by the corresponding distance output to calculate the sum thereof. The threshold processing unit 612 sets a threshold in advance for each

distance calculation unit 600, sets "1" when the distance output exceeds the value, and sets "0" in other cases, and brings the result together by the importance as described in the weight multiplication unit 611.
[0061]
Fig. 7 is a diagram illustrating an example of the configuration of the GUI unit 190 and the relationship with other units.
[0062]
In addition to the image group to be output from the alignment unit 130, the GUI unit 190 receives, as input, the result of the work determination to be output from the appearance normality determination unit 180. The GUI unit 190 includes a determination basis GUI unit 700. The determination basis GUI unit 700 takes a form of a graphical user interface (GUI), and includes the processing units (1) to (3) described below for the person (user) who operates the GUI to efficiently browse the basis of the work determination.
[0063]
(1) An image display unit 701 displays a start image, an end image, a target image, an attention part mask, and an image cropped by the cropping processing unit 150.
(2) A determination result display unit 702 displays

the result of the work determination to be output from the appearance normality determination unit 180.
(3) A determination approval unit 703 confirms with the user whether or not the result of the work determination is correct. [0064]
Fig. 8 is a diagram illustrating an example of a GUI screen visualized by the determination basis GUI unit 700. First, the GUI screen is defined by including a component 800 that displays an image used for work determination, a component 810 that displays a result of the work determination, and a component 820 that confirms with the user whether or not the result of the work determination is correct. [0065]
Thus, by performing the processing of the present embodiment, it is possible to perform a work determination robust to aged deterioration of the work target component and a change at a part not relevant to the work determination at the time of work determination. It becomes possible to automatically determine the attention part, and it becomes possible to perform work determination without manually registering the attention part. [0066]
(II) Second embodiment

The configuration related to the inspection system 100 of the present embodiment is the same as that illustrated in Fig. 1, and thus illustration and description thereof are omitted. As a difference from the first embodiment, the DB unit 120 retains, in addition to the target image, a reference image that is an image when the appearance of the work target component is abnormal, and outputs it to the alignment unit 130. Note that a plurality of reference images may be present. [0067]
An additional image pair is also selected in the present embodiment, in addition to the image pairs selected by the attention part mask image pair selection unit 101-1, the image comparison image pair selection unit 101-2, and the combination ratio image pair selection unit 101-3 in the first embodiment. The attention part mask image pair selection unit 101-1 selects a pair of a target image and a reference image. The image comparison image pair selection unit 101-2 selects a pair of an end image and a reference image and a pair of an end image and a target image. The combination ratio image pair selection unit 101-3 selects a pair of a target image and a reference image. If a plurality of reference images are present, all combinations are considered when creating a pair. [0068]

The appearance normality determination unit 180 in the present embodiment receives, as input, a distance (distance 3... N) between the end image and the reference image from the image comparison unit 170, in addition to the distance (distance 1) between the start image and the end image and the distance (distance 2) between the end image and the target image, and performs work determination by comparing those distances. Examples of the determination logic include a method of determining that the work target component is normal if the distance 2 is smaller than the distance 1 as in the first embodiment, and a method of determining that the work target component is in a normal state because the end image is more similar to the target image than the reference image when the distance 2 is the minimum compared with the distance 3... N. [0069]
Fig. 9 is a diagram illustrating an example of the processing related to the inspection system 100 in the present embodiment. [0070]
In S900, at the time of start of the work, the user photographs the work target component by using the photographing unit 110, and the photographing unit 110 acquires (captures) an image (start image) of the work target component.

[0071]
In S901, the worker performs the work. For example, while the worker performs the work in accordance with a predetermined procedure or waits until the operation of the equipment ends, time elapses and the appearance of the work target component changes. [0072]
In S902, at the time of end of the work, the user photographs the work target component by using the photographing unit 110, and the photographing unit 110 acquires an image (end image) of the work target component. The start image and the end image are input to the alignment unit 130. [0073]
In S903, the start image, the end image, the target image, and the reference image that have been aligned by the alignment unit 130 are input to the attention part mask creation unit 140, and the attention part mask creation unit 140 creates an attention part mask. Note that the processing of creating the attention part mask from the target image and the reference image is the same as the processing of creating the attention part mask from the start image and the end image, and therefore the description thereof is omitted. [0074]

In S904, the start image, the end image, the target image, and the reference image that have been aligned by the alignment unit 130 are input to the combination ratio preparation unit 160, and the combination ratio preparation unit 160 decides the importance of the calculation method used to calculate the distance between the images. [0075]
In S905, the image comparison unit 170 calculates the distance (distance 1) between the start image and the end image, the distance (distance 2) between the end image and the target image, and the distance (distance 3... N) between the end image and the reference image. The importance of the calculation method used by the image comparison unit 170 at this time conforms to that determined in S904. [0076]
In S906, the appearance normality determination unit 180 compares the distances between the images calculated in S905, and performs work determination as to whether or not the work target component is in a normal state (appearance inspection of whether the appearance of the work target component appearing in the end image is normal). When the appearance normality determination unit 180 determines that the work target component is not in a normal state, it notifies the GUI unit 190 of information indicating that

the work target component is not in a normal state (in an abnormal state) (S907) . When the appearance normality determination unit 180 determines that the work target component is in a normal state, it notifies the GUI unit 190 of information indicating that the work target component is in a normal state (in a normal state) (S908).
[0077]
The GUI of the present embodiment is substantially similar to the GUI in the first embodiment, and thus illustration and description thereof are omitted. The difference lies in that the reference image is also displayed in the component 800, in addition to the start image, the end image, and the target image.
[0078]
Thus, by performing the processing of the present embodiment, it becomes possible to perform the work determination not only by using a normal appearance (target image) of the work target component but also by using an appearance (reference image) at the time of abnormality, and it becomes possible to perform the work determination more robustly.
[0079]
(III) Third embodiment
Fig. 10 is a diagram illustrating an example of a configuration related to the inspection system 100 in the

present embodiment. In the present embodiment, a configuration of performing feedback from the GUI unit 190 to the DB unit 120 is added to the inspection system 100 described in the first embodiment. In the present embodiment, not only the target image but also the reference image may be used as in the second embodiment. [0080]
A feedback mechanism added in the present embodiment will be described with reference to the flowchart of Fig. 11. [0081]
Fig. 11 is a diagram illustrating an example of processing performed after the appearance normality determination unit 180 performs work determination (S207 and S208 in Fig. 2 and S907 and S908 in Fig. 9). [0082]
In S1101, the GUI unit 190 asks the user whether to present the basis of work determination. [0083]
In S1102, the GUI unit 190 presents a GUI screen indicating the basis of the work determination as illustrated in Fig. 13. [0084]
In S1103, the GUI unit 190 edits the current work content and asks the user whether to update the DB unit

120.
[0085]
In S1104, the GUI unit 190 presents a GUI screen capable of editing the attention part mask as illustrated in Fig. 14, and causes the user to edit the attention part mask.
[0086]
In S1105, the GUI unit 190 updates the DB unit 120 based on the attention part mask and the end image edited in S1104.
[0087]
Fig. 12 is a diagram illustrating an example of the configuration of the GUI unit 190 according to the present embodiment and the relationship with other units.
[0088]
In addition to the image group to be output from the alignment unit 130, the GUI unit 190 receives, as input, the result of the work determination to be output from the appearance normality determination unit 180. The GUI unit 190 includes the determination basis GUI unit 700 and an edit GUI unit 1200.
[0089]
First, the determination basis GUI unit 700 includes the processing units (1) to (4) described below for the person (user) who operates the GUI to efficiently browse

the basis of the work determination.
(1) An image display unit 701 displays a start image, an end image, a target image, an attention part mask, and an image cropped by the cropping processing unit 150.
(2) A determination result display unit 702 displays the result of the work determination to be output from the appearance normality determination unit 180.
(3) A determination approval unit 703 confirms with the user whether or not the result of the work determination is correct.
(4) An edit confirmation unit 1201 edits the attention part mask and confirms with the user whether to update the DB unit 120.
[0090]
The edit GUI unit 1200 includes the processing units
(1) to (11) described below for the person (user) who operates the GUI to efficiently edit the attention part mask.
(1) An image display unit 1211 displays a start image, an end image, a target image, an attention part mask, and an image cropped by the cropping processing unit 150.
(2) A region selection unit 1212 defines, as an attention region, adjacent pixels detected as an attention

part in the attention part mask. The region selection unit 1212 selects an attention region in response to a user input.
(3) A region expansion/reduction unit 1213 changes, by user input, the size of the attention region selected by the region selection unit 1212.
(4) A region movement unit 1214 translates, by user input, the attention region selected by the region selection unit 1212.
(5) A region rotation unit 1215 rotates, by user input, the attention region selected by the region selection unit 1212.
(6) A region joint unit 1216 joins, by user input, the plurality of attention regions selected by the region selection unit 1212.
(7) A region deletion unit 1217 deletes, by user input, the attention region selected by the region selection unit 1212.
(8) A pixel addition/deletion unit 1218 adds or deletes, by user input, a discretionary pixel in the attention part mask to or from the attention part mask.
(9) A distance result combination ratio change unit 1219 sets, by user input, the output value of the weight setting unit 530.
(10) A tentative determination result display unit

1220 displays the result of the work determination when the attention part mask under edit is used.
(11) An edit end unit 1221 ends the edit work.
[0091]
Fig. 13 is a diagram illustrating an example of a GUI screen visualized by the determination basis GUI unit 700.
[0092]
The GUI screen is defined by the component 800 that displays an image used for work determination, the component 810 that displays a result of the work determination, the component 820 that confirms with the user whether or not the result of the work determination is correct, and a component 1300 that edits the attention part mask and confirms with the user whether to update the DB unit 120.
[0093]
Fig. 14 is a diagram illustrating an example of a GUI screen visualized by the edit GUI unit 1200.
[0094]
The GUI screen is defined by a component 1400 that displays an image used for work determination, a component 1401 that selects an attention region, a component 1402 that changes the size of the selected attention region, a component 1403 that translates the selected attention

region, a component 1404 that rotates the selected attention region, a component 1405 that joins a plurality of selected attention regions, a component 1406 that deletes the selected attention region from the mask, a component 1407 that adds or deletes a discretionary pixel to or from the attention part mask, a component 1408 that sets the output value of the weight setting unit 530, a component 1409 that displays the result of work determination when the attention part mask under edit is used, and a component 1410 that ends the edit work. [0095]
Thus, by performing the processing of the present embodiment, it becomes possible to perform manual edit even when the automatic generation of the attention part mask fails, and it becomes possible to perform the work determination more robustly. [0096]
A program according to the present invention is a program incorporated in a computer and operates the computer as the inspection system 100. By incorporating the program of the present invention into a computer, the inspection system 100 illustrated in the block diagram of Fig. 1 or the like is configured. [0097]
(IV) Supplementary

The above-described embodiments include, for example, the following contents. [0098]
In the above embodiments, a case of applying the present invention to an inspection system has been described. However, the present invention is not limited to this, and can be widely applied to various other systems, devices, methods, and programs. [0099]
The program is executed by a processor (e.g., CPU and GPU), and thus the determined processing is performed while appropriately using a storage resource (e.g., memory) and/or an interface device (e.g., communication port). Hence, the agent of the processing may be a processor. Similarly, the agent of processing that executes and performs the program may be a computer having a processor, a controller, a device, a system, a calculator, or a node. The agent of processing that executes and performs the program is only required to be an arithmetic operation unit, and may also include a dedicated circuit (e.g., FPGA and ASIC) that performs specific processing. [0100]
The program may be installed from a program source to a device such as a calculator. The program source may be a storage medium readable by a program distribution

server or a calculator, for example. If the program source is a program distribution server, the program distribution server may include a processor and a storage resource that stores a distribution target program, and the processor of the program distribution server may distribute the distribution target program to another calculator. Two or more programs may be implemented as one program, or one program may be implemented as two or more programs.
[0101]
The inspection system 100 may be an inspection device such as a personal computer, and the functions of the inspection device (DB unit 120, alignment unit 130, attention part mask creation unit 140, cropping processing unit 150, combination ratio preparation unit 160, image comparison unit 170, appearance normality determination unit 180, GUI unit 190, and the like) may be implemented, for example, by the CPU reading a program stored in the ROM into the RAM and executing it (software), may be implemented by hardware such as a dedicated circuit, or may be implemented by a combination of software and hardware. Some of the functions of the inspection device may be implemented by another computer capable of communicating with the inspection device.
[0102]
One function of the inspection device may be divided

into a plurality of functions, or the plurality of functions may be brought into one function. Some of the functions of the inspection device may be provided as another function or may be included in another function. Some of the functions of the inspection device may be implemented by another computer capable of communicating with the inspection device.
[0103]
In the above-described embodiments, the illustrated and explained screens are only examples, and any design may be used as long as the received information is the same.
[0104]
In the above embodiments, a case where the mean value is used in various calculations has been described. However, it is not limited to the mean value but may be a maximum value, a minimum value, a difference between the maximum value and the minimum value, a mode, a median, a standard deviation, and the like.
[0105]
In the above-described embodiments, the output of information is not limited to display on a display. The output of information may be an audio output by a speaker, an output to a file, a print on a paper medium or the like by a printer device, a projection onto a screen or the like by a projector, or another aspect.

[0106]
In the above description, information such as a program, a table, and a file that implements each function can be stored in a storage device such as a memory, a hard disk, a solid state drive (SSD), or a recording medium such as an IC card, an SD card, or a DVD.
[0107]
The above-described embodiments have, for example, the following characteristic configuration.
[0108]
An inspection system (e.g., inspection system 100) includes: a storage unit (e.g., DB unit 120, attention part mask creation unit 140, computer, and circuit) that stores
(e.g., stores in a storage device) an attention part mask indicating which part of an inspection target object (e.g., work target component, component) to pay attention; a calculation unit (e.g., image comparison unit 170, computer, and circuit) that calculates a distance between an image (e.g., part image) of an attention part mask of the object in an image (e.g., entire image) of an object photographed by a photographing unit (e.g., photographing unit 110) at a first time point (may be at the time of start of the work or at the time of installation of a component) and an image of the attention part mask in an image of the object photographed by a photographing unit

(e.g., photographing unit 110, and may be the same as or may be different from that used for photographing at the first time point) at a second time point (may be at the time of end of the work or at the time of inspection of a component) that is after the first time point, and a distance between the image and an image of the attention part mask in a target image, which is an image when the object is in a normal state; and a determination unit(e.g., appearance normality determination unit 180, computer, and circuit) that compares two distances calculated by the calculation unit, and determines (may be work determination as to whether or not the work has been performed correctly, or may be defective determination as to whether or not the component is in a normal state) whether or not an object is in a normal state.
[0109]
In the above configuration, since the distance between the image of the attention part mask in the image of the object at the first time point and the image of the attention part mask in the image of the object at the second time point is compared with the distance between the image and the image of the attention part mask in the target image to determine whether or not the object is in a normal state, it is possible to inspect the object more accurately than, for example, when comparing the entire

image.
[0110]
The inspection system includes an alignment unit
(e.g., alignment unit 130, computer, and circuit) that aligns an image of an object photographed by a photographing unit with a target image of the object, in which the alignment unit makes an image of the object photographed by a photographing unit before a start of a work related to an object a first image aligned with a target image of the object, and makes an image of the object photographed by a photographing unit after an end of the work a second image aligned with the target image, and the storage unit generates and stores an attention part mask of the object from a difference between the first image and the second image.
[0111]
In the above configuration, since the attention part mask of the target object is automatically generated and stored from the image of the object before the work and the image of the object after the work, it is possible to reduce the man-hour of the registration work of the attention part mask, for example.
[0112]
The inspection system includes: a processing unit
(cropping processing unit 150, computer, and circuit) that

performs processing of cutting out a part of an attention part mask from a predetermined image, in which the processing unit cuts out a part of the attention part mask from the first image to make a first part image (e.g., attention part image of a start image), cuts out a part of the attention part mask from the second image to make a second part image (e.g., attention part image of an end image), and cuts out a part of the attention part mask from the target image to make a target part image (e.g., attention part image of a target image), the calculation unit calculates a first distance between the first part image and the second part image and a second distance between the second part image and the target part image, and the determination unit compares the first distance with the second distance that are calculated by the calculation unit, and determines whether or not the object is in a normal state. [0113]
The inspection system includes a decision unit (e.g., combination ratio preparation unit 160, computer, and circuit) that decides the importance of each of a plurality of calculation methods used by the calculation unit for calculation of the first distance and the second distance. [0114]

According to the above configuration, it is possible to automatically select a calculation method that can calculate correctly, for example, even when a calculation method that can correctly calculate the distance between images varies depending on the image pair. [0115]
The calculation unit calculates (e.g., histogram distance calculation unit 600-1) a distance by comparing histograms created from the first part image and the second part image, calculates (e.g., pixel mean distance calculation unit 600-2) a distance by comparing mean values of pixel values of the first part image and the second part image, calculates (e.g., pixel median distance calculation unit 600-3) a distance by comparing medians of pixel values of the first part image and the second part image, calculates (e.g., pixel difference sum distance calculation unit 600-4), as a distance, a sum of difference values obtained by calculating a difference of pixels having a same position between the first part image and the second part image, calculates (e.g., learning model type distance calculation unit 600-5) a distance in the first part image and the second part image using a model adjusted based on learning data, and defines (e.g., distance result combination unit 610), as the first distance, a value of a sum obtained by multiplying each calculated distance by an

importance decided by the decision unit and adding products, and calculates (e.g., histogram distance calculation unit 600-1) a distance by comparing histograms created from the second part image and the target part image, calculates (e.g., pixel mean distance calculation unit 600-2) a distance by comparing mean values of pixel values of the second part image and the target part image, calculates (e.g., pixel median distance calculation unit 600-3) a distance by comparing medians of pixel values of the second part image and the target part image, calculates
(e.g., pixel difference sum distance calculation unit 600-4), as a distance, a sum of difference values obtained by calculating a difference of pixels having a same position between the second part image and the target part image, calculates (e.g., learning model type distance calculation unit 600-5) a distance in the second part image and the target part image using a model adjusted based on learning data, and defines (e.g., distance result combination unit 610), as the second distance, a value of a sum obtained by multiplying each calculated distance by an importance decided by the decision unit and adding products.
[0116]
The inspection system includes a user interface unit
(e.g., GUI unit 190, computer, and circuit) that outputs the first image, the second image, the target image, the

attention part mask, the first part image, the second part image (e.g., component 800), a result (e.g., component 810) of determination by the determination unit, and component information (e.g., component 820) for receiving a result of confirmation whether the result is correct (e.g., GUI screen illustrated in Fig. 8).
[0117]
According to the above configuration, for example, the user can judge whether or not an attention part mask is correct based on each displayed image, and can judge whether or not the result of the determination is correct.
[0118]
The storage unit stores a target image that is an image when an object is in a normal state and one or more reference images that are images when the object is not in a normal state, and generates and stores an attention part mask of the object from a difference between the target image and the reference image (see Fig. 9, for example).
[0119]
In the above configuration, since an attention part mask is generated from the target image and the reference image, for example, the more the number of reference images increases, the more noise in the attention part mask can be reduced, and the more the inspection accuracy can be improved.

[0120]
The inspection system includes a user interface unit (e.g., GUI unit 190, computer, and circuit) that enables edit of an attention part mask stored by the storage unit, and the storage unit stores an attention part mask edited by the user interface unit. [0121]
According to the above configuration, for example, since the user can manually edit an attention part mask to correct the contents, it is possible to reduce erroneous determination of inspection. [0122]
The inspection system includes: an alignment unit (e.g., alignment unit 130, computer, and circuit) that aligns an image of an object photographed by a photographing unit with a target image of the object; and a processing unit (cropping processing unit 150, computer, and circuit) that performs processing of cutting out a part of an attention part mask from a predetermined image, in which the alignment unit makes an image of an object photographed by a photographing unit at a first time point a first image aligned with a target image of the object, and makes an image of the object photographed by a photographing unit at a second time point that is after the first time point a second image aligned with the target

image, the processing unit cuts out a part of the attention part mask from the first image to make a first part image (e.g., attention part image of a start image), cuts out a part of the attention part mask from the second image to make a second part image (e.g., attention part image of an end image), and cuts out a part of the attention part mask from the target image to make a target part image (e.g., attention part image of a target image), the user interface unit outputs the first image, the second image, the target image, the attention part mask, the first part image, the second part image (e.g., component 800), a result (e.g., component 810) of determination by the determination unit, first component information (e.g., component 820) for receiving a result of confirmation as to whether the result is correct, and second component information (e.g., component 1300) for receiving edit of the attention part mask (e.g., screen illustrated in Fig. 13), and when receiving edit of the attention part mask from the second component information, performs at least one processing of processing (e.g., processing performed by region expansion/reduction unit 1213) of changing a size of a region of the attention part mask, processing (e.g., processing performed by region movement unit 1214) of changing a place of a region of the attention part mask, processing (e.g., processing performed by region rotation

unit 1215) of changing an angle of a region of the attention part mask, processing (e.g., processing performed by region joint unit 1216) of joining a plurality of regions when there are a plurality of regions of the attention part mask, processing (e.g., processing performed by region deletion unit 1217) of deleting a region of the attention part mask, processing (e.g., processing performed by pixel addition/deletion unit 1218) of adding or deleting a discretionary pixel in the attention part mask, processing (e.g., processing performed by distance result combination ratio change unit 1219) of changing an importance of a plurality of calculation methods used by the calculation unit for calculation of a distance, processing (e.g., processing performed by tentative determination result display unit 1220) of displaying a result of determination by the determination unit when using an attention part mask under edit, and processing (e.g., processing performed by edit end unit 1221) of ending edit. [0123]
The above-described configuration may be changed, rearranged, combined, or omitted as appropriate within the scope without departing from the gist of the present invention. [0124]

It should be understood that items included in a list in the form "at least one of A, B, and C" can mean (A) , (B) , (C) , (A and B) , (A and C) , (B and C) , or (A, B, and C). Similarly, items listed in the form "at least one of A, B, or C" can mean (A), (B), (C), (A and B), (A and C) , (B and C), or (A, B, and C) .

What is claimed is:
1. An inspection system comprising:
a storage unit that stores an attention part mask indicating which part of an inspection target object to pay attention;
a calculation unit that calculates a distance between an image of an attention part mask of the object in an image of an object photographed by a photographing unit at a first time point and an image of the attention part mask in an image of the object photographed by a photographing unit at a second time point that is after the first time point, and a distance between the image and an image of the attention part mask in a target image, which is an image when the object is in a normal state; and
a determination unit that compares two distances calculated by the calculation unit, and determines whether or not an object is in a normal state.
2. The inspection system according to claim 1
comprising:
an alignment unit that aligns an image of an object photographed by a photographing unit with a target image of the object, wherein
the alignment unit makes an image of the object photographed by a photographing unit before a start of a

work related to an object a first image aligned with a target image of the object, and makes an image of the object photographed by a photographing unit after an end of the work a second image aligned with the target image, and
the storage unit generates and stores an attention part mask of the object from a difference between the first image and the second image.
3. The inspection system according to claim 2 comprising:
a processing unit that performs processing of cutting out a part of an attention part mask from a predetermined image, wherein
the processing unit cuts out a part of the attention part mask from the first image to make a first part image, cuts out a part of the attention part mask from the second image to make a second part image, and cuts out a part of the attention part mask from the target image to make a target part image,
the calculation unit calculates a first distance between the first part image and the second part image and a second distance between the second part image and the target part image, and
the determination unit compares the first distance with the second distance that are calculated by the calculation unit, and determines whether or not the object

is in a normal state.
4. The inspection system according to claim 3
comprising:
a decision unit that decides an importance of each of a plurality of calculation methods used by the calculation unit for calculation of the first distance and the second distance.
5. The inspection system according to claim 4,
wherein
the calculation unit
calculates a distance by comparing histograms created from the first part image and the second part image, calculates a distance by comparing mean values of pixel values of the first part image and the second part image, calculates a distance by comparing medians of pixel values of the first part image and the second part image, calculates, as a distance, a sum of difference values obtained by calculating a difference of pixels having a same position between the first part image and the second part image, calculates a distance in the first part image and the second part image using a model adjusted based on learning data, and defines, as the first distance, a value of a sum obtained by multiplying each calculated distance by an importance decided by the decision unit and adding products, and

calculates a distance by comparing histograms created from the second part image and the target part image, calculates a distance by comparing mean values of pixel values of the second part image and the target part image, calculates a distance by comparing medians of pixel values of the second part image and the target part image, calculates, as a distance, a sum of difference values obtained by calculating a difference of pixels having a same position between the second part image and the target part image, calculates a distance in the second part image and the target part image using a model adjusted based on learning data, and defines, as the second distance, a value of a sum obtained by multiplying each calculated distance by an importance decided by the decision unit and adding products.
6. The inspection system according to claim 3
comprising:
a user interface unit that outputs the first image, the second image, the target image, the attention part mask, the first part image, the second part image, a result of determination by the determination unit, and component information for receiving a result of confirmation whether the result is correct.
7. The inspection system according to claim 1,
wherein

the storage unit stores a target image that is an image when an object is in a normal state and one or more reference images that are images when the object is not in a normal state, and generates and stores an attention part mask of the object from a difference between the target image and the reference image.
8. The inspection system according to claim 1
comprising:
a user interface unit that enables edit of an attention part mask stored by the storage unit, wherein
the storage unit stores an attention part mask edited by the user interface unit.
9. The inspection system according to claim 8
comprising:
an alignment unit that aligns an image of an object photographed by a photographing unit with a target image of the object; and
a processing unit that performs processing of cutting out a part of an attention part mask from a predetermined image, wherein
the alignment unit makes an image of an object photographed by a photographing unit at a first time point a first image aligned with a target image of the object, and makes an image of the object photographed by a photographing unit at a second time point that is after the

first time point a second image aligned with the target image,
the processing unit cuts out a part of the attention part mask from the first image to make a first part image, cuts out a part of the attention part mask from the second image to make a second part image, and cuts out a part of the attention part mask from the target image to make a target part image,
the user interface unit
outputs the first image, the second image, the target image, the attention part mask, the first part image, the second part image, a result of determination by the determination unit, first component information for receiving a result of confirmation as to whether the result is correct, and second component information for receiving edit of the attention part mask, and
when receiving edit of the attention part mask from the second component information, performs at least one processing of processing of changing a size of a region of the attention part mask, processing of changing a place of a region of the attention part mask, processing of changing an angle of a region of the attention part mask, processing of joining a plurality of regions when there are a plurality of regions of the attention part mask, processing of deleting a region of the attention part mask, processing

of adding or deleting a discretionary pixel in the attention part mask, processing of changing an importance of a plurality of calculation methods used by the calculation unit for calculation of a distance, processing of displaying a result of determination by the determination unit when using an attention part mask under edit, and processing of ending edit.
10. The inspection method further comprising:
storing, by a storage unit, an attention part mask indicating a part of an inspection target object to pay attention;
calculating, by a calculation unit, a distance between an image of an attention part mask of an object in an image in which the object is photographed by a photographing unit at a first time point and an image of the attention part mask in an image in which the object is photographed by the photographing unit at a second time point after the first time point, and a distance between the image and an image of the attention part mask in a target image that is an image when the object is in a normal state; and
determining, by a determination unit, whether or not an object is in a normal state by comparing the two distances calculated by the calculation unit.

Documents

Application Documents

# Name Date
1 202114039437-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [31-08-2021(online)].pdf 2021-08-31
2 202114039437-STATEMENT OF UNDERTAKING (FORM 3) [31-08-2021(online)].pdf 2021-08-31
3 202114039437-REQUEST FOR EXAMINATION (FORM-18) [31-08-2021(online)].pdf 2021-08-31
4 202114039437-PROOF OF RIGHT [31-08-2021(online)].pdf 2021-08-31
5 202114039437-POWER OF AUTHORITY [31-08-2021(online)].pdf 2021-08-31
6 202114039437-JP 2020-164952-DASCODE-C42D [31-08-2021].pdf 2021-08-31
7 202114039437-FORM 18 [31-08-2021(online)].pdf 2021-08-31
8 202114039437-FORM 1 [31-08-2021(online)].pdf 2021-08-31
9 202114039437-DRAWINGS [31-08-2021(online)].pdf 2021-08-31
10 202114039437-DECLARATION OF INVENTORSHIP (FORM 5) [31-08-2021(online)].pdf 2021-08-31
11 202114039437-COMPLETE SPECIFICATION [31-08-2021(online)].pdf 2021-08-31
12 202114039437-FORM 3 [23-02-2022(online)].pdf 2022-02-23
13 202114039437-Others-200422.pdf 2022-04-22
14 202114039437-Others-200422-2.pdf 2022-04-22
15 202114039437-Others-200422-1.pdf 2022-04-22
16 202114039437-GPA-200422.pdf 2022-04-22
17 202114039437-Correspondence-200422.pdf 2022-04-22
18 202114039437-FER.pdf 2022-05-26
19 202114039437-OTHERS [14-09-2022(online)].pdf 2022-09-14
20 202114039437-FORM 3 [14-09-2022(online)].pdf 2022-09-14
21 202114039437-FER_SER_REPLY [14-09-2022(online)].pdf 2022-09-14
22 202114039437-DRAWING [14-09-2022(online)].pdf 2022-09-14
23 202114039437-COMPLETE SPECIFICATION [14-09-2022(online)].pdf 2022-09-14
24 202114039437-CLAIMS [14-09-2022(online)].pdf 2022-09-14
25 202114039437-ABSTRACT [14-09-2022(online)].pdf 2022-09-14
26 202114039437-Response to office action [02-05-2025(online)].pdf 2025-05-02

Search Strategy

1 SearchStrategyMatrix202114039437E_26-05-2022.pdf