Sign In to Follow Application
View All Documents & Correspondence

Deterioration Diagnosis System Using Aircraft

Abstract: This deterioration diagnosis system using an aircraft makes it possible to enhance the diagnosis efficiency and accuracy when the state of deterioration of an object is diagnosed through comparison of past and present aerial images. This diagnosis system comprises an aircraft (1) that travels along a route around an object (5) and has a camera for photographing the object (5) and a computer system (100) for controlling the travel of the aircraft (1) and the photography of the camera. The computer system acquires, from the aircraft (1), data including image groups arrived at by consecutively photographing the object (5) at each prescribed date and time, and for a single object (5), associates images including the same location as images to be compared on the basis of diagnosis data including a diagnosis image group photographed at the current date and time and reference data including a reference image group from a past date and time, detects a deteriorated location by comparing the past and present images to be compared, and presents to a user a screen in which the deteriorated location within the image is made visible by being converted so as to be positioned on a three-dimensional model of the object.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
12 March 2020
Publication Number
24/2020
Publication Type
INA
Invention Field
PHYSICS
Status
Email
lsdavar@vsnl.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-08-25
Renewal Date

Applicants

HITACHI SYSTEMS, LTD.
2-1, Osaki 1-chome, Shinagawa-ku, Tokyo 1418672

Inventors

1. CYOU, Yuu
c/o Hitachi, Ltd., 6-6, Marunouchi 1-chome, Chiyoda-ku, Tokyo 1008280
2. WATANABE, Junichirou
c/o Hitachi, Ltd., 6-6, Marunouchi 1-chome, Chiyoda-ku, Tokyo 1008280
3. NAKAMURA, Masato
c/o Hitachi Systems, Ltd., 2-1, Osaki 1-chome, Shinagawa-ku, Tokyo 1418672
4. UEDA, Ryouichi
c/o Hitachi Systems, Ltd., 2-1, Osaki 1-chome, Shinagawa-ku, Tokyo 1418672
5. OONISHI, Kentarou
c/o Hitachi Systems, Ltd., 2-1, Osaki 1-chome, Shinagawa-ku, Tokyo 1418672
6. NARITA, Yoshihito
c/o Hitachi Systems, Ltd., 2-1, Osaki 1-chome, Shinagawa-ku, Tokyo 1418672

Specification

Specification
Title of invention: Aircraft-based deterioration diagnosis system
Technical field
[0001]
 The present invention relates to a technology such as an information processing system using a flying vehicle such as a drone or an unmanned aerial vehicle (UAV). In particular, the present invention relates to a technique for photographing a target structure by using a camera of a flying object and diagnosing and inspecting a state such as deterioration.
Background technology
[0002]
 An air vehicle such as a drone or a UAV can be autonomously navigated unmanned based on remote control, automatic control, and wireless communication, and aerial photography can be performed using a camera. Systems have been proposed that realize various services by utilizing flying objects such as drones and aerial images. As one of them, there is a diagnostic system (which may be referred to as a flight-object-based deterioration diagnosis system or the like) that images a target structure by using a camera of a flight object and diagnoses a state such as deterioration. It is expected that such a diagnostic system will be effective against social issues such as deterioration of structures such as buildings and infrastructure equipment, reduction of the number of workers for countermeasure work such as inspection and repair, and high cost situation.
[0003]
 In this diagnostic system, for example, an aerial image of a target object is taken by a camera when the drone autonomously navigates, so that a wide range of images of the target object can be collected with a minimum of manual work. The computer performs image analysis processing and the like using the image group to diagnose and detect the location of the object such as deterioration, abnormality, and change. As a result, for example, it is possible to find a deteriorated portion such as cracks, rust, corrosion, and peeling on the surface of the object, or a changed portion such as extension and remodeling or adhesion of foreign matter. As a result, the work of diagnosis and inspection by humans can be supported and the efficiency can be improved.
[0004]
 Incidentally, in the case of a method of diagnosing, for example, a wall surface at a high place of a building without using an air vehicle, there are problems such as high cost and danger to an operator because a dedicated device is required.
[0005]
 Japanese Patent Application Laid-Open No. 2017-78575 (Patent Document 1) is an example of a prior art relating to a diagnostic system using a flying object. Patent Document 1 describes that an inspection system or the like captures an image of an inspection object by remotely operating a drone, and inspects defects of the inspection object based on imaging information quickly and at low cost. . Patent Document 1 describes that the server device receives the imaging information and the position information of the drone, analyzes the imaging information, and specifies the defect of the inspection object.
Prior art documents
Patent literature
[0006]
Patent Document 1: JP-A-2017-78575
Summary of the invention
Problems to be Solved by the Invention
[0007]
 In the prior art example like patent document 1, the detail of the process for a computer to identify the defect, deterioration, etc. of an object using the aerial image of a flying body is not described.
[0008]
 In the conventional air vehicle-based deterioration diagnosis system, in order to find a deteriorated part or the like from a large amount of aerial images, basically, a person (diagnostic person) relies on visual image confirmation and diagnosis work. ing. Therefore, it is difficult to perform efficient work, labor and cost are large, and it is difficult to quickly detect a deteriorated portion and the like.
[0009]
 Further, in the deterioration diagnosing system using a flying object of the prior art example, in the case of a method of detecting deterioration etc. by comparing the current aerial image and the past aerial image, it is necessary to process a large amount of aerial image in the computer. There is. The processing itself is difficult, requires a long processing time, and is difficult to perform efficient diagnosis. Due to the difference in the situation at the time of shooting, there is a difference in the contents of the past and present aerial images. The greater the difference between the two, the more difficult the process of associating and comparing the past and present images, and the more difficult the accuracy of the diagnostic process is.
[0010]
 An object of the present invention is to improve the efficiency and accuracy of diagnosis when diagnosing conditions such as deterioration of an object by comparing past and present aerial images with respect to the technology of a deterioration diagnosis system using an air vehicle. It is to provide the technology.
Means for solving the problem
[0011]
 A representative embodiment of the present invention is a flight object deterioration diagnosis system, which is characterized by having the following configuration.
[0012]
 A flight object use deterioration diagnosis system of one embodiment is a flight object use deterioration diagnosis system for diagnosing a state including deterioration of an object by using imaging by an air vehicle, and navigates a route around the object. And a computer system for controlling the navigation of the aircraft and the image capturing of the camera, the computer system including the aircraft at a predetermined date and time. Data including an image group in which the object is continuously photographed while traveling on a route is acquired from the air vehicle and stored in a DB, and a diagnosis including a diagnostic image group imaged at the current date and time is performed for the same object. An image including the same portion in the current image and the past image is associated as a comparison target image based on the data and the reference data including the reference image group captured at the past date and time referred from the DB, and By comparing the current image and the past image of the comparison target image to determine the difference, a deterioration point in a state including the deterioration is detected from the current image, and the deterioration point in the current image is detected. A screen for converting the represented two-dimensional coordinates so as to locate the three-dimensional coordinates in the space including the three-dimensional model of the object and visualizing the diagnosis result information including the deteriorated portion in the space after the conversion. Generate and display to the user.
The invention's effect
[0013]
 According to a typical embodiment of the present invention, regarding the technology of a deterioration diagnosis system using an air vehicle, when diagnosing a condition such as deterioration of an object by comparing past and present aerial images, Efficiency and precision can be improved.
Brief description of the drawings
[0014]
FIG. 1 is a diagram showing a configuration of a flight object utilization deterioration diagnosis system according to an embodiment of the present invention.
FIG. 2 is a diagram showing a configuration example of a drone and a computer system in the embodiment.
FIG. 3 is a diagram showing a configuration example of a computer system in the embodiment.
FIG. 4 is a diagram showing an outline in the embodiment.
FIG. 5 is a diagram showing a basic processing flow in the embodiment.
FIG. 6 is a diagram showing an outline flow of a plane matching type diagnostic process in the embodiment.
FIG. 7 is a diagram showing a diagnosis process of a plane matching method in the embodiment.
FIG. 8 is a diagram showing an outline flow of coordinate conversion processing in the embodiment.
FIG. 9 is a diagram showing an outline of coordinate conversion processing in the embodiment.
FIG. 10 is a diagram showing a flow of coordinate conversion processing of a plane conversion method in a modification of the embodiment.
FIG. 11 is a diagram showing a coordinate conversion process of a plane conversion method in a modification of the embodiment.
FIG. 12 is a diagram showing a configuration example of deterioration point information in the embodiment.
FIG. 13 is a diagram showing a first configuration example of a positioning and visualization screen for a three-dimensional model in the embodiment.
FIG. 14 is a diagram showing a second configuration example of a positioning and visualization screen for a three-dimensional model in the embodiment.
FIG. 15 is a diagram showing a route setting method in a modification of the embodiment.
FIG. 16 is a diagram showing a processing flow of a route setting function in a modification of the embodiment.
FIG. 17 is a diagram showing a camera adjustment method in a modification of the embodiment.
FIG. 18 is a diagram showing a processing flow of a camera adjustment function in a modification of the embodiment.
FIG. 19 is a diagram showing a stepwise association method in a modified example of the embodiment.
FIG. 20 is a diagram showing a partial SFM processing method in a modification of the embodiment.
FIG. 21 is a diagram showing a priority aerial photography method in a modified example of the embodiment.
FIG. 22 is a diagram showing a case where an object is a utility pole and an electric wire in another embodiment.
FIG. 23 is a diagram showing a case of a method of diagnosing by a person as a flying body utilization deterioration diagnosing system of a comparative example with respect to the embodiment.
FIG. 24 is a diagram showing a case of a method of diagnosing by a computer as a flying body utilization deterioration diagnosis system of a comparative example to the embodiment.
FIG. 25 is an explanatory diagram showing association and comparison of past and present image groups in a comparative example.
MODE FOR CARRYING OUT THE INVENTION
[0015]
 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In all the drawings for explaining the embodiments, the same parts are designated by the same reference numerals in principle, and the repeated description thereof will be omitted.
[0016]
 [Problems, etc.]
 Supplementary explanations will be given to the problems, etc. in the conventional aircraft-based deterioration diagnosis system.
[0017]
 FIG. 23 is an example of aerial photography of an object 5 in the case of a method of diagnosing a deteriorated portion by visual image confirmation work by a person as a first configuration example of a deterioration diagnosis system using an air vehicle of a comparative example with respect to the embodiment. Indicates. The target object 5 is a deterioration diagnosis target structure such as a predetermined building or infrastructure facility. In this example, the case where the side wall surface of the object 5 is set as the diagnosis target area is shown. In this method, a flying body such as a drone 91 is caused to autonomously fly on a predetermined route in the space around the object 5. From the control device such as the computer 92, the drone 91 is caused to autonomously travel on the set route under the control of wireless communication. As the drone 91 travels, a camera mounted on the drone 91 continuously captures a predetermined area (diagnosis target area) on the surface of the object 5. The camera has predetermined shooting settings and shooting controls. The predetermined shooting settings are settings such as the shooting direction of the camera, shooting timing, and shooting conditions. The shooting direction is from the position of the drone 91 and the camera to the shooting location. The shooting timing is, for example, the timing of continuous shooting of still images at a predetermined time interval (or may be regarded as the shooting timing of a moving image at a predetermined frame rate). The shooting condition is a condition defined by the setting values ​​of various known camera parameters (focal length, etc.). The route and the photographing setting are set in advance for photographing and diagnosing a predetermined region of the object 5.
[0018]
 By the above-mentioned aerial photography, a continuous image 901, which is a plurality of images continuous in a spatiotemporal sequence, is obtained. The computer 92 obtains data (which may be referred to as diagnostic data) including an image group of continuous images 901 from the drone 91. A diagnostician who is a user performs a diagnostic operation for detecting a deteriorated portion 902 or the like of the target object 5 by performing a visual image confirmation operation or the like on an image group of the diagnostic data. It takes labor, time, and cost for a person to find a deteriorated portion from a large amount of aerial images. We would like to support or automate such diagnostic work and reduce costs. Therefore, the following is given as another example of the aircraft-based deterioration diagnosis system of the comparative example.
[0019]
 FIG. 24 shows an example of aerial photography of the target object 5 in the case of the method of assisting or automating the diagnostic work by the computer 92 as the second configuration example of the flying body utilization deterioration diagnosis system of the comparative example. FIG. 24A shows the situation at the time of aerial photography of the object 5 at the past date and time, and FIG. 24B shows the situation at the time of aerial photography of the same object 5 at the current date and time. In both cases, the route setting of the drone 91 is the same. In the method of the diagnostic system of this comparative example, as described below, the computer 92 performs a diagnostic process of automatically diagnosing and detecting a deteriorated portion or the like based on the comparison between the current aerial image and the past aerial image. To do.
[0020]
 At the date and time of the past diagnosis and aerial photography in (A), the computer 92 obtains data (which may be referred to as reference data) including the image group of the continuous images 901A from the drone 91. The computer 92 stores the reference data, performs the diagnosis process, and stores the diagnosis result information. Further, thereafter, at the current date and time of the diagnosis and aerial photography in (B), the computer 92 obtains data (diagnosis data) including the image group of the continuous images 901B from the drone 91. The computer 92 saves diagnostic data, performs diagnostic processing, and saves diagnostic result information.
[0021]
 The aerial photography and diagnosis are performed every predetermined date and time according to the object 5. For example, in the case of the purpose of diagnosing the deterioration state, the aerial photography and the like are performed in a predetermined time unit plan such as yearly or monthly basis according to the purpose. As a result, a continuous image, diagnostic result information, etc. are obtained for each date and time of that time unit, and are accumulated and saved in the DB or the like.
[0022]
 Further, the computer 92 acquires not only aerial image data and shooting setting information but also sensor data of various sensors from the drone 91 during each aerial shooting. Examples of the sensor data include position information based on GPS and the like, and information such as azimuth, speed, and acceleration based on electronic compass, gyro sensor, acceleration sensor, and the like. Such information is also used when making a diagnosis.
[0023]
 The computer 92 compares the image group of the current diagnostic data (diagnostic image group) with the image group of the past reference data (reference image group) corresponding to the image group of the current diagnostic data at the time of the diagnostic processing of the object 5 at a certain diagnostic date and time. Then, the deteriorated portion or the like is determined and detected. At that time, the computer 92 associates, as a comparison target image, images corresponding to each other in image content, including the same shooting location, in the past image group and the current image group (in some cases, referred to as matching). There is) That is, the computer 92 selects one or more images from the past and present image groups that are candidates, and associates the selected set of images as a comparison target image. The computer 92 performs image analysis processing at the image level between the past and present images in the comparison target image to compare the image contents, and determines the difference and difference between the two. As a result, the computer 92 determines and detects, from the aerial image, a location (generally referred to as a degradation location) of a state such as deterioration, abnormality, or change in the surface area of ​​the object 5.
[0024]
 Further, the computer 92 determines the type of deterioration (for example, cracking, rust, corrosion, peeling, etc.) and the degree of deterioration thereof by a predetermined process. The calculator 92 detects, for example, a deteriorated portion whose deterioration degree is larger than a threshold value. The computer 92 stores the diagnosis result information including the detected deterioration location in the DB or the like and outputs it to the user on the screen. The user confirms the image and the like including the deteriorated portion on the screen.
[0025]
 The diagnostic system of the comparative example requires a process of associating and comparing a large amount of past and present aerial images. However, the processing itself including such association processing and comparison processing is difficult, and there is a problem in terms of efficiency and accuracy of diagnostic processing. The computer 92 needs to select images to be associated with each other from the past and present image groups. For that purpose, it is necessary to judge the image contents such as containing the same portion, and the image level image analysis processing is necessary for the judgment. Such processing needs to be performed on many past and present images. It takes time to process a large amount of image data. As a result, the diagnostic process requires a long time and is not efficient. It is also difficult to obtain diagnostic results immediately after aerial photography.
[0026]
 Further, basically, at each diagnosis date and time, the route, the photographing setting, and the like are controlled so that the same region of the same object 5 is photographed aerial. However, due to the difference in the situation at that time, there is a difference between the past and present aerial images, even if the images are for the same location, such as a shake or a shift in the image content. The larger the difference is, the more difficult it is to associate and compare images with each other, and it is difficult to improve the accuracy of diagnosis.
[0027]
 As an example of the difference in the situation at each date and time, the actual navigation route of the drone 91 on the set route may be shaken or displaced. For example, the state of wind speed or the like varies depending on the time of year, the weather, or the like. Therefore, the drone 91 is shaken or displaced in the position, speed, or direction. This causes a difference in the aerial image content. Since the drone 91 and the camera are not always in the same position or the like even at the same time point after a predetermined time has passed from the time point when the navigation is started, the image contents are different. It does not mean that images at the same time point should be simply associated with each other. In addition, based on the shooting settings (shooting direction, shooting timing, shooting conditions, etc.) of the camera along with the set route, at the time of actual shooting, the state of light in the area of ​​the object 5 due to the influence of sunlight, shadows, and the like. Is different. Therefore, the aerial image contents of the past and the present differ from each other in terms of sharpness.
[0028]
 FIG. 25 is an explanatory diagram showing the difficulty in the process of associating the past and present image groups and the process of comparing images in the diagnostic system of the comparative example. FIG. 25A shows an example of association and comparison between an image group of reference data of past date and time and an image group of diagnostic data of current date and time. The continuous image which made the horizontal direction the time series is shown. Each square represents one image. The star mark indicates an example of a certain same portion in the area of ​​the object 5 for explanation. As described above, the association process is an image-level matching process and is difficult to process. The computer 92 associates, in the past image group and the current image group, one or more images each including the same portion as a comparison target image. In this example, a case where the past three images and the current two images can be associated with each other is shown. The computer 92 performs comparison processing on the past comparison image and the current image one by one in the compared comparison target images. Through the comparison process, the deteriorated portion is determined and detected based on the difference between the past and the present.
[0029]
 25B and 25C show examples of comparison target images. FIG. 25B shows a simplified example of the image content of one image at a past date and time (for example, January 1, 2016). FIG. 25C shows an example of the content of one image at the current date and time (for example, January 1, 2017) regarding the same portion of the same object 5. The two images have different image contents because the drone 91 and the camera have different positions and shooting directions during aerial shooting. Point 905 indicates some same position on the wall. The current image has a deterioration portion 906 (for example, a crack) and is shown by a broken line frame. In the past image, a portion 907 corresponding to the deteriorated portion 906 is indicated by a broken line frame. In the association and comparison processing, it is necessary to associate such images with each other and compare the image contents. In the comparison process, the deterioration point 906 can be detected from the difference between the deterioration point 906 and the point 907. The greater the difference between the past and present image groups, the more difficult the association and comparison processing.
[0030]
 In contrast to the diagnosis system of the comparative example, the diagnosis system of the embodiment has a device for facilitating and increasing the efficiency of the diagnosis process including the association and comparison of the past and present images and improving the accuracy of diagnosis.
[0031]
 (Embodiment) With reference to
 FIG. 1 to FIG. 14, a deterioration diagnosing system using a flying object of an embodiment of the present invention will be described.
[0032]
 [Outline]
 (1) The diagnostic system according to the embodiment is a system for diagnosing deterioration using a flying object and visualizing a deteriorated portion. This diagnostic system is based on the method of the second configuration example of the comparative example. That is, the present diagnostic system performs a diagnostic process by a computer on the basis of aerial imaged images of an air vehicle regarding past and present objects, thereby automatically determining and detecting a deteriorated portion (deterioration diagnostic function. ) Has. The diagnostic system stores diagnostic result information in a DB and visualizes and displays it on a screen for a diagnostician who is a user. The present diagnostic system is devised in a data processing method for diagnosis processing, a correspondence between images, a comparison processing method, and the like. As a result, the user can efficiently perform the diagnosis work, and the diagnosis can be realized at low cost.
[0033]
 In this diagnostic system, as a premise, the image to be processed (two-dimensional image) is a continuous image taken by the camera of the air vehicle, and is an image of a plurality of still images in a space-time sequence of navigation and shooting. It is a group. In this diagnostic system, a diagnostic process for detecting a deteriorated portion is performed based on the association and comparison between the diagnostic image group of the current diagnostic data and the reference image group of the past reference data. In the diagnosis system of the comparative example described above, the image level matching method is used in the diagnosis process including the association and the comparison. On the other hand, the diagnostic system of the embodiment uses the plane matching method in the diagnostic process including the association and the comparison (FIG. 7 and the like described later).
[0034]
 In this plane matching method, the association and comparison are performed at the plane level detected from the image, not at the image level, in the association and comparison processing. The diagnostic system detects a plane from each of the current and past images, associates the detected planes with each other, and compares the associated images between the current and past planes. The diagnostic system determines and detects a deteriorated portion from a difference obtained by comparing planes. By using the plane matching method, diagnosis processing including association and comparison of a large number of past and present image groups is facilitated and made efficient, the calculation processing load is reduced, and a deteriorated portion can be detected faster.
[0035]
 (2) In addition, the diagnostic system has a function of providing a user with a screen that visualizes the deteriorated portion detected from the image in the diagnostic processing in a space including the three-dimensional object model. At this time, the present diagnostic system converts the two-dimensional coordinate information representing the deteriorated portion in the image into the three-dimensional coordinate information on the three-dimensional object model (corresponding to coordinate conversion from two-dimensional to three-dimensional, coordinate conversion). Is performed (FIG. 9 described later, etc.). This conversion uses a known SFM (Structure for motion) process or the like. This diagnostic system uses a perspective transformation matrix from the two-dimensional coordinates (x, y) that represent the deteriorated location in each of at least two of the plurality of consecutive images during transformation, on the three-dimensional model of the object. Calculate three-dimensional coordinates (X, Y, Z). The perspective transformation matrix is ​​calculated in advance using SFM processing or the like.
[0036]
 In the diagnostic system of the comparative example, even if the diagnostician or the computer can detect the deteriorated part from the aerial image, the position of the deteriorated part in the space including the object is determined from the image including the deteriorated part. There is a case that you don't understand well. On the other hand, in the present diagnostic system, the user can easily recognize on the screen the position where the deteriorated portion is in the space including the object and the like. The user can efficiently and intuitively find a deteriorated portion or the like.
[0037]
 (3) Furthermore, in addition to the above-described plane matching type diagnosis function and the deterioration location visualization function, the present diagnosis system can provide various additional functions as modifications and can be used together. As a method of the additional function, there are a route setting method, a camera adjustment method, a stepwise association method, a partial SFM processing method, and the like, which will be described later. The user can use each additional function based on the user setting of the diagnostic system. As a result, it is possible to further shorten the calculation time and improve the diagnostic accuracy in the diagnostic processing of the basic function.
[0038]
 [Aircraft Utilization Deterioration Diagnosis System (1)]
 FIG. 1 shows an overall configuration of a diagnostic system which is a flight vehicle utilization deterioration diagnosis system of an embodiment. The diagnostic system includes an aircraft that is the drone 1 and a computer system 100, which are connected by wireless communication. The computer system 100 includes, for example, a PC 2 and a server 3, which are connected via a communication network. The PC 2 is a drone control device and a client terminal device used by each user (diagnostic person). The server 3 is, for example, a server device in a cloud computing system or a data center of a business operator, and is a calculation processing device that cooperates with the PC 2.
[0039]
 The object 5 is the structure to be diagnosed and the subject of the camera 4. The target 5 is, for example, a building, infrastructure equipment, or the like. Buildings include general buildings, houses, public buildings, and the like. Examples of the infrastructure equipment include electric power equipment (equipment for thermal power generation, wind power generation, hydroelectric power generation, etc.), road transportation equipment, communication equipment, bridges, and the like. Predetermined areas on the surface of the object 5 are the diagnosis target area and the imaging target area. In advance, as a setting, the navigation route of the drone 1, the date and time, the shooting setting information of the camera 4, and the like are set so that the predetermined area can be taken aerial.
[0040]
 The drone 1 is a flying body that performs autonomous navigation based on remote control by wireless communication from the PC 2. As a modified example, a mode in which the user operates the navigation of the drone 1 from the PC 2 is also possible. The drone 1 autonomously travels on a set route in a space around a predetermined object. The drone 1 is equipped with a camera 4 and various sensors. The drone 1 takes an aerial image of the object 5 with the camera 4 when sailing on the route. The drone 1 transmits captured image data, sensor data, etc. at the time of capturing to the PC 2 by wireless communication.
[0041]
 The known sensor group of the drone 1 can detect the position, azimuth (direction), speed, acceleration, etc. of the drone 1 and the camera 4 as sensor data. The position includes three-dimensional coordinates (X, Y, Z). The position is obtained as latitude, longitude, altitude (height from the ground surface) based on, for example, GPS, an altitude sensor, or another positioning system. In addition, when using GPS, sufficient positioning accuracy is premised.
[0042]
 The camera 4 includes shooting direction, shooting timing, shooting conditions (camera parameters), and the like as shooting setting information. The shooting direction is a direction toward the shooting location of the object 5 with the positions of the drone 1 and the camera 4 as a reference. The shooting timing is a timing at which a plurality of continuous images (still images) are picked up. The shooting conditions are defined by known camera parameter settings such as the focal length and angle of view of the lens.
[0043]
 The PC 2 performs navigation control of the drone 1 and shooting control of the camera 4 by wireless communication. The PC 2 transmits known navigation control parameters, shooting setting information, etc. to the drone 1. A diagnostician who is a user operates the PC 2 to use the diagnostic system. On the screen of the PC 2, the user can input instructions to the diagnostic system and make user settings, and can confirm the setting status, diagnostic result information, and the like. Note that a plurality of PCs 2 of a plurality of users may be similarly connected to the server 3.
[0044]
 The PC 2 has a drone control function 21, a diagnostic client program 22, a storage unit 23, and the like. The drone control function 21 is a known function for controlling the navigation of the drone 1 and the shooting of the camera 4. The diagnosis client program 22 is a client program of the air vehicle use deterioration diagnosis and deterioration part visualization software 200. The diagnostic client program 22 of the PC 2 cooperates with the diagnostic server program 32 of the server 3 in client-server communication to perform processing. The diagnostic client program 22 controls the drone control function 21. The diagnosis client program 22 is particularly in charge of cooperation with the drone 1 and screen display processing.
[0045]
 The storage unit 23 of the PC 2 stores various data / information used by the diagnostic client program 22 and the like for processing. The storage unit 23 stores captured image data acquired from the drone 1, sensor data, shooting setting information set in the drone 1, and the like. Each data acquired from the server 3 is also stored in the storage unit 23.
[0046]
 The server 3 has a diagnostic server program 32, a DB 33, and the like. The diagnosis server program 32 is a server program of the flight vehicle utilization deterioration diagnosis and deterioration portion visualization software 200. The diagnostic server program 32 is particularly in charge of processing with a high calculation processing load such as diagnostic processing. The diagnostic server program 32 executes predetermined processing in response to a request from the diagnostic client program 22, and responds with processing result information.
[0047]
 The DB 33 of the server 3 stores various data used by the diagnostic server program 32 and the diagnostic client program 22 for processing. The DB 33 may be realized by a DB server or the like. In the DB 33, object data, diagnosis result information, and the like are stored in addition to each data acquired from the PC 2. The target data is data including basic information about the target 5, three-dimensional target model data, and the like. The object three-dimensional model data is data in any format, and for example, data created by an existing CAD system can be used. Alternatively, as the object three-dimensional model data, data obtained as a result of reconstructing a three-dimensional structure using a known SFM process based on an aerial image may be used. The diagnosis result information is the result of the diagnosis processing of the flight object deterioration diagnosis and deterioration part visualization software 200, and includes an image including the deterioration part, information in which the deterioration part is located on the three-dimensional object model, and the like.
[0048]
 The flight vehicle utilization deterioration diagnosis and deterioration part visualization software 200 realizes each function including a deterioration diagnosis function and a deterioration part visualization function. The deterioration diagnosis function is a function of performing a diagnosis process based on the correspondence and comparison between the current and past aerial images to detect a deteriorated portion of the object 5. In this diagnostic processing, the plane matching method is used in particular. The deterioration point visualization function is a function of providing a screen for visualizing the deterioration point in the image detected by the deterioration diagnosis function so as to be positioned on the three-dimensional object model.
[0049]
 The mounting configuration of the computer system 100 is not limited to the above and is possible. For example, the PC 2 and the server 3 may be integrated as one device, or may be separated into a plurality of devices for each function. The drone control device and the PC 2 may be separated. Known drones and UAVs can be applied to the flying body, but a dedicated flying body in which a specific function is additionally mounted for the diagnostic system may be used.
[0050]
 [
 Aircraft Utilization Deterioration Diagnosis System (2)] FIG. 2 shows a schematic functional block configuration of the drone 1 and the computer system 100 of the present diagnosis system.
[0051]
 The drone 1 has a propeller drive unit 11, a navigation control unit 12, a sensor 13, a gimbal 14, a camera 4, an image storage unit 15, a wireless communication unit 16, a battery 17, and the like. The propeller drive unit 11 drives a plurality of propellers. The navigation control unit 12 controls the navigation of the drone 1 according to the navigation control information from the navigation control unit 102 of the PC 2. Therefore, the navigation control unit 12 drives and controls the propeller drive unit 11 while using the detection information of the sensor 13. The sensor 13 is a sensor group such as a known GPS receiver, electronic compass, gyro sensor, and acceleration sensor, and outputs predetermined sensor data. The gimbal 14 is a known mechanism that holds the camera 4, and automatically maintains the camera 4 in a constant state without shaking during navigation. The camera 4 takes an aerial image of the object 5 according to the shooting control information and the shooting setting information from the shooting control unit 104 of the PC 2 and outputs the shot image data. The image storage unit 15 stores captured image data and the like. The wireless communication unit 16 includes a wireless communication interface device, and performs wireless communication with the computer system 2 using a predetermined wireless communication interface. The battery 17 supplies electric power to each unit.
[0052]
 The computer system 100 includes a GUI unit 101, a navigation control unit 102, a shooting control unit 104, a storage unit 105, a wireless communication unit 106, a diagnosis unit 107, and a visualization unit 108.
[0053]
 The GUI unit 101 forms a screen serving as a GUI (Graphical User Interface) for the user and displays it on the display. The user can make user settings and input instructions on the screen, and can check the setting status, diagnostic result information, and the like. As the user setting, it is possible to set whether or not to use each function provided by the diagnostic system, and to set a control threshold value for each function. Further, as the route setting, a basic route of the drone 1 (including the departure point), a schedule including date and time of aerial photography and diagnosis, and the like can be set. Also, as the shooting settings, basic camera parameters of the camera 4 and the like can be set.
[0054]
 The navigation control unit 102 controls the navigation of the drone 1 based on the setting of the route and the like. The navigation control unit 102 transmits navigation control information to the drone 1 through wireless communication and receives sensor data or the like indicating the navigation state from the drone 1.
[0055]
 The shooting control unit 104 controls shooting by the camera 4 based on the shooting setting information. The shooting control unit 104 transmits shooting control information based on the shooting setting information to the drone 1 via wireless communication, and receives shot image data and the like from the drone 1.
[0056]
 The storage unit 105 includes a diagnostic data storage unit 105A and a reference data storage unit 105B. The diagnostic data storage unit 105A stores a diagnostic image group at the time of the current aerial shooting. The reference data storage unit 105B stores a reference image group at the time of past aerial shooting. Each image data in the storage unit 105 is managed under the condition that the date and time at the time of shooting, sensor data, shooting setting information and the like are associated with each other. In addition, the storage unit 105 also stores user setting information, object data in FIG. 1, diagnosis result information, and the like.
[0057]
 The wireless communication unit 106 includes a wireless communication interface device, and performs wireless communication with the drone 1 using a predetermined wireless communication interface.
[0058]
 The diagnosis unit 107 receives the present diagnosis data and the past reference data, performs a diagnosis process, and outputs diagnosis result information. The diagnosis unit 107 includes a matching unit 107A, a comparison unit 107B, a conversion unit 107C, and an SFM processing unit 107D. The matching unit 107A performs a matching process on the diagnostic image group of the diagnostic data and the reference image group of the reference data. The matching unit 107A particularly performs the matching process of the plane matching method. The comparison unit 107B detects a location (degradation location) in a state of deterioration, abnormality, change, etc. by comparing the past image and the current image and determining the difference in the compared comparison target images. To do. The comparison unit 107B particularly performs the comparison process of the plane matching method. Further, the comparison unit 107B determines the type of deterioration and the degree of deterioration.
[0059]
 The conversion unit 107C converts the two-dimensional coordinate information representing the deteriorated portion into the three-dimensional coordinate information on the three-dimensional object model as a process of positioning the deteriorated portion detected from the image on the three-dimensional object model. Perform processing. The conversion unit 107C uses the SFM processing of the SFM processing unit 107D when performing the conversion.
[0060]
 The SFM processing unit 107D performs known SFM processing. The SFM processing unit 107D performs SFM processing on a plurality of input images, restores a three-dimensional structure, and outputs result information. In the SFM processing, the three-dimensional structure of the surface of the target object 5 (represented by the three-dimensional coordinates of the plurality of feature points is represented from the two or more consecutive images, based on the two-dimensional coordinates of the feature points in the image. ) And the viewpoint position (camera 4 position) are restored.
[0061]
 The visualization unit 108 performs a process of visualizing a deteriorated portion or the like on the screen based on the diagnosis result information of the diagnosis unit 107. The visualization unit 108 provides a screen in which the deteriorated position is located on the three-dimensional model of the object. The visualization unit 108 displays an image and information of a deteriorated portion according to a user's input operation on the screen.
[0062]
 [
 Aircraft Utilization Deterioration Diagnosis System (3)] FIG. 3 shows a configuration example of hardware, programs and data in the computer system 100 of the present diagnosis system. The computer system 100 includes an arithmetic unit 111, an input unit 112, a display unit 113, a wireless communication unit 106, a program storage unit 114, a data storage unit 115, and the like, which are connected by a bus or the like.
[0063]
 The arithmetic unit 111 includes a CPU, a ROM, a RAM, and the like, and realizes each processing unit such as the diagnosis unit 107 by performing processing according to the program read from the program storage unit 114. The input unit 112 includes an input device such as a keyboard and a mouse, and receives an input from the user. The display unit 113 displays a screen for the user. It may have another output device such as a printer.
[0064]
 The program storage unit 114 is composed of a non-volatile memory or the like, and stores a program for realizing the function of the diagnostic system. The programs include a two-dimensional image deterioration diagnosis program 401, a two-dimensional image deterioration place three-dimensional positioning program 402, a three-dimensional model creation program 403, a route setting program 404, a camera adjustment program 405, and the like.
[0065]
 The two-dimensional image deterioration diagnosis program 401 is a program that realizes a diagnosis process (matching unit 107A, comparison unit 107B in FIG. 2) including association and comparison of past and present image groups, and performs a plane matching process. Contains the program to be realized.
[0066]
 The two-dimensional image deterioration location three-dimensional positioning program 402 is a program that realizes coordinate conversion processing and the like (the conversion unit 107C in FIG. 2) that positions the deterioration location detected from the image on the three-dimensional object model. This program includes a program that processes how to select the two-dimensional coordinates when converting the two-dimensional coordinates representing the deteriorated portion into the three-dimensional coordinates of the object.
[0067]
 The three-dimensional model creation program 403 is a program that realizes SFM processing and the like (SFM processing unit 107D in FIG. 2) that restores a three-dimensional structure on a three-dimensional object model from a continuous image. An external program may be used as the three-dimensional model creation program 403 (SFM processing unit 107D). For example, an external server for SFM processing may exist separately from the server 3, and the server 3 may communicate with the external server and cooperate with each other to realize the function.
[0068]
 The route setting program 404 is a program that realizes a process of setting the route of the drone 1 and the like. The method of setting the basic route is not particularly limited. A method of setting the route based on the data obtained by actually operating the drone 1 may be used, or a method of manually setting the route based on the three-dimensional model of the object on the setting screen without operating the drone 1 may be used.
[0069]
 The camera adjustment program 405 is a program that realizes a process of setting and adjusting shooting setting information including camera parameters of the camera 4.
[0070]
 The computer system 100 acquires the captured image data 151 and the sensor data 152 (position, azimuth, speed, acceleration, etc.) from the drone 1 based on wireless communication and stores them in the data storage unit 115.
[0071]
 The data storage unit 115 can be configured by a buffer memory, a storage device, a DB server, etc., and stores various data / information used for processing. The storage unit and DB for various data / information may be configured by separate storage devices, DB servers, and the like. The data storage unit 115 includes a diagnostic data storage unit 105A, a reference data storage unit 105B, a three-dimensional model storage unit 105C, and a diagnostic result information storage unit 105D. The diagnostic data storage unit 105A includes a diagnostic image DB 161 and a diagnostic sensor data DB 162. In the diagnostic image DB 161, a group of images to be diagnosed (diagnostic image group) are arranged in time series and stored. In the diagnostic sensor data DB 162, a sensor data group related to a diagnostic image group is arranged in time series and stored. The reference data storage unit 105B includes a reference image DB 171 and a reference sensor data DB 172. In the diagnostic image DB 171, an image group to be referred to as a comparison target (reference image group) is arranged in time series and stored. The reference diagnostic sensor data DB 172 stores a sensor data group related to the reference image group in a time series.
[0072]
 The latest diagnostic data is stored in the diagnostic data storage unit 105A in accordance with the generation of the diagnostic data of the latest date and time. The diagnostic data of the previous date and time that has already been stored at that time is sequentially moved to the reference data storage unit 105B and becomes the reference data.
[0073]
 The object 3D model data 181, the restored 3D structure data 182, and the like are stored in the 3D model storage unit 105C. The object 3D model data 181 is current and past 3D model data related to the object 5, and data such as CAD can be used. The restored three-dimensional structure data 182 is three-dimensional structure data restored by the SFM process.
[0074]
 The diagnostic result information storage unit 105D stores diagnostic result information. The diagnosis result information includes two-dimensional coordinate information indicating the detected deterioration point and the corresponding aerial image, and three-dimensional coordinate information indicating the deterioration point positioned on the three-dimensional object model.
[0075]
 At the time of diagnosis, the computer system 100 (PC 2 or server 3) temporarily stores diagnostic data such as a diagnostic image group and sensor data acquired from the drone 1 in the diagnostic data storage unit 105A. The computer system 100 reads out a necessary amount of diagnostic data from the diagnostic data storage unit 105A to a processing memory (a memory of the arithmetic unit 111 or another memory) as appropriate. Further, the computer system 100 reads out, from the reference data storage unit 105B, data of an appropriate amount of reference data such as a reference image group and sensor data regarding the same object 5 as the diagnostic data, to the memory. In addition, the computer system 100 reads out the object three-dimensional model data 181 from the three-dimensional model storage unit 105C into the memory. The computer system 100 performs diagnostic processing and the like by using each data read in the memory.
[0076]
 During the diagnostic processing, the computer system 100 restores the three-dimensional structure of the target object including the deteriorated portion by the conversion processing using the SFM processing based on the diagnostic image group, and stores it as the restored three-dimensional structure data 182. The computer system 100 uses the restored three-dimensional structure data 182 to position the deteriorated portion on the target three-dimensional model.
[0077]
 The configuration example of the sharing of processing between the PC 2 and the server 3 is as follows. The PC 2 acquires the captured image data 151 and the sensor data 152 from the drone 1, and sends a processing request to the server 3 together with these data. In response to the processing request, the server 3 performs the diagnostic processing of the diagnostic unit 107 of FIG. 2, detects the deteriorated portion, and performs conversion to position the deteriorated portion on the three-dimensional object model. The visualization unit 108 generates screen data of a screen that visualizes a deteriorated portion on the target three-dimensional model. The server 3 transmits screen data including such diagnosis result information to the PC 2. The PC 2 displays a screen based on the screen data.
[0078]
 [
 Aircraft Utilization Deterioration Diagnosis System (4)] FIG. 4 shows, as an outline of the embodiment, an aerial image of the surroundings of the object 5 and a process of associating and comparing the aerial image of the past and the present image.
[0079]
 FIG. 4A shows a route around the object 5 and a continuous image 401 at the time of past aerial shooting (for example, January 2016). In this example, the side wall surface of a building or the like is used as the target object 5 as the diagnosis target area. In this example, the drone 1 takes an aerial image of the side wall surface in a direction such that the drone 1 looks obliquely downward from above the object 5. In addition, the drone 1 navigates a route in a feasible space that satisfies a predetermined rule. Generally, a wall surface of a building has a predetermined structure and wall surface design (for example, a design corresponding to a window or a pillar). Each point on the route indicates the position of the drone 1 and the camera 4 and the image capturing time. The position of the drone 1 and the position of the camera 4 are generally the same. An alternate long and short dash line arrow from each point indicates the shooting direction of the camera 4. The point before the one-dot chain line arrow indicates the shooting location, and indicates the center point of the image. By this aerial photography, a continuous image 401 is obtained from the drone 1 and stored as a reference image group of reference data.
[0080]
 Similarly, FIG. 4B shows the route and the continuous image 402 regarding the same target object 5 at the time of the current aerial photography (for example, January 2017). The set routes are the same for (A) and (B), but the actual traveling trajectory of the drone 1 has sway and deviation with respect to the set route. An example in which a deteriorated portion 403 has occurred on a part of the side wall surface of the object 5 will be shown. By this aerial photography, a continuous image 402 is obtained from the drone 1 and stored as a diagnostic image group of diagnostic data. Since the conditions of wind and light differ depending on the date and time of aerial photography, the image content of each continuous image also differs.
[0081]
 The computer (PC 2 and server 3) of the computer system 100 detects the deteriorated portion by inputting the diagnostic image group of the current diagnostic data and the reference image group of the past reference data and associating and comparing them. The computer then uses the plane matching method to associate and compare the planes in each image. This facilitates and improves the processing of association and comparison. The computer can detect, for example, the deterioration point 403 as a result of the diagnosis process. The computer provides a screen that visualizes the detected deterioration point 403 so as to be located on the three-dimensional model of the object 5.
[0082]
 Further, in the present diagnostic system, a route setting method, a camera adjustment method, a stepwise association method, a partial SFM processing method, etc., which will be described later, can be used together as additional functions. The efficiency can be further improved by using these additional functions together.
[0083]
 [DB data / information]
 Various data / information such as reference data and diagnostic data acquired and stored in the DB 33 of the computer system 100 include the following.
[0084]
 (A) Shooting time information: Date and time of aerial shooting, and information at each imaging point in time series. For example, year month day minute second.
[0085]
 (B) Position information: Position information of the drone 1 and the camera 4. For example, it has latitude, longitude, and altitude (height from the ground surface) measured based on GPS. The position can be represented by three-dimensional coordinates (X, Y, Z) in a three-dimensional space including the three-dimensional object model. Another sensor (height sensor) may be used for the height position (Z). A positioning system other than GPS may be used.
[0086]
 (C) Camera shooting direction: The shooting direction of the camera 4. The direction from the position of drone 1 and camera 4 to the shooting location. The camera photographing direction can be controlled based on the control of the gimbal 14 of the drone 1.
[0087]
 (D) Camera parameters (shooting conditions): Various parameters that can be set as basic functions of the camera 4. For example, setting values ​​for aperture, lens, shutter, flash, etc. Focal length, angle of view, etc.
[0088]
 (E) Degradation location information and estimated deterioration possibility: Degradation location information includes two-dimensional coordinate information in the image regarding the degradation location and three-dimensional coordinate information on the three-dimensional object model. The deterioration point information may be one or more feature points (change points) representing the deterioration points and a two-dimensional area formed by a collection thereof. The deterioration point information includes a deterioration possibility estimated value. In the diagnostic processing, the possibility of deterioration occurring at the position of the three-dimensional coordinates of the deterioration point is estimated and expressed as a probability value. Further, the deterioration location information may include information such as a deterioration type and a deterioration degree.
[0089]
 [Processing Flow]
 FIG. 5 shows a basic processing flow of the computers (PC 2 and server 3) of the computer system 100 of the present diagnostic system. FIG. 5 has steps S1 to S9. The steps will be described below in order. It should be noted that this flow is a flow in the case of a method of performing diagnostic processing and the like in real time along with navigation and aerial photography of the drone 1. The present invention is not limited to this, and a method may be used in which the drone 1 is once navigated and aerial shot, and then captured image data or the like is acquired to perform diagnostic processing.
[0090]
 (S1) The computer displays a setting screen and performs basic settings based on the user's input operation. As a basic setting, a diagnosis target area of ​​the target 5 (target three-dimensional model data), a date and time of diagnosis, a route, imaging setting information, and the like are set.
[0091]
 (S2) At the date and time of diagnosis, the drone 1 is caused to autonomously travel on the route under the control of the PC 2, and the area of ​​the object 5 is aerial imaged by the camera 4. The drone 1 transmits the captured image data and sensor data at that time to the PC 2. The PC 2 acquires those data as diagnostic data.
[0092]
 (S3) The diagnostic unit 107 of FIG. 2 of the computer inputs the diagnostic data, and inputs the diagnostic data by referring to the same reference data regarding the object 5 as the diagnostic data. Specifically, for example, the PC 2 transmits diagnostic data (diagnostic image group and diagnostic sensor data) to the server 3. The server 3 inputs the diagnostic data and reads out past reference data (reference image group and reference sensor data) regarding the same object 5 from the DB 33. At this time, the reference data of the diagnosis date and time that is a predetermined time before the current diagnosis date and time is referred to as a candidate. The predetermined time is set in advance together with the schedule of the diagnosis date and time according to the object 5 and the like. For example, one year ago, half a year ago, one month ago, one week ago, etc. can be arbitrarily set.
[0093]
 (S4) The diagnostic unit 107 performs a diagnostic process on the input diagnostic data and reference data to determine and detect a deteriorated portion. The matching unit 107A performs a matching process on the diagnostic image group and the reference image group by a plane matching method to obtain a comparison target image. At that time, the matching unit 107A detects a plane from the image, and performs matching using the detected plane. The comparison unit 107B compares the image contents of the past image and the current image in the comparison target image to determine and detect a deteriorated portion in a state such as deterioration. At that time, the comparison unit 107B compares the planes detected from the images. The comparison unit 107B may also determine the type of deterioration, the degree of deterioration, and the like.
[0094]
 (S5) When detecting the deteriorated portion from the two-dimensional image of the diagnostic data, the diagnosis unit 107 stores the two-dimensional coordinate information representing the detected deteriorated portion as a part of the diagnosis result information.
[0095]
 (S6) The diagnosis unit 107 performs a process of positioning the detected deterioration point on the three-dimensional object model. For that purpose, the conversion unit 107C performs coordinate conversion processing for converting the two-dimensional coordinate information representing the deteriorated portion into the three-dimensional coordinate information on the three-dimensional object model. At that time, the diagnosis unit 107 causes the SFM processing unit 107D to perform SFM processing. The SFM processing unit 107D performs SFM processing on the continuous images in advance to restore the three-dimensional structure, and obtains the perspective transformation matrix P. Based on the perspective transformation matrix P, the conversion unit 107C converts at least two consecutive images including the deteriorated part in the diagnostic image group into three-dimensional coordinates representing the position of the deteriorated part.
[0096]
 (S7) The diagnostic unit 107 stores the three-dimensional coordinate information of the deteriorated portion obtained in S6, the restored three-dimensional structure data 182, and the like as a part of the diagnostic result information.
[0097]
 (S8) The diagnosis unit 107 confirms whether the input of the diagnostic image group of the diagnostic data has been completed (whether there is a continuous input image). In the case (N), the process returns to S2 and is repeated in the same manner.
[0098]
 (S9) The visualization unit 108 performs deterioration location visualization processing. The visualization unit 108 uses the diagnosis result information to configure screen data (for example, Web page data) in which the deteriorated portion is located on the three-dimensional object model. The server 3 transmits the screen data to the PC 2. The PC 2 displays a screen on the display based on the screen data. The user can confirm the deteriorated part on the three-dimensional model of the object on the screen.
[0099]
 [Diagnostic Process-Plane Matching Method (1)]
 FIG. 6 shows a process flow including the plane matching process in the diagnostic process (S4) of the diagnostic unit 107. FIG. 6 has steps S41 to S43.
[0100]
 (S41) The diagnosis unit 107 detects a flat surface portion in each image of the input diagnosis image group and the reference image group (FIG. 7 described later). This plane detection process can be realized based on the process of detecting a feature point or an edge line in the image. A continuous area having a color for each pixel in the image and having substantially the same color can be detected as a plane (plane portion).
[0101]
 (S42) The matching unit 107A of the diagnosis unit 107 uses the detected plane portion to associate the diagnostic image group with the reference image group to obtain a comparison target image. At this time, for example, the planes of the images can be compared with each other and the images having substantially the same plane can be associated with each other.
[0102]
 (S43) The comparison unit 107B of the diagnosis unit 107 compares the planes of the past image and the plane of the current image in the associated comparison target images at the plane level. If the image contains multiple planes, a comparison is made for each plane. The diagnosis unit 107 detects the deteriorated portion by determining and extracting the difference between the past and the present by this plane comparison. As a result, the diagnosis unit 107 obtains the two-dimensional coordinate information representing the deteriorated portion in the image. The two-dimensional coordinate information may be information on one representative feature point, or may be information consisting of a plurality of feature points according to the shape and size of the region including the deteriorated portion.
[0103]
 Since the above-mentioned association and comparison processing is not an image level processing but a plane level processing, it is relatively easy, the efficiency of the diagnostic processing can be improved, and erroneous detection can be reduced.
[0104]
 Further, when the plane matching method is used, after that, in the coordinate conversion process in the conversion unit 107C, the process at the plane level is performed using the information of the plane unit.
[0105]
 [Diagnosis Processing-Plane Matching Method (2)]
 FIG. 7 shows a plane matching method in the above diagnosis processing. An example of the reference image and the diagnostic image is shown on the upper side of FIG. 7. In the first example, the diagnostic image 601 and the reference image 602 are images of the same object 5 taken at the same location (example: side wall surface of the building in FIG. 4). The first example is a case where the predetermined wall surface design includes a plurality of plane regions separated by lines. Since the situation at the time of shooting is different, the contents of the two images are different. In this example, the diagnostic image 601 has a deteriorated portion 701 such as a crack.
[0106]
 Similarly, in the second example, the diagnostic image 603 and the reference image 604 are images obtained by aerial photography of the same location (example: a three-dimensional structure protruding from the side wall surface) of the same object 5. In this example, the diagnostic image 603 has a deteriorated portion 702 such as a crack.
[0107]
 In the case of the diagnostic system of the comparative example, image levels are associated and compared with such images. Each image includes a plane (plane portion) having a different color depending on the structure and design of the surface of the object 5. In the plane matching method, such a plane is detected and used. The diagnostic unit 107 detects a feature point or an edge line in the image, and detects a region having substantially the same color as a plane. In this example, for example, three plane portions (planes p11 to p13) are detected from a certain diagnostic image 603. Further, from a certain reference image 604, three plane parts (planes p21 to p23) are detected.
[0108]
 The diagnostic unit 107 uses the planes detected from the images to associate and compare the images (plane level matching). The diagnosis unit 107 first uses a plane to associate one or more past images with one or more current images as comparison target images. Next, the diagnosis unit 107 compares planes between the images of the comparison target images by using the planes in each image. For example, the planes p11 to p13 of the diagnostic image 603 and the planes p21 to p23 of the diagnostic image 604 are compared. The diagnostic unit 107 estimates that, for example, the plane p11 and the plane p21 correspond from the positional relationship of the planes and the shape similarity, and associates the plane p11 and the plane p21 as the comparison target planes. Similarly, the plane p12 and the plane p22 are associated with each other and the plane p13 and the plane p23 are associated with each other based on the estimation. Then, the diagnosis unit 107 detects the deteriorated portion by comparing the planes of the respective comparison target planes and determining the difference between the two. For example, the deterioration point 702 can be detected by comparing the plane p12 and the plane p22. In addition, when the deterioration points are spread over a plurality of planes, they can be detected as respective deterioration points from the respective comparison target planes, and can be detected by integrating the deterioration points into one. The diagnosis unit 107 obtains two-dimensional coordinates that represent the deteriorated portion 702 in the image.
[0109]
 The plane level matching is easier in image analysis processing and the like and has a lower processing load than the conventional image level matching. Therefore, the diagnosis process can be made efficient.
[0110]
 [Degradation Location Detection Processing] The
 diagnosis unit 107 may perform the following processing when detecting a degradation location from the image. The diagnosis unit 107 may determine the deterioration type and the deterioration degree of the deterioration portion 702 by a predetermined process. For example, cracks, rust, corrosion, peeling, etc. are defined as the types of deterioration. For example, when a crack is detected, the position and area of ​​the crack, the size of the crack area, the number of crack lines, the length, the width, and the like are calculated. The degree of crack deterioration is determined based on the quantified values. For example, the degree of deterioration is determined as several levels based on comparison with a reference threshold value for cracking.
[0111]
 [Conversion Process]
 FIG. 8 shows a flow of the conversion process (S6) of the conversion unit 107C of the diagnosis unit 107. FIG. 8 has steps S61 to S63.
[0112]
 (S61) The conversion unit 107C inputs, based on the diagnostic image group, at least two consecutive images as images (diagnostic images) in which the degraded portion has been detected (FIG. 9 described later). If there is not at least two images, conversion cannot be performed, so this processing flow is not established.
[0113]
 (S62) The conversion unit 107C calculates the two-dimensional coordinates (x1, y1) indicating the deteriorated portion in the first image of the input continuous image and the two-dimensional coordinates indicating the corresponding deteriorated portion in the second image. Corresponding three-dimensional coordinates (X1, Y1, Z1) are obtained from the coordinates (x2, y2) by coordinate conversion using a known perspective conversion matrix P. The perspective transformation matrix P is obtained in advance by another SFM process. The three-dimensional coordinates obtained by the conversion represent the position of the deteriorated portion in the space including the three-dimensional object model. Even when there are three or more continuous images, the same processing can be performed, and in that case, conversion accuracy can be improved.
[0114]
 (S63) The conversion unit 107C positions the obtained three-dimensional coordinates as the deterioration location position on the three-dimensional object model. This positioning can be realized as an association process in information processing (FIG. 12 described later). In addition, the conversion unit 107C sets a deteriorated part image for highlighting the deteriorated part on the three-dimensional model of the object with a predetermined color or an image as an expression at the time of visualization.
[0115]
 [Coordinate Transformation]
 FIG. 9 shows a coordinate transformation for obtaining three-dimensional coordinate information from two continuous images including a deteriorated portion, as an outline of the above transformation processing. The three-dimensional coordinate system of the space including the camera 4 is indicated by (X, Y, Z). O indicates the camera position. The two-dimensional coordinate system in the image is indicated by (x, y).
[0116]
 A first image g1 and a second image g2 are provided as two continuous two-dimensional images. In the two-dimensional coordinate system in the first image g1, the feature point f1 (indicated by a black circle) corresponding to the deteriorated portion is included. The two-dimensional coordinates (x1, y1) of the feature point f1 are shown. Similarly, the second image g2 has the feature point f2, and the two-dimensional coordinates (x2, y2) of the feature point f2 are shown. Here, a case where one feature point is included in the image is shown by simplifying the description, but the same applies to the case where a plurality of feature points are included. The conversion unit 107C associates the feature point f1 with the feature point f2.
[0117]
 The characteristic point corresponding to the deteriorated portion in the three-dimensional space including the object is defined as the characteristic point F1. The three-dimensional coordinates (X1, Y1, Z1) of the feature point F1 are shown. The conversion unit 107C performs coordinate conversion from the two images g1 and g2 using the perspective conversion matrix P, and thereby the three-dimensional coordinates (X1, Y1, Z1) of the characteristic point F1 at the deteriorated portion in the three-dimensional space. Is obtained.
[0118]
 [Conversion processing-plane conversion method (1)]
 Efficiency can be improved by devising a method of selecting two-dimensional coordinates in the above-mentioned coordinate conversion (FIGS. 8 and 9). In the diagnostic system of the modified example, the following plane conversion method is used as a device. In this plane conversion method, two-dimensional coordinates are selected using the plane information of the plane matching method described above. As a result, the conversion process can be made efficient.
[0119]
 FIG. 10 shows, as a modification, a processing flow when the plane conversion method is applied during the conversion processing. FIG. 10 includes steps S71 to S76.
[0120]
 (S71) The conversion unit 107C inputs at least two images in the continuous image in which the deteriorated portion is detected.
[0121]
 (S72) The conversion unit 107C inputs information about each plane detected by the above-described plane detection in the input continuous image.
[0122]
 (S73) The conversion unit 107C associates a plane (first plane) in which a deteriorated portion (feature point) is detected in a certain diagnostic image (first image) with a second image associated with the first plane of the first image. Of the second plane. For example, in FIG. 7, the plane p12 and the plane p22 in the comparison plane correspond to each other.
[0123]
 (S74) The conversion unit 107C calculates the plane conversion coefficient between the associated planes (first plane, second plane).
[0124]
 (S75) The conversion unit 107C uses the plane conversion coefficient to determine the two-dimensional coordinates (x1, y1) of the deteriorated portion of the first plane of the first image and the deteriorated portion of the second plane of the second image that is associated with the two-dimensional coordinate. Two-dimensional coordinates (x2, y2) are calculated.
[0125]
 (S76) The conversion unit 107C calculates the three-dimensional coordinates (X1, Y1, Z1) of the deteriorated portion from the two-dimensional coordinates (x1, y1) and the two-dimensional coordinates (x2, y2) using the perspective transformation matrix P. To do.
[0126]
 [Conversion processing-plane conversion method (2)]
 FIG. 11 shows coordinate conversion calculation in the plane conversion method. The 1st plane p101 in the 1st image and the 2nd plane p102 in the 2nd image which contain a degradation point in a continuous image are shown. The first plane p101 and the second plane p102 are comparison target planes that are associated and compared by plane matching. The first coordinates (x1, y1) of the feature point f1 representing the deteriorated portion are included in the first plane p101. The second coordinates (x2, y2) of the feature point f2 corresponding to the feature point f1 are included in the second plane p102. The computer system 100 calculates a plane conversion coefficient C from the first plane p101 to the second plane p102. The computer system 100 calculates the second coordinates (x2, y2) of the feature point f2 from the first coordinates (x1, y1) of the feature point f1 by the coordinate calculation using the plane conversion coefficient C.
[0127]
 As described above, the coordinate conversion from the two-dimensional coordinates of the deteriorated portion to the three-dimensional coordinates is performed, but in that case, two-dimensional coordinate information of a plurality of viewpoints is required as shown in FIG. At this time, in order to align the information of a plurality of viewpoints, it is necessary to perform the association processing at the image level. During the processing, an error in correspondence between feature points may occur. When the error occurs, it affects the accuracy of positioning the object in the three-dimensional model. Therefore, in this modification, a method of calculating the two-dimensional coordinates of the deteriorated portion (feature point) is used as the plane conversion method. In this plane conversion method, feature point association processing at the image level is unnecessary. As a result, it is possible to improve the positioning accuracy of the object 3D model.
[0128]
 [Deterioration Point Information]
 FIG. 12 shows an example of the data storage configuration relating to the deterioration point information of the diagnosis result information. The deterioration location information is stored in a DB table. Data is managed in the same structure as past reference data and current diagnostic data. Although not shown, the data is managed in time series using information such as the date and time of imaging.
[0129]
 The table of FIG. 12A shows a two-dimensional image data table. The table stores the two-dimensional information obtained from the processing of the two-dimensional image. This table has an image file, an image size, a deterioration point ID, and a deterioration point two-dimensional coordinate as columns. The image file shows an image of each still image in the continuous image. The image size is represented by the number of vertical and horizontal pixels of the image. The deterioration point ID indicates an identifier related to a deterioration point (a point with high possibility of deterioration) detected in the image. The deterioration point two-dimensional coordinates indicate two-dimensional coordinates (x, y) representing the position of the deterioration point in the image.
[0130]
 FIG. 12B shows a three-dimensional model data table. This table stores the three-dimensional information obtained by the conversion of the above-mentioned two-dimensional to three-dimensional positioning. This table has, as columns, the deterioration point ID, the deterioration point three-dimensional coordinates (X, Y, Z), and the corresponding two-dimensional image file. The deterioration point ID is generated based on the deterioration point ID in the two-dimensional image data table. The deterioration point three-dimensional coordinates (X, Y, Z) are the three-dimensional coordinates of the deterioration point positioned on the object three-dimensional model. The corresponding two-dimensional image file indicates a file of a two-dimensional image (image including the deteriorated portion) corresponding to this deteriorated portion, and is associated with the image file of the two-dimensional image data table.
[0131]
 [Degradation part visualization function]
 FIG. 13 shows a degradation part visualization function for locating a degradation part from two dimensions to three dimensions and an example of a screen thereof.
[0132]
 As an example of the diagnostic image 1301, FIG. 13A shows a case where a deteriorated portion 1303 is detected from a part of the three-dimensional structure portion 1302 on the side wall of the object 5. It has the two-dimensional coordinates (x1, y1) of the deteriorated portion 1303. As described above, the conversion unit 107C of the computer system 100 changes the two-dimensional coordinates (x1, y1) of the deteriorated portion 1303 of the diagnostic image 1301 to the three-dimensional coordinates (X1, Y1, Z1) on the three-dimensional object model. Conversion processing of. Then, the visualization unit 108 generates a screen for locating and visualizing the deteriorated part on the three-dimensional model of the object based on the conversion information.
[0133]
 FIG. 13B shows a configuration example of the visualization screen. On this screen, a three-dimensional model 1311 of the object viewed from a predetermined set viewpoint is displayed on the background. The object 3D model 1311 representing the object 5 is expressed as a 3D point group. The viewpoint on the screen, the enlargement / reduction ratio, and the like can be changed by a user operation. Around the object three-dimensional model 1311, basic information of the object 5, a route, a departure point, and the like may be displayed. Then, the three-dimensional coordinates (X1, Y1, Z1) of the deteriorated portion 1313 are located on the surface of the three-dimensional object model 1311. When the deterioration location is represented as a two-dimensional area, it also has information such as size. The deteriorated part is highlighted as an image by the conversion process described above. In this way, on this screen, the user can easily confirm where in the entire object 5 the deteriorated portion is, etc. For example, when the structure of the surface of the target object 5 is complicated and there are many places with similar structures, it is difficult to recognize the deteriorated part from the image, but by locating it on the three-dimensional model, the deteriorated part is easily recognized. .
[0134]
 FIG. 13C shows a configuration example of another visualization screen transitioning from the screen of FIG. When the user wants to check the details of the deteriorated part on the screen of (B), he or she selects the deteriorated part (for example, clicks or taps). As a result, the visualization unit 108 configures and displays the screen as shown in (C). The screen of (B) may be transitioned to display the screen of (C) as a whole, or the screen of (C) may be superimposed on the screen of (B). Further, it is possible to return from the screen of (C) to the screen of (B) by a predetermined user operation. On the screen of (C), the deteriorated part is enlarged and displayed, and various information (diagnosis date / time, object name, deterioration type, deterioration degree, etc.) regarding the deteriorated part is also displayed. In addition, a two-dimensional image (a part of the diagnostic image group) including the deteriorated portion may be displayed in association with each other based on a predetermined user operation. Further, on the screens of (B) and (C), the deteriorated portion is highlighted. For example, the two-dimensional area including the deteriorated portion may be highlighted with a frame colored according to the deterioration type and the deterioration degree. Further, on the screen of (C), it is also possible to compare and display the current and past images regarding the deteriorated portion based on a predetermined user operation (described below).
[0135]
 FIG. 14 shows an image comparison display screen as another visualization screen example. From the screens of (B) and (C) of FIG. 13, it is possible to specify the deteriorated portion based on a predetermined user operation and transition to the screen of FIG. The visualization unit 108 performs the image comparison display in the past and the present time series on this screen regarding the deterioration location designated by the user. On this screen, GUI parts such as a bar 1401 that can be selected in time series, a current image 1402, and a past image 1403 are displayed. First, the current image 1402 (diagnosis image including the deterioration portion) is displayed for the designated deterioration portion. For example, the current date and time is January 2017. Further, when the reference image of the corresponding reference data exists at a past date and time that is a predetermined time (for example, one year) before the current date and time, the past image 1403 is displayed. In addition, when corresponding data exists, the user can arbitrarily change the past and present two dates and times to be compared in the bar 1401. The images of the two changed dates and times are displayed as a current image 1402 and a past image 1403. Information such as the degree of deterioration may be displayed together with each image.
[0136]
 As another screen display example, a plurality of images of three or more dates and times may be compared and displayed, or an animation may be displayed while switching a plurality of images in the same area.
[0137]
 As described above, in the deterioration visualization function, the user can confirm the details of the detected deterioration portion on the screen by viewing the corresponding image content together with the position on the three-dimensional object model. The user can easily confirm the state of changes such as the degree of deterioration due to the occurrence and progress of the deteriorated portion in a time series from the past to the present. Therefore, it can also contribute to the planning of inspection and repair work.
[0138]
 [Degradation location visualization function-variation example] The
 following is also possible as a variation example of the degradation location visualization function and screen. The visualization unit 108 first displays the three-dimensional model of the object on the screen as shown in FIG. 13B, and does not display the deteriorated portion (diagnosis result information). The user selects and operates a desired part on the three-dimensional model of the object on the screen based on the knowledge about the structure of the object 5 and the past deterioration occurrence record. For example, when the target object 5 is a bridge, and there is a place where deterioration is likely to occur in advance due to the structure of the bridge, that place is selected. When there is past reference data and current diagnostic data for the selected location (point or area), the visualization unit 108 refers to the diagnostic result information for that portion. From the diagnosis result information, the visualization unit 108 generates and displays a screen in which the deterioration location is located on the three-dimensional model of the object, a comparison display screen of past and present images, or the like.
[0139]
 As another modified example, the following may be adopted. The computer system 100 does not immediately execute the diagnostic process after acquiring the aerial image and the like from the drone 1. The visualization unit 108 displays the three-dimensional object model on the screen. The user selects and operates a desired portion (point or area) to be the diagnostic processing target on the screen. The computer system 100 confirms the presence of corresponding past reference data and current diagnostic data for the selected and designated location and reads them. The computer system 100 uses the read data to execute a diagnostic process regarding the selected location. From the diagnosis result information, the visualization unit 108 generates and displays a screen in which the deteriorated portion is located on the three-dimensional model of the object, a comparison display screen of past and present images, and the like.
[0140]
 In the case of the above-described modified example, since the diagnostic process is performed on a part of the image data, the entire region of the target object 5 cannot be diagnosed, but the diagnostic process can be performed in a short processing time.
[0141]
 [Effects, etc.]
 As described above, according to the flight object utilization deterioration diagnosis system of the embodiment, the efficiency of the diagnosis is improved when the condition such as the deterioration of the object is diagnosed by comparing the past and present aerial images. And accuracy can be improved. According to this diagnosis system, it is possible to support the work of the deterioration diagnosis based on the aerial image of the flying object to improve the efficiency, and further to automate the deterioration diagnosis. According to this diagnosis system, it is possible to reduce the work of visually confirming an image and the like by a person, and it is possible to realize deterioration diagnosis at low cost. According to the present diagnostic system, a deteriorated portion or the like can be visualized on a three-dimensional object model on a screen, a diagnostician can easily recognize the deteriorated portion or the like, and work such as inspection and repair becomes easy.
[0142]
 [Modification (1)] The
 following is a diagnosis system according to a modification of the embodiment. As a modified example, when the above-mentioned diagnosis unit 107 performs a diagnosis process (S4 in FIG. 5, FIG. 6) including a process of associating and comparing the past and present image groups by the plane matching method (or the image level matching method). Known machine learning may be applied to. A deep learning method may be applied as the machine learning method. For example, by performing the plane level association and comparison processing as shown in FIG. 7 by machine learning, it is possible to automatically detect a deteriorated portion. The diagnostic unit 107 detects a deteriorated portion by performing machine learning (for example, deep learning) processing on each image of the input diagnostic image group and reference image group. When machine learning is used, an image including a deteriorated portion is input in advance as learning information for machine learning, and learning is performed.
[0143]
 [Modification (2)] In the
 above-described embodiment, a method is used in which captured image data is obtained during aerial shooting, diagnostic processing is started, and similar processing is applied to the diagnostic image group and the reference image group. It is possible without being limited to. As a modification, as a process prior to the diagnosis date and time, a process such as plane detection by a plane matching method is performed on the image group of the reference data, and the process result information is stored in the DB. As a result, at the time of the diagnostic processing of the diagnostic date and time, the processing result information of the reference data may be read out from the diagnostic data and the processing can be performed, so that the overall processing time can be shortened.
[0144]
 [Modification (3)]
 When applying the matching and comparison processing of the plane matching method to the above-mentioned past and present image groups, it may be difficult to detect the plane depending on the image. For example, when the structure of the surface of the target object 5 is complicated, noise may increase and a plane may not be detected at all, or only a large number of fine planes may be detected. In the diagnostic system of the modified example, when the input image is an image in which a plane is difficult to detect, it is treated as an exception, and other processing such as image level matching processing is applied without applying the plane matching processing. To deal with.
[0145]
 [Modification (4)] In the
 above-described diagnosis process, the images associated with each other are compared with each other, and the deteriorated portion is detected from the difference between the images. At that time, noise is generally involved although it depends on the processing method. In order to improve the diagnostic accuracy, it is effective to reduce the noise. Therefore, in the diagnostic system of the modified example, a two-stage noise removal process is applied. As a first step, the computer system 100 applies a predetermined noise removal process, for example, a predetermined filter process, to the entire image data of the comparison target image. After the first-stage noise removal processing, the computer system 100 evaluates the degree of noise relating to the noise portion remaining in the image by a predetermined evaluation processing. The computer system 100 compares the noise degree obtained by the evaluation with a predetermined threshold value, and applies the second-stage noise removal processing to an image including a noise portion whose noise degree exceeds the threshold value. The noise removal process of the second stage is, for example, a predetermined filtering process different from the first stage. As a result, it is possible to reduce erroneous detection of a deteriorated portion during the diagnostic processing.
[0146]
 [Modification (5) -Route Setting Method] As a
 modification example, a route setting method in an additional function will be described. With this route setting method function, based on the past route and reference data, etc., a suitable route of the drone 1 for present and future diagnosis, diagnosis date and time, photographing setting information, etc. are automatically generated and preset. To do. The suitable route, diagnosis date / time, and shooting setting information are routes, diagnosis date / time, and shooting setting information that can reduce the shake or shift of the current image content with respect to the past image content. Specifically, the route, the diagnosis date and time, and the shooting setting information are set so that the diagnosis target area is captured as clearly as possible in consideration of the light and wind conditions in the environment of the target object 5, the time, the weather, and the like. .. In other words, the current suitable route is generated by the predetermined correction based on the past set route. With this function, aerial photography is actually performed according to the route, diagnosis date and time, and photography setting information set in advance. As a result, the difference between the past and present image contents is small, and it is easy to obtain a diagnostic image group in which the same diagnostic target region is clearly captured. Therefore, the association and comparison processing can be facilitated and the diagnostic accuracy can be improved.
[0147]
 FIG. 15 shows an example of a route around the object 5 and shooting setting information regarding the function of the route setting method. FIG. 15A shows the basic route and the shooting setting information set at a certain date and time in the past. The basic route and the shooting setting information are set so as to cover the diagnosis target region on the side wall surface of the target 5, for example. On the basic route, the shooting direction, the shooting conditions, and the like are set for each position (three-dimensional coordinates) of the drone 1 and the camera 4 and each shooting time.
[0148]
 FIG. 15B shows a suitable route (correction route) and photographing generated by using the route setting function based on the basic route of FIG. 15A, photographing setting information, and reference data of the result of actual aerial photographing. Indicates setting information and the like.
[0149]
 In this route setting method, when setting a suitable route, the part that determines spatial information (position on the route, etc.) is determined based on the past route and the actual navigation result information on that route. To be done. For the part that determines the temporal information (diagnosis date and time, imaging time on the route, etc.) of a suitable route, based on the time information of the past route, the time, weather, sunlight and shadow of sunlight, wind direction and wind speed. , The positional relationship between the drone 1 and the object 5, the shooting direction of the camera 4, and the like.
[0150]
 The computer system 100 obtains the corrected route and the shooting setting information by correcting the basic route and the shooting setting information by the correction calculation in consideration of the wind and light conditions. The computer system 100 performs the correction calculation in consideration of the wind direction and speed, the direction and amount of sunlight, the positional relationship between the target object 5 and the drone 1, etc. according to the time, weather, and the like. By the correction calculation, the position on the route, the image capturing time point, the image capturing direction, the image capturing condition, and the like are corrected.
[0151]
 The following are examples of correction of this function.
[0152]
 ・ Set the recommended diagnosis date and time according to the weather and the weather.
[0153]
 -Correct the shooting direction and shooting conditions for areas where light is relatively hard to reach.
[0154]
 -Correct the time and time interval for image capture according to the wind direction.
[0155]
 In addition, the computer system 100 determines the image content in consideration of the wind and light conditions from the reference image group at the time of aerial photography of the basic route of (A), and determines, for example, a location where the deviation from the diagnosis target area is large or the like. It is also possible to detect a location where the sharpness is insufficient. Further, the computer system 100 may measure the wind speed, the wind direction, the temperature, and the like using a sensor at the diagnosis date and time, and correct the route and the shooting setting information in consideration of the measured values.
[0156]
 According to this function, it is possible to reduce a difference in image content of images that are temporally different, facilitate association and comparison processing, and improve diagnostic accuracy by setting a suitable route or the like.
[0157]
 In the diagnostic system of the embodiment, the user sets the basic route of the drone 1 and makes the autonomous navigation on the route at the diagnostic date and time. The user does not need to control the drone 1 at the diagnosis date and time. In addition, the route is set to comply with a predetermined rule. As an example of the rule, there are a predetermined time, a time zone, and a place where navigation is permitted. In addition, there are restrictions on the height of the aircraft when it is sailing and the weight of the aircraft. Further, a predetermined distance is secured between the flying object and the object. If there are many people under the air vehicle, it is not allowed to fly.
[0158]
 [Route Setting Processing]
 FIG. 16 shows a processing flow of the route setting function of the computer system 100. FIG. 16 has steps S101 to S105.
[0159]
 (S101) The computer system 100 reads and inputs past route setting information, reference data, diagnostic result information, etc., when the route setting of the route setting function is instructed based on the user operation on the setting screen. The reference data includes a reference image group, reference sensor data, shooting information, and the like. The shooting information includes date and time of diagnosis and aerial shooting, shooting timing (imaging time on the route), shooting setting information of the camera 4, and the like.
[0160]
 (S102) The computer system 100 generates a spatially suitable route for diagnosis of the current date and time, based on the past date and time setting route and the reference data which are input data. The spatially suitable route is a route capable of aerial imaging of the diagnosis target region of the target object 5 with little displacement or the like. The past navigation route has a deviation from the set route. The computer system 100 generates a suitable route so as to reduce the difference.
[0161]
 (S103) The computer system 100 generates suitable shooting setting information of the camera 4 for the diagnosis of the current date and time based on the shooting information of the past date and time which is the input data. The suitable shooting setting information is a shooting direction, a shooting timing, a shooting condition, and the like with which the diagnosis target region of the target object 5 can be shot with less displacement.
[0162]
 (S104) The computer system 100 generates a suitable diagnosis date and time for the diagnosis of the current date and time based on the past date and time diagnosis result information which is the input data, and sets it as the recommended diagnosis date and time. The suitable diagnosis date and time is the time and time zone in which the diagnosis target area of ​​the target object 5 can be photographed as clearly as possible.
[0163]
 (S105) The computer system 100 displays each of the generated route, diagnosis date and time, imaging setting information and the like on a setting screen to confirm the user. The user confirms each information on the setting screen, and when adopting, presses the confirm button. The user can partially revise the presented information and adopt it. As a result, the computer system 100 presets a suitable route for the present or future diagnosis, the date and time of diagnosis, the shooting setting information, and the like.
[0164]
 [Modification (6) -Camera adjustment method] As a
 modification, the function of the camera adjustment method among the additional functions will be described. In the function of this camera adjustment method, shooting control is performed so as to adjust the shooting settings of the camera 4 in real time during the navigation of the route of the drone 1. This adjustment is to change the shooting settings so as to reduce the shake or shift of the current image content with respect to the past image. Specifically, during aerial shooting, the shooting direction, shooting timing, shooting conditions, and the like of the camera 4 are corrected so that the degree of overlap between the current and past images is as large as possible at every predetermined shooting time point on the route.
[0165]
 FIG. 17 shows an example of camera adjustment during aerial shooting as a camera adjustment method in the modification. This function allows the camera 4 to be photographed in real time in the aerial image of the drone 1 at each control time point of a predetermined time interval so that the diagnostic target region of the object 5 can be imaged with as little displacement as possible from past images. Adjust the shooting direction, etc. As a result, the content of the aerial image at the current diagnosis date and time is made as close as possible to the content of the past reference image. As a result, in the diagnosis process, the association and comparison processes are facilitated and the diagnosis accuracy can be improved.
[0166]
 Specifically, the computer system 100 calculates the degree of overlap between the current image and the past image at each control time point (for example, time points t1, t2, t3) at a predetermined time interval. A region where the current image and the past image overlap is indicated by a shaded region. The computer system 100 adjusts the shooting setting information such as the shooting direction of the camera 4 in real time so that the degree of overlap is as large as possible between control times. For example, the adjustment process is performed so as to adjust the shooting setting information at the next time point t2 based on the state at the time point t1.
[0167]
 The adjustment processing (correction processing) of the camera 4 in this method is realized not at each image capturing time point but at each control time point at a predetermined time interval that is coarser than that, thereby reducing the processing load. Further, in this method, the correction process is performed for each image at a predetermined time interval, but at that time, the adjustment process is performed for each group (for example, for each representative image in the group) by using the grouping setting described later. You may do it. Further, as a modified example, not only the shooting setting information of the camera 4 but also the navigation parameters of the drone 1 may be adjusted to achieve the same purpose.
[0168]
 [Camera Adjustment Processing]
 FIG. 18 shows a processing flow of the camera adjustment function. FIG. 18 has steps S201 to S205.
[0169]
 (S201) The computer system 100 causes the drone 1 to perform aerial photography at the diagnosis date and time based on the setting. The computer system 100 receives and inputs aerial image data from the drone 1 (camera 4) in real time. The computer system 100 sequentially extracts one image (extracted diagnostic image) at each control time point of a predetermined time interval from the time-series continuous images of the input diagnostic image group.
[0170]
 (S202) The computer system 100 refers to the corresponding reference image group based on the extracted diagnostic image of the input diagnostic image group, and associates the images (extracted reference image) from the reference image group by association using the shooting information of each image. Image). The computer system 100 associates the extracted reference image with the extracted diagnostic image as a comparison target image.
[0171]
 (S203) The computer system 100 calculates the degree of overlap between the past image and the current image in the above-mentioned matched images to be compared, as an overlap rate. In addition, the computer system 100 calculates a direction such as a shift between both images (FIG. 17, shift direction). The shift direction is obtained, for example, as a vector connecting the center points of both images. It should be noted that not only the overlapping rate but also the overlapping area may be calculated.
[0172]
 Further, the computer system 100 may use the above-described plane matching process when calculating the overlapping rate of both images. In this case, the overlap rate between planes in each image is calculated. Thereby, the adjustment process can be made efficient.
[0173]
 (S204) The computer system 100 calculates the adjustment amount for adjusting the shooting setting information between the control time points of the predetermined time interval based on the overlap rate and the deviation direction. Here, it is assumed that the shooting direction of the camera 4 is adjusted. The computer system 100 calculates the adjustment amount from the current shooting direction of the camera 4 (for example, the shooting direction 1701 in FIG. 17) to the shooting direction at the next time (for example, the shooting direction 1702). At this time, the computer system 100 calculates, as the adjustment amount, the number of pixels for moving in a direction in which the overlapping rate of both images is increased (a direction opposite to the shift direction). Then, the computer system 100 calculates the adjustment amount in the shooting direction of the camera 4 corresponding to the number of moving pixels.
[0174]
 (S205) The computer system 100 controls in real time so as to change the shooting setting information of the camera 4 for the next time point according to the adjustment amount obtained above. The above-described processing is similarly repeated at each time point.
[0175]
 [Modification (7) -Stepwise association method] As a
 modification, the function of the stepwise association method among the additional functions will be described. As described above, the association and comparison processing is basically difficult and the processing load is high. Correspondence is difficult unless the images have a high degree of similarity between the past and present image contents. In the stepwise association method of this modified example, the association processing of the past and present image groups is realized in two steps using grouping.
[0176]
 The computer of the computer system 100 preliminarily sets the entire space including the object 5 into a plurality of groups based on information such as a three-dimensional structure of the object three-dimensional model data, a route, and shooting setting information (for example, camera shooting direction). (Also referred to as space). A group corresponds to a spatial part of a rough division in space-time. The method of grouping is not particularly limited. For example, you may divide into groups according to the structure part, wall surface part, etc. of a building. For example, you may divide into groups according to the similarity of the camera shooting direction (or the advancing direction of the drone 1).
[0177]
 At the time of the above-mentioned association processing, the computer first performs rough association in group units as the first association. The computer divides the diagnostic image group into a plurality of image groups (image groups) corresponding to the grouping. The computer associates a certain past image group with a certain present image group as a comparison target group. The first association can be relatively easily performed by using the information for each group.
[0178]
 Next, after the first association, the computer performs association as the second association for each image in the comparison target group. The computer performs the above-described plane-level matching or image-level matching between images between the past and present image groups to obtain a comparison target image. The computer performs comparison processing between the images associated with each other in the group to determine and detect the deteriorated portion. When the image level matching is used, the matching is performed based on a plurality of feature points detected in each image.
[0179]
 The computer obtains comprehensive diagnosis result information by connecting and integrating the diagnosis result information after the comparison processing of each group unit into one after the above-described two-step association processing.
[0180]
 FIG. 19 shows a stepwise association method in the modified example. FIG. 19A shows an example of the reference image group and the group. It has groups G1, G2, G3, etc. in the space. An image group is set corresponding to the group. FIG. 19B shows an example of diagnostic image groups and groups. It has groups G11, G12, G13, etc. in the space. An image group is set corresponding to the group. The computer associates the image groups with each other in the first association. For example, the group G1 and the group G11 are associated with each other. The computer performs the second association, for example, the association between the plurality of images in the group G1 and the plurality of images in the group G11 using a plane matching method or the like.
[0181]
 According to the above method, the association processing can be made efficient and the processing time can be shortened. The above-described first association processing for each group may be further performed as follows. The computer system 100 previously performs the following processing on a plurality of groups (image groups) of the reference image group. The computer system 100 selects one image (representative image) from a plurality of images for each group. For example, an image at the time of capturing at a predetermined time interval is selected. Alternatively, for example, an image corresponding to a specific location may be selected on the three-dimensional object model. In FIG. 19, an example of the representative image is indicated by a square with a double frame. In the reference data, the computer stores, as detailed information, position coordinate information and the like of feature points in the image for each selected representative image in the group. At the time of the first association processing, the computer uses the detailed information of the representative images in the group and compares the images with the images of the image group of the diagnostic image group to perform association in group units.
[0182]
 [Modification Example-Partial SFM Processing Method] As a
 modification example, a function of the partial SFM processing method among additional functions will be described. With this partial SFM processing method, when there is a large amount of aerial image data, SFM processing is performed on some images instead of performing SFM processing on all images.
[0183]
 The known SFM process is a process of restoring a three-dimensional structure (a feature point or the like representing the structure of the surface) and a camera position of an object from a plurality of two-dimensional images. In the present diagnostic system, for example, at the time of the above-mentioned conversion, the SFM processing is used to calculate the 3 The dimensional structure (three-dimensional coordinates of the deteriorated portion) is restored. However, since the known SFM processing has a relatively high processing load, it takes a long time and is not efficient when executed on a large amount of aerial image data.
[0184]
 Therefore, in this partial SFM processing method, SFM processing is performed on a selected part of a large number of candidate aerial images to obtain three-dimensional information (feature points, three-dimensional information) that represents a deteriorated portion. Coordinate).
[0185]
 As a method of selecting the SFM processing target, the entire space including the target object 5 is divided into a plurality of space portions (may be the above-described group) in advance, and the diagnostic image group is divided into a plurality of image groups correspondingly. Be done. The computer selects the image group of the space portion as a unit for SFM processing.
[0186]
 Also, in this partial SFM processing method, a plurality of SFM processings are executed in parallel for a plurality of image groups of each spatial portion in the diagnostic image group, and the result information of the plurality of SFM processings is converted into one result information. To integrate. With the above function, the diagnostic processing load can be reduced and the processing time can be shortened.
[0187]
 FIG. 20 shows a partial SFM processing method in the modified example. The computer system 100, for example, selects a diagnostic image group in group units corresponding to the spatial part and applies SFM processing. In FIG. 20, for example, groups G21, G22, and G23 are provided as a plurality of space parts. The group can be set in advance. The computer divides the diagnostic image group into a plurality of image groups in association with the group. The computer selects a group for SFM processing from the group of diagnostic image groups. In this example, all the groups G21 to G23 are targeted. The computer executes SFM processing for each of the selected groups G21 to G23. For example, the SFM processing unit 107D may perform SFM processing for each of a plurality of groups by parallel calculation processing. The computer system 100 obtains SFM processing result information for each group and then integrates the information into one piece of information.
[0188]
 Note that, depending on the configuration of how to divide the space parts (groups), a deviation or the like may occur during integration. Therefore, it is desirable to set in advance a suitable space portion (group). For example, the amount of information is set to be small in the connecting portion between the space portions (groups). As a result, a deviation or the like is less likely to occur during integration, and the SFM processing accuracy can be increased.
[0189]
 [Modification-Priority Aerial Shooting Method] As a
 modification, a priority aerial shooting method will be described. In this priority aerial photography method, attention is paid to a specific place or area of ​​the object 5, and the route and the photographing setting information of the camera 4 are set so that the particular place is preferentially taken as an aerial image. To set and control. For example, the user sets a specific location or area in advance as a priority location in the three-dimensional object model on the setting screen. The specific location is, for example, a location known to have a high possibility of occurrence of deterioration or the like based on the knowledge of the structure of the target object 5 and the past results of diagnosis. The detected deteriorated point may be set as a priority point based on the past reference data.
[0190]
 FIG. 21 shows the priority aerial photography method. In this example, the object 5 is a bridge. On the basis of the three-dimensional model structure of the bridge and the like, a place having a relatively high possibility of deterioration (for example, crack due to stress concentration, corrosion due to chemical reaction, etc.) is set as a priority place. In this example, the priority place 2101 is shown. The computer system 100 sets, based on the setting information of the priority portion, suitable routes for preferentially aerial photographing of the priority portion and shooting setting information of the camera 4. Here, a case is shown where the shooting direction and shooting conditions of the camera 4 are corrected based on the basic route. In this example, the shooting direction of the camera 4 is set so as to face the same priority point at a plurality of consecutive predetermined image pickup times and positions on the route. During actual aerial shooting, the shooting direction of the camera 4 is controlled according to the shooting setting information.
[0191]
 As a result, a plurality of images including the priority portion can be obtained as the diagnostic image group. As a candidate, there are a plurality of images of the same priority area. Therefore, the computer system 100 can easily realize the association and comparison processing for the priority point, and can improve the diagnostic accuracy.
[0192]
 (Other Embodiments)
 FIG. 22 shows an aerial image including a target and a three-dimensional model of the target in a diagnostic system according to another embodiment of the present invention. The case where it is applied to utility poles and electric wires is shown. In this form, in the diagnosis process, as a state and a place corresponding to the above-mentioned deteriorated place, a contact place of a contact state between an object (electric pole and electric wire) and other objects (trees, etc.) around it is diagnosed and detected. .
[0193]
 FIG. 22A shows an example of an aerial image including an object. In this example, in the image, there are two electric poles near the road and electric wires that pass through the two electric poles. Then, as another object, a tree is included. The wires may be in contact with the trees. The computer system 100 estimates and detects such a contact point based on the diagnosis processing including the association and comparison processing similar to the above-described embodiment. When the computer system 100 detects a contact point candidate from the two-dimensional image, the computer system 100 positions the three-dimensional model of the object from the two-dimensional information and displays a visualization screen.
[0194]
 The computer system 100 may have difficulty in detecting an electric wire from an image during diagnosis processing because the electric wire is thin, for example. In addition, it may be difficult to judge the contact between the electric wire and the tree. In that case, the computer system 100 devises the processing as follows.
[0195]
 The computer system 100 detects a predetermined target object or another object from the past and present aerial images by the above-described comparison processing, SFM processing, machine learning, or the like, and a contact point between the target object and another object. Detect candidates for. The computer system 100 calculates the three-dimensional coordinates on the three-dimensional object model from the two-dimensional coordinates of the contact point candidates and positions them on the three-dimensional object model. At that time, the contact point candidate is visualized by a predetermined highlighting.
[0196]
 If the electric wire cannot be detected from the aerial image, the computer system 100 detects two nearby utility poles from the aerial image or a past three-dimensional object model, and obtains three-dimensional coordinates of the two utility poles. The computer system 100 tentatively sets, as an estimated electric wire, an electric wire that connects the tip portions of the two electric poles as the position of the electric wire from the two electric poles in the three-dimensional space. The computer system 100 positions and displays the estimated electric wire on the three-dimensional object model. At that time, the estimated electric wire is visualized by a predetermined expression. As a result, the user can easily recognize the contact point candidate on the screen in an easy-to-understand manner. Therefore, it becomes easy to make a plan such as inspection of electric equipment.
[0197]
 Further, the computer system 100 determines contact between the estimated electric wire and an object such as a tree for the contact point candidate on the three-dimensional object model. The computer system 100 may calculate the contact possibility as a numerical value and display it as a determination result for the user. For example, the computer system 100 roughly detects the height of another object such as a tree from the aerial image and compares it with the estimated height of the electric wire to determine the contact state. Alternatively, the computer system 100 restores a three-dimensional structure (feature point) from an aerial image to an object such as a tree by SFM processing or the like, and estimates an object such as an electric wire and a tree on the object three-dimensional model. The contact state may be determined by comparing with. This allows the user to confirm the possibility of contact regarding the contact point candidate on the screen.
[0198]
 Although the present invention has been specifically described above based on the embodiments, the present invention is not limited to the above-described embodiments and can be variously modified without departing from the scope of the invention.
Explanation of symbols
[0199]
 1 ... Drone, 2 ... PC, 3 ... Server, 4 ... Camera, 5 ... Object, 21 ... Drone control function, 22 ... Diagnostic client program, 23 ... Storage unit, 32 ... Diagnostic server program, 33 ... DB, 100 ... Computer system, 200 ... Software for deterioration diagnosis and deterioration location visualization software using a flying object.
The scope of the claims
[Claim 1]
 A flight object use deterioration diagnosis system for diagnosing a state including deterioration of an object using imaging by an air vehicle,
 wherein the flight has a camera for navigating a route around the object and imaging the object. A body and
 a computer system for controlling the navigation of the flying body and the photographing of the camera
 ,
 wherein the computer system makes the
 flying body sail along the route at a predetermined date and time and continuously photographs the object. The data including the image group acquired from the flying object is stored in the DB,
 and the diagnostic data including the diagnostic image group photographed at the current date and time and the past date and time referred to from the DB are captured for the same object. Based on the reference data including the reference image group, an image including the same portion in
 the current image and the past image is associated as a comparison target image, and the current image and the past image of the comparison target image are compared with each other. By comparing and determining the difference, a deterioration point in a state including the deterioration is detected from the
 current image, and two-dimensional coordinates representing the deterioration point in the current image are converted into a three-dimensional model of the object. convert to position the three-dimensional coordinates of space, including a
 diagnostic result information including the degradation point in the space of the converted, and generates a screen to visualize, to display to the user,
 the flying object Utilization deterioration diagnosis system.
[Claim 2]
 The aircraft body deterioration diagnosis system according to claim 1,
 wherein the computer system
 detects a plane portion in each image between the diagnostic image group and the reference image group at the time of the association, Correspondence between the plane portion and the plane portion of the past image, and at
 the time of the comparison , by comparing the plane portion of the current image and the plane portion of the past image to determine the difference, from the current image An
 aircraft-based deterioration diagnosis system for detecting the deterioration point .
[Claim 3]
 The flight object use deterioration diagnosis system according to claim 1,
 wherein the computer system, at the time of the conversion, converts two-dimensional coordinates representing the deterioration point in each image of the plurality of images in the diagnostic image group into a perspective conversion matrix. A
 deterioration diagnosis system using an air vehicle for converting the object into three-dimensional coordinates on the three-dimensional model based on the above .
[Claim 4]
 The aircraft body deterioration diagnosis system according to claim 1,
 wherein the computer system, in the conversion, converts the first plane portion detected from the first image in the plurality of images in the diagnostic image group into the second image. A plane conversion coefficient for converting to a second plane portion detected from inside is calculated, and using the plane transformation coefficient, from the two-dimensional coordinates representing the deteriorated portion in the first plane portion, the inside of the second plane portion is calculated. An
 aircraft-based deterioration diagnosis system for calculating two-dimensional coordinates representing the above-mentioned deterioration point .
[Claim 5]
 The aircraft body deterioration diagnosis system according to claim 1,
 wherein the computer system visualizes a time-series change of the deterioration point of the object including a comparison between the past image and the current image. An
 aircraft-based deterioration diagnosis system that generates a screen .
[Claim 6]
 The flight object utilization deterioration diagnosis system according to claim 1,
 wherein the computer system
 causes the flight object to travel along the route at the predetermined date and time, and the time and date, the route, and the position of the flight object when the image is taken. Sensor data including information, and data including shooting setting information of the camera are stored in the DB,
 and based on the data of the past date and time, considering the situation of light and wind around the object, A
 flying object deterioration diagnosis system for generating and pre-setting data including the route or the shooting setting information for a current or future date and time so that the target area of ​​the object can be covered and imaged .
[Claim 7]
 The flight object use deterioration diagnosis system according to claim 1,
 wherein the computer system displays the current image and the past image at each control time point when the flight object is navigated along the route and the image is taken. A
 deterioration diagnosing system using a flying body , which judges a degree of overlapping, and performs control to correct shooting setting information of the camera or a navigation control parameter of the flying body so that the degree of overlapping becomes as large as possible .
[Claim 8]
 The aircraft body deterioration diagnosis system according to claim 1,
 wherein the computer system is configured
 to perform the diagnosis image group and the reference image according to the three-dimensional model of the object, the route, or the division of the shooting direction of the camera. The group is divided into a plurality of groups in spatiotemporal space, at the time of the
 associating, the first associating between the image groups in the group unit is performed, and
 after the first associating , the first group by the image unit in the group is performed. 2. An
 aircraft-based deterioration diagnosis system that performs two correspondences .
[Claim 9]
 The aircraft body deterioration diagnosis system according to claim 1,
 wherein the computer system sets the
 diagnostic image group to a spatiotemporal space in accordance with the three-dimensional model of the object, the route, or the division of the shooting direction of the camera. In a plurality of
 images in the diagnostic image group, the two-dimensional coordinates representing the deteriorated portion in each image in the diagnostic image group are divided into a plurality of parts in the target object based on a perspective transformation matrix. A
 system for diagnosing deterioration by using a flying body , which converts the three-dimensional coordinates on the three-dimensional model and integrates the conversion results for each part .
[Claim 10]
 The flight object use deterioration diagnosis system according to claim 1,
 wherein the computer system sets
 a specific region of the object as a region for
 preferential photographing, and the camera according to the setting of the region for preferential photographing. Shooting setting information including the shooting direction of the
 camera, and controlling the shooting direction of the camera to face the priority shooting area at a plurality of consecutive points when the aircraft is sailed along the route and the shooting is performed. to,
 aircraft utilization deterioration diagnosis system.
[Claim 11]
 The flight vehicle utilization deterioration diagnosis system according to claim 1,
 wherein the computer system uses machine learning in the association and the comparison
 .
[Claim 12]
 The aircraft body deterioration diagnosis system according to claim 1,
 wherein the computer system
 determines a contact state between the object and another object during the comparison, and detects a contact portion in the contact state from the current image. Then, at
 the time of the conversion, the two-dimensional coordinates representing the contact point in the current image are converted so as to be positioned at the three-dimensional coordinates in the space including the three-dimensional model of the object, and
 after the conversion, A
 system for diagnosing deterioration using a flying object , which generates a screen for visualizing diagnosis result information including the contact point in the space and displays the screen for the user .

Documents

Application Documents

# Name Date
1 202037010517-IntimationOfGrant25-08-2023.pdf 2023-08-25
1 202037010517.pdf 2020-03-12
2 202037010517-PatentCertificate25-08-2023.pdf 2023-08-25
2 202037010517-STATEMENT OF UNDERTAKING (FORM 3) [12-03-2020(online)].pdf 2020-03-12
3 202037010517-FORM 1 [12-03-2020(online)].pdf 2020-03-12
3 202037010517-ABSTRACT [29-09-2022(online)].pdf 2022-09-29
4 202037010517-FIGURE OF ABSTRACT [12-03-2020(online)].pdf 2020-03-12
4 202037010517-CLAIMS [29-09-2022(online)].pdf 2022-09-29
5 202037010517-DRAWINGS [12-03-2020(online)].pdf 2020-03-12
5 202037010517-COMPLETE SPECIFICATION [29-09-2022(online)].pdf 2022-09-29
6 202037010517-FER_SER_REPLY [29-09-2022(online)].pdf 2022-09-29
6 202037010517-DECLARATION OF INVENTORSHIP (FORM 5) [12-03-2020(online)].pdf 2020-03-12
7 202037010517-FORM 3 [29-09-2022(online)].pdf 2022-09-29
7 202037010517-COMPLETE SPECIFICATION [12-03-2020(online)].pdf 2020-03-12
8 202037010517-Proof of Right [17-03-2020(online)].pdf 2020-03-17
8 202037010517-Information under section 8(2) [29-09-2022(online)].pdf 2022-09-29
9 202037010517-Certified Copy of Priority Document [24-06-2022(online)].pdf 2022-06-24
9 202037010517-Information under section 8(2) [24-07-2020(online)].pdf 2020-07-24
10 202037010517-FER.pdf 2022-05-09
10 202037010517-FORM-26 [28-09-2020(online)].pdf 2020-09-28
11 202037010517-FORM 18 [21-06-2021(online)].pdf 2021-06-21
12 202037010517-FER.pdf 2022-05-09
12 202037010517-FORM-26 [28-09-2020(online)].pdf 2020-09-28
13 202037010517-Certified Copy of Priority Document [24-06-2022(online)].pdf 2022-06-24
13 202037010517-Information under section 8(2) [24-07-2020(online)].pdf 2020-07-24
14 202037010517-Information under section 8(2) [29-09-2022(online)].pdf 2022-09-29
14 202037010517-Proof of Right [17-03-2020(online)].pdf 2020-03-17
15 202037010517-COMPLETE SPECIFICATION [12-03-2020(online)].pdf 2020-03-12
15 202037010517-FORM 3 [29-09-2022(online)].pdf 2022-09-29
16 202037010517-DECLARATION OF INVENTORSHIP (FORM 5) [12-03-2020(online)].pdf 2020-03-12
16 202037010517-FER_SER_REPLY [29-09-2022(online)].pdf 2022-09-29
17 202037010517-COMPLETE SPECIFICATION [29-09-2022(online)].pdf 2022-09-29
17 202037010517-DRAWINGS [12-03-2020(online)].pdf 2020-03-12
18 202037010517-CLAIMS [29-09-2022(online)].pdf 2022-09-29
18 202037010517-FIGURE OF ABSTRACT [12-03-2020(online)].pdf 2020-03-12
19 202037010517-FORM 1 [12-03-2020(online)].pdf 2020-03-12
19 202037010517-ABSTRACT [29-09-2022(online)].pdf 2022-09-29
20 202037010517-STATEMENT OF UNDERTAKING (FORM 3) [12-03-2020(online)].pdf 2020-03-12
20 202037010517-PatentCertificate25-08-2023.pdf 2023-08-25
21 202037010517.pdf 2020-03-12
21 202037010517-IntimationOfGrant25-08-2023.pdf 2023-08-25

Search Strategy

1 202037010517searchE_06-05-2022.pdf

ERegister / Renewals

3rd: 06 Nov 2023

From 24/08/2020 - To 24/08/2021

4th: 06 Nov 2023

From 24/08/2021 - To 24/08/2022

5th: 06 Nov 2023

From 24/08/2022 - To 24/08/2023

6th: 06 Nov 2023

From 24/08/2023 - To 24/08/2024

7th: 10 Aug 2024

From 24/08/2024 - To 24/08/2025

8th: 04 Aug 2025

From 24/08/2025 - To 24/08/2026