Abstract: An automated aircraft inspection system for maintenance, repair and overhaul operations [0062] The present invention discloses an automated inspection system for performing maintenance, repair and overhaul operations, wherein the system (100) comprises a user interface device (101) through which a user is allowed to provide commands to perform an activity. The system (100) further comprises a central data processing unit (102) for allocating at least one data acquisition unit (103) for data collection from a portion of the surface of an aircraft. The system (100) further comprises at least one data acquisition unit (103) for acquiring data from at least a portion of the surface on the aircraft. The system (100) comprises a knowledge repository unit (104) for storing the operational activity details, wherein the detailed archives of all the activities conducted through the system (100) are recorded in real-time in the knowledge repository unit (104). (Figure 1)
Description:PREAMBLE TO THE DESCRIPTION:
[0001] The following specification particularly describes the invention, and the manner in which it has to be performed:
DESCRIPTION OF THE INVENTION
Technical field of the invention
[0002] The present invention relates to an automatic surface inspection system. The invention particularly relates to an intelligent inspection system for controlling the automated surface inspections of the aircrafts.
Background of the invention
[0003] The inspection of large objects such as aircraft, wind turbines, ships, bridges, structures, buildings, etc. is often difficult as it is impossible to detect all the possible damages on the surfaces. The damages to the surfaces such as impact due to lightning, corrosion, cracks.
[0004] The existing techniques used for visual inspection of large objects such as aircrafts relies on manual and human operation of the hardware or human vision for data acquisition and another decoupled system is used for analysis of the acquired data. These approaches are prone to human errors and human safety issues such as traversing on top of an airplane fuselage for acquiring data, resulting in delays/longer Aircraft on ground (AOG) times, affecting on-time-performance (OTP), expensive operational costs, and increased passenger safety risks for Maintenance, Repair and Overhaul (MRO) operation providers and airliners.
[0005] Additionally, the inspection of the aircrafts may involve uncertainties, wherein the unexpected damages or challenging time constraints for repairs maybe seen due to continuous flight schedules. Thus, is it highly important for the inspection systems to be adaptive to the uncertainties that are frequently encountered during inspections. The existing systems require human intervention for adapting to the new situations and there does not exist any methodology or framework to handle such unpredictability of operations.
[0006] Further, the Maintenance, Repair and Overhaul (MRO) operations carried out through existing systems generally attempt to automate the data acquisition process, however these systems are able to automate few minor subtasks and are not able to adapt to the new data being gathered. Most of the inspections are only point solutions, i.e., they address only the current need at hand and operate on a reactive mode rather than preemptive mode.
[0007] There are several drawbacks of the existing technologies including limited automation and increased need for human intervention in data acquisition process, wherein human intervention can pose safety risk situation. Further, limited automation implies manual and time-consuming processes, which might lead to longer Aircraft on ground (AOG) or parking times, missed contracts and delayed project delivery etc. Such systems are highly prone to human errors. The lack of in-depth traceability of the ongoing process and previous missions leads to lack of accountability. Further, lack of automation and human involvement for inspection implies human errors such as missed defect detections, hard to quantify a time estimate for performing the inspections unlike for automated programmatic approaches.
[0008] Further, the data acquisition methodology is decoupled from the data analysis, wherein the data is gathered in separate process than the detailed analysis phase, thus during analysis phase, the inspectors might encounter corrupt, insufficient, or lacking data. In order to remedy the lacking, insufficient and/or corrupt data, a following data acquisition task would need to be relaunched, thus proving to highly time consuming. Further, the existing systems are not adaptable and the results are not holistic. The existing methods are not modular nor versatile, and thus the hardware systems can tackle only a limited range of missions and that too are not cross disciplinary.
[0009] In order to overcome the drawbacks of the existing systems, several technologies have been developed over the decades to develop automated inspection systems for inspecting larger objects.
[0010] The Patent Application No. CN109379564A entitled “Unmanned aerial vehicle inspection device and method for fuel gas pipeline” discloses an unmanned aerial vehicle inspection device and method for a fuel gas pipeline, relates to the field of fuel gas pipelines, and aims to solve the problems of low efficiency, difficulty in realizing inspection, high cost and susceptibility to weather environment existing in an existing natural gas pipeline inspection method. The method comprises the following steps that: a flight control module sends a control command to an unmanned aerial vehicle module, and a multi-rotor unmanned aerial vehicle flies according to the control command; a video recording module transmits a collected digital image signal to a data receiving module through a wireless image transmission module, and transmits the collected digital image signal to an image processing module and the flight control module; the image processing module splices and fuses received images, and transmits the processed images to an image analyzing module, a display module and the flight control module through data buses; the image analyzing module analyses suspectable accident points, annotates the suspectable accident points on the images, broadcasts the confirmed accident points through an alarm module, and transmits the processed images to the display module for displaying; and the flight control module transmits the received image data to a data service center.
[0011] The Patent Application No. WO2018229391A1 entitled “Platform for controlling and tracking inspections of surfaces of objects by inspection robots and inspection system implementing such a platform” discloses a remote platform for controlling and tracking inspections of surfaces of predetermined objects by inspection robots, comprising a reference database of said predetermined objects; a module for setting the parameters of said inspection robots; a module for collecting, from an inspection database, surface data provided by said inspection robots; a module for processing said surface data from said inspection database, configured to be able to detect surface defects on said inspected objects, to save them in a defect database, and to provide said parameter-setting module, from said inspection database and from said defect database, with an optimized parameter setting for said inspection robots; and a system for wireless communication with said inspection robots.
[0012] The Patent Application No. WO2021088311A1 entitled “Multi-unmanned aerial vehicle collaborative operation-based automatic inspection method and system for bridges” discloses an automatic inspection method and system for bridges, wherein the multi-unmanned aerial vehicle collaborative operation-based automatic inspection method comprises the steps of: analyzing bridge environment information and dividing a bridge inspection task into a plurality of sub-tasks; acquiring a task grouping optimization function, and dividing the sub-tasks into sub-task groups; planning the flight paths of unmanned aerial vehicles corresponding to the sub-task groups respectively, and allocating the sub-task groups to corresponding unmanned aerial vehicles; transmitting photos photographed by the unmanned aerial vehicles to an image processing unit to extract damage features of the bridge, and generating a bridge damage report. In the multi-unmanned aerial vehicle collaborative operation-based automatic inspection method and system for bridges, different photographing schemes are designed according to different bridge tasks, bridge inspection is performed by means of the mutual collaboration of a plurality of unmanned aerial vehicles, and the unmanned aerial vehicles simultaneously execute different sub-tasks and use redundancy fault-tolerant technology to remediate abnormalities. The method and system are highly efficient, have high task continuity and completeness, are highly accurate, have a low error rate, and can solve the problems in which traditional bridge inspection operations have a high degree of difficulty, a single unmanned aerial vehicle takes a long time, and the error rate is high.
[0013] Hence, there is a need for an automated aircraft inspection system and method for aircraft Maintenance, Repair and Overhaul (MRO) operations.
Summary of the invention
[0014] The present invention discloses an automated and intelligent aircraft inspection system and a method for performing maintenance, repair and overhaul operations wherein, the system automates the inspection process and analysis of the collected data. The system comprises a user interface device for allowing a user to assign a task to a central data processing unit, wherein the central data processing unit receives the assigned task from the user interface device and allocates at least one data acquisition unit for data collection from at least a portion of the surface of an aircraft under inspection.
[0015] The system further comprises a central data processing unit for allocating at least one data acquisition unit for data collection from at least a portion of the surface of the object. The central data processing unit receives command from the user through the user interface device. The system comprises a knowledge repository unit for storing the operational activity details and history, wherein the details from the user interface device, plurality of data acquisition units and the central data processing unit are recorded in the knowledge repository unit in real-time.
[0016] The present invention discloses a method of performing aircraft inspection for maintenance, repair and overhaul operations, wherein the method comprises the steps of: providing at least one input command to the central data processing unit by the user through the user interface device in order to perform at least one activity. Further, the central data processing unit allocates at least one data acquisition unit to perform the activities provided by the user in order to inspect the aircraft.
[0017] Upon allocation of at least on data acquisition unit to perform the activities provided by the user, the data acquisition unit collects the data using the plurality of sensors. The data acquisition unit collects the data from at least a portion of the surface of the aircraft. The data collected from the data acquisition unit is processed by the central data processing unit for identifies the damages in the portion of the surface of the aircraft.
[0018] Subsequently, upon processing the data received from the data acquisition unit, the central data processing unit evaluates the processed data to determine if the activity is completed. In case of any discrepancies in the data obtained by the data acquisition unit, the central data processing unit allocates at least one data acquisition unit with advanced sensors for data collection and the collected data is processed and evaluated to identify the type of damage in the portion of the surface of the aircraft. Further, the knowledge repository unit stores the second-by-second details of the activities performed by the central data processing unit and the data acquisition unit and the data is recorded in the knowledge repository unit.
[0019] The present invention is advantageous as it is adaptable regardless of the situation at hand. As the data acquisition unit tightly coupled with the central data processing unit, the automated inspection system automatically tasks the data acquisition unit with advanced sensors for inspecting the aircraft, based on the quality of the previously obtained data. Further, the automated inspection system is dynamic as the system adapts to the uncertain situations encountered during aircraft inspection and the central data processing system automatically allocates the required data acquisition unit for inspection, thus adapting to the uncertain situations.
[0020] Further, based on the time constraints and depth of analysis requirements, the system deploys a plurality of data acquisition units or a single data acquisition unit for data collection. Further, the automated inspection system is modular as the central data processing unit allows various types of data acquisition unit with sensors for aircraft inspection, wherein the data collected, analyzed and evaluated is stored in the knowledge repository unit. Further, the automated inspection system facilitates virtual inspection wherein the system carries out virtual inspection on the virtual representation of aircraft models in order to provide insights such as probable regions with defects based on the previous data from the knowledge repository unit.
Brief description of the drawings
[0021] The foregoing and other features of embodiments will become more apparent from the following detailed description of embodiments when read in conjunction with the accompanying drawings. In the drawings, like reference numerals refer to like elements.
[0022] Figure 1 illustrates a block diagram representation of the automated aircraft inspection system.
[0023] Figure 2 illustrates a block diagram representation of the data acquisition unit of the automated aircraft inspection system.
[0024] Figure 3 illustrates a flow diagram for the method involved in the automated inspection of the aircrafts.
[0025] Figure 4 illustrates a block diagram representation of the first use case of the present invention.
[0026] Figure 5 illustrates a block diagram representation of the first use case of the present invention.
Detailed description of the invention
[0027] Reference will now be made in detail to the description of the present subject matter, which is shown in the illustrations. Various changes and modifications obvious to one skilled in the art to which the invention pertains are deemed to be within the spirit, scope, and contemplation of the invention.
[0028] In order to more clearly and concisely describe and point out the subject matter of the claimed invention, the following definitions are provided for specific terms, which are used in the following written description.
[0029] The term "automated and intelligent aircraft inspection system", according to the present invention means an aircraft inspection system is automated and intelligent as it is adaptive, modular, integrated and data-driven, where the aircraft inspection system adapts itself to various uncertain situations without any manual intervention.
[0030] The term “knowledge repository unit”, according to the present invention means an individual hardware entity to recordand archive real-time activity of the system. This unit also houses and archives models (AI, machine learning, heuristics, procedural, etc., models) and any other data processing algorithms (e.g.filtering of sensor data, de-pixelation, etc.) to be used for prediction and analysis by the connected data processing unit.
[0031] The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive way, simply because it is being utilized in conjunction with detailed description of certain specific embodiments of the invention. The terms “automated and intelligent aircraft inspection system” and “automated aircraft inspection system” may be interchangeably used.
[0032] The present invention discloses an automated and intelligent aircraft inspection system for performing maintenance, repair and overhaul operations wherein, the system automates the inspection process and analysis of the collected data.
[0033] Figure 1 illustrates a block diagram representation of the automated aircraft inspection system for performing maintenance, repair and overhaul operations, wherein the system (100) comprises a user interface device (101) for allowing a user to assign a task to a central data processing unit (102), wherein the central data processing unit (102) receives the assigned task from the user interface device (101). The user interface device (101) facilitates interaction between the user and the system (100), wherein the user interface device (101) receives instructions from the user and reports back to the user with the results and in-depth live operation details of the system. The user interface device (101) facilitates connection between various users onto a common collaborative framework.
[0034] The system (100) further comprises a central data processing unit (102) for allocating at least one data acquisition unit (103) for data collection from different parts of the exterior surface of an aircraft under inspection. The data acquisition unit (103) acquires data from a portion of the surface of the aircraft. The central data processing unit (102) receives commands from the user through the user interface device (101), wherein the central data processing unit (102) facilitates dynamic configuration of the system (100). The central data processing unit (102) controls the data acquisition unit (103) and the knowledgeware repository unit (104), wherein the central data processing unit (102) manages the data acquisition by interfacing with the data acquisition unit (103) and provides the data acquisition unit (103) with instructions, commands and feedback on the acquired data. The central data processing unit (102) determines and evaluates the data acquired by the data acquisition unit (103) and the operation of the data acquisition unit (103), and facilitates monitoring the activity, the dynamic adaptations of the system (100) to uncertainties.
[0035] The system (100) further comprises at least one data acquisition unit (103) for acquiring data from at least a portion of the surface on the aircraft. According to an embodiment of the invention, the data acquisition unit (103) may comprise a localized data processing unit (105) for communicating with the central data processing unit (102) and processing the data acquired by the data acquisition unit (103). The data acquisition unit (103) comprises a knowledge repository sub-unit (106) for storing the data captured by the data acquisition unit (103), at least one sensor (107) for gathering the data and a positioning module (108) comprising hardware to position and orient all the sensors (107) to capture the data with required precision and accuracy.
[0036] According to an embodiment of the invention, the sensor (107) may facilitate capturing plurality of images of the surface of the aircraft, wherein the sensors (107) facilitate data collection. The sensors (107) may include image capturing devices, Light detection and ranging (Lidar) sensor, thermal sensors, ultra-sonic sensors etc., according to an embodiment of the invention. Further, the data acquisition unit (103) includes positioning module (108) to facilitate orienting and positioning of the sensors (107) in an optimal location, orientation and time to capture the data and plurality of control systems for streaming the data, processing the data and signal filtering, etc., according to an embodiment of the invention.
[0037] The system (100) comprises a knowledge repository unit (104) for storing the operational activity details, wherein the knowledge repository unit (104) receives the details from the user interface device (101), plurality of data acquisition units (103) and the central data processing unit (102). The knowledge repository unit (104) comprises of detailed archives of all the activities conducted by the user through the system (100). According to an embodiment of the invention, the data recorded in the knowledge repository unit (104) may include the inputs provided by the user such as details of the activity, constraints etc., data acquired by the data acquisition unit (103), communication logs between the central data processing unit (102), data acquisition unit (103) and the knowledge repository unit (104).
[0038] The knowledge repository unit (104) further stores the telemetry of the data acquisition unit (103) hardware, status of sensors (107), details of the analysis performed by the data acquisition unit (103). According to an embodiment of the invention, the knowledge repository unit (104) may include any external inputs provided by the user such as tables, logs, files, readings etc. Further, the knowledge repository unit (104) archives all the specific data of the activities, storage systems and the knowledge repository unit (104) also translate all the archived information into agnostic knowledgeware such as machine learning models, control system models, heuristic models, combination of heuristic, empirical and machine learning models, etc., that can be applied and used for any future activities, according to an embodiment of the invention. The data in the knowledge repository unit (104) may be stored in physical storage devices, according to an embodiment of the invention.
[0039] Figure 2 illustrates a block diagram representation of automated inspection system disclosing the components of the data acquisition unit (103), wherein the data acquisition unit (103) comprises a knowledge repository sub-unit (106) for storing the data captured by the data acquisition unit (103) and at least one sensor (107) for facilitating capturing the data. According to an embodiment of the invention, the sensor (107) may be an image capturing device for capturing plurality of images of at least a portion of the surface of the aircraft to be inspected. According to an embodiment of the invention, the data acquisition unit (103) may comprise a localized data processing unit (105) to process the data captured by the data acquisition unit (103).
[0040] According to an embodiment of the invention, the user provides commands to the system (100) through the user interface device (101) wherein, various sensors (107) capture the data from the surface of the aircraft and the captured data is processed by the localized data processing unit (105) located in the data acquisition unit (103). The user commands and interactions along with the captured and processed data are stored in the knowledge repository sub-unit (106). The data generated by the data acquisition unit (103) is evaluated by the knowledge repository unit (104) to access the damages on the inspected aircraft.
[0041] The central data processing unit (102) and the knowledge repository unit (104) are coupled together, wherein the data acquisition unit (103) determines that the optical readings of the surface of the aircraft are required and the data acquisition unit (103) acquires the data from a certain distance from the surface and undertakes a unique scanning pattern to cover the entire surface, according to an embodiment of the invention. The distance from the surface of the aircraft and the scanning pattern is determined by the data acquisition unit (103) using the data and experience from the previously carried out activities by using the mission agnostic models formulated and archived in knowledge repository unit (104).
[0042] According to an embodiment of the invention, the central data processing unit (102) and the knowledge repository unit (104) are coupled together wherein, the during inspecting the surface of the aircraft by the data acquisition unit (103), various parts of the aircraft may require investigation. Subsequently, the central data processing unit (102) communicates with the user interface device (101) and using the archived mission agnostic models, the central data processing unit (102) determines the requirement of advanced sensors such as Lidar sensors on some portion of the aircraft at certain time intervals and high-resolution optical images of some portions of the aircraft are required to detect any damages or wear caused due to weather conditions and striking of the objects etc.
[0043] Figure 3 illustrates a flow diagram for the method of automated aircraft inspection for performing maintenance, repair and overhaul operations, wherein the method (200) comprises the steps of providing at least one input command to the central data processing unit (102) by the user through the user interface device (101) in order to perform at least one activity, in step (201). Further, the central data processing unit (102) allocates at least one data acquisition unit (103) to perform the activities provided by the user, in step (202).
[0044] Upon allocation of at least one data acquisition unit (103) to perform the activities provided by the user, the data acquisition unit (103) collects the data using the plurality of sensors (107) and positioning module (108), in step (203). The data acquisition unit (103) collects the data from at least a portion of the surface of the aircraft. Further, in step (204), the data collected from the data acquisition unit (103) is processed by the central data processing unit (102) to identify the damages in the portion of the surface of the aircraft.
[0045] Subsequently, the central data processing unit (102) evaluates the data from the data acquisition unit (103), wherein the central data processing unit (102) determines the damages incurred by the aircraft and the determines the capability of the operation of the aircraft. The central data processing unit (102) determines if the activity provided by the user is completed. In step (205), the central data processing unit (102) allocates at least one data acquisition unit (103) with advanced sensors for data collection in case of any discrepancies in the data obtained by the data acquisition unit (103) and the collected data is processed and evaluated by the central data processing unit (102) in order to identify the type of damage in the portion of the surface of the aircraft. Further, in step (206) the knowledge repository unit (104) archives all the recorded second-by-second details of the activities performed by the central data processing unit (102) and the data acquisition unit (103) and the data is stored in the knowledge repository unit (104). Further, the data acquisition unit (103) relays the acquired data to the central data processing unit (102) and the knowledge repository unit (104). The central data processing unit (102) analyzes the data relayed from the data acquisition unit (103) and sends the required instructions and commands to data acquisition unit (103).
[0046] According to an embodiment of the invention, the system (100) adapts dynamically and automatically reconfigures itself during the course of the activity, in case of any unexpected damage on the surface of the aircraft being inspected. In such case, the central data processing unit (102) allocates another advanced data acquisition unit (103) to perform the activity, wherein the data acquired by the data acquisition unit (103) is processed by the central data processing unit (102). The central data processing unit (102), data acquisition unit (103) and the knowledge repository unit (104) are integrated with one another, wherein they are able to communicate data, instructions, and analysis results seamlessly between one another through a common user interface device and a communication protocol.
[0047] The user interface device (101), the central data processing unit (102), the plurality of data acquisition unit (103) and the knowledge repository unit (104) are modular and tightly integrated with each other using a common application programming user interface device and communication protocol platform, according to an embodiment of the invention. Thus, the system (100) allows quick, dynamic configuration/re-configuration of the components of the system (100) in order to successfully complete the activities.
[0048] According to an embodiment of the invention, the central data processing unit (102), data acquisition unit (103) can spawn several instances of uncertainty without any manual intervention, wherein the central data processing unit (102) may launch a new processing session concurrently on the data acquisition units (103) to assist in parallel processing of data, and one processing session may analyze the captured optical images of the surface of an object, while another processing session may analyze the LiDAR readings.
[0049] According to an embodiment of the invention, upon determination of the damage on the portion of the aircraft by the central data processing unit (102), the central data processing unit (102) identifies the captured image as damage of certain type, for example: a dent, and the central data processing unit (102) determines the probability of the damage to be determined as a dent. According to an embodiment of the invention, an ambiguous range is set between 15% and 50%, wherein in case the probability is less than 15%, the central data processing unit (102) implies as no damage on the surface and in case the probability is higher than 50%, then the inspected portion is determined as damaged. The central data processing unit (102) upon determination of the damage on the surface of the object, deploys the data acquisition unit (103) to capture higher resolution images to improve the detection probability. The higher resolution images captured by the data acquisition unit (103) helps in resolving the ambiguity.
[0050] Figure 4 illustrates a block diagram representation of the first use case of the present invention, wherein the system (100) is configured to inspect one specific aircraft using plurality of data acquisition units (103). Each data acquisition unit (103) comprises at least one or a set of sensors (107) and in this modular setup, the knowledge repository unit (104) and the plurality of data acquisition unit (103), i.e., data acquisition unit ‘A’ to data acquisition unit ‘Z’, which are being coordinated by the central data processing unit (102). The central data processing unit (102) is embodied on a central ground station electronic device that user interface with each data acquisition unit (103) via radio signals. The knowledge repository unit (104) facilitates storage of data on a cloud server that the ground station has access to. The central data processing unit (102) allocates the plurality of data acquisition units (103) to inspect the aircraft, and the localized data processing unit (105) located on each data acquisition unit (103) processes the data acquired by the respective data acquisition unit (103) and the data is sent to the central data processing unit (102) for data evaluation, where the system (100) detects the type of damage on the surface of the aircraft, which is communicated to the user through the user interface device (101).
[0051] For example, the user provides commands to the system (100) to inspect 15 aircrafts within 3 hours with fidelity of A-checks in a quick inspection mode. The central data processing unit (102) allocates 40 data acquisition units (103) including 33 air-based drones and 7 land-based rovers. The central data processing unit (102) allocates for e.g., 1 robot for inspecting narrow body aircrafts and 2 robots for wide body aircrafts and also user interface device s with the processing session of the central data processing unit (102) running on each robot and identifies for any anomaly or uncertainties. Upon completion of inspecting the narrow body aircrafts, the central data processing unit (102) re-allocates the data acquisition units (103) to assist inspection of wide body air crafts. Each localized data processing unit (105) located on the data acquisition unit (103) is loaded with models to identify the defects, determine an optimal scanning pattern and relay the findings to the central data processing unit (102). The central data processing unit (102) queries the knowledge repository unit (104) for optimal scheduling and resource management models in order to optimize the usage given time and fidelity constraints of the data acquisition unit (103). The localized data processing unit (105) located on the data acquisition unit (103) queries the knowledge repository unit (104) to obtain any historical data and identify the ideal defect detection models suitable for desired turn-around time.
[0052] For example: the central data processing unit (102) encounters unexpected event during inspection, such as a massive dent on the upper fuselage of an aircraft, wherein the central data processing unit (102) deploys a new data acquisition unit (103) comprising the localized data processing unit (105) and the data acquisition unit (103) coupled with the knowledge repository unit (104) dynamically configures a new optimal scanning pattern to cover the unexpected damage. The data acquisition unit (103) coupled with the knowledge repository unit (104) also determines the best sensor to be used to accurately quantify the unexpected damage, for example LiDAR sensor. The processing session of the localized data processing unit (105) user interface device d with the knowledge repository unit (104) is launched on the data acquisition unit (103) and on the user’s electronic device in order to load the machine learning models best suited for dent analysis.
[0053] According to an embodiment of the invention, the modular and adaptable nature of the central data processing unit (102) and the data acquisition unit (103) allows for the data collection to be achieved through the data acquisition unit (103) with sensors including a thermal and imaging camera or a swarm of data acquisition units (103), wherein some may include thermal and/or some with imaging cameras. Based on the data acquisition unit (103) allocated by the central data processing unit (102) and the commands provided by the user, the optimal configuration is dynamically determined. The data acquisition unit (103) enables custom configuration of the design of the system (100) and the configuration may be adapted dynamically in an automated manner. For example: the user commands the central data processing unit (102) to inspect several taxied aircrafts of an airline carrier through the user interface device (101), the data acquisition unit (103) may use various strategies to complete the activity, wherein a single data acquisition unit (103) may user interface device with the central data processing unit (102) for data acquisition, a single data acquisition unit (103) embodied in a ground station may user interface device with the central data processing unit (102), wherein multiple data acquisition units (103) with various sensors may be used for acquiring data for all aircrafts. Further, several data acquisition units (103) may be embodied in multiple drones that user interface device and stream data, provide instructions to the central data processing unit (102) independently. According to an embodiment of the invention, configuration of the system (100) is automatically chosen based on time constraints, user constraints etc., and the optimal system setup is dynamically configured.
[0054] Figure 5 illustrates a block diagram representation of the first use case of the present invention, wherein the system (100) performs design iterations to lower the maintenance, repair costs and frequency of maintenance. The central data processing unit (102) carries out virtual inspection of the aircraft and predicts the most probable locations of the aircraft that require high maintenance and repair, based on the data located in the knowledge repository unit (104). Further, the system (100) suggests the modifications to the design of the aircraft in order to lower the overall defect possibility score. The defect possibility score indicates the likelihood of encountering defects in the aircraft and its intensity and frequency for various operational parameters. According to an embodiment of the invention, the operational parameters may include weather conditions, operation duration of the aircraft etc.
[0055] According to an embodiment of the invention, the system (100) facilitates virtual inspection of the large objects such as aircrafts, wherein the inspection data such as aircraft inspection data provided by the companies may be stored in the knowledge repository unit (104), and the inspection data is used to generated machine learning models. According to an embodiment of the invention, the knowledge repository unit (104) may store data including but not limited to ship inspection data, bridge inspection data etc., which may be used for virtual inspection of watercrafts, bridges etc. The machine learning models may suggest the required changes in the design of the aircraft in order to minimize the cost incurred in the maintenance. Further, the machine learning models may indicate the regions with high, moderate and low possibility of defects in contour form, according to an embodiment of the invention.
[0056] According to an embodiment of the invention, the data acquisition unit (103) may be physical data collection hardware devices including drones operating on air, land, underwater, space, human beings equipped with image capturing devices etc. The localized data processing unit (105) located on the data acquisition unit (103) may include but not limited to computer systems, embedded computer systems onboarded on the data acquisition unit (103) and/or remotely located and solely responsible for automation, guidance and navigation, control, locomotion of the data acquisition unit (103) hardware in order to position the sensors (107) for accurate data collection. The sensors (107) may be the physical hardware devices responsible for accurately measuring the data and information about the surface of the aircraft. The sensor (107) may include, but not limited to thermal sensor, camera (imaging sensor), Lidar, radar, etc.
[0057] Further, the knowledge repository unit (104) may be but not limited to physical data storage devices to facilitate archiving the past activity data, capturing live activity data, formulating and archiving the mission agnostic models to improve the quality of the data acquisition unit (103) and the localized data processing unit (105), according to an embodiment of the invention. Further, according to an embodiment of the invention, the central data processing unit (102) may include, but not limited to cloud computing systems to perform computations for data analysis of the acquired data and to determine the need for adaptions in the existing system configuration. Further, the central data processing unit (102) facilitates performing of live dynamic adaptations of the system (100), determining the reliability of the acquired data and performing the analysis on the acquired data to determine the status of the inspected aircraft.
[0058] The advantages of the system (100) and the method (200) of the present invention includes adaptability as the central data processing unit (102) adapts to any uncertain situation dynamically. The central data processing unit (102) automatically allocates the data acquisition unit (103) with advanced sensors (107) based on the quality of the previously acquired data. Further, the central data processing unit (102) allocates a single or plurality of data acquisition units (103) for data collection based on the time and depth of analysis requirements. Further, the modular nature of the central data processing unit (102) facilitates data collection using various types of sensors for aircraft inspection and the data collected, analyzed and evaluated is stored in the knowledge repository unit (104). Furthermore, the central data processing unit (102) facilitates virtual inspection of the aircraft and provides insights such as probable regions with defects based on the previous data from the knowledge repository unit (104).
[0059] Further, the automated inspection system (100) facilitates inspection of large objects, including but not limited to aircrafts, bridges, roads, pipelines, skyscrapers, buildings, water bodies like lakes and rivers. The automated inspection system (100) may be used for inspecting the bridges where the data acquisition unit (103) such as air-based drones may be used. The data acquisition unit (103) may fly over and under the bridge for data acquisition. Further, automated inspection system (100) may be used for inspection of forest and wildlife for animal tracking, forest fire, presence of water resources, location of intruders/poachers etc.
[0060] Further, the automated inspection system (100) may be used for inspection of watercrafts such as ship hull inspection, offshore platform inspection, oil and natural gas extraction rig inspection etc., wherein air-based robots and land-based rovers may be used for data acquisition. Further, the automated inspection system (100) facilitates inspection in energy sector such as powerplants, pipelines, wind mills, wind turbines, solar panel arrays, solar farms etc. The automated inspection system (100) may be helpful in surveying lands such as agricultural lands, mountain terrain, plantations based on the requirements. The agricultural lands and plantations may be inspected for quality of crops being cultivated, wherein the air-based drones and land-based rovers may be used for data acquisition.
[0061] Furthermore, the automated inspection system (100) facilitates automated inspection in defense sector including commercial aircrafts, fighter jets, commuter class airplanes, rockets, launchpads, military air transport vehicles, helicopters, cargo etc. Further, the automotive inspections including but not limited to parking lot inspections, vehicle inventory inspections, rental car delivery and return inspections; industrial equipment inspections such as heavy machinery, earth moving vehicles, generators, heat exchangers; network antennas; archaeology inspections such as old monuments, heritage sites like temples, excavations. Further, the automated inspection system (100) facilitates automated inspection in law enforcement such as real-time surveillance, thermal camera-based tracking etc., which may help in solving crimes.
Reference numbers
Components Reference Numbers
System 100
User interface device 101
Central data processing unit 102
Data acquisition unit 103
Knowledge repository unit 104
Localized data processing unit 105
Knowledge repository sub-unit 106
Sensors 107
Positioning module 108
, Claims:We Claim:
1. An automated aircraft inspection system for performing maintenance, repair and overhaul operations, the system (100) comprising:
a. a user interface device (101) for allowing a user to assign a task to a central data processing unit (102), wherein the central data processing unit (102) receives the assigned task from the user interface device (101) and allocates at least one data acquisition unit (103) for data collection from different parts of the exterior surface of an aircraft under inspection;
b. a knowledge repository unit (104) for recording and storing the operational activity details received from the user interface device (101), the central data processing unit (102) and the data acquisition unit (103).
2. The system (100) as claimed in claim 1, wherein the data acquisition unit (103) further comprises:
a. a localized data processing unit (105) for communicating with the central data processing unit (102) and processing the data acquired by the data acquisition unit (103);
b. a knowledge repository sub-unit (106) for storing the data captured by the data acquisition unit (103) and the data processed by the localized data processing unit (105);
c. at least one sensor (107) for facilitating capturing of data of at least a portion of the surface of the aircraft; and
d. a positioning module (108) for orienting and positioning the sensor (107).
3. The system (100) as claimed in claim 1, wherein the central data processing unit (102) provides commands to the data acquisition unit (103) for data acquisition.
4. The system (100) as claimed in claim 1, wherein the data acquired by the data acquisition unit (103) is evaluated by the central data processing unit (102).
5. The system (100) as claimed in claim 1, wherein the inspection data stored in the knowledge repository unit (104) facilitates detection of probable damages through virtual inspection and analysis.
6. The system (100) as claimed in claim 1, wherein the central data processing unit (102) facilitates virtual inspection of the aircraft and provides insights such as probable regions with defects based on the stored inspection data from the knowledge repository unit (104).
7. The method of performing aircraft inspection for maintenance, repair and overhaul operations, the method (200) comprising the steps of:
a. providing at least one input command to the central data processing unit (102) by the user through the user interface device (101) in order to perform at least one task;
b. allocating at least one data acquisition unit (103) to complete the task by the central data processing unit (102);
c. collecting data by at least one data acquisition unit (103), wherein the data acquisition unit (103) uses plurality of sensors (107) and the positioning module (108) to inspect at least a portion of the surface to be inspected;
d. processing the data by the central data processing unit (102) to identify the defects in the inspected surface of the portion;
e. allocating at least one data acquisition unit (103) for acquiring the data from at least a portion of the surface of the aircraft, in case the previously obtained data is erroneous or inconclusive, wherein the data acquisition unit (103) comprising advanced sensors are deployed to the aircraft surface for data acquisition;
f. storing the data in the knowledge repository unit (104) wherein, the data stored in the knowledge repository unit (104) is used for further and/future analysis.
8. The method (200) as claimed in claim 7, wherein the central data processing unit (102) uses the archived and real-time data stored in the knowledge repository (104) to provide commands to the data acquisition unit (103) for acquiring required data.
9. The method (200) as claimed in claim 7, wherein the user commands and the computations performed by the central data processing unit (102) are recorded in the knowledge repository (104).
| # | Name | Date |
|---|---|---|
| 1 | 202341004595-STATEMENT OF UNDERTAKING (FORM 3) [24-01-2023(online)].pdf | 2023-01-24 |
| 2 | 202341004595-PROOF OF RIGHT [24-01-2023(online)].pdf | 2023-01-24 |
| 3 | 202341004595-POWER OF AUTHORITY [24-01-2023(online)].pdf | 2023-01-24 |
| 4 | 202341004595-FORM FOR STARTUP [24-01-2023(online)].pdf | 2023-01-24 |
| 5 | 202341004595-FORM FOR SMALL ENTITY(FORM-28) [24-01-2023(online)].pdf | 2023-01-24 |
| 6 | 202341004595-FORM 1 [24-01-2023(online)].pdf | 2023-01-24 |
| 7 | 202341004595-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [24-01-2023(online)].pdf | 2023-01-24 |
| 8 | 202341004595-EVIDENCE FOR REGISTRATION UNDER SSI [24-01-2023(online)].pdf | 2023-01-24 |
| 9 | 202341004595-DRAWINGS [24-01-2023(online)].pdf | 2023-01-24 |
| 10 | 202341004595-DECLARATION OF INVENTORSHIP (FORM 5) [24-01-2023(online)].pdf | 2023-01-24 |
| 11 | 202341004595-COMPLETE SPECIFICATION [24-01-2023(online)].pdf | 2023-01-24 |
| 12 | 202341004595-FORM-9 [25-03-2023(online)].pdf | 2023-03-25 |
| 13 | 202341004595-STARTUP [27-03-2023(online)].pdf | 2023-03-27 |
| 14 | 202341004595-FORM28 [27-03-2023(online)].pdf | 2023-03-27 |
| 15 | 202341004595-FORM 18A [27-03-2023(online)].pdf | 2023-03-27 |
| 16 | 202341004595-FER.pdf | 2024-05-28 |
| 17 | 202341004595-OTHERS [06-11-2024(online)].pdf | 2024-11-06 |
| 18 | 202341004595-FER_SER_REPLY [06-11-2024(online)].pdf | 2024-11-06 |
| 19 | 202341004595-CLAIMS [06-11-2024(online)].pdf | 2024-11-06 |
| 20 | 202341004595-ABSTRACT [06-11-2024(online)].pdf | 2024-11-06 |
| 21 | 202341004595-US(14)-HearingNotice-(HearingDate-22-08-2025).pdf | 2025-08-04 |
| 22 | 202341004595-FORM-26 [20-08-2025(online)].pdf | 2025-08-20 |
| 23 | 202341004595-Correspondence to notify the Controller [20-08-2025(online)].pdf | 2025-08-20 |
| 24 | 202341004595-US(14)-ExtendedHearingNotice-(HearingDate-16-09-2025)-1700.pdf | 2025-08-25 |
| 25 | 202341004595-Correspondence to notify the Controller [12-09-2025(online)].pdf | 2025-09-12 |
| 26 | 202341004595-Response to office action [29-09-2025(online)].pdf | 2025-09-29 |
| 27 | 202341004595-Annexure [29-09-2025(online)].pdf | 2025-09-29 |
| 1 | 202341004595E_29-08-2023.pdf |