Abstract: Disclosed herein is an SVDGS (100) for providing advanced aircraft docking guidance assistance to an aircraft. The SVDGS (100) includes a processor (101) configured to determine, based on real time data of the aircraft, using an ML model (107) stored in a memory (103) of the SVDGS (100), whether the aircraft is arriving towards a designated parking bay. Further, the processor is configured to determine, using the ML model, a type of the aircraft based on aircraft information and the real time data. Furthermore, the processor is configured to determine, using the ML model, whether the aircraft is following a centre line marked on the designated parking bay based on the real time data. Furthermore, the processor is configured to provide, based on a result of the determination that the aircraft is deviating from the centre line, guidance instructions to the aircraft to align the aircraft with the centre line.
DESC:FIELD OF THE INVENTION
[0001] The present disclosure relates to an aircraft docking system and more particularly relates to a method and a system for providing advanced aircraft docking guidance assistance to an aircraft using Artificial Intelligence (AI) and Machine Learning (ML) algorithms.
BACKGROUND
[0002] Aircraft docking systems have been an important part of aviation for many years. The aircraft docking systems are used to guide aircraft safely and accurately to the terminal or gate and to ensure that the aircraft is securely connected to ground power and other services. With the increasing demand for air travel, and the need to improve airport efficiency and safety, the development of new and innovative aircraft docking systems has become a priority for the aviation industry.
[0003] A conventional aircraft docking system includes a laser transceiver, camera, and/or radar, which are utilized to acquire real time data of the docking aircraft. Additionally, the system includes an onboard computer (CPU) for processing the real time data and a display unit to provide visual assistance and guidance to the pilots of the docking aircraft. During an active docking operation, the aircraft docking system usually displays information related to the guidance of the aircraft's docking procedure.
[0004] Further, conventional aircraft docking systems use fundamental software for performing calculations of azimuth and range to offer visual docking guidance assistance to approaching aircraft. The conventional aircraft docking guidance system assists pilots in accurately positioning the aircraft at a gate or a parking spot. The assistance involves the use of visual aids such as colored lights, markers, and laser-guided systems to guide the pilot into the correct position.
[0005] In recent years, several aircraft docking systems have been developed that are more efficient, reliable, and user-friendly than earlier systems. Some aircraft docking guidance systems include sensors that can detect the distance between the aircraft and the docking station, as well as the angle of approach. This information can be displayed on the pilot's instrument panel, allowing them to adjust their position and ensure safe and accurate docking. These aircraft docking systems use sensors and computer algorithms to guide the aircraft to the gate. These aircraft docking systems also use advanced communication systems, which enable ground crew and pilots to communicate more effectively during the docking procedure.
[0006] However, there are various factors such as weather conditions, operational activities, personnel and objects at an apron region, or other real-time conditions that could affect aircraft docking.
[0007] Hence, there is a need to provide advanced techniques that help to optimize safety and accuracy during aircraft docking, reducing the risk of collisions or damage to the aircraft.
SUMMARY
[0008] This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify key or essential inventive concepts of the invention nor is it intended for determining the scope of the invention.
[0009] In an embodiment, a smart visual docking guiding system (SVDGS) for providing advanced aircraft docking guidance assistance to an aircraft is disclosed. The SVDGS includes a memory, one or more sensors, and a processor communicatively coupled with the memory and the one or more sensors. The one or more sensors are configured to acquire real time data associated with the aircraft approaching a parking space. The processor is configured to determine, based on the acquired real time data using a Machine Learning (ML) model stored in the memory, whether the aircraft is arriving towards a designated parking bay in the parking space designated for the aircraft. Further, the processor is configured to determine, using the ML model, a type of the aircraft based on aircraft information, the acquired real time data, and a result of determination that the aircraft is arriving towards the designated parking bay. Furthermore, the processor is configured to determine, using the ML model, whether the aircraft is following a centre line marked on the designated parking bay based on the acquired real time data and a result of the determination that the type of the aircraft is correct. Furthermore, the processor is configured to provide, based on a result of the determination that the aircraft is deviating from the centre line, guidance instructions to the aircraft to align the aircraft with the centre line.
[0010] Also disclosed herein is a method for providing advanced aircraft docking guidance assistance to an aircraft. The method is implemented in a smart visual docking guiding system (SVDGS) including a memory, one or more sensors, and a processor communicatively coupled with the memory and the one or more sensors. The method includes determining, by the processor using a Machine Learning (ML) model stored in the memory, whether the aircraft is arriving towards a designated parking bay in a parking space designated for the aircraft. The determination of whether the aircraft is arriving towards the designated parking bay is based on real time data of the aircraft acquired by the one or more sensors. Further, the method includes determining, by the processor using the ML model, a type of the aircraft based on aircraft information, the acquired real time data, and a result of determination that the aircraft is arriving towards the designated parking bay. Furthermore, the method includes determining, by the processor using the ML model, whether the aircraft is following a centre line marked on the designated parking bay based on the acquired real time data and a result of the determination that the type of the aircraft is correct. Furthermore, the method includes providing, by the processor, guidance instructions to the aircraft to align the aircraft with the centre line based on a result of the determination that the aircraft is deviating from the centre line.
[0011] To further clarify the advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail in the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
FIG. 1 illustrates an exemplary environment depicting a Smart Visual Docking Guidance System (S-VDGS), in accordance with an embodiment of the present disclosure;
FIG. 2 illustrates a block diagram of the SVDGS for providing advanced aircraft docking guidance assistance to an aircraft, in accordance with an embodiment of the present disclosure;
FIG. 3 illustrates a functional diagram of the SVDGS for providing the advanced aircraft docking guidance assistance to the aircraft, in accordance with an embodiment of the present disclosure; and
FIG. 4 illustrates a flowchart of a method for providing the advanced aircraft docking guidance assistance to the aircraft, in accordance with an embodiment of the present disclosure.
[0013] Further, skilled artisans will appreciate that those elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present invention. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0014] For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
[0015] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are explanatory of the invention and are not intended to be restrictive thereof.
[0016] Reference throughout this specification to “an aspect”, “another aspect” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrase “in an embodiment”, “in another embodiment”, “in one or more embodiments”, and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0017] The terms “comprise”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components preceded by “comprises... a” does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.
[0018] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The term “or” as used herein, refers to a non-exclusive or unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those skilled in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0019] As is traditional in the field, embodiments may be described and illustrated in terms of modules that carry out a described function or functions. These modules, which may be referred to herein as units or blocks or the like, or may include blocks or units, are physically implemented by analog or digital circuits such as logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronic components, active electronic components, optical components, hardwired circuits, or the like, and may optionally be driven by firmware and software. The circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like. The circuits constituting a block may be implemented by dedicated hardware, by a processor (e.g., one or more programmed microprocessors and associated circuitry), or by a combination of dedicated hardware to perform some functions of the block and a processor to perform other functions of the block. Each block of the embodiments may be physically separated into two or more interacting and discrete blocks without departing from the scope of the invention. Likewise, the blocks of the embodiments may be physically combined into more complex blocks without departing from the scope of the invention.
[0020] The accompanying drawings are used to help easily understand various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed to extend to any alterations, equivalents, and substitutes in addition to those which are particularly set out in the accompanying drawings. Although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another.
[0021] Embodiments will be described below in detail with reference to the accompanying drawings.
[0022] FIG. 1 illustrates an exemplary environment depicting a Smart Visual Docking Guidance System (S-VDGS) 100, in accordance with an embodiment of the present disclosure.
[0023] The SVDGS 100 may provide an assistance to a pilot of an aircraft during the docking process of the aircraft at a parking bay designated for the aircraft. The SVDGS 100 assists the pilot in docking the aircraft accurately in the parking bay. In one or more embodiments the SVDGS 100 may provide the assistance to multiple aircraft in multiple parking bays simultaneously.
[0024] As shown in FIG. 1, the S-VDGS 100 is installed at an airport with the aircraft approaching the parking bay. The S-VDGS 100 may include one or more sensors to scan an apron region of the airport at a predefined interval. The S-VDGS 100 may determine that the aircraft is approaching the parking bay based on sensor data of the one or more sensors. Thereafter, the S-VDGS 100 may provide guiding instructions to the aircraft to provide enhanced safety, accuracy, and efficiency during docking of the aircraft. The S-VDGS 100 may provide real-time guidance to the pilot based on Artificial Intelligence (AI) and Machine Learning (ML) based methods.
[0025] The SVDGS 100 may be adapted to be configured with a plurality of elements of the airport. The plurality of elements may include but is not limited to, a plurality of runways, a plurality of parking bays, a plurality of lines in the parking ways like a plurality of central lines, a plurality of stop lines, a stopping points of the stop line, an overrun portion, an underrun portion, and an apron area of the airport.
[0026] FIG. 2 illustrates a block diagram of the SVDGS 100 for providing advanced aircraft docking guidance assistance to the aircraft, in accordance with an embodiment of the present disclosure.
[0027] The SVDGS 100 includes a processor 101, a memory 103, a database 105, an AI/ML model 107, a communication unit 109, an Input/Output (I/O) interface 111, a display unit 113, and a sensor unit 115.
[0028] The processor(s) 101 can be a single processing unit or several units, all of which could include multiple computing units. The processor 101 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 101 is configured to fetch and execute computer-readable instructions and data stored in the memory 103.
[0029] The memory 103 includes one or more computer-readable storage media. The memory 103 may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. In addition, the memory may, in some examples, be considered a non-transitory storage medium. The term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term “non-transitory” should not be interpreted to mean that the memory is non-movable. In certain examples, a non-transitory storage medium may store data that can, over time, change (e.g., in Random Access Memory (RAM) or cache.
[0030] The memory 103 may further include any non-transitory computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read-only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 103 may store one or more machine learning models for performing operations as disclosed throughout the disclosure.
[0031] The database 105 is configured to be accessed by the processor 101 and stores information as required by the processor 101 to perform the one or more functions. The information includes preconfigured aircraft information of aircraft like wingspan, nose dimensions, fuselage size/shape, and nose-to-nose wheel distance.
[0032] The AI/ML model 107 may be implemented with an AI module that may include a plurality of neural network layers. Examples of neural networks include but are not limited to, convolutional neural network (CNN), deep neural network (DNN), recurrent neural network (RNN), and Restricted Boltzmann Machine (RBM). The learning technique for training the AI/ML model 107 uses a plurality of learning data to cause, allow, or control the SVDGS 100 to make a determination or prediction. Examples of learning techniques include but are not limited to supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning. At least one of a plurality of CNN, DNN, RNN, RMB models and the like may be implemented to thereby achieve execution of the present subject matter’s mechanism through the AI/ML model 107. A function associated with the AI/ML model 107 may be performed through the non-volatile memory, the volatile memory, and the processor 101.
[0033] The communication unit 109 is configured to communicate voice, video, audio, images, or any other content over a communication network. Further, the communication unit 109 may include a communication port or a communication interface for sending and receiving signals from the SVDGS 100 via the communication network. The communication port or the communication interface may be a part of the processor 101 or may be a separate component. The communication port may be created in software or may be a physical connection in hardware. The communication port may be configured to connect with the communication network, external media, the display, or any other components in the SVDGS 100, or combinations thereof. The connection with the communication network may be a physical connection, such as a wired Ethernet connection, or may be established wirelessly as discussed above. Likewise, the additional connections with other components of the SVDGS 100 may be physical or may be established wirelessly. The communication unit 109 may include the Wi-Fi module or Bluetooth module for enabling wireless communication capability and data exchange capability between the SVDGS 100 and the network.
[0034] The I/O interface 111 refers to hardware or software components that enable communication between the SVDGS 100 and other devices or systems at the airport. The I/O interface 111 serves as a communication medium for exchanging information, commands, signals, or query responses with other devices or systems. The I/O interface 111 may be a part of the processor 101 or maybe a separate component. The I/O interface 111 may be created in software or maybe a physical connection in hardware. The I/O interface 111 may be configured to connect with an external network, external media, the display, or any other components, or combinations thereof. The external network may be a physical connection, such as a wired Ethernet connection, or may be established wirelessly.
[0035] The display unit 113 is configured to display the guidance instructions to the pilot of the aircraft. The display unit 113 may include a display screen. As a non-limiting example, the display screen may be Light Emitting Diode (LED), Liquid Crystal Display (LCD), Organic Light Emitting Diode (OLED), Active Matrix Organic Light Emitting Diode (AMOLED), or Super Active Matrix Organic Light Emitting Diode (AMOLED) screen. The display screen may be of varied resolutions. In one or more embodiments, the display unit 113 may be referred to as a plurality of display units. Each display unit of the plurality of display units is coupled with the processor 101 and installed in all parking bays to provide guidance instructions to the aircraft.
[0036] The sensor unit 115 includes the one or more sensors configured to capture data associated with the aircraft. The one or more sensors may include a Light Detection and Ranging (LIDAR) sensor 301 and a plurality of image sensors 303.
[0037] The LIDAR sensor 301 is configured to scan an apron region of the airport at a predefined interval. The scanning of the apron region of the airport facilitates the generation of lidar point cloud data of aircraft space. The lidar point cloud data is 3D point cloud data that includes a plurality of 3D attributes and a plurality of 2-dimensional (2D) attributes of the aircraft in the parking bay of the airport. In one or more embodiments, the LIDAR sensor 301 may generate the lidar point cloud data in the real time. For example, the LIDAR sensor 301 may facilitate a 360-degree horizontal view by performing multiple scans of the apron area per second. In one or more embodiments, the SVDGS 100 may include a plurality of LIDAR sensors to facilitate a more detailed view of the airport and the airspace.
[0038] The plurality of image sensors 303 may include one or more cameras, one or more laser sensors, and one or more thermal sensors. The plurality of image sensors 303 generates image data to facilitate real-time tracking of the apron area of the airport and the airspace.
[0039] The data associated with the aircraft may include the lidar point cloud data obtained from the LIDAR sensor 301 and image data obtained from the plurality of image sensors 303. The one or more sensors may capture the data in real time.
[0040] FIG. 3 illustrates a functional diagram of the SVDGS 100 for providing the advanced aircraft docking guidance assistance to the aircraft, in accordance with an embodiment of the present disclosure.
[0041] The AI/ML model 107 receives the lidar point cloud data from the LIDAR sensor 301 and the image data from the plurality of image sensors 303. The AI/ML model 107 further receives the preconfigured aircraft information of the aircraft from the database 105. The AI/ML model 107 predicts a position of the aircraft and movement of the aircraft based on at least one of the received the lidar point cloud data, the image data, and the aircraft information.
[0042] The AI/ML model 107 is trained on large datasets of aircraft docking scenarios, allowing the SVDGS 100 to accurately predict the position of the aircraft and the movement of the aircraft based on inputs from the one or more sensors. The SVDGS 100 may also learn from previous docking experiences to improve its accuracy and provide more personalized guidance to the pilot.
[0043] The SVDGS 100 tracks the approaching aircraft based on the sensor data and determines whether the approaching aircraft is coming to its designated bay or not. For example, the SVDGS 100 acquires real time data associated with the aircraft approaching the parking space from the one or more sensors. Further, the SVDGS 100 determines whether the aircraft is arriving towards a designated parking bay in the parking space designated for the aircraft. The determination whether the aircraft is arriving towards the designated parking bay may be based on the acquired real time data using the AI/ML model 107.
[0044] For instance, the AI/ML model 107 may track the turning of the aircraft from a taxiway towards the parking space. Then, the AI/ML model 107 identifies the aircraft nose once the aircraft turning complete is detected. Thereafter, the AI/ML model 107 determines a distance of the aircraft at the designated parking bay. Thereafter, the AI/ML model 107 determines that the aircraft is arriving towards the designated parking bay if the distance of the aircraft at the designated parking bay matches the acceptable preconfigured distance. The above sequence of steps ensures that the aircraft is approaching the designated parking bay. Also, the above sequence of steps helps in avoiding false detection of approaching aircraft at the designated bay, unlike prior systems and art.
[0045] Further, the SVDGS 100 determines a type of the aircraft based on the preconfigured aircraft information, the acquired real time data, and a result of a determination that the aircraft is arriving towards the designated parking bay. The SVDGS 100 may determine the type of the aircraft using the AI/ML model 107. If the approaching aircraft matches with the preconfigured aircraft information a docking handler 305 of the SVDGS 100 starts to provide guidance instructions such as closing distance and azimuth guidance to the pilot of the aircraft.
[0046] If the approaching aircraft does not match with the preconfigured aircraft information the SVDGS 100 determines that the type of the aircraft is incorrect. At this time, the docking handler 305 provides guidance instructions to the aircraft to abort the docking procedure. In a non-limiting example, the docking handler 305 may send a notification to display with a message “ID FAIL” at the display unit 113. The docking handler 305 may also send a notification to an operator panel 309 to inform that the type of the aircraft is incorrect and the guidance instruction to abort the docking procedure is sent to the aircraft. The docking handler 305 may also store this information to S-VDGS server 307.
[0047] In one or more embodiments, after determining that the type of the aircraft is correct, the SVDGS 100 may create a safe zone area for approaching aircraft using the preconfigured aircraft information. This safe zone area helps to identify one or more obstacles in the path of approaching aircraft to ensure the safety and integrity of the aircraft as well as ground staff. If any obstacle gets detected, then the docking handler 305 may provide guidance instructions to the pilot of the aircraft to abort the docking procedure temporarily. In a non-limiting example, the docking handler 305 may send the notification to display with a message “WAIT” & “STOP” at the display unit 113. Once the obstacle has been removed from the path, SVDGS 100 again continues with the docking guidance. The SVDGS 100 may also identify any hazardous events detrimental to smooth airport operations.
[0048] In one or more embodiments, the SVDGS 100 provides accurate azimuth guidance to the pilot using the AI/ML model 107 and is compliant with International Civil Aviation Organization (ICAO) safety standards. The SVDGS 100 provides the azimuth guidance to the aircraft in the real time till the docking procedure completes.
[0049] For example, at a distance of 25 meters prior to an aircraft stopping position, the SVDGS 100 tracks aircraft nose centre point and calculates a relative angle from a centre line marked on the designated parking bay. For instance, the SVDGS 100 may determine, using the AI/ML model 107, whether the aircraft is following the centre line marked on the designated parking bay. The docking handler 305 may provide guidance instructions to the pilot of the aircraft to align the aircraft with the centre line if it is determined that the aircraft is deviating from the centre line. For instance, the docking handler 305 may display left/right alignment arrows on the pilot display for pilots to rectify their approach azimuth errors.
[0050] In a non-limiting example, if the aircraft reaches 25 meters from the aircraft stopping position, the SVDGS 100 start tracking nose wheel accurately in parallel with the aircraft nose centre point and determines the deviation angle from the centre line, and provides accurate azimuth guidance to the pilot in the real-time.
[0051] In one or more embodiments, the SVDGS 100 calculates a closing rate and distance from the aircraft stopping position using the lidar point cloud data obtained from the LIDAR sensor 301. If the closing rate of approaching aircraft is higher than a predefined closing rate for that aircraft on that bay, then the SVDGS 100 may notify the pilots by displaying “SLOW” on the display unit 113.
[0052] The SVDGS 100 uses the AI/ML model 107 to identify the aircraft nose and capture maximum data points from the aircraft nose to calculate the distance with an accuracy of 0.1m. The SVDGS 100 signals the aircraft to stop at a 0.1-meter distance from its stopping position to prevent overshooting of the aircraft and ensure the safety of both the aircraft and the ground staff. After signalling the pilot to stop the aircraft, the SVDGS 100 waits for a few seconds and then checks the speed of the aircraft to determine whether the aircraft has come to a complete halt. If the aircraft has been stopped then the SVDGS 100 recalculates the range and verifies the final stop position. If the aircraft nose wheels are within the range of the ICAO safety standards, then it displays “OK” otherwise it displays “Too Far”.
[0053] In one or more embodiments, the SVDGS 100 may provide accurate chocks ON timing for billing purposes using the AI/ML model 107. With the help of the real time lidar point cloud data and the image data, the SVDGS 100 detects an event of physical chocks being applied under the wheels of the aircraft by airport operations personnel. The AI/ML model 107 provides accurate chocks on /off timing over existing manual/automatic procedures of SVDGS 100,
[0054] In one or more embodiments, the SVDGS 100 may track a movement of PBB Passenger Boarding Bridge (PBB) using the AI/ML model 107 to determine engagement /disengagement of the PBB. The AI/ML model 107 visually tracks the PBB movement with the help of the lidar point cloud data and the image data, a distance of the PBB front from the aircraft door, and canopy position to accurately identify the PBB engagement & disengagement. This helps to avoid any physical integration requirement of SVDGS 100 with the PBB using interfacing electrical cables and additional hardware/software systems. The SVDGS 100 may also detect engagement /disengagement of multiple PBBs at multiple entry/exit gates of the aircraft simultaneously.
[0055] In one or more embodiments, the SVDGS 100 may determine a status of one or more operational equipment at the airport using the AI/ML model 107. For example, the SVDGS 100 may determine the status of GPU & PCA with a visual object recognition approach. The SVDGS 100 may determine the status of other airport operational equipment such as baggage trolley attachment, cargo door opened or closed, baggage loading/unloading status, and aircraft refueling status among other events. The SVDGS 100 may provide the docking guidance to the aircraft based on the identified status of the one or more operational equipment. This information greatly enhances the efficiency of airport turnaround times and contributes to better passenger experience and increased revenue from aircraft operations.
[0056] In one or more embodiments, the AI/ML model 107 filters out LiDAR reflections from snow and fog particles in harsh weather conditions such as heavy snow and thick fog conditions thereby providing uncompromised ranging performance of the aircraft approaching the parking bay. The AI/ML model 107 also filters out spurious laser reflections from other objects in the field of view of the aircraft path thereby providing accurate error-free information for safely docking the aircraft.
[0057] In one or more embodiments, the SVDGS 100 may capture all the docking events and provide the captured information to the pilot along with the guidance instructions. The captured information may include conditions like approaching speed and distance along with the azimuth guidance provided to pilots.
[0058] In one or more embodiments, the docking handler 305 of the SVDGS 100 may pass information derived from the AI/ML model 107 to the S-VDGS server 307 for advanced AI/ML analytics to determine the pilot performance during the docking process. This ensures safe docking by the pilots and creates a performance monitoring system to reduce any aircraft parking related incidents at the airports.
[0059] In one or more embodiments, the SVDGS 100 stores HD video logs of all active aircraft docking periods at the S-VDGS server 307 for investigative purposes during any parking incidents at the parking bay.
[0060] FIG. 4 illustrates a flowchart of a method 400 for providing the advanced aircraft docking guidance assistance to the aircraft, in accordance with an embodiment of the present disclosure. The method 400 includes a series of operation steps 401 through 417 performed by the processor 101 of the SVDGS 100.
[0061] At step 401, the processor 101 determines whether the aircraft is approaching the parking space. For instance, the processor 101 controls the LIDAR sensor 301 to scan an aircraft space in the predefined interval. The scanning of the aircraft space generates the lidar point cloud data comprises a plurality of 3D attributes and a plurality of 2D attributes in the parking space and the air space of the approaching aircraft. The processor 101 may determine that the aircraft is approaching the parking space based on the lidar point cloud data. The flow of the method 400 now proceeds to step 403.
[0062] At step 403, the processor 101 determines whether the aircraft is arriving towards the designated parking bay in the parking space designated for the aircraft. The processor 101 may determine whether the aircraft is arriving towards the designated parking bay based on the acquired real time data of the one or more sensors using the AI/ML model 107. If it is determined that the aircraft is arriving towards the designated parking bay, the flow of the method 400 proceeds to step 405. If it is determined that the aircraft is not arriving towards the designated parking bay flow of the method 400 returns to step 401.
[0063] At step 405, the processor 101 determines the type of the aircraft using the AI/ML model 107 based on aircraft information and the acquired real time data. If the type of the aircraft is correct, the flow of the method 400 proceeds to step 407. The type of the aircraft is determined to be correct if the approaching aircraft matches the preconfigured aircraft information. Further, if the approaching aircraft does not match with the preconfigured aircraft information the processor 101 determines that the type of the aircraft is incorrect, and the flow of the method 400 proceeds to step 409.
[0064] At step 407, the processor 101 determines, using the AI/ML model 107, whether the aircraft is following the centre line marked on the designated parking bay based on the acquired real time data. If it is determined that the aircraft is deviating from the centre line, the flow of the method 400 proceeds to step 411. If it is determined that the aircraft is aligned with the centre line, the flow of the method 400 proceeds to step 413.
[0065] At step 407, the processor 101 aborts the docking procedure and the flow of the method 400 goes back to step 401.
[0066] At step 411, the processor 101 provides the guidance instructions to the aircraft to align the aircraft with the centre line. For example, the processor 101 controls the display unit 113 to display left/right alignment arrows for pilots to rectify the alignment of the aircraft. The flow of the method 400 now proceeds to step 413.
[0067] At step 413, the processor 101 determines whether the one or more obstacles are present on the designated parking bay based on the acquired real time data. If the one or more obstacles are present on the designated parking bay the processor 101, the flow of the method proceeds to step 409. Alternatively, the processor 101 may send the notification to display with a message “WAIT” & “STOP” at the display unit 113. Once the one or more obstacles have been removed from the path, the processor 101 again continues with the docking guidance, and the flow of the method 400 proceeds to step 415. Further, if it is determined that no obstacle is present on the designated parking, the flow of the method 400 directly proceeds to step 415.
[0068] At step 415, the processor 101 determines whether the aircraft has reached the aircraft stopping position. If it is determined that the aircraft has not reached the aircraft stopping position, the flow of the method 400 proceeds to step 417. If it is determined that the aircraft has not reached the aircraft stopping position, the flow of the method 400 ends or restarts from the step 401.
[0069] At step 417, the processor 101 continues to provide the guidance instructions to the aircraft till the aircraft has reached the aircraft stopping position. The guidance instructions may include azimuth guidance to the aircraft in real time till the docking procedure is completed based on at least one of the acquired real time data, the status of the passenger boarding bridge, the status of one or more operational equipment at the airport, or the parking time of the aircraft.
[0070] In an example, the module(s) and/or the unit(s) and/or model(s) may include a program, a subroutine, a portion of a program, a software component, or a hardware component capable of performing a stated task or function. As used herein, the module(s) and/or the unit(s) and/or model(s) may be implemented on a hardware component such as a server independently of other modules, or a module can exist with other modules on the same server, or within the same program. The module(s) and/or unit(s) and/or model(s) may be implemented on a hardware component such as processor one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. The module(s) and/or unit(s) and/or model(s), when executed by the processor(s), may be configured to perform any of the described functionalities.
[0071] The method disclosed herein in one or more embodiments provides various technical benefits and advantages. The AI/ML based aircraft docking system provides real-time guidance to the pilot, helping to reduce the risk of collisions and damage to the aircraft. The disclosed system also incorporates data on weather conditions, traffic patterns, and other factors that could affect the aircraft's docking. Overall, the AI/ML based aircraft docking system provides enhanced safety, accuracy, and efficiency during the aircraft docking. The disclosed system provides a significant advancement in the aviation industry and may transform the way of parking and docking of the aircraft.
[0072] The various actions, acts, blocks, steps, or the like in the flow diagrams may be performed in the order presented, in a different order, or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, or the like may be omitted, added, modified, skipped, or the like without departing from the scope of the invention.
[0073] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one ordinary skilled in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
[0074] While specific language has been used to describe the present subject matter, any limitations arising on account thereto, are not intended. As would be apparent to a person in the art, various working modifications may be made to the method to implement the inventive concept as taught herein. The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment.
[0075] The embodiments disclosed herein can be implemented using at least one hardware device and performing network management functions to control the elements.
[0076] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the scope of the embodiments as described herein.
,CLAIMS:1. A smart visual docking guiding system (SVDGS) (100) for providing advanced aircraft docking guidance assistance to an aircraft, the SVDGS (100) comprising:
a memory (103);
one or more sensors configured to acquire real time data associated with the aircraft approaching a parking space; and
a processor (101) communicatively coupled with the memory (103) and the one or more sensors, wherein the processor (101) is configured to:
determine, based on the acquired real time data using a Machine Learning (ML) model (107) stored in the memory (103), whether the aircraft is arriving towards a designated parking bay in the parking space designated for the aircraft;
determine, using the ML model (107), a type of the aircraft based on aircraft information, the acquired real time data, and a result of determination that the aircraft is arriving towards the designated parking bay;
determine, using the ML model (107), whether the aircraft is following a centre line marked on the designated parking bay based on the acquired real time data and a result of the determination that the type of the aircraft is correct; and
provide, based on a result of the determination that the aircraft is deviating from the centre line, guidance instructions to the aircraft to align the aircraft with the centre line.
2. The SVDGS (100) as claimed in claim 1, wherein the processor (101) is further configured to:
determine whether one or more obstacles are present on the designated parking bay based on the acquired real time data;
provide, based on a result of the determination that the one or more obstacles are present on the designated parking bay, the guidance instructions to the aircraft to abort docking procedure.
3. The SVDGS (100) as claimed in claim 2, wherein the processor (101) is further configured to create a safe zone area for the arriving aircraft using the aircraft information, wherein the safe zone area is created to identify the one or more obstacles on the designated parking bay to ensure safety of aircraft and ground staff.
4. The SVDGS (100) as claimed in claim 1, wherein the processor (101) is further configured to provide, using the ML model (107), accurate azimuth guidance to the aircraft in real time till docking procedure completes based on at least one of the acquired real time data, a status of passenger boarding bridge, status of one or more operational equipment at the airport, or a parking time of the aircraft.
5. The SVDGS (100) as claimed in claim 1, wherein
the one or more sensors include a Light Detection and Ranging (LIDAR) sensor (301) and a plurality of image sensors (303),
the real time data associated with the aircraft includes real time lidar point cloud data obtained from the LIDAR sensor (301) and image data obtained from the plurality of image sensors (303).
6. The SVDGS (100) as claimed in claim 1, wherein the ML model (107) is trained on large datasets of aircraft docking scenarios to accurately predict a position of the aircraft and movement of the aircraft, based on the acquired real time data.
7. The SVDGS (100) as claimed in claim 1, wherein, to determine whether the aircraft is arriving towards the designated parking bay, the processor (101) is configured to:
track turning of the aircraft from a taxiway towards the parking space;
identify the aircraft nose once the aircraft turning complete is detected;
determine a distance of the aircraft at the designated parking bay; and
determine that the aircraft is arriving towards the designated parking bay if the distance of the aircraft at the designated parking bay matches the acceptable preconfigured distance.
8. The SVDGS (100) as claimed in claim 1, further comprising a plurality of display units communicatively coupled with the processor (101) and installed in all parking bays to provide the guidance instructions to the aircraft.
9. The SVDGS (100) as claimed in claim 1, wherein the processor (101) is further configured to provide, based on a result of the determination that the type of the aircraft is incorrect, the guidance instructions to the aircraft to abort docking procedure.
10. The SVDGS (100) as claimed in claim 1, wherein the processor (101) is configured to monitor, using the ML model (107), parking performance of a pilot associated with the aircraft during docking procedure.
11. A method (400) for providing advanced aircraft docking guidance assistance to an aircraft, the method (400) comprising:
in a smart visual docking guiding system (SVDGS) (100) including a memory (103), one or more sensors, and a processor (101) communicatively coupled with the memory (103) and the one or more sensors:
determining (403), by the processor (101) using a Machine Learning (ML) model (107) stored in the memory (103), whether the aircraft is arriving towards a designated parking bay in a parking space designated for the aircraft, wherein the determination of whether the aircraft is arriving towards the designated parking bay is based on real time data associated with the aircraft acquired by the one or more sensors;
determining (405), by the processor (101) using the ML model (107), a type of the aircraft based on aircraft information, the acquired real time data, and a result of determination that the aircraft is arriving towards the designated parking bay;
determining (407), by the processor (101) using the ML model (107), whether the aircraft is following a centre line marked on the designated parking bay based on the acquired real time data and a result of the determination that the type of the aircraft is correct; and
providing (411), by the processor (101), guidance instructions to the aircraft to align the aircraft with the centre line based on a result of the determination that the aircraft is deviating from the centre line.
12. The method (400) as claimed in claim 11, further comprising:
determining, by the processor (101), whether one or more obstacles are present on the designated parking bay based on the acquired real time data;
providing, by the processor (101) based on a result of the determination that the one or more obstacles are present on the designated parking bay, the guidance instructions to the aircraft to abort docking procedure.
13. The method (400) as claimed in claim 12, further comprising:
creating, by the processor (101), a safe zone area for the arriving aircraft using the aircraft information, wherein the safe zone area is created to identify the one or more obstacles on the designated parking bay to ensure safety of aircraft and ground staff.
14. The method (400) as claimed in claim 11, further comprising:
providing, by the processor (101) using the ML model (107), accurate azimuth guidance to the aircraft in real time till docking procedure is completed based on at least one of the acquired real time data, a status of passenger boarding bridge, status of one or more operational equipment at the airport, or a parking time of the aircraft.
15. The method (400) as claimed in claim 11, wherein
the one or more sensors include a Light Detection and Ranging (LIDAR) sensor (301) and a plurality of image sensors (303),
the real time data associated with the aircraft includes real time lidar point cloud data obtained from the LIDAR sensor (301) and image data obtained from the plurality of image sensors (303).
16. The method (400) as claimed in claim 11, wherein, for determining whether the aircraft is arriving towards the designated parking bay, the method (400) comprises:
tracking turning of the aircraft from a taxiway towards the parking space;
identifying the aircraft nose once the aircraft turning complete is detected;
determining a distance of the aircraft at the designated parking bay; and
determining that the aircraft is arriving towards the designated parking bay if the distance of the aircraft at the designated parking bay matches the acceptable preconfigured distance.
| # | Name | Date |
|---|---|---|
| 1 | 202311015732-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [09-03-2023(online)].pdf | 2023-03-09 |
| 2 | 202311015732-STATEMENT OF UNDERTAKING (FORM 3) [09-03-2023(online)].pdf | 2023-03-09 |
| 3 | 202311015732-PROVISIONAL SPECIFICATION [09-03-2023(online)].pdf | 2023-03-09 |
| 4 | 202311015732-OTHERS [09-03-2023(online)].pdf | 2023-03-09 |
| 5 | 202311015732-FORM FOR STARTUP [09-03-2023(online)].pdf | 2023-03-09 |
| 6 | 202311015732-FORM FOR SMALL ENTITY(FORM-28) [09-03-2023(online)].pdf | 2023-03-09 |
| 7 | 202311015732-FORM 1 [09-03-2023(online)].pdf | 2023-03-09 |
| 8 | 202311015732-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [09-03-2023(online)].pdf | 2023-03-09 |
| 9 | 202311015732-EVIDENCE FOR REGISTRATION UNDER SSI [09-03-2023(online)].pdf | 2023-03-09 |
| 10 | 202311015732-DRAWINGS [09-03-2023(online)].pdf | 2023-03-09 |
| 11 | 202311015732-DECLARATION OF INVENTORSHIP (FORM 5) [09-03-2023(online)].pdf | 2023-03-09 |
| 12 | 202311015732-Proof of Right [22-05-2023(online)].pdf | 2023-05-22 |
| 13 | 202311015732-FORM-26 [31-05-2023(online)].pdf | 2023-05-31 |
| 14 | 202311015732-FORM FOR STARTUP [07-03-2024(online)].pdf | 2024-03-07 |
| 15 | 202311015732-EVIDENCE FOR REGISTRATION UNDER SSI [07-03-2024(online)].pdf | 2024-03-07 |
| 16 | 202311015732-DRAWING [07-03-2024(online)].pdf | 2024-03-07 |
| 17 | 202311015732-CORRESPONDENCE-OTHERS [07-03-2024(online)].pdf | 2024-03-07 |
| 18 | 202311015732-COMPLETE SPECIFICATION [07-03-2024(online)].pdf | 2024-03-07 |