Sign In to Follow Application
View All Documents & Correspondence

A Smart Advanced Visual Docking Guidance System

Abstract: The present disclosure relates to an SVDGS (100) for an airport having a plurality of parking bays. The SVDGS (100) includes a 3-dimensional (3D) sensor (102) adapted to scan an aircraft (500) in a space in a predefined interval. The 3-3D sensor (102) is adapted to generate a 3D point cloud data. The 3D point cloud data comprises a plurality of 3D attributes and a plurality of 2-dimensional (2D) attributes in the parking space. The SVDGS (100) includes a control unit (104) equipped with an advanced artificial Intelligence (AI) engine coupled with the 3D sensor (102) and configured to receive the 3D point cloud data from the 3D sensors (102). The control unit (104) is configured to process the 3D point cloud data to detect the aircraft (500) in the parking space covering all parking bays and a movement thereof, comparing the detected aircraft (500) and the movement thereof.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
09 March 2023
Publication Number
37/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

INXEE SYSTEMS PRIVATE LIMITED
144 Udyog Vihar Phase 1, Sector 20, Gurgaon, 122 016, Haryana, India

Inventors

1. NUDURUPATI, Srinath
364 FF, Sector 23, Gurgaon 122017, Haryana, India

Specification

DESC:FIELD OF THE INVENTION

[0001] The present disclosure relates to an aircraft docking system. More particularly the present disclosure relates to a Smart Visual Docking Guiding System (SVDGS) for automation of airport operations and for monitoring pilot performance.

BACKGROUND

[0002] With the consistent rise of the economy, air travel is becoming an essential mode of transportation. For maintaining flight schedules and operation with minimal delay, safe and timely automated docking/parking of an aircraft is essential and desired. In addition, with the rising number of the aircraft and safety requirements, a proper aircraft pilot performance monitoring system is also desired.

[0003] Currently, the aircraft docking system is limited to handling a single parking bay having a single center line. In addition, the presently available docking guidance systems are not able to detect any errors committed by the pilots related to parking or during the parking of the aircraft. In addition, any error monitored is done manually and no processing or course correction is performed for future improvements. This results in problems for other aircraft and pilots at the docking stations. Moreover, the errors committed by any pilot go unnoticed.

SUMMARY

[0004] This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify key or essential inventive concepts of the invention nor is it intended for determining the scope of the invention.

[0005] The present disclosure relates to a smart visual docking guiding system (SVDGS) for an airport having a plurality of parking bays. The SVDGS includes a 3-dimensional (3D) sensor adapted to scan an aircraft in a space in a predefined interval. The 3-dimensional (3D) sensor may be a 3D semiconductor LiDAR sensor that is adapted to generate a 3D point cloud data. The 3D point cloud data comprises a plurality of 3D attributes and a plurality of 2-dimensional (2D) attributes in the parking space. In addition, the SVDGS includes a control unit equipped with an advanced Artificial Intelligence (AI) engine coupled with the 3D sensor and configured to receive the 3D point cloud data from 3D sensors. The control unit is configured to process the 3D point cloud data to detect an aircraft in the parking space covering all parking bays and a movement thereof, compare the detected aircraft and the movement thereof with a designated parking bay based on the detection of the aircraft and the movement and generate a guidance instruction based on the comparison.

[0006] The present disclosure also relates to a method of parking an aircraft in a parking bay having a plurality of parking bays. The method includes scanning an aircraft space in a predefined interval by the 3-dimensional (3D) sensor of a smart visual docking guiding system (SVDGS) to generate a 3D point cloud data. The 3D point cloud data comprises a plurality of 3D attributes and a plurality of 2-dimensional (2D) attributes in the parking space. In addition, the method includes receiving the 3D point cloud data from the 3D sensors by a control unit communicatively coupled with the 3D sensor. The control unit is configured to process the 3D point cloud data using advanced artificial intelligence (AI) techniques to detect the aircraft in the parking bay and a movement and generate guidance instruction based on the comparison. Further, the method includes displaying a guidance instruction to the aircraft by a plurality of display units communicatively coupled with the control unit and installed in each of the parking bays. The 3-dimensional (3D) sensor may be a 3D semiconductor LiDAR sensor that is adapted to generate a 3D point cloud data.

[0007] The SVDGS of the present disclosure is precise in monitoring the position of the aircraft with reference to the elements of the airport and facilitates the docking of one or more aircraft at their determined docking stations simultaneously. The SVDGS provides real-time feedback on the pilot's performance to airport operators and the pilot. In addition, the parking time of the aircraft is reduced thereby reducing the overall cost of the airport. The SVDGS is precise requires no human intervention and may be incorporated/installed into all the airports irrespective of the jurisdiction of the airports. Further, the SVDGS is universal, automatic, and requires no human interventions thereby reducing the docking time and eliminating the need for skilled personnel. In this manner, the SVDGS reduces the overall cost of the system for parking assistance and provides feedback on the pilot's performance to airport operators and the pilots in case of any errors committed by the pilots.

[0008] To further clarify the advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

[0009] These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:

[0010] Figure 1 illustrates an exemplary environment including a Smart Visual Docking Guiding System (SVDGS) with a plurality of aircraft parked and approaching the airport, in accordance with an embodiment of the present disclosure;

[0011] Figure 2 illustrates the SVDGS system installed at environment of the airport with the aircraft being parked in respective parking bays, in accordance with an embodiment of the present disclosure;

[0012] Figure 3 illustrates a block diagram of the SVDGS, in accordance with an embodiment of the present disclosure;

[0013] Figure 4 illustrates a method for parking, monitoring, and docking the aircraft by the SVDGS, in accordance with an embodiment of the present disclosure;

[0014] Figure 5 illustrates a flow chart depicting an embodiment of a process of operation of the SVDGS for parking the aircraft, in accordance with an embodiment of the present disclosure; and

[0015] Figure 6 illustrates a flow chart depicting an embodiment of a process of operation of the SVDGS for informing the airport operators of the errors committed by pilots that occurred during parking, in accordance with an embodiment of the present disclosure.

[0016] Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale.

[0017] Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.

DETAILED DESCRIPTION OF FIGURES

[0018] For the purpose of promoting an understanding of the principles of the present disclosure, reference will now be made to the various embodiments and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the present disclosure is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the present disclosure as illustrated therein being contemplated as would normally occur to one skilled in the art to which the present disclosure relates.

[0019] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are explanatory of the present disclosure and are not intended to be restrictive thereof.

[0020] Whether or not a certain feature or element was limited to being used only once, it may still be referred to as “one or more features” or “one or more elements” or “at least one feature” or “at least one element.” Furthermore, the use of the terms “one or more” or “at least one” feature or element do not preclude there being none of that feature or element, unless otherwise specified by limiting language including, but not limited to, “there needs to be one or more…” or “one or more elements is required.”

[0021] Reference is made herein to some “embodiments.” It should be understood that an embodiment is an example of a possible implementation of any features and/or elements of the present disclosure. Some embodiments have been described for the purpose of explaining one or more of the potential ways in which the specific features and/or elements of the proposed disclosure fulfil the requirements of uniqueness, utility, and non-obviousness.

[0022] Use of the phrases and/or terms including, but not limited to, “a first embodiment,” “a further embodiment,” “an alternate embodiment,” “one embodiment,” “an embodiment,” “multiple embodiments,” “some embodiments,” “other embodiments,” “further embodiment”, “furthermore embodiment”, “additional embodiment” or other variants thereof do not necessarily refer to the same embodiments. Unless otherwise specified, one or more particular features and/or elements described in connection with one or more embodiments may be found in one embodiment, or may be found in more than one embodiment, or may be found in all embodiments, or may be found in no embodiments. Although one or more features and/or elements may be described herein in the context of only a single embodiment, or in the context of more than one embodiment, or in the context of all embodiments, the features and/or elements may instead be provided separately or in any appropriate combination or not at all. Conversely, any features and/or elements described in the context of separate embodiments may alternatively be realized as existing together in the context of a single embodiment.

[0023] Any particular and all details set forth herein are used in the context of some embodiments and therefore should not necessarily be taken as limiting factors to the proposed disclosure.

[0024] The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by “comprises... a” does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.

[0025] Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.

[0026] Referring to Figure 1 and Figure 2, a Smart Visual Docking Guiding System (SVDGS) 100 for a parking space for an airport is shown. The SVDGS 100 may assist pilots during the docking/undocking process to safely maneuver a plurality of aircraft 500 into the designated parking positions at the airports. The SVDGS 100 may assist in parking multiple aircraft 500 in multiple parking bays 500 simultaneously. In addition, the SVDGS 100 In addition, facilitates notifying the airport operators and the pilot about potential errors committed by the pilots while parking the aircraft 500 in the respective parking bays.

[0027] Specifically, Figure 1 illustrates an exemplary environment in an airport installed with the SVDGS 100 with the aircraft 500 parked and approaching the airport, in accordance with the embodiment of the present disclosure. Figure 2 illustrates the SVDGS 100 system installed at the environment of the airport with the aircraft 500 being parked in respective parking bays, in accordance with the embodiment of the present disclosure.

[0028] The SVDGS 100 may be adapted to be installed in the airport to facilitate the parking of the aircraft 500 and inform the pilots and other operators about any errors committed by the pilots while parking their aircraft 500. In addition, the SVDGS 100 may be deployed for facilitating an automatic docking of a plurality of aircraft 500 at a plurality of parking bays.

[0029] The SVDGS 100 may be adapted to be configured with a plurality of elements of the airport. The plurality of elements may include but is not limited to, a plurality of runways, a plurality of parking bays, a plurality of lines in the parking ways like a plurality of central lines, a plurality of stop lines, a stopping points of the stop line, an overrun portion, an underrun portion, the aircrafts, and an apron area of the airport. Further, the SVDGS 100 is adapted to capture an approach data of the aircraft 500, its approach speed to the stopping position, and an approach angle of the aircraft 500.

[0030] As shown in Figure 1 and Figure 2, the SVDGS 100 includes a 3-dimensional (3D) sensor adapted to scan an apron region of the airport at a predefined interval. The 3-dimensional (3D) sensor may be a 3-dimensional (3D) semiconductor LIDAR sensor to facilitate the capturing of a 3-D point cloud data of the apron region. The scanning of the aircraft space facilitates the generation of 3D point cloud data of the aircraft space. The generated 3D point cloud data via the 3D sensor includes a plurality of 3D attributes and a plurality of 2-dimensional (2D) attributes in a parking bay of the airport.

[0031] In an embodiment, the 3D sensors 102 may include but are not limited to a 3D Solid State semiconductor Light Detection and Ranging (LIDAR) sensor. The field of view of the 3D LIDAR sensor covers the complete parking space of the airport. The 3D Lidar sensor is adapted to generate a 3-D point cloud data associated with the apron area of the airport. In addition, the 3D Lidar sensor may facilitate the generation of the data on a real-time basis. As an example, the 3D Lidar sensor facilitates a 360-degree horizontal view by performing multiple scans of the apron area per second.

[0032] In another embodiment, the SVDGS 100 may include a plurality of the 3D Lidar sensors to facilitate a more detailed view of the airport and the airspace. In an embodiment, the plurality of sensors 102 may include a video camera (not shown) to facilitate the generation of a video data to facilitate real-time tracking of the apron area of the airport and the airspace.

[0033] In another embodiment, the plurality of 3D sensors 102 may include one or more LiDAR sensors, one or more thermal sensors, or one or more camera sensors. The one or more LiDAR sensors, the one or more thermal sensors, and the one or more camera sensors are adapted to generate data associated with an orientation of the aircraft 500 in the space and on the ground. The orientation of the aircraft 500 may include the direction of travel of the aircraft 500 on the apron, the offset of the aircraft 500 from the center line, and the positioning relative to the stop line.

[0034] In addition, the data associated with the orientation of the aircraft 500 may be captured in real-time. As an example, the data associated with the orientation of the aircraft 500 may include the data associated with all three axis of the aircraft 500, their positioning, and orientation, and the point cloud data of the aircraft 500 while approaching the parking bay.

[0035] Further, the SVDGS 100 may include a control unit 104 coupled with the 3D sensors 102 such as but not limited to the 3D LIDAR sensor, the video camera, the one or more thermal sensors, and the one or more camera sensors. The control unit 104 is configured to receive the 3D point cloud data from the 3D sensors 102 via a network to facilitate the interconnection of the control unit 104 and the 3D sensors 102. In an example, the network may be a wired network or a wireless network. The network may include, but is not limited to, a mobile network, a broadband network, a Wide Area Network (WAN), a Local Area Network (LAN), and a Personal Area Network.

[0036] In an embodiment, the control unit 104 may be a single processing unit or a number of units, all of which may include multiple computing units. In another embodiment, the control unit 104 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, Arduino, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the control unit 104 is configured to fetch and execute computer-readable instructions and memory stored in a memory.

[0037] The control unit 104 of the SVDGS 100 is configured to process the 3D point cloud data to detect the aircraft 500 in the parking space covering all parking bays and movement thereof. In addition, the control unit 104 of the SVDGS 100 is configured to compare the detected aircraft 500 and the movement thereof with a designated parking bay based on the detection of the aircraft 500 and the movement of the aircraft 500. Further, the control unit 104 of the SVDGS 100 is configured to generate a guidance instruction based on the comparison. The control unit 104 of the SVDGS 100 is configured with an advanced Artificial Intelligence (AI) engine to facilitate the automation of the processes carried out by the SVDGS 100.

[0038] In an embodiment, the 3D point cloud data generated from the 3D sensors 102 such as but not limited to the 3D LIDAR sensor, the video camera, one or more laser sensors, the one or more thermal sensors, and the one or more camera sensors is visualized using the advanced AI engine of the control unit 104. The calibrations and the markings are also visualized by the control unit 104 related to the position of the center line, the multiple positions of the stop line, different aircraft types, and sizes, the position of the parking bay, the data of the aircraft taxing into the parking bay, the time of the aircraft approaching the parking bay, the data of the parking bay assigned to the aircraft 500.

[0039] Further, the control unit 104 of the SVDGS 100 may be adapted to store the plurality of data or the point cloud data generated from the 3D sensors 102. In an example, the control unit 104 of the SVDGS 100 may include a data storage unit or a data repository to facilitate the storage of the data generated and received from at least one of the the 3D sensors 102. In addition, the data generated from the plurality of the 3D sensors 102 may be fetched into the control unit 104. In an embodiment, the control unit 104 may include a predefined data related to the airport dimensions, a spatial data of the airport to determine the relative positioning of the aircraft 500 with reference to the predefined data sets.

[0040] In an embodiment, the control unit 104 is configured to store data related to a physical profile and a dimension of different types of the aircraft 500, trace a position of the center line for the approaching aircraft type via the prestored data in the control unit 104, and store a predetermined position of the stop line of the approaching aircraft 500. In addition, the control unit 104 is configured to detect the front wheel of the aircraft 500 via the point cloud data generated from the 3-D sensor and compare a position of the front wheel of the aircraft 500 relative to the center line of the designated parking bay to determine an offset of the aircraft 500.

[0041] In an embodiment, the SVDGS 100 is adapted to capture the 3-D coordinates of the aircraft 500 approaching the parking space. The capturing of the coordinates may be performed via the object detections using the 3D sensor 102 such as the 3D semiconductor based LIDAR sensors.

[0042] In another embodiment, the control unit 104 is configured to compare the approach speed of the aircraft 500 with a prestored threshold value of the approach speed in the control unit 104 and detect the offset of the aircraft 500 from the center line to detect misalignment of the aircraft 500. In this manner, the control unit 104 detects the misalignment of the aircraft 500 and determines the relative positioning of the aircraft 500 with reference to the center line and the sideline on the parking bay. In addition, if the approach speed is greater than the prestored threshold value of the approach speed, the SVDGS 100 may send a notification to the operator and the pilot to reduce the speed.

[0043] In an embodiment, the control unit 104 of the SVDGS 100 is configured to process the 3D point cloud data using the 3D sensors 102 via an image processing technique by a trained artificial intelligence model. The control unit 104 facilitates in determining the type of the aircraft 500, the entry of the aircraft 500 in the parking space, an approach speed of the aircraft 500, the offset of the aircraft 500 from the center line to detect a misalignment, and an exit of the aircraft 500 from the parking bay via analyzing the 3D point cloud data using from the 3D sensors 102. The 3D coordinates of the approaching aircraft 500 are compared with the coordinates of previously detected attributes such as the center line, the stop line to determine the relative positioning of the aircraft 500.

[0044] Furthermore, the control unit 104 may include a software or a set of protocols to facilitate the monitoring of at least one of the elements of the airport. In an embodiment, the software or the set of protocols may include Artificial Intelligence (AI) and Machine Learning (ML) modules to facilitate the automation of the monitoring process. In addition, the software or the set of protocols may facilitate the parallel monitoring of the plurality of the center lines, the plurality of stop lines, the aircraft 500, the stopping points, the overrun portion of the aircraft 500, the approach speed, and the approach angle of the aircraft 500 and other elements of the airport.

[0045] In addition, the control unit 104 may facilitate the generation of a tentative path for the aircraft 500 to be parked at a designated docking station, and determine the errors committed by the pilot while parking the aircraft 500 at the designated docking station. The tentative path is determined based on the 3-D coordinates of the designated space at the parking bay for the approaching aircraft 500 and the 3-D coordinates extracted from the aircraft 500 while the aircraft 500 is moving. The path may be determined and any change in relative positioning is informed to the pilot.

[0046] Further, the SVDGS 100 includes a plurality of display units 106 communicatively coupled with the control unit 104 and installed in the respective parking bays to provide the visual guidance instructions to the aircraft 500 and the pilots within the aircrafts 500. The display units 106 may be adapted to be operatively coupled with the control unit 104 and adapted to receive the plurality of signals from the control unit 104.

[0047] In addition, the display units 106 may include a plurality of signals to facilitate the pilots while guiding and maneuvering the aircraft 500 to facilitate the parking of the aircraft 500 at the designated parking bay. In an example, the plurality of signals may include a signal for stopping point overrun, the stopping point underrun, the over speeding of the aircraft 500, under speeding of the aircraft 500, unauthorized approach speed of the aircraft 500, a right side overrun of the aircraft 500, a left side overrun of the aircraft 500, and other unauthorized parking errors committed by the pilot for the aircraft 500.

[0048] The control unit 104 of the SVDGS 100 is further configured to provide the guiding instruction indicative to correct the offset of the aircraft 500 to the pilots via the display unit 106 via the displaying of the predetermined signals. In addition, the control unit 104 of the SVDGS 100 is configured to determine the corrective actions performed by the pilot in response to the displaying of the signals on the display unit 106 and generate a rating to indicate the pilot’s performance within the control unit 104.

[0049] Referring to Figure 3, a block diagram SVDGS 100 is shown, in accordance with the embodiment of the present disclosure. The SVDGS 100 may include the control unit 104, which may further include a processor/controller 204, a memory 206, and module(s) 208. The memory 206, in one example, may store the instructions to carry out the operations of the modules 208. The modules 208 and the memory 206 may be coupled to the processor 204.

[0050] The processor 204 can be a single processing unit or several units, all of which could include multiple computing units. The processor 204 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 204 is configured to fetch and execute computer-readable instructions and data stored in the memory 206. The processor 204 may include one or a plurality of processors. At this time, one or a plurality of processors may be a general-purpose processor, such as a central processing unit (CPU), an application processor (AP), or the like, a graphics-only processing unit such as a graphics processing unit (GPU), a visual processing unit (VPU), and/or an AI-dedicated processor such as a neural processing unit (NPU). The one or a plurality of processors control the processing of the input data in accordance with a predefined operating rule or artificial intelligence (AI) model stored in the non-volatile memory and the volatile memory. The predefined operating rule or machine learning model is provided through training or learning.

[0051] The memory 206 may include any non-transitory computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read-only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.

[0052] The modules 208, amongst other things, include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement data types. The modules 208 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulates signals based on operational instructions.

[0053] Further, the modules 208 can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, such as the processor 204, a state machine, a logic array, or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform the required tasks or, the processing unit can be dedicated to performing the required functions. In another embodiment of the present disclosure, the modules 208 may be machine-readable instructions (software) which, when executed by the processor 204/processing unit, perform any of the described functionalities. Further, the data serves, amongst other things, as a repository for storing data processed, received, and generated by one or more of the modules 208.

[0054] The modules 208 may perform different functionalities which may include. An analysing module 210 for analysing the speed, direction, and coordinates of the aircraft 500, a comparing module 212 for comparing the positioning of the aircraft 500 with reference to the other elements of the airport and with the prestored data of the aircraft 500 characterstics, and a tracking module 214. To enable the tracking of the positioning of the aircraft 500.

[0055] The present disclosure also relates to a method 300 for parking, monitoring, and docking of the aircraft 500 by the SVDGS 100 as shown in Figure 4. The order in which the method steps are described below is not intended to be construed as a limitation, and any number of the described method steps can be combined in any appropriate order to execute the method or an alternative method. Additionally, individual steps may be deleted from the method without departing from the spirit and scope of the subject matter described herein.

[0056] The method 300 can be performed by programmed computing devices, for example, based on instructions retrieved from non-transitory computer-readable media. The computer-readable media can include machine-executable or computer-executable instructions to perform all or portions of the described method. The computer-readable media may be, for example, digital memories, magnetic storage media, such as a magnetic disks and magnetic tapes, hard drives, or optically readable data storage media.

[0057] Referring to Figure 4, a method 300 for parking, monitoring, and docking of the aircraft 500 by the SVDGS 100 is shown. At step 302, scanning an aircraft space in the predefined interval by the 3D sensor 102 of the SVDGS 100 is performed. The scanning facilitates the generation of the 3D point cloud data from the 3D sensors 102. The 3D point cloud data comprises a plurality of 3D attributes and a plurality of 2D attributes in the parking space and the air space of the approaching aircraft 500.

[0058] In an embodiment, the 3D point cloud data includes the position of the center line, the position of the stop line, the position of the parking bay, the data of the aircraft 500 taxing into the parking bay, the time of the aircraft 500 approaching the parking bay, the data of the parking bay assigned to the aircraft 500. In addition, the data may include the real-time data associated with the center lines, the stop lines, and the approach characteristics of the aircraft including its speed and distance from its stopping position 500.

[0059] At step 304, receiving the 3D point cloud data from the 3D sensors 102 by the control unit 104 is communicatively coupled with the 3D sensors 102. For receiving the data from the 3D sensors 102 the network may be used to enable the communication between the control unit 104 and the 3D sensors 102. In an example, the network may be a wired network or a wireless network. The network may include, but is not limited to, a mobile network, a broadband network, a Wide Area Network (WAN), a Local Area Network (LAN), and a Personal Area Network.

[0060] At step 306, the control unit 104, processes the 3D point cloud data generated from the 3D sensors 102 to detect the aircraft 500 in the parking space and a movement thereof. The control unit 104 may include a software or a set of protocols to facilitate the processing of the 3D point cloud data generated from the 3D sensors 102 and facilitate the monitoring of the elements of the airport. In an embodiment, the software or the set of protocols may include AI-ML modules to facilitate the automation of the monitoring process. In addition, the software or the set of protocols may facilitate the parallel monitoring of the plurality of the center lines, the plurality of stop lines, the approach speed of the aircraft 500.

[0061] In the step 308, the comparing the detected aircraft 500 and the movement thereof with a designated parking bay is based on the detection of the aircraft 500 and the movement. This is done to recommend a pathway along the one or more center lines and the stop lines. The recommendation process is performed by the control unit 104. The recommendation and the comparison of the aircraft 500 with the predefined instructions may be performed by implementing AI-ML modules. In an embodiment, the comparing and the recommending the pathway to the aircraft 500 is as per the predetermined location of the aircraft 500 in the parking bay. In addition, after comparison, the control unit 104 calculates the real-time errors while the aircraft 500 is parked in the parking bay.

[0062] In the step 310, detect any corrective actions performed by the pilot. The control unit 104 may facilitate in recommending and generating a pathway along the one or more center lines and the stop lines to the pilot via the display unit 106. The recommendation process is performed by the control unit 104 via sending the plurality of instructions to the display unit 104.

[0063] In the Step 312, displaying the determined guidance instruction by the control unit 104 via the display unit 106 is performed. In the display unit 104, the plurality of signals informs the pilot to maneuver the aircraft 500 to park the aircraft 500 at the designated parking bay. In an example, the plurality of signals may include a signal for stopping point overrun, the stopping point underrun, the over speeding of the aircraft 500, under speeding of the aircraft 500, unauthorized approach speed of the aircraft 500, a right side overrun of the aircraft 500, a left side overrun of the aircraft 500, and other unauthorized parking errors committed by the pilot for the aircraft 500.

[0064] Referring to Figure 5 illustrates a flow chart depicting an embodiment for a process 400 of operation of the SVDGS 100 for parking of the aircraft 500, in accordance with the embodiment of the present disclosure. At step 402, the process 400 is initiated from a standby or a stop stage.

[0065] In a step 404, the process 400 includes sensing arrival of the aircraft 500 by the SVDGS 100. In an embodiment, the arrival of the aircrafts 500 is sensed by the 3-D sensors 102 of the SVDGS 100. In case, if the aircraft 500 is not arriving, the process moves to the previous step 402. If the aircraft 500 is arriving, the process 400 may move to step 406.

[0066] In the step 406, checking of the multiple parking bays is performed. For checking the multiple bay support, the plurality of parking bays are scanned and the presence of any aircraft 500 is confirmed by the control unit 104 via the data generated from the 3-D sensors 102. In case, a single aircraft 500 is sensed and the availability of the parking bay is confirmed by the control unit 104, then the process 400 moves to step 408.

[0067] In the step 408, the parking of the aircraft 500 is performed by sending signals to the display unit 106 by the control unit 104. The display unit 104 may guide the pilot and the aircraft 500 to the designated parking bay by the central unit 104. However, if more than one aircrafts 500, is sensed, then the process moves to step 410.

[0068] In the step 410, sensing of the aircrafts 500 is performed. In addition, the positioning of the aircraft 500 is determined by the control unit 104 via the data obtained from the 3-D sensors 102. Further, if a single aircraft 500 is sensed or a last aircraft 500 is left after the parking of the remaining aircraft 500, the process moves to step 408 to facilitate the normal parking of the aircraft 500. However, if more than one aircraft 500 is sensed, then the process moves to step 412.

[0069] In the step 412, scanning of the airport area is performed by the 3-D sensor 102. The data generated from the 3-D sensor 102 is stored within the control unit 104. In addition, the scanning of the plurality of the center lines and the stop lines is performed parallelly and a route is determined by the control unit 104 for the parking of the aircrafts 500. The determined route is displayed on the display unit 106 to facilitate the guiding of the pilots to their parking bay. After the parking, process 400 moves to step 414.

[0070] In the step 414, the control unit 104 scans all the aircraft 500 parked in the parking bay along their respective the stop lines and the center lines. In this manner, in the step 414, the process 400 of docking is completed and the aircrafts 500 is parked in the docking stations. The process 400 may move to the starting step 402 after the termination of the step 414.

[0071] Referring to Figure 6 illustrates a flow chart depicting an embodiment of a process 600 of operation of the SVDGS 100 for informing the airport operator and the pilot of the errors that occurred during parking, in accordance with the embodiment of the present disclosure.

[0072] In a step 504, the process includes sensing the arrival of the aircraft 500 by the SVDGS 100. In case, if the aircraft 500 does not arrive, the process moves to the previous step 502. If the aircraft 500 is arriving, the process 600 may move to the step 506.

[0073] In the step 506, checking the angle of approach of the aircraft 500 is performed by the SVDGS 100. For checking the angle of approach of the aircraft 500, the aircraft 500 approaching the parking bay is scanned. The angle of approach of the aircraft 500 is fed into the control unit 104 for calculating the errors. If the approach angle of the aircraft 500 is determined to be right by the control unit 104, the process 600 moves to step 508. However, if the approach angle of the aircraft 500 is determined to have some errors, the process 600 moves to step 512.

[0074] In the step 508, the positioning of the aircraft 500 is sensed from the 3-D sensors 102. The sensing is performed after the stopping of the aircraft 500. The sensed data includes the positioning of the aircraft 500 with reference to the stop line, the centre lines, the overrun portions of the aircraft 500, the underrun portions of the aircraft 500, the right overruns, the right underrun, the left overrun, the left underrun and other parameters of the aircraft 500. The data is fed into the control unit 104 and the calculation is performed. If the control unit 104 calculates the correctness of the data, the process 600 moves to step 514. If the control unit 104 calculates and determines some error in the data, the process 600 moves to step 512.

[0075] In the step 512, the errors diagnosed by the control unit 104 are compiled and a report is generated. The report includes all the errors committed by the pilot while parking the aircraft 500. The report includes data related to the positioning of the aircraft 500 with reference to the stop line, the centre lines, the overrun portions of the aircraft 500, the underrun portions of the aircraft 500, the right overruns, the right underrun, the left overrun, the left underrun and other parameters of the aircraft 500 orientation along with the approach speed of the aircraft and whether it was below the maximum allowed approach speed. Further, the process moves to step 514.

[0076] In the step 514, the docking/parking of the aircraft 500 is performed. In addition, the aircraft 500 is parked at their respective stations, and the process 600 moves to the step 510.

[0077] In the step 510, a final check is run by the control unit 104 regarding checking all the parameters of the docking/parking of the aircraft 500. If all the parameters of the aircraft 500 are calculated correctly by the control unit 104, the process 600 is terminated. After the termination of the process 600, the process moves to step 502, and the process 600 is repeated for other aircraft 500 approaching for parking. However, if any parameters of the aircraft 500 depict the position of the aircraft 500 In an inappropriate parking position, the process moves to step 514. In this manner, the process 600 facilitates the proper parking of the aircraft 500 in their respective stands/bays/stations.

[0078] The advantage of the SVDGS 100 is now explained. The SVDGS 100 is precise and facilitates the docking of one or more aircraft 500 at their determined docking stations simultaneously. The SVDGS 100 provides real-time feedback to the pilots and the operator. In addition, the parking time of the aircraft 500 is reduced thereby reducing the overall cost of the airport. The SVDGS 100 is precise requires no human intervention and may be incorporated/installed into all the airports irrespective of the jurisdiction of the airports. Further, the SVDGS 100 is universal, automatic, and requires no human interventions thereby reducing the docking time, and eliminating the need for skilled personnel. In this manner, the SVDGS 100 reduces the overall cost of the system for parking assistance and provides feedback to the pilots and airport operators in case of any errors committed by the pilots.

[0079] While specific language has been used to describe the present disclosure, any limitations arising on account thereto, are not intended. As would be apparent to a person in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein. The drawings and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. ,CLAIMS:1. A smart visual docking guiding system (SVDGS) (100) for an airport having a plurality of parking bays, the SVDGS (100) comprising:
a 3-dimensional (3D) sensor (102) adapted to scan an aircraft (500) in a space in a predefined interval, the 3-3D sensor (102) is adapted to generate a 3D point cloud data, wherein the 3D point cloud data comprises a plurality of 3D attributes and a plurality of 2-dimensional (2D) attributes in the parking space; and
a control unit (104) equipped with an advanced artificial Intelligence (AI) engine coupled with the 3D sensor (102) and configured to receive the 3D point cloud data from the 3D sensors (102), wherein the control unit (104) is configured to:
process the 3D point cloud data to detect the aircraft (500) in the parking space covering all parking bays and a movement thereof;
compare the detected aircraft (500) and the movement thereof with a designated parking bay based on the detection of the aircraft (500) and the movement; and
generate a guidance instruction based on the comparison.

2. The SVDGS (100) as claimed in claim 1, comprising a plurality of display units (106) communicatively coupled with the control unit (104) and installed in the respective parking bays to provide the guidance instructions to the aircraft (500).

3. The SVDGS (100) as claimed in claim 2, wherein the 3D point cloud data is visualized using the advanced AI engine of the control unit (104) with calibrations and markings of the position of a center line, multiple positions of the stop line, different aircraft types and sizes, a position of the parking bay, a data of the aircraft taxing into the parking bay, a time of the aircraft (500) approaching the parking bay, a data of the parking bay assigned to the aircraft (500).

4. The SVDGS (100) as claimed in claims 1 and 3, wherein the control unit (104) is configured to:
store a data related to a physical profile and a dimension of a different types of aircraft (500);
trace a position of the center line for the approaching aircraft (500) type;
store a predetermined position of the stop line of the approaching aircraft (500);
detect a front wheel of the aircraft (500);
compare a position of the front wheel relative to the center line of the designated parking bay to determine an offset; and
provide the guiding instruction indicative to correct the offset.

5. The SVDGS (100) as claimed in claim 1, wherein the 3D sensor (102) is a 3D Solid State semiconductor Light Detection and Ranging (LIDAR) sensor, wherein a field of view of the 3D semiconductor LIDAR sensor covers the complete parking bay.

6. The SVDGS (100) as claimed in claim 1, wherein the control unit (104) includes a predefined data related to the airport dimensions, a spatial data of the airport.

7. The SVDGS (100) as claimed in claim 3, wherein the control unit (104) is configured to process the 3D point cloud data using an image processing technique by a trained artificial intelligence model to determine at least one of a type of the aircraft (500), entry of the aircraft (500) in the parking bay, an approach speed of the aircraft (500), an offset of the aircraft (500) from the center line to detect a misalignment and an exit of the aircraft (500) from the parking bay.

8. The SVDGS (100) as claimed in claim 7, wherein the control unit (104) is configured to:
compare the approach speed of the aircraft (500) with a prestored value of an approach speed;
detect the offset of the aircraft (500) from the center line to detect misalignment;
any corrective actions performed by a pilot; and
generate a rating to indicate the pilot's performance.

9. A method (300) of parking an aircraft (500) in a parking bay having a plurality of parking bays, the method comprising:
scanning the aircraft (500) space in a predefined interval by a 3-dimensional (3D) sensor (102) of a smart visual docking guiding system (SVDGS) (100) to generate a 3D point cloud data, wherein the 3D point cloud data comprises a plurality of 3D attributes and a plurality of 2-dimensional (2D) attributes in the parking space;
receiving the 3D point cloud data from the 3D sensors (102) by a control unit (104) communicatively coupled with the 3D sensor (102), the control unit (104) is configured to:
processing the 3D point cloud data using advanced artificial intelligence (AI) techniques to detect the aircraft (500) in the parking bay and a movement; and
generating a guidance instruction based on the comparison; and
displaying a guidance instruction to the aircraft (500) by a plurality of display units (106) communicatively coupled with the control unit (104) and installed in each of the parking bays.

10. The method (300) as claimed in claim 9, wherein the 3D point cloud data includes a position of a center line, a position of the stop line, a position of the parking bay, a data of the aircraft taxing into the parking bay, a time of the aircraft (500) approaching the parking bay, a data of the parking bay assigned to the aircraft (500).

11. The method (300) as claimed in claim 9, wherein the control unit (104) can simultaneously process the point cloud data of multiple approaching aircraft (500) or the docked aircraft (500) from multiple parking bays simultaneously to provide simultaneous parking guidance assistance.

Documents

Application Documents

# Name Date
1 202311015850-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [09-03-2023(online)].pdf 2023-03-09
2 202311015850-STATEMENT OF UNDERTAKING (FORM 3) [09-03-2023(online)].pdf 2023-03-09
3 202311015850-PROVISIONAL SPECIFICATION [09-03-2023(online)].pdf 2023-03-09
4 202311015850-OTHERS [09-03-2023(online)].pdf 2023-03-09
5 202311015850-FORM FOR STARTUP [09-03-2023(online)].pdf 2023-03-09
6 202311015850-FORM FOR SMALL ENTITY(FORM-28) [09-03-2023(online)].pdf 2023-03-09
7 202311015850-FORM 1 [09-03-2023(online)].pdf 2023-03-09
8 202311015850-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [09-03-2023(online)].pdf 2023-03-09
9 202311015850-EVIDENCE FOR REGISTRATION UNDER SSI [09-03-2023(online)].pdf 2023-03-09
10 202311015850-DRAWINGS [09-03-2023(online)].pdf 2023-03-09
11 202311015850-DECLARATION OF INVENTORSHIP (FORM 5) [09-03-2023(online)].pdf 2023-03-09
12 202311015850-Proof of Right [22-05-2023(online)].pdf 2023-05-22
13 202311015850-FORM-26 [31-05-2023(online)].pdf 2023-05-31
14 202311015850-DRAWING [02-03-2024(online)].pdf 2024-03-02
15 202311015850-CORRESPONDENCE-OTHERS [02-03-2024(online)].pdf 2024-03-02
16 202311015850-COMPLETE SPECIFICATION [02-03-2024(online)].pdf 2024-03-02
17 202311015850-OTHERS [13-03-2024(online)].pdf 2024-03-13
18 202311015850-FORM FOR STARTUP [13-03-2024(online)].pdf 2024-03-13
19 202311015850-EVIDENCE FOR REGISTRATION UNDER SSI [13-03-2024(online)].pdf 2024-03-13