Sign In to Follow Application
View All Documents & Correspondence

Drone Pilot System

Abstract: Disclosed is an aerial vehicle (104) including a sensing unit (120) and processing circuitry (124). The sensing unit (120) includes first and second set of sensors (120a-102b) configured to capture one or more images of the GPS denied environment of the aerial vehicle (104), and sense one or more parameters associated with an orientation of the aerial vehicle (104), respectively. The processing circuitry (124) that is coupled to the sensing unit (120), is configured to determine a state of the aerial vehicle (104) based on the one or more images and the one or more orientation parameters, and determine, based on the state of the aerial vehicle (104), a set of control configurations for a stable altitude and angular velocity control of the aerial vehicle (104) using at least one of, one or more machine learning techniques, one or more artificial intelligence techniques, or a combination thereof. FIG. 1 will be the reference figure.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
23 June 2022
Publication Number
34/2024
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application

Applicants

Enord Private Limited
24-B Second Floor, Okhla Village Okhla, New Delhi, South Delhi, Delhi-110025, India

Inventors

1. ANAS, Muhammad
D-50, Near God’s Grace School, Okhla Vihar Metro Station. Jamia Nagar Okhla, New Delhi -110025 India

Specification

DESC:TECHNICAL FIELD
The present disclosure relates generally to unmanned aerial vehicles. More particularly, the present disclosure relates to a drone pilot system.
BACKGROUND
Inspection and surveillance have been critical aspects of almost every sector today. The surveillance and inspection systems ensure to bring in notice, any minor defect (abnormality) or change in usual conditions of a system and keep a track of the abnormality or change till the issue is resolved. Traditional system and tools for inspection lack spontaneity as well as dynamic range of coverage and are limited to only areas with ease of access.
Recently, the use of unmanned arial vehicles (UAV) also commonly known as drones has increased not just for inspection and surveillance but also for a wide variety of tasks at places with almost zero human accessibility. Using drones for surveillance can provide access to areas that may be difficult or impossible to reach by humans on foot or in land vehicles. Further, the drones produce lesser noise and thus are less likely to be noticed, thus find applications in military security and vigilance services.
Most of the present-day drones use global positioning system (GPS) to find a path for their commute, however there are a few challenges that are faced by drones using GPS. The GPS-aided drones are quite susceptible loss of GPS signal. Further, Localization and mapping algorithms in GPS-denied or intermittent conditions have limitations based on environment texture, reflectivity, types and the quality of sensors used. Furthermore, the GPS is not completely accurate and thus in some critical situations, GPS-aided drones may lead to critical situations that can make the entire purpose of using drones for a specific purpose, a failure.
Thus, an improved system with better reliability and accuracy of location for UAV systems is an ongoing effort and demands a need for improvised technical solution that overcomes the aforementioned problems.
SUMMARY
In an aspect of the present disclosure, an aerial vehicle includes a sensing unit and processing circuitry coupled to the sensing unit. The sensing unit includes first and second set of sensors that are configured to capture one or more images of the GPS denied environment of the aerial vehicle, and sense one or more parameters associated with an orientation of the aerial vehicle, respectively. The processing circuitry is configured to determine a state of the aerial vehicle based on the one or more images and the one or more orientation parameters. The processing circuitry is further configured to determine, based on the state of the aerial vehicle, a set of control configurations for a stable altitude and angular velocity control of the aerial vehicle using at least one of, one or more machine learning techniques, one or more artificial intelligence techniques, or a combination thereof.
In some aspects, to determine the set of control configurations, the processing circuitry is configured to train a set of aviation weights by way of a custom ground-truth dataset, and fine-tuned the set of aviation weights by way of one or more mixed perception techniques.
In some aspects, to fine-tune the set of aviation parameters, the processing circuitry is configured to iteratively update, by way of a set of pre-defined lower-precision floating-point numbers, a forward pass and a backward pass of the set of aviation weights, and iteratively update, by way of a set of pre-defined higher-precision numbers, the set of aviation weights.
In some aspects, the processing circuitry, is configured to combine the one or more images captured by the first set of sensors to generate a depth map, using one or more artificial intelligence techniques.
In some aspects, the processing circuitry is further configured to determine one or more trajectories of flight from a source point to a destination point based on the depth map.
In some aspects, the processing circuitry is configured to identify a numerical count of occlusions in each trajectory of the one or more trajectories of flight by way of an object detection technique, and determine an optimized trajectory of the one or more trajectories of flight based on the numerical count of occlusions in each trajectory, using one or more artificial intelligence techniques.
In some aspects, the sensing unit further includes a third set of sensors that are configured to capture a plurality of high-resolution images of the destination point.
In some aspects, the processing circuitry is configured to identify an object of interest from the plurality of high-resolution images of the destination point using at least one of, one or more object detection techniques and one or more artificial intelligence techniques. The processing circuitry is further configured to identify one or more high resolution images of the plurality of high-resolution images in which the object of interest is identified.
In some aspects, the processing circuitry is further configured to send the one or more high resolution images to a user device.
In another aspect of the present disclosure, a system includes a user device and an aerial vehicle that is coupled to the user device. The user device is configured to enable a user to select a source point and a destination point. The aerial vehicle includes a sensing unit and processing circuitry coupled to the sensing unit. The sensing unit includes first and second set of sensors that are configured to capture one or more images of the GPS denied environment of the aerial vehicle, and sense one or more parameters associated with an orientation of the aerial vehicle, respectively. The processing circuitry is configured to determine a state of the aerial vehicle based on the one or more images and the one or more orientation parameters. The processing circuitry is further configured to determine, based on the state of the aerial vehicle, a set of control configurations for a stable altitude and angular velocity control of the aerial vehicle using at least one of, one or more machine learning techniques, one or more artificial intelligence techniques, or a combination thereof.
In yet another aspect of the present disclosure, a method includes enabling, by way of a user device, a user to select the source point and the destination point. The method further includes capturing, by way of a first set of sensors of a sensing unit, one or more images of the GPS denied environment of the aerial vehicle. Furthermore, the method for includes sensing, by way of a second set of sensors of the sensing unit, one or more parameters associated with an orientation of the aerial vehicle. Furthermore, the method includes determining, by way of processing circuitry, a state of the aerial vehicle based on the one or more images and the one or more orientation parameters. Furthermore, the method includes determine, by way of the processing circuitry, a set of control configurations for a stable altitude and angular velocity control of the aerial vehicle based on the state of the aerial vehicle using at least one of, one or more machine learning techniques, one or more artificial intelligence techniques, or a combination thereof.
BRIEF DESCRIPTION OF DRAWINGS
The above and still further features and advantages of aspects of the present disclosure becomes apparent upon consideration of the following detailed description of aspects thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
FIG. 1 illustrates a block diagram of a system, in accordance with an exemplary aspect of the present disclosure;
FIG. 2 illustrates a block diagram of an aerial vehicle of the system of FIG. 1, in accordance with an exemplary aspect of the present disclosure; and
FIG. 3 illustrates a flow chart of a method, in accordance with an exemplary aspect of the present disclosure.
To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures.
DETAILED DESCRIPTION
Various aspect of the present disclosure provides an aerial vehicle, a method and a drone pilot system. The following description provides specific details of certain aspects of the disclosure illustrated in the drawings to provide a thorough understanding of those aspects. It should be recognized, however, that the present disclosure can be reflected in additional aspects and the disclosure may be practiced without some of the details in the following description.
The various aspects including the example aspects are now described more fully with reference to the accompanying drawings, in which the various aspects of the disclosure are shown. The disclosure may, however, be embodied in different forms and should not be construed as limited to the aspects set forth herein. Rather, these aspects are provided so that this disclosure is thorough and complete, and fully conveys the scope of the disclosure to those skilled in the art. In the drawings, the sizes of components may be exaggerated for clarity.
It is understood that when an element is referred to as being “on,” “connected to,” or “coupled to” another element, it can be directly on, connected to, or coupled to the other element or intervening elements that may be present. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
The subject matter of example aspects, as disclosed herein, is described with specificity to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventor/inventors have contemplated that the claimed subject matter might also be embodied in other ways, to include different features or combinations of features similar to the ones described in this document, in conjunction with other technologies. Generally, the various aspects including the example aspects relate to an aerial vehicle, a method, and a drone pilot system.
The aspects herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting aspects that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the aspects herein. The examples used herein are intended merely to facilitate an understanding of ways in which the aspects herein may be practiced and to further enable those of skill in the art to practice the aspects herein. Accordingly, the examples should not be construed as limiting the scope of the aspects herein.
FIG. 1 illustrates a block diagram of a system 100, in accordance with an exemplary aspect of the present disclosure. The system 100 may include a user device 102 and an aerial vehicle 104. In some aspects of the present disclosure, the user device 102 and the aerial vehicle may be communicatively coupled to the aerial vehicle 104 by way of either of, a first wired communication medium and a first wireless communication medium. In some aspects of the present disclosure, the user device 102 may be communicatively coupled to the aerial vehicle 104 by way of a communication network 106.
In some aspects of the present disclosure, the user device 102 may include a user interface 110, a processing unit 112, a memory unit 114, a drone console 116, and a communication interface 118.
The user interface 110 may include an input interface (not shown) for receiving inputs from a user. The input interface may further be configured to facilitate the user to input data for registration, authentication and/or logging-in to the system 100. Examples of the input interface may include, but are not limited to, a touch interface, a mouse, a keyboard, a motion recognition unit, a gesture recognition unit, a voice recognition unit, or the like. Aspects of the present disclosure are intended to include or otherwise cover any type of the input interface including known, related art, and/or later developed technologies. The user interface 110 may further include an output interface (not shown) for displaying (or presenting) an output to the user. In some aspects of the present disclosure, the output interface may be configured to facilitate the user to receive, present and/or display one or more notifications from the system 100. Examples of the output interface may include, but are not limited to, a digital display, an analog display, a touch screen display, a graphical user interface, a website, a webpage, a keyboard, a mouse, a light pen, an appearance of a desktop, and/or illuminated characters. Aspects of the present disclosure are intended to include and/or otherwise cover any type of the output interface including known and/or related, or later developed technologies.
The processing unit 112 may include suitable logic, instructions, circuitry, interfaces, and/or codes for executing various operations, such as the operations associated with the user device 102. In some aspects of the present disclosure, the processing unit may utilize one or more processors such as Arduino or raspberry pi or the like. Further, the processing unit 112 may be configured to control one or more operations executed by the user device 102, in response to the input received at the user interface 110 from the user. Examples of the processing unit 112 may include, but are not limited to, an application-specific integrated circuit (ASIC) processor, a reduced instruction set computing (RISC) processor, a complex instruction set computing (CISC) processor, a field-programmable gate array (FPGA), a Programmable Logic Control unit (PLC), and the like. Aspects of the present disclosure are intended to include or otherwise cover any type of the processing unit 112 including known, related art, and/or later developed processing units.
The memory unit 114 may be configured to store the logic, instructions, circuitry, interfaces, and/or codes of the processing unit 112, data associated with the user devices 102, and/or data associated with the system 100. Examples of the memory unit 114 may include, but are not limited to, a Read-Only Memory (ROM), a Random-Access Memory (RAM), a flash memory, a removable storage drive, a hard disk drive (HDD), a solid-state memory, a magnetic storage drive, a Programmable Read Only Memory (PROM), an Erasable PROM (EPROM), and/or an Electrically EPROM (EEPROM). Aspects of the present disclosure are intended to include or otherwise cover any type of the memory unit 114 including known, related art, and/or later developed memories.
The drone console 116 may be configured as computer-executable applications, to be executed by the processing unit 112. The drone consoles 116 may include suitable logic, instructions, and/or codes for executing various operations and may be controlled by the aerial vehicle 104. The one or more computer executable applications corresponding to the drone console 116 may be stored in the memory unit 114 Examples of the one or more computer executable applications may include, but are not limited to, an audio application, a video application, a social media application, a navigation application, or the like. Aspects of the present disclosure are intended to include or otherwise cover any type of the computer executable application including known, related art, and/or later developed computer executable applications.
The communication interface 118 may be configured to enable the user device 102 to communicate with the aerial vehicle 104. Examples of the communication interface 118a may include, but are not limited to, a modem, a network interface such as an Ethernet card, a communication port, and/or a Personal Computer Memory Card International Association (PCMCIA) slot and card, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and a local buffer circuit. It will be apparent to a person of ordinary skill in the art that the communication interface 118 may include any device and/or apparatus capable of providing wireless or wired communications of the user device 102 with the serial vehicle 104.
In some aspects of the present disclosure, the aerial vehicle 104 may include a sensing unit 120, an aviation unit 122, processing circuitry 124 and a local database 126.
The sensing unit 120 may be configured to capture one or more images of a GPS-denied environment of the aerial vehicle 104 (hereinafter interchangeably referred to as environment of the aerial vehicle 104). The sensing unit 120 may further be configured to sense one or more parameters associated with an orientation of the aerial vehicle 104. Furthermore, the sensing unit 120 may be configured to capture a plurality of high-resolution images. The sensing unit 120 may include first through third sets of sensors (shown as 120a-120c later in FIG. 2). Examples of the first through third sets of sensors 120a-120c of the sensing unit 120 may include, but not limited to, one or more navigation cameras, one or more payload cameras, one or more orientation sensors, gyroscope sensor, accelerometer sensor, and the like. Aspects of the present disclosure are intended to include or otherwise cover any type of sensors including known, related art, and/or later developed sensors.
The aviation unit 122 may include one or more entities that may be configured to enable the aerial vehicle 104 to move (or fly). Aspects of the present disclosure are intended to include or otherwise cover any type of the aviation unit 122 including known, related art, and/or later developed aviation units.
In some aspects of the present disclosure, the processing circuitry 124 may include suitable logic, instructions, circuitry, interfaces, and/or codes for executing various operations of the system 100. The processing circuitry 124 may be configured to host and enable the drone console 112 running on (or installed on) the user device 102, to execute the operations associated with the system 100 by communicating one or more commands and/or instructions over the communication network 106.
The local database 126 may be configured to store the logic, instructions, circuitry, interfaces, and/or codes of the processing circuitry 124 for executing a number of operations. The local database 126 may be further configured to store therein, data associated with users registered with the system 100. Some aspects of the present disclosure are intended to include and/or otherwise cover any type of the data associated with the users registered with the system 100. Examples of the local database 126 may include but are not limited to, a ROM, a RAM, a flash memory, a removable storage drive, a HDD, a solid-state memory, a magnetic storage drive, a PROM, an EPROM, and/or an EEPROM. In some aspects of the present disclosure, the local database 126 may be configured to store one or more of, user data, instructions data, and the like corresponding to the system 100.
The communication network 106 may include suitable logic, circuitry, and interfaces that may be configured to provide a number of network ports and a number of communication channels for transmission and reception of data related to operations of various entities (such as the user device 102 and the aerial vehicle 104) of the system 100. Each network port may correspond to a virtual address (or a physical machine address) for transmission and reception of the communication data. For example, the virtual address may be an Internet Protocol Version 4 (IPV4) (or an IPV6 address) and the physical address may be a Media Access Control (MAC) address. The communication network 106 may be associated with an application layer for implementation of communication protocols based on one or more communication requests from the user device 102 and the aerial vehicle 104. The communication data may be transmitted or received, via the communication protocols. Examples of the communication protocols may include, but are not limited to, Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), Simple Mail Transfer Protocol (SMTP), Domain Network System (DNS) protocol, Common Management Interface Protocol (CMIP), Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Long Term Evolution (LTE) communication protocols, or any combination thereof.
In some aspects of the present disclosure, the communication data may be transmitted or received via at least one communication channel of a number of communication channels in the communication network 106. The communication channels may include, but are not limited to, a wireless channel, a wired channel, a combination of wireless and wired channel thereof. The wireless or wired channel may be associated with a data standard which may be defined by one of a Local Area Network (LAN), a Personal Area Network (PAN), a Wireless Local Area Network (WLAN), a Wireless Sensor Network (WSN), Wireless Area Network (WAN), Wireless Wide Area Network (WWAN), a metropolitan area network (MAN), a satellite network, the Internet, an optical fiber network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. Aspects of the present disclosure are intended to include or otherwise cover any type of communication channel, including known, related art, and/or later developed technologies.
In operation, the system 100 by way of the sensing unit 120 may be configured to capture one or more images of a GPS denied environment of the aerial vehicle 104, and sense one or more parameters associated with an orientation of the aerial vehicle 104. The system 100 by way of the processing circuitry 124 may be configured to determine a state of the aerial vehicle 104 based on the one or more images and the one or more orientation parameters. The system 100 by way of the processing circuitry 124, based on the state of the aerial vehicle 104 may further be configured to determine a set of control configurations for a stable altitude and angular velocity control of the aerial vehicle 104. The system 100 by way of the processing circuitry 124 may further be configured to combine the one or more images captured by the sensing unit 120 to generate a depth map. The system 100, by way of the processing circuitry 124, based on the depth map, may be configured to determine one or more trajectories of flight from a source point to a destination point. The system 100 by way of the processing circuitry 124, based on the one or more images of the GPS denied environment captured by the sensing unit 120, may be configured to identify a numerical count of occlusions in each trajectory of the one or more trajectories of flight by way of an object detection technique. The system 100, by way of the processing circuitry 124 may further be configured to determine an optimized trajectory of the one or more trajectories of flight based on the numerical count of occlusions in each trajectory of the one or more trajectories of flight. The system 100 by way of the aviation unit 122, may be configured to move (or fly) the aerial vehicle 104 from the source point to the destination point on the determined optimized trajectory. When the aerial vehicle reaches the destination point, the system 100, by way of the sensing unit 120, may be configured to capture a plurality of high-resolution images of the destination point. The system 100, by way of the processing circuitry 124 may further be configured to identify an object of interest from the plurality of high-resolution images of the destination point using at least one of, one or more object detection techniques and one or more artificial intelligence techniques. Furthermore, the system 100, by way of the processing circuitry 124, may be configured to identify one or more high resolution images of the plurality of high-resolution images in which the object of interest is identified. Upon identification of the one or more high resolution images in which the object of interest is identified, the system 100, by way of the processing circuitry 124, may be configured to send the one or more high resolution images to the user device 102 associated with the user. Upon sending the one or more high resolution images of the object of interest to the user, the system 100, by way of the aviation unit 122, may be configured to enable the aerial vehicle 104 to move (or fly) back to the source point.
FIG. 2 is a block diagram that illustrates the aerial vehicle 104 of FIG. 1, in accordance with an exemplary aspect of the present disclosure. The aerial vehicle 104 may include the sensing unit 120, the aviation unit 122, the processing circuitry 124 and the local database 126. The information processing apparatus 104 may further include a network interface 200 and an input/output (I/O) interface 202. The sensing unit 120, the aviation unit 122, the processing circuitry 124, the local database 126, the network interface 200, and the input/output (I/O) interface 202 may communicate with each other by way of a first communication bus 203. It will be apparent to a person having ordinary skill in the art that the aerial vehicle 104 is for illustrative purposes and not limited to any specific combination of hardware circuitry and/or software.
The network interface 200 may be implemented by use of various known technologies to support wired or wireless communication of the aerial vehicle 104 with the communication network 106. The network interface 200 may include, but is not limited to, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and a local buffer circuit.
The I/O interface 202 may include suitable logic, circuitry, interfaces, and/or code that may be configured to receive inputs (e.g., orders) and transmit outputs via a plurality of data ports (not shown) in the aerial vehicle 104. The I/O interface 202 may include various input and output data ports for different I/O devices. Examples of such I/O devices may include, but are not limited to, a touch screen, a keyboard, a mouse, a joystick, a projector audio output, a microphone, an image-capture device, a liquid crystal display (LCD) screen and/or a speaker.
The sensing unit 120 may include first through third sets of sensors 120a-102c. The first set of sensors 120a may include one or more first sensors (hereinafter interchangeably referred to as “navigation cameras”) that may be configured to capture the one or more images of an environment of the aerial vehicle 104. Examples of the navigation cameras may include stationary cameras, Pan-Tilt-Zoom (PTZ) cameras, depth sensing camera pairs, and the like. Aspects of the present disclosure are intended to include or otherwise cover any type of navigation cameras as the first set of cameras 120a including known, related art, and/or later developed navigation cameras. Specifically, the first set of sensors 120a may have two sensors (i.e., a first sensor 120aa, and a second sensor 120ab) that may be configured to capture the one or more images of the environment of the aerial vehicle 104, that may be used by the processing circuitry 124 to determine a depth of the aerial vehicle 104. Although FIG. 2 illustrates that the first set of sensors 120a includes two sensors (i.e., the first sensor 120aa and the second sensor 120ab), it will be apparent to a person skilled in the art that the scope of the present disclosure is not limited to it. In various other aspects, the first set of sensors 120a may have any number of sensors, without deviating from the scope of the present disclosure. In such a scenario, each sensor of the first set of sensors 120a is adapted to serve one or more functionalities in a manner similar to the functionalities of the first and second sensors 120aa-120ab as described above.
The sensing unit 120 may further include a second set of sensors 120b. Examples of the second set of sensors 120b may include accelerometer sensor, gyroscope sensor, and the like. Aspects of the present disclosure are intended to include or otherwise cover any type of orientation sensors as the second set of sensors 120b including known, related art, and/or later developed orientation sensors. The second set of sensors 120b may be configured to sense the one or more parameters associated with an orientation of the aerial vehicle 104. Examples of the one or more parameters associated with the orientation of the aerial vehicle 104 may include geographical location, altitude, longitude, latitude, perspective localization parameters, and the like. Aspects of the present disclosure are intended to include or otherwise cover any type of orientation parameters including known, related art, and/or later developed orientation parameters.
Specifically, the second set of sensors 120b may have two sensors (i.e., a third sensor 120ba, and a fourth sensor 120bb) that may be configured to determine the orientation parameters of the aerial vehicle 104. Although FIG. 2 illustrates that the second set of sensors 120b includes two sensors (i.e., the third sensor 120ba and the fourth sensor 120bb), it will be apparent to a person skilled in the art that the scope of the present disclosure is not limited to it. In various other aspects, the second set of sensors 120b may have any number of sensors, without deviating from the scope of the present disclosure. In such a scenario, each sensor of the second set of sensors 120b is adapted to serve one or more functionalities in a manner similar to the functionalities of the third and fourth sensors 120ba-120bb as described above.
Furthermore, the sensing unit 120 may include a third set of sensors 120c. The third set of sensors 120a may include one or more third sensors (hereinafter interchangeably referred to as “payload cameras”) that may be configured to capture the one or more high-resolution images of the object of interest. Examples of the payload cameras may include high-definition stationary cameras, high-definition PTZ cameras, and the like. Aspects of the present disclosure are intended to include or otherwise cover any type of payload cameras as the third set of cameras 120c including known, related art, and/or later developed navigation cameras. Specifically, the third set of sensors 120c may have two sensors (i.e., a fifth sensor 120ca, and a sixth sensor 120cb) that may be configured to capture the one or more high-resolution images of the object of interest. Although FIG. 2 illustrates that the third set of sensors 120c includes two sensors (i.e., fifth sensor 120ca, and the sixth sensor 120cb), it will be apparent to a person skilled in the art that the scope of the present disclosure is not limited to it. In various other aspects, the third set of sensors 120c may have any number of sensors, without deviating from the scope of the present disclosure. In such a scenario, each sensor of the third set of sensors 120c is adapted to serve one or more functionalities in a manner similar to the functionalities of the fifth and sixth sensors 120ca-120cb as described above.
The aviation unit 122 may include the one or more entities that may be configured to enable the aerial vehicle 104 to move (or fly). Aspects of the present disclosure are intended to include or otherwise cover any type of the aviation unit 122 including known, related art, and/or later developed aviation units.
In some aspects of the present disclosure, the aviation unit 122 may include a set of propellors (not shown), one or more motors (not shown), one or more battery units (not shown), and an aviation control unit (not shown) coupled to each other. The aviation unit 122 may be configured to control operation of the one or motors to provide rotation of the one or more propellors at a desired rotational speed. The one or more battery units may be configured to provide electrical power for the various entities of the aviation unit 122. In some aspects of the present disclosure, the one or more battery units may be configured to provide electrical power to various entities of the aerial vehicle 104.
In some other aspects of the present disclosure, the aviation unit 122 may include the one or more propellors, a fuel combustion engine (not shown), a fuel storage unit (not shown), and the aviation control unit coupled to each other. The fuel combustion engine may be configured to receive fuel from the fuel storage unit. The fuel combustion engine may further be configured to generate electrical power for various entities of the aviation unit 122. In some aspects of the present disclosure, the fuel combustion engine may be configured to provide electrical power to the various entities of the aerial vehicle 104.
In some aspects of the present disclosure, the processing circuitry 120 may include a data exchange engine 204, a registration engine 206, an authentication engine 208, an aviation control engine 210, a data perception engine 212, a path planning engine 214, an object detection engine 216, and a notification engine 218, coupled to each other by way of a second communication bus 222.
The data exchange engine 204 may be configured to receive one or more inputs from the user device 102 associated with the user. The data exchange engine 204 may further be configured to enable an exchange of data and/or instructions between various engines of the processing circuitry 124. In some aspects of the present disclosure, the data exchange engine 204 may be configured to receive data from the sensing unit 120. The data exchange engine 204 may further be configured to send the data received from the sensing unit 120 to one or more engines of the various engines of the processing circuitry 124, based on a request from the one or more engines of the various engines.
The registration engine 206 may be configured to enable the user to register into the system 100 by providing registration data through a registration menu (not shown) of the drone console 112 that may be displayed by way of the user device 102.
The authentication engine 208 may be configured to fetch the registration data of the user and authenticate the registration data of the user. The authentication engine 208, upon successful authentication of the registration data of the user, may be configured to enable the user to log-in or sign up to the system 100.
The aviation control engine 210 may be configured to determine the state of the aerial vehicle 104 based on the one or more images, and the one or more orientation parameters sensed by the sensing unit 120. The aviation control engine 210 may further be configured to determine, based on the state of the aerial vehicle 104, the set of control configurations for the stable altitude and angular velocity control of the aerial vehicle 104. The aviation control engine 210 may be configured to determine the set of control configurations for the stable altitude and angular velocity control of the aerial vehicle 104 using at least one of, one or more machine learning techniques, one or more artificial intelligence techniques, or a combination thereof.
In some aspects of the present disclosure, the aviation engine 210 may be configured to receive the plurality of high-resolution images of the destination point from the sensing unit 120. The aviation control engine 210 may further be configured to determine a distance between the aerial vehicle 104 and the object of interest by using one or more image processing techniques on the plurality of high-resolution images of the destination point. Furthermore, the aviation control engine 210 may be configured to generate a restriction signal for the aviation unit 122 to restrict the movement of the aerial vehicle at a pre-defined distance from the object of interest.
In some aspects of the present disclosure, to determine the set of control configurations, the aviation control engine 210 may be configured to train a set of aviation weights by way of a custom ground-truth dataset. The aviation control engine 210 may further be configured to fine-tune (or iteratively update) the set of aviation weights by way of one or more mixed perception techniques.
In some aspects of the present disclosure, to fine-tune the set of aviation parameters, the aviation control engine 210 may be configured to iteratively update a forward pass and a backward pass of the set of aviation weights by way of a set of pre-defined lower-precision floating-point numbers. The aviation control engine 210 may further be configured to iteratively update the set of aviation weights by way of a set of pre-defined higher-precision numbers.
The data perception engine 212 may be configured to combine the one or more images captured by the first set of sensors 120a to generate a depth map, using one or more artificial intelligence techniques. In some aspects of the present disclosure, prior to generation of the depth map, the data perception engine 212 may utilize the one or more orientation parameters sensed by the second set of sensors 120b to generate a point cloud based the one or more images captured. The data perception engine 212 may further be configured to update the point cloud based on one or more temporal images of the environment of the aerial vehicle 104 that are captured by the first set of sensors 120a over a period of time. Furthermore, the data perception engine 212 may be configured to generate the depth map based on the updated point cloud.
The path planning engine 214 may be configured to determine one or more trajectories of flight from a source point to a destination point based on the depth map. The path planning engine 214 may further be configured to identify a numerical count of occlusions in each trajectory of the one or more trajectories of flight. In some aspects of the present disclosure, the path planning engine 214 may be configured to determine the occlusions in the one or more trajectories of flight by way of an object detection technique. The path planning engine 214 may further be configured to determine optimized trajectory of the one or more trajectories of flight based on the numerical count of occlusions in each trajectory, using one or more artificial intelligence techniques.
The object detection engine 216 may be configured to identify the object of interest from the plurality of high-resolution images of the destination point using at least one of, one or more object detection techniques and one or more artificial intelligence techniques, The object detection engine 216 may further be configured to identify one or more high resolution images of the plurality of high-resolution images in which the object of interest is identified. Furthermore, the object detection engine 216 may be configured to generate transmission instructions for the sensing unit 210 to send the one or more high resolution images of the object of interest to the user device 102. The notification engine 218 may be configured to generate one or more notifications corresponding to one or more activities performed by the various entities of the system 100.
In some aspects of the present disclosure, the local database 126 may include an instructions repository 224, a user data repository 226, and a training data repository 228, a depth map repository 230, a trajectory repository 232, and an image repository 234. The instructions repository 224 may be configured to store one or more instructions of the aerial vehicle 104. The user data repository 226 may be configured to store data and/or metadata of the data associated with the user of the system 100. Specifically, the user data repository 226 may be configured to store data and/or metadata of the data associated with the user. The training data repository 228 may be configured to store one or more datasets (including the custom dataset) that may be used for training of at least one of, the one or more machine learning techniques, the one or more artificial intelligence techniques, or a combination thereof. The depth map repository 230 may be configured to store therein the depth maps generated by the data perception engine 212. The trajectory repository 232 may be configured to store therein the one or more trajectories from the source point to the destination point. The image repository 234 may be configured to store the plurality of high-resolution images captured by the third set of sensors 120c.The image repository 234 may further be configured to store one or more images of the environment of the aerial vehicle 104.
FIG. 3 illustrates a flow chart of the method 300, in accordance with an exemplary aspect of the present disclosure.
At step 302, the system 100 may capture the one or more images of the environment of the aerial vehicle 104, and sense the one or more parameters associated with the orientation of the aerial vehicle 104.
At step 304, the system 100 may determine the state of the aerial vehicle 104 based on the one or more images and the one or more orientation parameters.
At step 306, the system 100, based on the state of the aerial vehicle 104 may determine the set of control configurations for the stable altitude and angular velocity control of the aerial vehicle 104.
At step 308, the system 100 may combine the one or more images captured by the sensing unit 120 to generate the depth map.
At step 310, the system 100, based on the depth map, may determine the one or more trajectories of flight from the source point to the destination point.
At step 312, the system 100, based on the one or more images of the GPS denied environment captured by the sensing unit 120, may identify the numerical count of occlusions in each trajectory of the one or more trajectories of flight by way of the object detection technique.
At step 314, the system 100 may further determine the optimized trajectory of the one or more trajectories of flight based on the numerical count of occlusions in each trajectory of the one or more trajectories of flight.
At step 316, the system 100 may enable the aerial vehicle 104 to move (or fly) from the source point to the destination point on the determined optimized trajectory.
At step 318, the system 100 may capture the plurality of high-resolution images of the destination point.
At step 320, The system 100 may further identify the object of interest from the plurality of high-resolution images of the destination point using at least one of, the one or more object detection techniques and the one or more artificial intelligence techniques.
At step 322, the system 100 may identify the one or more high-resolution images of the plurality of high-resolution images in which the object of interest is identified.
At step 324, the system 100 may send the one or more high resolution images to the user device 102 associated with the user.
At step 326, the system 100 may enable the aerial vehicle 104 to move (or fly) back to the source point.
As mentioned, there is a need for improved system with better reliability and accuracy of location for UAV. The present aspect, therefore: provides the system 100, the aerial vehicle 104, and the method 300 for more reliable and accurate surveillance. The system 100 by way of the aerial vehicle 104 enables surveillance of GPS denied areas due to non-reliability of GPS based sensors for localization. The system 100 further uses custom datasets for training of the one or more artificial intelligence and machine learning based techniques for enhanced accuracy. Furthermore, the system 100, by way of the processing circuitry 124 restricts the movement of the aerial vehicle for inspection of the object of interest at a pre-defined distance. Therefore, the system 100 provides efficient and reliable solution for surveillance in critical conditions such as inspection and surveillance of high-power electric cables, tunnels and the like.
The foregoing discussion of the present disclosure has been presented for purposes of illustration and description. It is not intended to limit the present disclosure to the form or forms disclosed herein. In the foregoing Detailed Description, for example, various features of the present disclosure are grouped together in one or more aspects, configurations, or aspects for the purpose of streamlining the disclosure. The features of the aspects, configurations, or aspects may be combined in alternate aspects, configurations, or aspects other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention the present disclosure requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, configuration, or aspect. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate aspect of the present disclosure.
Moreover, though the description of the present disclosure has included description of one or more aspects, configurations, or aspects and certain variations and modifications, other variations, combinations, and modifications are within the scope of the present disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, configurations, or aspects to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.
As one skilled in the art will appreciate, the system 100 includes a number of functional blocks in the form of a number of units and/or engines. The functionality of each unit and/or engine goes beyond merely finding one or more computer algorithms to carry out one or more procedures and/or methods in the form of a predefined sequential manner, rather each engine explores adding up and/or obtaining one or more objectives contributing to an overall functionality of the system 100. Each unit and/or engine may not be limited to an algorithmic and/or coded form, rather may be implemented by way of one or more hardware elements operating together to achieve one or more objectives contributing to the overall functionality of the system 100. Further, as it will be readily apparent to those skilled in the art, all the steps, methods and/or procedures of the system 100 are generic and procedural in nature and are not specific and sequential.
Certain terms are used throughout the following description and claims to refer to particular features or components. As one skilled in the art will appreciate, different persons may refer to the same feature or component by different names. This document does not intend to distinguish between components or features that differ in name but not structure or function. While various aspects of the present disclosure have been illustrated and described, it will be clear that the present disclosure is not limited to these aspects only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the present disclosure, as described in the claims. ,CLAIMS:1. An aerial vehicle (104) comprising:
a sensing unit (120) comprising:
first and second set of sensors (120a-102b) configured to capture one or more images of the GPS denied environment of the aerial vehicle (104), and sense one or more parameters associated with an orientation of the aerial vehicle (104), respectively; and
processing circuitry (124) that is coupled to the sensing unit (120), and configured to:
determine a state of the aerial vehicle (104) based on the one or more images and the one or more orientation parameters; and
determine, based on the state of the aerial vehicle (104), a set of control configurations for a stable altitude and angular velocity control of the aerial vehicle (104) using at least one of, one or more machine learning techniques, one or more artificial intelligence techniques, or a combination thereof.

2. The aerial vehicle (104) as claimed in claim 1, wherein, to determine the set of control configurations, the processing circuitry (124) is configured to (i) train a set of aviation weights by way of a custom ground-truth dataset, and (ii) fine-tuned the set of aviation weights by way of one or more mixed perception techniques.

3. The aerial vehicle (104) as claimed in claim 2, wherein, to fine-tune the set of aviation parameters, the processing circuitry is configured to (i) iteratively update, by way of a set of pre-defined lower-precision floating-point numbers, a forward pass and a backward pass of the set of aviation weights, and (ii) iteratively update, by way of a set of pre-defined higher-precision numbers, the set of aviation weights.

4. The aerial vehicle (104) as claimed in claim 1, wherein the processing circuitry (124), is configured to combine the one or more images captured by the first set of sensors (120a) to generate a depth map, using one or more artificial intelligence techniques.

5. The aerial vehicle (104) as claimed in claim 1, wherein the processing circuitry (124), based on the depth map, is further configured to determine one or more trajectories of flight from a source point to a destination point based on the depth map.

6. The aerial vehicle (104) as claimed in claim 5, wherein the processing circuitry (124) is configured to (i) identify a numerical count of occlusions in each trajectory of the one or more trajectories of flight by way of an object detection technique, and (ii) determine an optimized trajectory of the one or more trajectories of flight based on the numerical count of occlusions in each trajectory, using one or more artificial intelligence techniques.

7. The aerial vehicle (104) as claimed in claim 1, wherein sensing unit (120) further comprising a third set of sensors (120c) configured to capture a plurality of high-resolution images of the destination point.

8. The aerial vehicle (104) as claimed in claim 7, wherein the processing circuitry (124) is configured to (i) identify an object of interest from the plurality of high-resolution images of the destination point using at least one of, one or more object detection techniques and one or more artificial intelligence techniques, and (ii) identify one or more high resolution images of the plurality of high-resolution images in which the object of interest is identified.

9. The aerial vehicle (104) as claimed in claim 8, wherein the processing circuitry (124) is further configured to send the one or more high resolution images to a user device (102).

10. A system (100) comprising:
a user device (102) configured to enable a user to select a source point and a destination point; and
aerial vehicle (104) comprising:
a sensing unit (120) comprising:
first and second set of sensors configured to capture one or more images of the GPS denied environment of the aerial vehicle (104), and sense one or more parameters associated with an orientation of the aerial vehicle (104); and
processing circuitry (124) that is coupled to the sensing unit (120), and configured to:
determine a state of the aerial vehicle (104) based on the one or more images and the one or more orientation parameters; and
determine, based on the state of the aerial vehicle (104), a set of control configurations for a stable altitude and angular velocity control of the aerial vehicle (104) using at least one of, one or more machine learning techniques, one or more artificial intelligence techniques, or a combination thereof.

11. The system (100) as claimed in claim 10, wherein, to determine the set of control configurations, the processing circuitry (124) is configured to (i) train a set of aviation weights by way of a custom ground-truth dataset, and (ii) fine-tuned the set of aviation weights by way of one or more mixed perception techniques.

12. The system (100) as claimed in claim 11, wherein, to fine-tune the set of aviation parameters, the processing circuitry is configured to (i) iteratively update, by way of a set of pre-defined lower-precision floating-point numbers, a forward pass and a backward pass of the set of aviation weights, and (ii) iteratively update, by way of a set of pre-defined higher-precision numbers, the set of aviation weights.

13. The system (100) as claimed in claim 10, wherein the processing circuitry (124), is configured to combine the one or more images captured by the first set of sensors (120a) to generate a depth map, using one or more artificial intelligence techniques.

14. The system (100) as claimed in claim 13, wherein the processing circuitry (124), based on the depth map, is further configured to determine one or more trajectories of flight from the source point to the destination point.

15. The system (100) as claimed in claim 14, wherein the processing circuitry (124) is configured to (i) identify a numerical count of occlusions in each trajectory of the one or more trajectories of flight by way of an object detection technique, and (ii) determine an optimized trajectory of the one or more trajectories of flight based on the numerical count of occlusions in each trajectory, using one or more artificial intelligence techniques.

16. The system (100) as claimed in claim 10, wherein sensing unit (120) further comprising a third set of sensors (120c) configured to capture a plurality of high-resolution images of the destination point.

17. The system (100) as claimed in claim 16, wherein the processing circuitry (124) is configured to (i) identify an object of interest from the plurality of high-resolution images of the destination point using at least one of, one or more object detection techniques and one or more artificial intelligence techniques, and (ii) identify one or more high resolution images of the plurality of high-resolution images in which the object of interest is identified.

18. The system (100) as claimed in claim 17, wherein the processing circuitry (124) is further configured to send the one or more high resolution images to the user device (102).

19. A method (300) comprising:
enabling, by way of a user device (102), a user to select the source point and the destination point;
capturing, by way of a first set of sensors (120a) of a sensing unit (120), one or more images of the GPS denied environment of the aerial vehicle (104);
sensing, by way of a second set of sensors () of the sensing unit (120), one or more parameters associated with an orientation of the aerial vehicle (104);
determining, by way of processing circuitry (124), a state of the aerial vehicle (104) based on the one or more images and the one or more orientation parameters;
determine, by way of the processing circuitry (124), a set of control configurations for a stable altitude and angular velocity control of the aerial vehicle (104) based on the state of the aerial vehicle (104), using at least one of, one or more machine learning techniques, one or more artificial intelligence techniques, or a combination thereof.

20. The method (300) as claimed in claim 19, wherein, for determining the set of control configurations, the method (300) comprising (i) training, by way of the processing circuitry (124), a set of aviation weights by way of a custom ground-truth dataset, and (ii) fine-tuning, by way of the processing circuitry (124), the set of aviation weights by way of one or more mixed perception techniques.

21. The method (300) as claimed in claim 19, wherein, for fine-tuning the set of aviation parameters, the method (300) comprising (i) iteratively updating, by way of the processing circuitry, a forward pass and a backward pass of the set of aviation weights by a set of pre-defined lower-precision floating-point numbers and (ii) iteratively updating, by way of the processing circuitry (124), the set of aviation weights by a set of pre-defined higher-precision numbers.

22. The method (300) as claimed in claim 19, wherein, to generate a depth map, the method (300) comprising combining the one or more images captured by the first set of sensors (120a) using one or more artificial intelligence techniques.

23. The method (300) as claimed in claim 22, wherein, upon generating of the depth map, the method (300) further comprising determining one or more trajectories of flight from the source point to the destination point based on the depth map.

24. The method (300) as claimed in claim 23, wherein, upon determining one or more trajectories of flight, the method (300) further comprising (i) identifying, by way of the processing circuitry (124), a numerical count of occlusions in each trajectory of the one or more trajectories of flight by way of an object detection technique, and (ii) determining, by way of the processing circuitry (124), an optimized trajectory of the one or more trajectories of flight based on the numerical count of occlusions in each trajectory, using one or more artificial intelligence techniques.

25. The method (300) as claimed in claim 19, wherein the method (300) further comprising capturing, by way of a third set of sensors (120c) of the sensing unit (120), a plurality of high-resolution images of the destination point.

26. The method (300) as claimed in claim 25, wherein, upon capturing the plurality of high-resolution images of the destination point, the method (300) comprising (i) identifying, by way of the processing circuitry (124), an object of interest from the plurality of high-resolution images of the destination point using at least one of, one or more object detection techniques and one or more artificial intelligence techniques, and (ii) identifying, by way of the processing circuitry (124), one or more high resolution images of the plurality of high-resolution images in which the object of interest is identified.

27. The method (300) as claimed in claim 26, wherein, upon identifying the one or more high resolution images, the method (300) comprising sending, by way of the processing circuitry (124), the one or more high resolution images to the user device (102).

Documents

Application Documents

# Name Date
1 202211035946-STATEMENT OF UNDERTAKING (FORM 3) [23-06-2022(online)].pdf 2022-06-23
2 202211035946-PROVISIONAL SPECIFICATION [23-06-2022(online)].pdf 2022-06-23
3 202211035946-FORM FOR STARTUP [23-06-2022(online)].pdf 2022-06-23
4 202211035946-FORM FOR SMALL ENTITY(FORM-28) [23-06-2022(online)].pdf 2022-06-23
5 202211035946-FORM 1 [23-06-2022(online)].pdf 2022-06-23
6 202211035946-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-06-2022(online)].pdf 2022-06-23
7 202211035946-EVIDENCE FOR REGISTRATION UNDER SSI [23-06-2022(online)].pdf 2022-06-23
8 202211035946-DRAWINGS [23-06-2022(online)].pdf 2022-06-23
9 202211035946-DECLARATION OF INVENTORSHIP (FORM 5) [23-06-2022(online)].pdf 2022-06-23
10 202211035946-FORM-26 [01-08-2022(online)].pdf 2022-08-01
11 202211035946-Proof of Right [08-12-2022(online)].pdf 2022-12-08
12 202211035946-DRAWING [23-06-2023(online)].pdf 2023-06-23
13 202211035946-COMPLETE SPECIFICATION [23-06-2023(online)].pdf 2023-06-23
14 202211035946-Defence-25-08-2023.pdf 2023-08-25
15 202211035946-FORM28 [26-10-2023(online)].pdf 2023-10-26
16 202211035946-Covering Letter [26-10-2023(online)].pdf 2023-10-26
17 202211035946-FORM 3 [07-11-2023(online)].pdf 2023-11-07
18 202211035946-Defence-25-01-2024.pdf 2024-01-25
19 202211035946-Defence-16-05-2024.pdf 2024-05-16
20 202211035946-Covering Letter [28-06-2024(online)].pdf 2024-06-28
21 202211035946-REPLY FORM DRDO-051023.pdf 2024-08-20