Sign In to Follow Application
View All Documents & Correspondence

Gesture Controlled Control Drone

Abstract: A gesture-controlled control drone, comprises of an UAV 101 configured with propellers 102 and an imaging unit 103 connected with the UAV 101, an eyewear 104 comprising a pair of lenses 105 mounted within a rim 106 and a pair of extendable temples 107 pivotally attached with lateral portions of the rim 106 to stabilise the rim 106 against eyes of the user, a communication unit to establish an operative connection with the UAV 101, an IMU 108 connected with a control unit to detect and track the head movement of the user, an IR (infrared) sensor to capture and track the eye movements of the user, a camera 109 attached with the rim 106 to capture hand movements of the user, a HUD (heads up display) 110 to display a live video feed captured from the imaging unit 103 of the UAV 101 received via the communication unit.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
27 May 2025
Publication Number
25/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Marwadi University
Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Inventors

1. Ronit Motivaras
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
2. Chandrasinh D Parmar
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
3. Nishith Kotak
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to a gesture-controlled control drone that is capable of detecting the instant orientation of the drone along with the forces operating over the drone and taking the necessary steps for correcting the position of the drone thereby enhancing the drone's flight stability and safety. The present invention is also capable of detecting the presence of obstacles in front of the drone and taking the necessary steps for preventing the collision between the drone and the obstacle, thereby reducing the risk of damage to the drone.

BACKGROUND OF THE INVENTION

[0002] Unmanned Aerial Vehicles (UAVs), commonly known as drones, have become increasingly essential across various sectors due to their versatility, efficiency and cost-effectiveness. They play a crucial role in applications such as aerial photography, agriculture, disaster management, surveillance, and military operations. UAVs enable rapid, real-time data collection from hard-to-reach or hazardous areas, enhancing decision-making and operational safety. Their ability to access remote locations with minimal human risk makes them invaluable for environmental monitoring, infrastructure inspection and search-and-rescue missions. The importance of UAVs continues to grow, transforming industries and contributing to innovations in data acquisition, transportation, and security.

[0003] Traditional methods of controlling drones typically involve manual operation using remote controllers, joysticks, or touchscreen interfaces. These methods require physical input devices and direct line-of-sight, often resulting in limited flexibility and responsiveness. Some advanced drones utilize pre-programmed flight paths or GPS-based navigation. Traditional control methods for drones have several drawbacks, including limited intuitiveness and requiring direct physical interaction, which is inconvenient in certain environments. They often demand precise manual operation and are challenging for users without extensive training. Additionally, these methods lack flexibility in dynamic situations, are susceptible to signal interference, and do not allow for hands-free or natural gesture-based control, reducing overall efficiency and user convenience.

[0004] US20170235308A1 discloses a method, system, and/or computer program product controls movement and adjusts operations of an aerial drone. A drone camera observes an aerial maneuver physical gesture by a user. The aerial drone then performs an aerial maneuver that correlates to the aerial maneuver physical gesture. The drone camera observes the user performing a physical action. One or more processors associate the physical action with a particular type of activity. A drone on-board computer adjusts an operation of an aerial drone based on the particular type of activity.

[0005] US11755041B2 discloses techniques that are described for controlling an autonomous vehicle such as an unmanned aerial vehicle (UAV) using objective-based inputs. In an embodiment, the underlying functionality of an autonomous navigation system is exposed via an application programming interface (API) allowing the UAV to be controlled through specifying a behavioral objective, for example, using a call to the API to set parameters for the behavioral objective. The autonomous navigation system can then incorporate perception inputs such as sensor data from sensors mounted to the UAV and the set parameters using a multi-objective motion planning process to generate a proposed trajectory that most closely satisfies the behavioral objective in view of certain constraints. In some embodiments, developers can utilize the API to build customized applications for the UAV. Such applications, also referred to as “skills,” can be developed, shared, and executed to control behavior of an autonomous UAV and aid in overall system improvement.

[0006] Conventionally, many drones have been developed for operating under the control of the user’s gesture but they lack in detecting the instant orientation of the drone along with the forces operating over the drone for correcting the position of the drone for enhancing the drone's flight stability and safety. They also lack in detecting the presence of obstacles in front of the drone for preventing the collision between the drone and the obstacle for reducing the risk of damage to the drone.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a drone that requires to be capable of detecting the instant orientation of the drone along with the forces operating over the drone for correcting the position of the drone for enhancing the drone's flight stability and safety. Additionally, the drone requires to be capable of detecting the presence of obstacles in front of the drone for preventing the collision between the drone and the obstacle for reducing the risk of damage to the drone.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a drone that is capable of detecting the instant orientation of the drone along with the forces operating over the drone and taking the necessary steps for correcting the position of the drone thereby enhancing the drone's flight stability and safety.

[0010] Another object of the present invention is to develop a drone that is capable of detecting the instant location of the drone for preventing the drone from traversing outside a predefined geo-fence, thereby enhancing safety and operational control.

[0011] Yet another object of the present invention is to develop a drone that is capable of detecting the obstacles in front of the drone and taking the necessary steps for preventing the collision between the drone and the obstacle, thereby reducing the risk of damage to the drone.

[0012] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0013] The present invention relates to a gesture-controlled control drone that is capable of detecting the instant location of the drone for preventing the drone from traversing outside a predefined geo-fence, thereby enhancing safety and operational control.

[0014] According to an embodiment of the present invention, a gesture-controlled control drone comprises of an UAV (unmanned aerial vehicle) configured with propellers and an imaging unit connected with the UAV by means of a ball and socket joint, an eyewear comprising a pair of lenses mounted within a rim and a pair of extendable temples pivotally attached with lateral portions of the rim to stabilise the rim against eyes of the user, a communication unit installed with the rim to establish an operative connection with the UAV, an IMU (inertial measurement unit) installed with the rim and connected with a control unit to detect and track the head movement of the user to accordingly actuate the communication unit to transmit a command to the UAV relating to the speed of the UAV, an IR (infrared) sensor installed with the rim to capture and track the eye movements of the user to accordingly actuate the communication unit to transmit a command to the UAV relating to the direction of the UAV, a camera attached with the rim to capture hand movements of the user to actuate the communication unit to transmit a command to the UAV relating to the direction of the imaging unit, a HUD (heads up display) embedded in the lenses to display a live video feed captured from the imaging unit of the UAV received via the communication unit.

[0015] According to another embodiment of the present invention, the drone further comprises of a gyroscope and accelerometer is installed with the UAV to detect an instant orientation of the UAV along with the forces over the UAV to dynamically actuate the propellers to correct the position of the UAV, an AI-based filtering module configured with the control unit to detect unintentional head, eye and hand movements of user to prevent transmission of accidental commands to the UAV, a stress detection module is configured with the control unit receives captured data relating to the head, eyes and hands of the user to detect a condition of stress to initiate an automated landing of the UAV, a detection module is configured with control unit receives prior footage captured by the imaging unit to determine objects of the interest for the user to actuate the ball and socket joint to rotate the imaging unit in the direction of the object of interest during capturing footage, a GPS (global positioning drone) unit is installed with the UAV to detect an instant location of the UAV to prevent the UAV from traversing outside a predefined geo-fence, a laser sensor is embedded in the UAV, in synchronisation with the imaging unit to detect obstacles in front of the UAV to actuate the propellers to change direction of the UAV to prevent collision, a microphone is attached with the temple to receive voice commands from the user regarding initiating, pausing and halting capturing visuals via the imaging unit, a frame is attached over the imaging unit, having an extendable plate to cover the imaging unit to protect from damage.

[0016] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0017] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a gesture-controlled control drone.

DETAILED DESCRIPTION OF THE INVENTION

[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0019] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0020] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0021] The present invention relates to a gesture-controlled control drone that is capable of detecting the presence of obstacles in front of the drone and taking the necessary steps for preventing the collision between the drone and the obstacle, thereby reducing the risk of damage to the drone.

[0022] Referring to Figure 1, an isometric view of a gesture-controlled control drone is illustrated, comprising an UAV (unmanned aerial vehicle) 101 configured with propellers 102, an imaging unit 103 connected with the UAV 101, an eyewear 104 comprising a pair of lenses 105 mounted within a rim 106 and a pair of extendable temples 107, an IMU (inertial measurement unit) 108 installed with the rim 106, a camera 109 attached with the rim 106, a HUD (heads up display) 110 embedded in the lenses 105, a frame 111 is attached over the imaging unit 103 having an extendable plate 112, a microphone 113 is attached with the temple.

[0023] The drone disclosed herein employs an UAV (unmanned aerial vehicle) 101. This UAV 101 is configured with propellers 102. The UAV (Unmanned Aerial Vehicle) 101 is preferably constructed using lightweight, durable materials to ensure optimal flight performance and safety. Common materials include composites such as carbon fiber-reinforced polymers, which offer high strength-to-weight ratios, rigidity and resistance to environmental stresses. The propellers 102 of the UAV 101 are essential components responsible for generating lift and enabling movement. The propellers 102 are preferably made up of but not limited to materials like plastic, carbon fiber or composites. When powered the propellers 102 spin rapidly to create airflow, producing thrust that lifts the drone into the air. With the UAV 101, an imaging unit 103 is connected by means of a ball and socket joint.

[0024] For activating the drone, the user needs to press a push button which is arranged on the UAV 101 which in turn activates all the related components for performing the desired task. After pressing the button, a closed electrical circuit is formed and current starts to flow that powers an inbuilt control unit to allow all the linked components to perform their respective task upon actuation.

[0025] An eyewear 104 is wirelessly connected with the UAV 101, comprising a pair of lenses 105 that is mounted within a rim 106 and a pair of extendable temples 107. These temples 107 are pivotally attached with the lateral portions of the rim 106 to stabilise the rim 106 against the eyes of the user. The extendable temples 107 extend and retract by using nested sections that slide within each other, driven by a pneumatic unit. The pneumatic unit for extension and retraction operates using compressed air to drive a piston inside a cylinder. When air is supplied to one side of the piston, it creates pressure that pushes the piston rod outward, causing extension. To retract, air is supplied to the opposite side while the initial chamber is vented, pulling the piston rod back.

[0026] For establishing an operative connection with the UAV 101, a communication unit is configured with the rim 106. The communication unit includes, but not limited to Wi-Fi (Wireless Fidelity) module, Bluetooth module, GSM (Global System for Mobile Communication) module. The communication unit used in the present invention is preferably a Wi-Fi module that contains transmitters and receivers that use radio frequency signals to transmit data wirelessly to the microcontroller. The wireless module typically includes components such as antennas, amplifiers and processors to facilitate communication and further connected to networks such as Wi-Fi, Bluetooth, or cellular networks, allowing drones to exchange information over short or long distances.

[0027] An IMU (inertial measurement unit) 108 is installed with the rim 106 and connected with the control unit to detect and track the head movement of the user. The IMU 108 functions by integrating multiple sensors, primarily accelerometer, gyroscope and magnetometer to precisely detect and quantify motion and orientation. The accelerometer operates based on the principle of measuring linear acceleration by detecting the displacement of a suspended mass. Internally, they typically consist of a micro-machined proof mass attached to a spring-like structure. When the IMU 108 experiences acceleration along a particular axis, the inertial forces cause the proof mass to shift relative to the housing. This displacement alters the capacitance which is then converted into an electrical signal proportional to the acceleration.

[0028] The gyroscope measures angular velocity by detecting changes in orientation over time. Internally, micro-electromechanical systems (MEMS) gyroscopes often utilize the Coriolis effect. They contain a vibrating structure that oscillates at a known frequency. When the drone rotates, the Coriolis force acts on the vibrating element, causing a measurable change in the vibration characteristics, such as amplitude, phase, or frequency shift. This change is detected by the capacitive sensor within the gyroscope. The circuitry interprets these variations to compute angular velocity around each axis, providing real-time rotational data.

[0029] The magnetometer detects the magnetic field vector around the drone to determine the orientation relative to Earth's magnetic field. Internally, they often use anisotropic magnetoresistance (AMR) sensor. These sensors consist of materials whose electrical resistance varies with the magnetic field. When exposed to an external magnetic field, the magnetometer produces a voltage proportional to the magnitude and direction of the magnetic flux passing through the sensing element. The circuitry then converts these voltages into digital signals representing the magnetic field's vector components, which are used to determine heading or orientation relative to magnetic north. This helps correct drift errors from gyroscopes and provides absolute heading references. In accordance with the detected head movement of the user, the communication unit is actuated to transmit a command to the UAV 101 relating to the speed of the UAV 101.

[0030] For capturing and tracking the eye movements of the user, an IR (infrared) sensor is installed with the rim 106. The IR (infrared) sensor functions by emitting infrared light toward the eyes and detecting the reflected signals to determine gaze direction. Internally, the sensor typically comprises an IR light-emitting diode (LED) that projects a focused infrared beam onto the eye and an array of photodetectors that receive the reflected IR light from the cornea, pupil and surrounding eye features. The intensity and pattern of the reflected IR light vary depending on the position and movement of the eye, such as eye rotation. The photodetectors convert these reflected IR signals into electrical signals, which are then processed by the sensor’s internal circuitry.

[0031] The signal processing protocol analyzes these electrical signals to identify features like corneal reflections. By tracking the relative positions and movements of these features over time, the eye movement patterns are calculated. This information is transmitted to the control unit for real-time eye-tracking. In accordance with the detected eye movement of the user, the communication unit is actuated to transmit a command to the UAV 101 relating to the direction of the UAV 101. The IR sensor also detects the blinks of the user’s eyes to actuate the communication unit to transmit a command to the UAV 101 relating to the hovering of the UAV 101.

[0032] A camera 109 is attached with the rim 106 to capture the hand movements of the user, to actuate the communication unit to transmit a command to the UAV 101 relating to the direction of the imaging unit 103. The camera 109 functions by capturing real-time visual data of the user's hand movements through the internal imaging components. The camera 109 comprises an image sensor that is a Charge-Coupled Device, which converts incoming light reflected from the user's hand into electrical signals. The lens arrangement focuses the scene onto the sensor, ensuring clear and detailed images. The sensor then captures a sequence of still images, which are processed by the internal circuitry to identify key features.

[0033] For displaying a live video feed captured from the imaging unit 103 of the UAV 101, a HUD (heads up display) 110 is embedded in the lenses 105. This live video feed is received via the communication unit. The imaging unit 103 comprises of an image capturing arrangement including a set of lenses that captures the live video feed in vicinity of the UAV 101 and the captured video feeds are stored within a memory of the imaging unit 103 in form of an optical data. The HUD (heads-up display) 110 functions by receiving the live video feed transmitted from the UAV's imaging unit 103 through the communication unit.

[0034] The HUD 110 comprises a transparent display panel, such as an LCD that is positioned within the user's line of sight. This display is integrated with the lenses 105, allowing the live video feed to be projected directly into the user's view without obstructing their natural vision. The received video data, transmitted wirelessly via radio frequency, is processed by an onboard display controller or microprocessor that converts the incoming data into a suitable visual format. The processed video signal is then sent to the microdisplay, which overlays the real-time video feed onto the transparent lens surface. The system ensures proper alignment and brightness adjustment so that the live feed appears as an augmented overlay within the user’s natural field of vision.

[0035] A gyroscope and accelerometer is installed with the UAV 101 to detect an instant orientation of the UAV 101 along with the forces over the UAV 101 to dynamically actuate the propellers 102 to correct the position of the UAV 101. The gyroscope installed on the UAV 101 functions by detecting angular velocity, that is how quickly the UAV 101 is rotating around the axes using the principles of angular momentum. Internally, the gyroscope typically comprises a vibrating mass that oscillates at a known frequency. When the UAV 101 rotates, Coriolis forces act upon the vibrating elements, causing a shift in their oscillation frequency. These changes are detected by the sensor and converted into electrical signals, which represent the rate of rotation about each axis. The gyroscope’s real-time angular velocity data is then processed by the control system to determine the UAV's current orientation and rotational movements.

[0036] The accelerometer on the UAV 101 detects linear acceleration forces acting upon it along the axes, including gravity and movement-induced accelerations. The accelerometer consists of tiny mass-spring arrangement that responds to acceleration by displacing slightly from their resting position. This displacement alters the capacitance within the sensor, generating an electrical signal proportional to the magnitude and direction of the applied force. The accelerometer's embedded circuitry processes these signals to quantify the instantaneous acceleration along each axis. During flight, the data helps determine changes in velocity and orientation, such as tilting or acceleration due to external forces.

[0037] With the control unit, an AI-based filtering module is configured to detect unintentional head, eye and hand movements of user to prevent the transmission of accidental commands to the UAV 101. The AI-based filtering module functions by continuously analyzing real-time sensor data capturing the user's head, eye, and hand movements. By utilizing advanced machine learning protocols, such as deep neural networks trained on extensive datasets of intentional versus unintentional movements, the module learns to distinguish between deliberate commands and accidental gestures. It processes temporal and spatial features, like movement speed, trajectory and context to assess the likelihood that a detected gesture is intentional. When an unintentional movement, such as a slight head tilt or minor eye twitch, is identified, the command is filtered, preventing the transmission to the UAV 101. Conversely, when a movement matches learned patterns of intentional interaction such as a deliberate hand signal or gaze shift, the module allows the command to pass through for execution. This sophisticated filtering ensures that only purposeful user inputs influence the UAV’s behavior, enhancing operational safety and control accuracy.

[0038] A stress detection module is configured with the control unit receives captured data relating to the head, eyes and hands of the user to detect a condition of stress to initiate an automated landing of the UAV 101. The stress detection module functions by analyzing real-time physiological and behavioral data captured from the user's head, eyes, and hands through integrated sensors and cameras within the control unit. This module employs machine learning protocols trained to recognize physiological indicators of stress, such as rapid eye movements, increased head motion, muscle tension and changes in facial expressions or eye blink rates. The module processes this multimodal data to identify patterns consistent with elevated stress levels, such as heightened heart rate proxies, gaze aversion or involuntary gestures. When the module detects significant signs of stress, it triggers an automated response, such as initiating a safe landing of the UAV 101, to mitigate potential risks caused by user impairment.

[0039] The prior footage captured by the imaging unit 103 is received by a detection module that is configured with the control unit to determine objects of interest for the user. The detection module operates by processing the prior footage captured by the imaging unit 103 through computer vision protocols. The module begins by analyzing the stored video data to identify and classify objects within the scene using methods such as object detection models. These models examine features like shape, color, texture and movement patterns to distinguish objects of interest, such as other vehicles, obstacles or specific targets relevant to the user's mission. The module also tracks the location, size and movement of these objects over time to assess their relevance and potential threat levels.

[0040] In accordance with the detected objects of interest, the ball and socket joint is actuated to rotate the imaging unit 103 in the direction of the object of interest during capturing footage. The ball and socket joint enables precise rotational movement in multiple directions by integrating an electric motor. The ball, typically attached to a shaft, fits into the socket, allowing the imaging unit 103 to rotate freely around several axes. The motor is responsible for rotating the ball within the socket, providing controlled movement along different planes.

[0041] In synchronisation with the imaging unit 103, a laser sensor embedded in the UAV 101 detects the obstacles in front of the UAV 101 to actuate the propellers 102 to change the direction of the UAV 101 to prevent collision. The laser sensor embedded in the UAV 101 functions by emitting rapid, focused laser pulses, typically via a laser diode, towards the environment in front of the UAV 101. These pulses travel outward until they encounter an obstacle, reflecting back to the sensor's photodetector component, such as a photodiode. The sensor then measures the time taken for the laser pulses to return, known as the Time-of-Flight (ToF), which is used to calculate the precise distance to the obstacle. The internal circuitry processes this data in real-time, continuously monitoring the proximity of objects ahead of the UAV 101. When an obstacle is detected within a predefined safety range, the sensor's embedded control unit triggers the UAV's to adjust the propellers' direction, causing the UAV 101 to change course and avoid the collision.

[0042] For preventing the UAV 101 from traversing outside a predefined geo-fence, a GPS (global positioning drone) unit is mounted with the UAV 101 which detects an instant location of the UAV 101. The GPS (Global Positioning System) unit functions by receiving signals from a constellation of satellite networks orbiting the Earth. The GPS unit processes these signals to determine the precise instantaneous location of the UAV 101 through trilateration, which involves calculating the distance from multiple satellites based on the time delay of the received signals. The GPS receiver then translates this data into accurate geographic coordinates (latitude, longitude, and altitude). This real-time location information is continuously transmitted to the control unit, enabling the UAV 101 to monitor the position relative to a predefined geo-fence boundary. If the UAV 101 approaches the geo-fence limits, the control system initiates corrective actions, such as navigation adjustments, to prevent the UAV 101 from traversing outside the designated operational area, thereby ensuring operational safety.

[0043] A microphone 113 is attached with the temple 107 to receive voice commands from the user regarding initiating, pausing and halting capturing visuals via the imaging unit 103. The microphone 113 processes the voice command from the user by converting sound waves into electrical signals. The signals are analog in nature. These analog signals are then digitized using an analog-to-digital converter (ADC) for further processing. The digital data undergoes pre-processing, including noise reduction and filtering, to improve clarity by eliminating background noise. The cleaned signal is passed for speech recognition powered by artificial intelligence, which analyzes the input to detect keywords or phrases. Once recognized, the microcontroller maps the command and triggers the initiating, pausing and halting of capturing of the visuals.

[0044] For covering the imaging unit 103 to protect from damage, a frame 111 is attached over the imaging unit 103 that is having an extendable plate 112. The extendable plate 112 operates using a telescopic method where the plate 112 extends and retracts by using nested sections that slide within each other, driven by the pneumatic unit. The pneumatic unit works in the similar manner as explained above. Hence, protecting the imaging unit 103 from external damage.

[0045] The present invention works best in the following manner, where the UAV 101 (unmanned aerial vehicle) that is configured with propellers 102 and the imaging unit 103 connected with the UAV 101 by means of the ball and socket joint. The eyewear 104 comprising the pair of lenses 105 mounted within the rim 106 and the pair of extendable temples 107 pivotally attached with lateral portions of the rim 106 to stabilise the rim 106 against eyes of the user. The communication unit to establish the operative connection with the UAV 101. The IMU (inertial measurement unit) 108 connected with the control unit to detect and track the head movement of the user to accordingly actuate the communication unit to transmit the command to the UAV 101 relating to the speed of the UAV 101. The IR (infrared) sensor captures and tracks the eye movements of the user to accordingly actuate the communication unit to transmit the command to the UAV 101 relating to the direction of the UAV 101. The IR sensor detects blinks of the user’s eyes to actuate the communication unit to transmit the command to the UAV 101 relating to the hovering of the UAV 101. The camera 109 captures hand movements of the user to actuate the communication unit to transmit the command to the UAV 101 relating to the direction of the imaging unit 103. The HUD (heads up display) 110 displays the live video feed captured from the imaging unit 103 of the UAV 101 received via the communication unit. The gyroscope and accelerometer detects the instant orientation of the UAV 101 along with the forces over the UAV 101 to dynamically actuate the propellers 102 to correct the position of the UAV 101.

[0046] In continuation, the AI-based filtering module detects unintentional head, eye and hand movements of user to prevent transmission of accidental commands to the UAV 101. The stress detection module receives captured data relating to the head, eyes and hands of the user to detect the condition of stress to initiate the automated landing of the UAV 101. The detection module receives prior footage captured by the imaging unit 103 to determine objects of the interest for the user to actuate the ball and socket joint to rotate the imaging unit 103 in the direction of the object of interest during capturing footage. The GPS (global positioning drone) unit detects an instant location of the UAV 101 to prevent the UAV 101 from traversing outside the predefined geo-fence. The laser sensor in synchronisation with the imaging unit 103 detects the obstacles in front of the UAV 101 to actuate the propellers 102 to change direction of the UAV 101 to prevent collision. The microphone 113 receives voice commands from the user regarding initiating, pausing and halting capturing visuals via the imaging unit 103. The frame 111 having the extendable plate 112 covers the imaging unit 103 to protect from damage.

[0047] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A gesture-controlled control drone, comprising:

i) an UAV (unmanned aerial vehicle) 101 configured with propellers 102 and an imaging unit 103 connected with the UAV 101 by means of a ball and socket joint;
ii) an eyewear 104 comprising a pair of lenses 105 mounted within a rim 106, and a pair of extendable temples 107 pivotally attached with lateral portions of the rim 106 to stabilise the rim 106 against eyes of the user;
iii) a communication unit installed with the rim 106 to establish an operative connection with the UAV 101;
iv) an IMU (inertial measurement unit) 108 installed with the rim 106 and connected with a control unit to detect and track the head movement of the user, to accordingly actuate the communication unit to transmit a command to the UAV 101 relating to the speed of the UAV 101;
v) an IR (infrared) sensor installed with the rim 106 to capture and track the eye movements of the user, to accordingly actuate the communication unit to transmit a command to the UAV 101 relating to the direction of the UAV 101;
vi) a camera 109 attached with the rim 106 to capture hand movements of the user, to actuate the communication unit to transmit a command to the UAV 101 relating to the direction of the imaging unit 103; and
vii) a HUD (heads up display) 110 embedded in the lenses 105 to display a live video feed captured from the imaging unit 103 of the UAV 101, received via the communication unit.

2) The drone as claimed in claim 1, wherein a gyroscope and accelerometer is installed with the UAV 101 to detect an instant orientation of the UAV 101 along with the forces over the UAV 101 to dynamically actuate the propellers 102 to correct the position of the UAV 101.

3) The drone as claimed in claim 1, wherein the IR sensor detects blinks of the user’s eyes to actuate the communication unit to transmit a command to the UAV 101 relating to the hovering of the UAV 101.

4) The drone as claimed in claim 1, wherein an AI-based filtering module configured with the control unit to detect unintentional head, eye and hand movements of user to prevent transmission of accidental commands to the UAV 101.

5) The drone as claimed in claim 1, wherein a stress detection module is configured with the control unit receives captured data relating to the head, eyes and hands of the user to detect a condition of stress, to initiate an automated landing of the UAV 101.

6) The drone as claimed in claim 1, wherein a detection module is configured with control unit receives prior footage captured by the imaging unit 103 to determine objects of the interest for the user, to actuate the ball and socket joint to rotate the imaging unit 103 in the direction of the object of interest during capturing footage.

7) The drone as claimed in claim 1, wherein a GPS (global positioning drone) unit is installed with the UAV 101 to detect an instant location of the UAV 101 to prevent the UAV 101 from traversing outside a predefined geo-fence.

8) The drone as claimed in claim 1, wherein a laser sensor is embedded in the UAV 101, in synchronisation with the imaging unit 103, to detect obstacles in front of the UAV 101 to actuate the propellers 102 to change direction of the UAV 101 to prevent collision.

9) The drone as claimed in claim 1, wherein a microphone 113 is attached with the temple 107 to receive voice commands from the user regarding initiating, pausing and halting capturing visuals via the imaging unit 103.

10) The drone as claimed in claim 1, wherein a frame 111 is attached over the imaging unit 103, having an extendable plate 112 to cover the imaging unit 103 to protect from damage.

Documents

Application Documents

# Name Date
1 202521050919-STATEMENT OF UNDERTAKING (FORM 3) [27-05-2025(online)].pdf 2025-05-27
2 202521050919-REQUEST FOR EXAMINATION (FORM-18) [27-05-2025(online)].pdf 2025-05-27
3 202521050919-REQUEST FOR EARLY PUBLICATION(FORM-9) [27-05-2025(online)].pdf 2025-05-27
4 202521050919-PROOF OF RIGHT [27-05-2025(online)].pdf 2025-05-27
5 202521050919-POWER OF AUTHORITY [27-05-2025(online)].pdf 2025-05-27
6 202521050919-FORM-9 [27-05-2025(online)].pdf 2025-05-27
7 202521050919-FORM FOR SMALL ENTITY(FORM-28) [27-05-2025(online)].pdf 2025-05-27
8 202521050919-FORM 18 [27-05-2025(online)].pdf 2025-05-27
9 202521050919-FORM 1 [27-05-2025(online)].pdf 2025-05-27
10 202521050919-FIGURE OF ABSTRACT [27-05-2025(online)].pdf 2025-05-27
11 202521050919-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [27-05-2025(online)].pdf 2025-05-27
12 202521050919-EVIDENCE FOR REGISTRATION UNDER SSI [27-05-2025(online)].pdf 2025-05-27
13 202521050919-EDUCATIONAL INSTITUTION(S) [27-05-2025(online)].pdf 2025-05-27
14 202521050919-DRAWINGS [27-05-2025(online)].pdf 2025-05-27
15 202521050919-DECLARATION OF INVENTORSHIP (FORM 5) [27-05-2025(online)].pdf 2025-05-27
16 202521050919-COMPLETE SPECIFICATION [27-05-2025(online)].pdf 2025-05-27
17 Abstract.jpg 2025-06-13