Sign In to Follow Application
View All Documents & Correspondence

Unmanned Aerial Vehicle (Uav) Based Imaging System

Abstract: The present disclosure discloses an Unmanned Aerial Vehicle (UAV) based imaging system 100 for monitoring and forecasting about hazards and disasters, such as landslide, flood, drought, and the like. The UAV based imaging system 100 can include imaging units 102, such as a tracking camera 102-1 and a main camera 102-2 that can be operatively coupled to UAV 110, a processing unit 104, a positioning unit 106, a driving unit 108, an altimeter 112, and a magnetometer 114. The UAV based imaging system 100 can obtain one or more images through the tracking camera 102-1, and can detect a region of interest (ROI) from the obtained images. Further, the UAV 110 moves towards the detected ROI to capture and analyze HD images of the ROI through the main camera 102-2 to monitor and forecast about hazards and disasters.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 June 2020
Publication Number
52/2021
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
info@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-03-15
Renewal Date

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector - 9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. SOOD, Vishakha
PhD Scholar, Department of Electronics and Communication Engineering, Chitkara University, Punjab-140401, India.
2. GUSAIN, Hemendra Singh
H. No. 3177B, Sector 24D, Chandigarh - 160023, India.
3. GUPTA, Sheifali
Professor, Department of Electronics and Communication Engineering, Chitkara University, Punjab-140401, India.
4. SINGH, Sartajvir
Associate Professor, Department of Electronics and Communication Engineering, Chitkara University, Himachal Pradesh-174103, India.

Specification

[0001] The present disclosure relates to the field of image processing. More
particularly, the present disclosure relates to an unmanned aerial vehicle (UAV) based
imaging system.
BACKGROUND
[0002] Background description includes information that may be useful in
understanding the present invention. It is not an admission that any of the information
provided herein is prior art or relevant to the presently claimed invention, or that any
publication specifically or implicitly referenced is prior art.
[0003] In the field of remote sensing, peripheral data associated with landscapes of
earth is generally acquired by satellite earth observation sensors such as AWiFS and Land
sat. Such sensors are configured to acquire image of the earth during day-time. Moreover,
such sensors take a number of days to cover the entire periphery of the earth, which may
depend on spatial resolution of said sensors, where the spatial resolution can be defined as
area covered per pixel.
[0004] For instance, AWiFS is having 56-meter spatial resolution with 5-day revisit
time period, from which it can be inferred that AWiFS can acquire image of the entire earth
in 5-days and area covered by a pixel is 56×56 meter. In case of Landsat-8, spatial resolution
is 30-meter with 16-day revisit time period that means Landsat-8 can acquire image of the
entire earth in 16-days and area covered by a pixel is 30×30 meter. However, it can be
observed that when spatial resolution is improving from 56 meters to 30meter, the revisit
time period is increasing from 5-day to 16-day, respectively or it can also be said that as the
revisit time period is improving, the spatial resolution is decreasing. Hence, it can be inferred
that spatial resolution of said sensors is inversely related to their revisit time period.
[0005] In order to observe changes in surface of the earth caused due to disaster,
hazard, landslide, flood and the like, earth observation sensors such as MODIS and
SCATSAT-1 can be used. MODIS and SCATSAT-1 are having revisit time period of 1-2
days and 1 day, with compromising spatial resolutions 250 metre to 1 kilometre and
approximately 2 kilometre, respectively.
3
[0006] However, changes in surface of the earth is needed to be monitored
continuously and precisely to forecast disasters, natural hazards and for mitigating their
effects. Therefore, low spatial resolution is needed to be used for different applications.
[0007] Various different types of classes can be grouped based on their similarity
score, and can be broadly categorized as per-pixel based classifier and subpixel based
classifier. Per-pixel classifier can perform on pixel level whereas subpixel based classifier
can perform on sub-pixel level. The subpixel-based classifier delivers the information about
different class categories, such as 10% snow, 40% barren, and 50% ice, within a specific
pixel in the fractions such as Linear spectral mixing (LSM), fuzzy classifiers etc.
To validate the mixed pixel classifier that includes more than one classes in subpixel
classified image, there is a need to select endmembers that can be earth surface components,
such as snow, barren, and ice, within an area of image pixel, for instance 56m×56m that can
be acquired with AWiFS sensor. The validation process can be accomplished via field
observations and/ or high-quality image acquisition with drones or UAV (Unmanned aerial
vehicle). But sometimes, field observations are not possible over inaccessible areas such as
the Himalayas or under extreme climate conditions. On the other hand, high-quality image
acquisition via UAV becomes costlier for an image of 200 × 200 pixels.
[0008] There is therefore need in the art to develop a robust, accurate, fast, costeffective, efficient and simple system to overcome the above-mentioned and other limitations
of the existing solutions and utilize techniques and to acquire land-use and land-cover
information over the transition zones (mixed pixels) between the two class categories.
OBJECTS OF THE PRESENT DISCLOSURE
[0009] Some of the objects of the present disclosure, which at least one embodiment
herein satisfies are as listed herein below.
[00010] It is an object of the present disclosure to provide an UAV based imaging
system for sub-pixel based classification of images.
[00011] It is another object of the present disclosure to provide an UAV based imaging
system for per-pixel based classification of an image.
[00012] It is another object of the present disclosure to provide an UAV based imaging
system for determining a region of interest based on edge detection.
[00013] It is another object of the present disclosure to provide an UAV based imaging
system for facilitating HD imaging of the determined region of interest.
4
[00014] It is another object of the present disclosure to provide a smart, efficient, and
cost effective UAV based imaging system.
SUMMARY
[00015] The present disclosure relates to the field of image processing. More
particularly, the present disclosure relates to an unmanned aerial vehicle (UAV) based
imaging system.
[00016] An aspect of the present disclosure pertains to an unmanned aerial vehicle
(UAV) based imaging system comprising: one or more imaging units operatively coupled to
a UAV, and configured to capture one or more images of a region; and a processing unit
operatively coupled to the one or more imaging units and the UAV, the processing unit
comprising one or more processors, and coupled with a memory, the memory storing
instructions executable by the one or more processors and configured to: pre-process the one
or more images captured by at least one of the imaging units, wherein the pre-processing
comprises detecting and extracting of peripheral attributes from the one or more captured
images; determine a Region of Interest (ROI) based on each of the pre-processed one or more
images; and delineate pixels associated with the determined ROI from pixels associated with
each of the pre-processed one or more images.
[00017] In an aspect, the peripheral attributes comprise any or a combination of colour,
texture, albedo, and variation associated with the region.
[00018] In an aspect, the processing unit may be configured to modify the extracted
peripheral attributes of each of the one or more pre-processed images.
[00019] In an aspect, the processing unit may be configured to detect edges associated
with the region based on the modified peripheral attributes associated with each of the preprocessed one or more images.
[00020] In an aspect, the determination of the ROI may be associated with the detected
edges.
[00021] In an aspect, the UAV based imaging system comprises a positioning unit
operatively coupled to the processing unit, and configured to identify position coordinates of
the determined ROI and transmit the identified position coordinates to the processing unit.
[00022] In an aspect, the positioning unit comprises any or a combination of GPS,
GLONASS, and NAVIC.
5
[00023] In an aspect, the UAV based imaging system comprises a driving unit
operatively coupled to the processing unit and configured to control kinesiological
parameters of the UAV based on the identified position coordinates of the determined ROI.
[00024] In an aspect, the kinesiological parameters of the drone comprise any or a
combination of direction, velocity, acceleration, angle of rotation, pitch, roll, and yaw.
[00025] In an aspect, the processing unit may be configured to actuate at least one of
the one or more imaging units to obtain one or more high resolution images of the determined
ROI.
BRIEF DESCRIPTION OF DRAWINGS
[00026] The accompanying drawings are included to provide a further understanding
of the present disclosure, and are incorporated in and constitute a part of this specification.
The drawings illustrate exemplary embodiments of the present disclosure and, together with
the description, serve to explain the principles of the present disclosure. The diagrams are for
illustration only, which thus is not a limitation of the present disclosure.
[00027] FIGs. 1A and 1B illustrates block diagram of the proposed unmanned aerial
vehicle (UAV) based imaging system to illustrate its overall working in accordance with an
embodiment of the present disclosure.
[00028] FIG. 2 illustrates exemplary engines of a processing unit of the proposed
system, in accordance with an exemplary embodiment of the present disclosure.
[00029] FIGs. 3A and 3C illustrate diagrams associated with working of the UAV
based imaging system 100, to illustrate its overall working in accordance with an
embodiment of the present disclosure.
DETAILED DESCRIPTION
[00030] The following is a detailed description of embodiments of the disclosure
depicted in the accompanying drawings. The embodiments are in such detail as to clearly
communicate the disclosure. However, the amount of detail offered is not intended to limit
the anticipated variations of embodiments; on the contrary, the intention is to cover all
modifications, equivalents, and alternatives falling within the spirit and scope of the present
disclosure as defined by the appended claims.
[00031] In the following description, numerous specific details are set forth in order to
provide a thorough understanding of embodiments of the present invention. It will be
6
apparent to one skilled in the art that embodiments of the present invention may be practiced
without some of these specific details.
[00032] Embodiments of the present invention include various steps, which will be
described below. The steps may be performed by hardware components or may be embodied
in machine-executable instructions, which may be used to cause a general-purpose or specialpurpose processor programmed with the instructions to perform the steps. Alternatively, steps
may be performed by a combination of hardware, software, and firmware and/or by human
operators.
[00033] Various methods described herein may be practiced by combining one or more
machine-readable storage media containing the code according to the present invention with
appropriate standard computer hardware to execute the code contained therein. An apparatus
for practicing various embodiments of the present invention may involve one or more
computers (or one or more processors within a single computer) and storage systems
containing or having network access to computer program(s) coded in accordance with
various methods described herein, and the method steps of the invention could be
accomplished by modules, routines, subroutines, or subparts of a computer program product.
[00034] If the specification states a component or feature “may”, “can”, “could”, or
“might” be included or have a characteristic, that particular component or feature is not
required to be included or have the characteristic.
[00035] As used in the description herein and throughout the claims that follow, the
meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates
otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on”
unless the context clearly dictates otherwise.
[00036] Exemplary embodiments will now be described more fully hereinafter with
reference to the accompanying drawings, in which exemplary embodiments are shown. These
exemplary embodiments are provided only for illustrative purposes and so that this disclosure
will be thorough and complete and will fully convey the scope of the invention to those of
ordinary skill in the art. The invention disclosed may, however, be embodied in many
different forms and should not be construed as limited to the embodiments set forth herein.
Various modifications will be readily apparent to persons skilled in the art. The general
principles defined herein may be applied to other embodiments and applications without
departing from the spirit and scope of the invention. Moreover, all statements herein reciting
embodiments of the invention, as well as specific examples thereof, are intended to
encompass both structural and functional equivalents thereof. Additionally, it is intended that
7
such equivalents include both currently known equivalents as well as equivalents developed
in the future (i.e., any elements developed that perform the same function, regardless of
structure). Also, the terminology and phraseology used is for the purpose of describing
exemplary embodiments and should not be considered limiting. Thus, the present invention is
to be accorded the widest scope encompassing numerous alternatives, modifications and
equivalents consistent with the principles and features disclosed. For purpose of clarity,
details relating to technical material that is known in the technical fields related to the
invention have not been described in detail so as not to unnecessarily obscure the present
invention.
[00037] Thus, for example, it will be appreciated by those of ordinary skill in the art
that the diagrams, schematics, illustrations, and the like represent conceptual views or
processes illustrating systems and methods embodying this invention. The functions of the
various elements shown in the figures may be provided through the use of dedicated
hardware as well as hardware capable of executing associated software. Similarly, any
switches shown in the figures are conceptual only. Their function may be carried out through
the operation of program logic, through dedicated logic, through the interaction of program
control and dedicated logic, or even manually, the particular technique being selectable by
the entity implementing this invention. Those of ordinary skill in the art further understand
that the exemplary hardware, software, processes, methods, and/or operating systems
described herein are for illustrative purposes and, thus, are not intended to be limited to any
particular named element.
[00038] Embodiments of the present invention may be provided as a computer
program product, which may include a machine-readable storage medium tangibly
embodying thereon instructions, which may be used to program a computer (or other
electronic devices) to perform a process. The term “machine-readable storage medium” or
“computer-readable storage medium” includes, but is not limited to, fixed (hard) drives,
magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs),
and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access
memories (RAMs), programmable read-only memories (PROMs), erasable PROMs
(EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical
cards, or other type of media/machine-readable medium suitable for storing electronic
instructions (e.g., computer programming code, such as software or firmware).A machinereadable medium may include a non-transitory medium in which data may be stored and that
does not include carrier waves and/or transitory electronic signals propagating wirelessly or
8
over wired connections. Examples of a non-transitory medium may include, but are not
limited to, a magnetic disk or tape, optical storage media such as compact disk (CD) or digital
versatile disk (DVD), flash memory, memory or memory devices. A computer-program
product may include code and/or machine-executable instructions that may represent a
procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software
package, a class, or any combination of instructions, data structures, or program statements.
A code segment may be coupled to another code segment or a hardware circuit by passing
and/or receiving information, data, arguments, parameters, or memory contents. Information,
arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable
means including memory sharing, message passing, token passing, network transmission, etc.
[00039] Furthermore, embodiments may be implemented by hardware, software,
firmware, middleware, microcode, hardware description languages, or any combination
thereof. When implemented in software, firmware, middleware or microcode, the program
code or code segments to perform the necessary tasks (e.g., a computer-program product)
may be stored in a machine-readable medium. A processor(s) may perform the necessary
tasks.
[00040] Systems depicted in some of the figures may be provided in various
configurations. In some embodiments, the systems may be configured as a distributed system
where one or more components of the system are distributed across one or more networks in
a cloud computing system.
[00041] Each of the appended claims defines a separate invention, which for
infringement purposes is recognized as including equivalents to the various elements or
limitations specified in the claims. Depending on the context, all references below to the
"invention" may in some cases refer to certain specific embodiments only. In other cases it
will be recognized that references to the "invention" will refer to subject matter recited in one
or more, but not necessarily all, of the claims.
[00042] All methods described herein may be performed in any suitable order unless
otherwise indicated herein or otherwise clearly contradicted by context. The use of any and
all examples, or exemplary language (e.g., “such as”) provided with respect to certain
embodiments herein is intended merely to better illuminate the invention and does not pose a
limitation on the scope of the invention otherwise claimed. No language in the specification
should be construed as indicating any non-claimed element essential to the practice of the
invention.
9
[00043] Various terms as used herein are shown below. To the extent a term used in a
claim is not defined below, it should be given the broadest definition persons in the pertinent
art have given that term as reflected in printed publications and issued patents at the time of
filing.
[00044] The present disclosure relates to the field of image processing. More
particularly, the present disclosure relates to anunmanned aerial vehicle (UAV) based
imaging system.
[00045] According to an aspect of the present disclosure an unmanned aerial vehicle
(UAV) based imaging system including: one or more imaging units operatively coupled to a
UAV, and configured to capture one or more images of a region; and a processing unit
operatively coupled to the one or more imaging units and the UAV, the processing unit
incluing one or more processors, and coupled with a memory, the memory storing
instructions executable by the one or more processors and configured to: pre-process the one
or more images captured by at least one of the imaging units, wherein the pre-processing
comprises detecting and extracting of peripheral attributes from the one or more captured
images; determine a Region of Interest (ROI) based on each of the pre-processed one or more
images; and delineate pixels associated with the determined ROI from pixels associated with
each of the pre-processed one or more images.
[00046] In an embodiment, the peripheral attributes include any or a combination of
colour, texture, albedo, and variation associated with the region.
[00047] In an embodiment, the processing unit can be configured to modify the
extracted peripheral attributes of each of the one or more pre-processed images.
[00048] In an embodiment, the processing unit can be configured to detect edges
associated with the region based on the modified peripheral attributes associated with each of
the pre-processed one or more images.
[00049] In an embodiment, the determination of the ROI can be associated with the
detected edges.
[00050] In an embodiment, the UAV based imaging system includes a positioning unit
operatively coupled to the processing unit, and configured to identify position coordinates of
the determined ROI and transmit the identified position coordinates to the processing unit.
[00051] In an embodiment, the positioning unit includes any or a combination of GPS,
GLONASS, and NAVIC.
10
[00052] In an embodiment, the UAV based imaging system includes a driving unit
operatively coupled to the processing unit and configured to control kinesiological
parameters of the UAV based on the identified position coordinates of the determined ROI.
[00053] In an embodiment, the kinesiological parameters of the drone include any or a
combination of direction, velocity, acceleration, angle of rotation, pitch, roll, and yaw.
[00054] In an embodiment, the processing unit can be configured to actuate at least one
of the one or more imaging units to obtain one or more high resolution images of the
determined ROI.
[00055] FIGs. 1A and 1B illustrate block diagrams of the proposed unmanned aerial
vehicle (UAV) based imaging system 100 to illustrate its overall working in accordance with
an embodiment of the present disclosure.
[00056] As illustrated in FIGs. 1A and 1B, in an embodiment, the proposed unmanned
aerial vehicle (UAV) based imaging system 100 (interchangeably referred to as UAV based
imaging system 100 and system 100, hereinafter) can include one or more imaging units 102-
1, 102-2… 102-N (also, collectively referred to as imaging units 102 and individually
referred to as imaging unit 102, hereinafter) that can be operatively coupled to unmanned
aerial vehicle 110 (also referred to UAV 110, herein). In an exemplary embodiment, the
imaging units 102 can be, but not limited to, any or a combination of camera, scanner, high
definition (HD) camera, and the like.
[00057] In an embodiment, the UAV based imaging system 100 can include a
processing unit 104 that can be operatively coupled to the imaging units 102 and the UAV
110. In an exemplary embodiment, the processing unit 104 can include one or more
processors coupled with a memory, the memory storing instructions executable by the one or
more processors.
[00058] In an embodiment, at least one of the imaging units 102, for example, a
tracking camera 102-1 capture one or more images (also, referred to as images, hereinafter)
of a region that can be transmitted to the processing unit 104. The processing unit 104 can
receive the captured images and can further pre-process the received images. In an exemplary
embodiment, the pre-processing can include detecting and extracting of peripheral attributes
from the images, such as, but not limited to colour, texture, albedo, variation associated with
the region, and the like.
[00059] In an embodiment, based on each of the pre-processed image, the processing
unit 104 can determine a Region of Interest (ROI) from the region. In an exemplary
11
embodiment, the processing unit 104 can delineate pixels associated with the determined ROI
from pixels associated with each of the pre-processed images.
[00060] In an embodiment, the processing unit can be configured to modify the
extracted peripheral attributes of each of the pre-processed images, such as, conversion of a
coloured image that has been captured by the tracking camera 102-1 to greyscale. In another
embodiment, the processing unit 104 can be configured to detect edges associated with the
region based on the modified peripheral attributes associated with each of the pre-processed
images, and further the ROI can be determined, where the determination of the ROI can be
associated with the detected edges.
[00061] In an embodiment, the UAV based imaging system 100 can include a
positioning unit 106 that can be operatively coupled to the processing unit 104, where the
positioning unit can include any or a combination of GPS, GLONASS, NAVIC, and the like.
The positioning unit 106 can be configured to identify position coordinates of the determined
ROI and accordingly transmit the identified position coordinates to the processing unit 104.
[00062] In an embodiment, the UAV based imaging system 100 can include a driving
unit 108 that can be operatively coupled to the processing unit 104. In an embodiment, based
on the identified position coordinates of the determined ROI, the driving unit 108 can be
configured to control kinesiological parameters of the UAV 110, such as, but not limited to,
direction, velocity, acceleration, angle of rotation, pitch, roll, yaw, and the like.
[00063] In an embodiment, the processing unit 104 can be configured to actuate an HD
camera 102-2 to obtain one or more high resolution images of the determined ROI.
[00064] In an embodiment, the UAV based imaging system 100 can further include an
altimeter 112 operatively coupled to the UAV 110, and configured to measure altitude of the
UAV 110 over a fixed level, for example, sea level.
[00065] In an embodiment, the UAV based imaging system 100 can further include a
magnetometer 114 operatively coupled to the UAV 110, and configured to measure
magnetism, direction, strength, relative change of magnetic field of a particular region, and
the like.
[00066] In an embodiment, the altimeter 112 and the magnetometer 114 can also aid in
determining geo-coordinates of the said region, distance of the UAV 110 from the said
region, direction of the UAV 110 with respect to the said region, and the like.
[00067] In an embodiment, the UAV based imaging system 100 can further include a
database 116 where all the data regarding said region, such as, but not limited to,
corresponding images, altitude, geo-coordinates, and the like, can be stored.
12
[00068] In an embodiment, the UAV based imaging system 100 can include a
transmitter and receiver 118 (interchangeably referred to as transceiver 118, hereinafter) and
a base station controller 120, where the transceiver 118 can facilitate in establishment of a
communication channel between the UAV 110 and the base station controller 120, thereby
enabling data exchange between the UAV 110 and the base station controller 120.
[00069] In an exemplary embodiment, said system 100 can enable monitoring of
disaster-prone areas and forecasting of disasters, such as, but not limited to, landslide, flood,
drought, and the like.
[00070] In another exemplary embodiment, said system 100 can aid in acquiring
efficient information in a small time period, by acquiring a few number of images that are
associated with the determined ROI itself, thereby less memory, processing, and storage
space is required.
[00071] FIG. 2 illustrates exemplary engines of a processing unit of the proposed
system, in accordance with an exemplary embodiment of the present disclosure.
[00072] In an aspect, the processing unit 104can include one or more processor(s) 202.
The one or more processor(s) 202 may be implemented as one or more microprocessors,
microcomputers, microcontrollers, digital signal processors, central processing units, logic
circuitries, and/or any devices that manipulate data based on operational instructions. Among
other capabilities, the one or more processor(s) 202 are configured to fetch and execute
computer-readable instructions stored in a memory 204 of the processing unit 104. The
memory 204 may store one or more computer-readable instructions or routines, which may
be fetched and executed to create or share the data units over a network service. The memory
204 may comprise any non-transitory storage device including, for example, volatile memory
such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[00073] The processing unit 104 may also comprise an interface(s) 206. The
interface(s) 206 may comprise a variety of interfaces, for example, interfaces for data input
and output devices, referred to as I/O devices, storage devices, and the like. The interface(s)
206 may facilitate communication of the processing unit 104 with various devices coupled to
the processing unit 104. The interface(s) 206 may also provide a communication pathway for
one or more components of the processing unit 104. Examples of such components include,
but are not limited to, processing engine(s) 208 and database 210.
[00074] The processing engine(s) 208 may be implemented as a combination of
hardware and programming (for example, programmable instructions) to implement one or
more functionalities of the processing engine(s) 208. In examples described herein, such
13
combinations of hardware and programming may be implemented in several different ways.
For example, the programming for the processing engine(s) 208 may be processor executable
instructions stored on a non-transitory machine-readable storage medium and the hardware
for the processing engine(s) 208 may comprise a processing resource (for example, one or
more processors), to execute such instructions. In the present examples, the machine-readable
storage medium may store instructions that, when executed by the processing resource,
implement the processing engine(s) 208. In such examples, the processing unit 104can
include the machine-readable storage medium storing the instructions and the processing
resource to execute the instructions, or the machine-readable storage medium may be
separate but accessible to the processing unit 104and other processing resources. In other
examples, the processing engine(s) 208 may be implemented by electronic circuitry.
[00075] The database 210 may comprise data that is either stored or generated as a
result of functionalities implemented by any of the components of the processing engine(s)
208.
[00076] In an exemplary embodiment, the processing engine(s) 208 can include a preprocessing unit 212, a ROI determining unit 214, a control unit 216, and other unit(s) 220.
[00077] It would be appreciated that modules being described are only exemplary
engines and any other module or sub-engine can be included as part of the processing unit
104or the processing engine(s) 208. These engines too may be merged or divided into superengines or sub- engines as may be configured.
[00078] In an embodiment, the pre-processing unit 212 associated with the processing
unit 104 can facilitate pre-processing of images of a region captured through tracking camera
102-1, where the images can be pre-processed by applying sub-pixel classifier technique, per
pixel classifier technique, and the like. In an exemplary embodiment, the pre-processing can
include detecting and extracting of peripheral attributes from the images, such as, but not
limited to colour, texture, albedo, variation associated with the region, and the like. In an
embodiment, the captured images, which can be of RGB format, can be converted to
greyscale.
[00079] In an embodiment, the ROI determining unit 214 associated with the
processing unit 104 can aid in determining a ROI by using edge detecting techniques. In an
embodiment, edges associated with the region can be identified and accordingly ROI can be
determined. In an exemplary embodiment, pixels corresponding to the identified edges can be
segmented from pixels of each of the images. In another exemplary embodiment, position
14
coordinates of the determined ROI can further be identified via positioning unit 106, altimeter
112, magnetometer 114, and the like.
[00080] In an embodiment, the control unit 216 associated with the processing unit 104
can facilitate in controlling of driving unit 108 of the UAV 110. In an exemplary
embodiment, based on the identified position coordinates of the determined ROI, the control
unit 216 can facilitate in controlling of kinesiological parameters of the UAV 110 by
controlling the driving unit 108, where the kinesiological parameters can be, but not limited
to, direction, velocity, acceleration, angle of rotation, pitch, roll, yaw, and the like.
[00081] FIGs. 3A-3C illustrate diagrams associated with working of the UAV based
imaging system 100, to illustrate its overall working in accordance with an embodiment of
the present disclosure.
[00082] As illustrated in FIG. 3A, in an embodiment, in step 302, images captured be
captured from tracking camera 102-1 can be inputted into the UAV based imaging system
100 for pre-processing.
[00083] In an embodiment, in step 304, the images can be converted from colored
form, such as, RGB format, to greyscale.
[00084] In an embodiment, in step 306, edge detector operators can be applied to the
greyscale format of the images to identify edges in the images, and accordingly an ROI can
be determined.
[00085] In an embodiment, in step 308, the pre-processed images associated with the
determined ROI can be obtained as output from the processing unit 104.
[00086] In an embodiment, in step 310, positon coordinates (also referred to geo
coordinates, herein) can be extracted from the pre-processed images via positioning unit 106,
altimeter 112, and magnetometer 114.
[00087] In an embodiment, in step 312, the extracted geo coordinates can be assigned
to the driving unit 108 (also referred to as UAV driving module, hereinafter), and
correspondingly the UAV driving module 108 can aid in driving the UAV 110 towards the
determined ROI, and thereby the UAV 110 can take HD images of the determined ROI.
[00088] FIGs. 3B and 3C represent image captured and modified image, respectively.
In an embodiment, FIG. 3B represents classified map where “R” and “B” can represent pure
pixels and “RB” can represent mixed pixels. In an embodiment, FIG. 3C represents processed
(edge detected) image with geographic coordinates. Where “LL” can be representing any or a
combination of latitudes and longitudes associated with a specific pixel and “black” color can
15
represent an actual zone where images are required to be obtained through the main camera
102-2.
[00089] Thus, it will be appreciated by those of ordinary skill in the art that the
diagrams, schematics, illustrations, and the like represent conceptual views or processes
illustrating systems and methods embodying this invention. The functions of the various
elements shown in the figures may be provided through the use of dedicated hardware as well
as hardware capable of executing associated software. Similarly, any switches shown in the
figures are conceptual only. Their function may be carried out through the operation of
program logic, through dedicated logic, through the interaction of program control and
dedicated logic, or even manually, the particular technique being selectable by the entity
implementing this invention. Those of ordinary skill in the art further understand that the
exemplary hardware, software, processes, methods, and/or operating systems described
herein are for illustrative purposes and, thus, are not intended to be limited to any particular
named.
[00090] While embodiments of the present invention have been illustrated and
described, it will be clear that the invention is not limited to these embodiments only.
Numerous modifications, changes, variations, substitutions, and equivalents will be apparent
to those skilled in the art, without departing from the spirit and scope of the invention, as
described in the claim.
[00091] In the foregoing description, numerous details are set forth. It will be apparent,
however, to one of ordinary skill in the art having the benefit of this disclosure, that the
present invention may be practiced without these specific details. In some instances, wellknown structures and devices are shown in block diagram form, rather than in detail, to avoid
obscuring the present invention.
[00092] As used herein, and unless the context dictates otherwise, the term "coupled
to" is intended to include both direct coupling (in which two elements that are coupled to
each other contact each other)and indirect coupling (in which at least one additional element
is located between the two elements). Therefore, the terms "coupled to" and "coupled with"
are used synonymously. Within the context of this document terms "coupled to" and "coupled
with" are also used euphemistically to mean “communicatively coupled with” over a
network, where two or more devices are able to exchange data with each other over the
network, possibly via one or more intermediary device.
[00093] It should be apparent to those skilled in the art that many more modifications
besides those already described are possible without departing from the inventive concepts
16
herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of
the appended claims. Moreover, in interpreting both the specification and the claims, all
terms should be interpreted in the broadest possible manner consistent with the context. In
particular, the terms “comprises” and “comprising” should be interpreted as referring to
elements, components, or steps in a non-exclusive manner, indicating that the referenced
elements, components, or steps may be present, or utilized, or combined with other elements,
components, or steps that are not expressly referenced. Where the specification claims refers
to at least one of something selected from the group consisting of A, B, C ….N, the text
should be interpreted as requiring only one element from the group, not A plus N, or B plus
N, etc.
[00094] While the foregoing describes various embodiments of the invention, other
and further embodiments of the invention may be devised without departing from the basic
scope thereof. The scope of the invention is determined by the claims that follow. The
invention is not limited to the described embodiments, versions or examples, which are
included to enable a person having ordinary skill in the art to make and use the invention
when combined with information and knowledge available to the person having ordinary skill
in the art.
ADVANTAGES OF THE INVENTION
[00095] The present disclosure provides an UAV based imaging system for sub-pixel
based classification of images.
[00096] The present disclosure provides an UAV based imaging system for per-pixel
based classification of an image.
[00097] The present disclosure provides an UAV based imaging system for
determining a region of interest based on edge detection.
[00098] The present disclosure provides an UAV based imaging system for facilitating
HD imaging of the determined region of interest.
[00099] The present disclosure provides a smart, efficient, and cost effective UAV
based imaging system.

We Claim:

1. A unmanned aerial vehicle (UAV) based imaging system comprising:
one or more imaging units operatively coupled to a UAV, and configured to
capture one or more images of a region; and
a processing unit operatively coupled to the one or more imaging units and the
UAV, the processing unit comprising one or more processors, and coupled with a
memory, the memory storing instructions executable by the one or more processors
and configured to:
pre-process the one or more images captured by at least one of the
imaging units, wherein the pre-processing comprises detecting and extracting
of peripheral attributes from the one or more captured images;
determine a Region of Interest (ROI) based on each of the preprocessed one or more images; and
delineate pixels associated with the determined ROI from pixels
associated with each of the pre-processed one or more images.
2. The unmanned aerial vehicle based imaging system as claimed in claim 1, wherein the
peripheral attributes comprise any or a combination of colour, texture, albedo, and
variation associated with the region.
3. The unmanned aerial vehicle based imaging system as claimed in claim 1, wherein the
processing unit is configured to modify the extracted peripheral attributes of each of
the one or more pre-processed images.
4. The unmanned aerial vehicle based imaging system as claimed in claim 3, wherein the
processing unit is configured to detect edges associated with the region based on the
modified peripheral attributes associated with each of the pre-processed one or more
images.
5. The unmanned aerial vehicle based imaging system as claimed in claim 4, wherein the
determination of the ROI is associated with the detected edges.
6. The unmanned aerial vehicle based imaging system as claimed in claim 1, wherein the
UAV based imaging system comprises a positioning unit operatively coupled to the
processing unit, and configured to identify position coordinates of the determined ROI
and transmit the identified position coordinates to the processing unit.
7. The unmanned aerial vehicle based imaging system as claimed in claim 5, wherein the
positioning unit comprises any or a combination of GPS, GLONASS, and NAVIC.
18
8. The unmanned aerial vehicle based imaging system as claimed in claim 1, wherein the
UAV based imaging system comprises a driving unit operatively coupled to the
processing unit and configured to control kinesiological parameters of the UAV based
on the identified position coordinates of the determined ROI.
9. The unmanned aerial vehicle based imaging system as claimed in claim 1, wherein the
kinesiological parameters of the drone comprise any or a combination of direction,
velocity, acceleration, angle of rotation, pitch, roll, and yaw.
10. The unmanned aerial vehicle based imaging system as claimed in claim 1, wherein the
processing unit is configured to actuate at least one of the one or more imaging units
to obtain one or more high resolution images of the determined ROI.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 202011025948-IntimationOfGrant15-03-2024.pdf 2024-03-15
1 202011025948-STATEMENT OF UNDERTAKING (FORM 3) [19-06-2020(online)].pdf 2020-06-19
2 202011025948-FORM FOR STARTUP [19-06-2020(online)].pdf 2020-06-19
2 202011025948-PatentCertificate15-03-2024.pdf 2024-03-15
3 202011025948-FORM FOR SMALL ENTITY(FORM-28) [19-06-2020(online)].pdf 2020-06-19
3 202011025948-Annexure [09-02-2024(online)].pdf 2024-02-09
4 202011025948-Written submissions and relevant documents [09-02-2024(online)].pdf 2024-02-09
4 202011025948-FORM 1 [19-06-2020(online)].pdf 2020-06-19
5 202011025948-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-06-2020(online)].pdf 2020-06-19
5 202011025948-Correspondence to notify the Controller [24-01-2024(online)].pdf 2024-01-24
6 202011025948-FORM-26 [24-01-2024(online)].pdf 2024-01-24
6 202011025948-EVIDENCE FOR REGISTRATION UNDER SSI [19-06-2020(online)].pdf 2020-06-19
7 202011025948-US(14)-HearingNotice-(HearingDate-25-01-2024).pdf 2023-12-27
7 202011025948-DRAWINGS [19-06-2020(online)].pdf 2020-06-19
8 202011025948-DECLARATION OF INVENTORSHIP (FORM 5) [19-06-2020(online)].pdf 2020-06-19
8 202011025948-CLAIMS [02-12-2022(online)].pdf 2022-12-02
9 202011025948-COMPLETE SPECIFICATION [19-06-2020(online)].pdf 2020-06-19
9 202011025948-CORRESPONDENCE [02-12-2022(online)].pdf 2022-12-02
10 202011025948-DRAWING [02-12-2022(online)].pdf 2022-12-02
10 202011025948-FORM-26 [01-09-2020(online)].pdf 2020-09-01
11 202011025948-FER_SER_REPLY [02-12-2022(online)].pdf 2022-12-02
11 202011025948-Proof of Right [23-09-2020(online)].pdf 2020-09-23
12 202011025948-FER.pdf 2022-06-03
12 202011025948-FORM 18 [07-02-2022(online)].pdf 2022-02-07
13 202011025948-FER.pdf 2022-06-03
13 202011025948-FORM 18 [07-02-2022(online)].pdf 2022-02-07
14 202011025948-FER_SER_REPLY [02-12-2022(online)].pdf 2022-12-02
14 202011025948-Proof of Right [23-09-2020(online)].pdf 2020-09-23
15 202011025948-DRAWING [02-12-2022(online)].pdf 2022-12-02
15 202011025948-FORM-26 [01-09-2020(online)].pdf 2020-09-01
16 202011025948-COMPLETE SPECIFICATION [19-06-2020(online)].pdf 2020-06-19
16 202011025948-CORRESPONDENCE [02-12-2022(online)].pdf 2022-12-02
17 202011025948-DECLARATION OF INVENTORSHIP (FORM 5) [19-06-2020(online)].pdf 2020-06-19
17 202011025948-CLAIMS [02-12-2022(online)].pdf 2022-12-02
18 202011025948-US(14)-HearingNotice-(HearingDate-25-01-2024).pdf 2023-12-27
18 202011025948-DRAWINGS [19-06-2020(online)].pdf 2020-06-19
19 202011025948-FORM-26 [24-01-2024(online)].pdf 2024-01-24
19 202011025948-EVIDENCE FOR REGISTRATION UNDER SSI [19-06-2020(online)].pdf 2020-06-19
20 202011025948-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-06-2020(online)].pdf 2020-06-19
20 202011025948-Correspondence to notify the Controller [24-01-2024(online)].pdf 2024-01-24
21 202011025948-Written submissions and relevant documents [09-02-2024(online)].pdf 2024-02-09
21 202011025948-FORM 1 [19-06-2020(online)].pdf 2020-06-19
22 202011025948-FORM FOR SMALL ENTITY(FORM-28) [19-06-2020(online)].pdf 2020-06-19
22 202011025948-Annexure [09-02-2024(online)].pdf 2024-02-09
23 202011025948-PatentCertificate15-03-2024.pdf 2024-03-15
23 202011025948-FORM FOR STARTUP [19-06-2020(online)].pdf 2020-06-19
24 202011025948-STATEMENT OF UNDERTAKING (FORM 3) [19-06-2020(online)].pdf 2020-06-19
24 202011025948-IntimationOfGrant15-03-2024.pdf 2024-03-15

Search Strategy

1 202011025948aAE_05-12-2022.pdf
1 202011025948E_03-06-2022.pdf
2 202011025948aAE_05-12-2022.pdf
2 202011025948E_03-06-2022.pdf

ERegister / Renewals