Abstract: A system and a method for landing of an Unmanned Aerial Vehicle (UAV) are provided. The method includes estimating a 3-dimensional (3D) location of at least one media sensor mounted on the UAV relative to a marker representative of a landing location of the UAV. The marker comprises a recursive geometrical pattern. The landing of the UAV on the marker at the landing location is facilitated based on the 3D location of the at least one media sensor mounted on the UAV relative to the marker.
CLIAMS:WE CLAIM:
1. A computer-implemented method for landing of an Unmanned Aerial Vehicle (UAV), the method comprising:
estimating a 3-dimensional (3D) location of at least one media sensor mounted on the UAV relative to a marker representative of a landing location of the UAV, the marker comprising a recursive geometrical pattern; and
facilitating landing of the UAV on the marker at the landing location based on the 3D location of the at least one media sensor mounted on the UAV relative to the marker.
2. The method as claimed in claim 1, wherein the recursive geometrical pattern comprises of recursive fractal pattern.
3. The method as claimed in claim 2, wherein the recursive fractal pattern comprises a Sierpinski fractal pattern.
4. The method as claimed in claim 3, wherein the recursive fractal pattern comprises a plurality of polygons, the plurality of polygons being self-similar, wherein size of each polygon of the plurality of polygons is from among a set of preconfigured sizes.
5. The method as claimed in claim 4, further comprising performing a scanning, by the at least one media sensor mounted on the UV, to detect at least one portion of the marker, wherein detecting the at least one portion of the marker comprises capturing an image of the at least one portion of the marker containing at least one complete polygon.
6. The method as claimed in claim 5, wherein the at least one portion of the marker comprises a complete marker pattern, and the image of the at least one portion of the marker comprises an image of the complete marker pattern, and wherein estimating the 3D location of the at least one media sensor relative to the marker comprises:
determining a centroid of the complete marker pattern based on the size of the plurality of polygons in the complete image;
determining a 3D orientation of the at least one media sensor relative to the centroid of the complete marker pattern; and
estimating an altitude of the UAV based on the 3D orientation of the at least one media sensor.
7. The method as claimed in claim 5, further comprising assigning a distinct color code to each polygon of the plurality of polygons, the color code being indicative of an order and relative orientation of the polygon with respect to a centroid of the marker.
8. The method as claimed in claim 7, wherein the at least one portion of the marker comprises a partial marker pattern, and the image of the at least one portion of the partial marker comprises a partial image of the marker having a set of polygons, and wherein estimating the 3D location of the at least one media sensor relative to the marker comprises:
extrapolating the partial image the marker to generate the complete marker image based on the distinct color coding assigned to each of the polygon within the set of polygons ;
determining a centroid of the complete marker image based on the size of the plurality of polygons in the complete image;
determining a 3D orientation of the at least one media sensor relative to the centroid of the complete marker pattern; and
estimating an altitude of the UAV based on the 3D orientation of the at least one media sensor.
9. The method as claimed in claim 1, wherein the landing location comprises a static surface.
10. The method as claimed in claim 1, wherein the marker being located on a ground vehicle (GV), the GV is in one of a static condition and mobile condition.
11. A computer implemented system for landing of an Unmanned Aerial Vehicle (UAV), the system comprising:
at least one media sensor;
at least one memory; and
at least one processor, the at least one memory coupled to the at least one processor wherein the at least one processor is capable of executing programmed instructions stored in the at least one memory to:
estimate a 3-dimensional (3D) location of the at least one media sensor mounted on the UAV relative to a marker representative of a landing location of the UAV, the marker comprising a recursive geometrical pattern; and
facilitate landing of the UAV on the marker at the landing location based on the 3D location of the at least one media sensor mounted on the UAV relative to the marker.
12. The system as claimed in claim 11, wherein the recursive geometrical pattern comprises of recursive fractal pattern.
13. The system as claimed in claim 12, wherein the recursive fractal pattern comprises a Sierpinski fractal pattern.
14. The system as claimed in claim 13, wherein the Sierpinski fractal pattern comprises a plurality of polygons, the plurality of polygons being self-similar, wherein size of each polygon of the plurality of polygons is from among a set of preconfigured sizes.
15. The system as claimed in claim 14, wherein the at least one processor is further configured by the instructions to perform a scanning, by the at least one media sensor mounted on the UV, to detect at least one portion of the marker, wherein for detecting the at least one portion of the marker, the at least one processor is further configured by the instructions to capture an image of the at least one portion of the marker containing at least one complete polygon.
16. The system as claimed in claim 15, wherein the at least one portion of the marker comprises a complete marker pattern, and the image of the at least one portion of the marker comprises an image of the complete marker pattern, and wherein for estimating the 3D location of the at least one media sensor relative to the marker, the at least one processor is further configured by the instructions to:
determine a centroid of the complete marker pattern based on the size of the plurality of polygons in the complete image;
determine a 3D orientation of the at least one media sensor relative to the centroid of the complete marker pattern; and
estimate an altitude of the UAV based on the 3D orientation of the at least one media sensor.
17. The system as claimed in claim 15, wherein the at least one processor is further configured by the instructions to assign a distinct color code to each polygon of the plurality of polygons, the color code being indicative of an order and relative orientation of the polygon with respect to a centroid of the marker.
18. The system as claimed in claim 17, wherein the at least one portion of the marker comprises a partial marker pattern, and the image of the at least one portion of the partial marker comprises a partial image of the marker having a set of polygons, and wherein for estimating the 3D location of the at least one media sensor relative to the marker, the at least one processor is further configured by the instructions to:
extrapolate the partial image the marker to generate the complete marker image based on the distinct color coding assigned to each of the polygon within the set of polygons;
determine a centroid of the complete marker image based on the size of the plurality of polygons in the complete image;
determine a 3D orientation of the at least one media sensor relative to the centroid of the complete marker pattern; and
estimate an altitude of the UAV based on the 3D orientation of the at least one media sensor.
19. The system as claimed in claim 11, wherein the marker being located on a ground vehicle (GV), and the GV is in one of a static condition and mobile condition.
20. A non-transitory computer-readable medium having embodied thereon a computer program for executing a method for landing of an Unmanned Aerial Vehicle (UAV), the method comprising:
estimating a 3-dimensional (3D) location of at least one media sensor mounted on the UAV relative to a marker representative of a landing location of the UAV, the marker comprising a recursive geometrical pattern; and
facilitating landing of the UAV on the marker at the landing location based on the 3D location of the at least one media sensor mounted on the UAV relative to the marker. ,TagSPECI:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
Title of Invention:
METHODS AND SYSTEMS FOR LANDING OF UNMANNED AERIAL VEHICLE
Applicant:
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
[001] The embodiments herein generally relate to landing of unmanned aerial vehicle (UAV), and, more particularly, to vision-based landing of UAV based on a recursive marker.
BACKGROUND
[002] In general, a UAV is an aircraft which flies without a human pilot on-board. A UAV recognizes a path of flight thereof based on programmed instructions provided to the UAV by a remote control station or by UAV embedded controllers. With advancement in aviation technology, utilization of UAVs, also known as “drones”, for various urban civilian and rural civilian applications is increasing. For example, the UAV is being utilized for urban civilian applications such as surveillance, fire brigades, disaster control, emergency response crews, while remote rural civilian applications include periodic monitoring of long linear infrastructures towards critical utilities, such as power line, oil/gas pipelines, and so on. Other advanced applications of UAVs usage include weather observation, topographical survey, and various military applications.
[003] In order to assist a UAV in landing, various techniques have been implemented. For example, an external controller is employed that may assist the UAV to land at a particular point or location. In certain scenarios, such point/location is identified by implementing a marker based technique, wherein a marker such as an “H” shaped marker or a marker having concentric rings may be placed at the landing location. The UAV includes a camera that is configured to identify the marker and land thereon. However, the existing techniques for marker based landing of UAV have certain limitations. For instance, for the camera to accurately detect the marker, the marker should be fully within the camera field of view (FOV) so that it can be fully imaged and identified. Moreover, when the UAV is on a descend path, the image of the marker as captured by the camera keeps changing, thereby rendering the detection of the marker inefficiently.
SUMMARY
[004] The following presents a simplified summary of some embodiments of the disclosure in order to provide a basic understanding of the embodiments. This summary is not an extensive overview of the embodiments. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the embodiments. Its sole purpose is to present some embodiments in a simplified form as a prelude to the more detailed description that is presented below.
[005] In view of the foregoing, an embodiment herein provides methods and systems for landing of an Unmanned Aerial Vehicle (UAV). In one aspect, a system for landing of a (UAV) is provided, where the system include at least one media sensor; at least one memory; and at least one processor, the at least one memory coupled to the at least one processor wherein the at least one processor is capable of executing programmed instructions stored in the at least one memory to: estimate a 3D location of at least one media sensor mounted on the UAV relative to a marker representative of a landing location of the UAV, the marker comprising a recursive geometrical pattern; and facilitate landing of the UAV on the marker based on the 3D location of the at least one media sensor mounted on the UAV relative to the marker.
[006] In another aspect, a method for landing of a UAV is provided. The method in-cludes estimating a 3D location of at least one media sensor mounted on the UAV relative to a marker representative of a landing location of the UAV, the marker comprising a recur-sive geometrical pattern; and facilitating landing of the UAV on the marker based on the 3D location of the at least one media sensor mounted on the UAV relative to the marker.
[007] In yet another aspect, a non-transitory computer-readable medium having embo-died thereon a computer program for executing a method for landing of a UAV is provided. The method include estimating a 3D location of at least one media sensor mounted on the UAV relative to a marker representative of a landing location of the UAV, the marker com-prising a recursive geometrical pattern; and facilitating landing of the UAV on the marker at the landing location based on the 3D location of the at least one media sensor mounted on the UAV relative to the marker.
BRIEF DESCRIPTION OF THE FIGURES
[008] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules.
[009] FIG. 1 illustrates an example representation of landing of a UAV at a landing location, in accordance with embodiment of the present disclosure;
[0010] FIG. 2 illustrates a block diagram of a system for facilitating landing of a UAV, in accordance with an embodiment of the present disclosure;
[0011] FIG. 3 illustrates an example marker for facilitating landing of a UAV, in accordance with an embodiment of the present disclosure;
[0012] FIG. 4 illustrates an example representation of a marker for facilitating landing of a UAV, in accordance with an example embodiment
[0013] FIG. 5 illustrates an example representation of estimation of an altitude of a UAV, in accordance with an example embodiment; and
[0014] FIG. 6 illustrates a flow diagram of a method for landing of a UAV, in accordance with the present disclosure.
[0015] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems and devices embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0016] The present disclosure relates to a system and methods for management of a financial account access. Herein, the term ‘financial account’ may refer to an account in an financial institution, for example a banking account, a credit account, a loan account, a debit account, and so on, that may be utilized by a user for at least for a purpose of expenditure. Various embodiments herein disclose system and method for accessing the financial account for performing financial transactions.
[0017] Unless specifically stated otherwise as apparent from the following discussions, it is to be appreciated that throughout the present disclosure, discussions utilizing terms such as “determining” or “generating” or “comparing” or the like, refer to the action and processes of a computer system, or similar electronic activity detection device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0018] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0019] The methods and systems are not limited to the specific embodiments described herein. In addition, the method and system can be practiced independently and separately from other modules and methods described herein. Each device element/module and method can be used in combination with other elements/modules and other methods.
[0020] Throughout the description and claims of this complete specification, the word “comprise” and variations of the word, such as “comprising” and “comprises,” means “including but not limited to,” and is not intended to exclude, for example, other additives, components, integers or steps. “Exemplary” means “an example of” and is not intended to convey an indication of a preferred or ideal embodiment. “Such as” is not used in a restrictive sense, but for explanatory purposes.
[0021] For a firmware and/or software implementation, the methodologies can be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine readable medium tangibly embodying instructions can be used in implementing the methodologies described herein. For example, software codes and programs can be stored in a memory and executed by a processing unit.
[0022] In another firmware and/or software implementation, the functions may be stored as one or more instructions or code on a non-transitory computer-readable medium. Examples include computer-readable media encoded with a data structure and computer-readable media encoded with a computer program. The computer-readable media may take the form of an article of manufacturer. The computer-readable media includes physical computer storage media. A storage medium may be any available medium that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer; disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blue-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
[0023] It should be noted that the description merely illustrates the principles of the present subject matter. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present subject matter and are included within its spirit and scope. Furthermore, all examples recited herein are principally intended expressly to be only for pedagogical purposes to aid the reader in understanding the principles of the invention and the concepts contributed by the inventor(s) to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.
[0024] The manner, in which the system and method for accessing the financial account shall be implemented, has been explained in details with respect to the FIGS. 1 through 4D While aspects of described methods and systems for accessing the financial account can be implemented in any number of different systems, utility environments, and/or configurations, the embodiments are described in the context of the following exemplary system(s).
[0025] FIG. 1 illustrates an example representation 100 of landing of a UAV 102, in accor-dance with embodiment of the present disclosure. The representation of landing of the UAV 102 is shown to include the UAV 102 that is capable of detecting a marker, for example a marker 104. In an embodiment, the UAV detects the marker by utilizing vision based tech-niques. In an embodiment, the UAV includes at least one media sensor (or a camera embo-dying at least one media sensor) for detecting the marker. For example as illustrated in FIG. 1, the UAV 102 is shown to include at least one media sensor 106. In an embodiment, the at least one media sensor 106 (or the camera) is downward-looking completely, so as to effec-tively project the marker.
[0026] The marker herein is representative of a landing location of the UAV 102. In an em-bodiment, the landing location of the UAV may include a static surface such as a plane ground surface on which the marker may be configured. For example, a patch of a land may be cleared and the marker may be printed on the cleared patch. In another embodiment, the landing location of the UAV may be on a vehicle. For instance, the marker may be confi-gured on a platform located on a vehicle. Such a vehicle that holds a marker indicative of landing point of the UAV is known as a ground vehicle (GV). An example GV, for example, a GV 108 holding a marker 104 is illustrated in FIG. 1. The GV maybe in a moving condi-tion or a static condition. Such a system having the GV (comprising the marker) along with the UAV is referred to as a hybrid platform. The hybrid platform is useful in scenarios where, for example, ground vehicle can carry many accessories which may be required for various types of recoveries, while the UAV may be limited in terms of amount of payload it can carry.
[0027] In an embodiment, the marker may include a pattern that can be detected by the UAV. In an embodiment, the marker includes a recursive geometrical pattern that is self-similar. Herein, the term ‘self-similar pattern’, may refer to a pattern that is similar to itself at any scale, as well as a part of itself. The similarity of the pattern to itself at any scale is in form of geometrical properties being exactly or approximately the same in terms of congru-ence. In an embodiment, the recursive geometrical pattern includes a recursive fractal pat-tern. A fractal is a mathematical set that exhibits a repeating pattern in a self-similar manner that displays at every scale. Different types of fractals may either be exactly or nearly the same at different levels. An example of the recursive fractal pattern is a Sierpinski fractal pattern. The Sierpinski fractal pattern includes a plurality of polygons that are self-similar or similar in shape. For example, the plurality of polygons may be of square shape, triangle shape, and so on. It will be understood that a polygon included in the recursive geometrical shape may be of any shape that includes all the sides of equal length. In order to configure a Sierpinski fractal pattern marker, a polygon shaped object is embedded multiple times at appropriate locations at various scales within a topmost object in the marker. Herein, the term ‘scale’ refers to the size of the polygon. As such, “the size of plurality of polygons be-longing to various distinct scales” refers to a set of distinct sizes of the polygons. The num-ber of scales at which the object is embedded is known as the order of the pattern. An ex-ample of a Sierpinski fractal pattern marker with a square as the polygon is illustrated and explained further with reference to FIG. 3.
[0028] The resemblance of a fractal to a part of itself facilitates in the locating and recog-nizing the marker when only a part of the marker is detected/imaged during flight/descend of the UAV. In addition, similarity of the geometrical polygonal shape at arbitrary scales to itself in the marker facilitates in capturing frames that have fixed-width pattern inside, dur-ing descend of the UAV at certain fixed discrete set of altitudes. Due to these advantages, the fractal pattern can be utilized for facilitating landing of the UAV even when the GV on which the marker is disposed is in motion (or is mobile).
[0029] In an embodiment, the descending of the UAV is controlled by continuously (at a particular frequency) estimating the 3D location of the UAV with respect to the marker. In an embodiment, the UAV 102 is configured to detect the marker by using vision based sens-ing techniques. In an embodiment, the UAV includes at least one media sensor configured thereon for detecting the marker by capturing images/frames of the marker. The captured frames can be utilized for estimating 3D location of the UAV with respect to a centroid of the marker during descend. Hereinafter the 3D location of the UAV with respect to a centroid of the marker during descend may be referred to as ‘relative 3D location’. In an embodiment, the relative 3D location may be used to control a speed of the UAV, an immediate heading position of the UAV during the flight/descend, and other such factors, thereby facilitating in landing of the UAV on the marker (even when the marker is configured on a moving GV).
[0030] In an embodiment, the 3D location of the UAV at any instant may be used to de-termine/estimated altitude of the UAV at that instant. The estimated altitude may further be utilized for determining a descend path of the UAV towards the marker. For example, the 3D position of the UAV may further facilitate in determining the altitude in real-time of the UAV, thereby enabling in estimating an immediate heading position (or the descend path) of the UAV. In an embodiment, the estimation of the 3D position of the UAV can be utilized in calculating controlled instantaneous speeds of the UAV during descend. In an example embodiment, a system is provided for facilitating the landing of the UAV at the landing point. An example implementation of the system for estimating the relative 3D location of the UAV and facilitating the landing of the UAV at the landing point is explained further with reference to FIG. 2.
[0031] FIG. 2 illustrates a block diagram of a system 200 for facilitating landing of a UAV, in accordance with an embodiment of the present disclosure. In an embodiment, the system 200 facilitates in landing of the UAV by utilizing vision based techniques. For example, the system stores an image of a marker that is representative of a landing location of the UAV. Such a stored image of the marker may hereinafter be referred to as ‘stored marker image’. The system 200 may further facilitate in capturing an image (hereinafter referred to as ‘cap-tured marker image’) of the marker at any instant from an altitude and align the stored marker image with the captured marker image based on the respective centroids of the cap-tured marker image and the stored marker image. Once the stored marker image is aligned with the captured marker image, the system 200 may be caused to determine the 3D loca-tion and/or altitude of the UAV relative to the marker, and based on the relative altitude, the system 200 may initiate and/or facilitate descend of the UAV.
[0032] The system 200 includes or is otherwise in communication with at least one pro-cessor such as a processor 202, at least one memory such as a memory 204, and at least one media sensor such as media sensor 206. In an embodiment, the processor 202, memory 204, and the memory 204, and the media sensor 206 may be coupled by a system bus such as a system bus 208 or a similar mechanism.
[0033] The processor 202 may include circuitry implementing, among others, audio and logic functions associated with the communication. For example, the processor 202 may include, but are not limited to, one or more digital signal processors (DSPs), one or more mi-croprocessor, one or more special-purpose computer chips, one or more field-programmable gate arrays (FPGAs), one or more application-specific integrated circuits (ASICs), one or more computer(s), various analog to digital converters, digital to analog converters, and/or other support circuits. The processor 202 thus may also include the functionality to encode messages and/or data or information. The processor 202 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 202. Further, the processor 202 may include functionality to execute one or more software programs, which may be stored in the memory 204 or otherwise accessible to the processor 202.
[0034] The at least one memory such as a memory 204, may store any number of pieces of information, and data, used by the system to implement the functions of the system. For example, in an example embodiment, the memory 204 is configured to store an image (herei-nafter referred to as ‘stored marker image’) of the marker. The memory 204 may include for example, volatile memory and/or non-volatile memory. Examples of volatile memory may include, but are not limited to volatile random access memory (RAM). The non-volatile memory may additionally or alternatively comprise an electrically erasable programmable read only memory (EEPROM), flash memory, hard drive, or the like. Some examples of the volatile memory includes, but are not limited to, random access memory, dynamic random access memory, static random access memory, and the like. Some example of the non-volatile memory includes, but are not limited to, hard disks, magnetic tapes, optical disks, programmable read only memory, erasable programmable read only memory, electrically erasable programmable read only memory, flash memory, and the like. The memory 204 may be configured to store information, data, applications, instructions or the like for enabling the UAV to carry out various functions in accordance with various example embodiments. Additionally or alternatively, the memory 204 may be configured to store instructions which when executed by the processor 202 causes the UAV to behave in a manner as described in various embodiments.
[0035] In an embodiment, the media sensor 206 may include an image capturing module, such as a camera, video and/or audio module, in communication with the processor 202. The media sensor 206 may be any means for facilitating capturing images, video and/or audio for storage, display, or transmission. For example, in an exemplary embodiment in which the media sensor 206 may be embodied in a camera, such that the camera may be configured to form and save a digital image file from an image of the marker captured by the camera. The media sensor 206 may include hardware such as a CMOS/CCD (complementary metal-oxide semiconductor/charged coupled device) sensors configured for capturing images. In an embodiment, the media sensor may be configured to capture media items in accordance with a number of capture settings such as focal length, zoom level, lens type, aperture, shutter timing, white balance, color, style (e.g., black and white, sepia, or the like), picture quality (e.g., pixel count), flash, date, time, or the like. In some embodiments, the values of the capture settings (e.g., degree of zoom) may be obtained at the time a media item comprising the marker image is captured and stored in association with the captured media item in a memory device, such as, memory 204. The media sensor 206 can include all hardware, such as circuitry, a lens or other optical component(s), and software for creating a digital image file from a captured image.
[0036] In some example embodiments, the image sensor may include only the hardware needed to view an image, while a memory device, such as the memory device of the system 200 stores instructions for execution by the processor 202 in the form of software to create a digital image file from a captured image. In an exemplary embodiment, the media sensor 206 may further include a processor or co-processor which assists the processor 202 in processing image data and an encoder and/or decoder for compressing and/or decompress-ing image data. The encoder and/or decoder may encode and/or decode according to, for example, a joint photographic experts group (JPEG) standard or other format.
[0037] In an example embodiment, the system 200 may be embodied in the UAV. In another embodiment, the system 200 may partially be implemented in the UAV and partially at the landing location. For example, in case the landing location of the UAV is on a GV (as is illustrated in FIG. 1), the measurements performed by the sensors configured on the GV as well as the captured images of the marker may be transmitted wirelessly to a Ground Control Station (GCS) configured on/within the GV, where the estimation is performed and transmitted back to the UAV in real-time. In this embodiment, the system 200 may include a communication interface element 210 to facilitate communication between the UAV and the GCS. The communication interface element 210 may be in form of a wireless connection or a wired connection. Examples of communication interface element 210 may include, but are not limited to, IEEE 802.11 (Wifi), BLUETOOTH®, or a wide-area wireless connection. Example of wired network interface element 210 includes, but is not limited to Ethernet. The processor 202 may also be configured to facilitate communications via the communica-tions interface element 210 by, for example, controlling hardware included within the com-munications interface element 210. In this regard, the communication interface element 210 may include, for example, communications driver circuitry (e.g., circuitry that supports wired communications via, for example, fiber optic connections), one or more antennas, a transmitter, a receiver, a transceiver and/or supporting hardware, including, for example, a processor for enabling communications.
[0038] In an embodiment, for a UAV to be able to start descend from a comfortable alti-tude, the marker should be properly imaged by the media sensor, and the image of the marker should be provided to the processor for enabling descend of the UAV. Also, for the media sensor to properly image the marker from that altitude, the marker should be captured within a predetermined threshold size of a bounding box by the media sensor. Similarly, given the size of a landing pad/location and the marker pattern within the landing location, there is a maximum altitude till which point useful estimation of location can be done.
[0039] In an example embodiment, the processor 202 is configured to, with the content of the memory 204, and optionally with other components described herein, cause the system 200 to detect the marker representative of the landing location of the UAV by scanning the landing location. In an embodiment, detecting at least one portion of the marker includes capturing an image of the at least one portion of the marker containing at least one complete polygon of the marker. In an embodiment, the camera (or the at least one media sensor) may be installed downward-looking on the UAV so as to properly scan and capture image of the marker.
[0040] The system 200 may be caused to scan the landing location and detect an object that may potentially be the marker. In order to ascertain that the detected object is the mark-er, the system 200 is caused to perform a connected component analysis (CCA) on the de-tected object. In an embodiment, for performing CCA, distinct polygons of a plurality of self-similar polygons of the marker may be assumed to be the components. An example of detecting the marker is explained below by considering the marker as a Sierpinski fractal carpet pattern having a plurality of squares. Also, an example of the Sierpinski fractal carpet is illustrated and described with reference to FIG. 3.
[0041] In an embodiment, for detecting the Sierpinski fractal carpet pattern marker, a CCA is performed by the system 200 on a detected object which may potentially include the pat-tern. The system 200 is caused to perform a ‘squareness check’ on the components of the detected object. Herein, the term ‘squareness check’ may refer to determining and ascertain-ing whether the components of the detected object are squares or not. In an example embo-diment, the squareness check may be performed for each component by detecting corners of the component, determining number of corners of the component, checking a ratio of a blob area to blob boundary for the component, determining whether the angles near the corner of a bounding box of the component is around 90° or not, and so on. Herein, the blob may re-fer to regions emerging within the image after CCA, corresponding to various objects. It will be noted that herein only a few of the criteria for determining the squareness of the components are disclosed. However, the squareness of the components may be determined by various other parameters or combination thereof with these mentioned parameters with-out limiting the scope of various embodiments. The components meeting the squareness cri-teria may be sorted into a sequence, and the sequence may be sorted to store the compo-nents with decreasing area thereof.
[0042] As an example, a blob is considered in the detected marker and an orientation of the blob is determined. Also, an x-y axis is defined from the centroid of the blob. If in the sorted sequence, 8 more squares are found as next successive items of list, having area around 1/9th of this blob, then for each such square, it is determined whether the centroid of this square is along with the direction defined by extended 8-neighborhood of the centroid of this image, at same distance from the current blob’s centroid, as the distance noted for neighboring squares are considered till all the iterations are complete. In an embodiment, all the iteration are completed once the count of the detected squares is at most (8n+8(n-1)+?+1) and at least (3n+3(n-1)+?+1). Herein, the least number corresponds to detecting all squares along the boundary of the marker, plus a central square which may be biggest in size.
[0043] In an embodiment, once the marker is detected by the system 200, the processor 202 is further configured to, with the content of the memory 204, and optionally with other components described herein, cause the system 200, to estimate a 3D location of the UAV relative to the marker. In particular, the system 200 is caused to determine the 3D location of the UAV relative to the centroid of the marker at the landing location. In an embodiment, the centroid may be determined based on the size of the plurality of polygons in the recur-sive geometrical pattern of the marker. In particular, the centroid may be determined as the central portion of the single biggest polygon detected while detecting the marker. For exam-ple, in case of a Sierpinski fractal pattern marker, the centroid may be the central portion of the single biggest square.
[0044] In an example embodiment, the processor 202 is further configured to, with the content of the memory 204, and optionally with other components described herein, cause the system 200, to estimate 3D orientation of the at least one media sensor with respect to the centroid of the marker. In an embodiment, the 3D orientation of the at least one media sensor with respect to the centroid of the marker may be determined by utilizing vision based techniques. In an embodiment, the orientation of the at least one media sensor with respect to the centroid of the marker the system 200 may be caused to align and compare the stored marker image in the system 200 with the captured marker image to determine a correspondence between the captured marker image and the stored marker image. In an ex-ample embodiment, the system 200 is further caused to compute an altitude of the UAV based on the correspondence between the images. In an embodiment, a plurality of altitude values may be computed from each individual correspondence between different polygons of the plurality of polygons of the stored marker image and the captured marker image, and a mean/median/mode of the plurality of altitude values may be calculated from each indi-vidual correspondence to determine the altitude of the UAV in a robust manner.
[0045] In certain scenarios, for example due to UAV motion or other noise conditions, the media sensor may not be able to capture image of the marker completely. In an example embodiment, an image of the marker may be deemed partial marker if the central, biggest polygon cannot be detected. In such a scenario, the system 200 may be caused to track only a partial image of the marker, and extrapolate partially detected marker image into complete marker image, so that the centroid of the marker can be estimated. In an example embodi-ment, the system may be caused to extrapolate the partially detected marker image based on a color code associated with the polygons detected in the partially detected marker image. In particular, for determining the level information and orientation information associated with each of the plurality of polygons of the partially captured marker image, each polygon may be color-coded with a unique color and a corresponding color value, such that the color and the corresponding color value of each of the partially captured marker image may uni-quely include an orientation information and an order information associated with that po-lygon embedded therein. In an embodiment, the color and the corresponding color value of each of the polygons may be stored in a pre-established list of colors. In an embodiment, the pre-established list of colors may be stored in a memory, for example the memory 204 of the system 200. Accordingly, even if one polygon of the plurality of polygons is detected in the imaged marker and the system 200 is caused to correspond the detected polygon to one unique polygon in the stored marker image, the polygon can be extrapolated to generate the complete marker image. It will be noted herein, that more number of polygons are detected and the correspondences are established, the lesser is the estimation error of the centroid. An example of detecting the partial marker image and extrapolating the partial marker image based on the color coding of the individual polygons of the marker is described below.
[0046] In an embodiment, for extrapolating the partially captured marker image, the sys-tem performs a CCA on a set of polygons captured in the partially captured marker image. Herein, the individual polygons of the set of polygons may be referred to as components. The system 200 is caused to perform a ‘check’ on the individual components to determine whether the individual component is a polygon or not. In an example embodiment, the check may be performed for each component by detecting corners of the component, deter-mining number of corners of the component, checking a ratio of a blob area to blob boun-dary for the component, determining the angles near the corner of a bounding box of the component, and so on. It will be noted that herein only a few of the criteria for checking the shape of the components are disclosed. However, the check of the components may be de-termined by various other parameters or combination thereof with these mentioned parame-ters without limiting the scope of various embodiments. The components meeting the criteria of the check may be sorted into a sequence, and the sequence may be sorted to store the components with decreasing area thereof. For each polygon that is stored in the sequence, a color value associated with a color of the polygon is determined and a search is performed for determining a mapping between the determined color value and color values stored in the pre-established list of colors. In an embodiment, on detecting a mapping, the polygon corresponding to the mapped color value along with corresponding color information (hav-ing the order information and the orientation information) is identified in the stored marker image. Based on the order information and the orientation information of the polygon in the stored marker image, a location of the centroid of the stored marker image is estimated. In an embodiment, the location of the centroid is estimated in the stored marker image based on the orientation (and/or location) of the polygon with respect to the marker and the order of the polygon. In an embodiment, the location of the centroid may be determined based on the following expression:
Location of centroid of a polygon = i*unit distance in each of (Kn/2) directions
where, n is a known order number of the marker, K is the number of sides of the polygon, and i is the order of the polygon under consideration.
[0047] The color mapped polygons may be stored in a sequence, and the sequence may be sorted to store the color mapped polygons in order of decreasing area. A blob is considered and an orientation of the blob is determined, and x,y axis is defined from the centroid of the blob. If in the sorted list, few more polygons are found as next successive items of list, hav-ing area around 1/(9i)th of this blob (where, i = 1 to n, and where n is the known order of the marker) of this blob, then for each such polygon, it is determined whether the centroid of this polygon is along with the direction defined by extended k-neighborhood of the centroid of this image, at same distance from the current blob’s centroid, as the distance noted for neighboring polygons considered till all the iterations are complete. In an embodiment, all the iteration are completed once the count of the detected polygons is at most (8n+8(n-1)+?+1), and a location of the centroid is estimated in each of the iteration. In an embodiment, the location of the centroid is computed by taking a mean/media/mode of the values of the centroid location estimated from all the iterations. The location of the centroid that is estimated from all the iterations is utilized for detecting relative 3D location and orientation of the camera (media sensors) and/or the UAV with respect to the marker at the landing location. In addition, the estimated location of the centroid facilitates in determining the altitude of the UAV. An example of determining the altitude of the UAV based on the estimated location of the centroid is explained further with reference to FIG. 5.
[0048] In an example embodiment, the processor 202 is further configured to, with the content of the memory 204, and optionally with other components described herein, cause the system 200, to determine a landing path of the UAV based on the altitude of the camera (media sensors). In an embodiment, determining the landing path includes determining an immediate subsequent landing position of the UAV.
[0049] FIG. 3 illustrates a marker 300 for facilitating landing of a UAV, in accordance with an embodiment of the present disclosure. The marker illustrated in FIG. 3 is a recursive self-similar fractal pattern marker. In particular, the recursive self-similar fractal pattern marker 300 is a Sierpinski Carpet marker pattern. The Sierpinski Carpet marker 300 is a recursive geometrical pattern that is self-similar in nature. In an embodiment, the similarity is in form of geometrical properties being exactly or approximately the same. A self-similar pattern, by its definition is similar to itself any scale, as well as a part of itself. For example, as illus-trated in FIG. 3, the marker 300 includes a polygonal shape that repeats itself at multiple scales throughout the marker. In the case of Sierpinski Carpet marker 300, the polygonal shape is a square. The square is embedded multiple times at appropriate locations at various scales within a topmost object of the Sierpinski Carpet marker. The Sierpinski Carpet marker 300 includes an arrangement of the square shapes such that the central portion of the marker includes the square 302 of a biggest size as compared to the rest of squares. The number of scales at which the squares are embedded in the object is known as ‘order’ of the pattern. For example, in the Sierpinski Carpet 300 illustrated in FIG. 3, the order is 5 since the marker includes squares of 5 different sizes, for instance square such as 302, 304, 306, 308, and 310.
[0050] As the Sierpinski Carpet marker 300 includes self-similar pattern repeating itself at different scales (meaning squares of different sizes), the Sierpinski Carpet marker 300 en-ables the at least one media sensor to capture image of the pattern effectively from different altitudes. For example, when the UAV having the at least one media sensor is at a certain altitude, it may be able to capture the image of the Sierpinski Carpet marker 300, however, while descending, the UAV may deviate from a landing path due to environmental condi-tions or in some scenario the at least one media sensor may not be able to capture the com-plete marker. In such scenarios, if the media sensors are able to detect even a portion (one or two squares) of the Sierpinski Carpet marker 300, the system (for example, the system 200) is able to extrapolate rest of the Sierpinski Carpet marker pattern, detect the centroid (for example, a centroid 312 of the Sierpinski Carpet marker pattern) thereof. In addition, as the UAV starts the descend, then due to similarity at arbitrary scales to itself (also referred to as self-similarity) it may be possible, at certain fixed discrete set of altitudes, to be able to cap-ture frames that have fixed-width pattern within themselves.
[0051] As is explained with reference to FIGS. 1 and 2, the system utilizes the centroid of the marker for detecting the 3D location of camera mounted on the UAV relative to the location of the marker, which in turn is used to determine immediate heading of the UAV. In addition, the detecting the 3D location of camera mounted on the UAV relative to the location of the marker facilitates in continuously controlling the descending of the UAV. Herein, the descending of the UAV is continuously controlled by controlling various kinematics parameters such as the speed, the acceleration, the angle of descend of the UAV, and consequently the landing path followed by the UAV. In an embodiment, on detecting the centroid of the marker (using the captured marker image), the system may facilitate in tracking and aligning the centroid of the stored marker image with the centroid of the captured marker image. In an embodiment, the tracking and aligning of the respective centroids of the captured marker image and the stored marker image may be performed by a Kalman filter embodied in the system (for example, the system 200 of FIG. 2).
[0052] FIG. 4 illustrates an example representation of a marker 400 for facilitating landing of a UAV, in accordance with an example embodiment. In an embodiment, the marker 400 includes a Sierpinski fractal pattern having a plurality of squares with different order and orientations relative to a centroid 402 of the marker 400. Herein, the centroid of the Sierpin-ski fractal pattern may refer to a central point of the Sierpinski fractal pattern, which also forms the centre of the biggest square of the marker.
[0053] As illustrated, various squares such as a square 404 is of the order 1, a square 406 and other squares having the same size as that of the square 406 are of the order 2, a square 408 and other squares having the same size as that of the square 408 are of the order 3, and so on. The orientation of the any square from among the plurality of squares is determined by an orientation of a central point (or centroid) of that square along the centroid 402 (of the biggest size square) pf the marker 400. For example the orientation of the square 406 may be determined along a line 410 connecting the centre of the square 406 and the centroid 402.
[0054] In an embodiment, the “unit distance” of any square may refer to a distance be-tween the central point of that square along the centroid 402. For example, the unit distance between the square 406 includes the length of a line 410 connecting the centre of the square 406 and the centroid 402. Since the distance a respective central point of any square with reference to the centroid is different for different squares, the unit distance corresponding to each square may be distinct.
[0055] FIG. 5 illustrates an example representation of estimation of an altitude of a UAV, in accordance with an example embodiment. In an embodiment, the altitude of the UAV may be estimated based on a determination of, for at least one square, values of unit dis-tance in the stored marker image and captured marker image, and a focal length of at least one media sensor utilized for capturing the image of the marker.
[0056] Referring to FIG. 5, an example of corresponding squares in a captured marker im-age and a stored marker image is illustrated. For example, a square 502 may be a square in the stored marker image and a square 504 may be a square in the captured marker image (or /projected marker image in on camera image plane) corresponding to the square 502. Also, the dimensions of the square 504 are represented by ‘B’, and dimensions of the square 502 are represented by ‘dip’. In an embodiment, the altitude (H) 506 of the UAV may be deter-mined based on the following equation (1):
dip/f = B/H, ------------------- (1),
where, f (represented as 508) is the focal length of the camera (or a device) embodying the at least one media sensor.
[0057] As is seen, in the equation (1), all the quantities such as dip, B and f are known, and thus the value of altitude (H) of the UAV at any instant can be determined.
[0058] As is seen above, for determination of the altitude of the UAV in case of a partial image capture, a unit distance associated with at least one polygon is required, meaning the-reby that during the partial marker detection at least one polygon should be detected.
[0059] FIG. 6 illustrates a flow diagram of a method 600 for landing of a UAV, in accor-dance with the present disclosure.
[0060] At 602, the method includes scanning, by at least one media sensor mounted on the UV, for detecting a marker representative of a landing location of the UV, the marker com-prising a recursive geometrical pattern. Herein, the scanning may be performed at a geo-graphical area where the maker may be present. For example, in case of a hybrid platform, the scanning may be performed at the top of a GV which is the location of disposing the marker. In an embodiment, during the scanning, the at least one sensor is configured to cap-ture image of the at least one portion of the detected marker.
[0061] At 604, it is determined whether at least one portion of the marker is detected. In an embodiment, the scanning for detecting the at least one portion of the marker is per-formed at 602, till it is determined at 604 that the at least one portion of the marker is de-tected. At 606, it may be determined that the at least one portion of the marker is detected. In particular, at 606, the detected at least one portion of the marker includes the complete marker, meaning thereby that during the scanning the complete marker is detected. In an embodiment, the scanning of the complete marker may include imaging the complete marker pattern (for example, the Sierpinski fractal pattern, as described with reference to FIG. 3) to generate a complete marker image or captured marker image.
[0062] In another embodiment, it may be determined that only a portion of the marker is detected during the scanning. For example, at 608, it may be determined that a partial mark-er is detected during scanning. The partial marker pattern may be detected, for example, during noisy conditions or bad weather conditions. In an embodiment, the scanning of the partial marker may include imaging the partial marker to capture a partial marker image. In an embodiment, the partial marker image may be extrapolated to generate the complete marker image, at 610. In an embodiment, the partial marker image may be extrapolated by determining correspondences between the polygons of the captured partial marker image and the polygons of a pre-stored marker image. The stored marker image is the image of the complete marker, and can be stored in the memory of the system (for example, the system 200 shown in FIG. 2).
[0063] In an example embodiment, the partial marker image may be extrapolated by de-termining the color coding assigned to the polygons of the partial marker image, based on the color coding determining the orientation information and order information associated with the polygons of partial marker image. The orientation information and the order infor-mation along with a scaling factor may be utilized for extrapolating the partial marker image to generate the complete marker image. The generation of the complete marker image by extrapolating the partial marker image is explained in detail with reference to FIGS. 2-5.
[0064] At 612, a centroid of the captured marker image is determined. In an example em-bodiment, the centroid of the captured marker image may include central portion of the big-gest size polygon of the captured marker image. At 614, a 3D location of the at least one media sensor mounted on the UAV is estimated based at least on the centroid of the com-plete marker image. In an embodiment, for estimating the 3D location of the at least one media sensor, the centroid of the complete marker image may be aligning with the centroid of the pre-stored marker image, and based on the alignment, an altitude of the at least one media sensor is determined. An example of determining the altitude of the at least one me-dia sensor (and hence the altitude of the UAV) is described with reference to FIG. 5.
[0065] At 616, landing of the UAV on the marker at the landing location is facilitated based on the estimation of the 3D location of the at least one media sensor mounted on the UAV.
[0066] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embo-diments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[0067] Various embodiments of the disclosure provide method and system for facilitating landing of a UAV using a marker based landing approach. In particular, the landing of the UAV is facilitated by a marker that is self-similar and repetitive. The system provided herein detection of self-similar marker using vision based techniques. The system for detecting the maker may be configured within the UAV or at a landing location (such as GV). The self-similarity feature of the marker is very advantageous in detecting the marker and landing of the UAV since, this feature enables robust detection of the marker/pattern even during partial image of the marker/pattern is captured. Particularly, even though the altitude of the UAV varies during the descend of the UAV, and the camera mounted on the UAV is able to capture only a portion of the pattern, the system can extrapolate the pattern as the pattern is self-similar and recursive. In addition, the feature of self-similarity of the recursive pattern enables robust detection of the marker and a smooth landing of the UAV, even when the GV is in motion (or mobile condition) and is a size-constrained surface such as truck-top. Moreover, the disclosed system takes care of any false positive, i.e. a detected polygon that does not form part of a pattern/marker is not detected.
[0068] In certain embodiments, in order to facilitate location of the pattern under arbitrary outdoor lighting/photogrammetric conditions, uniform illumination may be provided beneath the pattern/marker imprinted on a glass surface. For example, square LED illumination can be used to enhance the luminance of the pattern.
[0069] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[0070] It is, however to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[0071] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[0072] The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
[0073] A data processing system suitable for storing and/or executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
[0074] Input/output (I/O) devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
[0075] A representative hardware environment for practicing the embodiments may include a hardware configuration of an information handling/computer system in accordance with the embodiments herein. The system herein comprises at least one processor or central processing unit (CPU). The CPUs are interconnected via system bus to various devices such as a random access memory (RAM), read-only memory (ROM), and an input/output (I/O) adapter. The I/O adapter can connect to peripheral devices, such as disk units and tape drives, or other program storage devices that are readable by the system. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments herein.
[0076] The system further includes a user interface adapter that connects a keyboard, mouse, speaker, microphone, and/or other user interface devices such as a touch screen device (not shown) to the bus to gather user input. Additionally, a communication adapter connects the bus to a data processing network, and a display adapter connects the bus to a display device which may be embodied as an output device such as a monitor, printer, or transmitter, for example.
[0077] The foregoing description of the specific implementations and embodiments will so fully reveal the general nature of the implementations and embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
[0078] The preceding description has been presented with reference to various embodiments. Persons having ordinary skill in the art and technology to which this application pertains will appreciate that alterations and changes in the described structures and methods of operation can be practiced without meaningfully departing from the principle, spirit and scope.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 2721-MUM-2015-IntimationOfGrant27-06-2022.pdf | 2022-06-27 |
| 1 | REQUEST FOR CERTIFIED COPY [16-03-2016(online)].pdf | 2016-03-16 |
| 2 | 2721-MUM-2015-PatentCertificate27-06-2022.pdf | 2022-06-27 |
| 2 | Form 3 [21-07-2016(online)].pdf | 2016-07-21 |
| 3 | Request For Certified Copy-Online.pdf | 2018-08-11 |
| 3 | 2721-MUM-2015-PETITION UNDER RULE 137 [08-03-2022(online)]-1.pdf | 2022-03-08 |
| 4 | Form 3.pdf | 2018-08-11 |
| 4 | 2721-MUM-2015-PETITION UNDER RULE 137 [08-03-2022(online)].pdf | 2022-03-08 |
| 5 | Form 2.pdf | 2018-08-11 |
| 5 | 2721-MUM-2015-RELEVANT DOCUMENTS [08-03-2022(online)]-1.pdf | 2022-03-08 |
| 6 | Figure for Abstract.jpg | 2018-08-11 |
| 6 | 2721-MUM-2015-RELEVANT DOCUMENTS [08-03-2022(online)].pdf | 2022-03-08 |
| 7 | Drawings.pdf | 2018-08-11 |
| 7 | 2721-MUM-2015-Written submissions and relevant documents [08-03-2022(online)].pdf | 2022-03-08 |
| 8 | ABSTRACT1.jpg | 2018-08-11 |
| 8 | 2721-MUM-2015-Correspondence to notify the Controller [23-02-2022(online)].pdf | 2022-02-23 |
| 9 | 2721-MUM-2015-FORM-26 [23-02-2022(online)].pdf | 2022-02-23 |
| 9 | 2721-MUM-2015-Power of Attorney-201015.pdf | 2018-08-11 |
| 10 | 2721-MUM-2015-Form 1-190815.pdf | 2018-08-11 |
| 10 | 2721-MUM-2015-US(14)-ExtendedHearingNotice-(HearingDate-25-02-2022).pdf | 2022-02-23 |
| 11 | 2721-MUM-2015-Correspondence to notify the Controller [02-02-2022(online)].pdf | 2022-02-02 |
| 11 | 2721-MUM-2015-DEFENCE OF R & D ORGANISATION-11-01--2017.pdf | 2018-08-11 |
| 12 | 2721-MUM-2015-Correspondence-201015.pdf | 2018-08-11 |
| 12 | 2721-MUM-2015-FORM-26 [02-02-2022(online)]-1.pdf | 2022-02-02 |
| 13 | 2721-MUM-2015-Correspondence-190815.pdf | 2018-08-11 |
| 13 | 2721-MUM-2015-FORM-26 [02-02-2022(online)].pdf | 2022-02-02 |
| 14 | 2721-MUM-2015-FER.pdf | 2019-07-30 |
| 14 | 2721-MUM-2015-US(14)-HearingNotice-(HearingDate-23-02-2022).pdf | 2022-02-01 |
| 15 | 2721-MUM-2015-CLAIMS [30-01-2020(online)].pdf | 2020-01-30 |
| 15 | 2721-MUM-2015-OTHERS [30-01-2020(online)].pdf | 2020-01-30 |
| 16 | 2721-MUM-2015-COMPLETE SPECIFICATION [30-01-2020(online)].pdf | 2020-01-30 |
| 16 | 2721-MUM-2015-FER_SER_REPLY [30-01-2020(online)].pdf | 2020-01-30 |
| 17 | 2721-MUM-2015-FER_SER_REPLY [30-01-2020(online)].pdf | 2020-01-30 |
| 17 | 2721-MUM-2015-COMPLETE SPECIFICATION [30-01-2020(online)].pdf | 2020-01-30 |
| 18 | 2721-MUM-2015-CLAIMS [30-01-2020(online)].pdf | 2020-01-30 |
| 18 | 2721-MUM-2015-OTHERS [30-01-2020(online)].pdf | 2020-01-30 |
| 19 | 2721-MUM-2015-FER.pdf | 2019-07-30 |
| 19 | 2721-MUM-2015-US(14)-HearingNotice-(HearingDate-23-02-2022).pdf | 2022-02-01 |
| 20 | 2721-MUM-2015-Correspondence-190815.pdf | 2018-08-11 |
| 20 | 2721-MUM-2015-FORM-26 [02-02-2022(online)].pdf | 2022-02-02 |
| 21 | 2721-MUM-2015-Correspondence-201015.pdf | 2018-08-11 |
| 21 | 2721-MUM-2015-FORM-26 [02-02-2022(online)]-1.pdf | 2022-02-02 |
| 22 | 2721-MUM-2015-Correspondence to notify the Controller [02-02-2022(online)].pdf | 2022-02-02 |
| 22 | 2721-MUM-2015-DEFENCE OF R & D ORGANISATION-11-01--2017.pdf | 2018-08-11 |
| 23 | 2721-MUM-2015-Form 1-190815.pdf | 2018-08-11 |
| 23 | 2721-MUM-2015-US(14)-ExtendedHearingNotice-(HearingDate-25-02-2022).pdf | 2022-02-23 |
| 24 | 2721-MUM-2015-Power of Attorney-201015.pdf | 2018-08-11 |
| 24 | 2721-MUM-2015-FORM-26 [23-02-2022(online)].pdf | 2022-02-23 |
| 25 | ABSTRACT1.jpg | 2018-08-11 |
| 25 | 2721-MUM-2015-Correspondence to notify the Controller [23-02-2022(online)].pdf | 2022-02-23 |
| 26 | Drawings.pdf | 2018-08-11 |
| 26 | 2721-MUM-2015-Written submissions and relevant documents [08-03-2022(online)].pdf | 2022-03-08 |
| 27 | Figure for Abstract.jpg | 2018-08-11 |
| 27 | 2721-MUM-2015-RELEVANT DOCUMENTS [08-03-2022(online)].pdf | 2022-03-08 |
| 28 | Form 2.pdf | 2018-08-11 |
| 28 | 2721-MUM-2015-RELEVANT DOCUMENTS [08-03-2022(online)]-1.pdf | 2022-03-08 |
| 29 | Form 3.pdf | 2018-08-11 |
| 29 | 2721-MUM-2015-PETITION UNDER RULE 137 [08-03-2022(online)].pdf | 2022-03-08 |
| 30 | Request For Certified Copy-Online.pdf | 2018-08-11 |
| 30 | 2721-MUM-2015-PETITION UNDER RULE 137 [08-03-2022(online)]-1.pdf | 2022-03-08 |
| 31 | 2721-MUM-2015-PatentCertificate27-06-2022.pdf | 2022-06-27 |
| 31 | Form 3 [21-07-2016(online)].pdf | 2016-07-21 |
| 32 | 2721-MUM-2015-IntimationOfGrant27-06-2022.pdf | 2022-06-27 |
| 32 | REQUEST FOR CERTIFIED COPY [16-03-2016(online)].pdf | 2016-03-16 |
| 1 | SearchStrategy_29-03-2019.pdf |
| 1 | SearchStrategy_29-07-2019.pdf |
| 2 | SearchStrategy_29-03-2019.pdf |
| 2 | SearchStrategy_29-07-2019.pdf |