Sign In to Follow Application
View All Documents & Correspondence

System, Apparatus, And Method For Polygon Survey

Abstract: Disclosed is a data processing apparatus (104) including a processing circuitry (120). The processing circuitry (120) is configured to receive a set of inputs from a user device (102) for selection of a polygon (302), determine a center point (304) of the polygon (302), determine a circle (310) with a radius (r) from the center point (304), generate a rectangular bounding box (312) encompassing the circle (310), segregate the rectangular bounding box (312) into a plurality of secondary boxes (402) based on a footprint area, determine a set of secondary center points (404), determine one or more secondary center points of the set of secondary center points (404) inside the polygon (302), and generate a survey pattern (406) by joining adjacent center points of the one or more secondary center points based on a pre-defined heading map.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
01 August 2023
Publication Number
12/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2025-10-09
Renewal Date

Applicants

PASSENGER DRONE RESEARCH PRIVATE LIMITED
04, Balnandan Row House, Jachak Nagar, Jai Bhavani Road, Nashik Road, Nashik, Maharashtra, 422101, India

Inventors

1. Kanchan Prakash Borade
Flat-03, Shree Arpan Residency, opposite to Gaikwad Petroleum, Jai Bhavani Road, Nashik Road, Nashik, Maharashtra, 422101, India

Specification

Description:TECHNICAL FIELD
The present disclosure relates generally to geographical surveys. More particularly, the present disclosure relates to a system, an apparatus, and a method for polygon survey.
BACKGROUND
Surveillance systems conduct surveys of a desired geographical region for a variety of reasons such as monitoring of behavior, detection of activities, gathering information to perform one or more actions, and the like. A variety of surveys exist today such as polygon survey, circular survey, and the like. The type of survey is dependent on an application of the survey system.
Applications such as road survey, pipelines (gas, water) survey, and the like require a precise selection of an area to be surveyed with a capability of accurate detection of the turns of the roads or the pipelines. Polygon surveys are common for surveys, however polygon used for survey can be of any size and shape, and thus it becomes very difficult to plot the waypoints over the polygon.
Thus, there is a need for an automated system, an apparatus, and a method for accurate and precise determination of a polygon survey that efficiently utilize the available resources and result in least amount of irrelevant data captured by the system to cover the desired area, which demands a need for improvised technical solution that overcomes the aforementioned problems.
SUMMARY
In an aspect of the present disclosure, a data processing apparatus includes processing circuitry. The processing circuitry is configured to receive a set of inputs from a user device for selection of a polygon, determine a center point of the polygon, determine a circle with a radius from the center point, and generate a rectangular bounding box encompassing the circle. The processing circuitry is further configured to segregate the rectangular bounding box into a plurality of secondary boxes based on a footprint area, and determine a set of secondary center points such that each center point of the set of secondary center points corresponds to a box of the plurality of secondary boxes. Furthermore, the processing circuitry is configured to determine one or more secondary center points of the set of secondary center points inside the polygon, and generate a survey pattern by joining adjacent center points of the one or more secondary center points based on a pre-defined heading map.
In some aspects, to determine the center point of the polygon, the processing circuitry is configured to identify a first set of edge points of the polygon, determine a plurality of opposite point pairs of the first set of edge points, determine a first set of lines such that each line of the first set of lines connects a point pair of the plurality of opposite point pairs, and identify an intersection of the first set of lines.
In some aspects, to determine the radius of the circle from the center point, the processing circuitry is configured to determine a first set of distances between the first set of lines, determine a maximum distance value of the first set of distances between the first set of lines, and divide the maximum distance value by two.
In some aspects, to generate the rectangular bounding box, the processing circuitry is configured to determine a second set of lines from the circle determine a set of mid points on the second set of lines, determine a third set of lines by joining the set of mid points with the center, and determine a second set of edge points at a distance from the center on the third set of lines.
In some aspects, to determine the second set of lines, the processing circuitry is configured to determine first and second secants on horizontal and vertical axes from the center point, respectively, and determine first and second pairs of the secant points that correspond to intersection of the first and second secants with the circle, respectively.
In some aspects, the processing circuitry is configured to determine the adjacent center points of the one or more secondary center points based on a heading and a distance between each edge points of the set of edge points.
In some aspects, the processing circuitry is configured to receive one or more parameters from an imaging unit, and determine the footprint area based on the one or more parameters of the imaging unit.
In some aspects, upon the generation of the survey pattern, the processing circuitry is configured to receive one or more images captured along the survey pattern from the imaging unit, determine a relative orientation of the one or more images, and generate a survey map by combining the one or more images based on the relative orientation of the one or more images.
In another aspect of the present disclosure, a system includes a user device and a data processing apparatus. The user device is configured to enable a user to provide a set of inputs for selection of a polygon. The data processing apparatus further includes processing circuitry. The processing circuitry is configured to receive a set of inputs from the user device for the selection of the polygon, determine a center point of the polygon, determine a circle with a radius from the center point, and generate a rectangular bounding box encompassing the circle. The processing circuitry is further configured to segregate the rectangular bounding box into a plurality of secondary boxes based on a footprint area, and determine a set of secondary center points such that each center point of the set of secondary center points corresponds to a box of the plurality of secondary boxes. Furthermore, the processing circuitry is configured to determine one or more secondary center points of the set of secondary center points inside the polygon, and generate a survey pattern by joining adjacent center points of the one or more secondary center points based on a pre-defined heading map.
In yet another aspect of the present disclosure, a method includes receiving, by way of processing circuitry, a set of inputs from the user device for selection of a polygon, determining, by way of the processing circuitry, a center point of the polygon, determining, by way of the processing circuitry, a circle with a radius from the center point, and generating, by way of the processing circuitry, a rectangular bounding box encompassing the circle. The method further includes segregating, by way of the processing circuitry, the rectangular bounding box into a plurality of secondary boxes based on a footprint area, and determining, by way of the processing circuitry, a set of secondary center points such that each center point of the set of secondary center points corresponds to a box of the plurality of secondary boxes. Furthermore, the method includes determining, by way of the processing circuitry, one or more secondary center points of the set of secondary center points inside the polygon and generating, by way of the processing circuitry, a survey pattern by joining adjacent center points of the one or more secondary center points based on a pre-defined heading map.
BRIEF DESCRIPTION OF DRAWINGS
The above and still further features and advantages of aspects of the present disclosure becomes apparent upon consideration of the following detailed description of aspects thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
FIG. 1 illustrates a block diagram of a system for polygon survey, in accordance with an exemplary aspect of the present disclosure;
FIG. 2 illustrates a block diagram of a data processing apparatus of FIG. 1, in accordance with an exemplary aspect of the present disclosure;
FIG. 3 illustrates a schematic representation of a rectangle encompassing a circle around a polygon selected by a user for a polygon survey; and
FIG. 4 illustrates a schematic representation of a survey map for polygon survey, in an exemplary aspect of the present disclosure; and
FIG. 5 illustrates a flow chart of a method for the polygon survey, in accordance with an exemplary aspect of the present disclosure.
To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures.
DETAILED DESCRIPTION
Various aspect of the present disclosure provides a system, an apparatus, and a method for corridor survey. The following description provides specific details of certain aspects of the disclosure illustrated in the drawings to provide a thorough understanding of those aspects. It should be recognized, however, that the present disclosure can be reflected in additional aspects and the disclosure may be practiced without some of the details in the following description.
The various aspects including the example aspects are now described more fully with reference to the accompanying drawings, in which the various aspects of the disclosure are shown. The disclosure may, however, be embodied in different forms and should not be construed as limited to the aspects set forth herein. Rather, these aspects are provided so that this disclosure is thorough and complete, and fully conveys the scope of the disclosure to those skilled in the art. In the drawings, the sizes of components may be exaggerated for clarity.
It is understood that when an element is referred to as being “on,” “connected to,” or “coupled to” another element, it can be directly on, connected to, or coupled to the other element or intervening elements that may be present. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
The subject matter of example aspects, as disclosed herein, is described with specificity to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventor/inventors have contemplated that the claimed subject matter might also be embodied in other ways, to include different features or combinations of features similar to the ones described in this document, in conjunction with other technologies. Generally, the various aspects including the example aspects relate to the system, and the method for polygon survey.
As mentioned, there is a need for an automated system, an apparatus, and a method for accurate and precise determination of a polygon for survey that efficiently utilize the available resources and result in least amount of irrelevant data captured by the system to cover the desired area. The present aspects, therefore: provides a system 100, a data processing apparatus 104, and a method 300 that provides an improvised technical solution that overcomes the aforementioned problems.
The aspects herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting aspects that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the aspects herein. The examples used herein are intended merely to facilitate an understanding of ways in which the aspects herein may be practiced and to further enable those of skill in the art to practice the aspects herein. Accordingly, the examples should not be construed as limiting the scope of the aspects herein.
FIG. 1 illustrates a block diagram of the system 100 for corridor survey, in accordance with an exemplary aspect of the present disclosure. The system 100 may include a user device 102, a data processing apparatus 104, and an imaging unit 106. In some aspects of the present disclosure, the user device 102 and the imaging unit 106 may be communicatively coupled to the data processing apparatus 104 by way of either of, a first wired communication medium and a first wireless communication medium. In some aspects of the present disclosure, the user device 102, the data processing apparatus 104 and the imaging unit 106 may be communicatively coupled to each other by way of a communication network 108.
The user device 102 may be configured to enable a user to submit a set of inputs for selection of a polygon (shown as 302 later in FIG. 3). The user device 102 may further be configured to enable the user to select and/or input one or more parameters associated with the imaging unit 106. In some aspects of the present disclosure, the user device 102 may be configured to facilitate the user to provide input(s) to register on the system 100. Furthermore, the user device 102 may facilitate the user to enable a password protection for logging-in (i.e., user authentication) to the system 100.
In some aspects of the present disclosure, the user device 102 may include a first user interface 110, a first processing unit 112, a first memory 114, a survey console 116, and a first communication interface 118.
The first user interface 110 may include a first input interface (not shown) for receiving inputs from the user. In some aspects of the present disclosure, the first input interface may be configured to enable the user to submit the set of inputs for selection of the polygon 302. The first input interface may further be configured to enable the user to select and/or input the one or more parameters associated with the imaging unit 106. Furthermore, the first input interface may be configured to enable the user to select and/or provide inputs for registration and/or authentication of the user to use one or more functionalities of the system 100. In some aspects of the present disclosure, the first input interface may be configured to enable the user to provide inputs to enable password protection for logging-in to the system 100. Examples of the first input interface may include, but are not limited to, a touch interface, a mouse, a keyboard, a motion recognition unit, a gesture recognition unit, a voice recognition unit, or the like. Aspects of the present disclosure are intended to include or otherwise cover any type of the first input interface including known, related art, and/or later developed technologies. The first user interface 110 may further include a first output interface (not shown) for displaying (or presenting) an output to the user. In some aspects of the present disclosure, the first output interface may be configured to display or present either of, survey pattern (shown later in FIG. 4 as 406) and/or a survey map (shown later in FIG. 4 as 400) generated by the system 100 to the user. Examples of the first output interface may include, but are not limited to, a digital display, an analog display, a touch screen display, a graphical user interface, a website, a webpage, a keyboard, a mouse, a light pen, an appearance of a desktop, and/or illuminated characters. Aspects of the present disclosure are intended to include and/or otherwise cover any type of the first output interface including known and/or related, or later developed technologies.
The first processing unit 112 may include suitable logic, instructions, circuitry, interfaces, and/or codes for executing various operations, such as the operations associated with the user device 102, and/or the like. In some aspects of the present disclosure, the first processing unit 112 may utilize one or more processors such as Arduino or raspberry pi or the like. Further, the first processing unit 112 may be configured to control one or more operations executed by the user device 102 in response to the input received at the first user interface 110 from the user. Examples of the first processing unit 112 may include, but are not limited to, an application-specific integrated circuit (ASIC) processor, a reduced instruction set computing (RISC) processor, a complex instruction set computing (CISC) processor, a field-programmable gate array (FPGA), a Programmable Logic Control unit (PLC), and the like. Aspects of the present disclosure are intended to include or otherwise cover any type of first processing unit 112 including known, related art, and/or later developed processing units.
The first memory 114 may be configured to store the logic, instructions, circuitry, interfaces, and/or codes of the first processing unit 112, data associated with the user device 102, and/or data associated with the system 100. In some aspects of the present disclosure, the first memory 114 may be configured to store a variety of inputs received from the user. Examples of the first memory 114 may include, but are not limited to, a Read-Only Memory (ROM), a Random-Access Memory (RAM), a flash memory, a removable storage drive, a hard disk drive (HDD), a solid-state memory, a magnetic storage drive, a Programmable Read Only Memory (PROM), an Erasable PROM (EPROM), and/or an Electrically EPROM (EEPROM). Aspects of the present disclosure are intended to include or otherwise cover any type of first memory 114 including known, related art, and/or later developed memories.
The survey console 116 may be configured as a computer-executable application, to be executed by the first processing unit 112. The survey console 116 may include suitable logic, instructions, and/or codes for executing various operations and may be controlled by the data processing apparatus 104. The one or more computer executable applications may be stored in the first memory 114. Examples of the one or more computer executable applications may include, but are not limited to, an audio application, a video application, a social media application, a navigation application, or the like. Aspects of the present disclosure are intended to include or otherwise cover any type of the computer executable application including known, related art, and/or later developed computer executable applications.
The first communication interface 118 may be configured to enable the user device 102 to communicate with the data processing apparatus 104 and the imaging unit 106 via the data processing apparatus 104. Examples of the first communication interface 118 may include, but are not limited to, a modem, a network interface such as an Ethernet card, a communication port, and/or a Personal Computer Memory Card International Association (PCMCIA) slot and card, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and a local buffer circuit. It will be apparent to a person of ordinary skill in the art that the first communication interface 118 may include any device and/or apparatus capable of providing wireless or wired communications between the user device 102, the data processing apparatus 104 and the imaging unit 106.
The data processing apparatus 104 may be a network of computers, a software framework, or a combination thereof, that may provide a generalized approach to create the server implementation. Examples of the data processing apparatus 104 may include, but are not limited to, personal computers, laptops, mini-computers, mainframe computers, any non-transient and tangible machine that can execute a machine-readable code, cloud-based servers, distributed server networks, or a network of computer systems. The data processing apparatus 104 may be realized through various web-based technologies such as, but not limited to, a Java web-framework, a .NET framework, a personal home page (PHP) framework, or any web-application framework. The data processing apparatus 104 may include processing circuitry 120 and one or more memory units (hereinafter, collectively referred to and designated as “Database 122”).
The processing circuitry 120 may include suitable logic, instructions, circuitry, interfaces, and/or codes for executing various operations of the system 100. The processing circuitry 120 may be configured to host and enable the survey console 112 running on (or installed on) the user device 102 to execute the operations associated with the system 100 by communicating one or more commands and/or instructions over the communication network 108. The processing circuitry 120 may be configured to generate the survey pattern 406 and/or the survey map 400 based on the polygon 302 selected through the one or more inputs of the user.
The database 122 may be configured to store the logic, instructions, circuitry, interfaces, and/or codes of the processing circuitry 120 for executing a number of operations. The database 122 may be further configured to store therein, data associated with users registered with the system 100. Some aspects of the present disclosure are intended to include and/or otherwise cover any type of the data associated with the users registered with the system 100. Examples of the database 122 may include but are not limited to, a ROM, a RAM, a flash memory, a removable storage drive, a HDD, a solid-state memory, a magnetic storage drive, a PROM, an EPROM, and/or an EEPROM. In some aspects of the present disclosure, the database 122 may be configured to store one or more of, user data, instructions data, corridor maps, one or more configuration parameters of the imaging unit 106, and the like corresponding to the system 100.
The imaging unit 106 may be configured to move on the survey map 400 generated by the processing circuitry 120. The imaging unit 106 may further be configured to capture a plurality of images following the survey map survey pattern 406 of the survey map 400.
In some aspects of the present disclosure, the imaging unit 106 may include an aviation unit 124, a power supply 126, an imaging unit 128, a second processing unit 130, a second memory 132, and a second communication interface 134. In some aspects of the present disclosure, various components of the imaging unit 106 may be coupled to each other by way of one or more wired or wireless communication mediums (not shown).
In an exemplary aspect of the present disclosure, the aviation unit 124 may include one or more propellors (not shown), one or more motors (not shown) coupled to the one or more propellors, and an aviation control unit (not shown) coupled to the one or more motors and configured to control a rotational speed of each motor of the one or more motors. The power supply 126 may be coupled to various components of the imaging unit (i.e., the aviation unit 124, the imaging unit 128, the second processing unit 130, the second memory unit 132, and the second communication interface 134), and may be configured to provide electrical energy to the various components of the imaging unit 106. The imaging unit 128 may include one or more camera sensors (not shown) configured to capture the plurality of images of the survey map 400 along the survey pattern 406. Examples of the one or more camera sensors of the imaging unit 128 may include but not limited to, a stationary camera, a Pan-Tilt-Zoom (PTZ) camera, a depth sensing camera pair, and the like. Aspects of the present disclosure are intended to include or otherwise cover any type of camera sensor including known, related art, and/or later developed camera sensors.
The second processing unit 130 may include suitable logic, instructions, circuitry, interfaces, and/or codes for executing various operations, such as the operations associated with the imaging unit 106, or the like. In some aspects of the present disclosure, the second processing unit 130 may utilize one or more processors such as Arduino or raspberry pi or the like. Further, the second processing unit 130 may be configured to control one or more operations executed by the imaging unit 106 in response to the input received at the second user interface 134 from the data processing apparatus 104. In some aspects of the present disclosure, the second processing unit 130 may be configured to combine the plurality of images captured by the one or more camera sensors to generate a cumulative plan of the corridor map(s). Examples of the second processing unit 130 may include, but are not limited to, an application-specific integrated circuit (ASIC) processor, a reduced instruction set computing (RISC) processor, a complex instruction set computing (CISC) processor, a field-programmable gate array (FPGA), a Programmable Logic Control unit (PLC), and the like. Aspects of the present disclosure are intended to include or otherwise cover any type of second processing unit 130 including known, related art, and/or later developed processing units.
The second memory 132 may be configured to store the logic, instructions, circuitry, interfaces, and/or codes of the second processing unit 130, data associated with the imaging unit 106, and/or data associated with the system 100. In some aspects of the present disclosure, the second memory 132 may be configured to store a variety of inputs received from the data processing apparatus 104. In some aspects of the present disclosure, the second memory 132 may be configured to temporarily store the plurality of images of the survey map 400 along the survey pattern 406. Examples of the second memory 132 may include, but are not limited to, a Read-Only Memory (ROM), a Random-Access Memory (RAM), a flash memory, a removable storage drive, a hard disk drive (HDD), a solid-state memory, a magnetic storage drive, a Programmable Read Only Memory (PROM), an Erasable PROM (EPROM), and/or an Electrically EPROM (EEPROM). Aspects of the present disclosure are intended to include or otherwise cover any type of second memory 132 including known, related art, and/or later developed memories.
The second communication interface 134 may be configured to enable the imaging unit 106 to communicate with the data processing apparatus 104 and the user device 102 via the data processing apparatus 104. Examples of the second communication interface 134 may include, but are not limited to, a modem, a network interface such as an Ethernet card, a communication port, and/or a Personal Computer Memory Card International Association (PCMCIA) slot and card, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and a local buffer circuit. It will be apparent to a person of ordinary skill in the art that the second communication interface 134 may include any device and/or apparatus capable of providing wireless or wired communications between the data processing apparatus 104 and the imaging unit 106.
The communication network 108 may include suitable logic, circuitry, and interfaces that may be configured to provide a number of network ports and a number of communication channels for transmission and reception of data related to operations of various entities (such as the user device 102, the data processing apparatus 104, and the imaging unit 106) of the system 100. Each network port may correspond to a virtual address (or a physical machine address) for transmission and reception of the communication data. For example, the virtual address may be an Internet Protocol Version 4 (IPV4) (or an IPV6 address) and the physical address may be a Media Access Control (MAC) address. The communication network 108 may be associated with an application layer for implementation of communication protocols based on one or more communication requests from the user device 102, the data processing apparatus 104, and the imaging unit 106. The communication data may be transmitted or received, via the communication protocols. Examples of the communication protocols may include, but are not limited to, Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), Simple Mail Transfer Protocol (SMTP), Domain Network System (DNS) protocol, Common Management Interface Protocol (CMIP), Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Long Term Evolution (LTE) communication protocols, or any combination thereof.
In some aspects of the present disclosure, the communication data may be transmitted or received via at least one communication channel of a number of communication channels in the communication network 108. The communication channels may include, but are not limited to, a wireless channel, a wired channel, a combination of wireless and wired channel thereof. The wireless or wired channel may be associated with a data standard which may be defined by one of a Local Area Network (LAN), a Personal Area Network (PAN), a Wireless Local Area Network (WLAN), a Wireless Sensor Network (WSN), Wireless Area Network (WAN), Wireless Wide Area Network (WWAN), a metropolitan area network (MAN), a satellite network, the Internet, an optical fiber network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. Aspects of the present disclosure are intended to include or otherwise cover any type of communication channel, including known, related art, and/or later developed technologies.
In operation, the system 100, by way of the processing circuitry 120, may be configured to receive the set of inputs from the user device 102 for selection of the polygon 302. The system 100, by way of the processing circuitry 120, may further be configured to determine a center point (shown later in FIG. 3 as 304) of the polygon 302. Furthermore, the system 100, by way of the processing circuitry 120, may be configured to determine a circle (shown later in FIG. 3 as 310) with a radius (shown later in FIG. 3 as ‘r’) from the center point 304. Furthermore, the system 100, by way of the processing circuitry 120, may be configured to generate a rectangular bounding box (shown Later in FIG. 3 as 312) encompassing the circle 310. Furthermore, the system 100, by way of the processing circuitry 120, may be configured to segregate the rectangular bounding box 312 into a plurality of secondary boxes (shown later in FIG. 4 as 402a-404n, and cumulatively referred to as the plurality of secondary boxes 402) based on a footprint area specific to the one or more camera sensors of the imaging unit 106, that may be determined by the processing circuitry based on the one or more parameters of the imaging unit selected by the user. Furthermore, the system 100, by way of the processing circuitry 120, may be configured to determine a set of secondary center points (shown later in FIG. 4 as 404a-404n, and cumulatively referred to as the set of secondary center points 404) such that each center point of the set of secondary center points 404 corresponds to a box of the plurality of secondary boxes 402. Furthermore, the system 100, by way of the processing circuitry 120, nay be configured to determine one or more secondary center points of the set of secondary center points 404 inside the polygon 302. Furthermore, the system 100, by way of the processing circuitry 120 may be configured to generate the survey pattern 406 by joining adjacent center points of the one or more secondary center points based on a pre-defined heading map.
In some aspects of the present disclosure, upon generation of the survey pattern 406, the system 100, by way of the imaging unit 106 may be configured to capture the one or more images along the survey pattern 406.
In some aspects of the present disclosure, upon capturing of the one or more images along the survey pattern, the system 100, by way of the processing circuitry 120 may be configured to receive the one or more images captured along the survey pattern 406 from the imaging unit 106. The system 100, by way of the processing circuitry 120, may further be configured to determine a relative orientation of the one or more images. Furthermore, the system 100, by way of the processing circuitry 120 may be configured to generate the survey map 400 by combining the one or more images based on the relative orientation of the one or more images.
FIG. 2 is a block diagram that illustrates the data processing apparatus 104 of FIG. 1, in accordance with an exemplary aspect of the present disclosure. The data processing apparatus 104 may include the processing circuitry 120 and the database 122. The data processing apparatus 104 may further include a network interface 200 and an input/output (I/O) interface 202. The processing circuitry 120, the database 122, the network interface 200, and the input/output (I/O) interface 202 may be configured to communicate with each other by way of a first communication bus 203.
In an exemplary aspect of the present disclosure, the processing circuitry 120 may include a data exchange engine 204, a registration engine 206, an authentication engine 208, a data processing engine 210, a footprint engine 212, a survey pattern engine 214, a map generation engine 216, and a notification engine 218 communicatively coupled to each other by way of a second communication bus 220. It will be apparent to a person having ordinary skill in the art that the data processing apparatus 104 is for illustrative purposes and not limited to any specific combination of hardware circuitry and/or software.
The data exchange engine 204 may be configured to enable transfer of data from the database 160 to various engines of the processing circuitry 158. The data exchange engine 204 may further be configured to enable transfer of data and/or instructions from the user device 102 and/or the imaging unit 106 to the data processing apparatus 104.
The registration engine 206 may be configured to enable the user to register into the system 100 by providing registration data through a registration menu (not shown) of the survey console 112 that may be displayed by way of the user device 102.
The authentication engine 208 by way of the data exchange engine 204 may be configured to fetch the registration data of the user and authenticate the registration data of the user. The authentication engine 208, upon successful authentication of the registration data of the user, may be configured to enable the user to log-in or sign up to the system 100. In some aspects of the present disclosure, the authentication engine 208 may enable the user to set the password protection for logging-in to the system 100. In such a scenario, the authentication engine 208 may be configured to verify a password entered by the user for logging-in to the system 100 by comparing the password entered by the user with the set password protection. In some aspects, when the password entered by the user is verified by the authentication engine 208, the authentication engine 208 may enable the user to log-in to the system 100. In some other aspects of the present disclosure, when the password entered by the user is not verified by the authentication engine 208, the authentication engine 208 may generate a signal for the notification engine 224 to generate a login failure notification for the user.
The data processing engine 210 may be configured to receive the set of inputs from the user device 102 for selection of the polygon 302. The data processing engine 210 may further be configured to determine the center point 304 of the polygon 302. In some aspects of the present disclosure, to determine the center point of the polygon 302, the data processing engine 210 may be configured to identify a first set of edge points (shown later in FIG. 3 as 306a-306d, and cumulatively referred to as the first set of edge points 306) of the polygon 302. The data processing engine 210 may further be configured to determine a plurality of opposite point pairs of the first set of edge points 306. Furthermore, the data processing engine 210 may be configured to determine a first set of lines (shown later in FIG 3 as 308a and 308b, and cumulatively referred to as the first set of lines 308) such that each line of the first set of lines 308 connects a point pair of the plurality of opposite point pairs. Furthermore, the data processing engine 210 may be configured to identify an intersection of the first set of lines 308.
The data processing engine 210 may further be configured to determine the circle 310 with the radius ‘r’ from the center point 304. In some aspects of the present disclosure, to determine the radius ‘r’ of the circle 310 from the center point 304, the data processing engine 210 may be configured to determine a first set of distances between the first set of lines 308. The data processing engine 210 may further be configured to determine a maximum distance value of the first set of distances between the first set of lines 308. Furthermore, the data processing engine 210 may be configured to divide the maximum distance value by two.
Furthermore, the data processing engine 210 may be configured to generate a rectangular bounding box 312 encompassing the circle 310. In some aspects of the present disclosure, to generate the rectangular bounding box 312, the data processing engine 210 may be configured to determine a second set of lines (shown later in FIG. 3 as 314a and 314b, and cumulatively referred to the second set of lines 314) from the circle 310. The data processing engine 210 may further be configured to determine a set of mid points (shown later in FIG. 3 as 320a and 320b, and cumulatively referred to as the set of mid points 320) on the second set of lines 314. Furthermore, the data processing engine 210 may be configured to determine a third set of lines (shown later in FIG. 3 as 322a and 322b, and cumulatively referred to as the third set of lines 322) by joining the set of mid points 320 with the center 304. Furthermore, the data processing engine 210 may be configured to determine a second set of edge points (shown later in FIG. 3 as 324a-324d, and cumulatively referred to as the second set of edge points 324) at a distance ‘d’ from the center 304 on the third set of lines 322. Preferably, the distance ‘d’ may be equal to v2 times the radius ‘r’ of the circle 310. In some other aspects of the present disclosure, to determine the second set of lines 314, the data processing engine 210 may be configured to determine first and second secants (316a and 316b, cumulatively designated as 316) on horizontal and vertical axes from the center point 304, respectively. The data processing engine 210 may further be configured to determine, first and second pairs of the secant points 318 that correspond to intersection of the first and second secants 316 with the circle 310, respectively.
The footprint engine 212 may be configured to receive the one or more parameters from the imaging unit 106. The footprint engine 212 may further be configured to determine a footprint area based on the one or more parameters of the imaging unit 106. Preferably, the one or more parameters may include x-coordinates (X-sensor) of the one or more cameras of the imaging unit 106, y-coordinates (Y-sensor) of the one or more cameras of the imaging unit 106, focal length of the one or more cameras of the imaging unit 106, altitude of the one or more cameras of the imaging unit 106, gimble in x-direction (X-gimble) of the one or more cameras of the imaging unit 106, and gimble in y-direction (y-gimble) of the one or more cameras of the imaging unit 106. In some aspects of the present disclosure, to determine the footprint area, the footprint engine 212 may be configured to determine a field of view of each camera of the one or more cameras of the imaging unit 106. The footprint engine 212 may further be configured to determine a height of the footprint and a width of the footprint. Furthermore, the footprint engine 212 may be configured to determine the footprint area based on the height and width of the footprint.
The survey pattern engine 214 may be configured to segregate the rectangular bounding box 312 into the plurality of secondary boxes 402 based on the footprint area. The survey pattern engine 214 may further be configured to determine the set of secondary center points 404 such that each center point of the set of secondary center points 404 may correspond to a box of the plurality of secondary boxes 402. Furthermore, the survey pattern engine 214 may be configured to determine the one or more secondary center points of the set of secondary center points 404 inside the polygon 302. Furthermore, the survey pattern engine 214 may be configured to generate the survey pattern 406 by joining the adjacent center points of the one or more secondary center points based on the pre-defined heading map that may be stored in the database 122. In some aspects of the present disclosure, the survey pattern engine 214 may be configured to determine a heading and a distance between each point of the one or more secondary center points to determine the adjacent center points.
The map generation engine 216 may be configured to receive the plurality of images from the imaging unit 106. The map generation engine 216 may further be configured to generate the survey map 400 based on the plurality of images received from the imaging unit 106. In some aspects of the present disclosure, the map generation engine 216 may be configured to receive the one or more images captured along the survey pattern 406 from the imaging unit 106. The map generation engine 216 may further be configured to determine the relative orientation of the one or more images. Furthermore, the map generation engine 216 may be configured to generate the survey map 400 by combining the one or more images based on the relative orientation of the one or more images.
The notification engine 218 may be configured to generate one or more notifications corresponding to the system 100 that may be presented to the user by way of the user device 102. It will be apparent to a person skilled in the art that the aspects of the present disclosure are intended to include or cover any type of notification generated by the system 100 and/or presented to the user by the system 100.
The database 122 may be configured to store data corresponding to the system 100. In some aspects of the present disclosure, the database 122 may be segregated into one or more repositories that may be configured to store a specific type of data. In an exemplary aspect of the present disclosure, the database 122 may include an instructions repository 222, a user data repository 224, an image repository 226, a survey pattern repository 228, and a survey map repository 230.
The instructions repository 222 may be configured to store instructions data corresponding to the data processing apparatus 104. The instructions data may include data and metadata of one or more instructions corresponding to the various entities of the data processing apparatus 104 such as the processing circuitry 120, the I/O interface 200 and/or the network interface 202. It will be apparent to a person skilled in the art that the aspects of the present disclosure are intended to include or cover any type of instructions data of the data processing apparatus 104, and thus must not be considered as a limitation of the present disclosure.
The user data repository 224 may be configured to store user data of the system 100. The user data may include data and metadata of the data of authenticated users that are registered on the system 100. In some aspects of the present disclosure, the user data repository 224 may further be configured to store partial data and/or partial metadata of the user data corresponding to users that fail to register and/or authenticate on the system 100. Furthermore, the user data repository 224 may be configured to store the set of inputs received from the user by way of the user device 102. It will be apparent to a person skilled in the art that the aspects of the present disclosure are intended to include or cover any type of user data and/or metadata of the user data of the system 100, and thus must not be considered as a limitation of the present disclosure.
The image repository 226 may be configured to store the plurality of images captured by the imaging unit 106. The image repository 226 may further be configured to store a combined (or fused) image that may be generated by combining the plurality of images by the processing circuitry 120. The survey pattern repository 228 may be configured to store data of the survey pattern 406 of the survey map 400 generated by the processing circuitry 120. The survey map repository 230 may be configured to store data of the survey map 400 generated by the processing circuitry 120.
FIG. 3 illustrates a schematic representation 300 of the rectangle 312 encompassing the circle 310 around the polygon 302 selected by a user for the polygon survey. The system 100, by way of the processing circuitry 120 may generate the rectangle 312 by one or more operations of the processing circuitry 120 as described in the detailed description of FIG. 1 and FIG. 2.
FIG. 4 illustrates a schematic representation of the survey map 400 for the polygon survey, in an exemplary aspect of the present disclosure. The survey map 400 may include the survey pattern 406. The system 100, by way of the processing circuitry 120, may generate the survey pattern 406 by one or more operations of the processing circuitry 120 as described in the detailed description of FIG. 1 and FIG. 2. The system 100, by way of the imaging unit 106 may capture the plurality of images along the survey pattern. Furthermore, the system 100, by way of the processing circuitry 120, may generate the survey map 400 by combining the plurality of images captured by the imaging unit 106.
FIG. 5 illustrates a flow chart of a method 500 for the polygon survey, in accordance with an exemplary aspect of the present disclosure.
At step 502, the system 100 may determine the center point 304 of the polygon 302 based on the received set of inputs from the user device 102 for selection of the polygon 302.
In some aspects of the present disclosure, to determine the center point of the polygon 302, system 100 may identify the first set of edge points 306 of the polygon 302. The system 100 may further determine the plurality of opposite point pairs of the first set of edge points 306. Furthermore, the system 100 may determine the first set of lines 308 such that each line of the first set of lines 308 connects a point pair of the plurality of opposite point pairs. Furthermore, the system 100 may identify the intersection of the first set of lines 308.
At step 504, the system 100 may determine the circle 310 with the radius ‘r’ from the center point 304.
In some aspects of the present disclosure, to determine the radius ‘r’ of the circle 310 from the center point 304, the system 100 may determine the first set of distances between the first set of lines 308. The system 100 may further determine the maximum distance value of the first set of distances between the first set of lines 308. Furthermore, the system 100 may divide the maximum distance value by two.
At step 506, the system 100 may generate the rectangular bounding box 312 encompassing the circle 310.
In some aspects of the present disclosure, to generate the rectangular bounding box 312, the system 100 may determine the second set of lines 314 from the circle 310. The system 100 may further determine the set of mid points 320 on the second set of lines 314. Furthermore, the system 100 may determine the third set of lines 322 by joining the set of mid points 320 with the center 304. Furthermore, the system 100 may determine the second set of edge points 324 at the distance ‘d’ from the center 304 on the third set of lines 322
In some other aspects of the present disclosure, to determine the second set of lines 314, the system 100 may determine first and second secants 316 on the horizontal and vertical axes from the center point 304, respectively. The system 100 may determine, the first and the second pairs of the secant points 318 that correspond to intersection of the first and second secants 316 with the circle 310, respectively.
At step 508, the system 100 may receive the one or more parameters from the imaging unit 106, and may determine the footprint area based on the one or more parameters of the imaging unit 106.
In some aspects of the present disclosure, to determine the footprint area, system 100 may determine the field of view of each camera of the one or more cameras of the imaging unit 106. The system 100 may further determine a height of the footprint and a width of the footprint. Furthermore, the system 100 may be configured to determine the footprint area based on the height and width of the footprint.
At step 510, the system 100 may segregate the rectangular bounding box 312 into the plurality of secondary boxes 402 based on the footprint area.
At step 512, the system 100 may determine the set of secondary center points 404 such that each center point of the set of secondary center points 404 may correspond to a box of the plurality of secondary boxes 402.
At step 514, the system 100 may determine the one or more secondary center points of the set of secondary center points 404 inside the polygon 302.
At step 516, the system 100 may generate the survey pattern 406 by joining the adjacent center points of the one or more secondary center points based on the pre-defined heading map that may be stored in the database 122.
In some aspects of the present disclosure, the system 100 may determine the heading and the distance between each point of the one or more secondary center points to determine the adjacent center points.
At step 518, the system 100 may capture the plurality of images from along the survey pattern, and receive the plurality of images from the imaging unit 106.
At step 520, the system 100 may generate the survey map 400 based on the plurality of images received from the imaging unit 106.
In some aspects of the present disclosure, the system 100 may receive the one or more images captured along the survey pattern 406 from the imaging unit 106. The system 100 may further determine the relative orientation of the one or more images. Furthermore, the system 100 may generate the survey map 400 by combining the one or more images based on the relative orientation of the one or more images.
As discussed earlier, there is a need for an automated system, an apparatus, and a method for accurate and precise determination of a corridor for survey that efficiently utilize the available resources and result in least amount of irrelevant data captured by the system to cover the desired area. As the method 500 involves selection of a precise survey area based on user’s inputs, the system 100 by way of the data processing apparatus 104 through the method 500 provides accurate and precise determination of the survey map 400 for the polygon survey that efficiently utilize the available resources and result in least amount of irrelevant data captured by the system to cover the desired area.
The foregoing discussion of the present disclosure has been presented for purposes of illustration and description. It is not intended to limit the present disclosure to the form or forms disclosed herein. In the foregoing Detailed Description, for example, various features of the present disclosure are grouped together in one or more aspects, configurations, or aspects for the purpose of streamlining the disclosure. The features of the aspects, configurations, or aspects may be combined in alternate aspects, configurations, or aspects other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention the present disclosure requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, configuration, or aspect. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate aspect of the present disclosure.
Moreover, though the description of the present disclosure has included description of one or more aspects, configurations, or aspects and certain variations and modifications, other variations, combinations, and modifications are within the scope of the present disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, configurations, or aspects to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.
As one skilled in the art will appreciate, the system 100 includes a number of functional blocks in the form of a number of units and/or engines. The functionality of each unit and/or engine goes beyond merely finding one or more computer algorithms to carry out one or more procedures and/or methods in the form of a predefined sequential manner, rather each engine explores adding up and/or obtaining one or more objectives contributing to an overall functionality of the system 100. Each unit and/or engine may not be limited to an algorithmic and/or coded form, rather may be implemented by way of one or more hardware elements operating together to achieve one or more objectives contributing to the overall functionality of the system 100. Further, as it will be readily apparent to those skilled in the art, all the steps, methods and/or procedures of the system 100 are generic and procedural in nature and are not specific and sequential.
Certain terms are used throughout the following description and claims to refer to particular features or components. As one skilled in the art will appreciate, different persons may refer to the same feature or component by different names. This document does not intend to distinguish between components or features that differ in name but not structure or function. While various aspects of the present disclosure have been illustrated and described, it will be clear that the present disclosure is not limited to these aspects only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the present disclosure, as described in the claims.
, Claims:1. A data processing apparatus (104) comprising:
processing circuitry (120) configured to (i) receive a set of inputs from a user device (102) for selection of a polygon (302), (ii) determine a center point (304) of the polygon (302), (iii) determine a circle (310) with a radius (r) from the center point (304), (iv) generate a rectangular bounding box (312) encompassing the circle (310), (v) segregate the rectangular bounding box (312) into a plurality of secondary boxes (402) based on a footprint area, (vi) determine a set of secondary center points (404), wherein each center point of the set of secondary center points (404) corresponds to a box of the plurality of secondary boxes (402), (vii) determine one or more secondary center points of the set of secondary center points (404) inside the polygon (302), and (viii) generate a survey pattern (406) by joining adjacent center points of the one or more secondary center points based on a pre-defined heading map.

2. The data processing apparatus (104) as claimed in claim 1, wherein, to determine the center point (304) of the polygon (302), the processing circuitry (104) is configured to (i) identify a first set of edge points (306) of the polygon (302), (ii) determine a plurality of opposite point pairs of the first set of edge points (306), (iii) determine a first set of lines (308), wherein each line of the first set of lines (308) connects a point pair of the plurality of opposite point pairs, and (iv) identify an intersection of the first set of lines (308).

3. The data processing apparatus (104) as claimed in claim 1, wherein, to determine the radius (r) of the circle (310) from the center point (304), the processing circuitry (120) is configured to (i) determine a first set of distances between the first set of lines (308), (ii) determine a maximum distance value of the first set of distances between the first set of lines (308), and (iii) divide the maximum distance value by two.
4. The data processing apparatus (104) as claimed in claim 1, wherein to generate the rectangular bounding box (312), the processing circuitry (120) is configured to (i) determine a second set of lines (314) from the circle (310), (ii) determine a set of mid points (320) on the second set of lines (314), (iii) determine a third set of lines (322) by joining the set of mid points (320) with the center (304), and (iv) determine a second set of edge points (324) at a distance (d) from the center (304) on the third set of lines (322).

5. The data processing apparatus (104) as claimed in claim 4, wherein, to determine the second set of lines (314), the processing circuitry (120) is configured to (i) determine first and second secants (316) on horizontal and vertical axes from the center point (304), respectively, and (ii) determine, first and second pairs of the secant points (318) that correspond to intersection of the first and second secants (316) with the circle (310), respectively.

6. The data processing apparatus (104) as claimed in claim 1, wherein the processing circuitry (120) is configured to determine the adjacent center points of the one or more secondary center points based on a heading and a distance between each point of the one or more secondary center points.

7. The data processing apparatus (104) as claimed in claim 1, wherein the processing circuitry (120) is configured to receive one or more parameters from an imaging unit (106), and determine the footprint area based on the one or more parameters of the imaging unit (106).

8. The data processing apparatus (104) as claimed in claim 1, wherein, upon the generation of the survey pattern (406), the processing circuitry (120) is configured to (i) receive one or more images captured along the survey pattern (406) from the imaging unit (106), (ii) determine a relative orientation of the one or more images, and (iii) generate a survey map (400) by combining the one or more images based on the relative orientation of the one or more images.

9. A system (100) comprising:
a user device (102) configured to enable a user to provide a set of inputs for selection of a polygon (302); and
a data processing apparatus (104) comprising:
processing circuitry (120) configured to (i) receive a set of inputs from a user device (102) for selection of a polygon (302), (ii) determine a center point (304) of the polygon (302), (iii) determine a circle (310) with a radius (r) from the center point (304), (iv) generate a rectangular bounding box (312) encompassing the circle (310), (v) segregate the rectangular bounding box (312) into a plurality of secondary boxes (402) based on a footprint area, (vi) determine a set of secondary center points (404), wherein each center point of the set of secondary center points (404) corresponds to a box of the plurality of secondary boxes (402), (vii) determine one or more secondary center points of the set of secondary center points (404) inside the polygon (302), and (viii) generate a survey pattern (406) by joining adjacent center points of the one or more secondary center points based on a pre-defined heading map..

10. The system (100) as claimed in claim 9, wherein, to determine the center point (304) of the polygon (302), the processing circuitry (104) is configured to (i) identify a first set of edge points (306) of the polygon (302), (ii) determine a plurality of opposite point pairs of the first set of edge points (306), (iii) determine a first set of lines (308), wherein each line of the first set of lines (308) connects a point pair of the plurality of opposite point pairs, and (iv) identify an intersection of the first set of lines (308).

11. The system (100) as claimed in claim 9, wherein, to determine the radius (r) of the circle (310) from the center point (304), the processing circuitry (120) is configured to (i) determine a first set of distances between the first set of lines (308), (ii) determine a maximum distance value of the first set of distances between the first set of lines (308), and (iii) divide the maximum distance value by two.

12. The system (100) as claimed in claim 9, wherein to generate the rectangular bounding box (312), the processing circuitry (120) is configured to (i) determine a second set of lines (314) from the circle (310), (ii) determine a set of mid points (320) on the second set of lines (314), (iii) determine a third set of lines (322) by joining the set of mid points (320) with the center (304), and (iv) determine a second set of edge points (324) at a distance (d) from the center (304) on the third set of lines (322).

13. The system (100) as claimed in claim 12, wherein, wherein, to determine the second set of lines (314), the processing circuitry (120) is configured to (i) determine first and second secants (316) on horizontal and vertical axes from the center point (304), respectively, and (ii) determine, first and second pairs of the secant points (318) that correspond to intersection of the first and second secants (316) with the circle (310), respectively.

14. The system (100) as claimed in claim 9, wherein the processing circuitry (120) is configured to determine the adjacent center points of the one or more secondary center points based on a heading and a distance between each point of the one or more secondary center points.

15. The system (100) as claimed in claim 8, wherein the system (100) further comprising an imaging unit (106) configured to capture one or more images along the survey pattern (406).

16. The system (100) as claimed in claim 8, wherein the processing circuitry (120) is configured to receive one or more parameters from the imaging unit (106), and determine the footprint area based on the one or more parameters of the imaging unit (106).

17. The system (100) as claimed in claim 8, wherein, upon the generation of the survey pattern (406), the processing circuitry (120) is configured to (i) receive the one or more images captured along the survey pattern (406) from the imaging unit (106), (ii) determine a relative orientation of the one or more images, and (iii) generate a survey map (400) by combining the one or more images based on the relative orientation of the one or more images.

18. A method (500) comprising:
receiving, by way of processing circuitry (120), a set of inputs from the user device (102) for selection of a polygon (302);
determining, by way of the processing circuitry (120), a center point (304) of the polygon (302);
determining, by way of the processing circuitry (120), a circle (310) with a radius (r) from the center point (304);
generating, by way of the processing circuitry (120), a rectangular bounding box (312) encompassing the circle (310);
segregating, by way of the processing circuitry (120), the rectangular bounding box (312) into a plurality of secondary boxes (402) based on a footprint area;
determining, by way of the processing circuitry (120), a set of secondary center points (404), wherein each center point of the set of secondary center points (404) corresponds to a box of the plurality of secondary boxes (402);
determining, by way of the processing circuitry (120), one or more secondary center points of the set of secondary center points (404) inside the polygon (302); and
generating, by way of the processing circuitry (120), a survey pattern (406) by joining adjacent center points of the one or more secondary center points based on a pre-defined heading map.

19. The method (500) as claimed in claim 18, wherein, for determining the center point of the polygon (302), the method (500) comprising (i) identifying, by way of the processing circuitry (120), a first set of edge points (306) of the polygon (302), (ii) determining, by way of the processing circuitry (120), a plurality of opposite point pairs of the first set of edge points (306), (iii) determining, by way of the processing circuitry (120), a first set of lines (308), wherein each line of the first set of lines (308) connects a point pair of the plurality of opposite point pairs, and (iv) identifying, by way of the processing circuitry (120), an intersection of the first set of lines (308).

20. The method (500) as claimed in claim 18, wherein for determining the radius (r) of the circle (310) from the center point (304), the method (500) comprising (i) determining, by way of the processing circuitry (120), a first set of distances between the first set of lines (308), (ii) determining, by way of the processing circuitry (120), a maximum distance value of the first set of distances between the first set of lines (308), and (iii) dividing, by way of the processing circuitry (120), the maximum distance value by two.

21. The method (500) as claimed in claim 18, wherein for generating the rectangular bounding box (312), the method (500) comprising (i) determining, by way of the processing circuitry (120), a second set of lines (314) from the circle (310), (ii) determining, by way of the processing circuitry (120), a set of mid points (320) on the second set of lines (314), (iii) determining, by way of the processing circuitry (120), a third set of lines (322) by joining the set of mid points (320) with the center (304), and (iv) determining, by way of the processing circuitry (120), a second set of edge points (324) at a distance (d) from the center (304) on the third set of lines (322).

22. The method (500) as claimed in claim 21, wherein for determining the second set of lines (314), the method (500) comprising (i) determining, by way of the processing circuitry (120), first and second secants (316) on horizontal and vertical axes from the center point (304), respectively, and (ii) determining, by way of the processing circuitry (120), first and second pairs of the secant points (318) that correspond to intersection of the first and second secants (316) with the circle (310), respectively.

23. The method (500) as claimed in claim 18, wherein the method (500) further comprising determining, by way of the processing circuitry (120), the adjacent center points of the one or more secondary center points based on a heading and a distance between each point of the one or more secondary center points.

24. The method (500) as claimed in claim 18, wherein, prior to segregating the rectangular bounding box, the method (500) further comprising receiving, by way of the processing circuitry (), one or more parameters from the imaging unit (106), and determining the footprint area based on the one or more parameters of the imaging unit (106).

25. The method (500) as claimed in claim 18, wherein upon generating the survey pattern (406), the method (500) comprising receiving, by way of the processing circuitry (120), one or more images captured along the survey pattern (406) from the imaging unit (106), (ii) determining, by way of the processing circuitry (120), a relative orientation of the one or more images, and (iii) generating, by way of the processing circuitry (120), a survey map (400) by combining the one or more images based on the relative orientation of the one or more images.

Documents

Application Documents

# Name Date
1 202321051580-STATEMENT OF UNDERTAKING (FORM 3) [01-08-2023(online)].pdf 2023-08-01
2 202321051580-FORM FOR STARTUP [01-08-2023(online)].pdf 2023-08-01
3 202321051580-FORM FOR SMALL ENTITY(FORM-28) [01-08-2023(online)].pdf 2023-08-01
4 202321051580-FORM 1 [01-08-2023(online)].pdf 2023-08-01
5 202321051580-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [01-08-2023(online)].pdf 2023-08-01
6 202321051580-EVIDENCE FOR REGISTRATION UNDER SSI [01-08-2023(online)].pdf 2023-08-01
7 202321051580-DRAWINGS [01-08-2023(online)].pdf 2023-08-01
8 202321051580-DECLARATION OF INVENTORSHIP (FORM 5) [01-08-2023(online)].pdf 2023-08-01
9 202321051580-COMPLETE SPECIFICATION [01-08-2023(online)].pdf 2023-08-01
10 202321051580-FORM-26 [01-11-2023(online)].pdf 2023-11-01
11 Abstract.1.jpg 2024-01-08
12 202321051580-Proof of Right [30-01-2024(online)].pdf 2024-01-30
13 202321051580-FORM 3 [01-02-2024(online)].pdf 2024-02-01
14 202321051580-FORM-9 [19-03-2024(online)].pdf 2024-03-19
15 202321051580-STARTUP [20-03-2024(online)].pdf 2024-03-20
16 202321051580-FORM28 [20-03-2024(online)].pdf 2024-03-20
17 202321051580-FORM 18A [20-03-2024(online)].pdf 2024-03-20
18 202321051580-FER.pdf 2024-05-07
19 202321051580-FORM 3 [27-05-2024(online)].pdf 2024-05-27
20 202321051580-FER_SER_REPLY [03-09-2024(online)].pdf 2024-09-03
21 202321051580-DRAWING [03-09-2024(online)].pdf 2024-09-03
22 202321051580-CLAIMS [03-09-2024(online)].pdf 2024-09-03
23 202321051580-US(14)-HearingNotice-(HearingDate-20-08-2025).pdf 2025-08-01
24 202321051580-FORM-26 [08-08-2025(online)].pdf 2025-08-08
25 202321051580-Correspondence to notify the Controller [08-08-2025(online)].pdf 2025-08-08
26 202321051580-Written submissions and relevant documents [04-09-2025(online)].pdf 2025-09-04
27 202321051580-PatentCertificate09-10-2025.pdf 2025-10-09
28 202321051580-IntimationOfGrant09-10-2025.pdf 2025-10-09

Search Strategy

1 202321051580E_30-04-2024.pdf

ERegister / Renewals