Sign In to Follow Application
View All Documents & Correspondence

System And Method For Analysis Of Transmission Towers Using Aerial Imagery

Abstract: The present disclosure relates to a system and method for faster and better analysis of transmission towers present in an electricity infrastructure using an unmanned aerial vehicle (UAV). In one embodiment, the present system enables the Unmanned Aerial Vehicle (UAV) to capture images of the transmission towers present in the electricity infrastructure and accordingly generate image data related to the transmission towers. The image data is further processed to reduce the manual efforts required for analyzing the huge image data. The system enables an automatic selection of specific frames for the image data, wherein the specific frames contain region related to the transmission towers that are essential for an expert to make a final assessment of the damages caused to the transmission tower.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
13 February 2015
Publication Number
36/2016
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2023-09-14
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India

Inventors

1. V, Adithya
Tata Consultancy Services Limited, Abhilash Building, Plot No. 96 , EP-IP Industrial Area, Whitefield Road, Bangalore , Karnataka, India 560 066
2. SHARMA, Hrishikesh
Tata Consultancy Services Limited, Abhilash Building, Plot No. 96 , EP-IP Industrial Area, Whitefield Road, Bangalore , Karnataka, India 560 066
3. PURUSHOTHAMAN, Balamuralidhar
Tata Consultancy Services Limited, Abhilash Building, Plot No. 96 , EP-IP Industrial Area, Whitefield Road, Bangalore , Karnataka, India 560 066
4. DUTTA, Tanima
Tata Consultancy Services Limited, Abhilash Building, Plot No. 96 , EP-IP Industrial Area, Whitefield Road, Bangalore , Karnataka, India 560 066

Specification

DESC:
FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION
(See Section 10 and Rule 13)

Title of invention:
SYSTEM AND METHOD FOR ANALYSIS OF TRANSMISSION TOWERS USING AERIAL IMAGERY

Applicant:
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India

The following specification particularly describes the embodiments and the manner in which it is to be performed.

CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[0001] The present application claims priority to India Patent Application 474/MUM/2015, filed on February 13, 2015, the entirety of which is hereby incorporated by reference.
TECHNICAL FIELD
[0002] The embodiments herein generally relate to image processing, and, more particularly, to a system and method for analysis of transmission towers using aerial imagery.

BACKGROUND
[0003] Nowadays, electricity is supplied from one location to other through a network of transmission towers and transmission lines. These transmission towers and transmission lines need to be regularly checked for any kind of damage to avoid any accidents. Most of the gravitational and atmospheric stress on the grid is concentrated over the transmission tower, also called pylon. However, over time, the transmission towers age due to multiple reasons, which generates unbalanced internal stress at various points of the transmission towers, especially joints. For cost reasons, it is expected that once installed, a transmission tower must serve for many decades with minimal amount of maintenance cost, rather than re-installation. Hence after initial few years, it is critical to carry out inspection of the transmission towers within vast power grid corridor. For the purpose of inspection, typically aerial imagery method is used.
[0004] Traditional aerial imagery methods for monitoring the vastly spread electricity infrastructure include satellite image analysis and analysis of images captured through Unmanned Aerial Vehicle (UAV). Other method for inspection of electricity infrastructure involves manual or helicopter inspection. Manual or helicopter inspection requires an expert person to carry out damage assessment on the field. Further satellite image analysis involves capturing images from the satellite which allows remote assessment of transmission towers by an expert. However, the images captured by satellite lack fine details, and cannot be used to locate or identify damage of the transmission tower structure. Further, as compared to manual or helicopter inspection, deployment of the Unmanned Aerial Vehicles (UAV) is quicker and easier than to hire a helicopter for rapid inspection of transmission towers, or do a manual inspection especially during emergencies such as cyclone, earthquake, and landslides etc. Typically, in this kind of video based surveillance, amount of video or image data acquired is huge. Thus there is a need for a system and method for faster and efficient analyse electricity infrastructure.

SUMMARY
[0005] The following presents a simplified summary of some embodiments of the disclosure in order to provide a basic understanding of the embodiments. This summary is not an extensive overview of the embodiments. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the embodiments. Its sole purpose is to present some embodiments in a simplified form as a prelude to the more detailed description that is presented below.
In view of the foregoing, an embodiment herein provides a system and method for analysis of transmission towers using aerial imagery.
[0006] In an aspect, a processor implemented method is provided. The method, comprising: obtaining, by one or more hardware processors, one or more aerial images pertaining to a transmission tower; performing an optimized mean shift segmentation on the one or more aerial images; identifying one or more candidate blocks present in the one or more aerial images based on the optimized mean shift segmentation being performed; performing boundary growing and region merging operation over the one or more candidate blocks in each of the one or more aerial images to obtain boundary sharing conditions and one or more clusters associated with the boundary sharing conditions; generating a super clustered image on the one or more clusters being merged based on the boundary sharing conditions; and performing context based pylon detection operation on the super clustered image to identify a super cluster corresponding to the transmission tower such that the identified super cluster is distinguishable from one or more false positive super clusters in the one or more aerial images.
[0007] In an embodiment, the step of identifying one or more candidate blocks present in the one or more aerial images is based on at least one of a gradient magnitude density and a cluster density. In another embodiment, one or more blocks having a value equal to, or greater than a threshold value in a two dimensional vector of gradient magnitude density and cluster density are identified as the one or more candidate blocks. In an embodiment, bottom edge of an aerial image from the one or more aerial images is indicative of proximity of the transmission tower from the aerial image to eliminate false positive detection of the transmission tower.
[0008] In another aspect, a system is provided. The system comprising: a memory storing instructions; one or more communication interfaces; one or more hardware processors coupled to the memory via the one or more communication interfaces, wherein the one or more hardware processors are configured by the instructions to execute: a filtration module that is configured to obtain one or more aerial images pertaining to a transmission tower, and perform an optimized mean shift segmentation on the one or more aerial images; a search space identification module that is configured to identify one or more candidate blocks present in the one or more aerial images based on the optimized mean shift segmentation being performed; a boundary analysis module that is configured to perform boundary growing and region merging operation over the one or more candidate blocks in each of the one or more aerial images to obtain boundary sharing conditions and one or more clusters associated with the boundary sharing conditions, and generate a super clustered image on the one or more clusters being merged based on the boundary sharing conditions; and a tower detection module that is configured to perform context based pylon detection operation on the super clustered image to identify a super cluster corresponding to the transmission tower such that the identified super cluster is distinguishable from one or more false positive super clusters in the one or more aerial images.
[0009] In an embodiment, the search space identification module identifies one or more candidate blocks present in the one or more aerial images based on at least one of a gradient magnitude density and a cluster density. In an embodiment, the search space identification module is configured to identify one or more blocks having a value equal to, or greater than a threshold value in a two dimensional vector of gradient magnitude density and cluster density as the one or more candidate blocks. In an embodiment, bottom edge of an aerial image from the one or more aerial images is indicative of proximity of the transmission tower from the aerial image to eliminate false positive detection of the transmission tower.
[0010] It should be appreciated by those skilled in the art that any block diagram herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computing device or processor, whether or not such computing device or processor is explicitly shown.

BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[0012] Figure 1 illustrates a network implementation of a system for processing image data received from Unmanned Aerial Vehicle (UAV), in accordance with an embodiment of the present disclosure;
[0013] Figure 2 illustrates the system, in accordance with an embodiment of the present disclosure;
[0014] Figure 3 illustrates a method for processing the image data received from Unmanned Aerial Vehicle (UAV), in accordance with an embodiment of the present disclosure;
[0015] Figure 4A illustrates one or more aerial images pertaining to a transmission tower using UAV according to an embodiment of the present disclosure;
[0016] Figure 4B illustrates an output of Background Suppression using upon an optimized mean shift segmentation being performed on the one or more aerial images of Figure 4A according to an embodiment of the present disclosure;
[0017] Figures 5A-5B illustrate a depiction of gradient magnitude density and cluster density respectively for identification of one or more candidate blocks in the one or more aerial images according to an embodiment of the present disclosure;
[0018] Figure 6A illustrates identifying the one or more candidate blocks in the one or more aerial images according to an embodiment of the present disclosure;
[0019] Figure 6B depicts an output of Region Merging and Boundary Growing according to an embodiment of the present disclosure;
[0020] Figure 7 depicts an Output of Foreground Object Detection based on context based pylon detection using a tower detection module of Figure 2 according to an embodiment of the present disclosure;
[0021] Figure 8 depicts a partial false positive that occurred when a neighboring region other than the actual pylon region is present in the final detected pylon region according to an example embodiment of the present disclosure;
[0022] Figure 9 depicts a partial false negative that occurred when a part of actual pylon region is not present within the boundary of the detected pylon region according to an example embodiment of the present disclosure; and
[0023] Figure 10 depicts ground truth images illustrating accuracy of pylon detection measurement according to an example embodiment of the present disclosure.

DETAILED DESCRIPTION OF EMBODIMENTS
[0024] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0025] This summary is provided to introduce concepts related to systems and methods for faster and better analysis of transmission tower of electricity infrastructure using unmanned aerial vehicles (UAVs), and the concepts are further described below in the detailed description. The present disclosure relates to a system and method for faster and better analysis of transmission towers present in an electricity infrastructure using unmanned aerial vehicles (UAVs). In one embodiment, the present system enables an Unmanned Aerial Vehicle (UAV) to capture images of the transmission towers present in the electricity infrastructure and accordingly generate image data associated with each of the transmission towers. The image data is further processed to reduce manual efforts spent in analysing this huge image data. For this purpose, the system enables an automatic selection of specific frames from the image data, wherein the specific frames contain region related to the transmission towers that are essential for an expert to make a final assessment of the damages caused to the transmission towers.
[0026] Referring now to the drawings, and more particularly to Figures 1 through 10, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
[0027] Figure 1 illustrates a network implementation of a system for processing image data received from Unmanned Aerial Vehicle (UAV), in accordance with an embodiment of the present disclosure. Referring now to Figure 1, a network implementation of an image acquisition system 100, image acquisition system 100 enables a Ground Control Station (GCS) hereafter referred to as the system 102. The system 102 is configured for analysis of image data related to transmission towers present in an electricity infrastructure, as well as the sensing platform using an Unmanned Aerial Vehicle (UAV) 108. Although the present disclosure is explained by considering that the video analysis function of the system 102 is implemented as a software application on a server, it may be understood that the video analysis function of the system 102 may also be remotely implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, cloud, hand-held device and the like. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2…104-N, collectively referred to as user devices 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a hand-held device, and a workstation. The user devices 104 are communicatively coupled to the system 102 through a network 106. Further, the system 102 may be connected to an Unmanned Aerial Vehicle (UAV) through the network 106, wherein the UAV is configured to capture images and videos of the transmission towers present in the electrical infrastructure using a wireless network of cameras present on the UAV.
[0028] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0029] Referring now to figure 2, the system 102 is illustrated in accordance with an embodiment of the present disclosure. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.
[0030] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with a user directly or through the user devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[0031] The memory 206 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and system data 230. The memory 206 may comprise one or more aerial images obtained from one or more sources (e.g., sensors, or external storage devices, and the like).
[0032] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include a reception module 210, a displaying module 212, a filtration module 214, a search space identification module 216, a boundary analysis module 218, a tower detection module 220 and other modules 222. The other modules 222 may include programs or coded instructions that supplement applications and functions of the system 102.
[0033] The system data 230, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The system data 230 may also include a system database 232 and other data 234. The other data 234 may include data generated as a result of the execution of one or more modules in the other modules 222. In one implementation, the user may use the client device 104 to access the system 102 via the I/O interface 204.
[0034] In one embodiment, the system 102 is configured for detection of electricity transmission towers from aerial images or videos. The background of such transmission tower is varying and complex, with different artefacts such as trees, barren patches, unpaved roads, occasional hutment, sky beyond the horizon etc. To process such aerial images for identification of transmission tower, the system 102 enable four modules namely filtration module 214, search space identification module 216, boundary analysis module 218, and tower detection module 220. In one embodiment, the filtration module 214 is configured to filter the aerial images (obtained from the memory 206) via optimized mean shift segmentation. The process of optimized mean shift segmentation over the aerial images results in an isotropic diffusion, for edge preserving and smoothing of the aerial images. Further, the search space identification module 216 is configured to identify a set of candidate blocks present in the aerial images. For this purpose, the aerial images are divided into equally sized blocks. Further, search space identification module 216 selects a set of candidate blocks from each of the aerial image based on two features namely gradient and cluster density.
[0035] In the next step, the boundary analysis module 218 performs the operation of boundary growing and region merging over the set of candidate blocks in each of the image. For this purpose, clusters within the set of candidate blocks are merged based on boundary sharing conditions to generate a super clustered image. Once the clusters are merged, the tower detection module 220 performs context based pylon detection operation on the super clustered image to differentiate the super cluster corresponding to the transmission tower from the other false positive super clusters in the aerial image. The tower detection module 220 assumes the default UAV path planning so as to identify aerial image with the transmission tower close to the bottom edge of the aerial image to eliminate false positive detection of the transmission tower. The region corresponding to the final cluster, identified as the transmission tower, is extracted from the initial input image and presented to the expert using the I/O interface 204 for the purpose of inspection. The extracted object corresponding to tower region from the final image occupies marginally less memory as compared to the original aerial image thus solving storage issues and provides a clear picture of the transmission tower for the purpose of inspection. The components of the system 102 are further explained with respect to the flowchart as illustrated in Figure 3.
[0036] Figure 3 represents a method 300 for processing aerial images captured from the UAV 108 in order to detect the transmission tower. At block 302, the filtration module 214 is configured to filter the aerial image via optimized mean shift segmentation. The process of optimized mean shift segmentation over the aerial images results in an isotropic diffusion, for edge preserving and smoothing of the aerial images.
[0037] At block 304, the search space identification module 216 is configured to identify a set of candidate blocks present in the aerial images. For this purpose, the aerial images are divided into equally sized blocks based on the size of aerial image, size of the transmission tower, and imaging distance. Further, an image gradient is calculated at every pixel from the aerial image and pixel clustering is done on the image based on colour similarity and boundary sharing condition. Further, search space identification module 216 selects a set of candidate blocks from each of the aerial image based on two features namely gradient (mean gradient value in the block) and cluster density (number of clusters in the block).
[0038] At block 306, the boundary analysis module 218 performs the operation of boundary growing and region merging over the set of candidate blocks in each of the image. For this purpose, clusters within the set of candidate blocks are merged based on boundary sharing conditions to generate a super clustered image.
[0039] At block 308, the tower detection module 220 performs context based pylon detection operation on the super clustered image to differentiate the super cluster corresponding to the transmission tower from the other super clusters present in the aerial image. The tower detection module 220 assumes the default UAV path planning so as to identify aerial image with the transmission tower close to the bottom edge of the aerial image to eliminate false positive detection of the transmission tower. The region corresponding to the final cluster, identified as the transmission tower, is extracted from the initial input image and presented to the expert using the I/O interface 204 for the purpose of inspection.
[0040] In one embodiment, some of the other advantages provided by the system 102 include faster processing of aerial images by identifying key image blocks from the aerial image to grow the desired transmission tower region. For this purpose the system 102 identifies candidate clusters along the transmission tower within key blocks, which are then extended along the boundary on either side, using total covered area and boundary considerations. The system 102 further enables determining cluster density, wherein the cluster density is used to identify porous objects, especially porous objects having a cage-like structure of the transmission tower. Furthermore, in an alternative embodiment, the system 102 can also define shape density in order to detect shapes such as irregular polygons that are present in the transmission towers.
[0041] Figure 4A, with reference to FIGS. 1 through 3, illustrates one or more aerial images pertaining to a transmission tower using UAV according to an embodiment of the present disclosure. Figure 4B, with reference to Figures 1 through 4A, illustrates an output of Background Suppression using upon an optimized mean shift segmentation being performed on the one or more aerial images of Figure 4A according to an embodiment of the present disclosure. Due to complex outdoor surroundings, various edge detection algorithms give a number of edges in the background, along with those in the foreground. At the same time, due to presence of numerous beams in its truss, a striking feature of a pylon is presence of multiple linear features in its corresponding image. Distinguishing the subset of linear features of pylon, from those of the background, is a complex task. Therefore, to reduce background clutter and simultaneously accentuate the foreground, the embodiments of the present disclosure the one or more aerial images are filtered. The mean shift based image segmentation is a straightforward extension of the discontinuity preserving smoothing technique. The optimized mean shift segmentation performed has following features: (a) mean shift in the spatial and color dimensions, (b) anisotropic diffusion for edge preserving and smoothing, and (c) joined bilateral filtering for both the intensity and position of each pixel by replacing with the weighted average of its neighbors.
[0042] Figures 5A-5B, with reference to Figures 1 through 4B, illustrate a depiction of gradient magnitude density and cluster density respectively for identification of one or more candidate blocks in the one or more aerial images according to an embodiment of the present disclosure. The search space identification module 216 generating Search Space of Key Granules (identified candidate blocks). To narrow down on probable pylon region, 128*96 sized granules are considered organized as a grid within 1280*960 pixel-sized images. The granule size is configurable, and may be decided based on estimation of height of pylon projection. This in turn depends on the distance the pylon is imaged from, UAV flight orientation, as well as pylon size. Too small granules give many false candidate regions, which further increases computation time, at the very least. Too big granules may not contain enough chunks of pylon projection, in order to arrive at a meaningful conclusion.
[0043] To identify one or more candidate blocks (or granules from the granule grid), two specific, granule-level distinguishing features are considered. Intuitively, it is expected the granules overlapping with pylon projection to exhibit both high gradient density, as well as high cluster density, when compared to background in the smoothened image. The former is an effect of dominant amount of pylon beams as edges, while the latter is an effect of cage-like structure of the pylon. The perspective view of the pylon from the down-looking camera during sideways tracking results in lines of varying thickness across the projection. In addition, since aerial cameras are wide lens camera, the images have fish eye effect. Due to non-parallel and jittery trajectory of the UAV with respect to the pylon, the pylon can have any orientation in the image plane, though. Also, imaging distance can limit the height of pylon that is captured within camera's field of view. Irrespective of these variabilities, the cage-like structure is always present in pylon projection. Consequently, the high linearity and cluster density features remain consistent in all kind of pylon projections in all imaging scenarios. The embodiments of the present disclosure enable the search space identification module 216 to estimate the gradient using a Sobel function on the filtered image, while clustering is done by imposing similar color and boundary-sharing conditions on the filter image. The gradient density GDi of a granule i is then calculated as the average gradient magnitude within that granule as shown in Figure 5A. Similarly, the cluster density CDi of a granule i is the number of clusters in that granule subject to condition that they have more than x% (e.g., 80%) area within that granule, as shown in Figure 5B.
[0044] The one or more candidate blocks (or granules) are identified (or shortlisted) by arranging the density values in descending order, and omitting the tail of such sorted distribution. More specifically, it is done by dropping out (or disregarding) granules (blocks) that have densities less than 30% of the peak density values in both distributions. In other words, one or more blocks having a value equal to or greater than a threshold value (for example, 70% of the maximum gradient density or cluster density, across all image blocks) in a two dimensional vector of gradient magnitude density and cluster density are identified (e.g., using the search space identification module 216) as the one or more candidate blocks. More particularly, one or more blocks having a higher value (e.g., 70% of the maximum gradient density or cluster density, across all image blocks) in a two dimensional vector of gradient magnitude density and cluster density are identified as the one or more candidate blocks, The disregarding of blocks and identification of the one or more candidate blocks is illustrated by way of the following expression:
(1)
The union of candidate blocks, after disregarding KG blocks, comprises of foreground search space.
[0045] Figure 6A, with reference to Figures 1 through 5B, illustrates identifying the one or more candidate blocks in the one or more aerial images according to an embodiment of the present disclosure. The one or more candidate blocks selected for a background-suppressed frame are depicted in Figure 6A. After the previous stage, blocks that contain pylon and similar object regions in the image are obtained. To get a snug fit pylon region, the contiguous clusters in these connected granules (or blocks) are to be merged. Cluster merging simultaneously entails extracting the prominent pylon structure in the foreground. Boundary growing and region merging operation over the one or more candidate blocks in each of the one or more aerial images to obtain boundary sharing conditions and one or more clusters associated with the boundary sharing conditions. Figure 6B, with reference to Figures 1 through 6A, depicts an output of Region Merging and Boundary Growing according to an embodiment of the present disclosure.
[0046] For merging, first a granule set of sets G, comprising of two or more connected granules (candidate blocks) is defined. Next, for each granule set Gi, a set of member clusters Cij is identified. Member clusters are those having ‘y%’ (e.g., 80% or more) area within the granules of the set Gi. A seed cluster Cis, which is the largest cluster in set of member clusters Cij, is also identified, and is illustrated by way of example in the below expression:
(2)

[0047] Once the seed cluster is identified, the region is grown by iteratively by combining member clusters that share boundaries. The order of combining them is based on maximum boundary sharing with current merged cluster. This is repeated for all the granule sets Gi for i = 1, 2, 3,.. n, ‘n’ being a real number. The output of this iterative process is a “forest" of a few merged segments as shown in Figure 6B. The iteration step is illustrated by way of example expression below:
(3)

[0048] A super clustered image is generated upon the one or more clusters being merged based on the boundary sharing conditions.
[0049] Figure 7, with reference to Figures 1 through 6, depicts an Output of Foreground Object Detection based on context based pylon detection using the tower detection module 220 of Figure 2 according to an embodiment of the present disclosure. For all inspections, the UAV path planning for inspection is always done so that pylon and all other components of the power grid are the physically closest objects to be imaged. Such context-based sensory information is assumed to be fed as a prior knowledge to the detection framework. The embodiments of the present disclosure uses this information to select one out of many blobs, whose bottom-left corner of bounding box is closest to the bottom-left corner of the image frame, i.e., closest during projection. In other words, bottom edge of the aerial image is indicative of proximity of the transmission tower from the aerial image to eliminate false positive detection of the transmission tower. The output of this stage is shown in Figure 7. The tower detection module 220 performs context based pylon detection operation on the super clustered image to identify a super cluster corresponding to the transmission tower such that the identified super cluster is distinguishable from one or more false positive super clusters in the one or more aerial images. In other words, the identified super cluster is not a false positive super cluster. The tower detection module 220 performs context based pylon detection operation on the super clustered image to identify an appropriate transmission tower (tower space and exact location of the transmission tower) from the one or more aerial images.
[0050] Experimental Results:
[0051] The image data was captured using an 11 MP f/2.8 wide lens RGB camera, GoPro Hero3, which was mounted on a mini-UAV. The image size captured was 3000*2250, which was resized to 1280*960 for testing purposes. Two test sites provided by Hot Line Training Center, outside Bangalore, were used for imaging two power grids. A quadcopter provided by one or more collaborators was flown so as to have a sideways view of the power grids. For such view, the pitch of the camera mount was fixed to around 60 degree, while yaw was azimuth-facing and roll angle towards horizon. Enough length of power grids was imaged, giving us around 115 different pylon frames within the video. The background of the power lines was varying, and typically consisted of vegetation, sky, few unpaved roads and houses.
[0052] Results and Performance Analysis:
[0053] While the testing and analysis has been carried out on all 115 pylon frames, results are only shown for two sample images for sake of brevity. The two images have been chosen with different angle of view, for different sized pylons. The configurable thresholds e.g., 80% area overlap, 30% tail discarding etc., while making intuitive sense, were hand-tuned so as to minimize partial false positive as well as partial false negative. A partial false positive occurs when a neighboring region other than the actual pylon region is present in the final detected pylon region. Figure 8, with reference to Figures 1 through 7, depicts a partial false positive that occurred when a neighboring region other than the actual pylon region is present in the final detected pylon region according to an example embodiment of the present disclosure. Similarly, a partial false negative occurs when a part of actual pylon region is not present within the boundary of the detected pylon region. Figure 9, with reference to Figures 1 through 8, depicts a partial false negative that occurred when a part of actual pylon region is not present within the boundary of the detected pylon region according to an example embodiment of the present disclosure. The embodiments of the present disclosure also considered full false positive i.e., detection of a non-pylon region as pylon region, and full false negative, i.e., complete non-detection of a pylon region, though present.
[0054] The accuracy of pylon detection is measured based on the ground truth results, where pylons are detected manually using GIMP toolbox. Figure 10, with reference to, Figures 1 through 9 depicts ground truth images illustrating accuracy of pylon detection measurement according to an example embodiment of the present disclosure. Overlap analysis of ground truth region and detected region, as shown for two example images in Figure 8 and Figure 9 reveals that the partial false positives are mostly limited to co-detection of small length of power lines, insulators and vegetation whose boundary intersects with pylon boundary in the projection. Due to target of semi-automated processing, presence of these minor protrusions in desired region is not a serious problem, though. Partial false negatives were present in 8% of the frames, due to the fact that pylon projection is slant at times, thus resulting in some granules (or blocks) having a limited part of pylon getting discarded as non-key granules. Such losses are limited to up to 2 out of maximum of 10 key granules (candidate blocks) being considered per frame i.e., < 20%, so such “dent" is not widespread. Full false negative is limited to just 1 frame, since a very small part of pylon (< 10%) is all that got captured in image. Full false positives are typically restricted to presence of truss-like structures in background, e.g., a power station. However, the final stage of the techniques proposed by the embodiments of the present disclosure removes them all.
[0055] Following is an illustrative technique for detection of Transmission Tower/Pylon in Aerial Images used by the embodiments of the present disclosure:
Optimized mean shift segmentation to:
Find the peak of a confidence map using the color histogram of the image.
Cluster the image into mean color based clusters.
Gradient magnitude density estimation for all image granules (image blocks).
Cluster density estimation for all granules.
Select all granules which have higher value in the 2D vector of gradient magnitude and cluster density.
For Each Key granule in descending order of vector values.
Pick a cluster in that granule, which has maximum area within that granule.
Merge the selected cluster with neighboring clusters which has maximum boundary sharing with the selected cluster, and maximum area within that granule in a iterative manner.
Detection of foreground cluster as Pylon based on context knowledge.
EndFor.
[0056] Although implementations of system and method for analysis of transmission towers using aerial imagery have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features of systems and methods are disclosed as examples of implementations for analysis of transmission towers using aerial imagery.
[0057] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[0058] The embodiments of the present disclosure enable an automatic detection of electricity towers in aerial images is a practically useful research problem that requires attention from computer vision research community. The embodiments of the present disclosure describe a technique for detecting transmission towers in in complex and heterogeneous outdoor surroundings, which is achieved by suppressing the background clutter first, and then by selecting one or more candidate blocks based on at least one of a gradient magnitude density or a cluster density, and finally a region merging technique is applied to detect the tower region. The cage like structure and linearity feature of towers are exploited during detection. The proposed technique was tested on 100+ tower images collected using a small quad copter mini-UAV, over two grid corridors. It was found to exhibit minimum presence of both false positives and false negatives. This makes the technique for tower detection robust enough with good performance, and hence can be applicable for detecting all standard types of towers in any other outdoor surroundings as well.
[0059] The embodiments of the present disclosure enable an automatic detection of electricity towers in aerial images is a practically useful research problem that requires attention from computer vision research community. The embodiments of the present disclosure describe a technique for detecting transmission towers in in complex and heterogeneous outdoor surroundings, which is achieved by suppressing the background clutter first, and then by selecting one or more candidate blocks based on at least one of a gradient magnitude density or a cluster density, and finally a region merging technique is applied to detect the tower region. The cage like structure and linearity feature of towers are exploited during detection. The proposed technique was tested on 100+ tower images collected using a small quad copter mini-UAV, over two grid corridors. It was found to exhibit minimum presence of both false positives and false negatives. This makes the technique for tower detection robust enough with good performance, and hence can be applicable for detecting all standard types of towers in any other outdoor surroundings as well.
[0060] It is, however to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[0061] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[0062] The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W), BLU-RAY, and DVD.
[0063] A data processing system suitable for storing and/or executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
[0064] Input/output (I/O) devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
[0065] A representative hardware environment for practicing the embodiments may include a hardware configuration of an information handling/computer system in accordance with the embodiments herein. The system herein comprises at least one processor or central processing unit (CPU). The CPUs are interconnected via system bus to various devices such as a random access memory (RAM), read-only memory (ROM), and an input/output (I/O) adapter. The I/O adapter can connect to peripheral devices, such as disk units and tape drives, or other program storage devices that are readable by the system. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments herein.
[0066] The system further includes a user interface adapter that connects a keyboard, mouse, speaker, microphone, and/or other user interface devices such as a touch screen device (not shown) to the bus to gather user input. Additionally, a communication adapter connects the bus to a data processing network, and a display adapter connects the bus to a display device which may be embodied as an output device such as a monitor, printer, or transmitter, for example.
[0067] The preceding description has been presented with reference to various embodiments. Persons having ordinary skill in the art and technology to which this application pertains will appreciate that alterations and changes in the described structures and methods of operation can be practiced without meaningfully departing from the principle, spirit and scope.
,CLAIMS:
1. A processor implemented method, comprising:
obtaining, by one or more hardware processors, one or more aerial images pertaining to a transmission tower;
performing an optimized mean shift segmentation on said one or more aerial images;
identifying one or more candidate blocks present in said one or more aerial images based on said optimized mean shift segmentation being performed;
performing boundary growing and region merging operation over said one or more candidate blocks in each of said one or more aerial images to obtain boundary sharing conditions and one or more clusters associated with said boundary sharing conditions;
generating a super clustered image on said one or more clusters being merged based on said boundary sharing conditions; and
performing context based pylon detection operation on said super clustered image to identify a super cluster corresponding to said transmission tower such that said identified super cluster is distinguishable from one or more false positive super clusters in said one or more aerial images.

2. The processor implemented method of claim 1, wherein identifying one or more candidate blocks present in said one or more aerial images is based on at least one of a gradient magnitude density and a cluster density.

3. The processor implemented method of claim 1, wherein bottom edge of an aerial image from said one or more aerial images is indicative of proximity of said transmission tower from said aerial image to eliminate false positive detection of said transmission tower.

4. The processor implemented method of claim 1, wherein one or more blocks having a value equal to, or greater than a threshold value in a two dimensional vector of gradient magnitude density and cluster density are identified as said one or more candidate blocks.

5. A system comprising:
a memory storing instructions;
one or more communication interfaces;
one or more hardware processors coupled to said memory via said one or more communication interfaces, wherein said one or more hardware processors are configured by said instructions to execute:
a filtration module that is configured to obtain one or more aerial images pertaining to a transmission tower, and perform an optimized mean shift segmentation on said one or more aerial images;
a search space identification module that is configured to identify one or more candidate blocks present in said one or more aerial images based on said optimized mean shift segmentation being performed;
a boundary analysis module that is configured to perform boundary growing and region merging operation over said one or more candidate blocks in each of said one or more aerial images to obtain boundary sharing conditions and one or more clusters associated with said boundary sharing conditions, and generate a super clustered image on said one or more clusters being merged based on said boundary sharing conditions; and
a tower detection module that is configured to perform context based pylon detection operation on said super clustered image to identify a super cluster corresponding to said transmission tower such that said identified super cluster is distinguishable from one or more false positive super clusters in said one or more aerial images.

6. The system of claim 5, wherein said search space identification module identifies one or more candidate blocks present in said one or more aerial images based on at least one of a gradient magnitude density and a cluster density.

7. The system of claim 5, wherein said tower detection module is configured to identify an aerial image from said one or more aerial images such that said transmission tower is in close proximity of bottom edge of said aerial image to eliminate false positive detection of said transmission tower.

8. The system of claim 5, wherein said search space identification module is configured to identify one or more blocks having a value equal to, or greater than a threshold value in a two dimensional vector of gradient magnitude density and cluster density as said one or more candidate blocks.

Documents

Application Documents

# Name Date
1 474-MUM-2015-FORM 26-(27-04-2015).pdf 2015-04-27
1 474-MUM-2015-IntimationOfGrant14-09-2023.pdf 2023-09-14
2 474-MUM-2015-PatentCertificate14-09-2023.pdf 2023-09-14
2 474-MUM-2015-CORRESPONDENCE-(27-04-2015).pdf 2015-04-27
3 Drawing [11-02-2016(online)].pdf 2016-02-11
3 474-MUM-2015-CLAIMS [01-11-2019(online)].pdf 2019-11-01
4 Description(Complete) [11-02-2016(online)].pdf 2016-02-11
4 474-MUM-2015-COMPLETE SPECIFICATION [01-11-2019(online)].pdf 2019-11-01
5 Form 2.pdf ONLINE 2018-08-11
5 474-MUM-2015-DRAWING [01-11-2019(online)].pdf 2019-11-01
6 Form 2.pdf 2018-08-11
6 474-MUM-2015-FER_SER_REPLY [01-11-2019(online)].pdf 2019-11-01
7 474-MUM-2015-OTHERS [01-11-2019(online)].pdf 2019-11-01
7 474-MUM-2015-Form 1-250215.pdf 2018-08-11
8 474-MUM-2015-FER.pdf 2019-05-01
8 474-MUM-2015-Correspondence-250215.pdf 2018-08-11
9 474-MUM-2015-FER.pdf 2019-05-01
9 474-MUM-2015-Correspondence-250215.pdf 2018-08-11
10 474-MUM-2015-Form 1-250215.pdf 2018-08-11
10 474-MUM-2015-OTHERS [01-11-2019(online)].pdf 2019-11-01
11 Form 2.pdf 2018-08-11
11 474-MUM-2015-FER_SER_REPLY [01-11-2019(online)].pdf 2019-11-01
12 Form 2.pdf ONLINE 2018-08-11
12 474-MUM-2015-DRAWING [01-11-2019(online)].pdf 2019-11-01
13 Description(Complete) [11-02-2016(online)].pdf 2016-02-11
13 474-MUM-2015-COMPLETE SPECIFICATION [01-11-2019(online)].pdf 2019-11-01
14 Drawing [11-02-2016(online)].pdf 2016-02-11
14 474-MUM-2015-CLAIMS [01-11-2019(online)].pdf 2019-11-01
15 474-MUM-2015-PatentCertificate14-09-2023.pdf 2023-09-14
15 474-MUM-2015-CORRESPONDENCE-(27-04-2015).pdf 2015-04-27
16 474-MUM-2015-IntimationOfGrant14-09-2023.pdf 2023-09-14
16 474-MUM-2015-FORM 26-(27-04-2015).pdf 2015-04-27

Search Strategy

1 474_MUM_2015_Search_Strategy_19-12-2018.pdf
1 474_MUM_2015_Search_Strategy_Modified_26-04-2019.pdf
2 474_MUM_2015_Search_Strategy_19-12-2018.pdf
2 474_MUM_2015_Search_Strategy_Modified_26-04-2019.pdf

ERegister / Renewals

3rd: 14 Dec 2023

From 13/02/2017 - To 13/02/2018

4th: 14 Dec 2023

From 13/02/2018 - To 13/02/2019

5th: 14 Dec 2023

From 13/02/2019 - To 13/02/2020

6th: 14 Dec 2023

From 13/02/2020 - To 13/02/2021

7th: 14 Dec 2023

From 13/02/2021 - To 13/02/2022

8th: 14 Dec 2023

From 13/02/2022 - To 13/02/2023

9th: 14 Dec 2023

From 13/02/2023 - To 13/02/2024

10th: 12 Feb 2024

From 13/02/2024 - To 13/02/2025

11th: 13 Feb 2025

From 13/02/2025 - To 13/02/2026