Sign In to Follow Application
View All Documents & Correspondence

System And Method For Modelling Of Underwater Topography

Abstract: The present disclosure relates to a system and method for facilitating modelling of underwater topography. The system includes a modelling unit operatively coupled to a side scan sonar SONAR system, comprising one or more processors to: obtain a set of data packets corresponding to underwater environment parameters; extract a first set of attributes corresponding to underwater environment parameters and a second set of attributes pertaining to underwater object parameters to generate a reference image based on the extracted first and second set of attributes; and map an underwater floor terrain model based on any or a combination of the extracted first and second set of attributes on the generated reference image.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
21 December 2020
Publication Number
25/2022
Publication Type
INA
Invention Field
PHYSICS
Status
Email
info@khuranaandkhurana.com
Parent Application

Applicants

Bharat Electronics Limited
Corporate Office, Outer Ring Road, Nagavara, Bangalore - 560045, Karnataka, India.

Inventors

1. BIVIN GEORGE
Electro Optic Laser & Electronic Warfare / CRL, Bharat Electronics Limited, Jalahalli, Bangalore - 560013, Karnataka, India.
2. JISHA G
Sonar System / PDIC, Bharat Electronics Limited, Jalahalli, Bangalore - 560013, Karnataka, India.
3. NIDHAL M MANSOOR
Sonar System / PDIC, Bharat Electronics Limited, Jalahalli, Bangalore - 560013, Karnataka, India.

Specification

Claims:1. A system for facilitating modelling of underwater topography, said system comprising of:
a modelling unit operatively coupled to a side scan SONAR system, comprising one or more processors, wherein the one or more processors operatively coupled with memory, the memory storing instructions executable by the one or more processors to:
obtain a set of data packets from a first computing device amongst the one or more computing devices associated with the side scan SONAR system, wherein the set of data packets correspond to underwater environment parameters;
extract a first set of attributes from the set of data packets, wherein the first set of attributes correspond to underwater-floor terrain topology parameters;
extract a second set of attributes from the set of data packets, wherein the second set of attributes correspond to attributes associated with underwater objects;
generate a reference image based on the extracted first and second set of attributes; and
map an underwater floor terrain model based on any or a combination of the extracted first and second set of attributes on the generated reference image.
2. The system as claimed in claim 1, wherein the first set of attributes corresponding to underwater floor terrain topology parameters comprise any or a combination of maximum achievable swath range, ship speed, tow speed, survey time, resolution along track, resolution across track, mine data set, altitude of object from underwater floor and transducer depression angle.
3. The system as claimed in claim 1 wherein the second set of attributes comprise any or a combination of object dimension coefficient, object reflection coefficient, object height from underwater floor, object position on underwater floor, position and height of transducer, beam angle of transducer, elevation of underwater floor.
4. The system as claimed in claim 1 wherein the modelling unit is configured to generate a plurality of contour maps based on the extracted first and second set of attributes.
5. The system as claimed in claim 1 wherein the modelling unit is configured to generate shadow of underwater objects based on the extracted first and second set of attributes.
6. The system as claimed in claim 1, wherein generated underwater floor terrain model comprises any or a combination of underwater floor terrain model based on a plurality of elevation contour maps, on a plurality of reflectance contour maps, shadow of underwater floor terrain based on elevation, position and height of SONAR transducer and pixelated underwater floor terrain and map the said on to the reference image.
7. The system as claimed in claim 1, the underwater terrain pertains to any or a combination of lake floor terrain, river floor terrain, sea floor terrain, and ocean floor terrain.
8. A method for facilitating mapping of underwater topography, said method being executed by a set of instructions at one or more processors, and comprising :
obtaining a set of data packets from a first computing device amongst the one or more computing devices associated with a side scan SONAR system at a processor, wherein the set of data packets corresponds to underwater environment parameters;
extracting a first set of attributes from the set of data packets, wherein the first set of attributes correspond to underwater floor terrain topology parameters;
extracting a second set of attributes from the set of data packets, wherein the second set of attributes correspond to attributes associated with underwater objects;
generating a reference image based on the extracted first and second set of attributes; and
mapping an underwater floor terrain model based on any or a combination of the extracted first and second set of attributes on the generated reference image.
9. The method as claimed in claim 8, wherein the generated shadow of the underwater objects and the generated plurality of contour maps are mapped on the underwater floor terrain model.
10. The method as claimed in claim 8, wherein the shadow of underwater objects provides for a shadow signature mask corresponding to any or a combination of a dead zone, highlight and shadow ratio pertaining to the extracted first and second set of attributes.
, Description:TECHNICAL FIELD
[1] The present disclosure relates to a system and method for underwater survey, and in particular modelling of underwater topography.

BACKGROUND
[2] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[3] Existing underwater environment models provide simulated acoustically realistic active sonar signals and includes simulating a realistic sounding active sonar signal that originates from a simulated sonar transmitter and provides simulated ocean effects that incorporates realistic sounding simulated reverberation. While another technique relates generally to the field of quantitative sedimentologic and stratigraphic prediction. The models output consists of a three-dimensional model of the sedimentologic and stratigraphic attributes for the specified basin volume. Yet another technique relates to methods and systems for performing underwater surveys, in particular on sub-sea installations such as oil and gas pipelines, risers, well-heads and so on. and provides an augmented underwater image of a scene for use in an underwater survey.
[4] But such models predominantly use complex hardware-based approach and complex computer implemented methods, thus creating a complex high-end hardware system which are not only expensive but also have maintenance and installation problems. Hence, there is a requirement in the art to devise a system and a method to facilitate efficient underwater environment modelling for a side scan sonar using cost effective hardware modules and systems.

OBJECTS OF THE PRESENT DISCLOSURE
[5] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[6] An object of the present disclosure is to provide for a system and method to synthesize underwater environment model for a side scan sonar.
[7] An object of the present disclosure is to provide for a system and method to facilitate synthesizing of base underwater floor image, using reference image based on spatio-temporal domain specification and generate an underwater floor model.
[8] An object of the present disclosure is to provide for a system and method to facilitate synthesizing of underwater floor terrain using contour maps having elevation and reflectance data with respect to reference image and generate an underwater floor model.
[9] An object of the present disclosure is to provide for a system and method to facilitate synthesizing of mine like objects based on object dimension and reflectance details provided in dataset and generate an underwater floor model.
[10] An object of the present disclosure is to provide for a system and method to facilitate synthesizing of shadow of underwater floor terrain based on terrain elevation data and object dimension and position and generate an underwater floor model.
[11] An object of the present disclosure is to provide for a system and method to facilitate addition of features such as faults, craters, landslides, sediment paths, rocks from underwater terrain images and the like to incorporate in the generated underwater model.
[12] An object of the present disclosure is to provide for a system and method to facilitate synthesizing of three-dimensional to two-dimensional underwater floor terrain, and generate an underwater floor model.

SUMMARY
[13] The present disclosure provides for a system and a method for system for facilitating modelling of underwater topography.
[14] In an aspect, the present disclosure provides for a system for facilitating modelling of underwater topography. The system may include a modelling unit operatively coupled to a side scan SONAR system, comprising one or more processors operatively coupled with memory, the memory storing instructions executable by the one or more processors to obtain a set of data packets from a first computing device amongst the one or more computing devices associated with the side scan SONAR system, where the set of data packets can correspond to underwater environment parameters. The system then can extract a first set of attributes from the obtained set of data packets, where the first set of attributes may correspond to underwater topology parameters and also extract a second set of attributes the obtained set of data packets, where the second set of attributes may correspond to attributes associated with underwater objects. The system further can generate a reference image based on the extracted first and second set of attributes and map an underwater terrain model based on any or a combination of the extracted first and second set of attributes on the generated reference image.
[15] In an embodiment, the first set of attributes corresponding to underwater topology parameters may include any or a combination of maximum achievable swath range, ship speed, tow speed, survey time, resolution along track, resolution across track, mine data set, altitude of object from underwater bed, a transducer depression angle, and the like.
[16] In an embodiment, the second set of attributes can include any or a combination of object dimension coefficient, object reflection coefficient, object height from underwater floor, object position on underwater floor, position and height of transducer, beam angle of transducer, elevation of underwater floor.
[17] In an embodiment, the modelling unit may be configured to generate a plurality of contour maps based on the extracted first and second set of attributes.
[18] In another embodiment, the modelling unit maybe configured to generate shadow of underwater objects based on the extracted first and second set of attributes.
[19] In yet another embodiment, generated underwater terrain model can include any or a combination of underwater terrain model based on any or a combination of a plurality of elevation contour maps, a plurality of reflectance contour maps, shadow of underwater terrain based on elevation, position and height of transducer, and the like. The underwater terrain model maybe pixelated and mapped on to the reference image.
[20] In an embodiment, the underwater terrain may include any or a combination of lake floor terrain, river floor terrain, sea floor terrain, and ocean floor terrain.
[21] In an embodiment, the present disclosure provides for a method for facilitating modelling of underwater topography, the method may be executed by a set of instructions executable at one or more processors. The method may include the steps of : obtaining a set of data packets from a first computing device amongst the one or more computing devices associated with a side scan SONAR system, where the set of data packets may correspond to underwater environment parameters; extracting a first set of attributes from the obtained set of data packets, where the first set of attributes may correspond to underwater floor terrain topology parameters; extracting a second set of attributes from the obtained set of data packets, where the second set of attributes correspond to attributes associated with underwater objects, generating a reference image based on the extracted first and second set of attributes and mapping an underwater terrain model based on any or a combination of the extracted first and second set of attributes on the generated reference image.
[22] In an embodiment, any or a combination of generated shadow of the underwater objects and plurality of contour maps can be mapped on the underwater terrain model.
[23] In an embodiment, shadow of underwater objects can provide for a shadow signature mask corresponding to any or a combination of a dead zone, highlight and shadow ratio. The shadow mask signature can be based on the extracted first and second set of attributes.

BRIEF DESCRIPTION OF THE DRAWINGS
[24] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[25] FIG. 1 illustrates exemplary architecture in which or with which proposed system can be implemented in accordance with an embodiment of the present disclosure.
[26] FIG. 2 illustrates an exemplary architecture of a processor in accordance with an embodiment of the present disclosure.
[27] FIG. 3 illustrates an exemplary representation of a flow diagram associated with the method for facilitating mapping of underwater topography in accordance with an embodiment of the present disclosure.
[28] FIG. 4 illustrates a block diagram generic representation of process flow overview of the method in accordance with an embodiment of the present disclosure.
[29] FIG. 5 illustrates an exemplary flow diagram illustrating a working example of shadow generation in accordance with an embodiment of the present disclosure.
[30] FIG. 6 illustrates an exemplary representation of shadow generation in accordance with an embodiment of the present disclosure.
[31] FIGs. 7A-7B illustrate exemplary representations of contour maps in accordance with an embodiment of the present disclosure.
[32] FIGs. 8A-8B illustrate exemplary representations of object and shadow model generation in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[33] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[34] The present disclosure provides for a system and a method for system for facilitating modelling of underwater topography.
[35] Embodiments of the present invention may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).
[36] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present invention with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present invention may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the invention could be accomplished by modules, routines, subroutines, or subparts of a computer program product.
[37] Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. These exemplary embodiments are provided only for illustrative purposes and so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those of ordinary skill in the art. The invention disclosed may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Various modifications will be readily apparent to persons skilled in the art. Thus, the present invention is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed.
[38] In an aspect, the present disclosure provides for a system for facilitating modelling of underwater topography. The system may a modelling unit may be operatively coupled to a side scan SONAR system. The modelling unit can include one or more processors operatively coupled with memory, the memory storing instructions executable by the one or more processors to obtain a set of data packets from a first computing device amongst the one or more computing devices associated with the side scan SONAR system. The set of data packets can include underwater environment parameters. The processor can extract a first set of attributes from the obtained set of data packets. The first set of attributes may correspond to underwater topology parameters. The processor can be then configured to extract a second set of attributes from the obtained set of data packets. The second set of attributes may correspond to attributes associated with underwater objects and then the processor can generate a reference image based on the extracted first and second set of attributes. The processor can also be configured to map an underwater terrain model based on any or a combination of the extracted first and second set of attributes on the generated reference image.
[39] In an embodiment, the first set of attributes corresponding to underwater topology parameters may include any or a combination of maximum achievable swath range, ship speed, tow speed, survey time, resolution along track, resolution across track, mine data set, altitude of system from underwater floor, a transducer depression angle, and the like.
[40] In an embodiment, the second set of attributes can include any or a combination of object dimension coefficient, object reflection coefficient, object height from underwater floor, object position on underwater floor, position and height of transducer, beam angle of transducer, elevation of underwater floor.
[41] In an embodiment, the modelling unit may be configured to generate a plurality of contour maps based on the extracted first and second set of attributes. In another embodiment, the modelling unit maybe configured to generate shadow of underwater objects based on the extracted first and second set of attributes.
[42] In yet another embodiment, generated underwater terrain model can include any or a combination of underwater terrain model based on any or a combination of a plurality of elevation contour maps, a plurality of reflectance contour maps, shadow of underwater terrain based on elevation, position and height of transducer, and the like. The underwater terrain model maybe pixelated and mapped on to the reference image.
[43] In an embodiment, the underwater terrain can include any or a combination of lake floor terrain, river floor terrain, sea floor terrain, ocean floor terrain and the like.
[44] In an embodiment, the present disclosure provides for a method for facilitating modelling of underwater topography. The method may be executed by a set of instructions executable at one or more processors and may include the steps of : obtaining a set of data packets from a first computing device amongst the one or more computing devices associated with a side scan SONAR system, extracting a first set of attributes from the set of data packets, extracting a second set of attributes from the set of data packets, generating a reference image based on the extracted first and second set of attributes, and mapping an underwater terrain model based on any or a combination of the extracted first and second set of attributes on the generated reference image.
[45] In an embodiment, any or a combination of generated shadow of the underwater objects and plurality of contour maps can be mapped on the underwater terrain model.
[46] In an embodiment, shadow of underwater objects can provide for a shadow signature mask corresponding to any or a combination of a dead zone, highlight and shadow ratio. The shadow mask signature can be based on the extracted first and second set of attributes.
[47] FIG. 1 illustrates exemplary network architecture in which or with which proposed system can be implemented in accordance with an embodiment of the present disclosure.
[48] As illustrated in FIG. 1, according to an aspect of the present disclosure an underwater terrain modelling system 100 (also referred to as the system 100, hereinafter) can provide modelling of underwater data related to a set of data packets obtained from side scan SONAR systems 108. In another embodiment, the set of data packets can be user defined due to the presence of input devices associated with the system 100.
[49] In an embodiment, the system 100 can include a modelling unit 102, one or more input devices, one or more output devices, one or more power devices, and a network 104 operatively coupled to the modelling unit 102. In an exemplary embodiment, the one or more input devices can include a keypad 114, touchpad, and the like. The keypad 114 can be configured to acquire one or more attributes of a user and other state parameters associated with SONAR data packets. The one or more output devices can include a display unit 116. The display unit 106 can be used to provide a visual model to the user.
[50] In an embodiment, the input devices 114 and display units 116 can be associated with one or more computing devices 106. In an embodiment, the system 100 can be implemented using any or a combination of hardware components and software components such as a cloud, a server, a computing device, a network device, and the like. The modelling unit 102 may be configured to receive a set of data packets to and from the one or more computing devices 106 which can receive the set of data packets via network 104 from the side scan SONAR systems 108.
[51] In another embodiment, the computing device 106 can include a datasheet having underwater environment parameters. The datasheet can be stored in the memory of the computing device 106. In yet another embodiment, the datasheet associated with underwater environment parameters can be user defined.
[52] In an implementation, the system 100 can be accessed by the one or more computing devices 106 through a website or application that can be configured with any operating system, including but not limited to, AndroidTM, iOSTM, Kai-OSTM and the like.
[53] Further, the system 100 can also be configured to extract a first and a second set of attributes from the received data packets. The first set of attributes may correspond to underwater environment parameters such as maximum achievable swath range, ship speed, tow speed, survey time, resolution along track, resolution across track, mine data set, altitude of system from underwater floor, a transducer depression angle, and the like. The second set of attributes may correspond to attributes associated with underwater objects such as object dimension coefficient, object reflection coefficient, object height from underwater floor, object position on underwater floor, position and height of transducer, beam angle of transducer, elevation of underwater floor, and the like.
[54] Subsequently, in an embodiment, the system 100 can generate a reference image based on the extracted first and second set of attributes and map an underwater terrain model based on any or a combination of the extracted first and second set of attributes on the generated reference image.
[55] Examples of the computing devices can include, but are not limited to, a computing device 106, a smart phone, a portable computer, a laptop, a handheld device, a workstation and the like.
[56] In an embodiment, the system 100 can be communicatively coupled to the monitoring unit 102 through a communication unit, wherein the communication unit is a network 104 that can include any or a combination of a wireless network module, a wired network module, a dedicated network module and a shared network module.
[57] In an embodiment, the one or more computing devices 106 with the datasheet having the underwater environment parameters can be configured to capture, sample and map the underwater environment parameters onto a set of data packets having a series of across track slices.
[58] FIG. 2 illustrates an exemplary architecture of a processor 202 coupled to the system (100) in accordance with an embodiment of the present disclosure.
[59] As illustrated, the modelling unit 102 can include one or more processor(s) 202. The one or more processor(s) 202 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute computer-readable instructions stored in a memory 204 of the modelling unit 102. The memory 204 can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 204 can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[60] The modelling unit 102 can also include an interface(s) 206. The interface(s) 206 may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 can facilitate communication of the modelling unit 102 with various devices coupled to the modelling unit 102. The interface(s) 206 can also provide a communication pathway for one or more components of the modelling unit 102. Examples of such components include, but are not limited to, processing units 208 and database 210. In another exemplary embodiment, the set of data packets pertaining to underwater environment parameters can be stored in the database 210. Herein, the database 210, can be configured and developed through the interface 206 that can sort set of data packets pertaining to underwater environment parameters according to the nature of set of data packets.
[61] Further, the processing units 208 can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing units 208. The database 210 can include data that is either stored or generated because of functionalities implemented by any of the components of the processing units 208.
[62] In an example, the processing units 208 can include an attribute extraction unit 212, a reference image generation unit 214, a contour map generation unit 216, object synthesising unit 218, object shadow model generation unit 220, shadow signature mask generation unit 222, underwater floor map modelling unit 224 and other unit(s) 224. The other unit(s) 218 can implement functionalities that supplement applications or functions performed by the modelling unit 102 or the processing units 208.
[63] In an embodiment, the attribute extraction unit 212 can obtain a set of data packets from the computing device 106 associated with the side scan SONAR system 108 and extract a first and a second set of attributes from the set of data packets. For example, the obtained set of data packets can include a set of data packets pertaining to underwater environment parameters. The attribute extraction unit 212 can extract the first set of attributes pertaining to underwater environment parameters that can include maximum achievable swath range, ship speed, tow speed, survey time, resolution along track, resolution across track, mine data set, altitude of system from underwater floor, a transducer depression angle, and the like. The attribute extraction unit 212 can then extract the second set of attributes corresponding to underwater objects parameters such as of object dimension coefficient, object reflection coefficient, object height from underwater floor, object position on underwater floor, position and height of transducer, beam angle of transducer, elevation of underwater floor, and the like. The first and the second attributes can be stored in database 210 for further processing.
[64] In an embodiment, the modelling unit 102 can include a reference image generation unit 214 which can be configured for generating a 2D image of the underwater floor terrain based on the extracted first and second attributes stored in the database 210 and received from the attribute extraction unit 212 through various image processing techniques.
[65] In an exemplary embodiment, the reference generation unit 214 can generate an image of the underwater terrain based on the first and the second attributes received.
[66] In another embodiment, the modelling unit 102 can include a contour map generation unit 216 that can generate a map with contour lines which can show any or a combination of valleys and hills, elevation and refection of objects, steepness and gentleness of slopes having contour interval which can be the difference in elevation between successive contour lines. Contour lines can be any or a combination of curved, straight or a mixture of both lines on a map describing the intersection of a real or hypothetical surface with one or more horizontal planes. The configuration of these contours can allow map readers to infer the relative gradient of a parameter and estimate that parameter at specific places. In an exemplary implementation, the contour generation unit 216 can generate any or a combination of elevation contour map, reflectance contour map and the like of the underwater terrain and then map the contour maps on the generated reference image produced by the reference image generation unit 214.
[67] In another embodiment, the modelling unit 102 can include an object synthesizing unit 218 that can create underwater objects pertaining but not limited to mine-like objects from the second set of attributes extracted by the attribute extraction unit 212. In an exemplary embodiment, the mine-like objects can be from an underwater terrain. The object synthesising unit 218 can further create the mine-like objects using specified dimension coefficient and specified reflection coefficient extracted from the second set of attributes.
[68] In another embodiment, the modelling unit 102 can include an object shadow model generation unit 220 that can generate shadow of underwater objects pertaining but not limited to mine-like objects from the second set of attributes extracted by the attribute extraction unit 212. In an exemplary embodiment, the shadow model generation unit 220 can generate shadow of mine-like object and can map the shadows on the generated reference image generated by the reference image generation unit 214 based on object height on underwater. object position on underwater, on position, height, beam angle of transducer, elevation of underwater and the like extracted from the second set of attributes by the attribute extraction unit 212. The object shadow model generation unit can generate region darkening of the region out of focus with respect to across beam angle.
[69] In another embodiment, the modelling unit 102 can include a shadow signature mask generation unit 222 that can generate shadow signature mask with any or a combination of dead zone, highlight and shadow ratio based on any or a combination of underwater floor elevation, object dimension, and object position with respect to the transducer obtained from the extracted second set of attributes. The shadow signature mask generation unit 222 can generate shadow mask for objects falling in shadow zone of another object based on a simple trigonometric relation in terms of object dimension, object position with respect to transducer and reflectance.
[70] In yet another embodiment, the modelling unit 102 can include an underwater floor map modelling unit 224 that can synthesize an underwater floor terrain and map it on the reference image generated by the reference image generation unit 214. In an exemplary embodiment, the underwater floor map modelling unit 224 can synthesize a underwater terrain model using any or a combination of the elevation contour maps and the reflectance contour maps generated by the contour map generation unit 216, shadow of terrain generated by the object shadow model generation unit 218 and can be pixelated and mapped on to reference image generated by the reference image generation unit 214.
[71] FIG. 3 illustrates an exemplary representation of a flow diagram associated with the method 300 for facilitating mapping of underwater topography in accordance with an embodiment of the present disclosure.
[72] The order in which the method as described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method or alternate methods. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method may be considered to be implemented in the above described system.
[73] In an embodiment, a method for facilitating modelling of underwater topography, the method 300 may include at block 302, a step of obtaining a set of data packets from a first computing device amongst the one or more computing devices associated with a side scan SONAR system, where the set of data packets correspond to underwater environment parameters; and at block 304, a step for extracting a first set of attributes from the obtained set of data packets, where the second set of attributes correspond to underwater floor terrain topology parameters.
[74] Further, the method may include at block 306, a step for extracting a second set of attributes from the set of data packets, where the second set of attributes correspond to attributes associated with underwater objects, and at block 308, a step for generating a reference image based on the extracted first and second set of attributes and at block 310 a step for mapping a underwater terrain model based on any or a combination of the extracted first and second set of attributes on the generated reference image.
[75] FIG. 4 illustrates a block diagram generic representation of process flow overview of the method in accordance with an embodiment of the present disclosure.
[76] As illustrated, in an implementation, the process can include reference image generation at block 402, underwater terrain synthesis at block 404, and mine like object with shadow synthesis at block 406. Reference image generation at block 402 can generates reference image (referred herein as IMr) of dimension based on any or a combination of resolution along track (referred to as Ral hereinafter), resolution across track (referred to as Rac hereinafter), maximum achievable swath range (referred to as Ls hereinafter), ship, vessel or tow speed (referred to as Vs hereinafter) and survey time (referred to as Ts hereinafter). Reference image IMr generated can be Ls wide and (Vs*Ts) long. Number of pixels can be based on Ral and Rac. Underwater terrain synthesis at block 404 can generate underwater terrain image (referred to as IMsf hereinafter) along with shadow elevation map (referred to as IMsd hereinafter) and can map it on IMr.
[77] FIG. 5 illustrates an exemplary flow diagram illustrating a working example of shadow generation in accordance with an embodiment of the present disclosure.
[78] As illustrated, in an implementation, the exemplary flow diagram can include at block 502 Pixel elevation estimation which can be provided to Shadow map estimation at block 504. Upon obtaining falls in shadow zone at block 506, height difference estimation can be done at block 508. If height difference is negative at block 510, No shadow and object generated can be concluded at block 512. If height difference is positive at block 514, generate shadow can be done at block 516. If at block 504 having Shadow map estimation can estimate not in shadow zone at block 504, then again shadow generate at block 516 can be provided.
[79] FIG. 6 illustrates an exemplary representation of shadow generation in accordance with an embodiment of the present disclosure.
[80] As illustrated, in an exemplary implementation, a transducer 604 can be attached to a towfish 602 with a depression 610 (referred to as Da 610 hereinafter) having across track beamwidth 612 (referred to as ACbw 612 hereinafter). The transducer 604 can illuminate a maximum swath Ls 614. Region beyond Ls 642 can be referred to as IMmn and can be darkened. Nadir region can be referred to as Ln 646, which can also be dark and can be synthesized separately in a later stage. A first ray 620 illustrated in FIG. 6 can make an angle perpendicular to transducer 604. For a first object 622 illustrated in the FIG. 6, total shadow length can be given by .
[81] In another implementation, the modelling unit 102 can synthesize shadow of objects which can fall in shadow of other objects based on object height and shadow height of terrain based on the height and position of terrain feature with respect to the position of transducer (referred to as IMsd hereinafter). As illustrated, in an exemplary implementation, shadow generation of a third object 630 which can fall in shadow zone of second object 636 but having a height difference 626 (referred to as hd 626 hereinafter) which can be decided by the height of the third object 630 and position of the third object 630 with respect to the shadow zone 638 of the second object 636. For object 3, hd can be calculated using equation:
[82] In yet another implementation, if hd is positive, then shadow for the third object 630 can be generated based on height and position of the third object 630, otherwise the third object 630 as well as shadow 640 of third object 630 cannot be generated. A fourth object 632 can fall in the shadow zone 640 of third object 630 and if hd is negative, the fourth object 632 as well as the shadow of the fourth object cannot be generated.
[83] FIGs. 7A-7B illustrate exemplary representations of contour maps in accordance with an embodiment of the present disclosure.
[84] As illustrated, in an implementation, elevation of each pixel in IMsf can be estimated with respect to elevation map and a shadow elevation map (referred to as IMSd hereinafter) can be generated. IMsd can contain shadow height of terrain based on the height and position of terrain feature with respect to the position of the transducer. As illustrated in FIG. 7A, therefore, IMsf, along with shadow elevation map, IMsd, is generated over IMr based on the elevation contour map can be shown. Reflectance of each pixel in IMsf can be estimated with respect to reflectance map in yet another implementation as illustrated in FIG. 7B.
[85] FIGs. 8A-8B illustrate exemplary representations of object and shadow model generation in accordance with an embodiment of the present disclosure.
[86] The modelling unit 102 can synthesize mine-like objects with shadow based on details of mine, mine location and mine reflectance with respect to IMSf (referred to as Mds hereinafter) and IMsd and incorporate Mds and IMsd to generate IMmn. Shadow of individual objects can be generated based on position and height of plurality of objects with respect to acoustic source. As illustrated in FIG. 8A, variation in shadow dimension in terms of position and dimension of object is depicted. In an implementation, a mine-like object 802 (referred to as M1 802 hereinafter) can have reflectance less than a second mine-like object 804 (referred to as M2 804 hereinafter) but for both shadows can be generated, which can make detection of objects possible using shadow dimension. A third mine-like object 806 (referred to as M3 806 hereinafter) and a fourth mine-like object 804 (referred to as M4 808 hereinafter) can be located closer but M3 806 has longer shadow than M4 808 due to height dimension difference. Shadow generation of each object consist of shadow mask generation. As illustrated in FIG. 8B, a typical shadow signature mask generated for a mine-like object is shown. In an implementation, the shadow signature mask can include atleast three zones, highlight zone 812, dead zone 816 and shadow zone 814. Dimensions of these zones can be based on underwater floor elevation, object dimension, and object position with respect to the transducer 604. Highlight zone 812 can reflect the reflectance of object, so it can have same dimension as that of object. In yet another implementation, this model 1/5th length of shadow near to object can be considered as dead zone and rest as shadow.
[87] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE PRESENT DISCLOSURE
[88] Some of the advantages of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[89] The present disclosure provides for a system and method to synthesize underwater environment model for a side scan sonar.
[90] The present disclosure provides for a system and method to facilitate synthesizing of base underwater floor image, using reference image based on spatio-temporal domain specification and generate an underwater floor model.
[91] The present disclosure provides for a system and method to facilitate synthesizing of underwater floor terrain using contour maps having elevation and reflectance data with respect to reference image and generate an underwater floor model.
[92] The present disclosure provides for a system and method to facilitate synthesizing of mine like objects based on object dimension and reflectance details provided in dataset and generate an underwater floor model.
[93] The present disclosure provides for a system and method to facilitate synthesizing of shadow of underwater floor terrain based on terrain elevation data and object dimension and position and generate an underwater floor model.
[94] The present disclosure provides for a system and method to facilitate addition of features such as faults, craters, landslides, sediment paths, rocks from underwater terrain images and the like to incorporate in the generated underwater model.
[95] The present disclosure provides for a system and method to facilitate synthesizing of three-dimensional to two-dimensional underwater floor terrain, and generate an underwater floor model.

Documents

Application Documents

# Name Date
1 202041055599-STATEMENT OF UNDERTAKING (FORM 3) [21-12-2020(online)].pdf 2020-12-21
2 202041055599-POWER OF AUTHORITY [21-12-2020(online)].pdf 2020-12-21
3 202041055599-FORM 1 [21-12-2020(online)].pdf 2020-12-21
4 202041055599-DRAWINGS [21-12-2020(online)].pdf 2020-12-21
5 202041055599-DECLARATION OF INVENTORSHIP (FORM 5) [21-12-2020(online)].pdf 2020-12-21
6 202041055599-COMPLETE SPECIFICATION [21-12-2020(online)].pdf 2020-12-21
7 202041055599-Proof of Right [17-03-2021(online)].pdf 2021-03-17
8 202041055599-POA [15-10-2024(online)].pdf 2024-10-15
9 202041055599-FORM 13 [15-10-2024(online)].pdf 2024-10-15
10 202041055599-AMENDED DOCUMENTS [15-10-2024(online)].pdf 2024-10-15
11 202041055599-FORM 18 [04-12-2024(online)].pdf 2024-12-04