Sign In to Follow Application
View All Documents & Correspondence

System And Method For Maritime Surveillance Using Electro Optic Sensors

Abstract: The present disclosure relates to a system (100) for monitoring the maritime environment. The system includes a plurality of electro-optic sensors (110) with a pedestal installed on a tower to capture video sequences of the maritime environment. A plurality of remote stations (102) coupled to the electro-optic sensors through a network that obtains the video sequence from the plurality of electro-optic sensors. A video processing unit (116) is configured to process the video sequence by applying video enhancement, video stabilization, single target tracking and panorama generation on the captured video sequence based on the selection of the operator from the command and control center and transmit the processed output video sequence through a communication link (118) to a video server located at the command and control center (104, 106, 108).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 March 2023
Publication Number
40/2024
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Bharat Electronics Limited
Corporate Office, Outer Ring Road, Nagavara, Bangalore - 560045, Karnataka, India.

Inventors

1. JYOTHESWAR JETTIPALLI
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.
2. CHAVELI RAMESH
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.
3. RAGHU M
D&E CSS RFCS NS-R&FCS, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.
4. SRIHARI B R
D&E CSS RFCS NS-R&FCS, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.
5. SHRIKANT TIRKI
BSTC Business R&D, Software Technology Center, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.
6. NIKHIL NANDAN VERMA
D&E Eng Unmanned System, Strategic Communication and Unmanned System, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.
7. SRAVANTHY E
BSTC Business R&D, R&D, Software Technology Center, Bharat Electronics Limited, Jalahalli P.O., Bangalore - 560013, Karnataka, India.

Specification

Description:TECHNICAL FIELD
[0001] The present disclosure relates, in general, to maritime monitoring, and more specifically, relates to a system and method for maritime surveillance using electro-optic sensors.

BACKGROUND
[0002] The prior art discloses the different methods and systems for maritime surveillance including threat detection and warning for the detected threat to maritime commercial assets, and maritime security. The infrastructure for a complete system is also described. An example of such a system is recited in a patent US8581688B2 titled “Coastal Monitoring Techniques” which discloses method and system for monitoring coastlines that includes sensor arrangement for coastline, data collection for monitoring via sensors, analysing the collected data to detect the presence of any threat near the coastline, and transmission of analysed data or signal to command and monitoring centre with sensor identification and location. Another example is recited in a patent US10121078B2 titled “Methods and System for detection of foreign objects in maritime environments” whichdiscloses the techniques for detecting foreign objects in a region of interest in maritime environments.
[0003] Another example is recited in a WO 2009/066988A2 titled “Device and method for surveillance system” which discloses a device and method for a surveillance system utilizing a wide view camera a serve as a guiding system to determine the area of interest. Yet another example is recited in a patent US 2014/0314270 A1 titled “Detection of floating objects in maritime video using a mobile camera” that discloses a method and system for detecting floating objects in maritime video is disclosed. The horizon is detected within the video. Modelling of the sky and water is performed in the video. Objects are detected that are not water and sky within the video.
[0004] Therefore, it is desired to overcome the drawbacks, shortcomings, and limitations associated with existing solutions, and develop a maritime surveillance system with video processing methods for better viewing, monitoring, and controlling from remote operating locations.

OBJECTS OF THE PRESENT DISCLOSURE
[0005] An object of the present disclosure relates, in general, to maritime monitoring, and more specifically, relates to a system and method for maritime surveillance using electro-optic sensors.
[0006] Anotherobject of the present disclosure is to provide a system that is robust to intensity variations.
[0007] Another object of the present disclosure is to provide a system that realizes maritime surveillance system with video processing methods for better viewing, monitoring, and controlling from remote operating locations
[0008] Another object of the present disclosure is to provide a system that is accurate and highly scalable architecture.
[0009] Another object of the present disclosure is to provide a system that is less complex and fast processing.
[0010] Another object of the present disclosure is to provide a system having modules that are interchangeably used as per operator selection.
[0011] Yet another object of the present disclosure is to provide a system that has application in maritime security monitoring.

SUMMARY
[0012] The present disclosure relates in general, to maritime monitoring, and more specifically, relates to a system and method for maritime surveillance using electro-optic sensors.The main objective of the present disclosure is to overcome the drawback, limitations, and shortcomings of the existing system and solution, by providing a maritime surveillance system with video processing methods for better viewing, monitoring, and controlling from remote operating locations.
[0013] The present disclosure relates to the system and method for maritime surveillance of coastal regions using electro-optic sensors and the framework of hierarchical control and data flow architecture is disclosed. A chain of electro-optic (EO) sensors is installed along the coastline to provide coverage and enable surveillance of high-risk areas, high traffic density and areas of fishing vessel density. The EO system is capable of being remotely controlled and operated from command and control centers through sensor management software. The EO system video output is taken through a video analytics module, wherein video processing operations such as contrast improvement, haze removal, video stabilization, panorama generation, naval target tracking and video encoding & streaming operations are performed. The architecture to realize a maritime surveillance system is disclosed with video processing methods for better viewing, monitoring, and controlling from remote operating locations.
[0014] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The following drawings form part of the present specification and are included to further illustrate aspects of the present disclosure. The disclosure may be better understood by reference to the drawings in combination with the detailed description of the specific embodiments presented herein.
[0016] FIG. 1 illustrates an exemplary conceptual view of maritime monitoring system, in accordance with an embodiment of the present disclosure.
[0017] FIG. 2 illustrates an exemplary block diagram of maritime monitoring system at remote station, in accordance with an embodiment of the present disclosure.
[0018] FIG. 3 illustrates an exemplary flow chart of sensor management software application, in accordance with an embodiment of the present disclosure.
[0019] FIG. 4 illustrates an exemplary flow chart of a method for monitoring the maritime environment, in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[0020] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0021] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0022] The present disclosure relates, in general, to maritime monitoring, and more specifically, relates to a system and method for maritime surveillance using electro-optic sensors.
[0023] The present disclosure relates to a system and method for maritime monitoring using electro-optic sensors and it also discloses the video processing, encoding, streaming and the architecture for dataflow and control of the system between remote stations (RS), remote operating stations (ROS), regional operating centre’s (ROC) and control centre (CC). An electro-optic sensor (CCD and TI) mounted on the pedestal captures the video.The captured video is processed in a computing device called a video processing unit (VP). The electro-optic sensor with pedestal and computing device is located at the remote station.
[0024] The maritime monitoring system is capable of being remotely controlled and operated using sensor management software application from command & control centers located at District Head Quarters which are called as Remote Operating Stations (ROS) and regional headquarters of Coast Guard which are known as Regional Operating Centre’s (ROC). Finally, video streams from all networked cameras may be available at Control Centre (CC), which is the National Head Quarters. The commands for selection of video processing elements are issued from the command & control centers situated at ROS, ROC and CC using sensor management software application. Based on the mode of command, video processing element performs the operation and transmits the processed video data to all command and control centers over a communication link. The processed/improved video updates on the screen with enhancements and target information to the operators at ROS, ROC, and CC.
[0025] The present disclosure relates to the system and method for maritime surveillance using video processing-based techniques. This method and system include video processingelements such as video-based stabilization, video enhancement by improving contrast and removing haze using atmospheric corrections in an image scene/sequence, single target tracking for CCD and thermal imager (TI) videos. Each module works independently and can be interchanged by module selection by the operator at ROS/ROC/CC. Apart from these methods; the panorama generation method aligns images using an image blending approach in sector view mode or 360o view mode. A video encoder compresses the processed video data and transmits it through the real-time streaming protocol to ROS/ROC/CC over Ethernet for monitoring.
[0026] The present disclosure relates to the system and method for maritime surveillance using video processing-based techniques. This method and system also include command and control centers located at ROS, ROC, and CC. These centers have similar set up which includes video decoder module, command & control module and use interface. The Remote Stations (RS), Remote Operating Stations (ROS) and Regional Operating Centre’s (ROC) are scalable in the networked architecture. Remote Station (RS) sensors are controlled from command and control centers based on authority level of control module. The control center (CC) has access to video data from all RS and operators at CC can control all EO sensors. ROC and ROS operators have access to data and control of EO sensors only from RS within the reporting hierarchy. A group of ROS report to ROC and all ROC’s report to CC as per the hierarchical reporting architecture.
[0027] According to one embodiment of the invention, video enhancement consists of three modules called haze/fog removal, contrast improvement using adaptive histogram equalization and gamma correction. These modules are applied sequentially on an input image sequence. A modified dark channel prior approach is used to remove the haze/ fog in a single image and then applied a contrast improvement approach to improve the contrast of the image. Finally, a gamma correction method is applied to enhance the color information in an image. It is therefore suitable for improving the local contrast and enhancing the definitions of edges in each region of an image. These operations are controlled by the operator with the help of a threshold control scroll bar from command and control centers (ROS/ROC/CC), accordingly improved/enhanced video updates on the screen.
[0028] According to another embodiment of the invention, the video stabilization process is developed for the naval scenario. The global motion of consecutive frames is estimated interms of motion vectors and a novel filter-based correction method is implemented to correct the trajectory of motion vectors generated. An image warping or image transformation process provides a stable view of the scene. It compensates for camera platform vibration due to wind and camera pedestal movement. This process ensures the robust and real-time performance of the video stabilization process.
[0029] According to another exemplary embodiment of the invention, single target tracking is implemented to track naval targets like ships, vessels, and boats. The tracking module generates the target position or the pixel error and passes the same to sensor management software to move the pedestal accordingly. A target-matching algorithm and prediction filter is used to find the target direction and track positions. Strong features of the target are determined and used as flow points to implement target tracking on a Video Processing (VP) unit. Based on directed pan/tilt motion and pixel-to-pixel relationship, an estimate is computed whether target motion points fit the background motion, dynamic motion, or noise.
[0030] In order to view the smooth variation in the video scene, a two-dimensional position and velocity vector is used with a prediction filter to predict the next required position of the camera, so that the object of interest stays in the centre of the presented video. Acquired images are processed in a parallel manner achieving a high frame rate processing of images, the results provide real-time tracking of targets on a coast-line monitoring system using a pan & tilt unit for generic moving targets where no training is required, and camera motion is observed from high accuracy approach as compared with image correlation.
[0031] According to another exemplary embodiment of the invention, the panorama generation module forms a panoramic view of captured images in a sector view mode or 360o view mode. The camera mechanism is mounted on a fixed pedestal, which may rotate in clockwise and anti-clockwise directions with a fixed rotating speed (rpm) and fixed zoom level during panorama generation. The rotation speed and zoom levels can be changed by the operator controls. Once speed and zoom are set to a value then initiation of panorama generation takes place as per mode selection (sector mode/360o view mode). This module generates a pixel shift parameter for each scenario of pedestal speed and camera zoom level. With generated pixel shift, image blending approach is used to stitch the captured images to form a panorama in sector view mode or 360o view mode.The present disclosure can be described in enabling detail in the following examples, which may represent more than one embodiment of the present disclosure.
[0032] The advantages achieved by the system of the present disclosure can be clear from the embodiments provided herein. Like, the system is robust to intensity variations, realizes maritime surveillance system with video processing methods for better viewing, monitoring, and controlling from remote operating locations. The system is accurate and highly scalable architecture, less complex and fast processing. The system having modules that are interchangeably used as per operator selection. The description of terms and features related to the present disclosure shall be clear from the embodiments that are illustrated and described; however, the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents of the embodiments are possible within the scope of the present disclosure. Additionally, the invention can include other embodiments that are within the scope of the claims but are not described in detail with respect to the following description.
[0033] FIG. 1 illustrates an exemplary conceptual view of maritime monitoring system, in accordance with an embodiment of the present disclosure.
[0034] Referring to Figure 1, the maritime monitoring system 100 (also referred to as system 100,herein). The system 100 can include remote stations 102, Remote Operating Stations (ROS) 104, Regional Operating Centre’s (ROC) 106, Control Center (CC) 108,imaging sensors 110, multiple ships 112, tower 114, video processing unit 116, communication link 118, sensor manger 120, coastline 122, video server (124-1 to 124-3 (which are collectively referred to as video server 124, herein)) and water body 126.
[0035] The imaging sensors 110 (also referred to a plurality of electro optic sensors, 110 herein) with pedestal installed on the tower to capture video sequence of maritime environment. The remote stations 102 coupled to the electro optic sensors 110 through a network, the remote stations 102 obtains the video sequence from the electro optic sensors. The video processing unit 116 positioned at the remote station 102, the video processing unit configured to process the video sequence by applying video enhancement, video stabilization, single target tracking and panorama generation on the captured video sequence based on the selection of the operator from command and control center.
[0036] The video processing unit 116 can perform selected video processing on the input video sequence sequentially and interchangeably based on the choice of the operator and transmit processed output video sequence through communication link to a video server located at the command and control center.
[0037] The video processing unit 116 can include can include video enhancement unit 206, video stabilization unit 208, single target tracking unit 210, panorama generation unit 212 and video encoder 214 are configured to perform the operation on an input video sequence independently. The video enhancement unit 206, video stabilization unit 208, single target tracking unit 210, panorama generation unit 212 and video encoder 214 are not dependent on each other to perform their functions on the input video sequence.
[0038] The video processing unit 116 applies video enhancement, video stabilization, single target tracking and panorama generation on the captured input video sequence using video enhancement unit 206, video stabilization unit 208, single target tracking unit 210 and panorama generation unit 212 respectively. The video enhancement, video stabilization, single target tracking and panorama generation are configured to perform in an interchangeable manner by the operator’s preference by cascading each other and perform functionalities independently without depending on each other to facilitate flexibility in selection of video processing unit to apply on the input video sequence.
[0039] The video processing unit 116 coupled to the video encoder 214 that compresses the video stream and transmits to the command and control centers over communication link 118. The command and control centers can include sensor management application to decode and display the processed video sequence as an output to the operator on a video wall, wherein video processing unit 116 are initiated and controlled by the operator from the command and control center. The command and control centers can include remote operating station 104, regional operating center 106 and control center 108.
[0040] The system 100 can include two or more imaging sensors 110 for tracking of multiple ships 112. Several remote stations 102 connected to network and the two or more imaging sensors 110 (also referred to as electro optic sensors, herein), for obtaining remote site video data from the two or more imaging sensors 110. The processor configured to process the remote site video data and inputting the processed video data to the network for transmission. In an exemplary embodiment, the two or more imaging sensors 110such as charged-coupled device (CCD) camera for day vision and thermal imager (TI) camera for night vision are mounted on a pedestal; which are installed on a tower 114.
[0041] The camera feeds and pedestal controls are connected to a control unit and video processing unit 116 present at the remote station 102. The processed video information transmitted over communication link 118 to sensor management software application residing at command and control centers named remote operating stations (ROS) 104, regional operating centre’s (ROC) 106 and control center(CC) 108.
[0042] Referred diagram includes sensor manger 120, coastline 122 and naval targets 112 like ships, boats, and vessels in the water body 124. Although the exemplary embodiment shows a complete view of video processing elements for maritime monitoring system, one of the key features of this system is that its process is realistic in software reliability and robustness.
[0043] FIG. 1 depicts the data and control flow diagram of the embodiments between sensors and command and control centres. In this figure, imaging sensors such as thermal imager and CCD sensors are mounted on a pan and tilt pedestal. The data captured by the imaging sensors 110 are transmitted to the video processing unit 116 through control unit. The image sensors 110 with pedestal and video processing unit 116 are installed at remote station 102.
[0044] The video processing unit 116 processes the captured video data and transmits through Ethernet communication link 118 to the sensor management software application running on the sensor manager 120 residing at Remote Operating Stations (ROS) 104. The same video streams are transmitted to Regional Operating Centre’s (ROC) 106 and Control Center (CC) 108. Sensor management software application sends the commands, to move the pedestal in the pan and tilt direction. The pedestal controls are also communicated by the Ethernet communication link 118 as shown in FIG. 1.
[0045] Thus, the present invention overcomes the drawbacks, shortcomings, and limitations associated with existing solutions, and provides system that is robust to intensity variations, and realizes maritime surveillance system with video processing methods for better viewing, monitoring, and controlling from remote operating locations. The system is accurate and highly scalable architecture, less complex and fast processing. The system having modules that are interchangeably used as per operator selection.
[0046] FIG. 2 illustrates an exemplary block diagram of maritime monitoring system at remote station, in accordance with an embodiment of the present disclosure.
[0047] The disclosed system 100 can include a capturing element which is mounted on the pedestal 202 to acquire the video sequence. The control unit 204 acquires the video from the sensor and pedestal controls are transmitted to the pedestal 202. The captured video sequence is fed to the video processing module/unit 116 which in turn processes for video enhancement 206, video stabilization 208, single target tracking 210 and panorama generation 212. The modules are initiated as per operator’s selection using sensor management software application from command and control centers located at ROS/ROC/CC (104, 106, 108). The video processing operations can be performed in an interchangeable manner as commands received from the sensor management software application by Ethernet communication link 118. The system 100 has one exception that only one module among video stabilization and single target tracking should be performed on an input video sequence at the same time. This way of operation provides more robustness and stability to the system while operating in interchangeable manner of video processing modules.
[0048] In an aspect, the image capturing element 202 can be configured to capture the frames from thermal imager or CCD sensor based on the input from the sensor management software application from command and control centers (ROS/ROC/CC) (104, 106, 108). The selected sensor video signal data is transmitted to the sensor management software using Ethernet network link 118. Based on the inputs received from the sensor management software for the selection of video processing modules like video enhancement 206, video stabilization 208, single target tracking 210 and panorama generation 212 may be performed on an input video sequence captured from the sensor at remote station (RS) 102. These video processing modules 116 perform their operation sequentially as selected by the operator.
[0049] Once the selected operations are performed on the video sequence, the video encoder 214 compresses the processed video sequence and streams it to the command and control centers located at ROS/ROC/CC (104, 106, 108). While generating a panorama, the video enhancement selection module feature can be enabled by the operator to view the better-quality video sequence on the sensor management software application terminal.
[0050] In an aspect of FIG. 2, the embodiment of the invention is capable of processing the captured video in an interchangeable manner. The input to the video processing modules (video enhancement 206, video stabilization 208, single target tracking 210 and panorama generation 212) is termed as captured/processed input video sequence. The output of the video processing modules (video enhancement 206, video stabilization 208, single target tracking 210 and panorama generation 212) is termed as an output video sequence. Video enhancement module 206 comprises three functionalities like haze removal, contrast enhancement and gamma correction methods; These methods are performed on the input video sequence sequentially. The resultant of the video enhancement module 206 is an improved video sequence which acts as input to the other video processing functionalities such as video stabilization 208, single target tracking 210 and panorama generation 212 and the same improved video sequence is transmitted to the sensor management software application located at ROS/ROC/CC.
[0051] In an aspect of FIG. 2, one of the embodiments of the invention is video stabilization module 208, which provides a stable video sequence by eliminating the unwanted motion in the input video sequence. This module finds the features/corners in the present image and these features/corner points are used to track the features in the consecutive/next frame. The apparent motion between the consecutive frames is computed using the detected features/corner points in the previous frame and the present frame. The motion flow transformation parameters (dx, dy and da (angle)) between consecutive frames are computed using hybrid motion estimation. These trajectories are smoothed using a predictive filtering approach to find the global transformation parameters. Image warping is applied on the current frame using global transformation parameters. This process gives the robust and stable video for further video processing modules like video enhancement 206 and panorama generation 212 or to transmit the stabilized video sequence to the sensor management software application located at ROS/ROC/CC.
[0052] In an aspect of FIG 2, another embodiment of the disclosed video processing elements is a single target tracking module for the maritime monitoring system. The output of this module controls the pedestal 202 where the imaging sensors are installed and ensures the selected moving target is in the center of the frame. The single target tracking module 210 is enabled by the operator through the sensor management software application from one of the command and control center (ROS/ROC/CC), this initiates the tracking module in the video processing unit located at the remote station (RS) 102.
[0053] This module captures and detects the good feature points like corners, edges and texture of the selected target in the initial captured image/previous image; using these target feature points, it finds the match in the current frame. This module finds the moving target feature point’s displacement in the current image by eliminating the outliers. Then the centroid of the moving target is obtained, which in turn computes the moving target positions. The Centroid location of the moving target is predicted using a predictive filtering approach; this process ensures the accuracy of the tracking approach. This process further directs the pedestal 202 to move the electro optic sensor 110 in the direction of moving target. A single target tracking module works iteratively to track the moving target automatically without manual intervention. The single target tracking functionality works in a closed-loop manner to track the selected target continuously by directing the pedestal. Finally, the output video sequence with the bounding box around the target of interest is transmitted to the command and control center (ROS/ROC/CC) to monitor the moving target continuously.
[0054] The panorama generation module 212 as referred to in another embodiment is to see the sector/360-degree view around the remote station 102. With reference to Figure 2, the panoramic generation approach maps electrooptic sensor movement in the form of sector coverage with pixel coverage using pixel shifts in the look-up table (LUT). The pre-loaded look-up table is used during the stitching process to avoid overlapped areas. Gradient-domain image stitching is employed for panorama generation. Panorama is formed with respect to the first frame in clockwise or anti clockwise direction. These panorama formations are available to the operator either in sector mode or 360-degree mode. A seamless panorama will be generated using an image blending approach with the look-up-table parameters based on each mode of operation. At every end of the position of the pedestal, the generated panorama will be updated on the sensor management software application installed at ROS/ROC/CC.
[0055] FIG. 3 illustrates an exemplary flow chart of sensor management software application, in accordance with an embodiment of the present disclosure. FIG. 3 represents the functional blocks of the sensor management software application. This module consists of the video decoder 302, command and control module 304, and user interface 306 modules to perform the command initiation and viewing of live video streams received from the remote stations 102 located across the coastline. The video decoder 302 decompresses the received video stream and displays it on the screen through a sensor management software application. Sensor management software application controls the sensors installed at remote stations remotely and connection establishment between all the server applications located at Remote Operating Stations (ROS) 104, Regional Operating Centre’s (ROC) 106 and Control Center (CC) 108. This module also shows the health status of the electro-optic sensors and network node configuration of the overall architecture. This software has an authorization feature linked to the locations, this enables the operator’s access to the features available on the sensor management software application. At a time only one command and control center can take control of the electro-optic sensors. If one command and control center controls the sensors, others need to request the controlling center to release the sensors to take control over the phone network. This architecture is scalable in terms of RS, ROS and ROC’s. Electro-optic sensor selection either CCD or TI can be selected from the software application and if TI sensor is selected, the polarity of the sensor either White Hot or Black Hot can be chosen by the operator.
[0056] In an aspect of FIG. 3, the embodiment named sensor management software of the invention is capable of decoding received video feed and displays on the video wall and it allocates the network node addresses to RS, ROS, ROC and CC such that communication establishes between sensors and command and control centers. Multiprotocol label switching is adapted for networking technology which routes traffic using the shortest path based on “labels”, rather than network addresses, to handle forwarding over private wide area networks.
[0057] The foregoing exemplary embodiments are presented as teaching examples related to the invention. Those of regular skill in the art will understand that various changes in form and details may be made to the exemplary embodiments without departing from the scope of the invention as defined by the claims that follow.
[0058] FIG. 4 illustrates an exemplary flow chart of a method for monitoring the maritime environment, in accordance with an embodiment of the present disclosure.
[0059] Referring to FIG. 4, the method 400, at block 402, the electro-optic sensors can capture video sequences of the maritime environment, the plurality of electro-optic sensors with pedestal installed on a tower. At block 404, the remote stations can obtain the video sequence from the plurality of electro-optic sensors, the plurality of remote stations coupled to the plurality of electro-optic sensors through a network.
[0060] At block 406, the video processing unit can process the video sequence by applying video enhancement, video stabilization, single target tracking and panorama generation on the captured video sequence based on the selection of the operator from the command and control center, the video processing unit positioned at the remote station.
[0061] At block 408, the video processing unit can perform selected video processing on the input video sequence sequentially and interchangeably based on the choice of the operator and at block 410, the video processing unit can transmit the processed output video sequence through a communication link to a video server located at the command and control center.
[0062] It will be apparent to those skilled in the art that the system 100 of the disclosure may be provided using some or all of the mentioned features and components without departing from the scope of the present disclosure. While various embodiments of the present disclosure have been illustrated and described herein, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the disclosure, as described in the claims.

ADVANTAGES OF THE PRESENT INVENTION
[0063] The present invention provides a system that is robust to intensity variations.
[0064] The present invention provides a system that realizes maritime surveillance system is disclosed with video processing methods for better viewing, monitoring, and controlling from remote operating locations.
[0065] The present invention provides that is accurate and highly scalable architecture.
[0066] The present invention provides a system that is less complex and fast processing.
[0067] The present invention provides a system having modules that are interchangeably used as per operator selection.
[0068] The present invention provides a system that has applications in maritime security monitoring.
, Claims:1. A system (100) for monitoring the maritime environment, the system comprising:
a plurality of electro optic sensors(110) with pedestal installed on a tower to capture video sequence of maritime environment;
a plurality of remote stations (102) coupled to the plurality of electro optic sensors through a network, the plurality of remote stations obtains the video sequence from the plurality of electro optic sensors; and
a video processing unit (116) positioned at the remote station, the video processing unit configured to:
process the video sequence by applying video enhancement, video stabilization, single target tracking and panorama generation on the captured video sequence based on the selection of the operator from command and control center;
perform selected video processing on the input video sequence sequentially and interchangeably based on the choice of the operator; and
transmit processed output video sequence through communication link (118) to a video server located at the command and control center (104, 106, 108).
2. The system as claimed in claim 1, wherein the video processing unit (116) comprises video enhancement unit (206), video stabilization unit (208), single target tracking unit (210), panorama generation unit (212) and video encoder (214) that are configured to perform the operation on an input video sequence independently, wherein the video enhancement unit, video stabilization unit, single target tracking unit, panorama generation unit and video encoder are not dependent on each other to perform their functions on the input video sequence.
3. The system as claimed in claim 1, wherein the video processing unit (116) applies video enhancement, video stabilization, single target tracking and panorama generation on the captured input video sequence.
4. The system as claimed in claim 1, wherein the video enhancement, the video stabilization, the single target tracking, and the panorama generation are configured to perform in an interchangeable manner by the operator’s preference by cascading each other and perform functionalities independently without depending on each other to facilitate flexibility in selection of video processing unit to apply on the input video sequence.
5. The system as claimed in claim 1, wherein the video processing unit (116) coupled to a video encoder (214) that compresses the video stream and transmits to the command and control centers over the communication link (118).
6. The system as claimed in claim 1, wherein the command and control centers comprise remote operating station (104), regional operating center(106) and control center(108).
7. The system as claimed in claim 1, wherein the command and control centers comprises sensor management application to decode and display the processed video sequence as an output to the operator on a video wall, wherein the video processing unit (116) are initiated and controlled by the operator from the command and control center.
8. The system as claimed in claim 1, whereinthe video enhancement unit (206) comprises haze removal, contrast enhancement and gamma correction approach, wherein the video enhancement are performed on the input video sequence sequentially.
9. The system as claimed in claim 1, wherein the resultant of the video enhancement unit (206) is improved video sequence which acts as input to the video stabilization unit (208), single target tracking unit(210) and panorama generation unit(212) and the same improved video sequence is transmitted to the sensor management software application located at the command and control centers.
10. A method (400) for monitoring the maritime environment, the method comprising:
capturing (402), by a plurality of electro optic sensors, video sequence of maritime environment, the plurality of electro optic sensors with pedestal installed on a tower;
obtaining (404), by a plurality of remote stations, the video sequence from the plurality of electro optic sensors, the plurality of remote stations coupled to the plurality of electro optic sensors through a network;
processing (406), at a video processing unit, the video sequence by applying video enhancement, video stabilization, single target tracking and panorama generation on the captured video sequence based on the selection of the operator from command and control center, the video processing unit positioned at the remote station;
performing (408), at the video processing unit, selected video processing on the input video sequence sequentially and interchangeably based on the choice of the operator; and
transmitting (410), at the video processing unit, processed output video sequence through communication link to a video server located at the command and control center.

Documents

Application Documents

# Name Date
1 202341024948-STATEMENT OF UNDERTAKING (FORM 3) [31-03-2023(online)].pdf 2023-03-31
2 202341024948-FORM 1 [31-03-2023(online)].pdf 2023-03-31
3 202341024948-DRAWINGS [31-03-2023(online)].pdf 2023-03-31
4 202341024948-DECLARATION OF INVENTORSHIP (FORM 5) [31-03-2023(online)].pdf 2023-03-31
5 202341024948-COMPLETE SPECIFICATION [31-03-2023(online)].pdf 2023-03-31
6 202341024948-ENDORSEMENT BY INVENTORS [10-04-2023(online)].pdf 2023-04-10
7 202341024948-FORM-26 [13-05-2023(online)].pdf 2023-05-13
8 202341024948-Proof of Right [16-06-2023(online)].pdf 2023-06-16
9 202341024948-POA [04-10-2024(online)].pdf 2024-10-04
10 202341024948-FORM 13 [04-10-2024(online)].pdf 2024-10-04
11 202341024948-AMENDED DOCUMENTS [04-10-2024(online)].pdf 2024-10-04
12 202341024948-Response to office action [01-11-2024(online)].pdf 2024-11-01