Sign In to Follow Application
View All Documents & Correspondence

Auto Navigation Of Autonomous Underwater Vehicle

Abstract: TITLE OF INVENTION:-AUTO-NAVIGATION OF AUTONOMOUS UNDERWATER VEHICLE ABSTRACT: The present invention relates to a system for enabling an Autonomous Underwater Vehicle (AUV) 100 to navigate underwater terrain in an autonomous manner. The present invention more specifically relates to a navigation sensor suite 200 having at least inertial measurement system 041, at least one depth sensor 042, at least one acoustic based Doppler velocity log 043, at least one magnetometer 044, at least one positioning system 045, at least one imaging sensor 046 and a sensor fusion unit 047 to compute the required parameters for navigation of the vehicle and generate control signals for the actuators 101 of the AUV 100. [Figure 6]

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
08 August 2022
Publication Number
24/2023
Publication Type
INA
Invention Field
PHYSICS
Status
Email
Parent Application

Applicants

Indian Institute of Information Technology
630 Gnan Marg, Sri City, Chittoor - 517646, Andhra Pradesh, India
Hrishikesh Venkataraman
Faculty Block 301, IIIT Sri City, 630 Gnan Marg, Sri City, Chittoor – 517646, Andhra Pradesh, India

Inventors

1. HRISHIKESH VENKATARAMAN
Faculty Block 301, IIIT Sri City, 630 Gnan Marg, Sri City, Chittoor - 517646, Andhra Pradesh, India
2. MAHESH NAGARAPPU
Quarter No: B1, N.S.P Colony, Narasaraopet - 522601, Andhra Pradesh, India
3. SHAKEERA SK
25-12-215, Police Colony, Nellore – 524004, Andhra Pradesh, India
4. BALA NAGA JYOTHI
D4, Block3, Ceebros Mayfair Apt, 2A, LIC Colony Main Road, Chennai – 600042, Tamil Nadu, India

Specification

DESC:FIELD OF INVENTION:

The present invention relates to the field of underwater vehicles which involves implementation of technologies related to robotics, electronics, navigation sensors, autonomous algorithms, machine learning, and artificial intelligence.

BACKGROUND OF INVENTION:

Autonomous Underwater Vehicle, abbreviated to AUV and also known as Uncrewed Underwater Vehicle has many important applications such as conducting survey missions for the detection and mapping submerged wrecks, landforms, rocks and other obstructions that can cause difficulty for the navigation of commercial and recreational vessels in rivers and seas. AUVs are also utilised to conduct reconnaissance surveys in the field of defence. There are a lot of natural resources below the surface of the Earth which has been made difficult to access by water bodies which can be searched for and uncovered by AUVs.

Historically, Remote Operated Vehicles (ROV) are being used for exploring the depths of water bodies. As suggested by the name, these vehicles are remotely operated by humans and cannot accomplish any tasks without human intervention. The communication between an ROV and its operator is often established via the electro-mechanical tether cables. The usage of tether with underwater vehicles creates a number of considerable problems like drag forces. The long bulky tether cable disturbs the life present within the water body being explored. A more critical issue which arises with the usage of tether cables is that there exists a risk of the tether cable getting caught up with underwater objects or even getting entangled. If the tether cable gets damaged significantly, then the operator could lose control of the ROV and lose the ROV in the water body which would lead to great financial and scientific loss along with the sensors, actuators and loss of the data collected by the ROV.

Hence, researchers have been working on Autonomous Underwater Vehicles (AUVs), which are designed to negate the disadvantages associated with ROVs so that these vehicles can be autonomously deployed for exploring the different ocean locations without the risk of losing connection with the vehicle and losing it.

To begin with, AUVs need to decide their own travel path from source to destination. This involves identification of the presence of obstacles, including their dimensions. Furthermore, the vehicle has to be aware of the water conditions including spatio-temporal current velocities, salinity, buoyancy and the energy required to move the vehicle per unit distance. In this regard, the energy profile of the AUV is an important aspect to be considered for AUV endurance, navigation and control.

The US patent US8880275B1by James A. Del Savio, Richard P. Berube, Stuart K. Beazley, Ryan K. Miller, Peter Licis, AlbericoMenozzi, titled “Autonomous underwater vehicle control system and method” and published on 04/11/2014 discloses A vehicle control system is provided that includes an internal communications system. The vehicle control system further includes a controller configured to communicate with a plurality of independent vehicle systems via the internal communications system. The controller stores and accesses a plurality of libraries of system processes having data associated with the plurality of vehicle components. The controller maintains an operational state for the vehicle during an operational failure of at least one of the plurality of independent vehicle systems.

The US patent US20110288714A1 by MyriamFlohr, Gilad Steiner, Inon Ben Zur, titled “Autonomous navigation system and method for a maneuverable platform,” and published on 24/11/2011 discloses an automated method for autonomous navigation of a maneuverable platform is disclosed. The method includes providing an autonomous navigation system that includes a situation awareness module to receive data from one or more sensors on one or more identifying parameters selected from the group of identifying parameters that includes position, course and speed, relating to the platform and obstacles in the vicinity of the platform. The platform also includes a decision module to choose course and speed for the platform based on the identifying parameters of the obstacles in the vicinity of the platform and the data on the position of the platform. The method further includes providing the decision module with information on a mission that includes at least one task assigned to the platform; and periodically obtaining the data and choosing a preferred option using the decision module, based on the identifying parameters, by assigning, for each option from a set of options, each option defining a distinct combination of course and speed, a grade which is indicative of the desirability of that option with respect to each of the obstacles and with respect to each of a plurality of objectives, for each option summing the grades assigned to that option with respect to all obstacles, wherein the preferred option is the option whose summed grades is indicative of the greatest desirability of that option.

The US patent US10955523B1 by Peter N. Mikhalevsky, titled “Deep ocean long range underwater navigation algorithm (UNA) for determining the geographic position of underwater vehicles,” and published on 23/03/2021 discloses An underwater navigation algorithm (UNA) uses acoustic signals transmitted in the ocean from sources at known positions to compute a position underwater for a vehicle that requires no initial a priori position or any ocean sound speed information or any initial GPS position that would require surfacing. The UNA consists of two parts, (1) the Cold Start Algorithm (CSA) and (2) the CSA with Modeling (CSAM). The underwater vehicle needs to be equipped with only a single hydrophone acoustic receiver and an onboard processor. The CSA requires only measuring the travel time of the end of the arrival coda (EOC) from each of the sources to compute a position. The CSAM is a post CSA procedure to calculate a higher accuracy position using the CSA position. CSAM utilizes the CSA position and a 4D sound speed field derived from an ocean 4D General Circulation Model (GCM) constrained using Ocean Acoustic Tomography (OAT) in the ocean area of operation and further includes (1) computing a modeled result with an acoustic propagation modeling code that is compared with the received acoustic data using a known procedure called “bulk shifting” and/or (2) a new proposed procedure that uses calculated group speeds from the 4D sound speed field, to provide a higher accuracy estimate of the receiver position.

S.T Havenstrøm, A. Rasheed and O. San, “Deep Reinforcement Learning Controller for 3D Path Following and Collision Avoidance by Autonomous Underwater Vehicles”, Front. Robot. AI 7:5660 37, 2021, discloses Deep Reinforcement Learning (DRL) techniques, to develop autonomous agents capable of achieving this hybrid objective without having à prior knowledge about the goal or the environment. However, only Deep Reinforcement Learning (DRL) is not enough for precise navigation, the feedback from the real-time sensor is also important. The present proposed solution considers both real-time sensor data and AI ML intelligence for precise navigation.

Patent publication US8880275B1 James Del Savio, P. Richard Berube, K. Stuart Beazley, K. Ryan Miller Peter Licis, Alberico Menozzi, “Autonomous underwater vehicle control system and method”, 2014 discloses a vehicle control system that includes an internal communications system. The vehicle control system further includes a controller configured to communicate with a plurality of independent vehicle systems via the internal communications system. The controller stores and accesses a plurality of libraries of system processes having data associated with the plurality of vehicle components. The said approach is computationally highly demanding. The present proposed mechanism the intelligence is added using vision and ML models to reduce the computational burden.

Or, Barak, and Itzik Klein. "A Hybrid Model and Learning-Based Adaptive Navigation Filter." IEEE Transactions on Instrumentation and Measurement 71 (2022): 1-11 discloses a used model-based Kalman filter for sensor fusion of INS sensors, a deep neural network model is used to tune the momentary system noise covariance matrix, based on the inertial sensor readings, once the process noise covariance is learned, it is plugged into the well-established, model-based Kalman filter, field experiment was performed with a quadrotor and showed a 25% improvement in position error in comparison with adaptive model-based methods. The said solution is for a quadrotor and obstacle avoidance is required. The present proposed solution includes obstacle avoidance through hybrid model like sonar and vision.

Ma, Hui, Xiaokai Mu, and Bo He. "Adaptive Navigation Algorithm with Deep Learning for Autonomous Underwater Vehicle." Sensors 21.19 (2021): 6406 uses deep learning to generate low-frequency position information to correct the error accumulation of the navigation system, the ?2 rule is selected to judge if the Doppler velocity log (DVL) measurement fails, which could avoid interference from DVL outliers. The adaptive filter, based on the variational Bayesian (VB) method, is employed to estimate the navigation information simultaneous with the measurement covariance to improve the navigation accuracy. The said technique ignores obstacle identification and avoidance. The proposed technique of the present disclosure includes obstacle avoidance through a hybrid model like sonar and vision.

Chen, Hua, et al. "Improving inertial sensor by reducing errors using deep learning methodology." NAECON 2018-IEEE National Aerospace and Electronics Conference. IEEE, 2018 discloses, in GPS-denied environments like dense urban places, multi-level parking structures, and areas with thick tree-coverage, the INS unit incorporates the dead-reckoning principle via linear acceleration and angular velocity data. Microelectromechanical Systems (MEMS) based IMU sensors are preferred due to their low cost and resistance to shock and vibration. However, MEMS inertial sensors are prone to various error development of calibration and compensation techniques for errors reduction from sensors, both systematic and stochastic/random are essential. CNN-based deep learning methodology used to remove many errors sources in the sensor signals simultaneously. This method tests the performance with traditional technologies like Six-Position Static Test and Rate Test and it is found to be 80% more accurate. The said methodology is that it’s used for human motion tracking and ignored obstacle identification & avoidance. The present methodology includes obstacle avoidance through a hybrid model like sonar and vision.

SUMMARY OF THE INVENTION:

The primary objective of the present invention is to put forth an apparatus and method to enable an Underwater Vehicle to navigate efficiently in underwater terrain autonomously.

The said method of autonomous navigation broadly consists of four steps including sea-bed mapping, training & validation, obstacle identification and obstacle avoidance.

Another embodiment of the present invention is to provide a mechanism planning the path an underwater vehicle is supposed to follow.

Another embodiment of the present invention is to provide a mechanism to allow communication between multiple Autonomous Underwater Vehicles.

The principal aspect of the present invention is to provide A system for the navigation of autonomous underwater vehicle 100 propelled by actuators 101 comprising a combination of at least one thruster 101a and at least one control fin 101b, or a plurality of thrusters 101a characterized by a navigation sensor suite 200 comprising at least inertial measurement system 041 capturing the real time data, orientation data from the Inertial Measurement Unit (IMU) and using position estimation method estimating the vehicle parameters such as position, attitude, velocity and depth with the real time data,
at least one depth sensor 042 recording the real time depth value of the vehicle by measuring the in-situ pressure, at least one acoustic based doppler velocity log 043 recording the velocity of the vehicle with respect to sea bed in 3-axis using frequency shift of the transmission and reflected acoustic signals, at least one magnetometer 044 providing the attitude and true North seeking of the vehicle during underwater operations, at least one positioning system 045 provides the real-time delayed location of the vehicle in geo-coordinates with respect to the autonomous underwater vehicle 100, at least one long range imaging sensor 046a for obstacle identification at long distance; at least one short range imaging sensor 046b for obstacle identification at short distance; wherein, a sensor fusion unit 047 receives the data collected by the said navigation sensor suite 200 to fuse the data from the various sensors for the respective real time instance to associate the inertial measurement data from the inertial measurement system 041, the depth of the underwater vehicle 100 from the depth sensor 042, the velocity of the vehicle 100 from the Doppler velocity log 043, the attitude of the vehicle from the magnetometer 044 to the location of the underwater vehicle 100 as detected by the positioning system 045 in order to dynamically model the environment and determine the path followed by the said vehicle 100; the imaging sensors 046 detect the obstacles in the path of the vehicle 100 to detect the category and the dimensions of the obstacles and relays the same to the sensor fusion unit 047; the sensor fusion unit 047 continuously computes the required parameters for navigation of the vehicle and generates control signals for the actuators 101 as per a selected control method to continuously update the path of the underwater vehicle 100 to avoid the obstacles.

Another aspect of the present invention is to provide a method for the determination of a path for the navigation of an autonomous underwater vehicle comprising steps:
i. Scanning a seabed to collect data regarding the said seabed to create a database of obstacles in the said seabed as prior bathymetry done by an acoustic Imaging Sensor;
ii. Using 80% of the said collected data regarding the seabed for training the navigation system and 20% of the data for validating the said navigation system;
iii. Categorizing the obstacles found in the seabed obstacle database as known obstacles and categorizing the obstacles not found in the said seabed database as unknown obstacles;
iv. Estimating the dimensions of the unknown obstacles;
v. Calculating vehicle real time parameters including position, speed and attitude or orientation in north-east-down frame, with reference to the True North;
vi. Fusing the calculated vehicle parameters by a sensor fusion technique in order to estimate the optimized vehicle position, speed and attitude outputs;
vii. Determining a reference path by considering the known obstacles, unknown obstacles, ocean environmental and vehicle system parameters;
viii. Implementing an adaptive control method for following the determined path for the underwater vehicle;
ix. Continuously updating the determined path during mission.

BRIEF DESCRIPTION OF DRAWINGS:

The drawings constitute a part of this invention and include exemplary embodiments of the present invention illustrating various objects and features thereof.

Figure 1: Schematic illustration of the Navigation and control of underwater vehicles using four layer technology

Figure 2: Schematic illustration of the Training and validation methods

Figure 3: Schematic illustration of the Factors considered in mission planning

Figure 4: Schematic illustration of the Navigation Sensor Suite of AUV

Figure 5: Schematic illustration of the Control methods

Figure 6: Schematic illustration of the execution navigation of an AUV 100

Figure 7: Schematic illustration of the present navigation system in operation

DETAILED DESCRIPTION OF THE INVENTION:

For the purpose of promoting, an understanding of the principles of the invention, references will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated device, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.

Reference herein to “one embodiment” or “another embodiment” means that a particular feature, structure, or characteristics described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in a specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Further, the diagrams representing one or more embodiments of the invention do not inherently indicate any particular order nor imply any limitations in the invention.
As used herein, the term “plurality? refers to the presence of more than one of the referenced items, and the terms “a”, “an”, and “at least” do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item.
The term ‘AUV’ represents the ‘Autonomous Underwater Vehicle’ in the following description.

It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps, or components, but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.

The objective of the invention is to develop an autonomous navigation system that would be based on several factors, including both in-vehicle and out-vehicle factors.
1. The in-vehicle factors include (but not limited to):

a. Vehicle endurance/ Residual energy of the vehicle
b. Sensors/ Battery management system
c. Communication capability with other AUVs/underwater vehicles
d. Distance to be traversed between the source and planned destination

2. The out-vehicle factors include (but not limited to):

a. Presence of obstacles in the route and the dimensions of the obstacle
b. Buoyancy of the water

Of these, the most important factors to be considered are the presence and dimensions of the obstacle and the endurance to travel. This invention proposes different factors to be considered for calculating the navigation route, for the AUV upto 100m depth.

Fig. 1 shows the navigation and control for autonomous vehicles using four layered technology. First layer is the seabed mapping, training and validation layer. Second layer is path planning. Third layer is obstacle identification and finally fourth layer is obstacle avoidance.

In training and validation, the block 001 represents seabed map where the seabed map data is collected using acoustic based Multi beam SONAR/ Echo sounders/ Imaging sonar/ side scan sonar integrated with geo-reference position data.

Block 002 is training and validation, 80% of collected data is used to train the model and 20% for model validation. The detailed process of training and validation shown in Fig. 2. After the training and validation, second layer path planning is as follows.

Block 003 represents the Path Planning layer where the reference path as a mission plan is defined by considering ocean environmental and vehicle system parameters shown in Fig. 3 which the vehicle needs to consider for path planning.

Block 004 is the navigation sensor suite which is used to calculate vehicle real time parameters like position, speed and attitude/ orientation in the NED frame, with reference to the True North. The details of sensors and the classical sensor fusion algorithms like Kalman or Extended Kalman filter shall be used for estimating the optimized vehicle position, speed and attitude outputs which are furnished in Fig. 4.

The computed parameters are passed to control methods block 005 to follow the specified desired path in an efficient way. Some of the available conventional and adaptive control methods are shown in Fig. 5 which are enabled for efficient AUV control by actuating the thrusters and surface control fins and rudders according to the vehicle configuration and manoeuvring capabilities.

Block 006 is the obstacle identification block that uses acoustic based sensors like multi beam sonars, and optical based sensors like cameras to identify the obstacle distance, angle, size and shape. Further, these inputs are passed to the obstacle avoidance layer where these parameters are used to classify the object as known obstacle or unknown obstacle.

Block 007 refers to the obstacle data already existing in the seabed database and the model is trained.

Block 008 is an unknown obstacle block which refers to the obstacle details not available in the seabed database and not trained to the model.

Block 009 refers to the estimation of the dimensions of unknown obstacles which are given as inputs to the control algorithm.

Block 010 shows the control algorithm block which specifies the intelligence of avoiding the unknown obstacle.

Block 011 represents avoiding known or unknown obstacles using machine learning techniques or artificial intelligence or heuristic techniques.

Block 012 refers to the updated path after avoiding the obstacle using navigation and control. This updated path is fed to the mission planning block for completion of mission.

Training and Validation

The term “training” refers to the process of feed an actual dataset to a ML model so that the ML model can observe and learn from the said dataset which enables the said ML model to predict outcomes as per the fed data set in order to make the right decision. In the present invention the actual dataset is divided into two parts wherein 80% of the dataset is used for training and the remaining 20% is used for the testing of the trained model.

The term “validation” refers to the process of fine-tuning the model parameters with independent data as a part of the training of the said model. The model evaluates this data and provides an unbiased evaluation. This validation is mainly used for finding the accuracy and loss of the trained model. This data is approximately 20% of the whole dataset.

Fig. 2 illustrates the training and validation can be done only with recorded path data of the seabed map. Block 021 represents the same.

To increase the accuracy of the training models in block 022 along with the path data, obstacle data also needs to be trained.

Further, in block 023 to train the behaviour of the ocean along with path data, obstacle data, spatio-temporal ocean currents are also being trained with models which are efficient.

Mission Planning
Fig. 3 illustrates the factors considered in mission planning.

Block 031 refers to battery endurance, endurance of the battery is estimated using the residual energy of the battery. Depending on the battery energy, mission endurance shall be updated in real-time.

Block 032 refers to long range communication challenges such as acoustic communication channel path loss due to multiple reflections, refractions, attenuation, and shadow zones in the propagation path between the vehicles.

Environmental variables in 033 block refers to the factors like spatio-temporal ocean currents which are accountable.

Block 034 refers to path parameters such as the distance between source and destination of the mission, the number of known obstacles present in that reference path. These parameters are considered while choosing the reference path.

Navigation Sensor Suite
Fig. 4 shows the navigation sensor suite of the AUV.

Block 041 refers to the inertial navigation system which takes the real time data, orientation data from the Inertial Measurement Unit (IMU) and using position estimation algorithm it estimates the vehicle parameters such as position, attitude, velocity and depth with the real time data from the sensors.

Block 042 is a depth sensor which provides the real time depth value of the vehicle by measuring the in-situ pressure.

Block 043 refers to the Doppler Velocity Logger (DVL) is an acoustic based sensor which provides the velocity of the vehicle with respect to sea bed in 3-axis using frequency shift of the transmission and reflected acoustic signals.

Block 044 refers to the magnetometer or gyro compass which provides the attitude and true North seeking of the vehicle during underwater operations. The Global Positioning System (GPS) (Initial/ vehicle surfacing).

Acoustic Positioning System (APS) refers to block 045 and provides the real-time relative position of the vehicle in Geo-coordinates with respect to ship.

The camera or sonar in block 046 provides the range and bearing angle / heading angle of the vehicle in polar coordinates between the vehicle and the obstacle.

All the sensor data along with the inertial navigation sensor is given to sensor fusion block 047 which computes the required parameters for navigation of the vehicle. Finally, the estimated parameters are furnished in block 048 and transmitted to the control methods block to control and vehicle manoeuvring.

Control Methods
Control methods are shown in Fig. 4. The first one is the Lyapunov Method illustrated as block 051 is used for tracking control where vision is used as the primary sensing element for the navigation.

Block 052 is a conventional PID (proportional–integral–derivative) based method where position is used to track and it is accomplished by calculating the steering angle that is proportional to the Cross Track Error (CTE), Along Track Error (ATE) which is the lateral and vertical distance between the vehicle and the reference trajectory respectively.

Block 053 indicates the Kinematic control method that is used for the motion control of the vehicle where position and orientation of the vehicle is required.

Block 054 details the adaptive sliding mode control system which controls based on input-output signals in terms of dive-plane command and depth measurement.

Block 055 is the Fuzzy logic that is used to take the intelligent decision using the real-time sensor data of the environment conditions in navigation and control of AUV.

Block 056 is Adaptive control is used to update the parameters like vehicle velocity, distance, motion, orientation etc. by using the real time sensor data.

The figure 6 illustrates an embodiment of the system for the execution of the method for auto navigation of an autonomous underwater vehicle. The AUV 100 in the said embodiment, is actuated by a set of actuators 100 wherein the said actuators comprise at least one thruster 101a and at least one control fin 101b.

In the said preferred embodiment of the present invention the navigation sensor suite 200, as illustrated in the figure 4, comprises an inertial measurement unit (IMU) 041, a depth sensor 042, a Doppler velocity log (DVL) 043, a magnetometer 044, a positioning system 045 and a imaging sensor 046 which may be selected from a SONAR (sound navigation and ranging) as a long range imaging sensor 046a or/and a camera as a short range imaging sensor 046b or. The positioning system 045 may be selected from a GPS (Global Positioning System) or an APS (Assured Positioning System). The magnetometer 044 may be substituted with a gyro compass.

A ML (Maching Learning) deployable high processing unit is used for training and validation of machine learning models for prediction and classification. The predicted and classified output is passed to the controller and sensor values from the navigation sensor suite 200 are processed in sensor fusion mode to generate control signals and feed thrusters 101a for mobility. In the said embodiment of the present invention, the rotation of the AUV is achieved using control fins 101b.

The said machine leaning model deployable hardware or board consists of a powerful board equipped with a microcontroller and a wide variety of sensors for example Intel BLE Sense or speedboat board etc. The board can sense movement, acceleration, rotation, barometric pressure, sounds, gestures, proximity, colour, and light intensity. The kit also includes a camera module specifically OV7675 and a High end ML deployable processor to create AI-ML (artificial intelligence-machine learning) based projects. One is able to explore practical ML use cases using classical algorithms as well as deep neural networks powered by TensorFlow Lite Micro.

In the conventional auto navigation systems, the reference trajectory is loaded into the AUV. While loading the trajectory, typically the mission planner has to provide all the waypoints between the source and destination. However, in the present system, a mission planner is developed wherein, the reference trajectory is loaded and used to automatically generate the wave points. Generally, ocean modelling involves solving Navier–Stokes equations which are computationally very complex. In this proposed method, AI-ML based regression methods are used which are significantly less complex. Further, based on the estimation of relative speed and orientation, the wave points are adjusted and sent to the navigation sensor suite 200. Herein, all the sensor outputs are combined using popular filtering techniques such as Bayesian or Extended Kalman filter. While combining these data, each sensor of the sensor suite 200 generates some errors. In such scenarios, the precise estimation of the sensor error covariance matrix is very important for precise navigation. However, using conventional equations for the estimation of the error matrix is computationally complex. Hence, using ML (machine learning) based models are being proposed in this work that would estimate the error without being computationally expensive.

The estimated parameters are passed to the control methods to generate the control signal which is further passed to the actuators 101. As the ocean environment is dynamic, there may be many situations where the obstacle needs to be identified dynamically in real-time. In conventional SONAR-based systems, the obstacle avoidance mechanism can estimate the obstacle up to 50m in a 30-degree single beam measurement. Using SONAR, the distance and orientation of the obstacle can be estimated. However, it does not estimate the dimensions. If the obstacle is unknown, then it is essential for precise avoidance to identify how big and how small the obstacle is.

Hence, in the proposed method, a multi-model two-level obstacle avoidance mechanism is used. In the first level using SONAR, the obstacle is identified from a long range. The vehicle slows down the speed and moves in very slowly. Further, in the second level, in order to obtain the precise turn, a vision-based mechanism is used from a small distance (say up to 5m) to estimate obstacle dimensions. The vehicle relative distance, orientation and dimensions are then passed as inputs to the AI-ML based model in order to estimate the turning value and turning direction. The block diagram of the proposed method has been provided in figure 7.

In the said figure 7, the data collected by the navigation sensor suite 200 is fused by the sensor fusion unit 047 to determine the real time position and velocity of the underwater vehicle 100 and dynamically model the environment. The imaging sensors 046 continuously scans the seabed to detect obstacles in the short range and the long range and as per the dimensions and the category of the obstacle of the, i.e., known obstacle and the unknown obstacle, and the path of the vehicle 100 is updated accordingly in order to avoid the said detected obstacles. In the illustrated instance, the rocks are categorised as known obstacles and the sensor fusion unit 047 has accounted for it from the database of obstacles and has prepared a path accordingly. As the fish is an unknown obstacle, the navigation sensor suite 200 detects it and the sensor fusion unit 046 updates the path to avoid the said unknown obstacle and orders the actuators 101 of the vehicle 100 to follow the newly determined path.

In the said embodiment of the present invention, the training and validation unit 300 comprises a high end processor and a micro controller or a microcomputer.

In an embodiment of the present invention the navigation sensor suite 200 comprises a global positioning system, an inertial measurement unit, a Doppler velocity log, a microcontroller and at least one camera for the capturing of images.

The control method of the present as represented in the block 005 is executed by a micro controller or a microcomputer which controls signal generation, an Electronic Speed Converter (ESC) which converts Pulse width modulation signal to an electrical signal to control the speed and the thrusters of the AUV.
In a preferred embodiment of the present invention, the method for the determination of a path for the navigation of an autonomous underwater vehicle comprising steps:
i. Scanning a seabed to collect data regarding the said seabed to create a database of obstacles in the said seabed as prior bathymetry done by an acoustic Imaging Sensor. Bathymetry is the measurement of depth of water in oceans, seas, or lakes.
ii. Using 80% of the said collected data regarding the seabed for training the navigation system and 20% of the data for validating the said navigation system;
iii. Categorizing the obstacles found in the seabed obstacle database as known obstacles and categorizing the obstacles not found in the said seabed database as unknown obstacles;
iv. Estimating the dimensions of the unknown obstacles;
v. Calculating vehicle real time parameters including position, speed and attitude or orientation in north-east-down frame, with reference to the True North;
vi. Fusing the calculated vehicle parameters by a sensor fusion technique in order to estimate the optimized vehicle position, speed and attitude outputs;
vii. Determining a reference path by considering the known obstacles, unknown obstacles, ocean environmental and vehicle system parameters;
viii. Implementing an adaptive control method for following the determined path for the underwater vehicle;
ix. Continuously updating the determined path during mission.
,CLAIMS:Claims:
I/We Claims:
1. A system for the navigation of autonomous underwater vehicle 100 propelled by actuators 101 comprising a combination of at least one thruster 101a and at least one control fin 101b, or a plurality of thrusters 101a characterized by:
a navigation sensor suite 200 comprising at least inertial measurement system 041 capturing the real time data, orientation data from the Inertial Measurement Unit (IMU) and using position estimation method estimating the vehicle parameters such as position, attitude, velocity and depth with the real time data,
at least one depth sensor 042 recording the real time depth value of the vehicle by measuring the in-situ pressure,
at least one acoustic based doppler velocity log 043 recording the velocity of the vehicle with respect to sea bed in 3-axis using frequency shift of the transmission and reflected acoustic signals,
at least one magnetometer 044 providing the attitude and true North seeking of the vehicle during underwater operations,
at least one positioning system 045 provides the real-time delayed location of the vehicle in geo-coordinates with respect to the autonomous underwater vehicle 100,
at least one long range imaging sensor 046a for obstacle identification at long distance;
at least one short range imaging sensor 046b for obstacle identification at short distance;
wherein,
a sensor fusion unit 047 receives the data collected by the said navigation sensor suite 200 to fuse the data from the various sensors for the respective real time instance to associate the inertial measurement data from the inertial measurement system 041, the depth of the underwater vehicle 100 from the depth sensor 042, the velocity of the vehicle 100 from the Doppler velocity log 043, the attitude of the vehicle from the magnetometer 044 to the location of the underwater vehicle 100 as detected by the positioning system 045 in order to dynamically model the environment and determine the path followed by the said vehicle 100;
the imaging sensors 046 detect the obstacles in the path of the vehicle 100 to detect the category and the dimensions of the obstacles and relays the same to the sensor fusion unit 047;
the sensor fusion unit 047 continuously computes the required parameters for navigation of the vehicle and generates control signals for the actuators 101 as per a selected control method to continuously update the path of the underwater vehicle 100 to avoid the obstacles.

2. The system as claimed claim 1, wherein the positioning system 045 is selected from an Acoustic Positioning system (APS) or a Global Positioning System (GPS).

3. The system as claimed in claim 1, wherein the sensor fusion module 047 generates control signals for the actuators 101 by a method selected from Lyapunov method, proportional–integral–derivative method, kinematic control, adaptive sliding mode, fuzzy logic, adaptive control.

4. The system as claimed in claim 1, wherein the long range imaging sensor 046a is SONAR.

5. The system as claimed in claim 1, wherein the long range imaging sensor 046b is camera.

6. The system as claimed in claim 1, wherein a training and validation unit 300 comprises a high end processor and microcontroller/microcomputer.

7. The system as claimed in claim 1, wherein the training and validation unit 300 feeds an actual dataset to a machine learning model to enable the machine learning model to observe and learn from the said dataset and predict outcomes and fine-tunes the model parameters with independent data.

8. The system as claimed claim 1, wherein the sensor fusion and the computation of the sensor fusion module 047 is executed on a Raspberry Pi controller.

9. A method for the determination of a path for the navigation of an autonomous underwater vehicle comprising:
x. Scanning a seabed to collect data regarding the said seabed to create a database of obstacles in the said seabed as prior bathymetry done by an acoustic Imaging Sensor;
xi. Using 80% of the said collected data regarding the seabed for training the navigation system and 20% of the data for validating the said navigation system;
xii. Categorizing the obstacles found in the seabed obstacle database as known obstacles and categorizing the obstacles not found in the said seabed database as unknown obstacles;
xiii. Estimating the dimensions of the unknown obstacles;
xiv. Calculating vehicle real time parameters including position, speed and attitude or orientation in north-east-down frame, with reference to the True North;
xv. Fusing the calculated vehicle parameters by a sensor fusion technique in order to estimate the optimized vehicle position, speed and attitude outputs;
xvi. Determining a reference path by considering the known obstacles, unknown obstacles, ocean environmental and vehicle system parameters;
xvii. Implementing an adaptive control method for following the determined path for the underwater vehicle;
xviii. Continuously updating the determined path during mission.

10. The method for the determination of a path for the navigation of an autonomous underwater vehicle, as claimed in claim 9, wherein the sensor fusion technique is selected from Bayesian, Kalman filter and Extended Kalman filter.

Dated this 26th Day of May 2023
Signature:
Name: Bhavik Patel
Applicant’s Agent: IN/PA-1379
INFINVENT IP

Documents

Application Documents

# Name Date
1 202241045253-STATEMENT OF UNDERTAKING (FORM 3) [08-08-2022(online)].pdf 2022-08-08
2 202241045253-PROVISIONAL SPECIFICATION [08-08-2022(online)].pdf 2022-08-08
3 202241045253-POWER OF AUTHORITY [08-08-2022(online)].pdf 2022-08-08
4 202241045253-FORM 1 [08-08-2022(online)].pdf 2022-08-08
5 202241045253-FIGURE OF ABSTRACT [08-08-2022(online)].pdf 2022-08-08
6 202241045253-DRAWINGS [08-08-2022(online)].pdf 2022-08-08
7 202241045253-DECLARATION OF INVENTORSHIP (FORM 5) [08-08-2022(online)].pdf 2022-08-08
8 202241045253-RELEVANT DOCUMENTS [04-05-2023(online)].pdf 2023-05-04
9 202241045253-POA [04-05-2023(online)].pdf 2023-05-04
10 202241045253-PA [04-05-2023(online)].pdf 2023-05-04
11 202241045253-MARKED COPIES OF AMENDEMENTS [04-05-2023(online)].pdf 2023-05-04
12 202241045253-FORM 13 [04-05-2023(online)].pdf 2023-05-04
13 202241045253-ASSIGNMENT DOCUMENTS [04-05-2023(online)].pdf 2023-05-04
14 202241045253-Annexure [04-05-2023(online)].pdf 2023-05-04
15 202241045253-AMMENDED DOCUMENTS [04-05-2023(online)].pdf 2023-05-04
16 202241045253-8(i)-Substitution-Change Of Applicant - Form 6 [04-05-2023(online)].pdf 2023-05-04
17 202241045253-ENDORSEMENT BY INVENTORS [06-06-2023(online)].pdf 2023-06-06
18 202241045253-DRAWING [06-06-2023(online)].pdf 2023-06-06
19 202241045253-COMPLETE SPECIFICATION [06-06-2023(online)].pdf 2023-06-06
20 202241045253-FORM-9 [14-06-2023(online)].pdf 2023-06-14
21 202241045253-FORM 18 [14-06-2023(online)].pdf 2023-06-14