Sign In to Follow Application
View All Documents & Correspondence

Method For Tank Cleaning And Devices Thereof

Abstract: METHOD FOR TANK CLEANING AND DEVICES THEREOF ABSTRACT The disclosure provides a method (100) and a robotic device (200) for performing tank cleaning and cleanliness evaluation of the tank. The method includes placing (102) device(200) at an end of the tank, receiving a destination point(104), receiving plurality of data sets(106) from a plurality of sensors(208), receiving images 108 from a plurality of cameras(216), initializing(110) a cleaning module for performing floor cleaning followed by predicting the device’s new position, incrementing(112) the counter corresponding to the new position. The next step includes redirecting (114) the robotic device to perform wall cleaning followed by determining(116) cleaning efficiency by evaluating the cleanliness parameters in a cleanliness evaluation unit. The method includes performing resizing, augmentation of received images to provide a hybrid dataset and image segmentation. The method of the present disclosure assesses robot’s performance in terms of speed, efficiency, cleaning effectiveness, safety, environmental impact, cost of operation and maintenance. FIG.1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 January 2025
Publication Number
06/2025
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application

Applicants

AMRITA VISHWA VIDYAPEETHAM
Amritapuri Campus, Amritapuri, Clappana PO, Kollam - 690525, Kerala, India

Inventors

1. MEGALINGAM RAJESH KANNAN
306, South Block, M A Math, Amritapuri, Kerala 690525, India
2. KUTTANKULANGARA MANOHARAN SAKTHIPRASAD
AYYAMPARAMBU,KUTTANKULANGARA, AKATHIYUR P.O, PORKULAM, Thrissur, Kerala 680519, India

Specification

Description:FORM 2
THE PATENT ACT, 1970
(39 of 1970)
COMPLETE SPECIFICATION
(See section 10, rule 13)

TITLE: METHOD FOR TANK CLEANING AND DEVICES THEREOF

APPLICANT
Amrita Vishwa Vidyapeetham
Amritapuri Campus,
Amritapuri, Clappana PO,
Kollam - 690525, Kerala, India

THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED

METHOD FOR TANK CLEANING AND DEVICES THEREOF
CROSS-REFERENCES TO RELATED APPLICATIONS
[0001] None.
FIELD OF THE INVENTION
[0002] The present invention generally relates to surface cleaningand more particularly relates toa method and device for dirt elimination and cleanliness evaluation.
BACKGROUND OF THE RELATED ART
[0003] Improper maintenance of water tanks may lead to water contamination. The use of contaminated water for various household needs, such as bathing, brushing teeth, and so on, may result in harmful diseases.Contaminated water is one of the main contributors to skin and hair diseases. Hence, it is crucial to ensure that water tanks are kept clean and free of contaminants, especially in residential and commercial settings.Despite the importance of water tank cleaning, manual cleaning methods are still widely used, and manual cleaning of household water tanks is a tiresome and time-consuming process. To address these challenges, the development of autonomous water tank cleaning robots has been suggested. These robots can perform the cleaning process with minimal human intervention, reducing the time, effort, and risks associated with manual cleaning methods.
[0004] Researchers from all across the world have invented a variety of water tank cleaning robots. For example, the US patent US11348269B1 relates a robot for perceiving a spatial model of an environment that includes an actuator, a processor, and memory storing instructions that when executed by the processor effectuates various operations. Another US patent US8457789B2 provides a wall-following robot cleaner that cleans a cleaning region while traveling the cleaning region and a method to control the same. Further, patent JP5844941B2 discloses a method of operating a mobile floor cleaning robot that includes a robot body, a cleaning system, an imaging sensor, and a controller. The controller receives a sequence of images of the floor surface and segments the image into color blobs by color quantizing pixels of the image.
[0005] However, the above-mentioned cleaning devices are bulky and require complex analytical modules for cleaning purpose. Further, these cleaning robots are designed to enhance cleaning performance, but typically lack a crucial feature that provides quantitative feedback on the status of the area being cleaned. This feedback offers valuable information to the robot, allowing it to prioritise cleaning efforts in areas with the most accumulated dirt.Hence, there has been a need in the art for a method and device that particularly tackles issues like dust removal, bulkiness and cleanliness quality evaluation.In this regard, the method and devicetoclean the tank according to the present invention substantially departs from the conventional concepts and designs of the prior art.
[0006] These and other advantages will be more readily understood by referring to the following detailed description disclosed hereinafter with reference to the accompanying drawing and which are generally applicable to other systems and methods for performing tank cleaning and cleanliness evaluation of the tankto fulfill particular application illustrated hereinafter.
SUMMARY OF THE INVENTION
[0007] According to one embodiment of the present subject matter, a method for performing tank cleaning and cleanliness evaluation of the tank is disclosed. The method includes the following steps of placing a robotic device at a first end of the tank followed by receiving a destination point to be reached by the robotic device. The next step includes receiving plurality of data sets from a plurality of sensors, wherein the sensors are configured to obtain data for a floor or a wall region of the tank followed by receiving images of the tank surface from a plurality of cameras. The next step includes initializing a cleaning module for performing floor cleaning within the floor region of the tank based on current position of the robotic device and activating a counter followed by predicting the robotic device’s new position and incrementing the counter corresponding to the new position. Redirecting the robotic device to perform wall cleaning by the cleaning module when the floor cleaning is complete takes place in the next step. This is followed by determining cleaning efficiency of the robotic device by evaluating the cleanliness parameters in a cleanliness evaluation unit. The evaluation of cleanliness module includes the steps of receiving images of the tank’s wall and floor surfaces from the plurality of cameras followed by performing resizing and augmentation of the received images, wherein the images are hybrid dataset. The next step includes providing the hybrid dataset to deep leaning models for training followed by providing training data from the deep learning models to a modified U-net model. This is followed by segmenting the images into black and white pixels, wherein the dirt particles are segmented into white pixels and the remaining area into black pixels followed by determining the ratio of the number of white pixels to the total number of pixels. The next step includes comparing the ratio with a predetermined threshold value followed by positioning the robotic devicebased on the presence of the detected white pixels.
[0008] In various embodiments, initializing floor cleaning by the cleaning unit includes determining a position of the robotic devicein the tank and initiate floor cleaning and setting the counter as nil followed by incrementing counter by 1if a wall is detected and turning the robotic deviceby 90⁰ and performing floor cleaning until the counter reaches a predefined threshold.
[0009] In various embodiments, redirecting the robotic device to perform wall cleaning by the cleaning unit includes the steps of determining the position of the robotic device in the tank from the wall and setting the counter as nil followed by moving the robotic device until the presence of the wall is detected and turn the device 90⁰. In the next step, determining position of a wall cleaning brush from the wall and aligning the robotic deviceto proximity of the wall until the wall cleaning brush touches the wall takes place. In various embodiments, aligning the robotic deviceto the wall’s proximity until the wall cleaning brush touches the wall is performed by a lead screw coupled with a stepper motor. This is followed by incrementing counter by 1 and performing wall cleaning until the counter reaches a predefined threshold.
[0010] According to another embodiment of the present subject matter, a robotic devicefor performing tank cleaning and cleanliness evaluation for a tank is disclosed. The robotic device includes a body attached with a mobility unit and a lead screw coupled with stepper motor. The device further includes a plurality of sensors mounted on the body to measure a plurality of sensor data in the robotic device’s surrounding environment. In various embodiments, the plurality of sensors include an inertial measurement unit, an ultrasonic sensor, a current sensor or a combination thereof. In various embodiments, a current sensor determines the position of a wall cleaning brush from the wall of a water tank. The device includes a cleaning module for performing wall cleaning and floor cleaning, the module comprising a floor cleaning unit and a wall cleaning unit, positioned on the body to clean the tank. In various embodiments, the floor cleaning unit is mounted on the body and comprises a motor coupled with a driver to perform horizontal cleaning with a floor cleaning brush. In various embodiments, the wall cleaning unitis mounted on the lead screw coupled with stepper motor having a motor coupled with a driver connected to a wall cleaning brush to perform vertical cleaning. The device also includes a plurality of cameras mounted on the cleaning module, adapted to receive images of the surrounding environment. Further, the device includes a processing unit for facilitating movement and performing cleanliness evaluation. The processing unit further include a locomotion unit adapted to control movement of the mobility unit of the device in surrounding environment and a cleanliness evaluation unit for determining the efficiency of the robotic device. In various embodiments, the mobility unit includes one or more wheels for navigationThe cleanliness evaluation unit is configured to receive images of the tank’s wall and floor surfaces from the plurality of cameras, perform resizing and augmentation of the received images, wherein the images are hybrid dataset, provide the hybrid dataset to deep leaning models for training, provide training data from the deep learning models to a modified U-net model, segment the images into black and white pixels, wherein the dirt particles are segmented into white pixels and the remaining area into black pixels, determine the ratio of the number of white pixels to the total number of pixels, compare the ratio with a predetermined threshold value and position the robotic devicebased on the presence of the detected white pixels.
[0011] In various embodiments, the deviceincludes a communication interface for wired or wireless communication, configured to communicate in real‐time with a remote host.
[0012] This and other aspects are disclosed herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] The invention has other advantages and features which will be more readily apparent from the following detailed description of the invention and the appended claims, when taken in conjunction with the accompanying drawings, in which:
[0014] FIG.1: represents a method for performing effective dirt detection and cleanliness evaluation of the walls and floor of a water tank
[0015] FIG. 2: represents a block diagram of a robotic device for cleaning water tank using the method of the invention
[0016] FIG. 3A: illustrates a side view illustrating various components of a robotic device.
[0017] FIG. 3B: illustrates a top view illustrating various components of the robotic device.
[0018] FIG. 4A:showing a flow diagram for initializing floor cleaning by the cleaning unit, and 4B shows path of the device for floor cleaning.
[0019] FIG. 5A: showing a flow diagram for redirecting the robotic device to perform wall cleaning by the cleaning unit, and 5B shows path of the device for wall cleaning.
[0020] FIG. 6:showing a schematic for determining cleaning efficiency of the robotic device in a cleanliness evaluation unit.
[0021] Referring to the figures, like numbers indicate like parts throughout the various views.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0022] While the invention has been disclosed with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt to a particular situation or material to the teachings of the invention without departing from its scope.
[0023] Throughout the specification and claims, the following terms take the meanings explicitly associated herein unless the context clearly dictates otherwise. The meaning of "a", "an", and "the" include plural references. The meaning of "in" includes "in" and "on." Referring to the drawings, like numbers indicate like parts throughout the views. Additionally, a reference to the singular includes a reference to the plural unless otherwise stated or inconsistent with the disclosure herein.
[0024] The present subject matter describes a method and devicefor performing tank cleaning and cleanliness evaluation of the tank.
[0025] A method 100 for performing tank cleaning and cleanliness evaluation of the tank is illustrated in FIG. 1 according to one embodiment of the present subject matter. The method 100 involves various steps of performing tank cleaning and cleanliness evaluation. In step 102, a robotic device 200 is placed at a first end of the tank. This is followed by receiving a destination point to be reached by the robotic device in step 104. Next, a plurality of data sets 106 is received from a plurality of sensors 208. In various embodiments, the sensors are configured to obtain data for a floor or a wall region of the tank. In step 108, images of tank surface are received from a plurality of cameras216.Further, initializing of a cleaning module 210 for performing floor cleaning within the floor region of the tank based on current position of the robotic device and activating a counter takes place in step 110. In the next step 112, predicting the robotic device’s new position and incrementing the counter corresponding to the new position takes place. This step is followed by redirecting the robotic device to perform wall cleaning by the cleaning module when the floor cleaning is complete in step 114.
[0026] The invention in various embodiments includes a robotic device 200 for performing tank cleaning and cleanliness evaluation for a tank using the method 100. A block diagram of the robotic device 200 is illustrated in FIG. 2, according to one embodiment of the present subject matter. The robotic device 200 may primarily include a body 202, a plurality of sensors 208, a cleaning module 210, a plurality of cameras 216 and a processing unit. In various embodiments, the device comprises a communicationinterface for wired or wireless communication, configured to communicate in real‐time with a remote host.
[0027] In various embodiments, the body 202 of the device may be attached with a mobility unit 204. In various embodiments, the mobility unit 204 may include one or more wheels powered by DC motors for navigation. In one embodiment, one or more wheels may include a ball castor wheel. In various embodiments, the body 202 may be attached with a lead screw coupled with stepper motor 206. In various embodiments,
[0028] In various embodiments, the plurality of sensors 208 may be mounted on the body 202 to measure a plurality of sensor data in the device’s surrounding environment. In various embodiments, the plurality of sensors 208 may include an inertial measurement unit, an ultrasonic sensor, a current sensor or a combination thereof. In one embodiment, the current sensor determines the position of a wall cleaning brush from the wall of a water tank.
[0029] In various embodiments, the cleaning module 210 may be positioned on the body of the device. The cleaning module 210 may include a floor cleaning unit 212 adapted to perform floor cleaning of the tank. In various embodiments, the floor cleaning unit 212 may be mounted on the body 202 and comprises a motor 224 coupled with a driver 226 to perform horizontal cleaning with a floor cleaning brush 228. In various embodiments, the cleaning module 210 may further include awall cleaning unit 214 adapted to perform wall cleaning of the tank. The wall cleaning unit 214 may be mounted on the lead screw coupled with stepper motor 206 having a motor 230 coupled with a driver 232 connected to a wall cleaning brush 234 to perform vertical cleaning.In various embodiments, the plurality of cameras 216 may be mounted on the cleaning module 210. The cameras 216 may be adapted to receive images of the surrounding environment.
[0030] In various embodiments, the processing unit 218 is adapted for facilitating movement and performing cleanliness. In various embodiments, the processing unit includes a locomotion unit 220 and a cleanliness evaluation unit 222. The locomotion unit 220 is adapted to control movement of the mobility unit 204 of the device in surrounding environment. The cleanliness evaluation unit 222 for determining the efficiency of the robotic device. The cleanliness evaluation unit 222 is configured to receive images of the tank’s wall and floor surfaces from the plurality of cameras. The cleanliness evaluation unit 222 is adapted to perform resizing and augmentation of the received images, wherein the images are hybrid dataset. The cleanliness evaluation unit 222 is also adapted to provide the hybrid dataset to deep leaning models for training, provide training data from the deep learning models to a modified U-net model. In various embodiments, the deep cleaning models include basic CNN, SegNet, Fast CNN and U-Net.The cleanliness evaluation unit 222 is configured to segment the images into black and white pixels, wherein the dirt particles are segmented into white pixels and the remaining area into black pixels and determine the ratio of the number of white pixels to the total number of pixels. The cleanliness evaluation unit 222 also compare the ratio with a predeterminedthreshold value and position the robotic devicebased on the presence of the detected white pixels.In various embodiments, the predetermined threshold value may be in the range of 0 to 3.
[0031] In various embodiments, as shown in FIG. 4A, initializing floor cleaning 110 by the cleaning unit includes determining a position of the robotic devicein the tank and initiate floor cleaning and setting the counter as nil followed by incrementing counter by 1if a wall is detected and turning the robotic deviceby 90⁰, following the control flow. The actual path taken by the device is shown in FIG. 4B. This is followed by performing floor cleaning until the counter reaches a predefined threshold. In one embodiment, the counter is in the range of 0 to 3. In one embodiment, the predefined threshold is 3.
[0032] FIG. 5A shows the control flow, while FIG. 5B shows path of the device for wall cleaning. In various embodiments, as shown in FIG. 5A, redirecting 114 the robotic device to perform wall cleaning by the cleaning unit includes determining the position of the robotic device in the tank from the wall and setting the counter as nil followed by moving the robotic device until the presence of the wall is detected and turn the device 90⁰. This is followed by determining position of a wall cleaning brush from the wall and aligning the robotic deviceto proximity of the wall until the wall cleaning brush touches the wall. In various embodiments, aligning the robotic deviceto the wall’s proximity until the wall cleaning brush touches the wall is performed by a lead screw coupled with a stepper motor 206. The next step includes incrementing counter by 1 and performingwall cleaning until the counter reaches a predefined threshold. In one embodiment, the counter is in the range of 0 to 3. In one embodiment, the predefined threshold is 3.
[0033] In the next step 116, cleaning efficiency of the robotic device is determined by evaluating the cleanliness parameters in a cleanliness evaluation unit. In various embodiments, determining cleaning efficiency of the robotic device includes various steps and includes receiving images of the tank’s wall and floor surfaces from the plurality of cameras followed by performing resizing and augmentation of the received images, wherein the images are hybrid dataset. Further, providing the hybrid dataset to deep leaning models for training followed by providing training data from the deep learning models to a modified U-net model, wherein the deep learning model include basic CNN, SegNet, Fast FCN and U-Net This is followed by segmenting the images into black and white pixels, wherein the dirt particles are segmented into white pixels and the remaining area into black pixels. The next step includes determining the ratio of the number of white pixels to the total number of pixels followed by comparing the ratio with a predetermined threshold value. In various embodiments, the predetermined threshold range is in the range of 0 to 3.Final step includes positioning the robotic device based on the presence of the detected white pixels.
EXAMPLES
[0034] Example - 1: Robotic Device Architecture:
[0035] Fabrication of Robotic Device: TheComputer-Aided Design (CAD)of the water tank cleaning robot and labelling of the mechanical parts are shown in FIG. 3A and FIG. 3B. The robot was designed to efficiently clean both the floor and the walls of a concrete water tank. The device’s dimension were 38 cm x 38 cm x 20 cm (L x B x H) wherein the height excluded the lead screw and supporting rod. For cleaning the walls of the water tank, a wall cleaning module was placed on the right side of the robot. The vertical and horizontal motion of the wall cleaning module was facilitated with the help of a lead screw based mechanism. The device used a differential drive for mobility with two 100 mm wheels and a ball caster wheel for support. The wheels were powered by two 12-volt DC motors with 5.7 kg-cm torque. An inertial measurement unit (IMU) was added to the robot to control the orientation. The robotic device’s front and right plate both included an ultrasonic sensor to aid wall detection. Other electronics in the robot are DC motor drivers, a stepper motor driver, a Bluetooth module and a current sensor. All the electronics and components were connected to an Arduino microcontroller.
[0036] Floor Cleaning Module and Wall Cleaning Module of The Robotic Device:The floor cleaning module was directly attached to the robot's main body. The floor cleaning module included a DC motor and a DC motor driver. Unlike the floor cleaning module, the wall cleaning module was mounted on the lead screw nut of the lead screw coupled with the stepper motor to enable the module to slide vertically up and down. The wall cleaning module included several components, such as a lead screw, DC motors, a DC motor driver, and a current sensor. The lead screw was coupled with a DC motor for horizontal actuation. A DC motor connected with a cleaning brush was mounted on the lead screw nut. The purpose of this lead screw was to make the cleaning brush come into contact with the walls of the water tank. To determine whether the cleaning brushwas touching the wall or not, a current sensor was connected to the DC motor coupled with the lead screw. The load on the DC motor, responsible for horizontal actuation, increased when the brush was in contact with the wall as the motion got blocked. The current sensor detected the change, and when the load increased, the DC motor stopped moving the cleaning brush in the direction of the wall. To clean the wall from top to bottom, the stepper motor was placed inside the chassis of the robot started rotating the lead screw with which it was connected.
[0037] Hardware Architecture of the Robotic Device:The processing unit for the water tank cleaning robot was an Arduino UNO microcontroller. The Arduino microcontroller was responsible for executing tasks like locomotion, floor cleaning, and wall cleaning. Ultrasonic sensors and an IMU (MPU-6050) aid in the motion of the robot. The current sensor (ACS712) played a crucial role during wall cleaning. The current sensor helped in sensing whether the cleaning brush was touching the wall of a water tank. The hardware architecture of the water tank cleaning robot is elucidated in FIG. 2.
[0038] Software Architecture Robotic Device: The programming for the robot was done using the Arduino programming language. The software architecture for the water tank cleaning robot was divided into two parts: one for floor cleaning, and another one for wall cleaning.
[0039] Example-2: Movement of the Robotic Device:
[0040] Floor Cleaning:The fabricated water tank cleaning robot made use of an ultrasonic sensor placed in the front plate of the robot and an IMU sensor for path planning during the floor cleaning operation. The robot travelled in a spiral path to clean the floor of a water tank. For floor cleaning, the robot was to be positioned in a corner of the water tank with the wall cleaning brush facing away from the closest wall. During the motion of the robot in the water tank, the cleaning brush attached to the DC motor fixed to the robot’s base plate starts spinning. Implementation of floor cleaning is shown in FIG. 4A.
[0041] Wall Cleaning:During the wall cleaning, the robot travelled towards the wall and turned left once a wall was detected; then, the ultrasonic sensor placed at the right plate of the robot measured the distance between the robot and the wall. Depending on the distance measured, the robot aligned itself close to the wall using the combination of feedback received from an ultrasonic sensor and IMU. After the robot was aligned properly with the wall, the DC motor coupled with the lead screw started moving the cleaning brush towards the wall, and when the brush touched the wall, the load on the DC motor measured by the current sensor increased. This change caused horizontal actuation to stop and caused the stepper motor to turn the lead screw with which it was coupled. The lead screw connected to the stepper motor consisted of the wall cleaning module. This enabled the wall cleaning module to clean the wall uniformly as it moved vertically up and down. The robot advanced a short distance after cleaning the portion of the wall that the brush can cover. This procedure continued until the value of the counter variable became greater than 3, indicating that the water tank's four side walls were cleaned.
Example - 3: Determining Cleaning Efficiency of the Robotic Device
[0042] An autonomous water tank cleaning robot was designed, and its performance in a domestic water tank cleaning was checked using the test setup created. The design allowed the robot to clean both the floor and walls of the domestic water tank. A deep learning-based model for dirt detection and cleanliness evaluation was developed in conjunction with the fabricated robot.
[0043] Deep Learning-Based Dirt Detection and Evaluation: This section presents the experiments and results of using custom trained deep learning models for segmenting dirt from images. It also discussed how these models were evaluated to determine their cleaning performance. The quality of the segmentation depended on the specific application and the accuracy of the algorithm used. Several advanced models such as SegNet, U-Net, Fully Convolutional Network and a simple base-model based out of convolutional layers were experimented for finding the suitable model for the segmentation task. The experiments were conducted using the free GPU of Google Colab platform. All the models were trained on this configuration.
[0044] For developing a cleanliness evaluation model, a hybrid dataset was created using images from the Automation and Control Institute(ACIN) dataset and the water tank images captured by the cameras and collected on the robotic device for the purpose of this study. These images were augmented to make a final dataset comprising about 7300 images, out of which 2,190 images were for testing and 5,110 for training. The parameters were number of epochs = 25, batch size = 16, learning rate=1e-3, Adam as the optimizer and Binary cross entropy as the loss function on a dataset of 7,300 images, split into 5,110 images for training and 2,190 images for testing. The Dataset preparation section contains all the information regarding the preparation of the dataset. After the training was completed, the test dataset was used to evaluate the generalization ability of the model.
[0045] A comparative analysis of various deep learning models,to evaluate the cleanliness status of tanks, was performed. The hybrid dataset was trained with various image segmentation models like Fast FCN, SegNet, and U-Net. To improve the accuracy of cleanliness evaluation, a novel modified U-Net model was proposed and evaluated. The ReLU (Rectified Linear Unit) activation function was used in the convolutional layers of the encoder part of this architecture. This analysis aimed to identify the most effective model that could provide accurate and reliable feedback to enhance the cleaning performance of the autonomous robotic device.
[0046] Models In The Benchmark: Base Model: To compare the outcomes of the different models that were trained, a basic base model was developed for the study. With three convolutional blocks in the encoding path and two upsampling blocks in the decoding path, this model employeda straightforward encoder-decoder architecture. The probability that each pixel in the input image was black could be calculated from the single-channel output with sigmoid activation that was produced by the final convolutional layer. TABLE-1 below provide details on parameters, training and testing accuracy for the Base Model.
[0047] The ReLU (Rectified Linear Unit) activation function was used in the model. ReLU is a commonly used activation function that is applied element-wise to the output of a neural network layer. ReLUreturns the maximum of 0 and the input value. ReLU is known for its simplicity and effectiveness in reducing the vanishing gradient problem.
[0048] Softmaxwas another used activation function in neural networks. The output values of the softmax function add up to 1, allowing for easy interpretation of the results. In binary classification, the probability of an input belonging to one of two classes: 0 or 1 was predicted.
TABLE-1: Parameters, Training and Testing Accuracy for the Base Model
Parameters Training Accuracy Testing Accuracy
14,54,791 78.24% 75.9%
[0049] SegNetwas used for a variety of image segmentation tasks, including road detection, object tracking, and medical image segmentation. ReLu (Rectified Linear Unit) and Sigmoid functions were used as activation functions in this model along with an Adam Optimizer. During this study 5 encoder layers and 5 decoder layers were used. TABLE- 2 provide details on parameters, training and testing accuracy for the SegNet.
TABLE- 2: Parameters, Training and Testing Accuracy for the SegNet
Parameters Training Accuracy Testing Accuracy
35,45,469 88.46% 80.41%
[0050] Fully Convolutional Network:Fast FCN, or Fully Convolutional Networkswas used for a variety of image segmentation tasks, including semantic segmentation, instance segmentation, and medical image analysis. Overall, Fast FCN was a powerful tool for image segmentation that could handle large and complex datasets, while still maintaining fast processing times. ReLu (Rectified Linear Unit) and Softmax functions were used as activation functions in this model. Here, 4 encoder layers and 4 decoder layers were used. TABLE- 3 provide details on parameters, training and testing accuracy for the Fast FCN.
TABLE- 3: Parameters, Training and Testing Accuracy for the Fast FCN
Parameters Training Accuracy Testing Accuracy
19,25,601 97.31% 89.39%

-U-Net:For the purpose of segmenting images, the U-Net convolutional neural network architecture was used. The model that was trained during the study included 10 convolution layers in the encoding path and 3 max pooling layers. The decoding path consisted of 9 convolution layers and 2 convolution transpose layers. Unlike a straightforward upsampling layer, the Conv2DTranspose or transpose convolutional layer is more complicated. While upsampling, U-Net executes the upsample operation and understands the coarse input data to fill in the information. TABLE- 4: provide details on parameters, training and testing accuracy for the U-Net.

TABLE-4: Parameters, training and testing accuracy for the U-Net
Parameters Training Accuracy Testing Accuracy
1,944,049 98.4% 87.6%
[0051] Cleaning Performance AnalysisWith Modified U-Net Model: It was clearly observed that the previous models that were trained were not able to generalise and produce good results for test data which means that the models are overfitting. To eliminate overfitting a new model was proposed by adding dropout and batch normalisation layers to the U Net architecture.
[0052] Modified U-Net Architecture: The architecture included three main blocks: Dataset Preparation, Dirt segmentation using modified U-Net. FIG. 5 represents a schematic for determining cleaning efficiency of the robotic device in a cleanliness evaluation unitused for this study.From TABLE-5 it is clear that the modified U-Net model with parameters 25,16,0.001,achieved the highest training accuracy of 99.42% and testing accuracy of 96%compared to the all other four models used during the study.
TABLE-5: represents the comparison of the training and testing accuracy for various trained models.
Model Name Training Accuracy (%) Testing Accuracy (%)
Base Model 78.24 75.9
SegNet 88.46 80.41
Fast FCN 97.31 89.39
U Net 98.4 87.6
Modified U-Net 99.42 96
[0053] Example- 4: Cleanliness Performance Evaluation:
[0054] Two cameras were attached, one to the floor cleaning and the other to the wall cleaning modules. Based on either wall cleaning or floor cleaning evaluation mode the corresponding cameras were activated. The live feed coming from the activated camera was given to the controller board where the resizing and the segmentation was carried out. Dirt particles were segmented into white pixels and the remaining are segmented as black pixels.
[0055] Wall cleaning:For the wall cleanliness evaluation mode, the camera was aligned to the area where the wall cleaning brush points. Once any white pixels were detected, the model stopped its movement and thoroughly cleaned that area for 2 minutes. This duration was chosen based on experimental results, which showed that 2 minutes was the optimal time for effective cleaning with this robot. Also, this duration allowed the brush to adequately scrub the area, ensuring all dirt was removed without overusing resources or time. Furthermore, once the cleaning started, it triggered the submersible water pump and started to pump water to the cleaning area in an interval of 2 minutes.
[0056] Floor cleaning:For the floor cleanliness evaluation mode, the camera was placed on the front plate of the robot and is aligned to point towards the floor. Once the evaluation mode for the floor cleaning module was activated, the live feed from the camera on the front was sent to the controller board. When white pixels were detected, the robot moved 30cm forward and cleaned the area for the next 2 minutes. The 30 cm movement was chosen because the camera was mounted on the front plate, while the brush was attached to the back plate, which was 36 cm apart. With the brush extending 6 cm inward, a 30 cm advance ensured that the robot reached the correct location for thorough cleaning. This distance accounted for both the placement of the camera and the brush's position, ensuring accurate dirt patch detection and effective cleaning.
, Claims:WE CLAIM:
1. A method (100) for performing tank cleaning and cleanliness evaluation of the tank, the method comprising:
placing (102) a robotic device (200) at a first end of the tank;
receiving a destination point (104) to be reached by the robotic device (200);
receiving plurality of data sets (106) from a plurality of sensors (208), wherein the sensors are configured to obtain data for a floor or a wall region of the tank;
receiving images (108) of the tank surface from a plurality of cameras (216);
initializing (110) a cleaning module for performing floor cleaning within the floor region of the tank based on current position of the robotic device and activating a counter;
predicting (112) the robotic device’s new position and incrementing the counter corresponding to the new position;
redirecting (114) the robotic device to perform wall cleaning by the cleaning module when the floor cleaning is complete;
determining (116) cleaning efficiency of the robotic device by evaluating the cleanliness parameters in a cleanliness evaluation unit, wherein evaluation of cleanliness module comprises:
receiving images of the tank’s wall and floor surfaces from the plurality of cameras;
performing resizing and augmentation of the received images, wherein the images are hybrid dataset;
providing the hybrid dataset to deep leaning models for training;
providing training data from the deep learning models to a modified U-net model;
segmenting the images into black and white pixels, wherein the dirt particles are segmented into white pixels and the remaining area into black pixels;
determining the ratio of the number of white pixels to the total number of pixels;
comparing the ratio with a predetermined threshold value; and
positioning the robotic devicebased on the presence of the detected white pixels.

2. The method as claimed in claim 1, wherein initializing floor cleaning (110) by the cleaning unit comprises:
determininga position of the robotic devicein the tank and initiate floor cleaning and setting the counter as nil;
incrementing counter by 1if a wall is detected and turning the robotic deviceby 90⁰; and
performing floor cleaning until the counter reaches a predefined threshold.

3. The method as claimed in claim 1, wherein redirecting (114) the robotic device to perform wall cleaning by the cleaning unit comprises:
determining the position of the robotic device in the tank from the wall and setting the counter as nil;
moving the robotic device until the presence of the wall is detected and turn the device 90⁰;
determining position of a wall cleaning brush from the wall and aligning the robotic deviceto proximity of the wall until the wall cleaning brush touches the wall; and
incrementing counter by 1 and performingwall cleaning until the counter reaches a predefined threshold.

4. The method(100) as claimed in claim 3, wherein aligning the robotic deviceto the wall’s proximity until the wall cleaning brush touches the wall is performed by a lead screw coupled with a stepper motor (206).

5. The robotic device (200) for performing tank cleaning and cleanliness evaluation for a tank as claimed in claim 1, wherein the device comprising:
a body (202) attached with a mobility unit (204) and a lead screw coupled with stepper motor (206);
a plurality of sensors (208) mounted on the body (202) to measure a plurality of sensor data in the robotic device’s surrounding environment;
a cleaning module (210) for performing wall cleaning and floor cleaning, the module comprising a floor cleaning unit (212) and a wall cleaning unit (214), positioned on the body (202) to clean the tank;
a plurality of cameras (216) mounted on the cleaning module (210), adapted to receive images of the surrounding environment;
a processing unit (218) for facilitating movement and performing cleanliness evaluation, the processing unit comprises:
a locomotion unit (220) adapted to control movement of the mobility unit (204) of the device in surrounding environment;
a cleanliness evaluation unit (222) for determining the efficiency of the robotic device, the unit is configured to:
receive images of the tank’s wall and floor surfaces from the plurality of cameras;
perform resizing and augmentation of the received images, wherein the images are hybrid dataset;
provide the hybrid dataset to deep leaning models for training;
provide training data from the deep learning models to a modified U-net model;
segment the images into black and white pixels, wherein the dirt particles are segmented into white pixels and the remaining area into black pixels;
determine the ratio of the number of white pixels to the total number of pixels;
compare the ratio with a predetermined threshold value; and
position the robotic devicebased on the presence of the detected white pixels.

6. The device as (200) claimed in claim 5, wherein the floor cleaning unit (212)is mounted on the body (202) and comprises a motor (224) coupled with a driver (226) to perform horizontal cleaning with a floor cleaning brush (228);

7. The device (200) as claimed in claim 5, wherein the wall cleaning unit (214) is mounted on the lead screw coupled with stepper motor (206) having a motor (230) coupled with a driver (232) connected to a wall cleaning brush (234) to perform vertical cleaning.

8. The device (200) as claimed in claim 5, wherein the mobility unit (204) comprises one or more wheels for navigation.

9. The device (200) as claimed in claim 5, wherein the plurality of sensors (208) comprises an inertial measurement unit, an ultrasonic sensor, a current sensor or a combination thereof.

10. The device(200) as claimed in claim 5, wherein a current sensor determines the position of a wall cleaning brush from the wall of a water tank.

11. The device(200) as claimed in claim 6, wherein the device comprises a communication interface for wired or wireless communication, configured to communicate in real-time with a remote host.


Dr V. SHANKAR
IN/PA-1733
For and on behalf of the Applicants

Documents

Application Documents

# Name Date
1 202541006851-STATEMENT OF UNDERTAKING (FORM 3) [28-01-2025(online)].pdf 2025-01-28
2 202541006851-REQUEST FOR EXAMINATION (FORM-18) [28-01-2025(online)].pdf 2025-01-28
3 202541006851-REQUEST FOR EARLY PUBLICATION(FORM-9) [28-01-2025(online)].pdf 2025-01-28
4 202541006851-OTHERS [28-01-2025(online)].pdf 2025-01-28
5 202541006851-FORM-9 [28-01-2025(online)].pdf 2025-01-28
6 202541006851-FORM-8 [28-01-2025(online)].pdf 2025-01-28
7 202541006851-FORM FOR SMALL ENTITY(FORM-28) [28-01-2025(online)].pdf 2025-01-28
8 202541006851-FORM 18 [28-01-2025(online)].pdf 2025-01-28
9 202541006851-FORM 1 [28-01-2025(online)].pdf 2025-01-28
10 202541006851-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-01-2025(online)].pdf 2025-01-28
11 202541006851-EDUCATIONAL INSTITUTION(S) [28-01-2025(online)].pdf 2025-01-28
12 202541006851-DRAWINGS [28-01-2025(online)].pdf 2025-01-28
13 202541006851-DECLARATION OF INVENTORSHIP (FORM 5) [28-01-2025(online)].pdf 2025-01-28
14 202541006851-COMPLETE SPECIFICATION [28-01-2025(online)].pdf 2025-01-28