Abstract: Systems and methods for removing rain streak distortion from a distorted video are described. The system receives sample non-distorted images and sample distorted images of a video. The sample non-distorted images are indicative of non-raining condition and the sample distorted images are indicative of raining condition in the video. The system further determines first temporal information from the sample distorted images and second temporal information from the sample non-distorted images. The first temporal information indicative of a change in the rain streak distortion pattern and the second temporal information indicative of a change in a non-rain streak distortion pattern. Further, the system correlates the first temporal information with the second temporal information to generate a training model comprising one or more trained weights. Further, the system removes the rain streak distortion from a real-time distorted video by applying the training model, which results in the generation of the non-distorted video. FIG. 1
Claims:We claim:
1. A method of removing rain streak distortion from a distorted video, the method comprising:
receiving, by a model generator (206), a plurality of sample non-distorted images (106) and a plurality of sample distorted images (108) of a video (104), wherein the plurality of sample non-distorted images (106) are indicative of a non-raining condition in the video (104), and wherein the plurality of sample distorted images (108) are indicative of a raining condition in the video (104);
determining, by the model generator (206), a first temporal information (222) from the plurality of sample distorted images (108) and a second temporal information (224) from the plurality of sample non-distorted images (106), wherein the first temporal information (222), indicative of a change in the rain streak distortion pattern, comprises a plurality of first set of pixel values corresponding to the plurality of sample distorted images (108), and wherein the second temporal information (224), indicative of a change in a non-rain streak distortion pattern, comprises a plurality of second set of pixel values corresponding to the plurality of sample non-distorted images (106);
correlating, by the model generator (206), the first temporal information (222) with the second temporal information (224) based on the plurality of first set of pixel values and the plurality of second set of pixel values;
generating, by the model generator (206), a training model (110) comprising one or more trained weights (218) based on the correlation; and
removing, by a video convertor, the rain streak distortion from a real-time distorted video (112) by applying the training model (110), wherein removal of the rain streak distortion results in the generation of the non-distorted video (116).
2. The method as claimed in claim 1, wherein the removing of the rain streak distortion from the real-time distorted video (112) comprises:
receiving, by a video capturing unit (208), the real-time distorted video (112) comprising the rain streak distortion,
converting, by real-time image sequence generator (210), the real-time distorted video (112) into a plurality of real-time distorted images (114);
determining, by an image generating unit (212), real-time rain streak distortion pattern in the plurality of real-time distorted images (114) by applying the one or more trained weights (218) on the plurality of real-time distorted images (114);
generating, by the image generating unit (212), a plurality of real-time non-distorted images corresponding to the plurality of real-time distorted images (114) by removing the real-time rain streak distortion pattern; and
converting, by the video converter, the plurality of real-time non-distorted images into the non-distorted video (116).
3. The method as claimed in claim 1, wherein the correlating of the first temporal information (222) and the second temporal information (224) comprises:
comparing the plurality of first set of pixel values with the plurality of second set of pixel values to determine a difference between pixel values;
generating a plurality of reconstructed images based on the comparing; and
comparing the plurality of reconstructed images with the plurality of sample non-distorted images (106) to determine correlation factor indicating a similarity error between the plurality of reconstructed images and the plurality of sample non-distorted images (106), wherein the correlation factor is minimized by applying the one or more trained weights (218) such that the plurality of reconstructed images becomes similar to the plurality of sample non-distorted images (106).
4. The method as claimed in claim 1, wherein the plurality of first set of pixel values, of the first temporal information (222), indicates change in rain streak distortion pattern across the plurality of sample distorted images (108) over a time interval.
5. The method as claimed in claim 1, wherein the plurality of second set of pixel values, of the second temporal information (224), indicates a change in non-rain streak distortion pattern across the plurality of sample non-distorted images (106) over a time interval.
6. The method as claimed in claim 1, wherein the training model (110) learns, from the plurality of sample distorted images (108) and the plurality of sample non-distorted images (106), about the change in the rain streak distortion pattern and the change in the non-rain streak distortion pattern respectively.
7. A system (102) for removing rain streak distortion from a distorted video, wherein the system (102) comprises:
a processor (204); and
a memory (216) communicatively coupled to the processor (204), wherein the memory (216) stores processor instructions, which, on execution, causes the processor (204) to:
receive a plurality of sample non-distorted images (106) and a plurality of sample distorted images (108) of a video (104), wherein the plurality of sample non-distorted images (106) are indicative of a non-raining condition in the video (104), and wherein the plurality of sample distorted images (108) are indicative of a raining condition in the video (104),
determine a first temporal information (222) from the plurality of sample distorted images (108) and a second temporal information (224) from the plurality of sample non-distorted images (106), wherein the first temporal information (222), indicative of a change in the rain streak distortion pattern, comprises a plurality of first set of pixel values corresponding to the plurality of sample distorted images (108), and wherein the second temporal information (224), indicative of a change in a non-rain streak distortion pattern, comprises a plurality of second set of pixel values corresponding to the plurality of sample non-distorted images (106),
correlate the first temporal information (222) with the second temporal information (224) based on the plurality of first set of pixel values and the plurality of second set of pixel values, and
generate a training model (110) comprising one or more trained weights (218) based on the correlation; and
remove the rain streak distortion from a real-time distorted video (112) by applying the training model (110), wherein the removal of the rain streak distortion results in the generation of the non-distorted video (116).
8. The system (102) as claimed in claimed 7, wherein the system removes the rain streak distortion from the real-time distorted video (112) by:
receiving the real-time distorted video (112) comprising the rain streak distortion;
converting the real-time distorted video (112) into a plurality of real-time distorted images;
determining real-time rain streak distortion pattern in the plurality of real-time distorted images (114) by applying the one or more trained weights (218) on the plurality of real-time distorted images (114), and
generating a plurality of real-time non-distorted images corresponding to the plurality of real-time distorted images (114) by removing the real-time rain streak distortion pattern; and
converting the plurality of real-time non-distorted images into the non-distorted video (116).
9. The system (102) as claimed in claim 7, wherein the system performs correlating of the first temporal information (222) and the second temporal information (224) by:
comparing the plurality of first set of pixel values with the plurality of second set of pixel values to determine a difference between pixel values;
generating a plurality of reconstructed images based on the comparing, and
comparing the plurality of reconstructed images with the plurality of sample non-distorted images (106) to determine a similarity error between the plurality of reconstructed images and the plurality of sample non-distorted images (106), wherein the correlation factor is minimized by applying the one or more trained weights (218) such that the plurality of reconstructed images becomes similar to the plurality of sample non-distorted images (106).
10. The system (102) as claimed in claim 7, wherein the plurality of first set of pixel values, of the first temporal information (222), indicates change in rain streak distortion pattern across the plurality of sample distorted images (108) over a time interval.
11. The system (102) as claimed in claim 7, wherein the wherein the plurality of second set of pixel values, of the second temporal information (224), indicates a change in non-rain streak distortion pattern across the plurality of sample non-distorted images (106) over a time interval.
12. The system (102) as claimed in claim 7, wherein the training model (110) learns, from the plurality of sample distorted images (108) and the plurality of sample non-distorted images (106), about the change in the rain streak distortion pattern and the change in the non-rain streak distortion pattern respectively.
Dated this 10th day of August, 2017
Swetha S N
Of K&S Partners
Agent for the Applicant
, Description:TECHNICAL FIELD
The present disclosure relates in general to image processing and deep learning. More particularly, but not exclusively, the present disclosure discloses a method and system for removing rain streak distortion from a distorted video.
| # | Name | Date |
|---|---|---|
| 1 | 201741028472-STATEMENT OF UNDERTAKING (FORM 3) [10-08-2017(online)].pdf | 2017-08-10 |
| 2 | 201741028472-REQUEST FOR EXAMINATION (FORM-18) [10-08-2017(online)].pdf | 2017-08-10 |
| 3 | 201741028472-POWER OF AUTHORITY [10-08-2017(online)].pdf | 2017-08-10 |
| 4 | 201741028472-FORM 18 [10-08-2017(online)].pdf | 2017-08-10 |
| 5 | 201741028472-FORM 1 [10-08-2017(online)].pdf | 2017-08-10 |
| 6 | 201741028472-DRAWINGS [10-08-2017(online)].pdf | 2017-08-10 |
| 7 | 201741028472-DECLARATION OF INVENTORSHIP (FORM 5) [10-08-2017(online)].pdf | 2017-08-10 |
| 8 | 201741028472-COMPLETE SPECIFICATION [10-08-2017(online)].pdf | 2017-08-10 |
| 9 | 201741028472-REQUEST FOR CERTIFIED COPY [11-08-2017(online)].pdf | 2017-08-11 |
| 10 | 201741028472-Proof of Right (MANDATORY) [11-10-2017(online)].pdf | 2017-10-11 |
| 11 | Correspondence By Agent_Form1,30_13-10-2017.pdf | 2017-10-13 |
| 12 | 201741028472-FER.pdf | 2020-05-15 |
| 13 | 201741028472-Information under section 8(2) [04-09-2020(online)].pdf | 2020-09-04 |
| 14 | 201741028472-FORM 3 [04-09-2020(online)].pdf | 2020-09-04 |
| 15 | 201741028472-PETITION UNDER RULE 137 [07-09-2020(online)].pdf | 2020-09-07 |
| 16 | 201741028472-FER_SER_REPLY [07-09-2020(online)].pdf | 2020-09-07 |
| 17 | 201741028472-US(14)-HearingNotice-(HearingDate-19-07-2022).pdf | 2022-06-22 |
| 18 | 201741028472-POA [01-07-2022(online)].pdf | 2022-07-01 |
| 19 | 201741028472-FORM 13 [01-07-2022(online)].pdf | 2022-07-01 |
| 20 | 201741028472-Correspondence to notify the Controller [01-07-2022(online)].pdf | 2022-07-01 |
| 21 | 201741028472-AMENDED DOCUMENTS [01-07-2022(online)].pdf | 2022-07-01 |
| 22 | 201741028472-Written submissions and relevant documents [02-08-2022(online)].pdf | 2022-08-02 |
| 23 | 201741028472-PatentCertificate05-12-2022.pdf | 2022-12-05 |
| 24 | 201741028472-IntimationOfGrant05-12-2022.pdf | 2022-12-05 |
| 1 | 2020-02-1216-32-36_12-02-2020.pdf |