Sign In to Follow Application
View All Documents & Correspondence

A Method For Providing Color Vision Deficiency Assistance

Abstract: The present invention discloses a system and a method for providing color vision deficiency assistance by selectively recoloring pixels of an image frame including unperceivable colors with perceivable colors in real time. In particular, the present invention provides for rescaling the image frame using a first set of rules. Further, a perceivable color space is selected based on one or more parameters. Furthermore, one or more pixels associated with unperceivable colors are identified using a second set of rules. Yet further, the identified one or more pixels are recolored using the selected perceivable color space and a third set of rules. Finally, a corrected image frame comprising perceivable colors is provided.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
01 May 2019
Publication Number
45/2020
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
dev.robinson@amsshardul.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-05-28
Renewal Date

Applicants

Cognizant Technology Solutions India Pvt. Ltd.
Techno Complex, No. 5/535, Old Mahabalipuram Road, Okkiyam Thoraipakkam, Chennai 600 097, Tamil Nadu, India

Inventors

1. Avinandan Bandyopadhyay
10/1 Ghoshal Para Road, PO: Barasat, Kolkata – 700124, West Bengal, India
2. Subhas Chakraborty
15/B/1 Moore Avenue, PS – Regent Park, Kolkata – 700040, West Bengal, India
3. Ramesh Yechangunja
204 Esteem Classic, #25/2 Industrial Suburb, I Stage Rajajinagar, Bangalore – 560022, Karnataka, India

Specification

We claim:
1. A method for providing color vision deficiency assistance
by selectively recoloring pixels of an image frame including
unperceivable colors, wherein the method is implemented by at
least one processor executing program instructions stored in a
memory, the method comprising:
rescaling the image frame using a first set of rules, wherein the first set of rules comprises converting the retrieved image of resolution M by N into a resolution P by Q;
selecting a predefined perceivable color space based on one or more parameters;
identifying one or more pixels of the rescaled image frame associated with respective unperceivable colors for recoloring using a second set of rules; and
recoloring the identified one or more pixels using the selected perceivable color space and a third set of rules.
2. The method as claimed in claim 1, wherein the first set of rules comprises converting the retrieved image frame of resolution M by N into a desired resolution P by Q using the equation P / X = 21, where (21 x X) = 3, P / X >= 1 and Q / Y = 2^, where (2^ x Y) = 3, Q / Y >= 1, wherein M, N, P, Q, X, Y, i and j are integers.
3. The method as claimed in claim 1, wherein selecting a predefined color space comprises:
converting an initial color space of the rescaled image frame into a conversion color space based on the one or more parameters including closeness of color space with human perception of colors, wherein a conversion color space values for each pixel of the rescaled image frame is derived from corresponding initial color space values;
28

categorizing the conversion color space into a perceivable color space and an unperceivable color space based on one or more unperceivable colors in the rescaled image, wherein the one or more unperceivable colors are detected based on the one or more parameters including type of color vision deficiency and one or more inputs received from the color deficient individual; and
selecting the perceivable color space for recoloring.
4. The method as claimed in claim 3, wherein the initial color space is RGB (Red, Green, Blue) color space and the conversion color space is selected from HSL (Hue, Saturation, Lightness), HSV (Hue, Saturation, Value), LMS (Long-range, middle-range, short-range wavelength) color space or any other color space which aligns closely with human perception of colors.
5. The method as claimed in claim 3, wherein the one or more inputs from the color deficient individual include fine tuning an extent to which the unperceivable colors are not visible based on a degree of severity of color vision deficiency.
6. The method as claimed in claim 1, wherein the second set of rules includes:
constructing a quasi-region based quad-tree data structure on the rescaled image frame, wherein the rescaled image frame is representative of a root node comprising a plurality of pixels, wherein further the root node is recursively split into four, two or zero equal smaller nodes upto a plurality of leaf nodes, each smaller node and each leaf node comprising one or more center pixel(s) and a plurality of border pixels;
identifying each leaf node comprising one or more center pixels and a plurality of border pixels having one or more unperceivable colors, wherein the identification is performed by analyzing the color space values of each leaf node and rejecting the leaf nodes with border pixels and center pixels in the perceivable color space;
29

determining one or more contender nodes comprising one or more center pixels having unperceivable colors different from each of the corresponding plurality of border pixels, wherein the determination is performed by recursively analyzing color space values of the one or more center pixels and the plurality of border pixels of each of the identified leaf node until one or more center pixels are found to be having unperceivable colors different from each of the corresponding plurality of border pixels;
selecting the one or more center pixels of each of the contender nodes for recoloring; and
repeating the step of determining contender nodes and selecting the center pixels of the contender node for recoloring, wherein the determination is performed on the higher nodes of the identified leaf node upto to the root node if the one or more unperceivable colors of the center pixels of the leaf nodes are not found to be different from the corresponding border pixels.
7. The method as claimed in claim 6, wherein the third set
of rules comprises:
identifying a color mapping between an unperceivable color associated with one of the identified one or more pixels and a perceivable color in a color compensator table for each of the identified one or more pixels based on determination of availability of a color mapping in the color compensator table; and
recoloring one of the identified one or more pixels with the identified color mapping if color of each of the corresponding border pixels is found to be different from the perceivable color mapped in the color compensator table, and wherein the step of identifying and recoloring is repeated for the other identified pixels.
8. The method as claimed in claim 1, wherein the third set
of rules comprises choosing a new color from the selected
30

perceivable color space using a fourth set of rules for recoloring the identified one or more pixels if a color compensator table is empty, further wherein the color compensator table is a dynamic table populated with color mapping information to maintain a record of each perceivable color mapped with one or more unperceivable colors.
9. The method as claimed in claim 7, wherein the third set of rules comprises choosing a new color from the selected perceivable color space using a fourth set of rules for recoloring the identified one or more pixels associated with the unperceivable color if color of any of the corresponding border pixels is same as the perceivable color mapped in the color compensator table.
10. The method as claimed in claim 9, wherein the fourth set of rules comprises selecting the new color for one of the identified pixels associated with the unperceivable color by computing minimum distance between color space value associated with the unperceivable color and color space values of each of the corresponding border pixel colors, wherein based on the computation a Position (P',Q') of the corresponding border pixels with minimum distance is determined, and wherein the new color having a distance nearest to the computed minimum distance is chosen from the selected perceivable color space if the border pixel at Position (P',Q') is found to be recolored, and wherein the step of selecting the new color is repeated for the other identified pixels.
11. The method as claimed in claim 10, wherein the new color is chosen from the selected perceivable color space for recoloring one of the identified pixels if the pixel at Position (P',Q') is not recolored, wherein the new color is chosen based
on a validation that the new color is different from each of the corresponding border pixels, and wherein the step of selecting the new color from the selected perceivable color space is repeated for the other identified pixels, if the pixel at Position (P',Q') is not recolored.
31

12. The method as claimed in claim 1, wherein a percentage of colors available in the selected perceivable color space are replenished if no new color is available for selection.
13. The method as claimed in claim 1, wherein a machine learning model is generated based on the recolored image frames and the corresponding input image frames, wherein further the machine learning model selectively recolors pixels of incoming image frames including unperceivable colors.
14. A system for providing color vision deficiency assistance in real time, the system comprising:
a memory storing program instructions; a processor configured to execute program instructions stored in the memory; and a color correction engine in communication with the processor and configured to:
rescale a retrieved image frame using a first set of rules, wherein the first set of rules comprises converting the retrieved image of resolution M by N into a resolution P by Q;
select a predefined perceivable color space based on one or more parameters;
identify one or more pixels of the rescaled image frame associated with one or more unperceivable colors for recoloring using a second set of rules; and
recolor the identified one or more pixels using the selected perceivable color space and a third set of rules.
15. The system as claimed in claim 14, wherein color
correction engine comprises a scaling unit in communication with
the processor, said scaling unit interfaces with an image
subsystem and is configured to rescale the image frame retrieved
from the image subsystem using the first set of rules, wherein
the first set of rules comprises converting the retrieved image
frame of resolution M by N into a desired resolution P by Q using
the equation P / X = 21, where (21 x X) = 3, P
32

/ X >= 1 and Q / Y = 2j, where (2? x Y) = 3, Q / Y >= 1, wherein M, N, P, Q, X, Y, i and j are integers.
16. The system as claimed in claim 14, wherein color
correction engine comprises a color compensator unit in
communication with the processor, said color compensator unit
configured to select the predefined color space by:
converting an initial color space of the rescaled image frame into a conversion color space based on the one or more parameters including closeness of color space with human perception of colors, wherein a conversion color space values for each pixel of the rescaled image frame is derived from corresponding initial color space values;
categorizing the conversion color space into a perceivable color space and an unperceivable color space based on one or more unperceivable colors in the rescaled image, wherein the one or more unperceivable colors are detected based on the one or more parameters including type of color vision deficiency and one or more inputs received from the color deficient individual; and
selecting the perceivable color space for recoloring.
17. The system as claimed in claim 16, wherein the initial color space is RGB (Red, Green, Blue) color space and the conversion color space is selected from HSL (Hue, Saturation, Lightness), HSV (Hue, Saturation, Value), LMS (Long-range, middle-range, short-range wavelength) color space or any other color space which aligns closely with human perception of colors.
18. The system as claimed in claim 16, wherein the one or more inputs from the color deficient individual include fine tuning an extent to which the unperceivable colors are not visible based on a degree of severity of color vision deficiency.
19. The system as claimed in claim 14, wherein color correction engine comprises a recoloring unit in communication with the processor, said recoloring unit configured to identify the one or more pixels of the rescaled image frame for recoloring
33

using the second set of rules, wherein the second set of rules comprises:
constructing a quasi-region based quad-tree data structure on the rescaled image frame, wherein the rescaled image frame is representative of a root node comprising a plurality of pixels, wherein further the root node is recursively split into four, two or zero equal smaller nodes upto a plurality of leaf nodes, each smaller node and each leaf node comprising one or more center pixel(s) and a plurality of border pixels;
identifying each leaf node comprising one or more center pixels and a plurality of border pixels having one or more unperceivable colors, wherein the identification is performed by analyzing the color space values of each leaf node and rejecting the leaf nodes with border pixels and center pixels in the perceivable color space;
determining one or more contender nodes comprising one or more center pixels having unperceivable colors different from each of the corresponding plurality of border pixels, wherein the determination is performed by recursively analyzing color space values of the one or more center pixels and the plurality of border pixels of each of the identified leaf node until one or more center pixels are found to be having unperceivable colors different from each of the corresponding plurality of border pixels;
selecting the one or more center pixels of each of the contender nodes for recoloring; and
repeating the step of determining contender nodes and selecting the center pixels of the contender node for recoloring, wherein the determination is performed on the higher nodes of the identified leaf node upto to the root node if the one or more unperceivable colors of the center pixels of the leaf nodes are not found to be different from the corresponding border pixels.
34

20. The system as claimed in claim 19, wherein the recoloring
unit is configured to recolor the identified one or more pixels
using the third set of rules and the selected perceivable color
space, wherein the third set of rules comprises:
identifying a color mapping between an unperceivable color associated with one of the identified one or more pixels and a perceivable color in a color compensator table for each of the identified one or more pixels based on determination of availability of a color mapping in the color compensator table; and
recoloring one of the identified one or more pixels with the identified color mapping if color of each of the corresponding border pixels is found to be different from the perceivable color mapped in the color compensator table, and wherein the step of identifying and recoloring is repeated for the other identified pixels.
21. The system as claimed in claim 14, wherein the third set of rules comprises choosing a new color from the selected perceivable color space using a fourth set of rules for recoloring the identified one or more pixels if a color compensator table is empty, further wherein the color compensator table is a dynamic table populated with color mapping information to maintain a record of each perceivable color mapped with one or more unperceivable colors.
22. The system as claimed in claim 20, wherein the third set of rules comprises choosing a new color from the selected perceivable color space using a fourth set of rules for recoloring the identified one or more pixels associated with the unperceivable color if color of any of the corresponding border pixels is same as the perceivable color mapped in the color compensator table.
23. The system as claimed in claim 22, wherein the fourth set of rules comprises selecting the new color for one of the identified pixels associated with the unperceivable color by
35

computing minimum distance between color space value associated with the unperceivable color and color space values of each of the corresponding border pixel colors, wherein based on the computation a Position (P',Q') of the corresponding border pixels with minimum distance is determined, and wherein the new color having a distance nearest to the computed minimum distance is chosen from the selected perceivable color space if the border pixel at Position (P',Q') is found to be recolored, and wherein the step of selecting the new color is repeated for the other identified pixels.
24. The system as claimed in claim 23, wherein the new color is chosen from the selected perceivable color space for recoloring one of the identified pixels if the pixel at Position (P',Q') is not recolored, wherein the new color is chosen based on a validation that the new color is different from each of the corresponding border pixels, and wherein the step of selecting the new color from the selected perceivable color space is repeated for the other identified pixels, if the pixel at Position (P',Q') is not recolored.
25. The system as claimed in claim 14, wherein a percentage of colors available in the selected perceivable color space are replenished based on determination of unavailability of any new color for selection.
Dated this Is' day of May, 2019.
Cognizant Tech$*6$gy Solutions India Pvt. Ltd.
/Oft/ fi®fo

Documents

Application Documents

# Name Date
1 201941017347-STATEMENT OF UNDERTAKING (FORM 3) [01-05-2019(online)].pdf 2019-05-01
2 201941017347-PROOF OF RIGHT [01-05-2019(online)].pdf 2019-05-01
3 201941017347-POWER OF AUTHORITY [01-05-2019(online)].pdf 2019-05-01
4 201941017347-FORM 1 [01-05-2019(online)].pdf 2019-05-01
5 201941017347-DRAWINGS [01-05-2019(online)].pdf 2019-05-01
6 201941017347-COMPLETE SPECIFICATION [01-05-2019(online)].pdf 2019-05-01
7 201941017347-FORM 18 [03-05-2019(online)].pdf 2019-05-03
8 Correspondence by Agent_Form1-Power of Attorney_06-05-2019.pdf 2019-05-06
9 201941017347-Request Letter-Correspondence [17-05-2019(online)].pdf 2019-05-17
10 201941017347-Form 1 (Submitted on date of filing) [17-05-2019(online)].pdf 2019-05-17
11 201941017347-FORM 3 [13-09-2019(online)].pdf 2019-09-13
12 201941017347-Information under section 8(2) [21-09-2021(online)].pdf 2021-09-21
13 201941017347-FORM 3 [21-09-2021(online)].pdf 2021-09-21
14 201941017347-FER_SER_REPLY [21-09-2021(online)].pdf 2021-09-21
15 201941017347-DRAWING [21-09-2021(online)].pdf 2021-09-21
16 201941017347-COMPLETE SPECIFICATION [21-09-2021(online)].pdf 2021-09-21
17 201941017347-CLAIMS [21-09-2021(online)].pdf 2021-09-21
18 201941017347-ABSTRACT [21-09-2021(online)].pdf 2021-09-21
19 201941017347-FER.pdf 2021-10-17
20 201941017347-US(14)-HearingNotice-(HearingDate-13-05-2024).pdf 2024-04-16
21 201941017347-FORM-26 [24-04-2024(online)].pdf 2024-04-24
22 201941017347-Correspondence to notify the Controller [24-04-2024(online)].pdf 2024-04-24
23 201941017347-Written submissions and relevant documents [28-05-2024(online)].pdf 2024-05-28
24 201941017347-PatentCertificate28-05-2024.pdf 2024-05-28
25 201941017347-IntimationOfGrant28-05-2024.pdf 2024-05-28

Search Strategy

1 searchstrategyE_18-03-2021.pdf

ERegister / Renewals

3rd: 14 Jun 2024

From 01/05/2021 - To 01/05/2022

4th: 14 Jun 2024

From 01/05/2022 - To 01/05/2023

5th: 14 Jun 2024

From 01/05/2023 - To 01/05/2024

6th: 14 Jun 2024

From 01/05/2024 - To 01/05/2025

7th: 14 Jun 2024

From 01/05/2025 - To 01/05/2026

8th: 14 Jun 2024

From 01/05/2026 - To 01/05/2027

9th: 14 Jun 2024

From 01/05/2027 - To 01/05/2028

10th: 14 Jun 2024

From 01/05/2028 - To 01/05/2029

11th: 14 Jun 2024

From 01/05/2029 - To 01/05/2030

12th: 14 Jun 2024

From 01/05/2030 - To 01/05/2031

13th: 14 Jun 2024

From 01/05/2031 - To 01/05/2032

14th: 14 Jun 2024

From 01/05/2032 - To 01/05/2033

15th: 14 Jun 2024

From 01/05/2033 - To 01/05/2034