Sign In to Follow Application
View All Documents & Correspondence

"Automatic Font Color Selection."

Abstract: A method for automating font color selection and placing a human understandable object, such as caption, text or a figurative element in a image, comprising the steps of: selecting the area in which the human understandable object is to be inserted, converting the RGB values into HSL values and forming a histogram of pixels in the area in which the said human understandable object is to be inserted, identifying the dominating colors of the selected area from the said histogram, removing the said dominating colors from the set of candidate colors, remaining candidate colors are given a rating depending on weighted color and brightness difference between the said candidate and non-candidate colors, selecting the color with the highest rating as the font color, rendering  the  digital  image  with  human  understandable  object  overlaid substantially into the location selected by the user.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
07 November 2006
Publication Number
20/2008
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SAMSUNG INDIA ELECTRONICS PVT. LTD
B-1, SECTOR-81, PHASE II NOIDA-201305, INDIA

Inventors

1. PANKAJ MISHRA
C/O SAMSUNG INDIA ELECTRONICS PVT. LTD., OF B-1, SECTOR-81, PHASE II NOIDA-201305, INDIA
2. SAURABH JAIN
C/O SAMSUNG INDIA ELECTRONICS PVT. LTD., OF B-1, SECTOR-81, PHASE II NOIDA-201305, INDIA
3. GEETIKA SHARMA
C/O SAMSUNG INDIA ELECTRONICS PVT. LTD., OF B-1, SECTOR-81, PHASE II NOIDA-201305, INDIA
4. SURESH VARIGANJI
C/O SAMSUNG INDIA ELECTRONICS PVT. LTD., OF B-1, SECTOR-81, PHASE II NOIDA-201305, INDIA

Specification

Title
Automatic font color selection
Field of the invention
The invention relates to a method for automatic font color selection for overlaying human understandable object on a user-defined portion of a digital image.
Background of the Invention:
In many different televisions or displays, a text is combined with an image to be displayed together. In these cases, the text or the graphic object constitutes a foreground object whereas the image is the background. A combination of the text and image is used, for example to illustrate the text by means of the background image or vice versa. The overlaying text sometimes becomes illegible or difficult to read when superimposed on a background. This occurs when the contrast between the superimposed information, and the background becomes diminished due to the excessive brightness or other display problems.
In television sets a user is often confronted with the problem of selecting a foreground text object to a given background image in such a way that the text is clearly legible. In such a case, the selection of a suitable color poses a problem due to usually large number of possible and available colors for the text object.
US Patent No 6711291 describes a method for automatic determination of a region of interest in an image, and placing a human understandable item, such as a caption, text or a figurative element in a digital image. The method mainly comprises of two steps: Firstly Digitally processing the digital image to recognize and identify an optimal location in the digital image for placing the item and secondly modifying the placement of the item in relation to human understandable image content so as to minimally obscure

such human understandable image content in a digital image. The recognition and identification of an optimal location further includes the identification of an optimal open space region and then finding an optimal location for the item within the open space region. In the aforementioned method an optimal location is selected on the basis of smooth, open spaces in the image. It is possible that such a space does not exist in the image at hand. For example, the user may provide an image of a natural scene, which has large color and texture variations all over it. In short the method does not work on a large class of images it has limited applicability. Even if an open region is determined by the first step, it may not be acceptable by the user, as the user may want to insert the text at a particular location. This location may not satisfy the criteria used to determine an open region and so will not be identified by the algorithm as a candidate region.
US Patent application No. 20050157926 describes an apparatus, method and article of manufacture therefore, for automatically determining a foreground color for a digital image. The apparatus includes a color clustering module and a color selection module. The color-clustering module automatically divides the colors of the pixels of at least a part of the digital image into a number of color clusters in a color space. The color selection module automatically selects for at least one color cluster, a color that is related to at least one color cluster according to predetermined criteria. The above patent does not mention the text to be predefined or user defined.
US Patent application No. 20030097475 relates to an invention, which provides color translation between a server side application and a web browser application. A color object created on the server side is reduced into the individual color components defining the color property for the object. These discrete color components are then converted into numerical values representing the intensity of the respective components. For objects including text information, the color property for the text are automatically determined, based on the translated color object, to achieve a high contrast between the background color and the foreground or text color. The above patent explains a web-server system where an auto text color function can change the color of the text by comparing the RGB

values of the text with the background color. Also the system is described for use in HTML applications.
US Patent No 7064759 relates to a method for displaying text over a background, which in turn improves the legibility of the text that is displayed against the background. The term background is used to refer to any two-dimensional surface or three-dimensional object. The same method is used to improve the contrast between an image and a background over which the image is displayed. Selecting one or more color component values for use in generating the text increases the contrast between the text and background. One of the drawbacks of the aforementioned method is that it does not provide for changing the attributes of the font other than color.
US Patent No 5721792 discloses an apparatus and method for superimposing text information on a video background signal. The apparatus includes means to input the video background signal, a text generator for generating a text signal containing the text information to be superimposed on the video background signal and automatic contrast circuit means for selectively controlling the brightness of the text signal to maintain at least a predetermined minimum contrast between the brightness of the text signal and the brightness of the video background signal varies. This patent describes a process of changing the brightness value of text overlaying on image and does not relate to changing font color or other attributes.
One of the drawbacks of the discussed prior art is that the automatic color selection strategy relies heavily on the region selection module to discover an appropriate region in which to insert the item. Such a region should have little or no color variation. Once such a location has been found, it is easy to find a contrasting color, as there would be only few colors or different shades of a single color appearing in the selected region. The algorithm will fail if there is no such region occurring in the image, as the set of candidate colors would become empty

Summary of the Invention:
To overcome the abovementioned drawbacks the invention provides for an automated method of font selection for overlaying text on a user defined portion of a digital image.
It is one of the objects of the invention to provide for automated font color selection for a string of text to be inserted at a particular location in an image.
It is one of the aspects of the invention to allow users to select a suitable location within the image for overlaying item.
It is one of the aspects of the invention to provide for improved contrast ratio between the overlaying item and the background.
It is further one of the objects of the invention to provide for users to select for the item to be superimposed on the image.
It is further one of the objects of the invention to provide for improved visibility of the overlaying item against the background.
To achieve the aforementioned objects the present invention provides a method for automating font color selection and placing a human understandable object, such as caption, text or a figurative element in a image, comprising the steps of:
• selecting the area in which the human understandable object is to be inserted,
• converting the RGB values into HSL values and forming a histogram of pixels in
the area in which the said human understandable object is to be inserted,
• identifying the dominating colors of the selected area from the said histogram,
• removing the said dominating colors from the set of candidate colors,
• remaining candidate colors are given a rating depending on weighted color and
brightness difference between the said candidate and non-candidate colors,

• selecting the color with the highest rating as the font color,
• rendering the digital image with human understandable object overlaid
substantially into the location selected by the user.
The present invention allows a user to select a location in an image as well as the text to be inserted at the desired location. The font color of the text is then chosen automatically to contrast all the colors that occur in the region occupied by the text. The invention has primarily been made for a digital television, which allows users to store images in it. The images may additionally be annotated with a string of text, which is overlaid on the image. The present invention may also be included with the software for a digital camera or digital camcorder, providing users the additional functionality of labeling their photos and videos.
Brief Description of the accompanying drawings:
For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated device, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
Figure 1 is a diagram of the steps involved in selecting a location and font color of the text in an image.
Figure 2 illustrates an example of the background image.

Detailed description of the invention:
The invention comprises of a single module, which takes as input an image, the string of text to be overlaid, and the location at which the text is to be inserted. The output is a color, which maximally contrast the colors that occur in the area of the image covered by the text.
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.
Through out the patent specification, a convention employed is that in the appended drawings, like numerals denote like components.
Figure 1 illustrates the steps involved for selection of a suitable font color for a human understandable object to be overlaid on an image. (101) Firstly the user selects the location within the image where the said human understandable object is to be superimposed. (102) Determine the area of the image that will get covered by the image, read the RGB (Red Green Blue) values of the pixels within the area and convert them into HSL (Hue Saturation Luminance) co-ordinates. (103) Quantize the HSL color space. Each of the three dimensions may be quantized differentially. For example, we may have 36 divisions in H, 4 in S and 10 in L, providing 1440 bins for the entire space. (104) A list of non-candidate colors is created with respect to the location selected by the user for the placement of the string text. (105) The rest of the colors in the quantized HSL space are marked as candidate colors. (106) Each candidate color is rated based on their color difference criterion and brightness difference criterion mentioned below respectively i.e.
Color difference = max (Rl, R2) - min (Rl, R2) + max (Gl, G2) - min (Gl, G2) + max (Bl,B2)-min(Bl,B2)
And brightness difference = | (299*R1 + 587*G1 + 114*81) - (299*R2 + 587*G2 + 114*82) /1000

(107) All the candidate colors are sorted in order of decreasing rating. (108) Choose the color having the highest rating as the color of overlaying text or item.
Figure 2 shows an input image and the text string to be overlaid on the image at different locations selected by the user. The abovementioned steps are applied to the selected image and the user decides the location within the image where the text string is to be displayed. The sequence of steps provide for selecting a suitable contrasting font color.

We claim:
1) A method for automating font color selection and placing a human understandable object, such as caption, text or a figurative element in a image, comprising the steps of:
• selecting the area in which the human understandable object
is to be inserted,
• converting the RGB values into HSL values and forming a
histogram of pixels in the area in which the said human
understandable object is to be inserted,
• identifying the dominating colors of the selected area from
the said histogram,
• removing the said dominating colors from the set of
candidate colors,
• remaining candidate colors are given a rating depending on
weighted color and brightness difference between the said
candidate and non-candidate colors,
• selecting the color with the highest rating as the font color,
• rendering the digital image with human understandable
object overlaid substantially into the location selected by the
user.
2) The method as claimed in claim 1, wherein the said human understandable object is of any given length on a give image at a user specified location.
3) The method as claimed in claim 1, wherein the chosen color for the said human understandable object maximally contrasts the colors occurring in the area of the image covered by the text.

4) The method as claimed in claim 1, wherein a color for the said human understandable object is selected according to the legibility criterion for a background image.
5) The method as claimed in claim 1, wherein the said method is used for television sets, camcorders, digital cameras and other devices with display
6) The method as claimed in claim 1, wherein the said method can be applied on any type of given image.
7) The method as claimed in claim 1, wherein the said method allows user to select the image, the text, and the location where the text is to be inserted.

Documents

Orders

Section Controller Decision Date
Section 15 B P SINGH 2019-05-31
Section 15 B P SINGH 2019-05-31
Section 15 B P SINGH 2019-05-31

Application Documents

# Name Date
1 2416-DEL-2006-Correspondence-120719.pdf 2019-07-22
1 2416-del-2006-Form-18-(24-11-2008).pdf 2008-11-24
2 2416-del-2006-Correspondence Others-(24-11-2008).pdf 2008-11-24
2 2416-DEL-2006-OTHERS-120719.pdf 2019-07-22
3 2416-del-2006-gpa.pdf 2011-08-21
3 2416-DEL-2006-8(i)-Substitution-Change Of Applicant - Form 6 [27-06-2019(online)].pdf 2019-06-27
4 2416-DEL-2006-Form-5.pdf 2011-08-21
4 2416-DEL-2006-ASSIGNMENT DOCUMENTS [27-06-2019(online)].pdf 2019-06-27
5 2416-DEL-2006-PA [27-06-2019(online)].pdf 2019-06-27
5 2416-DEL-2006-Form-3.pdf 2011-08-21
6 2416-DEL-2006-Form-1.pdf 2011-08-21
6 2416-DEL-2006-Changing Name-Nationality-Address For Service [05-03-2018(online)].pdf 2018-03-05
7 2416-DEL-2006-RELEVANT DOCUMENTS [05-03-2018(online)].pdf 2018-03-05
7 2416-del-2006-correspondence-other.pdf 2011-08-21
8 Written submissions and relevant documents [27-06-2017(online)].pdf 2017-06-27
8 2416-del-2006- form-2.pdf 2011-08-21
9 2416-del-2006- drawings.pdf 2011-08-21
9 2416-DEL-2006_EXAMREPORT.pdf 2016-06-30
10 2416-del-2006- description (complete).pdf 2011-08-21
10 Abstract [07-11-2015(online)].pdf 2015-11-07
11 2416-del-2006- claims.pdf 2011-08-21
11 Claims [07-11-2015(online)].pdf 2015-11-07
12 2416-del-2006- abstract.pdf 2011-08-21
12 Description(Complete) [07-11-2015(online)].pdf 2015-11-07
13 Description(Complete) [07-11-2015(online)].pdf_20.pdf 2015-11-07
14 Examination Report Reply Recieved [07-11-2015(online)].pdf 2015-11-07
14 GPOA.pdf 2014-05-20
15 Form 13 [07-11-2015(online)].pdf 2015-11-07
15 Form 13_Address for service.pdf 2014-05-20
16 Amended Form 1.pdf 2014-05-20
16 OTHERS [07-11-2015(online)].pdf 2015-11-07
17 OTHERS [07-11-2015(online)].pdf 2015-11-07
17 Amended Form 1.pdf 2014-05-20
18 Form 13_Address for service.pdf 2014-05-20
18 Form 13 [07-11-2015(online)].pdf 2015-11-07
19 Examination Report Reply Recieved [07-11-2015(online)].pdf 2015-11-07
19 GPOA.pdf 2014-05-20
20 Description(Complete) [07-11-2015(online)].pdf_20.pdf 2015-11-07
21 2416-del-2006- abstract.pdf 2011-08-21
21 Description(Complete) [07-11-2015(online)].pdf 2015-11-07
22 2416-del-2006- claims.pdf 2011-08-21
22 Claims [07-11-2015(online)].pdf 2015-11-07
23 2416-del-2006- description (complete).pdf 2011-08-21
23 Abstract [07-11-2015(online)].pdf 2015-11-07
24 2416-DEL-2006_EXAMREPORT.pdf 2016-06-30
24 2416-del-2006- drawings.pdf 2011-08-21
25 2416-del-2006- form-2.pdf 2011-08-21
25 Written submissions and relevant documents [27-06-2017(online)].pdf 2017-06-27
26 2416-DEL-2006-RELEVANT DOCUMENTS [05-03-2018(online)].pdf 2018-03-05
26 2416-del-2006-correspondence-other.pdf 2011-08-21
27 2416-DEL-2006-Form-1.pdf 2011-08-21
27 2416-DEL-2006-Changing Name-Nationality-Address For Service [05-03-2018(online)].pdf 2018-03-05
28 2416-DEL-2006-PA [27-06-2019(online)].pdf 2019-06-27
28 2416-DEL-2006-Form-3.pdf 2011-08-21
29 2416-DEL-2006-Form-5.pdf 2011-08-21
29 2416-DEL-2006-ASSIGNMENT DOCUMENTS [27-06-2019(online)].pdf 2019-06-27
30 2416-del-2006-gpa.pdf 2011-08-21
30 2416-DEL-2006-8(i)-Substitution-Change Of Applicant - Form 6 [27-06-2019(online)].pdf 2019-06-27
31 2416-del-2006-Correspondence Others-(24-11-2008).pdf 2008-11-24
31 2416-DEL-2006-OTHERS-120719.pdf 2019-07-22
32 2416-DEL-2006-Correspondence-120719.pdf 2019-07-22
32 2416-del-2006-Form-18-(24-11-2008).pdf 2008-11-24