Sign In to Follow Application
View All Documents & Correspondence

Image Processing

Abstract: Systems and methods for image processing are described. In one embodiment, a method for image processing comprises obtaining an image having specular reflection. The image comprises a plurality of pixels, each of the plurality of pixels including a luminance component and a chrominance component. Further, the method comprises identifying specular pixels amongst the plurality of pixels in the image. Each of the identified specular pixels is enhanced based on at least one image enhancement technique, wherein the at least one image enhancement technique is applied on the luminance component of the specular pixels to generate at least one enhanced image.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 August 2012
Publication Number
10/2014
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2021-04-29
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED
Nirmal Building  9th Floor  Nariman Point  Mumbai  Maharashtra 400021

Inventors

1. CHATTOPADHYAY  Tanushyam
Plot A2  M2 & N2  Sector V  Block GP  Salt Lake Electronics Complex  Kolkata 700091West Bengal
2. VISVANATHAN  Aishwarya
Tata Consultancy Services Ltd Abhilesh Building Plot#96  EPIP Industrial Area  Whitefield Road  Whitefield Bangalore - 560017
3. BHATTACHARYA  Ujjwal
CVPR Unit  Indian Statistical Institute  203 B.T. Road  Kolkata 700 108

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
1. Title of the invention: IMAGE PROCESSING
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor,
SERVICES LIMITED Nariman Point, Mumbai,
Maharashtra 400021, India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it
is to be performed.

TECHNICAL FIELD
[0001] The present subject matter, in general, relates to image processing and, in particular, to systems and methods for processing images having specular reflection.
BACKGROUND
[0002] Specular reflection is a mirror-like reflection of light from a surface. The specular reflection typically occurs when substantially unidirectional light from a source, falling on a surface of an object whose image is to be captured, is reflected in substantially the same direction causing the surface to appear shiny. The occurrence of specular reflection depends on the properties of the surface of the captured object and characteristics as well as position of the light source relative to the surface of the captured object. The specular reflection may cause the occurrence of glare patches or bright spots within a captured image. Such bright spots mask the structure of the captured image, resulting in loss of details and thus in a reduction of the visibility or clarity of such a captured image.
[0003] There are many examples where the specular reflection deteriorates the quality of the captured images. The effect of specular reflection may be observed, for instance, on the surface of water reflecting the light coming from the sun. Specular reflection also occurs in the photographing of documents printed on glossy paper, or when photographing objects including glass surfaces, such as faces wearing glasses, windows or mirrors. An unpleasant consequence of the specular reflection is the loss of information occluded by the bright spots. Such loss of information may be annoying for the persons viewing the image. Moreover, Optical Character Recognition (OCR) tools fail to recognize or interpret text contained in the images occluded by the specular reflection.
SUMMARY
[0004] This summary is provided to introduce concepts related to image processing. These concepts are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.

[0005] In one embodiment, a method for image processing comprises obtaining an image having specular reflection. The image comprises a plurality of pixels, each of the plurality of pixels including a luminance component and a chrominance component. Further, the method comprises identifying specular pixels amongst the plurality of pixels in the image. Each of the identified specular pixels is enhanced based on at least one image enhancement technique, wherein the at least one image enhancement technique is applied on the luminance component of the specular pixels to generate at least one enhanced image.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] The detailed description is described with reference to the accompanying figure(s). In the figure(s), the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figure(s) to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and with reference to the accompanying figure(s), in which:
[0007] Fig. 1 illustrates a network environment implementing an image processing system, in accordance with an embodiment of the present subject matter.
[0008] Fig. 2(a) illustrates components of the image processing system, in accordance with an embodiment of the present subject matter.
[0009] Figs. 2(b) illustrate an exemplary image having specular reflection provided as an input to the image processing system.
[0010] Figs. 2(c)-2(d) illustrate enhanced images obtained as an output from the image processing system, in accordance with an embodiment of the present subject matter.
[0011] Fig. 3 illustrates a method for image processing, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0012] Specular reflection deteriorates quality of the captured images, resulting in loss of information occluded by the bright spots. The specular reflection may cause problems in applications such as document photographing. The text or pictures of photographed document may be occluded by the specular reflection, causing the photographed document to be

incomplete. Further, the specular reflection causes problems when the captured images contain textual information affected by the presence of bright spots. Such textual information is often difficult to interpret. Also, an OCR tool fails to recognize such textual information, thereby resulting in errors in processing the images.
[0013] Conventional systems and methods of processing images with the specular reflection are limited for use with a particular type of image. Such systems and methods typically fail to process other types of images. As an instance, a conventional system designed to process images with only white background fails to process the images having non-white background. Further, such systems and methods attempts to reduce the specular reflection by way of adjusting color and contrast of the object/text within the image and its background that impacts the quality of the entire image. Thus, such conventional systems and methods achieve reduction in specular reflection at the cost of compromising with the quality of the image. Moreover, such conventional techniques fail to process images containing text. Such techniques when applied for processing text images in order to enable an OCR tool to read the text in the image often produce unsatisfactory results. For example, when a credit card image is processed for OCR using conventional techniques, the result is of inferior quality. The processing/reading of the text occluded by the specular reflection becomes difficult for the OCR tool.
[0014] In accordance with the present subject matter, the systems and methods for image processing is described. According to one embodiment, the image, for example, a camera captured image, affected by the specular reflection is received as an input. As known, an image is composed of various pixels. When the image is said to be affected by the specular reflection, it is apparent that either some or all of the pixels of the image are under the influence of the bright spots. Such pixels affected by the specular reflection or bright spots are hereinafter referred to as specular pixels, while the remaining pixels of the image falling under the influence of usual lighting are hereinafter to as diffused pixels.
[0015] The systems and methods according to the present subject matter enhance the image having specular reflection which thereby improves clarity and recognition accuracy of the image. In one implementation, enhancement is carried out on specular pixels of the image. The specular pixels when taken collectively define one or more specular regions in the image. Such specular pixels amongst multiple pixels of the image are identified based on a localization technique. A

localization technique when applied on the input image segregates the pixels of the image into specular pixels and diffused pixels. The specular pixels, thus, identified are then picked up for the enhancement.
[0016] The enhancement of the specular pixels is carried out using one or more image enhancement techniques. The image enhancement techniques may include a YUV technique, a color clustering technique, a diffused pixels adjacency technique, a histogram equalization technique, a gamma correction technique, and a laplacian technique.
[0017] Each pixel of the image, irrespective of whether the pixel is a specular pixel or a diffused pixel, comprises a luminance component indicative of the intensity of the light emitted from the pixel, and a chrominance component indicative of chromaticity of the pixel. According to an embodiment of the present subject matter, the image enhancement technique is applied on the luminance component of the specular pixels. As a result of the application, the specular pixels of the image are enhanced resulting in an enhanced image, which is substantially free from the specular reflection. The text contained in the enhanced image, if any, can be easily recognized by an OCR tool.
[0018] There could be several ways in which image enhancement techniques can be applied on the specular pixels of the image. Few of such implementations are described below. It is to be understood that such implementations should not be construed as a limitation, various other implementations of the application of image enhancement techniques are possible without deviating from the scope of the present subject matter.
[0019] According to one implementation, one of the image enhancement techniques is applied for enhancement of the image to be processed, which is hereinafter referred to as an input image. According to another implementation, a combination of image enhancement techniques is applied on the input image. According to yet another implementation, multiple copies of the input image are produced. Subsequently, one image enhancement technique or combinations of two or more image enhancement techniques are applied to each copy of the image separately, resulting in generation of multiple enhanced images. The manner in which image enhancement technique is to be applied can be preconfigured by the user or can be manually specified by the user at the time of application.

[0020] The enhanced image(s), thus, generated can be binarized for further enhancement of the specular pixels. The binarization can be performed using a conventionally known binarization technique. Binarization further improves the quality of the enhanced image(s). An image obtained as a result of binarization is hereinafter referred to as binarized enhanced image. In one implementation, choice of binarizing an enhanced image is left to the user. While, in other implementation, manner in which binarization is to be applied can be preconfigured by the user.
[0021] Unlike conventional systems that are limited to processing a particular type of image, for example, images having white background, the systems and methods according to the present subject matter are capable of processing any type of image irrespective of its color and background. Thus, even the images having non-white background can be efficiently processed by the present systems and methods. Further, enhancement process according to the present subject matter is carried out pixels-wise, wherein the enhancement takes place only on the specular regions of the image without affecting quality of the remaining regions of the image. Thus, the resulting enhanced image provides better quality. Also, text within the enhanced image, if any can be easily interpreted and recognized by the OCR tool.
[0022] The above methods and systems are further described in conjunction with the following figures. It should be noted that the description and figures merely illustrate the principles of the present subject matter. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the present subject matter and are included within its spirit and scope. Furthermore, all examples recited herein are principally intended expressly to be only for pedagogical purposes to aid the reader in understanding the principles of the present subject matter and the concepts contributed by the inventor(s) to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the present subject matter, as well as specific examples thereof, are intended to encompass equivalents thereof.
[0023] The manner in which the image processing takes place shall be explained in detail with respect to the Figs. 1-3. While aspects of systems and methods can be implemented in any

number of different computing systems environments, and/or configurations, the embodiments are described in the context of the following exemplary system architecture(s).
[0024] Fig. 1 illustrates a network environment 100 implementing an image processing system 102, in accordance with an embodiment of the present subject matter. In one implementation, the network environment 100 can be a company network, including thousands of office personal computers, laptops, various servers, such as blade servers, and other computing devices connected over a network 106.
[0025] The image processing system 102 is connected to a plurality of user devices 104-1, 104-2, 104-3,...104-N, collectively referred to as the user devices 104 and individually referred to as a user device 104, through the network 106. A plurality of users may use the user devices 104 to access the image processing system 102 for processing the images. The image processing system 102 may be implemented in a variety of computing systems, such as servers, a desktop personal computer, a notebook or portable computer, a workstation, a mainframe computer, and a laptop. The user devices 104 may include, without limitation, desktop computers, cellular phones, laptops or other portable computing devices, and network computing devices. In one implementation, the image processing system 102 can be integrated within the user devices 104, and the plurality of users may process the images using the image processing system/functionality provided within their respective user devices 104.
[0026] The network 106 may be a wireless network, wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and such. The network 106 may either be a dedicated network or a shared network, which represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), to communicate with each other. Further, the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices.
[0027] According to one embodiment, the image processing system 102 receives an image, such as a camera captured image, having specular reflection, from user devices 102.. In one implementation, the image processing system 102 may receive the image from an external

memory/repository associated with the image processing system 102 via the network 106, or an internal memory of the image processing system 102. Upon receiving, the image processing system 102 localizes pixels of the image into specular pixels and diffused pixels. The image processing system 102, according to said embodiment, includes an image enhancement module 108 configured to process the specular pixels of the image so as to reduce the specular reflection and improve the interpretation and recognition accuracy of the image. For images containing text, the enhanced image produced by the image enhancement module 108 can be fed to an OCR system 110 for recognizing the text contained in the enhanced image. The text, for example, may be comprised of alpha-numeric characters. For example, an image of a credit card containing text can be processed by the image processing system 102 resulting in an enhanced image that can be fed to the OCR system 110 to read the text contained therein.
[0028] According to the embodiment depicted in the fig. 1, the OCR system 110 is connected to the image processing system 102 through the network 106. In another embodiment, the OCR system 110 can be integrated within the image processing system 102.
[0029] Fig. 2(a) illustrates components of the image processing system 102, according to an embodiment of the present subject matter.
[0030] In said embodiment, the image processing system 102 includes one or more processor(s) 202, a memory 204 coupled to the processor(s) 202, and interface(s) 206. The processor(s) 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) 202 is configured to fetch and execute computer-readable instructions and data stored in the memory 204. The interface(s) 206 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, allowing the image processing system 102 to interact with the user devices 104. Further, the interface(s) 206 may enable the image processing system 102 to communicate with other computing devices, such as web servers and external data servers (not shown in figure). The interface(s) 206 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example LAN, cable, and wireless networks such

as WLAN, cellular, or satellite. The interface(s) 206 may include one or more ports for connecting the image processing system 102 to a number of other devices to or to another server.
[0031] The memory 204 may include any computer-readable medium known in the art including, for example, volatile memory such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 204 also includes module(s) 208 and data 210.
[0032] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. The modules 208 further include a localization module 212, the image enhancement module 108, a binarization module 214, and other modules 216. The other modules 216 may include programs or coded instructions that supplement applications and functions on the system 102. In one implementation, the image enhancement module 108 may further include various sub-modules, namely, a YUV module 218, a diffused pixel adjacency module 220, a color clustering module 222, a laplacian module 224, a histogram equalization module 226, and a gamma correction module 228.
[0033] The data 210, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the module(s) 208. The data 210 includes input image 230, enhanced image(s) 232, binarized enhanced image(s) 234, and other data 236. The other data 236 may include data generated as a result of the execution of one or more modules in the other modules 216.
[0034] According to the present subject matter, the image processing system 102 receives an image affected by the specular reflection as an input. Such an image is hereinafter referred to as input image 230. The input image 230 is composed of a plurality of pixels. The localization module 212 of the image processing system 102 receives the input image 230 and subsequently segregates the pixels of the image into specular pixels and diffused pixels. The localization module 212 can be configured to carry out the localization using any conventionally known localization technique.
[0035] In one implementation, the localization module 212 is configured to carry out the localization based on a conventionally known specular-to-diffuse localization technique.

According to this technique, a specular component and a diffuse component is calculated based on the chromaticity and intensity of the pixels. A predefined threshold value is then set to the diffuse component to coarsely segment the specular regions of the input image 230. A pixel is labeled to be a specular pixel, if the diffuse component of the pixel is greater than the threshold value, while, if the diffuse component of the pixel is lesser than the threshold value, the pixel is labeled to be a diffused pixel. Thus, based on the localization, specular pixels and diffused pixels are identified.
[0036] The specular pixels are then picked up for the enhancement. The image enhancement module 108 enhances the specular pixels of the input image 230 based on one or more image enhancement techniques. The image enhancement techniques referred herein includes a YUV technique, a color clustering technique, a diffused pixels adjacency technique, a histogram equalization technique, a gamma correction technique, and a laplacian technique. In one implementation, the image enhancement module 108 comprises various sub-modules, namely, the YUV module 218 configured to apply the YUV technique, the diffused pixels adjacency module 220 configured apply diffused pixels adjacency technique, the color clustering module 222 configured to apply the color clustering technique, the laplacian module 224 configured to apply laplacian technique, the histogram equalization module 226 configured to apply the histogram equalization technique, and the gamma correction module 228 configured to apply the gamma correction technique.
[0037] The image enhancement module 108 may be configured to invoke one or more of such sub-modules according to a predefined pattern to enhance the input image 230. Such a predefined pattern may be configured by a user. In one example, the predefined pattern may include a default pattern that will be applicable to all the images. Further, the predefined pattern may be a user-defined pattern that is created by the user and applied to the images. Such a user-defined pattern can be saved and re-applied to the input images. The user may select the desired predefined pattern to be applied. In case, no pattern is selected by the user, the image processing module 108 is configured to apply the default pattern for enhancement. In one implementation, patterns can be created, modified and/or deleted by the users.
[0038] As an example, the pattern may be applying YUV technique to the input image 230. In said example, image enhancement module 108 invokes the YUV module 218 that applies the

YUV technique on the specular pixels to enhance the input image 230 at specular region(s). In another example, the pattern may be applying a laplacian technique and then the diffused pixel adjacency technique. According to said example, the image enhancement module 108 invokes the laplacian module 224 that applies the laplacian technique on the specular pixels of the input image 230, and then invokes the diffused pixel adjacency module 220 to apply the diffused pixel adjacency technique on the input image 230.
[0039] In one implementation, the image enhancement module 108 is configured to create multiple copies of the input image 230 and apply the image enhancement techniques on each copy of the input image 230 according to the predefined pattern. The number of copies to be created can be preconfigured into the image processing system 102 or can be manually provided by the user at the time of carrying out the enhancement.
[0040] In one example, the image enhancement module 108 is configured to create six copies of the input image 230. In said example, the image enhancement module 108 is configured according to predefined pattern to invoke each of the six sub-modules, i.e., the YUV module 218, the diffused pixels adjacency module 220, the color clustering module 222, the laplacian module 224, the histogram equalization module 226, and the gamma correction module 228, either in parallel or sequential order to separately apply the corresponding image enhancement technique on each copy of the input image 230. As a result, six enhanced images 232 are obtained. A user may then select a best enhanced image 232 out of the six enhanced images 232.
[0041] The manner in which the aforementioned image enhancement techniques are applied is described in detail in the following sub-sections, as YUV technique, diffused pixels adjacency technique, color clustering technique, laplacian technique, histogram equalization technique, and gamma correction technique.
YUV Technique
[0042] The image enhancement module 108 may invoke the YUV module 218 to apply the YUV technique. According to the YUV technique, a YUV value of the specular pixel is

Y = CLIP ((0.257 x R) + (0.504 x G) + (0.098 x B) + 16) ...( 1)
U = CLIP (-(0.148 X R) - (0.291 X G) + (0.439 X B) + 128) ...(2)
V= CLIP ((0.439 X R) - (0.368 X G) + (0.071 X B) + 128) .....(3)
Where Y = luminance component of a pixel; and
U and V= chrominance component of a pixel
[0043] Once the YUV values are computed, the YUV module 218 replaces the Y value of
each of the specular pixels by an average value of the diffused component and the specular component, as provided by the equation (4). As described previously, the values of the diffused
component and the specular component are obtained during the localization (4)
[0044] The YUV module 218 then recalculates the RGB values of the specular pixels based
on the newly obtained Y value, as shown in the equations (5), (6) and (7).
R = CLIP ((1.164 X(Y- 16) - 1.596 x (V - 128)
G = CLIP ((1.164 X(Y- 16) - 0.813 x (V - 128) - 0.391 X (U - 128)) ...(6)
B = CLIP ((1.164 x(Y- 16) + 2.018 x (U - 128)) ...(7)
[0045] By calculating the new Y value that represents a luminance component of the pixel,
and substituting the newly computed Y value into the specular pixels, an enhanced image 232 is obtained. As described above, the enhancement is carried out at specular pixels of the input image 232, thus, the quality of the entire input image 232 is not affected.
Diffused Pixel Adjacency Technique
[0046] The image enhancement module 108 may trigger the diffused pixel adjacency
module 228 to apply the diffused pixel adjacency technique. According to the diffused pixel adjacency technique, the diffused pixel adjacency module 220 uses diffused pixels that are in proximity to the specular pixels to enhance the specular pixels. In one implementation, eight neighboring diffused pixels are used to enhance the specular pixels.
X X X

X 0 X
X X X
[0047] As shown above, 0 refers to the specular pixel under consideration and X refers to the neighboring diffused pixels. The diffused pixel adjacency module 220 obtains the Y component value of the eight adjacent diffused pixels, and subsequently computes a mode value of obtained Y values. The diffused pixel adjacency module 220 then replaces the Y component value of the specular pixels by the mode value, to obtain the enhanced image 232 that is substantially free from specular reflection.
Color Clustering Technique
[0048] The image enhancement module 108 may invoke the color clustering module 222 to apply the color clustering technique. According to the color clustering technique, a set of observations is partitioned into ‘K’ clusters in which each observation in the set of observations belongs to the cluster with a nearest mean. The basic principle of the color clustering technique is to define ‘K’ centroids one for each of the clusters.
[0049] The color clustering module 222 first converts the pixels in the input image 230 into YUV format. After the conversion, with value K=2, the color clustering module 222 obtain two centroids, i.e., c1(y1, u1, v1) and c2(y2, u2, v2) with respect to U and V component for each row taking the diffused pixels into consideration. The distance of each specular pixel’s U and V is computed with the two centroids of the corresponding row, d1 and d2. If d1d2, the color clustering module 222 replaces the Y component of the specular pixels by y2. In other words, least distance of specular component to the cluster centroid is selected for replacement of the Y component of the specular pixel. As a result, an enhanced image 232 is produced.
Laplacian Technique
[0050] The image enhancement module 108 may invoke the laplacian module 224 to apply the laplacian technique. Laplacian is a two-dimensional measure of a second spatial derivative of

an image. The laplacian of an image highlights regions of rapid intensity change The laplacian L(x, y) of an image with pixel intensity value I(x, y) is given by the equation (8):

[0051] The laplacian L(x, y) can be calculated using a convolution filter. Since the input image is represented as a set of discrete pixels, a discrete convolution kernel that can approximate the second derivatives is determined. In one implementation, the following 3×3 kernels may be used to approximate the second derivatives.

0
-1 0
-1 4 -1
0 -1 0

-1 -1 -1
-1 8 -1
-1 -1 -1

[0052] According to the laplacian technique, one of the kernels is used to calculate the
Laplacian using standard convolution methods. Each Y component of the specular pixels, along with its 8 neighboring pixels is multiplied with the 3×3 kernel to eliminate the specular effect. As a result, an enhanced image 232 is obtained.
Histogram Equalization
[0053] The image enhancement module 108 may trigger the histogram equalization module
226 to apply the histogram equalization technique on the input image 230. The histogram equalization techniques involve adjusting contrast of the input image 230 using a histogram. When the histogram equalization is applied to any image, regions of the image having lower local contrast gains a higher contrast. Histogram equalization accomplishes this by effectively spreading out most frequent intensity values of the pixels in the image.
[0054] In one implementation, the histogram equalization module 226 computes a
normalized histogram for the luminance component (Y) of the specular pixels according to the

Where n{i}= The number of pixels with luminance factor having a value i in the image, wherein the value of i satisfy the criteria 0 <= i <= 255; and
MUhist(i) = Normalized histogram for the Y value i and count is the total number of pixels in the image.
[0055] Based on the histogram equalization, a histogram equalized image (g) is obtained
which is defined by the equation (10).
g(i) = floor (255 x ∑i = Oi = 255 MUhist(i)) …(10)
[0056] Such a histogram equalized image is an enhanced image 232 that is substantially free
from specular reflection.
Gamma Correction Technique
[0057] The image enhancement module 108 may invoke the gamma correction module 228
to apply the gamma correction technique. Gamma correction is a non-linear operation used to enhance images. Gamma correction is, in the simplest cases, is defined by the following equation (11):

Where A is a constant, and Vin and Vout is the input and output values that are non-negative real values.
[0058] The gamma correction module 228 performs the gamma correction using the above
equation, wherein a gamma value of 0.8 is used. As a result of the gamma correction, an enhanced image 232 is obtained.
[0059] As mentioned in the foregoing description, the image enhancement module 108 may
invokes its corresponding sub-modules according to a predefined pattern to perform the enhancement. As a result of the enhancement, one or more enhanced images 232 are produced. The predefined pattern as referred herein can be stored within the other data 236.
[0060] In one implementation, the enhanced image(s) 232, thus, generated can be binarized
by the binarization module 214 for further enhancement of the specular pixels. The binarization can be performed using a conventionally known binarization technique. In one implementation, binarization is performed using a conventionally known Niblack binarization technique. As a

result of the binarization, binarized enhanced image 234 is obtained. In one example, users may be provided with flexibility of choosing the enhanced image 232 for binarization. As an instance, the user may choose not to binarize the image, if the enhanced image 232 is of desired quality according to its end-use. In another example, the binarization module 214 can be configured to automatically perform the binarization based on predefined binarization rules. As an instance, one such rule may be performing the binarization when the enhanced image 232 is obtained as a result of applying the laplacian technique. Another rule may instruct the binarization module 214 to perform the binarization on every enhanced image 232. The binarization rules described herein may be stored in the other data 236.
[0061] In one example, when multiple enhanced images 232 generated by the image enhancement module 108 are binarized, multiple binarized enhanced images 234 are produced by the binarization module 214. The users may then select a best binarized enhanced image 234 therefrom.
[0062] The enhanced image 232 or binarized enhanced image 234 obtained from the image processing system 102 is substantially free from specular reflection. Any text contained in such images can be easily recognized by the OCR tools. Since the enhancement is carried out at the affected regions of the input image 230, quality of the input image 230 at remaining regions is maintained.
[0063] Fig. 2(b) illustrates an exemplary input image to the image processing system 102, and Figs. 2(c)-2(d) illustrate enhanced images obtained as an output from the image processing system 102, in accordance with an embodiment of the present subject matter. As shown in the fig. 2(b), the input image 230 is a camera captured text image affected by the specular reflection. This input image 230 is processed by the image processing system 102 to generate the enhanced image 232. The enhanced image 232 as shown in the fig. 2(c) is generated based on applying the laplacian image enhancement technique. Such an enhanced image 232 is then processed further, for example, the enhanced image is binarized, using a Niblack binarization technique, to generate the binarized enhanced image 234, depicted in Fig. 2(c). It is clear from the Figs. 2(c) and 2(d) that the enhanced images 232 and 234 is substantially free from specular reflection. Such images can be easily recognized by an OCR tool.

[0064] Fig. 3 illustrates a method 300 for image processing, in accordance with an embodiment of the present subject matter. The method may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0065] The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or an alternative method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
[0066] At block 302, an image having specular reflection is received. The image referred herein may be a camera captured image or a scanned image affected by the specular reflection. The image may contain captured objects and/or text. In one implementation, the localization module 212 of the image processing system 102 receives the image affected by the specular reflection.
[0067] At block 304, the image is localized into specular pixels and diffused pixels. In one implementation, localization is performed using a conventionally known localization technique. The localization segregates the pixels in the image into diffused pixels and specular pixels. Based on the localization, specular pixels are identified for further processing. In one implementation, the localization module 212 localizes the pixels in the image into specular pixels and diffused pixels. In one example, the localization module 212 performs the localization based on a conventionally known specular-to-diffuse localization technique.
[0068] At block 306, the specular pixels are enhanced based on at least one image enhancement technique to generate an enhanced image. It is to be understood that each pixel of the pixel whether specular or diffused includes a luminance component and a chrominance

component. The luminance component is indicative of the intensity of the light emitted from the pixel, and the chrominance component is indicative of chromaticity of the pixel. The image enhancement according to the present subject matter is carried out on a luminance component of each specular pixel. In one embodiment, the image enhancement module 108 performs the image enhancement on at least one image enhancement technique. The image enhancement techniques may include a YUV technique, a color clustering technique, an adjacency of diffused pixels technique, a histogram equalization technique, a gamma correction technique, and a laplacian technique. The image enhancement module 108 may include various sub-modules configured to apply such image enhancement techniques on the specular pixels of the image. According to one implementation, the image enhancement module 108 applies the image enhancement techniques according to a predefined pattern.
[0069] At block 308, the at least one enhanced image is binarized to generate at least one binarized enhanced image. The binarization can be performed using a conventionally known binarization technique. The binarization further improves the quality of the enhanced image(s). In one implementation, binarization module 212 of the image processing system 102 performs the binarization. In one example, binarization is performed based on predefined binarization rules. The enhanced image obtained as a result of the binarization is referred to as binarized enhanced image. For the input images containing text, such a binarized enhanced image can be provided to an OCR tool for recognizing text contained therein.
[0070] According to present subject matter, any type of image irrespective of its color and background can be processed for enhancement. Further, an input image at the specular regions can be enhanced without compromising with the quality of the entire image. Also, the images containing text can be efficiently processed by the systems and methods described herein. The resulting processed images are substantially free from specular reflection and can be provided to an OCR tool for recognizing the text contained therein.
[0071] Although embodiments for image processing have been described in language specific to structural features and/or methods, it is to be understood that the invention is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as exemplary implementations for systems and methods for image processing.

I/We claim:
1. A method for image processing, the method comprising:
obtaining an image having specular reflection, wherein the image comprises a plurality of pixels, each of the plurality of pixels including a luminance component and a chrominance component;
identifying specular pixels amongst the plurality of pixels in the image; and
enhancing each of the specular pixels based on at least one image enhancement technique, wherein the at least one image enhancement technique is applied on the luminance component of the specular pixels to generate at least one enhanced image.
2. The method as claimed in claim 1, wherein the image enhancement technique is one of a laplacian technique, a gamma correction technique, a YUV technique, a color clustering technique, a diffused pixel adjacency technique, and a histogram equalization technique.
3. The method as claimed in claim 1 further comprising localizing the plurality of pixels in the image into the specular pixels and diffused pixels based on a localization technique.
4. The method as claimed in claim 1, wherein the enhancing each of the specular pixels is further based on diffused pixels.
5. The method as claimed in claim 1 further comprising binarizing the at least one enhanced image to generate at least one binarized enhanced image.
6. The method as claimed in claim 5 further comprising providing the at least one binarized enhanced image to an Optical Character Recognition (OCR) system.
7. An image processing system (102) comprising:
a processor (202); and
a memory (204) coupled to the processor (202), the memory (204) comprising:
an image enhancement module (108) configured to:
identify specular pixels amongst a plurality of pixels in an image (230), wherein the specular pixels comprises at least a luminance component;

enhance each of the specular pixels based on at least one image enhancement technique, wherein the at least one image enhancement technique is applied on the luminance component of the specular pixels; and
generate at least one enhanced image (232) based on the enhancement.
8. The image processing system (102) as claimed in claim 7 further comprises a localization module (212) configured to localize the pixels in the image (230) into the specular pixels and diffused pixels.
9. The image processing system (102) as claimed in claim 7, wherein the image enhancement technique is one of a laplacian technique, a gamma correction technique, a YUV technique, a color clustering technique, a diffused pixel adjacency technique, and a histogram equalization technique.
10. The image processing system (102) as claimed in claim 7, wherein the image enhancement module (108) is configured to enhance the image (230) based on diffused pixels.
11. The image processing system (102) as claimed in claim 7, wherein the at least one image enhancement technique is applied to the specular pixels of the image (230) according to a predefined pattern.
12. The image processing system (102) as claimed in claim 7 further comprises a binarization module (214) configured to binarize the at least one enhanced image (232) to generate at least one binarized enhanced image (234).
13. The image processing system (102) as claimed in claim 7, wherein the image processing system (102) is configured to interface to an OCR system (110), wherein the at least one enhanced image (232) is provided as an input to the OCR system (110) for recognizing text contained within the at least one enhanced image (232).
14. The image processing system (102) as claimed in claim 12, wherein the image processing system (102) is configured to interface to an OCR system (110), wherein the at least one binarized enhanced image (234) is provided as an input to the OCR system (110) for recognizing text contained within the at least one binarized enhanced image (234).

15. A computer-readable medium having embodied thereon a computer program for executing a method comprising:
obtaining an image having specular reflection, wherein the image comprises a plurality of pixels, each of the plurality of pixels including a luminance component and a chrominance component;
identifying specular pixels amongst the plurality of pixels in the image; and
enhancing each of the specular pixels based on at least one image enhancement technique, wherein the at least one image enhancement technique is applied on the luminance component of the specular pixels to generate at least one enhanced image.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 2415-MUM-2012-FORM 1(10-10-2012).pdf 2012-10-10
1 2415-MUM-2012-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26
2 2415-MUM-2012-CORRESPONDENCE(10-10-2012).pdf 2012-10-10
2 2415-MUM-2012-US(14)-HearingNotice-(HearingDate-13-04-2021).pdf 2021-10-03
3 2415-MUM-2012-IntimationOfGrant29-04-2021.pdf 2021-04-29
3 2415-MUM-2012-FORM 3(18-12-2013).pdf 2013-12-18
4 2415-MUM-2012-PatentCertificate29-04-2021.pdf 2021-04-29
4 2415-MUM-2012-CORRESPONDENCE(18-12-2013).pdf 2013-12-18
5 ABSTRACT.jpg 2018-08-11
5 2415-MUM-2012-Written submissions and relevant documents [27-04-2021(online)].pdf 2021-04-27
6 2415-MUM-2012-POWER OF ATTORNEY(21-9-2012).pdf 2018-08-11
6 2415-MUM-2012-Correspondence to notify the Controller [08-04-2021(online)].pdf 2021-04-08
7 2415-MUM-2012-FORM 18(22-8-2012).pdf 2018-08-11
7 2415-MUM-2012-ABSTRACT [15-03-2019(online)].pdf 2019-03-15
8 2415-MUM-2012-CORRESPONDENCE(22-8-2012).pdf 2018-08-11
8 2415-MUM-2012-CLAIMS [15-03-2019(online)].pdf 2019-03-15
9 2415-MUM-2012-COMPLETE SPECIFICATION [15-03-2019(online)].pdf 2019-03-15
9 2415-MUM-2012-CORRESPONDENCE(21-9-2012).pdf 2018-08-11
10 2415-MUM-2012-CORRESPONDENCE [15-03-2019(online)].pdf 2019-03-15
10 2415-MUM-2012-FORM 2.pdf 2018-08-21
11 2415-MUM-2012-FER.pdf 2018-09-26
11 2415-MUM-2012-FER_SER_REPLY [15-03-2019(online)].pdf 2019-03-15
12 2415-MUM-2012-FORM 3 [14-03-2019(online)].pdf 2019-03-14
12 2415-MUM-2012-OTHERS [15-03-2019(online)].pdf 2019-03-15
13 2415-MUM-2012-FORM 3 [14-03-2019(online)].pdf 2019-03-14
13 2415-MUM-2012-OTHERS [15-03-2019(online)].pdf 2019-03-15
14 2415-MUM-2012-FER.pdf 2018-09-26
14 2415-MUM-2012-FER_SER_REPLY [15-03-2019(online)].pdf 2019-03-15
15 2415-MUM-2012-CORRESPONDENCE [15-03-2019(online)].pdf 2019-03-15
15 2415-MUM-2012-FORM 2.pdf 2018-08-21
16 2415-MUM-2012-COMPLETE SPECIFICATION [15-03-2019(online)].pdf 2019-03-15
16 2415-MUM-2012-CORRESPONDENCE(21-9-2012).pdf 2018-08-11
17 2415-MUM-2012-CORRESPONDENCE(22-8-2012).pdf 2018-08-11
17 2415-MUM-2012-CLAIMS [15-03-2019(online)].pdf 2019-03-15
18 2415-MUM-2012-FORM 18(22-8-2012).pdf 2018-08-11
18 2415-MUM-2012-ABSTRACT [15-03-2019(online)].pdf 2019-03-15
19 2415-MUM-2012-POWER OF ATTORNEY(21-9-2012).pdf 2018-08-11
19 2415-MUM-2012-Correspondence to notify the Controller [08-04-2021(online)].pdf 2021-04-08
20 ABSTRACT.jpg 2018-08-11
20 2415-MUM-2012-Written submissions and relevant documents [27-04-2021(online)].pdf 2021-04-27
21 2415-MUM-2012-PatentCertificate29-04-2021.pdf 2021-04-29
21 2415-MUM-2012-CORRESPONDENCE(18-12-2013).pdf 2013-12-18
22 2415-MUM-2012-IntimationOfGrant29-04-2021.pdf 2021-04-29
22 2415-MUM-2012-FORM 3(18-12-2013).pdf 2013-12-18
23 2415-MUM-2012-US(14)-HearingNotice-(HearingDate-13-04-2021).pdf 2021-10-03
23 2415-MUM-2012-CORRESPONDENCE(10-10-2012).pdf 2012-10-10
24 2415-MUM-2012-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26
24 2415-MUM-2012-FORM 1(10-10-2012).pdf 2012-10-10

Search Strategy

1 2415MUM2012searchstrategy_25-09-2018.pdf

ERegister / Renewals

3rd: 30 Apr 2021

From 18/08/2014 - To 18/08/2015

4th: 30 Apr 2021

From 18/08/2015 - To 18/08/2016

5th: 30 Apr 2021

From 18/08/2016 - To 18/08/2017

6th: 30 Apr 2021

From 18/08/2017 - To 18/08/2018

7th: 30 Apr 2021

From 18/08/2018 - To 18/08/2019

8th: 30 Apr 2021

From 18/08/2019 - To 18/08/2020

9th: 30 Apr 2021

From 18/08/2020 - To 18/08/2021

10th: 30 Apr 2021

From 18/08/2021 - To 18/08/2022

11th: 01 Aug 2022

From 18/08/2022 - To 18/08/2023

12th: 08 Aug 2023

From 18/08/2023 - To 18/08/2024

13th: 13 Aug 2024

From 18/08/2024 - To 18/08/2025

14th: 06 Aug 2025

From 18/08/2025 - To 18/08/2026