Sign In to Follow Application
View All Documents & Correspondence

An Imaging System And A Method For Image Quality Enhancement

Abstract: Disclosed herein is a system (104) and a method (300) for receiving (302) a pre-processed image having a plurality of pixel values in Red Green Blue (RGB) color space and a current contrast factor, identifying (304) a set of pixel values, among the plurality of pixel values, pertaining to a common color component in the RGB color space, modifying (306) remaining pixel values of the plurality of pixel values based on the set of pixel values in such a manner that the remaining pixel values also attain the common color component. The method (300) further comprises determining (308) a new contrast factor based on the set of pixel values and the remaining pixel values after being modified, and applying (310) the new contrast factor is upon the pre-processed image having a plurality of pixel values in RGB color space in order to obtain a contrast enhanced image. [FIG. 1]

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 July 2022
Publication Number
05/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

ZENSAR TECHNOLOGIES LIMITED
Plot#4 Zensar Knowledge Park, MIDC, Kharadi, Off Nagar Road, Pune, Maharashtra – 411014, India

Inventors

1. Sridhar Gadi
Zensar Technologies Ltd., Zensar Knowledge Park, Plot#4, MIDC, Kharadi, Off Nagar Road, Maharashtra – 411014, India
2. Varsha Vishwakarma
Zensar Technologies Ltd., Zensar Knowledge Park, Plot#4, MIDC, Kharadi, Off Nagar Road, Maharashtra – 411014, India
3. Manish Kumar
Zensar Technologies Ltd., Zensar Knowledge Park, Plot#4, MIDC, Kharadi, Off Nagar Road, Maharashtra – 411014, India
4. Pavan Jakati
Zensar Technologies Ltd., Zensar Knowledge Park, Plot#4, MIDC, Kharadi, Off Nagar Road, Maharashtra – 411014, India
5. Ankit Gupta
Zensar Technologies Ltd., Zensar Knowledge Park, Plot#4, MIDC, Kharadi, Off Nagar Road, Maharashtra – 411014, India

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION (See section 10, rule 13)
“AN IMAGING SYSTEM AND A METHOD FOR IMAGE QUALITY ENHANCEMENT”
ZENSAR TECHNOLOGIES LIMITED, of Plot#4 Zensar Knowledge Park, MIDC, Kharadi, Off Nagar Road, Pune, Maharashtra – 411014, India
The following specification particularly describes the invention and the manner in which it is to be performed.

TECHNICAL FIELD
The present invention relates to a field of image processing, and more particularly to an image quality enhancement technique.
BACKGROUND OF INVENTION
Optical character recognition (OCR) is a technique used for text recognition. An OCR system identifies letters on camera images, image-only pdfs, scanned documents, etc. It creates words based on identified letters and then arranges the words into sentences. This technique enables various functionalities for example, copying, editing, etc., of the original content.
However, the accuracy of OCR is often tainted by the poor quality of the input document images. Generally, the performance degradation of the OCR is attributed to the resolution and quality of scanning. It has been observed that, if a high-resolution image is input to the OCR system, the OCR is more accurate and provides better performance in character recognition. A high-resolution image offers a high pixel density and more details about the original scene. However, high-resolution images are not always available. Since the setup for high-resolution imaging proves expensive, it may not always be feasible due to the inherent limitations of sensors, optics manufacturing technology, and environment.
Another option is Super-resolving low-resolution document images by applying super resolution imaging technique prior to passing it to the OCR system or engine. However, this technique may create artifacts and/or unwanted patterns which may not be present in the original document.
Thus, there exists a need for a technique to enhance low-quality document images before passing them to the OCR system or engine in a cost-effective manner.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.

SUMMARY OF THE INVENTION
The present disclosure overcomes one or more shortcomings of the prior art and provides additional advantages discussed throughout the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
In one embodiment of the present disclosure, an imaging method is disclosed. The method comprises receiving a pre-processed image having a plurality of pixel values in Red Green Blue (RGB) color space and a current contrast factor. The method further comprises identifying a set of pixel values, among the plurality of pixel values, pertaining to a common color component in the RGB color space. The method further comprises modifying remaining pixel values of the plurality of pixel values based on the set of pixel values in such a manner that the remaining pixel values also attain the common color component. The method further comprises determining a new contrast factor based on the set of pixel values and the remaining pixel values after being modified. The method further comprises applying the new contrast factor is upon the pre-processed image having a plurality of pixel values in RGB color space in order to obtain a contrast enhanced image.
In one embodiment of the present disclosure, an imagining system is disclosed. The imaging system comprises a memory and a processing unit operationally coupled with the memory. The processing unit is configured to receive a pre-processed image having a plurality of pixel values in RGB color space and a current contrast factor. The processing unit is further configured to identify a set of pixel values, among the plurality of pixel values, pertaining to a common color component in the RGB color space. The processing unit is further configured to modify remaining pixel values of the plurality of pixel values based on the set of pixel values in such a manner that the remaining pixel values also attain the common color component. Further, the processing unit is configured to determine a new contrast factor based on the set of pixel values and the remaining pixel values after being modified. Further, the processing unit is configured

to apply the new contrast factor is upon the pre-processed image having a plurality of pixel values in RGB color space in order to obtain a contrast enhanced image.
The foregoing summary is illustrative only and is not intended to be in anyway limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCITPION OF DRAWINGS
The embodiments of the disclosure itself, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings. One or more embodiments are now described, by way of example only, with reference to the accompanying drawings in which:
FIG. 1 shows an exemplary environment 100 for providing image quality enhancement, in accordance with an embodiment of the present disclosure;
FIG. 2 shows a block diagram 104 illustrating a system for providing image quality enhancement, in accordance with an embodiment of the present disclosure;
FIGs. 3A-3B show a method 300 for providing image quality enhancement, in accordance with an embodiment of the present disclosure.
The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION OF THE INVENTION
In the present disclosure, the term "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present

subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusions, such that a device that comprises a list of components does not include only those components but may include other components not expressly listed or inherent to such setup or device. In other words, one or more elements in a system or method proceeded by "comprises... a" does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
The terms like "at least one" and "one or more" may be used interchangeably or in combination throughout the description.
While the present disclosure is illustrated in the context of OCR, however, the imaging method and system, and aspects and features thereof can also be used for any other application which requires image quality enhancement, for example, generating a high-resolution image from lower resolution image.
Reference will now be made to the exemplary embodiments of the disclosure, as illustrated in the accompanying drawings. Wherever possible, same numerals will be used to refer to the same or like parts. Embodiments of the disclosure are described in the following paragraphs with reference to FIGs. 1 to 3A-3B.
FIG. 1 shows an exemplary environment 100 for enhancing the image, and more particularly for generating a higher resolution image from a lower resolution image, in accordance with an embodiment of the present disclosure. It must be understood to a person skilled in art that the present disclosure may also be implemented in various environments, other than as shown in FIG. 1. As shown in FIG. 1, the system 104 receives a pre-processed image 102 having a current contrast factor. The pre-processed image 102 is a low quality image or lower resolution image which needs image enhancement. The system 104 further process the pre-processed image using imaging techniques disclosed in the present disclosure to convert it into a higher resolution image. The imaging technique will now be explained in conjunction with FIG. 2.

FIG. 2 shows a block diagram of a system 104 for image quality enhancement, in accordance with an embodiment of the present disclosure. Although the present disclosure is explained considering that the system 104 is implemented on a computer system, it may be understood that the system 104 may be implemented in a variety of computing systems, such as a laptop computer, a server, a notebook, a workstation, a mainframe computer, a server, a network server, a cloud-based computing environment, etc.
In one implementation, the system 104 may comprise a processing unit 202, a I/O interface 210 and a memory 212. The memory 212 may be communicatively coupled to the processing unit 202 and the I/O Interface 210. In some embodiments, the processing unit 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processing unit 202 is configured to fetch and execute computer-readable instructions stored in the memory 212. The I/O interface 210 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 210 may enable the system 104 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 210 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 210 may include one or more ports for connecting many devices to one another or to another server.
Along with the processing unit 202, system 104 further includes a Dominant Level Extractor (DLE) unit 204, an Image Channel Aggregator (ICA) unit 206 and an Image Channel Variance Router (ICVR) unit 208. According to embodiments of present disclosure, these units 202-208 may comprise hardware components like processor, microprocessor, microcontrollers, application-specific integrated circuit for performing various operations of the system 104. In one embodiment, the units 202-208 may be dedicated hardware units capable of executing one or more instructions stored in the memory 212 for performing various operations of the system 104. In another embodiment, the units 202-208 may be software modules stored in the memory 212

which may be executed by a processor. It must be understood to a person skilled in art that the processor may perform all the functions of the units 204-208 according to various embodiments of the present disclosure.
Now referring back to FIG. 1, the environment 100 shows the system 104 receiving the pre-processed image 102. The pre-processed image 102 comprises a plurality of pixel values in RGB color space and have a current contrast factor. In other words, the processing unit 202, of the system 104, may be configured to receive the pre-processed image 102. In one implementation, the pre-processed image 102 may be obtained after passing a lower-resolution image through one or more stages of image processing. Firstly, the lower-resolution image may be passed through a Super Resolution Imaging (SRI) stage where SRI techniques may be applied on the lower-resolution image. SRI techniques are mainly used for recovering information that is not explicitly present in lower resolution images. For example, letter ‘O’ and number ‘0’ (zero) may not be clearly distinguishable, however, SRI technique can recover such details from the lower resolution images which can distinguish letter ‘O’ and number ‘0’ (zero).
Secondly, an output image obtained after the SRI stage is subjected to sharpness enhancement technique at a Sharpness Enhancement stage. Sharpness Enhancement technique measures for image sharpness and facilitates automatic image sharpness enhancement. This way blurry images will be sharpened more whereas sufficiently sharp images will not be sharpened at all.
Thirdly, the sharpness enhanced image is subjected to an Image Channel Intensity Router (ICIR) technique at an Image Channel Intensity Router (ICIR) stage. ICIR technique balances the brightness of the sharpness enhanced image to avoid solid white pixels by measuring the intensity of brightness present in an image by measuring the Root Mean Square (RMS) of 3-slice stack of a color image (with each slice being one color channel of Red Blue Green (RGB) channels) and aggregating the measured values.
Now once the pre-processed image 102 having the set of pixel values and the current contrast factor is received after applying the above mentioned techniques, the system 104 further applies imaging techniques of the present disclosure upon the pre-processed image 102.

In the first step, the system 104 applies a Dominant Level Extraction (DLE) technique on the the pre-processed image 102 to identify a set of pixel values, among the plurality of pixel values, pertaining to a common color component 106 in the RGB color space. As can be seen in FIG. 1, the pixels pertaining to a common color component 106 are highlighted using dotted square. In next step, the system 104 applies an Image Channel Aggregator (ICA) technique to modify remaining pixel values 108 of the plurality of pixel values based on the set of pixel values in such a manner that the remaining pixel values also attain the common color component. As can be seen from FIG.1, the remaining pixels modified based on the set of pixel values (of the common color component) are indicated by “L” shape dotted lines. Once all the pixel values attain the common color component, in next step, the system 104 applies an Image Channel Variance Router (ICVR) technique to determine a new contrast factor. Here, it may be noted that the new contrast factor determined is an improved contrast factor compared to the current contrast factor initially received along with the pre-processed image 102. Finally, the system 104 applies the new contrast factor upon the pre-processed image 102 to obtain a contrast enhanced image.
The above steps will now be explained using an example. It is conventionally known that an image consists of pixels, and each pixel represents a dot in the image. Further, a pixel contains three values which lies between 0-255 representing the amount of red, blue and green (RGB) color components also known as colour channels. The combination of these RGB components shapes actual colour of the pixel. The RGB components of a pixel are represented as a M x N x 3 order matrix. In one implementation, when the system 104 obtains the pre-processed image 102 with current contrast factor, the DLE Unit 204 extracts the dominant level present in each colour channel (i.e., RGB colour channels or space) of the pre-processed image 102 with current contrast factor. The DLE Unit 204 is configured to identify a set of pixel values, among the plurality of pixel values, pertaining to a common colour component in the RGB colour space. The common colour component represents the dominant level present in RGB colour channels or space.
In one implementation, the DLE Unit 204 may apply K-means clustering technique to identify the set of pixel values pertaining to the common colour component. Before applying k-means clustering technique on the pixel values, required number of clusters

may be determined for the pre-processed image 102 with current contrast factor because the number of dominant colours may vary with different images. In one implementation, the Elbow method may be used to understand how many dominant colours exist in the pre-processed image 102 with current contrast factor. Further, the M x N x 3 order matrix of each pixel value of the pre-processed image 102 with current contrast factor may be transformed into three individual lists, which contain the respective Red, Blue and Green (RGB) values. Thereafter, identifying maximum values associated with the three individual lists to determine the dominant colour i.e., the quantized colour space of each pixel value. Further, each of the pixel values of the pre-processed image 102 with current contrast factor are mapped to the corresponding dominant colour space. Thereafter, K-means clustering technique clusters each of the pixels into a list of three colour stacks i.e., Red, Blue and Green (RGB). The colour in each cluster centre reflects the average of the attributes of all cluster members i.e., pixels. The determination of the dominant level or the common colour component of the pre-processed image 102 with current contrast factor is based on the colour in each cluster centre.
] In an example implementation, the DLE technique is applied on a pre-processed image 102 (obtained after ICIR stage) with pixel values in the range of 0-255 as shown in Table 1 below:

xr xg xb
70 80 78 72 58 68 66 60 47 57 55 49
156 161 160 157 146 151 150 147 133 138 137 134
204 194 196 203 192 183 185 191 180 171 173 179
204 205 197 201 193 193 185 190 181 181 173 178
204 205 198 201 192 193 186 190 180 181 174 178
205 205 190 202 193 193 179 190 181 181 167 178
205 205 199 204 193 193 187 192 181 181 175 180
204 205 196 203 192 193 184 191 180 181 172 179
205 205 205 202 193 193 193 191 181 181 181 179
Table 1
where:
Xr, Xg, Xb represent pixel representation of red, green and blue colour space of the image
obtained after I CI R stage. Further, applying the K-means clustering technique to the

pixel values of Table 1 in the manner as explained above may yield the dominant colour or the common colour component represented as (dlr, dlg, dlb) = [203,192,180] in RGB colour space.
In one implementation, after the DLE Unit 204 identifies the set of pixel values 106 pertaining to the common colour component in the RGB colour space, in next step, the Image Channel Aggregator (ICA) Unit 206 modifies remaining pixel values 108 of the pre-processed image with current contrast factor 102 in such a manner that the remaining pixel values also attain the common colour component as identified by the DLE Unit 204. Firstly, to modify the remaining pixel values, the ICA Unit 206 normalizes each of the values of dominant or common colour component (dlr, dlg, dlb) and is defined by:
nr = dlr/255 ng = dlg/255
nb = dlb/255
Where:
nr, rg, nb represent normalized colour features. The values of (nr, ng, nb) determined
based on values of (dlr, dlg, dlb) = [203,192,180] are [0.70, 0.75, 0.79].
In one implementation, the values of nr, ng,nb are further used to determine the values of the remaining pixels 108 and the determination of these values are defined by:
X r = Xr X ng
Xg = Xg x n2g
Xb = Xb x n2b
Where:
Xr, Xg, Xb represent RGB Channel pixels after modifying the remaining pixel values
based on the common colour component,
nr, ng,nb represent normalized colour features, and
xr, xg, xb represent pixel representation of red, green and blue colour space of the image
obtained after I CI R stage.

Based on the above, the dominant level can be between zero and one since outliers strongly influence the other scaler. Further, the RGB channel multiplied by the squared dominant colour component scaler to spread the common colour in the entire image, reducing the intensity of the common colour and balance the colour in the whole image. Advantageously, reducing the distance between min and max intensity level to treat the saturated problem and avoiding the biases towards dominant or common colour intensity while calculating a new contrast factor.
In an example implementation, Table 2 below shows the values of Xr, Xg, Xb when the ICA Unit 206 determines these values based on the values of xr , xg , xb and nr, ng, nb determined based on the pixel values of Table 1:

xr = xr x n2r xg = xg x n2g
xb = xb x n2b

44 51 49 46 33 39 37 34 23 28 27 24
99 102 101 99 83 86 85 83 66 69 68 67
129 123 124 129 109 104 105 108 90 85 86 89
129 130 125 127 109 109 105 108 90 90 86 89
129 130 125 127 109 109 105 108 90 90 87 89
130 130 120 128 109 109 101 108 90 90 83 89
130 130 126 129 109 109 106 109 90 90 87 90
129 130 124 129 109 109 104 108 90 90 86 89
130 130 130 128 109 109 109 108 90 90 90 89
Table 2
After determining the values of Xr, Xg, Xb, the ICA unit 206 merges the values of Xr, Xg, Xb to normalize all three colour channels (RGB colour space) so that all the remaining pixels of the image attain the common colour component. The merging of the values of Xr, Xg, Xb by the ICA Unit 206 may be defined by:
Xc = merge (XrXg ,Xb )
Where:
Xc : represents the modified pixel values of the whole image (i.e., Normalized Channel
Aggregator).
When the ICA unit 206 outputs an image (hereinafter “ICA image”) with modified pixel values as explained above, the ICA image is obtained by the ICVR Unit 208. An ICVR

technique applied by the ICVR Unit 208 enhances the contrast of the ICA image. Thus, the ICVR Unit 208 determines a new contrast factor (vf) based on the set of pixel values (having the common color component) and the remaining pixel values after being modified (which also attains the common color component). Further, the ICVR Unit 208 applies the new contrast factor (vf) upon the ICA image having a plurality of pixel values in RGB color space in order to obtain a contrast enhanced image.
In one implementation, the determination of the new contrast factor (vf) may be defined by following technique:
σ 2 = ∑ni=1 (Xci-x)2/n
sigma ^ {2} = log (abs (sigma A {2}) + 1) V f = ∑ni=1 si/n
s = √σ2
X,. = Vf(Xr-100) + 100
Xg = Vf(Xg-100) + 100
Xb = vf(xb-100) + 100
Where:
G2 : represents variance,
s : represents standard deviation of the ICA image respectively,
Xc : represents the modified pixel values of the whole image (i.e., Normalized Channel
Aggregator).
Vf ■ represents the new contrast factor,
Vf/>1 : represents more contrast, 0< Vf <1 represents less contrast and Vf= 0 gives solid
grey image,
Xr, Xg, Xb : represent RGB colour channel of the contrast enhanced image,
Xi : represents contrast enhanced image (result rounded to an integer and clamped to
the range [0,255] if pixel values extended the range)

Now, the sequence of functions applied to determine the new contrast factor (vf) in according with the above implementation may be represented as below:

In an example implementation, the new contrast factor (vf) may be determined using values of Table 2 as shown below in Table 3:

σ2 = ∑ni=1 (Xci-x)2/n
sigma ^{2}=log (abs (sigma ^{2})+1) s = √σ2
987
998
1007
828
832
832
826
816
995 988
999
1009
829
834
833
827
817
997 989
1000
1010
831
835
835
828
819
998 992
1003
825
833
837
837
831
821
813 6.89568 6.89669 6.89771 6.90073 2.62596 2.62616 2.62635 2.62692




6.90675 6.90776 6.90875 6.91175 2.62807 2.62826 2.62845 2.62902




6.91572 6.91771 6.9187 6.71659 2.62978 2.63015 2.63034 2.59164




6.72022 6.72143 6.72383 6.72623 2.59234 2.59257 2.59304 2.5935




6.72503 6.72743 6.72863 6.73102 2.59327 2.59373 2.59396 2.59442




6.72503 6.72623 6.72863 6.73102 2.59327 2.5935 2.59396 2.59442




6.7178 6.71901 6.72022 6.72383 2.59187 2.59211 2.59234 2.59304




6.70564 6.70686 6.7093 6.71174 2.58952 2.58976 2.59023 2.5907




6.90375 6.90575 6.90675 6.70196 2.6275 2.62788 2.62807 2.58881
Table 3
When the values determined as mentioned in Table 3, the value of the new contrast factor is determined as below:
Vf = ∑ni=1 si/n = 2.54
Further, the value of the new contrast factor (vf) i.e., 2.54 as determined above is applied to all the pixels of the ICA image to obtain a contrast enhanced image with the values of pixels in RGB colour channel or space as shown below in Table 4:

Xr Xg Xb
122 134 132 124 96 109 106 99 63 76 73 66
244 218 223 241 213 190 195 211 183 160 165 180
244 246 226 236 216 216 195 208 185 185 167 178
244 246 228 236 213 216 198 208 183 185 150 178
246 246 108 238 216 216 180 208 185 185 170 178
246 246 231 244 216 216 200 213 185 185 162 183

244 246 223 241 213 216 193 211 183 185 185 180
246 246 246 238 216 216 216 211 185 162 167 183
246 221 226 244 216 193 198 213 185 162 167 183
Table 4 Further, the values of Xr, Xg, Xb as shown in Table 4 are merged to obtain the contrast enhanced image with new contrast factor 110 and may be defined as:
Xi = Merge (Xr, Xg, Xb)
Where, Xi represents the contrast enhanced image with new contrast factor.
Further, the contrast enhanced image 110 is subjected to a RGB Noise Removal technique. Noise usually accompanies images during acquisition or transmission, resulting in contrast reduction, colour shift, and poor visual quality. The presence of noise damages the precision of several computer vision-based applications, like semantic segmentation, motion tracking, object detection and action recognition. Presence of noise in image contaminates the authenticity of an image. According to an embodiment of the present invention, the system 104 applies one or more filtering techniques upon the contrast enhanced image 110 in order to remove the one or more pixel values pertaining to the artifacts, thereby denoising the contrast enhanced image 110.
In one implementation, the denoised contrast enhanced image is subjected to binarization technique to obtain a binarized image before implementing Optical Character Recognition (OCR) process. Binarization of the denoised contrast enhanced image is a segmentation technique, used for separating an object considered as a foreground from its background. By applying binarization technique on the denoised contrast enhanced image all the pixel values are converted to black and rest into white which results in a binarized image.
Further, the Optical Character Recognition (OCR) technique is applied on the binarized image for extracting text from printed or scanned digital/handwritten document image and converting them into a digital text format that can be editable.
FIG. 3A illustrates a flow chart of a method 300 for enhancing the image quality, and more particularly for generating a higher resolution image from a lower resolution

image. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform specific functions or implement specific abstract data types.
The order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described.
At step 302, the method includes receiving a pre-processed image having a plurality of pixel values in RGB color space and a current contrast factor. In one implementation, the pre-processed image may be obtained after passing a lower-resolution image through one or more stages of image processing which includes a Super Resolution Imaging (SRI) stage, Sharpness Enhancement stage, and Image Channel Intensity Router (ICIR) stage as explained in above paragraphs.
At step 304, the method 300 may include identifying a set of pixel values, among the plurality of pixel values, pertaining to a common colour component in the RGB colour space.
At step 306, the method 300 may include modifying remaining pixel values 108 of the plurality of pixel values based on the set of pixel values in such a manner that the remaining pixel values also attain the common color component.
At step 308, the method 300 includes determining a new contrast factor (vf) based on the set of pixel values 106 and the remaining pixel values 108 after being modified.
At step 310, the method 300 includes applying the new contrast factor (vf) is upon the pre-processed image having a plurality of pixel values in RGB color space in order to obtain a contrast enhanced image 112.
Now referring to FIG. 3B, at step 312, the method 300 includes detecting one or more pixel values pertaining to artifacts present in the contrast enhanced image.

At step 314, the method 300 includes applying one or more filtering techniques upon the contrast enhanced image in order to replace the one or more pixel values pertaining to the artifacts, thereby denoising the contrast enhanced image 110.
At step 316, the method 300 includes applying an Optical Character Recognition (OCR) technique upon the contrast enhanced image after being denoised.
The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.
Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer- readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., are non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
Suitable processors include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a graphic processing unit (GPU), a plurality of microprocessors, one or more microprocessors in

association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.
Advantages of the embodiment of the present disclosure are illustrated herein.
[0058] In an embodiment, the present disclosure provides improved character level accuracy for better quality OCR output in a cost-effective manner.
[0059] In an embodiment, the present disclosure provides the techniques to generate higher-resolution images from lower-resolution images.
REFERENCE NUMERALS
[0060] Environment 100
[0061] Pre -Processed Image with current Contrast Factor 102
[0062] System 104
[0063] Pixel values with common colour components 106
[0064] Modified remaining Pixel values 108
[0065] Contrast Enhanced Image with new contrast Factor 110
[0066] Processing Unit 202
[0067] Dominant Level Extraction (DLE) Unit 204
[0068] Image Channel Aggregator (ICA) Unit 206
[0069] Image Channel Variance Router (ICVR) Unit 208
[0070] I/O Interface 210
[0071] Memory 212
[0072] Method 300
[0073] The terms "an embodiment", "embodiment", "embodiments", "the embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.

The terms "including", "comprising", “having” and variations thereof mean "including but not limited to", unless expressly specified otherwise.
The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise.
The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be clear that more than one device/article (whether they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether they cooperate), it will be clear that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the embodiments of the present invention are intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and

embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

We Claim:
1. An imaging method comprising:
receiving (302) a pre-processed image having a plurality of pixel values in RGB color space and a current contrast factor;
identifying (304) a set of pixel values, among the plurality of pixel values, pertaining to a common color component in the RGB color space;
modifying (306) remaining pixel values of the plurality of pixel values based on the set of pixel values in such a manner that the remaining pixel values also attain the common color component;
determining (308) a new contrast factor based on the set of pixel values and the remaining pixel values after being modified; and
applying (310) the new contrast factor upon the pre-processed image having a plurality of pixel values in RGB color space in order to obtain a contrast enhanced image.
2. The method as claimed in claim 1, further comprising:
detecting (312) one or more pixel values pertaining to artifacts present in the contrast enhanced image; and
applying (314) one or more filtering techniques upon the contrast enhanced image in order to replace the one or more pixel values pertaining to the artifacts, thereby denoising the contrast enhanced image.
3. The method as claimed in claim 2, further comprising applying (316) an Optical Character Recognition (OCR) technique upon the contrast enhanced image after being denoised.
4. The method as claimed in claim 1, wherein the set of pixel values, among the plurality of pixel values, pertaining to the common color component in the RGB color space is determined by using k-means clustering technique.
5. An imaging system (104) comprises:
a memory (212);
a processing unit (202) operationally coupled with the memory (212), the processing unit (202) configured to:

receive a pre-processed image having a plurality of pixel values in RGB color space and a current contrast factor;
identify a set of pixel values, among the plurality of pixel values, pertaining to a common color component in the RGB color space;
modify remaining pixel values of the plurality of pixel values based on the set of pixel values in such a manner that the remaining pixel values also attain the common color component;
determine a new contrast factor based on the set of pixel values and the remaining pixel values after being modified; and
apply the new contrast factor upon the pre-processed image having a plurality of pixel values in RGB color space in order to obtain a contrast enhanced image.
6. The imaging system (104) as claimed in claim 5, wherein the processing unit is further
configured to:
detect one or more pixel values pertaining to artifacts present in the contrast enhanced image; and
apply one or more filtering techniques upon the contrast enhanced image in order to replace the one or more pixel values, thereby denoising the contrast enhanced image.
7. The system (104) as claimed in claim 6, wherein the processing unit is further
configured to:
apply an Optical Character Recognition technique upon the contrast enhanced image after being denoised.
8. The system (104) as claimed in claim 5, wherein the set of pixel values, among the
plurality of pixel values, pertaining to the common color component in the RGB color
space is determined by using k-means clustering technique.

Documents

Application Documents

# Name Date
1 202221043339-STATEMENT OF UNDERTAKING (FORM 3) [28-07-2022(online)].pdf 2022-07-28
2 202221043339-REQUEST FOR EXAMINATION (FORM-18) [28-07-2022(online)].pdf 2022-07-28
3 202221043339-POWER OF AUTHORITY [28-07-2022(online)].pdf 2022-07-28
4 202221043339-FORM 18 [28-07-2022(online)].pdf 2022-07-28
5 202221043339-FORM 1 [28-07-2022(online)].pdf 2022-07-28
6 202221043339-DRAWINGS [28-07-2022(online)].pdf 2022-07-28
7 202221043339-DECLARATION OF INVENTORSHIP (FORM 5) [28-07-2022(online)].pdf 2022-07-28
8 202221043339-COMPLETE SPECIFICATION [28-07-2022(online)].pdf 2022-07-28
9 202221043339-Proof of Right [29-07-2022(online)].pdf 2022-07-29
10 Abstract1.jpg 2022-09-30
11 202221043339-FER.pdf 2025-05-19
12 202221043339-FORM-26 [13-06-2025(online)].pdf 2025-06-13

Search Strategy

1 SearchHistoryE_12-09-2024.pdf