Abstract: The present disclosure relates to a method and a system for recognizing characters. In one embodiment, the input image comprising one or more characters to be recognized is received and processed to extract one or more nodes and edges of each character in the input image. Using the extracted nodes and edges, a graphical representation and adjacency matrix of each character is generated and compared with a predetermined graphical representation and adjacency matrix to determine a match. Based on the comparison, a matching probability is determined based on which one or more characters in the input image is recognized and displayed as output. The proposed recognition method and system recognizes character with more accuracy and speed. Further, the present disclosure is simple, cost-effective and reduces the complexity involved in automatic recognition of characters. FIG. 2
CLIAMS:We Claim:
1. A recognition method, comprising:
receiving, from an image sensor of a recognition system, an input image comprising one or more characters;
extracting, by a processor of the recognition system, one or more nodes and edges of each character in the input image;
generating, by the processor, a graphical representation of each character based on the one or more edges;
comparing, by a comparison unit of the recognition system, the generated graphical representation of each character with the predetermined graphical representation of each reference character stored in a reference repository; and
recognizing, by a validation unit of the recognition system, the reference character as one of the characters in the input image based on the comparing.
2. The method as claimed in claim 1, further comprising pre-processing of received input image before extracting the one or more nodes and edges of each character from the received input image, the pre-processing comprising the steps of:
detecting boundary of the input image;
identifying the location of each character based on the boundary of the image;
segmenting the image of each character into one or more image segments from the location; and
skeletonizing the one or more image segments of each character to generate one or more features representing a general form of the character.
3. The method as claimed in claim 1 and 2, further comprising creating the reference repository, wherein creating the reference repository comprising the steps of:
pre-processing an image of one or more characters;
extracting one or more nodes and edges of each skeletonized character;
generating the graphical representation and adjacency matrix for each skeletonized character using the one or more nodes and edges; and
storing the graphical representation and the adjacency matrix in a memory coupled with the processor.
4. The method as claimed in claim 1 and 2, wherein generating the graphical representation of each character comprising the steps of:
generating the graphical representation using the edges of each skeletonized character; and
determining the graphical wave ending position angle of each skeletonized character from the respective graphical representation thus generated.
5. The method as claimed in claim 1 and 3, wherein comparing the generated graphical representation of each character with the predetermined graphical representation of the reference character, comprising the steps of:
comparing the graphical wave ending position angle of each skeletonized character with the graphical wave ending position angle of each reference character;
determining an adjacency matrix for each skeletonized character based on the one or more nodes; and
comparing the adjacency matrix of each skeletonized character with the adjacency matrix of the reference character.
6. The method as claimed in claim 1 and 5, wherein recognizing characters of the input image comprising the steps of:
selecting one or more skeletonized characters matching with each reference character as recognized characters of the input image upon comparing the graphical wave ending position angle and adjacency matrix of each skeletonized character with the graphical wave ending position angle and adjacency matrix of the reference character;
determining at least one unmatched character and determining a matching probability data of at least one unmatched character;
comparing the matching probability data of at least one unmatched character with a predetermined matching probability threshold; and
selecting the reference character as recognized character in the input image based on comparing.
7. A recognition system comprising:
an image sensor;
a processor coupled with the image sensor;
a memory communicatively coupled to the processor, wherein the memory stores processor-executable instructions, which, on execution, cause the processor to:
receive, from the image sensor, an input image comprising one or more characters;
extract one or more nodes and edges of each character from the input image; and
generate a graphical representation of each character based on the one or more edges;
a comparison unit coupled with the processor and configured to compare the graphical representation of each character with a predetermined graphical representation of each reference character stored in a reference repository; and
a validation unit coupled with the comparison unit and configured to recognize the reference character as one of the characters in the input image based on the comparing.
8. The system as claimed in claim 7, wherein the instructions, on execution, further cause the processor to pre-process the received input image before extracting the one or more nodes and edges of each character from the input image, the pre-processing comprising the steps of:
detecting boundary of the input image;
identifying the location of each character based on the boundary of the input image;
segmenting image of each character into one or more image segments from the location; and
skeletonizing the one or more image segments of each character to generate one or more features representing the general form of the character.
9. The system as claimed in claim 7 and 8, wherein the instructions, on execution, further cause the processor to create the reference character repository by:
pre-processing an image of one or more characters including at least alphabets, numbers and special characters;
extracting one or more nodes and edges of each skeletonized character;
generating the graphical representation and adjacency matrix for each skeletonized character using the one or more extracted nodes and edges; and
storing the graphical representation and the adjacency matrix in the memory.
10. The system as claimed in claim 7 and 8, wherein the instructions, on execution, further cause the processor to generate the graphical representation of each character by:
generating the graphical representation using the edges of each skeletonized character; and
determining the graphical wave ending position angle of each skeletonized character from the respective graphical representation thus generated.
11. The system as claimed in claim 7 and 9, wherein the instructions, on execution, cause the comparison unit to compare the generated graphical representation of each character with the predetermined graphical representation of the reference character by:
comparing the graphical wave ending position angle of each skeletonized character with the graphical wave ending position angle of each reference character;
determining an adjacency matrix for each character based on the one or more nodes; and
comparing the adjacency matrix of each skeletonized character with that of the reference character.
12. The system as claimed in claim 7 and 11, wherein the instructions, on execution, further cause the validation unit to recognize the one or more characters of the input image by:
selecting one or more skeletonized characters matching with each reference character as recognized characters of the input image upon comparing the graphical wave ending position angle and adjacency matrix of each skeletonized character with the graphical wave ending position angle and adjacency matrix of the reference character;
determining at least one unmatched character and determining a matching probability data of at least one unmatched character;
comparing the matching probability data of at least one unmatched character with a predetermined matching probability threshold; and
selecting the reference character as recognized character in the input image based on comparing.
13. A non-transitory computer readable medium including instructions stored thereon that when processed by at least one processor cause a system to perform acts of:
receiving an input image comprising one or more characters;
extracting one or more nodes and edges of each character from the input image;
generating a graphical representation of each character based on the one or more edges;
comparing the generated graphical representation of each character with the predetermined representation of each reference character stored in a reference repository; and
recognizing the reference character as one of the characters in the input image based on the comparing.
Dated this 24th day of December 2014
M.S. Devi
Of K& S Partners
Agent for the Applicant
,TagSPECI:FIELD OF THE DISCLOSURE
The present subject matter is related, in general to a recognition method and a recognition system, and more particularly, but not exclusively to method and system for recognizing characters.
| # | Name | Date |
|---|---|---|
| 1 | 6520-CHE-2014 FORM-9 24-12-2014.pdf | 2014-12-24 |
| 1 | 6520-CHE-2014-PROOF OF ALTERATION [08-06-2023(online)].pdf | 2023-06-08 |
| 2 | 6520-CHE-2014 FORM-18 24-12-2014.pdf | 2014-12-24 |
| 2 | 6520-CHE-2014-IntimationOfGrant23-02-2023.pdf | 2023-02-23 |
| 3 | IP29064-spec.pdf | 2014-12-30 |
| 3 | 6520-CHE-2014-PatentCertificate23-02-2023.pdf | 2023-02-23 |
| 4 | IP29064-fig.pdf | 2014-12-30 |
| 4 | 6520-CHE-2014-CLAIMS [27-01-2020(online)].pdf | 2020-01-27 |
| 5 | FORM 5-IP29064.pdf | 2014-12-30 |
| 5 | 6520-CHE-2014-COMPLETE SPECIFICATION [27-01-2020(online)].pdf | 2020-01-27 |
| 6 | FORM 3-IP29064.pdf | 2014-12-30 |
| 6 | 6520-CHE-2014-CORRESPONDENCE [27-01-2020(online)].pdf | 2020-01-27 |
| 7 | 6520CHE2014_CertifiedCopyRequest.pdf | 2014-12-30 |
| 7 | 6520-CHE-2014-DRAWING [27-01-2020(online)].pdf | 2020-01-27 |
| 8 | 6520-CHE-2014-Request For Certified Copy-Online(30-12-2014).pdf | 2014-12-30 |
| 8 | 6520-CHE-2014-FER_SER_REPLY [27-01-2020(online)].pdf | 2020-01-27 |
| 9 | 6520-CHE-2014-OTHERS [27-01-2020(online)].pdf | 2020-01-27 |
| 9 | abstract 6520-CHE-2014.jpg | 2015-01-03 |
| 10 | 6520-CHE-2014 POWER OF ATTORNEY 22-05-2015.pdf | 2015-05-22 |
| 10 | 6520-CHE-2014-PETITION UNDER RULE 137 [14-01-2020(online)].pdf | 2020-01-14 |
| 11 | 6520-CHE-2014 FORM-1 22-05-2015.pdf | 2015-05-22 |
| 11 | 6520-CHE-2014-FORM 3 [11-01-2020(online)].pdf | 2020-01-11 |
| 12 | 6520-CHE-2014 CORRESPONDENCE OTHERS 22-05-2015.pdf | 2015-05-22 |
| 12 | 6520-CHE-2014-FER.pdf | 2019-07-26 |
| 13 | 6520-CHE-2014 CORRESPONDENCE OTHERS 22-05-2015.pdf | 2015-05-22 |
| 13 | 6520-CHE-2014-FER.pdf | 2019-07-26 |
| 14 | 6520-CHE-2014 FORM-1 22-05-2015.pdf | 2015-05-22 |
| 14 | 6520-CHE-2014-FORM 3 [11-01-2020(online)].pdf | 2020-01-11 |
| 15 | 6520-CHE-2014 POWER OF ATTORNEY 22-05-2015.pdf | 2015-05-22 |
| 15 | 6520-CHE-2014-PETITION UNDER RULE 137 [14-01-2020(online)].pdf | 2020-01-14 |
| 16 | 6520-CHE-2014-OTHERS [27-01-2020(online)].pdf | 2020-01-27 |
| 16 | abstract 6520-CHE-2014.jpg | 2015-01-03 |
| 17 | 6520-CHE-2014-Request For Certified Copy-Online(30-12-2014).pdf | 2014-12-30 |
| 17 | 6520-CHE-2014-FER_SER_REPLY [27-01-2020(online)].pdf | 2020-01-27 |
| 18 | 6520CHE2014_CertifiedCopyRequest.pdf | 2014-12-30 |
| 18 | 6520-CHE-2014-DRAWING [27-01-2020(online)].pdf | 2020-01-27 |
| 19 | FORM 3-IP29064.pdf | 2014-12-30 |
| 19 | 6520-CHE-2014-CORRESPONDENCE [27-01-2020(online)].pdf | 2020-01-27 |
| 20 | FORM 5-IP29064.pdf | 2014-12-30 |
| 20 | 6520-CHE-2014-COMPLETE SPECIFICATION [27-01-2020(online)].pdf | 2020-01-27 |
| 21 | IP29064-fig.pdf | 2014-12-30 |
| 21 | 6520-CHE-2014-CLAIMS [27-01-2020(online)].pdf | 2020-01-27 |
| 22 | IP29064-spec.pdf | 2014-12-30 |
| 22 | 6520-CHE-2014-PatentCertificate23-02-2023.pdf | 2023-02-23 |
| 23 | 6520-CHE-2014-IntimationOfGrant23-02-2023.pdf | 2023-02-23 |
| 23 | 6520-CHE-2014 FORM-18 24-12-2014.pdf | 2014-12-24 |
| 24 | 6520-CHE-2014-PROOF OF ALTERATION [08-06-2023(online)].pdf | 2023-06-08 |
| 24 | 6520-CHE-2014 FORM-9 24-12-2014.pdf | 2014-12-24 |
| 1 | 6520_CHE_2014_Search_Strategy_25-07-2019.pdf |