Sign In to Follow Application
View All Documents & Correspondence

A Method And A System For Recognition Of Data In One Or More Images

Abstract: The present disclosure relates to a method and system for recognition of data in one or more images. The method receives and segments the one or more images to identify segmented objects. Further, the method generates an object relationship data for each of the segmented objects and determines a knowledge base representation of the object relationship data based on defined features. Furthermore, a Recurrent Neural Network (RNN) is trained based on the knowledge base representation to determine an appropriate Neural Network(NN) having optimum confidence score. Based on the appropriate NN selected, the objects in the input image is predicted and transmitted to external systems for decision making. Thus, enabling accurate text/object identification for the images having different background variations using the dynamic selection of NN, thereby facilitating more effective decision making. Figure 3A

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
12 February 2018
Publication Number
33/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
bangalore@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-12-08
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. BALAJI GOVINDARAJ
No.17 Balaraman Street, Guduvancheri, Chennai 603-202, Tamil Nadu, India.
2. MOHD ZAID
E 150 Shaheen Bagh, Jamia Nagar Okhla, Delhi
3. SUJATHA J
O-103, HMT Township, Sector 1, Jalahalli, Bangalore-560013, Karnataka, India.
4. RAGHOTTAM MANNOPANTAR
Pristine Paradise, #105, Near Shantiniketan School, Bilekahalli, Bangalore -560076, Karnataka, India.

Specification

Claims:We claim:
1. A method for recognition of data in one or more images, the method comprising:
receiving, by a processor of a data recognition system, the one or more images from an image sensor coupled with the processor;
segmenting, by the processor, the one or more images to identify one or more segmented objects in the one or more images;
generating, by the processor, an object relationship data for each of the segmented objects, each of the segmented objects comprising at least one or more defined features and a confidence score;
determining, by the processor, a knowledge base representation of the object relationship data based on the one or more defined features of the one or more segmented objects, wherein the knowledge base representation comprises one or more defined feature nodes, one or more indent nodes and a plurality of links defining a relationship between the one or more defined feature nodes and at least one indent node;
training, by the processor, a Recurrent Neural Network (RNN) based on the knowledge base representation to generate a trained RNN; and
determining, by the processor, an appropriate Neural Network(NN) based on selection of the at least one indent node having optimum confidence score and the trained RNN for data prediction and recognition.

2. The method as claimed in claim 1, wherein segmenting the one or more images comprising the steps of:
identifying one or more trained objects and text data in the one or more images;
generating, by the processor, at least one boundary for the one or more trained objects text data identified in the one or more images; and
cropping, by the processor, the identified trained objects and text data along the at least one boundary to determine the one more segmented objects in the one or more images.

3. The method as claimed in claim 1, wherein generating the object relationship data comprising steps of:
training a plurality of NNs with the one or more segmented objects;
identifying at least one defined feature for each of the one or more segmented objects; and
determining the confidence score for the one or more defined features of one or more segmented objects associated with the plurality of NNs.

4. The method as claimed in claim 1, wherein the step of determining the knowledge base representation of the object relationship data comprising steps of:
identifying the one or more indent nodes for each segmented object in the object relationship data, wherein an indent node represents a NN;
determining at least one indent node with optimum confidence score associated with each segmented object; and
mapping the at least one indent node with the one or more defined features to generate a relationship between the one or more defined feature nodes and at least one indent node, wherein a defined feature node represents a defined feature in the knowledge base representation.

5. The method as claimed in claim 1, wherein determining the appropriate NN comprises selecting the NN by the RNN based on at least one defined feature node linked with the indent node representing the selected NN.

6. The method as claimed in claim 1, further comprising:
automatically updating the knowledge base representation of the object relationship data with new NN based on the one or more defined features; and
transmitting the data predicted and recognized from the selected NN to a decision-making module for further processing.

7. A system for recognition of data in one or more images, the system comprising:
at least one image sensor for capturing the one or more images;
a processor; and
a memory, communicatively coupled with the processor, wherein the memory stores processor-executable instructions, which on execution cause the processor to:
receive the one or more images from the image sensor;
segment the one or more images to identify one or more segmented objects;
generate an object relationship data for each of the segmented objects, each segmented object comprises one or more defined features and a confidence score;
determine a knowledge base representation of the object relationship data based on the one or more defined features of the one or more segmented objects, wherein the knowledge base representation comprises the one or more defined feature nodes, one or more indent nodes and a plurality of links defining the relationship between the one or more defined feature nodes and at least one indent node;
train a Recurrent Neural Network (RNN) based on the knowledge base representation to generate a trained RNN; and
determine an appropriate Neural Network(NN) based on selection of the at least one indent node having optimum confidence score and the trained RNN for data prediction and recognition.

8. The system as claimed in claim 7, wherein the processor is configured to segment the one or more images by steps comprising:
identifying one or more trained objects and text data in the one or more images;
generating at least one boundary for the one or more trained objects and text data identified in the one or more images; and
cropping the identified trained objects and text data along the at least one boundary to determine the one more segmented objects in the one or more images.

9. The system as claimed in claim 7, wherein the processor is configured to generate the object relationship data by performing steps of:
training a plurality of NNs with the one more segmented objects;
identifying at least one defined feature for each of the one or more segmented objects; and
determining the confidence score for the one or more defined features of one or more segmented objects associated with the plurality of NNs.

10. The system as claimed in claim 7, wherein the processor is configured to determine the knowledge base representation of the object relationship data by performing steps of:
identifying the one or more indent nodes for each segmented object in the object relationship data, wherein an indent node represents a NN;
determining at least one indent node with optimum confidence score associated with each segmented object; and
mapping the at least one indent node with the one or more defined features to generate a relationship between the one or more defined feature nodes and at least one indent node, wherein a defined feature node represents a defined feature in the knowledge base representation.

11. The system as claimed in claim 7, wherein the processor is configured to determine the appropriate NN comprises selecting the NN by the RNN based on at least one defined feature node linked with the indent node representing the selected NN.

12. The system as claimed in claim 7, wherein the processor is further configured to:
update the knowledge base representation of the object relationship data with new NN based on the one or more defined features automatically; and
transmit the data predicted and recognized from the selected NN to a decision-making module for further processing.

Dated this 12th day of February, 2018

SWETHA S. N
OF K&S PARTNERS
ATTORNEY FOR THE APPLICANT
, Description:TECHNICAL FIELD
The present subject matter is related, in general to image processing and more particularly, but not exclusively to a method and a system for recognition of data in images.

Documents

Application Documents

# Name Date
1 201841005276-STATEMENT OF UNDERTAKING (FORM 3) [12-02-2018(online)].pdf 2018-02-12
2 201841005276-REQUEST FOR EXAMINATION (FORM-18) [12-02-2018(online)].pdf 2018-02-12
3 201841005276-POWER OF AUTHORITY [12-02-2018(online)].pdf 2018-02-12
4 201841005276-FORM 18 [12-02-2018(online)].pdf 2018-02-12
5 201841005276-FORM 1 [12-02-2018(online)].pdf 2018-02-12
6 201841005276-DRAWINGS [12-02-2018(online)].pdf 2018-02-12
7 201841005276-DECLARATION OF INVENTORSHIP (FORM 5) [12-02-2018(online)].pdf 2018-02-12
8 201841005276-COMPLETE SPECIFICATION [12-02-2018(online)].pdf 2018-02-12
9 201841005276-REQUEST FOR CERTIFIED COPY [01-03-2018(online)].pdf 2018-03-01
10 201841005276-Proof of Right (MANDATORY) [24-04-2018(online)].pdf 2018-04-24
11 Correspondence by Agent_Form 1_01-05-2018.pdf 2018-05-01
12 201841005276-OTHERS [29-01-2021(online)].pdf 2021-01-29
13 201841005276-Information under section 8(2) [29-01-2021(online)].pdf 2021-01-29
14 201841005276-FORM 3 [29-01-2021(online)].pdf 2021-01-29
15 201841005276-FER_SER_REPLY [29-01-2021(online)].pdf 2021-01-29
16 201841005276-DRAWING [29-01-2021(online)].pdf 2021-01-29
17 201841005276-CLAIMS [29-01-2021(online)].pdf 2021-01-29
18 201841005276-ABSTRACT [29-01-2021(online)].pdf 2021-01-29
19 201841005276-FER.pdf 2021-10-17
20 201841005276-PatentCertificate08-12-2023.pdf 2023-12-08
21 201841005276-IntimationOfGrant08-12-2023.pdf 2023-12-08
22 201841005276-PROOF OF ALTERATION [10-04-2024(online)].pdf 2024-04-10

Search Strategy

1 search_stratE_21-08-2020.pdf

ERegister / Renewals

3rd: 07 Mar 2024

From 12/02/2020 - To 12/02/2021

4th: 07 Mar 2024

From 12/02/2021 - To 12/02/2022

5th: 07 Mar 2024

From 12/02/2022 - To 12/02/2023

6th: 07 Mar 2024

From 12/02/2023 - To 12/02/2024

7th: 07 Mar 2024

From 12/02/2024 - To 12/02/2025

8th: 05 Feb 2025

From 12/02/2025 - To 12/02/2026