Abstract: This disclosure relates to method and system for classifying an object in input data using an artificial neural network (ANN) model. The method may include extracting positive features and orthogonal features associated with the object in the input data, performing a partial classification of the object based on the positive features by a first part of the ANN model, and determining an accuracy of the classification of the object based on the orthogonal features by a second part of the ANN model. The positive features are features uniquely contributing to identification of a class for the object, while the orthogonal features are features not contributing to identification of the class but contributing to identification of one or more of remaining classes. Figure 2
Claims:WE CLAIM:
1. A method of classifying an object in input data using an artificial neural network (ANN) model, the method comprising:
extracting, by an object classification device, one or more positive features and one or more orthogonal features associated with the object in the input data, wherein the one or more positive features are features uniquely contributing to identification of a class for the object, and wherein the one or more orthogonal features are features not contributing to identification of the class but contributing to identification of one or more of remaining classes;
performing, by the object classification device, a partial classification of the object based on the one or more positive features by a first part of the ANN model, wherein the first part of the ANN model detects presence of a pattern in the input data to arrive at the class of the object; and
determining, by the object classification device, an accuracy of the classification of the object based on the one or more orthogonal features by a second part of the ANN model, wherein the second part of the ANN model detects absence of a pattern in the input data to arrive at the accuracy of the class of the object.
2. The method of claim 1, further comprising:
determining a plurality of positive features and a plurality of orthogonal features for each of a plurality of classes corresponding to a plurality of objects using training data by a multi-stage classifier, wherein determining the plurality of positive features and the plurality of orthogonal features comprise determining, for each of at least two features from among a plurality of features, at least one of a ratio of cross correlation, a ratio of auto correlation, or a Kullback–Leibler (KL) divergence; and
storing the plurality of positive features and the plurality of orthogonal features for each of the plurality of classes in a database.
3. The method of claim 2, wherein extracting the one or more positive features and the one or more orthogonal features associated with the object comprises employing the plurality of positive features and the plurality of orthogonal features for each of the plurality of classes stored in the database.
4. The method of claim 1, wherein the one or more positive features further comprises features common with one or more of remaining classes but contributing to identification of the class for the object, and wherein a lower weightage is assigned to the features common with one or more of remaining classes.
5. The method of claim 1, wherein the input data comprises one of image data, textual data, audio data, or haptic signal, and wherein the first part of the ANN model comprises a convolutional neural network (CNN) and the second part of the ANN model comprises a long short-term memory (LSTM).
6. The method of claim 1, further comprising:
receiving user input with respect to at least one of the class, a plurality of classes, the one or more positive features, or the one or more orthogonal features; and
re-training the ANN model based on the user input.
7. An object classification device for classifying an object in input data using an artificial neural network (ANN) model, the object classification device comprising:
at least one processor and a computer-readable medium storing instructions that, when executed by the at least one processor, cause the at least one processor to perform operations comprising:
extracting one or more positive features and one or more orthogonal features associated with the object in the input data, wherein the one or more positive features are features uniquely contributing to identification of a class for the object, and wherein the one or more orthogonal features are features not contributing to identification of the class but contributing to identification of one or more of remaining classes;
performing a partial classification of the object based on the one or more positive features by a first part of the ANN model, wherein the first part of the ANN model detects presence of a pattern in the input data to arrive at the class of the object; and
determining an accuracy of the classification of the object based on the one or more orthogonal features by a second part of the ANN model, wherein the second part of the ANN model detects absence of a pattern in the input data to arrive at the accuracy of the class of the object.
8. The object classification device of claim 7, wherein the operations further comprise:
determining a plurality of positive features and a plurality of orthogonal features for each of a plurality of classes corresponding to a plurality of objects using training data by a multi-stage classifier, wherein determining the plurality of positive features and the plurality of orthogonal features comprise determining, for each of at least two features from among a plurality of features, at least one of a ratio of cross correlation, a ratio of auto correlation, or a Kullback–Leibler (KL) divergence; and
storing the plurality of positive features and the plurality of orthogonal features for each of the plurality of classes in a database.
9. The object classification device of claim 8, wherein extracting the one or more positive features and the one or more orthogonal features associated with the object comprises employing the plurality of positive features and the plurality of orthogonal features for each of the plurality of classes stored in the database and wherein the one or more positive features further comprises features common with one or more of remaining classes but contributing to identification of the class for the object, and wherein a lower weightage is assigned to the features common with one or more of remaining classes.
10. The object classification device of claim 7, wherein the input data comprises one of image data, textual data, audio data, or haptic signal, and wherein the first part of the ANN model comprises a convolutional neural network (CNN) and the second part of the ANN model comprises a long short-term memory (LSTM), and wherein the ANN model is re-trained based on user input comprising at least one of the class, a plurality of classes, the one or more positive features, or the one or more orthogonal features.
Dated this 12th day of June, 2019
R Ramya Rao
Of K&S Partners
Agent for the Applicant
IN/PA-1607
, Description:Technical Field
[001] This disclosure relates generally to an artificial neural network (ANN), and more particularly to method and system for classifying an object in input data using an ANN model.
| # | Name | Date |
|---|---|---|
| 1 | 201941023316-FORM-26 [24-05-2024(online)].pdf | 2024-05-24 |
| 1 | 201941023316-STATEMENT OF UNDERTAKING (FORM 3) [12-06-2019(online)].pdf | 2019-06-12 |
| 2 | 201941023316-IntimationOfGrant24-05-2024.pdf | 2024-05-24 |
| 2 | 201941023316-Request Letter-Correspondence [12-06-2019(online)].pdf | 2019-06-12 |
| 3 | 201941023316-REQUEST FOR EXAMINATION (FORM-18) [12-06-2019(online)].pdf | 2019-06-12 |
| 3 | 201941023316-PatentCertificate24-05-2024.pdf | 2024-05-24 |
| 4 | 201941023316-Written submissions and relevant documents [24-05-2024(online)].pdf | 2024-05-24 |
| 4 | 201941023316-POWER OF AUTHORITY [12-06-2019(online)].pdf | 2019-06-12 |
| 5 | 201941023316-Power of Attorney [12-06-2019(online)].pdf | 2019-06-12 |
| 5 | 201941023316-AMENDED DOCUMENTS [23-04-2024(online)].pdf | 2024-04-23 |
| 6 | 201941023316-FORM 18 [12-06-2019(online)].pdf | 2019-06-12 |
| 6 | 201941023316-Correspondence to notify the Controller [23-04-2024(online)].pdf | 2024-04-23 |
| 7 | 201941023316-FORM 13 [23-04-2024(online)].pdf | 2024-04-23 |
| 7 | 201941023316-FORM 1 [12-06-2019(online)].pdf | 2019-06-12 |
| 8 | 201941023316-POA [23-04-2024(online)].pdf | 2024-04-23 |
| 8 | 201941023316-Form 1 (Submitted on date of filing) [12-06-2019(online)].pdf | 2019-06-12 |
| 9 | 201941023316-DRAWINGS [12-06-2019(online)].pdf | 2019-06-12 |
| 9 | 201941023316-US(14)-HearingNotice-(HearingDate-09-05-2024).pdf | 2024-04-15 |
| 10 | 201941023316-DECLARATION OF INVENTORSHIP (FORM 5) [12-06-2019(online)].pdf | 2019-06-12 |
| 10 | 201941023316-FER.pdf | 2021-10-17 |
| 11 | 201941023316-CLAIMS [21-09-2021(online)].pdf | 2021-09-21 |
| 11 | 201941023316-COMPLETE SPECIFICATION [12-06-2019(online)].pdf | 2019-06-12 |
| 12 | 201941023316-COMPLETE SPECIFICATION [21-09-2021(online)].pdf | 2021-09-21 |
| 12 | 201941023316-Proof of Right (MANDATORY) [18-11-2019(online)].pdf | 2019-11-18 |
| 13 | 201941023316-DRAWING [21-09-2021(online)].pdf | 2021-09-21 |
| 13 | 201941023316-FORM 3 [20-09-2021(online)].pdf | 2021-09-20 |
| 14 | 201941023316-FER_SER_REPLY [21-09-2021(online)].pdf | 2021-09-21 |
| 14 | 201941023316-PETITION UNDER RULE 137 [21-09-2021(online)].pdf | 2021-09-21 |
| 15 | 201941023316-OTHERS [21-09-2021(online)].pdf | 2021-09-21 |
| 16 | 201941023316-FER_SER_REPLY [21-09-2021(online)].pdf | 2021-09-21 |
| 16 | 201941023316-PETITION UNDER RULE 137 [21-09-2021(online)].pdf | 2021-09-21 |
| 17 | 201941023316-FORM 3 [20-09-2021(online)].pdf | 2021-09-20 |
| 17 | 201941023316-DRAWING [21-09-2021(online)].pdf | 2021-09-21 |
| 18 | 201941023316-COMPLETE SPECIFICATION [21-09-2021(online)].pdf | 2021-09-21 |
| 18 | 201941023316-Proof of Right (MANDATORY) [18-11-2019(online)].pdf | 2019-11-18 |
| 19 | 201941023316-CLAIMS [21-09-2021(online)].pdf | 2021-09-21 |
| 20 | 201941023316-FER.pdf | 2021-10-17 |
| 21 | 201941023316-US(14)-HearingNotice-(HearingDate-09-05-2024).pdf | 2024-04-15 |
| 22 | 201941023316-POA [23-04-2024(online)].pdf | 2024-04-23 |
| 23 | 201941023316-FORM 13 [23-04-2024(online)].pdf | 2024-04-23 |
| 24 | 201941023316-Correspondence to notify the Controller [23-04-2024(online)].pdf | 2024-04-23 |
| 25 | 201941023316-AMENDED DOCUMENTS [23-04-2024(online)].pdf | 2024-04-23 |
| 26 | 201941023316-Written submissions and relevant documents [24-05-2024(online)].pdf | 2024-05-24 |
| 27 | 201941023316-PatentCertificate24-05-2024.pdf | 2024-05-24 |
| 28 | 201941023316-IntimationOfGrant24-05-2024.pdf | 2024-05-24 |
| 29 | 201941023316-STATEMENT OF UNDERTAKING (FORM 3) [12-06-2019(online)].pdf | 2019-06-12 |
| 29 | 201941023316-FORM-26 [24-05-2024(online)].pdf | 2024-05-24 |
| 1 | searchstrategyE_16-03-2021.pdf |