Abstract: Embodiments of the present disclosure teach automatic extraction and verification of signatures through artificial neural networks. An embodiment of the present disclosure refers to a system for automatic extraction and verification of signatures comprising a signature extractor configured to extract user signature image from scanned cheque images, a signature database coupled to the signature extractor configured to store actual reference data and fake reference data, a signature processor coupled to the signature extractor configured to process extracted user signature image and identify features of the extracted user signature image, at least one neural network dedicated to each actual reference feature of stored actual reference signatures coupled to signature processor configured to verify features of extracted user signature image against the actual reference as well as fake reference data stored in the signature database and an integrated neural network system interface coupled to the at least one neural network dedicated to each actual reference feature. Ref:Fig.1
THE PATENTS ACT, 1970
COMPLETE SPECIFICATION
Section 10
'AUTOMATIC EXTRACTION AND VERIFICATION OF SIGNATURES"
Newgen Software Technologies Limited, a corporation organized and existing under the laws of India, of Brooldyn Business Centre, 5th Floor, East Wing, 103-105, Periyar EVR Road, Chennai - 600084 Tamil Nadu, India.
The following specification particularly describes the nature of this invention and the manner in which it is to be performed:
-1-
AUTOMATIC EXTRACTION AND VERIFICATION OF SIGNATURES
Field of Technology:-
The present disclosure refers to automatic extraction and verification of signatures and more particularly, but not limited to, automatic extraction and verification of signatures through artificial neural networks.
Background: -
Industries where payments and money transfers are involved, such as the banking industry, a verification of the user's identity is required which is usually in the form of signatures. For example, in the banking industry, for availing any kind of services fike opening an account, making a credit card, the customer is required to fill a form with his personal information and three or more reference signatures. These signatures serve the purpose of verification of all his future transactions.
Previously such verification was carried out manually; however, such verification was prone to human errors, costiy and time intensive. Currently, various mechanisms offer automated signature verification, which are based on the acquisition of the signature data i.e. On-line and Off-line. The existing systems either follow an offline or an online system of verification or a combination of both. On-line process records the motion of the digital stylus/pen while the signature is produced on a transducer pad, and captures the dynamic characteristics like the location, pen direction, stroke direction, acceleration and pen pressure. These characteristics are specific to each individual and sufficiently stable as well as repetitive. In Off-line process, a 2-D scanned digital image of the signature is used for verification. The processing of the offline data is relatively complex due to many different factors such as the absence of stable dynamic characteristics, highly stylish and unconventional writing styles, pen tip thickness, orientation, size & position of signature and space availability on paper, variation of the
2
signatures due to age, illness, geographic location and to some extent the emotional state of the person. Moreover, in some scenarios extraction of offline signatures presents an extreme challenge especially for example in bank checks of south-east Asia, where signatures are overlapped with stamps, or with another signature, MICR and other static/dynamic check information. These factors coupled together cause a lot of tntra-personal variation and make the verification process much more complex.
Summary: -
Embodiments of the present disclosure teach automatic extraction and verification of signatures through artificial neural networks. An embodiment of the present disclosure refers to a system for automatic extraction and verification of signatures comprising a signature extractor configured to extract user signature image from scanned cheque images, a signature database coupled to the signature extractor configured to store actual reference data and fake reference data, a signature processor coupled to the signature extractor configured to process extracted user signature image and identify features of the extracted user signature image, at least one neural network dedicated to each actual reference feature of stored actual reference signatures coupled to signature processor configured to verify features of extracted user signature image against the actuai reference as well as fake reference data stored in the signature database and an integrated neural network interface coupled to the at least one neural network dedicated to each actual reference feature.
According to an embodiment of the present disclosure, the signature extractor is configured to extract user signature image from scanned cheque images by calculating Most Probable Region (MPR).
According to another embodiment of the present disclosure, the signature database comprises an actual reference signature storage unit and a fake reference signature storage unit. The actuai reference signature storage unit
3
is configured to store actual reference signature images, their actual reference features and their actual reference data while the fake reference signature storage unit is configured to store fake reference signature images and fake reference data similar to actual reference signature images. The actual reference data is formula based ratio of values of the actual reference features and fake reference data is formula based ratio of values of the authentic and fake reference features. Alternatively, the fake reference data for one user can also be the authentic reference data for the other user.
According to an embodiment of the present disclosure, the actual and fake reference features are such as baseline, critical points, cross points, curve comparison, directional probability density function, gradient of image, gradient computation, graph matching, grid, height, hoies, novel feature, projection feature, profiling, shape, critical points and graph matching hybrid.
According to an embodiment of the present disclosure, the signature database is dynamically updated with reference features and reference data of signatures of new users.
According to an embodiment of the present disclosure, the at least one neural network dedicated to each actual reference feature is trained for signatures of new users.
According to another embodiment of the present disclosure, the integrated neural network interface provides an output if considerable features of the extracted user signature verify else the user signature is sent for manual verification.
Another embodiment of the present disclosure refers to a signature verification device wherein said device comprises a signature extractor, a signature database coupled to the signature extractor, a signature processor coupled to the signature extractor, at least one neural network dedicated to
4
each actual reference feature coupled to signature processor and an integrated neural network interface coupled to the at least one neural network for each feature. The signature extractor is configured to extract user signature image from scanned cheque images while the signature database is configured to store actual reference and fake reference data. The signature processor is configured to process extracted user signature image and identify features of the extracted user signature image while the at least one neural network is configured to verify features of extracted user signature image against the actual reference as well as fake reference data stored in the signature database. The integrated neural network interface is configured to provide an output if all features of the extracted user signature image verify else the user signature is sent for manual verification.
Yet another embodiment of the present disclosure refers to a method for automatic extraction and verification of signatures comprising capturing a scanned image of a cheque, extracting user signature image from the scanned image, processing extracted user signature image to identify features of the extracted user signature image, dedicating at least one neural network to each actual reference feature of stored actual reference signature images, verifying each feature of the extracted user signature image against actual reference data and fake reference data of stored actual reference signature images and fake reference signature images respectively and integrating output of each at least one neural network to provide an output through an integrated neural network interface.
According to an embodiment of the present disclosure, the method comprises dynamically updating and storing reference features and reference data of signatures of new users.
According to an embodiment of the present disclosure, extracting user signature image from the scanned image comprises extracting Most Probable Region (MPR) from the scanned image of the cheque and regenerating user
5
signature image by extracting Black and White and/or Gray scale signature image from the extracted Most Probable Region (MPR).
According to another embodiment of the present disclosure, processing extracted user signature image to identify features of the extracted user signature image comprises removing noise components from the extracted user signature image, detecting components in the MPR after removal of noise components and filtering to segregate such components from the detected components that are probable of being signature components.
According to another embodiment of the present disclosure, dedicating at least one neural network to each actual reference feature of actual reference signature image comprises training each at least one neural network if signatures are of a new user.
According to another embodiment of the present disclosure, the output through integrated neural network is provided if each feature of extracted user signature verifies against stored actual reference data and fake reference data else the user signature is sent for manual verification.
Brief Description of drawings: -
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
Figure 1 illustrates a block diagram representation of a system for automatic extraction and verification of signatures according to an embodiment of the present disclosure.
6
Figure 2 illustrates a flow diagrammatic representation of a method for automatic extraction and verification of signatures in accordance with an embodiment of the present disclosure.
Figure 3 illustrates a flow diagrammatic representation of extraction of Most Probable Region (MPR) in accordance with an embodiment of the present disclosure.
Figure 4 illustrates a flow diagrammatic representation of a method for automatic extraction and verification of signatures based on number of signatories on a particular cheque in accordance with an embodiment of the present disclosure.
Figure 4a illustrates a flow diagrammatic representation of continuation of a method for automatic extraction and verification of signatures based on number of signatories on a particular cheque in accordance with an embodiment of the present disclosure.
Figure 5 illustrates a flow diagrammatic representation of extraction and regeneration of signature image in accordance with an embodiment of the present disclosure.
Figure 6 illustrates a flow diagrammatic representation of extraction of Black and White signature image in accordance with an embodiment of the present disclosure.
Figure 7 illustrates a flow diagrammatic representation of extraction of Gray Scale signature image in accordance with an embodiment of the present disclosure.
Figure 8 illustrates a flow diagrammatic representation of training an artificial neural network in accordance with an embodiment of the present disclosure.
7
Figures 9a-9l illustrates diagrammatic representations of features of a extracted user signature image in accordance with embodiments of the present disclosure.
Detailed Description: -
The following discussion provides a brief, general description of a suitable computing environment in which various embodiments of the present disclosure can be implemented. The aspects and embodiments are described in the generai context of computer executable mechanisms such as routines executed by a general purpose computer e.g. a server or personal computer. The embodiments described herein can be practiced with other system configurations, including Internet applications, hand held devices, multi¬processor systems, microprocessor based or programmable consumer electronics, network PCs, mini computers, mainframe computers and the like. The embodiments can be embodied in a special purpose computer or data processor that is specifically programmed configured or constructed to perform one or more of the computer executable mechanisms explained in detail below.
Exemplary embodiments now will be described with reference to the accompanying drawings. The disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey its scope to those skilled in the art. The terminology used in the detailed description of the particular exemplary embodiments illustrated in the accompanying drawings is not intended to be limiting. In the drawings, like numbers refer to like elements.
The specification may refer to "an", "one" or "some" embodiment(s) in several locations. This does not necessarily imply that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment.
8
Single features of different embodiments may also be combined to provide other embodiments.
As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and/or "comprising" when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Furthermore, "connected" or "coupled" as used herein may include wirelessly connected or coupled. As used herein, the term "and/or" includes any and all combinations and arrangements of one or more of the associated listed items.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains, it will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The figures depict a simplified structure only showing some elements and functional entities, all being logical units whose implementation may differ from what is shown. The connections shown are logical connections; the actual physical connections may be different. It is apparent to a person skilled in the art that the structure may also comprise other functions and structures. It should be appreciated that the functions, structures, elements and the protocols used in communication are irrelevant to the present disclosure. Therefore, they need not be discussed in more detail here.
9
Also, all logical units described and depicted in the figures include the software and/or hardware components required for the unit to function. Further, each unit may comprise within itseif one or more components which are implicitly understood. These components may be operatively coupled to each other and be configured to communicate with each other to perform the function of the said unit.
Figure 1 illustrates a block diagram representation of a system for automatic extraction and verification of signatures according to an embodiment of the present disclosure. A signature extractor 102 receiving scanned cheque images 101a, 101b, 101c....101n is configured to extract user signature images from said scanned cheque images. A signature processor 103 coupled to the signature extractor 102 is configured to process user signature image extracted by the signature extractor 102 and detect its features. A signature database 106 is coupled to the signature extractor 102 wherein the signature database is configured to store actual reference data and fake reference data.
According to an embodiment of the present disclosure, the signature database 106 comprises an actual reference storage unit and a fake reference storage unit where the actual reference storage unit stores actual reference signature images, their actual reference features and their actual reference data while the fake reference storage unit stores fake reference signature images, fake reference features and fake reference data corresponding to actual reference signatures. When the signature extractor 102 encounters a signature of a new user, it extracts the signature image and stores the signature as an actual reference signature image. The signature processor 103 processes the same and identifies the features so as to store them as actual reference features in the signature database. Actual reference data is also calculated and stored as ratios of values of the actual reference features. Similarly, according to an embodiment of the present disclosure, fake reference signature images are aiso generated by a fake signature
10
generator 107. The fake signature generator 107 is coupled to the signature database 106 wherein the generator 107 uses actual reference signatures images of users stored in the database 106 to generate the fake reference signature images and accordingly store the generated fake signatures in the signature database 106. Therefore, such fake reference signature images correspond to respective actual reference signature images.
According to yet another embodiment of the present disclosure, fake reference signatures are generated using formulas.
According to an embodiment of the present disclosure, the signature database 106 is dynamically updated whenever a signature of a new user is encountered. The reference features as well as reference data is automatically updated in the signature database 106 for future reference.
Artificial neural networks 104a, 104b, 104c...104n dedicated to each actual reference feature are coupled to the output of the signature processor 103 and collectively coupled to the signature database 106. These artificial neural networks are trained to verify features of the extracted user signature against the actual reference data and fake reference data stored in the signature database 106. According to an embodiment of the present disclosure, incase a signature for a new user is encountered, the signature is extracted, features
are extracted and accordingly, each artificial neural 104a, 104b, 104c
104n is trained for each extracted feature. According to an embodiment of the present disclosure, each neural network is trained using formula based ratios of the extracted features and not on actual features themselves. According to yet another embodiment of the present disclosure, the normalized values are within the range of -1 to +1.
The output of each artificial neural network is integrated to provide an integrated neural network interface 105. The integrated neural network interface 105 provides an output only if all features of the extracted user signature verify else the signature is sent for manual verification.
n
According to an embodiment of the present disclosure, actual reference features and fake reference features are such as, but not limited to, baseline, critical points, cross points, curve comparison, directional probability density function, gradient of image, gradient computation, graph matching, grid, height, holes, novel feature, projection feature, profiling, shape and critical points and graph matching hybrid. These are described in detail in the following description.
Baseline: The signature image is divided into 2 parts by splitting it along the centroid. The centroid of the left image and right image are calculated. The line joining the left centroid and the right centroid is called the baseline and the angle this line makes with the horizontal is called the base angle. The base angle is used for verification. (Illustrated in Figure 9a)
Critical Points: The set of minimum points, required to trace the outer
boundary of the complete signature, are called critical points. These criticaf points are extracted and classified according to their coordinates and direction. These classifications of points and the coordinates are then compared using co-relation and longest common substring algorithms. (Illustrated in Figure 9b)
Cross Points: In each signature there are points where the signature
strokes cross themselves. These points of crossover are known as Cross Points. Certain cross points show degree of invariance from one sample signature to another sample signature. The cross points are extracted and then finally used for verification using Hungarian Matching. (Illustrated in Figure 9c)
Curve Comparison: Each signature has its characteristic curves as shown in figure 9d. The outer boundaries of each of the curves is extracted and traversed in the clockwise direction. Information about each curve is gathered and stored in the form of Freeman Chain codes in the signature database. These chain codes of the curves are then verified. (Illustrated in Figure 9d)
12
Directional Probability Density Function Gradient of the gray-level signature image is computed, using Sobel operator. This is illustrated below: -
- Gradient of Image. The gradient of an image measures the variation of its gray levels. It provides two pieces of information. The magnitude of the gradient tells us how quickly the image levels are varying, while the direction of the gradient tells us the direction in which the image gray levels are changing most rapidly.
- Gradient computation: The gradient of an image is, at each image point (pixel) a 2D vector with the components obtained by the horizontal and vertical Sobel operators. The first component gives the intensity data, and the second component gives the directional data. The directional data is further normalized by dividing it by sum of the values. The directional data is fit into a Binomial distribution with seven channels (coefficients).Signatures are classified by using cross-correlation function. (Further illustration in Figure 9e)
Graph-Matching Graph-Matching analyses the geometrical shape of the signature. The sample signature and the test signature data points are used to construct a distance matrix and the cost is calculated using an optimized deviant of Hungarian Matching. If the cost is within the pre-calculated threshold using reference signatures, the signature is said to be authentic. (Illustrated in Figure 9f)
Grid Analysis This feature divides the signature into a grid of smaller density based on pixe! count i.e. each grid will have approximately the same number of pixels. From each of the given grids a number of smaller sub features are extracted, namely
2nd order moment about x-axis
2nd order moment about y-axis
Average angle made by each pixel with bottom left corner
13
Ratio of distance of center of gravity from bottom left corner with
length of diagonal of grid
Pixel density of grid All the above mentioned values are verified using a weighted vector comparison model. (Illustrated in Figure 9g)
Height Feature In this feature the signature is divided in 10 bins based on horizontal pixel density. Each bin has approximately the same number of pixels. After division into buckets the mean height of each bin (based on distance of each pixel from the bottom) is calculated. This vector of mean heights is used to verify signatures. (Illustrated in Figure 9h)
Holes Each signature has distinctive loops or holes present. The size
position and relative position of the loops/holes is used to verify the authenticity of the signature. (Illustrated in Figure 9i)
Novel Feature In this feature the signature image is split into two parts
along its centroid, once vertically and then horizontally. This process is repeated for each of the split parts twice thus obtaining 13 localized centroids. These 13 points act as a feature vector, which is used for verification. (Illustrated in Figure 9j)
Projection feature The signature is rotated in the clockwise and anti¬clockwise direction in steps of 5 degrees forming a range from -30 degrees to +30 degrees. The histogram of each of the signature is taken at each step forming a feature matrix . This feature matrix is verified against the feature vector of the sample signatures obtained by this process using cross-correlation. (Illustrated in Figure 9k)
Profiling This feature matches the left, right, top and bottom profiles of the test signature with the sample signatures. Profile denotes vector containing the index of first pixel in each direction. If the correlation of the
14
profiles of the test signature is greater than the intra personal correlation of the corresponding profiles of the reference signatures, then the score is incremented. If this score exceeds a particular threshold, then the test signature is accepted as the authentic signature by this feature, otherwise not.
Shape Shape Feature is a combination of 8 sub features which are
- Aspect ratio: Width of image to Height of image
- Area 1 ratio: Number of black pixels in thinned image/Area of Bounding Box
- Area2 ratio: Number of biack pixels in waterfall image/Area of Bounding Box
- Area3 ratio: Area of convex hull/Area of bounding box
- Average Stroke Widtfr. Number of biack pixels in thinned image/ No of biack pixels in the original image
- Relative centroid aiongX- axis X-coordinate of centroid/bounding box width
- Relative centroid aiong Y-axis Y-coordinate of centroid/bounding box height
- Maximum Horizontal Histogram and its position
- Maximum Vertical Histogram and its position
Gravity Distance rat/a Distance of centroid from bottom left corner/length of diagonal of bounding box
Critical Points and Graph Matching Hybrid This feature uses both critical points and graph matching. In the first step the critical points are calculated in the test as well as sample signature. Then the critical points are mapped from the specimen to the sample signature and the immediate region surrounding the critical points (Critical Region) is matched using Graph matching. This process is repeated for all the critical points and the total cost is verified against a pre-calculated threshold. (Illustrated in Figure 91)
According to an embodiment of the present disclosure, values of the above mentioned features are normalized such that the values fall within the range '-
15
1 to +1'. Each artificial neural network is trained with these normalized values of the features. These normalized values are not direct values of the features; instead, these are a algebraic relational data of the normalized values. These values are used as input for each artificial neural network.
Embodiments of the method for automatic extraction and verification of signatures according to the present disclosure are described in Figure 2, 3, 4, 4a, 5, 6, 7 and 8. The method is illustrated as a collection of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. The order in which the process is described is not intended to be construed as a limitation, and any number of the described blocks can be combined in different orders to implement the process, or an alternate process.
Figure 2 illustrates a flow diagrammatic representation of a method for automatic extraction and verification of signatures in accordance with an embodiment of the present disclosure. A scanned image of a cheque sent for signature verification is captured 201 whereafter the signature image is extracted from the scanned image 202. The extracted user signature image is then processed to extract its features 203. At least one neural network is then dedicated to each stored actual reference data and fake reference data 204 so as to verify each feature of the extracted user signature against stored actual reference data and fake reference data 205. The output of the neural networks is then integrated to provide an output through an integrated neural network interface 206.
Figure 3 illustrates a flow diagrammatic representation of extraction of Most Probable Region (MPR) in accordance with an embodiment of the present disclosure which also elaborates further on extracting signature image from the scanned image and processing extracted user signature image to identify features as referred to in the previous embodiment. According to an embodiment of the present disclosure, extracting signature image from the scanned image 301 comprises extracting Most Probable Region (MPR) from
16
the scanned image of the cheque 302 and regenerating signature image by extracting Black and White and/or Gray scale signature image from the extracted Most Post Region (MPR) 303. According to an embodiment of the present disclosure, processing extracted user signature image to identify features of the extracted user signature image comprises removing noise components from the extracted user signature image 304, detecting components in the MPR after removal of noise components 305 and filtering to segregate such components from the detected components that are probable of being signature components 306.
Figure 4 illustrates a flow diagrammatic representation of a method for automatic extraction and verification of signatures based on number of signatories on a particular cheque in accordance with an embodiment of the present disclosure. The current embodiment is to understand the number of signatories on the cheque and accordingly verify the signatures. For example, if in a joint account, cheques above a particular denomination are to be cleared only on signatures of all holders of the account, then such verification can be done. The current embodiment is also implemented for single signatory accounts. According to the embodiment, number of components in the reference signature image (Cr) and sample signature image (Cs) are detected and calculated 401 where sample signature image is the extracted user signature image. The number of components, Cr and Cs are then compared to each other 402. If it is a single signatory account 403, it is checked whether number of components of reference signature image (Cr) is greater than or equal to number of components of sample signature image (Cs) 405. if yes then the extracted user signature image is sent to the neural networks 406 else first component of reference signature is compared to all components of sample signature 407. This holds for multiple signatory 404 as well where the first component of reference signature is compared to all components of sample signature 407 and then following an intermediary procedure A 408, it is verified whether maximum value of overlapping pixels is greater than the threshold value of the overlapping pixels 409. If no, then send the signature for manual verification 410 else compare all components of the
17
reference signature to the sample signature with relative information 411 and proceed to intermediary procedure 412. It is then verified whether the combine maxima is greater than the threshold value 413. If yes, the signature image is extracted 414 and sent to the neural networks 406, else it is the signature is sent for manual verification.
Figure 4a illustrates a flow diagrammatic representation of continuation of a method for automatic extraction and verification of signatures based on number of signatories on a particular cheque in accordance with an embodiment of the present disclosure i.e. the intermediary procedure A 408, 412. Similar components are resized to the same size 408a, 412a and the margin of sample components is increased 408b, 412b. Reference components in the sample components are traversed 408c, 412c and the number of overlapping pixels in the traversal is calculated 408d, 412d. A person in the art would understand that the procedure described in Figure 4a is to be read in tandem with description of Figure 4 and not otherwise.
Figure 5 illustrates a flow diagrammatic representation of extraction and regeneration of signature image in accordance with an embodiment of the present disclosure. According to the present embodiment, scanned cheque image is referred to in Gray Scale as weii as Black ad White 501. According to an embodiment of the present disclosure, Horizontal smearing is performed on the Black and White image of the signature image 502 and text lines with width X and Y are detected 503. If lines are detected, these are removed 504 and components of the scanned cheque image are analyzed 505. Accordingly, Black and White signature image is then extracted 506. The Gray Scale image is then mapped per pixel from the extracted Black and White signature image 507 and hence, signature image in Gray Scale is obtained 508.
Figure 6 illustrates a flow diagrammatic representation of extraction of Black and White signature image in accordance with an embodiment of the present disclosure. As performed by the signature extractor in previously described
18
embodiments, the Most Probable Region (MPR) is extracted from the scanned cheque image 601. The signature image is then extracted from the MPR 602 and Horizontal Smearing is performed on the extracted user signature image 603. The width and height of the extracted image is calculated 604 and checked for whether the calculated width and height is greater than the threshold width and height 605. If the calculated width and height is greater than the threshold, then the text line is removed 606 and extracted Black and White signature image is obtained 607, else the extracted Black and White signature image is directly obtained.
Figure 7 illustrates a flow diagrammatic representation of extraction of Gray Scale signature image in accordance with an embodiment of the present disclosure. According to the embodiment, as done by the signature extractor as described in previous embodiments, Most Probable Region (MPR) 701 is extracted and signature region in Gray Scale is extracted 702. Horizontal Smearing is then performed on the gray scale image with the extracted Black and White image 703. If there is a same pixel in Black and white signature image, the pixel is retained in the gray scale image 704 and extracted Gray Scale Image is obtained 705.
Figure 8 illustrates a flow diagrammatic representation of training an artificial neural network in accordance with an embodiment of the present disclosure. According to an embodiment, multiple reference signatures are obtained by extraction 801a and features from the multiple reference signatures are then extracted as well 802a. These extracted features are then used to determine the normalized ratio values 803a. Alternatively, according to another embodiment of the present disclosure, fake reference signatures are generated 801b and features from the fake reference signatures are extracted 802b. The normalized ratio value of features of fake reference signatures are determined with the actual reference signatures 803c.
19
According to an embodiment of the present disclosure, the output of the above described two embodiments is used to train the artificial neural network 804 dedicated to each stored actual reference feature.
In another embodiment of the present disclosure, the values of features are normalized such that these normalized value fails within the range of-1 to +1' and these normalized values are not direct values of the features instead are algebraic relational data of the normalized values.
Embodiments of the present disclosure are efficient as they generate more reference data out of the reference signatures in comparison to the prior art. Further, in view of at least one neural network dedicated to each actual reference feature, a system in accordance with embodiments of the present disclosure is convenient to use and maintain. Embodiments of the present disclosure perform signature extraction without the presence of a reference signature.
Owing to verification of the features of the extracted user signature against the actual reference features as well as fake reference features, embodiments of the present disclosure render higher confidence value in the output, which makes the system according to the embodiments robust and time efficient.
Embodiments of the present disclosure reduce the high level of inconvenience caused to a customer incase an authentic cheque is rejected by a bank's manual system due to lack of reference signatures and skilled workers to perform the task. Also, due to the volume of work, the process becomes monotonous if done manually and is subjected to errors due to fatigue.
As will be appreciated by one of skill in the art, the present invention may be embodied as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, a software embodiment or an embodiment combining software and hardware aspects all generally referred to herein as a "circuit" or "module." Furthermore,
20
the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
Furthermore, the present invention was described in part above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention.
it will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a genera! purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
21
The flowchart and schematic diagrams of Figures 1-3 illustrate the architecture, functionality, and operations of some embodiments of methods, systems, and computer program products for centralized and controlled printing and administration. In this regard, each block may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). it should also be noted that in other implementations, the function(s) noted in the biocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending on the functionality involved.
In the drawings and specification, there have been disclosed exemplary embodiments of the invention. Although specific terms are employed, they are used in a generic and descriptive sense only and not for purposes of limitation, the scope of the invention being defined by the foliowing claims
22
We claim : -
1. A system for automatic extraction and verification of signatures
comprising: -
a. a signature extractor configured to extract user signature image
from scanned cheque images;
b. a signature database coupled to the signature extractor
configured to store actual reference data and fake reference
data;
c. a signature processor coupled to the signature extractor
configured to process extracted user signature image and
extract features of the extracted user signature image;
d. at ieast one neural network dedicated to each actua! reference
feature of stored actual reference signature images coupled to
signature processor and the signature database configured to
verify features of extracted user signature image against the
actual reference as welt as fake reference data stored in the
signature database; and
e. an integrated neura! network interface coupled to outputs of the
at least one neural network dedicated to each actual .reference
feature.
2. A system as claimed in claim 1 wherein the signature extractor is configured to extract user signature image from scanned cheque images by calculating Most Probable Region (MPR).
3. A system as claimed in ciaim 1 wherein the signature database comprises: -
a. an actual reference signature storage unit configured to store actual reference signature images, their actua! reference features and their actual reference data; and
23
b. a fake reference signature storage unit configured to store fake reference signature images and fake reference data similar to actual reference signature images.
4. A system as claimed in claim 1 and 3 wherein actual reference data is formula based ratio of values of the actual reference features.
5. A system as claimed in claim 1 and 3 wherein fake reference data is formula based ratio of values of the fake reference features with actual reference data.
6. A system as claimed in claim 4 and 5 wherein actual and fake reference features are such as baseline, critical points, cross points, curve comparison, directional probability density function, gradient of image, gradient computation, graph matching, grid, height, holes, novel feature, projection feature, profiling, shape and critical points and graph matching hybrid.
7. A system as claimed in claim 1 wherein the signature database is dynamically updated with reference features and reference data of signatures of new users.
8. A system as claimed in claim 1 wherein the at least one neural network dedicated to each actual reference feature is trained for signatures of new users.
9. A system as claimed in claim 1 wherein said at least one neural network dedicated to each actual reference feature is common to all users.
10. A system as claimed in claim 1 wherein the integrated neural network interface provides an output if all features of the extracted user
24
signature are authenticated else the user signature is sent for manual verification.
11. A signature verification device comprising: -
a. a signature extractor configured to extract user signature image
from scanned cheque images;
b. a signature database coupled to the signature extractor
configured to store actual reference data and fake reference
data;
c. a signature processor coupled to the signature extractor
configured to process extracted user signature image and
extract features of the extracted user signature image;
d. at least one neural network dedicated to each actual reference
feature of stored actual reference signature images coupled to
signature processor and the signature database configured to
verify features of extracted user signature image against the
actual reference as well as fake reference data stored in the
signature database; and
e. an integrated neural network interface coupled to outputs of the
at least one neural network dedicated to each actual reference
feature.
12.A signature verification device as claimed in claim 11 wherein the signature extractor is configured to extract user signature image from scanned cheque images by calculating Most Probable Region (MPR).
13. A signature verification device as claimed in claim 11 wherein the signature database comprises: -
a. an actual reference signature storage unit configured to store actual reference signature images, their actual reference features and their actual reference data; and
25
b. a fake reference signature storage unit configured to store fake reference signature images and fake reference data similar to actual reference signature images.
14. A signature verification device as claimed in claim 11 and 13 wherein actual reference data is formula based ratio of values of the actual reference features.
15. A signature verification device as claimed in claim 11 and 13 wherein fake reference data is formula based ratio of values of the fake reference features with actual reference data.
16. A signature verification device as claimed in claim 14 and 15 wherein actual and fake reference features are such as baseline, critical points, cross points, curve comparison, directional probability density function, gradient of image, gradient computation, graph matching, grid, height, holes, novel feature, projection feature, profiling, shape and critical points and graph matching hybrid.
17. A signature verification device as claimed in claim 11 wherein the signature database is dynamically updated with reference features and reference data of signatures of new users.
18. A signature verification device as claimed in claim 11 wherein the at least one neural network dedicated to each actual reference feature is trained for signatures of new users.
19. A signature verification device as claimed in claim 11 wherein said at least one neural network dedicated to each actual reference feature is common to all users.
20. A signature verification device as claimed in claim 11 wherein the integrated neural network interface provides an output if all features of
26
the extracted user signature are authenticated else the user signature is sent for manual verification.
21. A method for automatic extraction and verification of signatures comprising: -
a. capturing a scanned image of a cheque;
b. extracting user signature image from the scanned image;
c. processing extracted user signature image to extract features of
the extracted user signature image;
d. dedicating at least one neural network to each actual reference
feature of stored actual reference signature images;
e. verifying each feature of the extracted user signature against
actual reference data and fake reference data of stored actual
reference signature images and fake reference signature
images respectively; and
f. integrating output of each at least one neural network to provide
an output through an integrated neural network interface.
22.A method as claimed in claim 21 wherein extracting signature image from the scanned image comprises: -
a. extracting Most Probable Region (MPR) from the scanned
image of the cheque; and
b. regenerating user signature image by extracting Black and
White and/or Gray scale signature image from the extracted
Most Post Region (MPR).
23.A method as claimed in claim 21 wherein processing extracted user signature image to identify features of the extracted user signature image comprises
a. removing noise components from the extracted user signature image;
27
b. detecting components in the MPR after removal of noise
components; and
c. filtering to segregate such components from the detected
components that are probable of being signature components.
24.A method as claimed in claim 21 wherein said method comprises dynamically updating and storing reference features and reference data of signatures of new users.
25. A method as claimed in claim 21 wherein actual reference data is formula based ratio of values of actual reference features of actual reference signatures.
26. A method as claimed in claim 21 wherein fake reference data is formula based ratio of values of fake reference features of fake reference signatures with actual reference data.
27. A method as claimed in claim 21, 25 and 26 wherein actual and fake reference features are such as baseline, critical points, cross points, curve comparison, directional probability density function, gradient of image, gradient computation, graph matching, grid, height, holes, novel feature, projection feature, profiling, shape and critical points and graph matching hybrid.
28. A method as claimed in claim 21 wherein dedicating at least one neural network to each actual reference feature of stored actual reference signature images comprises training each at least one neural network if signatures are of a new user.
29. A method as claimed in claim 21 wherein said at least one neural network dedicated to each actual reference feature is common to all users.
28
30. A method as claimed in claim 21 wherein the output through integrated neural network is provided if each feature of extracted user signature verifies against stored actual reference data and fake reference data eise the user signature is sent for manual verification.
Dated this 03ru day of February 2010
Of Anand and Anand Advocates
Agents for the Applicant
| # | Name | Date |
|---|---|---|
| 1 | 270-che-2010 power of attormney 07-04-2010.pdf | 2010-04-07 |
| 2 | 270-che-2010 form-1 07-04-2010.pdf | 2010-04-07 |
| 3 | 270-CHE-2010 FORM-18 25-10-2010.pdf | 2010-10-25 |
| 5 | Form-1.pdf | 2011-09-02 |
| 6 | abstract270-che-2010.jpg | 2011-09-02 |
| 7 | 270-CHE-2010-FER.pdf | 2016-12-28 |
| 8 | 270-CHE-2010-AbandonedLetter.pdf | 2017-07-20 |
| 1 | 270che2010(1)_23-11-2016.PDF |