Sign In to Follow Application
View All Documents & Correspondence

System And Method For Extracting Flowchart Information From Digital Images

Abstract: A system and method for extracting flowchart information from digital images is provided. The method includes converting the digital flowchart image into a grayscale image and then binarizing the image. The method further includes extracting and masking text data from the binarized image. Further, flow lines connecting geometric components within the flowchart image are extracted and masked. The geometric components are classified into one or more categories and the flow line relationships between the geometric components are extracted. Finally, the extracted text data, flow line relationship information and geometric component information is stored in a database.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
17 February 2011
Publication Number
42/2012
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
dev.robinson@amarchand.com
Parent Application

Applicants

INFOSYS TECHNOLOGIES LIMITED
PLOT NO. 44, ELECTRONICS CITY, HOSUR ROAD, BANGALORE 560 100

Inventors

1. VASUDEVAN, BINTU GOPALAN
201 GAYATRI PEARL, BEHIND IIMB, SARVOBHOMA NAGAR, BILEKAHALLI BANNERGATTA ROAD, BANGALORE 560 076
2. DHANAPANICHKUL, SORAWISH
303/51, SOI SATHUPRADIT 19 SATHUPRADIT ROAD, CHONGNONSRI YANNAWA, BANGKOK 10 120
3. BALAKRISHNAN, RAJESH
325, PRESTIGE GREENWOODS NAGAVARAPALYA, C.V. RAMAN NAGAR P.O., BANGALORE 560 093

Specification

SYSTEM AND METHOD FOR EXTRACTING FLOWCHART INFORMATION FROM DIGITAL IMAGES Field of invention [0001] The present invention relates to the analysis and use of software artifacts. More particularly, the present invention provides for extracting flowchart information from digital images. Background of the invention [0002] Software engineering is the implementation of processes for development, maintenance and operation of software used in any application. An important aspect of software engineering is reusing existing software for efficient operation of a software system. Software reuse also helps in accelerating software development lifecycle. [0003] One of the features of software reuse currently implemented in the industry is the reuse of information available in the form of software artifacts. A software artifact is a portion of a software development process containing useful information. Generally, software artifacts contain useful knowledge related to the features of a software system. Examples of software artifacts include use-cases, flowcharts, wireframe diagrams, activity diagrams, UML diagrams and the like. [0004] A flowchart is a schematic representation of a process or an algorithm that illustrates the sequence of operations to be performed to get the solution of a problem. Nowadays, business organizations widely use software systems for implementing business processes. A majority of artifacts of a software system of a business organization may exist in the form of flowcharts. Flowcharts may be used to represent essential functions of an organizational process. Examples of the essential functions represented by a flowchart may include movement of materials through a machinery in a manufacturing process, flow of applicant information through a hiring process in a human resources department, etc. [0005] In light of the above, there exists a need for extracting data from artifacts of a software system such as flowcharts, and storing the data in a format such that the data can be efficiently reused. Summary of the invention [0006] A system and method for extracting flowchart information from digital images is provided. The digital flowchart image includes text data, geometric components and connecting flow lines. The method includes binarizing the digital flowchart image. Text data is then extracted from the binarized image using rectangular region growing segmentation technique. The method then includes extracting and masking flow lines connecting geometric components within the digital flowchart image. After the extraction and masking of flow lines, geometric components are extracted and classified into one or more categories. Classifying the geometric components may include recognizing the components and arranging them into one or more shape categories. Flow line relationship information between the geometric components is also extracted. Thereafter, the extracted text data, flow line relationship information and geometric component information is stored in a database. In various embodiments of the present invention, the digital flowchart image may be a binary image, a color image, a grayscale image, a multispectral image or a thematic image. [0007] In an embodiment of the present invention, prior to binarizing the digital flowchart image, the image is converted into a grayscale image [0008] In an embodiment of the present invention, one or more regions including text data are masked prior to extracting and masking flow lines connecting geometric components. Masking of the one or more regions includes converting pixels within the one or more bounded regions into background color of the digital flowchart image. [0009] In an embodiment of the present invention, extracting text data using rectangular region growing segmentation technique includes marking rectangular boundaries around one or more regions bounded by clusters of connected pixels of text data. An iterative algorithm is then executed for extracting one or more segment blocks enclosing individual characters from the one or more regions. In an embodiment of the present invention, a heuristic algorithm is implemented for separating closely connected individual characters prior to executing the iterative algorithm. Characters are recognized in each of the one or more segmented blocks using a neural network based Optical Character Recognition algorithm. Thereafter, the characters are translated using a character encoding scheme. [0010] In an embodiment of the present invention, recognition of geometric components is implemented using back-propagation neural network technique. In another embodiment of the present invention, recognition of geometric components is implemented by comparing the geometric components with standard geometric shapes stored in a database. The comparison of geometric components is performed using Dynamic Time Warping algorithm. [0011] In an embodiment of the present invention, the standard geometric shapes are stored by representing the shapes using boundary-based shape representation. Angular directions of pixel points along boundary of a geometric shape is used for describing the shape and slope of line within a threshold limit traced along the boundary is used to define and form shape vectors. [0012] In an embodiment of the present invention, the extracted text data is stored along with its location information. The location information indicates location of bounded geometric components within which text data is stored. [0013] In an embodiment of the present invention, the extracted geometric component information is stored along with location, height and width information. [0014] In an embodiment of the present invention, the extracted text data, flow line relationship information and geometric component information is stored in XML format. In another embodiment of the present invention, the extracted text data, flow line relationship information and geometric component information is stored in Graph Exchange Language format. Brief description of the accompanying drawings [0015] The present invention is described by way of embodiments illustrated in the accompanying drawings wherein: [0016] FIG. 1 illustrates a flow diagram depicting a sample flowchart image; [0017] FIG. 2 illustrates a representation of the processed sample flowchart image depicting the flowchart components of the flow diagram of FIG. 1 after character and line masking; [0018] FIG. 3 illustrates a flow diagram depicting the method steps for extracting flowchart information from digital images; [0019] FIG. 4 illustrates a shape descriptor for representing and describing geometric shapes extracted from a digital image; [0020] FIG. 5 illustrates four sample shape descriptors for standard flowchart components; [0021] FIG. 6 illustrates a dynamic time warping path used in recognizing an eclipse geometric shape; [0022] FIG. 7 illustrates an exemplary neural network used in learning and recognition of flowchart shapes; [0023] FIG. 8 illustrates a graph of theoretically calculated values of Root Mean Square (RMS) error versus number of iterations of training data, input to a neural network for flowchart component recognition; and [0024] FIG. 9 illustrates a sample XML representation of a section of a flowchart image. Detailed description of the invention [0025] A system, method and computer program product for extracting information from software artifacts is provided. The present invention is more specifically directed towards extracting flowchart information from digital images. An exemplary scenario in which the present invention may be implemented is in a software system in which information about the processes and functions of the system are stored in flowchart image files. In order to enable an efficient reuse of this information, data in flowchart images is to be extracted and stored in a format that is widely used. [0026] In an embodiment of the present invention, system, method and computer program product disclosed provides extracting data from flowchart image files. Data extracted from flowchart images includes text data, data describing geometric flowchart components and flow lines connecting the geometric components. Text data is data located in the flowchart image. Text data may be enclosed within geometric flowchart components representing steps of flow of a process or it may be located outside the flowchart components. [0027] In various embodiments of the present invention, system, method and computer program product disclosed provides utilizing a technique for extracting text data from a flowchart image. The method includes converting flowchart image into a grayscale image. Further, the method includes binarizing the image and extracting character segment blocks from the image using region growing segmentation. Thereafter, individual characters are recognized using neural network based Optical Character Recognition (OCR). [0028] In an embodiment of the present invention, system, method and computer program product disclosed provides for extracting and classifying flowchart components from the flowchart image. Prior to extracting flowchart components, text data as well as flow lines connecting the flowchart components are masked. Then, flowchart components are extracted using region growing segmentation technique and the components are recognized using a back-propagation neural network. The neural network utilized for recognizing the geometric components is a network trained inrecognizing geometric shapes. In various embodiments of the present invention, a Dynamic Time Warping (DTW) approach is used to recognize flowchart component shapes. [0029] In yet another embodiment of the present invention, the system, method and computer program product disclosed provides for storing the extracted text data, data describing geometric components and flow lines in an Extensible Markup Language (XML) format. [0030] Hence, the present invention enables an efficient reuse of information stored in flowcharts. The present invention also enables a proficient manner of exporting and using data across various software systems due to the data being stored in XML format. [0031] The disclosure is provided in order to enable a person having ordinary skill in the art to practice the invention. Exemplary embodiments herein are provided only for illustrative purposes and various modifications will be readily apparent to persons skilled in the art. The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. The terminology and phraseology used herein is for the purpose of describing exemplary embodiments and should not be considered limiting. Thus, the present invention is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed herein. For purpose of clarity, details relating to technical material that is known in the technical fields related to the invention have been briefly described or omitted so as not to unnecessarily obscure the present invention. [0032] The present invention would now be discussed in context of embodiments as illustrated in the accompanying drawings. [0033] FIG.1 illustrates a flow diagram depicting a sample flowchart image 100. In various embodiments of the present invention, the sample flowchart image 100 is a digital image comprising flowchart components arranged in a sequence to describe a process flow. A flowchart is a schematic representation of a process or an algorithm that illustrates the sequence of operations to be performed to get the solution of a problem. Flowcharts are commonly used in business/economic presentations to help the audience to visualize the content better, or to find flaws in the process, which describes what operations (and in what sequence) are required to solve a given problem. Flowcharts are usually drawn using standard geometric symbols such as rectangles, hexagons, circles, ellipses and lines. The standard geometric symbols represent start or end of a process, computational or processing steps, input or output operations, decision making/branching and flow lines. Examples of digital formats of flowchart images are JPEG, GIF, TIFF, PNG and BMP. The geometric symbols enclose concise text which provides information about the process being modeled. As shown in the figure, the sample flowchart image 100 comprises geometric symbols 102, 104, 106 and 108 which are flowchart components representing "start of process", "process operation", "decision making" and "value preparation for subsequent step" respectively. [0034] FIG. 2 illustrates a representation depicting the flowchart components of the sample flowchart image 100 depicted in FIG. 1 after the completion of character and line masking operations. Masking includes bitwise operations performed on the pixels of a digital image in order to hide certain portions of the image in order to perform operations on other portions of the image. In various embodiments of the present invention, character and line masking is used for processing the extraction of text and geometrical components from the flowchart image 100. Preprocessing is first performed on the flowchart image 100 to extract text (characters) located within flowchart image 100. In an embodiment of the present invention, preprocessing involves converting the flowchart image 100 to a grayscale image. The grayscale image is processed and a cluster of binary pixels is generated based on a global contrast threshold technique in which a threshold value for the entire image is ascertained based on intensity histogram. Thereafter, empirical experiments on the grayscale image are performed to determine a set of threshold values yielding a binary image in the form of cluster of black and white pixels. After the binarization of the image, the character segments are extracted. In various embodiments of the present invention, the character segments are extracted using region growing segmentation technique. Following the extraction of character segments, the character segments and the flow lines between the flowchart components are masked, before extracting the flowchart components. Flow lines are either horizontal or vertical lines connecting the flowchart components. In an embodiment of the present invention, the line masking is done by first detecting the horizontal lines first and the vertical lines second. The line detection is performed by processing the line pixels with the heuristic that the line pixels have a certain line width and that the line pixels are oriented in either horizontal or vertical direction. In various embodiments of the present invention, the arrow heads of the flow lines are not masked while performing the line masking. FIG. 2 illustrates the resultant image after the completion of binarization and line masking. The resultant image shows distinct components with geometrical shapes corresponding to the flowchart components of FIG. 1. [0035] FIG. 3 illustrates a flow diagram depicting the method steps for automatically extracting flowchart information from a flowchart image. In various embodiments of the present invention, the flowchart image is a digital image having geometric components (symbols), text data and flow lines connecting the geometric components. A digital image is a binary representation of a two-dimensional image. Typically, a two-dimensional image is represented using pixels in a digital image, wherein a pixel is the smallest piece of information in a digital image comprising one or more bits. The one or more bits represent the color and intensity of the digital image. Examples of digital images include, but are not limited to, binary, color, grayscale, multi-spectral, thematic and the like. In an embodiment of the present invention, image processing techniques are used to extract text data, data describing geometric components and flow lines connecting the geometric components. At step 302, the flowchart image is converted into a grayscale image. A grayscale image is a representation of a color two-dimensional image in which each pixel of the grayscale image represents the color of the corresponding pixel in the color image by a value signifying the intensity of the color "gray". In various embodiments of the present invention, a color image is converted to a grayscale image where each pixel in the Red-Green-Blue (RGB) format in the color image is converted into a corresponding gray pixel using the formula: GS = (0.299xR) + (0.587xG) + (0.114xB), where R, G and B represent the level or magnitude of Red, Blue and Green colors in an RGB pixel in the color image and GS represents the pixel in the grayscale image. [0036] At step 304, the grayscale image is binarized. Image binarization comprises converting the grayscale image into a black-and-white image. Binarization is a process of simplifying the grayscale image in order to process it for information extraction, such as, extraction of text and geometric component information. In various embodiments of the present invention, thresholding techniques are used to binarize the grayscale image. A thresholding technique comprises choosing a threshold value and classifying all pixels in the grayscale image with value above the threshold value as white and all pixels with values below the threshold value as black. The thresholding technique can be applied by choosing two different threshold values: One threshold value results in an image with dark text in lighter background and the other threshold value results in an image with light text in dark background. Variations of the threshold technique include choosing an optimal threshold value for each area of the grayscale image and then classifying the pixels accordingly. The resultant image is a binary image in the form of a cluster of black and white pixels. At step 306, text is extracted from the binarized image. In an embodiment of the present invention, the resultant binary image is processed for text extraction by using a rectangular region growing segmentation technique. The rectangular region growing segmentation technique is a block segmentation technique in which a rectangular boundary around the cluster of connected pixels of text is marked for detecting characters. The rectangular region growing segmentation technique is a technique in which a region is allowed to grow in forward, backward, upward and downward direction for marking the rectangular boundary. An algorithm checks from left to right and top to bottom for left, top, right and bottom boundaries of the cluster of connected pixels. While going from left to right, the first black pixel is the left boundary and the last black pixel is the right boundary of the cluster of pixels. Similarly, for top to bottom, the first black pixel is the top boundary and the last black pixel is the bottom boundary of the connected pixels. In an embodiment of the present invention, the procedure implemented by the algorithm includes marking a rectangular boundary that is one pixel more than the region bounded by the cluster of connected pixels. The implementation of the algorithm yields a block segmented region with respect to certain number of pixels and characters along with their position and size information. The algorithm is further implemented iteratively on the block segmented region in order to extract the smallest possible segment block enclosing an individual character. The algorithm is iteratively implemented by imposing geometrical constraints in order to sort out individual character blocks. Examples of geometrical constraints include imposing a threshold limit for a width to height ratio corresponding to an individual character segment block. In certain example, 4 to 5 iterations are sufficient to extract individual character segment blocks when the characters are well separated from adjacent characters. In other embodiments, characters in text may not be well separated such as in digital images stored as compressed bmp/jpg/gif files. The compression may cause merging of closest adjacent characters. In these cases, if a block width to height ratio is greater than an average character segmented block ratio, a heuristic algorithm is used to separate the characters that are closely connected at the point of minimum pixel joining point. [0037] The block segmented region is then processed through a character recognition phase for recognizing the character images and translating them into a standard encoding scheme such as ASCII or Unicode. The characters in the block segmented region are recognized by a neural network based Optical Character Recognition (OCR) algorithm. A neural network is an adaptive software system of interconnected mathematical processing elements that provides an optimal solution to a problem based on a learning phase and a solution phase. In an embodiment of the present invention, the neural network is a back-propagation neural network. A back-propagation neural network is described in conjunction with the description of FIG. 7. In an embodiment of the present invention, for training the neural network database, character images database is generated from standard Windows system font. Font types such as Times New Roman, Arial and Courier are selected including font styles such as bold, italic and normal. Each font is then converted into a character image matrix of 26 x 26 pixels. Training database contains a set of input vectors of character image matrices and a set of output target vectors corresponding to character ASCII codes. The training database is applied to the neural network with various "neural network" configurations such as by varying number of layers, number of neurons in hidden layer, the activation function, the learning rate and the error limits. Consequent to training the database, the neural network is implemented to recognize the characters in the individual character segment blocks and to translate the characters into corresponding ASCII codes. [0038] At step 308, the text region is masked. As described earlier, while using block segmentation technique for text extraction, rectangular boundaries around groups of connected pixels of text are marked for text identification. Since text boundaries are already known, all pixels within boundary areas are converted into background color of the image in order to mask text regions. In an embodiment of the present invention, wherein background color of an image is white in color as a result of image binarization, all pixels within text boundary areas are converted into white color. The resultant image obtained includes geometric components and connecting flow lines which are illustrated by black-colored pixels. Thereafter, at step 310, the flow lines connecting the geometric components are extracted and masked. In an embodiment of the present invention, flow line masking is done by processing pixels corresponding to the flow lines with the simple heuristic that the flow line pixels have a certain line width, and are oriented in either horizontal or vertical direction. During the masking of flow lines, the lines are labeled and their extreme points information is stored in a database. A resultant image after binarization and flow line masking shows distinct geometrical components with connected arrow head components. At step 312, the geometric components are extracted and classified. In an embodiment of the present invention, the geometric components are extracted by identifying clusters of connected pixels representing geometric shapes. The geometric components extracted from a digital image are then recognized and classified into categories. The classification of geometric components includes arranging the components into particular categories of shapes such as oval, square, hexagon, diamond and the like. In an embodiment of the present invention, for the purpose of recognizing geometric component shapes, a back-propagation neural network technique is used. In another embodiment of the present invention, for the purpose of recognizing geometric component shapes, a DTW algorithm is used, wherein the extracted component is compared with standard geometric shapes stored in database in order to determine a best match for recognition. As will be described further with reference to FIG. 4, the standard geometric shapes stored in database which are used for recognition are represented using various representation techniques. [0039] At step 314, flow line relationships between geometric components are extracted. Extraction of flow line relationships is performed by tracing the flow lines based on the simple heuristic of detecting all flow lines and arrow heads connected to the geometric components. Additional information for identification includes pixels representing arrow heads connected to the components. In various embodiments of the present invention, a simple region growing segmentation technique is used to mark and label segment blocks with bounded box information that represent geometrical shapes. The arrow head components are separated while segmenting the geometric components. In various embodiments of the present invention, the separation criteria for separating the arrow head components is separating two components in region of minimal number of pixel link between two regions. The filtration of arrow head components is done by comparing the geometric components and the arrow head components based on a threshold. In an embodiment of the present invention, the tracing is done by starting with the top first geometric component bounded box, expanding the box boundary by one pixel area and tracking co-ordinates of any lines or arrow heads intersecting with the top first geometric component. The co-ordinates of the connected line are then used to trace the line to find an arrow head component connected to the other geometric component. The tracing of flow line is performed for all the geometric components to trace all flow line relationships between the components. [0040] Finally, at step 316, the extracted text data, data describing geometric component shapes and flow line relationships are stored in a database. In various embodiments of the present invention, the extracted text data, the geometric components and the flow lines are stored in a database. In various embodiments of the present invention, the extracted text data, the geometric components and the flow lines are stored in an Extensible Markup Language (XML) format. XML is a markup language that provides a software and hardware independent manner of storing data so that the data can be shared across disparate software systems. In an embodiment of the present invention, the text data is stored along with its location information. The location information indicates the location of the bounded geometric component within which the characters are enclosed. The geometric components are stored with the location, width and height information. In other embodiments of the present invention, the extracted text data, the geometric components and the flow lines are stored in a Graph Exchange Language (GXL) format. The GXL format is an XML meta-language which is a standard for describing graphs across standard graph-based tools. [0041] FIG. 4 illustrates a shape descriptor 400 for representing and describing geometric shapes extracted from a digital image. In various embodiments of the present invention, geometric shapes extracted from a digital image are classified by describing them and storing the description in a database such that the shapes can be recognized and restored later on. Pursuant to using segmentation technique to track and extract geometric components, a boundary-based shape representation technique is used for representing geometric shapes. In boundary-based shape representation, boundary outline of a geometric component is extracted by tracing the contour edges of the component. Following the tracing of the contour edges of the geometric component, the shape of the geometric component is represented by a sequence of values, each value corresponding to a segment direction. The segment direction corresponds to the direction of a straight line between two sample points on the contour edge of the geometric component. In an embodiment of the present invention, as shown in FIG. 4, the geometric component is traced by starting with an initial point Po. Thereafter, a step L for any two consecutive sample points along the contour of the geometric component are chosen for creating a geometric shape descriptor. Then, the boundary of the geometric component is traced by using a straight line fit from the initial point Po to a sample point Pi along the contour till the slope of the line is within a particular threshold limit. Once the particular threshold limit is reached, the sample point Pi is pivoted and an angular direction (θi ) of Pi with respect to a horizontal line is calculated. In the figure, the sample point Pi which is pivoted is the point P4. The geometric shape descriptor is a set of vectors comprising values of angular directions of sample points along the contour with respect to a horizontal line. Thus the angular direction (θ2) of P4, is stored in the geometric shape vector. The geometric shape descriptor of the geometric component is thus constructed as follows: A slope of line from Po to a sample point Pi (P4 in the figure) along the contour of the geometric component is calculated while the contour is traced clockwise. The sample point Pi is pivoted as a consecutive sample point when the slope of line between the sample point and the horizontal line is within the particular threshold limit. Then, an angular direction of all pixel points between the Po to Pi is assigned with the same direction. Hence, for all sample points between the sample point Po and the consecutive sample point Pi, angular directions assigned to all the sample points is the angular direction (θt). Thus, consecutive sample points along the contour of the geometric component are traced in a clockwise direction, angular directions for the sample points are calculated, assigned and stored in the geometric shape descriptor. The tracing is performed along the contour of the geometric component until the initial point Po is reached. [0042] In various embodiments of the present invention, the tracing is performed by a software algorithm. The length of the vectors of the geometric shape descriptor is selected to be of a standard by selecting an average component segment size. A new component segment image is re-scaled to a standard segment image size before processing is done for creating a shape descriptor. The lengths of vectors in a geometric shape descriptor are made equal by re-sampling the vectors, when required. [0043] FIG. 5 illustrates four sample shape descriptors for standard flowchart components. In an embodiment of the present invention, the shape descriptors 502, 504, 506 and 508 represent the description of the geometric components hexagon, square, eclipse and diamond respectively. The shape descriptors 502, 504, 506 and 508 illustrate the direction of the pixels on the contours of the geometric shapes which is on the Y- axis and the number of sample pixel points on the X-axis in serial order. The shape descriptors represented and described for various geometric shapes are stored in a database. During recognition phase, a new shape descriptor is first re-sized to a standard size component and is re-sampled to a fixed length vector size before classification. In an embodiment of the present invention, a standard bounded size for a flowchart component is ascertained to be 160 X 80 pixels. A component block greater than this matrix is scaled down and centered to 160 X 80 pixels and a character less than this matrix is scaled up to this matrix size, while maintaining the aspect ratio of the flowchart component segments. Pursuant to representing shapes of standard flowchart components using geometric shape descriptors, the flowchart components are classified. The classification of flowchart components is described in conjunction with the description of FIG.s 5 and 6. [0044] FIG. 6 illustrates a dynamic time warping path used in describing an eclipse geometric shape. As described in conjunction with the description of FIG. 5, standard flowchart component shapes are classified and stored in a database. Subsequently, a flowchart component extracted and described by using the shape descriptor recited in the description of FIG. 5 is compared with the shapes stored in the database to determine a best match for component recognition. [0045] In various embodiments of the present invention, a Dynamic Time Warping (DTW) approach is used to detect an optimal alignment between two flowchart components. DTW is an algorithm that detects similarity between two sequences that are separated either in speed or time. A classic DTW algorithm is explained as follows: Considering two time series of length n and m respectively. In order to align the two sequences using DTW we construct an n x m matrix where the (ith , jth) element of the matrix contains the distance d(_qi, Cj) between the two points qi and cj In an example, the distance between the two points q1 and c2 is the Euclidean distance function: Each matrix element corresponds to the alignment between the points qi and Cj. A warping path is defined as a contiguous set of matrix elements that defines a mapping between Q and C. FIG. 6 illustrates an example. W of a warping path for the eclipse shape vector. The kth element of W is defined as wx = (i ,j)k, so that we have: The warping path is subject to several constraints such as boundary conditions, continuity and monotonicity. In various embodiments of the present invention, the constraints can be: • Boundary Conditions: w1 = (1,1) and Wx = (m,n. This boundary condition requires the warping path to start and finish at diagonally opposite comer cells of the matrix • Continuity: Given wk = (a, b), wk-1= (a' b') where a —a' < 1 and b —b' < 1. This constraint restricts the allowable steps in the warping path to adjacent cells. In an example, adjacent cells include diagonally adjacent cells • Monotonicity: Given w'' = (a, b) and wk-1 = (a ,b ) where a — a'.' > 0 and b — b' > 0. This constraint forces the points in the warping path W to be monotonically spaced in time. There are exponentially multiple warping paths that may satisfy the constraints. In an embodiment of the present invention, the warping path that minimizes the warping cost is used which is defined as: The length K, of the warping path is bounded such that max(m, n)

Documents

Application Documents

# Name Date
1 0452-CHE-2011 DRAWINGS 17-02-2011.pdf 2011-02-17
1 452-CHE-2011-FER.pdf 2019-10-25
2 452-CHE-2011 CORRESPONDENCE OTHERS 01-04-2014.pdf 2014-04-01
2 0452-CHE-2011 FORM-3 17-02-2011.pdf 2011-02-17
3 452-CHE-2011 FORM-18 01-04-2014.pdf 2014-04-01
3 0452-CHE-2011 FORM-2 17-02-2011.pdf 2011-02-17
4 452-CHE-2011 FORM-3 04-03-2013.pdf 2013-03-04
4 0452-CHE-2011 FORM-1 17-02-2011.pdf 2011-02-17
5 0452-CHE-2011 ABSTRACT 17-02-2011.pdf 2011-02-17
5 0452-CHE-2011 CORRESPONDENCE OTHERS 17-02-2011.pdf 2011-02-17
6 0452-CHE-2011 CLAIMS 17-02-2011.pdf 2011-02-17
6 0452-CHE-2011 DESCRIPTION(COMPLETE) 17-02-2011.pdf 2011-02-17
7 0452-CHE-2011 CLAIMS 17-02-2011.pdf 2011-02-17
7 0452-CHE-2011 DESCRIPTION(COMPLETE) 17-02-2011.pdf 2011-02-17
8 0452-CHE-2011 ABSTRACT 17-02-2011.pdf 2011-02-17
8 0452-CHE-2011 CORRESPONDENCE OTHERS 17-02-2011.pdf 2011-02-17
9 0452-CHE-2011 FORM-1 17-02-2011.pdf 2011-02-17
9 452-CHE-2011 FORM-3 04-03-2013.pdf 2013-03-04
10 452-CHE-2011 FORM-18 01-04-2014.pdf 2014-04-01
10 0452-CHE-2011 FORM-2 17-02-2011.pdf 2011-02-17
11 452-CHE-2011 CORRESPONDENCE OTHERS 01-04-2014.pdf 2014-04-01
11 0452-CHE-2011 FORM-3 17-02-2011.pdf 2011-02-17
12 452-CHE-2011-FER.pdf 2019-10-25
12 0452-CHE-2011 DRAWINGS 17-02-2011.pdf 2011-02-17

Search Strategy

1 SearchStrategyMatrix_24-10-2019.pdf