Abstract: A method and a system for comparing images is disclosed. In one embodiment, the method may include receiving a grid size to determine a number of a plurality of grids, and dividing each of the first image and the second image into the plurality of grids based on a size of each of the first and the second image. The method may further include comparing each grid of the first image with the plurality of grids of the second image to detect a mismatch between each grid of the first image and the plurality of grids of the second image. The method may further include classifying content of each grid of the first image as one of a content of addition, deletion, or a content of shift based upon the detected mismatched content.
Description: Technical Field
[001] This disclosure relates generally to comparing images, and more particularly to a system and method of comparing images for detecting and classifying mismatched content.
Background
[002] A product sketch also called an object drawing or a mechanical drawing is used in most product development processes. The object drawings may receive some amendments over time, for example, because of post-production alterations and/or image transformation operations. As such, eventually multiple versions of the object drawings may be created. Tracking amendments between two consecutive versions of drawings may be necessary because of various reasons. However, this tracking is generally performed manually which proves to be an effort-intensive and time-intensive process. Moreover, the manual tracking is limited in terms of the volume and accuracy.
[003] Some comparison tools are available which use a pixel-by-pixel comparing algorithm to find the difference (amendments). However, these tools fail to provide accurate results, especially in case of shifted content
[004] Accordingly, there is a need for a solution for effectively comparing two images (e.g., an original version and an amended version of an object drawing) and for detecting and classifying mismatched content between the two drawings.
SUMMARY
[005] In an embodiment, a method for comparing images is disclosed. The method may include receiving a grid size to determine a number of a plurality of grids for dividing each of a first image and a second image into the plurality of grids. The method may further include dividing each of the first image and the second image into the plurality of grids based on the size of each of the first image and the second image, and comparing each grid of the first image with the plurality of grids of the second image to detect a mismatch between each grid of the first image and the plurality of grids of the second image. The method may further include classifying the content of each grid of the first image as one of a content of addition, a content of deletion, or a content of shift based upon the detected mismatched content.
[006] In another embodiment, a system for comparing images is disclosed. The system includes a processor and a memory communicatively coupled to the processor. The memory stores processor-executable instructions, which, on execution, cause the processor to receiving a grid size to determine a number of a plurality of grids for dividing each of a first image and a second image into the plurality of grids. The memory stores processor-executable instructions may further cause the processor to divide each of the first image and the second image into the plurality of grids based on the size of each of the first image and the second image. The processor-executable instructions may further cause the processor to compare each grid of the first image with the plurality of grids of the second image to detect a mismatch between each grid of the first image and the plurality of grids of the second image, and classify the content of each grid of the first image as one of a content of addition, a content of deletion, or a content of shift based upon the detected mismatched content.
[007] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles.
[009] FIG. 1 illustrates a block diagram of an exemplary computer system for image comparison, in accordance with some embodiments of the present disclosure;
[010] FIG. 2 illustrates a functional block diagram of an image comparison system, in accordance with some embodiments of the present disclosure;
[011] FIG. 3 illustrates a flowchart of a method of comparing images, in accordance with some embodiments of the present disclosure;
[012] FIG. 4 illustrates a flow diagram of a process of comparing images, in accordance with some embodiments of the present disclosure; and
[013] FIG. 5 illustrates a flowchart of another method of comparing images, in accordance with some embodiments of the present disclosure.
DETAILED DESCRIPTION
[014] Exemplary embodiments are described with reference to the accompanying drawings. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope and spirit being indicated by the following claims. Additional illustrative embodiments are listed below.
[015] In an embodiment, a system and a method of comparing images is disclosed. For example, different versions of a two-dimensional object drawing (also, referred to as mechanical drawing) can be compared to detect and classify mismatched content. As will be appreciated by those skilled in the art, content of the object drawing may vary based on the version of the object drawing, since some of the parts may be added, deleted, or some changes happen in existing part itself. For example, a shifting of content in the existing object drawing may happen while adding a new content, because of space constrain in the existing object drawing.
[016] The present disclosure provides one or more techniques capable of overcoming the challenges of existing solutions which are not able to carry out accurate comparisons. The techniques provide for a grid-by-grid comparison for identifying mismatched content. These techniques are able to differentiate the added, removed, or shifted content. Further, the techniques provide for highlighting the added or removed or shifted content with different associated colors, in an output image. This allows a user to treat the shifted content as existing content (as the content has only shifted in position but is the same across versions) without considering the shifted content as a different content. Furthermore, the techniques help to automate the track changes between the different versions of object drawings. As such, the techniques provide for a convenient solution for tracking version changes in object drawings.
[017] Referring now to FIG. 1, an exemplary computing system 100 that may be employed to implement processing functionality for various embodiments (e.g., as a SIMD device, client device, server device, one or more processors, or the like) is illustrated. Those skilled in the relevant art will also recognize how to implement the invention using other computer systems or architectures. The computing system 100 may represent, for example, a user device such as a desktop, a laptop, a mobile phone, personal entertainment device, DVR, and so on, or any other type of special or general-purpose computing device as may be desirable or appropriate for a given application or environment. The computing system 100 may include one or more processors, such as a processor 102 that may be implemented using a general or special purpose processing engine such as, for example, a microprocessor, microcontroller or other control logic. In this example, the processor 102 is connected to a bus 104 or other communication media. In some embodiments, the processor 102 may be an Artificial Intelligence (AI) processor, which may be implemented as a Tensor Processing Unit (TPU), or a graphical processor unit, or a custom programmable solution Field-Programmable Gate Array (FPGA).
[018] The computing system 100 may also include a memory 106 (main memory), for example, Random Access Memory (RAM) or other dynamic memory, for storing information and instructions to be executed by the processor 102. The memory 106 also may be used for storing temporary variables or other intermediate information during the execution of instructions to be executed by processor 102. The computing system 100 may likewise include a read-only memory (“ROM”) or other static storage device coupled to bus 104 for storing static information and instructions for the processor 102.
[019] The computing system 100 may also include storage devices 108, which may include, for example, a media drive 110 and a removable storage interface. The media drive 110 may include a drive or other mechanism to support fixed or removable storage media, such as a hard disk drive, a floppy disk drive, a magnetic tape drive, an SD card port, a USB port, a micro-USB, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive. A storage media 112 may include, for example, a hard disk, magnetic tape, flash drive, or other fixed or removable media that is read by and written to by the media drive 110. As these examples illustrate, the storage media 112 may include a computer-readable storage medium having stored therein particular computer software or data.
[020] In alternative embodiments, the storage devices 108 may include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into the computing system 100. Such instrumentalities may include, for example, a removable storage unit 114 and a storage unit interface 116, such as a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, and other removable storage units and interfaces that allow software and data to be transferred from the removable storage unit 114 to the computing system 100.
[021] The computing system 100 may also include a communications interface 118. The communications interface 118 may be used to allow software and data to be transferred between the computing system 100 and external devices. Examples of the communications interface 118 may include a network interface (such as an Ethernet or other NIC card), a communications port (such as for example, a USB port, a micro-USB port), Near field Communication (NFC), etc. Software and data transferred via the communications interface 118 are in the form of signals which may be electronic, electromagnetic, optical, or other signals capable of being received by the communications interface 118. These signals are provided to the communications interface 118 via a channel 120. The channel 120 may carry signals and may be implemented using a wireless medium, wire or cable, fiber optics, or other communications medium. Some examples of the channel 120 may include a phone line, a cellular phone link, an RF link, a Bluetooth link, a network interface, a local or wide area network, and other communications channels.
[022] The computing system 100 may further include Input/Output (I/O) devices 122. Examples may include, but are not limited to a display, keypad, microphone, audio speakers, vibrating motor, LED lights, etc. The I/O devices 122 may receive input from a user and also display an output of the computation performed by the processor 102. In this document, the terms “computer program product” and “computer-readable medium” may be used generally to refer to media such as, for example, the memory 106, the storage devices 108, the removable storage unit 114, or signal(s) on the channel 120. These and other forms of computer-readable media may be involved in providing one or more sequences of one or more instructions to the processor 102 for execution. Such instructions, generally referred to as “computer program code” (which may be grouped in the form of computer programs or other groupings), when executed, enable the computing system 100 to perform features or functions of embodiments of the present invention.
[023] In an embodiment where the elements are implemented using software, the software may be stored in a computer-readable medium and loaded into the computing system 100 using, for example, the removable storage unit 114, the media drive 110 or the communications interface 118. The control logic (in this example, software instructions or computer program code), when executed by the processor 102, causes the processor 102 to perform the functions of the invention as described herein.
[024] Referring now to FIG. 2, a functional block diagram of an image comparison device 200 is illustrated, in accordance with some embodiments of the present disclosure. FIG. 2 is explained in conjunction with elements of the computing system 100. The image comparison device 200 may be configured to receive an input from a user. For example, the input may include a first document and a second document in a Portable Document Format (PDF). Further, for example, the first document and the second document may relate to engineering drawings, and in particular two different versions of an engineering drawing. The image comparison device 200 may be configured to convert the first document and the second document from the PDF format into an image format to obtain a first image and a second image.
[025] The image comparison device 200 may be further configured to receive an input from a user that may include a grid size of the first and the second image. The grid size may be used to determine a number of a plurality of grids into which each of the first image and the second image into the plurality of grids may be divided. The image comparison device 200 may be further configured to divide each of the first image and the second image into the plurality of grids based on a size of each of the first image and the second image, and further configured to compare each grid of the first image with the plurality of grids of the second image to detect a mismatch between each grid of the first image and the plurality of grids of the second image. Upon detecting the mismatch, the image comparison device 200 may be further configured to classify content of each grid of the first image or the second image as one of a content of addition, a content of deletion, or a content of shift.
[026] Further, the image comparison device 200 may be configured to present an output image including the content of the each of the plurality of grids of the first image or the second image. The content of each of the plurality of grids of the first image may be represented in one of a first predefined color, a second predefined color, a third predefined color, and a fourth predefined color. For example, the first predefined color may correspond to the content of no-mismatch, the second predefined color may correspond to the content of shift, the third predefined color may correspond to the content of addition, and the fourth predefined color may correspond to the content of deletion.
[027] In order to perform the above functions, in some embodiments, the image comparison device 200 may include various module, for example, an input receiving module 202, an image processing module 204, a mismatch detecting module 206, and a classification module 208. Further, additionally, the image comparison device 200 may include a format conversion module 210 and an output image presenting module 212.
[028] In some embodiments, the format conversion module 210 module may be configured to receive a first and a second document, when the first document and the second document is not in image format, for example, is in a PDF format. The format conversion module 210 module may further convert each of the first document and the second document from the said format (i.e. the PDF format) into an image format to obtain a first image and a second image.
[029] The input receiving module 202 may be configured to receive the first image and the second image from the format conversion module 210. The input receiving module 202 may be further configured to receive an input from a user. The input may include grid size. For example, the grid size may include a number of pixels along a horizontal and a vertical side of a rectangular shaped grid. It should be noted that the grid size may be used to determine a number of a plurality of grids into which each of the first image and the second image may be divided. Additionally or alternately, the input may simply include the number of the plurality of grids.
[030] The image processing module 204 may be configured to divide the first and second image into the plurality of grids. As will be understood, dividing each of the first image and the second image into the plurality of grids may imply superimposing a grid-mesh over the first image and the second image. Further, this grid-mesh may be comprised of the plurality of grids, with each grid having the size, as received from the user.
[031] The mismatch detecting module 206 may be configured to compare the first image with the second image, once the first image with the second image have been divided into the plurality of grids. In particular, the mismatch detecting module 206 may compare each grid of the first image with the plurality of grids of the second image to detect a mismatch between each grid of the first image and the plurality of grids of the second image. It should be noted that comparing may include performing a pixel match between a test grid of the first image and a corresponding grid of the second match, to determine one of a positive pixel match and a negative pixel match between the test grid of the first image and the corresponding grid of the second match. Additionally or alternately, the comparing may include performing a pattern match between the test grid of the first image and the remaining grids of the second image to determine one of a positive pattern match and a negative pattern match.
[032] The mismatch detecting module 206 may be further configured to detect a mismatch between each grid of the first image and the plurality of grids of the second image, based on the comparison. In other words, the mismatch detecting module 206 may detect a mismatched content between the first image and the second image. For example, the mismatched content may be a content of shift, or a content of addition or a content of deletion. It should be noted that the terms first image and second image may be used interchangeably in this disclosure.
[033] The classification module 208 may classify content of each grid of the first image as one of the content of addition, the content of deletion, or the content of shift. By way of an example, the first and the second image may be a first and a second version of an engineering drawing. The content of shift may relate to a drawing or a part of drawing that is present in both the first and the second image, however, the position of the drawing or the part of drawing is not the same. The content of addition may relate to a drawing or a part of drawing that is not present in the first image but is present in the second image. The content of deletion may relate to a drawing or a part of drawing that is present in the first image but is not present in the second image. For example, upon a positive pixel match, the classification module 208 may classify the content of the test grid as content of no-mismatch. Upon a negative pixel match, the pattern match may be performed between the test grid of the first image and the remaining grids of the second image to determine one of a positive pattern match and a negative pattern match. Further, upon a positive pattern match, the classification module 208 may classify the content of the test grid as the content of shift. Furthermore, upon a negative pattern match, the classification module 208 may classify the content of the test grid as the content of addition or the content of deletion.
[034] The output image presenting module 212 may present an output image which may include the content of the each of the plurality of grids of the first image. Further, the content of each of the plurality of grids of the first image may be represented in one of a first predefined color, a second predefined color, a third predefined color, or a fourth predefined color. For example, the first predefined color corresponds to the content of no-mismatch, the second predefined color corresponds to the content of shift, the third predefined color corresponds to the content of addition, and the fourth predefined color corresponds to the content of deletion.
[035] Referring now to FIG. 3, a flowchart of a method 300 of comparing images is illustrated, in accordance with some embodiments of the present disclosure. In some embodiments, the method 300 may be performed by the image comparison device 200.
[036] In some embodiments, at step 302, a first document and a second document may be received. When each of the first document and the second document is not in image format, for example, is in PDF format, each of the first document and the second document may be converted from the PDF format into an image format, at step 302. As such, a first image and a second image may be obtained.
[037] At step 304, an input may be received from a user. For example, the input may include a grid size. For example, the grid size may include a number of pixels along a horizontal and a vertical side of a rectangular shaped grid. The grid size may be used to determine a number of a plurality of grids for dividing each of the first image and the second image into a plurality of grids. For example, the number of the plurality of grids may be determined based on pixel resolution of each of the first image and the second image. Additionally or alternately, the input may simply include a number of adjacent number of grids. At step 306, each of the first image and the second image may be divided into the plurality of grids based on a size of each of the first image and the second image.
[038] At step 308, each of the plurality of grids of the first image may be compared with the plurality of grids of the second image to detect a mismatch between each grid of the first image and the plurality of grids of the second image. In some embodiments, the comparison may include performing a pixel match between a test grid of the first image and a corresponding grid of the second match, to determine one of a positive pixel match and a negative pixel match between the test grid of the first image and the corresponding grid of the second match. Additionally or alternatively, the comparison may further include performing a pattern match between the grids of the first image and the grids of the second image to determine one of a positive pattern match and a negative pattern match.
[039] At step 310, upon detecting the mismatch, the content of each grid of the first image or the second image may be classified as one of a content of addition, a content of deletion, or a content of shift. For example, upon a positive pixel match, the content of the test grid may be classified as a content of no-mismatch. Upon a negative pixel match, a pattern match may be performed between the test grid of the first image and the remaining grids of the second image to determine one of a positive pattern match and a negative pattern match. Furthermore, upon a positive pattern match, the content of the test grid may be classified as the content of shift. Upon a negative pattern match, the content of the test grid may be classified as one of the content of addition and the content of deletion.
[040] In some embodiments, at step 312, an output image may be presented that may include the content of the each of the plurality of grids of the first image and the second image. Further, the content of the output image may be represented in one of a first predefined color, a second predefined color, a third predefined color, and a fourth predefined color. For example, the first predefined color may corresponds to the content of no-mismatch, the second predefined color may correspond to the content of shift, the third predefined color may correspond to the content of addition, and the fourth predefined color may correspond to the content of deletion. At step 314, additionally, the output image may be converted into a PDF format. FIG. 3 is further explained by way of an example case scenario in conjunction with FIG. 4
[041] Referring now to FIG. 4, a flow diagram of a method 400 of comparing images is illustrated, in accordance with some embodiments of the present disclosure. The method 400 is explained using snapshots of a first image 410 and a second image 420.
[042] At step 402, the first image 410 and the second image 420 may be received. By way of an example, the first image 410 may be a first version of a mechanical drawing that may include elements 410A, 410B, 410C, 410D. The second image 420 may be a second version of the mechanical drawing and may include elements 410A, 410C, 410D, 410E. As will be understood, the elements 410A, 410B, 410C, 410D and the elements 410A, 410C, 410D, 410E may refer to a sub-drawing or a part of a drawing within the first image 410 and the second image 420, respectively. The second image 420 may include some mismatched content, i.e. an added content, or a deleted content, or a shifted content.
[043] At step 404, each of the first image 410 and the second image 420 may be divided into a plurality of grids. In order to divide the first image 410 and the second image 420 into the plurality of grids, a grid size may be received from a user. For example, the grid size may include a number of pixels along a horizontal and a vertical side of a rectangular shaped grid. The grid size may be used to determine a number of a plurality of grids for dividing each of the first image and the second image into the plurality of grids. Additionally or alternately, in order to divide the first image 410 and the second image 420 into the plurality of grids, the number of the plurality of grids may be received from the user. Therefore, at step 404, each of the first image 410 and the second image 420 may be divided into the plurality of grids, based on a size of each of the first image and the second image. As shown in FIG. 4, at 404, upon dividing into the plurality of grids, the first image 410 and the second image 420 may be represented as a first grid image 412 and a second grid image 422, respectively (therefore, the terms “first image 410” and “first grid image 412” may have been used interchangeably; similarly, the terms “second image 412” and “second grid image 422” may have been used interchangeably in this disclosure). Further, as shown, each of the first grid image 412 and the second grid image 422 may be divided into 154 grids with 11 grids defined vertically and 14 grids defined horizontally. It should be noted that the any combination of these grids of the (154 grids) can be treated as grids, for the above purpose.
[044] At steps 406A, 406B, the first grid image 412 and the second grid image 422 may be compared. In particular, each grid of the first grid image 412 may be compared with the plurality of grids of the second grid image 422 to detect a mismatch between each grid of the first grid image 412 and the plurality of grids of the second grid image 422. For example, at step 406A, a grid 2,2 of the first grid image 412 is compared with a corresponding grid 2,2 of the second grid image 422. In some embodiments, the comparison may include performing a pixel match between corresponding grids of the first grid image 412 and the second grid image 422 to determine one of a positive pixel match and a negative pixel match between the grid of the first image and the corresponding grid of the second match. In the above example, a positive pixel match is found between the grid 2,2 of the first grid image 412 and the corresponding grid 2,2 of the second grid image 422, and as such, the content of the grid 2,2 of the first grid image 412 may be classified as content of no-mismatch.
[045] Similarly, a grid 8,6 of the first grid image 412 is compared with a corresponding grid 8,6 of the second grid image 422. The comparison may include performing a pixel match between corresponding grids 8,6 of the first grid image 412 and the second grid image 422 to determine one of a positive pixel match and a negative pixel match between the grid 8,6 of the first grid image 412 and the corresponding grid 8,6 of the second grid image 422. A negative pixel match is found between the grid 8,6 of the first grid image 412 and the corresponding grid 8,6 of the second grid image 422.
[046] Thereafter, at step 406B, upon the negative pixel match, a pattern match may be performed between the grid 8,6 of the first grid image 412 with the remaining grids of the second grid image 422 to determine one of a positive pattern match and a negative pattern match. As such, a positive pattern match is found between grid 8,6 of the first grid image 412 and a combination of grid 8,8 and 8,9 of the second grid image 422. Upon the positive pattern match, the content of the grid 8,6 of the first grid image 412 may be classified as the content of shift. In other words, the content of the grid 8,6 of the first grid image 412 exists in the second grid image 422 but the position of the content is not same as in the first grid image 412 and is a shifted position.
[047] Therefore, using the above comparison, each of the plurality of grids of the first grid image 412 may be compared with the plurality of grids of the second grid image 422, to classify content of each grid of the first grid image 412 as one of the content of no-mismatch, or the content of addition, or the content of deletion, or the content of shift.
[048] Furthermore, based on the classification of each grid, one or more elements of the first grid image 412 may be identified as elements of shift, or elements of addition, or elements of deletion. For example, the elements 410A and 410C are found as and classified as element of no-mismatch, since both the elements 410A and 410C exist (i.e. no deletion or addition) in the both the first grid image 412 and the second grid image 422 and at the same corresponding positions (i.e. no shift). The element 410B is found to exist only in the first grid image 412 and is absent in the second grid image 422, and therefore may be classified as element of deletion. The element 410D is found to exist in both the first grid image 412 and the second grid image 422, however, at different positions. As such, the element 410D may be classified as an element of shift. The element 410E is found to exist only in second grid image 422 and absent in the first grid image 412. Therefore, the element 410E may be classified as element of addition.
[049] It should be noted that the detection of the content of deletion or content of addition may be based on detection of content which is present or absent between the first grid image 412 and the second grid image 422. For example, if a content of a grid of the first grid image 412 is not found in the second grid image 422, that content may be classified as content of deletion. Similarly, if a content of a grid of the second grid image 422 is not found in the first grid image 412, that content may be classified as content of addition.
[050] At step 408, an output image 440 may be presented that may include the content of the each of the plurality of grids of the first image 410 and the second image 420. Further, the contents of the output image 440 may be represented in one of a first predefined color, a second predefined color, a third predefined color, and a fourth predefined color. The first predefined color may correspond to the content of no-mismatch, the second predefined color may correspond to the content of shift, the third predefined color may correspond to the content of addition, and the fourth predefined color may correspond to the content of deletion.
[051] Therefore, the elements 410A and 410C, classified as elements of no-mismatch, may be represented in the first predefined color, for example, Black color. The element 410B, classified as the element of deletion, may be represented in the fourth predefined color, for example, Red color. The element 410D, classified as the element of shift, may be represented in the second predefined color, for example, Blue color. The element 410E, classified as the element of addition, may be represented in the third predefined color, for example, Green color.
[052] Referring now to FIG. 5, another flowchart of a method 500 of comparing images is illustrated, in accordance with some embodiments of the present disclosure. At step 502, a first image and a second image may be received. Further, at step 504, a grid size may be received from a user, that may be used to determine a number of a plurality of grids for dividing each of the first image and the second image into a plurality of grids. For example, the grid size may include a number of pixels along a horizontal and a vertical side of a rectangular shaped grid. The number of the plurality of grids may be determined based on pixel resolution of each of the first image and the second image.
[053] At step 504, each of the first image and the second image may be divided into the plurality of grids based on a size of each of the first image and the second image. At step 506, a pixel match may be performed between a test grid of the first image and a corresponding grid of the second match, to determine one of a positive pixel match and a negative pixel match between the test grid of the first image and the corresponding grid of the second match.
[054] At step 508, a check may be performed to determine if the pixel match is a negative pixel match or not (i.e. it is a positive pixel match). If at step 508, it is determined that the pixel match is a negative pixel match, the method 500 may proceed to the step 510 (“No” path), where the content of the test grid may be classified as content of no-mismatch, and further, the method 500 may continue to repeat the steps 506-508 for the remaining grids of the first image. However, if at step 508, it is determined that the pixel match is not a negative pixel match (i.e. a positive pixel match), the method 500 may proceed to step 512 (“Yes” path). In other words, when a positive pixel match is not obtained, the method may proceed to take a next alternative approach.
[055] At step 512, a pattern match may be performed between the test grid of the first image and the remaining grids of the second image 512 to determine one of a positive pattern match and a negative pattern match. At step 514, a check may be performed to check if it the pattern match is a positive pattern match or not (i.e. it is a negative pattern match). If at step 514, it is determined that the pattern match is a positive pattern match, the method 500 may proceed to step 516 (“Yes” path) where the content of the test grid may be classified as the content of shift. If at step 512, it is determined that the pattern match is not a positive pattern (i.e. is a negative pattern match), the method 500 may proceed to step 518 (“No” path) where the content of the test grid may be classified as one of the contents of addition or the content of deletion.
[056] At step 520, an output image may be presented that may include the content of the each of the plurality of grids of the first image and the second image. Further, the content of the output image may be represented in one of a first predefined color, a second predefined color, a third predefined color, and a fourth predefined color. For example, the first predefined color may corresponds to the content of no-mismatch, the second predefined color may correspond to the content of shift, the third predefined color may correspond to the content of addition, and the fourth predefined color may correspond to the content of deletion.
[057] As will be appreciated by those skilled in the art, the techniques described in the various embodiments discussed above are not routine, or conventional, or well understood in the art. The techniques discussed above provide a way for comparing images. Further, the techniques may provide a way of automatic tracking of all the changes observed in the first image when compared with the second image. The techniques may also help in classifying the mismatched content.
[058] Thus, the disclosed method offers the ability to find mismatched content and differentiate between added/removed/shifted content. The disclosed method also enables the identification of the shifted content with changes and also the shifted content without changes.
[059] It is intended that the disclosure and examples be considered as exemplary only, with a true scope and spirit of disclosed embodiments being indicated by the following claims.
Claims: We claim:
1. A method of comparing images, the method comprising:
receiving, by a comparison device, a grid size to determine a number of a plurality of grids for dividing each of a first image and a second image into the plurality of grids;
dividing, by the comparison device, each of the first image and the second image into the plurality of grids based on a size of each of the first image and the second image;
comparing, by the comparison device, each grid of the first image with the plurality of grids of the second image to detect a mismatch between each grid of the first image and the plurality of grids of the second image; and
upon detecting the mismatch, classifying, by the comparison device, content of each grid of the first image or the second image as one of a content of addition, a content of deletion, or a content of shift.
2. The method as claimed in claim 1, wherein the number of the plurality of grids is determined based on pixel resolution of each of the first image and the second image.
3. The method as claimed in claim 1, wherein the comparing comprises:
performing a pixel match between a test grid of the first image and a corresponding grid of the second match, to determine one of a positive pixel match and a negative pixel match between the test grid of the first image and the corresponding grid of the second match.
4. The method as claimed in claim 3 further comprising:
upon a positive pixel match,
classifying the content of the test grid as content of no-mismatch; and
upon a negative pixel match,
performing a pattern match between the test grid of the first image and the remaining grids of the second image to determine one of a positive pattern match and a negative pattern match.
5. The method as claimed in claim 4 further comprising:
upon a positive pattern match,
classifying the content of the test grid as the content of shift; and
upon a negative pattern match,
classifying the content of the test grid as one of the content of addition and the content of deletion.
6. The method as claimed in claim 5 further comprising:
presenting an output image comprising the content of the each of the plurality of grids of the first image and the second image,
wherein the content of the output image is represented in one of a first predefined color, a second predefined color, a third predefined color, and a fourth predefined color,
wherein the first predefined color corresponds to the content of no-mismatch,
wherein the second predefined color corresponds to the content of shift,
wherein the third predefined color corresponds to the content of addition, and
wherein the fourth predefined color corresponds to the content of deletion.
7. The method as claimed in claim 1 further comprising:
receiving at least one of a first document and a second document in a PDF format; and
converting the at least one of the first document and the second document from the PDF format into an image format to obtain the first image and the second image.
8. The method as claimed in claim 6 further comprising:
converting the output image into a PDF format.
9. A system for comparing images, the system comprising:
a processor; and
a memory communicatively coupled to the processor, wherein the memory stores processor-executable instructions which, on execution by the processor, cause the processor to:
receive a grid size to determine a number of a plurality of grids for dividing each of a first image and a second image into the plurality of grids;
divide each of the first image and the second image into the plurality of grids based on a size of each of the first image and the second image;
compare each grid of the first image with the plurality of grids of the second image to detect a mismatch between each grid of the first image and the plurality of grids of the second image; and
upon detecting the mismatch, classify content of each grid of the first image or the second image as one of a content of addition, a content of deletion, or a content of shift.
10. The system as claimed in claim 9, wherein the comparing comprises:
performing a pixel match between a test grid of the first image and a corresponding grid of the second match, to determine one of a positive pixel match and a negative pixel match between the test grid of the first image and the corresponding grid of the second match.
11. The system as claimed in claim 10, wherein the processor-executable instructions further cause the processor to:
upon a positive pixel match,
classify the content of the test grid as content of no-mismatch;
upon a negative pixel match,
perform a pattern match between the test grid of the first image and the remaining grids of the second image to determine one of a positive pattern match and a negative pattern match;
upon a positive pattern match,
classify the content of the test grid as the content of shift; and
upon a negative pattern match,
classify the content of the test grid as one of the content of addition and the content of deletion.
12. The system as claimed in claim 11, wherein the processor-executable instructions further cause the processor to:
present an output image comprising the content of the each of the plurality of grids of the first image and the second image,
wherein the content of the output image is represented in one of a first predefined color, a second predefined color, a third predefined color, and a fourth predefined color,
wherein the first predefined color corresponds to the content of no-mismatch,
wherein the second predefined color corresponds to the content of shift,
wherein the third predefined color corresponds to the content of addition, and
wherein the fourth predefined color corresponds to the content of deletion.
| # | Name | Date |
|---|---|---|
| 1 | 202211025442-STATEMENT OF UNDERTAKING (FORM 3) [30-04-2022(online)].pdf | 2022-04-30 |
| 2 | 202211025442-REQUEST FOR EXAMINATION (FORM-18) [30-04-2022(online)].pdf | 2022-04-30 |
| 3 | 202211025442-REQUEST FOR EARLY PUBLICATION(FORM-9) [30-04-2022(online)].pdf | 2022-04-30 |
| 4 | 202211025442-PROOF OF RIGHT [30-04-2022(online)].pdf | 2022-04-30 |
| 5 | 202211025442-POWER OF AUTHORITY [30-04-2022(online)].pdf | 2022-04-30 |
| 6 | 202211025442-FORM-9 [30-04-2022(online)].pdf | 2022-04-30 |
| 7 | 202211025442-FORM 18 [30-04-2022(online)].pdf | 2022-04-30 |
| 8 | 202211025442-FORM 1 [30-04-2022(online)].pdf | 2022-04-30 |
| 9 | 202211025442-FIGURE OF ABSTRACT [30-04-2022(online)].jpg | 2022-04-30 |
| 10 | 202211025442-DRAWINGS [30-04-2022(online)].pdf | 2022-04-30 |
| 11 | 202211025442-DECLARATION OF INVENTORSHIP (FORM 5) [30-04-2022(online)].pdf | 2022-04-30 |
| 12 | 202211025442-COMPLETE SPECIFICATION [30-04-2022(online)].pdf | 2022-04-30 |
| 13 | 202211025442-FER.pdf | 2022-09-14 |
| 14 | 202211025442-OTHERS [14-03-2023(online)].pdf | 2023-03-14 |
| 15 | 202211025442-FER_SER_REPLY [14-03-2023(online)].pdf | 2023-03-14 |
| 16 | 202211025442-DRAWING [14-03-2023(online)].pdf | 2023-03-14 |
| 17 | 202211025442-CORRESPONDENCE [14-03-2023(online)].pdf | 2023-03-14 |
| 18 | 202211025442-COMPLETE SPECIFICATION [14-03-2023(online)].pdf | 2023-03-14 |
| 19 | 202211025442-CLAIMS [14-03-2023(online)].pdf | 2023-03-14 |
| 20 | 202211025442-US(14)-HearingNotice-(HearingDate-27-12-2024).pdf | 2024-11-29 |
| 21 | 202211025442-FORM-26 [24-12-2024(online)].pdf | 2024-12-24 |
| 22 | 202211025442-Correspondence to notify the Controller [24-12-2024(online)].pdf | 2024-12-24 |
| 23 | 202211025442-Written submissions and relevant documents [30-12-2024(online)].pdf | 2024-12-30 |
| 24 | 202211025442-PatentCertificate22-09-2025.pdf | 2025-09-22 |
| 25 | 202211025442-IntimationOfGrant22-09-2025.pdf | 2025-09-22 |
| 1 | SearchHistoryE_13-09-2022.pdf |