Abstract: A METHOD FOR IMPROVING ACCURACY OF IMAGE SENSORS The disclosed method solves the problem of obtaining lower resolution images from lower resolution image sensors, by improving processing accuracy of the image sensors. The method includes obtaining a target image, traversing the target image to obtain a plurality of target segments based on contours identification by edge detection, selecting a first segment of the plurality of segments and choose a first test image corresponding to the first segment based on area and edges of the contour, generating Graded Similarity Index between the first segment and the first test image, determining centre of the first segment image based on the Graded Similarity Index, and executing sub-pixel interpolation to the centre to obtain sub-micron accuracy.
DESC:DESCRIPTION OF THE INVENTION:
Technical Field of the Invention
[002] The present invention relates to hardware design coupled to image sensors in image processing domain. More specifically, the invention relates to method for improving accuracy of an imaging sensor.
Background of the Invention
[003] Imaging sensors vary across range of resolutions. Generally, imaging sensors of higher resolution provide better accuracy over images obtained. For example an image sensor of 64MP resolution would take an image of 9216x6912 pixel resolution, which by obvious means has better details than an image obtained by an image sensor of 12MP resolution, which has only 4032x3024 pixel resolution. Further, processing the image obtained by the 64 MP image sensor includes higher computations, compared to the image obtained by the 12MP image sensor, considering that processing is executed by a processor of same capacity. Increased computations leads to increased processing time and use of additional resources.
[004] Furthermore, to obtain a higher accuracy images in most of the cases a higher resolution sensor is used, by taking account of increased computational time, additional use of resources etc. Hence, there is a need to obtain efficient images that are of higher accuracy and obtained by low resolution images sensors, such that computations and additional use of resources are minimized.
Object of the invention
[005] The principal object of the invention is to obtain higher accuracy images using lower resolution image sensors.
[006] Another object of the invention is to reduce computational time and computational complexity in image processing.
[007] Another object of the invention is to couple an image sensor with a semi-conductor assembly line to achieve high accuracy component placement.
[008] These and other objects and characteristics of the present invention will become apparent from the further disclosure to be made in the detailed description given below.
Summary of the invention
[009] To the enablement of the present disclosure and related ends, the at least one aspect comprises the feature(s) hereinafter completely described and particularly and/or specifically pointed out in the specification at the section of claims. The following drawings and description set forth in detail enable certain exemplary features of the at least one aspect(s). Described features are indicative, however, of but a few of the many ways in which the following principles of various aspects may be employed, and this description is intended to include all such aspects and their equivalents.
[0010] In the present disclosure, a system and a method for enabling an efficient image processing is disclosed. The disclosed system solves the problem of obtaining lower accurate images using a lesser resolution image sensor, and eliminates need of higher resolution image sensors for higher quality images. Further, the disclosed method solves the problem of obtaining lower resolution images from lower resolution image sensors, by improving processing accuracy of the image sensors. The method includes obtaining a target image, traversing the target image to obtain a plurality of target segments based on contours identification by edge detection, selecting a first segment of the plurality of segments and choose a first test image corresponding to the first segment based on area and edges of the contour, generating Graded Similarity Index between the first segment and the first test image, determining centre of the first segment image based on the Graded Similarity Index, and executing sub-pixel interpolation to the centre to obtain sub-micron accuracy.
[0011] This summary provided herein is to introduce a section of concepts in a simple and clear form which are further described in the Detailed Description. This summary provided herein is not intended to particularly identify key features or essential features of the claimed invention or subject matter, nor is it intended to be used as a support or as an aid in determining the scope of the claimed subject matter.
[0012] The above summary is descriptive and exemplary only and is not intended to be in any way restricting. In addition to the descriptive aspects, embodiments, and features described in the above summary, further features and embodiments will become apparent by reference to the accompanied drawings and the following detailed description.
Brief Description of Drawings
[0013] The foregoing and any other features of embodiments will become more evident from the following detailed description of embodiments when read along with the associated drawings. In the drawings, like elements refer to like reference numerals.
[0014] In the following description, a number of specific details are put forward in order to enable a thorough comprehension of various embodiments of the invention. However, it is evident to one skilled in the art that the embodiments of the invention may be put to practice with an equivalent arrangement or without using these specific details. In other examples, in order to avoid unnecessary obscuring of the embodiments of the invention, devices, and well-known structures are clearly shown in the form of a block diagram.
[0015] FIGs. 1A-1B illustrates application environment, for enabling an efficient image processing, according to one embodiment of the invention.
[0016] FIGs 2A-2C illustrates a flowchart for improving accuracy of an image processing, according to one embodiment of the invention.
[0017] FIGs. 3A- 3G illustrates series of tables depicting, mathematical enablement for improving accuracy of the image processing, according to one embodiment of the invention.
[0018] FIG. 4A-4B illustrates a graphs depicting improvement observed in terms of computational complexity and computational time, according to one embodiment of the invention.
[0019] FIG. 5A-5F illustrates images depicting improving accuracy of an image processing, according to one embodiment of the invention.
Detailed Description of Drawings
[0020] FIGs. 1A-1B illustrates application environment, for enabling an efficient image processing, according to one embodiment of the invention. FIG. 1A indicates use of a 64 MP image sensor to obtain an accuracy of 0.01 µ meter. Also a 16 MP image sensor may be used to obtain an accuracy of 0.1 µ meter accuracy. In this case accuracy is defined as ability of an imaging sensor to accurately identify a particular shape/pattern with almost nil tolerance. FIG. 1A further illustrates use of 16 MP image sensor that may be used to obtain an accuracy of 0.01 µ meter. In order to obtain such higher degree of accuracy using lower configuration hardware, especially in an automated assembly line certain system and method is incorporated. Such system and method to improve accuracy of the image processing is described in further part of the disclosure.
[0021] FIG. 1B illustrates a system 100 for improving accuracy of a target image, according to one embodiment of the invention. The system 100 includes a processor 101, a memory 103, and a communication interface 105. The processor 101 can be any processor, such as 32-bit processors using a flat address space, such as a Hitachi SH1, an Intel 960, an Intel 80386, a Motorola 68020 (or any other processors carrying similar or bigger addressing space). Processors other than the above mentioned, processors that may be built in the future, are also apt. The processor can include but is not limited to general processor, Application Specific Integrated Circuit (ASIC), Digital Signal Processing (DSP) chip, AT89S52 microcontroller firmware or a combination, Technology Programmable Gate Arrays (FPGAs) thereof.
[0022] Processors which are suitable for carrying out a computer program may include, by example, both special and general-purpose microprocessors, or processors of any kind for digital computer. Generally, a processor obtains instructions and data through a read only memory card or a random-access memory (RAM) or both. The vital elements of a computer are its processor for carrying out instructions and multiple memory devices for hoarding data and instructions. Generally, a computer includes, or be operatively associated to transfer data to or receive data from, or both, multiple mass storage devices for hoarding data, e.g., magneto optical disks, magnetic, or optical disks. However, a computer requires no such devices. Moreover, a computer can be lodged into another device without much effort, e.g., a personal digital assistant (PDA),a mobile telephone, a GPS receiver, a mobile audio player, to name a few. Computer readable media which are suitable for hoarding computer programs and data consists of all forms of media, and memory devices, non-volatile memory, including semiconductor memory devices, e.g., EEPROM, EPROM, and magnetic disks, flash memory devices; e.g., removable disks or internal hard disks; magneto optical disks, DVD-ROM disks and CD ROM . The memory 103 can be of non-transitory form such as a RAM, ROM, flash memory, etc. The processor 101 along with the memory 103 can be supplemented by, or subsumed in, special purpose logic circuits.
[0023] In accordance with an example embodiment, the memory 103 includes both static memory (e.g., ROM, CD-ROM, etc.) and dynamic memory (e.g., RAM, magnetic disk, writable optical disk, etc.
[0024] For example, consider in a manufacturing facility an assembly of several components are to be executed on a silicon board, whereas form factor of each of the components are in millimeters. Further, placement of each of the component is subjected to higher degree of accuracy with tolerance of 0.01 micrometer. A system may be dynamically configured to place several components on the silicon board based on improved accuracy of associated image processing. In this process, a Graded Similarity Index technology is integrated with image processing, to achieve sub-micron accuracy. Consider, two components that need to be placed on a silicon board. Each component is individually selected and processed under Graded Similarity Index technology to achieve 0.01 micrometer accuracy. Process of achieving sub – micron accuracy is explained in detail in later part of the disclosure.
[0025] FIGs 2A illustrates a flowchart depicting a method for improving accuracy of an imaging sensor, according to one embodiment of the invention. At step 201, the method includes obtaining an image of a target. In some example embodiments, the target may include a board or a wafer where one or more components need to be placed accurately. The target image may be obtained by an image sensor associated with an assembly system. The assembly system may also comprise a plurality of robotic arm to accurately place the one or more components.
[0026] At step 203, the method includes traversing the target image to obtain a plurality of segments based on contours identification by edge detection. For example, consider the target image comprises certain markers where a specific component may be placed. Such markers are identified by contour identification, enabled via edge detection. The edge detection enables accurate identification of contours such that markers are identified on the target image, where markers form a plurality of target segment, thereby eliminating unnecessary details, thereby saving a significant amount of processing time and usage of resources.
[0027] At Step 205, the method includes selecting a first segment of the plurality of segments and choose a first test image corresponding to the first segment based on area and edges of the contour. Consider, two target segments are identified where first one is identified by a plus mark and second one is identified by a circle. In case, a first target segment (i.e. a plus mark) is identified and a test image corresponding to the plus mark is selected by the system. In some example embodiment, the system may select the right test image by considering area and the edges of the identified contour. In this case, the selected right test image corresponding to the first segment may be called as first test image.
[0028] At step 207, the method includes generating a Graded Similarity Index between the first segment and the first test image. Generating the Graded Similarity Index is explained in further part of the disclosure.
[0029] At step 209, the method includes determining the center of the first segment image based on the Graded Similarity Index. Determining the center of the first segment image based on the Graded Similarity Index is explained in further part of the disclosure.
[0030] At step 211, the method includes executing sub-pixel interpolation to the determined center to obtain sub-micron accuracy, thereby increasing the accuracy of the image sensor. The sub-pixel interpolation includes division of each pixel into multiple fold. In this case, sub-pixel interpolation of the center, enables effectivity of the image processing to determine the center into multiple fold. Further, center of the sub-interpolated center becomes highly accurate position of placing of any component.
[0031] FIG. 2B illustrates a flowchart illustrating a method for generating a Graded Similarity Index between the first segment and the first test image. At Step 207A and At Step 207B the method includes binaryfication and fuzzyfication of the first test image and the first target segment respectively. At Step 207C, the method includes determining Sum of Absolute Differences between the first test image and the first target segment based on traversing of the first test image on the first target segment In some example embodiments, binarified first test image and the first target segment are subjected to distance transform, followed by normalization in order to achieve fuzzyfication. More details with example is provided in the FIG. 3A-3F.
[0032] FIG. 2C illustrates a flowchart illustrating a method for determining the center of the first segment image based on the Graded Similarity Index. At Step 209A, the method includes identification of the value associated with the Graded Similarity Index. At step 209B, the method includes determining a mapped region such that initial pixel of the first test image is correspondingly placed on the identified smallest value associated with the Graded Similarity Index. At step 209C, the method includes determining centre of the mapped region as the centre of the first target segment based on centre of the first test image. Throughout the disclosure, the Graded Similarity Index is always referred with respect to a matrix. Therefore, the Graded Similarity Index and the Graded Similarity Index matrix may interchangeably be used.
[0033] FIG. 3A illustrates a target image represented inform of a 8*16 matrix and FIG. 3B illustrates a test image represented inform of a 2*2 matrix. Two target segments are identified based on contours identification by edge detection. FIG 3C, illustrates a selected target segment that is in the form of 8*8 matrix. Objective is to identify a mark on the selected that segment that maps with the test image. In order to do so, firstly the selected target segment and the test image are subjected to the binaryfication and fuzzyfication. FIG. 3D illustrates binaryfication and fuzzyfication of the test image. FIG. 3E illustrates binaryfication of the selected target image, where any value less than 5 is considered as low and any value equal or greater than 5 is considered high. The binaryfied image is subjected to distance transform. FIG. 3F illustrates normalization of the distance transform matrix illustrated in FIG. 3D, to obtain fuzzyfied image of the selected target segment. FIG. 3G illustrates Graded Similarity Index matrix obtained by Sum of Absolute Differences between the test image and the selected target segment based on traversing of the test image on the selected target segment. It may be considered that the mark present of the target image is present where smallest value is found in the graded similarity index matrix. Therefore, a mapped region is determined such that initial pixel of the test image is correspondingly placed on the identified smallest value associated with the graded similarity index. Centre of the mapped region is considered as the centre of the selected target segment based on centre of the test image, as shown in the FIG. 3G.
[0034] FIG. 4A illustrates a graph showing accuracy obtained over number of computation. In a 64 MP image sensor, higher accuracy is provided over significantly increased computations. In a 16 MP image sensor, accuracy obtain is five times lesser than the 64 MP image sensor and computation is also reduced four times. However, by using the method disclosed in the invention a 16 MP image sensor may nearly get 64 MP image sensor accuracy in almost 50 times lesser computation to that of the 64 MP image sensor.
[0035] FIG. 4B illustrates a graph showing accuracy obtained over execution time. In a 64 MP image sensor, higher accuracy is provided over significantly increased execution time. In a 16 MP image sensor, accuracy obtain is five times lesser than the 64 MP image sensor and execution time is also reduced four times. However, by using the method disclosed in the invention a 16 MP image sensor may nearly get 64 MP image sensor accuracy in almost 50 times lesser execution time to that of the 64 MP image sensor. From the following graphs, it may be observed that using the method disclosed in this application, computational complexity and computational time associated with processing an image effectively elevated to accurate images from lower resolution image sensors, thereby improving the accuracy of the image sensor.
[0036] 5A-5F illustrates images depicting improving accuracy of an image processing, according to one embodiment of the invention. FIG. 5A illustrates a target image, out of which a target segment is selected. The target image is traversed to obtain a plurality of target segments based on contours identification by edge detection. FIG. 5B represents a selected target segment (500). The target image has multiple target segments. Specifically one required target segment is selected out of multiple target segments.
[0037] FIG. 5C represents a test image, the test image is selected corresponding to the target segment based on area and edges of the contour of target segment. The target image and test image are subjected to binarification followed by distance transform and fuzzification (normalization). FIG. 5D represents distance transformed and fuzzified target image. FIG. 5E represents an image subjected to graded similarity index operation where Sum of Absolute Differences between the fuzzified first test image and the fuzzyfied target segment is obtained based on traversing of the fuzzified first test image on the fuzzified target segment. Determine a mapped region such that initial pixel of the test image is correspondingly placed on the identified smallest value associated with the graded similarity index. Determine centre of the mapped region as the centre of the target segment based on centre of first test image, as shown in FIG. 5F. FIG. 5F also represents executing sub-pixel interpolation to the centre to obtain sub-micron accuracy.
[0038] The above detailed description includes description of the invention in connection with a number of embodiments and implementations. The invention is not limited by the number of embodiments and implementations but covers various obvious modifications and equivalent arrangements which lie within the purview of the appended claims. Though aspects of the invention are expressed in certain combinations among the claims, it is considered that these features may be arranged in any combination and order. Any element, step, or feature used in the detailed description of the invention should not be construed as crucial to the invention unless explicitly mentioned as such. It is also presumed by the attached claims to consider all such possible features along with advantages of the present invention which shall fall within the scope of the invention and true spirit. Therefore, the specification and accompanied drawings are to be contemplated in an illustrative and exemplary rather in limiting sense.
,CLAIMS:We claim,
1. A method for improving accuracy of a target image of a target, the method comprising:
obtaining (201) the target image of the target;
traversing (203) the target image to obtain a plurality of target segments based on contour identification by edge detection;
selecting (205) a first target segment of the plurality of target segments;
selecting a first test image corresponding to the first target segment;
generating (207) a graded similarity index based on the first target segment and the first test image;
determining (209) a centre of the first target segment based on the graded similarity index; and
executing (211) sub-pixel interpolation to the centre of the first target segment to improve the accuracy of the target image.
2. The method of claim 1, wherein the first test image corresponding to the first target segment is selected based on an area and edges of contour of the first target segment.
3. The method of claim 1, wherein, to generate the graded similarity index based on the first target segment and the first test image, the method further comprises:
executing binaryfication (207A) of the first test image to determine a binarified first test image;
performing a distance transform (207B) of the binarified first test image to determine a distance transformed first test image; and
executing fuzzyfication (207C) of the distance transformed first test image to determine a fuzzyfied first test image.
4. The method of claim 1, wherein, to generate the graded similarity index based on the first target segment and the first test image, the method further comprises:
executing binaryfication (207D) of the first target segment to determine a binarified first target segment;
performing a distance transform (207E) of the binarified first target segment to determine a distance transformed first target segment; and
executing fuzzyfication (207F) of the distance transformed first target segment to determine a fuzzyfied first target segment.
5. The method of claims 3 and 4, wherein the method further comprises generating (207G) the graded similarity index by determining a sum of absolute differences between the fuzzyfied first test image and the fuzzyfied first target segment based on traversing of the fuzzyfied first test image on the fuzzyfied first target segment.
6. The method of claim 1, wherein, to determine the centre of the first target segment based on the graded similarity index, the method further comprises:
identifying (209A) a smallest value associated with the graded similarity index;
determining (209B) a mapped region such that an initial pixel of the first test image is correspondingly placed on the identified smallest value associated with the graded similarity index; and
determining (209C) a center of the mapped region as the centre of the first target segment, based on centre of the first test image.
7. The method of claim 1, wherein executing sub-pixel interpolation to the centre of the first target segment comprises division of the centre of the first target segment into multiple fold.
8. The method of claim 1, wherein the target corresponds to one of a board and a wafer.
9. A system (100) for improving accuracy of a target image of a target, the system comprising: at least one processor (101); and a memory (103) having instructions stored thereon that, when executed by the at least one processor (101), cause the system (100) to:
obtain the target image of the target;
traverse the target image to obtain a plurality of target segments based on contour identification by edge detection;
select a first target segment of the plurality of target segments;
select a first test image corresponding to the first target segment;
generate a graded similarity index based on the first target segment and the first test image;
determine a centre of the first target segment based on the graded similarity index; and
execute sub-pixel interpolation to the centre of the first target segment to improve the accuracy of the target image.
10. The system (100) of claim 9, wherein the first test image corresponding to the first target segment is selected based on an area and edges of contour of the first target segment.
11. The system (100) of claim 9, wherein, to generate the graded similarity index based on the first target segment and the first test image, the at least one processor (101) is further configured to:
execute binaryfication of the first test image to determine a binarified first test image;
perform a distance transform of the binarified first test image to determine a distance transformed first test image; and
execute fuzzyficatio of the disntance transformed first test image to determine a fuzzyfied first test image.
12. The system (100) of claim 9, wherein, to generate the graded similarity index based on the first target segment and the first test image, the at least one processor (101) is further configured to:
execute binaryfication of the first target segment to determine a binarified first target segment;
perform a distance transform of the binarified first target segment to determine a distance transformed first target segment; and
execute fuzzyfication of the distance transformed first target segment to determine a fuzzyfied first target segment.
13. The system (100) of claims 11 and 12, wherein the at least one processor (101) is further configured to generate the graded similarity index by determining a sum of absolute differences between the fuzzyfied first test image and the fuzzyfied first target segment based on traversing of the fuzzyfied first test image on the fuzzyfied first target segment.
14. The system (100) of claim 9, wherein, to determine the centre of the first target segment based on the graded similarity index, the at least one processor (101) is further configured to:
identify a smallest value associated with the graded similarity index;
determine a mapped region such that an initial pixel of the first test image is correspondingly placed on the identified smallest value associated with the graded similarity index; and
determine a center of the mapped region as the centre of the first target segment, based on centre of the first test image.
15. The system (100) of claim 9, wherein executing sub-pixel interpolation to the centre of the first target segment comprises division of the centre of the first target segment into multiple fold.
| # | Name | Date |
|---|---|---|
| 1 | 202241013372-STATEMENT OF UNDERTAKING (FORM 3) [11-03-2022(online)].pdf | 2022-03-11 |
| 2 | 202241013372-PROVISIONAL SPECIFICATION [11-03-2022(online)].pdf | 2022-03-11 |
| 3 | 202241013372-POWER OF AUTHORITY [11-03-2022(online)].pdf | 2022-03-11 |
| 4 | 202241013372-FORM FOR STARTUP [11-03-2022(online)].pdf | 2022-03-11 |
| 5 | 202241013372-FORM FOR SMALL ENTITY(FORM-28) [11-03-2022(online)].pdf | 2022-03-11 |
| 6 | 202241013372-FORM 1 [11-03-2022(online)].pdf | 2022-03-11 |
| 7 | 202241013372-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [11-03-2022(online)].pdf | 2022-03-11 |
| 8 | 202241013372-EVIDENCE FOR REGISTRATION UNDER SSI [11-03-2022(online)].pdf | 2022-03-11 |
| 9 | 202241013372-DRAWINGS [11-03-2022(online)].pdf | 2022-03-11 |
| 10 | 202241013372-DECLARATION OF INVENTORSHIP (FORM 5) [11-03-2022(online)].pdf | 2022-03-11 |
| 11 | 202241013372-Proof of Right [07-09-2022(online)].pdf | 2022-09-07 |
| 12 | 202241013372-PA [28-01-2023(online)].pdf | 2023-01-28 |
| 13 | 202241013372-FORM28 [28-01-2023(online)].pdf | 2023-01-28 |
| 14 | 202241013372-FORM-26 [28-01-2023(online)].pdf | 2023-01-28 |
| 15 | 202241013372-FORM FOR SMALL ENTITY [28-01-2023(online)].pdf | 2023-01-28 |
| 16 | 202241013372-EVIDENCE FOR REGISTRATION UNDER SSI [28-01-2023(online)].pdf | 2023-01-28 |
| 17 | 202241013372-ASSIGNMENT DOCUMENTS [28-01-2023(online)].pdf | 2023-01-28 |
| 18 | 202241013372-8(i)-Substitution-Change Of Applicant - Form 6 [28-01-2023(online)].pdf | 2023-01-28 |
| 19 | 202241013372-RELEVANT DOCUMENTS [07-03-2023(online)].pdf | 2023-03-07 |
| 20 | 202241013372-RELEVANT DOCUMENTS [07-03-2023(online)]-1.pdf | 2023-03-07 |
| 21 | 202241013372-POA [07-03-2023(online)].pdf | 2023-03-07 |
| 22 | 202241013372-POA [07-03-2023(online)]-1.pdf | 2023-03-07 |
| 23 | 202241013372-FORM 13 [07-03-2023(online)].pdf | 2023-03-07 |
| 24 | 202241013372-FORM 13 [07-03-2023(online)]-1.pdf | 2023-03-07 |
| 25 | 202241013372-AMENDED DOCUMENTS [07-03-2023(online)].pdf | 2023-03-07 |
| 26 | 202241013372-AMENDED DOCUMENTS [07-03-2023(online)]-1.pdf | 2023-03-07 |
| 27 | 202241013372-DRAWING [08-03-2023(online)].pdf | 2023-03-08 |
| 28 | 202241013372-COMPLETE SPECIFICATION [08-03-2023(online)].pdf | 2023-03-08 |
| 29 | 202241013372-FORM-26 [26-05-2023(online)].pdf | 2023-05-26 |
| 30 | 202241013372-FORM 13 [27-05-2023(online)].pdf | 2023-05-27 |
| 31 | 202241013372-FORM-9 [09-06-2023(online)].pdf | 2023-06-09 |
| 32 | 202241013372-MSME CERTIFICATE [03-07-2023(online)].pdf | 2023-07-03 |
| 33 | 202241013372-FORM28 [03-07-2023(online)].pdf | 2023-07-03 |
| 34 | 202241013372-FORM 18A [03-07-2023(online)].pdf | 2023-07-03 |
| 35 | 202241013372-FER.pdf | 2023-08-04 |
| 36 | 202241013372-FER_SER_REPLY [04-02-2024(online)].pdf | 2024-02-04 |
| 37 | 202241013372-COMPLETE SPECIFICATION [04-02-2024(online)].pdf | 2024-02-04 |
| 38 | 202241013372-CLAIMS [04-02-2024(online)].pdf | 2024-02-04 |
| 39 | 202241013372-US(14)-HearingNotice-(HearingDate-05-04-2024).pdf | 2024-03-05 |
| 40 | 202241013372-Proof of Right [05-03-2024(online)].pdf | 2024-03-05 |
| 41 | 202241013372-Correspondence to notify the Controller [05-04-2024(online)].pdf | 2024-04-05 |
| 42 | 202241013372-Written submissions and relevant documents [20-04-2024(online)].pdf | 2024-04-20 |
| 43 | 202241013372-Annexure [20-04-2024(online)].pdf | 2024-04-20 |
| 44 | 202241013372-MARKED COPIES OF AMENDEMENTS [14-09-2024(online)].pdf | 2024-09-14 |
| 45 | 202241013372-FORM 13 [14-09-2024(online)].pdf | 2024-09-14 |
| 46 | 202241013372-AMMENDED DOCUMENTS [14-09-2024(online)].pdf | 2024-09-14 |
| 47 | 202241013372-RELEVANT DOCUMENTS [08-11-2024(online)].pdf | 2024-11-08 |
| 48 | 202241013372-MARKED COPIES OF AMENDEMENTS [08-11-2024(online)].pdf | 2024-11-08 |
| 49 | 202241013372-FORM 13 [08-11-2024(online)].pdf | 2024-11-08 |
| 50 | 202241013372-AMMENDED DOCUMENTS [08-11-2024(online)].pdf | 2024-11-08 |
| 51 | 202241013372-PatentCertificate12-11-2024.pdf | 2024-11-12 |
| 52 | 202241013372-IntimationOfGrant12-11-2024.pdf | 2024-11-12 |
| 1 | searchstrategy_202241013372E_04-08-2023.pdf |
| 2 | search(2)AE_15-02-2024.pdf |