Abstract: DIMENSIONAL OBJECTS A method and system (100) for inspection of a three dimensional object is provided. The system is using vision based method for the quality inspection. The system comprises a camera (102) for capturing a 2D image of the 3D object and a 3D model capturing tool (104) for capturing the 3D model of the 3D object. Two points are selected on the 3D object by the user. Further, a Euclidian distance is measured by a processor (112) on the 2D image and the 3D model of the 3D object. The Euclidian distance got on the 2D image is compared with the Euclidian distance measured from the 3D model by the processor. Based on the comparison the 2D point’s selection is iteratively refined to achieve a predefined accuracy level. The system also configured to identify whether the 3D object is defective or not.
FORM 2
THE PATENTS ACT, 1970 (39 of 1970) & THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
1. Title of the invention: METHOD AND SYSTEM FOR AUTOMATIC QUALITY
INSPECTION OF 3 DIMENSIONAL OBJECTS
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor,
SERVICES LIMITED Nariman Point, Mumbai,
Maharashtra 400021, India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it
is to be performed.
FIELD OF THE INVENTION
[0001] The present application generally relates to the field of quality check of three dimensional objects. More particularly, but not specifically, the invention provides a system and method for inspection of three dimensional objects using vision based measurement.
BACKGROUND OF THE INVENTION
[0002] In the manufacturing industry, a lot of three dimensional (3D) objects are needed in the production process to manufacture a product. These 3D objects, must be manufactured without any defects. Even a very slight defect in the 3D object may cause a large scale defect in the manufacturing of the product. Therefore, it is very important to measure and inspect the quality of the 3D object at the very early stage of the production process. It is also important to identify the defect in the real time in the production process. In addition to quality monitoring, there is also need to measure and validate the dimensions of the 3D object.
[0003] Conventionally, the inspection of defects or flaws on 3D objects is conducted by human visual sight. The defect inspection under the conventional methods contains many issues such as skill is required, so that the conventional method is inadequate for automatic product inspection, etc. Further, the inspection dependent upon the human visual sight contains problems of accuracy. This process involves a lot of human involvement using hand held apparatus on a testing floor, which makes the process prone to error. The accuracy is dependent on testing environment & human error. Therefore there is a certain need of automated processes with better accuracy of quality inspection.
[0004] Various other automated methods have also been used in the past for the inspection and measurement of defects in the 3D objects. One of the methods involve comparison of a 3D model under inspection with a base model. In these methods, the system calculates a point to point distance between the base model and the model under inspection where the basic assumption is both the model should have same number of points. This assumption is not correct in the practical scenario, because the surface area of the base model
and the model under inspection are not same practically. So the distance calculation between the base model and the model under inspection to detect the defect is not always correct.
[0005] In addition to that, the existing systems and methods are not robust enough and are not adaptive to be used in different type of environment such as noisy environment like shop floor. Further, the existing techniques may also result in increase of the overall cost of inspection of 3D objects. Moreover, all existing systems are designed for very specific to the object and these systems cannot be used for other different objects.
SUMMARY OF THE INVENTION
[0006] The following presents a simplified summary of some embodiments of the disclosure in order to provide a basic understanding of the embodiments. This summary is not an extensive overview of the embodiments. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the embodiments. Its sole purpose is to present some embodiments in a simplified form as a prelude to the more detailed description that is presented below.
[0007] In view of the foregoing, an embodiment herein provides a system for inspection of a 3 dimensional (3D) object. The system comprises a camera, a 3D model capturing tool, a memory and a processor. The camera captures a 2D image of the 3D object. The 2D image comprises an image of the 3 dimensional object. The 3D model capturing tool captures a 3D model. The 3D model comprises the 3 dimensional object. The plurality of noise filters remove noise from the 2D image and the 3D model. The processor is in communication with the memory. Initially, the processor receives the 2D image and the 3D model as input. The processor segments the image of the 3 dimensional object from the filtered 2D image. The processor further selects a first point and a second point on the image of the 3 dimensional object. The first point and the second point are the endpoints of a section of the 3 dimensional object which needs to be inspected. Further, the processor calculates a Euclidean distance using selected 2D points, camera calibration and a known perpendicular distances from camera centre towards the 3 dimensional object. The processor measures the Euclidian distance between the first point and
the second point on the 3D object. Further, the processor segments the 3D model of the 3D object from the captured 3D model. Further, the processor measures the Euclidian distance between the first point and the second point on the 3D model of the 3D object. And finally processor compares the Euclidian distances between 2D and 3D and iteratively refine 2D point’s selection based on comparison to achieve a predefined accuracy level.
[0008] In another aspect, an embodiment provides a method for inspection of the 3 dimensional object. Initially, a 2D image is captured using a camera. The 2D image comprises an image of the 3 dimensional object. At the same time a 3D model is captured using a 3D model capturing tool. The 3D model comprises the 3 dimensional object. The 2D image and the 3D model is provided to the processor as an input. Noise are removed from the 2D image and the 3D model using a plurality of noise filters. In the next step, the image of the 3 dimensional object is segmented from the filtered 2D image. A first point and a second point are selected on the 2D image of the 3 dimensional object. The first point and the second point are the endpoints of a section of the 3 dimensional object which needs to be inspected. In the next step, a Euclidian distance on the 2D image is calculated using the selected 2D points, the camera’s internal calibration matrices and a known perpendicular distances from camera centre towards the 3 dimensional object. In the next step, the 3D model of the 3 dimensional object is segmented from the captured 3D model. The Euclidian distance is measured between the first point and the second point on the 3D model of the 3 dimensional object. In the next step, the Euclidian distance calculated on the 2D image of the 3 dimensional object is compared with the Euclidian distance measured from the 3D model of the 3 dimensional object. And finally, the 2D point’s selection is iteratively refined based on the comparison to achieve a predefined accuracy level.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[0010] Fig. 1 illustrates a block diagram of a system for inspection of a 3D object, in accordance with an embodiment of the present disclosure;
[0011] Fig. 2 illustrates a set up for the implementation of the system for inspection of the 3D object, in accordance with an embodiment of the present disclosure; and
[0012] Fig. 3 is a flowchart illustrating the steps involved in the inspection of the 3D object, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE INVENTION
[0013] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0014] Referring now to the drawings, and more particularly to FIG. 1, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
[0015] According to an embodiment of the disclosure, a system 100 for the inspection of a 3 dimensional object (3D object) is shown in the block diagram of Fig. 1. The system 100 is configured to inspect the quality of a 3 dimensional object in the real time. The 3D object can be any object such as a parallelepiped object which is used in the assembly line of a manufacturing plant. The system 100 can preferably be used in the automobile industry. Though it should be appreciated that the use of system 100 in any other industry is well within the scope of this disclosure. The system 100 provides an automated real time vision based method to inspect the quality of the 3D object.
[0016] According to an embodiment of the disclosure, the system 100 comprises a camera 102, a 3D model capturing tool 104, a user interface 106, a plurality of noise filters 108, a memory 110 and a processor 112. The memory 110 is in communication with the processor 112. The processor 112 further includes a plurality of modules such as a segmentation module 114, a rectification module 116, conversion module 118 etc. The plurality of modules generally include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. The plurality of modules described herein may be implemented as software modules that may be executed in the cloud-based computing environment of the system 100. In another embodiment the processor 112 may also include other modules performing various other functions of the system 112.
[0017] According to an embodiment of the disclosure, the camera 102 is configured to capture the 2D image of the 3D object. The camera 102 used here is the pre-calibrated camera. The 3D model capturing tool 104 is configured to capture the 3D model of the 3D object. The camera 102 and the 3D model capturing tool 104 should be paired together such that the 2D image and the 3D model should be captured at the same time. The camera 102 and the 3D model capturing tool 104 should be placed close to each other. In an embodiment, the camera 102 is placed right on top of the 3D model capturing tool 104 as shown in the setup of Fig. 2. It should be appreciated that the system 100 can use any standard RGB camera as the camera 102 for capturing the 2D image. It should also be appreciated the system 100 can use any one of a Kinect sensor, an LSD SLAM, or a multi-view 3D reconstruction camera etc. as the 3D model capturing tool 104. The use of any other 3D model capturing tool 104 is well within the scope of this disclosure.
[0018] According to an embodiment of the disclosure, the user interface 106 is configured to be used by a user to select a first point and a second point on the 3D object. In an example, the first point and the second points are chosen in such a way that they are the corner points of the 3D object and covers the whole object for the comparison. The user interface 106 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface user may allow the system 100 to interact with the user directly or through the processor 112 and other devices in the system 100. Further,
the user interface 106 may enable the system 100 to communicate with other computing devices, such as web servers and external data servers (not shown). The user interface 106 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite.
[0019] The 2D image and the 3D model generally include a lot of noises such as environmental noise, high frequency noise, Gaussian noise etc. The 2D image and the 3D model may also include discontinuity. The noise and the discontinuity are removed by the plurality of noise filters 108. In an embodiment, a smoothening function is used to smoothen the 3D model. In another embodiment, the system 100 may use Gaussian filter, bilateral filter, smoothening filter, filters to remove discontinuity points etc. for removing various type of noise.
[0020] According to an embodiment of the disclosure, the system 100 also includes the segmentation module 114. The segmentation module 114 is configured to segment the 2D image and the 3D model. Normally 2D image include a lot things in the background, which are of no use in the quality inspection of the object. Same is the case with 3D model, it may include lot of unwanted 3D objects in the background. The use of any existing algorithm for segmentation is well within the scope of this disclosure. The segmentation algorithm can be selected based on the running time of the algorithm. In an embodiment, the segmentation module 114 may use edge detection and some trivial geometrical shape detection techniques to find out the pose and extracting the required dimensions.
[0021] According to an embodiment of the disclosure the first point and the second point are selected by the user on both the 2D image of the 3D object and 3D model of the 3D object. The first and the second point are the endpoints of a section of the 3D object which needs to be inspected. The processor 112 further includes the conversion module 118. The conversion module 118 is configured to calculate a Euclidian distance using the selected 2D points, the camera’s internal calibration matrices and a known perpendicular distance between the center point of the camera 102 and the 3D object. At the same time, the processor 112 also configured
to measure the Euclidian distance between the first point and the second point on the 3D model of the 3 dimensional object.
[0022] According to an embodiment of the disclosure, the processor 112 is also configured to compare the Euclidian distance got after conversion on the 2D image of the 3D object and the Euclidian distance measured from the 3D model of the 3D object. If the Euclidian distance on the 2D image and the 3D model are same then it can be concluded that the object under inspection is of good quality. If the Euclidian distance on the 2D image and the 3D model are not same then the processor 112 iteratively refines the 2D point’s selection to achieve a predefined accuracy level. The predefined accuracy level can be decided by the user. It should be appreciated that the system 100 also configured to identify the 3D object as defective if the difference between the Euclidian distance measured on the 2D image and the Euclidian distance measured from the 3D model of the 3D object is more than a predefined value.
[0023] According to another embodiment of the disclosure, the processor 112 also includes the rectification module 116. The rectification module 116 is configured to rectify an error measured during the inspection of the 3D object, using the proper point selection method.
[0024] In operation, According to an embodiment of the disclosure, a flowchart 200 illustrating the steps involved in the inspection of a 3 dimensional (3D) object is shown in Fig. 3. Initially at step 202, a 2D image is captured using the camera 102. The 2D image normally comprises an image of the 3 dimensional object and other background objects. At step 204, the 3D model is captures using a 3D model capturing tool 104. The 3D model comprises the 3 dimensional object and any other object which is present in the background. It should be appreciated that the step 202 and 204 are performed simultaneously to avoid variations in the captured 2D image and 3D model of the 3D object. Further at step 206 and 208, the noise is removed from the 2D image and the 3D model respectively using a plurality of noise filters 108.
[0025] In the next step 210, the image of the 3 dimensional object is segmented from the filtered 2D image using the segmentation module 114 present in the processor 112. In the next step 212, the first point and the second point are selected on the 2D image of the 3 dimensional
object. The first point and the second point are selected by the user using the user interface 106. The first point and the second point are the endpoints of a section of the 3 dimensional object which needs to be inspected by the user. At step 214, the selected 2D points are converted into Euclidean distance by the conversion module 118. The conversion module 118 uses the camera’s internal calibration matrices and a known perpendicular distances from the camera 102 centre towards the 3 dimensional object. At step 216, the 3D model of the 3 dimensional object is also segmented from the captured 3D model by the segmentation module 114 present in the processor 112. At step 218, the Euclidian distance is measured between the first point and the second point on the 3D model of the 3 dimensional object.
[0026] In the next step 220, the Euclidian distance got on the 2D image of the 3 dimensional object is compared with the Euclidian distance measured from the 3D model of the 3 dimensional object. If the results of the comparison are same then it is concluded that the 3D object under consideration is of high quality. If the results of the comparison are not same then at step 222, the 2D point’s selection is iteratively refined to achieve a predefined accuracy level set by the user.
[0027] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims. The embodiment, thus provides the system and method for testing of active molecules using simulation of skin membrane
[0028] It is, however to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed
including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[0029] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[0030] The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
[0031] A data processing system suitable for storing and/or executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
[0032] Input/output (I/O) devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
[0033] A representative hardware environment for practicing the embodiments may include a hardware configuration of an information handling/computer system in accordance with the embodiments herein. The system herein comprises at least one processor or central processing unit (CPU). The CPUs are interconnected via system bus to various devices such as a random access memory (RAM), read-only memory (ROM), and an input/output (I/O) adapter. The I/O adapter can connect to peripheral devices, such as disk units and tape drives, or other program storage devices that are readable by the system. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments herein.
[0034] The system further includes a user interface adapter that connects a keyboard, mouse, speaker, microphone, and/or other user interface devices such as a touch screen device (not shown) to the bus to gather user input. Additionally, a communication adapter connects the bus to a data processing network, and a display adapter connects the bus to a display device which may be embodied as an output device such as a monitor, printer, or transmitter, for example.
[0035] The preceding description has been presented with reference to various embodiments. Persons having ordinary skill in the art and technology to which this application pertains will appreciate that alterations and changes in the described structures and methods of operation can be practiced without meaningfully departing from the principle, spirit and scope.
I/We Claim:
1. A method for inspection of a 3 dimensional object, the method comprising:
capturing a 2D image using a camera (102), wherein the 2D image comprises an image of the 3 dimensional object;
capturing a 3D model using a 3D model capturing tool (104), wherein the 3D model comprises the 3 dimensional object;
receiving by a processor, the 2D image and the 3D model as an input;
removing by the processor noise from the 2D image and the 3D model using a plurality of noise filters (108);
segmenting, by the processor (112), the image of the 3 dimensional object from the filtered 2D image;
selecting, by the processor (112), a first point and a second point on the image of the 3 dimensional object, wherein the first point and the second point are the endpoints of a section of the 3 dimensional object which needs to be inspected;
calculating, by the processor (112), a Euclidean distance on the 2D image using the selected points, the camera’s internal calibration matrices and a known perpendicular distances from camera centre towards the 3 dimensional object;
segmenting, by the processor (112), the 3D model of the 3 dimensional object from the captured 3D model;
measuring, by the processor (112), the Euclidian distance between the first point and the second point on the 3D model of the 3 dimensional object; and
comparing, by the processor (112), the Euclidian distance calculated on the 2D image of the 3 dimensional object and the Euclidian distance measured from the 3D model of the 3 dimensional object; and
iteratively refining, by the processor, the 2D points selection based on the comparison to achieve a predefined accuracy level.
2. The method of claim 1 further comprises identifying the 3D object as defective if the difference between the Euclidian distance calculated on the 2D image and the Euclidian distance measured from the 3D model of the 3D object is more than a predefined value.
3. The method of claim 1 further comprising finalizing the length from choosing best available 2D points on images which gives Euclidean measurement closest to the measurement from 3D.
4. The method of claim 1 further includes the step of rectifying an error using proper point selection.
5. The method of claim 1, wherein the first point and the second point are the corner points in the 3 dimensional object.
6. A system for inspection of a 3 dimensional object, the system comprising:
a camera (102) for capturing a 2D image, wherein the 2D image comprises an image of the 3 dimensional object;
a 3D model capturing tool (104) for capturing a 3D model using, wherein the 3D model comprises the 3 dimensional object;
a plurality of noise filters (108) removing noise from the 2D image and the 3D model; a memory (110); and
a processor (112) in communication with the memory, wherein the processor configured to:
receive the 2D image and the 3D model as an input;
segment the image of the 3 dimensional object from the filtered 2D image using a segmentation module (114);
select a first point and a second point on the image of the 3 dimensional object using a user interface, wherein the first point and the second point are the endpoints of a section of the 3 dimensional object which needs to be inspected;
calculate a Euclidean distance on the 2D image using the selected 2D points, camera calibration and a known perpendicular distances from camera centre towards the 3 dimensional object using a conversion module (118);
segment the 3D model of the 3 dimensional object from the captured 3D model using the segmentation module;
measure the Euclidian distance between the first point and the second point on the 3D model of the 3 dimensional object; and
compare the Euclidian distances between 2D and 3D and iteratively refine 2D points selection based on comparison to achieve a predefined accuracy level.
7. The system of claim 7, wherein the camera is a calibrated camera.
8. The system of claim 7 wherein the 3D object is a parallelepiped object.
| # | Name | Date |
|---|---|---|
| 1 | 201621027719-IntimationOfGrant27-09-2023.pdf | 2023-09-27 |
| 1 | Form 5 [12-08-2016(online)].pdf | 2016-08-12 |
| 2 | 201621027719-PatentCertificate27-09-2023.pdf | 2023-09-27 |
| 2 | Form 3 [12-08-2016(online)].pdf | 2016-08-12 |
| 3 | Form 18 [12-08-2016(online)].pdf_142.pdf | 2016-08-12 |
| 3 | 201621027719-Written submissions and relevant documents [11-09-2023(online)].pdf | 2023-09-11 |
| 4 | Form 18 [12-08-2016(online)].pdf | 2016-08-12 |
| 4 | 201621027719-FORM-26 [24-08-2023(online)]-1.pdf | 2023-08-24 |
| 5 | Drawing [12-08-2016(online)].pdf | 2016-08-12 |
| 5 | 201621027719-FORM-26 [24-08-2023(online)].pdf | 2023-08-24 |
| 6 | Description(Complete) [12-08-2016(online)].pdf | 2016-08-12 |
| 6 | 201621027719-Correspondence to notify the Controller [02-08-2023(online)].pdf | 2023-08-02 |
| 7 | Form 26 [02-09-2016(online)].pdf | 2016-09-02 |
| 7 | 201621027719-US(14)-HearingNotice-(HearingDate-28-08-2023).pdf | 2023-07-27 |
| 8 | Other Patent Document [14-10-2016(online)].pdf | 2016-10-14 |
| 8 | 201621027719-CLAIMS [15-07-2020(online)].pdf | 2020-07-15 |
| 9 | 201621027719-COMPLETE SPECIFICATION [15-07-2020(online)].pdf | 2020-07-15 |
| 9 | ABSTRACT1.JPG | 2018-08-11 |
| 10 | 201621027719-FER_SER_REPLY [15-07-2020(online)].pdf | 2020-07-15 |
| 10 | 201621027719-Power of Attorney-060916.pdf | 2018-08-11 |
| 11 | 201621027719-Form 1-201016.pdf | 2018-08-11 |
| 11 | 201621027719-OTHERS [15-07-2020(online)].pdf | 2020-07-15 |
| 12 | 201621027719-Correspondence-201016.pdf | 2018-08-11 |
| 12 | 201621027719-FORM-26 [18-06-2020(online)].pdf | 2020-06-18 |
| 13 | 201621027719-Correspondence-060916.pdf | 2018-08-11 |
| 13 | 201621027719-FER.pdf | 2020-01-17 |
| 14 | 201621027719-Correspondence-060916.pdf | 2018-08-11 |
| 14 | 201621027719-FER.pdf | 2020-01-17 |
| 15 | 201621027719-Correspondence-201016.pdf | 2018-08-11 |
| 15 | 201621027719-FORM-26 [18-06-2020(online)].pdf | 2020-06-18 |
| 16 | 201621027719-Form 1-201016.pdf | 2018-08-11 |
| 16 | 201621027719-OTHERS [15-07-2020(online)].pdf | 2020-07-15 |
| 17 | 201621027719-Power of Attorney-060916.pdf | 2018-08-11 |
| 17 | 201621027719-FER_SER_REPLY [15-07-2020(online)].pdf | 2020-07-15 |
| 18 | 201621027719-COMPLETE SPECIFICATION [15-07-2020(online)].pdf | 2020-07-15 |
| 18 | ABSTRACT1.JPG | 2018-08-11 |
| 19 | 201621027719-CLAIMS [15-07-2020(online)].pdf | 2020-07-15 |
| 19 | Other Patent Document [14-10-2016(online)].pdf | 2016-10-14 |
| 20 | 201621027719-US(14)-HearingNotice-(HearingDate-28-08-2023).pdf | 2023-07-27 |
| 20 | Form 26 [02-09-2016(online)].pdf | 2016-09-02 |
| 21 | 201621027719-Correspondence to notify the Controller [02-08-2023(online)].pdf | 2023-08-02 |
| 21 | Description(Complete) [12-08-2016(online)].pdf | 2016-08-12 |
| 22 | 201621027719-FORM-26 [24-08-2023(online)].pdf | 2023-08-24 |
| 22 | Drawing [12-08-2016(online)].pdf | 2016-08-12 |
| 23 | 201621027719-FORM-26 [24-08-2023(online)]-1.pdf | 2023-08-24 |
| 23 | Form 18 [12-08-2016(online)].pdf | 2016-08-12 |
| 24 | 201621027719-Written submissions and relevant documents [11-09-2023(online)].pdf | 2023-09-11 |
| 24 | Form 18 [12-08-2016(online)].pdf_142.pdf | 2016-08-12 |
| 25 | Form 3 [12-08-2016(online)].pdf | 2016-08-12 |
| 25 | 201621027719-PatentCertificate27-09-2023.pdf | 2023-09-27 |
| 26 | Form 5 [12-08-2016(online)].pdf | 2016-08-12 |
| 26 | 201621027719-IntimationOfGrant27-09-2023.pdf | 2023-09-27 |
| 1 | 201621027719_Search_16-01-2020.pdf |