Sign In to Follow Application
View All Documents & Correspondence

System And Method For An Automated Optical Inspection On Assembly Production Line

Abstract: This disclosure relates to a system and method for an automated optical inspection on an assembly production line through an image analytics. The system does an automatic inspection of components on a conveyor belt of an assembly line. The system herein identifies three level hierarchy of defects based on a reference sample archived and other metadata of defect at sub-component level. The system uses a first array of cameras and a second array of cameras for identifying a component as a whole and for capturing different portions of the component with high pixel density for defect analysis purpose. A CAD dimension mapping with the captured images specifies distance of every sub-component location from origin, an oriented fast and rotated brief (ORB) is applied for a sharp feature on components and a co-relation-based template matching for a texture feature of the components. [To be published with FIG. 3]

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
09 January 2020
Publication Number
23/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2024-06-03
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point Mumbai 400021 Maharashtra, India

Inventors

1. DAS, Apurba
Tata Consultancy Services Limited Salarpuria GR Tech Park, Dhara Block, Whitefield Bangalore 560066 Karnataka, India
2. GAURIAR, Anshuman Santosh
Tata Consultancy Services Limited Salarpuria GR Tech Park, Dhara Block, Whitefield Bangalore 560066 Karnataka, India
3. BOGIREDDY, Lahari
Tata Consultancy Services Limited Salarpuria GR Tech Park, Dhara Block, Whitefield Bangalore 560066 Karnataka, India
4. BHOLE, Nikhil
Tata Consultancy Services Limited Salarpuria GR Tech Park, Dhara Block, Whitefield Bangalore 560066 Karnataka,India
5. ZAIDI, Syed Sahil Abbas
Tata Consultancy Services Limited Salarpuria GR Tech Park, Dhara Block, Whitefield Bangalore 560066 Karnataka, India
6. VELUVOLU, Akhil
Tata Consultancy Services Limited Salarpuria GR Tech Park, Dhara Block, Whitefield Bangalore 560066 Karnataka, India

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION (See Section 10 and Rule 13)
Title of invention:
SYSTEM AND METHOD FOR AN AUTOMATED OPTICAL
INSPECTION ON ASSEMBLY PRODUCTION LINE
Applicant
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India
Preamble to the description
The following specification particularly describes the invention and the manner in which it is to be performed.

TECHNICAL FIELD [001] The disclosure herein generally relates to a field of automated optical inspection of products on assembly production line and, more particularly, a system and method for an automated optical inspection of products on an assembly production line through an image analytics.
BACKGROUND [002] An automated optical inspection (AOI) is a key technique used in the manufacture and test of electronics printed circuit boards (PCBs). The AOI enables fast and accurate inspection of electronic assemblies and in particular PCBs to ensure that the quality of product leaving the production line is high and the items are built correctly and without manufacturing faults. Despite the major improvements that have been made, modern circuits are far more complicated than boards were even a few years ago. The introduction of surface mount technology, and the subsequent further reductions in size mean that boards are particularly compact. Even relatively average boards have thousands of soldered joints, and these are where the majority of problems are found.
[003] The existing automated optical inspection (AOI) systems are designed mostly for PCB manufacturing and automotive manufacturing industries. These systems cannot be extended to multiple manufacturing with different kind of conveyor belt. The entire conveyor belt is not monitored, and inventory is not maintained without manual intervention.
SUMMARY [004] Embodiments of the present disclosure provides technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems.
[005] In one aspect, a processor-implemented method for an automated optical inspection on an assembly production line through an image analytics.

[006] The method comprises one or more steps as follows. Herein, a first array of a plurality of cameras are positioned to capture a video of one or more components on a running assembly line. It is to be noted that the plurality of cameras of the first array are characterized with a wide field of view (FOV). The one or more components captured in the video are tagged with a predefined identity to maintain an inventory record. A boundary of each of the one or more tagged components is extracted by creating a motion mask and using a Gaussian mixer model along with a Lab color mask. It would be appreciated that the PCB or an electrical panel may be in contrast color as compare to conveyor belt and this color is segmented using the Lab color mask.
[007] Further, a second array of a plurality of cameras is positioned to capture one or more images of each section of the one or more components. At least one subject of interest of each of the one or more tagged components is located by localizing the one or more captured images based the extracted boundary of each of the one or more tagged components. The located subject of interest is mapped with a predefined CAD dimension of each of the one or more components. It would be appreciated that a starting point of each of the one or more components is co-related with a starting point of the predefined CAD dimension for mapping. Further, the one or more mapped components is verified to identify at least one defect using an oriented fast and rotated brief (ORB) for a sharp feature. The sharp feature includes a logo or text in case of PCB integrated circuit and electrical panel fabrication placement of correct relay contractor on one or more components.
[008] In another aspect, a system is configured for an automated optical inspection on an assembly production line through an image analytics. The system comprising a first array of a plurality of cameras and a second array of a plurality of cameras. Wherein, the plurality of cameras of the first array are characterized with a wide field of view and the plurality of cameras of the second array are characterized with a narrow field of view.

[009] Further, the system comprising at least one memory storing a plurality of instructions and one or more hardware processors communicably coupled with at least one memory. The one or more hardware processors are configured to execute one or more modules comprises of a tagging module, an extraction module, a localization module, a mapping module, and a verification module.
[010] In one embodiment, the first array of the plurality of cameras are configured to capture a video of one or more components on a running assembly line. The tagging module of the system is configured to tag each of the one or more components in the captured video with a predefined identity to maintain an inventory record. The extraction module of the system is configured to extract a boundary of each of the one or more tagged components by creating a motion mask using a Gaussian mixer model along with the lab color mask. Further, the second array of a plurality of cameras are positioned to capture one or more images of each section of the one or more tagged components. The localization module of the system is configured to locate at least one subject of interest of each of the one or more tagged components based on the extracted boundary of each of the one or more tagged components. The mapping module of the system is configured to map the located subject of interest with a predefined CAD dimension of each of the one or more components. It would be appreciated that a starting point of each of the one or more components are co-related with a starting point of the predefined CAD dimension. The verification module of the system is configured to verify each of the one or more mapped components to identify at least one defect using an oriented fast and rotated brief (ORB) for a sharp feature on one or more components and a co-relation-based template matching for a texture feature of the one or more components.
[011] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.

BRIEF DESCRIPTION OF THE DRAWINGS [012] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles:
[013] FIG. 1 illustrates a system for an automated optical inspection on an assembly production line through an image analytics, in accordance with some embodiments of the present disclosure.
[014] FIG. 2 is a flow diagram to illustrate a method for an automated optical inspection on an assembly production line through an image analytics, in accordance with some embodiments of the present disclosure.
[015] FIG. 3 is a functional block diagram of the system an automated optical inspection on an assembly production line through an image analytics, in accordance with some embodiments of the present disclosure.
[016] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems and devices embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes, which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION OF EMBODIMENTS [017] Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of

disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope and spirit being indicated by the following claims.
[018] The embodiments herein provide a method and a system for an automated optical inspection on an assembly production line through an image analytics. The system does an automatic inspection of one or more components on a conveyor belt of an assembly line. The system uses a first array of cameras and a second array of cameras for identifying a component as a whole and for capturing different portions of the component with high pixel density for defect analysis purpose. It is to be noted that the system herein identifies three level hierarchy of defects based on a reference sample archived and other metadata of defect at sub¬component level. At the component level defects includes a missing sub-component, rotational, skew error, wrong component placed and a shifted component. The system is configured to verify each of the sub-component of the one or more components in its predefined location without any shift with a predefined similarity score and acceptable alignment in position from a CAD mapping. Further, to verify the orthogonal defect a tile of the captured one or more images is rotated and a similarity score is measured. To verify the skew defect the template is skewed at a predefined angle and then similarity score is measured along with the predefined skew angle. Missing sub-component is found by using the Lab color space comparison with reference image.
[019] Referring now to the drawings, and more particularly to FIG. 1 through FIG. 3, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.

[020] FIG. 1 illustrates a system (100) for an automated optical inspection on an assembly production line through an image analytics. In the preferred embodiment, the system (100) comprises a first array of a plurality of cameras (102), and a second array of a plurality of cameras (104). Further, the system comprises at least one memory (106) with a plurality of instructions and one or more hardware processors (108) which are communicably coupled with the at least one memory (106) to execute modules therein.
[021] The hardware processor (108) may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuits, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the hardware processor (108) is configured to fetch and execute computer-readable instructions stored in the memory (106). Further, the system comprises a tagging module (110), an extraction module (112), a localizing module (114), a mapping module (116) and a verification module (118).
[022] In the preferred embodiment of the disclosure, the first array of the plurality of cameras (102) of the system (100) are positioned to capture a video of one or more components on a running assembly line. Herein, the first array of the plurality of cameras are characterized with a wide field of view (FOV). The wide FOV of the first array of the plurality of cameras is used in tracking one or more components on the running assembly.
[023] In the preferred embodiment of the disclosure, the tagging module (110) of the system (100) is configured to tag each of the one or more components in the captured video with a predefined identity to maintain an inventory record. The tracking of the tagged one or more components is done to maintain the inventory record and to avoid duplicate counting of the same component.
[024] In the preferred embodiment of the disclosure, the extraction module (112) of the system (100) is configured to extract a boundary of each of the one

or more tagged components by creating a motion mask using a Gaussian mixer model along with the Lab color mask. Herein, the Lab color mask is a color space to do color-based segmentation of an object of interest. It would be appreciated that the PCB or an electrical panel may be in contrast color as compared to conveyor belt and this color is segmented using the Lab color mask.
[025] In the preferred embodiment of the disclosure, the second array of plurality of cameras (104) of the system (100) are positioned to capture one or more images of each section of the one or more tagged components, wherein the second array of the plurality of cameras are characterized with a narrow field of view (FOV). Further, the second array of the plurality of cameras is configured to capture a reference image. It is to be noted that the reference image is for offline calibration, in online and offline component capture process is exactly same of each of the one or more components. Each of the captured one or more images of each section of the one or more tagged components is placed as a tile while mapping for each sub-component.
[026] In one example, wherein a component is the entire PCB, or an electrical panel box and sub components of the PCB are the registers, transistors, capacitors etc. The electrical panel box comprising sub-components like relay contactors, PLC’s, Circuit breakers etc. The secondary array cameras sub-divide the PCB in each camera view and multiple sub-components will appear. Some component at the boundary will be visible in the common section of two adjacent cameras.
[027] In one aspect, each of the sub-component is divided into two categories as a high feature sub-component and a low feature. The high feature sub-component has a written text or Logo and a low feature sub-component does not have any text or Logo written on it. The low feature sub-component has only color and lining markings. A rotation-based color template matching co-relation values are used for finding defects. The rotation-based template matching is to identify components that are correct components but wrongly placed in a rotated manner.

[028] In the preferred embodiment of the disclosure, the localization module (114) of the system (100) is configured to localize one or more captured images to locate at least one subject of interest of each of the one or more tagged components based the extracted boundary of each of the one or more tagged components.
[029] In the preferred embodiment of the disclosure, the at least one mapping module (116) of the system (100) is configured to map the located subject of interest with a predefined CAD dimension of each of the one or more components, wherein a starting point of each of the one or more components are co-related with a starting point of the predefined CAD dimension. Herein, a distance of each sub-component of each of the one or more components from an origin is specified for mapping the each of one or more components with the predefined CAD dimension.
[030] In one aspect, the CAD dimension information of each of the one or more components are predefined and provided to the system through an offline calibration. The CAP dimension is used to locate a precise location of the subject of interest of each sub-component of the one or more components. It would be appreciated that the one or more captured images from the second array of the plurality of cameras are tiled in a physical space next to each other and the same are mapped with the CAD dimension.
[031] In the preferred embodiment of the disclosure, the verification module (118) of the system (100) is configured to verify each of the one or more mapped components to identify at least one defect using an oriented fast and rotated brief (ORB) for a sharp feature on one or more components and a co-relation-based template matching for a texture feature of the one or more components.
[032] In one example, wherein a printed circuit board (PCB) is taken under inspection. The PCB contains one or more components comprising an integrated

circuit (IC), capacitors, resistors, transistors, and diodes. Herein, the one or more components are divided into high feature components and low feature components. The IC and the capacitors have written text or Logo on them and they are defined as a high feature component. The ORB features can be used for high feature components for identifying the kind of defect like skew defect, orthogonal rotation defect and a wrong component defect. Further, the low feature components like register, transistor, and diode does not have any written text or Logo on them. Only features are color and line marking for them. For the low feature components, the color template matching co-relation values are used to identify defects.
[033] In another example, wherein the electrical panel fabrication, placement of correct Relay Contactor, PLC’s, instrumentation meters, circuit breakers, and the ORB feature can be used on one or more components and a co-relation-based template matching for a texture feature.
[034] Referring FIG. 3, a processor-implemented method (300) for quantitative measure of subject wise sentiment analysis of unstructured form of texts. The method comprises one or more steps as follows.
[035] Initially, at the step (302), positioning a first array of a plurality of cameras to capture a video of one or more components on a running assembly line, wherein the plurality of cameras is characterized with a wide field of view (FOV).
[036] In the preferred embodiment of the disclosure, at the next step (304), each of the one or more components in the captured video is tagged with a predefined identity to maintain an inventory record.
[037] In the preferred embodiment of the disclosure, at the next step (306), a boundary of each of the one or more tagged components is extracted by creating

a motion mask using a Gaussian mixer model along with the Lab color mask.
[038] In the preferred embodiment of the disclosure, at the next step (308), positioning a second array of a plurality of cameras to capture one or more images of each section of the one or more tagged components, wherein the second array of the plurality of cameras are characterized with a narrow field of view (FOV).
[039] In the preferred embodiment of the disclosure, at the next step (310), localizing one or more captured images to locate at least one subject of interest of each of the one or more tagged components based on the extracted boundary of each of the one or more tagged components.
[040] In the preferred embodiment of the disclosure, at the next step (312), mapping the located subject of interest with a predefined CAD dimension of each of the one or more components, wherein a starting point of each of the one or more components are co-related with a starting point of the predefined CAD dimension.
[041] In the preferred embodiment of the disclosure, at the last step (314), each of the one or more mapped components is verified to identify at least one defect using an oriented fast and rotated brief (ORB) for a sharp feature on one or more components and a co-relation-based template matching for a texture feature of the one or more components.
[042] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent

elements with insubstantial differences from the literal language of the claims.
[043] The embodiments of present disclosure herein address unresolved problem associated with the existing automated optical inspection (AOI) systems those are designed mostly for PCB manufacturing and automotive manufacturing industries. These existing systems cannot be extended to multiple manufacturing with different kind of conveyor belt. The entire conveyor belt is not monitored, and inventory is not maintained without manual intervention.
[044] It is to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device, which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software processing components located therein. Thus, the means can include both hardware means, and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[045] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various components described herein may be implemented in other components or

combinations of other components. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[046] The illustrated steps are set out to explain the exemplary embodiments
shown, and it should be anticipated that ongoing technological development
would change the manner in which particular functions are performed. These
examples are presented herein for purposes of illustration, and not limitation.
Further, the boundaries of the functional building blocks have been arbitrarily
defined herein for the convenience of the description. Alternative boundaries can
be defined so long as the specified functions and relationships thereof are
appropriately performed. Alternatives (including equivalents, extensions,
variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
[047] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term

“computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[048] It is intended that the disclosure and examples be considered as exemplary only, with a true scope of disclosed embodiments being indicated by the following claims.

We Claim
1. A processor-implemented method comprising:
positioning a first array of a plurality of cameras to capture a video of one or more components on a running assembly line, wherein the plurality of cameras is characterized with a wide field of view (FOV);
tagging each of the one or more components in the captured video with a predefined identity to maintain an inventory record;
extracting a boundary of each of the one or more tagged components by creating a motion mask using a Gaussian mixer model along with a Lab color mask;
positioning a second array of the plurality of cameras to capture one or more images of a plurality of sections of the one or more tagged components, wherein the second array of the plurality of cameras are characterized with a narrow field of view (FOV);
localizing one or more captured images to locate at least one subject of interest of each of the one or more tagged components based on the extracted boundary of each of the one or more tagged components;
mapping the located subject of interest with a predefined CAD dimension of each of the one or more components, wherein a starting point of each of the one or more components are co-related with a starting point of the predefined CAD dimension; and
verifying each of the one or more mapped components to identify at least one defect using an oriented fast and rotated brief (ORB) technique for a sharp

feature on one or more components and a co-relation-based template matching for a texture feature of the one or more components.
2. The processor-implemented method claimed in claim 1, further
comprising:
capturing a reference image of each of the one or more components using the second array of the plurality of cameras; and
specifying a distance of each sub-component of each of the one or more components from an origin for mapping the each of one or more components with the predefined CAD dimension.
3. The processor-implemented method claimed in claim 1, wherein the first array of plurality of cameras are arranged to identify and tag the one or more components.
4. The processor-implemented method claimed in claim 1, wherein the second array of plurality of cameras are used to identify one or more sub-components in the tagged each of the one or more components.
5. The processor-implemented method claimed in claim 1further comprising detecting a missing component by comparing the Lab color space of one or more captured image with the reference image.
6. A system comprising:
at least one memory storing a plurality of instructions;
one or more hardware processors communicably coupled with at least one memory, wherein one or more hardware processors are configured to execute one or more modules;

a first array of a plurality of cameras are configured to capture a video of one or more components on a running assembly line, wherein the plurality of cameras are characterized with a wide field of view (FOV);
a tagging module configured to tag each of the one or more components in the captured video with a predefined identity to maintain an inventory record;
an extraction module configured to extract a boundary of each of the one or more tagged components by creating a motion mask using a Gaussian mixer model along with a lab color mask;
a second array of a plurality of cameras are configured to capture one or more images of each section of the one or more tagged components, wherein the second array of the plurality of cameras are characterized with a narrow field of view (FOV);
a localization module configured to localize one or more captured images to locate at least one subject of interest of each of the one or more tagged components based on the extracted boundary of each of the one or more tagged components;
a mapping module configured to map the located subject of interest with a predefined CAD dimension of each of the one or more components, wherein a starting point of each of the one or more components are co-related with a starting point of the predefined CAD dimension; and
a verification module configured to verify each of the one or more mapped components to identify at least one defect using an oriented fast and rotated brief (ORB) for a sharp feature on one or more components and a co-relation-based template matching for a texture feature of the one or more components.

7. The system claimed in claim 5, wherein a reference image of each of the one or more components is captured using the second array of the plurality of cameras.
8. The system claimed in claim 5, wherein a distance of each sub-component of each of the one or more components from an origin is specified for mapping the each of one or more components with the predefined CAD dimension.
9. The system claimed in claim 5, wherein the first array of plurality of cameras are arranged to identify and tag the one or more components.
10. The system claimed in claim 5, wherein the second array of plurality of cameras are used to identify one or more sub-components in the tagged each of the one or more components.
11. The system claimed in claim 5, wherein at least one missing sub¬component from the identified one or more sub-components is detected by comparing the Lab color space of one or more captured image with the reference image.

Documents

Application Documents

# Name Date
1 202021001025-STATEMENT OF UNDERTAKING (FORM 3) [09-01-2020(online)].pdf 2020-01-09
2 202021001025-REQUEST FOR EXAMINATION (FORM-18) [09-01-2020(online)].pdf 2020-01-09
3 202021001025-FORM 18 [09-01-2020(online)].pdf 2020-01-09
4 202021001025-FORM 1 [09-01-2020(online)].pdf 2020-01-09
5 202021001025-FIGURE OF ABSTRACT [09-01-2020(online)].jpg 2020-01-09
6 202021001025-DRAWINGS [09-01-2020(online)].pdf 2020-01-09
7 202021001025-DECLARATION OF INVENTORSHIP (FORM 5) [09-01-2020(online)].pdf 2020-01-09
8 202021001025-COMPLETE SPECIFICATION [09-01-2020(online)].pdf 2020-01-09
9 Abstract1.jpg 2020-01-13
10 202021001025-Proof of Right [17-06-2020(online)].pdf 2020-06-17
11 202021001025-FER.pdf 2022-09-07
12 202021001025-OTHERS [11-11-2022(online)].pdf 2022-11-11
13 202021001025-FER_SER_REPLY [11-11-2022(online)].pdf 2022-11-11
14 202021001025-COMPLETE SPECIFICATION [11-11-2022(online)].pdf 2022-11-11
15 202021001025-CLAIMS [11-11-2022(online)].pdf 2022-11-11
16 202021001025-US(14)-HearingNotice-(HearingDate-02-05-2024).pdf 2024-04-22
17 202021001025-FORM-26 [25-04-2024(online)].pdf 2024-04-25
18 202021001025-Correspondence to notify the Controller [25-04-2024(online)].pdf 2024-04-25
19 202021001025-FORM-26 [02-05-2024(online)].pdf 2024-05-02
20 202021001025-Written submissions and relevant documents [13-05-2024(online)].pdf 2024-05-13
21 202021001025-PETITION UNDER RULE 137 [13-05-2024(online)].pdf 2024-05-13
22 202021001025-PatentCertificate03-06-2024.pdf 2024-06-03
23 202021001025-IntimationOfGrant03-06-2024.pdf 2024-06-03

Search Strategy

1 SearchHistoryE_05-09-2022.pdf

ERegister / Renewals

3rd: 07 Jun 2024

From 09/01/2022 - To 09/01/2023

4th: 07 Jun 2024

From 09/01/2023 - To 09/01/2024

5th: 07 Jun 2024

From 09/01/2024 - To 09/01/2025

6th: 19 Dec 2024

From 09/01/2025 - To 09/01/2026