Abstract: AN AUTOMATED INSPECTION SYSTEM AND METHOD THEREOF The present invention discloses an automated inspection system and method thereof for inspecting the objects to verify and maintain a level of quality. The system (102) includes at least one computing device and a control engine. The computing device includes a capture module to capture images of an area, and objects of the area. The control engine includes a memory, a processor, a processing module, an inspection module, a database, a comparator, and an external output module. The processing module processes the captured images. The inspection module inspects attributes of the processed images. The database stores data related to a user, pre-defined standards of objects, images of objects, sub-assemblies, and pre-defined quality and/or assembly requirements. The comparator compares the inspected attributes with the stored data, and generate compared data. The external output module determines one or more parts of the objects based on the compared data, and generates an output.
DESC:FORM 2
THE PATENTS ACT, 1970
(39 OF 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[SEE SECTION 10, RULE 13]
AN AUTOMATED INSPECTION SYSTEM AND METHOD THEREOF
V-LINK AUTOMOTIVE SERVICES PRIVATE LIMITED
A COMPANY REGISTERED IN INDIA UNDER THE COMPANIES ACT, 1956 HAVING ITS REGISTERED OFFICE AT:
F-17, 04TH FLOOR, PINNACLE BUSINESS PARK, MAHAKALI CAVES ROAD, SHANTI NAGAR, ANDHERI (EAST), MUMBAI - 400 093, MH, INDIA
THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.
TECHNICAL FIELD
[001] The present invention relates to inspection systems and methods, and more specifically to an automated inspection system and method.
BACKGROUND
[002] A large part of success of manufacturing industry depends on quality assurance and control section. Frequent measurement and inspection are undertaken to ensure that manufactured parts or part quality attributes conform to design specifications. The product and part quality attributes, such as presence or absence of child parts such as plastics/metals sub parts, screws, rivets, clips, washers, felt and the like. Another element includes visual inspections of the surface for defects such as scratches or damage and paint or plating defects such as seeds, pin holes, flow marks and the like.
[003] However, there are various problems associated with either the manual or automatic approaches. First, in case of manual inspection, the human error makes visual inspection of products or part quality of the product less than completely effective. Also, such manual inspection is relatively slow and thus a relatively costly aspect of the manufacturing process. Furthermore, the pass or fail criteria of the manual inspection, does not generally provide any numeric dimensional data that would otherwise be useful for process control.
[004] To improve the effectiveness of inspections and to reduce manufacturing costs, automated inspection systems have been provided. Further, some automated inspection systems utilize machine vision, for example camera, to inspect objects, particularly small component parts of products. Such systems typically employ optical equipment that receives light reflected from objects during inspection. Some other traditional automated inspection system includes a plurality of sensors for various parts working either on electromagnetic or limit or color means among others that are used. These sensors are then connected to circuits (using relays, times etc.) or programmable logic controllers or CNC machines to provide automated inspection methods. Although some success has been achieved with such systems. However, in the longer run such systems could not sustain the demand of the automated inspection mechanism mainly it require substantially high capacity camera and then high capacity processors for signal processing and computing in order to organize even simple configurations.
[005] Hence, there is a need of an invention which solves the above defined problems and provide a system that operates autonomously to conduct quality inspections of the parts or components and to determine whether the same parts or components are acceptable according to the desirable standards.
OBJECTS
[006] Some of the objects of the present invention aimed to ameliorate one or more problems of the prior art or to at least provide a useful alternative, are listed herein below.
[007] An object of the present invention is to provide a computer implemented platform to process the captured images and determine the quality of the product.
[008] Another object of the present invention is to provide a low-end hardware to capture the images of the product.
[009] Another object of the present invention is to supervise and inspect the quality of the product precisely.
[0010] Another object of the present invention is to inspect large components with a single standard camera.
[0011] Another object of the present invention is to provide a platform to determine the attributes and further to determine whether a product or part is acceptable as per the required standards.
[0012] Another object of the present invention is automating the inspection system so that any automated inspection system may check within certain parameters, to make sure that the item is what it should be as per the specification requirements.
[0013] Other objects and advantages of the present invention will be more apparent from the following description when read in conjunction with the accompanying figures, which are not intended to limit the scope of the present invention.
SUMMARY
[0014] This summary is provided to introduce concepts related to providing an automated inspection system. This summary is neither intended to identify essential features of the present invention nor is it intended for use in determining or limiting the scope of the present invention.
[0015] For example, various embodiments herein may include one or more automated inspection systems and methods.
[0016] In one of the embodiments, the present invention discloses a method for inspecting a plurality of objects in an area connected with a computing environment having a memory and processor. The method includes a step of capturing a plurality of images of the area, and one or more objects of the area. The method includes a step of storing, in a database, data related to a user, pre-defined standards of one or more objects, images of a plurality of objects, sub-assemblies of the plurality of objects, and pre-defined quality and/or assembly requirements of the plurality of objects. The method includes a step of processing the captured images. The method includes a step of inspecting one or more attributes of the processed images. The method includes a step of comparing the inspected attributes with the stored data. The method includes a step of generating compared data based on the comparing of inspected attributes and the stored data. The method includes a step of determining one or more parts of the objects, based on the compared data. The method includes a step of generating an output based on the determined parts.
[0017] In another implementation, an automated inspection system includes at least one computing device and a control engine. The computing device is associated with a user, which includes a capture module. The capture module is configured to capture a plurality of images of an area, and one or more objects of the area. The control engine includes a memory, a processor, a processing module, an inspection module, a database, a comparator, and an external output module. The memory is configured to store pre-defined rules. The processor is configured to generate system processing commands. The processing module is configured to process the captured images. The inspection module is configured to inspect one or more attributes of the processed images. The database is configured to store data related to a user, pre-defined standards of one or more objects, images of a plurality of objects, sub-assemblies of the plurality of objects, and pre-defined quality and/or assembly requirements of the plurality of objects. The comparator is configured to compare the inspected attributes with the stored data, and generate compared data. The external output module is configured to determine one or more parts of the objects based on the compared data, and generate an output.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0018] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules.
[0019] Figure 1 illustrates a schematic diagram depicting a computer implemented system with a quality control technique that verifies and maintains a desired level of quality in an existing part or component or product by continued inspection, according to an exemplary implementation of the present invention.
[0020] Figure 2 illustrates a block diagram depicting an automated inspection system, according to an exemplary implementation of the present invention.
[0021] Figures 3A and 3B illustrate an exemplary implementation of the industrial usage of the automated inspection system, according to an exemplary implementation of the present invention.
[0022] Figure 4 illustrates an exemplary implementation of the training mechanism of the parts or components, according to an exemplary implementation of the present invention.
[0023] Figure 5 illustrates exemplary implementation of the inspection of the visual defects on a part or a component, according to an exemplary implementation of the present invention.
[0024] Figure 6 illustrates an exemplary implementation of the three stages of calibration, according to an exemplary implementation of the present invention.
[0025] Figures 7A and 7B illustrate an exemplary implementation of the inspection output on the screen, according to an exemplary implementation of the present invention.
[0026] Figures 8A and 8B illustrate a flowchart of a method of a plurality of steps that leads to a decision of conformance to requirement of a product or sub-assembly with the use of a plurality of images captured by the camera module, according to an exemplary implementation of the present invention.
[0027] Figures 9A and 9B illustrate a flowchart of a method of a plurality of steps that leads to a decision of conformance to requirement of a product or sub-assembly with the use of a live video stream captured by the camera module, according to an exemplary implementation of the present invention.
[0028] Figure 10 illustrates a flowchart depicting a method for inspecting a plurality of objects in an area, according to an exemplary implementation of the present invention.
[0029] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present invention. Similarly, it will be appreciated that any flowcharts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0030] In the following description, for the purpose of explanation, specific details are set forth in order to provide an understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these details. One skilled in the art will recognize that embodiments of the present invention, some of which are described below, may be incorporated into a number of systems.
[0031] The various embodiments of the present invention provide an automated inspection system and method thereof.
[0032] Furthermore, connections between components and/or modules within the figures are not intended to be limited to direct connections. Rather, these components and modules may be modified, re-formatted or otherwise changed by intermediary components and modules.
[0033] References in the present invention to “one embodiment” or “an embodiment” mean that a particular feature, structure, characteristic, or function described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
[0034] In one of the embodiments, the present invention discloses a method for inspecting a plurality of objects in an area connected with a computing environment having a memory and processor. The method includes a step of capturing a plurality of images of the area, and one or more objects of the area. The method includes a step of storing, in a database, data related to a user, pre-defined standards of one or more objects, images of a plurality of objects, sub-assemblies of the plurality of objects, and pre-defined quality and/or assembly requirements of the plurality of objects. The method includes a step of processing the captured images. The method includes a step of inspecting one or more attributes of the processed images. The method includes a step of comparing the inspected attributes with the stored data. The method includes a step of generating compared data based on the comparing of inspected attributes and the stored data. The method includes a step of determining one or more parts of the objects, based on the compared data. The method includes a step of generating an output based on the determined parts.
[0035] In another implementation, the captured images are used to recognize the one or more parts of the objects.
[0036] In another implementation, the step of inspecting includes checking the one or more parts of the objects in the processed images.
[0037] In another implementation, the step of inspecting includes identifying one or more region of interest (RoI) of the processed images, based on the pre-defined quality and/or assembly requirements.
[0038] In another implementation, determining the region of interest is acceptable according to the stored standards based on the compared data.
[0039] In another implementation, the method includes transmitting the output to a computing device.
[0040] In another implementation, the step of inspecting includes verifying and maintaining a level of quality in said objects, wherein the level of quality is pre-defined and stored in the database.
[0041] In another implementation, the method includes capturing a plurality of images of an empty area, one or more parts of the objects without child parts of the one or more parts, and one or more parts of the objects with child parts of the one or more parts.
[0042] In another implementation, the step of inspecting includes computing a minimum bounding template for inspecting the attributes of the objects from the processed images.
[0043] In another implementation, the method includes identifying an image having difference between at least one structural similarity index of the one or more parts and the child parts.
[0044] In another implementation, the step of identifying the image includes binarizing the identified image and performing counters in the binarized image with a region larger than a threshold value.
[0045] In another implementation, identifying the location of the one or more region of interest (RoI) in the processed images by using a template location and RoI properties.
[0046] In another implementation, an automated inspection system includes at least one computing device and a control engine. The computing device is associated with a user, which includes a capture module. The capture module is configured to capture a plurality of images of an area, and one or more objects of the area. The control engine includes a memory, a processor, a processing module, an inspection module, a database, a comparator, and an external output module. The memory is configured to store pre-defined rules. The processor is configured to generate system processing commands. The processing module is configured to process the captured images. The inspection module is configured to inspect one or more attributes of the processed images. The database is configured to store data related to a user, pre-defined standards of one or more objects, images of a plurality of objects, sub-assemblies of the plurality of objects, and pre-defined quality and/or assembly requirements of the plurality of objects. The comparator is configured to compare the inspected attributes with the stored data, and generate compared data. The external output module is configured to determine one or more parts of the objects based on the compared data, and generate an output.
[0047] In another implementation, the inspection module includes a checking module. The checking module is configured to check the one or more parts of the objects in the processed images.
[0048] In another implementation, the inspection module includes an identifier. The identifier is configured to identify one or more region of interest (RoI) of the processed images, based on the pre-defined quality and/or assembly requirements.
[0049] In another implementation, the identifier is configured to identify the region of interest is acceptable according to the stored standards based on the compared data.
[0050] In another implementation, the control engine includes a transmitter. The transmitter is configured to transmit the output to the computing device.
[0051] In another implementation, the inspection module is configured to verify and maintain a level of quality in the objects by inspecting the objects, wherein the level of quality is pre-defined and stored in the database.
[0052] In another implementation, the capture module is configured to capture a plurality of images of an empty area, one or more parts of said objects without child parts of the one or more parts, and one or more parts of the objects with child parts of the one or more parts.
[0053] In another implementation, the inspection module includes a computation module. The computation module is configured to compute a minimum bounding template for inspecting the attributes of the objects from the processed images.
[0054] In another implementation, the computation module is configured to identify an image having difference between at least one structural similarity index of the one or more parts and the child parts, and further configured to binarize the identified image and counter in the binarized image with a region larger than a threshold value.
[0055] In another implementation, the identifier is configured to identify the location of the one or more region of interest (RoI) in the processed images by using a template location and RoI properties.
[0056] In another implementation, the control engine includes a cooling module. The cooling module is configured to perform cooling in the system. The cooling module includes a heat sink and a fan. The heat sink is made up of thermal conducting metal to carry out the heat away from the processor into fins. The fan is controlled by the processing module, and is turned ON if the processor temperature rises beyond an operating threshold.
[0057] In another implementation, the fan is configured to release the hot air around the heat sink outside the system (102).
[0058] Figure 1 illustrates a schematic diagram depicting a computer implemented system with a quality control technique that verifies and maintains a desired level of quality in an existing part or component or product by continued inspection, according to an exemplary implementation of the present invention.
[0059] The computer implemented system (100) includes an automated inspection system (102), a network (104), a plurality of computing devices (106a, 106b, 106c, 106d), and a database (108). The automated inspection system (102) (hereinafter referred as “system”), includes a memory (110), a processor (112), I/O interfaces (114), a control engine (116), a plurality of modules (120), and a plurality of data (138).
[0060] The network (104) interconnects the computing devices (106a, 106b, 106c, and 106d) and the database (108) with the system (102). The network (104) includes wired and wireless networks. Examples of the wired networks include a Wide Area Network (WAN) or a Local Area Network (LAN), a client-server network, a peer-to-peer network, and so forth. Examples of the wireless networks include Wi-Fi, a Global System for Mobile communications (GSM) network, and a General Packet Radio Service (GPRS) network, an enhanced data GSM environment (EDGE) network, 802.5 communication networks, Code Division Multiple Access (CDMA) networks, or Bluetooth networks.
[0061] The database (108) may be implemented as enterprise database, remote database, local database, and the like. The database (108) may be located within the system (102) or may be located at different geographic locations as compared to that of the system (102).
[0062] In an embodiment, the computing devices (106a, 106b, 106c, and 106d) are user devices, and each of the user devices (106a, 106b, 106c, and 106d) is associated with respective users to access the system (102). In one embodiment, the user devices may be mobile devices, personal computers, laptops, PDAs, and the like.
[0063] The memory (110) may be coupled to the processor (112). The memory (110) can include any computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0064] The system (102) includes one or more processors (112). The processor (112) may be implemented as one or more microprocessors, microcomputers, micro-controllers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor (112) is configured to fetch and execute computer-readable instructions stored in the memory (110).
[0065] The system (102) includes the I/O interfaces (114). The I/O interfaces (114) may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface (114) may allow the system (102) to interact with a user directly or through the user devices (106). Further, the I/O interface (114) may enable the system (102) to communicate with other user devices or computing devices (106), such as web servers. The I/O interface (114) may include one or more ports for connecting a number of devices to one another or to another server.
[0066] The system (102) includes the control engine (116). The control engine (116) is configured to control the system (100) by performing various functionalities by using one or more modules (116). More specifically, the control engine (116) is configured to receive inputs from the one or more computing devices (106), and perform functionalities by using the received inputs, and generate an output.
[0067] The modules (120) include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules (120) include a capture module (124), a processing module (126), an inspection module (128), a comparator (130), an external output module (132), a cooling module (134), and other modules (136). The other modules (136) may include programs or coded instructions that supplement applications and functions of the system (102).
[0068] In the present implementation, the capture module (124) is configured to capture the plurality of images of the parts or components of the product with the help of a camera mounted on a computing device (106) such as a mobile phone (106a). The processing module (126) is configured to process the captured images. The inspection module (128) is configured to identify the region of interest (henceforth referred to as RoI) for each child part from the plurality of captured images. The comparator (130) is configured to compare inspected attributes with the pre-defined data stored in the database (108). The external output module (132) is configured to determine one or more parts or components of the product, and generate an output. The cooling module (134) is configured to perform cooling on the system (100).
[0069] In the present implementation, the data (138) may include various data pertaining to the operation of the processor (112), and other data (144). The data (138), amongst other things, may serve as a database for storing data that is processed, received, or generated as a result of the execution of one or more modules in the other module(s) (136). Further, the data (138) broadly includes a user data (140) and a system data (142). The user data (140) includes all the data pertaining to the user such as registering to receive remote notifications and the like. The system data (142) includes all other data except the user data (140). The user data (140) and the system data (142), both may be stored in the database (108).
[0070] Figure 2 illustrates a block diagram (200) depicting an automated inspection system, according to an exemplary implementation of the present invention.
[0071] The automated inspection system (102) includes a computing device (106), and a control engine (116).
[0072] The computing device (106) further includes a capture module (124). The capture module (124) is configured to capture a plurality of images of an area, and one or more objects of the area. In an embodiment, the capture module (124) can be a camera. In another embodiment, the capture module (124) is configured to capture the plurality of images of the parts or components of a product with the help of a camera mounted on a computing device (106) such as a mobile phone (106a). The camera is configured to connect to the processor (112) with a good memory buffering capacity. Further, a plurality of images is captured using the camera. This plurality of images includes image of an empty workstation, image of the part to be inspected on the workstation without child parts, an image of the part to be inspected with all the child parts in place, and the like. In an embodiment, the captured images are used to recognize the one or more parts of the objects.
[0073] The control engine (116) is configured to cooperate with the computing device (106) to receive captured images. The control engine (116) includes a memory (110), a processor (112), a database (108), a processing module (126), an inspection module (128), a comparator (130), and an external output module (132).
[0074] The memory (110) is configured to store pre-defined rules related to inspection, comparison, and rules related to input/output generation. The processor (112) is configured to cooperate with the memory (110) to receive the pre-defined rules, and is further configured to generate system processing commands.
[0075] The processing module (126) is configured to process the captured images received from the capture module (124) via the network (104).
[0076] The inspection module (128) is configured to cooperate with the processing module (126) to receive the processed images. The inspection module (128) is further configured to inspect one or more attributes of the processed image. The inspection module (128) is configured to verify and maintain a level of quality in the objects by inspecting the objects, wherein the level of quality is pre-defined and stored in the database (108).
[0077] In an embodiment, the inspection module (128) includes a checking module (202), a computation module (204), and an identifier (206).
[0078] The checking module (202) is configured to check one or more parts of the objects in the processed image. The computation module (204) is configured to compute a minimum bounding template for inspecting the attributes of the objects from the processed images. In an embodiment, the computation module (204) is further configured to identify an image having difference between at least one structural similarity index of the one or more parts and the child parts, and is further configured to binarize the identified image and counter the binarize image with a region larger than a threshold value. The identifier (206) is configured to identify one or more region of interest (RoI) of the processed images, based on the pre-defined quality and/or assembly requirements stored in the database. Further, the identifier (206) is configured to identify the location of the one or more region of interest (RoI) in the processed images by using a template location and RoI properties. The identifier (206) is further configured to determine the region of interest (RoI) is acceptable according to the stored standards.
[0079] In an embodiment, the inspection module (128) is configured to identify the region of interest for each child part from the plurality of captured images, by identifying the structural similarity index of the image of the part to be inspected on the workstation without child parts and the image of the part to be inspected with all the child parts in place. Further, the resulting difference image is binarized and the contours in the binary image with area larger than a threshold and location within template, is identified. Moreover, each ROI’s location and dimensions can be manually calibrated along with the deletion or addition of a new RoI in an I/O interface (114).
[0080] The database (108) is configured to store data related to a user, pre-defined standards of one or more objects, images of a plurality of objects, sub-assemblies of said plurality of objects, and pre-defined quality and/or assembly requirements of said plurality of objects.
[0081] The comparator (130) is configured to cooperate with the inspection module (128) and the database to receive the inspected attributes and stored data. The comparator (130) is configured to compare the inspected attributes with the stored data, and generate compared data.
[0082] The external output module (132) is configured to cooperate with the comparator (130) to receive the compared data. The external output module (132) is further configured to determine one or more parts of the objects based on the compared data, and generate an output.
[0083] The control engine (116) further includes a transmitter (118) and a cooling module (134). The transmitter (118) is configured to cooperate with the external output module, and is further configured to transmit the generated output to the computing device (106). The cooling module (134) is configured to perform cooling in the system (100). The cooling module (134) includes a heat sink and a fan (not shown in figures). The heat sink is configured to cooperate with the processor (112), and is made up of thermal conducting metal to carry out the heat away from the processor (112) into fins. The fan is controlled by the processing module (126), and is turned ON if the processor (112) temperature rises beyond an operating threshold. The fan is configured to release the hot air around the heat sink outside the system (102).
[0084] Figures 3A and 3B illustrate an exemplary implementation (300) of the industrial usage of the automated inspection system, according to an exemplary implementation of the present invention.
[0085] Figures 3A and 3B illustrate the industrial usage of the system (102) using a processing equipment (302) assembled along with a camera (304) that is mounted on a frame (306). Further, the frame is attached to the work table (308) on which the assembly process on an assembly fixture (310) is carried out. The final component / sub-assembly (312) is assembled on the fixture (310) by fixing the child parts (314) onto it. Furthermore, a light (L) provides enough illumination for the system (102) to work seamlessly. The inspection output is thus delivered on one or more computing devices (106) such as a desktop (106c) or a laptop (106d) or the like. A vacuum suction cup (316) is connected to the processing equipment (302) via an I/O interface (114). When the final component / sub-assembly (312) is detected by the processing equipment (302), the processing equipment (302) switches the vacuum (316) on, thereby locking the final component / sub-assembly (312) to the fixture (310). After all the child parts (314) are assembled, the processing equipment (302) switches off the vacuum (316) thus releasing the final component / sub-assembly (312) from the fixture (310).
[0086] Figure 4 illustrates an exemplary implementation of the training mechanism (400) of the parts or components, according to an exemplary implementation of the present invention.
[0087] In Figure 4, the training mechanism of a part (402) where a region of interest (ROI) has been identified based on the quality/assembly requirements of the parts or components. For each requirement, the region of interest (RoI) is identified that can be adjusted depending on the component and its relative distance between assembly objects for optimization of accurate detection. The region of interest (RoI) of the one part (404) demonstrates the presence of a child part in the required location. While the RoI of another part (406) demonstrates a defect as the child part has not been assembled.
[0088] Figure 5 illustrates exemplary implementation of the inspection of the visual defects on a part or a component (500), according to an exemplary implementation of the present invention.
[0089] Figure 5 illustrates the visual defects on a part or component can also be inspected. The visual defects may include defects such as a scratch (502), a seed on a paint surface (504) and the like. Further, many other such visual defects found on the aesthetic parts produced with usable surface finish or post processed using painting, coating etc., can also be inspected.
[0090] Figure 6 illustrates an exemplary implementation of the three stages of calibration (600), according to an exemplary implementation of the present invention.
[0091] Figure 6 demonstrates the three stages of calibration. A first step involves using of the base table in absence of any part (602). A second step shows a part that is placed on a table and the images are recorded for calibration forming the next stage of calibration (404). A third step shows that all the child parts are assembled and a complete assembly stage of calibration (506) is carried out.
[0092] Figures 7A and 7B illustrate an exemplary implementation of the inspection output on the screen (700), according to an exemplary implementation of the present invention.
[0093] Figure 7A shows the assembled part on the jig for which the inspection is carried out by the system (102) and the result is displayed on a computing device such as a desktop (106c) or a laptop (106d) or the like. Further, the assembled part is ergonomically placed at the work station to minimize the physical effort and maximize the efficiency.
[0094] Figure 7B shows the missing or defective elements, hence in case of an error the user is immediately alerted of the missing parts. Additionally, the hardware used in the present invention allows the selection of the parts only when the assembly of the parts is done in a correct manner.
[0095] Figures 8A and 8B illustrate a flowchart (800) of a method of a plurality of steps that leads to a decision of conformance to requirement of a product or sub-assembly with the use of a plurality of images captured by the camera module, according to an exemplary implementation of the present invention.
[0096] At step (802), capturing an image. In an embodiment, a capture module (124) is configured to capture an image. The image is captured with a predefined resolution and brightness using the computing device (106), such as the mobile phone (106a), upon a button press.
[0097] At step (804), identifying a part template location. In an embodiment, an identifier (206) is configured to identify a part template location. The location of the part template in the captured image is identified using the template matching algorithm. In an embodiment, the template matching algorithm includes deep neural network, large deformation diffeomorphic metric mapping (LDDMM), and the like.
[0098] At step (806), identifying relative location of each Region of Interest (RoI). In an embodiment, the identifier (206) is configured to identify relative location of each of the Region of Interest (RoI). In an embodiment, the relative location of each ROI is found using the template’s location and RoI properties. Further, the mean and standard deviation for each of the BGR(Blue, Green, Red) color channels of the RoI in the image are computed.
[0099] At step (808), computing three-dimensional (3D) histogram. In an embodiment, a computation module (304) is configured to compute three-dimensional (3D) histogram. The 3D histogram for each ROI in the image is calculated.
[00100] At step (810), computing Bhattacharya distance between new histogram and stored histograms. In an embodiment, the computation module (204) is configured to compute Bhattacharya distance between new histogram and stored histograms. In an embodiment, the Bhattacharya distance is calculated between the 3D histogram of the current image and the 3D histogram of each of image of the part to be inspected on the workstation without child parts (may be referred as case X) and between the 3D histogram of the current image and the image of the part to be inspected with all the child parts in place (may be referred as case Y). In one embodiment, the histograms are stored in a database (108).
[00101] At step (812), determining whether the difference between distance without part and difference with part is greater than a first threshold value (th1). In an embodiment, a computation module (204) is configured to identify an image having difference between at least one structural similarity indexof the one or more parts and the child parts. If the difference between distance without part and difference with part is greater than the first threshold value (th1), a child part is present, as shown in (814), else determining whether the difference between distance without part and difference with part is greater than a second threshold value (th2), as shown in (816). In an embodiment, at step (814), it is determined that if the distance in case X is significantly lower than case Y, then the child part in RoI is termed as absent (818).
[00102] Further if the distances in both the cases X and Y are close then the further comparison is done to conclude the presence of the child part. If the difference between distance without part and difference with part is greater than the second threshold value (th2), a child part is absent, as shown in (818), else computing sum of pixels in adaptive threshold of grayscale RoI, as shown in (818). The sum of the pixels is calculated between the adaptive threshold of grayscale ROI and the image of the part to be inspected on the workstation without child parts and between the adaptive threshold of grayscale ROI and the image of the part to be inspected with all the child parts in place.
[00103] At step (820), if the sum of pixels in the adaptive threshold of grayscale ROI is significantly closer to the sum of pixels in the adaptive threshold of grayscale image of the part to be inspected with all the child parts in place, then the child part in ROI is termed as present (822).
[00104] At step (824), if the sum of pixels in the adaptive threshold of grayscale ROI is significantly closer to the sum of pixels in the adaptive threshold of grayscale image of the part to be inspected on the workstation without child parts, then the child part in ROI is termed as absent (826).
[00105] At step (828), computing histogram interaction, Chi square distance and correlation between new histogram and stored histogram (with and without child parts). In an embodiment, the computation module (204) is configured to compute histogram interaction, Chi square distance and correlation between new histogram and stored histogram (with and without child parts).
[00106] At step (830), assigning equal weight to each method (with and without child parts). In an embodiment, the computation module (204) is configured to assign equal weight to each method. In an embodiment, each method is used to decide a temporary presence or absence of child part and equal weightage is given to each method.
[00107] At step (832), if the weight of the part to be inspected with all the child parts in place is significantly lower than the weight of the part to be inspected on the workstation without child parts then the child part is absent (834).
[00108] At step (836), if the weight of the part to be inspected with all the child parts in place is significantly greater than the weight of the part to be inspected on the workstation without child parts then the child part is present.
[00109] At step (838), if all the child parts in the ROI’s are found to be present, the display is updated with the appropriate signal suggesting a successful assembly (842).
[00110] At step (840), if all the child parts in the ROI’s are not found to be present, an indication is provided to the missing child part along with sign for overall failed assembly.
[00111] Figures 9A and 9B illustrate a flowchart of a method of a plurality of steps that leads to a decision of conformance to requirement of a product or sub-assembly with the use of a live video stream captured by the camera module, according to an exemplary implementation of the present invention.
[00112] At step (902), capturing a live stream. In an embodiment, a capture module (124) is configured to capture a live video stream. In an embodiment, the live stream is captured and processed with a predefined resolution and brightness using the computing device, such as the mobile phone (106a).
[00113] At step (904), matching a template of each frame of stream. In an embodiment, a comparator (130) is configured to match a template of each frame of stream.
[00114] At step (906), determining whether the template matching is greater than threshold (th0). If the template matching is lesser than th0, repeat to step (904), else, identifying a relative location of each RoI in an image, and transmitting a signal to relay for inspection initiation, as shown in (908). In an embodiment, the computation module (204) is configured to determining whether the template matching is greater than threshold (th0). In another embodiment, the identifier (206) is configured to identify a relative location of each RoI in an image. In yet another embodiment, a transmitter (118) is configured to transmit a signal to relay for inspection initiation. In an embodiment, the template matching is done and if the matching value is above a threshold for n consecutive frames then the part is detected, signal is sent to a relay for handling locking of part, display is updated with assembly in progress cue and the template matching is stopped. The relative location of each ROI is found in image using the template’s location and ROI properties and a signal is sent to relay for starting the inspection. Further, the mean and standard deviation for each of the BGR color channels of the ROI in the frame are computed.
[00115] At step (910), computing three-dimensional (3D) histogram. In an embodiment, a computation module (304) is configured to compute three-dimensional (3D) histogram. The 3D histogram for each ROI in the image is calculated.
[00116] At step (912), computing Bhattacharya distance between new histogram and stored histograms. In an embodiment, the computation module (204) is configured to compute Bhattacharya distance between new histogram and stored histograms. In an embodiment, the Bhattacharya distance is calculated between the 3D histogram of the current image and the 3D histogram of each of image of the part to be inspected on the workstation without child parts (may be referred as case X) and between the 3D histogram of the current image and the image of the part to be inspected with all the child parts in place (may be referred as case Y). In one embodiment, the histograms are stored in a database (108).
[00117] At step (914), determining whether the difference between distance without part and difference with part is greater than a first threshold value (th1). In an embodiment, a computation module (204) is configured to identify an image having difference between at least one structural similarity index of the one or more parts and the child parts. If the difference between distance without part and difference with part is greater than the first threshold value (th1), a child part is present, as shown in (916), else determining whether the difference between distance without part and difference with part is greater than a second threshold value (th2), as shown in (918). If the difference between distance without part and difference with part is greater than a second threshold value (th2), child part is present, as shown in (920), else computing sum of pixels in adaptive threshold of grayscale RoI, as shown in (922). In an embodiment, at step 922, sum of the pixels is calculated between the adaptive threshold of grayscale ROI and the image of the part to be inspected on the workstation without child parts and between the adaptive threshold of grayscale ROI and the image of the part to be inspected with all the child parts in place.
[00118] At step (924), if the sum of pixels in the adaptive threshold of grayscale ROI is significantly closer to the sum of pixels in the adaptive threshold of grayscale image of the part to be inspected with all the child parts in place, then the child part in ROI is termed as absent (926).
[00119] At step (928), if the sum of pixels in the adaptive threshold of grayscale ROI is significantly closer to the sum of pixels in the adaptive threshold of grayscale image of the part to be inspected on the workstation without child parts, then the child part in ROI is termed as absent (929).
[00120] At step (930), computing histogram interaction, Chi square distance and correlation between new histogram and stored histogram (with and without child parts). In an embodiment, the computation module (204) is configured to compute histogram interaction, Chi square distance and correlation between new histogram and stored histogram (with and without child parts).
[00121] At step (932), assigning equal weight to each method (with and without child parts). In an embodiment, the computation module (204) is configured to assign equal weight to each method. In an embodiment, each method is used to decide a temporary presence or absence of child part and equal weightage is given to each method.
[00122] At step (934), if the weight of the part to be inspected with all the child parts in place is significantly lower than the weight of the part to be inspected on the workstation without child parts then the child part is absent (936).
[00123] At step (938), if the weight of the part to be inspected with all the child parts in place is significantly greater than the weight of the part to be inspected on the workstation without child parts then the child part is present.
[00124] At step (940), if all the child parts in the ROI’s are found to be present, the display is updated with the appropriate signal suggesting a successful assembly. Further, the relay is provided with a signal to stop that the inspection is over (833). If all the child parts in the ROI’s are found to be absent, goto step (910).
[00125] Figure 10 illustrates a flowchart depicting a method for inspecting a plurality of objects in an area, according to an exemplary implementation of the present invention.
[00126] The flowchart (1000) starts at a step (1002), capturing a plurality of images of an area, and one or more objects of the area. In an embodiment, a capture module (124) is configured to capture a plurality of images of an area, and one or more objects of the area.
[00127] At step (1004), storing, in a database (108), data related to a user, pre-defined standards of one or more objects, images of a plurality of objects, sub-assemblies of said plurality of objects, and pre-defined quality and/or assembly requirements of said plurality of objects.
[00128] At step (1006), processing the captured images. In an embodiment, a processing module (126) is configured to process the captured images.
[00129] At step (1008), inspecting one or more attributes of the processed images. In an embodiment, an inspection module (128) is configured to inspect one or more attributes of the processed images.
[00130] At step (1010), comparing the inspected attributes with the stored data. In an embodiment, comparator (130) configured to compare the inspected attributes with the stored data.
[00131] At step (1012), generating compared data based on the comparing of inspected attributes and the stored data. In an embodiment, comparator (130) configured to generate compared data based on the comparing of inspected attributes and the stored data.
[00132] At step (1014), determining one or more parts of the objects, based on the compared data. In an embodiment, an external output module (132) is configured to determine one or more parts of the objects based on the compared data.
[00133] At step (1016), generating an output based on the determined parts. In an embodiment, an external output module (132) is configured to generate an output based on the determined parts.
[00134] It should be noted that the description merely illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present invention. Furthermore, all examples recited herein are principally intended expressly to be only for explanatory purposes to help the reader in understanding the principles of the invention and the concepts contributed by the inventor(s) to furthering the art and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.
,CLAIMS:
1. A method for inspecting a plurality of objects in an area connected with a computing environment having a memory (110) and a processor (112), said method comprising:
capturing a plurality of images of said area, and one or more objects of said area;
storing, in a database (108), data related to a user, pre-defined standards of one or more objects, images of a plurality of objects, sub-assemblies of said plurality of objects, and pre-defined quality and/or assembly requirements of said plurality of objects;
processing said captured images;
inspecting one or more attributes of said processed images;
comparing said inspected attributes with said stored data;
generating compared data based on said comparing of inspected attributes and said stored data;
determining one or more parts of said objects, based on said compared data, and
generating an output based on said determined parts.
2. The method as claimed in claim 1, wherein said captured images are used to recognize said one or more parts of said objects.
3. The method as claimed in claim 1 or 2, wherein said step of inspecting includes checking said one or more parts of said objects in said processed images.
4. The method as claimed in claim 1, wherein said step of inspecting includes identifying one or more region of interest (RoI) of said processed images, based on said pre-defined quality and/or assembly requirements.
5. The method as claimed in claim 1 or 4, wherein identifying said region of interest is acceptable according to said stored standards based on said compared data.
6. The method as claimed in claim 1, wherein said method includes transmitting said output to a computing device (106).
7. The method as claimed in claim 1, wherein said step of inspecting includes verifying and maintaining a level of quality in said objects, wherein the level of quality is pre-defined and stored in the database (108).
8. The method as claimed in claim 1, wherein said method includes capturing a plurality of images of an empty area, one or more parts of said objects without child parts of said one or more parts, and one or more parts of said objects with child parts of said one or more parts.
9. The method as claimed in claim 1, wherein said step of inspecting includes computing a minimum bounding template for inspecting said attributes of said objects from said processed images.
10. The method as claimed in claim 1 or 9, wherein said method includes identifying an image having difference between at least one structural similarity index of said one or more parts and said child parts.
11. The method as claimed in claim 10, wherein said step of identifying said image includes binarizing said identified image and performing counters in said binarized image with a region larger than a threshold value.
12. The method as claimed in claim 4, wherein identifying said location of said one or more region of interest (RoI) in said processed images by using a template location and RoI properties.
13. An automated inspection system (102) comprising:
at least one computing device (106a, 106b, 106c, 106d) associated with a user, said computing device (106a, 106b, 106c, 106d) including:
a capture module (124) configured to capture a plurality of images of an area, and one or more objects of said area; and
a control engine (116) configured to cooperate with said computing device (124), said control engine (116) comprising:
a memory (110) configured to store pre-defined rules;
a processor (112) configured to cooperate with said memory (112) to receive said pre-defined rules, said processor (112) configured to generate system processing commands;
a processing module (126) configured to process said captured images;
an inspection module (128) configured to cooperate with said processing module (126) to receive said processed images, said inspection module (128) configured to inspect one or more attributes of said processed images;
a database (108) configured to store data related to a user, pre-defined standards of one or more objects, images of a plurality of objects, sub-assemblies of said plurality of objects, and pre-defined quality and/or assembly requirements of said plurality of objects;
a comparator (130) configured to cooperate with said inspection module (128) and said database (108) to receive said inspected attributes and stored data, said comparator (130) configured to compare said inspected attributes with said stored data, and generate compared data; and
an external output module (132) configured to cooperate with said comparator (130) to receive said compared data, said external output module (132) configured to determine one or more parts of said objects based on said compared data, and generate an output.
14. The system (102) as claimed in claim 13, wherein said inspection module (128) includes a checking module (202) configured to check said one or more parts of said objects in said processed images.
15. The system (102) as claimed in claim 13, wherein said inspection module (128) includes an identifier (206), said identifier (206) is configured to identify one or more region of interest (RoI) of said processed images, based on said pre-defined quality and/or assembly requirements.
16. The system (102) as claimed in claim 13 or 15, wherein said identifier (206) is configured to identify said region of interest is acceptable according to said stored standards based on said compared data.
17. The system (102) as claimed in claim 13, wherein said control engine (116) includes a transmitter (118), said transmitter (118) is configured to cooperate with said external output module (132), and further configured to transmit said output to said computing device (106).
18. The system (102) as claimed in claim 13, wherein said inspection module (128) is configured to verify and maintain a level of quality in said objects by inspecting said objects, wherein the level of quality is pre-defined and stored in the database (108).
19. The system (102) as claimed in claim 13, wherein said capture module (124) is configured to capture a plurality of images of an empty area, one or more parts of said objects without child parts of said one or more parts, and one or more parts of said objects with child parts of said one or more parts.
20. The system (102) as claimed in claim 13, wherein said inspection module (128) includes a computation module (204), said computation module (204) is configured to compute a minimum bounding template for inspecting said attributes of said objects from said processed images.
21. The system (102) as claimed in claim 20, wherein said computation module (204) is configured to identify an image having difference between at least one structural similarity index of said one or more parts and said child parts, and further configured to binarize said identified image and counter in said binarized image with a region larger than a threshold value.
22. The system (102) as claimed in claim 15, wherein said identifier (206) is configured to identify said location of said one or more region of interest (RoI) in said processed images by using a template location and RoI properties.
23. The system (102) as claimed in claim 13, wherein said control engine (116) includes a cooling module (134), said cooling module (134) is configured to perform cooling in said system (100).
24. The system (102) as claimed in claim 23, wherein said cooling module (134) includes a heat sink and a fan.
25. The system (102) as claimed in claim 24, wherein said heat sink is configured to cooperate with said processor (112), and is made up of thermal conducting metal to carry out the heat away from said processor (112) into fins.
26. The system (102) as claimed in claim 24, wherein said fan is controlled by said processing module (126), and is turned ON if the processor (112) temperature rises beyond an operating threshold.
27. The system (102) as claimed in claims 24 and 26, wherein said fan is configured to release the hot air around the heat sink outside the system (102).
Dated this 18th day of January, 2019
| # | Name | Date |
|---|---|---|
| 1 | 201821002155-PROVISIONAL SPECIFICATION [18-01-2018(online)].pdf | 2018-01-18 |
| 1 | Abstract1.jpg | 2019-06-08 |
| 2 | 201821002155-FORM 1 [18-01-2018(online)].pdf | 2018-01-18 |
| 2 | 201821002155-COMPLETE SPECIFICATION [18-01-2019(online)].pdf | 2019-01-18 |
| 3 | 201821002155-DRAWINGS [18-01-2018(online)].pdf | 2018-01-18 |
| 3 | 201821002155-DRAWING [18-01-2019(online)].pdf | 2019-01-18 |
| 4 | 201821002155-Proof of Right (MANDATORY) [13-03-2018(online)].pdf | 2018-03-13 |
| 4 | 201821002155-ENDORSEMENT BY INVENTORS [18-01-2019(online)].pdf | 2019-01-18 |
| 5 | 201821002155-FORM 3 [18-01-2019(online)].pdf | 2019-01-18 |
| 5 | 201821002155-FORM-26 [13-03-2018(online)].pdf | 2018-03-13 |
| 6 | 201821002155-ORIGINAL UNDER RULE 6 (1A)-FORM 1-210318.pdf | 2018-08-11 |
| 6 | 201821002155-ORIGINAL UNDER RULE 6 (1A)-FORM 26-210318.pdf | 2018-08-11 |
| 7 | 201821002155-ORIGINAL UNDER RULE 6 (1A)-FORM 1-210318.pdf | 2018-08-11 |
| 7 | 201821002155-ORIGINAL UNDER RULE 6 (1A)-FORM 26-210318.pdf | 2018-08-11 |
| 8 | 201821002155-FORM 3 [18-01-2019(online)].pdf | 2019-01-18 |
| 8 | 201821002155-FORM-26 [13-03-2018(online)].pdf | 2018-03-13 |
| 9 | 201821002155-ENDORSEMENT BY INVENTORS [18-01-2019(online)].pdf | 2019-01-18 |
| 9 | 201821002155-Proof of Right (MANDATORY) [13-03-2018(online)].pdf | 2018-03-13 |
| 10 | 201821002155-DRAWINGS [18-01-2018(online)].pdf | 2018-01-18 |
| 10 | 201821002155-DRAWING [18-01-2019(online)].pdf | 2019-01-18 |
| 11 | 201821002155-FORM 1 [18-01-2018(online)].pdf | 2018-01-18 |
| 11 | 201821002155-COMPLETE SPECIFICATION [18-01-2019(online)].pdf | 2019-01-18 |
| 12 | Abstract1.jpg | 2019-06-08 |
| 12 | 201821002155-PROVISIONAL SPECIFICATION [18-01-2018(online)].pdf | 2018-01-18 |