Abstract: A system and method for diagnosis of identified defects in a fabric using machine learning is disclosed. The system (100) comprises a receiving module (120) to acquire a video featuring a fabric and retrieve data from the video, a detector module (135) to identify and classify defects of the fabric. The system comprises a prediction module (140) to forecast a report illustrating the reasoning of the defects using machine learning. The system comprises a recommendation module (145) configured to propose actions to resolve the defects and collect feedback from an operator of the loom machine (130) in real-time. The recommendation module is operatively coupled to an active learning engine (150) to train with a training data set using machine learning. The system (100) comprises an alerting module (155) to broadcast an alert upon identifying the defects and transmit signals to pause the loom machine (130) by a controlling module (160). FIG. 1
Description:FIELD OF INVENTION
[0001] Embodiments of the present disclosure relate to the field of quality control, and more particularly, a system and a method for diagnosis of identified defects in a fabric using machine learning.
BACKGROUND
[0002] Quality is essential for any manufacturing or service industry to guarantee sufficient market share and meeting customer satisfaction thereby winning customer loyalty. In today’s competitive market, quality is the main factor in determining the success or failure of an organization. It is a known fact that profitable industry produces the highest quality goods in the shortest amount of time. Further, quality control is a multi-step procedure that includes testing of materials, analysis and corrective actions. Quality control in manufacturing typically involves inspecting manufactured products to detect defects.
[0003] Specifically, with the rapid development of the textile industry, the quality control of fabric is becoming strict as fabric defects are usually the key factors affecting the quality of fabrics. Specifically, fabric quality comprises of two components namely fabric properties and fabric faults (or fabric defects). The major causes of fabric defects are machine or process malfunctions, faulty yarns and machine spoils. Each cause has different effects and impacts sale and serviceability of the textiles. Although fabric manufacturers try to make the fabric fault free, different types of flaws are found in the fabrics. The fabric defects are responsible for nearly 85% of the defects found by the textile industry. Manufacturers recover only 45%-65% of their profits from seconds or off-quality goods. Therefore, detecting, identifying and preventing defects from reoccurring is mandatory for customer satisfaction and profitability. Further, the detection of fabric defects is necessary before the fabric is put on the market or further processed.
[0004] The fabric defects may develop during the process of weaving or after the fabric is produced such as silks, wrong warps and wefts, uneven knitting, oil strains and the like can exist. Weaving is the most popular way of fabric manufacturing. It is primarily done by interlacing two (or more than two) sets (warp and weft) of yarns in a regular and recurring pattern. Weaving involves repeating in sequence the operations of shed building, weft insertion, and reed beat-up. All these processes are typically carried out by a loom (also referred herein as ‘loom machine’). At times, the fabric defects may be caused by the problems of the machine, the defects can be generated continuously, so that the machine needs to be stopped in time for maintenance. One of the most common fabric defects is the presence of oil strains and are defined as spots or patches of differing color. Oil strains can occur at any time during or after production (for instance during spinning, weaving or finishing) if the fabric is not well protected. The sources of the oil strains come from dirt from factory floor, oil from machinery and dyes. Another common fabric defect is a folding mark in the fabric. The folding mark is a kind of large fabric defect that appears where creases are caused by fabric folds in the finishing process.
[0005] Traditionally, fabric defect detection methods are mostly based on manual measurement and human eye observation. These methods have great limitations such as poor consistency of test results. Further, the detection to cloth defect is accomplished through the mode of manual vision observation and concerned staff require to possess professional ability therefore manual detection hardly satisfies objectivity, reliability and uniformity.
[0006] Hence, there is a need for an improved system and method for diagnosis of identified defects in a fabric using machine learning which addresses the aforementioned issue(s).
BRIEF DESCRIPTION
[0007] In accordance with an embodiment of the present disclosure, a system for diagnosis of identified defects in a fabric using machine learning is provided. The system includes a processing subsystem hosted on a server. The processing subsystem is configured to execute on a network to control bidirectional communications among a plurality of modules. The processing subsystem includes a receiving module and configured to acquire a video content captured by a machine vision camera wherein the video content features at least one section of an area of a fabric woven in a loom machine. The video content includes a plurality of frames. The receiving module is also configured to retrieve data in different formats from the plurality of frames. The data includes image data, raw material data, details of an operator of the loom machine, machine data, program details corresponding to the at least one section of the area of the fabric and the loom machine. The processing subsystem also includes a detector module operatively coupled to the receiving module. The detector module is configured to identify one or more defects induced along the length and breadth of the at least one section of the fabric based on a set of standard features corresponding to the at least one section of the fabric, upon receiving the image data. The detector module is also configured to classify the one or more defects based on the intensity of the one or more defects. The processing subsystem also includes a prediction module wherein the prediction module is configured to forecast a report for the one or more defects based on a plurality of parameters using machine learning. The report illustrates the reasoning of the occurrence of the one or more defects. Further, the processing subsystem includes a recommendation module operatively coupled to the prediction module wherein the recommendation module is configured to propose one or more actions to be taken for resolving the one or more defects. The recommendation module is also configured to collect feedback from an operator of the loom machine in real-time.
[0008] In accordance with another embodiment of the present disclosure, a method for diagnosis of identified defects in a fabric using machine learning is provided. The method includes acquiring a video content by a receiving module wherein the video content features at least one section of a fabric woven in a loom machine. The video content includes a plurality of frames. The method also includes retrieving by the receiving module data in different formats from the plurality of frames. The data includes image data, raw material data, details of an operator of the loom machine, machine data, program details corresponding to the fabric and the loom machine. The method includes identifying by a detector module, one or more defects induced along the length and breadth of the fabric based on a set of standard features corresponding to the fabric, upon receiving the image data. The method includes classifying by the detector module, the one or more defects based on the intensity of the one or more defects. The method includes predicting by a predictor module, a report for the one of more defects based on a plurality of parameters using machine learning. The report illustrates the reasoning of the occurrence of the one or more defects. Further, the method includes proposing by a recommendation module one or more actions that are to be taken to resolve the one or more defects. Furthermore, the method includes collecting by the recommendation module feedback from an operator of the loom machine in real-time. Moreover, the method includes training by an active leaning engine with a training data set using machine learning. The training data set comprises the one or more actions corresponding to the one or more defects and the feedback from the operator.
[0009] To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting in scope. The disclosure will be described and explained with additional specificity and detail with the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:
[0011] FIG. 1 is a block diagram representation of a system for diagnosis of identified defects in a fabric using machine learning in accordance with an embodiment of the present disclosure;
[0012] FIG. 2 is a block diagram of a computer or a server in accordance with an embodiment of the present disclosure;
[0013] FIG. 3 (a) illustrates a flow chart representing the steps involved in a method for diagnosis of identified defects in a fabric using machine learning in accordance with an embodiment of the present disclosure; and
[0014] FIG. 3 (b) illustrates continued steps of the method of FIG. 3 (a) in accordance with an embodiment of the present disclosure.
[0015] Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0016] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.
[0017] The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more devices or subsystems or elements or structures or components preceded by "comprises... a" does not, without more constraints, preclude the existence of other devices, sub-systems, elements, structures, components, additional devices, additional sub-systems, additional elements, additional structures or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.
[0018] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.
[0019] In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.
[0020] Embodiments of the present disclosure relate to a system and method for diagnosis of identified defects in a fabric using machine learning. The system includes a processing subsystem hosted on a server. The processing subsystem is configured to execute on a network to control bidirectional communications among a plurality of modules. The processing subsystem includes a receiving module and configured to acquire a video content captured by a machine vision camera wherein the video content features at least one section of an area of a fabric woven in a loom machine. The video content includes a plurality of frames. The receiving module is also configured to retrieve data in different formats from the plurality of frames. The data includes image data, raw material data, details of an operator of the loom machine, machine data, program details corresponding to the at least one section of the area of the fabric and the loom machine. The processing subsystem also includes a detector module operatively coupled to the receiving module. The detector module is configured to identify one or more defects induced along the length and breadth of the at least one section of the fabric based on a set of standard features corresponding to the at least one section of the fabric, upon receiving the image data. The detector module is also configured to classify the one or more defects based on the intensity of the one or more defects. The processing subsystem also includes a prediction module wherein the prediction module is configured to forecast a report for the one or more defects based on a plurality of parameters using machine learning. The report illustrates the reasoning of the occurrence of the one or more defects. Further, the processing subsystem includes a recommendation module operatively coupled to the prediction module wherein the recommendation module is configured to propose one or more actions to be taken for resolving the one or more defects. The recommendation module is also configured to collect feedback from an operator of the loom machine in real-time.
[0021] FIG. 1 is a block diagram representation of a system (100) for diagnosis of identified defects in a fabric using machine learning in accordance with an embodiment of the present disclosure.
[0022] The system (100) includes a processing subsystem (105) hosted on a server (108). In one embodiment, the server (108) may include a cloud-based server. In another embodiment, parts of the server (108) may be a local server coupled to a computing device (165). The processing subsystem (105) is configured to execute on a network (115) to control bidirectional communications among a plurality of modules. In one example, the network (115) may be a private or public local area network (LAN) or Wide Area Network (WAN), such as the Internet. In another embodiment, the network (115) may include both wired and wireless communications according to one or more standards and/or via one or more transport mediums. In one example, the network (115) may include wireless communications according to one of the 802.11 or Bluetooth specification sets, or another standard or proprietary wireless communication protocol. In yet another embodiment, the network (115) may also include communications over a terrestrial cellular network, including, a global system for mobile communications (GSM), code division multiple access (CDMA), and/or enhanced data for global evolution (EDGE) network.
[0023] The system (100) includes a receiving module (120) configured to receive a video content captured by a machine vision camera (125). In one embodiment, the video content is captured in real-time. The video content features at least one section of an area of a fabric woven in a loom machine (130). The loom machine (130) is a simple machine used to produce woven fabric. The machine vision camera (125) is typically one or more cameras to inspect and analyze products in the industrial or production environment. Specifically, the machine vision camera (125) captures the video content from a front and back sides of the at least one section of the area of the fabric. It must be noted that the system (100) may receive the video content from any other suitable camera (which suits the requirement of the present disclosure) that refers to a combination of a sensing element for measuring electromagnetic radiation and a lens for bending electromagnetic radiation and is not limited to the said machine vision camera (125). Specifically, the camera must be capable of capturing high resolution images of the weaving area in the fabric. In one embodiment, the camera may be an analog or digital still image camera, a video camera, an optical camera, a laser camera, a 3D image scanner, full spectrum camera, an infrared illuminated camera, an infrared camera, a thermal imaging camera, a stereoscopic camera, and/or any other combination of cameras or different types of cameras.
[0024] Further, the receiving module (120) converts the video content into a plurality of frames wherein each frame includes a plurality of pixels. Subsequently, the plurality of frames is refined based on size and contrast.
[0025] The receiving module (120) is also configured to retrieve data wherein the data is produced in different formats from the plurality of frames. Examples of the data include, but is not limited to, image data, raw material data, details of an operator (175) of the loom machine (130), machine data, program details corresponding to the at least one section of the area of the fabric and the loom machine (130). The machine data is obtained by using Internet of Things (IoT) sensors. Examples of the data include, but is not limited to:
a) Machine data: vibration of data from different parts of the loom machine (130), load cell data, Inertial Measurement Unit (IMU) sensors, temperature sensors at moving parts and power consumption data ACS sensors.
b) Raw material data: Source of raw material, history of supplier, history of quality and material property.
c) Operator details: Skill rating, age, gender, height, relevant experience, employment history.
d) Program details: Type of job, design, speed, needles, number of wraps and wefts and number of cones loaded, run time.
[0026] The image data that is received is transmitted to a detector module (135) whereas the rest of the data (raw material data, details of an operator (175) of the loom machine (130), machine data, program details corresponding to the at least one section of the area of the fabric and the loom machine (130)) is transmitted to a prediction module (140).
[0027] As discussed earlier, the detector module (135) receives the image data in the form of a plurality of frames from the machine vision camera (125). The plurality of frames is then analyzed using convolutional neural network (CNN). It must be noted that analysis of the plurality of frames may be performed by any other suitable artificial neural network used in image recognition and classification. Typically, the detector module (135) is operable to identify one or more defects induced along the length and breadth of the at least one section of the fabric based on a set of standard features corresponding to the at least one section of the fabric, upon receiving the image data. In other words, the detector module (135) is configured to analyze the plurality of pixels of the plurality of frames based on consistency of the atleast one section of the fabric colour, thread density, thickness of thread and the like using machine learning technique. The plurality of pixels is compared with data corresponding to a non-defective section of the fabric to identify the one or more defects. Subsequently, the detector module (135) is also operable to classify the defects upon analysis and identification. Typically, the defects are classified as minor defects, major defects and critical defects. The identified and classified defects are then fed to a prediction module (140).
[0028] The prediction module (140) is configured to forecast a report for the one or more defects based on a plurality of parameters using machine learning. The report illustrates the reasoning of the occurrence of the one or more defects. Typically, the prediction module (140) is functional upon a combination of data, wherein the data includes the rest of the data, the identified and classified data and a plurality of weighted variables from an Active Learning Engine (150). Subsequently, the report is forwarded to a recommendation engine (145) where the possible reasons of the corresponding defects are generated.
[0029] Further, the one or more defects and corresponding one or more actions that are prescribed to resolve the one or more defects are stored in a database (170) for future analysis.
[0030] The recommendation engine (145) is operatively coupled to the prediction module (140) and is configured to propose one or more actions to be taken for resolving the one or more defects. In one embodiment, the one or more actions may be performed manually or automatically. Further, the recommendation engine (145) collects feedback from an operator (175) of the loom machine (130) in real-time.
[0031] The recommendation engine (145) is operatively coupled to the Active Learning Engine (150) wherein Active Learning Engine (150) is functional and trained with a training data that includes the one or more actions to be taken for the one or more defects and the feedback on the defects and corresponding actions to resolve the defects. The Active Learning Engine (150) produces weighted variables that are consequently supplied as an input to the prediction module (140).
[0032] In a specific embodiment, the video content may be stored in a database (170). In such an embodiment, the database may be hosted on the cloud server. Further, the database (170) is operable to store the analysis of the one or more defects and corresponding one or more actions to resolve the one or more defects.
[0033] The system (100) also includes an alerting module (155) operatively coupled to the recommendation module (145). The alerting module is operable to broadcast an alert via a computing device (165) upon identifying the one or more defects. In one embodiment, the alert is an audio-visual alert. Examples of the computing device (165) includes but is not limited to, a mobile phone, desktop computer, portable digital assistant (PDA), smart phone, tablet, ultra-book, netbook, laptop, multi-processor system, microprocessor-based or programmable consumer electronic system, or any other communication device that an operator (175) may use. In some embodiments, the system may comprise a display module (not shown) to display information (for example, in the form of user interfaces). In further embodiments, the system may comprise one or more of touch screens, accelerometers, gyroscopes, cameras, microphones, global positioning system (GPS) devices, and so forth.
[0034] The system (100) also includes a controlling module (160) operatively coupled to the alerting module (155) wherein the controlling module (160) is configured to transmit one or more signals to pause or stop the loom machine (130) from weaving the at least one section of the fabric.
[0035] In one embodiment, the various functional components of the system may reside on a single computer, or they may be distributed across several computers in various arrangements. The various components of the system may, furthermore, access one or more databases, and each of the various components of the system may be in communication with one another. Further, while the components of FIG. 1 are discussed in the singular sense, it will be appreciated that in other embodiments multiple instances of the components may be employed.
[0036] FIG. 2 is a block diagram of a computer or a server in accordance with an embodiment of the present disclosure. The server (200) includes processor(s) (230), and memory (210) operatively coupled to the bus (220). The processor(s) (230), as used herein, means any type of computational circuit, such as, but not limited to, a microprocessor, a microcontroller, a complex instruction set computing microprocessor, a reduced instruction set computing microprocessor, a very long instruction word microprocessor, an explicitly parallel instruction computing microprocessor, a digital signal processor, or any other type of processing circuit, or a combination thereof.
[0037] The memory (210) includes several subsystems stored in the form of executable program which instructs the processor (230) to perform the method steps illustrated in FIG. 1. The memory (210) includes a processing subsystem (105) of FIG.1. The processing subsystem (105) further has following modules: a receiving module (120), a detector module (135), a prediction module (140), a recommendation module (145), an active learning module (150), an alerting module (155) and a controlling module (160).
[0038] In accordance with an embodiment of the present disclosure, a system for diagnosis of identified defects in a fabric using machine learning is provided. The system includes a processing subsystem hosted on a server. The processing subsystem is configured to execute on a network to control bidirectional communications among a plurality of modules. The processing subsystem includes a receiving module and configured to acquire a video content captured by a machine vision camera wherein the video content features at least one section of an area of a fabric woven in a loom machine. The video content includes a plurality of frames. The receiving module is also configured to retrieve data in different formats from the plurality of frames. The data includes image data, raw material data, details of an operator of the loom machine, machine data, program details corresponding to the at least one section of the area of the fabric and the loom machine. The processing subsystem also includes a detector module operatively coupled to the receiving module. The detector module is configured to identify one or more defects induced along the length and breadth of the at least one section of the fabric based on a set of standard features corresponding to the at least one section of the fabric, upon receiving the image data. The detector module is also configured to classify the one or more defects based on the intensity of the one or more defects. The processing subsystem also includes a prediction module wherein the prediction module is configured to forecast a report for the one or more defects based on a plurality of parameters using machine learning. The report illustrates the reasoning of the occurrence of the one or more defects. Further, the processing subsystem includes a recommendation module operatively coupled to the prediction module wherein the recommendation module is configured to propose one or more actions to be taken for resolving the one or more defects. The recommendation module is also configured to collect feedback from an operator of the loom machine in real-time.
[0039] The bus (220) as used herein refers to be internal memory channels or computer network that is used to connect computer components and transfer data between them. The bus (220) includes a serial bus or a parallel bus, wherein the serial bus transmits data in bit-serial format and the parallel bus transmits data across multiple wires. The bus (220) as used herein, may include but not limited to, a system bus, an internal bus, an external bus, an expansion bus, a frontside bus, a backside bus and the like.
[0040] FIG. 3 (a) illustrates a flow chart representing the steps involved in a method for diagnosis of identified defects in a fabric using machine learning in accordance with an embodiment of the present disclosure and FIG. 3 (b) illustrates continued steps of the method of FIG. 3 (a) in accordance with an embodiment of the present disclosure.
[0041] The method includes acquiring a video content featuring at least one section of a weaving area in a loom by a receiving module in step (210). The video content is captured by a machine vision camera wherein the video content includes a plurality of frames.
[0042] The method includes retrieving data in different formats from the plurality of frames of the video content by the receiving module in step (220). The data includes image data, raw material data, details of an operator of the loom machine, machine data, program details corresponding to the fabric and the loom machine.
[0043] The method includes identifying one or more defects induced along the length and breadth of the fabric based on a set of standard features corresponding to the fabric by a detector module in step (230). Typically, a defect is defined as an area of fabric that does not conform to the features of fabric or fulfill the requirements which increases the dissatisfaction of the customer.
[0044] The quality of woven fabric depends upon the number and size of defects left in the fabric after the manufacturing process. Further, various faults occurring in the weaving area during manufacture may cause defects in the finished fabric. These include slubs, holes, missing yarns, yarn variation, end out, soiled yarns, wrong yarn faults, oil spots, loom-stop marks, start marks, thin place, smash marks, open reed, mixed filling, kinky filling, mixed end, knots, jerk-in, dropped picks, broken picks, double picks, double ends, drawbacks, burl marks and the like. It should be noted that the listed defects are exemplary in nature and should not limit the scope of the invention. In one embodiment, a root cause for a defect may be determined by a machine learning model trained to identify root causes of specific fabric defects.
[0045] The identification of defects in the fabric is performed by analyzing the captured area of the fabric. The plurality of frames within the captured area is analyzed based on the consistency of the colour, thread density, thickness of the thread and the like using machine learning. Subsequently, the plurality of frames is compared with data corresponding to a non-defective section of the fabric. As a result, occurrence of one or more defects is identified in the fabric.
[0046] Upon identifying defects in the fabric, an alert or a flag is raised wherein the alert may be audio-visual. Subsequently, the operator of the loom is aware of a discrepancy. In one embodiment, one or more signals may be transmitted to pause the loom machine from weaving.
[0047] The method includes classifying the one or more defects identified based on the intensity of the one or more defects by the detector module in step (240).
[0048] The defects may be broadly classified as minor defects, major defects and critical defects. The minor defects include small faults that do not influence the purchase of the product. The major defects are those which when exposed, are likely to affect the purchase of the product and are hence categorized as seconds. The critical defects cause an entire roll to be rated as a second or worse.
[0049] The method includes predicting a report for the one or more defects based on a plurality of parameters using machine learning, wherein the report illustrates the reasoning of the occurrence of the one or more defects by a prediction module in step (250). In one embodiment, the report may include images, graphical representations, numbers, text and the like.
[0050] It to be noted that step (250) is performed by an active learning engine using artificial intelligence to accurately detect the defects in the fabric. Examples of the artificial intelligence algorithm includes, but are not limited to, a Deep Neural Network (DNN), Convolutional Neural Network (CNN), Restricted Boltzmann Machine (RBM), Deep Belief Network (DBN) and Deep Q-Networks. Specifically, step (250) is performed using CNN.
[0051] The input fed to the active learning engine comprises of the predicted report of the one or more defects for the possible failures along with the corresponding actions to be taken for the diagnosed defects. The output of the active learning engine comprises of weighted variables that is again fed into the prediction module as an input.
[0052] The method includes proposing one or more actions to be taken for resolving the one or more defects by a recommendation module in step (260). In one embodiment, the one or more actions may be an automated process or a manual process.
[0053] Typically, the one or more actions are alerted to the operator of the loom machine via a display device. In one embodiment, flags may be raised as an indication of the presence of one or more defects in the fabric.
[0054] Further, one or more signals are transmitted to stop the loom machine or otherwise adjust the loom machine settings in response to the identification of defects in the fabric.
[0055] In one embodiment, the one or more actions for resolving the one or more corresponding defects is stored in a database for further training purposes.
[0056] The method includes collecting feedback from an operator of the loom machine in real-time by the recommendation module in step (270).
[0057] The machine learning model is refined based on the feedback. Examples of the feedback includes, but is not limited to, voting scores (for instance, operator provided scores, percentages of operators that found the insight helpful and so on), textual feedback (for instance, “unclear,” “unhelpful,” “unrelated,” “trivial,” “wrong,” “duplicate,” “helpful,” “other,” and so on), or a combination of both.
[0058] The method includes training with a training data set using machine learning wherein the training data set comprises the one or more actions corresponding to the one or more defects and the feedback from the operator by an active learning engine operatively coupled to the recommendation module of the processing sub-system in step (280).
[0059] The machine learning model may be trained (or updated) in real-time in response to the feedback and the identified fabric defects, or may be updated continuously, for example, at predetermined time intervals.
[0060] Various embodiments of the system and method for diagnosing identified defects in a fabric using machine learning as described above imparts accuracy rate of detecting defects and classification of more than 95% on an average. Further, the method disclosed herein provides a cost effective and competitive detection and classification of defects.
[0061] Although the present disclosure describes a system and method to diagnose defects in fabric, it will be appreciated to those skilled in the art that the said system and method can be applied to any other suitable application areas (for example, but not limited to, spinning, weaving, batching, dying and printing) wherein the defects that may exist in any suitable raw material.
[0062] The techniques described in this disclosure may be implemented, at least in part, in hardware, software, firmware, or any combination thereof. For example, various aspects of the described techniques may be implemented within one or more processors, including one or more microprocessors, digital signal processors (DSPs), application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components. The term “processor” or “processing subsystem” may generally refer to any of the foregoing logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry. A control unit including hardware may also perform one or more of the techniques of this disclosure.
[0063] Such hardware, software, and firmware may be implemented within the same device or within separate devices to support the various techniques described in this disclosure. In addition, any of the described units, modules, or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units must be realized by separate hardware, firmware, or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware, firmware, or software components, or integrated within common or separate hardware, firmware, or software components.
[0064] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the disclosure and are not intended to be restrictive thereof.
[0065] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.
[0066] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, the order of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts need to be necessarily performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. , Claims:1. A system (100) for diagnosis of identified defects in a fabric using machine learning comprising:
a processing subsystem (105) hosted on a server (108), wherein the processing subsystem (105) is configured to execute on a network (115) to control bidirectional communications among a plurality of modules comprising:
a receiving module (120) operatively coupled to the processing subsystem (105) wherein the receiving module (120) is configured to:
acquire a video content captured by a machine vision camera (125), wherein the video content features at least one section of an area of a fabric woven in a loom machine (130) and wherein the video content comprises a plurality of frames; and
retrieve data in different formats from the plurality of frames wherein the data comprises image data, raw material data, details of an operator of the loom machine (130), machine data, program details corresponding to the at least one section of the area of the fabric and the loom machine (130);
a detector module (135) operatively coupled to the receiving module (120) wherein the detector module (135) is configured to:
identify one or more defects induced along the length and breadth of the at least one section of the fabric based on a set of standard features corresponding to the at least one section of the fabric by using machine learning;
classify the one or more defects based on the intensity of the one or more defects by using machine learning;
a prediction module (140) operatively coupled to the detector module (135) wherein the prediction module (140) is configured to analyze the one or more defects and forecast a report for the one or more defects based on a plurality of parameters using machine learning, wherein the report illustrates the reasoning of the occurrence of the one or more defects; and
a recommendation module (145) operatively coupled to the prediction module (140) wherein the recommendation module (145) is configured to:
propose one or more actions to be taken for resolving the one or more defects; and
collect feedback from an operator (175) of the loom machine (130) in real-time.
2. The system (100) as claimed in claim 1 comprises an active learning engine (150) operatively coupled to the recommendation module (145) and prediction module (140) wherein the active learning engine (150) is configured to train based on a training data set using machine learning wherein the training data set comprises the one or more actions corresponding to the one or more defects and the feedback from the operator (175) fed as an input from the recommendation module (145).
3. The system (100) as claimed in claim 1 wherein the plurality of parameters comprises machine data using one or more sensors, raw material data, operator details and program details, wherein the one or more sensors is configured to capture data of the loom machine (130).
4. The system (100) as claimed in claim 1 wherein the receiving module (120) is configured to refine each of the plurality of frames based on size and contrast.
5. The system (100) as claimed in claim 1 wherein the detector module (135) is configured to:
analyze a plurality of pixels of the plurality of frames based on consistency of the at least one section of the fabric colour, thread density and thickness of thread, using machine learning; and
compare the plurality of pixels with data of a corresponding non-defection section of the fabric to identify the one or more defects.
6. The system (100) as claimed in claim 1 wherein the one or more defects is classified as a minor defect, critical defect and a major defect.
7. The system (100) as claimed in claim 1 wherein the one or more actions is performed by one of an automated process and a manual process.
8. The system (100) as claimed in claim 1 comprising:
an alerting module (155) operatively coupled to the recommendation module (145) wherein the alerting module (155) is operable to perform at least one of broadcast an alert via a computing device (165) upon identifying the one or more defects wherein the alert is an audio-visual alert; and
a controlling module (160) operatively coupled to the alerting module (155) wherein the controlling module (160) is configured to transmit one or more signals to pause the loom machine (130) from weaving the at least one section of the fabric.
9. The system (100) as claimed in claim 1 wherein the one or more defects and corresponding one or more actions to resolve the one or more defects are stored in a database (170) for future analysis.
10. A method (200) for diagnosis of identified defects in a fabric using machine learning comprising:
acquiring, by a machine vision camera, a video content wherein the video content features at least one section of a fabric woven in a loom machine and wherein the video content comprises a plurality of frames; (210)
retrieving, by a retrieving module of a processing sub-system operatively coupled to the machine vision camera, data in different formats from the plurality of frames wherein the data comprises image data, raw material data, details of an operator of the loom machine, machine data, program details corresponding to the fabric and the loom machine; (220)
identifying, by a detector module of the processing sub-system, one or more defects induced along the length and breadth of the fabric based on a set of standard features corresponding to the fabric, upon receiving the image data; (230)
classifying, by the detector module of the processing sub-system, the one or more defects based on the intensity of the one or more defects; (240)
predicting, by a prediction module of the processing sub-system, a report for the one or more defects based on a plurality of parameters using machine learning, wherein the report illustrates the reasoning of the occurrence of the one or more defects; (250)
proposing, by a recommendation module of the processing sub-system, one or more actions to be taken for resolving the one or more defects; (260)
collecting, by the recommendation module of the processing sub-system, feedback from an operator of the loom machine in real-time; (270) and
training, by an active learning engine operatively coupled to the recommendation module of the processing sub-system, with a training data set using machine learning wherein the training data set comprises the one or more actions corresponding to the one or more defects and the feedback from the operator. (280)
Dated this 13th day of October 2022
Signature
Jinsu Abraham
Patent Agent (IN/PA-3267)
Agent for the Applicant
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 202241058665-FORM-24 [28-05-2024(online)].pdf | 2024-05-28 |
| 1 | 202241058665-STATEMENT OF UNDERTAKING (FORM 3) [13-10-2022(online)].pdf | 2022-10-13 |
| 2 | 202241058665-REQUEST FOR EARLY PUBLICATION(FORM-9) [13-10-2022(online)].pdf | 2022-10-13 |
| 2 | 202241058665-Written submissions and relevant documents [09-06-2023(online)].pdf | 2023-06-09 |
| 3 | 202241058665-PROOF OF RIGHT [13-10-2022(online)].pdf | 2022-10-13 |
| 3 | 202241058665-Correspondence to notify the Controller [19-05-2023(online)].pdf | 2023-05-19 |
| 4 | 202241058665-US(14)-HearingNotice-(HearingDate-25-05-2023).pdf | 2023-04-28 |
| 4 | 202241058665-FORM-9 [13-10-2022(online)].pdf | 2022-10-13 |
| 5 | 202241058665-FORM FOR SMALL ENTITY(FORM-28) [13-10-2022(online)].pdf | 2022-10-13 |
| 5 | 202241058665-DRAWING [26-04-2023(online)].pdf | 2023-04-26 |
| 6 | 202241058665-FORM FOR SMALL ENTITY [13-10-2022(online)].pdf | 2022-10-13 |
| 6 | 202241058665-ENDORSEMENT BY INVENTORS [26-04-2023(online)].pdf | 2023-04-26 |
| 7 | 202241058665-FORM 1 [13-10-2022(online)].pdf | 2022-10-13 |
| 7 | 202241058665-FER_SER_REPLY [26-04-2023(online)].pdf | 2023-04-26 |
| 8 | 202241058665-FORM 3 [26-04-2023(online)].pdf | 2023-04-26 |
| 8 | 202241058665-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [13-10-2022(online)].pdf | 2022-10-13 |
| 9 | 202241058665-EVIDENCE FOR REGISTRATION UNDER SSI [13-10-2022(online)].pdf | 2022-10-13 |
| 9 | 202241058665-OTHERS [26-04-2023(online)].pdf | 2023-04-26 |
| 10 | 202241058665-DRAWINGS [13-10-2022(online)].pdf | 2022-10-13 |
| 10 | 202241058665-ENDORSEMENT BY INVENTORS [27-03-2023(online)].pdf | 2023-03-27 |
| 11 | 202241058665-DECLARATION OF INVENTORSHIP (FORM 5) [13-10-2022(online)].pdf | 2022-10-13 |
| 11 | 202241058665-RELEVANT DOCUMENTS [27-03-2023(online)].pdf | 2023-03-27 |
| 12 | 202241058665-COMPLETE SPECIFICATION [13-10-2022(online)].pdf | 2022-10-13 |
| 12 | 202241058665-FORM-8 [13-03-2023(online)].pdf | 2023-03-13 |
| 13 | 202241058665-MSME CERTIFICATE [14-10-2022(online)].pdf | 2022-10-14 |
| 13 | 202241058665-Proof of Right [13-03-2023(online)].pdf | 2023-03-13 |
| 14 | 202241058665-FORM-26 [22-11-2022(online)].pdf | 2022-11-22 |
| 14 | 202241058665-FORM28 [14-10-2022(online)].pdf | 2022-10-14 |
| 15 | 202241058665-FER.pdf | 2022-11-17 |
| 15 | 202241058665-FORM 18A [14-10-2022(online)].pdf | 2022-10-14 |
| 16 | 202241058665-FER.pdf | 2022-11-17 |
| 16 | 202241058665-FORM 18A [14-10-2022(online)].pdf | 2022-10-14 |
| 17 | 202241058665-FORM28 [14-10-2022(online)].pdf | 2022-10-14 |
| 17 | 202241058665-FORM-26 [22-11-2022(online)].pdf | 2022-11-22 |
| 18 | 202241058665-MSME CERTIFICATE [14-10-2022(online)].pdf | 2022-10-14 |
| 18 | 202241058665-Proof of Right [13-03-2023(online)].pdf | 2023-03-13 |
| 19 | 202241058665-COMPLETE SPECIFICATION [13-10-2022(online)].pdf | 2022-10-13 |
| 19 | 202241058665-FORM-8 [13-03-2023(online)].pdf | 2023-03-13 |
| 20 | 202241058665-DECLARATION OF INVENTORSHIP (FORM 5) [13-10-2022(online)].pdf | 2022-10-13 |
| 20 | 202241058665-RELEVANT DOCUMENTS [27-03-2023(online)].pdf | 2023-03-27 |
| 21 | 202241058665-DRAWINGS [13-10-2022(online)].pdf | 2022-10-13 |
| 21 | 202241058665-ENDORSEMENT BY INVENTORS [27-03-2023(online)].pdf | 2023-03-27 |
| 22 | 202241058665-EVIDENCE FOR REGISTRATION UNDER SSI [13-10-2022(online)].pdf | 2022-10-13 |
| 22 | 202241058665-OTHERS [26-04-2023(online)].pdf | 2023-04-26 |
| 23 | 202241058665-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [13-10-2022(online)].pdf | 2022-10-13 |
| 23 | 202241058665-FORM 3 [26-04-2023(online)].pdf | 2023-04-26 |
| 24 | 202241058665-FORM 1 [13-10-2022(online)].pdf | 2022-10-13 |
| 24 | 202241058665-FER_SER_REPLY [26-04-2023(online)].pdf | 2023-04-26 |
| 25 | 202241058665-FORM FOR SMALL ENTITY [13-10-2022(online)].pdf | 2022-10-13 |
| 25 | 202241058665-ENDORSEMENT BY INVENTORS [26-04-2023(online)].pdf | 2023-04-26 |
| 26 | 202241058665-FORM FOR SMALL ENTITY(FORM-28) [13-10-2022(online)].pdf | 2022-10-13 |
| 26 | 202241058665-DRAWING [26-04-2023(online)].pdf | 2023-04-26 |
| 27 | 202241058665-US(14)-HearingNotice-(HearingDate-25-05-2023).pdf | 2023-04-28 |
| 27 | 202241058665-FORM-9 [13-10-2022(online)].pdf | 2022-10-13 |
| 28 | 202241058665-PROOF OF RIGHT [13-10-2022(online)].pdf | 2022-10-13 |
| 28 | 202241058665-Correspondence to notify the Controller [19-05-2023(online)].pdf | 2023-05-19 |
| 29 | 202241058665-Written submissions and relevant documents [09-06-2023(online)].pdf | 2023-06-09 |
| 29 | 202241058665-REQUEST FOR EARLY PUBLICATION(FORM-9) [13-10-2022(online)].pdf | 2022-10-13 |
| 30 | 202241058665-STATEMENT OF UNDERTAKING (FORM 3) [13-10-2022(online)].pdf | 2022-10-13 |
| 30 | 202241058665-FORM-24 [28-05-2024(online)].pdf | 2024-05-28 |
| 31 | 202241058665-ReviewPetition-HearingNotice-(HearingDate-16-12-2025).pdf | 2025-11-11 |
| 1 | 202241058665_searchE_16-11-2022.pdf |