Abstract: The present disclosure provides a system (100) and a method for machine degradation estimation. The system (100) includes image acquisition units configured at predetermined (fixed) distances from a machine for detecting stereo image(s) of the machine. The system (100) estimates machine degradation by estimating the depth(s) of machine part(s) constituting the machine from the stereo images. The system (100) detects machine parts in the stereo image(s), estimates the depth(s) via an Artificial Intelligence (AI) based method and a stereoscopic method based on the detected machine part(s), and combines the depths estimated by the two methods. The system (100) estimates the degradation of machine part(s) by comparing the depth(s) with reference depth(s) indicating no degradation and maximum degradation, and sends alert to users.
Description:TECHNICAL FIELD
[0001] The present disclosure relates to machine degradation estimation. In particular, the present disclosure relates to a system and a method for machine degradation estimation.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present disclosure. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed disclosure, or that any publication specifically or implicitly referenced is prior art.
[0003] Industrial machines need to be kept in working condition and any unexpected down time may result in loss of revenue and supply-chain issues. Continuous operation of industrial machines leads to their wear and tear (degradation). The machine parts that constitute the machine may need replacement due to the degradation.
[0004] Industrial machines need timely housekeeping and maintenance. The maintenance activity involves evaluation of the machine parts and changing any machine part constituting the machine that has degraded below a threshold. Most often, maintenance of machines and replacement of any machine part requires that the machines be stopped or brought to a non-working condition. But, stoppage of machine might lead to pause in production and loss of revenue. Moreover, manual evaluation of machines for checking the degradation is inaccurate, time consuming and expensive.
[0005] Therefore, there is a need for a solution for machine degradation (wear and tear) estimation to overcome the abovementioned drawbacks.
OBJECTS OF THE PRESENT DISCLOSURE
[0006] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed below.
[0007] The principal object of the present disclosure is to provide a solution to accurately estimate the machine degradation while the machine is running.
[0008] Another object of the present disclosure to provide a solution to estimate the machine degradation in real time.
[0009] Yet another object of the present disclosure to provide a solution to estimate the machine degradation and provide timely alerts.
SUMMARY
[0010] The present disclosure relates to machine degradation estimation. In particular, the present disclosure relates to a system and a method for machine degradation estimation.
[0011] In an aspect, the present disclosure relates to a system for machine degradation estimation. The system includes a plurality of image acquisition units configured at predetermined distances from a machine for detecting one or more stereo images of the machine. The system includes at least one processor and a memory operatively coupled to the at least one processor. The memory stores executable instructions which when executed by the at least one processor, causes the at least one processor to perform the followings steps. The at least one processor detects the one or more stereo images associated with the machine via the plurality of image acquisition units at predetermined periods. The at least one processor detects one or more images associated with the one or more machine parts among the one or more stereo images, where the one or more machine parts are configured with the machine. The at least one processor estimates one or more depths associated with one or more machine parts using an Artificial Intelligence (AI) based method, based on the one or more images. The at least one processor estimates the one or more depths associated with the one or more machine parts using a stereoscopic method based on the one or more images. The at least one processor combines the one or more depths estimated using the AI based method and the stereoscopic method into one or more final depths of the one or more machine parts. The at least one processor estimates one or more degradation levels of the one or more machine parts based on the one or more final depths.
[0012] In an embodiment, the AI based method for estimating the one or more depths associated with the one or more machine parts may include the at least one processor estimating the one or more depths using a machine learning (ML) model based on the one or more images associated with the one or more machine parts.
[0013] In an embodiment, the stereoscopic method for estimating the one or more depths associated with the one or more machine parts may include the following steps. The at least one processor may match one or more features of the one or more machine parts among the one or more stereo images. The at least one processor may estimate the one or more depths associated with the one or more machine parts based on one or more displacements of the one or more features associated with the one or more machine parts.
[0014] In an embodiment, the at least one processor may transmit an alert when the one or more final depths of the one or more machine parts are between a pre-defined range, indicating a minimum degradation and a maximum degradation associated with the one or more machine parts.
[0015] In an embodiment, the one or more degradation levels associated with one or more machine parts is encoded in the alert via one or more colors.
[0016] In an aspect, the present disclosure relates to a method for machine degradation estimation. The method includes the following steps. Detecting, by at least one processor associated with a system, one or more stereo images of a machine, via a plurality of image acquisition units, at predetermined periods, where the plurality of image acquisition units are configured at predetermined distances from the machine for detecting the one or more stereo images of the machine. Detecting, by the at least one processor, one or more images associated with the one or more machine parts among the one or more stereo images, where the one or more machine parts are configured with the machine. Estimating, by the at least one processor, one or more depths associated with one or more machine parts, using an AI based method, based on the one or more stereo images. Estimating, by the at least one processor, the one or more depths associated with the one or more machine parts, using a stereoscopic method for depth estimation, based on the one or more stereo images. Combining, by the at least one processor, the one or more depths estimated using the AI based method and the stereoscopic method, into one or more final depths of the one or more machine parts. Estimating, by the at least one processor, one or more degradation levels of the one or more machine parts, based on the one or more final depths.
[0017] In an embodiment, the AI based method for estimating the one or more depths associated with the one or more machine parts may include estimating, by the at least one processor, the one or more depths, using an ML model, based on the one or more images associated with the one or more machine parts.
[0018] In an embodiment, the stereoscopic method for estimating the one or more depths associated with the one or more machine parts may include the following steps. Matching, by the at least one processor, one or more features of the one or more machine parts among the one or more stereo images. Estimating, by the at least one processor, the one or more depths associated with the one or more machine parts based on one or more displacements of the one or more features associated with the one or more machine parts.
[0019] In an embodiment, the at least one processor may transmit an alert when the one or more final depths of the one or more machine parts are between a pre-defined range, indicating a minimum degradation and a maximum degradation associated with the one or more machine parts.
[0020] In an embodiment, the one or more degradation levels associated with one or more machine parts may be encoded in the alert via one or more colors.
BRIEF DESCRIPTION OF DRAWINGS
[0021] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in, and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure, and together with the description, serve to explain the principles of the present disclosure.
[0022] In the figures, similar components, and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[0023] FIG. 1 illustrates the steps performed by an example system (100) for machine degradation estimation, in accordance with an embodiment of the present disclosure.
[0024] FIG. 2 illustrates (200) an example system (100), for machine degradation estimation with structural and functional components, in accordance with an embodiment of the present disclosure.
[0025] FIG. 3 illustrates an example method (300) for machine degradation estimation, in accordance with an embodiment of the current disclosure.
DETAILED DESCRIPTION
[0026] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0027] The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth.
[0028] Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
[0029] Also, it is noted that individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed but could have additional steps not included in a figure. A process may correspond to method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
[0030] The word “exemplary” and/or “demonstrative” is used herein to mean serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “exemplary” and/or “demonstrative” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques known to those of ordinary skill in the art. Furthermore, to the extent that the terms “includes,” “has,” “contains,” and other similar words are used in either the detailed description or the claims, such terms are intended to be inclusive—in a manner similar to the term “comprising” as an open transition word—without precluding any additional or other elements.
[0031] Reference throughout this specification to “one embodiment” or “an embodiment” or “an instance” or “one instance” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0032] Various aspects of the present disclosure are described with respect to FIG. 1 to FIG. 3.
[0033] The present disclosure relates to machine degradation estimation. In particular, the present disclosure relates to a system and a method for machine degradation estimation.
[0034] In an aspect, the present disclosure relates to a system for machine degradation (wear and tear) estimation. The system may include a plurality of image acquisition units configured at predetermined (fixed) distances from a machine for detecting one or more stereo images of the machine. A stereo image is an image pair detected simultaneously (same time instant) for depth perception akin to human vision by a pair of image acquisition units calibrated for stereo image acquisition. The calibration of the image acquisition units is done using detected stereo image(s) of a known pattern (a checkerboard pattern) and estimating the intrinsic (image acquisition unit(s) specific parameters) and extrinsic parameters (configuration of one image acquisition unit with respect to the other image acquisition unit).
[0035] FIG. 1 illustrates the steps performed by an example system (100) for machine degradation estimation, in accordance with an embodiment of the present disclosure. In an embodiment, the system (100) detects the one or more stereo images associated with the machine via the plurality of image acquisition units at predetermined periods (temporal domain) as shown at step 102. In an embodiment, the system may detect one or more images associated with the one or more machine parts among the one or more stereo images, where the one or more machine parts are configured with the machine as shown at step 104. As machine parts constituting a machine degrade due to wear and tear their depth decreases (material loss), and their depths as perceived in images increase.
[0036] In an embodiment, as shown at 106 step the system (100) may estimate one or more depths associated with one or more machine parts using an Artificial Intelligence (AI) based method based on the one or more stereo images, where the one or more machine parts are configured with the machine. In an embodiment, the AI based method (106) for estimating the one or more depths associated with the one or more machine parts may include estimation of the one or more depths using a machine learning (ML) model based on the one or more images associated with the one or more machine parts. In an exemplary embodiment, the ML model may be a deep neural network for Single Shot detection (SSD) which performs detection of machine parts and estimation/prediction of depth based on the detected machine parts in the stereo images in a single pass through the stereo images. For example, the neural network may be a YOLO (You Only Look Once) based deep learning model. In an exemplary embodiment, the deep learning model may be trained on a substantial dataset of annotated stereo images with machine part bounding boxes and depth information. In an exemplary embodiment, the training process of the deep neural network may be optimized by optimizing the hyperparameters that control the training process to optimize the accuracy of machine part detection and depth estimation which is a regression problem. The hyperparameters may include learning rate, batch size, and number of epochs. The learning rate determines the size of the steps during update of weights during training. Batch size is the number of training examples that the deep learning network sees before updating its weights. An epoch is one pass through the training data before weight is updated. Number of epochs is the number iterations through the training data before the training is stopped. In an exemplary embodiment, the trained machine learning model may then be deployed for depth estimation of machine parts while the machine is functioning.
[0037] In an embodiment, the system (100) estimates the one or more depths associated with the one or more machine parts using a stereoscopic method based on the one or more images as shown at step 108. In an embodiment, the stereoscopic method for estimating the one or more depths associated with the one or more machine parts may include the following steps. The system (100) may match one or more features of the one or more machine parts among the one or more stereo images and may estimate the one or more depths associated with the one or more machine parts based on one or more displacements of the one or more features associated with the one or more machine parts as shown at step 108. The one or more features of one or more machine parts are the same points on the machine parts that map onto different pixels in the two images corresponding to a stereo image. In an exemplary embodiment, the feature matching may be done using SIFT (scale-invariant feature transform) feature vectors extracted from the one or more images. The two image acquisition units and a physical point (3-dimensional point or 3D point) on the machine part form a triangle. Feature matching (key-point feature matching) is performed to locate the image pixels corresponding to the same physical point(s) on the stereo image (set of two images). Depth is estimated using the shift in the image pixels corresponding to same 3D point using a triangulation process where the 3D point is being estimated based on the location of the two image acquisition units and the image pixels. In an exemplary embodiment, the stereoscopic method for depth estimation may involve estimation of depth of a machine part based on the breadth and the length of the bounding box of detected images of the machine part in stereo image(s).
[0038] In an embodiment, the system (100) may combine the one or more depths estimated using the AI based method and the stereoscopic method into one or more final depths of the one or more machine parts as shown at step 110. In an exemplary embodiment, the estimated depth for a machine part can be an average of the depths estimated by the two methods for the machine part. In an exemplary embodiment, the depth estimated using either of the two methods may be used directly in case the estimated depths are proximate within a certain margin. In an embodiment, the system (100) may estimate one or more degradation levels of the one or more machine parts based on the one or more final depths as shown at step 112. In an exemplary embodiment, the estimation of degradation of a machine part based on the depth may be through comparison with reference depths with no degradation and maximum degradation.
[0039] In an embodiment, the system (100) may transmit an alert when the one or more final depths of the one or more machine parts are between a pre-defined range, indicating minimum (no) degradation and a maximum degradation (machine part needs a replacement) associated with the one or more machine parts. In an embodiment, the one or more degradation levels associated with the one or more machine parts is encoded in the alert via one or more colors. In an exemplary embodiment, alarm may be sent in real-time to the users via display units via color coded regions overlaid on the stereo images. In an exemplary embodiment, degradation levels may be displayed when the degradation is above 10%. In an exemplary embodiment, the variation of estimated depth with respect to reference depth may be calculated. The percentage of depth variation/change is used alarm indication/transmission. In an exemplary embodiment, for the percentage change within 20% then yellow color may be shown around an object (machine part), for a percentage change between 20% to 90%, orange color may be shown around the machine part, and for a percentage change in depth variation is more than 90% then red color may be shown around the detected machine part.
[0040] FIG. 2 illustrates an example system (200) for machine degradation estimation with structural and functional components, in accordance with an embodiment of the present disclosure. The system (200) referred to FIG. 2 is similar to system (100) shown in FIG. 2. Referring to FIG. 2, the system (100) may include at least one processor (202) that may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that process data based on operational instructions. Among other capabilities, the processor(s) (202) may be configured to fetch and execute computer-readable instructions stored in a memory (204) of the system (100). The memory (204) may be configured to store one or more computer-readable instructions or routines in a non-transitory computer readable storage medium, which may be fetched and executed to create or share data packets over a network service. The memory (204) may comprise any non-transitory storage device including, for example, volatile memory such as random-access memory (RAM), or non-volatile memory such as erasable programmable read only memory (EPROM), flash memory, and the like.
[0041] In an embodiment, the system (100) may include an interface(s) (206). The interface(s) (206) may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) (206) may facilitate communication to/from the system 100. The interface(s) (206) may also provide a communication pathway for one or more components of the system (100). Examples of such components include, but are not limited to, a processing unit/engine(s) (208) and a local database (218). In an embodiment, the local database (218) may be separate from the system (100).
[0042] In an embodiment, the processing engine(s) (208) may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) (208). In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) (208) may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) (208) may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) (208). In such examples, the system (100) may include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the system (100) and the processing resource. In other examples, the processing engine(s) (208) may be implemented by electronic circuitry.
[0043] In an embodiment, the processing engine (208) may include a data processing engine (210), a machine learning engine (212), stereoscopic depth estimation module (214) and other modules (216). The processor(s) (202) via the data processing engine (210) may detect the one or more stereo images associated with the machine via the plurality of image acquisition units at predetermined periods (temporal domain). The data processing engine (210) may store the one or more stereo images in the local database (218). The processor(s) (202) via the machine learning engine (212) may retrieve the one or more stereo images from the local database (218) and may detect one or more images associated with the one or more machine parts among the one or more stereo images, where the one or more machine parts are configured with the machine. The processor(s) (202) via the machine learning engine (212) using an AI based method may estimate one or more depths associated with one or more machine based on the one or more images. The processor(s) (202) via the stereoscopic depth estimation module (214) may estimate the one or more depths associated with the one or more machine parts using a stereoscopic method based on the one or more images. The processor(s) (202) may combine the one or more depths estimated using the AI based method and the stereoscopic method into one or more final depths of the one or more machine parts. The processor(s) (202) may estimate one or more degradation levels of the one or more machine parts based on the one or more final depths. The other modules (216) may implement the other functionalities of the system (100).
[0044] In an aspect, the present disclosure relates to a method for machine degradation estimation. FIG. 3 illustrates an example method (300) for machine degradation estimation, in accordance with an embodiment of the current disclosure. The system referred to in the method (300) is be similar to the system (100) shown in FIG. 1. The method may include the following steps. At step 302, a system (100) may detect one or more stereo images of a machine, via a plurality of image acquisition units, at predetermined periods (temporal domain), where the plurality of image acquisition units is configured at predetermined (fixed) distances from the machine, for detecting the one or more stereo images of the machine. At step 304, the system (100) may detect one or more images associated with the one or more machine parts among the one or more stereo images, where the one or more machine parts are configured with the machine. At step 306, the system (100) may estimate one or more depths associated with one or more machine parts, using an AI based method, based on the one or more images. At step 308, the system (100) may estimate, the one or more depths associated with the one or more machine parts, using a stereoscopic method for depth estimation, based on the one or more images. At step 310, the system (100) may combine, the one or more depths estimated using the AI based method and the stereoscopic method, into one or more final depths of the one or more machine parts. At step 312, the system (100) may estimate one or more degradation levels of the one or more machine parts, based on the one or more final depths.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0045] The present disclosure provides a system and a method for machine degradation estimation while the machine is running.
[0046] The present disclosure provides a system and a method for machine degradation estimation in real time.
[0047] The present disclosure provides a system and a method for machine degradation estimation and provide timely alerts.
, Claims:1. A system (100) for machine degradation estimation, the system comprising:
a plurality of image acquisition units configured at predetermined distances from a machine for detecting one or more stereo images of the machine;
at least one processor (202); and
a memory (204) operatively coupled to the at least one processor (202), wherein said memory (204) stores executable instructions which when executed by the at least one processor (202), causes the at least one processor (202) to:
detect, the one or more stereo images associated with the machine, via the plurality of image acquisition units, at predetermined periods;
detect, one or more images associated with the one or more machine parts among the one or more stereo images, wherein the one or more machine parts are configured with the machine;
estimate, one or more depths associated with one or more machine parts using an Artificial Intelligence (AI) based method, based on the one or more images;
estimate, the one or more depths associated with the one or more machine parts using a stereoscopic method, based on the one or more images;
combine, the one or more depths estimated using the AI based method and the stereoscopic method, into one or more final depths of the one or more machine parts; and
estimate, one or more degradation levels of the one or more machine parts, based on the one or more final depths.
2. The system (100) as claimed in claim 1, wherein the AI based method for estimating the one or more depths associated with the one or more machine parts comprises estimating, by the at least one processor (202), the one or more depths using a machine learning (ML) model, based on the one or more images associated with the one or more machine parts.
3. The system (100) as claimed in claim 1, wherein the stereoscopic method for estimating the one or more depths associated with the one or more machine parts comprises:
matching, by the at least one processor (202), one or more features of the one or more machine parts among the one or more stereo images; and
estimating, by the at least one processor (202), the one or more depths associated with the one or more machine parts, based on one or more displacements of the one or more features associated with the one or more machine parts.
4. The system (100) as claimed in claim 1, wherein the at least one processor (202) transmits an alert when the one or more final depths of the one or more machine parts are between a pre-defined range, indicating a minimum degradation and a maximum degradation associated with the one or more machine parts.
5. The system (100) as claimed in claim 4, wherein the one or more degradation levels associated with one or more machine parts is encoded in the alert via one or more colors.
6. A method (300) for machine degradation estimation, the method (300) comprising:
detecting (302), by at least one processor (202) associated with a system (100), one or more stereo images of a machine, via a plurality of image acquisition units, at predetermined periods, wherein the plurality of image acquisition units are configured at predetermined distances from the machine for detecting the one or more stereo images of the machine;
detecting (304), by the at least one processor (202), one or more images associated with the one or more machine parts among the one or more stereo images, wherein the one or more machine parts are configured with the machine;
estimating (306), by the at least one processor (202), one or more depths associated with one or more machine parts, using an AI based method, based on the one or more images, wherein the one or more machine parts are configured with the machine;
estimating (308), by the at least one processor (202), the one or more depths associated with the one or more machine parts, using a stereoscopic method for depth estimation, based on the one or more images;
combining (310), by the at least one processor (202), the one or more depths estimated using the AI based method and the stereoscopic method, into one or more final depths of the one or more machine parts; and
estimating (312), by the at least one processor (202), one or more degradation levels of the one or more machine parts, based on the one or more final depths.
7. The method (300) as claimed in claim 6, wherein the AI based method for estimating the one or more depths associated with the one or more machine parts comprises estimating, by the at least one processor (202), the one or more depths using an ML model, based on the one or more images associated with the one or more machine parts.
8. The method (300) as claimed in claim 6, wherein the stereoscopic method for estimating the one or more depths associated with the one or more machine parts comprises:
matching, by the at least one processor (202), one or more features of the one or more machine parts among the one or more stereo images; and
estimating, by the at least one processor (202), the one or more depths associated with the one or more machine parts based on one or more displacements of the one or more features associated with the one or more machine parts.
9. The method (300) as claimed in claim 6, wherein the at least one processor (202) transmits an alert when the one or more final depths of the one or more machine parts are between a pre-defined range, indicating a minimum degradation and a maximum degradation associated with the one or more machine parts.
10. The method (300) as claimed in claim 9, wherein the one or more degradation levels associated with one or more machine parts is encoded in the alert via one or more colors.
| # | Name | Date |
|---|---|---|
| 1 | 202441003113-STATEMENT OF UNDERTAKING (FORM 3) [16-01-2024(online)].pdf | 2024-01-16 |
| 2 | 202441003113-POWER OF AUTHORITY [16-01-2024(online)].pdf | 2024-01-16 |
| 3 | 202441003113-FORM 1 [16-01-2024(online)].pdf | 2024-01-16 |
| 4 | 202441003113-DRAWINGS [16-01-2024(online)].pdf | 2024-01-16 |
| 5 | 202441003113-DECLARATION OF INVENTORSHIP (FORM 5) [16-01-2024(online)].pdf | 2024-01-16 |
| 6 | 202441003113-COMPLETE SPECIFICATION [16-01-2024(online)].pdf | 2024-01-16 |
| 7 | 202441003113-Proof of Right [09-02-2024(online)].pdf | 2024-02-09 |
| 8 | 202441003113-POA [04-10-2024(online)].pdf | 2024-10-04 |
| 9 | 202441003113-FORM 13 [04-10-2024(online)].pdf | 2024-10-04 |
| 10 | 202441003113-AMENDED DOCUMENTS [04-10-2024(online)].pdf | 2024-10-04 |
| 11 | 202441003113-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |