Sign In to Follow Application
View All Documents & Correspondence

Method And System For Identifying Anomalies In A Target Object

Abstract: The invention relates to method (400) and system (500) for identifying anomalies in a target object. The method (400) includes capturing (401) one or more laser images of the target object and one or more hyperspectral images of the target object, in real time; generating (402) a 3D laser augmented hyperspectral image of the target object, based on the one or more laser images and the one or more hyperspectral images, based on a common reference image point, using an image overlaying algorithm; identifying (403) one or more anomalies in the 3D laser augmented hyperspectral image based on a reference 3D laser augmented hyperspectral image of an ideal target object; and classifying (404) the target object into one of a plurality of categories based on the one or more anomalies.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 January 2022
Publication Number
02/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
docketing@inventip.in
Parent Application
Patent Number
Legal Status
Grant Date
2025-01-21
Renewal Date

Applicants

HCL Technologies Limited
806, Siddharth, 96, Nehru Place, New Delhi-110019, INDIA

Inventors

1. Navin Saini
HCL Technologies Limited Technology Hub, SEZ Plot No. 3A, Sector 126. Noida Uttar Pradesh India 201304
2. Yogesh Gupta
HCL Technologies Limited Technology Hub, SEZ Plot No. 3A, Sector 126. Noida Uttar Pradesh India 201304

Specification

Generally, the invention relates to imaging systems. More specifically, the invention relates to method and system for identifying anomalies in a target object.
BACKGROUND
[002] In typical industrial setups, products such as machine parts, medicine, and the like are manufactured. The industrial setups always include a section where all the products are brought in for inspection. Most of the times, the inspection section includes a conveyor belt, and an inspection team is responsible for picking up the finished or semi-finished products from the conveyor belt which may be inspected further based on defined sample size. Thus, the current industrial setups are manual and prone to human error, expensive as labor cost is involved, time consuming as manual inspection takes time for inspection and measurements, and uses sampling as each part may not be inspected. In case of sampling, there may be few misses in inspection process as well.
[003] Further, various systems are available with automation for identifying anomalies in objects. These systems install multiple cameras to identify various anomalies. For example, a first camera to identify anomaly in size, a second to identify contour anomalies, and a third camera for identifying anomalies in composition. Number of installed cameras may depend on requirement. However, these cameras only capture images or videos of the products. Further, based on the captured images or videos, alerts may be triggered. Moreover, the inspection team may be needed for inspecting measurement anomalies, incorrect contour, and composition anomalies.
[004] Therefore, there is a need to develop a system that may automatically identify anomalies in measurement, contour, as well as composition of any object or product without human intervention.
SUMMARY
[005] In one embodiment, a method for identifying anomalies in a target object is disclosed. The method may include capturing one or more laser images of the target object and one or more hyperspectral images of the target object in real time. The one or more laser images may represent a three-dimensional (3D) structural profile of the target object and the one or more hyperspectral images may represent a two-dimensional (2D) composition profile of the target object. The method may further include generating a 3D laser augmented hyperspectral image of the target object, based on the one or more laser images and the one or more hyperspectral images, based on a common reference image point, using an image overlaying algorithm. The method may further include identifying one or more anomalies in the 3D laser augmented hyperspectral image based on a reference 3D laser augmented hyperspectral image of an ideal target object. The method may further include classifying the target object into one of a plurality of categories based on the one or more anomalies.
[006] In another embodiment, a system for identifying anomalies in a target object is disclosed. The system may include a processor and a memory communicatively coupled to the processor. The memory may store processor-executable instructions, which, on execution, may cause the processor to capture one or more laser images of the target object and one or more hyperspectral images of the target object, in real time. The one or more laser images may represent a three-dimensional (3D) structural profile of the target object, and the one or more hyperspectral images may represent a two-dimensional (2D) composition profile of the target object. The processor-executable instructions, on execution, may further cause the processor to generate a 3D laser augmented hyperspectral image of the target object, based on the one or more laser images and the one or more hyperspectral images, based on a common reference image point, using an image overlaying algorithm. The processor-executable instructions, on execution, may further cause the processor to identify one or more anomalies in the 3D laser augmented hyperspectral image based on a reference 3D laser augmented hyperspectral image of an ideal target object. The processor-executable instructions, on execution, may further cause the processor to classify the target object into one of a plurality of categories based on the one or more anomalies.
[007] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.

BRIEF DESCRIPTION OF THE DRAWINGS
[008] The present application can be best understood by reference to the following description taken in conjunction with the accompanying drawing figures, in which like parts may be referred to by like numerals
[009] FIG. 1 illustrates a hyperspectral laser (HSL) imager device configured for identifying anomalies in a target object, in accordance with some embodiments of the present disclosure.
[010] FIG. 2 illustrates an exemplary system for generating laser images using a laser imager of the HSL imager device, in accordance with some embodiments of the present disclosure.
[011] FIG. 3A, 3B, and 3C illustrate a laser image, a hyperspectral image, and a 3D laser augmented hyperspectral image of an exemplary object respectively, in accordance with some embodiments of the present disclosure.
[012] FIG. 4 illustrates a flow diagram of an exemplary process for identifying anomalies in a target object, in accordance with some embodiments of the present disclosure.
[013] FIG. 5 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.

DETAILED DESCRIPTION OF THE DRAWINGS
[014] The following description is presented to enable a person of ordinary skill in the art to make and use the invention and is provided in the context of particular applications and their requirements. Various modifications to the embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Moreover, in the following description, numerous details are set forth for the purpose of explanation. However, one of ordinary skill in the art will realize that the invention might be practiced without the use of these specific details. In other instances, well-known structures and devices are shown in block diagram form in order not to obscure the description of the invention with unnecessary detail. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
[015] While the invention is described in terms of particular examples and illustrative figures, those of ordinary skill in the art will recognize that the invention is not limited to the examples or figures described. Those skilled in the art will recognize that the operations of the various embodiments may be implemented using hardware, software, firmware, or combinations thereof, as appropriate. For example, some processes can be carried out using processors or other digital circuitry under the control of software, firmware, or hard-wired logic. (The term “logic” herein refers to fixed hardware, programmable logic and/or an appropriate combination thereof, as would be recognized by one skilled in the art to carry out the recited functions.) Software and firmware can be stored on computer-readable storage media. Some other processes can be implemented using analog circuitry, as is well known to one of ordinary skill in the art. Additionally, memory or other storage, as well as communication components, may be employed in embodiments of the invention.
[016] Referring now to FIG. 1, a hyperspectral laser (HSL) imager device 100 configured for identifying anomalies in a target object 101 is illustrated, in accordance with some embodiments of the present disclosure. In some embodiments, the HSL imager device 100 may include a laser imager 102, a hyperspectral imager 103, an image merging module 104, an anomaly identification module 105, and a classification module 106. Further, the HSL imager device 100 may also include a data store (not shown in FIG. 1) in order to store intermediate results generated by the modules 102-106.
[017] The laser imager 102 may be configured to capture one or more laser images of the target object 101. Further, in some embodiments, the laser imager 102 may capture images from different angles in different intervals of time as required. It should be noted that laser profile of the target object 101 may be captured by capturing the one or more laser images. The target object 101 may be any type of product. For example, in some embodiments, the target object 101 may be a medicine strip. Also, it should be noted that the one or more laser images represent a three-dimensional (3D) structural profile of the target object 101. The 3D structural profile of an exemplary target object is further explained in conjunction with FIGS. 3 and 4. The one or more laser images may be required for identifying shape, size, and contour of the target object.
[018] The hyperspectral imager 103 may be configured to capture one or more hyperspectral images of the target object 101. Further, the one or more hyperspectral images may be used to obtain spectrum for each pixel in an image of the target object 101. The hyperspectral imager 103 may divide the spectrum into various bands that may be useful for providing additional information about the target object 101. It should be noted that the one or more hyperspectral images represent a two-dimensional (2D) composition profile of the target object 101. The laser imager 102 and the hyperspectral imager 103 may be communicatively coupled to the image merging module 104 in order to process the one or more laser images and the one or more hyperspectral images.
[019] The image merging module 104 may be configured to generate a 3D laser augmented hyperspectral image of the target object. It should be noted that the 3D laser augmented hyperspectral image may be generated based on the one or more laser images and the one or more hyperspectral images, using an overlaying algorithm. Further, a common reference point may be considered for generating the 3D laser augmented hyperspectral image. The image merging module 104 may process the 3D laser augmented hyperspectral image to the operatively connected anomaly identification module 105.
[020] The anomaly identification module 105 may be configured to identify one or more anomalies in the target object 101. In detail, the anomaly identification module 105 may consider a reference 3D laser augmented hyperspectral image of an ideal target object and based on that the one or more anomalies may be identified in the target object 101. In some embodiments, for anomaly identification, information associated with shape, size, orientation, contour, and composition of the target object 101 may be extracted from the 3D laser augmented hyperspectral image. Further, in some embodiments, the 3D laser augmented hyperspectral image of the target object 101 may be compared with the reference 3D laser augmented hyperspectral image of the ideal target object 101 with respect to their dimensionalities and compositions. In some other embodiments, for identifying the one or more anomalies, a time-series analysis may be performed for the 3D laser augmented hyperspectral image of the target object. Further, the anomaly identification module 105 may be communicatively coupled to the classification module 106.
[021] The classification module 106 may be configured for classifying the target object 101 into one of a plurality of categories. In some embodiments, the plurality of categories may correspond to a plurality of types of the target object 101. It should be noted that the classification may be performed by the classification module 106 based on the one or more anomalies. The plurality of categories may include an exact match category, a failure category, and a manual inspection category. The exact match category may be a category for an object matching a baselined model and meeting quality standards of shape, size as well as composition. Further, the failure category may be a category for an object not matching pre-specified standards. Moreover, in the manual inspection category, the HSL imager device 100 may be unable to take decisions. Therefore, under this situation, a manual intervention for the inspection may be needed.
[022] Additionally, in some embodiments, a boundary (for example, a circle and a square) around the target object 101 in the 3D laser augmented hyperspectral image of the target object 101 may be created. The boundary may be created with any pre-defined color (such as red, blue, green, and the like). It should be noted that color of the boundary may depend on the category of the target object 101. By way of an example, green color may be used for the exact match category of the target object 101, a red color may be used for the failure category, and an orange color may be used for the manual inspection category.
[023] By way of an example, the HSL imager device 100 may be setup over a conveyor belt to inspect various products. The HSL imager device 100 may capture information like images and texture at a fast pace so that conveyor belt may keep moving. Apart from capturing images, the HSL imager device 100 may have the image merging module 104 to merge multiple images recorded by different modules 102 and 103 of the HSL imager device 100. Further, the classification module 106 may classify the products into three different categories after anomaly identification by the anomaly identification module 105. The categories may include a certified and approved category (corresponding to the exact match category) with no issues with the product. In one embodiment, the category may be a failed category with issue identified with product and need to be discarded. In another embodiment, the category may be a manual inspection needed under which the HSL imager device 100 may be unable to classify the product and manual inspection may be required.
[024] Referring now to FIG. 2, an exemplary system 200 for generating laser images using the HSL imager device 100 is illustrated, in accordance with some embodiments of the present disclosure. The system 200 includes a laser imager 201 and an object 202. The laser imager 201 may be similar to the laser imager 102. The laser imager 201 may be configured for capturing 3D structural profile 203 of the object 102. Further, the laser imager may provide an output in multiple cross-sectional images. It should be noted that multiple such images may be captured at a very short distance and combined to form a 3D image of the object 202. In other words, to capture a laser profile of an object under consideration (such as the object 202), the laser imager 201 may output a 3D profile of the object 202 which may be further used to identify any anomaly in shape and size.
[025] As illustrated in FIG. 2, a laser stripe 204 may be projected on the object 102 that may be further imaged on an image sensor. Hence, X, Y and Z dimensional information associated with the object 202 may be obtained using the laser imager 201. The laser imager 201 may identify measurements based on ‘XYZ’ coordinates. Also, contour variations (which may be represented using different color saturations) may be identified by the laser imager 201.
[026] Referring now to FIG. 3A. 3B, and 3C, a laser image 300A, a hyperspectral image 300B, and a 3D laser augmented hyperspectral image 300C of an exemplary object are illustrated, in accordance with some embodiments of the present disclosure. FIG. 3A represents a 3D laser image which may be captured using a laser imager (similar to the laser imager 102 and 201). Capturing of laser images using a laser imager have already been explained in detail in conjunction with FIG. 1 and FIG. 2. Further, the hyperspectral image 300B of the same object may be captured using a hyperspectral imager (same as the hyperspectral imager 103) of the HSL imager device 100. The hyperspectral image 300B may be used further to identify anomaly in composition, for example percentage of raw material used and percentage mix of iron and aluminum in an alloy.
[027] By way of an example, in some situation such as medicine packages, a user is unable to identify an anomaly visually. However, the hyperspectral image 300B may show composition defect or missing tablet defect. Further, it may be noted that the hyperspectral image 300B represents a 2D composition profile of the object.
[028] With regards to the composition profile, the hyperspectral image 300B may cover the composition information of the object. In particular, the hyperspectral image 300B may be captured to obtain spectrum for each pixel in an image of the object with the purpose of identifying materials used in manufacturing of the object. Generally, a human eye is capable to see only visible light in three bands (i.e., red, green and blue). However, a hyperspectral image deals with more than that three bands which may not be visible to human eyes. By way of an example, each element (such as plastic, rubber, iron, soil etc.) has a unique spectrum signature. Therefore, a hyperspectral image may be captured to identify different components in an image with the help of different colors. Similarly, in the hyperspectral image 300B different shaded areas are used to represent different colors in the hyperspectral image 300B. Therefore, each of the shaded area may represent an element of the object.
[029] Further, the 3D laser augmented hyperspectral image 300C of the object may be generated based the laser image 300A and the hyperspectral image 300B. The laser image 300A and the hyperspectral image 300B may be merged together, using a common reference point in order to generate the 3D laser augmented hyperspectral image 300C. Once both the images are merged together, a 3D model of the object in color may be obtained as illustrated in FIG. 3C. Also, in the 3D augmented hyperspectral image 300C, different shaded areas represent different colors. This representation further may be used as a baseline for anomaly detection or classification.
[030] The HSL Imager device 100 may be useful in various industries to automate the manual process. The HSL Imager device 100 follows a non-intrusive approach and may be used with already existing setups without many changes. Further, the HSL Imager device 100 may be useful across various industries and may have various applications such as mining (Mineralogy), geology, earth observation (for water, rock, soil and plant analysis), agriculture (for quality of crop, pesticide spread, water need), food processing (quality of food), civil engineering (for cracks identification), ecology, and surveillance to inspection.
[031] In short, the HSL imager device 100 may identify an anomaly in shape, size, and composition (for example, there may be excess humidity in the alloy which is used to make parts/medicine, changing composition of soil, or oil leakage).
[032] It should be noted that the HSL imager device 100 may be implemented in programmable hardware devices such as programmable gate arrays, programmable array logic, programmable logic devices, or the like. Alternatively, the HSL imager device 100 may be implemented in software for execution by various types of processors. An identified engine/module of executable code may, for instance, include one or more physical or logical blocks of computer instructions which may, for instance, be organized as a component, module, procedure, function, or other construct. Nevertheless, the executables of an identified engine/module need not be physically located together but may include disparate instructions stored in different locations which, when joined logically together, comprise the identified engine/module and achieve the stated purpose of the identified engine/module. Indeed, an engine or a module of executable code may be a single instruction, or many instructions, and may even be distributed over several different code segments, among different applications, and across several memory devices.
[033] As will be appreciated by one skilled in the art, a variety of processes may be employed for identifying anomalies in a target object. For example, the exemplary HSL imager device 100 may identify anomalies in the target object, by the process discussed herein. In particular, as will be appreciated by those of ordinary skill in the art, control logic and/or automated routines for performing the techniques and steps described herein may be implemented by the HSL imager device 100 either by hardware, software, or combinations of hardware and software. For example, suitable code may be accessed and executed by the one or more processors to perform some or all of the techniques described herein. Similarly, application specific integrated circuits (ASICs) configured to perform some or all the processes described herein may be included in the one or more processors of the HSL imager device 100.
[034] Referring now to FIG. 4, an exemplary process 400 for identifying anomalies in a target object is depicted via a flow diagram, in accordance with some embodiments of the present disclosure. Each step of the process 400 may be performed by an HSL imager device (similar to the HSL imager device 100). FIG. 2 is explained in conjunction with FIG. 1.
[035] At step 401, one or more laser images (such as laser images 102a) of the target object and one or more hyperspectral images (such as the hyperspectral images 103a) of the target object may be captured, in real time. To capture the one or more laser images and one or more hyperspectral images, the HSL imager device 100 may include at least one laser imager and one hyperspectral imager (similar to the laser imager 102 and the hyperspectral imager 103). It should be noted that the one or more laser images represent a three-dimensional (3D) structural profile of the target object. And, the one or more hyperspectral images represent a two-dimensional (2D) composition profile of the target object.
[036] At step 402, a 3D laser augmented hyperspectral image of the target object may be generated based on the one or more laser images and the one or more hyperspectral images. An image merging module (same as the image merging module 104) may be employed to perform this operation. In particular, an image overlaying algorithm may be used for generating the 3D laser augmented hyperspectral image of the target object. Also, it should be noted that a common reference image point may be considered to generate the 3D laser augmented hyperspectral image of the target object.
[037] Further, at step 403, one or more anomalies may be identified in the 3D laser augmented hyperspectral image with the help of an anomaly identification module (similar to the anomaly identification module 105). Further, a reference 3D laser augmented hyperspectral image of an ideal target object may be considered to identify the one or more anomalies. In other words, the 3D laser augmented hyperspectral image of the target object may be compared with the reference 3D laser augmented hyperspectral image of the ideal target object with respect to their dimensionalities and compositions. In some embodiments, to identify the one or more anomalies, information may be extracted from the 3D laser augmented hyperspectral image. The information may be associated with shape, size, orientation, contour, and composition of the target object. Further, in some embodiments, a time-series analysis for the 3D laser augmented hyperspectral image of the target object may be performed.
[038] Thereafter, at step 404, the target object may be classified into one of a plurality of categories based on the one or more anomalies. A classification module (analogous to the classification module 106) may be used for classification. The plurality of categories may include, but are not limited to, an exact match category, a failure category, and a manual inspection category. In some embodiments, a boundary may be created around the target object in the 3D laser augmented hyperspectral image of the target object. It should be noted that boundaries are created based on identification of the one or more anomalies. Also, a plurality of colors, to differentiate the plurality of categories, may be used for creating the boundaries.
[039] The disclosed methods and systems may be implemented on a conventional or a general-purpose computer system, such as a personal computer (PC) or server computer. Referring now to FIG. 5, an exemplary computing system 500 that may be employed to implement processing functionality for various embodiments (e.g., as a SIMD device, client device, server device, one or more processors, or the like) is illustrated. Those skilled in the relevant art will also recognize how to implement the invention using other computer systems or architectures. The computing system 500 may represent, for example, a user device such as a desktop, a laptop, a mobile phone, personal entertainment device, DVR, and so on, or any other type of special or general-purpose computing device as may be desirable or appropriate for a given application or environment. The computing system 500 may include one or more processors, such as a processor 501 that may be implemented using a general or special purpose processing engine such as, for example, a microprocessor, microcontroller or other control logic. In this example, the processor 501 is connected to a bus 502 or other communication medium. In some embodiments, the processor 501 may be an Artificial Intelligence (AI) processor, which may be implemented as a Tensor Processing Unit (TPU), or a graphical processor unit, or a custom programmable solution Field-Programmable Gate Array (FPGA).
[040] The computing system 500 may also include a memory 503 (main memory), for example, Random Access Memory (RAM) or other dynamic memory, for storing information and instructions to be executed by the processor 501. The memory 503 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 501. The computing system 500 may likewise include a read only memory (“ROM”) or other static storage device coupled to bus 502 for storing static information and instructions for the processor 501.
[041] The computing system 500 may also include a storage device 504, which may include, for example, a media drives 505 and a removable storage interface. The media drive 505 may include a drive or other mechanism to support fixed or removable storage media, such as a hard disk drive, a floppy disk drive, a magnetic tape drive, an SD card port, a USB port, a micro USB, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive. A storage media 506 may include, for example, a hard disk, magnetic tape, flash drive, or other fixed or removable medium that is read by and written to by the media drive 505. As these examples illustrate, the storage media 506 may include a computer-readable storage medium having stored there in particular computer software or data.
[042] In alternative embodiments, the storage devices 504 may include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into the computing system 500. Such instrumentalities may include, for example, a removable storage unit 507 and a storage unit interface 508, such as a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, and other removable storage units and interfaces that allow software and data to be transferred from the removable storage unit 507 to the computing system 500.
[043] The computing system 500 may also include a communications interface 509. The communications interface 509 may be used to allow software and data to be transferred between the computing system 500 and external devices. Examples of the communications interface 509 may include a network interface (such as an Ethernet or other NIC card), a communications port (such as for example, a USB port, a micro USB port), Near field Communication (NFC), etc. Software and data transferred via the communications interface 509 are in the form of signals which may be electronic, electromagnetic, optical, or other signals capable of being received by the communications interface 509. These signals are provided to the communications interface 509 via a channel 510. The channel 510 may carry signals and may be implemented using a wireless medium, wire or cable, fiber optics, or other communications medium. Some examples of the channel 510 may include a phone line, a cellular phone link, an RF link, a Bluetooth link, a network interface, a local or wide area network, and other communications channels.
[044] The computing system 500 may further include Input/Output (I/O) devices 511. Examples may include, but are not limited to a display, keypad, microphone, audio speakers, vibrating motor, LED lights, etc. The I/O devices 511 may receive input from a user and also display an output of the computation performed by the processor 501. In this document, the terms “computer program product” and “computer-readable medium” may be used generally to refer to media such as, for example, the memory 503, the storage devices 504, the removable storage unit 507, or signal(s) on the channel 510. These and other forms of computer-readable media may be involved in providing one or more sequences of one or more instructions to the processor 501 for execution. Such instructions, generally referred to as “computer program code” (which may be grouped in the form of computer programs or other groupings), when executed, enable the computing system 500 to perform features or functions of embodiments of the present invention.
[045] In an embodiment where the elements are implemented using software, the software may be stored in a computer-readable medium and loaded into the computing system 500 using, for example, the removable storage unit 507, the media drive 505 or the communications interface 509. The control logic (in this example, software instructions or computer program code), when executed by the processor 501, causes the processor 501 to perform the functions of the invention as described herein.
[046] Thus, the present disclosure may overcome drawbacks of traditional systems discussed before. The disclosed method and system in the present disclosure may help in identifying defects in shape, size, measurement, orientation, as well as composition. Therefore, the disclosed system may be employed in manufacturing units such as industrial, medical and automation. Further, the disclosure provides increased production capacity by identifying defects in real time. Hence, time usually taken for inspection may be reduced. Further, the disclosure may provide an advantage of cost reduction as the system works automatically and no manual intervention is needed. Therefore, labor cost may be reduced to some extent. Additionally, the disclosure provides other advantages including non-intrusive inspection, no sampling and zero-defect leakage from inspection system. Also, the disclosed HSL imager device processes data in real time and helps in classification of products at runtime, thereby reducing efforts and time in any manufacturing process.
[047] Further, the HSL imager device may be able to provide contactless measurement check by measuring the shape and size without touching the products and flag defects in real-time.
[048] Further, the HSL imager device may precisely check contours. The HSL imager device Identifies the contours in the product with high accuracy by measuring angles and depths as well.
[049] It will be appreciated that, for clarity purposes, the above description has described embodiments of the invention with reference to different functional units and processors. However, it will be apparent that any suitable distribution of functionality between different functional units, processors or domains may be used without detracting from the invention. For example, functionality illustrated to be performed by separate processors or controllers may be performed by the same processor or controller. Hence, references to specific functional units are only to be seen as references to suitable means for providing the described functionality, rather than indicative of a strict logical or physical structure or organization.
[050] Although the present invention has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Rather, the scope of the present invention is limited only by the claims. Additionally, although a feature may appear to be described in connection with particular embodiments, one skilled in the art would recognize that various features of the described embodiments may be combined in accordance with the invention.
[051] Furthermore, although individually listed, a plurality of means, elements or process steps may be implemented by, for example, a single unit or processor. Additionally, although individual features may be included in different claims, these may possibly be advantageously combined, and the inclusion in different claims does not imply that a combination of features is not feasible and/or advantageous. Also, the inclusion of a feature in one category of claims does not imply a limitation to this category, but rather the feature may be equally applicable to other claim categories, as appropriate.

CLAIMS

We Claim:

1. A method (400) for identifying anomalies in a target object, the method (400) comprising:
capturing (401), by a hyperspectral laser (HSL) imager device (100), one or more laser images of the target object and one or more hyperspectral images of the target object, in real time, wherein the one or more laser images represent a three-dimensional (3D) structural profile of the target object, and wherein the one or more hyperspectral images represent a two-dimensional (2D) composition profile of the target object;
generating (402), by the HSL imager device (100), a 3D laser augmented hyperspectral image of the target object, based on the one or more laser images and the one or more hyperspectral images, based on a common reference image point, using an image overlaying algorithm;
identifying (403), by the HSL imager device (100), one or more anomalies in the 3D laser augmented hyperspectral image based on a reference 3D laser augmented hyperspectral image of an ideal target object; and
classifying (404), by the HSL imager device (100), the target object into one of a plurality of categories based on the one or more anomalies.

2. The method (400) of claim 1, wherein identifying (403) one or more anomalies comprises extracting information associated with shape, size, orientation, contour, and composition of the target object from the 3D laser augmented hyperspectral image.

3. The method (400) of claim 1, wherein classifying (404) the target object comprises creating a boundary around the target object in the 3D laser augmented hyperspectral image of the target object, wherein boundaries are created based on identification of the one or more anomalies, and wherein a plurality of colours differentiating the plurality of categories are used for creating the boundaries.

4. The method (400) of claim 1, wherein the plurality of categories comprises an exact match category, a failure category, and a manual inspection category.

5. The method (400) of claim 1, wherein identifying (403) the one or more anomalies comprises comparing the 3D laser augmented hyperspectral image of the target object with the reference 3D laser augmented hyperspectral image of the ideal target object with respect to their dimensionalities and compositions; and

6. The method (400) of claim 1, wherein identifying (403) the one or more anomalies comprises performing a time-series analysis for the 3D laser augmented hyperspectral image of the target object.

7. A system (500) for identifying anomalies in a target object, the system (500) comprising:
a processor (501); and
a memory (503) communicatively coupled to the processor (501), wherein the memory (503) stores processor-executable instructions, which, on execution, cause the processor (501) to:
capture (401) one or more laser images of the target object and one or more hyperspectral images of the target object, in real time, wherein the one or more laser images represent a three-dimensional (3D) structural profile of the target object, and wherein the one or more hyperspectral images represent a two-dimensional (2D) composition profile of the target object;
generate (402) a 3D laser augmented hyperspectral image of the target object, based on the one or more laser images and the one or more hyperspectral images, based on a common reference image point, using an image overlaying algorithm;
identify (403) one or more anomalies in the 3D laser augmented hyperspectral image based on a reference 3D laser augmented hyperspectral image of an ideal target object; and
classify (404) the target object into one of a plurality of categories based on the one or more anomalies.

8. The system (500) of claim 7, wherein the processor executable instructions may cause the processor (501) to identify one or more anomalies by:
extracting information associated with shape, size, orientation, contour, and composition of the target object from the 3D laser augmented hyperspectral image; and
comparing the 3D laser augmented hyperspectral image of the target object with the reference 3D laser augmented hyperspectral image of the ideal target object with respect to their dimensionalities and compositions.

9. The system of claim 7, wherein the processor executable instructions may cause the processor (501) to classify the target object by creating a boundary around the target object in the 3D laser augmented hyperspectral image of the target object, wherein boundaries are created based on identification of the one or more anomalies.

10. The system of claim 7, wherein the processor-executable instructions cause the processor to perform a time-series analysis for the 3D laser augmented hyperspectral image of the target object.

Documents

Application Documents

# Name Date
1 202211001617-IntimationOfGrant21-01-2025.pdf 2025-01-21
1 202211001617-STATEMENT OF UNDERTAKING (FORM 3) [11-01-2022(online)].pdf 2022-01-11
1 202211001617-Written submissions and relevant documents [15-07-2024(online)].pdf 2024-07-15
2 202211001617-FORM-26 [01-07-2024(online)].pdf 2024-07-01
2 202211001617-PatentCertificate21-01-2025.pdf 2025-01-21
2 202211001617-REQUEST FOR EXAMINATION (FORM-18) [11-01-2022(online)].pdf 2022-01-11
3 202211001617-Correspondence to notify the Controller [27-06-2024(online)].pdf 2024-06-27
3 202211001617-REQUEST FOR EARLY PUBLICATION(FORM-9) [11-01-2022(online)].pdf 2022-01-11
3 202211001617-Written submissions and relevant documents [15-07-2024(online)].pdf 2024-07-15
4 202211001617-PROOF OF RIGHT [11-01-2022(online)].pdf 2022-01-11
4 202211001617-FORM-26 [27-06-2024(online)].pdf 2024-06-27
4 202211001617-FORM-26 [01-07-2024(online)].pdf 2024-07-01
5 202211001617-US(14)-HearingNotice-(HearingDate-01-07-2024).pdf 2024-05-15
5 202211001617-POWER OF AUTHORITY [11-01-2022(online)].pdf 2022-01-11
5 202211001617-Correspondence to notify the Controller [27-06-2024(online)].pdf 2024-06-27
6 202211001617-FORM-9 [11-01-2022(online)].pdf 2022-01-11
6 202211001617-FORM-26 [27-06-2024(online)].pdf 2024-06-27
6 202211001617-DRDO REPLY.pdf 2023-07-12
7 202211001617-US(14)-HearingNotice-(HearingDate-01-07-2024).pdf 2024-05-15
7 202211001617-FORM 18 [11-01-2022(online)].pdf 2022-01-11
7 202211001617-CLAIMS [16-03-2023(online)].pdf 2023-03-16
8 202211001617-CORRESPONDENCE [16-03-2023(online)].pdf 2023-03-16
8 202211001617-DRDO REPLY.pdf 2023-07-12
8 202211001617-FORM 1 [11-01-2022(online)].pdf 2022-01-11
9 202211001617-CLAIMS [16-03-2023(online)].pdf 2023-03-16
9 202211001617-DRAWING [16-03-2023(online)].pdf 2023-03-16
9 202211001617-FIGURE OF ABSTRACT [11-01-2022(online)].jpg 2022-01-11
10 202211001617-CORRESPONDENCE [16-03-2023(online)].pdf 2023-03-16
10 202211001617-DRAWINGS [11-01-2022(online)].pdf 2022-01-11
10 202211001617-FER_SER_REPLY [16-03-2023(online)].pdf 2023-03-16
11 202211001617-DECLARATION OF INVENTORSHIP (FORM 5) [11-01-2022(online)].pdf 2022-01-11
11 202211001617-DRAWING [16-03-2023(online)].pdf 2023-03-16
11 202211001617-OTHERS [16-03-2023(online)].pdf 2023-03-16
12 202211001617-COMPLETE SPECIFICATION [11-01-2022(online)].pdf 2022-01-11
12 202211001617-FER.pdf 2022-09-16
12 202211001617-FER_SER_REPLY [16-03-2023(online)].pdf 2023-03-16
13 202211001617-OTHERS [16-03-2023(online)].pdf 2023-03-16
13 202211001617-Defence-14-09-2022.pdf 2022-09-14
14 202211001617-COMPLETE SPECIFICATION [11-01-2022(online)].pdf 2022-01-11
14 202211001617-FER.pdf 2022-09-16
15 202211001617-DECLARATION OF INVENTORSHIP (FORM 5) [11-01-2022(online)].pdf 2022-01-11
15 202211001617-Defence-14-09-2022.pdf 2022-09-14
15 202211001617-OTHERS [16-03-2023(online)].pdf 2023-03-16
16 202211001617-COMPLETE SPECIFICATION [11-01-2022(online)].pdf 2022-01-11
16 202211001617-DRAWINGS [11-01-2022(online)].pdf 2022-01-11
16 202211001617-FER_SER_REPLY [16-03-2023(online)].pdf 2023-03-16
17 202211001617-FIGURE OF ABSTRACT [11-01-2022(online)].jpg 2022-01-11
17 202211001617-DECLARATION OF INVENTORSHIP (FORM 5) [11-01-2022(online)].pdf 2022-01-11
17 202211001617-DRAWING [16-03-2023(online)].pdf 2023-03-16
18 202211001617-FORM 1 [11-01-2022(online)].pdf 2022-01-11
18 202211001617-DRAWINGS [11-01-2022(online)].pdf 2022-01-11
18 202211001617-CORRESPONDENCE [16-03-2023(online)].pdf 2023-03-16
19 202211001617-CLAIMS [16-03-2023(online)].pdf 2023-03-16
19 202211001617-FIGURE OF ABSTRACT [11-01-2022(online)].jpg 2022-01-11
19 202211001617-FORM 18 [11-01-2022(online)].pdf 2022-01-11
20 202211001617-DRDO REPLY.pdf 2023-07-12
20 202211001617-FORM 1 [11-01-2022(online)].pdf 2022-01-11
20 202211001617-FORM-9 [11-01-2022(online)].pdf 2022-01-11
21 202211001617-FORM 18 [11-01-2022(online)].pdf 2022-01-11
21 202211001617-POWER OF AUTHORITY [11-01-2022(online)].pdf 2022-01-11
21 202211001617-US(14)-HearingNotice-(HearingDate-01-07-2024).pdf 2024-05-15
22 202211001617-FORM-26 [27-06-2024(online)].pdf 2024-06-27
22 202211001617-FORM-9 [11-01-2022(online)].pdf 2022-01-11
22 202211001617-PROOF OF RIGHT [11-01-2022(online)].pdf 2022-01-11
23 202211001617-Correspondence to notify the Controller [27-06-2024(online)].pdf 2024-06-27
23 202211001617-POWER OF AUTHORITY [11-01-2022(online)].pdf 2022-01-11
23 202211001617-REQUEST FOR EARLY PUBLICATION(FORM-9) [11-01-2022(online)].pdf 2022-01-11
24 202211001617-FORM-26 [01-07-2024(online)].pdf 2024-07-01
24 202211001617-PROOF OF RIGHT [11-01-2022(online)].pdf 2022-01-11
24 202211001617-REQUEST FOR EXAMINATION (FORM-18) [11-01-2022(online)].pdf 2022-01-11
25 202211001617-Written submissions and relevant documents [15-07-2024(online)].pdf 2024-07-15
25 202211001617-STATEMENT OF UNDERTAKING (FORM 3) [11-01-2022(online)].pdf 2022-01-11
25 202211001617-REQUEST FOR EARLY PUBLICATION(FORM-9) [11-01-2022(online)].pdf 2022-01-11
26 202211001617-REQUEST FOR EXAMINATION (FORM-18) [11-01-2022(online)].pdf 2022-01-11
26 202211001617-PatentCertificate21-01-2025.pdf 2025-01-21
27 202211001617-STATEMENT OF UNDERTAKING (FORM 3) [11-01-2022(online)].pdf 2022-01-11
27 202211001617-IntimationOfGrant21-01-2025.pdf 2025-01-21

Search Strategy

1 202211001617E_16-09-2022.pdf

ERegister / Renewals

3rd: 03 Apr 2025

From 11/01/2024 - To 11/01/2025

4th: 03 Apr 2025

From 11/01/2025 - To 11/01/2026