Sign In to Follow Application
View All Documents & Correspondence

Detecting Threat Object During X Ray Baggage Scanning

Abstract: ABSTRACT DETECTING THREAT OBJECT DURING X-RAY BAGGAGE SCANNING A method for detecting a pointed threat object during X-ray baggage scanning, comprising: converting an X-ray image from RGB format to HSV format; extracting a binary image of blue coloured pixels from the HSV converted X-ray image; extracting outer contours of the binary image; computing the areas of the extracted outer contours; comparing the computed areas with an area threshold value; selecting, if the computed area of the extracted outer contour is greater than or equal to the area threshold value, regions having dark shades of blue colored pixels in the extracted outer contours of the binary image and updating the binary image with the dark shaded regions; separating connected objects in the updated binary image; computing contours of the updated binary image including the dark shaded regions and the separated objects therein; removing contours having length shorter than a predefined contour length from the computed contours of the updated binary image; and detecting, from the remaining contours, the contour matching a contour of the pointed threat object and generating an alert upon successful detection.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 March 2019
Publication Number
40/2020
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
info@krishnaandsaurastri.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-09-26
Renewal Date

Applicants

Bharat Electronics Limited
Outer Ring Road, Nagavara, Bangalore- 560045, Karnataka

Inventors

1. Divya Nagaraja Reddy
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O., Bangalore- 560013, Karnataka, India
2. Indu Soloman
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O., Bangalore- 560013, Karnataka, India

Specification

DESC:FORM – 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003

COMPLETE SPECIFICATION
(SEE SECTION 10, RULE 13)

DETECTING THREAT OBJECT
DURING X-RAY BAGGAGE SCANNING

BHARAT ELECTRONICS LIMITED
WITH ADDRESS:
OUTER RING ROAD, NAGAVARA, BANGALORE 560045, KARNATAKA, INDIA

THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.

TECHNICAL FIELD
[0001] The present invention relates generally to threat detection during X-ray baggage scanning. The invention, more particularly, relates to detection of pointed threat object(s) during X-ray baggage scanning.
BACKGROUND
[0002] Security checking at sensitive establishments, such as airports, metro rail stations, etc., includes baggage scanning where baggages are conveyed through a conveyor belt of an X-ray baggage scanner. X-rays are fired by an X-ray source and X-ray scanned images of the baggages are captured by detector cards/devices and sent to a computer (PC) for manual scrutiny by security personnel. X-ray baggage scanning internationally follows three colour scheme for material/object detection, namely, organic materials coloured in orange, low metals in green and high metals in blue colour.
[0003] The examination of the X-ray scanned images of the baggages are done manually by experienced/trained security personnel. In spite of this fact, the fatigue of continuous observation of scanned images, sitting for long hours, and also the cluttered nature of passenger baggage, at times leads to missed detection of threat objects.
[0004] There have been certain endeavors for threat object detection during X-ray baggage scanning. A Paper titled “Visual detection of knives in security applications using active appearance models” by Andrzej Glowacz et. al., mentions use of computer vision algorithms to locate an object that is in the analysed image using an interest point typical of knives to identify whether or not a knife exists in the image. It uses a technique called active appearance models (AAMs), which was earlier used for medical image interpretation. AAM can be described as a statistical model of the shape and pixel intensities across the object. In training phase, objects of interest are manually labelled in images from the training set with landmark points to define their shape. Principle Component Analysis (PCA) is then used to model variability between objects in images from the training set. The tip of the knife is detected using Harris corner detection algorithm.
[0005] US Patent number US7302083B2 titled “Method of and system for sharp object detection using computed tomography images” explains a method of and a system for processing computed tomography (CT) data to automatically identify sharp objects, such as knives in passenger baggages. Three dimensional CT image is generated by scanning the baggage and a set of voxels associated with an object is identified. Eigen analysis is performed on the voxels corresponding to an identified object to yield Eigen vectors. Eigen projection is generated by projecting the voxels of the identified object onto a plane perpendicular to the Eigen vector corresponding to the minimum Eigen value. Two features are computed from the Eigen projection of an identified object. One feature is an axial concavity ratio, and the other is a pointness measurement. The axial concavity gives the straightness of an object and the pointness measurement is a function of the sharpness of the detected object.
[0006] UK patent number GB2501026 titled “X-ray tomographic inspection systems for the identification of specific target items” provides a method for detecting a pointed object or a knife by identifying one or more predefined features in the tomographic X-ray image and configuring one or more decision trees by analysing the X-ray image. The method comprises of the following steps: (a) parameter extraction to detect one or more protruding points in the X-ray image, (b) second parameter extractor to identify one or more blades having a predefined length to width aspect ratio and (c) third parameter extractor for identifying folded blades having a repeating structure of at least two air gaps and three material fills. The decision tree is configured for correlating the parameters identified during the parameter extraction and transferring the correlated data for identification of pointed object or knife by mapping against a database.
[0007] Indian Patent Application number 1761/CHE/2014 titled, “A Method for Processing of X-Ray Baggage Scan Images” provides a method for discrimination of different materials scanned by computing the effective atomic number based on the ratio of low energy detector data to the high energy detector data. Based on the effective atomic number of the material, three colour system is proposed - orange for organic materials, green for inorganic materials and blue for metals. Threat Image Projection feature (superimposing random TIP object selected from the database with the detector data) is also incorporated into the system to check the efficiency of the operator.
[0008] Indian Patent Application number 1676/CHE/2015 titled, “A Method of Geometry/Shape Correction of X-Ray Baggage Scan Images” provides a method to correct the geometric distortion introduced by X-Ray source, L shaped detectors & object position. From the known physical lengths, distance factors are computed and all the distance factors are normalized with respect to the distance factor of one detector card. Then, a lookup table is generated by mapping the line of distorted image to the line of corrected image data. The image is finally downscaled to the original size to retrieve the corrected image.
[0009] However, none of the existing arts are capable of detecting pointed threat objects from X-ray scanned images of the baggages. There is therefore felt a need of an invention which provides detection of pointed threat object during X-ray baggage scanning, where the pointed threat objects of interest are sharp in nature and may also be overlapped with other objects.
SUMMARY
[0010] This summary is provided to introduce concepts of the invention related to detection of pointed threat object(s) during X-ray baggage scanning, as disclosed herein. This summary is neither intended to identify essential features of the invention as per the present invention nor is it intended for use in determining or limiting the scope of the invention as per the present invention.
[0011] In accordance with an exemplary implementation of the present invention, there is provided a computer implemented method for detecting a pointed threat object during X-ray baggage scanning. The method comprises: receiving an X-ray image of a baggage in RGB (Red Green Blue) format; converting the X-ray image from RGB (Red Green Blue) format to HSV (Hue Saturation Value) format; extracting a binary image of blue coloured pixels from the HSV converted X-ray image; extracting outer contours of the binary image; computing the areas of the extracted outer contours; comparing the computed areas of the extracted outer contours with an area threshold value; selecting, if the computed area of the extracted outer contours is greater than or equal to the area threshold value, one or more regions having dark shades of blue colored pixels in the extracted outer contours of the binary image and updating the binary image with the dark shaded regions; separating two or more connected objects in the updated binary image; computing contours of the updated binary image including the dark shaded regions and the separated objects therein; removing contours having length shorter than a predefined contour length from the computed contours of the updated binary image; and detecting, from the remaining contours, the contour matching a contour of the pointed threat object and generating an alert upon successful detection.
[0012] In an implementation, the step of separating two or more connected objects includes: extracting contours of the connected objects; removing contours having length shorter than the predefined contour length from the extracted contours of the connected objects; enclosing each remaining contour in a rectangle and determining the length and breadth of each rectangle; computing the area of each rectangle; determining the area of each remaining contour by counting the number of pixels in each remaining contour; computing the difference between the area of each rectangle and the area of each remaining contour, and comparing the difference area with a predefined dynamic area threshold area value; identifying the contours with two or more connected objects if the difference between the area of a rectangle and the area of a contour is greater than the predefined dynamic area threshold value; detecting a plurality of corner points in that contour with the difference area greater than the predefined dynamic area threshold value; determining actual distances between all pairs of corner points from the plurality of corner points; determining minimum distances as a minimum of peripheral distances between all pairs of corner points from the plurality of corner points when traversed along the contour; determining the ratio of the minimum distance to the actual distance of each pair of corner points and selecting the pair of corner points having the highest ratio; and drawing a separating line between the selected pair of corner points to separate the connected objects.
[0013] In an implementation, the step of detecting the contour matching a contour of the pointed threat object includes: detecting the corners of the remaining contours of the updated binary image by a corner detection technique such as Harris corner detection technique; detecting a plurality of points common to the contours and their corners; computing a dynamic displacement value for a contour; selecting point pairs whose distances to a common corner point along the contour are in multiples of the dynamic displacement value; calculating shortest distances between the points in each point pair and computing maximum of the shortest distances; comparing the computed maximum of the shortest distances with the breadth of the rectangle enclosing the contour (the computed maximum of the shortest distances must be less than the breadth of the rectangle); discarding the point pairs which do not meet a distance criteria; determining the cosine of angles between the points of each point pair with reference to the common corner point; counting the number of point pairs having cosine of angle greater than a first angle threshold; comparing the number of point pairs with a point pair threshold value; and classifying the contour as the pointed threat object if the number of point pairs are greater than the threshold value of point pairs.
[0014] In an implementation, the method, if the number of point pairs are less than the point pair threshold value, includes: counting the number of irregular point pairs; comparing the number of irregular point pairs with a first point pair threshold value; counting the maximum number of consecutive point pairs having cosine of angle greater than a second angle threshold, if the number of irregular point pairs is less than the first point pair threshold value; comparing the number of consecutive point pairs with a second point pair threshold value; and classifying the contour as the pointed threat object if the number of consecutive point pairs is greater than the second point pair threshold value.
[0015] In accordance with another exemplary implementation of the present invention, there is provided a system for detecting a pointed threat object during X-ray baggage scanning. The system comprises a computing unit in communication with an X-ray baggage scanner, wherein the computing unit comprises at least a memory configured to store a set of instructions, and at least a processor cooperating with the memory to execute the instructions and configured to: receive an X-ray image of a baggage in RGB (Red Green Blue) format; convert the X-ray image from RGB (Red Green Blue) format to HSV (Hue Saturation Value) format; extract a binary image of blue coloured pixels from the HSV converted X-ray image; extract outer contours of the binary image; compute the areas of the extracted outer contours; compare the computed areas of the extracted outer contours with an area threshold value; select one or more regions having dark shades of blue colored pixels in the extracted outer contours of the binary image and update the binary image with the dark shaded regions, in the event that the computed area of the extracted outer contour is greater than or equal to the area threshold value; separate two or more connected objects in the updated binary image; compute contours of the updated binary image including the dark shaded regions and the separated objects therein; remove contours having length shorter than a predefined contour length from the computed contours of the updated binary image; and detect, from the remaining contours, the contour matching a contour of the pointed threat object and generate an alert upon successful detection.

BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0016] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules.
[0017] Figure 1 illustrates a flow chart detailing the steps involved in detecting a pointed threat object in an X-ray scanned image of a baggage, according to an exemplary implementation of the present invention.
[0018] Figure 2 illustrates a flow chart detailing the steps involved in separating two overlapping objects in an X-ray scanned image of a baggage, according to another exemplary implementation of the present invention.
[0019] Figure 3 illustrates a flow chart detailing the steps involved in detecting a contour of the pointed threat object such as a knife, in an X-ray scanned image of a baggage, according to another exemplary implementation of the present invention.
[0020] Figure 4 illustrates a schematic diagram depicting the separation of two overlapping objects in the X-ray scanned image, according to the exemplary implementation detailed in Figure 2.
[0021] Figure 5 illustrates a schematic diagram depicting a pointed/sharp threat object such as a knife and the parameters for detecting the contour of the same, according to the exemplary implementation detailed in Figure 3.
[0022] Figure 6 illustrates a block diagram depicting a system for detecting a pointed threat object in an X-ray scanned image of a baggage, according to another exemplary implementation of the present invention.
[0023] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative methods embodying the principles of the present invention. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.

DETAILED DESCRIPTION
[0024] The various embodiments of the present invention describe about detection of pointed threat object(s) during X-ray baggage scanning. The embodiments particularly describe a computer implemented method and system for detecting pointed threat object(s) from X-ray scanned images obtained during X-ray baggage scanning. The pointed threat object(s) referred to are, typically, metallic objects which have sharp pointed tips, such as knives which could be hidden in a cluttered passenger baggage and which can cause harm to the fellow passengers.
[0025] The method and system as disclosed herein are adaptable to internationally followed three colour scheme for material/object detection in X-ray scanned images of baggages, wherein metallic threat objects appear blue in colour. The cluttered nature of the passenger baggage causes the threat objects to get overlapped with other metallic and non-metallic (organic) innocuous materials. This invention describes a methodology in which the sharpness of the object is estimated. Depending on the sharpness parameters and size constraints a decision is taken and the object is classified as a metallic pointed threat object. The sharpness of the object is estimated by a novel technique of measurement of the angle of inclination between the edges of the pointed threat object and also the angular changes in the object. This invention also describes a novel methodology for separation of overlapped objects, which is mandatory for classification and alerting of overlapped pointed threat objects. An alert signal is generated on detection of pointed threat objects satisfying the size restrictions.
[0026] The metallic pointed threat object detection from X-ray baggage scan images is achieved by a colour based segmentation technique which separates the metallic objects which are found in blue shade in the three colour scheme X-ray baggage scan images. The segmented image undergoes further processing of contours and corners detection. The angle of inclination using the boundaries of the threat object is estimated and also the variation in angle of inclination in threat object is computed. These parameters along with the size constraints are used for classification of metallic threat object as a sharp threat object.
[0027] In the following description, for purpose of explanation, specific details are set forth in order to provide an understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these details. One skilled in the art will recognize that embodiments of the present invention, some of which are described below, may be incorporated into a number of systems.
[0028] However, the method and system are not limited to the specific embodiments described herein. Further, structures and devices shown in the figures are illustrative of exemplary embodiments of the present invention and are meant to avoid obscuring of the present invention.
[0029] It should be noted that the description merely illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present invention. Furthermore, all examples recited herein are principally intended expressly to be only for explanatory purposes to help the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.
[0030] In one of the embodiments, a computer implemented method for detecting a pointed threat object during X-ray baggage scanning is disclosed. The baggage is scanned by an X-ray baggage scanner typically having at least an X-ray source to fire X-rays whenever a baggage is scanned, one or more detectors to capture attenuated X-rays after passing through the baggage, a conveyor belt to carry the baggage for scanning and a processor for threat detection. Typically, a sharp metallic object such as a knife, is detected by implementation of the method.
[0031] In another embodiment, the method includes separating two or more overlapping objects from a single object, based on extracted contours. Two corner points are selected for drawing a line of separation using distance ratio, on the condition that the minimum peripheral distance between the selected corner points when traversed along the contour is not too less. Distance ratio is the ratio of minimum peripheral distance between the corner points when traversed along the contour to the shortest distance between them.
[0032] In another embodiment, the method includes detecting weak corners based on the contour information. The angle a point makes with two of its neighbouring pixels located on either sides of the contour is used for detecting the weaker corners.
[0033] In another embodiment, the method includes estimating angle of inclination from the contour map of the object wherein the tip of the object is first detected by corner point detection, and declaring the object as a pointed threat object when the object satisfies the following constraints:
(a) angle of inclination criteria with respect to the detected tip;
(b) distance criteria with respect to the detected tip; and
(c) size constraints on length and breadth of rectangle enclosing the object's contour with least possible area.
[0034] The flow chart illustrated in Figure 1 details the steps involved in the computer implemented method for detecting the pointed threat object in an X-ray scanned image of the baggage during X-ray baggage scanning, according to an exemplary implementation of the present invention. The pointed threat object referred to is, typically, a metallic object with sharp pointed tips, such as a knife. In the X-ray scanned image of the baggage, knives, being metals, will be blue in colour as per the three colour scheme. Therefore, the blue coloured pixels are processed for knife detection. HSV (Hue, Saturation & Value) color format gives better result for colour based segmentation compared to RGB (Red, Green & Blue) color format. Hence, the X-ray scanned image of the baggage input to the computer in RGB format is converted to HSV format 1. Binary image, (ITh) (Blue/ Non-blue coloured pixels) is obtained by extracting the blue coloured pixels by thresholding the HSV Values 2. The outer contours of the binary image (ITh) are extracted 3. When a knife is placed under a large thin metal sheet, the entire metal sheet including the threat object is detected as one single object. Therefore, the contour of the knife under the metal sheet cannot be obtained. Detection in such cases is carried out by identifying the contours with the contour area 4 greater than an area threshold value (AreaTh) 5. In such cases, only dark shades of blue regions are selected for further processing 6. It is also possible that a knife and any other metal object could be overlapping and thus changing the contour of knife which prevents its detection. Therefore, the connected objects (if any) are separated and the binary image (ITh) is updated 7. The contours of the updated binary image (ITh) are computed 8. A knife’s contour cannot be too small and therefore, shorter contours are removed 9 by thresholding the contour length. The remaining contours matching that of a knife (refer Figure 3) in the updated binary image (ITh) are detected and alert is generated using a red coloured box 10.
[0035] Most of the time the passenger bags scanned by an X-ray baggage scanner are cluttered/unorganized. This could result in threat object being overlapped by other metallic or nonmetallic objects. When the threat object is overlapped with other objects, size and shape changes and makes the detection process difficult. Hence the separation of threat object from other objects is important.
[0036] The flow chart illustrated in Figure 2 details the steps involved in separating two overlapping objects in an X-ray scanned image of a baggage, according to another exemplary implementation of the present invention; and the schematic diagram illustrated in Figure 4 depicts the separation of two overlapping objects in the X-ray scanned image, according to the exemplary implementation detailed in Figure 2. The updated binary image (ITh) is obtained as input 11. The contours of the objects in the updated binary image (ITh) are extracted 12. Smaller contours are removed 13 and only longer contours are processed further. The length and breadth of the rectangle enclosing each remaining contour with least possible area are computed as Length and Breadth respectively 14. From the computed Length and Breadth, the area of the rectangle, (ARect) is computed as Length*Breadth 15. The contour area (ACont) which refers the number of pixels inside the remaining contour is computed 16. It is to be noted that the rectangle area ARect and the contour area (ACont) are different. When there are two separable overlapping objects, the difference between the actual contour area and the area of the rectangle enclosing the contour (with least possible area) is high. Only when this difference is greater than a predefined dynamic area threshold value (ADy), the possibility of drawing a separating line between the objects 17 is analysed. The point of contact of two different objects on a contour is a corner point. Therefore, corner points on the contour are computed for finding the end points of the separating line for overlapping objects. As conventional corner detection approaches find only strong corner points, they did not yield good results to get the separating line between the objects. Hence, the corner points are computed using the contour information thus enabling the detection of weaker corner points also. A point on the contour is detected as a corner point 18 if the angle it makes with two of its neighbouring pixels when given a small displacement of say 10 pixels along either sides of the contour is less than a predefined threshold. The threshold set for the angle helps to obtain the weaker corner points also. Once the corner points are found, the best pair of corner points to draw the separating line needs to be found. For this, two parameters are crucial - Dist1 and Dist2. The terminology used in here can be well understood by referring to Figure 4. The corner points (C1) and (C2) form a corner points pair as shown in Figure 4 (a). Dist1 measures the actual distance between the corner points (C1) and (C2) 19. Dist2 measures the minimum peripheral distance (D1, D2) between the corner points (C1 and C2) when traversed along the contour 20, i.e. Dist2 = min(D1, D2). Best corner points pair is the one which has the (Dist2 / Dist1) ratio to be maximum. Added to that, it should also satisfy the condition that min(D1, D2) > Thresh*(D1+D2). The most suitable corner points pair satisfying the above mentioned conditions are selected 21. A separating line is drawn between the points in the selected corner points pair 22 as shown in Figure 4. In Figure 4, (b) is the actual image wherein three objects are overlapped; (c) is the image after first cut and (d) is the image after second cut. In this particular case, three objects are separated after two cuts. The separation of overlapping objects continues until when no further separation of objects is possible in (ITh).
[0037] The flowchart illustrated in Figure 3 details the steps involved in detecting a contour of the pointed threat object such as a knife, in an X-ray scanned image of a baggage, according to another exemplary implementation of the present invention; and the schematic diagram illustrated in Figure 5 depicts a pointed/sharp threat object such as a knife and the parameters for detecting the contour of the same, according to the exemplary implementation detailed in Figure 3. The contour is detected by measuring the angle of inclination within the contour of suspicious object, the angular changes within the contour, size constraints, etc. The processed updated binary image ITh is obtained as input. As shown in Figure 5, any knife will have a sharp common corner point such as Point (C). To detect such points, Harris corner detection technique is first used for detecting the corners of the remaining contours in the updated binary image (ITh) 23. Then corner points which lie on the outer contours of the objects are detected 24. Further processing uses these corner points as references. For a contour, the dynamic displacement value d 25 is determined using the length of the contour (i.e.), d = (Length of the contour/(2N+1)), where N represents the number of point pairs used in this technique. The point pairs from the displacement value (d) 26 are chosen as shown in the Figure 5. From the corner point, the point P11 is obtained by moving a distance of (d) along one side of the contour and the point P12 is obtained by moving a distance of (d) along the other side of the contour. The points P11 and P12 make a point pair. Likewise, a distance of 2d is moved to obtain a point pair comprising the points P21 and P22. In other words, point pairs are the points whose distances to the common corner point (C) along the contour is a multiple of displacement value (d).
[0038] The shortest distance between the points of point pairs are calculated as d1, d2, d3,….. dN (i.e.,) dn is the distance between the points Pn1 and Pn2. Then the maximum of those distances, Distmax is computed as max{d1, d2, d3,….. dN} 27. If the condition Distmax < Th*BreadthRect (Breadth of the rectangle enclosing the contour with least possible area) is not met, then the contour is not detected as threat and is not processed further. If the condition is met 28, the contour is checked for the other constraints. The distances between the Point C and Points Pnk ( ) are computed as dn,k . A point pair PPn (Pn1, Pn2) has to satisfy a distance criteria (i.e.), dn,1 > dn-1,1 and dn,2 > dn-1,2. The point pairs which do not satisfy this distance criteria are discarded 29.
[0039] The cosines of the angles between the points of each of the point pairs are determined with reference to the point C 30. The angle between the points of point pairs (PPn) is the angle between the points Pn1 and Pn2 with respect to point C. The number of point pairs (NPPvalid) whose cosine of angle is greater than a first angle threshold ( ) is counted 31. If NPPvalid is greater than a point pair threshold value (NTh) 32 and also if the size constraints are satisfied 37, then the contour is classified as that of a pointed threat. If the condition is not met, the number of irregular point pairs (NPPIrr) are counted 33. Ideally, for a sharp threat object, cosine of the angle between the points of point pair (PPn) should be greater than that of a previous point pair (PPn-1) and lesser than that of a subsequent point pair (PPn+1). Irregular point pairs are those which do not satisfy this condition. If the number of irregular point pairs (NPPIrr) is greater than a first point pair threshold value (NPPTh1), then the contour is not classified as a threat 34. The maximum number of consecutive point pairs (NPPConsecutive) for which cosine of angle is greater than a second angle threshold ( ) are counted 35. If (NPPConsecutive) is greater than a second point pair threshold value (NPPTh2) 36 and also if the size constraints are satisfied 37, then the contour is classified as that of a pointed threat.
[0040] Thus, based on the flow charts of Figures 1-3 and their description herein above, the computer implemented method for detecting a pointed threat object during X-Ray baggage scanning comprises the following steps: receiving an X-ray image of a baggage in RGB (Red Green Blue) format; converting the X-ray image from RGB (Red Green Blue) format to HSV (Hue Saturation Value) format; extracting a binary image (ITh) of blue coloured pixels from the HSV converted X-ray image; extracting outer contours of the binary image (ITh); computing the areas of the extracted outer contours; comparing the computed areas of the extracted outer contours with an area threshold value (AreaTh); selecting, if the computed area of the extracted outer contour is greater than or equal to the area threshold value (AreaTh), one or more regions having dark shades of blue colored pixels in the extracted outer contours of the binary image (ITh) and updating the binary image (ITh) with the dark shaded regions; separating two or more connected objects in the updated binary image; computing contours of the updated binary image (ITh) including the dark shaded regions and the separated objects therein; removing contours having length shorter than a predefined contour length from the computed contours of the updated binary image; and detecting, from the remaining contours, the contours matching a contour of the pointed threat object and generating an alert upon successful detection.
[0041] In an embodiment, after the step of comparing the computed area of the extracted outer contour with the area threshold value (AreaTh), the method comprises the step of separating two or more connected objects in the updated binary image (ITh), if the computed area of the extracted outer contour is less than the area threshold value (AreaTh).
[0042] Further, the step of separating two or more connected objects includes: extracting contours of the connected objects; removing contours having length shorter than the predefined contour length from the extracted contours of the connected objects; enclosing each remaining contour in a rectangle and determining the length and breadth of each rectangle; computing the area of each rectangle (ARect); determining the area of each remaining contour (ACont) by counting the number of pixels in each remaining contour; computing the difference between the area of each rectangle (ARect) and the area of each remaining contour (ACont), and comparing the difference area with a predefined dynamic area threshold value (ADy); identifying the contours with two or more connected objects if the difference between the area of a rectangle (ARect) and the area of a contour (ACont) is greater than the predefined dynamic area threshold value (ADy); detecting a plurality of corner points (C1, C2) in that contour with the difference area greater than the predefined dynamic area threshold value (ADy); determining actual distances (Dist1) between all pairs of corner points from the plurality of corner points; determining minimum distances (Dist2) as a minimum of peripheral distances (min(D1, D2)) between all pairs of corner points from the plurality of corner points when traversed along the contour; determining the ratio of the minimum distance (Dist2) to the actual distance (Dist1) of each pair of corner points and selecting the pair of corner points (C1, C2) having the highest ratio; and drawing a separating line between the selected pair of corner points (C1, C2) to separate the connected objects.
[0043] In an embodiment, the method includes stopping the separating of objects if the difference area is less than the predefined dynamic area threshold value (ADy).
[0044] In accordance with the method, each corner point is detected based on an angle that the corner point makes with two neighbouring pixels on the contour.
[0045] In accordance with the method, the pair of corner points (C1, C2) are selected when the minimum of the peripheral distances (min(D1, D2)) is greater than multiplication of a predefined threshold (Thresh) with the sum of the two peripheral distances (D1, D2) [min(D1, D2) > Thresh*(D1+D2)].
[0046] Further, the step of detecting the contour matching a contour of the pointed threat object, includes: detecting the corners of the remaining contours of the updated binary image (ITh) by a corner detection technique such as Harris corner detection technique; detecting a plurality of points common to the contours and their corners; computing a dynamic displacement value (d) for a contour; selecting point pairs (P11, P12; P21, P22; P31, P32;...) whose distances to a common corner point (C) along the contour are in multiples of the dynamic displacement value (d); calculating shortest distances (d1, d2, d3,...) between the points in each point pair (P11, P12; P21, P22; P31, P32;...) and computing maximum of the shortest distances (Distmax = max (d1, d2, d3,...)); comparing the computed maximum of the shortest distances (Distmax) with the breadth of the rectangle (BreadthRect) enclosing the contour (i.e. Distmax < Th*BreadthRect); discarding the point pairs which do not meet a distance criteria (dn,1 > dn-1,1 and dn,2 > dn-1,2); determining the cosine of angles between the points of each point pair with reference to the common corner point (C); counting the number of point pairs (NPPvalid) having cosine of angle greater than a first angle threshold( ); comparing the number of point pairs (NPPvalid) with a point pair threshold value (NTh); and classifying the contour as the pointed threat object if the number of point pairs (NPPvalid) are greater than the point pair threshold value (NTh).
[0047] In an embodiment, the method includes designating the contour as a non-threat object if the computed maximum of the shortest distances is greater than the breadth of the rectangle.
[0048] Furthermore, the method, if the number of point pairs (NPPvalid) are less than the point pair threshold value (NTh), includes: counting the number of irregular point pairs (NPPIrr); comparing the number of irregular point pairs (NPPIrr) with a first point pair threshold value (NPPTh1); counting the maximum number of consecutive point pairs (NPPConsecutive) having cosine of angle greater than a second angle threshold ( ), if the number of irregular point pairs (NPPIrr) is less than the first point pair threshold value (NPPTh1); comparing the number of consecutive point pairs (NPPConsecutive) with a second point pair threshold value (NPPTh2); and classifying the contour as the pointed threat object if the number of consecutive point pairs (NPPConsecutive) is greater than the second point pair threshold value (NPPTh2).
[0049] In an embodiment, the method includes designating the contour as a non-threat object, if the computed maximum of the shortest distances (Distmax) is greater than the breadth of the rectangle, or if the number of irregular point pairs (NPPIrr) is greater than the first point pair threshold value (NPPTh1), or if the number of consecutive point pairs (NPPConsecutive) are less than the second point pair threshold value of (NPPTh2).
[0050] In accordance with method, a point pair (PPn) is an irregular point pair (NPPIrr) if it does not meet a condition that cosine of angle between points of the point pair (PPn) is greater than cosine of angle between points of a previous point pair (PPn-1) and less than cosine of angle between points of a subsequent point pair (PPn+1).
[0051] Referring to Figure 6, the above described method for detecting a pointed threat object during X-ray baggage scanning is implemented by a system (10) comprising a computing unit (100) in communication with an X-ray baggage scanner (50), wherein the computing unit comprises at least a memory (102) configured to store a set of instructions and at least a processor (101) cooperating with the memory (102) to execute the instructions and configured to: receive an X-ray image of a baggage in RGB (Red Green Blue) format; convert the X-ray image from RGB (Red Green Blue) format to HSV (Hue Saturation Value) format; extract a binary image (ITh) of blue coloured pixels from the HSV converted X-ray image; extract outer contours of the binary image (ITh); compute the areas of the extracted outer contours; compare the computed areas of the extracted outer contours with an area threshold value(AreaTh); select one or more regions having dark shades of blue colored pixels in the extracted outer contours of the binary image (ITh) and update the binary image with the dark shaded regions, in the event that the computed area of the extracted outer contour is greater than or equal to the area threshold value (AreaTh); separate two or more connected objects in the updated binary image; compute contours of the updated binary image (ITh) including the dark shaded regions and the separated objects therein; remove contours having length shorter than a predefined contour length from the computed contours of the updated binary image; and detect, from the remaining contours, the contour matching a contour of the pointed threat object and generate an alert upon successful detection.
[0052] At least some of the technical advantages provided by the presently disclosed method and system for detecting pointed threat object(s) from X-ray scanned images obtained during X-ray baggage scanning, are as under:
colour based segmenting of the standard tricolour X-ray baggage images to separate the metallic objects in blue shades;
segmenting the threat objects hidden under thin large metal sheet using stringent conditions;
separating two or more overlapped objects using contours and corners information; and
looking for the features of a sharp object such as tip of the object, angular inclination & variation within the contour with respect to the tip, size, etc.
[0053] In an advantageous aspect, the method is generic and can be applied for detecting any type of threat not necessarily metallic and for any type of images, not necessarily X-ray baggage scanned images by the use of a suitable segmentation technique. The method can also be applied to detect any pointed sharp object, not necessarily knife.
[0054] The methods as disclosed herein can be implemented to issue automatic detection and alerting facility in a baggage scanning system. The automatic detection and alert generation by the system aids the security personnel greatly and also speeds up the scanning process at busy airports, metro stations and other establishments.
[0055] The foregoing description of the invention has been set merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the substance of the invention may occur to person skilled in the art, the invention should be construed to include everything within the scope of the invention.
,CLAIMS:We Claim:

1. A computer implemented method for detecting a pointed threat object during X-ray baggage scanning, the method comprising:
receiving an X-ray image of a baggage in RGB (Red Green Blue) format;
converting the X-ray image from RGB (Red Green Blue) format to HSV (Hue Saturation Value) format;
extracting a binary image (ITh) of blue coloured pixels from the HSV converted X-ray image;
extracting outer contours of the binary image (ITh);
computing the areas of the extracted outer contours;
comparing the computed areas of the extracted outer contours with an area threshold value (AreaTh);
selecting, if the computed area of the extracted outer contour is greater than or equal to the area threshold value (AreaTh), one or more regions having dark shades of blue colored pixels in the extracted outer contour of the binary image (ITh) and updating the binary image (ITh) with the dark shaded regions;
separating two or more connected objects in the updated binary image;
computing contours of the updated binary image (ITh) including the dark shaded regions and the separated objects therein;
removing contours having length shorter than a predefined contour length from the computed contours of the updated binary image; and
detecting, from the remaining contours, the contour matching a contour of the pointed threat object and generating an alert upon successful detection.

2. The method as claimed in claim 1, wherein, the step of separating two or more connected objects includes:
extracting contours of the objects;
removing contours having length shorter than the predefined contour length from the extracted contours of the objects;
enclosing each remaining contour in a rectangle and determining the length and breadth of each rectangle;
computing the area of each rectangle (ARect);
determining the area of each remaining contour (ACont) by counting the number of pixels in each remaining contour;
computing the difference between the area of each rectangle (ARect) and the area of each remaining contour (ACont), and comparing the difference area with a predefined dynamic area threshold value (ADy);
identifying the contours with two or more connected objects if the difference between the area of a rectangle (ARect) and the area of a contour (ACont) is greater than the predefined dynamic area threshold value (ADy);
detecting a plurality of corner points (C1, C2) in that contour (ACont) with the difference area greater than the predefined dynamic area threshold value(ADy);
determining actual distances (Dist1) between all pairs of corner points from the plurality of corner points;
determining minimum distances (Dist2) as a minimum of peripheral distances (min(D1, D2)) between all pairs of corner points from the plurality of corner points when traversed along the contour;
determining the ratio of the minimum distance (Dist2) to the actual distance (Dist1) of each pair of corner points and selecting the pair of corner points (C1,C2) having the highest ratio; and
drawing a separating line between the selected pair of corner points (C1,C2) to separate the connected objects.

3. The method as claimed in claim 2, wherein each corner point is detected based on an angle that the corner point makes with two neighbouring pixels on the contour.

4. The method as claimed in claims 2 to 3, wherein the pair of corner points (C1, C2) are selected when the minimum of the peripheral distances (min(D1, D2)) is greater than multiplication of a predefined threshold (Thresh) with the sum of the peripheral distances (D1, D2) [min(D1, D2) > Thresh*(D1+D2)].

5. The method as claimed in claim 1, wherein the step of detecting, from the remaining contours, the contour matching a contour of the pointed threat object, includes:
detecting the corners of the remaining contours of the updated binary image (ITh) by a corner detection technique;
detecting a plurality of points common to the contours and their corners;
computing a dynamic displacement value (d) for a contour;
selecting point pairs (P11, P12; P21, P22; P31, P32;...) whose distances to a common corner point (C) along the contour are in multiples of the dynamic displacement value (d);
calculating shortest distances (d1, d2, d3) between the points in each point pair (P11, P12; P21, P22; P31, P32;...) and computing maximum of the shortest distances (Distmax = max (d1, d2, d3,...));
comparing the computed maximum of the shortest distances (Distmax) with the breadth of the rectangle (BreadthRect) enclosing the contour (Distmax < Th*BreadthRect);
discarding the point pairs which do not meet a distance criteria (dn,1>dn-1,1 and dn,2>dn-1,2);
determining the cosine of angles between the points of each point pair with reference to the common corner point (C);
counting the number of point pairs (NPPvalid) having cosine of angle greater than a first angle threshold( );
comparing the number of point pairs (NPPvalid) with a point pair threshold value (NTh); and
classifying the contour as the pointed threat object if the number of point pairs (NPPvalid) are greater than the point pair threshold value (NTh).

6. The method as claimed in claim 5, wherein the method, if the number of point pairs (NPPvalid) are less than the point pair threshold value (NTh), includes:
counting the number of irregular point pairs (NPPIrr);
comparing the number of irregular point pairs (NPPIrr) with a first point pair threshold value (NPPTh1);
counting the maximum number of consecutive point pairs (NPPConsecutive) having cosine of angle greater than a second angle threshold ( ), if the number of irregular point pairs (NPPIrr) is less than the first point pair threshold value (NPPTh1);
comparing the number of consecutive point pairs (NPPConsecutive) with a second point pair threshold value (NPPTh2); and
classifying the contour as the pointed threat object if the number of consecutive point pairs (NPPConsecutive) is greater than the second point pair threshold value (NPPTh2).

7. The method as claimed in claim 6, wherein the method includes designating the contour as a non-threat object,
if the computed maximum of the shortest distances (Distmax) is greater than the breadth of the rectangle, or
if the number of irregular point pairs (NPPIrr) is greater than the first point pair threshold value (NPPTh1), or
if the number of consecutive point pairs (NPPConsecutive) is less than the second point pair threshold value of (NPPTh2).

8. The method as claimed in claims 5 to 7, wherein a point pair (PPn) which does not meet a condition that,
cosine of angle between points of the point pair (PPn) is greater than cosine of angle between points of a previous point pair (PPn-1) and less than cosine of angle between points of a subsequent point pair (PPn+1),
is an irregular point pair (NPPIrr).

9. A system (10) for detecting a pointed threat object during X-Ray baggage scanning, said system comprising:
a computing unit (100) in communication with an X-ray baggage scanner (50), wherein said computing unit (100) comprises at least a memory (102) configured to store a set of instructions and at least a processor (101) cooperating with said memory (102) to execute the instructions, said processor (101) configured to:
receive an X-ray image of a baggage in RGB (Red Green Blue) format;
convert the X-ray image from RGB (Red Green Blue) format to HSV (Hue Saturation Value) format;
extract a binary image (ITh) of blue coloured pixels from the HSV converted X-ray image;
extract outer contour of the binary image (ITh);
compute the area of the extracted outer contour;
compare the computed area of the extracted outer contour with an area threshold value (AreaTh);
select one or more regions having dark shades of blue colored pixels in the extracted outer contour of the binary image (ITh) and update the binary image with the dark shaded regions, in the event that the computed area of the extracted outer contour is greater than or equal to the area threshold value (AreaTh);
separate two or more connected objects in the updated binary image;
compute contours of the updated binary image (ITh) including the dark shaded regions and the separated objects therein;
remove contours having length shorter than a predefined contour length from the computed contours of the updated binary image; and
detect, from the remaining contours, the contour matching a contour of the pointed threat object and generating an alert upon successful detection.

Dated this 29th day of March, 2019

FOR BHARAT ELECTRONICS LIMITED
(By their Agent)

D. MANOJ KUMAR (IN/PA-2110)
KRISHNA & SAURASTRI ASSOCIATES LLP

Documents

Application Documents

# Name Date
1 201941012393-PROVISIONAL SPECIFICATION [29-03-2019(online)].pdf 2019-03-29
1 201941012393-Response to office action [01-11-2024(online)].pdf 2024-11-01
2 201941012393-FORM 1 [29-03-2019(online)].pdf 2019-03-29
2 201941012393-PROOF OF ALTERATION [04-10-2024(online)].pdf 2024-10-04
3 201941012393-IntimationOfGrant26-09-2024.pdf 2024-09-26
3 201941012393-DRAWINGS [29-03-2019(online)].pdf 2019-03-29
4 201941012393-PatentCertificate26-09-2024.pdf 2024-09-26
4 201941012393-FORM-26 [18-06-2019(online)].pdf 2019-06-18
5 Correspondence by Agent _Power of Attorney_28-06-2019.pdf 2019-06-28
5 201941012393-Written submissions and relevant documents [27-08-2024(online)].pdf 2024-08-27
6 201941012393-Proof of Right (MANDATORY) [27-09-2019(online)].pdf 2019-09-27
6 201941012393-Correspondence to notify the Controller [09-08-2024(online)].pdf 2024-08-09
7 Correspondence by Agent_Form1_04-10-2019.pdf 2019-10-04
7 201941012393-FORM-26 [09-08-2024(online)].pdf 2024-08-09
8 201941012393-US(14)-HearingNotice-(HearingDate-12-08-2024).pdf 2024-07-10
8 201941012393-FORM 3 [02-01-2020(online)].pdf 2020-01-02
9 201941012393-ABSTRACT [30-06-2022(online)].pdf 2022-06-30
9 201941012393-ENDORSEMENT BY INVENTORS [02-01-2020(online)].pdf 2020-01-02
10 201941012393-CLAIMS [30-06-2022(online)].pdf 2022-06-30
10 201941012393-DRAWING [02-01-2020(online)].pdf 2020-01-02
11 201941012393-COMPLETE SPECIFICATION [30-06-2022(online)].pdf 2022-06-30
11 201941012393-CORRESPONDENCE-OTHERS [02-01-2020(online)].pdf 2020-01-02
12 201941012393-COMPLETE SPECIFICATION [02-01-2020(online)].pdf 2020-01-02
12 201941012393-DRAWING [30-06-2022(online)].pdf 2022-06-30
13 201941012393-FER_SER_REPLY [30-06-2022(online)].pdf 2022-06-30
13 201941012393-FORM 18 [12-11-2020(online)].pdf 2020-11-12
14 201941012393 Correspondence by Office_Defence_22-12-2021.pdf 2021-12-22
14 201941012393-Reply From Defence.pdf 2022-05-04
15 201941012393-FER.pdf 2022-01-04
16 201941012393 Correspondence by Office_Defence_22-12-2021.pdf 2021-12-22
16 201941012393-Reply From Defence.pdf 2022-05-04
17 201941012393-FORM 18 [12-11-2020(online)].pdf 2020-11-12
17 201941012393-FER_SER_REPLY [30-06-2022(online)].pdf 2022-06-30
18 201941012393-DRAWING [30-06-2022(online)].pdf 2022-06-30
18 201941012393-COMPLETE SPECIFICATION [02-01-2020(online)].pdf 2020-01-02
19 201941012393-COMPLETE SPECIFICATION [30-06-2022(online)].pdf 2022-06-30
19 201941012393-CORRESPONDENCE-OTHERS [02-01-2020(online)].pdf 2020-01-02
20 201941012393-CLAIMS [30-06-2022(online)].pdf 2022-06-30
20 201941012393-DRAWING [02-01-2020(online)].pdf 2020-01-02
21 201941012393-ABSTRACT [30-06-2022(online)].pdf 2022-06-30
21 201941012393-ENDORSEMENT BY INVENTORS [02-01-2020(online)].pdf 2020-01-02
22 201941012393-FORM 3 [02-01-2020(online)].pdf 2020-01-02
22 201941012393-US(14)-HearingNotice-(HearingDate-12-08-2024).pdf 2024-07-10
23 201941012393-FORM-26 [09-08-2024(online)].pdf 2024-08-09
23 Correspondence by Agent_Form1_04-10-2019.pdf 2019-10-04
24 201941012393-Correspondence to notify the Controller [09-08-2024(online)].pdf 2024-08-09
24 201941012393-Proof of Right (MANDATORY) [27-09-2019(online)].pdf 2019-09-27
25 Correspondence by Agent _Power of Attorney_28-06-2019.pdf 2019-06-28
25 201941012393-Written submissions and relevant documents [27-08-2024(online)].pdf 2024-08-27
26 201941012393-PatentCertificate26-09-2024.pdf 2024-09-26
26 201941012393-FORM-26 [18-06-2019(online)].pdf 2019-06-18
27 201941012393-IntimationOfGrant26-09-2024.pdf 2024-09-26
27 201941012393-DRAWINGS [29-03-2019(online)].pdf 2019-03-29
28 201941012393-PROOF OF ALTERATION [04-10-2024(online)].pdf 2024-10-04
28 201941012393-FORM 1 [29-03-2019(online)].pdf 2019-03-29
29 201941012393-Response to office action [01-11-2024(online)].pdf 2024-11-01
29 201941012393-PROVISIONAL SPECIFICATION [29-03-2019(online)].pdf 2019-03-29

Search Strategy

1 201941012393E_24-11-2021.pdf

ERegister / Renewals

3rd: 20 Dec 2024

From 29/03/2021 - To 29/03/2022

4th: 20 Dec 2024

From 29/03/2022 - To 29/03/2023

5th: 20 Dec 2024

From 29/03/2023 - To 29/03/2024

6th: 20 Dec 2024

From 29/03/2024 - To 29/03/2025

7th: 21 Mar 2025

From 29/03/2025 - To 29/03/2026