Sign In to Follow Application
View All Documents & Correspondence

Method And Apparatus For Detection Of Presence And/Or Absence Of An Object From An Image

Abstract: An apparatus for detection of presence and/or absence of an object from an image of a new item, said apparatus comprises database creation means adapted to create a database including correct Regions of Interest of a correct image from a correct item; database adapted to store said created Regions of Interest as templates; image capturing means adapted to capture images of a new item with respect to said Regions of Interest; first, second, third, fourth, fifth computation means, including corresponding comparator means, adapted to compare and compute respective first, second, third, fourth, fifth fuzzy factor scores indicating the degree of similarity of said image of a new item with respect to said corresponding template using Histogram Similarity Technique, Shape Features" Technique, Average Absolute Difference Technique, Image Fidelity Technique, Structural Content Technique; and final score calculation means adapted to calculate a final score in relation to detecting the presence and/or absence, of an object, using said first fuzzy factor score, said second fuzzy factor score, said third fuzzy factor score, said fourth fuzzy factor score, and said fifth fuzzy factor score.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
21 October 2008
Publication Number
32/2010
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2018-06-20
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED
NIRMAL BUILDING, 9TH FLOOR, NARIMAN POINT, MUMBAI-400021, MAHARASHTRA, INDIA.

Inventors

1. CHATTOPADHYAY TANUSHYAM
TATA CONSULTANCY SERVICES BENGAL INTELLIGENT PARK, BUILDING-D, PLOT NO. A2 M2 & N2,BLOCK-EP, SALT LAKE ELECTRONIC COMPLEX, SECTOR-V, KOLKATA-700001, WESTBENGAL, INDIA.
2. CHAKI AYAN
TATA CONSULTANCY SERVICES BENGAL INTELLIGENT PARK, BUILDING-D, PLOT NO. A2 M2 & N2,BLOCK-EP, SALT LAKE ELECTRONIC COMPLEX, SECTOR-V, KOLKATA-700001, WESTBENGAL, INDIA.

Specification

FORM -2
THE PATENTS ACT, 1970 (39 of 1970) & THE PATENTS RULES, 2003
PROVISIONAL
Specification
(See Section 10 and rule 13)


A METHOD AND APPARATUS FOR DETECTION OF PRESENCE AND/OR ABSENCE OF AN OBJECT FROM AN IMAGE
TATA CONSULTANCY SERVICES LTD.,
an Indian Company of Nirmal Building, 9th floor. Nan man Point, Mumbai 400 021 Maharashtra, India
THE FOLLOWING SPECIFICATION DESCRIBES THE INVENTION


This invention relates to a method and apparatus for detection of presence and/or absence of an object from an image.
This invention envisages the above method and apparatus by comparing two regions of interest (ROI) in a single image or in multiple images.
In particular, this invention envisages a novel way of matching two objects using multiple parameters.
In particular, this invention envisages a novel way of using Histogram Similarity, Shape features, Average Absolute Difference of two images, Image Fidelity, and Structural Content for comparing two ROIs in a single image or in multiple images.
In particular, this invention envisages a method which relates to automated visual inspection of presence and/OR absence of an object using above mentioned parameters.
this invention provides a non-threshold based technique for automated visual inspection of presence and/OR absence of an object using above mentioned parameters.
particular, this invention provides a fuzzy multifactor based approach for automated visual inspection of presence and/OR absence of an object using above mentioned parameters.

Background of the Invention:
Artificial vision applies digital image processing and analysis to tackle real problems in the industrial production, mainly of standardized products, in real¬time conditions. Manual inspection or monitoring of any continuous process is commonly agreed to be inefficient, especially because of its repetitive and tedious nature. People are employed for this activity almost exclusively for their vision capability. In glass plants and on bottling lines, platoons of people are used to peer at each bottle to insure the absence of defects or foreign material in the bottle. But due to human mistake it may happen that a bottle with a crack or with some foreign particle inside can pass through the human vision system. This can lead to a cleanup problem, perhaps in someone's shopping cart, or it could break and jam the filling machine causing expensive downtime. A small glass chip lying undetected in the bottom of the bottle could be ingested by the consumer, possibly leading to injury and a product liability suit. To avoid this problem, a more reliable technology with repeated accuracy is needed. Automatic Visual Inspection has its impact in many fields including textile and printing industry, electronic and manufacturing industry.
A basic problem with the visual inspection system is that for a wide range of inspection tasks, the detection rate is assumed to start off at 80-90%, and to deteriorate rapidly after half an hour; additionally, performance of a single inspector degrades rapidly with the number of possible defect types. Machine Vision is becoming popular in automotive industries for inspection of components. The vision industry needs a real time pattern matching technique that can handle non-square pixel data and non-linear change in brightness without loosing accuracy. Irrespective of non-linear changes in brightness, the

object boundaries are always visible and only the shapes of objects remain the same.
Meliones et.al proposed a distributed vision network to tackle industrial packaging inspection. The system consists of independent networked inspection stations able to efficiently address parallel inspection tasks such as product identification, character verification, tag inspection, content and packaging quality control at a high production speed. Zhen et.al performs the gradient operation in the texture image in four channels and formed an edge map from it. Over the edge map, texture features are specified by the density of microedge pixels of each channel. These features are computed in a sliding window with a size of WxW pixels. Feature data in each channel approximately follows a normal distribution. The mean and standard deviation of each channel form a feature vector for that texture. Then classification is performed by a thresholding using the normal distribution confidence level.
One of the major challenges in object detection applications is handling non¬linear changes for brightness, perspective distortion, process variations such as multilayer buildup in wafer production and blurring related issues. Contrast can change nonlinearly and unpredictably during the inspection. Irrespective of non-linear changes in brightness, the object boundaries are always visible and basically, only the shapes of objects remain the same. As the target contains many parts to be inspected and each part has its own characteristics, the pattern matching algorithm needs to be fast to inspect in real time, while maintaining high accuracy. A challenge in most of the machine vision problems is to take care of the varying contrast and to enhance (or reduce) it based on need.

Vinay G. Vaidya et.al proposed a technique for night vision enhancement using Wigner distribution. They calculated of Wigner distribution with exponential kernel, with fixed size window and DC frequency and the coefficient is powered by a constant value, which gives an enhanced version of image.
In accordance with this invention a system is envisaged where an automated decision is taken for detecting the presence AND/OR absence of an object in an image by comparing two ROIs in single or multiple images.
Further, in accordance with this invention a system is designed to detect in real time whether any object in an image is misplaced or missing.
This invention uses fuzzy multifactor based approach based on the following
features for comparing two ROIs in single or multiple images :
Histogram Similarity,
Shape features,
Average Absolute Difference of two images,
Image Fidelity,
Structural Content
(a) calculating a fuzzy factor indicating the degree of similarity of two image segments using Histogram Similarity
(b) calculating a fuzzy factor indicating the degree of similarity of two image segments using Shape features

(c) calculating a fuzzy factor indicating the degree of similarity of two image segments using Average Absolute Difference of two images
(d) calculating a fuzzy factor indicating the degree of similarity of two image segments using Image Fidelity, and;
(e) Calculating a fuzzy factor indicating the degree of similarity of two image segments using Structural Content
(f) detecting the presence AND/OR absence of an object by evaluating the parameters calculated in the steps (a), (b), (c), (d) and (e).
The method of the invention further comprises the steps of assigning a single fuzzy factor to each image segment based on the fuzzy factors described in (a), (b), (c), (d), and (e).
The method further comprises the steps of assigning same/different weights to each of the factors described in (a), (b), (c), (d), and (e) for assigning a single fuzzy factor to each image segment
The method further comprises the steps of computing a weight for the factors described in (a), (b), (c), (d), and (e) using any machine learning algorithm like Artificial Neural Network (ANN), Support Vector Machine (SVM), Hidden Markov Model (HMM).

Brief Description of the Accompanying Drawings:
The invention will now be described with reference to the accompanying drawings, wherein
Figure 1 gives an overview of the Inspection System, and
Figure 2 gives the flowchart for online inspection of instrument cases.
Detailed Description of the Invention:
The high level overview of the method and apparatus for detection of missing
and misplaced instruments from an instrument case is given in Error!
Reference source not found..
The work is conducted in three phases like - (i) Offline database creation by
marking the location of each instrument within an instrument case (will be
referred to as case this point onward) manually. These regions will be
considered as regions of interest (ROl) from this point onward in this text, (ii)
Online feature extraction for pattern matching (iii) Decision making system
based on fuzzy multifactorial analysis
(i) Offline database creation:
In this phase the relative positions of all instruments in each instrument case are
stored in a database. Here is the pseudo code for this step:
Find the bounding box coordinates for each ROI in the reference instrument
case (RIC).
Calculate the normalized coordinates of top left (x'oy'o) and bottom right
(x'1,y'1) position for the bounding box for each ROl based on the image
resolution using the formula:


Where xcase0and ycase0 represents the top left coordinate of RJC. The absolute
coordinates are represented as (x0,y0) and (xl,y1 ) respectively.
widthcaseand heightcase represents the width and height of RFC respectively.
(ii) Online feature extraction
The following assumptions are taken into account while designing the proposed system:
1. Most of the instruments have a linear shape
2. Image of the case will have a distinct contrast difference with the background
3. Prior knowledge of the different type of instruments to be detected is available
4. Instrument case will have a fixed position with reference to the camera
5. Every instrument will have a fixed position inside the case
6. Both the reference case and the case under inspection will be under similar illumination
The overview of this step is described below:
• Get the Case id from RFID reader
• Verify whether the case id is present in the existing case database or not

• If the input case is a valid one, search for the instruments within it. As ROl for each instrument within a case is stored in the database, this task is simply reduced to the task of searching the database.
• Extract the features from each ROI. The method for obtaining the factor values is described beJow.
Assigning factor values for each feature:
In the approach in accordance with this invention a membership is assigned to
each ROI based on five factors. Each factor is obtained on the basis of some
feature values for that particular ROI. We have used (i) shape feature (SF), (ii)
Histogram Similarity (HS), (iii) Image fidelity (IF), (iv) Average Absolute
difference (AAD), and (v) Structural Content (SC). These factors can be
described ins erial order
Shape feature: This feature is based on observation 1. To get the degree of
linearity of object boundary, Hough transform has been applied in the boundary
of each component.
Let H, be the number of Hough lines for the 7"ROI in the case of the reference image and 77', be the number of Flough lines for the same ROI in the case under inspection (CU1). We compute the shape factor (fsl,;)

Where n is the total number of ROIs in the case.
The factor fSF, will have the values lying between a close interval of (0,1).
When the CUI contains all the instruments in proper places, typically, fSF i
should be 1. Similarly it is observed that when there is a certain change in illumination condition, but the CUI holds all the instruments in proper places,

the factor value nearly tends to 1 but may not be exactly one. f SF i tends to 0 if the instrument under ROI is not in proper place.
Histogram Similarity:
To get the degree of similarity of cumulative property of luminance and
chrominance feature as a whole within the ROI, Histogram Similarity feature is
used.
Histogram Similarity (HS) defined as
where f1(c)is the relative frequency of level c in a 255 levels image
To get the factor value for this feature, we compute HS (HS1) for each ROI of
the reference and the CUI.
Now the factor (fHs i) for the ithROI in the CUI is defined as

Where HSmax is defined as
HSmax=max(HS1HIGH_VALUE)Viε l,2,....n where HIGH_VALUEis a predefined
high value obtained by computing HS for two distinct images. This HIGH_VALUE is used to remove the possibility of divide by zero error.
Image Fidelity:
To get the fidelity of the image content this feature is used.
Image Fidelity (IF) defined as


To get the factor value for this feature, we compute IF (lFi) for each ROI of the
reference and the CUi.
Now the factor (f IF i) for the ithROI in the CUI is defined as ♦

Where IFmax is defined as
IFmax= max(IF„HIGH_VALUE)Viε l,2>....„ where HIGH_VALUEis a predefined
high value obtained by computing IF for two distinct images. This HIGHJVALUE is used to remove the possibility of divide by zero error.
Average Absolute Difference
This feature is based on observation 6. To get the degree of similarity of
individual pixel under ROI, the AAD feature is used. AAD is defined as

To get the factor value for this featur.e, we compute AAD (AAD, ) for each ROI
of the reference and the CUI.
Now the factor (f AAD i) for the ith ROI in the CUI is defined as

Where AADmax is defined as

AADmax =max(AAD„HIGH_VALUE)Viεl,2s...n where HIGH_VALUEis a
predefined high value obtained by computing AAD for two distinct images. This HIGH_VALUE is used to remove the possibility of divide by zero error.
Structural Content:
To get the structural content this feature is used.
Structural Content (SC) defined as

To get the factor value for this feature, we compute SC (SC, ) for each ROI of
the reference and the CUI.
Now the factor (fsc i) for the '"'ROI in the CUI is defined as

Where 5Cmax is defined as
SCmax = max((|l -SC, |),;HIGH_VALUE)V i ε l,2s....n where HIGH_VALUE is a
predefined high value obtained by computing SC for two distinct images. This HIGH_VALUE is used to remove the possibility of divide by zero error.
(iii) Decision making system based on fuzzy multifactorial analysis Thus, a 5 X n metrics (D) is obtained defined by


Now, a mapping iD to a 1 X n decision matrix F is defined with elements fi,as follows:

Where fi represents the fuzzy decision for the ;"ROI in the CUI. As each of dijε(0.l) so each fiε(0.1)
If fi is greater than a statistically obtained threshold value, then the object is assumed to be present in the ROl.
Although the invention has been described in terms of particular embodiments and applications, one of ordinary skill in the art, in light of this teaching, can generate additional embodiments and modifications without departing from the spirit of or exceeding the scope of the chained invention. Accordingly, it is to be understood that the drawings and descriptions herein are offered by way of example to facilitate comprehension of the invention and should not be construed to limit the scope thereof.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 2268-MUM-2008-FORM 5(21-10-2009).pdf 2009-10-21
1 2268-MUM-2008-RELEVANT DOCUMENTS [28-09-2023(online)].pdf 2023-09-28
2 2268-MUM-2008-RELEVANT DOCUMENTS [26-09-2022(online)].pdf 2022-09-26
2 2268-MUM-2008-FORM 2(TITLE PAGE)-(21-10-2009).pdf 2009-10-21
3 2268-MUM-2008-RELEVANT DOCUMENTS [29-09-2021(online)].pdf 2021-09-29
3 2268-mum-2008-form 2(21-10-2009).pdf 2009-10-21
4 2268-MUM-2008-RELEVANT DOCUMENTS [29-03-2020(online)].pdf 2020-03-29
4 2268-MUM-2008-DRAWING(21-10-2009).pdf 2009-10-21
5 2268-MUM-2008-RELEVANT DOCUMENTS [23-03-2019(online)].pdf 2019-03-23
5 2268-MUM-2008-DESCRIPTION(COMPLETE)-(21-10-2009).pdf 2009-10-21
6 2268-MUM-2008-OTHERS ( ORIGINAL UR 6( 1A) FORM 26)-240518.pdf 2018-08-30
6 2268-MUM-2008-CORRESPONDENCE(21-10-2009).pdf 2009-10-21
7 2268-MUM-2008-OTHERS (ORIGINAL UR 6( 1A) FORM 26)-240518.pdf 2018-08-21
7 2268-MUM-2008-CLAIMS(21-10-2009).pdf 2009-10-21
8 2268-MUM-2008-CORRESPONDENCE(7-11-2008).pdf 2018-08-09
8 2268-MUM-2008-ABSTRACT(21-10-2009).pdf 2009-10-21
9 2268-MUM-2008-FORM 18(18-11-2010).pdf 2010-11-18
9 2268-mum-2008-correspondence.pdf 2018-08-09
10 2268-MUM-2008-CORRESPONDENCE(18-11-2010).pdf 2010-11-18
11 2268-mum-2008-description(provisional).pdf 2018-08-09
11 Other Patent Document [05-10-2016(online)].pdf 2016-10-05
12 2268-MUM-2008-FER_SER_REPLY [03-01-2018(online)].pdf 2018-01-03
12 2268-mum-2008-drawing.pdf 2018-08-09
13 2268-MUM-2008-DRAWING [03-01-2018(online)].pdf 2018-01-03
13 2268-MUM-2008-FER.pdf 2018-08-09
14 2268-MUM-2008-COMPLETE SPECIFICATION [03-01-2018(online)].pdf 2018-01-03
14 2268-MUM-2008-FORM 1(7-11-2008).pdf 2018-08-09
15 2268-MUM-2008-CLAIMS [03-01-2018(online)].pdf 2018-01-03
15 2268-mum-2008-form 1.pdf 2018-08-09
16 2268-MUM-2008-ABSTRACT [03-01-2018(online)].pdf 2018-01-03
16 2268-mum-2008-form 2(title page).pdf 2018-08-09
17 2268-MUM-2008-FORM-26 [15-05-2018(online)].pdf 2018-05-15
18 2268-mum-2008-form 2.pdf 2018-08-09
18 2268-MUM-2008-FORM-26 [22-05-2018(online)].pdf 2018-05-22
19 2268-mum-2008-form 26.pdf 2018-08-09
19 2268-MUM-2008-Written submissions and relevant documents (MANDATORY) [05-06-2018(online)].pdf 2018-06-05
20 2268-mum-2008-form 3.pdf 2018-08-09
20 2268-MUM-2008-PatentCertificate20-06-2018.pdf 2018-06-20
21 2268-MUM-2008-HearingNoticeLetter.pdf 2018-08-09
21 2268-MUM-2008-IntimationOfGrant20-06-2018.pdf 2018-06-20
22 abstract1.jpg 2018-08-09
23 2268-MUM-2008-HearingNoticeLetter.pdf 2018-08-09
23 2268-MUM-2008-IntimationOfGrant20-06-2018.pdf 2018-06-20
24 2268-mum-2008-form 3.pdf 2018-08-09
24 2268-MUM-2008-PatentCertificate20-06-2018.pdf 2018-06-20
25 2268-mum-2008-form 26.pdf 2018-08-09
25 2268-MUM-2008-Written submissions and relevant documents (MANDATORY) [05-06-2018(online)].pdf 2018-06-05
26 2268-MUM-2008-FORM-26 [22-05-2018(online)].pdf 2018-05-22
26 2268-mum-2008-form 2.pdf 2018-08-09
27 2268-MUM-2008-FORM-26 [15-05-2018(online)].pdf 2018-05-15
28 2268-MUM-2008-ABSTRACT [03-01-2018(online)].pdf 2018-01-03
28 2268-mum-2008-form 2(title page).pdf 2018-08-09
29 2268-MUM-2008-CLAIMS [03-01-2018(online)].pdf 2018-01-03
29 2268-mum-2008-form 1.pdf 2018-08-09
30 2268-MUM-2008-COMPLETE SPECIFICATION [03-01-2018(online)].pdf 2018-01-03
30 2268-MUM-2008-FORM 1(7-11-2008).pdf 2018-08-09
31 2268-MUM-2008-DRAWING [03-01-2018(online)].pdf 2018-01-03
31 2268-MUM-2008-FER.pdf 2018-08-09
32 2268-mum-2008-drawing.pdf 2018-08-09
32 2268-MUM-2008-FER_SER_REPLY [03-01-2018(online)].pdf 2018-01-03
33 2268-mum-2008-description(provisional).pdf 2018-08-09
33 Other Patent Document [05-10-2016(online)].pdf 2016-10-05
34 2268-MUM-2008-CORRESPONDENCE(18-11-2010).pdf 2010-11-18
35 2268-mum-2008-correspondence.pdf 2018-08-09
35 2268-MUM-2008-FORM 18(18-11-2010).pdf 2010-11-18
36 2268-MUM-2008-ABSTRACT(21-10-2009).pdf 2009-10-21
36 2268-MUM-2008-CORRESPONDENCE(7-11-2008).pdf 2018-08-09
37 2268-MUM-2008-CLAIMS(21-10-2009).pdf 2009-10-21
37 2268-MUM-2008-OTHERS (ORIGINAL UR 6( 1A) FORM 26)-240518.pdf 2018-08-21
38 2268-MUM-2008-CORRESPONDENCE(21-10-2009).pdf 2009-10-21
38 2268-MUM-2008-OTHERS ( ORIGINAL UR 6( 1A) FORM 26)-240518.pdf 2018-08-30
39 2268-MUM-2008-DESCRIPTION(COMPLETE)-(21-10-2009).pdf 2009-10-21
39 2268-MUM-2008-RELEVANT DOCUMENTS [23-03-2019(online)].pdf 2019-03-23
40 2268-MUM-2008-RELEVANT DOCUMENTS [29-03-2020(online)].pdf 2020-03-29
40 2268-MUM-2008-DRAWING(21-10-2009).pdf 2009-10-21
41 2268-MUM-2008-RELEVANT DOCUMENTS [29-09-2021(online)].pdf 2021-09-29
41 2268-mum-2008-form 2(21-10-2009).pdf 2009-10-21
42 2268-MUM-2008-RELEVANT DOCUMENTS [26-09-2022(online)].pdf 2022-09-26
42 2268-MUM-2008-FORM 2(TITLE PAGE)-(21-10-2009).pdf 2009-10-21
43 2268-MUM-2008-RELEVANT DOCUMENTS [28-09-2023(online)].pdf 2023-09-28
43 2268-MUM-2008-FORM 5(21-10-2009).pdf 2009-10-21

Search Strategy

1 searchstartegy_14-07-2017.pdf

ERegister / Renewals

3rd: 09 Aug 2018

From 21/10/2010 - To 21/10/2011

4th: 09 Aug 2018

From 21/10/2011 - To 21/10/2012

5th: 09 Aug 2018

From 21/10/2012 - To 21/10/2013

6th: 09 Aug 2018

From 21/10/2013 - To 21/10/2014

7th: 09 Aug 2018

From 21/10/2014 - To 21/10/2015

8th: 09 Aug 2018

From 21/10/2015 - To 21/10/2016

9th: 09 Aug 2018

From 21/10/2016 - To 21/10/2017

10th: 09 Aug 2018

From 21/10/2017 - To 21/10/2018

11th: 09 Aug 2018

From 21/10/2018 - To 21/10/2019

12th: 26 Sep 2019

From 21/10/2019 - To 21/10/2020

13th: 20 Oct 2020

From 21/10/2020 - To 21/10/2021

14th: 21 Sep 2021

From 21/10/2021 - To 21/10/2022

15th: 06 Oct 2022

From 21/10/2022 - To 21/10/2023

16th: 12 Oct 2023

From 21/10/2023 - To 21/10/2024

17th: 30 Sep 2024

From 21/10/2024 - To 21/10/2025

18th: 14 Oct 2025

From 21/10/2025 - To 21/10/2026