Abstract: The invention relates to a method for detection of Point of Importance in a binary object. In one embodiment this is accomplished by converting a colored image in to gray-scale image, converting the gray-scale image in to binary image by optimal thresholding, scanning of the image inside the object in column and row wise and all k consecutive object pixels along with their position (x, y) are stored into temporary buffer of size k, and iterating the memory once the image is scanned and marking the center location of memory as POI (Xp, Yp).
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
“A method for detection of Point of Importance (POI) in a binary object image”
By
BHARAT ELECTRONICS LIMITED
Nationality: Indian
OUTER RING ROAD, NAGAVARA, BANGALORE- 560045,
KARNATAKA, INDIA
The following specification particularly describes the invention and the manner in which it is to be performed.
Field of the invention
The present invention mainly relates to Point of Importance in an object and more particularly to a method for detection of Point of Importance in a binary object.
Background of the invention
The point of importance is the most strategically important location in an object. Usually, every geometric object is always characterized by its Centroid. The centroid of any object is geometric average of all the points which lay either inside or over the object. The point of importance always ensures to be lied inside the object. The conventional method to find out centroid gives POI at the location corresponding to the centroid of an object. But it is a fact that Centroid of non-convex objects does not always lie inside the object.
For example, document US 4499597 A describes Small-object location utilizing centroid accumulation, in the disclosed image-analysis scheme, a processing window of M by N pixels is successively scanned in single-pixel steps over a sensed image, with the centroid of the image data contained in each window position then being determined. When a pixel-by-pixel tabulation is made of the number of times each pixel has been determined to be the windowed-data centroid, those pixels having the higher tabulated centroid counts will tend to be the intra-image locations of any objects which are smaller than about M/2 by N/2 pixels.
Another, document US 3513318 A describes an Object locating system using image centroid measurement. The signals representing location are sampled at the leading edge of an object representing signal and again at the trailing edge of the same object representing signal. The two sets of location representing signals are averaged to provide a representation of the location of the centroid of the object. When used in combination with an imaging sensor, the accuracy of the location representing signals is improved by correcting for non-orthogonality of deflecting circuits and for misalignments of the optical axis relative to a deflection null axis.
Further, document, CN 103049921 B describes Method for determining image centroid of small irregular celestial body for deep space autonomous navigation, wherein the Irregular centroid of a small celestial images of deep space for autonomous navigation of determination method to determine the centroid of small celestial coordinate system based on three-dimensional shape model data generated by the ground, depending on the orientation of the image, the establishment of Hu's small bodies under different orientations 7 class invariant normalized central moment in which orientation to give the corresponding current centroid correction factor to form the model base; captured image according to the current time, a small celestial body contour extraction; calculate the contour of a binary image order moment, that centroid as the center of Hu's calculation of normalized Class I and II unchanged central moment of second order; calculating second central moment, for subsequent matching operation, if the match is unsuccessful, continue to calculate three seven types of third-order central moments, then add matching; the calculated central moments and model stored in the library match the central moments, when the degree of similarity is set to meet the threshold conditions are that the current position of the matching is completed, the current shape Heart utilization centroid factor correction.
Therefore there is a need in the art with the method for detection of point of importance in a binary object and to solve the above mentioned limitations.
Objective of the invention
The main objective of the present invention is to locate the point inside the binary object in strategic way and ensures consistency in different directions centroid output.
Summary of the Invention
An aspect of the present invention is to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below.
Accordingly, an aspect of the present invention is to provide amethod for detection of Point of Importance (POI) in a binary object image. The method includes converting a colored image in to gray-scale image, converting the gray-scale image in to binary image by optimal thresholding, wherein the binary image with all 1s belongs to object and all 0s belongs to background, scanning of the image inside the object in column and row wise and all k consecutive object pixels along with their position (x, y) are stored into temporary buffer of size k, wherein the center location of temporary buffer containing all object pixels in each column and each row is stored in memory and iterating the memory once the image is scanned and marking the center location of memory as POI (Xp, Yp).
Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
Brief description of the drawings
The above and other aspects, features, and advantages of certain exemplary embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings in which:
Figure 1 shows the general flow-chart describing the method for locating the POI inside the object according to one embodiment of the present invention.
Persons skilled in the art will appreciate that elements in the figures are illustrated for simplicity and clarity and may have not been drawn to scale. For example, the dimensions of some of the elements in the figure may be exaggerated relative to other elements to help to improve understanding of various exemplary embodiments of the present disclosure.
Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
Detailed description of the invention
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
By the term “substantially” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.
Figures discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way that would limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged communications system. The terms used to describe various embodiments are exemplary. It should be understood that these are provided to merely aid the understanding of the description, and that their use and definitions in no way limit the scope of the invention. Terms first, second, and the like are used to differentiate between objects having the same terminology and are in no way intended to represent a chronological order, unless where explicitly stated otherwise. A set is defined as a non-empty set including at least one element.
The present invention relates to a method for locating the point of importance inside the binary object in strategic way. Conventional way of calculating POI is based on Centroid of the object. The shortcomings of centroid based POI is overcome by scanning the points inside the object and iterating them to select one of them in strategic manner as POI. The idea of POI herein is proposed for single object in the process window. But this idea can also be generalized to multiple objects scenario inside the process window by assigning different labels to different objects and then calculation of POI for each labeled object as proposed in the present invention.
Further, the present invention is concerned with picking of a point related to the object other than centroid, which is based on some predefined criteria decided by the field of application. This point is defined by point of importance.
Figure 1 shows the general flow-chart describing the method for locating the POI inside the object according to one embodiment of the present invention.
The figure shows the general flow-chart 100 describing the method for locating the POI inside the object. The method orders the points in the object and selects strategically important point. The selection of the point is based on criteria which are based on various ways. The present invention method ensures that the point selected as POI always lies either inside or over the surface boundary of the object. Further, the present invention strategically selects the points of object based on various approaches which classifies the distinct definition of locating POI.
Initially, construct a process window over the image of size M x N. In general, the algorithm for locating the POI is divided into 5 steps as follows:-
Step 1: The input image as depicted in 1.1 can generally be acquired by either CCD or TI Camera with specific resolution as per the camera specification. If the input image is colored image, it is converted to gray-scale image before giving input to Step 2.
Step 2:An optimum process window of size M x N is selected over the image so that it contains the entire object. This process window is later been converted into binary image by optimal thresholding method so that all 1s belong to object and 0s belong to background.
Step 3: All the pixels inside the object are scanned successively and picking the set of important samples out of these based on the method as mentioned in claims.
Step 4: Once all pixels inside the object are scanned and set of important samples is formed. Set of important samples are iterated in step 4 for picking up POI out of all points from important samples.
Step 5: The final output of step 4 provides the POI for given object.
In one embodiment, the present invention relates to a method for detection of Point of Importance (POI) in a binary object image, the method comprising: converting a colored image in to gray-scale image 110, converting the gray-scale image in to binary image by optimal thresholding 120, wherein the binary image with all 1s belongs to object and all 0s belongs to background, scanning of the image inside the object in column and row wise and all k consecutive object pixels along with their position (x, y) are stored into temporary buffer of size k 130, wherein the center location of temporary buffer containing all object pixels in each column and each row is stored in memory and iterating the memory once the image is scanned and marking the center location of memory as POI (Xp, Yp) 140.
The method for locating POI in a row where object is having maximum vertical projection includes few steps: scanning of all object pixels in each row and updating total number of pixel counts for each row into buffer of size M, iterating the buffer to find out row index (rmax) which is having maximum number of pixel counts and mark this row index as horizontal coordinate of POI (xp), iterating the object pixels along row (rmax) to find out vertical coordinate of POI (yp).Repeating all object pixels along row (rmax) and picking up the center location of temporary buffer as Yp and store the same in memory.
The method for locating POI in a column (cmax) where object is having maximum horizontal projection includes few steps: scanning of all object pixels in each column and updating total number of pixel counts for each column into buffer of size N, iterating the buffer to find out column index (Cmax) which is having maximum number of pixel counts and mark this column index as horizontal coordinate of POI (xp), iterating the object pixels along column (Cmax) to find out vertical coordinate of POI (yp) and iterating the column (cmax = yp) to find out xp.
The method for locating POI over the horizontal edges of the object has few steps: scanning of object is done column wise and all the horizontal corners of the object are stored into temporary buffer of size k and the center location of buffer is updated into memory.
The method for locating POI over the vertical edges of the object has few steps: scanning of object is done row wise and all the vertical corners of the object are stored into buffer of size k and the center location of temporary buffer is updated into memory.
Figures are merely representational and are not drawn to scale. Certain portions thereof may be exaggerated, while others may be minimized. Figures illustrate various embodiments of the invention that can be understood and appropriately carried out by those of ordinary skill in the art.
In the foregoing detailed description of embodiments of the invention, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments of the invention require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description of embodiments of the invention, with each claim standing on its own as a separate embodiment.
It is understood that the above description is intended to be illustrative, and not restrictive. It is intended to cover all alternatives, modifications and equivalents as may be included within the spirit and scope of the invention as defined in the appended claims. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein,” respectively.
We claim:
1. A method for detection of Point of Importance (POI) in a binary object image, the method comprising:
converting a colored image in to gray-scale image;
converting the gray-scale image in to binary image by optimal thresholding, wherein the binary image with all 1s belongs to object and all 0s belongs to background;
scanning of the image inside the object in column and row wise and all k consecutive object pixels along with their position (x,y) are stored into temporary buffer of size k, wherein the center location of temporary buffer containing all object pixels in each column and each row is stored in memory; and
iterating the memory once the image is scanned and marking the center location of memory as POI (Xp, Yp).
2. A method for locating POI in a row where object is having maximum vertical projection according to claim 1 includes few steps:
scanning of all object pixels in each row and updating total number of pixel counts for each row into buffer of size M;
iterating the buffer to find out row index (rmax) which is having maximum number of pixel counts and mark this row index as horizontal coordinate of POI (xp);
iterating the object pixels along row (rmax) to find out vertical coordinate of POI (yp).
3. The method as claimed in claim 2, wherein iterating all object pixels along row (rmax) and picking up the center location of temporary buffer as Yp and store the same in memory.
4. A method for locating POI in a column (cmax) where object is having maximum horizontal projection according to claim 1 includes few steps:
scanning of all object pixels in each column and updating total number of pixel counts for each column into buffer of size N;
iterating the buffer to find out column index (Cmax) which is having maximum number of pixel counts and mark this column index as horizontal coordinate of POI (xp);
iterating the object pixels along column (Cmax) to find out vertical coordinate of POI (yp); and
iterating the column (cmax = yp) to find out xp.
5. The method for locating POI over the horizontal edges of the object according to claim 1, wherein scanning of object is done column wise and all the horizontal corners of the object are stored into temporary buffer of size k and the center location of buffer is updated into memory.
6. The method for locating POI over the vertical edges of the object according to claim 1, wherein scanning of object is done row wise and all the vertical corners of the object are stored into buffer of size k and the center location of temporary buffer is updated into memory.
Abstract
The invention relates to a method for detection of Point of Importance in a binary object. In one embodiment this is accomplished by converting a colored image in to gray-scale image, converting the gray-scale image in to binary image by optimal thresholding, scanning of the image inside the object in column and row wise and all k consecutive object pixels along with their position (x, y) are stored into temporary buffer of size k, and iterating the memory once the image is scanned and marking the center location of memory as POI (Xp, Yp).
Figure 1 (for publication)
,CLAIMS:We claim:
1. A method for detection of Point of Importance (POI) in a binary object image, the method comprising:
converting a colored image in to gray-scale image;
converting the gray-scale image in to binary image by optimal thresholding, wherein the binary image with all 1s belongs to object and all 0s belongs to background;
scanning of the image inside the object in column and row wise and all k consecutive object pixels along with their position (x,y) are stored into temporary buffer of size k, wherein the center location of temporary buffer containing all object pixels in each column and each row is stored in memory; and
iterating the memory once the image is scanned and marking the center location of memory as POI (Xp, Yp).
2. A method for locating POI in a row where object is having maximum vertical projection according to claim 1 includes few steps:
scanning of all object pixels in each row and updating total number of pixel counts for each row into buffer of size M;
iterating the buffer to find out row index (rmax) which is having maximum number of pixel counts and mark this row index as horizontal coordinate of POI (xp);
iterating the object pixels along row (rmax) to find out vertical coordinate of POI (yp).
3. The method as claimed in claim 2, wherein iterating all object pixels along row (rmax) and picking up the center location of temporary buffer as Yp and store the same in memory.
4. A method for locating POI in a column (cmax) where object is having maximum horizontal projection according to claim 1 includes few steps:
scanning of all object pixels in each column and updating total number of pixel counts for each column into buffer of size N;
iterating the buffer to find out column index (Cmax) which is having maximum number of pixel counts and mark this column index as horizontal coordinate of POI (xp);
iterating the object pixels along column (Cmax) to find out vertical coordinate of POI (yp); and
iterating the column (cmax = yp) to find out xp.
5. The method for locating POI over the horizontal edges of the object according to claim 1, wherein scanning of object is done column wise and all the horizontal corners of the object are stored into temporary buffer of size k and the center location of buffer is updated into memory.
6. The method for locating POI over the vertical edges of the object according to claim 1, wherein scanning of object is done row wise and all the vertical corners of the object are stored into buffer of size k and the center location of temporary buffer is updated into memory.
| # | Name | Date |
|---|---|---|
| 1 | 201741011870-PROOF OF ALTERATION [04-10-2024(online)].pdf | 2024-10-04 |
| 1 | 201741011870-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 1 | Drawing [31-03-2017(online)].pdf | 2017-03-31 |
| 2 | 201741011870-IntimationOfGrant30-11-2023.pdf | 2023-11-30 |
| 2 | 201741011870-PROOF OF ALTERATION [04-10-2024(online)].pdf | 2024-10-04 |
| 2 | Description(Provisional) [31-03-2017(online)].pdf | 2017-03-31 |
| 3 | 201741011870-DRAWING [22-03-2018(online)].pdf | 2018-03-22 |
| 3 | 201741011870-IntimationOfGrant30-11-2023.pdf | 2023-11-30 |
| 3 | 201741011870-PatentCertificate30-11-2023.pdf | 2023-11-30 |
| 4 | 201741011870-Written submissions and relevant documents [31-10-2023(online)].pdf | 2023-10-31 |
| 4 | 201741011870-PatentCertificate30-11-2023.pdf | 2023-11-30 |
| 4 | 201741011870-COMPLETE SPECIFICATION [22-03-2018(online)].pdf | 2018-03-22 |
| 5 | 201741011870-Written submissions and relevant documents [31-10-2023(online)].pdf | 2023-10-31 |
| 5 | 201741011870-Proof of Right (MANDATORY) [04-07-2018(online)].pdf | 2018-07-04 |
| 5 | 201741011870-FORM-26 [17-10-2023(online)].pdf | 2023-10-17 |
| 6 | 201741011870-FORM-26 [17-10-2023(online)].pdf | 2023-10-17 |
| 6 | 201741011870-FORM-26 [04-07-2018(online)].pdf | 2018-07-04 |
| 6 | 201741011870-Correspondence to notify the Controller [16-10-2023(online)].pdf | 2023-10-16 |
| 7 | Correspondence by Agent_Form26_06-07-2018.pdf | 2018-07-06 |
| 7 | 201741011870-US(14)-HearingNotice-(HearingDate-19-10-2023).pdf | 2023-09-18 |
| 7 | 201741011870-Correspondence to notify the Controller [16-10-2023(online)].pdf | 2023-10-16 |
| 8 | 201741011870-Response to office action [27-10-2022(online)].pdf | 2022-10-27 |
| 8 | 201741011870-US(14)-HearingNotice-(HearingDate-19-10-2023).pdf | 2023-09-18 |
| 8 | abstract 201741011870.jpg | 2018-07-09 |
| 9 | 201741011870-FER.pdf | 2021-10-17 |
| 9 | 201741011870-FORM 18 [13-08-2018(online)].pdf | 2018-08-13 |
| 9 | 201741011870-Response to office action [27-10-2022(online)].pdf | 2022-10-27 |
| 10 | 201741011870-ABSTRACT [20-04-2021(online)].pdf | 2021-04-20 |
| 10 | 201741011870-FER.pdf | 2021-10-17 |
| 10 | 201741011870-PETITION UNDER RULE 137 [20-04-2021(online)].pdf | 2021-04-20 |
| 11 | 201741011870-ABSTRACT [20-04-2021(online)].pdf | 2021-04-20 |
| 11 | 201741011870-CLAIMS [20-04-2021(online)].pdf | 2021-04-20 |
| 11 | 201741011870-OTHERS [20-04-2021(online)].pdf | 2021-04-20 |
| 12 | 201741011870-CLAIMS [20-04-2021(online)].pdf | 2021-04-20 |
| 12 | 201741011870-COMPLETE SPECIFICATION [20-04-2021(online)].pdf | 2021-04-20 |
| 12 | 201741011870-FER_SER_REPLY [20-04-2021(online)].pdf | 2021-04-20 |
| 13 | 201741011870-DRAWING [20-04-2021(online)].pdf | 2021-04-20 |
| 13 | 201741011870-COMPLETE SPECIFICATION [20-04-2021(online)].pdf | 2021-04-20 |
| 14 | 201741011870-COMPLETE SPECIFICATION [20-04-2021(online)].pdf | 2021-04-20 |
| 14 | 201741011870-DRAWING [20-04-2021(online)].pdf | 2021-04-20 |
| 14 | 201741011870-FER_SER_REPLY [20-04-2021(online)].pdf | 2021-04-20 |
| 15 | 201741011870-CLAIMS [20-04-2021(online)].pdf | 2021-04-20 |
| 15 | 201741011870-FER_SER_REPLY [20-04-2021(online)].pdf | 2021-04-20 |
| 15 | 201741011870-OTHERS [20-04-2021(online)].pdf | 2021-04-20 |
| 16 | 201741011870-ABSTRACT [20-04-2021(online)].pdf | 2021-04-20 |
| 16 | 201741011870-OTHERS [20-04-2021(online)].pdf | 2021-04-20 |
| 16 | 201741011870-PETITION UNDER RULE 137 [20-04-2021(online)].pdf | 2021-04-20 |
| 17 | 201741011870-FORM 18 [13-08-2018(online)].pdf | 2018-08-13 |
| 17 | 201741011870-PETITION UNDER RULE 137 [20-04-2021(online)].pdf | 2021-04-20 |
| 17 | 201741011870-FER.pdf | 2021-10-17 |
| 18 | 201741011870-Response to office action [27-10-2022(online)].pdf | 2022-10-27 |
| 18 | abstract 201741011870.jpg | 2018-07-09 |
| 18 | 201741011870-FORM 18 [13-08-2018(online)].pdf | 2018-08-13 |
| 19 | 201741011870-US(14)-HearingNotice-(HearingDate-19-10-2023).pdf | 2023-09-18 |
| 19 | abstract 201741011870.jpg | 2018-07-09 |
| 19 | Correspondence by Agent_Form26_06-07-2018.pdf | 2018-07-06 |
| 20 | 201741011870-Correspondence to notify the Controller [16-10-2023(online)].pdf | 2023-10-16 |
| 20 | 201741011870-FORM-26 [04-07-2018(online)].pdf | 2018-07-04 |
| 20 | Correspondence by Agent_Form26_06-07-2018.pdf | 2018-07-06 |
| 21 | 201741011870-FORM-26 [04-07-2018(online)].pdf | 2018-07-04 |
| 21 | 201741011870-FORM-26 [17-10-2023(online)].pdf | 2023-10-17 |
| 21 | 201741011870-Proof of Right (MANDATORY) [04-07-2018(online)].pdf | 2018-07-04 |
| 22 | 201741011870-COMPLETE SPECIFICATION [22-03-2018(online)].pdf | 2018-03-22 |
| 22 | 201741011870-Proof of Right (MANDATORY) [04-07-2018(online)].pdf | 2018-07-04 |
| 22 | 201741011870-Written submissions and relevant documents [31-10-2023(online)].pdf | 2023-10-31 |
| 23 | 201741011870-COMPLETE SPECIFICATION [22-03-2018(online)].pdf | 2018-03-22 |
| 23 | 201741011870-DRAWING [22-03-2018(online)].pdf | 2018-03-22 |
| 23 | 201741011870-PatentCertificate30-11-2023.pdf | 2023-11-30 |
| 24 | 201741011870-DRAWING [22-03-2018(online)].pdf | 2018-03-22 |
| 24 | 201741011870-IntimationOfGrant30-11-2023.pdf | 2023-11-30 |
| 24 | Description(Provisional) [31-03-2017(online)].pdf | 2017-03-31 |
| 25 | Drawing [31-03-2017(online)].pdf | 2017-03-31 |
| 25 | Description(Provisional) [31-03-2017(online)].pdf | 2017-03-31 |
| 25 | 201741011870-PROOF OF ALTERATION [04-10-2024(online)].pdf | 2024-10-04 |
| 26 | Drawing [31-03-2017(online)].pdf | 2017-03-31 |
| 26 | 201741011870-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 1 | searchE_09-10-2020.pdf |