Sign In to Follow Application
View All Documents & Correspondence

System And Method To Determine True Facial Feature Measurements Of A Face In A 2 D Image

Abstract: A face detection system comprises a communication device, a main server, and a third-party server. The communication device is controlled by a first processor and comprises a camera, which is focused on the face of a user to click an image of the face of the user. The main server is controlled by a second processor that is in communication with the communication device via a communication network, which receives the captured image via the communication device. The third-party server is controlled by a third processor and is in communication with the main server, which receives the captured image and identifies the landmarks on the face of the user as a measurable value. The identified measurable value is transferred back to the main server to determine the diameter of both irises of the user’s eyes and the distance between center of left iris and center of right iris.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
25 August 2020
Publication Number
09/2022
Publication Type
INA
Invention Field
ELECTRICAL
Status
Email
archana@anandandanand.com
Parent Application

Applicants

LENSKART SOLUTIONS PVT. LTD.
W-123, Greater Kailash, Part-2, New Delhi - 110048, India

Inventors

1. PEYUSH BANSAL
904, RR1, Eros Royale Retreat, Plot 2, Charmwood Village, Surajkund Road, Lakarpur, Sector 39, Faridabad, Haryana 121009, India

Specification

FIELD OF THE INVENTION
The present invention relates to detection of true facial feature measurements. More specifically, the present invention relates to a face detection system and a method associated with the system that determines true facial measurements for using optical products that require the appropriate size. For example, eyeglasses or sunglasses, facial measurement helps in determining the size of the product of interest.
BACKGROUND OF THE INVENTION
In the current scenario, online purchase of accessories and clothing has become a matter of great interest, especially considering the fact that most of the web portals offer technologies that help the user to, for example, click/scan/upload a picture of him/her through which the website can identify the suitable measurements for the accessory of his/her choice. However, for example, when it comes to choosing a pair of spectacles, measurements of the face have to be carefully taken to determine the right pair for the user. In an example, Nike Fit uses augmented reality that is available in iPhone X (iPhone X™ by Apple Inc.™) grade phones to determine the size. This is possible because iPhone X has an infrared camera that helps determine the depth of an object in an image. Unfortunately, this technology is limited to only those grades of phones.
Another method is the use of reference objects in the picture. In order to find out the measurement of the eyeglass or sunglass (frame) that would fit correctly on a face, few companies use a magnetic strip card (ATM card or credit card) as a reference object. These cards have the same length and width even for international standards. If a user places the card flat over his/her forehead and take a picture, then using image processing techniques, the card is detected and

hence the mm/pixel (since the length or width of the card is fixed and hence known).
As mentioned above, most of the techniques use width, length, and as well as depth, based on 3D imaging systems. Therefore, there is a need for a method or system that detects correct facial measurements for selecting the right product of interest, for example, spectacles, without using augmented reality and 3D depth analysis.
These and other advantages of the present invention will become readily apparent from the following detailed description read in conjunction with the accompanying drawings.
SUMMARY OF THE INVENTION
The following presents a simplified summary of the subject matter in order to provide a basic understanding of some of the aspects of subject matter embodiments. This summary is not an extensive overview of the subject matter. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the subject matter. Its sole purpose to present some concepts of the subject matter in a simplified form as a prelude to the more detailed description that is presented later.
The face detection system disclosed here addresses the above-mentioned need to detect correct facial measurements for selecting the right product of interest, for example, spectacles, without using augmented reality and 3D depth analysis. The face detection system comprises a communication device, a main server, and a third-party server. The communication device is controlled by a first processor and comprises a camera, where the camera is focused on the face of a user to click an image of the face of the user. The main server is controlled by a second processor that is in communication with the communication device via a communication network, where the main server receives the captured image via

the communication device. The third-party server is controlled by a third processor and is in communication with the main server, wherein the third-party server receives the captured image and identifies the landmarks on the face of the user as a measurable value. The identified measurable value is transferred back to the main server to determine the diameter of both irises of the user's eyes and the distance between center of left iris and center of right iris.
In an embodiment, the diameter of both the irises of the user's eyes and the distance between the center of the left iris and the center of the right iris are measured in pixels. In an embodiment, the main server in communication with the second processor determines a ratio of an average of the diameter of both the irises in pixels by the distance between the center of the left iris and the center of the right iris in pixels. In an embodiment, the communication device receives processed data that comprises the ratio the from the main server, where the user is enabled to choose a product of interest by accessing a set tables that is derived based on the above determined ratio.
In an embodiment, the face detection system further comprises an image acquisition module of the communication device that is controlled by the first processor, is in communication with the camera to capture and receive the image using the camera. In an embodiment, the face detection system further comprises a face detection module of the third-party sever that detects the landmarks of the image of the face in pixels using image processing techniques. In an embodiment, the face detection system further comprises a facial features quantifying module of the main server that detects the diameter of both the irises in pixels and the distance between the center of the left iris and the center of the right iris in pixels.
In an embodiment, the face detection system further comprises a unit determination module of the main server that receives data that includes the diameter and the distance that is retrieved from the facial features quantifying module. The unit determination module determines the ratio of the average of the

diameter of both the irises in pixels by the distance between the center of the left iris and the center of the right iris in pixels. In an embodiment, the ratio is within a range between 0.11 to 0.31.
Other aspects, advantages, and salient features of the present invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention. Reference will now be made to the accompanying diagrams which illustrate, by way of an example, and not by way of limitation, of one possible embodiment of the invention.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The following drawings are illustrative of particular examples for enabling systems and methods of the present invention, are descriptive of some of the methods and mechanism, and are not intended to limit the scope of the invention. The drawings are not to scale (unless so stated) and are intended for use in conjunction with the explanations in the following detailed description.
Figure 1 is a schematic diagram showing the face detection system that determines true facial feature measurements in a 2D image.
Figure 2 is a front view of a reference human face, showing the different dimensions that are retrieved from the face of the user to determine the facial feature measurements.
Figure 3 is a block diagram showing the different modules of the face detection system that determine facial feature measurements from a 2D image of the face of a user.

Figure 4 is a method flow diagram showing the process involved in the face detection system that determines facial feature measurements from a 2D image of the face of a user.
Persons skilled in the art will appreciate that elements in the figures are illustrated for simplicity and clarity and may represent both hardware components of the face detection system. Further, the dimensions of some of the elements in the figure may be exaggerated relative to other elements to help to improve understanding of various exemplary embodiments of the present disclosure.
Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
DETAILED DESCRIPTION OF THE INVENTION
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, persons skilled in the art will recognize that various changes and modifications to the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
The terms and words used in the following description are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to the person skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purpose only.
It is to be understood that the singular forms "a," "an," and "the" include plural referents unless the context clearly dictates otherwise.

The present invention relates to a face detection system that determines true facial measurements for using optical products that require the appropriate size. For example, eyeglasses or sunglasses, facial measurement helps in determining the size of the product of interest.
Referring to Figures 1 and 3, figure 1 is a schematic diagram showing the face detection system 100 that determines true facial feature measurements of a face in a 2D image. The face detection system 100 includes a communication device 102, a communication network 104, a main server 106, and a third-party server 108. The communication device 102 is, for example, a mobile device, a desktop, a laptop, or any device that includes a software application and an image capturing feature capable of communicating with a server through a network.
Referring to Figures 1-4, the communication device 102 is controlled by a first processor 310 and comprises a camera 102, which is focused 401 on the face of a user to click an image of the face 200 of the user. The main server 106 is controlled by a second processor 312 that is in communication with the communication device 102 via a communication network 104, which receives 402 the captured image via the communication device 102. The third-party server 108 is controlled by a third processor 314 and is in communication with the main server 106, which receives 403 the captured image and identifies the landmarks on the face 200 of the user as a measurable value. The identified measurable value is transferred back to the main server 106 to determine 404 the diameter 202 of both irises of the user's eyes and the distance 204 between center of left iris and center of right iris, as shown in Figure 2.
In other words, in order to detect the facial measurements, the user has to access the software application installed in the communication device 102 that is controlled by the first processor 310, access the camera 110 included in the communication device 102 through the option available on the software

application, position the communication device 102 at a distance from his/her face, and capture (or click) the image of the face 200 using the camera 110. The captured image is transferred to a main server 106 via a communication network 104. The main server 106 transfers the image to the third-party server 108, where the landmarks of the face 200 of the user are identified in measurable values, such as, pixels using image processing techniques.
This identified data using the third-party server 108 that includes landmarks of the face 200 is transferred back to the main server 106 for further processing that includes, for example, determining the diameter 202 of both the irises in pixels and the distance 204 between the center of the left iris and the center of the right iris in pixels and determining a ratio of an average of the diameter 202 of both the irises in pixels by the distance 204 between the center of the left iris and the center of the right iris in pixels. The communication device 102 finally receives processed data that comprises the ratio the from the main server 106, so that a user is enabled to choose the product of interest, for example, a pair of spectacles, by accessing a set tables derived from the above determined value, or in other words, the above determined ratio.
Figure 2 is a front view of a reference human face 200, showing the different dimensions that are retrieved from the face 200 of the user to determine the facial feature measurements. Here, the reference numeral 202 represents the diameter of both the irises in pixels and the reference numeral 204 represents the distance between the center of the left iris and the center of the right iris in pixels.
Figure 3 is a block diagram showing the different modules of the face detection system 100 that determines true facial feature measurements from a 2D image of the face 200 of a user. An image acquisition module 302 of the communication device 102 that is in communication with the camera 110 captures and receives the image of the face 200 of the user from the camera 110. The data of the captured image of the user's face 200 is transferred to the main server 106 and

then to a third-party server 108 via the communication network 104. A face detection module 304 of the third-party sever 106 detects the landmarks of the image of the face 200 in pixels using image processing techniques. The detected data that includes the landmarks of the image of the face 200 is fed back from the third-party sever 108 to the main server 106 via the communication network 104. A facial features quantifying module 306 of the main server 106 determines the diameter 202 of both the irises in pixels and the distance 204 between the center of the left iris and the center of the right iris in pixels.
A unit determination module 308 of the main server 106 receives the data including the diameter 202 and the distance 204 that is retrieved from the facial features quantifying module 306. The unit determination module 308 determines ratio of an average of the diameter 202 of both the irises in pixels by the distance 204 between the center of the left iris and the center of the right iris in pixels. This ratio will range between, for example, 0.11 to 0.31.
The unit determination module 308 also derives, based on the ratio, the real measurement of the distance between the center of the left iris and the center of the right iris in millimeter (mm), which is defined as the pupillary distance. In an embodiment, the unit determination module 308 further determines the measurements of the rest of the facial features, like eyes, nose, lips, ears and forehead using the pupillary distance as a reference scale. In order to enable a user to choose the product of interest, for example, a pair of spectacles with a custom frame size, a table is provided for the user's reference to derive the final facial measurements: For Example,
a. Male, 0.15 corresponds to a pupillary distance of 72
b. Male, 0.16 corresponds to a pupillary distance of 70
c. Male, 0.17 corresponds to a pupillary distance of 68
d. Male, 0.18 corresponds to a pupillary distance of 66
e. Male, 0.19 corresponds to a pupillary distance of 64
f Male, 0.20 corresponds to a pupillary distance of 62

As will be appreciated by one of skill in the art, the present disclosure may be embodied as a method and system. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, a software embodiment or an embodiment combining software and hardware aspects.
It will be understood that the functions of any of the units as described above can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts performed by any of the units as described above.
Instructions may also be stored in a computer- readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act performed by any of the units as described above.
Instructions may also be loaded onto a computer or other programmable data processing apparatus like a scanner/check scanner to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts performed by any of the units as described above.
In the specification, there has been disclosed exemplary embodiments of the invention. Although specific terms are employed, they are used in a generic and

descriptive sense only and not for purposes of limitation of the scope of the invention.


We Claim:
1. A face detection system comprising:
a communication device controlled by a first processor and comprising a camera, wherein the camera is focused on the face of a user to click an image of the face of the user;
a main server controlled by a second processor, in communication with the communication device via a communication network, wherein the main server receives the captured image via the communication device;
a third-party server controlled by a third processor, in communication with the main server, wherein the third-party server receives the captured image and identifies the landmarks on the face of the user as a measurable value, wherein the identified measurable value is transferred back to the main server to determine the diameter of both irises of the user's eyes and the distance between center of left iris and center of right iris.
2. The face detection system as claimed in claim 1, wherein the diameter of both the irises of the user's eyes and the distance between the center of the left iris and the center of the right iris are measured in pixels.
3. The face detection system as claimed in claim 2, wherein the main server in communication with the second processor determines a ratio of an average of the diameter of both the irises in pixels by the distance between the center of the left iris and the center of the right iris in pixels.
4. The face detection system as claimed in claim 3, wherein the communication device receives processed data that comprises the ratio the from the main server, wherein the user is enabled to choose a product of interest by accessing a set tables that is derived based on the above determined ratio.
5. The face detection system as claimed in claim 1, further comprises an image acquisition module of the communication device, that is controlled by the first

processor, is in communication with the camera to capture and receive the image using the camera.
6. The face detection system as claimed in claim 1, further comprises a face detection module of the third-party sever that detects the landmarks of the image of the face in pixels using image processing techniques.
7. The face detection system as claimed in claim 1, further comprises a facial features quantifying module of the main server that detects the diameter of both the irises in pixels and the distance between the center of the left iris and the center of the right iris in pixels.
8. The face detection system as claimed in claim 1, further comprises a unit determination module of the main server that receives data that includes the diameter and the distance that is retrieved from the facial features quantifying module, wherein the unit determination module determines the ratio of the average of the diameter of both the irises in pixels by the distance between the center of the left iris and the center of the right iris in pixels.
9. The face detection system as claimed in claim 8, wherein the ratio is within a range between 0.11 to 0.31.
10. A method of face detection comprising:
focusing a camera of a communication device on the face of a user to click an image of the face of the user, wherein the communication device is controlled by a first processor;
receiving the captured image in a main server that is controlled by a second processor, wherein the main server is in communication with the communication device via a communication network;
receiving the captured image at a third-party server that is controlled by a third processor and identifying the landmarks on the face of the user as a measurable value, wherein the third-party server is in communication with the main server; and

determining the diameter of both irises of the user's eyes and the distance between center of left iris and center of right iris after transferring the identified measurable value to the main server.

Documents

Application Documents

# Name Date
1 202011036580-AMMENDED DOCUMENTS [08-10-2021(online)].pdf 2021-10-08
1 202011036580-STATEMENT OF UNDERTAKING (FORM 3) [25-08-2020(online)].pdf 2020-08-25
2 202011036580-PROVISIONAL SPECIFICATION [25-08-2020(online)].pdf 2020-08-25
2 202011036580-FORM 13 [08-10-2021(online)].pdf 2021-10-08
3 202011036580-Proof of Right [08-10-2021(online)].pdf 2021-10-08
3 202011036580-POWER OF AUTHORITY [25-08-2020(online)].pdf 2020-08-25
4 202011036580-RELEVANT DOCUMENTS [08-10-2021(online)].pdf 2021-10-08
4 202011036580-FORM 1 [25-08-2020(online)].pdf 2020-08-25
5 202011036580-DRAWINGS [25-08-2020(online)].pdf 2020-08-25
5 202011036580-Covering Letter [27-08-2021(online)].pdf 2021-08-27
6 202011036580-Form 1 (Submitted on date of filing) [27-08-2021(online)].pdf 2021-08-27
6 202011036580-DRAWING [02-11-2020(online)].pdf 2020-11-02
7 202011036580-Request Letter-Correspondence [27-08-2021(online)].pdf 2021-08-27
7 202011036580-CORRESPONDENCE-OTHERS [02-11-2020(online)].pdf 2020-11-02
8 202011036580-COMPLETE SPECIFICATION [02-11-2020(online)].pdf 2020-11-02
9 202011036580-Request Letter-Correspondence [27-08-2021(online)].pdf 2021-08-27
9 202011036580-CORRESPONDENCE-OTHERS [02-11-2020(online)].pdf 2020-11-02
10 202011036580-DRAWING [02-11-2020(online)].pdf 2020-11-02
10 202011036580-Form 1 (Submitted on date of filing) [27-08-2021(online)].pdf 2021-08-27
11 202011036580-DRAWINGS [25-08-2020(online)].pdf 2020-08-25
11 202011036580-Covering Letter [27-08-2021(online)].pdf 2021-08-27
12 202011036580-RELEVANT DOCUMENTS [08-10-2021(online)].pdf 2021-10-08
12 202011036580-FORM 1 [25-08-2020(online)].pdf 2020-08-25
13 202011036580-Proof of Right [08-10-2021(online)].pdf 2021-10-08
13 202011036580-POWER OF AUTHORITY [25-08-2020(online)].pdf 2020-08-25
14 202011036580-PROVISIONAL SPECIFICATION [25-08-2020(online)].pdf 2020-08-25
14 202011036580-FORM 13 [08-10-2021(online)].pdf 2021-10-08
15 202011036580-STATEMENT OF UNDERTAKING (FORM 3) [25-08-2020(online)].pdf 2020-08-25
15 202011036580-AMMENDED DOCUMENTS [08-10-2021(online)].pdf 2021-10-08