Sign In to Follow Application
View All Documents & Correspondence

System And Method For Virtual Try On Of Eye Glasses

Abstract: Technique for virtual try-on of eye glasses is disclosed. In an embodiment, a frame including a user’s image and a three-dimensional (3D) mesh of the eye glasses are received. Further, a face and eyes of the user are identified in the image. Furthermore, image control points (IMPs) are defined around the eyes of the user. In addition, mesh control points are selected in the 3D mesh corresponding to the IMPs. Moreover, an aspect ratio of the eye glasses is determined based on length and breadth of the eye glasses in the 3D mesh. Also, the IMPs are updated based on the aspect ratio. Further, rotation and translation angles are estimated to transform the 3D mesh from a world camera system to a camera co-ordinate system based on the updated IMPs and mesh control points. The 3D mesh is then projected on the frame based on estimated rotation and translation angles.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 May 2016
Publication Number
48/2017
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2023-07-28
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point,Mumbai-400021, Maharashtra, India

Inventors

1. MALLIK, Apurbaa
Tata Consultancy Services Limited, Globalaxis H Block, 7th Floor, ODC No. 4, Gopalan Enterprises Pvt Ltd (Global Axis) SEZ "H" Block,No. 152 (Sy No. 147,157 & 158), Hoody Village, EPIP Zone,(II Stage),Whitefield, K.R. Puram Hobli, Bangalore - 560066,Karnataka
2. BHOWMICK, Brojeshwar
Tata Consultancy Services Limited, Building 1B,Ecospace, Innovation Labs, Kolkata - STP Kolkata, West Bengal - 700156 India
3. SINHA, Aniruddha
Tata Consultancy Services Limited, Building 1B,Ecospace, Innovation Labs, Kolkata - STP Kolkata, West Bengal - 700156 India
4. DASGUPTA, Ranjan
Tata Consultancy Services Limited, Building 1B,Ecospace, Innovation Labs, Kolkata - STP Kolkata, West Bengal - 700156 India

Specification

Claims:1. A method comprising:
receiving a frame comprising a user’s image and a three-dimensional (3D) mesh of eye-glasses;
identifying a face and eyes of the user in the image;
defining image control points around the eyes of the user;
selecting mesh control points in the 3D mesh of the eye-glasses corresponding to the image control points;
determining an aspect ratio of the eye-glasses based on length and breadth of the eye-glasses in the 3D mesh;
updating the image control points based on the aspect ratio of the eye-glasses;
estimating a rotation angle and a translation angle to transform the 3D mesh from a world camera system to a camera co-ordinate system based on the updated image control points and the mesh control points; and
projecting the transformed 3D mesh of the eye-glasses on the frame based on the estimated rotation angle and translation angle.

2. The method as claimed in claim 1, wherein defining the image control points around the eyes of the user comprises:
estimating a set of points defining a region of interest around the eyes of the user; and
defining the image control points based on the estimated set of points.

3. The method as claimed in claim 1, further comprising:
identifying two pupils of the user in the frame;
marking mesh vertices projecting on the pupils of the user;
transferring the mesh vertices corresponding to the pupils to the camera co-ordinate system;
calculating a Euclidean distance between a x-axis and y-axis component of the mesh vertices in the camera co-ordinate system;
calculating a Euclidean distance between two pupils of the user; and
determining an inter-pupillary distance based on the Euclidean distance between the x-axis and y-axis component of the mesh vertices and the Euclidean distance between the two pupils of the user.

4. The method as claimed in claim 1, further comprising:
changing an orientation of temples in the 3D mesh of the eye-glasses to fit the face of the user.

5. The method as claimed in claim 4, wherein the orientation of temples in the 3D mesh of the eye-glasses are changed based on an intrinsic matrix and a rotation of the temples.

6. The method as claimed in claim 1, further comprising:
extracting color information of the eye-glasses using a material property defined in the 3D mesh; and
modifying the color information according to a direction of the user.

7. A system comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor, wherein the memory comprises a virtual try-on module to:
receive a frame comprising a user’s image and a three-dimensional (3D) mesh of eye-glasses;
identify a face and eyes of the user in the image;
define image control points around the eyes of the user;
select mesh control points in the 3D mesh of the eye-glasses corresponding to the image control points;
determine an aspect ratio of the eye-glasses based on length and breadth of the eye-glasses in the 3D mesh;
update the image control points based on the aspect ratio of the eye-glasses;
estimate a rotation angle and a translation angle to transform the 3D mesh from a world camera system to a camera co-ordinate system based on the updated image control points and the mesh control points; and
project the transformed 3D mesh of the eye-glasses on the frame based on the estimated rotation angle and translation angle.

8. The system as claimed in claim 7, wherein the virtual try-on module is configured to:
estimate a set of points defining a region of interest around the eyes of the user; and
define the image control points based on the estimated set of points.

9. The system as claimed in claim 7, wherein the virtual try-on module is further configured to:
identify two pupils of the user in the frame;
mark mesh vertices projecting on the pupils of the user;
transfer the mesh vertices corresponding to the pupils to the camera co-ordinate system;
calculate a Euclidean distance between a x-axis and y-axis component of the mesh vertices in the camera co-ordinate system;
calculate a Euclidean distance between two pupils of the user; and
determine an inter-pupillary distance based on the Euclidean distance between the x-axis and y-axis component of the mesh vertices and the Euclidean distance between the two pupils of the user.

10. The system as claimed in claim 7, wherein the virtual try-on module is further configured to:
change an orientation of temples in the 3D mesh of the eye-glasses to fit the face of the user.

11. The system as claimed in claim 10, wherein the orientation of temples in the 3D mesh of the eye-glasses are changed based on an intrinsic matrix and a rotation of the temples.

12. The system as claimed in claim 7, wherein the virtual try-on module is further configured to:
extract color information of the eye-glasses using a material property defined in the 3D mesh; and
modify the color information according to a direction of the user.
, Description:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION
(See Section 10 and Rule 13)

Title of invention:
SYSTEM AND METHOD FOR VIRTUAL TRY-ON OF EYE GLASSES

Applicant:
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India

The following specification particularly describes the embodiments and the manner in which it is to be performed.
TECHNICAL FIELD
The embodiments herein generally relate to eye glasses and, more particularly, to virtual try-on of the eye glasses.
BACKGROUND
The Internet is a data communication network of interconnected computers and computer networks around the world and is rapidly evolving to the point where it combines elements of telecommunications, computing, broadcasting, publishing, commerce, and information services into a revolutionary business infrastructure. The economy on the Internet is growing in every aspect of life, a wide range of businesses including stock trading, commodities, products, retails and services ordering are all via the Internet.
The growth of Internet-based electronic commerce, however, is experiencing some obstacles when it comes to certain types of services and goods. For example, it may be very difficult for a business to promote wearable goods online, such as eye glasses and so on. Unless it is a pair of generic sunglasses, very few users or consumers may order personalized glasses, such as near-sighted glasses, over the Internet. Similar reasons are applied because a consumer likes to try on a pair of chosen glasses and see from a mirror how he/she looks with the chosen pair. Hence, the market for eyeglasses is primarily limited to local retailing because the current Internet-based commerce platform lacks “try-on” experiences.

SUMMARY
The following presents a simplified summary of some embodiments of the disclosure in order to provide a basic understanding of the embodiments. This summary is not an extensive overview of the embodiments. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the embodiments. Its sole purpose is to present some embodiments in a simplified form as a prelude to the more detailed description that is presented below. In view of the foregoing, an embodiment herein provides a system and method for virtual try-on of eye glasses is disclosed.
In one aspect, a method for virtual try-on of eye glasses is disclosed. In this aspect, a frame including a user’s image and a three-dimensional (3D) mesh of eye glasses are received. Further, a face and eyes of the user are identified in the image. Furthermore, image control points are defined around the eyes of the user. In addition, mesh control points are selected in the 3D mesh of the eye glasses corresponding to the image control points. Moreover, an aspect ratio of the eye glasses is determined based on length and breadth of the eye glasses in the 3D mesh. Also, the image control points are updated based on the aspect ratio of the eye glasses. Further, a rotation angle and a translation angle are estimated to transform the 3D mesh from a world camera system to a camera co-ordinate system based on the updated image control points and the mesh control points. The transformed 3D mesh of the eye glasses is then projected on the frame based on the estimated rotation angle and translation angle.
In another aspect, a system for virtual try-on of eye glasses is disclosed. In an example, the system includes one or more processors and a memory communicatively coupled to the processors. Further, the memory includes a virtual try-on module. In an example implementation, the virtual try-on module receives a frame including a user’s image and a three-dimensional (3D) mesh of eye glasses. Further, the virtual try-on module identifies a face and eyes of the user in the image. Furthermore, the virtual try-on module defines image control points around the eyes of the user. In addition, the virtual try-on module selects mesh control points in the 3D mesh of the eye glasses corresponding to the image control points. Moreover, the virtual try-on module determines an aspect ratio of the eye glasses based on length and breadth of the eye glasses in the 3D mesh. Also, the virtual try-on module updates image control points based on the aspect ratio of the eye glasses. Further, the virtual try-on module estimates a rotation angle and a translation angle to transform the 3D mesh from a world camera system to a camera co-ordinate system based on the updated image control points and the mesh control points. The virtual try-on module then projects the transformed 3D mesh of the eye glasses on the frame based on the estimated rotation angle and translation angle.
It should be appreciated by those skilled in the art that any block diagram herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it is appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computing device or processor, whether or not such computing device or processor is explicitly shown.

BRIEF DESCRIPTION OF THE DRAWINGS
The embodiments herein are better understood from the following detailed description with reference to the drawings, in which:
FIG. 1 illustrates a block diagram of a system for virtual try-on of eye glasses, according to an embodiment of the present disclosure;
FIG. 2 illustrates various parts of eye glasses, according to an embodiment of the present disclosure; and
FIG. 3 is a flow chart illustrating a method for virtual try-on of eye glasses, according to an embodiment of the present disclosure.

DETAILED DESCRIPTION
The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
The term “eye glasses” and “eye-glasses” are used interchangeably throughout the document.
In an example embodiment, the present technique provides marker-less augmentation of eye-glasses with proper fitment on a video feed of a webcam. The present technique enables the user to virtually try-on the eye-glasses using the webcam, so that the user can visualize and access how the user looks in real-life with those eye-glasses. Such system can be useful in eye-glass retail online stores. The proposed technique also uses a unit of mesh to calculate inter-pupillary distance, which is an important metric in eye-glass industry with no additional set-up, special hardware or markers.
By nature, the face structure of humans varies from person to person. Therefore, the proposed technique also essentially modifies the eye-glass temple to fit on to the input user face and to present realistic rendering to the user. Simultaneously, lens shape of the eye-glasses can vary according to shape of the frame. Therefore for proper fitment, the aspect ratio (that is the ratio of the length to width of the lens) is estimated and processed according to the face size, so that the eye-glass shape remains intact. Thus, the present technique is independent of a structure of the eye-glasses. Furthermore to enhance the user experience, color is added to the eye-glass projection on the image according to the illumination or material model taking in account light and user viewing direction.
FIG. 1 illustrates a block diagram of a system 100, according to an embodiment of the present disclosure. As shown in FIG. 1, the system 100 includes one or more processor(s) 102 and a memory 104 communicatively coupled to each other. The system 100 also includes interface(s) 106. Further, the memory 104 includes a virtual try-on module 108 and other modules. Although FIG. 1 shows example components of the system 100, in other implementations, the system 100 may contain fewer components, additional components, different components, or differently arranged components than depicted in FIG. 1.
The processor(s) 102 and the memory 104 may be communicatively coupled by a system bus. The processor(s) 102 may include circuitry implementing, among others, audio and logic functions associated with the communication. The processor(s) 102 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor(s) 102. The processor(s) 102 can be a single processing unit or a number of units, all of which include multiple computing units. The processor(s) 102 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) 102 is configured to fetch and execute computer-readable instructions and data stored in the memory 104.
The functions of the various elements shown in the figure, including any functional blocks labeled as “processor(s)”, may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), and non-volatile storage. Other hardware, conventional, and/or custom, may also be included.
The interface(s) 106 may include a variety of software and hardware interfaces, for example, interfaces for peripheral device(s), such as a keyboard, a mouse, an external memory, and a printer. The interface(s) 106 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, local area network (LAN), cable, etc., and wireless networks, such as Wireless LAN (WLAN), cellular, or satellite. For the purpose, the interface(s) 106 may include one or more ports for connecting the system 100 to other devices.
The memory 104 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 104, may store any number of pieces of information, and data, used by the system 100 to implement the functions of the system 100. The memory 104 may be configured to store information, data, applications, instructions or the like for enabling the system 100 to carry out various functions in accordance with various example embodiments. Additionally or alternatively, the memory 104 may be configured to store instructions which when executed by the processor(s) 102 causes the system 100 to behave in a manner as described in various embodiments. The memory 104 includes the virtual try-on module 108 and other modules. The module 108 and other modules include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. The other modules may include programs or coded instructions that supplement applications and functions of the system 100.
In operation, the virtual try-on module 108 receives user faces via a webcam (i.e., communicatively coupled to the system 100 and not shown in FIG. 1) and frames from the webcam for processing. For each frame, the virtual try-on module 108 identifies the face and the eyes of the user. In an example implementation, the virtual try-on module 108 uses haar feature based cascade classifier for face and eye detection. For example, haar feature based cascade classifier for face and eye detection is a machine learning based approach where a cascade function is trained from a plurality of positive and negative images. The function is then used to detect objects in the images. In this example, the function needs the plurality of positive images (images of faces) and negative images (images without faces) to train the classifier. Then, haar features of the faces are extracted from the classifier. For this process of extraction, haar features are used. Each feature is a single value obtained by subtracting sum of pixels under a white rectangle from sum of pixels under a black rectangle.
In a scenario where the virtual try-on module 108 fails to detect the face or eyes of the user in the frame, the virtual try-on module 108 prompts an error message and moves to the next frame. For each frame, the virtual try-on module 108 estimates a set of points defining the region of interest around two eyes of the user. Using the estimated points, the virtual try-on module 108 defines a set of two-dimensional image control points (IMPs) for each frame. In an example implementation, each detected eye region is represented by a minimum rectangular bounding box consisting three parameters a) the top leftmost point p = (px, py), b) length (L) and c) width (W) of the bounding boxes. Using the estimated bounding box parameters a set of 2D image control points Ic = (ix, iy) is estimated.
From the input 3D mesh of the eye-glasses, the virtual try-on module 108 selects three-dimensional mesh control points corresponding to the image control points. The virtual try-on module 108 may mark the mesh control points from the eye-glass mesh (i.e., 3D mesh) once for each input mesh. Thereafter, the mesh control points can be used for any set of estimated image control points.
Further, the virtual try-on module 108 determines an aspect ratio of the eye-glasses based on length and breadth of the eye-glasses in the 3D mesh. For example, each eye-glass includes two demo lenses (as shown in eye-glasses 200 FIG. 2) and the ratio between length and breadth of each demo lens is referred as the aspect ratio. The aspect ratio can vary for different eye-glasses. Also, the ratio of length to breadth of estimated eye region may not match the eye-glass aspect ratio. Therefore, there is a need to modify the length and breadth of the detected eye-region to match the aspect ratio of the eye-glass. This ensures the structure of the projected eye-glass on the frame does not fluctuate and looks realistic.
In an example implementation, from the input eye-glasses mesh, the virtual try-on module 108 marks length and breadth of the eye-glasses and calculates the aspect ratio (am). For example, the length, breadth and top left point of each detected eye region (i.e., bounding box parameters) is represented as W, L and p = (px, py) respectively and the aspect ratio of the bounding box is (ai) = W/L.
ai = (am *W)/L if ai < am
W/(am *L) if ai > am
dw = 1 if ai < am
0 if ai > am
dL = 1 if ai < am
0 if ai > am
L = aiLdL
W = aiWdW
px = px+ (1-ai)LdL/2
py = py+ (1-ai)WdW/2
Therefore, a new set of L, W and p = (px, py) using the aspect ratio of eye-glasses is determined and the image control points (Ic) are updated accordingly.
Now to project the eye-glasses correctly on the image, the virtual try-on module 108 transfers the 3D mesh from a world co-ordinate system (i.e., a world camera system) to a camera co-ordinate system by estimating a rotation angle and a translation angle. In an example, intrinsic camera parameter (K) consists of a focal length (f) and a principle point. In this example, the principle point is taken as (0, 0) as it is present in the center of the camera co-ordinate system. The extrinsic camera parameters consisting of a rotation angle (R) and a translation angle (t) are required to transform eye-glass mesh from the world coordinate system to camera coordinate system. Using the estimated rotation angle (R) and the translation angle (t), all the mesh vertices of the eye-glass can be projected on the frame. For example, the rotation angle (R) and the translation angle (t) to transform the 3D mesh from the world camera system to the camera coordinate system can be estimated using below equation (2).
Ic = K[R|t]M -----------(2)

where K = [¦(f&0&0@0&f&0@0&0&1)] is a camera internal matrix,
R = [¦(r1&r2&r3@r4&r5&r6@r7&r8&r9)]
t = [¦(t_x@t_y@t_z )]
Ic are the image control points, and
M are the mesh control points.
In some embodiments, the virtual try-on module 108 determines an inter-pupillary distance from a unit of mesh. The inter-pupillary distance is defined as a distance between the centers of the two pupils of the eyes. The inter-pupillary distance ensures proper eye-glass alignment with the user’s face. In these embodiments, the virtual try-on module 108 identifies two pupils of the user in the frame and marks mesh vertices which projects on the pupils of the user. In an example implementation, the virtual try-on module 108 uses the rectangular bounding box of each eye-region to calculate its corresponding centroid. The value of centroid serves as an initial position for determining an actual pupil center. The virtual try-on module 108 then determines the corresponding 3D position of the centroid applying the camera parameters K, R and t from equation 2. This estimated 3D position may not be equivalent to any of the 3D point present in the eye-glass mesh list. Therefore, considering Euclidean distance as criteria, the nearest 3D point of the eyeglass mesh (vertex vi) corresponding to the estimated 3D position of the bounding box centroid is determined by the virtual try-on module 108. In the mesh, vi is connected via edges to other 3D points vj’s. This set of neighboring vertices form one ring neighborhood Ni of the vertex vi. Therefore, the virtual try-on module 108 projects a set of mesh 3D points Sm = {Ni U vi} on the image using equation 2.
After projection, the virtual try-on module 108 generates the rectangular bounding box Bp enclosing the projected area for each eye in the image. Generally, Bp comprises of sclera, which is the white region of the eye and cornea along with iris which is mostly dark in color. The dark region of Bp is segmented by applying Otsu thresholding scheme on it. Then, the virtual try-on module 108 estimates the pupil center of the left eye il = (xl; yl) and right eye ir = (xr; yr) taking a 2D median position of the segmented black pixels in left and right Bp respectively. Median operator is considered as it is resistant to outliers and removes any noise introduced while thresholding.
The virtual try-on module 108 then projects 2D pupil center positions il and ir into a 3D coordinate system by applying the camera parameters on equation 2. For example, let two mesh vertices corresponding to the pupils be A1 = [x1,y1,z1] and A2 = [x2,y2,z2]. Then, the virtual try-on module 108 calculates a Euclidean distance between the x and y component of A1 and A2 using a below example equation.
dm =v(??(x_2-x_1)?^2+(y_2-y_1)?^2 ).
Similarly, the virtual try-on module 108 calculates a Euclidean distance (di) between two pupils of the user (e.g., output is in pixels). Further, the virtual try-on module 108 calculates a scaling factor using the ratio of dm and di i.e., s = dm/di. The value of s remains constant as single mesh structure is used and do not calculate the depth while determining the value of dm. Therefore, when the virtual try-on module 108 calculates a Euclidean distance between two pupils of another user, the virtual try-on module 108 use the value of s to determine the value of dm. Using mapping between physical dimensions and each unit of mesh, the physical dimension of dm can be calculated. Thus, the inter-pupillary distance is determined without using any markers.
As there is a huge diversity of human face shapes in the world, so a rigid eye-glasses mesh is insufficient to yield a naturalistic view. In real-life, the temple of the eye-glasses stretches according to the user’s face structure for proper fitment. Therefore, a vital requirement in the virtual eye-glasses try-on is changing an orientation of temples (as shown in the eye-glasses 200 FIG. 2) in the eye-glasses mesh to fit the user face. For example, a temple end point in the mesh be T and F (x, y). F(x,y) is a pixel point on the frame where T should be projected. The point T is manually selected once for every eye-glass and the point F is automatically estimated from each frame by the virtual try-on module 108. Further, the virtual try-on module 108 bounds the detected facial region by minimum bounding box B and find a centroid C=(xc, yc). For example, the centroid lies on a nose region of the user’s face. Then, the virtual try-on module 108 can define an ellipse taking C as a center, major axis, a as half of the length of the bounding box B and minor axis, b as half of the breadth of the bounding box B. Therefore, the virtual try-on module 108 defines the ellipse as follows:
( ?(x-x_c)?^2 )/(?(a)?^2 )+?(y-y_c)?^2/(?(b)?^2 )=1 ----------- (3)
Also, the virtual try-on module 108 can find minimum bounding boxes around two eyes of the user. For the left eye, for example, a line between the leftmost top point and rightmost top point be, y-y_c= m(x-x_c) + c. Similarly, the virtual try-on module 108 can define it for a right eye.
Further, the virtual try-on module 108 substitutes y-y_c = m(x-x_c) + c in equation (3) to get below equation (4).
(a^2 m^2+ b^2 ) (x-x_c )^2+2a^2 mc(x-x_c )+a^2 (c^2- b^2 )=0 -----------(4)
From above, it is understood that the equation (4) is a quadratic equation and the virtual try-on module 108 obtains two values of (x-x_c ). For left eye, the virtual try-on module 108 takes a negative output and for right eye, the virtual try-on module 108 takes a positive output. The value of x_c is known in priori. So, the estimated point F can be calculated as, F = (x_e+x_c, y) where (x-x_c )=x_e. So, the virtual try-on module 108 can define the minimization function to project the mesh point T on the point F on the frame as,
?min?_? (F – ? (K(RyRT + t))).2 -------------(5)
where .^ is a point-wise square of each element of a matrix, ?() represents a projection function which projects 3D to 2D, “.” operator signifies point-wise square of each element of the matrix,
Ry = [¦(cos??&0&sin??@0&1&0@-sin??&0&cos?? )]is a rotation of the eye glass temple to fit the face,
R = [¦(r_11&r_12&r_13@r_21&r_22&r_23@r_31&r_32&r_33 )],
K = [¦(f1&0&0@0&f2&0@0&0&-1)] is an intrinsic matrix,
T = [¦(X@Y@Z)], t = [¦(t_x@t_y@t_z )] and F = [¦(X_I@Y_I )].
Let, [¦(r_11&r_12&r_13@r_21&r_22&r_23@r_31&r_32&r_33 )][¦(X@Y@Z)]=[¦(X_MR@Y_MR@Z_MR )]
The projection of K(Rot1RT1 + t) on the image i.e., ?(K(Rot2RT2 + t)) is defined as
?(K(RyRT1 + t)) = [¦((f1(X_MR cos??+ Z_MR sin???+ t_x)?)/(-1(-X_MR sin??+ Z_MR cos??+ t_z))@(f2(Y_MR+ t_y))/(-1(-X_MR sin??+ Z_MR cos??+ t_z)))]
Thus, equation (5) can be expanded and expressed as
(XI -(f1(X_MR cos??+ Z_MR sin???+ t_x)?)/(-1(-X_MR sin??+ Z_MR cos??+ t_z)))^2 = 0 and (YI -(f2(Y_MR+ t_y))/(-1(-X_MR sin??+ Z_MR cos??+ t_z))) ^2 = 0
(XI - (f1(X_MR cos??+ Z_MR sin???+ t_x)?)/(-1(-X_MR sin??+ Z_MR cos??+ t_z))) = 0 and (YI - (f2(Y_MR+ t_y))/(-1(-X_MR sin??+ Z_MR cos??+ t_z))) = 0
[¦(-X_I Z_MR-f1X_MR&X_I X_MR-f1Z_MR@-Y_I Z_MR&Y_I X_MR )][¦(cos??@sin?? )] = [¦(X_I t_z+f1t_x@Y_I t_z+ f2Y_MR+f2t_y )] ----- (6)
The equation (6) can be optimized to find the value of ?. Then, the virtual try-on module 108 projects the mesh vertices of the temple region of the eye-glasses on the frame using K[RyR|t]Tm, where Tm are the temple mesh vertices of the eye-glasses.
In yet another embodiments, the virtual try-on module 108 uses a material property defined in the mesh to extract the color information and modify the color information according to the user direction. The virtual try-on module 108 gives a life-like representation of the eye-glasses in the projected frame. The virtual try-on module 108 then projects the 3D mesh on the frame using open GL routines. Open Graphics Library (OpenGL) is a cross-language, cross-platform application programming interface (API) for rendering 2D and 3D vector graphics. The API is typically used to interact with a graphics processing unit (GPU), to achieve hardware-accelerated rendering. The Open GL routines is used for projecting the mesh onto the image such that a realistic view is generated. Thus improving the visualization and produces a realistic output of a user wearing the eye-glasses (e.g., during online shopping and so on).
In an example, given the specular is, diffusive id and ambient ia component of each light source, the intensity of each mesh point of the eye-glass meshp is calculated as follows.
meshp = kaia + ?_(m € lights)¦(kd (Lm.N)im,d + ks (Rm.V) a im,s
where ks, kd, ka are constants, L is a direction vector of the light, V is user’s direction, N denotes a normal at a given 3D point of the mesh and R is a direction vector denoting the reflection on the surface denoted by surface normal N.

FIG. 3 is a flow chart illustrating a method 300 for virtual try-on of eye glasses, according to an embodiment of the present disclosure. At block 302, a frame including a user’s image and a three-dimensional (3D) mesh of the eye-glasses are received. At block 304, a face and eyes of the user are identified in the image. At block 306, image control points are defined around the eyes of the user. In an example implementation, a set of points defining a region of interest around the eyes of the user are estimated. The image control points are then defined based on the estimated set of points.
At block 308, mesh control points in the 3D mesh of the eye-glasses corresponding to the image control points are selected. At block 310, an aspect ratio of the eye-glasses is determined based on length and breadth of the eye-glasses in the 3D mesh. At block 312, the image control points are updated based on the aspect ratio of the eye-glasses. At block 314, a rotation angle and a translation angle are estimated to transform the 3D mesh from a world camera system to a camera co-ordinate system based on the updated image control points and the mesh control points. At block 316, the transformed 3D mesh of the eye-glasses is projected on the frame based on the estimated rotation angle and translation angle. Thus enabling the user to virtual try-on the eye-glasses.
In some embodiments, two pupils of the user in the frame are identified. Further, mesh vertices projecting on the pupils of the user are marked. Furthermore, the mesh vertices corresponding to the pupils are transferred to the camera co-ordinate system. In addition, a Euclidean distance between a x-axis and y-axis component of the mesh vertices in the camera co-ordinate system is calculated. Also, a Euclidean distance between two pupils of the user is calculated. Moreover, an inter-pupillary distance is determined based on the Euclidean distance between the x-axis and y-axis component of the mesh vertices and the Euclidean distance between the two pupils of the user.
In other embodiments, an orientation of temples in the 3D mesh of the eye-glasses is changed to fit the face of the user. For example, the orientation of temples in the 3D mesh of the eye-glasses are changed based on an intrinsic matrix and a rotation of the temples. In yet other embodiments, color information of the eye-glasses is extracted using a material property defined in the 3D mesh. The color information is modified according to a direction of the user. This is explained in more detailed with reference to FIG. 1.
The order in which the method(s) are described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the above method, or an alternative method. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the above method can be implemented in any suitable hardware, software, firmware, or combination thereof.
In an implementation, one or more of the method(s) described herein may be implemented at least in part as instructions embodied in non-transitory computer-readable storage medium and executable by one or more computing devices. In general, a processor (for example a microprocessor) receives instructions, from a non-transitory computer-readable medium, for example, a memory, and executes those instructions, thereby performing one or more method(s), including one or more of the method(s) described herein. Such instructions may be stored and/or transmitted using any of a variety of known computer-readable media.
It is, however to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device.
The preceding description has been presented with reference to various embodiments. Persons having ordinary skill in the art and technology to which this application pertains appreciate that alterations and changes in the described structures and methods of operation can be practiced without meaningfully departing from the principle, spirit and scope.

Documents

Application Documents

# Name Date
1 Form 3 [30-05-2016(online)].pdf 2016-05-30
2 Form 20 [30-05-2016(online)].jpg 2016-05-30
3 Form 18 [30-05-2016(online)].pdf_115.pdf 2016-05-30
4 Form 18 [30-05-2016(online)].pdf 2016-05-30
5 Drawing [30-05-2016(online)].pdf 2016-05-30
6 Description(Complete) [30-05-2016(online)].pdf 2016-05-30
7 Form 26 [21-07-2016(online)].pdf_22.pdf 2016-07-21
8 Form 26 [21-07-2016(online)].pdf 2016-07-21
9 Other Patent Document [04-08-2016(online)].pdf 2016-08-04
10 abstract1.jpg 2018-08-11
11 201621018553-Power of Attorney-250716.pdf 2018-08-11
12 201621018553-Form 1-100816.pdf 2018-08-11
13 201621018553-Correspondence-250716.pdf 2018-08-11
14 201621018553-Correspondence-100816.pdf 2018-08-11
15 201621018553-OTHERS [24-02-2021(online)].pdf 2021-02-24
16 201621018553-FER_SER_REPLY [24-02-2021(online)].pdf 2021-02-24
17 201621018553-COMPLETE SPECIFICATION [24-02-2021(online)].pdf 2021-02-24
18 201621018553-CLAIMS [24-02-2021(online)].pdf 2021-02-24
19 201621018553-FER.pdf 2021-10-18
20 201621018553-US(14)-HearingNotice-(HearingDate-11-04-2023).pdf 2023-03-20
21 201621018553-FORM-26 [29-03-2023(online)].pdf 2023-03-29
22 201621018553-FORM-26 [29-03-2023(online)]-1.pdf 2023-03-29
23 201621018553-Correspondence to notify the Controller [29-03-2023(online)].pdf 2023-03-29
24 201621018553-FORM-26 [11-04-2023(online)].pdf 2023-04-11
25 201621018553-RELEVANT DOCUMENTS [21-04-2023(online)].pdf 2023-04-21
26 201621018553-PETITION UNDER RULE 137 [21-04-2023(online)].pdf 2023-04-21
27 201621018553-Written submissions and relevant documents [24-04-2023(online)].pdf 2023-04-24
28 201621018553-PatentCertificate28-07-2023.pdf 2023-07-28
29 201621018553-IntimationOfGrant28-07-2023.pdf 2023-07-28

Search Strategy

1 2021-03-1213-20-32AE_12-03-2021.pdf
2 2020-07-2321-54-16E_23-07-2020.pdf

ERegister / Renewals

3rd: 27 Oct 2023

From 30/05/2018 - To 30/05/2019

4th: 27 Oct 2023

From 30/05/2019 - To 30/05/2020

5th: 27 Oct 2023

From 30/05/2020 - To 30/05/2021

6th: 27 Oct 2023

From 30/05/2021 - To 30/05/2022

7th: 27 Oct 2023

From 30/05/2022 - To 30/05/2023

8th: 27 Oct 2023

From 30/05/2023 - To 30/05/2024

9th: 29 May 2024

From 30/05/2024 - To 30/05/2025

10th: 23 May 2025

From 30/05/2025 - To 30/05/2026