Abstract: SYSTEM AND PROCESS FOR REAL TIME DETECTION OF POSITION OF WRIST AND FINGERS AND SUPERIMPOSITION OF IMAGE THEREON A system and a method are disclosed for detecting position of a hand 106 of a user. A processor unit is configured to receive and analyze an image 104 of a hand 106 captured by a camera device 102. The image 104 is analyzed to detect multiple rotations of the hand 106 with respect to each axis in a three-dimensional (3D) space and to subsequently detect position of a first reference point and a second reference point on the image 104 of the hand 106. An article image is thereafter superimposed with the image 104 of the hand 106 to provide a 3D simulation experience to the user, wherein the article image is superimposed with the image 104 of the hand 106 based on the detected position of the first and the second reference point. REFER FIGURE 1
TECHNICAL FIELD
[0001] The present subject matter in general relates to detection of the position of wrist and fingers in an image of a human hand, and particularly relates to a system and process for simulated application of an article, such as an ornamental or utilitarian article on the wrist or finger so determined.
BACKGROUND
[0002] Conventional process of visiting a physical store and wearing and removing multiple
products for try-on, is not very convenient and brings several disadvantages for the user.
Firstly, the physical application and removal of the item takes significant amount of time and
thus the user is only able to try on a very limited inventory before making a purchase decision.
Secondly, limitation of inventory available at a given store location further imposes a significant
restriction on the number of items that the user can apply. Thirdly, physical application of
products may require users to apply multiple items of metallic substances to their skin which
can, in some users, cause reactions such as allergies and can sometimes even be painful.
[0003] Further, virtual or online method of trying on wearable products or articles on wrists
and fingers do not provide accuracy and an appropriate user experience. Users who purchase
any products via online stores, are unable to try on the product before making a purchase
decision.
[0004] Some of the technologies known in the art use stickers/QR Codes to determine the
position of a hand, however, these technologies are unable to determine accurate pose of the
hand. The stickers/QR Codes are used as a standard format to determine a pattern on the
hand, but that presents a huge adoption barrier for the user. Moreover, the approach based
on QR codes and stickers does not work well under difficult lighting conditions or occlusions
since they can only be tracked when all parts of it are clearly visible.
[0005] Another similar approach known in the art for tracking hands is by using wristbands
to determine the pose of the wrist, but this approach is not able to track fingers, unless a user
is willing to wear bands and trackers all over their hand.
[0006] Currently, the technology for full 3D virtual try on watches, rings & bracelets, along
with a variety of other industries is very limited due to the technology limitations mentioned
above.
[0007] Hand tracking technology poses a variety of challenges which made this technology
difficult to implement. Virtual try-on for hands has been a long-standing challenge in the
fashion/ornaments industry since hands have no characteristic determining gradients/
features which would have allowed them to be easy to track using conventional computer vision algorithms.
[0008] Technologies relating to detection of position of wrist or finger are difficult to implement since hands can be easily mistaken for other body parts as they are 'deformable'. This is because people can close their hands, individually move their fingers, position them on different angles in the image, move them around very fast, etc. This is significantly different from face detection and tracking, which heavily relies on the eye-nose-eye bridge gradient to accurately find faces in an image as that pattern is easy to detect and is otherwise uncommon. Similarly, humans cannot individually move their nose/ mouth/ ears/ eyes as the face structure is not deformable for the most part. The ability to determine the structure, position and angle of hands is thus a crucial element in improving the user experience for virtual product try-on in augmented reality.
[0009] Therefore, there is a well felt need for a system and method that provides detection and tracking of hand movements, estimates the pose of the hand from just a single image, and achieves real-time performance on mobile phones without requiring depth data.
SUMMARY
[0010] The present subject matter provides a holistic solution to the above-mentioned limitations.
[0011] An object of the present subject matter is to detect and track landmarks of a hand from an image, and also estimate the pose of the hand from just a single image. [0012] Another object of the present subject matter is to perform various calculations in real¬time without requiring in-depth data.
[0013] According to an embodiment of the present subject matter, there is provided a system for detecting position of a hand of a user. The system comprises a camera device for capturing an image of a hand, the image including a finger part and a wrist part; a display device for displaying the image to the user; and a processor unit configured to: receive and analyze the captured image to detect a plurality of landmark points of the hand, track, in real time, multiple rotations of the hand in the image with respect to each axis in a three-dimensional (3D) space and subsequently detect position of a first reference point and a second reference point on the image of the hand, and superimpose an article image with the image of the hand to provide a 3D simulation experience to the user; wherein the article image is superimposed with the image of the hand based on the detected position of the first and the second reference point. [0014] According to an embodiment of the present subject matter, the plurality of landmark points includes multiple points on each finger of a human hand, and one point on wrist.
[0015] According to an embodiment of the present subject matter, the first reference point
and the second reference point are located respectively on the finger part and the wrist part
on the image of the hand.
[0016] According to an embodiment of the present subject matter, the processing unit is
further configured to calculate a plurality of offset distances based on the plurality of landmark
points, the first reference point and the second reference.
[0017] According to an embodiment of the present subject matter, in an event of the hand
being moved around y-axis at multiple angles, all other rotations are made constant, and a
first offset distance is applied to x-coordinates such that that any point being calculated
thereof, falls on center of the finger part or the wrist part.
[0018] According to an embodiment of the present subject matter, in an event of the hand
being moved around z-axis at multiple angles, all other rotations are made constant, and a
second offset distance is applied to y-coordinates such that any point being calculated thereof,
falls on center of the finger part or the wrist part.
[0019] According to an embodiment of the present subject matter, in an event of the hand
being moved around x-axis at multiple angles, all other rotations are made constant, and a
first offset distance is applied to z-coordinates such that that any point being calculated
thereof, falls on center of the finger part or the wrist part.
[0020] According to an embodiment of the present subject matter, each of the plurality of
offset distances is calibrated with respect to size of the hand.
[0021] According to an embodiment of the present subject matter, the article image is a 3D
model of any hand wearable article including ornaments and watches.
[0022] According to an embodiment of the present subject matter, a method is provided for
detecting position of a hand of a user. The method comprising: configuring a camera device
for capturing an image of a hand, the image including a finger part and a wrist part; configuring
a display device for displaying the image to the user; and configuring a processor unit for:
receiving and analyze the captured image to detect a plurality of landmark points of the hand,
tracking, in real time, multiple rotations of the hand in the image with respect to each axis in a
three-dimensional (3D) space and subsequently detect position of a first reference point and
a second reference point on the image of the hand, and superimposing an article image with
the image of the hand to provide a 3D simulation experience to the user; wherein the article
image is superimposed with the image of the hand based on the detected position of the first
and the second reference point.
[0023] The afore-mentioned objectives and additional aspects of the embodiments herein
will be better understood when read in conjunction with the following description and
accompanying drawings. It should be understood, however, that the following descriptions,
while indicating preferred embodiments and numerous specific details thereof, are given by
way of illustration and not of limitation. This section is intended only to introduce certain objects and aspects of the present invention, and is therefore, not intended to define key features or scope of the subject matter of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] The figures mentioned in this section are intended to disclose exemplary
embodiments of the claimed system and method. Further, the components/modules and steps
of a process are assigned reference numerals that are used throughout the description to
indicate the respective components and steps. Other objects, features, and advantages of the
present invention will be apparent from the following description when read with reference to
the accompanying drawings:
[0025] Figure 1 illustrates a camera device capturing an image of hand of a user, according
to an exemplary embodiment of the present subject matter.
[0026] Figure 2 illustrates a plurality of landmark points, according to an exemplary
embodiment of the present subject matter.
[0027] Figure 3 illustrates angle of the hand in 3D space, according to an exemplary
embodiment of the present subject matter.
[0028] Figure 4a, Figure 4b and Figure 4c illustrate application of a corresponding first offset
distance to respective coordinates such that that any point being calculated thereof, falls on
center of the finger part or the wrist part.
[0029] Like reference numerals refer to like parts throughout the description of several views
of the drawings.
DETAILED DESCRIPTION
[0030] The following presents a detailed description of various embodiments of the present
subject matter with reference to the accompanying drawings.
[0031] The embodiments of the present subject matter are described in detail with reference
to the accompanying drawings. However, the present subject matter is not limited to these
embodiments which are only provided to explain more clearly the present subject matter to a
person skilled in the art of the present disclosure. In the accompanying drawings, like
reference numerals are used to indicate like components.
[0032] The specification may refer to "an", "one", "different" or "some" embodiment(s) in
several locations. This does not necessarily imply that each such reference is to the same
embodiment(s), or that the feature only applies to a single embodiment. Single features of
different embodiments may also be combined to provide other embodiments.
[0033] As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and/or "comprising" when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "attached" or "connected" or "coupled" or "mounted" to another element, it can be directly attached or connected or coupled to the other element or intervening elements may be present. As used herein, the term "and/or" includes any and all combinations and arrangements of one or more of the associated listed items.
[0034] The figures depict a simplified structure only showing some elements and functional entities, all being logical units whose implementation may differ from what is shown. [0035] The present subject matter teaches a system and a process for detection of position of a body part, particularly a hand, a wrist, palm, fingers and thumb, based on an image of a human hand in varying orientations. The subject matter as disclosed herein, may be presented to the users as a mobile application, a web application or a television application to facilitate the users to superimpose an image of an ornamental article in a moving image of any human hand.
[0036] Figure 1 illustrates a camera device 102 capturing an image 104 of hand 106 of a user, according to an exemplary embodiment of the present subject matter. In a preferred embodiment of the present subject matter, a system for detecting position of a hand 106 of a user using a moving image 104 of a hand 106 of a user is provided. The system comprises a camera device 102 for capturing an image 104 of a hand 106. The image 104of the hand 106 includes two parts namely a finger part and a wrist part. The finger part shows all fingers of a hand 106 including thumb. A display device, in communication with the camera device 102 may be provided for displaying the image 104 to the user. The display device includes a monitor or a screen that may be a separate device or in-built with the camera device 102 such as a mobile phone camera or a digital camera device 102. A processor unit in communication with the camera device 102 and the display device is also provided. The processor unit is capable of executing program instructions stored in a memory. The processor unit is capable of communicating with the camera device 102, display device and the memory and is configured to execute program instructions to receive and analyze the captured image 104 to detect a plurality of landmark points of the hand 106.
[0037] Figure 2 illustrates the plurality of landmark points, according to an exemplary embodiment of the present subject matter. Hand land marking and tracking algorithms are used to detect the plurality of landmark points. The plurality of landmark points includes multiple points on each finger of a human hand 106, and one point on wrist. As shown in the
figure, the plurality of landmark points includes twenty-one key points in the x, y and z axis, which can be categorized into four points on each of the five fingers, namely the thumb, the forefinger, the middle finger and the pinky finger or the little finger. One point of the plurality of reference points is marked on the wrist part.
[0038] Figure 3 illustrates angle of the hand 106 in 3D space, according to an exemplary embodiment of the present subject matter. According to the embodiments of the present invention, a moving image 104 of the hand 106 of the user is captured by the camera device 102. The processor unit performs tracking of movement of hand 106 by analyzing the image 104. The tracking is performed, in real time, to detect multiple rotations of the hand 106 in the image 104 with respect to each axis in a three-dimensional (3D) space as shown in Figure 3. Subsequently, position of a first reference point and a second reference point on the image 104 of the hand 106 are detected. Thereafter, image 104 of an article is superimposed with the image 104 of the hand 106 to provide a 3D or three dimensional simulation experience to the user; wherein the article image is superimposed with the image 104 of the hand 106 based on the detected position of the first and the second reference point. The first reference point and the second reference point are located respectively on the finger part and the wrist part on the image 104 of the hand 106. The article image is a 3D model of any hand 106 wearable article including ornaments and watches.
[0039] Figure 4a, Figure 4b and Figure 4c illustrate application of a corresponding first offset distance to respective coordinates such that that any point being calculated thereof, falls on center of the finger part or the wrist part.
[0040] According to an embodiment of the present subject matter, the processing unit is further configured to calculate a plurality of offset distances 406a, 406b, 406c, 408a, 408b, 408c, based on the plurality of landmark points, the first reference point and the second reference in the 3D space having x-y-z axis 402. In an event of the hand 106 being moved around y-axis at multiple angles 402a, all other rotations are made constant, and a first offset distance 406a or 408a is applied to x-coordinates such that that any point being calculated thereof, falls on center of the finger part 404a or the wrist part 410a. Further, in an event of the hand 106 being moved around z-axis at multiple angles, all other rotations are made constant, and a second offset distance 406b or 408b is applied to y-coordinates such that any point being calculated thereof, falls on center of the finger part 404b or the wrist part 410b. Furthermore, in an event of the hand 106 being moved around x-axis at multiple angles, all other rotations are made constant, and a third offset distance 406c or 408c is applied to z-coordinates such that that any point being calculated thereof, falls on center of the finger part 404c or the wrist part 410c.
[0041] According to an embodiment of the present subject matter, each of the plurality of offset distances is calibrated with respect to size of the hand 106.
[0042] The subject matter according to the present invention determines the pose and angle of a hand 106 across the x, y & z axis to provide the ability to use this technology for the virtual try-on of products. The subject matter according to the present invention performs the following main function, namely:
Palm Detection: This is performed to determining whether the image 104 captured by the camera device 102 is an image 104 of a hand 106 or a palm of a user. In order to perform palm detection, hand 106 of the user is brought in front of the camera within a predefined range of distance between the camera device 102 and the object. Hand Land-marking and Tracking: This is performed to estimate the plurality of landmark points on the image 104 of the hand 106. Preferably, twenty-one key points are marked in the x, y and z axis, which can be categorized into 4 points on each of the five fingers, and one point on the wrist.
Pose estimation: This is performed to use the plurality of landmark points in the step above to determine the angle of the hand 106 in a given 3D space. [0043] The subject matter according to the present invention first crops part of the input image 104 that encompasses the hand 106 using the detection process. It thereafter determines reference hand landmarking points falling on the fingers and wrist of the hand 106 and rotation of the hand 106 in the image 104. The next step is correction of the rotation of the hand 106 in the image 104. Thereafter, the subject matter according to the present invention determines the vertical/height of the hand 106, corrects the offset distance from reference hand landmarking points for the height of the hand 106, and adds the offset distance to the reference hand 106 landmarking points for determining the position of ring finger and wrist edges on the image 104 of the human hand 106.
[0044] The subject matter according to the present invention is capable of achieving the desired result without deducing the entire hand 106 shape. The subject matter according to the present invention finds the coordinate of the hand 106 key points and the middle point between two finger points. By taking these points as reference, it develops a geometrical relationship to get the desired coordinates and also claims to accompany the system with an in-built manual error handling option.
[0045] According to an embodiment of the present subject matter, a method is provided for detecting position of a hand 106 of a user. The method comprises the step of configuring the camera device 102 for capturing the hand 106 image 104 having a finger part and a wrist part. The image 104 is displayed on the display device to the user. The processor unit is configured for: receiving and analyzing the captured image 104 to detect a plurality of landmark points of the hand 106. The rotation movement of the hand 106 image 104 is tracked, in real time, wherein multiple rotations of the hand 106 in the image 104 with respect to each axis in a three-dimensional (3D) space are analyzed. Subsequently, position of a first reference point
and a second reference point on the image 104 of the hand 106 are detected. Image 104 of an article, which the user intends to purchase, is superimposed with the image 104 of the hand 106 to provide a 3D simulation experience to the user. The article image is superimposed with the image 104 of the hand 106 based on the detected position of the first and the second reference point. The method steps are described:
1. The hand 106 of a person or a user in the image 104 is detected using a hand 106 detection algorithm that provides the bounding box where the hand 106 has been detected in the image 104.
2. A cropped image 104 of the hand 106 is taken as an input for a shape detection algorithm. The shape detection uses a training set to identify a plurality of landmark points of the hand 106 or the twenty-one unique features on the hand 106 of the person.
3. Once the plurality of landmark points have been identified, the points closest to the ring position are used as reference points.
4. In the next step, rotation of the hand 106 across all three axes is estimated.
5. During calibration, an image 104 of the user, where the user is straight in front of the camera, is used. In this position, all rotations are assumed to be at zero degrees.
6. The offset distance to the reference finger or referenced wrist points is added to position new markers at the position of the ring.
7. Keeping all other rotations constant, the hand 106 is moved around the y axis at multiple angles. An offset of (constant A) * (y rotation) is applied to the x coordinates of the marker such that the calculated point falls on the center of the finger/wrist.
8. Keeping all other rotations constant, the hand 106 is moved around the z axis at multiple angles. An offset of (constant B) * (z rotation) is applied to the y coordinates of the marker such that the calculated point falls on the center of the finger/wrist.
9. Keeping all other rotations constant, the hand 106 is moved around the x axis at multiple angles. An offset of (constant C) * (x rotation) is applied to the z coordinates of the marker such that the calculated point falls on the center of the finger or wrist.
10. These tests are done for multiple hands at multiple angles to get an average value of the constants A, B, C.
11. All offsets are calibrated with respect to the size of the hand 106. This is done such that when a hand 106 moves closer or further away from the camera, the image 104 overlaid on the finger points or the wrist points gets resized.
12. Once the positioning and location of the ring finger or wrist has been determined in accordance with the aforesaid steps, based on the rotation about the x, y, z axis, 3D models of the ornaments/watches are rotated. This rotated and resized 3D model of ornaments/watches are then placed on the location of the finger/wrist points, so as to provide a virtual try-on experience to the user.
[0046] The subject matter of the present invention has several advantages over the technologies known in the art. Using the advancements in hand 106 detection and tracking 21 landmarks in real time, the subject matter of the present invention can determine the ring position on the finger or the position of a bracelet or a watch on the wrist. Along with this, the subject matter of the present invention uses the twenty-one landmarks to determine the angle of the hand 106 which can be used to simulate a 3D experience.
[0047] This is not achievable with conventional tracking methods based on stickers or QR codes since as soon as the hand 106 is flipped or the sticker is occluded, the detection, tracking and pose estimation stops working.
[0048] Further, using a machine learning based landmark detection on hand 106 enables the present subject matter to track even occluded parts of the hand 106, which in turns allows us to track much higher range of angles.
[0049] The present subject matter has several advantages over the conventional processes of application and removal of an ornamental article on a user's hand 106. For instance, the process according to the present subject matter does not require visual hand 106 tracking stickers/markers. Further, a complete 360° angle estimation of hand 106 can be obtained. It works even hand 106 is occluded by itself. Moreover, the process according to the present subject matter can be used to determine width of fingers or wrist, and works even hand 106 is occluded by itself. It can be implemented in real time augmented reality try-on of ornaments such as watches, rings & bracelets. Moreover, the process according to the present subject matter enables the user to simulate the application of a large inventory of articles, particularly ornamental articles, without visiting a store.
[0050] While the preferred embodiments of the present invention have been described hereinabove, it should be understood that various changes, adaptations, and modifications may be made therein without departing from the spirit of the invention and the scope of the appended claims. It will be obvious to a person skilled in the art that the present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive.
We Claim
1. A system for detecting position of a hand 106 of a user, the system comprising:
a camera device 102 for capturing an image 104 of a hand 106, the image 104
including a finger part and a wrist part;
a display device for displaying the image 104 to the user; and
a processor unit configured to:
receive and analyze the captured image 104 to detect a plurality of landmark
points of the hand 106, track, in real time, multiple rotations of the hand 106 in the image 104 with respect to each axis in a three-dimensional (3D) space and subsequently detect position of a first reference point and a second reference point on the image 104 of the hand 106, and superimpose an article image with the image 104 of the hand 106 to provide a 3D simulation experience to the user; the article image being superimposed with the image 104 of the hand 106 based on the detected position of the first and the second reference point.
2. The system as claimed in claim 1, wherein the plurality of landmark points includes multiple points on each finger of a human hand 106, and one point on wrist.
3. The system as claimed in claim 1, wherein the first reference point and the second reference point are located respectively on the finger part and the wrist part on the image 104 of the hand 106.
4. The system as claimed in claim 1, wherein the processing unit is further configured to calculate a plurality of offset distances based on the plurality of landmark points, the first reference point and the second reference.
5. The system as claimed in claim 4, wherein in an event of the hand 106 being moved around y-axis at multiple angles, all other rotations are made constant, and a first offset distance is applied to x-coordinates such that that any point being calculated thereof, falls on center of the finger part or the wrist part.
6. The system as claimed in claim 4, wherein in an event of the hand 106 being moved around z-axis at multiple angles, all other rotations are made constant, and a second
offset distance is applied to y-coordinates such that any point being calculated thereof, falls on center of the finger part or the wrist part.
7. The system as claimed in claim 4, wherein in an event of the hand 106 being moved around x-axis at multiple angles, all other rotations are made constant, and a first offset distance is applied to z-coordinates such that that any point being calculated thereof, falls on center of the finger part or the wrist part.
8. The system as claimed in claim 4, wherein each of the plurality of offset distances is calibrated with respect to size of the hand 106.
9. The system as claimed in claim 4, wherein the article image is a 3D model of any hand 106 wearable article including ornaments and watches.
10. A method for detecting position of a hand 106 of a user, the system comprising:
configuring a camera device 102 102 for capturing an image 104 of a hand 106, the image 104 including a finger part and a wrist part; configuring a display device for displaying the image 104 to the user; and configuring a processor unit for:
receiving and analyze the captured image 104 to detect a plurality of landmark
points of the hand 106, tracking, in real time, multiple rotations of the hand 106 in the image 104 with respect to each axis in a three-dimensional (3D) space and subsequently detect position of a first reference point and a second reference point on the image 104 of the hand 106, and superimposing an article image with the image 104 of the hand 106 to provide a 3D simulation experience to the user; the article image being superimposed with the image 104 of the hand 106 based on the detected position of the first and the second reference point.
11. The method of claim 10, wherein the plurality of landmark points includes multiple points on each finger of a human hand 106, and one point on wrist.
12. The method of claim 10, wherein the first reference point and the second reference point are located respectively on the finger part and the wrist part on the image 104 of the hand 106.
13. The method of claim 10, further comprising calculating a plurality of offset distances based on the plurality of landmark points, the first reference point and the second reference.
14. The method of claim 13, wherein in an event of the hand 106 being moved around y-axis at multiple angles, all other rotations are made constant, and a first offset distance is applied to x-coordinates such that that any point being calculated thereof, falls on center of the finger part or the wrist part.
15. The method of claim 13, wherein in an event of the hand 106 being moved around z-axis at multiple angles, all other rotations are made constant, and a second offset distance is applied to y-coordinates such that any point being calculated thereof, falls on center of the finger part or the wrist part.
16. The method of claim 13, wherein in an event of the hand 106 being moved around x-axis at multiple angles, all other rotations are made constant, and a first offset distance is applied to z-coordinates such that that any point being calculated thereof, falls on center of the finger part or the wrist part.
17. The method of claim 13, wherein each of the plurality of offset distances is calibrated with respect to size of the hand 106.
18. The method of claim 10, wherein the article image is a 3D model of any hand 106 wearable article including ornaments and watches.
| # | Name | Date |
|---|---|---|
| 1 | 202011043049-PROVISIONAL SPECIFICATION [03-10-2020(online)].pdf | 2020-10-03 |
| 2 | 202011043049-FORM FOR STARTUP [03-10-2020(online)].pdf | 2020-10-03 |
| 3 | 202011043049-FORM FOR SMALL ENTITY(FORM-28) [03-10-2020(online)].pdf | 2020-10-03 |
| 4 | 202011043049-FORM 1 [03-10-2020(online)].pdf | 2020-10-03 |
| 5 | 202011043049-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [03-10-2020(online)].pdf | 2020-10-03 |
| 6 | 202011043049-EVIDENCE FOR REGISTRATION UNDER SSI [03-10-2020(online)].pdf | 2020-10-03 |
| 7 | 202011043049-EVIDENCE FOR REGISTRATION UNDER SSI [03-10-2020(online)]-1.pdf | 2020-10-03 |
| 8 | 202011043049-DRAWING [04-10-2021(online)].pdf | 2021-10-04 |
| 9 | 202011043049-CORRESPONDENCE-OTHERS [04-10-2021(online)].pdf | 2021-10-04 |
| 10 | 202011043049-COMPLETE SPECIFICATION [04-10-2021(online)].pdf | 2021-10-04 |
| 11 | 202011043049-FORM 18 [06-09-2024(online)].pdf | 2024-09-06 |