Abstract: METHOD OF IDENTIFYING AN OBJECT HELD IN A USER’S HAND USING A WEARABLE DEVICE AND PROVIDING RECOMMENDATIONS THEREOF The present invention describes a method of identifying an object held in a user’s hand using a wearable device and providing recommendations thereof. According to one embodiment, the wearable device receives sensor data from a sensor unit embedded in the wearable device. The sensor data corresponds to weight of the handheld object. A camera unit present in the wearable device gets triggered in response to the sensor data and captures one or more pictures of the handheld object. The image data provides dimension and other properties of the object. Then, both the image data and sensor data is retrieved by a recommendation unit present in a communication device and compares with pre-stored data. Based on comparisons, one or more recommendations are provided to the user regarding the object held in the user’s hand, wherein the one or more recommendations enable the user to choose the object suitable for the user. Figure 1
Claims:We claim:
1. A method of identifying an object held in a user’s hand using wearable device and providing recommendations, the method comprising:
receiving, by a wearable device, a sensor data from a sensor unit embedded in the wearable device;
capturing, by a wearable device, one or more pictures of the object held in the user’s hand in response to the sensor data received from the sensor unit;
retrieving, by a recommendation unit, image data associated with the one or more captured pictures of the object and the sensor data from the wearable device;
comparing the image data and sensor data retrieved from the wearable device with data stored in a database of a communication device;
providing one or more recommendations to the user regarding the object held in the user’s hand, wherein the one or more recommendations enable the user to choose the object suitable for the user.
2. The method as claimed in claim 1, wherein the sensor data is received due to generation of an electrical signal in one or more hand muscles due to muscle contraction when the object is held in the user’s hand.
3. The method as claimed in claim 1, wherein the sensor data corresponds to weight of the object.
4. The method as claimed in claim 1, further comprising:
processing the captured one or more pictures of the handheld object using an image processing technique for obtaining dimension and type of the object.
5. The method as claimed in claim 4, wherein the dimension corresponds to shape of the object and type of the object corresponds to property of the object.
6. The method as claimed in claim 5, wherein the property of the object comprises at least one of the following:
solid,
liquid,
semi-solid,
sediments,
ingredients, and
calorie values.
7. The method is claimed in claim 1, wherein the data stored in the communication device comprises review information provided by plurality of users on the object, purchase history and health data of the user.
8. The method as claimed in claim 1, wherein providing one or more recommendations to the user comprises:
delivering audio announcements on the communication device.
9. The method as claimed in claim 1, wherein providing one or more recommendations to the user comprises:
displaying notifications on a display of the communication device.
10. A system for providing recommendations regarding a handheld object, comprising:
A wearable device adapted for:
receiving a sensor data from a sensor unit embedded in the wearable device;
capturing one or more pictures of the handheld object in response to a sensor data received from a sensor; and
A recommendation unit adapted for:
retrieving image data associated with the one or more captured pictures of the object and the sensor data from the wearable device;
comparing the image data and sensor data retrieved from the wearable device with pre-stored data in a communication device; and
providing one or more recommendations to the user regarding the object held in the user’s hand, wherein the one or more recommendations enable the user to choose the object suitable for the user.
11. The system as claimed in claim 10, wherein the sensor data is received due to generation of an electrical signal in one or more hand muscles due to muscle contraction when the object is held in the user’s hand.
12. The system as claimed in claim 10, wherein the sensor data corresponds to weight of the object.
13. The system as claimed in claim 10, wherein the camera unit is further adapted to
process the captured one or more pictures of the handheld object using an image processing technique for obtaining dimension and type of the object.
14. The system as claimed in claim 13, wherein the dimension corresponds to shape of the object and type of the object corresponds to property of the object.
15. The system as claimed in claim 14, wherein the property of the object comprises at least one of the following:
solid,
liquid,
semi-solid,
sediments,
ingredients; and
calorie values.
16. The system is claimed in claim 10, wherein the pre-stored data comprises review information provided by plurality of users on the object, purchase history and health data of the user.
17. The system as claimed in claim 10, wherein in providing the one or more recommendations to the user, the recommendation unit is adapted to:
deliver audio announcements and display notifications on a display on the communication device.
Dated this the 14th day of December 2015
Signature
KEERTHI J S
Patent agent
Agent for the applicant
ABSTRACT , Description:FORM 2
THE PATENTS ACT, 1970
[39 of 1970]
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(Section 10; Rule 13)
METHOD OF IDENTIFYING AN OBJECT HELD IN A USER’S HAND USING A WEARABLE DEVICE AND PROVIDING RECOMMENDATIONS THEREOF
SAMSUNG R&D INSTITUTE INDIA – BANGALORE Pvt. Ltd.
# 2870, ORION Building, Bagmane Constellation Business Park,
Outer Ring Road, Doddanakundi Circle,
Marathahalli Post,
Bangalore -560037, Karnataka, India
Indian Company
The following Specification particularly describes the invention and the method it is being performed
FIELD OF THE INVENTION
The present invention in general relates to wearable devices, and more particularly relates to a method of identifying an object held in a user’s hand using a wearable device and providing recommendations thereof.
BACKGROUND OF THE INVENTION
In the recent days, recommendation systems are used in many places like shopping malls, restaurants, grocery shops etc., with the intent to assist customers about the objects they are picking. Some of the existing recommendation systems use a device attached to a shopping cart, which then scans a RFID associated with an item when placed inside the shopping cart. The item is then identified by an external system by reading the RFID associated with the object and provides recommendations regarding the item on a display of the user’s device such as mobile phone, PDA etc. Therefore, it is necessary for a user to use shopping cart to know the details of the items picked by the user.
Some of the recommendation systems use bar code system which allows even blind people to do self-shopping. The user is provided with a bar code reader device to read bar code information associated with objects picked by the user. The bar code information is first scanned and output is converted into voice data and then provided to the blind people’s wireless communication device. Hence, the user has to carry bar code reader device every time he visits shopping malls.
Therefore, there is a need for a novel method and device for identifying an object held in a user’s hand and providing recommendations associated with the object to the user.
SUMMARY OF THE INVENTION
Various embodiments herein describe a method of identifying an object held in a user’s hand using a wearable device and providing recommendations thereof. According to one embodiment, the method of providing recommendations regarding a handheld object using a wearable device comprises of receiving, by a wearable device, a sensor data from a sensor unit embedded in the wearable device; capturing, by a wearable device, one or more pictures of the object held in the user’s hand in response to the sensor data received from the sensor unit; retrieving, by a recommendation unit, image data associated with the one or more captured pictures of the object and the sensor data from the wearable device; comparing the image data and sensor data retrieved from the wearable device with pre-stored data in a communication device; providing one or more recommendations to the user regarding the object held in the user’s hand, wherein the one or more recommendations enable the user to choose the object suitable for the user.
According to one embodiment, the sensor data is received due to generation of an electrical signal in one or more hand muscles due to muscle contraction when the object is held in the user’s hand.
According to one embodiment, the sensor data corresponds to weight of the object.
According to one embodiment, the method further comprises of processing the captured one or more pictures of the handheld object using an image processing technique for obtaining dimension and type of the object.
According to one embodiment, the dimension corresponds to shape of the object and type of the object corresponds to property of the object.
According to one embodiment, the property of the object comprises at least one of the following: solid, liquid, semi-solid, sediments, ingredients, and calorie values.
According to one embodiment, the pre-stored data comprises review information provided by plurality of users on the object, purchase history and health data of the user.
According to one embodiment, wherein in providing one or more recommendations to the user, the method comprises of delivering audio announcements on the communication device and displaying notifications on a display of the communication device.
Various embodiments herein further describes a system for providing recommendations regarding a handheld object, the system comprising a wearable device adapted for: receiving a sensor data from a sensor unit embedded in the wearable device, capturing one or more pictures of the handheld object in response to a sensor data received from a sensor, and a recommendation unit adapted for: retrieving image data associated with the one or more captured pictures of the object and the sensor data from the wearable device,
comparing the image data and sensor data retrieved from the wearable device with data stored in a communication device, and providing one or more recommendations to the user regarding the object held in the user’s hand, wherein the one or more recommendations enable the user to choose the object suitable for the user.
The foregoing has outlined, in general, the various aspects of the invention and is to serve as an aid to better understanding the more complete detailed description which is to follow. In reference to such, there is to be a clear understanding that the present invention is not limited to the method or application of use described and illustrated herein. It is intended that any other advantages and objects of the present invention that become apparent or obvious from the detailed description or illustrations contained herein are within the scope of the present invention.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The other objects, features and advantages will occur to those skilled in the art from the following description of the preferred embodiment and the accompanying drawings in which:
Figure 1 illustrates an exemplary embodiment of identifying an object held in a user’s hand using a wearable device and providing recommendation to the user, according to one embodiment.
Figure 2 is a schematic diagram illustrating one or more functional modules of a recommendation unit for providing one or more recommendations regarding a handheld object to a user, according to one embodiment.
Figure 3 is a flow chart diagram illustrating an exemplary method of providing one or more recommendations regarding an object held in a user’s hand using a wearable device, according to one embodiment.
Figure 4 is a schematic diagram illustrating one or more recommendations regarding an object held in a user’s hand on a display of a communication device, according to one embodiment.
Although specific features of the present invention are shown in some drawings and not in others, this is done for convenience only as each feature may be combined with any or all of the other features in accordance with the present invention.
DETAILED DESCRIPTION OF THE INVENTION
The present invention provides a method of identifying an object held in a user’s hand using a wearable device and providing recommendations to the user. In the following detailed description of the embodiments of the invention, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
The specification may refer to “an”, “one” or “some” embodiment(s) in several locations. This does not necessarily imply that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes”, “comprises”, “including” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations and arrangements of one or more of the associated listed items.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Figure 1 illustrates an exemplary embodiment of identifying an object held in a user’s hand using a wearable device and providing recommendation to the user, according to one embodiment. In this embodiment, the wearable device 104 may be a smartwatch which is embedded with a camera and sensor unit for identifying an object 102 held in a user’s hand. As shown in Figure 1, the wearable device 104 receives a sensor data from a sensor unit embedded in the wearable device 104. The sensor data is received as a result of generation of an electrical signal due to muscle contraction in one or more muscles when the object 102 is held in the user’s hand. The sensor unit senses the generation of the electrical signal and outputs a value corresponding to weight of the handheld object. Simultaneously, the sensor unit triggers a camera application present in the wearable device 104 such that camera view 106 is made to focus on the object 102 held in the user’s hand. Thereafter, one or more pictures of the handheld object 102 are captured automatically. The captured one or more pictures are further processed by an image processing technique to obtain dimension, type and other information associated with the object 102, wherein the dimension of the object corresponds to shape of the object and type of the object corresponds to property of the object. The property of the object comprises at least one of solid, liquid, semi-solid, sediments, ingredients, and calorie values.
The image data associated with the captured one or more pictures of the object and the sensor data are retrieved by a recommendation unit present in a communication device 110 for providing one or more recommendations regarding the handheld object 102 to the user. The one or more recommendations enable the user to wisely choose the object suitable for the user.
Figure 2 is a schematic diagram illustrating one or more functional modules of a recommendation unit for providing one or more recommendations regarding a handheld object to a user, according to one embodiment. As shown in Figure 2, the recommendation unit 200 comprises of data retrieving module 202, a third party data retrieving module 204, a database 206, a comparison module 208 and a notification module 210. The modules function dependently on each other to provide recommendations to the user. The data retrieving module 202 is configured to retrieve image data and sensor data from the wearable device. The image data comprises information regarding dimension and type of the object. The dimension of the object corresponds to shape, height and color information of the handheld object. The type of the object corresponds to property of the object, wherein the property of the object comprises at least one of solid, liquid, semi-solid, sediments, ingredients, and calorie values. The sensor data provides information about weight of the object held in the user’s hand. The recommendation unit 200 retrieves the image data and sensor data from the wearable device and transfers to the third party data retrieving module 204. The third party data retrieving module 204 is configured for retrieving additional data relating to the object from third party websites/servers based on the image data and sensor data received from the wearable device. For example, if the object held in the user’s hand is a cool drinks bottle, the image data provides information regarding bottle shape, color of the bottle, content of the bottle, sediments/ingredients and the sensor data provides information regarding weight of the cool drinks bottle. If the third party data retrieving module identifies that the cool drinks bottle contains contaminants from the received image data, then using Internet or third party servers, additional information associated with the cool drinks bottle is retrieved. The additional information may include but not limited to one or more medical conditions of consuming the contaminated cool drinks, suggesting to book an appointment in a clinic or with doctors in nearby locations to treat the one or more medical conditions, availability of good quality cool drinks in the nearby locations, review provided by plurality of users on the object and purchase history and the like. Then, the retrieved data from third party servers are stored in the database 206.
The database 206 also comprises health information associated with the user. In one embodiment, the health data of the user can be retrieved from a third party server and get stored in the database 206. The comparison module 208 is configured for comparing the retrieved image and sensor data from the wearable device with the data stored in the database 206. Based on health data stored in the database 206, the recommendation unit 200 provides one or more recommendations to the user via the notification module. The notification module is configured to provide audio announcements via micro speakers of the communication device. In one embodiment, the recommendation unit 200 displays notifications on a display of the communication device regarding the object held in the user’s hand. For the contaminated cool drinks, the recommendation unit notifies availability of contaminants free cool drinks bottle in the nearby shops and displays the location names in the notification area. The recommendation unit 200 also alerts one or more disorders such as diarrhea or food poisoning if the contaminated cool drinks bottle is consumed. Further, the recommendation unit 200 also suggests doctors in the nearby location for treating the disorders. An exemplary notification is shown in Figure 4.
Figure 3 is a flow chart diagram illustrating an exemplary method of providing one or more recommendations regarding an object held in a user’s hand using a wearable device, according to one embodiment. At step 302, a sensor data from a sensor unit embedded in the wearable device is received. The sensor data is received due to generation of an electrical signal in one or more muscles of the hand when the object is held in the user’s hand. The generation of the electrical signal is detected by the sensor unit embedded in the wearable device. In one embodiment, the sensor unit embedded in the wearable device is a surface electromyography (EMG) sensor. The EMG sensor measures the generated electrical signal and outputs a value corresponding to weight of the object. The sensor data also triggers a camera application and at step 304, one or more pictures of the object held in user’s hand are captured. The captured one or more pictures of the object are processed using an image processing technique to obtain further details of the object. In one embodiment, the wearable device uses a self-learning algorithm to obtain further details of the object comprising dimension, whether the object is liquid, solid, semi solid, whether the object contains any sediments, ingredients associated with the object etc.
Then, at step 306, both the sensor data and the image data are retrieved by a recommendation unit present in a communication device. In one embodiment, both the sensor data and image data are transferred to the communication device using a short range communication. The short range communication is selected from a group comprising Bluetooth, Near Field Communication, Infrared communication, and Zigbee. At step 308, the retrieved sensor data and image data are compares with data stored in a database. The database comprises health information associated with the user and additional data of the object retrieved from third party websites. The health information comprises information about vital organs, sugar level in blood, blood pressure and the like. In one embodiment, the health information associated with the user may be stored in a third party server and can be retrieved by the user when needed. Based on the health information associated with the user, at step 310, one or more recommendations are provided to the user wherein the one or more recommendations enable the user to wisely select the object. The recommendations can be provided to the user either as audio announcements or as notifications on a display of the communication device associated with the user.
The present invention is widely applicable in shopping malls, restaurants and the like, where even visually impaired people can able to do shopping without requiring any help from others. For example, when a visually impaired person holds an object in his hand for example a water bottle, a surface electromyography (EMG) sensor embedded in the wearable device senses an electrical signal emanated from the one or more muscles of the user’s hand. The electrical signal is generated due to exertion of force and torque applied on the water bottle via the muscles. The EMG sensor converts the electrical signal and provides an output value corresponding to weight of the water bottle. Simultaneously, the EMG sensor also triggers a camera application such that the field of view is made to focus on the water bottle held in the user’s hand. Once the focusing is done, the camera application automatically captures one or more pictures of the water bottle. The captured one or more pictures are further processed using an image processing technique to obtain further details of the water bottle such as manufacturer name, sediments if any, weight of the bottle, availability of water bottle in nearby location, etc. Once, the above data is collected, a short range communication with a communication device is triggered to transfer the data. The communication device may comprise at least one of a mobile phone, a tablet, a personal digital assistant, a portable device and a laptop. A recommendation unit present in the communication device receives the water bottle details and weight data and compares the received data with the data stored in the database. Based on the comparison, one or more recommendations or alerts were notified to the user using the communication device.
Figure 4 is a schematic diagram illustrating one or more recommendations regarding an object held in a user’s hand on a display of a communication device, according to one embodiment. As shown in Figure 4, the one or more recommendations are displayed on display of the communication device. The one or more recommendations corresponds to quality and quantity of the object held in the user’s hand. The recommendation may also include information regarding review provided by plurality of users on the object and purchase history, availability of the object in nearby location, sediments/contaminants if any, one or more medical conditions as a result of consuming the object, suggesting doctors in the nearby locations for treating the one or more medical conditions. Using the one or more recommendations, the user selects the object.
In one embodiment, the communication device may provide recommendations in the form of audio announcements. The audio announcements enable a visually impaired person to intelligently select the object based on the recommendations. The communication device may also display notifications in the notification area also. Thus, the various embodiments of the present invention herein are adapted to provide a simple way of identifying an object held in the user’s hand and recommendations regarding the object without the need of requiring any additional device rather than a smartwatch and smartphone.
Although the invention of the method and system has been described in connection with the embodiments of the present invention illustrated in the accompanying drawings, it is not limited thereto. It will be apparent to those skilled in the art that various substitutions, modifications and changes may be made thereto without departing from the scope and spirit of the invention.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 6724-CHE-2015-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 1 | Power of Attorney [15-12-2015(online)].pdf | 2015-12-15 |
| 2 | 6724-CHE-2015-IntimationOfGrant12-12-2021.pdf | 2021-12-12 |
| 2 | Form 5 [15-12-2015(online)].pdf | 2015-12-15 |
| 3 | Form 18 [15-12-2015(online)].pdf | 2015-12-15 |
| 3 | 6724-CHE-2015-PatentCertificate12-12-2021.pdf | 2021-12-12 |
| 4 | Drawing [15-12-2015(online)].pdf | 2015-12-15 |
| 4 | 6724-CHE-2015-US(14)-HearingNotice-(HearingDate-03-09-2021).pdf | 2021-10-17 |
| 5 | Description(Complete) [15-12-2015(online)].pdf | 2015-12-15 |
| 5 | 6724-CHE-2015-Written submissions and relevant documents [17-09-2021(online)].pdf | 2021-09-17 |
| 6 | abstract 6724-CHE-2015.jpg | 2016-08-18 |
| 6 | 6724-CHE-2015-Correspondence to notify the Controller [02-09-2021(online)].pdf | 2021-09-02 |
| 7 | 6724-CHE-2015-Power of Attorney-090816.pdf | 2016-08-22 |
| 7 | 6724-CHE-2015-FORM-26 [02-09-2021(online)].pdf | 2021-09-02 |
| 8 | 6724-CHE-2015-Form 1-090816.pdf | 2016-08-22 |
| 8 | 6724-CHE-2015-ABSTRACT [11-11-2019(online)].pdf | 2019-11-11 |
| 9 | 6724-CHE-2015-CLAIMS [11-11-2019(online)].pdf | 2019-11-11 |
| 9 | 6724-CHE-2015-Correspondence-F1-PA-090816.pdf | 2016-08-22 |
| 10 | 6724-CHE-2015-DRAWING [11-11-2019(online)].pdf | 2019-11-11 |
| 10 | 6724-CHE-2015-FER.pdf | 2019-05-29 |
| 11 | 6724-CHE-2015-FER_SER_REPLY [11-11-2019(online)].pdf | 2019-11-11 |
| 11 | 6724-CHE-2015-RELEVANT DOCUMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 12 | 6724-CHE-2015-FORM 13 [17-07-2019(online)].pdf | 2019-07-17 |
| 12 | 6724-CHE-2015-OTHERS [11-11-2019(online)].pdf | 2019-11-11 |
| 13 | 6724-CHE-2015-AMENDED DOCUMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 13 | 6724-CHE-2015-RELEVANT DOCUMENTS [11-11-2019(online)].pdf | 2019-11-11 |
| 14 | 6724-CHE-2015-AMENDED DOCUMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 14 | 6724-CHE-2015-RELEVANT DOCUMENTS [11-11-2019(online)].pdf | 2019-11-11 |
| 15 | 6724-CHE-2015-FORM 13 [17-07-2019(online)].pdf | 2019-07-17 |
| 15 | 6724-CHE-2015-OTHERS [11-11-2019(online)].pdf | 2019-11-11 |
| 16 | 6724-CHE-2015-FER_SER_REPLY [11-11-2019(online)].pdf | 2019-11-11 |
| 16 | 6724-CHE-2015-RELEVANT DOCUMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 17 | 6724-CHE-2015-FER.pdf | 2019-05-29 |
| 17 | 6724-CHE-2015-DRAWING [11-11-2019(online)].pdf | 2019-11-11 |
| 18 | 6724-CHE-2015-CLAIMS [11-11-2019(online)].pdf | 2019-11-11 |
| 18 | 6724-CHE-2015-Correspondence-F1-PA-090816.pdf | 2016-08-22 |
| 19 | 6724-CHE-2015-ABSTRACT [11-11-2019(online)].pdf | 2019-11-11 |
| 19 | 6724-CHE-2015-Form 1-090816.pdf | 2016-08-22 |
| 20 | 6724-CHE-2015-FORM-26 [02-09-2021(online)].pdf | 2021-09-02 |
| 20 | 6724-CHE-2015-Power of Attorney-090816.pdf | 2016-08-22 |
| 21 | 6724-CHE-2015-Correspondence to notify the Controller [02-09-2021(online)].pdf | 2021-09-02 |
| 21 | abstract 6724-CHE-2015.jpg | 2016-08-18 |
| 22 | 6724-CHE-2015-Written submissions and relevant documents [17-09-2021(online)].pdf | 2021-09-17 |
| 22 | Description(Complete) [15-12-2015(online)].pdf | 2015-12-15 |
| 23 | 6724-CHE-2015-US(14)-HearingNotice-(HearingDate-03-09-2021).pdf | 2021-10-17 |
| 23 | Drawing [15-12-2015(online)].pdf | 2015-12-15 |
| 24 | 6724-CHE-2015-PatentCertificate12-12-2021.pdf | 2021-12-12 |
| 24 | Form 18 [15-12-2015(online)].pdf | 2015-12-15 |
| 25 | Form 5 [15-12-2015(online)].pdf | 2015-12-15 |
| 25 | 6724-CHE-2015-IntimationOfGrant12-12-2021.pdf | 2021-12-12 |
| 26 | Power of Attorney [15-12-2015(online)].pdf | 2015-12-15 |
| 26 | 6724-CHE-2015-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 1 | 2019-05-2417-25-39_27-05-2019.pdf |