Abstract: ABSTRACT METHOD OF RENDERING USER INTERACTION ON A DUAL SCREEN DEVICE The present invention explains a method of rendering user interaction on a dual screen device. According to one embodiment, the method includes detecting at least one gesture provided by a user on the dual screen device having a first screen and a second screen, determining if a focus of the user is on the first screen or the second screen; and performing one or more actions associated with the at least one detected gesture on the screen focused by the user. The at least one gesture comprises of a folding action, a flapping action, a tilting action or the gestures in combination for controlling one or more ongoing tasks on the dual screen device. The present invention also enables a touch enabled screen strip at the back of the dual screen device to perform one or more actions based on the gesture provided by the user. Figure 2
Claims:We claim:
1. A method of rendering user interaction on a dual screen device, the method comprising:
detecting at least one gesture provided by a user on the dual screen device having a first screen and a second screen;
determining if a focus of the user is on the first screen or the second screen; and
performing one or more actions associated with the at least one detected gesture on the screen focused by the user.
2. The method as claimed in claim 1, wherein the one or more actions comprises of:
copying an entity present on the first screen to the second screen;
saving a work progressing on the screen;
changing orientation of the screen; and
closing an application performed on the screen.
3. The method as claimed in claim 1, wherein the at least one gesture comprises one of a folding action, a flapping action, a tilting action or the gestures in combination for controlling one or more ongoing tasks on the dual screen device.
4. The method as claimed in claim 3, wherein the folding gesture comprises of an align-fold-glue interaction of the first screen and the second screen, where the contents present on the first screen is being copied to the second screen.
5. The method as claimed in claim 3, wherein the folding gesture comprises of a shut and save interaction, where a work progressing on the first screen is saved in response to the shut action.
6. The method as claimed in claim 3, wherein the tilt gesture is configured to change the orientation of the first screen and the second screen relate to each other. .
7. The method as claimed in claim 3, wherein the flap gesture is configured to close an application/task performed on the first screen or second screen.
8. The method as claimed in claim 1, wherein focus of the user on at least one of the first screen and the second screen is determined based on:
eye gaze or head direction of the user relative to the dual screen device; and
comparative tilt angle of the first screen and the second screen to the base position.
9. The method as claimed in claim 1, further comprising:
detecting at least one of a folding, tilting, and flapping gesture on the dual screen device to identify the screen;
detecting one or more gestures on a touch enabled screen strip at back of the dual screen device to perform one or more actions on the identified screen; and
performing one or more actions on the identified screen based on the one or more user interactions received on the touch enabled screen strip at back of the dual screen device, where the identified screen includes one of the first screen and the second screen.
10. The method as claimed in claim 9, wherein the one or more user interactions comprises of:
a left to right swipe gesture for performing horizontal scrolling action on first or second screen; and
a up-down swipe gesture for performing vertical scrolling action on first or second screen.
11. A dual screen device comprising:
a dual screen;
a touch enabled screen strip connecting a first screen and a second screen of the dual screen device;
a plurality of synchronized gyroscopes positioned on the first screen and the second screen;
a plurality of magnetic strips positioned on the first screen and the second screen;
a plurality of electromagnetic induction coils positioned on the first screen and the second screen;
wherein the dual screen is adapted for:
detecting at least one gesture provided by a user on the dual screen;
determining if a focus of the user is on the first screen or the second screen; and
performing one or more actions associated with the at least one detected gesture on the screen focused by the user.
12. The dual screen device as claimed in claim 11, wherein the one or more actions comprises of:
copying an entity present on the first screen to the second screen;
saving a work progressing on the screen;
changing orientation of the screen; and
closing an application performed on the screen.
13. The dual screen device as claimed in claim 10, wherein the at least one gesture comprises one of a folding action, a flapping action, a tilting action or the gestures in combination for controlling one or more ongoing tasks on the dual screen device.
14. The dual screen device as claimed in claim 12, wherein the folding gesture comprises of an align-fold-glue interaction on the first screen and the second screen, where the contents present on the first screen is being copied to the second screen.
15. The dual screen device as claimed in claim 12, wherein the folding gesture comprises of a shut and save interaction, where a progressing work on the first screen is saved in response to the shut action.
16. The dual screen device as claimed in claim 12, wherein the tilt gesture is configured to change the orientation of the first screen and the second screen relate to each other. .
17. The dual screen device as claimed in claim 12, wherein the flap gesture is configured to close an application/task performed on the first screen or second screen.
18. The dual screen device as claimed in claim 10, wherein the focus of the user on at least one of the first screen and the second screen is determined based on:
eye gaze or head direction of the user relative to the dual screen device; and
comparative tilt angle of the first screen and the second screen to the base position.
Dated this the 27th day of October 2015
Signature
KEERTHI J S
Patent agent
Agent for the applicant
, Description:FORM 2
THE PATENTS ACT, 1970
[39 of 1970]
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(Section 10; Rule 13)
METHOD OF RENDERING USER INTERACTION ON A DUAL SCREEN DEVICE
SAMSUNG R&D INSTITUTE INDIA – BANGALORE PRIVATE LIMITED
# 2870, ORION Building, Bagmane Constellation Business Park,
Outer Ring Road, Doddanakundi Circle,
Marathahalli Post, Bangalore-560 037
An Indian Company
The following Specification particularly describes the invention and the manner in which it is to be performed
FIELD OF THE INVENTION
The present invention generally relates to dual screen devices, and more particularly relates to a method of rendering user interaction on a dual screen device.
BACKGROUND OF INVENTION
The mobile technology is getting smarter day by day and constantly undergoing several improvements in various aspects including speed, interface, design, size, functions, flexibility, durability etc. The present mobile phones have a single screen and using that the users cannot access two different applications on a mobile phone at a time. For example, consider that the user is performing a navigation operation on a screen and wants to browse also. As both the tasks need the screen, the user has to minimize/close the navigation operation on the screen and activate a browser application. This is one of the reasons behind the development of dual screen device. Because, in dual screens, the user is allowed to efficiently perform multitasking across tasks that require information to be displayed on the screen.
Currently, user interface elements on the phone screen lend itself for interaction by the user through touch, tap, swipe or any other basic gestures and the like. However, interacting with dual screens simultaneously using the basic gestures possess little discomfort to the user. Some of the challenges of using dual screens are:
• Holding or gripping the dual screen mobile phone will be difficult while simultaneously performing touch/tap/swipe interactions on the screen.
• Single handed use of dual screen mobile phones is not possible. Though, the dual screen mobile phones are flexible, one cannot perform more interactions using one hand.
Therefore, there is a need for dual screen devices to have new interactions that can be performed without compromising on handling stability. Also, by virtue of these devices being flexible gives rise to new ways of physically interacting with the device.
SUMMARY OF THE INVENTION
Various embodiments herein describe a method of rendering user interaction on a dual screen device. According to one embodiment, the method comprises of detecting at least one gesture provided by a user on the dual screen device having a first screen and a second screen, determining if a focus of the user is on the first screen or the second screen, and performing one or more actions associated with the at least one detected gesture on the screen focused by the user.
According to one embodiment, the one or more actions comprises of copying an entity present on the first screen to the second screen, saving a work progressing on the screen, changing orientation of the screen, and closing an application performed on the screen.
According to one embodiment, the gesture comprises one of a folding action, a flapping action, a tilting action or the gestures in combination for controlling one or more ongoing tasks on the dual screen device.
According to one embodiment, the folding gesture comprises of an align-fold-glue interaction of the first screen and the second screen, where the contents present on the first screen is being copied to the second screen.
According to one embodiment, the folding gesture comprises of a shut and save interaction, where a work progressing on the first screen is saved in response to the shut action.
According to one embodiment, the tilt gesture is configured to change the orientation of the first screen and the second screen relate to each other. .
According to one embodiment, the flap gesture is configured to close an application/task performed on the first screen or second screen.
According to one embodiment, the focus of the user on at least one of the first screen and the second screen is determined based on: eye gaze or head direction of the user relative to the dual screen device and comparative tilt angle of the first screen and the second screen to the base position.
According to one embodiment, the method further comprising detecting at least one of a folding, tilting, and flapping gesture on the dual screen device to identify the screen, detecting one or more gestures on a touch enabled screen strip to perform one or more actions on the identified screen, and performing one or more actions on the identified screen based on the one or more user interactions received on the touch enabled screen strip, where the identified screen includes one of the first screen and the second screen.
According to one embodiment, the one or more user interactions comprises of a left to right swipe gesture for performing horizontal scrolling action on first or second screen and a up-down swipe gesture for performing vertical scrolling action on first or second screen.
According to another embodiment, a device for rendering one or more user interactions on a dual screen device is provided. The dual screen device comprising: a dual screen, a touch enabled screen strip connecting a first screen and a second screen of a dual screen device, a plurality of synchronized gyroscopes positioned on the first screen and the second screen, a plurality of magnetic strips positioned on the first screen and the second screen, a plurality of electromagnetic induction coils positioned on the first screen and the second screen, wherein the dual screen is adapted for: detecting at least one gesture provided by a user on the dual screen, determining if a focus of the user is on the first screen or the second screen, and performing one or more actions associated with the at least one detected gesture on the screen focused by the user.
The foregoing has outlined, in general, the various aspects of the invention and is to serve as an aid to better understanding the more complete detailed description which is to follow. In reference to such, there is to be a clear understanding that the present invention is not limited to the method or application of use described and illustrated herein. It is intended that any other advantages and objects of the present invention that become apparent or obvious from the detailed description or illustrations contained herein are within the scope of the present invention.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The other objects, features and advantages will occur to those skilled in the art from the following description of the preferred embodiment and the accompanying drawings in which:
Figure 1A, 1B and 1C are schematic diagrams illustrating exemplary gestures being rendered on a dual screen device, according to one embodiment.
Figure 2 illustrates a dual screen with one or more user interface elements placed on a dual screen for performing one or more user interactions, according to one embodiment.
Figure 3 is a flowchart diagram illustrating an exemplary method of rendering user interactions on a dual screen device, according to one embodiment.
Figure 4 is a schematic diagram illustrating an exemplary align-fold-glue folding interaction on a dual screen device, according to one embodiment.
Figure 5 is a schematic diagram illustrating an exemplary shut-save folding interaction on a dual screen device, according to one embodiment.
Figure 6 is a schematic diagram illustrating one or more gestures being performed on a touch enabled screen strip at back of a dual screen device, according to one embodiment.
Although specific features of the present invention are shown in some drawings and not in others. This is done for convenience only as each feature may be combined with any or all of the other features in accordance with the present invention.
DETAILED DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The present invention provides a method of rendering one or more user interactions on a dual screen device. In the following detailed description of the embodiments of the invention, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
The specification may refer to “an”, “one” or “some” embodiment(s) in several locations. This does not necessarily imply that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes”, “comprises”, “including” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations and arrangements of one or more of the associated listed items.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The present invention provides a method of rendering one or more user interactions on a dual screen device. In addition to the basic gestures, the present invention defines new gestures for interacting efficiently with the dual screen device. The one or more new gestures comprises of a folding gesture, a tilting gesture and a flapping gesture and a combination of all gestures. The method first detects at least one gesture on the dual screen device having a first screen and a second screen. The method after detecting the gesture further determines whether focus of a user is made on first screen or second screen. Then, one or more actions associated with the at least one detected gesture on the screen focused by the user is performed. In one embodiment, the dual screen device may comprise flexible screens for rendering one or more user interactions performed on the dual screen device.
Figure 1A, 1B and 1C are schematic diagrams illustrating exemplary gestures rendered on a dual screen device, according to one embodiment. Initially, the dual screen device is in a normal position 102A. Upon receiving a tilt gesture, the dual screen device changes orientation of the screen to a tilted position 102B. In the tilted position 102B, content in the screen is changed from a portrait view to a landscape view and vice versa. In one embodiment, the dual screen device detects the tilt gesture based on relative position of both the screens with respect to a joint, through which the screens are attached. The same is illustrated in Figure 1A.
Similarly, Figure 1B and 1C illustrate a folding gesture and a flapping gesture provided on the dual screen device in three dimensional (3D) formats. The folding gesture is defined when a user closes the device from either left to right direction or from right to left direction. The dual screen device upon receiving a folding gesture, changes from its normal position 104A to a folded position 104B as illustrated in Figure 1B. The folding gesture allows a user to copy/move an entity such as an application, graphics or content present on one screen to another screen. This type of screen folding interaction is defined as Align-Fold-Glue interaction. Also, using the fold gesture, the user is allowed to save/close an active entity present on the screen. This type of screen folding interaction is defined as shut-save interaction. A detailed description for both the screen folding interactions is explained in figures 4 and 5. As shown in Figure 1C, the flap gesture is defined when a user repeatedly performs an action involving bringing of both the screens closer and then taking them far. For example, consider that the user is reading a book on one screen and wants to move to next page of the book. The user can simply flaps the dual screen once to go to the next page. Thus, the user is taken to the next page using the flap gesture without losing handling stability of the device. Similarly, the user can provide the flapping gesture multiple times to define one or more functions to be performed on the screen.
These gestures are recognized using one or more sensing units placed on the screen of the dual screen device. Figure 2 illustrates a dual screen device 200 with one or more user interface elements placed on a dual screen for performing one or more user interactions, according to one embodiment. As shown in Figure 2, a paired and synced gyrometers 202 placed one on each screen of the dual screen device is adapted for identifying tilt gesture. A set of magnetic strips 204, one on each screen pane is adapted for identifying folding gesture and paired electromagnetic induction coils 206 across the screen panes are adapted for identifying flap gestures. In one embodiment, the one or more gestures are recognized using the following formula:
Gesture = ? f (tilt) f (fold) f (flap)
= ? ? (GLt - GRt) ? (MSLt - MSRt) ? (ECLt -ECRt)
where,
G- Differential Gyrometer output
MS - Differential Magnetic strip output
EC - Differential Electromagnetic induction Coil output
Figure 3 is a flowchart diagram illustrating an exemplary method of rendering user interactions on a dual screen device, according to one embodiment. At step 302, at least one gesture provided by a user on a dual screen device having a first screen and a second screen is detected. The gesture comprises one of a folding action, a flapping action, a tilting action or the gestures in combination for controlling one or more ongoing tasks on the dual screen device. After detecting the gesture, at step 304, it is determined whether focus of the user is on the first screen or the second screen.
For example, consider that the user is currently accessing a gaming application on a right screen and a browser application on a left screen. If the user wishes to use the browser application the user can simply tilt the left screen and types a query on the left screen. This is possible because the method monitors focus made by the user on a particular screen. In one embodiment the focus of the user is determined based on eye movement and head direction made by the user on the relative screen. In one embodiment, a comparative tilt angle of the two display screen is monitored to detect the focus made by the user on a respective screen.
At step 306, one or more actions associated with the detected at least one gesture on the screen focused by the user is performed. The one or more actions comprises of copying an entity present on the first screen to the second screen, saving a work progressing on the screen, changing orientation of the screen, and closing an application performed on the screen.
Figure 4 is a schematic diagram illustrating an exemplary align-fold-glue folding interaction on a dual screen device, according to one embodiment. The align-fold-glue folding interactions allow a user to copy an entity present on one screen to another screen. As shown in Figure 4, consider that there are two independent applications running on the two screens in which a navigation map application is running on a left screen and a browser application is running on a right screen. The user is viewing a landmark address using the browser application on the right screen. Now, user wants to locate the landmark address on the navigation map application running on the left screen. For this, the user can simply performs the align-fold-glue interaction without removing the hands from the device. On receipt of this align-fold-glue interaction, the dual screen device copies the landmark address on to the map application and renders the output to the user on the left screen.
Figure 5 is a schematic diagram illustrating an exemplary shut-save folding interaction on a dual screen device, according to one embodiment. The shut-save folding interaction is used in a scenario where there are two independent applications running on the two screens and user wishes to save and close the content/application present on one of the screens. To achieve this, the user performs the shut-save interaction which based on the base gesture of folding. If the user folds the left screen over the right screen, then the content/application on the right screen is saved while if the user folds the right screen over the left screen then the content/application on the left screen is saved. In other words, if the user folds the left screen over the right screen, then the content/application on the left screen is closed and if the user folds the right screen over the left screen, the content/application on the right screen is closed.
Figure 6 is a schematic diagram illustrating one or more gestures being performed on a touch enabled screen strip at back of a dual screen device, according to one embodiment. The present invention allows the touch enabled screen strip at the back of the dual screen device to perform one or more actions using the one or more user interactions. The gesture may comprise at least one of a folding action, tilting action, and flapping action. At first, at least one of the gestures comprising a folding action, a tilting action and flapping action is detected on the dual screen device to identify the screen that has received the gesture. After identifying the screen, the touch enabled screen strip at back of the dual screen device detects one or more gestures to perform one or more actions on the identified screen. In response to the detected gesture, the touch enabled screen strip performs one or more actions on the identified screen based on the one or more user interactions received on the touch enabled screen strip at back of the dual screen device, where the identified screen includes one of the first screen and the second screen. The one or more user interactions received on the touch enabled screen strip at the back of the dual screen device comprises at least one of a left to right swipe gesture for performing horizontal scrolling action on first or second screen and an up-down swipe gesture for performing vertical scrolling action on first or second screen.
In an exemplary operation, consider that the user is accessing browser application on both screens wherein in left screen the user checks for train availability and in right screen the user checks for flight availability. Suppose, if the user wants to book the flight ticket, the user can simply tilt the right screen and perform a swipe gesture on the touch enabled screen strip at the back of the dual screen device. In turn, the touch enabled screen strip performs a selection action to select a particular flight. Thus, the user is allowed to perform one or more actions on a respective screen without removing his hands from the dual screen device. Similarly, a scrolling action can be performed by the touch enabled screen strip upon receiving a scrolling gesture.
In another embodiment, the user provides voice based commands for interacting with dual screen device. For example, the user may provide voice command to voice assistant say, “take a screen shot and save it in my drive”. The dual screen device identifies the screen the user is focused on and performs the desired action on the identified screen.
Although the invention of the method and system has been described in connection with the embodiments of the present invention illustrated in the accompanying drawings, it is not limited thereto. It will be apparent to those skilled in the art that various substitutions, modifications and changes may be made thereto without departing from the scope and spirit of the invention.
| # | Name | Date |
|---|---|---|
| 1 | 5775-CHE-2015-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 1 | Power of Attorney [28-10-2015(online)].pdf | 2015-10-28 |
| 2 | 5775-CHE-2015-IntimationOfGrant31-03-2022.pdf | 2022-03-31 |
| 2 | Form 5 [28-10-2015(online)].pdf | 2015-10-28 |
| 3 | Form 18 [28-10-2015(online)].pdf | 2015-10-28 |
| 3 | 5775-CHE-2015-PatentCertificate31-03-2022.pdf | 2022-03-31 |
| 4 | Drawing [28-10-2015(online)].pdf | 2015-10-28 |
| 4 | 5775-CHE-2015-PETITION UNDER RULE 137 [26-03-2022(online)].pdf | 2022-03-26 |
| 5 | Description(Complete) [28-10-2015(online)].pdf | 2015-10-28 |
| 5 | 5775-CHE-2015-Written submissions and relevant documents [26-03-2022(online)].pdf | 2022-03-26 |
| 6 | abstract 5775-CHE-2015 .jpg | 2016-09-17 |
| 6 | 5775-CHE-2015-PETITION UNDER RULE 138 [23-03-2022(online)].pdf | 2022-03-23 |
| 7 | 5775-CHE-2015-RELEVANT DOCUMENTS [22-07-2019(online)].pdf | 2019-07-22 |
| 7 | 5775-CHE-2015-Correspondence to notify the Controller [07-03-2022(online)].pdf | 2022-03-07 |
| 8 | 5775-CHE-2015-FORM-26 [07-03-2022(online)].pdf | 2022-03-07 |
| 8 | 5775-CHE-2015-FORM 13 [22-07-2019(online)].pdf | 2019-07-22 |
| 9 | 5775-CHE-2015-AMENDED DOCUMENTS [22-07-2019(online)].pdf | 2019-07-22 |
| 9 | 5775-CHE-2015-US(14)-HearingNotice-(HearingDate-08-03-2022).pdf | 2022-02-07 |
| 10 | 5775-CHE-2015-ABSTRACT [17-08-2020(online)].pdf | 2020-08-17 |
| 10 | 5775-CHE-2015-FER.pdf | 2020-02-17 |
| 11 | 5775-CHE-2015-CLAIMS [17-08-2020(online)].pdf | 2020-08-17 |
| 11 | 5775-CHE-2015-OTHERS [17-08-2020(online)].pdf | 2020-08-17 |
| 12 | 5775-CHE-2015-COMPLETE SPECIFICATION [17-08-2020(online)].pdf | 2020-08-17 |
| 12 | 5775-CHE-2015-FER_SER_REPLY [17-08-2020(online)].pdf | 2020-08-17 |
| 13 | 5775-CHE-2015-DRAWING [17-08-2020(online)].pdf | 2020-08-17 |
| 14 | 5775-CHE-2015-COMPLETE SPECIFICATION [17-08-2020(online)].pdf | 2020-08-17 |
| 14 | 5775-CHE-2015-FER_SER_REPLY [17-08-2020(online)].pdf | 2020-08-17 |
| 15 | 5775-CHE-2015-CLAIMS [17-08-2020(online)].pdf | 2020-08-17 |
| 15 | 5775-CHE-2015-OTHERS [17-08-2020(online)].pdf | 2020-08-17 |
| 16 | 5775-CHE-2015-ABSTRACT [17-08-2020(online)].pdf | 2020-08-17 |
| 16 | 5775-CHE-2015-FER.pdf | 2020-02-17 |
| 17 | 5775-CHE-2015-US(14)-HearingNotice-(HearingDate-08-03-2022).pdf | 2022-02-07 |
| 17 | 5775-CHE-2015-AMENDED DOCUMENTS [22-07-2019(online)].pdf | 2019-07-22 |
| 18 | 5775-CHE-2015-FORM 13 [22-07-2019(online)].pdf | 2019-07-22 |
| 18 | 5775-CHE-2015-FORM-26 [07-03-2022(online)].pdf | 2022-03-07 |
| 19 | 5775-CHE-2015-RELEVANT DOCUMENTS [22-07-2019(online)].pdf | 2019-07-22 |
| 19 | 5775-CHE-2015-Correspondence to notify the Controller [07-03-2022(online)].pdf | 2022-03-07 |
| 20 | abstract 5775-CHE-2015 .jpg | 2016-09-17 |
| 20 | 5775-CHE-2015-PETITION UNDER RULE 138 [23-03-2022(online)].pdf | 2022-03-23 |
| 21 | Description(Complete) [28-10-2015(online)].pdf | 2015-10-28 |
| 21 | 5775-CHE-2015-Written submissions and relevant documents [26-03-2022(online)].pdf | 2022-03-26 |
| 22 | Drawing [28-10-2015(online)].pdf | 2015-10-28 |
| 22 | 5775-CHE-2015-PETITION UNDER RULE 137 [26-03-2022(online)].pdf | 2022-03-26 |
| 23 | Form 18 [28-10-2015(online)].pdf | 2015-10-28 |
| 23 | 5775-CHE-2015-PatentCertificate31-03-2022.pdf | 2022-03-31 |
| 24 | Form 5 [28-10-2015(online)].pdf | 2015-10-28 |
| 24 | 5775-CHE-2015-IntimationOfGrant31-03-2022.pdf | 2022-03-31 |
| 25 | 5775-CHE-2015-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 25 | Power of Attorney [28-10-2015(online)].pdf | 2015-10-28 |
| 1 | SearchStrategy5775CHE2015_13-12-2019.pdf |