Abstract: The present invention relates to method and device for contextually invoking an application. Accordingly, the method comprises: receiving (101) a single user-input on the electronic device; identifying (102) a fingerprint information and an application from the input; determining (103) existence of a predefined mapping between the fingerprint information and the application; identifying (104) an functionality of application associated with the fingerprint information and the application in case of positive determination, the functionality being predefined for the application; and invoking (105) the functionality of application based on said identification.
Description:TECHNICAL FIELD
The present invention relates to a method for processing an input on an electronic device and the electronic device thereof. More particularly, the present invention relates to method for contextually invoking an application on the electronic device in accordance with input.
BACKGROUND
With advent of technology, biometric identification has gained increasing popularity as security mechanism in various electronic devices, and especially in smart phones. Out of the various forms of biometric identification that are now available, fingerprint identification has gained wide acceptance. Consequently, the smart phones and other electronic devices such as laptops, notebooks, tablets, and personal digital assistances (PDA) are being increasingly manufactured with fingerprint sensors. Thus, the fingerprint identification is used to unlock the smart phones and allow access to the smart phones.
Presently, in addition to providing access to smart phones, i.e. providing device access, the fingerprint information is being used for providing access to applications available on the smart phones. In one solution, user may configure various applications to be hidden or revealed upon swiping of correspondingly enrolled finger. Additionally, if an application is accessible from a secondary menu, or by pressing designated hotkeys, such an application may also be hidden or revealed by the swiping of the correspondingly enrolled finger. In one example, a particular message can be hidden or viewed in messaging application upon swiping a right little finger. In another example, a particular email can be hidden or viewed in an email application upon swiping a left index finger. In addition, the user may configure different operations of the smart phone upon swiping of correspondingly enrolled finger.
In another solution, user can select fingerprint-based shortcut keys for one or more functions predefined in a function list. The function list may include, for example, an unlock function, an application execution function, an application sorting function, an SIM switching function, a user account change function, an Operating System (OS) change function, an object insertion function, a personal authentication function, a multimedia control function, an input interface change function, and a mode change function.
In yet another solution, user can select stationary finger mode for performing activities that do not involve sensitive information or non-financial activities. Examples of such non-financial activities include launching a game application, operating a game application, launching an application, performing an operation in an application, performing an operating system function, and unlocking device. Similarly, the user can select moving finger mode for performing activities that involve sensitive information or financial activities. Examples of such financial activities include completing a wireless payment transaction.
However, these solutions are directed towards association of different fingers with different applications or tasks selected from a predefined list, which is global for all applications or tasks available in the smart phone. Therefore, these solutions are non-flexible and non-expandable.
Further, each application has multiple functionalities that the user might use frequently. Such functionalities can be accessed from menu items by performing redundant steps. However, the present solutions do not cater to defining application specific functionalities with corresponding fingers.
Thus, there exists a need for a solution that can be configured to specific application and yet can be expanded for other applications easily.
SUMMARY OF THE INVENTION
In accordance with the purposes of the invention, the present invention as embodied and broadly described herein, enables contextual invoking application.
Accordingly, a user-input corresponding to defining contextual association for an application is received from within the application. Thereafter, selection of a functionality from amongst a plurality of functionalities predefined for the application and fingerprint information is received. Upon receiving, the fingerprint information is associated with the selected functionality, such that a functionality associated with fingerprint information is invoked on an application upon receiving a single user-input, said single user-input corresponding to the fingerprint information and a selection of the application. To this end, upon receiving the single user-input, fingerprint information and an application from the single user-input are identified. Upon identifying, an existence of a predefined mapping between the fingerprint information and the application is determined. In case of positive determination, functionality of the application associated with the fingerprint information is identified, and correspondingly the application is invoked in accordance with the identified functionality.
The advantages of the invention include, but are not limited to, defining fingerprint association to accommodate application specific context for invoking any application according to different functionalities available with the application. In addition, process of accessing various functionalities is made easier as a user can associate frequently used functionalities with different fingerprint information. Further, the user can associate same fingerprint information for same or different functions of different applications, thereby providing a flexible and easily expandable solution and a better user-experience.
These aspects and advantages will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings and claims.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS:
To further clarify advantages and aspects of the invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings, which are listed below for quick reference.
Figures 1(a) to 1(c) illustrates exemplary method for contextually invoking an application on an electronic device, in accordance with an embodiment of present invention.
Figure 2 illustrates exemplary method for defining contextual association for an application, in accordance with an embodiment of present invention.
Figure 3 illustrates exemplary electronic device implementing the methods as described in figures 1 and 2, in accordance with an embodiment of present invention.
Figures 4(a) to 4(e) illustrate screenshots of an electronic device implementing the method as illustrated in Figure 2, in accordance with one aspect of an embodiment of the invention.
Figure 5 illustrate screenshots of an electronic device implementing the method as illustrated in Figure 2, in accordance with another aspect of an embodiment of the invention.
Figures 6(a) and 6(b) illustrate screenshots of an electronic device implementing the method as illustrated in Figures 1(a) to 1(c), in accordance with one aspect of an embodiment of the invention.
Figure 7 illustrates screenshot of an electronic device implementing the method as illustrated in Figures 1(a) to 1(c), in accordance with another aspect of an embodiment of the invention.
Figures 8(a) and 8(b) illustrate screenshots corresponding to a first exemplary manifestation depicting the implementation of the invention.
Figures 9(a) and 9(b) illustrate screenshots corresponding to a second exemplary manifestation depicting the implementation of the invention.
Figures 10(a) and 10(b) illustrate screenshots corresponding to a third exemplary manifestation depicting the implementation of the invention.
Figures 11(a) and 11(b) illustrate screenshots corresponding to a fourth exemplary manifestation depicting the implementation of the invention.
Figure 12 illustrates a typical hardware configuration of an electronic device, which is representative of a hardware environment for practicing the present invention.
It may be noted that to the extent possible, like reference numerals have been used to represent like elements in the drawings. Further, those of ordinary skill in the art will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily drawn to scale. For example, the dimensions of some of the elements in the drawings may be exaggerated relative to other elements to help to improve understanding of aspects of the invention. Furthermore, the one or more elements may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
DETAILED DESCRIPTION
It should be understood at the outset that although illustrative implementations of the embodiments of the present disclosure are illustrated below, the present invention may be implemented using any number of techniques, whether currently known or in existence. The present disclosure should in no way be limited to the illustrative implementations, drawings, and techniques illustrated below, including the exemplary design and implementation illustrated and described herein, but may be modified within the scope of the appended claims along with their full scope of equivalents.
The term “some” as used herein is defined as “none, or one, or more than one, or all.” Accordingly, the terms “none,” “one,” “more than one,” “more than one, but not all” or “all” would all fall under the definition of “some.” The term “some embodiments” may refer to no embodiments or to one embodiment or to several embodiments or to all embodiments. Accordingly, the term “some embodiments” is defined as meaning “no embodiment, or one embodiment, or more than one embodiment, or all embodiments.”
The terminology and structure employed herein is for describing, teaching and illuminating some embodiments and their specific features and elements and does not limit, restrict or reduce the spirit and scope of the claims or their equivalents.
More specifically, any terms used herein such as but not limited to “includes,” “comprises,” “has,” “consists,” and grammatical variants thereof do NOT specify an exact limitation or restriction and certainly do NOT exclude the possible addition of one or more features or elements, unless otherwise stated, and furthermore must NOT be taken to exclude the possible removal of one or more of the listed features and elements, unless otherwise stated with the limiting language “MUST comprise” or “NEEDS TO include.”
Whether or not a certain feature or element was limited to being used only once, either way it may still be referred to as “one or more features” or “one or more elements” or “at least one feature” or “at least one element.” Furthermore, the use of the terms “one or more” or “at least one” feature or element do NOT preclude there being none of that feature or element, unless otherwise specified by limiting language such as “there NEEDS to be one or more . . . ” or “one or more element is REQUIRED.”
Unless otherwise defined, all terms, and especially any technical and/or scientific terms, used herein may be taken to have the same meaning as commonly understood by one having an ordinary skill in the art.
Reference is made herein to some “embodiments.” It should be understood that an embodiment is an example of a possible implementation of any features and/or elements presented in the attached claims. Some embodiments have been described for the purpose of illuminating one or more of the potential ways in which the specific features and/or elements of the attached claims fulfil the requirements of uniqueness, utility and non-obviousness.
Use of the phrases and/or terms such as but not limited to “a first embodiment,” “a further embodiment,” “an alternate embodiment,” “one embodiment,” “an embodiment,” “multiple embodiments,” “some embodiments,” “other embodiments,” “further embodiment”, “furthermore embodiment”, “additional embodiment” or variants thereof do NOT necessarily refer to the same embodiments. Unless otherwise specified, one or more particular features and/or elements described in connection with one or more embodiments may be found in one embodiment, or may be found in more than one embodiment, or may be found in all embodiments, or may be found in no embodiments. Although one or more features and/or elements may be described herein in the context of only a single embodiment, or alternatively in the context of more than one embodiment, or further alternatively in the context of all embodiments, the features and/or elements may instead be provided separately or in any appropriate combination or not at all. Conversely, any features and/or elements described in the context of separate embodiments may alternatively be realized as existing together in the context of a single embodiment.
Any particular and all details set forth herein are used in the context of some embodiments and therefore should NOT be necessarily taken as limiting factors to the attached claims. The attached claims and their legal equivalents can be realized in the context of embodiments other than the ones used as illustrative examples in the description below.
Figures 1(a) to 1(c) illustrates exemplary method (100) for contextually invoking an application on an electronic device, in accordance with an embodiment of present invention. Referring to Figure 1a, in said embodiment, the method (100) comprises: receiving (101) a single user-input on the electronic device; identifying (102) a fingerprint information and an application from the single user-input; determining (103) existence of a predefined mapping between the fingerprint information and the application; identifying (104) a functionality of said application associated with the fingerprint information and the application, in case of positive determination, the functionality being predefined for the application; and invoking (105) the functionality of said application based on said identification.
Further, in one aspect of the invention, the single user-input is a touch based input received via a biometric sensing unit available in the electronic device.
Further, in another aspect of the invention, the single user-input is a combination of touch based input received on the electronic device corresponding to a selection of the application and an input received from an external electronic device coupled with the electronic device. Furthermore, the external electronic device includes a wearable device and an input device.
Further, the fingerprint information includes a combination of one of human hand and one or more digits of said human hand.
Referring to Figure 1(b), the method (100) further comprises: determining (106) if a security is enabled for the application in case of negative determination; and preventing (107) invoking of the application.
Referring to Figure 1(c), the method (100) further comprises: determining (108) if a security is disabled for the application in case of negative determination; and invoking (109) a default functionality associated with the application.
Figure 2 illustrates exemplary method (200) for defining contextual association to invoke an application on an electronic device, according to one embodiment. In said embodiment, the method (200) comprises: receiving (201), from within an application, a user-input corresponding to defining contextual association for the application; receiving (202) a selection of a functionality from amongst a plurality of functionalities predefined for the application; receiving (203) a fingerprint information; and associating (204) the fingerprint information with the selected functionality, such that a functionality associated with a fingerprint information is invoked on an application upon receiving a single user-input, said single user-input corresponding to the fingerprint information and a selection of the application.
Figure 3 illustrates exemplary electronic device (300) implementing the aforesaid methods as illustrated in Figures 1 & 2, in accordance with an embodiment of present invention. Examples of the electronic device (300) smart phone, laptop, tablet, smart or Internet Protocol television (IPTV), printer, and Personal Digital Assistance (PDA). In such embodiment, the electronic device (300) includes a display unit (301) adapted to depict various elements such as images, text, and video. Examples include, but not limited to, depicting a list of applications available on the electronic device (300), depicting user-interface corresponding to each of the applications available in the electronic device (300), and depicting various features of the electronic device (300). Examples of the display unit (301) include liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, and others as known in the art or later developed.
According to the present invention, the electronic device (300) implements a method, as described in Figure 1 above, for contextually invoking an application available in the electronic device (300). As such, the electronic device (300) further includes a receiving unit (302) adapted to receive a single user-input on the electronic device (300). In one aspect of the invention, the receiving unit (302) includes a biometric sensing unit (303) adapted to receive the single user-input via a touch based gesture input such as dragging an icon of the application depicted on the display unit (301). In another aspect of the invention, the receiving unit (302) includes a touch-input receiving unit (304) and a device-input receiving unit (305). The touch-input receiving unit (304) recognizes a touch based input received from a finger or an input device such as a finger-print aware stylus coupled to the electronic device (300). Similarly, the device-input receiving unit (305) recognizes an input received from an external device coupled to the electronic device (300). Examples of such external device include, but not limited to, a finger-print aware stylus and a wearable device. Thus, in said aspect, the single user-input is received as a combination of input received by the touch-input receiving unit (304) and a device-input receiving unit (305). Example of such single user-input includes, but not limited to, tapping on an icon on of the application depicted on the display unit (301) by using a finger-print aware stylus.
Further, the electronic device (300) includes an analysing unit (306) coupled to the receiving unit (302). Upon receiving the single user-input, the analysing unit (306) is adapted to identify a fingerprint information and an application from the single user-input. The analysing unit (306) is further adapted to determine existence of a predefined mapping between the fingerprint information and the application. The analysing unit (306) is further adapted to identify a functionality of said application associated with the fingerprint information and the application, in case of positive determination, the functionality being predefined for the application.
Further, the electronic device (300) includes an application launching unit (307) coupled to the analysing unit (306) to invoke the functionality of said application based on said identification.
Further, the electronic device (300) includes a memory (308) coupled to the analysing unit (306). The memory (308) includes data corresponding to the predefined mapping (309). The memory (308) further includes data corresponding to security information (310) and other data (311). The data corresponding to security information (310) may be stored when security and/or authentication features of the applications are enabled. Further, the other data (311) may include data such as device information.
Further, the electronic device (300) includes a processing unit (312) adapted to perform necessary functions of the electronic device (300) and to control the functions of the above-mentioned units of the electronic device (300).
Further, the electronic device (300) implements a method for defining contextual association, as described in figure 2 above, such that an application can be contextually invoked, as described in figure 1 above. Thus, the contextual association defines a mapping between fingerprint information and an application. Accordingly, the receiving unit (302) is further adapted to receive, from within an application, a user-input corresponding to defining contextual association for the application. In one aspect of the invention, the touch-input receiving unit (304) receives the user-input. Examples of the user-input include touch based gesture input, non-touch based gesture input, and input received from the input device.
Upon receiving the user-input, the receiving unit (302) is further adapted to receive a selection of functionality from amongst a plurality of functionalities predefined for the application. Further, the receiving unit (302) is adapted to receive fingerprint information. In one aspect of the invention, the touch-input receiving unit (304) receives the fingerprint information. In another aspect of the invention, the biometric sensing unit (303) receives the fingerprint information.
Upon receiving the above mentioned information, the analysing unit (306) is further adapted to associate the fingerprint information with the selected functionality. Upon associating, the analysing unit (306) saves the information as the data corresponding to the predefined mapping (309) in the memory (308). Thus, upon receiving a single user-input corresponding to a fingerprint information and a selection of the application, the application is invoked in accordance with a functionality associated with the fingerprint information and the application.
It would be understood that the electronic device (300), the display unit (301), the receiving unit (302), and the processing unit (312) may include various hardware modules/units/components or software modules or a combination of hardware and software modules as necessary for implementing the invention.
Further, the analysing unit (306) and the application launching unit (307) can be implemented as hardware modules or software modules or a combination of hardware and software modules. In one aspect of the invention, the analysing unit (306) and the application launching unit (307) can be implemented as different entities, as depicted in the figure. In another aspect of the invention, the analysing unit (306) and the application launching unit (307) can be depicted as single entity performing the functions of both the analysing unit (306) and the application launching unit (307). Further, in one aspect of the invention, the analysing unit (306) and the application launching unit (307) can be implemented as forming a part of the processing unit (312). In another aspect of the invention, the analysing unit (306) and the application launching unit (307) can be implemented as forming a part of the memory (308).
For the ease of understanding, the forthcoming descriptions of Figures 4-6 illustrate defining contextual association between fingerprint information and an application, as described in figures 2 and 3 above, such that the application can be contextually invoked. The application is designed to provide various services/functionality to a user, with or without accessing data via a network. Examples of the applications include, but not limited to, music application, chat applications, mail applications, browser applications, messaging applications, e-commerce applications, social media applications, data based media applications, location-based service (LBS) applications, print/scan/fax applications, and search applications. Such applications can be either downloaded onto the electronic device (300) or preloaded in the electronic device (300).
Figures 4(a) to 4(e) illustrate screenshots of an electronic device (400) implementing the method as illustrated in Figure 2, in accordance with one aspect of an embodiment of the invention. In such aspect, the application supports defining of contextual association. The electronic device 400 includes unit/components (not shown in the figure) as described in reference to Figure 3 above.
Referring to Figure 4(a), the display unit (301) displays a user-interface (401) corresponding to the application. Further, the user-interface (401) provides an icon (402) for defining contextual association. For the ease of understanding, the application is a music application and the user-interface (401) displays a list of playlists previously created and stored by the user on the electronic device (300). As would be understood, the display unit (301) displays the user-interface (401) upon invoking the application or performing an action within the application.
Accordingly, the receiving unit (302) receives user-input corresponding to selection of the icon (402). In an example, the user-input is a touch based input provided by tapping on the icon (402) using a finger or an input device such a stylus. In such example, the touch-input receiving unit (304) receives the user-input.
Upon receiving the user-input, the analysing unit (302) provides a user-interface (403) for defining contextual association for the application. The user-interface (403) provides a one or more user-actionable items (404) for providing fingerprint information and selection of a functionality. Accordingly, the user-actionable item (404-1) receives information about human hand, i.e. left hand or right hand. Further, the user-actionable item (404-2) receives information about finger or digit of the human hand, i.e. a number ranging for 1-5. In one example, 1 represents thumb, 2 represents an index-finger, 3 represents a middle-finger, 4 represents a ring-finger, and 5 represents a little-finger fingerprint. As such, the user-actionable items (404-1) and (404-2) provides a predefined list for receiving the corresponding information about hand and digits of the human hand. Furthermore, the user-actionable item (404-3) receives a selection of functionality. As such, the user-actionable item (404-3) provides a predefined list of functionalities associated with the application for receiving the selection. In the exemplary music application, the predefined list of functionalities for each playlist can include, but not limited to, auto-start and auto-start with shuffling.
Accordingly, the receiving unit (302) receives the fingerprint information and the selected functionality. In one aspect of the invention, the biometric sensing unit (303) receives the fingerprint information when the user scans a finger on the biometric sensing unit (303). In such aspect, the user-actionable items (404-1) and (404-2) are automatically selected based on the scanning. In another aspect of the invention, the touch-input receiving unit (304) receives the fingerprint information when the user selects a hand and a digit of the selected hand at user-actionable items (404-1) and (404-2) by using a finger or an input device such as a stylus.
Upon receiving the fingerprint information and the selected functionality, the analysing unit (306) associates the fingerprint information with the selected functionality of the application. Further, the analysing unit (306) stores a mapping of the fingerprint information associated with the selected functionality in the memory (308) as data corresponding to predefined mapping (309). In the present aspect of the invention, the analysing unit (306) stores the predefined mapping as a metadata associated with the application. In such aspect, the application receives the association information and saves as the metadata in the form of a table.
Thus, different functionalities of the application can be associated with each finger of a human hand. Accordingly, Figure 4(b) illustrates a human hand (405) such that each finger is associated with different functionalities of an application. Functionality App F1 is associated with thumb finger, functionality App F2 is associated with index finger, functionality App F3 is associated with middle finger, functionality App F4 is associated with ring finger, and functionality App F5 is associated with little finger. Further, different functionalities of the application can be associated with combination of fingers. Accordingly, Figure 4(c) illustrates a fingerprint aware stylus (406) being held with three fingers (407) which are associated functionality App F6 of the application. Similarly, Figure 4(d) illustrates the fingerprint aware stylus (406) being held with four fingers (408) which are associated with functionality App F7 of the application. Similarly, Figure 4(e) illustrates the fingerprint aware stylus 406 being held with two fingers (409) which are associated functionality App F7 of the application.
Figure 5 illustrate screenshots of an electronic device (500) implementing the method as illustrated in Figure 2, in accordance with another aspect of an embodiment of the invention. In such aspect, the application does not support defining of contextual association. The electronic device 500 includes units/components (not shown in the figure) as described in reference to Figure 3 above.
Accordingly, the display unit (301) displays a user-interface (501) corresponding to the application. The display unit (301) further displays a menu (502) comprising of predefined list of functionalities associated with the application. For the ease of understanding, the application is a browser application and the user-interface (401) displays a web page accessed via the browser application. Similarly, the menu (502) displays predefined functionalities associated with the browser application. Examples of the predefined functionalities include, but not limited to, open a new tab, open a new incognito tab, bookmark the web page, print, and find in the web page. As would be understood, the display unit (301) displays the user-interface or the menu upon invoking the application or performing an action within the application.
Further, the receiving unit (302) receives user-input (503) on a functionality selected from the menu (502). The user-input (503) corresponds to defining of contextual association for the application. In an example, the user-input is a touch based gesture input provided by a finger or an input device such a stylus. In such example, the touch-input receiving unit (304) receives the user-input. Examples of the touch based gesture input include, but not limited to, long press and circle draw. In the exemplary browsing application, the user-input (503) is received on the functionality ‘open a new incognito tab’.
Upon receiving the user-input (503), the analysing unit (306) provides a user-interface (504) for defining contextual association for the application. The user-interface (504) provides a one or more user-actionable items (505) for providing fingerprint information. As would be understood form above, the user-input (503) is received on the selected functionality, thereby providing the information corresponding to the selected functionality to the analysing unit (306). Accordingly, the user-actionable item (505-1) receives information about human hand, i.e. left hand or right hand. Further, the user-actionable item (505-2) receives information about finger or digit of the human hand, i.e. a number ranging for 1-5. In one example, 1 represents thumb, 2 represents an index-finger, 3 represents a middle-finger, 4 represents a ring-finger, and 5 represents a little-finger fingerprint. As such, the user-actionable items (505-1) and (505-2) provides a predefined list for receiving the corresponding information about hand and digits of the human hand.
Accordingly, the receiving unit (302) receives the fingerprint information. In one aspect of the invention, the biometric sensing unit (303) receives the fingerprint information when the user scans a finger on the biometric sensing unit (303). In such aspect, the user-actionable items (404-1) and (404-2) are automatically selected based on the scanning. In another aspect of the invention, the touch-input receiving unit (304) receives the fingerprint information when the user selects a hand and a digit of the selected hand at user-actionable items (505-1) and (505-2) by using a finger or an input device such as a stylus.
Upon receiving the fingerprint information, the analysing unit (306) associates the fingerprint information with the selected functionality of the application. Further, the analysing unit (306) stores a mapping of the fingerprint information associated with the selected functionality in the memory (308) as data corresponding to predefined mapping (309). In another example, functionality of ‘include signature before sending an email’ will can be associated with ‘send’ button available on email composing user-interface. Thus, the functionality ‘include signature before sending an email’ will be invoked, when the ‘send’ button is selected and the email would be sent after including the signature of the user, thereby authenticating the sender of the email.
Thus, the analysing unit (306), as described in Figures 4 and 5, defines and saves contextual association between the fingerprint information and the selected functionality of the application as a predefined mapping. In one aspect, the predefined mapping is stored as a metadata associated with the application in form a first table. In another aspect, the predefined mapping is stored in form of a second table separate from the metadata associated with the application. In addition, a priority can be assigned to the first table and the second table such that the priority of the second table is higher than the priority of the first table.
For the ease of understanding, the forthcoming descriptions of Figures 6 & 7 illustrate screenshots of an electronic device implementing the method as illustrated in Figures 1(a) to 1(c).
Figures 6(a) and 6(b) illustrate screenshots (600) of an electronic device (601) implementing the method as illustrated in Figures 1(a) to 1(c), in accordance with one aspect of an embodiment of the invention. The electronic device (601) includes units/components (not shown in the figure) as described in reference to Figure 3 above.
For the present aspect, as illustrated in Figure 6(a), the electronic device (601) includes the biometric sensing unit (303) such that the biometric sensing unit (303) is located at a bottom panel (602) of the electronic device (601). Further, in the present aspect, the electronic device (601) may or may not include the touch-input receiving unit (304) and the device-input receiving unit (305). Further, the display unit (301) displays one or more applications installed in the electronic device (601). For ease of reference and sake of brevity, nine applications are depicted as A1, A2, A3, A4, A5, A6, A7, A8, and A9. It would be understood that more than nine applications can be depicted on the display unit (301).
According to the present invention, a user provides a single-user input for contextually invoking an application. In present aspect, the biometric sensing unit (303) receives the single-user input as a touch-based gesture input such as dragging an icon using a finger. As illustrated in Figure 6(a), the user selects an icon of an application A4 (represented using a dashed circle) and drags (603) the icon using a finger to the biometric sensing unit (303).
Consequently, as illustrated in Figure 6(b), the biometric sensing unit (303) scans the finger and the analysing unit (306) determines fingerprint information using methods as known in the art (illustrated using reference numeral 604). Simultaneously, the analysing unit (306) identifies the application being dragged (illustrated using reference numeral 605). In an example, the analysing unit (306) provides a bucket user-interface (606) which catches the application being dragged towards the biometric sensing unit (303). Though, the figures 6(a) and 6(b) illustrate use of single finger, it is to be understand more than one finger can be used.
Thus, from a single user-input i.e., dragging the application towards the biometric sensing unit (303), the analysing unit (306) identifies the fingerprint information and the application. Upon identifying, the analysing unit (306) determines existence of a predefined mapping between the fingerprint information and the application. Accordingly, the analysing unit (306) analyses the first table and the second table, as described earlier in reference to Figure 4 and 5, based on a predefined priority associated with the first and second tables. In case of positive determination, the analysing unit (306) identifies a functionality of the application associated with the fingerprint information and invokes the application according to the identified functionality. In another example, as described earlier, upon composing an email, the user drags a ‘send’ button towards the biometric sensing unit (303). Accordingly, functionality of ‘include signature before sending an email’ will be invoked and the email would be sent after including the signature of the user, thereby authenticating the sender of the email.
In case of negative determination, the analysing unit (306) determines if a security is enabled for the application. Accordingly, the analysing unit (306) fetches the data corresponding to the security information (310) from the memory (308) corresponding to the application. If the security is enabled, the invoking of the application is prevented. In another example, as described earlier, upon composing an email, the user drags a ‘send’ button towards the biometric sensing unit (303). Accordingly, functionality of ‘include signature before sending an email’ will not be invoked and the email would not be sent. In addition, a notification message may be displayed to the user on the display unit regarding the security feature. If the security is disabled, a default function associated with the application is invoked.
Figure 7 illustrates screenshots (700) of an electronic device (701) implementing the method as illustrated in Figures 1(a) to 1(c), in accordance with another aspect of an embodiment of the invention. The electronic device (701) includes units/components (not shown in the figure) as described in reference to Figure 3 above.
For the present aspect, the electronic device (701) excludes the biometric sensing unit (303) and includes the touch-input receiving unit (304) and the device-input receiving unit (305). Further, the electronic device (701) is communicatively coupled with one or more external devices (702) via a network (703). Examples of the network (703) include wireless network and wired network. The external device (702) includes a biometric sensing unit (not shown in the figure) capable of scanning fingerprint. The external devices (702) include, but not limited to, a wearable device (702-1) and an input device (702-2). Example of such input device (702-2) includes a fingerprint-aware stylus. Examples of such wearable device (702-1) include a smart watch, smart wristband, and smart armband. Furthermore, the display unit (301) displays one or more applications installed in the electronic device (701). For ease of reference and sake of brevity, nine applications are depicted as A1, A2, A3, A4, A5, A6, A7, A8, and A9. It would be understood that more than nine applications can be depicted on the display unit (301).
According to the present invention, a user provides a single-user input for contextually invoking an application. In present aspect, the single-user input is a combination of touch based input received on the electronic device (701) corresponding to a selection of the application and an input received from the external electronic device (702). Thus, the touch-input receiving unit (303) and the device-input receiving unit (304), together, receive the single-user input. As illustrated in Figure 7, the user selects an icon of an application A6 by tapping (704) (represented as dashed circle) on the icon using either one or more fingers or the external device (702). Consequently, the touch-input receiving unit (303) receives the touch input corresponding to the tapping (704). Simultaneously, if the user is wearing the wearable device (702-1) at the time of tapping (704) the icon using one or more fingers, the wearable device (702-1) detects electrical activity during the taping and communicates the information corresponding to the detected electrical activity to the electronic device (701) via the network (703). Similarly, if the user is tapping (704) the icon using the input device (702-2), the input device (702-2) scans the one or more fingers holding the input device (702-2) and communicates the information to electronic device (701) via the network (703). Consequently, the device-input receiving unit (303) receives the information from the external device (702).
Upon receiving the input from the device-input receiving unit (303), the analysing unit (306) determines fingerprint information using methods as known in the art. Simultaneously, the analysing unit (306) identifies the application from the input received by the touch-input receiving unit (303). Upon identifying, the analysing unit (306) determines existence of a predefined mapping between the fingerprint information and the application. Accordingly, the analysing unit (306) analyses the first table and the second table, as described earlier in reference to Figure 4 and 5, based on a predefined priority associated with the first and second tables. In case of positive determination, the analysing unit (306) identifies a functionality of the application associated with the fingerprint information and invokes the application according to the identified functionality. In another example, as described earlier, upon composing an email, the user taps on a ‘send’ button. Accordingly, functionality of ‘include signature before sending an email’ will be invoked and the email would be sent after including the signature of the user, thereby authenticating the sender of the email.
In case of negative determination, the analysing unit (306) determines if a security is enabled for the application. Accordingly, the analysing unit (306) fetches the data corresponding to the security information (310) from the memory (308) corresponding to the application. If the security is enabled, the invoking of the application is prevented. In another example, as described earlier, upon composing an email, the user taps on a ‘send’ button. Accordingly, functionality of ‘include signature before sending an email’ will not be invoked and the email would not be sent. In addition, a notification message may be displayed to the user on the display unit regarding the security feature. If the security is disabled, a default function associated with the application is invoked.
EXEMPLARY IMPLEMENTATIONS
Figures 8-12 illustrate example manifestations depicting the implementation of the present invention, as described above. However, it may be strictly understood that the forthcoming examples shall not be construed as being limitations towards the present invention and the present invention may be extended to cover analogous manifestations through other type of like mechanisms.
Figure 8 illustrates screenshot (800) of an electronic device (801) corresponding to a first exemplary manifestation depicting the implementation of the invention. In the exemplary manifestation, a browser application (802) is being contextually invoked. As illustrated in Figure 8(a), upon tapping the browser application (802) with index finger (803), the electronic device (801) identifies the fingerprint information, i.e., index finger, and the application; and determines functionality associated with the fingerprint information and the application. Accordingly, the electronic device (801) invokes ‘open the browser application in normal mode’ functionality associated with the browser application (802) and the index finger. As such, the browser application (802) opens in normal mode (804).
On the other hand, as illustrated in Figure 8(b), upon tapping the browser application (802) with middle finger (805), the electronic device (801) identifies the fingerprint information, i.e., middle finger, and the application; and determines functionality associated with the fingerprint information and the application. Accordingly, the electronic device (801) invokes ‘open the browser application in privacy mode’ functionality associated with the browser application (802) and the middle finger. As such, the browser application (802) opens in privacy mode (806).
Figure 9 illustrates screenshot (900) of an electronic device (901) corresponding to a second exemplary manifestation depicting the implementation of the invention. In the exemplary manifestation, an image capturing application (902) is being contextually invoked. As illustrated in Figure 9(a), upon tapping the image capturing application (902) with index finger (903), the electronic device (901) identifies the fingerprint information, i.e., index finger, and the application; and determines functionality associated with the fingerprint information and the application. Accordingly, the electronic device (901) invokes ‘image capturing mode’ functionality associated with the image capturing application (902) and the index finger. As such, the image capturing application (902) opens in image capturing mode (904) enabling the electronic device (900) to capture one or more images.
On the other hand, as illustrated in Figure 9(b), upon tapping the image capturing application (902) with middle finger (905), the electronic device (901) identifies the fingerprint information, i.e., middle finger, and the application; and determines functionality associated with the fingerprint information and the application. Accordingly, the electronic device (901) invokes ‘video capturing mode’ functionality associated with the browser application (802) and the middle finger. As such, the image capturing application (902) opens in video capturing mode (904) enabling the electronic device (902) to capture one or more videos.
Figure 10 illustrates screenshot (1000) of an electronic device (1001) corresponding to a third exemplary manifestation depicting the implementation of the invention. In the exemplary manifestation, a document application (1002) is being contextually invoked. As illustrated in Figure 10(a), upon tapping the document application (1002) with index finger (1003), the electronic device (1001) identifies the fingerprint information, i.e., index finger, and the application; and determines functionality associated with the fingerprint information and the application. Accordingly, the electronic device (1001) invokes ‘text input mode’ functionality associated with the document application (1002) and the index finger. As such, the document application (1002) opens in text mode (1004) enabling the electronic device (1000) to receive text data via a keyboard.
On the other hand, as illustrated in Figure 10(b), upon tapping the document application (1002) with fingerprint-aware stylus (1005), the electronic device (1001) identifies the fingerprint information, i.e., finger(s) holding the fingerprint-aware stylus (1005), and the application; and determines functionality associated with the fingerprint information and the application. Accordingly, the electronic device (1001) invokes ‘scribble input mode’ functionality associated with the document application (1002) and the finger(s) holding the fingerprint-aware stylus (1005). As such, the document application (1002) opens in scribble mode (1004) enabling the electronic device (1000) to perform various functions such as drawing, erasing, and graphic insertion.
Figure 11 illustrates screenshot (1100) of an application (1101) being accessed on an electronic device (not shown in the figure) corresponding to a fourth exemplary manifestation depicting the implementation of the invention. In the exemplary manifestation, a music application (1101) is being contextually invoked. As illustrated in Figure 11(a), upon tapping the music application (1101) with index finger (1102), the electronic device identifies the fingerprint information, i.e., index finger, and the application; and determines functionality associated with the fingerprint information and the application. Accordingly, the electronic device invokes ‘auto-play mode’ functionality associated with the music application (1101) and the index finger for a specific playlist. As such, the music application (1101) auto plays playlist 1.
On the other hand, as illustrated in Figure 11(b), upon tapping the music application (1101) with middle finger (1103), the electronic device identifies the fingerprint information, i.e., middle finger, and the application; and determines functionality associated with the fingerprint information and the application. Accordingly, the electronic device invokes ‘auto-play mode’ functionality associated with the music application (1101) and the index finger for a specific playlist. As such, the music application (1101) auto plays playlist 2.
EXEMPLARY HARDWARE CONFIGURATION
Figure 12 illustrates a typical hardware configuration of a electronic device (900), which is representative of a hardware environment for implementing the present invention. As would be understood, the electronic device, as described above, includes the hardware configuration as described below.
In a networked deployment, the electronic device (1200) may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment. The electronic device (1200) can also be implemented as or incorporated into various devices, such as, a tablet, a personal digital assistant (PDA), a palmtop computer, a laptop, a smart phone, a notebook, and a communication device.
The electronic device (1200) may include a processor (1201) e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both. The processor (1201) may be a component in a variety of systems. For example, the processor (1201) may be part of a standard personal computer or a workstation. The processor (1201) may be one or more general processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analysing and processing data. The processor 1801 may implement a software program, such as code generated manually (i.e., programmed).
The electronic device (1200) may include a memory (1202) communicating with the processor (1201) via a bus (1203). The memory (1202) may be a main memory, a static memory, or a dynamic memory. The memory (1202) may include, but is not limited to computer readable storage media such as various types of volatile and non-volatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like. The memory (1202) may be an external storage device or database for storing data. Examples include a hard drive, compact disc ("CD"), digital video disc ("DVD"), memory card, memory stick, floppy disc, universal serial bus ("USB") memory device, or any other device operative to store data. The memory (1202) is operable to store instructions executable by the processor (1201). The functions, acts or tasks illustrated in the figures or described may be performed by the programmed processor (1201) executing the instructions stored in the memory (1202). The functions, acts or tasks are independent of the particular type of instructions set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firm-ware, micro-code and the like, operating alone or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing and the like.
The electronic device (1200) may further include a display unit (1204), such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, a cathode ray tube (CRT), or other now known or later developed display device for outputting determined information.
Additionally, the electronic device (1200) may include an input device (1205) configured to allow a user to interact with any of the components of system (1200). The input device (1205) may be a number pad, a keyboard, a stylus, an electronic pen, or a cursor control device, such as a mouse, or a joystick, touch screen display, remote control or any other device operative to interact with the electronic device (1200).
The electronic device (1200) may also include a disk or optical drive unit (1206). The drive unit (1206) may include a computer-readable medium (1208) in which one or more sets of instructions (1208), e.g. software, can be embedded. In addition, the instructions (1208) may be separately stored in the processor (1201) and the memory (1202).
The electronic device (1200) may further be in communication with other device over a network (1209) to communicate voice, video, audio, images, or any other data over the network (1209). Further, the data and/or the instructions (1208) may be transmitted or received over the network (1209) via a communication port or interface (1210) or using the bus (1203). The communication port or interface (1210) may be a part of the processor (1201) or may be a separate component. The communication port (1210) may be created in software or may be a physical connection in hardware. The communication port (1210) may be configured to connect with the network (1209), external media, the display (1204), or any other components in system (1200) or combinations thereof. The connection with the network (1209) may be a physical connection, such as a wired Ethernet connection or may be established wirelessly as discussed later. Likewise, the additional connections with other components of the system (1200) may be physical connections or may be established wirelessly. The network (1209) may alternatively be directly connected to the bus (1203).
The network (1209) may include wired networks, wireless networks, Ethernet AVB networks, or combinations thereof. The wireless network may be a cellular telephone network, an 802.9, 802.16, 802.20, 802.1Q or WiMax network. Further, the network (1209) may be a public network, such as the Internet, a private network, such as an intranet, or combinations thereof, and may utilize a variety of networking protocols now available or later developed including, but not limited to TCP/IP based networking protocols.
In an alternative example, dedicated hardware implementations, such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement various parts of the computing system (1200).
Applications that may include the systems can broadly include a variety of electronic and computer systems. One or more examples described may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.
The electronic device (1200) may be implemented by software programs executable by the processor (1201). Further, in a non-limited example, implementations can include distributed processing, component/object distributed processing, and parallel processing. Alternatively, virtual computer system processing can be constructed to implement various parts of the system.
The electronic device (1200) is not limited to operation with any particular standards and protocols. For example, standards for Internet and other packet switched network transmission (e.g., TCP/IP, UDP/IP, HTML, HTTP) may be used. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same or similar functions as those disclosed are considered equivalents thereof.
The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.
While certain present preferred embodiments of the invention have been illustrated and described herein, it is to be understood that the invention is not limited thereto. Clearly, the invention may be otherwise variously embodied, and practiced within the scope of the following claims.
Claims:We Claim:
1. A method for invoking an application on an electronic device (300), the method comprising:
- receiving (101) a single user-input on the electronic device;
- identifying (102) a fingerprint information and an application from the input;
- determining (103) existence of a predefined mapping between the fingerprint information and the application;
- identifying (104) a functionality of said application associated with the fingerprint information and the application, in case of positive determination, the functionality being predefined for the application; and
- invoking (105) the functionality of said application based on said identification.
2. The method as claimed in claim 1, wherein the single user-input is a touch based input received via a biometric sensing unit (303) available in the electronic device (300).
3. The method as claimed in claim 1, wherein the single user-input is a combination of touch based input received on the electronic device (300) corresponding to a selection of the application and an input received from an external computing device coupled with the electronic device (300).
4. The method as claimed in claim 3, wherein the external computing device includes a wearable device and an input device.
5. The method as claimed in claim 1, wherein the fingerprint information includes a combination of one of human hand and one or more digits of said human hand.
6. The method as claimed in claim 1, further comprises:
- determining (106) if a security is enabled for the application in case of negative determination; and
- preventing (107) invoking of the application.
7. The method as claimed in claim 1, further comprises:
- determining (108) if a security is disabled for the application in case of negative determination; and
- invoking (109) a default functionality associated with the application.
8. A method for defining contextual association to invoke an application, the method comprising:
- receiving (201), from within an application, a user-input corresponding to defining contextual association for the application;
- receiving (202) a selection of a functionality from amongst a plurality of functionalities predefined for the application;
- receiving (203) a fingerprint information; and
- associating (204) the fingerprint information with the selected functionality, such that a functionality associated with a fingerprint information is invoked on an application upon receiving a single user-input, said single user-input corresponding to the fingerprint information and a selection of the application.
9. The method as claimed in claim 7, further comprises:
- storing a mapping of the fingerprint information associated with the selected functionality in a metadata associated with the application.
10. The method as claimed in claim 7, further comprises:
- storing a mapping of the fingerprint information associated with the selected functionality in a database.
11. The method as claimed in claim 7, wherein the user-input is received via one of: a touch-based gesture input and an input device.
12. An electronic device (300) for invoking an application, the electronic device (300) comprising:
- a receiving unit (302) to receive a single user-input on the electronic device (300);
- an analysing unit (306) coupled to the receiving unit to (302):
- identify a fingerprint information and an application from the single user-input;
- determine existence of a predefined mapping between the fingerprint information and the application; and
- identify a functionality of said application associated with the fingerprint information and the application, in case of positive determination, the functionality being predefined for the application; and
- an application launching unit (307) coupled to the analysing unit (306) to invoke the functionality of said application based on said identification.
13. An electronic device (300) for defining contextual association, the electronic device (300) comprising:
- a receiving unit to (302):
- receive, from within an application, a user-input corresponding to defining contextual association for the application;
- receive a selection of a functionality from amongst a plurality of functionalities predefined for the application; and
- receive a fingerprint information; and
- an analysing unit (306) coupled to the receiving unit (302) to associate the fingerprint information with the selected functionality, such that a functionality associated with a fingerprint information is invoked on an application upon receiving a single user-input, said single user-input corresponding to the fingerprint information and a selection of the application.
| # | Name | Date |
|---|---|---|
| 1 | 3214-DEL-2015-RELEVANT DOCUMENTS [09-09-2023(online)].pdf | 2023-09-09 |
| 1 | Power of Attorney [06-10-2015(online)].pdf | 2015-10-06 |
| 2 | Form 5 [06-10-2015(online)].pdf | 2015-10-06 |
| 2 | 3214-DEL-2015-RELEVANT DOCUMENTS [01-09-2022(online)].pdf | 2022-09-01 |
| 3 | Form 3 [06-10-2015(online)].pdf | 2015-10-06 |
| 3 | 3214-DEL-2015-IntimationOfGrant26-03-2021.pdf | 2021-03-26 |
| 4 | Form 18 [06-10-2015(online)].pdf | 2015-10-06 |
| 4 | 3214-DEL-2015-PatentCertificate26-03-2021.pdf | 2021-03-26 |
| 5 | Drawing [06-10-2015(online)].pdf | 2015-10-06 |
| 5 | 3214-DEL-2015-Correspondence-101019.pdf | 2019-10-14 |
| 6 | Description(Complete) [06-10-2015(online)].pdf | 2015-10-06 |
| 6 | 3214-DEL-2015-OTHERS-101019.pdf | 2019-10-14 |
| 7 | 3214-del-2015-Form-1-(26-10-2015).pdf | 2015-10-26 |
| 7 | 3214-DEL-2015-8(i)-Substitution-Change Of Applicant - Form 6 [19-09-2019(online)].pdf | 2019-09-19 |
| 8 | 3214-del-2015-Correspondence Others-(26-10-2015).pdf | 2015-10-26 |
| 8 | 3214-DEL-2015-ASSIGNMENT DOCUMENTS [19-09-2019(online)].pdf | 2019-09-19 |
| 9 | 3214-DEL-2015-PA [19-09-2019(online)].pdf | 2019-09-19 |
| 9 | 3214-DEL-2015-FER.pdf | 2018-12-24 |
| 10 | 3214-DEL-2015-CLAIMS [08-05-2019(online)].pdf | 2019-05-08 |
| 10 | 3214-DEL-2015-OTHERS [08-05-2019(online)].pdf | 2019-05-08 |
| 11 | 3214-DEL-2015-COMPLETE SPECIFICATION [08-05-2019(online)].pdf | 2019-05-08 |
| 11 | 3214-DEL-2015-FER_SER_REPLY [08-05-2019(online)].pdf | 2019-05-08 |
| 12 | 3214-DEL-2015-DRAWING [08-05-2019(online)].pdf | 2019-05-08 |
| 13 | 3214-DEL-2015-COMPLETE SPECIFICATION [08-05-2019(online)].pdf | 2019-05-08 |
| 13 | 3214-DEL-2015-FER_SER_REPLY [08-05-2019(online)].pdf | 2019-05-08 |
| 14 | 3214-DEL-2015-CLAIMS [08-05-2019(online)].pdf | 2019-05-08 |
| 14 | 3214-DEL-2015-OTHERS [08-05-2019(online)].pdf | 2019-05-08 |
| 15 | 3214-DEL-2015-FER.pdf | 2018-12-24 |
| 15 | 3214-DEL-2015-PA [19-09-2019(online)].pdf | 2019-09-19 |
| 16 | 3214-DEL-2015-ASSIGNMENT DOCUMENTS [19-09-2019(online)].pdf | 2019-09-19 |
| 16 | 3214-del-2015-Correspondence Others-(26-10-2015).pdf | 2015-10-26 |
| 17 | 3214-DEL-2015-8(i)-Substitution-Change Of Applicant - Form 6 [19-09-2019(online)].pdf | 2019-09-19 |
| 17 | 3214-del-2015-Form-1-(26-10-2015).pdf | 2015-10-26 |
| 18 | 3214-DEL-2015-OTHERS-101019.pdf | 2019-10-14 |
| 18 | Description(Complete) [06-10-2015(online)].pdf | 2015-10-06 |
| 19 | 3214-DEL-2015-Correspondence-101019.pdf | 2019-10-14 |
| 19 | Drawing [06-10-2015(online)].pdf | 2015-10-06 |
| 20 | Form 18 [06-10-2015(online)].pdf | 2015-10-06 |
| 20 | 3214-DEL-2015-PatentCertificate26-03-2021.pdf | 2021-03-26 |
| 21 | Form 3 [06-10-2015(online)].pdf | 2015-10-06 |
| 21 | 3214-DEL-2015-IntimationOfGrant26-03-2021.pdf | 2021-03-26 |
| 22 | Form 5 [06-10-2015(online)].pdf | 2015-10-06 |
| 22 | 3214-DEL-2015-RELEVANT DOCUMENTS [01-09-2022(online)].pdf | 2022-09-01 |
| 23 | Power of Attorney [06-10-2015(online)].pdf | 2015-10-06 |
| 23 | 3214-DEL-2015-RELEVANT DOCUMENTS [09-09-2023(online)].pdf | 2023-09-09 |
| 1 | 2018-12-20_20-12-2018.pdf |