Sign In to Follow Application
View All Documents & Correspondence

Method And Device For Quick Access Of Applications

Abstract: The invention relates to quick access of applications. In one embodiment, a method implementable on an electronic device for quick access of applications, comprises: receiving a first set of input events on a vacant area in a current screen; determining a number of input events in the first set of input events; determining an application associated with the number input events in the first set of input events; and displaying an icon of the application in the vacant area in the current screen. Fig. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 October 2014
Publication Number
19/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
mail@lexorbis.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-08-29
Renewal Date

Applicants

Samsung India Electronics Pvt. Ltd.
Logix Cyber Park, Plot No. C 28-29, Tower D - Ground to 10th Floor, Tower C - 7th to 10th Floor, Sector-62, Noida – 201301, Uttar Pradesh, India

Inventors

1. SHARMA, Ankur
Ganga Mandir Colony, RoopWas, Bharatput, Rajasthan, India
2. GUPTA, Sachin Kumar
House No 114, Bank Colony, Premier Nagar, Aligarh, Uttar Pradesh – 202001, India

Specification

CLIAMS:We Claim:
1. A method implementable on an electronic device for quick access of applications, the method comprising:
receiving a first set of input events on a vacant area in a current screen;
determining a number of input events in the first set of input events;
determining an application associated with the number input events in the first set of input events; and
displaying an icon of the application in the vacant area in the current screen.

2. The method as claimed in claim 1, wherein the input events are touch events, gesture based events, or graphical user interface (GUI) based events.

3. The method as claimed in claim 1 further comprising:
associating in advance the application with the number of input events in the first set of input events.

4. The method as claimed in claim 3, the associating is done automatically by the electronic device based on application usage timings or application usage frequency.

5. The method as claimed in claim 3, wherein the associating is done based one or more user inputs.

6. The method as claimed in claim 1 further comprising:
accepting the first set of input events if a subsequent input event is provided within a predefined time period from a previous input event.

7. The method as claimed in claim 1 further comprising:
moving the icon from an original screen of the icon to the vacant area in the current screen.

8. The method as claimed in claim 1 further comprising:
receiving a second set of input events in a current vacant area in the current screen, a number of input events in the second set of input events being equal to the number of input events in the first set of input events.

9. The method as claimed in claim 1 further comprising:
removing, upon receiving the second set of input events, the icon from the current screen.

10. The method as claimed in claim 9, wherein removing the icon comprises moving back the icon from the current screen to an original screen.

11. A method implementable on an electronic device for quick access of applications, the method comprising:
receiving a first set of input events on an vacant area in a current screen;
determining a number of input events in the first set of input events;
comparing the number of input events in the first set of input events with a maximum threshold value; and
displaying, if the number of input events in the first set of input events is greater than or equal to the maximum threshold value, a user interface having an icon for each of a plurality of applications.

12. The method as claimed in claim 11, wherein the input events are touch events, gesture based events, or graphical user interface (GUI) based events.

13. The method as claimed in claim 11, wherein the plurality of applications comprises most recently used applications, most frequently used applications, recommended applications, predetermined applications, or combinations thereof.

14. The method as claimed in claim 11 further comprising:
associating in advance the maximum threshold value with the user interface.

15. The method as claimed in claim 11 further comprising:
accepting the first set of input events if a subsequent input event is provided within a predefined time period from a previous input event.

16. The method as claimed in claim 11 further comprising:
moving the icon for each of a plurality of applications from an original screen of the icon in the user interface.

17. The method as claimed in claim 11, wherein displaying the user interface comprises displaying the user interface on the current screen.

18. The method as claimed in claim 11, wherein displaying the user interface comprises displaying the user interface on a screen other than the current screen when the current screen has a vacant space less than a space required for displaying the user interface.

19. The method as claimed in claim 11 further comprising:
receiving, at a screen displaying the user interface, a second set of input events having a number of input events greater than or equal to the maximum threshold value.

20. The method as claimed in claim 19 further comprising:
removing, upon receiving the second set of input events, the user interface from the screen displaying the user interface.

21. An electronic device having a plurality of applications installed therein, the electronic device comprising:
an input unit comprising an input-sensing mechanism configured to detect input events; and
a controller coupled with the input unit, wherein the controller is configured to:
receive a first set of input events on a vacant area in a current screen;
determine a number of input events in the first set of input events;
determine an application associated with the number input events in the first set of input events; and
display an icon of the application in the vacant area in the current screen.

22. The electronic device as claimed in claim 21, wherein the input events are touch events, gesture based events, or graphical user interface (GUI) based events.

23. The electronic device as claimed in claim 21, wherein the controller is further configured to:
associate in advance the application with the number of input events in the first set of input events.

24. The electronic device as claimed in claim 23, the controller automatically associates based on application usage timings or application usage frequency.

25. The electronic device as claimed in claim 23, wherein the controller associates based one or more user inputs.

26. The electronic device as claimed in claim 21, wherein the controller is further configured to:
accept the first set of input events if a subsequent input event is provided within a predefined time period from a previous input event.

27. The electronic device as claimed in claim 21, wherein the controller is further configured to:
move the icon from an original screen of the icon to the vacant area in the current screen.

28. The electronic device as claimed in claim 21, wherein the controller is further configured to:
receive a second set of input events in a current vacant area in the current screen, a number of input events in the second set of input events being equal to the number of input events in the first set of input events.

29. The electronic device as claimed in claim 28, wherein the controller is further configured to:
remove, upon receiving the second set of input events, the icon from the current screen.

30. The electronic device as claimed in claim 28, wherein the controller, in order to remove the icon, moves back the icon from the current screen to an original screen.

31. An electronic device having a plurality of applications installed therein, the electronic device comprising:
an input unit comprising an input-sensing mechanism configured to detect input events; and
a controller coupled with the input unit, wherein the controller is configured to:
receive a first set of input events on an vacant area in a current screen;
determine a number of input events in the first set of input events;
compare the number of input events in the first set of input events with a maximum threshold value; and
display, if the number of input events in the first set of input events is greater than or equal to the maximum threshold value, a user interface having an icon for each of a plurality of applications.

32. The electronic device as claimed in claim 31, wherein the input events are touch events, gesture based events, or graphical user interface (GUI) based events.

33. The electronic device as claimed in claim 31, wherein the plurality of applications comprises most recently used applications, most frequently used applications, recommended applications, predetermined applications, or combinations thereof.

34. The electronic device as claimed in claim 31, wherein the controller is further configured to:
associate in advance the maximum threshold value with the user interface.

35. The electronic device as claimed in claim 31, wherein the controller is further configured to:
accept the first set of input events if a subsequent input event is provided within a predefined time period from a previous input event.
36. The electronic device as claimed in claim 31, wherein the controller is further configured to:
move the icon for each of a plurality of applications from an original screen of the icon in the user interface.

37. The electronic device as claimed in claim 31, wherein the controller is further configured to display the user interface on the current screen.

38. The electronic device as claimed in claim 31, wherein the controller is further configured to display the user interface on a screen other than the current screen when the current screen has a vacant space less than a space required for displaying the user interface.

39. The electronic device as claimed in claim 31, wherein the controller is further configured to:
receive, at a screen displaying the user interface, a second set of input events having a number of input events greater than or equal to the maximum threshold value.

40. The electronic device as claimed in claim 39, wherein the controller is further configured to:
remove, upon receiving the second set of input events, the user interface from the screen displaying the user interface.
,TagSPECI:DESCRIPTION
TECHNICAL FIELD
The invention relates to quick access of applications on electronic devices. More particularly, the invention relates to quick access of applications on an electronic device using consecutive input events.

BACKGROUND
In current scenario, there is more and more usage of electronic devices, such as smartphones, tablets, laptops, etc., which is growing in demand every year. Such electronic devices provide user some options for working with a large number of applications present thereon. These applications are usually downloaded and installed by users in the electronic devices. With growing demand and availability of cheaper memory, users tend to install more and more applications. The applications depend entirely on user’s requirements and may range from gaming applications to social networking ones, information based applications, news applications, map and navigation related applications, etc.
At the same time, it is thus becoming cumbersome for the users to manage these applications in their devices for quick access. In existing solutions, users either need to remember the location of a desired application or has to perform a search function by typing the name of the desired application to find the same. In some electronic devices, the user is provided with options to collectively store the desired applications in separate folders. However, if the number of application is large, this activity of classifying applications based on requirements and storing in them in folders require considerable efforts and time from users.
Few other solutions that are known in this art are listed in this paragraph. For instance, prior art teaches mapping active regions of the touch screen to functions of at least one application. In another instance, prior art teaches separate signals to be assigned for single and double tap to the operating system. In another instance, prior art teaches detection of Taps on the headset, wherein a tap detector is used to identify taps on right and left headset. Despite the aforesaid teachings, it can be said that there is still need to provide for improvements in this area of technology.

OBJECT OF THE INVENTION
The object of the invention is to make the access of applications to the user more effective and an easy to do process at least for the most favoured or frequently or recently used applications in the device.

SUMMARY OF THE INVENTION
In accordance with the purposes of the invention, the present invention as embodied and broadly described herein, provides a solution to easily access the applications installed in electronic devices so that the user needs to perform minimum activity to access a desired application. For this purpose, users may be allowed to pre-configure favourite applications or most recent used applications with a “touch count”. This “touch count” is a positive integer value greater than 1. Theoretically, there is no upper limit on this integer value. However, a maximum threshold value may be set for practical reasons. In this way, a user may provide a double click, through touch or gesture or GUI events, on a vacant area of a current screen to bring an icon of a most favourite application on the current screen. Similarly, a triple click for second favourite application, a quadruple click for third favourite application, and so on. In case, the quintuple click is set as the maximum threshold value of clicks, a user interface having a predefined number of favourite or recent or predefined applications may be shown to the user.
According to another aspect of the invention, the user has an option to send back the previously presented application icon by again performing the respective consecutive “input events” ‘n’ number of times, where ‘n’ was the “touch count” of the presented application. For this purpose, the state, such as active, pause, or stop, of the presented application may be monitored.
According to another aspect of the invention, the user may be provided with an “Automate” option, which can automatically configure or suggest the “touch count” for applications according to their priority index. The higher the priority index of the application, the lower will be the touch count associated with the application. The determination of “priority index” can be based on knowledge gathering of user activities with respect to applications used and duration over which the applications remain active.
Accordingly, the users will have to just perform the consecutive touch events at a particular location on the touch screen and to be presented with the application desired on the home screen or the current active screen. In his way, the invention makes the process of accessing applications far easier and automatic for the user. The user now does not have to bother about the stored location of the application, does not have to remember the application by its icon, or does not need to perform any separate search for the application, etc. The present invention also helps user to do away with process of storing the applications on the home screen and will help in de-cluttering of the home screen of the device, thereby improving the performance of the device.
These and other features and advantages will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings and claims.

BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS:
To further clarify advantages and features of the invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings in which:
Figure 1 illustrates an exemplary method implementable on an electronic device for quick access of applications, in accordance with an embodiment of the invention.
Figure 2 illustrates another exemplary method implementable on an electronic device for quick access of applications, in accordance with an embodiment of the invention.
Figure 3 illustrates an exemplary touchscreen electronic device for implementing a method for quick access of applications, in accordance with an embodiment of the invention.
Figure 4 illustrates exemplary units for implementing a method for quick access of applications, in accordance with an embodiment of the invention.
Figure 5 illustrates functioning of Unit 1, in accordance with an embodiment of the invention.
Figure 6 illustrates an exemplary touchscreen panel for implementing a method for quick access of applications, in accordance with an embodiment of the invention.
Figure 7 illustrates calculation logic implemented by Unit 1, in accordance with an embodiment of the invention.
Figure 8 illustrates functioning of Unit 2, in accordance with an embodiment of the invention.
Figure 9 illustrates control logic implemented by Unit 2, in accordance with an embodiment of the invention.
Figure 10 illustrates functioning of Unit 3, in accordance with an embodiment of the invention.
Figure 11 illustrates control logic implemented by Unit 3, in accordance with an embodiment of the invention.
Figure 12 illustrates an exemplary use case for quick access of an application, in accordance with an embodiment of the invention.
Figure 13 illustrates an exemplary use case for moving back an application icon, in accordance with an embodiment of the invention.
Figure 14 illustrates an exemplary use case for quick access of multiple applications in a user interface, in accordance with an embodiment of the invention.
Figure 15 illustrates an exemplary use case for moving back said user interface, in accordance with an embodiment of the invention.
It may be noted that to the extent possible, like reference numerals have been used to represent like elements in the drawings. Further, those of ordinary skill in the art will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily drawn to scale. For example, the dimensions of some of the elements in the drawings may be exaggerated relative to other elements to help to improve understanding of aspects of the invention. Furthermore, the one or more elements may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.

DETAILED DESCRIPTION
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof. Throughout the patent specification, a convention employed is that in the appended drawings, like numerals denote like components.
Reference throughout this specification to “an embodiment”, “another embodiment” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Thus, appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures proceeded by "comprises" does not, without more constraints, preclude the existence of other devices or other sub-systems.
Various embodiments of the invention will be described below in detail with reference to the accompanying drawings.
Figure 1 illustrates an exemplary method (100) implementable on an electronic device for quick access of applications, in accordance with an embodiment of the invention. In said embodiment, the method (100) comprises: receiving (101) a first set of input events on a vacant area in a current screen; determining (102) a number of input events in the first set of input events; determining (103) an application associated with the number input events in the first set of input events; and displaying (104) an icon of the application in the vacant area in the current screen.
In a further embodiment, the input events may be touch events, gesture based events, or graphical user interface (GUI) based events.
In a further embodiment, the method (100) comprises: associating (105) in advance the application with the number of input events in the first set of input events.
In a further embodiment, the associating is done automatically by the electronic device based on application usage timings or application usage frequency.
In a further embodiment, the associating is done based one or more user inputs.
In a further embodiment, the method (100) comprises: accepting (106) the first set of input events if a subsequent input event is provided within a predefined time period from a previous input event.
In a further embodiment, the method (100) comprises: moving (107) the icon from an original screen of the icon to the vacant area in the current screen.
In a further embodiment, the method (100) comprises: receiving (108) a second set of input events in a current vacant area in the current screen, a number of input events in the second set of input events being equal to the number of input events in the first set of input events.
In a further embodiment, the method (100) comprises: removing (109), upon receiving the second set of input events, the icon from the current screen.
In a further embodiment, removing the icon comprises moving back the icon from the current screen to an original screen.
Figure 2 illustrates another exemplary method (200) implementable on an electronic device for quick access of applications, in accordance with an embodiment of the invention. In said embodiment, the method (200) comprises: receiving (201) a first set of input events on an vacant area in a current screen; determining (202) a number of input events in the first set of input events; comparing (203) the number of input events in the first set of input events with a maximum threshold value; and displaying (204), if the number of input events in the first set of input events is greater than or equal to the maximum threshold value, a user interface having an icon for each of a plurality of applications.
In a further embodiment, the input events may be touch events, gesture based events, or graphical user interface (GUI) based events.
In a further embodiment, the plurality of applications comprises most recently used applications, most frequently used applications, recommended applications, predetermined applications, or combinations thereof.
In a further embodiment, the method (200) comprises: associating (205) in advance the maximum threshold value with the user interface.
In a further embodiment, the method (200) comprises: accepting (206) the first set of input events if a subsequent input event is provided within a predefined time period from a previous input event.
In a further embodiment, the method (200) comprises: moving (207) the icon for each of a plurality of applications from an original screen of the icon in the user interface.
In a further embodiment, displaying the user interface comprises displaying the user interface on the current screen.
In a further embodiment, displaying the user interface comprises displaying the user interface on a screen other than the current screen when the current screen has a vacant space less than a space required for displaying the user interface.
In a further embodiment, the method (200) comprises: receiving (208), at a screen displaying the user interface, a second set of input events having a number of input events greater than or equal to the maximum threshold value.
In a further embodiment, the method (200) comprises: removing (209), upon receiving the second set of input events, the user interface from the screen displaying the user interface.
Figure 3 illustrates an exemplary electronic device (300), hereinafter referred to as the device (300), for implementing a method (100, 200) for quick access of applications, in accordance with an embodiment of the invention. The device (300) may be any device having a display screen and a plurality of applications installed thereon. Examples of the device (300) include, but are not limited to a mobile phone, smartphone, tablet, phablet, laptop, desktop computer, gaming console, and the like. The device (300) generally has a large number of applications installed therein. The device (300) comprises: an input unit (301a), an output unit (301b), or a combination (301) thereof, such as a touchscreen display (301) serving the purpose of input-cum-output unit (301). In addition, the device comprises a controller (302), such as a CPU, microprocessor, or microcontroller, and a memory (303) to store data.
In one embodiment, the device (300) comprises an input unit (301a) comprising an input-sensing mechanism configured to detect input events; and a controller (302) coupled with the input unit (301a), wherein the controller (302) is configured to: receive a first set of input events on a vacant area in a current screen; determine a number of input events in the first set of input events; determine an application associated with the number input events in the first set of input events; and display an icon of the application in the vacant area in the current screen.
In a further embodiment, the input events may be touch events, gesture based events, or graphical user interface (GUI) based events.
In a further embodiment, the controller (302) may be configured to: associate in advance the application with the number of input events in the first set of input events.
In a further embodiment, the controller (302) automatically associates based on application usage timings or application usage frequency.
In a further embodiment, the controller (302) associates based one or more user inputs.
In a further embodiment, the controller (302) may be configured to: accept the first set of input events if a subsequent input event is provided within a predefined time period from a previous input event.
In a further embodiment, the controller (302) may be configured to: move the icon from an original screen of the icon to the vacant area in the current screen.
In a further embodiment, the controller (302) may be configured to: receive a second set of input events in a current vacant area in the current screen, a number of input events in the second set of input events being equal to the number of input events in the first set of input events.
In a further embodiment, the controller (302) may be configured to: remove, upon receiving the second set of input events, the icon from the current screen.
In a further embodiment, the controller (302), in order to remove the icon, moves back the icon from the current screen to an original screen.
In another embodiment, the device (300) comprises: an input unit (301a) comprising an input-sensing mechanism configured to detect input events; and a controller (302) coupled with the input unit (301a), wherein the controller (302) is configured to: receive a first set of input events on an vacant area in a current screen; determine a number of input events in the first set of input events; compare the number of input events in the first set of input events with a maximum threshold value; and display, if the number of input events in the first set of input events is greater than or equal to the maximum threshold value, a user interface having an icon for each of a plurality of applications.
In a further embodiment, the input events may be touch events, gesture based events, or graphical user interface (GUI) based events.
In a further embodiment, the plurality of applications comprises most recently used applications, most frequently used applications, recommended applications, predetermined applications, or combinations thereof.
In a further embodiment, the controller (302) may be configured to: associate in advance the maximum threshold value with the user interface.
In a further embodiment, the controller (302) may be configured to: accept the first set of input events if a subsequent input event is provided within a predefined time period from a previous input event.
In a further embodiment, the controller (302) may be configured to: move the icon for each of a plurality of applications from an original screen of the icon in the user interface.
In a further embodiment, the controller (302) may be configured to: display the user interface on the current screen.
In a further embodiment, the controller (302) may be configured to: display the user interface on a screen other than the current screen when the current screen has a vacant space less than a space required for displaying the user interface.
In a further embodiment, the controller (302) may be configured to: receive, at a screen displaying the user interface, a second set of input events having a number of input events greater than or equal to the maximum threshold value.
In a further embodiment, the controller (302) may be configured to: remove, upon receiving the second set of input events, the user interface from the screen displaying the user interface.
Figure 4 illustrate a schematic representation various units for implementing the present invention. The operation of the system is controlled by a memory/database (401) and four units, more particularly Unit 1, Unit 2, Unit 3, and Unit 4 shown in the figure. The detailed description of the individual units and operational logic is described below in subsequent paragraphs. Unit 1 is responsible for configuration of the initial parameters, detection of the consecutive “touch events” performed by the user on the touch screen. Unit 2 is responsible for processing the signal from Unit 3 and presenting the application icon to the user on the home screen or the current active screen. Unit 3 is responsible for monitoring the current state of application, processing the signal input from Unit 1, and returning the respective application icon back from presented location to original location. Unit 4 is responsible for managing the “automate” option, gathering intelligence, and determining the “priority index” of the installed applications in the device.

DESCRIPTION OF UNIT 1
Figure 5 illustrates a flowchart of functioning of Unit 1. The user directly interacts with the device (300) comprising of a hardware touch panel or screen for receiving touch input from the user (step 501) as well as displaying information. A touch decoder module (not shown) determines touch coordinates (step 502) and passes it on to an analyser module (not shown) of Unit 1. Thee analyser module is responsible for (step 503) determination of the number of consecutive “touch events” performed by the user and generating a signal with “touch count” to be sent to Unit 2.
The initial configuration of the device (300) involves configuration of the parameter “Tstd”, which is the standard time interval between two consecutive touch events by the user. A standard value of Tstd may be maintained for all users. The initial configuration of the device (300) also involves configuration of the parameter “Rstd”, which is the standard value of the radius of the circular cross-sectional area of a normal human finger. A standard value of Rstd may also be maintained for all users.
Figure 6 illustrates a touchscreen display (601); the parameter “Rstd” (602), i.e., the standard value of the radius of circular cross-section of human finger; the point (603) of contact by user that represents the first touch coordinates (x,y); and the circular cross-section (604) around the point of first touch coordinates.
Figure 7 illustrates calculation logic (700) implemented by Unit 1. The unit 1 receives a touch input through a touch panel (701) and a timer signal from a timer module (702), otherwise waits (step 703) for the same. Upon receiving the touch and or timer signal, the Unit 1 determines (step 704) whether it is a touch event or a timer event. Accordingly, the unit 1 may start timer in case of touch event (step 705), otherwise stop the timer in case of timer event (step 706). Subsequently, the analyser module of Unit 1 checks (step 707) the touch coordinates at time ‘t’ and decides to wait for new touch coordinates or generate a signal output with number of touch counts detected. After that, Unit 1 again goes into a wait state (step 703).
Calculation process of “touch count” is described in detail this paragraph. “touch count” represents a positive integral value of consecutive touch events performed by the user in and around the coordinates of first touch event. The user can provide the first touch at coordinates P (a,b) at measured time instant ‘t1’.The “touch count” is incremented accordingly. The second touch event can be performed by the user in and around P (a,b) inside the circular region of radius “Rstd” with point of origin as P(a,b). The possible valid coordinates will have to satisfy the following equation: (x-a)^2 + (y-b)^2 = (Rstd)^2. If the second touch coordinates are P (a1,b1) and if they satisfy the above equation, then it is considered a valid touch event, otherwise not. The time instance measured is, say, ‘t2’. The valid touch coordinates P (a1,b1 ) pertain to a valid consecutive touch event only if it satisfies the following condition: (t2 – t1) <= Tstd. If the coordinate P (a1,b1) satisfies the above mentioned condition, it is considered a valid consecutive touch event and the “touch count “ is incremented accordingly. The timer “TIMER 1” is reset at this point of time. It is checked by the Unit 1 if the “TIMER 1” does not expire before the detection of next consecutive touch event. In this way, Unit 1 determines the maximum consecutive touch events performed by the user and updates the value of “touch count” accordingly. When the user stops performing consecutive touch events, the timer “TIMER 1” expires and Unit 1 finalizes the value of “touch count “ and pass the same to Unit 2.

DESCRIPTION OF UNIT 2
Unit 2 is responsible for parsing the signal obtained from Unit 1,mapping the provided “touch count” with the Values stored in the memory/database (401). Unit is also responsible for subsequent identification of application and its display to the user. For identification of application, first of all the “touch count” is mapped with Table 1 given below. For this purpose, the “touch count” acts as a primary key and corresponding Application Id can be determined, say, ‘APP1’.
TOUCH COUNT APPLICATION IDENTIFIER
2 APP1
3 APP2
4 to n+1 APP3 to APP ‘n’ respectively
Maximum threshold Multiple Applications
(Favorite or Recently used or Predefined)
Table 1
This Application id ‘APP1’ is then mapped with its location, screen coordinates in Table 2 given below.
APPLICATION IDENTIFIER ORIGINAL LOCATION NEW
LOCATION ORIGINAL SCREEN NEW SCREEN
APP1 X=9 Y=14 X= a Y = b 4 1
APP2 X=19 Y=39 ……. ……. …. ….
APP3…..APP ‘n’ …… ……. ……. …….. …. ….
Table 2
This application id ‘APP1’ is also used to fetch the application icon from Table 3 to be displayed to the user.
APPLICATION IDENTIFIER APPLICATION ICON
APP1 ICON1
APP2 ICON 2
APP3………APP ‘n’ ……………………..
Table 3
The process of determining space availability is described in this paragraph. When a user has performed consecutive touch events at P(a,b) location, the system needs to determine the location of display of application icon to the user. For this purpose, the current active screen is checked for available space. In one implementation, the screen may have at least a threshold percentage, say 25%, of its active area as available space to make the display of the application icon clear to the user. If the current active screen does not satisfy this threshold percentage criterion, the system will perform the same criterion on a new screen until a suitable screen is determined .If no such screen is available, the system may prompt the user to clear some space on the available screens in the user interface of the device (300). Further, new screen coordinates p’(a1,b1) can be then randomly allocated from the list of available empty coordinates on the selected screen.
Figure 8 illustrates functioning of Unit 2 that is responsible for processing the signal from Unit 3 and presenting the application icon to the user on the home screen or the current active screen. The system includes a display manger unit (801) to manage display of icons or notifications, a parser module (802) to parse incoming touch signals, a fetch module (803) to fetch applications from their original location, and a space availability check module (804) to check space availability on a current or any other screen when the current scree is not free.
In continuation to the previous figure, Figure 9 illustrates functioning of Unit 2. Firstly, incoming touch signals are parsed (step 901) by the parser module (802) and stored in the memory/database 401. Then, the unit 2 checks (step 902) whether any application id corresponding to parsed signal exists in Table 1. If such corresponding application id does not exists, a notification for the same is displayed (step 903) to the user by the display manger unit (801). However, if such corresponding application id exists, then the corresponding application is fetched (step 904) from the memory/database (401) by the fetcher module (803). Then, space availability to display the fetched application is checked (step 905) by the space availability check module (804). If sufficient space is not available, then a corresponding notification is displayed (step 903) to the user by the display manger unit (801). In case, the sufficient space is available, the display manger unit (801) displays (step 906) the application icon and updates new coordinates in the memory/database (401).

DESCRIPTION OF UNIT 3
Unit 3 is responsible for monitoring the instructions for currently active applications. The currently active applications are those applications which have been presented to the user by Unit 2, but have not been sent back to the original location by the user. It can be concluded that the already active applications referred above may be in ACTIVE or in SUSPENDED state. The Unit 3 comes into action before Unit 2.The signal sent by Unit 2 is first received by Unit 3, which after determining if the signal is for “sending back” already presented application or a call by the user for presentation of a new application, passes the signal to Unit 2 in latter case.
Monitoring Logic- Unit 3 receives incoming touch signals from Unit 1. Therefore, it infinitely polls for the signal reception from Unit 1.From the value of “touch count”, the value of application id is extracted from Table 1. Further, the value of application id is now matched with the list of currently “presented” applications. This List is maintained by Unit 2. If the application id does not match any item in the list above, then the control signal is passed on to Unit 2 as it is a call for new application by the user. However, if the application id matches with an item in the list above, then it is a call for “sending back” the presented application icon to its original location by the user.
Working Logic- Unit 3 matches the application id with the list of currently “presented” applications. For this purpose, the current state of the applications is obtained from the operating system. If the current state is PAUSE, SUSPENDED, or STOP state, then the original coordinates of the application is fetched from Table 2.The original screen position is also fetched from Table 2. The application is made to STOP by the system. Further, the application is displayed back to its original coordinates by calling the display manager Unit (801) of the operating system. It is assumed that application being in ACTIVE state is not possible as in that case user will not be able to make consecutive touch events. Once the “sending back” of the application is performed, Table 2 and the list of “presented” applications are updated accordingly by Unit 3.
Figure 10 illustrates functioning of Unit 3 that is responsible for monitoring the current state of application and processing the signal input from Unit 1 and returning the respective application icon back from presented location to original location. The system includes the display manger unit (801) to move back icons or close user interfaces, a receiver/monitoring module (110) to receive incoming touch signals, a fetch module (803) to fetch applications from their current location, and an analyser module (111) to analyse the signals for processing.
In continuation to the previous figure, Figure 11 illustrates functioning of Unit 3. Firstly, incoming touch signals are monitored (step 111) by receiver/monitoring module (110) and stored in the memory/database 401. Then, the unit 3 fetches (step 112) the application id from Table 1. The unit 3 checks (step 113) whether the incoming touch signals are for a new application or moving back a presented application. If the request is new application, then the control is handed over to Unit 2 for further processing. However, the request is for moving back the presented application, then original location and screen coordinates are extracted (step 115) from Table 2. After that Unit 3 checks (step 116) the state of the presented application. If the state is not a SUSPENDED, SLEEP, or STOP state, then the user is prompted (step 117) to close the presented application. However, is the status is SUSPENDED, or SLEEP, or STOP state, then the application stopped (step 118) and in communication (step 119) with the display manger unit (801).

DESCRIPTION OF UNIT 4
The Unit 4 may be implemented as a standalone unit in itself. It is responsible for providing the facility of “automate” option to the user. The “automate” option enables the device (300) to monitor activities of the user, gather information, and develop intelligence to define or associate a particular “priority index” with applications in the device (300). The “priority index” may be used to configure the respective “touch counts” for applications. Higher the “priority index” higher will be the “touch count” value associated with it. The device (300) may define an application with “priority index” as 1 to be the highest priority application and so “touch count” 2 can be mapped with this value as shown in Table 4. Higher the value of “priority index”, lower is the priority of the application in the device (300) with usage perspective of the user.
PRIORITY INDEX TOUCH COUNT APPLICATION IDENTIFIER
1 2 APP N1
2 3 APP N2
3……n 4……n+1 APP N3….APP Nn
Table 4
The calculation of “priority index” is based on the knowledge gathering and intelligence creation. The default configuration is extracted from Table 1, which is updated by Unit 1 during initial configuration. The device (300) will collect the data of “number of execution” and “time period of execution” on per day basis of the applications in the device (300). The above information is assumed to be known and is collected from the underlying operating system in the device (300). Further, the device may create a Table 5 on per day basis for every 3 days. The Table 5 may be refreshed and updated every 3 days.
Priority index is decided on 2 factors: (1) Number of execution is represented with N_exe; and (2) Time period of execution is represented with T_exe for an application APP X. In this way, applications with lager value of N_exe are given a higher priority. Further, in case of one or more applications with equal values of N_exe, the application with larger value of T_exe is given a higher priority. In case of one or more applications with equal N-exe and T_exe, the selection of application is random. On the basis of Table 5, priority index is decided and Table 4 is created by Unit 4. Those skilled in the art will appreciate that the data used in various tables is just for demonstration purposes.
APPLICATION IDENTIFIER N_EXE T_EXE LATEST USED TIMESTAMP
APP 1 7 45 12:03:04
APP2 6 76 12:04:08
APP….N ……… ………… ……………
Table 5
Figure 12 illustrates an exemplary use case (120), in accordance with an embodiment of the invention. User may perform multiple touch events in the vacant area of a current home screen (step 121). At this time, Unit 2 takes over the control and matches the “touch count” with Table 1 and collects location information for selected application (step 122). Say, the selected application App 3 is found on Screen 3 (step 123), then the same is moved the current home screen (step 124).
In continuation to the previous figure, Figure 13 illustrates another use case (130) where App 3 has been moved from its original location to the current home screen (step 131). In this case, Unit 3 keep on continuously monitoring the process state of App 3 and waiting for control signal from Unit 1 for the same “touch count” to send back App 3 to its original location (step 132).When the user perform multiple touch events anywhere in the vacant area in the current screen of App 3 (step 133), the App 3 is moved back to its original location in the Screen 3 (step 134). At the same time, the home screen returns to its original state (step 135) where the user can repeat the above steps to call another application.
Figure 14 illustrates an exemplary use case (140), in accordance with an embodiment of the invention. User may perform multiple touch events in the vacant area of a current home screen (step 141). At this time, Unit 2 takes over the control and matches the “touch count” with Table 1 and collects location information for a plurality of applications when the touch count is greater than or equal to a maximum threshold value (step 142). The plurality of applications (App 2, 3, 5, & 9) are located in their respective screens or a common screen, say screen 3 (step 143). After that the plurality of applications are moved to a user interface in the current screen or any other screen having sufficient space for displaying the user interface (step 144). In one implementation, the user window may be displayed as an overlay window over the current screen, for example, when there is insufficient free space on the current screen.
In continuation to the previous figure, Figure 15 illustrates another use case (150) where multiple applications (App 2, 3, 5, & 9) have been moved from their original location to said user interface (step 151). In this case, Unit 3 keep on continuously monitoring the process state of these multiple applications and waiting for control signal from Unit 1 for the “touch count” greater than or equal to send back these applications to their respective original location (step 152).When the user perform multiple touch events anywhere in the vacant area in the current screen of said user interface (step 153), the applications displayed in the user interface are moved back to their respective original location (step 154). At the same time, the home screen returns to its original state (step 155) where the user can repeat the above steps to call another application or said user interface.
In terms of existing solutions, the user usually does a search by application name or manually browses through the list of applications or can also arrange the applications in separate folder. This involves considerable user effort in cases where the applications are more in number. On the other hand, the present invention provides user with easy options to access the applications he/she wants to use. The user may not have to mandatorily remember the application name or previously stored application location to access the application. When there are more number of applications installed in the device (300), it becomes increasingly difficult to manage them. With the present invention, the user will have to just remember the “touch count”. If the “automate” option is enabled, then the user does not even need to remember the “touch count”. Moreover, the proposed solution is quite easy to use. It just targets the access of application to the user is made easy. The motive is to present the application for access by user in simplest way. With the present invention, the user can now also do away with the practice of keeping important and frequently used applications on the home screen, which in turn will make the home screen less cluttered and hence improves the performance of the device (300) as less RAM is utilized by the home screen.
While certain present preferred embodiments of the invention have been illustrated and described herein, it is to be understood that the invention is not limited thereto, but may be otherwise variously embodied and practiced within the scope of the following claims.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 3141-DEL-2014-IntimationOfGrant29-08-2023.pdf 2023-08-29
1 Specifications.pdf 2014-11-14
2 FORM 5.pdf 2014-11-14
2 3141-DEL-2014-PatentCertificate29-08-2023.pdf 2023-08-29
3 FORM 3.pdf 2014-11-14
3 3141-DEL-2014-Written submissions and relevant documents [25-05-2023(online)].pdf 2023-05-25
4 Form 26.pdf 2014-11-14
4 3141-DEL-2014-FORM-26 [16-05-2023(online)].pdf 2023-05-16
5 Drawingss.pdf 2014-11-14
5 3141-DEL-2014-Correspondence to notify the Controller [15-05-2023(online)].pdf 2023-05-15
6 3141-DEL-2014-US(14)-HearingNotice-(HearingDate-17-05-2023).pdf 2023-05-01
6 3141-del-2014-Form-1-(17-11-2014).pdf 2014-11-17
7 3141-DEL-2014-Correspondence-171114.pdf 2014-12-04
7 3141-DEL-2014-CLAIMS [15-05-2020(online)].pdf 2020-05-15
8 3141-DEL-2014-PA [18-09-2019(online)].pdf 2019-09-18
8 3141-DEL-2014-COMPLETE SPECIFICATION [15-05-2020(online)].pdf 2020-05-15
9 3141-DEL-2014-FER_SER_REPLY [15-05-2020(online)].pdf 2020-05-15
9 3141-DEL-2014-ASSIGNMENT DOCUMENTS [18-09-2019(online)].pdf 2019-09-18
10 3141-DEL-2014-8(i)-Substitution-Change Of Applicant - Form 6 [18-09-2019(online)].pdf 2019-09-18
10 3141-DEL-2014-OTHERS [15-05-2020(online)].pdf 2020-05-15
11 3141-DEL-2014-Correspondence-101019.pdf 2019-10-14
11 3141-DEL-2014-FER.pdf 2019-11-18
12 3141-DEL-2014-OTHERS-101019.pdf 2019-10-16
13 3141-DEL-2014-Correspondence-101019.pdf 2019-10-14
13 3141-DEL-2014-FER.pdf 2019-11-18
14 3141-DEL-2014-8(i)-Substitution-Change Of Applicant - Form 6 [18-09-2019(online)].pdf 2019-09-18
14 3141-DEL-2014-OTHERS [15-05-2020(online)].pdf 2020-05-15
15 3141-DEL-2014-ASSIGNMENT DOCUMENTS [18-09-2019(online)].pdf 2019-09-18
15 3141-DEL-2014-FER_SER_REPLY [15-05-2020(online)].pdf 2020-05-15
16 3141-DEL-2014-COMPLETE SPECIFICATION [15-05-2020(online)].pdf 2020-05-15
16 3141-DEL-2014-PA [18-09-2019(online)].pdf 2019-09-18
17 3141-DEL-2014-CLAIMS [15-05-2020(online)].pdf 2020-05-15
17 3141-DEL-2014-Correspondence-171114.pdf 2014-12-04
18 3141-del-2014-Form-1-(17-11-2014).pdf 2014-11-17
18 3141-DEL-2014-US(14)-HearingNotice-(HearingDate-17-05-2023).pdf 2023-05-01
19 3141-DEL-2014-Correspondence to notify the Controller [15-05-2023(online)].pdf 2023-05-15
19 Drawingss.pdf 2014-11-14
20 Form 26.pdf 2014-11-14
20 3141-DEL-2014-FORM-26 [16-05-2023(online)].pdf 2023-05-16
21 FORM 3.pdf 2014-11-14
21 3141-DEL-2014-Written submissions and relevant documents [25-05-2023(online)].pdf 2023-05-25
22 FORM 5.pdf 2014-11-14
22 3141-DEL-2014-PatentCertificate29-08-2023.pdf 2023-08-29
23 Specifications.pdf 2014-11-14
23 3141-DEL-2014-IntimationOfGrant29-08-2023.pdf 2023-08-29

Search Strategy

1 Search3141_02-11-2019.pdf

ERegister / Renewals

3rd: 28 Oct 2023

From 31/10/2016 - To 31/10/2017

4th: 28 Oct 2023

From 31/10/2017 - To 31/10/2018

5th: 28 Oct 2023

From 31/10/2018 - To 31/10/2019

6th: 28 Oct 2023

From 31/10/2019 - To 31/10/2020

7th: 28 Oct 2023

From 31/10/2020 - To 31/10/2021

8th: 28 Oct 2023

From 31/10/2021 - To 31/10/2022

9th: 28 Oct 2023

From 31/10/2022 - To 31/10/2023

10th: 28 Oct 2023

From 31/10/2023 - To 31/10/2024

11th: 28 Oct 2024

From 31/10/2024 - To 31/10/2025

12th: 12 Sep 2025

From 31/10/2025 - To 31/10/2026