Sign In to Follow Application
View All Documents & Correspondence

A Method And System For Capturing User Intent To Read Screen Content And Provide Option(s)

Abstract: ABSTRACT A method and an electronic device for assisting an active application of an electronic device are described. The method comprises detecting a user consent gesture corresponding to the active application. Further, the method comprises identifying at least one app-helper action in response to the user consent gesture by analyzing context of the active application. Further, the method comprises causing to display the at least one app-helper action, so identified for the active application. Furthermore, the method comprises executing an app-helper action selected from the at least one app-helper action displayed for the active application when a user selects the app-helper action from the at least one app-helper action displayed for the active application. FIG. 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 October 2015
Publication Number
37/2017
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
patent@bananaip.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-01-25
Renewal Date

Applicants

SAMSUNG R&D Institute India - Bangalore Private Limited
# 2870, Orion Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post,Bangalore-560 037, India

Inventors

1. Debayan Mukherjee
Samsung R&D Institute India – Bangalore,#2870, Bagmane Constellation Business Park, Doddanekundi, Marathahalli, Bangalore - 560037
2. Swadha Jaiswal
Samsung R&D Institute India – Bangalore,#2870, Bagmane Constellation Business Park, Doddanekundi, Marathahalli, Bangalore - 560037
3. Sowmya Radhakrishnan Iyer
Samsung R&D Institute India – Bangalore,#2870, Bagmane Constellation Business Park, Doddanekundi, Marathahalli, Bangalore - 560037
4. Mannu Amrit
Samsung R&D Institute India – Bangalore,#2870, Bagmane Constellation Business Park, Doddanekundi, Marathahalli, Bangalore - 560037
5. Preksha Shukla
Samsung R&D Institute India – Bangalore,#2870, Bagmane Constellation Business Park, Doddanekundi, Marathahalli, Bangalore - 560037
6. Karthikeyan Subramani
Samsung R&D Institute India – Bangalore,#2870, Bagmane Constellation Business Park, Doddanekundi, Marathahalli, Bangalore - 560037
7. Saumitri Choudhury
Samsung R&D Institute India – Bangalore,#2870, Bagmane Constellation Business Park, Doddanekundi, Marathahalli, Bangalore - 560037
8. Shailee Advani
Samsung R&D Institute India – Bangalore,#2870, Bagmane Constellation Business Park, Doddanekundi, Marathahalli, Bangalore - 560037
9. Dipesh Amritlal Shah
Samsung R&D Institute India – Bangalore,#2870, Bagmane Constellation Business Park, Doddanekundi, Marathahalli, Bangalore - 560037

Specification

DESC:
The following specification particularly describes and ascertains the nature of this invention and the manner in which it is to be performed:-


TECHNICAL FIELD
[001] The embodiments herein generally relate to the field of user experience on an electronic device and particularly relates to enhancing the user experience for an active application on the electronic device.

BACKGROUND
[002] Advances in device technology supported with round the clock internet access enables a user to perform multitude of tasks on his/her electronic device. Generally, an application on the electronic device such as messaging apps, photo gallery, camera, web browsers and the like enable the user to perform one or more tasks corresponding to the application. The application can provide tasks within the application so as to enable the user to perform functions such as share, attach and share, call and so on. However, most of the applications, typically third party applications, do not provide open Application Programming Interface (API) or Intent to an operating framework (native platform). Thus, the native platform is unable to access any data, activity or user action being performed inside the application. The native platform can only be aware of the launching and closing of the application activated by the user. This black box behavior of the applications limits the native platform from providing any useful, native, contextual, intelligent options within the application to the user to the user.
[003] Few existing methods, for example, ‘Now on tap’ feature assisting Google Now application reads a page on detecting a tap gesture, further analyzes page content for Google Search results. However, the feature limits to providing searches based on the page content read. Thus, existing methods may provide only related application and limit to content suggestions.

OBJECT OF INVENTION
[004] The principal object of the embodiments herein is to provide a method and system (electronic device) for assisting an application active on the electronic device (active application) to enhance user experience by providing one or more app-helper actions for the active application, wherein one or more app-helper actions are add-on external application to assist the active application.
[005] Another object of the invention is to provide a method that identifies one or more app-helper actions by analyzing context of the active application by mining a current page content of the active application and analyzing history of user usage pattern corresponding to the active application, wherein the app-helper actions are identified using an app-assist framework and app-action framework of a native platform of the electronic device.
[006] Another object of the embodiments herein to provide a user consent gesture that indicates user consent to access the current page content for identifying and providing one or more app-helper actions to the user.

SUMMARY
[007] In view of the foregoing, an embodiment herein provides a method for assisting an active application of an electronic device. The method comprises detecting a user consent gesture corresponding to the active application. Further, the method comprises identifying at least one app-helper action in response to the user consent gesture by analyzing context of the active application. Further, the method comprises causing to display the at least one app-helper action, so identified for the active application. Furthermore, the method comprises executing an app-helper action selected from the at least one app-helper action displayed for the active application when a user selects the app-helper action from the at least one app-helper action displayed for the active application.
[008] Embodiments further disclose an electronic device for assisting an active application of the electronic device. The electronic device comprises an application assist module configured to detect a user consent gesture corresponding to the active application. Further, the application assist module can be configured to identify at least one app-helper action in response to the user consent gesture by analyzing context of the active application. Further, the application assist module can be configured to cause to display the at least one app-helper action, so identified for the active application. Furthermore, the application assist module is configured to execute an app-helper action selected from the at least one app-helper action displayed for the active application when a user selects the app-helper action from the at least one app-helper action displayed for the active application.
[009] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.

BRIEF DESCRIPTION OF FIGURES
[0010] The embodiments of this invention are illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[0011] FIG. 1 illustrates a plurality of components of an electronic device for assisting an application active on the electronic device (active application) to enhance user experience by providing one or more app-helper actions for the active application, according to embodiments as disclosed herein;
[0012] FIG. 2 is a flow diagram illustrating a method for assisting the active application to enhance the user experience by providing one or more app-helper actions for the active application, according to embodiments as disclosed herein;
[0013] FIG. 3 illustrates functional block diagram depicting the data flow among plurality of modules of an application assist module of the electronic device that provides providing one or more app-helper actions for the active application, according to embodiments as disclosed herein;
[0014] FIG. 4 is an example illustrating screen shots of user interfaces of the electronic device that provides enhanced user experience by providing one or more app-helper actions for an active chat application, according to embodiments as disclosed herein;
[0015] FIG. 5 is an example illustrating screen shots of user interfaces of the electronic device that provides enhanced user experience by providing one or more app-helper actions for an active browser application, according to embodiments as disclosed herein;
[0016] FIG. 6 is an example illustrating screen shots of user interfaces of the electronic device that provides enhanced user experience by providing one or more app-helper actions for an active camera application, according to embodiments as disclosed herein;
[0017] FIG. 7 is an example illustrating screen shots of user interfaces of the electronic device that provides enhanced user experience by providing one or more app-helper actions for an active music application, according to embodiments as disclosed herein;
[0018] FIG. 8 is an example illustrating screen shots of user interfaces of the electronic device that provides enhanced user experience by providing one or more app-helper actions for an active email application, according to embodiments as disclosed herein;
[0019] FIG. 9 is an example illustrating screen shots of user interfaces of the electronic device that provides enhanced user experience by providing one or more app-helper actions for an active map application, according to embodiments as disclosed herein;
[0020] FIG. 10 is an example illustrating screen shots of user interfaces of the electronic device that provides enhanced user experience by providing one or more app-helper actions for an active photo gallery application, according to embodiments as disclosed herein; and
[0021] FIG. 11 illustrates a computing environment implementing the method for assisting the active application to enhance user experience by automatically providing one or more app-helper actions for the active application, according to embodiments as disclosed herein, according to embodiments as disclosed herein.


DETAILED DESCRIPTION
[0022] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0023] The embodiments herein provide a method and system (electronic device) for assisting an application active on the electronic device (active application) to enhance the user experience by providing one or more app-helper actions for the active application. One or more app-helper actions include add-on external applications that can assist the active application. These external applications can include independent applications that are not provided to a user within the active application. The method proposed automatically identifies one or more app-helper actions by analyzing context of the active application. The context is analyzed by mining a current page content of the active application along with analysis of history of user usage pattern corresponding to the active application. Thus, analysis enables to identify user intent in context of the active application and enhance user experience by providing the user intended additional applications through app-helper actions. The analysis includes identifying the app-helper actions using an app-assist framework and app-action framework provided by a native platform of the electronic device. For example, an operating platform such as an android platform is the native platform of the electronic device. The method includes, providing a user consent gesture that indicates user consent to access the current page content for identifying and providing one or more app-helper actions. Thus, the method includes providing one or more app-helper actions after detecting the user consent through the user consent gesture performed on the electronic device when an application is currently active on the electronic device.
[0024] One or more app-helper actions identified for the current active application can be displayed to the user on screen of the electronic device. Thus, the method enables diverting back traffic from third party applications to the native applications provided by the native platform. Once the user selects an app-helper action from one or more app-helper action displayed, the method includes executing the selected app-helper action without exiting from the active application. For executing the app-helper action, the method includes automatically providing the current page content corresponding to the active application to the app-helper action selected by the user. Thus, the method proposed provides an easy way to the user to utilize add-on application, where information of the active application is automatically fed to the add-on application selected by the user. Further, since the method includes executing the selected app-helper action without exiting from the active application, both the active application and the selected app-helper action can be simultaneously displayed to the user. For example, the selected app-helper action can be displayed in a cross view for a multitasking user.
[0025] In an embodiment, the electronic device is a mobile phone, a tablet, a personal digital assistant, a laptop, a wearable device or any other electronic device.
[0026] Referring now to the drawings, and more particularly to FIGS. 1 through 11, where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments.
[0027] FIG. 1 illustrates a plurality of components of an electronic device 100 for assisting the active application to enhance the user experience by providing one or more app-helper actions for the active application, according to embodiments as disclosed herein.
[0028] Referring to figure 1, the electronic device 100 is illustrated in accordance with an embodiment of the present subject matter. In an embodiment, the electronic device 100 may include at least one processor 102, an input/output (I/O) interface 104 (herein a configurable user interface), a memory 106. The at least one processor 102 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 102 is configured to fetch and execute computer-readable instructions stored in the memory 106.
[0029] The I/O interface104 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface and the like. The I/O interface 104 may allow the electronic device 100 to communicate with other devices. The I/O interface 104 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, Local Area network (LAN), cable, and the like, and wireless networks, such as Wireless LAN, cellular, Device to Device (D2D) communication network, Wi-Fi networks and so on. Modules 108 in the memory 106 include routines, programs, objects, components, data structures, and so on, which perform particular tasks, functions or implement particular abstract data types. The modules 108 may include programs or coded instructions that supplement applications and functions of the electronic device 100. Data 112, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 108 and an application assist module 110. In one implementation, the application assist module 110 futher may include an app-assist framework 112 and an app-action framework 114 that identify one or more app helper actions in context to the active application of the electronic device 100. In an embodiment, third party applications may be allowed to build features over the app-assist framework 112 and the app-action framework 114.The application assist module 110 can be configured to enhance user experience by automatically providing one or more app-helper actions for the active application on the electronic device 100. One or more app-helper actions include add-on external applications assisting the active application. These external applications can include independent applications that are not provided to the user, using the active application, within the active application. For example, providing an add-on carrier call application (app-helper action) to an active Whatsapp chat application that currently provides only Whatsapp call (Voice over Internet Protocol (VoIP)) internet call.
[0030] The application assist module 110 can be configured to detect the user consent gesture performed on the electronic device corresponding to the active application. Further, the application assist module 110 can be configured to identify one or more app-helper actions in response to the user consent gesture. The identification of one or more app-helper actions is performed by analyzing context of the active application. Analyzing the context of the active application includes mining the current page content of the active application and analyzing history of the user usage pattern corresponding to the active application by the app-assist framework 112 and the app-action framework 114 provided by the native platform of the electronic device. Functions performed by the app-assist framework 112 and the app-action framework 114 are explained in conjunction with FIG. 3. Further, the application assist module 110 can be configured to cause displaying of one or more app-helper actions, so identified for the active application.
[0031] Further, the application assist module 110 can be configured to execute an app-helper action selected from one or more app-helper actions displayed for the active application. The selected app-helper action can be executed without exiting from the active application. The app-assist module 110 can be configured to automatically provide the current page content corresponding to the active application for executing the app-helper action. The selected app-helper action can be displayed in the cross view of the graphical user interface of the I/O 104 for the multitasking user. Pluralities of use case examples are explained in Figs. 4 through 7 to understand the system in a real time scenario.
[0032] The names of the other components and modules of the electronic device 100 are illustrative and need not be construed as a limitation.
[0033] FIG. 2 is a flow diagram illustrating a method 200 for assisting the active application to enhance the user experience by providing one or more app-helper actions for the active application, according to embodiments as disclosed herein. The method 200 includes allowing the app assist module 110 to provide the user consent gesture that indicates user consent to access the current page content of the active application for identifying and providing one or more app-helper actions to the user.
[0034] At step 202, the method 200 includes allowing the application assist module 110 to detect the user consent gesture performed on the electronic device corresponding to the active application. For example, the user consent gesture can be pulling of an application handle provided on the electronic device. The application handle is an existing user gesture tool available with edge-type electronic device. However, any new gesture may be defined for indicating the user consent. In an embodiment, if none of the application is active on the electronic device 100 when the user consent gesture (pulling of the application handle) is detected, the method 200 allows the app-assist module to display one or more default applications predefined for the electronic device. However, when the user consent gesture is performed when the application is active, then the identified app-helper actions in context of the active application may be displayed to the user as an overlay on the active application. At step 204, the method 200 includes allowing the application assist module 110 to identify one or more app-helper actions in response to the user consent gesture. The identification of one or more app-helper actions is performed by analyzing context of the active application. Analyzing the context of the active application includes mining the current page content of the active application and analyzing history of the user usage pattern corresponding to the active application by the app-assist framework 112 and the app-action framework 114 provided by the native platform of the electronic device. The app-assist framework 112 and the app-action framework 114 are explained in conjunction with FIG. 3. At step 206, the method 200 includes allowing the application assist module 110 to cause displaying of one or more app-helper actions, so identified for the active application. At step 208, the method 200 includes allowing the application assist module 110 to execute the app-helper action selected from one or more app-helper actions displayed for the active application. Further, the selected app-helper action can be executed without exiting from the active application. The method 200 includes allowing the app-assist module to automatically provide the current page content corresponding to the active application for executing the app-helper action. The relevant required information from the active application can be auto fed to the selected app-helper action as explained in conjunction with use case examples in Figs. 4 through 10. In an embodiment, the selected app-helper action can be displayed in the cross view of the graphical user interface partially overlapping the active application. The various actions in method 200 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions listed in FIG. 2 may be omitted.
[0035] FIG. 3 illustrates functional block diagram depicting the data flow among plurality of modules of the application assist module 110 of the electronic device 100 that provides one or more app-helper actions for the active application, according to embodiments as disclosed herein. In an embodiment, a view system 304, of the app-assist framework 112 within the application assist module 110, interacts with an active application 302. The view system 304 is basic building block for user interface components. A view of the view system 304 occupies a rectangular area on a screen of the electronic device 100 and can be configured to perform drawing and event handling based on screen reading 316. The view system 304 is assisted with an assist manager 308 of the app-assist framework 112. The assist manager 308 can be configured to interact with a voice interaction/service module 314, a screen manager 306 and a window manager 310 of the app-assist framework 112. The assist manager 308 can be configured to control and initiate the action of reading view information, look for the active application 302 based on settings configured by a settings app/provider 312. The complete cycle for the active application 302 for suggesting app-helper actions is managed by the assist manager 308. The screen manager 306 can be configured to read screen contents 318 through view system 304 and arrange the screen content and provide input to assist manger 308. The window manager 310 can be configured to create a window for the active application 302 and interact with low level modules and send a screen buffer for display on the active application 302.
[0036] Further, a Natural Language Processing (NLP) Engine 322 of the app-action framework 114 can be configured to follow a set of pre-defined rules defined by the application assist module 110 and utilize a Natural Language processing algorithm for learning procedures (machine learning), automatically focusing on most common cases. The NLP engine 322 can be configured to receive and process input from a data provider 320 to derive different actions for the active application 302. These actions are provided to an action relation manager 324. The action relation manager 324 can be configured to arrange the actions suggested by the NLP engine 322, categorize the actions and prepare the app –helper actions specific to the active application 302. The app-helper actions are provided to an app-helper action provider 326. The app- helper action provider 326 can be configured to provide service in the application assist module 110. Any application on the electronic device 100 registers with these services. The app-helper actions are prepared as per an application specific format and propagated to the registered application for rendering on the screen of the electronic device 100 providong enhanced user experience. An application Programming Interface (API) Layer 328 functions as a Software Development Kit (SDK) layer for applications on the electronic device 100.FIG. 4 is an example illustrating screen shots of user interfaces of the electronic device 100 that provides enhanced user experience by intelligently providing one or more app-helper actions for an active chat application, according to embodiments as disclosed herein. In the example, a chat screen 404 of WhatsApp (active chat application) is provided an application handle 202. When the user performs the user consent gesture by pulling the application handle 402, the user permits the app-assist module 110 to read screen content and decipher it is the chat screen corresponding a contact Deb 406. Based on the current context of the active Whatsapp application, the app-assist module 110 pops out one or more app-helper actions 408. The app-helper actions correspond to applications the current active Whatsapp application does not provide within the application. For example, the Whatsapp does not support a Carrier Call 410 to the contact Deb 406 but is provided by the add on application corresponding to the app-helper action popped on the display screen along with applications such as email and face book messenger. If the user selects the Carrier call 410, the contact details of contact Deb 406 are automatically provided to the carrier call app-helper action 406 and a call is initiated. The carrier call application 406 is launched in a cross view without exiting from the currently active Whatsapp application.
[0037] FIG. 5 is an example illustrating screen shots of user interfaces of the electronic device 100 that provides enhanced user experience by automatically providing one or more app-helper actions for a browser application, according to embodiments as disclosed herein. When the user performs the user consent gesture by pulling the application handle 504, the user permits the app-assist module 110 to read a current page 502 of the active browser. The browser current page 502 displays an article in German language. When application handle 504 is pulled, the current page content is read and analyzed by the app-assist module 110 to be German. Thus, in context to the current page contents and user history indicating user mostly reads English articles, the app assist module 110 can be configured to pop app-helper actions 506 that may include a Translate app along with other commonly used sharing applications preferred by the user. For example, user history may indicate user prefers usage of face book and twitter for sharing information on social networking sites. On selecting a suggested app-helper action, such as the Translate app, page content is read, automatically fed to the Translate app and translated to English for an English speaking user. In such use cases a cross view of the app-helper action is not required since the current active application itself is opened again with translated contents.
[0038] FIG. 6 is an example illustrating screen shots of user interfaces of the electronic device 100 that provides enhanced user experience by automatically providing one or more app-helper actions for a camera application, according to embodiments as disclosed herein. On detecting pulling of an application handle 602 when the camera application is currently active on the electronic device 100, a photo 602 clicked of a business card by the camera application is read to determine the app-helper actions to be suggest. In the example, suggestions of the app-helper actions 606 include add-contact and so on. Further, when the user selects the add-contact app-helper action, a cross-view of add-contact UI 608 is launched with page context of the camera app (business card information) auto-filled in the add-contact 608. Further, the user can quickly edit the auto filled information and save to contacts.
[0039] FIG. 7 is an example illustrating screen shots of user interfaces of the electronic device 100 that provides enhanced user experience by automatically providing one or more app-helper actions for a music application, according to embodiments as disclosed herein. In the example, the currently active music application is depicted playing a music track of artist X 702. On detection of pulling of an application handle 704, the music track played in read and app-helper action 706 such as YouTube app is suggested. On selecting YouTube app-helper action by the user, a YouTube app 708 is launched in cross-view in overlay, with search results presented for same track that was played in the active music application. On selecting one of the search results in YouTube Cross-view 708, the corresponding video starts playing in cross-view 710
[0040] FIG. 8 is an example illustrating screen shots of user interfaces of the electronic device that provides enhanced user experience by providing one or more app-helper actions for an active email application, according to embodiments as disclosed herein. The figure depicts the active application with an email being composed to a Contact with schedule information in the email body. On pulling of an application handle (user consent gesture) as described in Figs. 4 to 7, the app-helper actions such as calendar and memo are suggested that may help the user to add the schedule content detected after screen reading. On user selecting an S Planner suggestion, S Planner cross-view appears with original context read from background app content already added, ready to be edited and saved by user.
[0041] FIG. 9 is an example illustrating screen shots of user interfaces of the electronic device that provides enhanced user experience by providing one or more app-helper actions for an active map application, according to embodiments as disclosed herein. The figure depicts the active map application with a map location selected. On pulling of an application handle (user consent gesture) as described in Figs. 4 to 7, the map location selected view is read and app-helper action suggestions such as share the location via various applications are displayed. On selecting WhatsApp sharing option by the user the WhatsApp cross-view is launched with contact-picker context. On selecting a Contact/Group in the contact picker in WhatsApp, the context from map application is auto-filled in WhatsApp cross-view.
[0042] FIG. 10 is an example illustrating screen shots of user interfaces of the electronic device that provides enhanced user experience by providing one or more app-helper actions for an active photo gallery application, according to embodiments as disclosed herein. When an application handle (user consent gesture) is pulled page content is read and analyzed to be a photo. Further, app-helper actions such as photo editing apps like Instagram and Pic Collage are suggested. These applications can be opened with this photo auto-fed into them to start editing right away. On selecting one of the app-helper action suggested such as Instagram the same photo is opened in edit mode in Instagram in a cross-view for user’s multitasking convenience.
[0043] In another example, when a gallery album view of the photo gallery application is open and user pulls the handle, event in the gallery album is recognized and app-helper actions such as facebook and WhatsApp sharing are suggested. On selecting facebook sharing action, facebook posting cross-view appears where the user can make quick edits and share immediately, without change of original context or additional effort.
[0044] FIG. 11 illustrates a computing environment implementing the method for assisting the active application on the to enhance user experience by automatically providing one or more app-helper actions for the active application, according to embodiments as disclosed herein, according to embodiments as disclosed herein. As depicted, the computing environment 1102 comprises at least one processing unit 1104 that is equipped with a control unit 1106 and an Arithmetic Logic Unit (ALU) 1108, a memory 1110, a storage unit 1112, plurality of networking devices 1114 and a plurality Input output (I/O) devices 1116. The processing unit 1104 is responsible for processing the instructions of the algorithm. The processing unit 1104 receives commands from the control unit 1106 in order to perform its processing. Further, any logical and arithmetic operations involved in the execution of the instructions are computed with the help of the ALU 1108.
[0045] The overall computing environment 1102 can be composed of multiple homogeneous and/or heterogeneous cores, multiple CPUs of different kinds, special media and other accelerators. The processing unit 1104 is responsible for processing the instructions of the algorithm. Further, the plurality of processing units 1104 may be located on a single chip or over multiple chips.
[0046] The algorithm comprising of instructions and codes required for the implementation are stored in either the memory unit 1110 or the storage 1112 or both. At the time of execution, the instructions may be fetched from the corresponding memory 1110 and/or storage 1112, and executed by the processing unit 1104. In case of any hardware implementations various networking devices 1114 or external I/O devices 1116 may be connected to the computing environment to support the implementation through the networking unit and the I/O device unit. The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements. The network elements shown in FIG. 1 through FIG. 11 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
[0047] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
,CLAIMS:STATEMENT OF CLAIMS
We claim:
1. A method for assisting an active application of an electronic device, the method comprising:
detecting, by an electronic device, a user consent gesture corresponding to the active application;
identifying, by the electronic device, at least one app-helper action in response to the user consent gesture by analyzing context of the active application; and
causing to display, by the electronic device, the at least one app-helper action so identified for the active application.
2. The method as claimed in claim 1, wherein said method comprises executing an app-helper action selected from the at least one app-helper action displayed for the active application when a user selects the app-helper action from the at least one app-helper action displayed for the active application.
3. The method as claimed in claim 2, wherein the app-helper action selected from the at least one app-helper action is executed without exiting from the active application, wherein information of the current page content corresponding to the active application is provided to the app-helper action for executing the app-helper action.
4. The method as claimed in claim 1, wherein analyzing the context of the active application comprises mining a current page content of the active application and analyzing history of user usage pattern corresponding to the active application by an app-assist framework and an app-action framework provided by a native platform of the electronic device.
5. The method as claimed in claim 1, wherein the at least one app-helper action, so identified, is an add-on external application provided to assist the active application.
6. An electronic device for assisting an active application of the electronic device, the electronic devices comprises a application assist module, the application assist module is configured to:
detect a user consent gesture corresponding to the active application;
identify at least one app-helper action in response to the user consent gesture by analyzing context of the active application; and
cause to display the at least one app-helper action, so identified for the active application.
7. The electronic device as claimed in claim 6, wherein the application assist module is configured to execute an app-helper action selected from the at least one app-helper action displayed for the active application when a user selects the app-helper action from the at least one app-helper action displayed for the active application.
8. The electronic device as claimed in claim 7, wherein the application assist module is configured to execute the app-helper action selected from the at least one app-helper action without exiting from the active application, wherein information of the current page content corresponding to the active application is provided to the app-helper action for executing the app-helper action.
9. The electronic device as claimed in claim 6, wherein the application assist module is configured to analyze the context of the active application by mining a current page content of the active application and analyzing history of user usage pattern corresponding to the active application by an app-assist framework and an app-action framework provided by a native platform of the electronic device.
10. The electronic device as claimed in claim 6, wherein the at least one app-helper action, so identified, is an add-on external application provided to assist the active application.

Dated this 9th of March, 2016
Signature:
Name of the Signatory: Dr. Kalyan Chakravarthy

Documents

Application Documents

# Name Date
1 5878-CHE-2015-IntimationOfGrant25-01-2024.pdf 2024-01-25
1 Form 5 [30-10-2015(online)].pdf 2015-10-30
2 5878-CHE-2015-PatentCertificate25-01-2024.pdf 2024-01-25
2 Form 3 [30-10-2015(online)].pdf 2015-10-30
3 Drawing [30-10-2015(online)].pdf 2015-10-30
3 5878-CHE-2015-Annexure [24-01-2024(online)].pdf 2024-01-24
4 Description(Provisional) [30-10-2015(online)].pdf 2015-10-30
4 5878-CHE-2015-Written submissions and relevant documents [24-01-2024(online)].pdf 2024-01-24
5 Drawing [09-03-2016(online)].pdf 2016-03-09
5 5878-CHE-2015-Annexure [04-01-2024(online)].pdf 2024-01-04
6 Description(Complete) [09-03-2016(online)].pdf 2016-03-09
6 5878-CHE-2015-Correspondence to notify the Controller [04-01-2024(online)].pdf 2024-01-04
7 5878-CHE-2015-Power of Attorney-210416.pdf 2016-07-13
7 5878-CHE-2015-FORM-26 [04-01-2024(online)].pdf 2024-01-04
8 5878-CHE-2015-US(14)-HearingNotice-(HearingDate-09-01-2024).pdf 2023-12-05
8 5878-CHE-2015-Correspondence-PA-210416.pdf 2016-07-13
9 5878-CHE-2015-ABSTRACT [01-07-2020(online)].pdf 2020-07-01
9 Form-18(Online).pdf 2016-09-26
10 5878-CHE-2015-CLAIMS [01-07-2020(online)].pdf 2020-07-01
10 5878-CHE-2015-FORM-26 [15-03-2018(online)].pdf 2018-03-15
11 5878-CHE-2015-COMPLETE SPECIFICATION [01-07-2020(online)].pdf 2020-07-01
11 5878-CHE-2015-FORM-26 [16-03-2018(online)].pdf 2018-03-16
12 5878-CHE-2015-CORRESPONDENCE [01-07-2020(online)].pdf 2020-07-01
12 5878-CHE-2015-FER.pdf 2020-01-21
13 5878-CHE-2015-FER_SER_REPLY [01-07-2020(online)].pdf 2020-07-01
13 5878-CHE-2015-OTHERS [01-07-2020(online)].pdf 2020-07-01
14 5878-CHE-2015-FER_SER_REPLY [01-07-2020(online)].pdf 2020-07-01
14 5878-CHE-2015-OTHERS [01-07-2020(online)].pdf 2020-07-01
15 5878-CHE-2015-CORRESPONDENCE [01-07-2020(online)].pdf 2020-07-01
15 5878-CHE-2015-FER.pdf 2020-01-21
16 5878-CHE-2015-COMPLETE SPECIFICATION [01-07-2020(online)].pdf 2020-07-01
16 5878-CHE-2015-FORM-26 [16-03-2018(online)].pdf 2018-03-16
17 5878-CHE-2015-FORM-26 [15-03-2018(online)].pdf 2018-03-15
17 5878-CHE-2015-CLAIMS [01-07-2020(online)].pdf 2020-07-01
18 5878-CHE-2015-ABSTRACT [01-07-2020(online)].pdf 2020-07-01
18 Form-18(Online).pdf 2016-09-26
19 5878-CHE-2015-Correspondence-PA-210416.pdf 2016-07-13
19 5878-CHE-2015-US(14)-HearingNotice-(HearingDate-09-01-2024).pdf 2023-12-05
20 5878-CHE-2015-FORM-26 [04-01-2024(online)].pdf 2024-01-04
20 5878-CHE-2015-Power of Attorney-210416.pdf 2016-07-13
21 5878-CHE-2015-Correspondence to notify the Controller [04-01-2024(online)].pdf 2024-01-04
21 Description(Complete) [09-03-2016(online)].pdf 2016-03-09
22 5878-CHE-2015-Annexure [04-01-2024(online)].pdf 2024-01-04
22 Drawing [09-03-2016(online)].pdf 2016-03-09
23 5878-CHE-2015-Written submissions and relevant documents [24-01-2024(online)].pdf 2024-01-24
23 Description(Provisional) [30-10-2015(online)].pdf 2015-10-30
24 5878-CHE-2015-Annexure [24-01-2024(online)].pdf 2024-01-24
24 Drawing [30-10-2015(online)].pdf 2015-10-30
25 Form 3 [30-10-2015(online)].pdf 2015-10-30
25 5878-CHE-2015-PatentCertificate25-01-2024.pdf 2024-01-25
26 Form 5 [30-10-2015(online)].pdf 2015-10-30
26 5878-CHE-2015-IntimationOfGrant25-01-2024.pdf 2024-01-25

Search Strategy

1 search_20-01-2020.pdf

ERegister / Renewals

3rd: 25 Apr 2024

From 30/10/2017 - To 30/10/2018

4th: 25 Apr 2024

From 30/10/2018 - To 30/10/2019

5th: 25 Apr 2024

From 30/10/2019 - To 30/10/2020

6th: 25 Apr 2024

From 30/10/2020 - To 30/10/2021

7th: 25 Apr 2024

From 30/10/2021 - To 30/10/2022

8th: 25 Apr 2024

From 30/10/2022 - To 30/10/2023

9th: 25 Apr 2024

From 30/10/2023 - To 30/10/2024

10th: 25 Apr 2024

From 30/10/2024 - To 30/10/2025

11th: 30 Oct 2025

From 30/10/2025 - To 30/10/2026