Abstract: ABSTRACT METHOD AND SYSTEM FOR MANAGING APPLICATIONS RUNNING ON SMART DEVICE USING A WEARABLE DEVICE The present invention describes method and system for managing applications running on one or more smart devices. The method comprises displaying a plurality of application icons on a wearable device, wherein each icon from the plurality of icons represents an active application on the smart device connected to the wearable device, receiving a touch gesture on one or more application icons from the plurality of icons, and triggering the smart device to perform an event comprising an interaction between the active applications represented by the one or more application icons in response to the touch gesture.. The system 100 comprises a wearable device and one or more smart devices. The wearable device comprises an application module, a SAP Gesture handler server, and accessory protocols. The smart device comprises an application handler daemon, a SAP gesture handler client, and accessory protocols. Figure 1 & 2
CLIAMS:
We claim:
1. A method for managing applications running on one or more smart devices, the method comprising:
displaying a plurality of application icons on a wearable device, wherein each icon from the plurality of icons represents an active application on the smart device connected to the wearable device;
receiving a touch gesture on one or more application icons from the plurality of application icons; and
triggering the smart device to perform an event comprising an interaction between the active applications represented by the one or more application icons in response to the touch gesture.
2. The method as claimed in claim 1, wherein the smart device is one of a smart phone, a smart TV, and a tablet.
3. The method as claimed in claim 1, wherein triggering the smart device to perform an event comprises switching control feature between application mode and priority mode on receiving the touch gesture.
4. The method as claimed in claim 3, wherein the application mode is used to send one or more of the active applications to one of a foreground and background of the smart device.
5. The method as claimed in claim 3, wherein the priority mode is used to assign user priority to the one or more active applications based on predefined instruction.
6. The method as claimed in claim 5, wherein hardware and software resources of the smart device are shared among the active applications based on the priority assigned by the user, wherein the top priority application gets more resources compared to a relatively lower priority application.
7. The method as claimed in claim 1, wherein the touch gesture for triggering the smart device to perform an event comprises changing a priority of one or more incoming calls to pick-up one call and keep another call on hold.
8. The method as claimed in claim 1, wherein the touch gesture comprises at least one of a swapping and tapping, pinching and bringing icons together, pinching and zooming an application icon, tapping twice on an application icon, pressing an application icon for a predefined period and dragging an application icon from one priority quadrant to another priority quadrant.
9. The method as claimed in claim 1, wherein the touch gesture for triggering the smart device to perform an event comprises pinch zooming an application icon for performing one of a
closing an application associated with a priority quadrant which is receiving the pinch zooming gesture, and
closing remaining one or more active applications on the smart device.
10. The method as claimed in claim 1, wherein the touch gesture, for triggering the smart device to perform an event, comprises at least one of
pinching and bringing icons of a first incoming call and a second incoming call together to merge both the calls into a conference call;
pressing an icon for a predefined period and dragging to a first priority quadrant of a display of the wearable device to split screen of the smart device;
pinching and bringing icons of two browsers displaying on two priority quadrants together in one of a priority quadrant to open all the tabs in one browser on the smart device and close the other browser based on predefined instruction;
pinching and bringing icons of a memo application and an email application together to send the memo as an attachment in the e-mail;
pinching and zooming an application icon to terminate one or more remaining active applications in the smart device;
pinching and bring icons of music icon and web browser together to search the details of currently playing song on the web browser;
tapping twice on music icon displaying on one of the priority quadrant for changing the music tracks running on the smart device;
dragging a program icon to the first priority quadrant to change running program on a smart TV;
pressing a program icon for a predefined period and dragging to the first priority quadrant to split display screen of a smart TV; and
tapping twice on an icon of a TV channel for opening channel settings thereby allowing user to change the channel settings,
where a display of the smart device is divided into at least four priority quadrants representing the first priority quadrant, a second priority quadrant, a third priority quadrant, and a fourth priority quadrant.
11. The method as claimed in claim 1, wherein the two or more active applications on the smart device are merged based on receiving a predefined gesture on the wearable device to perform one or more functions on the smart device based on predefined configuration.
12. A wearable device, comprising:
a memory that is configured to store computer-executable instructions; and
one or more processors communicatively coupled to the memory, the one or more processors are configured to execute the computer-executable instructions stored in the memory to:
display a plurality of application icons on the wearable device, wherein each icon from the plurality of icons represents an active application on a smart device connected to the wearable device;
receive a touch gesture on one or more application icons from the plurality of application icons; and
send an instruction to the smart device to perform an event comprising an interaction between the active applications represented by the one or more application icons in response to the touch gesture.
Dated this the 17th day of July 2015
Signature
KEERTHI J S
Patent agent
Agent for the applicant ,TagSPECI:
FORM 2
THE PATENTS ACT, 1970
[39 of 1970]
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(Section 10; Rule 13)
METHOD AND SYSTEM FOR MANAGING APPLICATIONS RUNNING ON SMART DEVICE USING A WEARABLE DEVICE
SAMSUNG R&D INSTITUTE INDIA – BANGALORE Pvt. Ltd.
# 2870, ORION Building, Bagmane Constellation Business Park,
Outer Ring Road, Doddanakundi Circle,
Marathahalli Post,
Bangalore -560037, Karnataka, India
Indian Company
The following Specification particularly describes the invention and the manner in which it is to be performed
FIELD OF THE INVENTION
The present invention generally relates to wearable device and more particularly relates to method and system for managing applications running on smart device using the wearable device.
BACKGROUND OF THE INVENTION
Wearable device such as smartwatch is a computerized wristwatch having enhanced function beyond timekeeping whereas the existing smartwatch performs basic functions such as calculations, translations, and game-playing. Now, we are surrounded with number of smart devices and managing these devices individually is a cumbersome process. However, controlling the smart devices with the wearable devices is known for limited functions.
Present state of the art does not provide for ways to enable a user to prioritize one or more applications or/and handle multiple applications running in a smart device though a wearable device, where the user could prioritize one or more applications or/and handle multiple applications by interacting with the wearable device. Generally Smart devices include, but not limited to, a smartphone, a tablet and a smart TV. The smart device, such as smartphone, runs various applications such as Social Network Services (SNSs), emails, and Instant Messaging (IM) applications.
Additionally, there is no system having interactive user experience (UX) to control and manage multiple programs simultaneously in Smart devices.
Therefore, there is a need for method and system for managing multiple smart devices by controlling the programs or applications running on a smart device using a wearable device.
SUMMARY
An embodiment of the present invention describes a method for managing applications running on one or more smart devices. The method comprises displaying a plurality of application icons on a wearable device, wherein each icon from the plurality of icons represents an active application on the smart device connected to the wearable device, receiving a touch gesture on one or more application icons from the plurality of icons, and triggering the smart device to perform an event comprising an interaction between the active applications represented by the one or more application icons in response to the touch gesture.
Another embodiment of the present invention describes a wearable device, which comprises a memory that is configured to store computer-executable instructions, and one or more processors communicatively coupled to the memory. The one or more processors are configured to execute the computer-executable instructions stored in the memory to display a plurality of application icons on the wearable device, wherein each icon from the plurality of application icons represents an active application on a smart device connected to the wearable device, receive a touch gesture on one or more application icons from the plurality of icons, and send an instruction to the smart device to perform an event comprising an interaction between the active applications represented by the one or more application icons in response to the touch gesture.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The aforementioned aspects and other features of the present invention will be explained in the following description, taken in conjunction with the accompanying drawings, wherein:
Figure 1 illustrates a system for managing communication between smart device and wearable device according to an exemplary embodiment of the present invention.
Figure 2 illustrates a scenario of switching between an application mode and a priority mode of the control (UX) application running on the wearable device on receiving a predefined gesture, according to an exemplary embodiment of the present invention.
Figure 3 illustrates a scenario of handling an incoming call on the smart device on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 4 illustrates a scenario of merging two or more incoming calls and converting into a conference call on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 5 illustrates a scenario of sharing a smartphone screen between two applications on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 6 illustrates a scenario of merging multiple browsers in a smart device such as Tablet on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 7 illustrates a scenario of merging multiple browsers in a smart device such as smartphones on receiving a predefined gesture on a wearable device, according to another exemplary embodiment of the present invention.
Figure 8 illustrates a scenario of sending memo as email attachment on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 9 illustrates a scenario of closing one or more applications on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 10 illustrates a scenario of performing content based searching in a smart device on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 11 illustrates a scenario of controlling key feature of an application in a smart device on receiving a predefined gesture on a wearable device, according to an embodiment of the present invention.
Figure 12 illustrates a scenario of swapping two programs in Smart TV on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 13 illustrates a scenario of sharing a display screen among multiple TV channels on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 14 illustrates a scenario of defining a specific setting for each channel using a wearable device, according to an exemplary embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
The embodiments of the present invention will now be described in detail with reference to the accompanying drawings. However, the present invention is not limited to the embodiments. The present invention can be modified in various forms. Thus, the embodiments of the present invention are only provided to explain more clearly the present invention to the ordinarily skilled in the art of the present invention. In the accompanying drawings, like reference numerals are used to indicate like components.
The specification may refer to “an”, “one” or “some” embodiment(s) in several locations. This does not necessarily imply that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes”, “comprises”, “including” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations and arrangements of one or more of the associated listed items.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein. Any name or term (which is registered trademark/copyright) used in the specification is only for the purpose of explaining the invention and not for any commercial gain.
Figure 1 illustrates a system 100 for managing communication between smart device and wearable device according to an exemplary embodiment of the present invention. The system 100 comprises a wearable device 101 and one or more smart devices 102. The smart device 102 includes but not limited to smart phone, tablet, smart TV etc. The wearable device 101 comprises an application module 101a, a SAP Gesture handler server 101b, and accessory protocols 101c. The smart device 102 comprises an application handler daemon 102a, a SAP gesture handler client 102b, and accessory protocols 102c.
The connection between the wearable device 101 and the smart device 102 is established through a SAP (Samsung Accessory Protocol) protocol (or any wireless link with communication protocol). The application when launched or closed on the smart device 102, the app ID and the app data (if any) are sent to the wearable device 101 in which the SAP Gesture Handler server 101b handles the data and notifies the application 101a of the wearable device 101. The data communicated from the smart device 102 to the wearable device 101 includes but not restricted to
i. Application ID
ii. Application Icon details.
iii. Event type. (Launched / Closed / Background / Foreground / Priority change etc.)
iv. Event details.
v. In case of Television it can be the channel details. (Icon + Number + Category like News, Sports, Movies, etc.)
In one embodiment, the wearable device 101 comprises a memory (not shown in figure 1) and a processor (not shown in figure 1). When the application on the wearable device 101 detects a gesture, the processor of the wearable device 101 processes the gesture. Subsequently, the wearable device 101 sends instructions to the smart device 102 for implementing the gesture. The gesture includes but not limited to Swap, Pinch, Double tap, Long press etc. The data transmitted by the wearable device 101 includes but not limited to
I. Application ID / ID’s
II. Event type. (Priority change, Foreground, Background, Close, Merge, Split screen etc.)
III. Event details
IV. In case of television the subset of the above mentioned events would hold good and the event details would be the settings like contrast, brightness etc., channel number and others.
Figure 2 illustrates a scenario of switching between an application mode and a priority mode of the control (UX) application running on the wearable device on receiving a predefined gesture, according to an exemplary embodiment of the present invention.
In the embodiment, the user interface (UI) is designed in such a way that with a simple UI touch gesture, the user can switch between the application mode (as shown in 101d) and the priority mode (as shown in 101e). In the application mode, the user performs the following activities:
1) applications can be sent to foreground/background by a predefined gesture such as by swiping the application icons.
2) two applications can be merged depending on the predefined configuration (such as context) on receiving a predefined gesture (such as Pinch zoom in or using two fingers) to merge the applications.
3) screen of the smart device can be virtually split to share between two applications on receiving a predefined gesture such as Long press on an application icon and move it on top of another icon.
4) the setting of the smart TV can be changed on receiving a predefined gesture. The setting includes but not limited to brightness, volume, contrast, child security feature or any other features provided in the smart TV. In another case, the channels can be changed by providing a predefined gesture such as swapping.
5) key feature of the application can be controlled by providing a predefined gesture such as Double tap gesture.
6) one or more applications can be closed by providing a predefined gesture such as pinch zoom out.
In the priority mode, the user is allowed to change the priority of the one or more applications. The change of priority enhances the user experience by allowing him to define his own priority to the applications rather than OS (operating system) managing the priorities.
For example: User wants to give the highest priority to the camera application when the battery is low. Using the present method, it would be easy/ convenient for the user to change the priority of the required application just by a predefined gesture on the wearable device.
In one embodiment, the priority of the application decrease from top left quadrant clock wise to bottom left quadrant. The application in top left quadrant (is the fourth quadrant of the display screen) has the highest priority. The top left quadrant of the display screen is a first (highest) priority quadrant. The top right quadrant of the display screen is a second priority quadrant. The bottom right quadrant of the display screen is a third priority quadrant. The bottom left quadrant of the display screen is a fourth priority quadrant.
Figure 3 illustrates a scenario of handling an incoming call on the smart device on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention. In this exemplary embodiment, a music player application is in the first priority quadrant of the wearable device 101d and so has highest priority. When the user picks up an incoming call at step 301 using any of the available methods such as Swipe to answer the incoming call on the smart device, receive through hands free, answer via wearable device etc., the call application takes the highest priority and its icon moves to the first priority quadrant (or the fourth quadrant of the screen) on the screen of the wearable device (as shown in 101e).
During the first call, if another incoming call arrives, then the second call’s icon occupies a second priority quadrant (i.e. top right quadrant of the screen) to indicate that another call is waiting (as shown in 101f). In case of further subsequent incoming call, the subsequent incoming call would be placed in the next lower priority quadrant. The user can switch between the calls by using a predefined gesture such as dragging the second call’s icon to the first priority quadrant (as shown in 101g) which automatically places the first call on hold and its icon being moved to the second priority quadrant (as shown in 101h).
Figure 4 illustrates a scenario of merging two or more incoming calls and converting into a conference call on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 4a depicts a pictorial representation of a scenario in which one or more further incoming calls comes during the first incoming call and user converts these calls into a conference by applying a predefined gesture such as pinching and bringing both the call icons together. This converts the existing ongoing call into a conference call and changes the icon to a conference call icon which is placed in the first priority quadrant.
Figure 4b depicts a flow diagram of merging two or more incoming calls and converting into a conference call on receiving a predefined gesture on a wearable device 101. At step 401, the wearable device 101 connects to the smart device 102 (such as smart phone or tablet) through SAP. At step 402, the smart device sends a list of applications running on it. At step 403, the smart device receives an incoming call. At step 404, the smart device 102 sends a call received notification along with call details to the wearable device 101. At step 405, the wearable device 101 updates the icons on the user interface (UI) of the wearable device 101. At step 406, the smart device 102 receives another incoming call. At step 407, the smart device 102 sends another call received notification along with second call details to the wearable device 101. At step 408, the wearable device 101 updates the icons on the UI of the wearable device 101. At step 409, the wearable device 101 performs gesture polling to check the gesture. The wearable device 101 interprets a gesture received from the user and performs the corresponding function, in this particular case changing the icon to the conference call. Here, polling is a procedure in which one process waits for the inputs from another. In this case, after receiving the call details, the wearable device waits for the user gestures. This wait is described as polling. At step 410, the wearable device 101 sends the data to the smart device 102 for merging and converting the two or more calls into conference calls. The data includes but not limited to notification type (i.e. merge calls), and the MSISDN (Mobile Station International Subscriber Directory Number) number of two calls. At step 411, the conference call is established between two or more callers.
Figure 5 illustrates a scenario of sharing a smartphone screen between two applications on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention. This exemplary embodiment explains how the user can virtually split the screen and places two different applications on one screen (Single screen).
In this exemplary embodiment, an icon of music application (i.e. primary application which is in the foreground of the smart device) occupies the first priority quadrant and an icon of the map application occupies the second priority quadrant, of the screen of the wearable device 101. At step 501, the user provides a predefined gesture on the wearable device 101 to virtually split the screen of the smart device 102. At step 502, the wearable device 101 sends an instruction to the smart device to virtually split the screen of the smart device and enable the user to access both the application together. This updates the icon on the wearable device 101 as well. In this particular case, the predefined gesture is a long press on the icon of the second application (New application icon which needs to be placed on the smart device screen) and drag to the first priority quadrant.
Figure 6 illustrates a scenario of merging multiple browsers in a smart device such as Tablet, on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention. This embodiment describes how two applications can be merged contextually. The contextual merging of applications is a method of using a data from one application in another application. The data can be anything that is of useful to another application. There can be a pre-defined or a default behavior when the applications are merged contextually or the user can be allowed to configure how the applications should respond when they are merged contextually.
In one exemplary embodiment, there are few tabs which are opened in chrome browser and there are another set of tabs that are opened in Internet explorer. When both these applications are merged contextually (by providing a predefined gesture such as Pinch and bring two browsers together), all the tabs present in one browser (internet explorer here because it has lesser priority compared to chrome because of its placement in the UI of the wearable device), would be opened in another browser (Chrome here) and the former would be closed.
Figure 7 illustrates a scenario of merging multiple browsers in a smart device such as smartphones on receiving a predefined gesture on a wearable device, according to another exemplary embodiment of the present invention. This embodiment also describes contextual merging of two applications similar to the embodiment described in Figure 6 but in this embodiment the smart device is a smart phone. In the smart device 102d, two tabs are opened in one browser. In the smart device 102e, one tab is opened in another browser. When the UI of the wearable device receives a predefined gesture (such as pinching and bringing two browsers), the wearable device 101 process the received gesture and sends the instruction to the smart device 102. The smart device (i.e. smart phone) 102 opens all the tabs in one browser and closes the other browser as shown in 102f.
Figure 8 illustrates a scenario of sending memo as email attachment on receiving a predefined gesture on a wearable device, according to yet another exemplary embodiment of the present invention. This is yet another exemplary embodiment of contextual merging of two different applications.
Figure 8a shows that the memo is opened in a first priority quadrant and an email is opened in a second priority quadrant. A user provides a predefined gesture such as pinching and brings the memo icon and the email icon together to send the memo as an attachment in the email. Then, the memo is attached to an email by just a pinch gesture.
Figure 8b depicts a flow diagram of a method of sending memo as email attachment on receiving a predefined gesture on a wearable device according to an exemplary embodiment of the present invention. At step 801, the wearable device 101 connects to the smart device 102 (such as smart phone or tablet) through SAP. Once the connection is established, the smart device 102 sends all the open application details to the wearable device 101 at step 802. At step 803, the UI of the wearable device 101 receives a predefined gesture. Subsequently the wearable device 101 processes the gesture and provides the details to the smart device 102 at step 804. The details include but not limited to applications ID’s of Memo and Mail, and memo ID. At step 805, the smart device 102 on receiving the details attaches the memo as an attachment in a new e-mail.
Figure 9 illustrates a scenario of closing one or more applications on receiving a predefined gesture on a wearable device 101d, according to an exemplary embodiment of the present invention. This exemplary embodiment describes that the user can either close a particular application or all other applications open on the smart device excluding the particular application by pinch zooming on the particular application icon shown on the wearable device 101d. When the user provides a gesture on an icon of a particular application (such as Facebook in this particular example) displaying on the wearable device 101d, all the applications are closed except the Facebook application as shown in wearable device 101e.
Figure 10 illustrates a scenario of performing content based searching in a smart device 102 on receiving a predefined gesture on a wearable device 101, according to an exemplary embodiment of the present invention. This is further exemplary embodiment of contextual merging of two different applications running on the smart device 102. The icon of the music player (assuming currently some music being played) and the icon of the browser application can be pinched and brought together to merge them contextually which results in:
a) Extracting the Meta data from the music file.
b) Using some of the field in the Meta data as a search input to the browser.
Figure 11 illustrates a scenario of controlling key feature of an application in a smart device on receiving a predefined gesture on a wearable device, according to an embodiment of the present invention. This exemplary embodiment describes how a basic feature of any application running on the smart device can be controlled by a predefined gesture (such as double tap gesture) on its icon on the UI of the wearable device 101.
Following are few examples given below in respect of controlling basic feature of any application based on either user configuration or predefined configuration:
a) Double tapping on the music application icon switch the running music track to next track.
b) Double tapping on an email / social application triggers the sync.
c) Double tapping on the calendar application displays the next appointment.
Figure 12 illustrates a scenario of swapping two programs in a Smart TV 102 on receiving a predefined gesture on a wearable device 101, according to an exemplary embodiment of the present invention. In this exemplary embodiment, the wearable device wirelessly connects to the smart TV 102. The smart TV 102 shares the channel details and the settings of each channel with the wearable device 101. The screen of the wearable device 101 shows four channels one in each quadrant. The channel icon shown in the first priority quadrant (fourth quadrant of the screen) is displaying on the smart TV 102. The user can change the displaying channel by using a predefined gesture such as dragging another channel’s icon to the first priority quadrant. Once the UI receives the gesture, the wearable device 101 process the gesture and sends the details to the smart TV 102. Then the smart TV 102 processes the details and changes the displaying channel.
Figure 13 illustrates a scenario of sharing a display screen among multiple TV channels on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention.
Figure 13a describes how to virtually split the TV screen to display two different channels on the same screen. The smart TV 102a displays only one channel. When the user provides a predefined gesture on the wearable device 101a, the screen of the smart TV 102 is virtually split and displays two channels together on the same screen.
Figure 13b describes a flow diagram of a method of sharing a display screen among multiple TV channels on receiving a predefined gesture on a wearable device, according to an exemplary embodiment of the present invention. At step 1301, the wearable device connects to the smart TV through SAP. Once the connection is established, the smart TV 102 sends the channel details to the wearable device 101 at step 1302. At step 1303, the wearable device 101 performs polling for a gesture and receives a predefined gesture on the UI provided by the user. Subsequently, the wearable device 101 processes the received gesture. Here, the polling is a procedure in which one process waits for the inputs from another. In this case, after receiving the channel details, the wearable device waits for the user gestures. This wait is described as polling. At step 1304, the instruction along with the details is sent to the smart TV 102 to virtually split the display screen by the wearable device 101. The details include but not limited to channel ID’s of two channel which shares the screen and positioning details of the two channels (such as left or right). At step 1305, the display screen is virtually split and the two channels are displayed simultaneously.
Figure 14 illustrates a scenario of defining a specific setting for each channel using a wearable device 101, according to an exemplary embodiment of the present invention. This exemplary embodiment describes that the setting of the smart TV 102 can be changed using one or more predefined gestures on the wearable device 101. For instance, whenever the user double taps the icon of a channel, the settings screen opens up wherein the user can configure the setting like volume, brightness, contrast, color, sharpness and screen dimensions for that particular channel alone. Once done, these settings are pushed to the smart TV 102. Until further changes, whenever this channel is played, the user configured settings are used in the smart TV 102.
Although the invention of method and system has been described in connection with the embodiments of the present invention illustrated in the accompanying drawings, it is not limited thereto. It will be apparent to those skilled in the art that various substitutions, modifications and changes may be made thereto without departing from the scope and spirit of the invention.
| # | Name | Date |
|---|---|---|
| 1 | 3681-CHE-2015-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 1 | SRIB-20140619-001_Form 5_Filed with IPO on 17th July 2015.pdf | 2015-07-20 |
| 2 | 3681-CHE-2015-IntimationOfGrant04-02-2022.pdf | 2022-02-04 |
| 2 | SRIB-20140619-001_Drawings_Filed with IPO on 17th July 2015.pdf | 2015-07-20 |
| 3 | SRIB-20140619-001_Complete Specification_Filed with IPO on 17th July 2015.pdf | 2015-07-20 |
| 3 | 3681-CHE-2015-PatentCertificate04-02-2022.pdf | 2022-02-04 |
| 4 | POA_Samsung R&D Institute India-new.pdf | 2015-07-20 |
| 4 | 3681-CHE-2015-Written submissions and relevant documents [25-10-2021(online)].pdf | 2021-10-25 |
| 5 | abstract 3681-CHE-2015.jpg | 2015-09-30 |
| 5 | 3681-CHE-2015-US(14)-HearingNotice-(HearingDate-12-10-2021).pdf | 2021-10-17 |
| 6 | REQUEST FOR CERTIFIED COPY [09-11-2015(online)].pdf | 2015-11-09 |
| 6 | 3681-CHE-2015-Correspondence to notify the Controller [11-10-2021(online)].pdf | 2021-10-11 |
| 7 | Request For Certified Copy-Online.pdf | 2015-11-14 |
| 7 | 3681-CHE-2015-FORM-26 [11-10-2021(online)].pdf | 2021-10-11 |
| 8 | 3681-CHE-2015-RELEVANT DOCUMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 8 | 3681-CHE-2015-CLAIMS [08-06-2020(online)].pdf | 2020-06-08 |
| 9 | 3681-CHE-2015-DRAWING [08-06-2020(online)].pdf | 2020-06-08 |
| 9 | 3681-CHE-2015-FORM 13 [17-07-2019(online)].pdf | 2019-07-17 |
| 10 | 3681-CHE-2015-AMENDED DOCUMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 10 | 3681-CHE-2015-FER_SER_REPLY [08-06-2020(online)].pdf | 2020-06-08 |
| 11 | 3681-CHE-2015-FER.pdf | 2019-12-16 |
| 11 | 3681-CHE-2015-OTHERS [08-06-2020(online)].pdf | 2020-06-08 |
| 12 | 3681-CHE-2015-PETITION UNDER RULE 137 [08-06-2020(online)]-1.pdf | 2020-06-08 |
| 12 | 3681-CHE-2015-PETITION UNDER RULE 137 [08-06-2020(online)].pdf | 2020-06-08 |
| 13 | 3681-CHE-2015-PETITION UNDER RULE 137 [08-06-2020(online)]-1.pdf | 2020-06-08 |
| 13 | 3681-CHE-2015-PETITION UNDER RULE 137 [08-06-2020(online)].pdf | 2020-06-08 |
| 14 | 3681-CHE-2015-FER.pdf | 2019-12-16 |
| 14 | 3681-CHE-2015-OTHERS [08-06-2020(online)].pdf | 2020-06-08 |
| 15 | 3681-CHE-2015-AMENDED DOCUMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 15 | 3681-CHE-2015-FER_SER_REPLY [08-06-2020(online)].pdf | 2020-06-08 |
| 16 | 3681-CHE-2015-DRAWING [08-06-2020(online)].pdf | 2020-06-08 |
| 16 | 3681-CHE-2015-FORM 13 [17-07-2019(online)].pdf | 2019-07-17 |
| 17 | 3681-CHE-2015-RELEVANT DOCUMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 17 | 3681-CHE-2015-CLAIMS [08-06-2020(online)].pdf | 2020-06-08 |
| 18 | Request For Certified Copy-Online.pdf | 2015-11-14 |
| 18 | 3681-CHE-2015-FORM-26 [11-10-2021(online)].pdf | 2021-10-11 |
| 19 | REQUEST FOR CERTIFIED COPY [09-11-2015(online)].pdf | 2015-11-09 |
| 19 | 3681-CHE-2015-Correspondence to notify the Controller [11-10-2021(online)].pdf | 2021-10-11 |
| 20 | abstract 3681-CHE-2015.jpg | 2015-09-30 |
| 20 | 3681-CHE-2015-US(14)-HearingNotice-(HearingDate-12-10-2021).pdf | 2021-10-17 |
| 21 | POA_Samsung R&D Institute India-new.pdf | 2015-07-20 |
| 21 | 3681-CHE-2015-Written submissions and relevant documents [25-10-2021(online)].pdf | 2021-10-25 |
| 22 | SRIB-20140619-001_Complete Specification_Filed with IPO on 17th July 2015.pdf | 2015-07-20 |
| 22 | 3681-CHE-2015-PatentCertificate04-02-2022.pdf | 2022-02-04 |
| 23 | SRIB-20140619-001_Drawings_Filed with IPO on 17th July 2015.pdf | 2015-07-20 |
| 23 | 3681-CHE-2015-IntimationOfGrant04-02-2022.pdf | 2022-02-04 |
| 24 | SRIB-20140619-001_Form 5_Filed with IPO on 17th July 2015.pdf | 2015-07-20 |
| 24 | 3681-CHE-2015-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 1 | SearchStrategyMatrix(3681)_16-12-2019.pdf |