Sign In to Follow Application
View All Documents & Correspondence

System And Method For Displaying Contenton Wearable Device Connectedto A Multiple Electronic Devices

Abstract: Embodiments herein provide a method and apparatus for displaying content. The method includes dynamically displaying a display portion associated with one or more second electronic devices based on a predefined criterion by a first electronic device. Further, the method includes performing a first predefined operation on the display portion associated with the second electronic device in response to an input. Furthermore, remotely performing a second predefined operation on the second electronic device after performing the first predefined operation. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
07 May 2014
Publication Number
02/2016
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
patent@ipmetrix.com
Parent Application
Patent Number
Legal Status
Grant Date
2022-06-29
Renewal Date

Applicants

Samsung R&D Institute India - Bangalore Pvt Ltd
# 2870,Orion Building, Bagmane Constellation Business Park,Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bangalore -560037

Inventors

1. Dipin Kollencheri Puthenveettil
Samsung R&D Institute India - Bangalore # 2870,Orion Building, Bagmane Constellation Business Park, Outer Ring Road,Doddanekundi Circle, Marathahalli Post, Bangalore - 560037

Specification

DESC: FIELD OF INVENTION
[001] The embodiments herein relates to wearable devices and more particularly for enhancing user interactions on the wearable device connected to multiple electronic devices simultaneously. The present application is based on, and claims priority from an Indian Application Number 2284/CHE/2014 filed on 7th May 2014, the disclosure of which is hereby incorporated by reference herein.
BACKGROUND OF THE INVENTION
[002] Electronic data and communication devices continue to become smaller, even as their information processing capacity continues to increase. In traditional systems, portable communication devices are primarily touch screen based user interfaces, which allow the devices to be controlled with user gestures. Many of the user interfaces are optimized for pocket-sized devices, such as cell phones, that have larger screens typically greater than 3? or 4? diagonal. Due to their relatively large form factors, one or more mechanical buttons is typically provided to support operation of these devices.
[003] In another traditional system, the user can pre-configure the settings in the electronic device to constantly display the dynamic application updates on the display of the wearable device as the user can easily access the wearable device while travelling. For example, if the user is driving a car then it is inconvenient to access his/her mobile phone or laptop to view the dynamic updates in regular intervals as it includes distraction of his/her eye site and can lead to accidents. The user can pre-configure the settings in his/her mobile phone or can perform the tap gesture on the dashboard of the car to display the dynamic updates of the user received from the mobile phone.
[004] In yet another traditional method, it is highly likely that many of the users carry multiple electronic devices with them on the move. Mostly there will be a tablet, a laptop, multiple mobile devices etc. or there would be electronics in the car which the user would be interacting with while travelling. The advent of wearable devices makes it really easy to interact with the particular electronic device on the move. However, the wearable electronic device cannot be used for switching between multiple electronic devices at the same time which are connected to the wearable electronic device. For example, a user cannot wear two spectacles at a time and also it is difficult to have 2 or 3 watches on a hand etc. Moreover the capability and utility of the particular category of the wearable device is different from the other. Like the capability and utility of a watch kind of a wearable device is different from a spectacles kind of a wearable device and so one cannot replace the other.
[005] Thus there remains a need of a robust system and method to enable switching among display portions of multiple electronic devices connected to a wearable device simultaneously; thereby, providing good user experience for the end-user while interacting to the different electronic devices through the wearable device.
[006] The above information is presented as background information only to help the reader to understand the present invention. Applicants have made no determination and make no assertion as to whether any of the above might be applicable as Prior Art with regard to the present application.
OBJECT OF INVENTION
[007] The principal object of the embodiments herein is to provide a method for dynamically displaying one or more Home Screen (HS) display portions by a first electronic device corresponding to one or more second electronic devices.
[008] Another object of the embodiments herein is to provide a method to enable switching between one or more display portions corresponding to one or more second electronic devices connected to a first electronic device.
[009] Another object of the embodiments herein is to provide HS on a first electronic device for hosting application icons of the applications residing in different second electronic devices.
[0010] Another object of the embodiments herein is to provide a method to view and customize application icons of different second electronic devices on a HS of a first electronic device using one or more grids and a button on a display screen of the first electronic device.

SUMMARY

[0011] Accordingly the embodiments herein provide a method of displaying content. The method includes dynamically displaying, by a first electronic device, a display portion associated with one or more second electronic devices based on a predefined criterion. Further, the method includes performing a first predefined operation on the display portion associated with the one or more second electronic devices in response to an input. Furthermore, the method includes remotely performing a second predefined operation on the one or more second electronic devices in response to performing the first predefined operation.
[0012] Accordingly the embodiments herein provide an apparatus for displaying content. The apparatus includes a processor and a memory coupled to the processor including executable by the processor, the processor being operable when executing the instructions to dynamically display, by the apparatus, a display portion associated with one or more electronic devices based on a predefined criterion. Further, the processor being operable when executing the instructions to perform a first predefined operation on the display portion associated with the one or more electronic devices in response to an input. Furthermore, the processor being operable when executing the instructions to remotely perform a second predefined operation on the one or more second electronic devices in response to performing the first predefined operation.
[0013] Accordingly the embodiments herein provide a computer program product including computer executable program code recorded on a computer readable non-transitory storage medium, the computer executable program code when executed causing the actions including dynamically displaying, by a first electronic device, a display portion associated with one or more second electronic devices based on a predefined criterion. Further, the computer executable program code when executed causing the actions including performing a first predefined operation on the display portion associated with the one or more second electronic devices in response to an input. Furthermore, the computer executable program code when executed causing the actions including remotely performing a second predefined operation on the one or more second electronic devices in response to performing the first predefined operation.
[0014] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF FIGURES
[0015] This invention is illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[0016] FIG. 1 illustrates generally, among other things, a high level overview of a system for dynamically displaying display portion of one or more second electronic devices by a first electronic device, according to embodiments as disclosed herein;
[0017] FIG. 2 illustrates various modules of a first electronic device for dynamically displaying a display portion of one or more second electronic devices, according to embodiments as disclosed herein;
[0018] FIG. 3 illustrates various modules of a second electronic device in proximity to a first electronic device, according to embodiments as disclosed herein;
[0019] FIG. 4a illustrates an example high level architecture of a first electronic device, according to embodiments as disclosed herein;
[0020] FIG. 4b illustrates an example widget framework for implementing a Home-screen (HS) framework in a second electronic device, according to the embodiments as disclosed herein;
[0021] FIG. 5 is a flow diagram illustrating a method for dynamically displaying a display portion of a one or more second electronic devices, according to embodiments as disclosed herein;
[0022] FIGS. 6a-6d illustrate a display interface divided into plurality of grids of a first electronic device for viewing display portion corresponding to one or more second electronic devices, according to embodiments as disclosed herein;
[0023] FIG. 7 illustrates an example scenario for dynamically displaying content corresponding to a Mobile 1, Mobile 2, and a Laptop, according to embodiments as disclosed herein;
[0024] FIG. 8 illustrates an example HS framework on a wearable device to dynamically enable adding, displaying, or removing the HS of a Mobile and a Laptop, according to embodiments as disclosed herein;
[0025] FIGS. 9a-9d illustrate an example scenario for transferring data between a “Mobile” and a “Laptop” connected to a “Watch”, according to embodiments as disclosed herein;
[0026] FIG. 10 illustrates an example scenario for providing a way to launch applications on a “Mobile” and a “Laptop” from a single HS on the “Watch”, according to embodiments as disclosed herein;
[0027] FIGS. 11a and 11b illustrate an example scenario of performing “Tap” gesture by a user on a dynamically added HS on a “Watch” to view different application icons, according to embodiments as disclosed herein;
[0028] FIGS. 12a-12d illustrates an example scenario for customizing a grid of a dynamic HS on a “Watch”, according to embodiments as disclosed herein;
[0029] FIGS. 13a-13c illustrate an example scenario for customizing multiple grids of a HS on a “Watch” selected by a user, according to embodiments as disclosed herein;
[0030] FIGS. 14a-14d illustrate an example scenario for customizing a persistent HS for displaying application icons from different connected devices simultaneously, according to embodiments as disclosed herein; and
[0031] FIG. 15 illustrates a computing environment implementing the method and system for dynamically displaying a display portion of one or more second electronic devices, according to embodiments as disclosed herein.
DETAILED DESCRIPTION OF INVENTION
[0032] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The term “or” as used herein, refers to a non-exclusive or, unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those skilled in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0033] The embodiments herein achieve a method and apparatus for dynamically displaying content. The method includes dynamically displaying a display portion associated with one or more second electronic devices based on a predefined criterion by a first electronic device. In an embodiment, the predefined criterion can be, for example, but not limited to, adding screens based on connection, removing screens based on disconnection, proximity, location, trust list, machine learning, internet activity, calling activity, and the like. Further, the method includes performing a first predefined operation on the display portion associated with the one or more second electronic devices in response to an input. In an embodiment, the first predefined operation can be for example but not limited to scrolling between screens, transferring data between screens, and customizing screens, and the like. Further, the method includes remotely performing a second predefined operation on the one or more second electronic devices in response to performing the first predefined operation. In an embodiment, the second predefined operation can be for example but not limited to controlling corresponding devices, remotely switching between devices, remotely transferring data between the devices, and the like. The display portion can be controlled based on one or more inputs or gestures performed by a user on a display interface of the first electronic device. In an embodiment, the inputs or gestures can be for example but not limited to a Tap gesture, Swipe gesture, Drag gesture, and the like.
[0034] In an embodiment, the display portion includes one or more Home-screen (HS) elements of respective second electronic devices and the display of respective HS element can be controlled based on the inputs provided by the user on the display interface of the first electronic device. The display interface of the first electronic device can be divided into plurality of grids connected to a controlling element. In an embodiment, the controlling element can be for example but not limited to a button at the center of the display interface. Each grid corresponds to the display portion of the one or more second electronic devices.
[0035] Further, the method includes dynamically removing the display portion on the display interface of the first electronic device after determining that the corresponding second electronic device is disconnected from the first electronic device. The method includes dynamically displaying the display portion on the display interface of the first electronic device after determining that the second electronic device is connected with the first electronic device. Unlike conventional systems, the first electronic device interacts with the one or more second electronic devices by switching the display interface (i.e., home screen) on the first electronic device. The first electronic device includes a HS framework to dynamically add, display, remove, or the like the display portions (i.e., home screen elements associated with the second electronic devices) when any one of the second electronic device is newly connected or removed from the first electronic device. Furthermore the method includes switching the display portion of the second electronic devices based on a switch event received from a switch member of the first electronic device. In an embodiment, the first electronic device provides a switch member on the periphery of the first electronic device to enable the user to easily interact with another second electronic device by switching the Home-screens (HSs).
[0036] The method and system described herein is simple and robust for dynamically displaying one or more display portions corresponding to a plurality of second electronic devices simultaneously.
[0037] In conventional systems, the user will pre-configure the settings in the electronic device to constantly display the dynamic application updates on the wearable device display. The wearable device cannot be used for switching between home screens of the multiple electronic devices simultaneously.
[0038] Unlike conventional systems, the method enables interaction with multiple electronic devices connected to a wearable device simultaneously by providing an efficient mechanism to switch between HSs of the electronic devices. Here, the wearable device includes dedicated HS to interact with each of the electronic device connected to it. The user can interact with each of the connected electronic device by switching to the specific HS of the connected electronic devices on the wearable device. The user can launch applications residing on different electronic devices effortlessly from a single HS on the wearable device.
[0039] The labels such as “first”, and “second,” are used merely to describe the embodiments, and do not limit the scope of the invention.
[0040] Throughout the description the terms “display portion” and “Home-screen” are used interchangeably.
[0041] Throughout the description the terms “application icons” and “Home-screen elements” are used interchangeably.
[0042] Throughout the description the terms “display interface” and “display” are used interchangeably.
[0043] Referring now to the drawings, and more particularly to FIGS. 1 through 15, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[0044] FIG. 1 illustrates generally, among other things, a high level overview of a system 100 for dynamically displaying display portion of one or more second electronic devices 1041-N (hereafter referred to as second electronic device 104) by a first electronic device 102, according to embodiments as disclosed herein. The first electronic device 102 described herein can be slave device, for example, but not limited to, Smart-Watch, Smart Glass, smart band, Eyewear, car infotainment display unit, Smart washing machine, and the like. In an embodiment, the first electronic device 102 can be selected from a group of eligible devices along with the other devices acting in the role of slave or dependent devices by getting paired with the eligible device acting in the role of master or primary device. The second electronic device 104 described herein can be another device such as for example, but not limited to, laptop, desktop computer, mobile phone, smart phone, Personal Digital Assistants (PDAs), tablet, phablet, consumer electronic device, server, or any other electronic device.
[0045] The first electronic device 102 can include appropriate interfaces to directly or indirectly communicate with the second electronic device 104 and with various other devices over the network. The network described herein can be for example, but not limited to, wireless network, wire line network, public network such as the Internet, private network, global system for mobile communication network (GSM) network, general packet radio network (GPRS), local area network (LAN), wide area network (WAN), metropolitan area network (MAN), cellular network, public switched telephone network (PSTN), personal area network, Bluetooth, Wi-Fi Direct, Near Field communication, Ultra Wide band, a combination thereof, or any other network.
[0046] The first electronic device 102 can be configured to dynamically add, display, or remove Home-screens (HSs) of the second electronic devices 104 connected to the first electronic device 102. The first electronic device 102 can be configured to include a HS framework for dynamically adding, displaying, or removing the HSs of the second electronic devices 104. In an embodiment, if each of the second electronic device 104 is brought in close proximity to the first electronic device 102 as shown in the FIG. 1, then the active HSs of the second electronic devices 104 are dynamically added to the HS framework of the first electronic device 102. Each HS of the second electronic device 104 are dynamically displayed by the first electronic device 102. In an embodiment, the HS associated with each of the second electronic device 104 can be switched by the user on performing a swipe gesture or by providing the input on the display interface of the first electronic device 102. The details of the swipe gestures performed by the user for switching the HSs of the second electronic devices 104 are explained in conjunction with the FIGS. 6a-6d. In another embodiment, the HSs corresponding to the second electronic devices 104 can be switched based on a switch event received from a switch member of the first electronic device 102.
[0047] The display interface of the first electronic device 102 can be divided into a plurality of grids connected to a controlling element. In an embodiment, the controlling element can be a button at the center of the display interface of the first electronic device 102. Also, each grid corresponds to the display portion of the second electronic device 104. In an embodiment, the display portion includes one or more HS elements of respective second electronic devices 104. Each of the HS elements can be controlled based on one or more inputs provided by the user on the first electronic device 102. The details of the grids and the controlling element associated with the first electronic device 102 are explained in conjunction with the FIGS. 6a-6d.
[0048] For example, consider a scenario where a first electronic device 102 such as car infotainment display unit is connected to a “Laptop” and a “Mobile”. The display portions of the “Laptop” and the “Mobile” are dynamically displayed on the infotainment display unit. The display portion associated with the “Laptop” or “Mobile” includes 4 applications which are displayed in the 4 grids of the infotainment display unit. On performing the swipe gesture or switch event on the switch member, the display portions of the “Laptop” and the “Mobile” are dynamically displayed. In another example, consider a scenario where a first electronic device 102 such as car infotainment display unit is connected to a “Laptop” and a “Mobile”. The display portions of the “Laptop” and the “Mobile” are dynamically displayed on the infotainment display unit. The display portion associated with the “Laptop” includes 2 applications which are displayed in the 2 grids of the infotainment display unit. Also, the display portion associated with the “Mobile” includes 2 applications which are displayed in the 2 grids of the infotainment display unit. The infotainment display unit displays 2 applications residing on the “Laptop” and 2 applications residing on the “Mobile” in the 4 grids simultaneously; thereby, displaying different applications from different devices on the single HS of the infotainment display unit.
[0049] The first electronic device 102 can be configured to include a HS sync component for synching the HS of the first electronic device 102 with the current active HS on the second electronic device 104. For example, if any two second electronic devices 104 are already paired with the first electronic device 102 then the connection with the first electronic device 102 is automatically enabled and the active HS of the connected second electronic devices 104 is shared with the first electronic device 102 automatically.
[0050] If any of the second electronic devices 104 are configured to dynamically modify the HS based on a location of the user, then the modified HS on the second electronic devices 104 are automatically updated in the first electronic device 102. The first electronic device 102 can be configured to maintain a list of trusted second electronic devices 104 that can be manually configured by the user. The first electronic device 102 can be configured to allow dynamic addition of the active HS of the second electronic devices 104 that are maintained in the trust list.
[0051] In another embodiment, the HS of the second electronic devices 104 can be dynamically added or customized on the first electronic device 102 based on the data activity or status of the second electronic devices 104. For example, consider a scenario where a wearable device “Watch” is connected to a “Mobile 1” and a “Laptop”. The wearable device “Watch” displays the HS of the “Mobile 1” and the “Laptop”. Further, if there is an email notification or a Social Networking Service (SNS) image marked as important on another “Mobile 2” then the HS of the “Mobile 2” is dynamically added to the HS framework of the “Watch”. In another example, consider a scenario where the user receives a phone call on his/her “Mobile 1” then the HS of the “Mobile 1” is dynamically added to the wearable device “Watch” on receiving the phone call.
[0052] Further, the first electronic device 102 can be configured to customize the HS to dynamically display frequently accessed applications by the user on different second electronic devices 104 together. In an embodiment, the first electronic device 102 can be configured to learn over a period of time about the applications (or Apps) residing on different second electronic devices 104 that the user frequently launches from the first electronic device 102. The first electronic device 102 displays the application icons residing on the different second electronic devices 104 together on the single HS displayed on the first electronic device 102. In another embodiment, the user can manually specify the applications that should be displayed on the HS of the first electronic device 102 from each of the second electronic device 104. The first electronic device 102 can be configured to dynamically remove the HS of the second electronic device 104 from the HS framework of the first electronic device 102 when the second electronic device 104 is disconnected.
[0053] Unlike conventional systems, multiple HSs associated with a plurality of external devices connected are displayed on the display interface of the portable device. Thus, eliminating the need of using different portable devices for controlling each of the external devices of the user. For example, consider a scenario where the plurality of master devices such as “Mobile” and “Laptop” are connected to a wearable device such as “Watch”. The display portions of the “Laptop” and the “Mobile” are dynamically displayed on the “Watch”; thereby, eliminating the need of using separate wearable devices for controlling the “Mobile” and the “Laptop”.
[0054] The FIG. 1 shows a limited overview of the system 100 but, it is to be understood that other embodiments are not limited thereto. Further, the system 100 can include any number of electronic devices along with other hardware or software components communicating with each other. For example, the component can be, but not limited to, a process running in the controller or processor, an object, an executable process, a thread of execution, a program, or a computer. By way of illustration, both an application running on a device and the device itself can be a component.
[0055] FIG. 2 illustrates various modules of the first electronic device 102 for dynamically displaying a display portion of the second electronic devices 104, according to embodiments as disclosed herein. In an embodiment, the first electronic device 102 includes a Home-screen (HS) framework 202. The HS framework 202 includes a grid management module 204, a grid selection module 206, a grid gesture processing module 208, a grid customization module 210, an event communication module 212, a communication module 214, a display module 216, and a storage module 218.
[0056] The grid management module 204 can be configured to assign the particular grid or the group of grids to the particular second electronic device 104. In an embodiment, the grid management module 204 can be configured to assign all the grids to the particular second electronic device 104 where the HS elements associated with the second electronic device 104 are dynamically displayed. In another embodiment, the grid management module 204 can be configured to assign each grid to the different second electronic device 104 where each HS element associated with different second electronic devices 104 are displayed.
[0057] On receiving the input or gesture from the user, the grid selection module 206 can be configured to select the grid and the HS element of the second electronic device 104 to be displayed on the grid. For example, the user selects the email application associated with the second electronic device 104 to be displayed on the particular grid using the grid selection module 206. The grid gesture processing module 210 is connected (e.g., communicatively coupled) to the grid gesture selection module 206 and receives user gesture information sent by the gesture selection module 206. The gird gesture processing module 208 can be configured to convert the user gesture input information into corresponding gesture recognition information and can send the gesture recognition information to the gesture grid customization module 210.
[0058] On receiving the gesture recognition information, the grid customization module 210 can be configured to initiate the grid customization process for replacing the HS element displayed on the grid. The grid customization module 210 can be configured to terminate the grid customization process on replacing the HS element with other HS element associated with the second electronic device 104. The details of the grid customization process are explained in conjunction with the FIGS. 14a-14d.
[0059] The event communication module 212 can be configured to send the first predefined operation to the HS event handling module (not shown) in the second electronic device 104 based on the inputs provided by the user on the first electronic device 102. The communication module 214 receives the display portions of the second electronic devices 104. On receiving the display portions, the communication module 212 sends the display portions to the display module 216. The communication module 214 can be used to transfer the data selected by the user from one electronic device to another electronic device based on the inputs provided by the user on the display module 216. For example, the user can provide simple inputs (i.e., copy and paste) to transfer data from one device to another device seamlessly using the communication module 214. The display module 216 displays the display portions corresponding to each of the second electronic device 104. The display module 216 including the display interface can be divided into the plurality of grids along with the controlling element at the center of the display. The details of the grids and the controlling element are explained in conjunction with the FIGS. 6a-6b.
[0060] Further, the storage module 218 stores the priority of the frequently used display portions over the less frequently used display portions. Based on the identified priority, the dynamically added display portions are stored in the storage module 218 facilitating easy user interactions. The storage module 218 stores the control instructions and operations which are used to perform various operations described herein.
[0061] The FIG. 2 show a limited overview of the modules of the first electronic device 102 but, it is to be understood that other embodiments are not limited thereto. The labels or names of the modules are used only for the illustrative purpose and does not limit the scope of the invention. Further, in real-time the function of the one or more modules can be combined or separately executed by the same or other modules without departing from the scope of the invention. Further, the electronic device can include various other modules along with other hardware or software components, communicating locally or remotely to display and perform various actions or operations described herein. For example, the component can be, but not limited to, a process running in the controller or processor, an object, an executable process, a thread of execution, a program, or a computer. By way of illustration, both an application running on an electronic device and the electronic device itself can be a component.
[0062] FIG. 3 illustrates various modules of the second electronic devices 104 in proximity to the first electronic device 102, according to embodiments as disclosed herein. In an embodiment, the second electronic devices 104 includes a HS event handling module 302, Application icon navigation module 304, a HS framework 306, and a storage module 308.
[0063] The HS event handling module 302 can be configured to receive the first predefined operation from the event communication module 212 in the first electronic device 102 based on the inputs provided by the user. The HS event handling module 302 sends the first predefined operation received from the event communication module 212 to the application icon navigation module 304. Based on the received first predefined operation, the application icon navigation module 304 can be configured to remotely perform the second predefined operation on the second electronic devices 104 to send the user selected application icons to the communication module 214 for dynamically displaying the application icons on the display of the first electronic device 102.
[0064] The details of the HS framework 306 in each of the second electronic device 104 are explained in conjunction with the FIG. 4b. The storage module 308 stores the control instructions and operations which are used to perform various operations described herein.
[0065] The FIG. 3 show a limited overview of the modules of the second electronic device 104 but, it is to be understood that other embodiments are not limited thereto. The labels or names of the modules are used only for the illustrative purpose and does not limit the scope of the invention. Further, in real-time the function of the one or more modules can be combined or separately executed by the same or other modules without departing from the scope of the invention. Further, the electronic device can include various other modules along with other hardware or software components, communicating locally or remotely to display and perform various actions or operations described herein. For example, the component can be, but not limited to, a process running in the controller or processor, an object, an executable process, a thread of execution, a program, or a computer. By way of illustration, both an application running on an electronic device and the electronic device itself can be a component.
[0066] FIG. 4a illustrates an example high level architecture of the first electronic device 102, according to embodiments as disclosed herein. In an embodiment, the architecture of the first electronic device 102 includes a light weight browser based User Interface (UI) that supports wearable Device Access Management (DAM) framework, a Web Runtime (WRT), a Web kit or Web core, and a Connectivity module.
[0067] The first electronic device 102 is built on a web browser technology using widgets over the WRT. By using the web technology, easy development environment on the first electronic device 102 limited by its UI capability can be provided and also easily supports multiple second electronic device 104 platforms.
[0068] FIG. 4b illustrates an example widget framework for implementing a HS framework in the second electronic devices 104, according to the embodiments as disclosed herein. In an embodiment, the widget application is required on the first electronic device 102 to handle each and every state of the of the application running. As shown in the FIG. 4b, the application framework on the first electronic device 102 controls the applications running on it and shares the application states to the second electronic devices 104; thereby, managing the first electronic device 102 environment. The HS framework is part of the application manager. Every second electronic device 104 connected to the first electronic device 102 installs a specific HS widget application on the first electronic device 102 so as to interact with the specific applications and functionalities of the second electronic device 104.
[0069] Further, the HS widget application is installed on the first electronic device 102 when it is connected for the first time with the second electronic device 104. In an embodiment, the HS widget application installed on the first electronic device 102 can be a special widget built based on a standard set of specifications set by the HS framework for it to be considered as the HS application by the framework on the first electronic device 102. The HS displayed on the first electronic device 102 also gets updated depending on the active HS on the first electronic device 102. Unlike conventional systems, it is easy for the user to switch back and forth between the HSs associated with different second electronic devices 104. For example, by taping on the HS icon or taping a switch element on the periphery of the first electronic device 102 to switch from one HS of the second electronic device 104 to another HS of another second electronic device 104.
[0070] FIG. 5 is a flow chart illustrating a method 500 for dynamically displaying a display portion of the second electronic devices, according to embodiments as disclosed herein. At step 502, the method 500 includes pairing with each of the second electronic device which are in close proximity with the first electronic device. The method 500 allows the communication module 214 to pair with the second electronic devices. At step 504, the method 500 includes dynamically displaying the display portion associated with different second electronic devices. The method 500 allows the display module 216 to dynamically display the display portion associated with different second electronic devices. The display portion can be controlled based on one or more inputs provided on the display of the first electronic device by the user. In an embodiment, the inputs provided by the user can be for example but not limited to a Tap gestures, Swipe gestures, drag gestures, or the like.
[0071] At step 506, the method 500 includes performing the first predefined operation on the display portion associated with the second electronic devices based on the input provided by the user. At step 508, the method includes triggering the first predefined operation in the first electronic device by a controlling element based on the input provided on the controlling element by the user. In an embodiment, the controlling element can be a button at the center of the display of the first electronic device. At step 510, the method 500 includes remotely performing the second predefined operation in the second electronic device based on the first predefined operation performed on the first electronic device. At step 512, the method 500 includes triggering the second predefined operation in the second electronic device based on the first predefined operation performed on the first electronic device.
[0072] The various actions, acts, blocks, steps, and the like in method 500 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions, acts, blocks, steps, and the like may be omitted, added, modified, skipped, and the like without departing from the scope of the invention.
[0073] FIGS. 6a-6b illustrate the display interface divided into plurality of grids of the first electronic device 102 for viewing display portion corresponding to the second electronic devices 104, according to embodiments as disclosed herein. For convenience, the first electronic device 102 is shown as a wearable device “Watch” and should not be construed as limiting the invention in any way. As shown in the FIG. 6a, the HS (i.e., the display portion) on the wearable device “Watch” can be created using 4 grids and the button at the center of the HS. The notation “602” indicates the button at the center for the user to provide inputs (i.e., gestures) to view and customize the grids. The notation “604” indicates 4 grids of the HS on the wearable device “Watch”. The notation “606” indicates a switch element for switching the display portions (i.e., HS elements) corresponding to different second electronic devices based on a switch event received from the switch member of the wearable device “Watch”.
[0074] Further, the button at the center of the HS can be used either to view other application icons (i.e., the HS elements corresponding to the second electronic device 104) or to customize the HS grid with the user desired application icons. Here, different gestures can be performed by the user on the button to view the application icons and to customize the grids. The user can perform a “Tap” gesture on the button to view other application icons that are not present in the current HS of the wearable device “Watch”. Also, the user can perform the “Tap” gesture and a “Drag” gesture to select the grid or a group of grids for customizing the application icons in the selected grids. Once the grids are selected, the user can again perform the “Tap” gesture to select the desired application icons and lock the application icon whichever needs to be displayed on the grid.
[0075] In an embodiment, each grid can be assigned to the desired second electronic device 104. In another embodiment, each grid can be assigned to different second electronic devices 104. The application icons on the grid can be customized by the same way using the button at the center of the HS. Further, each of the grids includes the application icon and maximum of “4” application icons associated with the second electronic devices 104 can be displayed on the HS of the wearable device “Watch”. As shown in the FIG. 6b, the structure of the HS on the wearable device “Watch” shows the “Tap” gesture performed by the user to view and customize different application icons of the second electronic devices 104.
[0076] As shown in the FIG. 6c, the “Tap” and “Drag” gesture can be performed by the user for selecting the particular application to customize the grid with the user required application icon. Once the grid is selected, the user can perform the “Tap” gesture on the button to select the desired application icon to be displayed on the selected grid. The user can select multiple grids on the HS for customizing multiple grids with the desired application icons as shown in the FIG. 6d.
[0077] FIG. 7 illustrates an example scenario for dynamically displaying content corresponding to a Mobile 1, Mobile 2, and a Laptop, according to embodiments as disclosed herein. The “Watch” is connected to the Mobile 1, Mobile 2, and Laptop. The content (i.e., application icons) corresponding to the Mobile 1, Mobile 2, and Laptop are dynamically displayed on the display (i.e., the HS) of the “Watch”. The switch element on the periphery of the “Watch” can be accessed by the user for switching the content of the Mobile 1, Mobile 2, and Laptop.
[0078] FIG. 8 illustrates an example HS framework on a wearable device “Watch” to dynamically enable adding, displaying, or removing the HS of a Mobile and a Laptop, according to embodiments as disclosed herein. A common notification bar on the display screen (i.e., the HS) of the “Watch” is shared by the connected devices “Mobile” and the “Laptop” as shown in the FIG. 8. The user can perform a horizontal swipe gesture on the HS of the “Mobile” to switch to the next HS of the “Laptop”. As the “Mobile” and the “Laptop” are connected to the “Watch”, 3 HSs are formed on the “Watch”. In an embodiment, three HSs can be represented by 3 graphical dots on the display of the “Watch”. The notation “802” indicates the graphical dot corresponding to the HS-1 for accessing the display screen of the “Watch”. The notation “804” indicates two graphical dots corresponding to the dynamically added HSs (i.e., HS-2 of the Mobile and HS-3 of the Laptop) are interacting with the “Mobile” and the “Laptop” connected to the “Watch”. In another embodiment, switch element on the periphery of the “Watch” can be used by the user to switch the HSs for easy access to the “Mobile” and the “Laptop” connected to the “Watch”.
[0079] FIGS. 9a-9d illustrate an example scenario for transferring data between a “Mobile” and a “Laptop” connected to a “Watch”, according to embodiments as disclosed herein. As the “Mobile” and the “Laptop” are connected to the “Watch”, 3 HSs are formed on the “Watch”. Consider a scenario where the “Watch” displays the HS of the “Mobile” as shown in the FIG. 9a. The user can able to interact with the “Mobile” and the “Laptop” through the “Watch”. The user can access the application on the HS associated with the “Mobile” to trigger “Copy” functionality from the menu launched on performing a long press on the selected data present the file. In an embodiment, on triggering the “Copy” functionality, the data transfer is automatically initiated from the “Mobile” to the “Watch”. The user can perform the switch event on the switch element for switching to the HS of the “Laptop” to paste the data copied from the “Mobile” as shown in the FIG. 9c.
[0080] The user can navigate to the appropriate location in the “Laptop” through the “Watch” to trigger the paste functionality. The user selects a specific new folder in the “Laptop” and the data transfer from the “Watch” to the specific new folder location in the “Laptop” initiates to paste the copied data from the “Mobile” as shown in the FIG. 9d. In another embodiment, only a small meta-data is passed through the “Watch” enabling the origin and destination to establish the connection between the “Mobile” and the “Laptop” for providing the details of the data, like name, type, location etc. and the actual data transfer is initiated directly between the “Mobile” and the “Laptop” automatically the moment the paste functionality is triggered on the specific location of the “Laptop” by the user.
[0081] FIG. 10 illustrates an example scenario for providing a way to launch applications on a “Mobile” and a “Laptop” from a single HS on the “Watch”, according to embodiments as disclosed herein. The persistent HS 1002 of the “Watch” as shown in the FIG. 10 is customized to include the application icons from the “Mobile” and the “Laptop”. The user can launch the applications effortlessly corresponding to the “Mobile” or the “Laptop” from the HS 1002 on the “Watch”. The top 2 grids on the HS of the “Watch” are assigned to a SNS 1 application and an Email application corresponding to the “Laptop”. The bottom 2 grids on the HS of the “Watch” are assigned to a Calculator application and a SMS application corresponding to the “Mobile” as shown in the FIG. 10. For example, the user can assess the SNS 1 application or Email application corresponding to the “Laptop” from the HS 1002 on the “Watch”. In another example, the user can assess the Calculator application or the SMS application corresponding to the “Mobile” from the HS 1002 on the “Watch”. In an embodiment, the “Mobile” and the “Laptop” may include a method to mark the application icons that should be added in the persistent HS 1002 of the “Watch” when connected. The applications from the “Mobile” and the “Laptop” are displayed in one single HS 1002 on the “Watch”.
[0082] FIGS. 11a and 11b illustrate an example scenario of performing “Tap” gesture by a user on a dynamically added HS on a “Watch” to view different application icons, according to embodiments as disclosed herein. The 4 grids of the HS on the “Watch” displays initial set of application icons present in the “Laptop” as shown in the FIG. 11a. It shows the preset application icons on the HS of the “Watch”. The user can perform the “Tap” gesture on the center button to view another set of 4 application icons from the “Laptop” connected with the “Watch” as shown in the FIG. 11b.
[0083] FIGS. 12a-12d illustrates an example scenario for customizing a grid of a dynamic HS on a “Watch”, according to embodiments as disclosed herein. The customization can includes selecting the grid and the application icon on the HS of the “Watch”. The HS on the “Watch” corresponds to the single connected device. For example, the single connected device can be a “Mobile”. The user may perform the “Tap” and “Drag” gesture to select the SNS application as shown in the FIG. 12a. The user may perform one or more “Tap” gestures to browse through the application icons corresponding to the connected device as shown in the FIG. 12b.
[0084] Finally, the user may perform a “Tap” gesture on the calculator application icon as shown in the FIG. 12c. The calculator application icon will be set and is displayed on the selected grid by the user and the grid customization procedure is terminated as shown in the FIG. 12d.
[0085] FIGS. 13a-13c illustrate an example scenario for customizing multiple grids of a HS on a “Watch” selected by a user, according to embodiments as disclosed herein. The customization includes selecting multiple grids and the application icon associated with the grids on the HS of the “Watch”. The HS on the “Watch” corresponds to the single connected device. For example, the single connected device can be a “Mobile”. The user may perform the “Tap” and “Drag” gesture to select the grids associated with a SNS application, a Music application, and an Email application as shown in the FIG. 13a. The user may perform one or more “Tap” gestures to browse through the application icons corresponding to the connected device as shown in the FIG. 13b.
[0086] Finally, the user may perform a “Tap” gesture on a Notepad application icon, a Folder application icon, and a Calculator application icon on grids selected by the user. The Notepad application icon, the Folder application icon, and the Calculator application icon are set and displayed on the selected grids and the grid customization procedure is terminated. As shown in the FIG. 13c, the 3 selected grids are replaced by another set of application icons and the Call application icon on the HS is not replaced as shown in the FIG. 13c.
[0087] FIGS. 14a-14d illustrate an example scenario for customizing a persistent HS for displaying application icons from different connected devices simultaneously, according to embodiments as disclosed herein. The persistent HS allows mapping its individual grids to different connected devices. The notation “A” indicates the grid assigned to an Email application icon corresponding to the “Laptop”. The notation “B” indicates the grid assigned to a Music application icon corresponding to the “Car stereo”. Also, the notation “C” indicates multiple grids assigned to a Call application icon and a Social Networking Service (SNS) application icon corresponding to the “Mobile” as shown in the FIG. 14a. The user may perform the “Tap” and “Drag” gesture for selecting the required grid as shown in the FIG. 14b. Further, the user may perform one or more “Tap” gestures for selecting the required application icon for customizing the selected grid from the respective connected device as shown in the FIG. 14c. The SNS application icon present on the grid assigned to the “Mobile” is replaced with a Chat application icon corresponding to the “Mobile” on the HS of the “Watch” as shown in the FIG. 14d.
[0088] FIG. 15 illustrates a computing environment implementing the method and system for dynamically displaying display portions of the second electronic devices 104, according to embodiments as disclosed herein. As depicted in the figure, the computing environment 1502 comprises at least one processing unit 1508 that is equipped with a control unit 1504 and an Arithmetic Logic Unit (ALU) 1506, a memory 1510, a storage unit 1512, plurality of networking devices 1516 and a plurality Input output (I/O) devices 1514. The processing unit 1508 is responsible for processing the instructions of the algorithm. The processing unit 1508 receives commands from the control unit in order to perform its processing. Further, any logical and arithmetic operations involved in the execution of the instructions are computed with the help of the ALU 1506.
[0089] The overall computing environment 1502 can be composed of multiple homogeneous and/or heterogeneous cores, multiple CPUs of different kinds, special media and other accelerators. The processing unit 1508 is responsible for processing the instructions of the algorithm. Further, the plurality of processing units 1508 may be located on a single chip or over multiple chips.
[0090] The algorithm comprising of instructions and codes required for the implementation are stored in either the memory unit 1510 or the storage 1512 or both. At the time of execution, the instructions may be fetched from the corresponding memory 1510 and/or storage 1512, and executed by the processing unit 1508.
[0091] In case of any hardware implementations various networking devices 1516 or external I/O devices 1514 may be connected to the computing environment to support the implementation through the networking unit and the I/O device unit.
[0092] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the elements. The elements shown in FIGS. 1, 2, 3, 4a, 4b, 6a-6d, 7, 8, 9a-9d, 10, 11a, 11b, 12a-12d, 13a-13c, 14a-14d, and 15 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
[0093] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein. ,CLAIMS:We claim:
1. A method of displaying content, the method comprising:
dynamically displaying, by a first electronic device, a display portion associated with one or more second electronic devices based on a predefined criterion;
performing a first predefined operation on said display portion associated with said one or more second electronic devices in response to an input; and
remotely performing a second predefined operation on said one or more second electronic devices in response to performing said first predefined operation.
2. The method of claim 1, wherein said display portion comprises at least one home screen element of respective said one or more second electronic devices, wherein display of each said at least one home screen element is controlled based on said input.
3. The method of claim 1, wherein said display portion is dynamically removed on said display of said first electronic device in response to disconnecting corresponding said one or more second electronic devices.
4. The method of claim 1, wherein said display portion is dynamically displayed on said display of said first electronic device in response to connecting said one or more second electronic devices.
5. The method of claim 1, wherein said display of said first electronic device is divided into a plurality of grids connected to a controlling element, wherein each said grid corresponds to said display portion of said one or more second electronic devices.
6. The method of claim 5, wherein said controlling element triggers said first predefined operation in said first electronic device based on said input on said controlling element.
7. The method of claim 5, wherein said controlling element triggers said second predefined operation in said one or more second electronic devices based on said first predefined operation.
8. The method of claim 1, wherein said one or more second electronic devices share a notification area of said first electronic device, wherein said notification area displays at least one notification message associated with corresponding at least one said home screen element.
9. The method of claim 1, wherein said display portion corresponding to at least one said second electronic device from said one or more second electronic devices is switched based on a switch event received from a switch member of said first electronic device.
10. An apparatus for displaying content, the apparatus comprising:
a processor; and
a memory coupled to said processor comprising instructions executable by said processor, the processor being operable when executing the instructions to:
dynamically display, by an apparatus, a display portion associated with one or more electronic devices based on a predefined criterion;
perform a first predefined operation on said display portion associated with said one or more electronic devices in response to an input; and
remotely perform a second predefined operation on said one or more second electronic devices in response to performing said first predefined operation.
11. The apparatus of claim 10, wherein said apparatus comprises:
a device body comprising said processor, said memory, and said display;
a switch member about said display; and
a detector configured to detect a switch event from said switch member, wherein said at least one display portion corresponding to at least one said electronic device from said one or more electronic devices is switched based on said switch event.
12. The apparatus of claim 10, wherein said display portion comprises at least one home screen element of respective said electronic device, wherein display of each said at least one home screen element is controlled based on said input.
13. The apparatus of claim 10, wherein said display portion is dynamically removed on said display of said apparatus in response to disconnecting corresponding said one or more electronic devices.
14. The apparatus of claim 10, wherein said display portion is dynamically displayed on said display of said apparatus in response to connecting said one or more electronic devices.
15. The apparatus of claim 10, wherein said display of said apparatus is divided into a plurality of grids connected to a controlling element, wherein each said grid corresponds to said display portion of said one or more electronic devices.
16. The apparatus of claim 15, wherein said controlling element triggers said first predefined operation in said apparatus based on said input on said controlling element.
17. The apparatus of claim 15, wherein said controlling element triggers said first predefined operation in said one or more electronic devices based on said first predefined operation.
18. The apparatus of claim 10, wherein said one or more electronic devices share a notification area of said apparatus, wherein said notification area displays at least one notification message associated with corresponding at least one said home screen element.
19. The apparatus of claim 10, wherein said display portion corresponding to at least one said electronic device from said one or more electronic devices is switched based on a switch event received from a switch member of said apparatus.
20. A computer program product comprising computer executable program code recorded on a computer readable non-transitory storage medium, said computer executable program code when executed causing the instructions including:
dynamically displaying, by a first electronic device, a display portion associated with one or more second electronic devices based on a predefined criterion;
performing a first predefined operation on said display portion associated with said one or more second electronic devices in response to an input; and
remotely performing a second predefined operation on said one or more second electronic devices in response to performing said first predefined operation.
21. The computer program product of claim 20, wherein said display portion comprises at least one home screen element of one or more second electronic devices, wherein display of each said at least one home screen element is controlled based on said input.
22. The computer program product of claim 20, wherein said display portion is dynamically removed on said display of said first electronic device in response to disconnecting corresponding said one or more second electronic devices.
23. The computer program product of claim 20, wherein said display portion is dynamically displayed on said display of said first electronic device in response to connecting said one or more second electronic devices.
24. The computer program product of claim 20, wherein said display of said first electronic device is divided into a plurality of grids connected to a controlling element, wherein each said grid corresponds to said display portion said one or more second electronic devices.
25. The computer program product of claim 24, wherein said controlling element triggers said first predefined operation in said first electronic device based on said input on said controlling element.
26. The computer program product of claim 24, wherein said controlling element triggers said second predefined operation in said one or more second electronic device based on said first predefined operation.
27. The computer program product of claim 20, wherein said one or more second electronic devices share a notification area of said first electronic device, wherein said notification area displays at least one notification message associated with corresponding at least one said home screen element.
28. The computer program product of claim 20, wherein said display portion corresponding to at least one said second electronic device from said one or more second electronic devices is switched based on a switch event received from a switch member of said first electronic device.
Dated: 17th Day of November, 2014 Signature
Arun Kishore Narasani Patent Agent

Documents

Application Documents

# Name Date
1 Samsung_SRIB-20140124-004_Spec.pdf 2014-05-13
2 Form 5.pdf 2014-05-13
3 Form 3.pdf 2014-05-13
4 Drawings.pdf 2014-05-13
5 Samsung_20140124-004_Specification.pdf 2014-11-24
6 Samsung - 20140124-004-Drawings.pdf 2014-11-24
7 abstract 2284-CHE-2014.jpg 2015-01-23
8 2284CHE2014 FORM-13 06-04-2015.pdf 2015-04-06
9 Samsung POA IPM NW ADDRSS.pdf 2015-04-08
10 FORM 13-change of POA - Attroney.pdf 2015-04-08
11 2284-CHE-2014 POWER OF ATTORNEY 21-04-2015.pdf 2015-04-21
12 2284-CHE-2014 FORM-1 21-04-2015.pdf 2015-04-21
13 2284-CHE-2014 CORRESPONDENCE OTHERS 21-04-2015.pdf 2015-04-21
14 2284-CHE-2014-FER.pdf 2018-11-26
15 2284-CHE-2014-PETITION UNDER RULE 137 [15-05-2019(online)].pdf 2019-05-15
16 2284-CHE-2014-FER_SER_REPLY [15-05-2019(online)].pdf 2019-05-15
17 2284-CHE-2014-US(14)-HearingNotice-(HearingDate-13-06-2022).pdf 2022-05-25
18 2284-CHE-2014-POA [13-06-2022(online)].pdf 2022-06-13
19 2284-CHE-2014-FORM-26 [13-06-2022(online)].pdf 2022-06-13
20 2284-CHE-2014-FORM 13 [13-06-2022(online)].pdf 2022-06-13
21 2284-CHE-2014-Correspondence to notify the Controller [13-06-2022(online)].pdf 2022-06-13
22 2284-CHE-2014-AMENDED DOCUMENTS [13-06-2022(online)].pdf 2022-06-13
23 2284-CHE-2014-Written submissions and relevant documents [28-06-2022(online)].pdf 2022-06-28
24 2284-CHE-2014-Response to office action [28-06-2022(online)].pdf 2022-06-28
25 2284-CHE-2014-Proof of Right [28-06-2022(online)].pdf 2022-06-28
26 2284-CHE-2014-PETITION UNDER RULE 137 [28-06-2022(online)].pdf 2022-06-28
27 2284-CHE-2014-Annexure [28-06-2022(online)].pdf 2022-06-28
28 2284-CHE-2014-PatentCertificate29-06-2022.pdf 2022-06-29
29 2284-CHE-2014-IntimationOfGrant29-06-2022.pdf 2022-06-29

Search Strategy

1 SEARCHSTRATEGY_20-07-2018.pdf
2 SearchHistory(2284CHE2014)AE_27-10-2021.pdf

ERegister / Renewals

3rd: 13 Sep 2022

From 07/05/2016 - To 07/05/2017

4th: 13 Sep 2022

From 07/05/2017 - To 07/05/2018

5th: 13 Sep 2022

From 07/05/2018 - To 07/05/2019

6th: 13 Sep 2022

From 07/05/2019 - To 07/05/2020

7th: 13 Sep 2022

From 07/05/2020 - To 07/05/2021

8th: 13 Sep 2022

From 07/05/2021 - To 07/05/2022

9th: 13 Sep 2022

From 07/05/2022 - To 07/05/2023

10th: 13 Sep 2022

From 07/05/2023 - To 07/05/2024

11th: 07 May 2024

From 07/05/2024 - To 07/05/2025

12th: 30 Apr 2025

From 07/05/2025 - To 07/05/2026