Sign In to Follow Application
View All Documents & Correspondence

An Interaction System And Method For Configuring An Input Device

Abstract: The method for configuring the input device comprises establishing, by a processor, a communication link between the input device and a user device. An input profile of the of the input device is generated based on the input device parameters. Thereafter, the input profile is mapped to the input device and the mapped input profile of the input device is updated at an application server such that in response to establishing a connection between the input device and at least one of, the user device and one or more user associated devices, the generated input profile is retrieved from the server and applied to the input device.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 December 2021
Publication Number
26/2023
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
chinthan.japhet@klaw.in
Parent Application

Applicants

Tesseract Imaging Limited
5 TTC Industrial Area, Reliance Corporate IT Park, Thane Belapur Road, Ghansoli, Navi Mumbai, Maharashtra – 400 701, India

Inventors

1. Devesh Ramcharan Jain
5 TTC Industrial Area, Reliance Corporate IT Park, Thane Belapur Road, Ghansoli, Navi Mumbai, Maharashtra – 400 701, India
2. Bhumit Sharad Dave
5 TTC Industrial Area, Reliance Corporate IT Park, Thane Belapur Road, Ghansoli, Navi Mumbai, Maharashtra – 400 701, India
3. Vinit Vinayak Ingle
5 TTC Industrial Area, Reliance Corporate IT Park, Thane Belapur Road, Ghansoli, Navi Mumbai, Maharashtra – 400 701, India
4. Kshitij Marwah
5 TTC Industrial Area, Reliance Corporate IT Park, Thane Belapur Road, Ghansoli, Navi Mumbai, Maharashtra – 400 701, India

Specification

Claims:We Claim:
1. A method of configuring an input device, the method comprising the steps of:
establishing, by a processor, a communication link between the input device and a user device;
generating, by the processor, an input profile of the input device based on a plurality of input device parameters;
mapping, by the processor, the input profile to the input device irrespective of a configuration of the input device, type of the input device, and a protocol compatibility of the input device with the user device, thereby ensuring scalability and real time compatibility; and
updating, by the processor, the mapped input profile of the input device at an application server such that in response to establishing a connection between the input device and at least one of, the user device and one or more user associated devices, the generated input profile is retrieved from the application server and applied to the input device, thereby ensuring flexibility and upgradability of the input device in real time.

2. The method as claimed in claim 1, wherein the input profile comprises a set of rules to enable the input device to interact with the user device in response to receiving an input from the user via the user device, the plurality of input device parameters are one of type of the input device, plurality of keys of the input device, a state of each of the plurality of keys and one or more interaction elements of at least one application for which the keys are to be utilized.

3. The method as claimed in claim 1, further comprising:
retrieving, by the processor, the input profile of the input device from the application server;
customizing, by the processor, the input profile based on customizing at least one rule of the plurality of rules of the input profile based on user preferences; and
updating, by the processor, the customized input profile pertaining to the input device at the server such that in response to establishing the connection between the input device and at least one of, the user device and the one or more user associated devices, the customized input profile is retrieved from the application server and applied to the input device, thereby ensuring flexibility and upgradability of the input device in real time.

4. The method as claimed in claim 1, wherein the input profile generated is dedicated for each of the at least one application of a plurality of applications of the user device, thereby in response to establishing the connection between the user device and the input device and subsequent to launching the at least one application, the processor in real time retrieves the input profile comprising the set of rules from the application server and the retrieved input profile is applied to the input device pertaining to the launched application.

5. The method as claimed in claim 1, wherein the input device is at least one of an electronic actuator, a mechanical actuator, a human gesture based input devices, keyboard, mouse, multi-sensor input device, a human voice based input device, and a combination thereof.

6. The method as claimed in claim 1, wherein the set of rules to enable the input device to interact with the user device is at least one of,
the state of each of the plurality of keys and a corresponding at least one output response to be performed by each of the plurality of keys in response to interaction of each of the plurality of keys with the one or more interaction elements.

7. The method as claimed in claim 6, wherein the set of rules further comprises a focusing means to facilitate the user to focus the at least one output response on the one or more interaction elements displayed on the user device.

8. The method as claimed in claim 1, wherein the user device is one of a laptop, a head mounted display, a smart watch, a smart phone, and gaming devices.

9. An interaction system for configuring an input device, the system comprising:
a memory including executable instructions; and
at least one processor in communication with the memory and configured to receive the executable instructions and configured to:
establish, a connection between the input device and a user device;
generate, an input profile of the input device based on a plurality of input device parameters, the input profile comprises a set of rules to enable the input device to interact with the user device in response to receiving an input from a user via the user device, the plurality of input device parameters is at least one of type of the input device, plurality of keys of the input device, a state of each of the plurality of keys and one or more interaction elements of at least one application for which the keys are to be utilized;
map, the input profile to the input device irrespective of a configuration of the input device, type of the input device, and a protocol compatibility of the input device with the user device, thereby ensuring scalability and real time compatibility; and
update, the generated input profile of the input device at an application server such that in response to establishing a connection between the input device and at least one of, the user device and one or more user associated devices, the generated input profile is retrieved from the application server and applied to the input device, thereby ensuring flexibility and upgradability of the input device in real time.

10. The system as claimed in claim 9, wherein in the event a new input device is connected to the user device, the processor is configured to:
identify, the type of the new input device in response to establishing the connection between the one or more user devices and the new input device;
locate, the input profile based on conducting a match search at the server to identify the input profile based on the type of the new input device;
suggest, in real time, the identified input profile to the user; and
update, the identified input profile to the new input device in response to consent received from the user.
, Description:FORM 2
THE PATENTS ACT, 1970
(39 OF 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[SEE SECTION 10, RULE 13]

AN INTERACTION SYSTEM AND METHOD FOR CONFIGURING AN INPUT DEVICE;

TESSERACT IMAGING LIMITED, A CORPORATION ORGANISED AND EXISTING UNDER THE LAWS OF INDIA, WHOSE ADDRESS IS-5 TTC INDUSTRIAL AREA, RELIANCE CORPORATE IT PARK, THANE BELAPUR ROAD, GHANSOLI, NAVI MUMBAI, MAHARASHTRA – 400 701, INDIA

THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.

FIELD OF THE INVENTION
[0001] The present invention relates to input devices, and more particularly relates to interaction systems and methods for configuring input devices.
BACKGROUND OF THE INVENTION
[0002] In the present age, interaction systems are advancing at a very fast pace. Interaction systems include any interaction devices and apparatuses such as, but not limited to, computers, laptops, smart phones, tablets, wearable devices such as, but not limited to, smart watches, smart glasses, etc. These interaction systems facilitate users to perform numerous activities such as, connect with the social and professional networks, perform work activities, gaming, learning, monitor day to day activities, etc. Especially owing to the whole COVID situation, advanced interaction systems have gained utmost significance and have become the need of the hour due to most of the interactions are virtual. Accordingly, advanced interaction systems are being developed at a very fast pace.
[0003] Some of the many advanced types of interaction systems include the mixed, virtual and augmented reality systems, hereinafter referred to as reality interaction systems. These reality interaction systems are a form of spatial computing that allow users to interact with virtual objects in their surroundings. This requires the definition of a unique interaction arrangement that is consistent and intuitive for the user. The various devices associated with these reality systems often also rely on multi-modal interactions that require multiple input devices to co-exist and function simultaneously as well as separately.
[0004] Further, since the reality interaction systems are evolving at a very fast pace, new types of input devices are being integrated into these systems. During this process, there are compatibility issues that may arise since, the new types of input devices may have been developed by a particular developer and the reality interaction systems may be developed by a separate developer. In these situations, the configuration file in order to operate the new input devices may not be available instantly. Therefore, the user is required to contact the developer of the new input device and request for the configuration file, which obviously would be chargeable, and the developer is required to provide the configuration file each time a new input device is added. Thereafter, the user is required to download the configuration file on the user device in order to use the new input device. Therefore, integrating the newly developed input devices to the reality interaction systems may prove to be a complex and a time-consuming process.
[0005] Further, the objective of developing these reality interaction systems is to provide an immersive experience to the user. Immersive experience is basically providing an experience to the user which is similar to the real world. In other words, to make a user feel that he is part of the immersive world. During these situations, the user based on his/her behavior in real world may intend to have the same experience while using the reality interaction systems. This user specific behavior will only be expressed by the user itself. Therefore, the user should be able to customize the reality interaction systems in order have the same experience. Customizing, is basically changing the manner in which multiple input devices interact with the reality interaction systems. However, since multiple input devices are involved and new input devices are added, the input devices may not be able to provide the same interaction experience as intended by the user. In these situations, the entities which developed the input devices may have to call back these devices and configure them based on user preferences and send it back to the user, which is practically not feasible to the developer and the user due to monetary and time constraints.
[0006] Further, these reality interaction systems are being used for multiple applications such as, but not limited to, calling, watch video, gaming and working. Further each of these applications, such as gaming may have multiple sub-application for multiple games to be played using these reality interaction systems. Each of these sub-applications such as an application for a particular game to be played may have different pre-defined rules for various types of input devices pre-configured at the time of manufacturing the same. In the reality interaction system whose main objective is to provide an immersive experience, may not be able to provide a customized experience even, if possible, as indicated above will be a complex and time-consuming process which may not be feasible to the user and the developer.
[0007] In view of the above, there is a dire need for efficient interaction systems to integrate multiple input devices therein which facilitate in providing the user with an immersive experience.
SUMMARY OF THE INVENTION
[0008] One or more embodiments of the present invention, provide an interaction system and method for configuring an input device.
[0009] In one aspect of the invention, an interaction system to configure an input device is provided. The system comprises a memory including executable instructions and at least one processor in communication with the memory and configured to receive the executable instructions and configured to: establish, a connection between the input device and a user device, generate, an input profile of the input device based on a plurality of input device parameters, the input profile comprises a set of rules to enable the input device to interact with the user device in response to receiving an input from user via the user device, the plurality of input device parameters is at least one of type of the input device, plurality of keys of the input device, a state of each of the plurality of keys and one or more interaction elements of at least one application for which the keys are to be utilized; map, the input profile to the input device irrespective of a configuration of the input device, type of the input device, and a protocol compatibility of the input device with the user device, thereby ensuring scalability and real time compatibility; and update, the generated input profile of the input device at an application server such that in response to establishing a connection between the input device and at least one of, the user device and one or more user associated devices, the generated input profile is retrieved from the server and applied to the input device, thereby ensuring flexibility and upgradability of the input device in real time.
[0010] In yet another aspect of the invention, a method of configuring an input device is provided. The method comprises the steps of: establishing, by a processor, a communication link between the input device and a user device; generating, by the processor, an input profile of the input device based on a plurality of input device parameters; mapping, by the processor, the input profile to the input device irrespective of a configuration of the input device, type of the input device, and a protocol compatibility of the input device with the user device, thereby ensuring scalability and real time compatibility; and updating, by the processor, the mapped input profile of the input device at an application server such that in response to establishing a connection between the input device and at least one of, the user device and one or more user associated devices, the generated input profile is retrieved from the server and applied to the input device, thereby ensuring flexibility and upgradability of the input device in real time.
[0011] Other features and aspects of this invention will be apparent from the following description and the accompanying drawings. The features and advantages described in this summary and in the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the relevant art, in view of the drawings, specification, and claims hereof. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter, resort to the claims being necessary to determine such inventive subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] Reference will be made to embodiments of the invention, examples of which may be illustrated in the accompanying figures. These figures are intended to be illustrative, not limiting. The accompanying figures, which are incorporated in and constitute a part of the specification, are illustrative of one or more embodiments of the disclosed subject matter and together with the description explain various embodiments of the disclosed subject matter and are intended to be illustrative. Further, the accompanying figures have not necessarily been drawn to scale, and any values or dimensions in the accompanying figures are for illustration purposes only and may or may not represent actual or preferred values or dimensions. Although the invention is generally described in the context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
[0013] FIG. 1 is an environment for an interaction system, according to one or more embodiments of the present invention;
[0014] FIG. 2 is a block diagram of an interaction system to configure an input device, according to one or more embodiments of the present invention;
[0015] FIG. 3 illustrates interaction of a processor with a plurality of modules of an interaction system to configure an input device, according to one or more embodiments of the present invention;
[0016] FIG. 4 illustrates an exemplary embodiment of scanning to identify plurality of input device parameters, according to one or more embodiments of the present invention;
[0017] FIG. 5 illustrates an exemplary embodiment of scanning to identify one or more interaction elements, according to one or more embodiments of the present invention;
[0018] FIG. 6 illustrates an exemplary embodiment to set one or more rules of an input profile, according to one or more embodiments of the present invention;
[0019] FIG. 7 illustrates an exemplary embodiment to retrieve an input profile from a storage unit, according to one or more embodiments of the present invention;
[0020] FIG. 8 illustrates an exemplary embodiment to configure a new input device, in accordance with one or more embodiments of the present invention; and
[0021] FIG. 9 illustrates a flowchart of a method for configuring an input device, according to one or more embodiments of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[0022] Reference will now be made in detail to specific embodiments or features, examples of which are illustrated in the accompanying drawings. Wherever possible, corresponding or similar reference numbers will be used throughout the drawings to refer to the same or corresponding parts. References to various elements described herein, are made collectively or individually when there may be more than one element of the same type. However, such references are merely exemplary in nature. It may be noted that any reference to elements in the singular may also be construed to relate to the plural and vice-versa without limiting the scope of the invention to the exact number or type of such elements unless set forth explicitly in the appended claims. Moreover, relational terms such as first and second, and the like, may be used to distinguish one entity from the other, without necessarily implying any actual relationship or between such entities.
[0023] Various embodiments of the invention provide an interaction system to configure an input device and a method thereof. The present invention is configured to provide an efficient interaction system to integrate multiple input devices therein which facilitate in providing the user with an immersive experience. The present invention can be utilized in fields such as, but not limited to, social and professional network interaction, music and/or, telecom, virtual reality, mixed reality and augmented reality.
[0024] Fig. 1 illustrates an example environment for an interaction system, according to one or more embodiments of the present invention. The environment includes an interaction system 100, one or more user device 110, at least one input device 120 and an application server 130. The interaction system 100, the one or more user device 110, the at least one input device 120 and the application server 130 communicate with each other over a communications network.
[0025] The communications network can be one of, but not limited to, LAN, cable, WLAN, cellular, or satellite.
[0026] In a preferred embodiment, the user device 110 includes a display 112 and the interaction system 100.
[0027] In an alternate embodiment, the interaction system 100 may be located as an independent entity and in communication with the user device 110, the at least one input device 120 and the application server 130.
[0028] In yet another alternate embodiment, the interaction system 100 may be embedded within the application server 130.
[0029] Further, the at least one input device 120 may be connected via wired and/or wireless connection with the user device 110.
[0030] Further, the application server 130 includes a communication transceiver 132, a processor 134, a memory 136 and a storage unit 138 present within the application server 130 or located remotely outside the application server 130.
[0031] In an embodiment, the application server 130 is configured to host one or more applications.
[0032] In an alternate embodiment, various other types of servers may be used instead of the application server 130.
[0033] As shown in Fig. 2, the interaction system 100 includes a user interface module 102, a memory 104, a transceiver 106, a plurality of functionality modules and a processor 108. The processor 108 controls the operation of the display 112 of the user device 110, the user interface module 102, the memory 104, the transceiver 106 and the plurality of functionality modules. The user interface module 102 is also configured to display and facilitate the user device 110 to input and/or view data. The functionality modules will be explained in detail below with respect to the interaction system 100.
[0034] The processor 108 of the interaction system 100 and the processor 134 of the application server 130 explained hereinafter, are the processors that may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor is configured to fetch and execute computer-readable instructions stored in the memory.
[0035] The memory 104 referred hereinafter, in general includes memory and any other storage means and/or units may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0036] In an embodiment, the user device 110 is at least one of, but not limited to, a desktop, a laptop, a tablet, a smart phone and wearable devices, such as, but not limited to, smart watch and smart glass.
[0037] Further, in an embodiment, the at least one input device 120 may include types of, but not limited to, electronic actuator, a mechanical actuator, a human gesture-based input devices, keyboard, mouse, multi-sensory input device, a human voice-based input device, and a combination thereof. In this regard, the at least one input device 120 is one of, but not limited to, a mouse, a trackpad, joystick, a light pen, a trackball and microphone.
[0038] At the outset, the user is required to register with the interaction system 100 and create an account. For registration, the user is required to install the user interface module 102. During the process of installing the user interface module 102, the user is required to provide user identity information such as, but not limited to, name age, contact number, residential address, email ID’s and preferred password. Once the user identity information is input by the user, the processor 108 of the interaction system 100 checks to verify if the details provided are in the right order. In case there are any errors, the same is indicated to the user to rectify. Thereafter, a verification code is sent to either the user’s email address or the contact number of the user. The user is required to input the verification code at the user device 110. Once this task is completed, the user is said to be registered. The credentials of the user which is received is stored in the memory 104 of the user device 110.
[0039] In a preferred embodiment, the user is the end user of the user device 110.
[0040] In an alternate embodiment, the user may include the developer of the interaction system 100 and/or any associated entities thereof.
[0041] Once the user interface module 102 is installed and the user is registered, a communication link 140 is established between the user device 110, the interaction system 100 and the application server 130.
[0042] In an embodiment, the communication link 140 is a communications channel that connects two or more devices over the communications network, herein mainly the user device 110, the interaction system 100 and the application server 130, for the purpose of data transmission. The communication link 140 may be a dedicated physical link or a virtual circuit that uses one or more physical links or shares a physical link with other telecommunications links.
[0043] Pursuant to establishment of the communication link 140, the memory 104 of the interaction system 100 as well as the application server 130 receive a user identifier from the user device 110. The user identifier may be an identifier of the user which is automatically generated, or the unique identifier may include the user identity information.
[0044] Once the user identifier information is received at the application server 130, a user profile is generated at the storage unit 138.
[0045] In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, written in a programming language, such as, for example, Java, C, or assembly. One or more software instructions in the modules may be embedded in firmware, such as an EPROM. It will be appreciated that modules may comprised connected logic units, such as gates and flip-flops, and may comprise programmable units, such as programmable gate arrays or processors. The modules described herein may be implemented as either software and/or hardware modules and may be stored in any type of computer-readable medium or other computer storage device.
[0046] Further, while one or more operations have been described as being performed by or otherwise related to certain modules, devices or entities, the operations may be performed by or otherwise related to any module, device or entity. As such, any function or operation that has been described as being performed by a module could alternatively be performed by a different server, by the cloud computing platform, or a combination thereof.
[0047] Once the user profile is generated at the application server 130, the processor 108 of the interaction system 100 may indicate via the display 112 of the user device 110 that the user may proceed with connecting the input device to the user device 110.
[0048] In an embodiment, the user may connect the input device to the user device 110 via wired and/or wireless connection using techniques well known in the art.
[0049] Once the at least one input device is connected to the user device 110, the processor 108 checks if the connection is valid. In case the connection is not valid, the processor 108 may indicate to the user to reconnect the input device again to the user device 110 via the display 112.
[0050] With reference to Fig. 3, the interaction of the processor 108 with a plurality of modules of the interaction system 100 to configure the input device is illustrated, according to an embodiment of the present invention. As shown in Fig. 3, the processor 108 includes a data provider module 302, a data mapper module 304, a data pointer module 306 and a data updating module 308. These modules as shown in Fig. 3 are controlled by the processor 108.
[0051] In an embodiment, the data provider module 302, the data mapper module 304, the data pointer module 306 and the data updating module 308 are in communication with the processor 108.
[0052] In an alternate embodiment, the data provider module 302, the data mapper module 304, the data pointer module 306 and the data updating module 308 are present within the processor 108 itself.
[0053] Once the connection between the user device 110 and the input device 120 is successful, the processor 108 automatically scans the input device 120 by retrieving a configuration file at the data provider module 302 from the input device 120 and identifying the input device parameters of the input device 120.
[0054] In an alternate embodiment, the processor 108 transmits a configuration file request to the input device 120 and thereafter receives the configuration file upon the approval of the configuration file request from the input device 120.
[0055] In an embodiment, the configuration file is a file used to configure the input device 120.
[0056] In an embodiment, once the configuration file is received at the data provider module 302 by the processor 108, the same is transmitted to the data mapper module 304.
[0057] At the data mapper module 304, the processor 108 firstly scans the data retrieved from the data provider module 302. In an embodiment, the data provider module 302 includes data pertaining to the connected input device 120.
[0058] In an embodiment, at the data mapper module 304, the processor 108 scans the data of the input device 120 based on a plurality of input device parameters.
[0059] In an embodiment, the plurality of input device parameters are parameters which are essential in order to operate the input device 120 in response to connecting with the user device 110.
[0060] In an embodiment, the input device parameters include at least one of, but not limited to, type of the input device, plurality of keys of the input device, a state of each of the plurality of keys and one or more interaction elements of at least one application for which the keys are to be utilized.
[0061] As discussed above, the processor 108 taking into consideration the plurality of input device parameters, scans for the type of the input device. The type of the input device may be electronic actuator/mechanical actuator, a human gesture-based input devices, multi-sensory input device, a human voice-based input device, and a combination thereof.
[0062] Thereafter, the plurality of keys associated with the input device is scanned. It is well known in the art, that the input device includes the plurality of keys. The plurality of keys may operate on electronic, mechanical, human gesture or multi-sensory mechanisms. For instance, for an electronic or mechanical actuator-based input devices, keys are physical in nature. The input devices including keys which are physical in nature may be one of, but not limited to, the keyboard, mouse, joystick, trackpad, light pen and trackball.
[0063] Similarly, human voice-based keys may include the microphone and the multi-sensory or human gesture-based keys may include those virtual keys which detect different movements of various parts of the human body as inputs.
[0064] Once the plurality of keys is identified based on scanning, the processor 108 identifies the application currently the user is signed into on the user device 110.
[0065] In an embodiment, the application is a specific portal displayed on the display 112 of the user device 110 to enable the user to carry out operations in order to produce user desired outputs. For example, let us consider that the user is signed into a video viewing application “A”.
[0066] Based on identifying the type of the application as the video viewing application, the processor 108 scans and identifies the states of each of the plurality of keys.
[0067] In an embodiment, the state of each of the plurality of keys is a state of the key for which an output response is required to be defined.
[0068] For example, considering the application identified as the video viewing application which is displayed on the display 112 of the user device 110, the processor 108 scans the data mapper module 304 to identify at least one state for each key of the plurality of keys of the input device 120. For instance, the particular video viewing application that the user is signed into is a video viewing application “A” as shown in Fig. 4. Based on the scan of the type of the input device, the input device identified is a handheld remote. Further, the plurality of keys identified of the handheld remote are five, namely a left key, a right key, a top key, a bottom key and a center key.
[0069] Once the input device 120 and the corresponding application are scanned based on the input device parameters, one or more rules for each state of each key of the plurality of keys and the corresponding output response which is required/intended by the user is received by the processor 108 at the data mapper module 304 from the user.
[0070] In an embodiment, the processor 108 may receive the one or more rules for each state of each key of the plurality of keys and the corresponding output response from the user via the user interface module 102.
[0071] In an alternate embodiment, the processor 108 may receive the one or more rules for each state of each key of the plurality of keys and the corresponding output response from a different electronic device including the interaction system 100 which is in turn connected to the user device 110.
[0072] Fig. 6 illustrates an exemplary embodiment to set one or more rules of an input profile. The user interface module 102 generates a user interface 610 which allows the user to set the one or more rules for the input device 120 in response to connecting the input device 120 to the user device 110.
[0073] In an embodiment, the set of rules are defined by the user, wherein the user may be one of, the end user using the user device 110 or the user being a developer or the associated entities thereof at the time of manufacturing the input device 120. For example, as shown in Fig. 6, if the user intends that the left key of the handheld remote is required to generate an output response of rewinding the video played (being a rule) on the user device 110 by pressing the left key, wherein pressing the left key is the state. Also, by holding the pressed left key for a period, which is another state of the left key may be required by the user to generate another output response of fast rewinding the video at a faster pace as per the user’s preference. Further, the right key may be required by the user to generate the output response of forwarding the video played on the user device 110 by pressing the right key, wherein pressing the right key is identified as the state. Also, by holding the pressed right key for a period, which is another state of the right key may be required by the user to generate another output response of fast-forwarding the video at a faster pace as per the user’s preference. Further, the top key when pressed, wherein the pressed top key is the state may be required to generate the output response by increasing the volume and the bottom key which when pressed, which is another state of the pressed bottom key may be required to generate the output response by decreasing the volume. Further, the center key is a key which may be pressed which may be required to generate the output response of either pausing the video which is played or to play the video from a paused state or to begin a new video altogether. Further, the center key may be partially rotatable about its axis which is another state of the center key which may be required to generate the output response to focus by means of a pointer on the video being played on the user device 110. In this regard, in response to the center key being scrolled about its axis, the pointer movable on the display 112 of the user device 110 may be moved corresponding to the direction of scrolling the center key. For example, if the center key is scrolled to the right, the pointer displayed on the display 112 may move to the right, etc.
[0074] Further, the one or more interaction elements which may be part of the application is identified by the processor 108. In the current example as shown in Fig. 5, the interaction elements part of the video application “A” is identified based on the scan.
[0075] In an embodiment, the one or more interaction elements are one or more specific defined areas of interest displayed on the user device 110 pertaining to the application. The one or more interaction elements allows the user to focus the output response on the specific defined areas of interest in response to receiving the input via the input device 120 by the user. As an example, shown in Fig. 5, the interaction elements is a search bar. The user herein with a help of a focusing means such as, but not limited to, a pointer may focus on the search bar. Further, another interaction element is a video selection option which may be accessed by scrolling between a first end and a second end of the screen displayed on the user device 110. The user in the current example, may set the rule that by providing the input by scrolling using the center key between the first end and the second end of the video selection option in order to perform the output response of selecting the video of choice from multiple videos displayed. Further as shown in the present example, the search bar which is another interaction element is displayed on the user device 110 pertaining to the application “A”. The user may set the rule of accessing the search bar by scrolling using the center key in order to generate the output response of focusing a pointer provided on the user device 110 onto the search bar. Once the pointer is focused onto the search bar, the choice of video may be searched by providing the title of the video using either a virtual keypad generated on the user device 110 for the application or using physical keypad provided additionally on the input device, herein the remote.
[0076] In an embodiment, the interface 610 to allow the user to set the rules of the input device as shown in Fig. 6 is merely exemplary in nature and should nowhere be construed as limiting the scope of the present disclosure.
[0077] In an embodiment, once the set of rules are defined by the user via the user interface module 102, the processor 108 generates the input profile of the input device 120. Thereafter, the input profile is mapped onto the input device 120.
[0078] In an embodiment, the input profile which is generated is mapped onto the input device 120 at the data mapper module 304. Further, in the event the user intends to set the rule to focus the output response on at least one interaction element, the processor 108 using the data pointer module 306 will integrate a focusing means, such as but not limited to, a pointer to facilitate the user to focus the at least one output response on the one or more interaction elements displayed on the user device 110. While mapping the input profile, the processor 108 ensures each state of each of the keys of the plurality of keys are mapped onto their respective one or more rules as defined by the user as shown in the example of Fig. 6. Further, the processor 108 ensures that the input profile is mapped to the input device 120 irrespective of a configuration of the input device, type of the input device, and a protocol compatibility of the input device with the user device 110, thereby ensuring scalability and real time compatibility. For instance, while mapping the input profile to the input device by the processor 108, if the input device 120 is of a different configuration/new type of input device/not compatible with the protocol as compared to the user device 110, the processor 108 of the interaction system 100 determines the compatibility of the input device 120 with the user device 110 by ensuring that the configuration file is firstly present with the input device 120.
[0079] Thereafter, the processor 108 checks to determine if the format of the configuration file of the input device 120 is compatible/valid with acceptable formats of the configuration file by the user device 110. This step is performed by the processor 108 by checking acceptable formats of the configuration file by either comparing pre-stored acceptable formats for the user device 110 stored at the user device memory or storage unit, or the processor 108 requests the pre-stored acceptable formats from the application server 130 which may have a database of the same stored at the storage unit 138. If the format of the configuration file is not compatible/valid with the user device 110, the processor 108 standardizes the format of the configuration file to ensure the input device 120 is compatible with the user device 120 with respect to configuration and protocol.
[0080] In an embodiment, once the input profile is mapped onto the input device 120, the data of the input device 120 pertaining to the scanned input device parameters along with the mapped input profile is transmitted to the application server 130 as shown in Fig. 1 from the interaction system 100 and stored at the storage unit 138. Therefore, the storage unit 138 includes the input profile for each application for each of the input device 120 connected to the user device 110.
[0081] Therefore, in the future in the event the input device 120 which is mapped with the input profile for each of the application which is stored at the storage unit 138 is connected to the user device 110, the processor 108 retrieves the respective input profile from the application server 130 in response to the user signing into the application and the input profile is applied to the input device 120. Advantageously, ensuring flexibility and upgradability of the input device 120 in real time. The input profile being retrieved from the application server 130 and applied to the input device 120 is explained further with an example as shown in Fig. 7.
[0082] In an embodiment, the application server 130 transmits the input profile via the communication transceiver 132 to the interaction system 100, which may be present within the user device 110.
[0083] As shown in Fig. 7, the storage unit 138 of the application server 130 includes a catalogue of input profiles of each input device 120 for each application. As illustrated above, each of the input profile includes the set of rules mapped onto the input device 120 based on the user preferences for a particular application. As seen in Fig. 7, the input device 1 includes the input profile 1(A), which indicates that the input profile 1(A) is related to the input device 1 pertaining to the application “A”. Similarly, the input device 2 includes the input profile 2(A), which indicates that the input profile 2(A) is related to the input device 2 pertaining to the application “A”.
[0084] Further, as seen in Fig. 7, the input device 1 includes the input profile 1(B), which indicates that the input profile 1(B) is related to the input device 1 pertaining to the application “B”. Similarly, the input device 2 includes the input profile 2(B), which indicates that the input profile 2(B) is related to the input device 2 pertaining to the application “B”.
[0085] Therefore, as seen in Fig. 7, the various input profiles for each input device 120 are stored at the storage unit 138 for each application. In the event any of the input device 120 which have been mapped are connected to the user device 110, the processor 108 in real time retrieves the respective input profile for the respective application from the application server 130 and applies the same to the input device 120. To be more precise, in the event the input device 1 is connected to the user device 110, and the user signs into the application “A”, then the processor 108 in real time transmits a request to the application server 130 via the transceiver 106 to retrieve the input profile 1(A). Similarly, if the input device 2 is connected to the user device 110 for the same application “A”, then the processor 108 transmits a request to the application server 130 to retrieve the input profile 2(A) in real time. In response to the request by the processor 108, the server transmits that input profiles 1(A) and 2(A) to the processor 108. The processor 108 subsequent to receiving the respective input profiles, applies the same to the input device 1 and 2 respectively, advantageously, ensuring flexibility and upgradability of the input device in real time. Advantageously, multiple input devices may be connected to the user device 110 for the same application, thereby providing an immersive experience to the user in case of virtual/mixed/augmented reality interaction systems, since each of the input device 120 will function as per the user’s preferences and also function in collaboration with each other in case there is a requirement for the application to include multiple input devices. Especially, in the virtual, mixed or augmented reality environments, the user will be able to have an immersive experience which may replicate how the user may interact with the real world due to functioning of each of the multiple input devices as per the user’s preferences.
[0086] Further, in the event one or more user associated devices which is registered to the user is connected to the one or more multiple input devices which are mapped with the input profile, the processor 108 ensures that the one or more user associated devices are also updated with the input profile of the respective input devices at the data updating module 308. For example, if the user has registered with the user device “X” and subsequently, the user connects another user associated device “Y” which also belongs to the user with the input device 1, the processor 108 retrieves the respective input profile from the application server 130 and applies the same to the connected input device 1. Advantageously, ensuring that the user can use multiple user devices and avail the same experience as that the user would get while using the original user device 110 through which the input profile was mapped.
[0087] As indicated above, the user may be the end user using the input device for which the input profile has been mapped onto based on the user’s preference. Further, the user may be the developer or any associated entity using the input device for which the input profile has been mapped onto at the time of manufacturing the input device. In both the situations, when the user is the end user or the developer, the input profile may be customized in the future by retrieving the input profile by the processor 108. Thereafter, customizing the input profile based on customizing at least one rule of the plurality of rules of the input profile based on user preferences. In an embodiment, the user may provide the input for customizing the at least one rule of the input profile via the user interface module 102. The feature to customize the input profile even at a later stage when the input device 120 is deployed to the end users, advantageously provides flexibility and upgradability of the input device 120 in real time. Further, the customizing feature prevents the requirement to call back the input device in order to be customized by the developer of the input device, thereby saving on cost and time for the developer (manufacturer) and the end user.
[0088] The customizing feature is explained with an example with reference to Fig. 6. For instance, let us assume that the input device has been mapped for the application “A”, wherein the input device includes five keys, namely a left key, a right key, a top key, a bottom key and a center key. The one or more rules set by the user for each key of the input device includes as follows: the left key is for “rewind the video”, left key pressed and held for a time period is for “fast-rewinding the video”, right key pressed and held is a time period is for “fast-forwarding the video”, top key pressed is for “increasing volume”, bottom key pressed is for “decreasing volume”, center key pressed is for “video pause and play”, etc.
[0089] In view of the above set rules for each key of the input device, in the event the user intends to customize at least one rule for any one key, such as the left button which is to “rewind the video” may be customized to “forward the video” instead based on the input received by the user via the user interface module.
[0090] Further, in the event a new input device 810 is connected to the user device 110 which is not mapped with the input profile in the past, the processor 108 firstly identifies the type of the new input device 810. Thereafter, the input profile is located based on conducting a match search at the application server 130 to identify the input profile based on the type of the input device. Thereafter, the processor 108 suggests in real time, the identified input profile to the user and updates the identified input profile pertaining to the new input device 810 in response to consent received from the user. This feature is explained with an example, as shown in Fig. 8. As per Fig. 8, the new input device 810 is connected to the user device 110. In response to this, the processor 108 identifies the type of the input device. Once the type of the input device is identified by the processor 108 based on the scan, the processor 108 compares the type of the new input device 810 with the data of all the mapped available input devices stored at the application server 130. Based on the comparison, as per the present example, if the input device 1 is found be at least partially relevant to the new input device 810, the processor 108 retrieves all the relevant profiles pertaining to the input device 1. In the present example, all the relevant input profiles of the input device 1 are Input Profile 1 (A) and Input Profile 1 (B) and suggest the same to the user. Finally, these input profiles are updated to the new input device 810 once the user provides the consent to the suggestion by the processor 108.
[0091] In an alternate embodiment, in response to the user signing into a particular application, herein let us consider application “A”, then the processor 108 only retrieves the relevant input profile pertaining to the application “A”, which in the present example is Input Profile 1 (A). Advantageously, this way any new input device 810 may also be mapped in real time with already existing input profiles, thereby saving on the additional development effort and time to implement the new input device 810.
[0092] Few of the technical advantages of the present invention are listed below:
-Ease of design and development of the interaction system 100 especially for virtual/mixed reality applications with an input device agnostic architecture
-The interaction system 100 allows addition of new input device 810 without any additional development effort.
-The interaction system 100 ensures consistency for end-users by allowing all input devices to be mapped to the same or different output responses.
-The interaction system 100 allows updating the data mapper module 304 from the application server 130 that further allows to reduce re-development effort while still ensuring consistency of interaction for end-users.
[0093] FIG. 9 shows a flowchart of a computer implemented method for configuring an input device. For the purpose of description, the method is described with the embodiments as illustrated in Fig. 1 to Fig. 8. The method comprises the steps as indicated below:
[0094] At step 905, establishing, by a processor 108, a communication link 140 between the input device and a user device.
[0095] At step 910, generating, by the processor 108, an input profile of the input device based on a plurality of input device parameters.
[0096] At step 915, mapping, by the processor 108, the input profile to the input device irrespective of a configuration of the input device, type of the input device, and a protocol compatibility of the input device with the user device, thereby ensuring scalability and real time compatibility.
[0097] At step 920, updating, by the processor 108, the mapped input profile of the input device at a server such that in response to establishing a connection between the input device and at least one of, the user device and one or more user associated devices, the generated input profile is retrieved from the server and applied to the input device, thereby ensuring flexibility and upgradability of the input device in real time.
[0098] In an embodiment, the method 900 further comprises:
[0099] Retrieving, by the processor 108, the input profile of the input device from the server.
[00100] Customizing, by the processor 108, the input profile based on customizing at least one rule of the plurality of rules of the input profile based on user preferences
[00101] Updating, by the processor 108, the customized input profile pertaining to the input device at the server such that in response to establishing the connection between the input device and the one or more user associated devices, the customized input profile is retrieved from the server and applied to the input device, thereby ensuring flexibility and upgradability of the input device in real time.
[00102] While aspects of the present invention have been particularly shown and described with reference to the embodiments above, it will be understood by those skilled in the art that various additional embodiments may be contemplated by the modification of the disclosed machines, systems and methods without departing from the scope of what is disclosed. Such embodiments should be understood to fall within the scope of the present invention as determined based upon the claims and any equivalents thereof.

Documents

Application Documents

# Name Date
1 202121061488-STATEMENT OF UNDERTAKING (FORM 3) [29-12-2021(online)].pdf 2021-12-29
2 202121061488-POWER OF AUTHORITY [29-12-2021(online)].pdf 2021-12-29
3 202121061488-FORM 1 [29-12-2021(online)].pdf 2021-12-29
4 202121061488-FIGURE OF ABSTRACT [29-12-2021(online)].jpg 2021-12-29
5 202121061488-DRAWINGS [29-12-2021(online)].pdf 2021-12-29
6 202121061488-DECLARATION OF INVENTORSHIP (FORM 5) [29-12-2021(online)].pdf 2021-12-29
7 202121061488-COMPLETE SPECIFICATION [29-12-2021(online)].pdf 2021-12-29
8 202121061488-Proof of Right [20-01-2022(online)].pdf 2022-01-20
9 Abstract1.jpg 2022-03-22
10 202121061488-Request Letter-Correspondence [28-12-2022(online)].pdf 2022-12-28
11 202121061488-Power of Attorney [28-12-2022(online)].pdf 2022-12-28
12 202121061488-Form 1 (Submitted on date of filing) [28-12-2022(online)].pdf 2022-12-28
13 202121061488-Covering Letter [28-12-2022(online)].pdf 2022-12-28
14 202121061488-CERTIFIED COPIES TRANSMISSION TO IB [28-12-2022(online)].pdf 2022-12-28
15 202121061488-FORM 3 [16-01-2023(online)].pdf 2023-01-16
16 202121061488-FORM 18 [05-03-2025(online)].pdf 2025-03-05