Sign In to Follow Application
View All Documents & Correspondence

Method And System For Detecting Position Of An Input Device

Abstract: ABSTRACT Method and system for determining position of an input device with respect to a User Equipment (UE). The input device is equipped with a light source that can be configured to emit light rays of pre-configured wavelength. The UE captures the light emitted by the input device, using an array of light receptors, and by processing the light rays further, identifies the position of the input device with respect to position of the UE. The input device can be configured to provide at least an option to vary the wavelength of the light being emitted. The UE triggers pre-configured action(s) that match the determined position of the input device, and the wavelength of the light received. FIG. 3

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 February 2016
Publication Number
32/2017
Publication Type
INA
Invention Field
PHYSICS
Status
Email
patent@bananaip.com
Parent Application
Patent Number
Legal Status
Grant Date
2022-05-17
Renewal Date

Applicants

SAMSUNG R&D Institute India - Bangalore Private Limited
#2870, Phoenix Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bengaluru, Karnataka 560037

Inventors

1. Vivek Vilas Galatage
Samsung R&D Institute, Phoenix Building, 2870 Bagmane Const.Biz.Pk, ORR Doddanekkundi, Marathahalli Post, Bangalore Karnataka 560037 INDIA
2. Ravi Phaneendra Kasibhatla
Samsung R&D Institute, Phoenix Building, 2870 Bagmane Const.Biz.Pk, ORR Doddanekkundi, Marathahalli Post, Bangalore Karnataka 560037
3. Amogh Jawahar Bihani
Samsung R&D Institute, Phoenix Building, 2870 Bagmane Const.Biz.Pk, ORR Doddanekkundi, Marathahalli Post, Bangalore Karnataka 560037
4. Mani Shyam Patro Behara
Samsung R&D Institute, Phoenix Building, 2870 Bagmane Const.Biz.Pk, ORR Doddanekkundi, Marathahalli Post, Bangalore Karnataka 560037

Specification

Claims:CLAIMS
What is claimed is:
1. A method for determining position of an input device with respect to position of a User equipment (UE), said method comprising:
receiving a plurality of light rays from a light source associated with said input device, by a plurality of light receptors of said UE;
identifying an isosceles triangle with smallest base, of a plurality of isosceles triangles formed by said plurality of light rays with respect to said plurality of light receptors, by a position detector of said UE;
determining value of an angle (?) being formed by sides of said isosceles triangle, by said position detector;
determining value of a distance (d) as distance between bases of any two consecutive triangles of said plurality of isosceles triangles, by said position detector;
determining value of a horizontal component (x), based on values of said angle (?) and said distance (d) , by said position detector;
determining value of a vertical component (y), by said position detector; and
determining position of said input device, based on said horizontal component (x) and said vertical component (y), by said position detector, wherein said horizontal and vertical components are determined with respect to position of said UE.
2. The method as claimed in claim 1, wherein value of said horizontal component (x) is determined as product of said distance (d) and cotangent of said angle (?).
3. The method as claimed in claim 1, wherein value of said vertical component (y) is determined as equal to distance between edge of an array of light receptors formed by said plurality of light receptors and centre of said array of light receptors.
4. A system for determining position of an input device with respect to position of a User equipment (UE), said system configured for:
receiving a plurality of light rays from a light source associated with said input device, by a plurality of light receptors of said UE;
identifying an isosceles triangle with smallest base, of a plurality of isosceles triangles formed by said plurality of light rays with respect to said plurality of light receptors, by a position detector of said UE;
determining value of an angle (?) being formed by sides of said isosceles triangle, by said position detector;
determining value of a distance (d) as distance between bases of any two consecutive triangles of said plurality of isosceles triangles, by said position detector;
determining value of a horizontal component (x), based on values of said angle (?) and said distance (d) , by said position detector;
determining value of a vertical component (y), by said position detector; and
determining position of said input device, based on said horizontal component (x) and said vertical component (y), by said position detector, wherein said horizontal and vertical components are determined with respect to position of said UE.
5. The system as claimed in claim 4, wherein said UE is configured to determine value of said horizontal component (x) as product of said distance (d) and cotangent of said angle (?).


6. The system as claimed in claim 4, wherein said UE is configured to determine value of said vertical component (y) as equal to distance between edge of an array of light receptors formed by said plurality of light receptors and centre of said array of light receptors.

Dated this 4th February, 2016 , Description:TECHNICAL FIELD
[001] The embodiments herein relate to digital User Equipments and, more particularly, to detect position of an input device with respect to a digital User Equipment.

BACKGROUND
[002] Digital devices are constantly evolving, not just in terms of technology, but in terms of size and shape. Devices such as smartphones, tablet PCs are so popular with the type of services they offer. Wearable device is another category which was introduced in the recent past, and is gaining popularity. In an attempt to make the devices visually appealing and easy to carry around, most of the device manufacturers are opting compact designs for the devices being developed of late.
[003] There are different mechanisms to allow a user interaction with a digital device. While some devices stick to the traditional keypads, most of the latest devices opt a touch screen or any such advanced user interfaces mechanism to allow user interaction with the device. However, while the compact designs meet the requirements in terms of styling and portability, compactness can take a toll on the ease with which a user can interact with such a device. This is because as the device becomes compact, the area on the screen of the device with which the user can interact, reduces. In order to overcome this advantage, certain input instruments such as but not limited to stylus pen are used, which helps interaction with the device. However, with the devices being more compact (for example, a smart watch with a smaller touch screen), even such instruments are ineffective, as the surface area of the touch screen reduces further.

OBJECT OF INVENTION
[004] An object of the embodiments herein is to detect position of an input device with respect to a digital user device.

SUMMARY
[005] In view of the foregoing, an embodiment herein provides a method for determining position of an input device with respect to position of a User equipment (UE). The UE receives a plurality of light rays from a light source associated with said input device, by a plurality of light receptors of said UE. Further, an isosceles triangle with smallest base is identified, of a plurality of isosceles triangles formed by said plurality of light rays with respect to said plurality of light receptors, by a position detector of said UE. Value of an angle (?) being formed by sides of said isosceles triangle, and value of a distance (d) as distance between bases of any two consecutive triangles of said plurality of isosceles triangles, are determined by said position detector. Further, based on the values of said angle (?) and said distance (d), value of a horizontal component (x) and value of a vertical component (y), by said position detector, and the position of said input device, is determined based on said horizontal component (x) and said vertical component (y), by said position detector, wherein said horizontal and vertical components are determined with respect to position of said UE.
[006] Embodiments further disclose a system for determining position of an input device with respect to position of a User equipment (UE). The UE receives a plurality of light rays from a light source associated with said input device, by a plurality of light receptors of said UE. Further, an isosceles triangle with smallest base is identified, of a plurality of isosceles triangles formed by said plurality of light rays with respect to said plurality of light receptors, by a position detector of said UE. Value of an angle (?) being formed by sides of said isosceles triangle, and value of a distance (d) as distance between bases of any two consecutive triangles of said plurality of isosceles triangles, are determined by said position detector. Further, based on the values of said angle (?) and said distance (d), value of a horizontal component (x) and value of a vertical component (y), by said position detector, and the position of said input device, is determined based on said horizontal component (x) and said vertical component (y), by said position detector, wherein said horizontal and vertical components are determined with respect to position of said UE.
[007] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings.

BRIEF DESCRIPTION OF THE FIGURES
[008] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[009] FIG. 1 illustrates a block diagram of the position tracking system, as disclosed in the embodiments herein;
[0010] FIG. 2 is block diagram that depicts components of the User Equipment (UE), as disclosed in the embodiments herein;
[0011] FIG. 3 is block diagram that depicts components of the input device, as disclosed in the embodiments herein;
[0012] FIG. 4 is a flow diagram that depicts steps involved in the process of determining position of the input device, using the position tracking system, as disclosed in the embodiments herein; and
[0013] FIGS. 5a-5c illustrate example implementations of the position tracking system, as disclosed in the embodiments herein.
DETAILED DESCRIPTION OF EMBODIMENTS
[0014] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0015] The embodiments herein disclose a mechanism for determining position of an input device with respect to position of a User Equipment (UE). Referring now to the drawings, and more particularly to FIGS. 1 through 5, where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments.
[0016] FIG. 1 illustrates a block diagram of the position tracking system, as disclosed in the embodiments herein. The position tracking system 100 comprises of a User Equipment (UE) 101, and an Input device 102. It is to be noted that the number of UEs 101 and input devices 102 can vary depending on user requirements and implementation standards, and the figures provided herein are not intended to add any limitation/restriction. The UE 101 can be any device that is capable of receiving input from at least one input device 102, in the form of light rays, process the received input, identify at least an action to be triggered in response to the received input, and then trigger the identified action. For example, the UE 101 can be a mobile phone, a tablet computer, a smart watch and/or any such device with the aforementioned capabilities.
[0017] The input device 102 is an equipment that can be configured to host a light source that can emit light rays of pre-determined wavelength. The input device 102 can be configured to provide at least one option to vary wavelength of the light being emitted, to at least one other pre-configured wavelength.
[0018] FIG. 2 is block diagram that depicts components of the User Equipment (UE), as disclosed in the embodiments herein. The UE 101 comprises of at least one light receptor 201¸a position detector 202, and an action detector 203.
[0019] The light receptor 201 can be configured to receive/collect light rays emitted from the input device 102. In an embodiment, for the UE 101 to determine position of the input device 102, a plurality of light receptors are used, such that the plurality of light receptors form a stack of light receptors, of a specific shape. For example, the light receptor stack may be of spherical shape. The light receptors 201 in the light receptor stack are positioned such that the light rays emitted from the light source of the input device 102 are captured efficiently.
[0020] The position detector 202 can be configured to detect position of the input device 102, based on the light rays collected by the light receptor 201. Light rays form different angles with the light receptor 201 that receives the corresponding light ray, and two rays can form an isosceles triangle with respect to the position of the light source of the input device 102 and the light receptors 201, with the corresponding light receptors 201 forming base of the triangle. The position detector 202 identifies, from all the isosceles triangles formed by the light rays, the isosceles triangle with the smallest base. In an embodiment, identifying the isosceles triangle with smallest base involves measuring distance between all light receptors 201 that are lit. By measuring the distance between the light receptors 201 that are lit, the position detector 201 identifies length of base of each triangle identified, and determines the triangle with the smallest base.
[0021] The position detector 202 then collects value of an angle (?), as the angle between the light rays that form the triangle with the smallest base. In an embodiment, the angle (?) between the light rays depends is an implementation specific characteristic of the input device 102, hence is a fixed value. The position detector 202 can be further configured to measure/determine value of a distance (d) as distance between bases of any two consecutive triangles of said plurality of isosceles triangles i.e. the distance between two lit light receptors 201. The position detector 202 further determines, based on the value of ‘d’ and ‘?’, a horizontal parameter (x) and a vertical parameter (y) which together represent position of the input device 102 as (x,y). The position detector 202 can be further configured to provide information pertaining to the determined position of the input device 102, to the action detector 203. The position detector 202 can be further configured to determine wavelength of the received light, and provide the wavelength information as input to the action detector 203, along with the position information.
[0022] The action detector 203 can be configured to collect the position of the input device, and the wavelength data, as real time input, and determine at least one action that matches the position of the input device and the wavelength of the light, with respect to current context of an application that is running in foreground of the UE 101. In an embodiment, the information pertaining to action(s) to be triggered corresponding to a combination of parameters such as but not limited to position of input device, and wavelength of light rays, with respect to different contexts of different applications, is pre-determined and saved in a memory module associated with the UE 101, and is made available to the action detector 203, as a reference data, when required. The action detector 203 can be configured to compare the real time input with the reference data, and identify the action(s) to be triggered. The action detector 203 can be further configured to trigger the identified action(s).
[0023] FIG. 3 is block diagram that depicts components of the input device, as disclosed in the embodiments herein. The input device 102 primarily comprises of a light source 301, and a wavelength adjustment module 302. The light source 301 can be configured to emit light rays of at least one specific wavelength. The light source 301 can be positioned in the input device 101 such that the emitted light rays can be directed to the UE 101, at any specific angle. The light source 301 can be further configured to emit light rays of at least one other wavelength, as pre-configured.
[0024] The wavelength adjustment module 302 can be configured to provide at least one option for a user to pre-configure one or more wavelengths of the light to be emitted. The wavelength adjustment module 302 can be further configured to provide at least one interface for a user to vary wavelength of the light. For example, the interface can be a button such that when the user presses the button, the wavelength of the light emitted changes to another pre-configured value. If more than two wavelengths have been configured, then the user can toggle between different wavelengths by pressing the button or any similar interface provided.
[0025] FIG. 4 is a flow diagram that depicts steps involved in the process of determining position of the input device, using the position tracking system, as disclosed in the embodiments herein. A user is allowed to provide input to the UE 101, using the input device 102. The user provides input to the UE 101, using the input device 102. The UE 101 receives (402) the light rays generated by the input device 102, and determines (404) the position of the input device 102. The UE 101 further determines wavelength of the light received from the input device 102. The process by which the UE 101 determines the position of the input device 102 is explained below:
[0026] The UE 101 collects the light rays from the input device 102 using suitable light receptors 201. The light rays form different angles with light receptor 201, that receives the corresponding light ray, and two rays can form an isosceles triangle with respect to the position of the light source of the input device 102 and the light receptors 201, with the corresponding light receptors 201 forming base of the triangle. A position detector 202 of the UE 101 then identifies, from all the isosceles triangles formed by the light rays, the isosceles triangle with the smallest base. In an embodiment, identifying the isosceles triangle with smallest base involves measuring distance between all light receptors 201 that are lit. By measuring the distance between the light receptors 201 that are lit, the position detector 201 identifies length of base of each triangle identified, and determines the triangle with the smallest base.
[0027] The position detector 202 then collects value of an angle (?), as the angle between the light rays that form the triangle with the smallest base. In an embodiment, the angle (?) between the light rays depends is an implementation specific characteristic of the input device 102, hence is a fixed value. The position detector 202 can be further configured to measure/determine value of a distance (d) as distance between bases of any two consecutive triangles of said plurality of isosceles triangles i.e. the distance between two lit light receptors 201.
[0028] The position detector 202 further determines, based on the value of ‘d’ and ‘?’, a horizontal parameter (x) and a vertical parameter (y) which together represent position of the input device 102 as (x,y). The position detector 202 can be further configured to provide information pertaining to the determined position of the input device 102, to the action detector 203 of the UE 101. This process is depicted in Fig. 5c. The position detector 202 can be further configured to determine wavelength of the received light, and provide the wavelength information as input to the action detector 203, along with the position information.
[0029] Further, the action detector 203 identifies and triggers (406) at least one action that matches the detected position of the input device 102, wavelength of the light ray received, and the current context of the application that is open on the UE 101.
[0030] The various actions in method 400 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions listed in FIG. 4 may be omitted.
[0031] FIGS. 5a-5c illustrate example implementations of the position tracking system, as disclosed in the embodiments herein. As depicted in Fig. 5a, the light receptor array can be of spherical shape, and can be configured to collect the light rays from the light source of the input device 102. Assume that a messaging application is running in the foreground of the UE 101, the user can interact with the virtual keyboard using the input device 102. When the user moves the input device 102, the UE 101 detects the position change, and accordingly identifies the letters selected/typed by the user. The user can change the wavelength of the light ray by clicking the button provided, and the UE 101 detects the change in wavelength and the corresponding action is triggered. For example, in this particular scenario, one such action can be clicking a ‘send button’ on the virtual screen to send the message that has been typed.
[0032] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements. The network elements shown in Fig. 1 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
[0033] The embodiments disclosed herein specify a mechanism for detecting position of an input device with respect to position of a User Equipment (UE). The mechanism allows detection of position of the user equipment using light rays, providing a system thereof. Therefore, it is understood that the scope of protection is extended to such a system and by extension, to a computer readable means having a message therein, said computer readable means containing a program code for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The method is implemented in a preferred embodiment using the system together with a software program written in, for ex. Very high speed integrated circuit Hardware Description Language (VHDL), another programming language, or implemented by one or more VHDL or several software modules being executed on at least one hardware device. The hardware device can be any kind of device which can be programmed including, for ex. any kind of a computer like a server or a personal computer, or the like, or any combination thereof, for ex. one processor and two FPGAs. The device may also include means which could be for ex. hardware means like an ASIC or a combination of hardware and software means, an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means are at least one hardware means or at least one hardware-cum-software means. The method embodiments described herein could be implemented in pure hardware or partly in hardware and partly in software. Alternatively, the embodiment may be implemented on different hardware devices, for ex. using a plurality of CPUs.
[0034] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the claims as described herein.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 201641004092-FORM-27 [30-09-2024(online)].pdf 2024-09-30
1 Form 5 [04-02-2016(online)].pdf 2016-02-04
2 201641004092-IntimationOfGrant17-05-2022.pdf 2022-05-17
2 Form 3 [04-02-2016(online)].pdf 2016-02-04
3 Form 18 [04-02-2016(online)].pdf 2016-02-04
3 201641004092-PatentCertificate17-05-2022.pdf 2022-05-17
4 Drawing [04-02-2016(online)].pdf 2016-02-04
4 201641004092-Response to office action [09-12-2020(online)].pdf 2020-12-09
5 Description(Complete) [04-02-2016(online)].pdf 2016-02-04
5 201641004092-Annexure [24-07-2020(online)].pdf 2020-07-24
6 201641004092-Written submissions and relevant documents [24-07-2020(online)].pdf 2020-07-24
6 201641004092-Power of Attorney-060616.pdf 2016-07-22
7 201641004092-FORM-26 [03-07-2020(online)].pdf 2020-07-03
7 201641004092-Form 5-060616.pdf 2016-07-22
8 201641004092-Form 1-060616.pdf 2016-07-22
8 201641004092-Annexure [30-06-2020(online)].pdf 2020-06-30
9 201641004092-Correspondence to notify the Controller [30-06-2020(online)].pdf 2020-06-30
9 201641004092-Correspondence-F 1&5-PA-060616.pdf 2016-07-22
10 201641004092-Power of Attorney-140716.pdf 2016-07-29
10 201641004092-US(14)-HearingNotice-(HearingDate-08-07-2020).pdf 2020-06-03
11 201641004092-CLAIMS [08-03-2019(online)].pdf 2019-03-08
11 201641004092-Form 5-140716.pdf 2016-07-29
12 201641004092-CORRESPONDENCE [08-03-2019(online)].pdf 2019-03-08
12 201641004092-Form 1-140716.pdf 2016-07-29
13 201641004092-Correspondence-F1-F5-PA-140716.pdf 2016-07-29
13 201641004092-DRAWING [08-03-2019(online)].pdf 2019-03-08
14 201641004092-FER_SER_REPLY [08-03-2019(online)].pdf 2019-03-08
14 abstract 201641004092.jpg 2016-08-11
15 201641004092-FORM-26 [16-03-2018(online)]_82.pdf 2018-03-16
15 201641004092-OTHERS [08-03-2019(online)].pdf 2019-03-08
16 201641004092-FER.pdf 2019-01-23
16 201641004092-FORM-26 [16-03-2018(online)].pdf 2018-03-16
17 201641004092-FORM-26 [16-03-2018(online)].pdf 2018-03-16
17 201641004092-FER.pdf 2019-01-23
18 201641004092-FORM-26 [16-03-2018(online)]_82.pdf 2018-03-16
18 201641004092-OTHERS [08-03-2019(online)].pdf 2019-03-08
19 201641004092-FER_SER_REPLY [08-03-2019(online)].pdf 2019-03-08
19 abstract 201641004092.jpg 2016-08-11
20 201641004092-Correspondence-F1-F5-PA-140716.pdf 2016-07-29
20 201641004092-DRAWING [08-03-2019(online)].pdf 2019-03-08
21 201641004092-CORRESPONDENCE [08-03-2019(online)].pdf 2019-03-08
21 201641004092-Form 1-140716.pdf 2016-07-29
22 201641004092-CLAIMS [08-03-2019(online)].pdf 2019-03-08
22 201641004092-Form 5-140716.pdf 2016-07-29
23 201641004092-Power of Attorney-140716.pdf 2016-07-29
23 201641004092-US(14)-HearingNotice-(HearingDate-08-07-2020).pdf 2020-06-03
24 201641004092-Correspondence-F 1&5-PA-060616.pdf 2016-07-22
24 201641004092-Correspondence to notify the Controller [30-06-2020(online)].pdf 2020-06-30
25 201641004092-Form 1-060616.pdf 2016-07-22
25 201641004092-Annexure [30-06-2020(online)].pdf 2020-06-30
26 201641004092-FORM-26 [03-07-2020(online)].pdf 2020-07-03
26 201641004092-Form 5-060616.pdf 2016-07-22
27 201641004092-Written submissions and relevant documents [24-07-2020(online)].pdf 2020-07-24
27 201641004092-Power of Attorney-060616.pdf 2016-07-22
28 Description(Complete) [04-02-2016(online)].pdf 2016-02-04
28 201641004092-Annexure [24-07-2020(online)].pdf 2020-07-24
29 Drawing [04-02-2016(online)].pdf 2016-02-04
29 201641004092-Response to office action [09-12-2020(online)].pdf 2020-12-09
30 Form 18 [04-02-2016(online)].pdf 2016-02-04
30 201641004092-PatentCertificate17-05-2022.pdf 2022-05-17
31 201641004092-IntimationOfGrant17-05-2022.pdf 2022-05-17
31 Form 3 [04-02-2016(online)].pdf 2016-02-04
32 201641004092-FORM-27 [30-09-2024(online)].pdf 2024-09-30
32 Form 5 [04-02-2016(online)].pdf 2016-02-04

Search Strategy

1 Searchstrategy201641004092_18-04-2018.pdf

ERegister / Renewals

3rd: 10 Jun 2022

From 04/02/2018 - To 04/02/2019

4th: 10 Jun 2022

From 04/02/2019 - To 04/02/2020

5th: 10 Jun 2022

From 04/02/2020 - To 04/02/2021

6th: 10 Jun 2022

From 04/02/2021 - To 04/02/2022

7th: 10 Jun 2022

From 04/02/2022 - To 04/02/2023

8th: 24 Jan 2023

From 04/02/2023 - To 04/02/2024

9th: 02 Feb 2024

From 04/02/2024 - To 04/02/2025