Sign In to Follow Application
View All Documents & Correspondence

Method And Computing Unit For Facilitating Interactions Of A Group Of Users With Gesture Based Application

Abstract: Embodiments of the present disclosure provide a method for facilitating interactions, with a gesture-based application, of a group of users. The method comprises identifying, by a computing unit of an interactive device, the group based on information received from sensors associated with the interactive device. Then, an interaction intensity value associated with each of the users is determined. The interaction intensity value is indicative of the level of activity of the each of the users. Next, at least one active user among the group of users is identified based on an order of the interaction intensity values. Lastly, gestures of the at least one active user towards the gesture-based application are tracked for facilitating interactions with the gesture-based application. FIGURE 6

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
25 April 2014
Publication Number
18/2014
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ipo@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-10-26
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. AMIT KUMAR
Sabalpur, Sonepur, Patna – 841101, Bihar, India.
2. SHEEBA S RAJ
157 Krishna Kunj, 3rd Cross, 4th main Nrupathunga Nagar, JP Nagar 8th Phase, Bangalore – 560076, Karnataka, India

Specification

CLIAMS:We claim:
1. A computer implemented method for facilitating interactions, with a gesture-based application, of a group of users, comprising:
identifying, by a computing unit of an interactive device, the group based on information received from sensors associated with the interactive device;
determining, by the computing unit, an interaction intensity value associated with each of the users, wherein the interaction intensity value is indicative of the level of activity of the each of the users;
identifying, by the computing unit, at least one active user among the group of users based on an order of the interaction intensity values; and
tracking, by the computing unit, gestures of the at least one active user towards the gesture-based application for facilitating interactions with the gesture-based application.

2. The method as claimed in claim 1 further comprising:
assigning a unique identifier to the at least one active user; and
creating a unique session for the at least one active user for facilitating interactions with the gesture-based application.
3. The method as claimed in claim 1 further comprising generating a notification for indicating the identified at least one active user, wherein the notification comprises at least one of a visual alert, an audio alert and an audio-visual alert.

4. The method as claimed in claim 1, wherein the identifying of the at least one active user among the group of users is based on a descending order of the interaction intensity values.

5. The method as claimed in claim 1 further comprising storing one or more actions performed towards the gesture-based application by the at least one active user in a memory of the computing unit.

6. The method as claimed in claim 1, wherein the information comprises at least one of color image, depth image and body matrix of each of the users.

7. The method as claimed in claim 1 further comprising identifying a change in the group of users by identifying at least one of addition and deletion of at least one user in the group.

8. The method as claimed in claim 1 further comprising identifying a change of the at least one active user in the group based on a change in the interaction intensity values.

9. A computing unit of an interactive device for facilitating interactions, with a gesture-based application, of a group of users, said computing unit comprising:
a processor;
a memory communicatively coupled to the processor, wherein the memory stores processor-executable instructions, which, on execution, cause the processor to:
identify the group based on information received from sensors associated with the interactive device;
determine an interaction intensity value associated with each of the users, wherein the interaction intensity value is indicative of the level of activity of the each of the users;
identify at least one active user among the group of users based on an order of the interaction intensity values; and
track gestures of the at least one active user towards the gesture-based application for facilitating interactions with the gesture-based application.

10. The computing unit as claimed in claim 9, wherein the sensors are selected from at least one of a camera, an infrared (IR) sensor, a Red-Green-Blue (RGB) sensor, a Sonar sensor, a laser sensor and a Radio Frequency (RF) sensor.

11. The computing unit as claimed in claim 9, wherein the interactive device is associated with a display unit to display a notification for indicating the identified at least one active user, wherein the notification comprises at least one of a visual alert, an audio alert and an audio-visual alert.

12. The computing unit as claimed in claim 9, wherein the memory stores one or more actions performed towards the gesture-based application by the at least one active user.
13. A non-transitory computer readable medium including instructions stored thereon that when processed by a processor cause a computing unit to perform acts of:
identifying the group based on information received from sensors associated with an interactive device;
determining an interaction intensity value associated with each of the users, wherein the interaction intensity value is indicative of the level of activity of the each of the users;
identifying at least one active user among the group of users based on an order of the interaction intensity values; and
tracking gestures of the at least one active user towards the gesture-based application for facilitating interactions with the gesture-based application.

Dated this 25th day of April, 2014

MADHUSUDAN S.T.
OF K & S PARTNERS
ATTORNEY FOR THE APPLICANT
,TagSPECI:TECHNICAL FIELD
The present disclosure relates to gesture-based interactions. In particular, embodiments of present disclosure include a method and a computing unit of an interactive device for facilitating interactions of a group of users with gesture-based application.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 2096-CHE-2014-IntimationOfGrant26-10-2023.pdf 2023-10-26
1 IP26797-spec.pdf 2014-04-25
2 2096-CHE-2014-PatentCertificate26-10-2023.pdf 2023-10-26
2 IP26797-fig.pdf 2014-04-25
3 FORM 5.pdf 2014-04-25
3 2096-CHE-2014-FORM 3 [13-10-2023(online)].pdf 2023-10-13
4 FORM 3.pdf 2014-04-25
4 2096-CHE-2014-Written submissions and relevant documents [13-10-2023(online)].pdf 2023-10-13
5 2096-CHE-2014-Request For Certified Copy-Online(25-04-2014).pdf 2014-04-25
5 2096-CHE-2014-AMENDED DOCUMENTS [25-09-2023(online)].pdf 2023-09-25
6 Form-9(Online).pdf 2014-04-28
6 2096-CHE-2014-Correspondence to notify the Controller [25-09-2023(online)].pdf 2023-09-25
7 2096CHE2014_CertifiedRequest.pdf 2014-04-28
7 2096-CHE-2014-FORM 13 [25-09-2023(online)].pdf 2023-09-25
8 abstract2096-CHE-2014.jpg 2014-04-29
8 2096-CHE-2014-POA [25-09-2023(online)].pdf 2023-09-25
9 2096-CHE-2014 POWER OF ATTORNEY 10-06-2014.pdf 2014-06-10
9 2096-CHE-2014-US(14)-HearingNotice-(HearingDate-29-09-2023).pdf 2023-09-18
10 2096-CHE-2014 FORM-1 10-06-2014.pdf 2014-06-10
10 2096-CHE-2014-ABSTRACT [13-01-2020(online)].pdf 2020-01-13
11 2096-CHE-2014 CORRESPONDENCE OTHERS 10-06-2014.pdf 2014-06-10
11 2096-CHE-2014-CLAIMS [13-01-2020(online)].pdf 2020-01-13
12 2096-CHE-2014-FER.pdf 2019-07-12
12 2096-CHE-2014-FER_SER_REPLY [13-01-2020(online)].pdf 2020-01-13
13 2096-CHE-2014-FORM 3 [09-01-2020(online)].pdf 2020-01-09
13 2096-CHE-2014-PETITION UNDER RULE 137 [09-01-2020(online)].pdf 2020-01-09
14 2096-CHE-2014-FORM 3 [09-01-2020(online)].pdf 2020-01-09
14 2096-CHE-2014-PETITION UNDER RULE 137 [09-01-2020(online)].pdf 2020-01-09
15 2096-CHE-2014-FER.pdf 2019-07-12
15 2096-CHE-2014-FER_SER_REPLY [13-01-2020(online)].pdf 2020-01-13
16 2096-CHE-2014 CORRESPONDENCE OTHERS 10-06-2014.pdf 2014-06-10
16 2096-CHE-2014-CLAIMS [13-01-2020(online)].pdf 2020-01-13
17 2096-CHE-2014-ABSTRACT [13-01-2020(online)].pdf 2020-01-13
17 2096-CHE-2014 FORM-1 10-06-2014.pdf 2014-06-10
18 2096-CHE-2014 POWER OF ATTORNEY 10-06-2014.pdf 2014-06-10
18 2096-CHE-2014-US(14)-HearingNotice-(HearingDate-29-09-2023).pdf 2023-09-18
19 2096-CHE-2014-POA [25-09-2023(online)].pdf 2023-09-25
19 abstract2096-CHE-2014.jpg 2014-04-29
20 2096-CHE-2014-FORM 13 [25-09-2023(online)].pdf 2023-09-25
20 2096CHE2014_CertifiedRequest.pdf 2014-04-28
21 2096-CHE-2014-Correspondence to notify the Controller [25-09-2023(online)].pdf 2023-09-25
21 Form-9(Online).pdf 2014-04-28
22 2096-CHE-2014-AMENDED DOCUMENTS [25-09-2023(online)].pdf 2023-09-25
22 2096-CHE-2014-Request For Certified Copy-Online(25-04-2014).pdf 2014-04-25
23 2096-CHE-2014-Written submissions and relevant documents [13-10-2023(online)].pdf 2023-10-13
23 FORM 3.pdf 2014-04-25
24 2096-CHE-2014-FORM 3 [13-10-2023(online)].pdf 2023-10-13
24 FORM 5.pdf 2014-04-25
25 IP26797-fig.pdf 2014-04-25
25 2096-CHE-2014-PatentCertificate26-10-2023.pdf 2023-10-26
26 IP26797-spec.pdf 2014-04-25
26 2096-CHE-2014-IntimationOfGrant26-10-2023.pdf 2023-10-26

Search Strategy

1 searchstartegy2096_04-07-2019.pdf

ERegister / Renewals

3rd: 18 Jan 2024

From 25/04/2016 - To 25/04/2017

4th: 18 Jan 2024

From 25/04/2017 - To 25/04/2018

5th: 18 Jan 2024

From 25/04/2018 - To 25/04/2019

6th: 18 Jan 2024

From 25/04/2019 - To 25/04/2020

7th: 18 Jan 2024

From 25/04/2020 - To 25/04/2021

8th: 18 Jan 2024

From 25/04/2021 - To 25/04/2022

9th: 18 Jan 2024

From 25/04/2022 - To 25/04/2023

10th: 18 Jan 2024

From 25/04/2023 - To 25/04/2024

11th: 24 Apr 2024

From 25/04/2024 - To 25/04/2025

12th: 25 Apr 2025

From 25/04/2025 - To 25/04/2026