Sign In to Follow Application
View All Documents & Correspondence

Interactive System And Method For Enhancing Adaptability Of An Interactive Surface Environment

Abstract: The present disclosure relates to a method for enhancing adaptability of an interactive surface environment having a plurality of objects. The method comprises receiving at least one user gesture performed on a target object from the plurality of objects. The further comprises identifying a context for each of the at least one user gesture performed on the target object based on at least one of the at least one user gesture, target object parameters, and object parameters. The method further comprises aggregating the at least one user gesture performed on the target object and the context to obtain a decision dataset. The method further comprises identifying an impact, to be rendered, by comparing the decision dataset with predefined datasets. The method further comprises rendering the impact on one or more objects from the plurality of objects in the interactive surface environment. Figure 6

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 July 2015
Publication Number
33/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ipr@akshipassociates.com
Parent Application
Patent Number
Legal Status
Grant Date
2022-08-23
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. SANGITA GANESH
Old no. 25 New no. 4, Venkatachalam Street, West Mambalam, Chennai 600033, Tamil Nadu
2. MANOJ MADHUSUDHANAN
#980, 1st Cross, 2nd Phase, 5th Stage, BEML Layout, Rajarajeshwari Nagar, Bangalore – 560098, Karnataka, India.

Specification

Claims:We claim:
1. A method for enhancing adaptability of an interactive surface environment having a plurality of objects, the method comprising:
receiving, by an interactive system, at least one user gesture performed on a target object from the plurality of objects;
identifying, by the interactive system, a context for each of the at least one user gesture performed on the target object based on at least one of the at least one user gesture, target object parameters, and object parameters;
aggregating, by the interactive system, the at least one user gesture performed on the target object and the context to obtain a decision dataset;
identifying, by the interactive system, an impact, to be rendered, by comparing the decision dataset with predefined datasets; and
rendering, by the interactive system, the impact on one or more objects from the plurality of objects in the interactive surface environment.

2. The method as claimed in claim 1, wherein each of the plurality of objects is one of a physical object and a virtual object.

3. The method as claimed in claim 1, wherein the target object parameters comprises at least one of a position, a shape, a sound, a size, or a color of the target object, and the object parameters comprises at least one of a position, a shape, a sound, a size, or a color of objects from the plurality of objects except the target object.

4. The method as claimed in claim 1, wherein the at least one user gesture comprises movement of feet, movement of arms, movement of legs, movement of fingers, movement of legs, movement of hands, movement of head, and combination thereof.

5. The method as claimed in claim 1, wherein the impact comprises at least one of a change in size, change in shape, change in color, change in sound, or change in position of the plurality of objects in the interactive surface environment.

6. The method as claimed in claim 1, wherein the predefined datasets are received from a decision database, and each of the predefined datasets comprises the at least one user gesture input associated with one or more objects from the plurality of objects, and the context.

7. An interactive system for enhancing adaptability of an interactive surface environment having a plurality of objects comprising:
a processor;
a memory communicatively coupled to the processor, wherein the memory stores processor-executable instructions, which, on execution, cause the processor to:
receive at least one user gesture performed on a target object from the plurality of objects;
identify a context for each of the at least one user gesture performed on the target object based on at least one of the at least one user gesture, target object parameters, and object parameters;
aggregating, by the interactive system, the at least one user gesture performed on the target object and the context to obtain a decision dataset;
identify an impact, to be rendered, by comparing the decision dataset with predefined datasets; and
render the impact on one or more objects from the plurality of objects in the interactive surface environment.

8. The interactive system as claimed in claim 7, wherein each of the plurality of objects is one of a physical object and a virtual object.

9. The interactive system as claimed in claim 7, wherein the target object parameters comprises at least one of a position, a shape, a sound, a size, or a color of the target object, and the object parameters comprises at least one of a position, a shape, a sound, a size, or a color of objects from the plurality of objects except the target object.

10. The interactive system as claimed in claim 7, wherein the at least one user gesture comprises movement of feet, movement of arms, movement of legs, movement of fingers, movement of legs, movement of hands, movement of head, and combination thereof.

11. The interactive system as claimed in claim 7, wherein the impact comprises at least one of a change in size, change in shape, change in color, change in sound, or change in position of the plurality of objects in the interactive surface environment.

12. The interactive system as claimed in claim 7, wherein the predefined datasets are received from a decision database, and each of the predefined datasets comprises the at least one user gesture input associated with one or more objects from the plurality of objects, and the context.

13. A non-transitory computer readable medium including instructions stored thereon that when processed by a processor cause an interactive system for enhancing adaptability of an interactive surface environment having a plurality of objects by performing acts of:
receiving at least one user gesture performed on a target object from the plurality of objects;
identifying a context for each of the at least one user gesture performed on the target object based on at least one of the at least one user gesture, target object parameters, and object parameters;
aggregating the at least one user gesture performed on the target object and the context to obtain a decision dataset;
identifying an impact, to be rendered, by comparing the decision dataset with predefined datasets; and
rendering the impact on one or more objects from the plurality of objects in the interactive surface environment.

Dated this day of July, 2015

SHWETHA A CHIMALGI
OF K & S PARTNERS
AGENT FOR THE APPLICANT
, Description:FIELD OF THE DISCLOSURE
The present subject matter is related, in general to interactive environments, and more particularly, but not exclusively to an interactive system and method for enhancing adaptability of an interactive surface environment having a plurality of objects.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 3967-CHE-2015-IntimationOfGrant23-08-2022.pdf 2022-08-23
1 Form 9 [31-07-2015(online)].pdf 2015-07-31
2 Form 5 [31-07-2015(online)].pdf 2015-07-31
2 3967-CHE-2015-PatentCertificate23-08-2022.pdf 2022-08-23
3 Form 3 [31-07-2015(online)].pdf 2015-07-31
3 3967-CHE-2015-Written submissions and relevant documents [26-05-2022(online)].pdf 2022-05-26
4 Form 18 [31-07-2015(online)].pdf 2015-07-31
4 3967-CHE-2015-AMENDED DOCUMENTS [26-04-2022(online)].pdf 2022-04-26
5 Drawing [31-07-2015(online)].pdf 2015-07-31
5 3967-CHE-2015-Correspondence to notify the Controller [26-04-2022(online)].pdf 2022-04-26
6 Description(Complete) [31-07-2015(online)].pdf 2015-07-31
6 3967-CHE-2015-FORM 13 [26-04-2022(online)].pdf 2022-04-26
7 REQUEST FOR CERTIFIED COPY [04-08-2015(online)].pdf 2015-08-04
7 3967-CHE-2015-POA [26-04-2022(online)].pdf 2022-04-26
8 abstract 3967-CHE-2015.jpg 2015-08-04
8 3967-CHE-2015-US(14)-HearingNotice-(HearingDate-11-05-2022).pdf 2022-04-18
9 3967-CHE-2015-FER_SER_REPLY [03-07-2020(online)].pdf 2020-07-03
9 3967-CHE-2015 POWER OF ATTORNEY 130116.pdf 2016-06-20
10 3967-CHE-2015 FORM-1 130116.pdf 2016-06-20
10 3967-CHE-2015-FORM 3 [03-07-2020(online)].pdf 2020-07-03
11 3967-CHE-2015 CORRESPONDENCE-F1 13012016.pdf 2016-06-20
11 3967-CHE-2015-FER.pdf 2020-01-17
12 3967-CHE-2015 CORRESPONDENCE-F1 13012016.pdf 2016-06-20
12 3967-CHE-2015-FER.pdf 2020-01-17
13 3967-CHE-2015 FORM-1 130116.pdf 2016-06-20
13 3967-CHE-2015-FORM 3 [03-07-2020(online)].pdf 2020-07-03
14 3967-CHE-2015 POWER OF ATTORNEY 130116.pdf 2016-06-20
14 3967-CHE-2015-FER_SER_REPLY [03-07-2020(online)].pdf 2020-07-03
15 3967-CHE-2015-US(14)-HearingNotice-(HearingDate-11-05-2022).pdf 2022-04-18
15 abstract 3967-CHE-2015.jpg 2015-08-04
16 3967-CHE-2015-POA [26-04-2022(online)].pdf 2022-04-26
16 REQUEST FOR CERTIFIED COPY [04-08-2015(online)].pdf 2015-08-04
17 3967-CHE-2015-FORM 13 [26-04-2022(online)].pdf 2022-04-26
17 Description(Complete) [31-07-2015(online)].pdf 2015-07-31
18 3967-CHE-2015-Correspondence to notify the Controller [26-04-2022(online)].pdf 2022-04-26
18 Drawing [31-07-2015(online)].pdf 2015-07-31
19 Form 18 [31-07-2015(online)].pdf 2015-07-31
19 3967-CHE-2015-AMENDED DOCUMENTS [26-04-2022(online)].pdf 2022-04-26
20 Form 3 [31-07-2015(online)].pdf 2015-07-31
20 3967-CHE-2015-Written submissions and relevant documents [26-05-2022(online)].pdf 2022-05-26
21 Form 5 [31-07-2015(online)].pdf 2015-07-31
21 3967-CHE-2015-PatentCertificate23-08-2022.pdf 2022-08-23
22 Form 9 [31-07-2015(online)].pdf 2015-07-31
22 3967-CHE-2015-IntimationOfGrant23-08-2022.pdf 2022-08-23

Search Strategy

1 2020-01-0915-48-51_09-01-2020.pdf
1 2020-01-0915-50-01_09-01-2020.pdf
2 2020-01-0915-49-06_09-01-2020.pdf
2 2020-01-0915-49-52_09-01-2020.pdf
3 2020-01-0915-49-06_09-01-2020.pdf
3 2020-01-0915-49-52_09-01-2020.pdf
4 2020-01-0915-48-51_09-01-2020.pdf
4 2020-01-0915-50-01_09-01-2020.pdf

ERegister / Renewals

3rd: 08 Nov 2022

From 31/07/2017 - To 31/07/2018

4th: 08 Nov 2022

From 31/07/2018 - To 31/07/2019

5th: 08 Nov 2022

From 31/07/2019 - To 31/07/2020

6th: 08 Nov 2022

From 31/07/2020 - To 31/07/2021

7th: 08 Nov 2022

From 31/07/2021 - To 31/07/2022

8th: 08 Nov 2022

From 31/07/2022 - To 31/07/2023

9th: 10 Jul 2023

From 31/07/2023 - To 31/07/2024

10th: 17 Jul 2024

From 31/07/2024 - To 31/07/2025

11th: 25 Jul 2025

From 31/07/2025 - To 31/07/2026