Sign In to Follow Application
View All Documents & Correspondence

Method And System For Predicting A Time Instant For Providing Promotions To A User

Abstract: A method and a system are described for providing one or more promotions to a user. The method includes detecting, by a monitoring device, a movement event associated with an object using one or more sensors. The method includes initializing, by the monitoring device, capturing of audio data of a user in contact with the object on detection of the movement event. The method further includes determining, by the monitoring device, an emotion, associated with the object, of the user based on the audio data. The method further includes predicting in real time, by the monitoring device, a time instant at which one or more promotions are provided to the user based on the determined emotion and the audio data. FIG.3

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
23 December 2016
Publication Number
26/2018
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2023-05-31
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. RAJAGOPAL APPAKUTTY
1180, 19th street, G Block, Annanagar, Chennai 600040, Tamil Nadu, India

Specification

Claims:
WE CLAIM
1. A method for predicting a time instant for providing one or more promotions to a user, the method comprising:
detecting, by a monitoring device, a movement event associated with an object using one or more sensors;
initializing, by the monitoring device, capturing of audio data of a user in contact with the object on detection of the movement event;
determining, by the monitoring device, an emotion, associated with the object, of the user based on the audio data; and
predicting in real time, by the monitoring device, a time instant at which one or more promotions are provided to the user based on the determined emotion and the audio data.
2. The method of claim 1, wherein the monitoring device is attached to the object.
3. The method of claim 1, wherein the one or more sensors comprise at least one of an accelerometer, a location sensor, a proximity sensor, a pressure sensor, or a light sensor.
4. The method of claim 3, further comprising determining a time duration for which the audio data is captured based on at least one of accelerometer data obtained from the accelerometer, location sensor data obtained from the location sensor, and proximity sensor data obtained from the proximity sensor, wherein the one or more promotions are provided to the user based on the determined time duration.
5. The method of claim 1, wherein the capturing of the audio data of the user in contact with the object is performed until the object is detected to be idle for a predefined period of time based on data from the one or more sensors.
6. The method of claim 1, wherein the movement event is detected based on one of:
determining accelerometer data, associated with the object, is greater than a pre-defined threshold, or
determining a change in a location of the object based on location data.
7. The method of claim 1, wherein the one or more promotions are displayed to the user on a display screen of the monitoring device.
8. The method of claim 1, further comprising selecting the one or more promotions to be provided to the user based on at least one of the determined emotion, one or more pre-defined locations, a pre-defined sale target, and historical behavior, associated with the object, of a plurality of users.
9. The method of claim 1, wherein predicting the time instant of providing the one or more promotions is based on the determined emotion of the user at a location.
10. A monitoring device to predict a time instant to provide one or more promotions to a user, the monitoring device comprising:
a processor; and
a memory communicatively coupled to the processor, wherein the memory stores processor instructions, which, on execution, causes the processor to:
detect a movement event associated with an object using one or more sensors;
initialize capturing of audio data of a user in contact with the object on detection of the movement event;
determine an emotion, associated with the object, of the user based on the audio data; and
predict in real time a time instant at which one or more promotions are provided to the user based on the determined emotion and the audio data.
11. The monitoring device of claim 10, wherein the monitoring device is attached to the object.
12. The monitoring device of claim 10, wherein the one or more sensors comprise at least one of an accelerometer, a location sensor, a proximity sensor, a pressure sensor, or a light sensor.
13. The monitoring device of claim 12, wherein the processor is further configured to determine a time duration for which the audio data is captured based on at least one of accelerometer data obtained from the accelerometer, location sensor data obtained from the location sensor, and proximity sensor data obtained from the proximity sensor, wherein the one or more promotions are provided to the user based on the determined time duration.
14. The monitoring device of claim 10, wherein the capturing of the audio data of the user in contact with the object is performed until the object is detected to be idle for a predefined period of time.
15. The monitoring device of claim 10, wherein the movement event is detected based on one of:
determining accelerometer data, associated with the object, is greater than a pre-defined threshold, or
determining a change in a location of the object based on location data.
16. The monitoring device of claim 10, wherein the one or more promotions are displayed to the user on a display screen of the monitoring device.
17. The monitoring device of claim 10, wherein the processor is further configured to select the one or more promotions to be provided to the user based on at least one of the determined emotion, one or more pre-defined locations, a pre-defined sale target, and historical behavior, associated with the object, of a plurality of users.
18. The monitoring system of claim 10, wherein predicting the time instant of providing the one or more promotions is based on the determined emotion of the user at a location.
19. A non-transitory computer-readable storage medium having stored thereon, a set of computer-executable instructions for causing a computer comprising one or more processors to perform steps comprising:
detecting a movement event associated with an object using one or more sensors;

initializing capturing of audio data of a user in contact with the object on detection of the movement event;
determining an emotion, associated with the object, of the user based on the audio data; and
predicting in real time, a time instant at which one or more promotions are provided to the user based on the determined emotion and the audio data.

Dated this 23rd day of December, 2016

Swetha SN
Of K&S Partner
Agent for the Applicant
, Description:TECHNICAL FIELD

The present subject matter is related, in general to retail product monitoring systems, and more particularly, but not exclusively to a method and a system for predicting a time instant for providing promotions to a user.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 201641044055-PatentCertificate31-05-2023.pdf 2023-05-31
1 Form5_As Filed_23-12-2016.pdf 2016-12-23
2 201641044055-Written submissions and relevant documents [30-05-2023(online)].pdf 2023-05-30
2 Form3_As Filed_23-12-2016.pdf 2016-12-23
3 Form26_General Power of Attorney_23-12-2016.pdf 2016-12-23
3 201641044055-AMENDED DOCUMENTS [16-05-2023(online)].pdf 2023-05-16
4 Form2 Title Page_Complete_23-12-2016.pdf 2016-12-23
4 201641044055-Correspondence to notify the Controller [16-05-2023(online)].pdf 2023-05-16
5 Form18_Express Request_23-12-2016.pdf 2016-12-23
5 201641044055-FORM 13 [16-05-2023(online)].pdf 2023-05-16
6 Drawing_As Filed_23-12-2016.pdf 2016-12-23
6 201641044055-POA [16-05-2023(online)].pdf 2023-05-16
7 Description Complete_As Filed_23-12-2016.pdf 2016-12-23
7 201641044055-US(14)-HearingNotice-(HearingDate-18-05-2023).pdf 2023-04-26
8 Claims_As Filed_23-12-2016.pdf 2016-12-23
8 201641044055-FER.pdf 2021-10-17
9 201641044055-CLAIMS [08-09-2021(online)].pdf 2021-09-08
9 Abstract_As Filed_23-12-2016.pdf 2016-12-23
10 201641044055-COMPLETE SPECIFICATION [08-09-2021(online)].pdf 2021-09-08
10 Form26_General Power of Attorney_26-12-2016.pdf 2016-12-26
11 201641044055-DRAWING [08-09-2021(online)].pdf 2021-09-08
11 Correspondence By Agent_Request For Certified Copy_26-12-2016.pdf 2016-12-26
12 201641044055-FER_SER_REPLY [08-09-2021(online)].pdf 2021-09-08
12 abstract 201641044055.jpg 2016-12-29
13 201641044055-PETITION UNDER RULE 137 [08-09-2021(online)].pdf 2021-09-08
13 Request For Certified Copy-Online.pdf 2016-12-30
14 201641044055-FORM 3 [07-09-2021(online)].pdf 2021-09-07
14 REQUEST FOR CERTIFIED COPY [02-02-2017(online)].pdf 2017-02-02
15 Correspondence by Agent_Form1_27-04-2017.pdf 2017-04-27
15 Other Patent Document [25-04-2017(online)].pdf 2017-04-25
16 Correspondence by Agent_Form1_27-04-2017.pdf 2017-04-27
16 Other Patent Document [25-04-2017(online)].pdf 2017-04-25
17 REQUEST FOR CERTIFIED COPY [02-02-2017(online)].pdf 2017-02-02
17 201641044055-FORM 3 [07-09-2021(online)].pdf 2021-09-07
18 201641044055-PETITION UNDER RULE 137 [08-09-2021(online)].pdf 2021-09-08
18 Request For Certified Copy-Online.pdf 2016-12-30
19 201641044055-FER_SER_REPLY [08-09-2021(online)].pdf 2021-09-08
19 abstract 201641044055.jpg 2016-12-29
20 201641044055-DRAWING [08-09-2021(online)].pdf 2021-09-08
20 Correspondence By Agent_Request For Certified Copy_26-12-2016.pdf 2016-12-26
21 201641044055-COMPLETE SPECIFICATION [08-09-2021(online)].pdf 2021-09-08
21 Form26_General Power of Attorney_26-12-2016.pdf 2016-12-26
22 201641044055-CLAIMS [08-09-2021(online)].pdf 2021-09-08
22 Abstract_As Filed_23-12-2016.pdf 2016-12-23
23 201641044055-FER.pdf 2021-10-17
23 Claims_As Filed_23-12-2016.pdf 2016-12-23
24 Description Complete_As Filed_23-12-2016.pdf 2016-12-23
24 201641044055-US(14)-HearingNotice-(HearingDate-18-05-2023).pdf 2023-04-26
25 Drawing_As Filed_23-12-2016.pdf 2016-12-23
25 201641044055-POA [16-05-2023(online)].pdf 2023-05-16
26 Form18_Express Request_23-12-2016.pdf 2016-12-23
26 201641044055-FORM 13 [16-05-2023(online)].pdf 2023-05-16
27 Form2 Title Page_Complete_23-12-2016.pdf 2016-12-23
27 201641044055-Correspondence to notify the Controller [16-05-2023(online)].pdf 2023-05-16
28 Form26_General Power of Attorney_23-12-2016.pdf 2016-12-23
28 201641044055-AMENDED DOCUMENTS [16-05-2023(online)].pdf 2023-05-16
29 Form3_As Filed_23-12-2016.pdf 2016-12-23
29 201641044055-Written submissions and relevant documents [30-05-2023(online)].pdf 2023-05-30
30 Form5_As Filed_23-12-2016.pdf 2016-12-23
30 201641044055-PatentCertificate31-05-2023.pdf 2023-05-31

Search Strategy

1 2021-03-0213-43-50E_02-03-2021.pdf

ERegister / Renewals

3rd: 12 Aug 2023

From 23/12/2018 - To 23/12/2019

4th: 12 Aug 2023

From 23/12/2019 - To 23/12/2020

5th: 12 Aug 2023

From 23/12/2020 - To 23/12/2021

6th: 12 Aug 2023

From 23/12/2021 - To 23/12/2022

7th: 12 Aug 2023

From 23/12/2022 - To 23/12/2023

8th: 19 Dec 2023

From 23/12/2023 - To 23/12/2024

9th: 18 Dec 2024

From 23/12/2024 - To 23/12/2025