Sign In to Follow Application
View All Documents & Correspondence

System And Method Of Video Processing For Analyzing Classroom Behaviour For Improving Classroom Performance

Abstract: System and method of video processing for analyzing classroom behaviour for improving classroom performance. A system (100) for video processing is provided. The system (100) includes one or more processors configured to capture video of individuals, wherein each video frame covers multiple individuals. The processor process the captured video to identify emotion exhibited by each of the individuals and classify the emotion exhibited by each of the individuals into multiple classifications. The processor further determines the number of individuals assigned to each of the classifications and categorize each of the individuals as exhibiting one of recommended or non-recommended emotion based on the classification of the emotion exhibited by the individual and the number of individuals assigned to each of the classifications.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 August 2020
Publication Number
03/2021
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ipo@invntree.com
Parent Application

Applicants

EDINNO TECH LABS PRIVATE LIMITED
B-1003, 10th Floor, Advant Navis Business Park, Sec-142, Noida -

Inventors

1. Deepti Lamba
B-1003, 10th Floor, Advant Navis Business Park, Sec-142, Noida - 201305
2. Anand Prakash
B-1003, 10th Floor, Advant Navis Business Park, Sec-142, Noida - 201305

Specification

Unless otherwise indicated herein, the materials described in this section
are not prior art to the claims in this application and are not admitted to being
prior art by inclusion in this section.
5 Field
[002] The subject matter in general relates to video analysis. More particularly,
but not exclusively, the subject matter is directed to capturing video of a
classroom to analyse student behaviour.
Discussion of related art
10 [003] Video analysis is a powerful tool that may be used for interpretation of a
situation captured by a video capturing device. Video analysis is most commonly
used in classrooms, hospitals, public places and so on. Video analysis of students
of the classroom may help in evaluating classroom behaviour of students which in
turn may help teachers in improving the academic performance of the students.
15 [004] In conventional methods, video and audio of students and teachers may
be captured using an audio-video capturing device and the student behaviour may
be studied by analysing the expressions, gestures and the body language exhibited
by the students and the teachers. The expressions, gestures and the body language
of students and teachers may be analysed using various face recognition and
20 motion recognition techniques. Emotions and gestures such as happiness or
raising a hand may indicate that the student is concentrating in class, whereas,
drowsiness or operating a mobile phone may indicate that the student is not
paying attention in class. By analysing the classroom behaviour of students over a
period of time, a student report may be generated. The student report may help
25 teachers and parents in guiding the students with their academic performance.
Further, the student report may help teachers and parents in recognizing the
improvement areas of each student.
[005] In the conventional video analysis of students in the classroom, the
emotions of the students may not be identified with respect to the subject taught in
3
the classroom or the classroom environment. Such an analysis may result in an
inaccurate interpretation of the student emotion. As an example, the emotion
expected by the student during the discussion of ‘wars and consequences’ maybe
a sad emotion. If the student is exhibiting a happy emotion, then the conventional
5 video analysis methods may inaccurately interpret the happy emotion of the
student as the expected emotion.
[006] In light of the foregoing discussion, there may be a need for an improved
technique of video processing for analyzing classroom behaviour for improving
classroom performance.
10 SUMMARY
[007] In one aspect, a system is provided for video processing. The system
includes one or more processors configured to capture video of individuals,
wherein each video frame covers multiple individuals. The processor process the
captured video to identify emotion exhibited by each of the individuals and
15 classify the emotion exhibited by each of the individuals into multiple
classifications. The processor further determines the number of individuals
assigned to each of the classifications and categorize each of the individuals as
exhibiting one of recommended or non-recommended emotion based on the
classification of the emotion exhibited by the individual and the number of
20 individuals assigned to each of the classifications.
[008] In the aspect, a method is provided for video processing. The method
may be carried out by one or more processors. The method comprises steps of
capturing video of individuals, wherein each video frame covers multiple
individuals and processing the captured video to identify emotion exhibited by
25 each of the individuals. Further, the method comprises classifying the emotion
exhibited by each of the individuals into multiple classifications and determining
the number of individuals assigned to each of the classifications. The method also
comprises categorizing each of the individuals as exhibiting one of recommended
or non-recommended emotion based on the classification of the emotion exhibited
4
by the individual and the number of individuals assigned to each of the
classifications.
BRIEF DESCRIPTION OF DIAGRAMS
[009] This disclosure is illustrated by way of example and not limitation in the
5 accompanying figures. Elements illustrated in the figures are not necessarily
drawn to scale, in which like references indicate similar elements and in which:
[0010] FIG. 1 is an exemplary block diagram illustrating software modules of a
system 100 for identifying and categorizing emotions of plurality of students, in
accordance with an embodiment;
10 [0011] FIG. 2 is a flowchart 200 illustrating the steps involved in identifying the
emotion of plurality of individuals, in accordance with an embodiment;
[0012] FIG. 3A – 3B is flowchart 300 illustrating the steps involved in
determining a predominant emotion, in accordance with an embodiment; and
[0013] FIG. 4 is a block diagram illustrating hardware elements of the system
15 100 of FIG. 1, in accordance with an embodiment.
DETAILED DESCRIPTION
[0014] The following detailed description includes references to the
accompanying drawings, which form part of the detailed description. The
drawings show illustrations in accordance with example embodiments. These
20 example embodiments are described in enough detail to enable those skilled in the
art to practice the present subject matter. However, it may be apparent to one with
ordinary skill in the art that the present invention may be practised without these
specific details. In other instances, well-known methods, procedures and
components have not been described in detail so as not to unnecessarily obscure
25 aspects of the embodiments. The embodiments can be combined, other
embodiments can be utilized, or structural and logical changes can be made
without departing from the scope of the invention. The following detailed
description is, therefore, not to be taken in a limiting sense.
5
[0015] In this document, the terms “a” or “an” are used, as is common in patent
documents, to include one or more than one. In this document, the term “or” is
used to refer to a non-exclusive “or”, such that “A or B” includes “A but not B”,
“B but not A”, and “A and B”, unless otherwise indicated.
5 [0016] It should be understood that the capabilities of the invention described
in the present disclosure and elements shown in the figures may be implemented
in various forms of hardware, firmware, software, recordable medium or
combinations thereof.
[0017] Referring to figures and more particularly to FIG. 1, a system and
10 method for video processing is provided. The system 100 is configured to capture
videos of plurality of individuals in a classroom and identify emotions exhibited
by each of the individual. The emotions of the individuals in the classroom may
be identified by calculating a predominant emotion of the classroom. The
individuals in the classroom may be students and teachers. The system 100 may
15 further determine whether the identified emotion of the student is an expected
emotion. The emotion exhibited by the students in the classroom may depend on
the subject taught in the class or/and the teacher teaching a specific subject. As an
example, when a subject on ’wars’ is taught in the classroom, the expected
emotion of the student may be sad emotion.
20 [0018] In an embodiment, for capturing the video of plurality of students and
categorizing the emotions, the system 100 may be configured to comprise a video
capturing module 102, a data repository 104, an image analysis module 106,
classification module 108, a summation module 110, an analysis module 112 and
an emotion determination module 114.
25 [0019] In an embodiment, the video capturing module 102 may be configured to
capture the video of plurality of individuals in the classroom. The video capturing
module 102 may comprise a video and image capturing device, wherein the video
and image capturing device may be configured to capture pluralities of activities
performed by the individuals in the classroom. The pluralities of activities
30 performed by the students in the classroom may be sleeping, raising hands,
6
looking forward, and talking to each other, among many others. The pluralities of
activities performed by the teachers may be teaching in class, sitting down,
walking around, and talking to students, among many others. The video and
image capturing device may further capture the facial expressions exhibited by the
5 students and the teachers. The pluralities of facial expressions exhibited by the
students may be happy, sad, angry drowsy and so on.
[0020] In an embodiment, the data repository 104 may be configured to
comprise plurality of information with respect to each of the student and the
teachers of an institution. The information may be individual’s name, face identity
10 and so on. The data repository 104 may further comprise information regarding
each individual’s behaviour for pluralities of subjects taught in class for a
predetermined duration. As an example, student A may concentrate during a
subject A1, may exhibit a sleepy behaviour during a subject B1 and so on. These
behavioural information for each of the plurality of subject information
15 corresponding to the student A may be stored in the data repository 104.
[0021] In an embodiment, the image analysis module 106 may be configured to
analyse the videos and images captured by the video and image capturing device.
The image analysis module 106 may extract feature vectors in depth of a facial
expression for recognizing the expression exhibited by each of the individual.
20 Further, the image analysis module 106 may extract a feature vectors in depth of
the individuals body motion to recognize the activities performed by each of the
individual in the classroom. The activities performed and the expressions
exhibited by the plurality of individuals may be recognized using various
recognition techniques such as FACENET, DLIB, YOLO and CNN based
25 techniques. The above mentioned algorithms for extracting and recognizing facial
expressions and behavioural characteristics is well known in the art and hence not
described in the current disclosure. On analysing the videos and the images
captured from the video, the image analysis module 106 may identify the
emotions exhibited and activities performed by each of the individual. As an
30 example, the expression exhibited by the students and the teachers may be angry,
7
sad, happy and neutral (no expression) among many others. The activities
performed by the students and teachers may be looking forward, sleeping, talking
and raising hand, among many others.
[0022] In an embodiment, the classification module 108 may be configured to
5 classify each of the expression exhibited and the activity performed by the
individuals in the classroom. A classification algorithm may be trained to classify
the expressions and the activities of the students and the teachers. The expressions
and the activities may be classified into pluralities of groups or classifications.
The groups may be a first class, a second class and a third class.
10 [0023] A first class may be assigned emotions that may be positive in nature. As
an example, happiness, excitement, surprise, joy and so on may be assigned to the
first class.
[0024] A second class may be assigned emotions that may be negative in nature.
As an example, sadness, anger, fear, disgust and so on may be assigned to the
15 second class.
[0025] A third class may be assigned emotions that may be neutral in nature.
That is to say, when a student or a teacher is not expressing any emotion, then the
emotion of the student or the teacher may be assigned to the third class.
[0026] In an embodiment, the summation module 110 may be configured to
20 determine the number of individuals assigned to each of the classification groups.
That is to say, the summation module 110 may calculate the percentage of the
students assigned to the first class, the second class and the third class. For
determining the percentage of students assigned to each group, the total number of
individuals assigned to each group may be calculated using the following
25 equation:
Ee = ∑ (𝑆𝑆𝑆𝑆) 𝑛𝑛
𝑖𝑖 ; .......................................... (1)
Wherein, Ee is the total number of individuals assigned to a particular group;
‘i’ = 1, 2,3....n; and
30 Se = Individuals exhibiting the particular emotion.
8
[0027] In an embodiment, the total number of individuals assigned to the first
class may be determined from the following equation:
Ep = ∑ (𝑆𝑆𝑆𝑆) 𝑛𝑛
𝑖𝑖 ;.......................................... (1a)
Wherein, Ep is the total number of individuals assigned to the first class;
5 ‘i’ = 1, 2,3....n; and
Sp = Individuals exhibiting the emotions assigned to the first
class.
[0028] In an embodiment, the total number of individuals assigned to the second
class may be determined from the following equation:
En = ∑ (𝑆𝑆𝑆𝑆) 𝑛𝑛
𝑖𝑖 10 ;.......................................... (1b)
Wherein, En is the total number of individuals assigned to the second
class;
‘i’ = 1, 2,3....n; and
Sp = Individuals exhibiting the emotions assigned to the
15 second class.
[0029] In an embodiment, the total number of individuals assigned to the third
class may be determined from the following equation:
Ene = ∑ (𝑆𝑆𝑆𝑆𝑆𝑆) 𝑛𝑛
𝑖𝑖 ;.......................................... (1c)
Wherein, Ene is the total number of individuals assigned to the third
20 class;
‘i’ = 1, 2,3....n; and
Sne = Individuals exhibiting the emotions assigned to the third
class.
[0030] In an embodiment, the summation module 110 may further determine the
25 percentage of individuals assigned to each classification group from the total
number of individuals assigned to each group. The percentage of individuals
assigned to each group may be calculated using the following equation:
Ep
e =
1
𝑁𝑁
Ee
(%);.......................................... (2)
9
Wherein, Ep
e is the percentage of individuals assigned to the particular group
Ee is the total number of individuals assigned to the particular
group; and
N = total number of individuals;
5 [0031] In an embodiment, the percentage of individuals assigned to the first
class may be determined from the following equation:
Ep
p =
1
𝑁𝑁
Ep
(%);.......................................... (2a)
Wherein, Ep
p is the percentage of individuals assigned to the firs
class;
Ep 10 is the total number of individuals assigned to the first
class; and
N = total number of individuals;
[0032] In an embodiment, the percentage of individuals assigned to the second
class may be determined from the following equation:
Ep
n =
1
𝑁𝑁
En 15 (%);.......................................... (2b)
Wherein, Ep
n is the percentage of individuals assigned to the second
group;
En is the total number of individuals assigned to the
second group; and
20 N = total number of individuals;
[0033] In an embodiment, the total number of individuals assigned to the third
class may be determined from the following equation:
Ep
ne =
1
𝑁𝑁
Ene(%);.......................................... (2c)
Wherein, Ep
ne is the percentage of individuals assigned to the third
25 group
Ee is the total number of individuals assigned to the third
group; and
N = total number of individuals;
[0034] In an embodiment, the analysis module 112 may determine the
30 predominant emotion of the classroom and a predominant emotion exhibited by
10
the teacher in the classroom. The predominant emotion of the classroom is the
emotion exhibited by a majority of students in the classroom and the predominant
emotion of the teacher is the emotion exhibited a maximum number of times. In
some scenario, the predominant emotion of the classroom may be the positive
5 emotion (first class). In some other scenario, the predominant emotion of the
classroom may be the negative emotion (second class). In yet another scenario,
the predominant emotion of the classroom may be the neutral emotion (third
class).
[0035] The analysis module 112 may identify the positive emotion as the
10 predominant emotion if any one of the following conditions are satisfied:
Ep
p
> Ep
n
≥ Ep
ne ; or
Ep
p
> Ep
n
≥ Ep
ne ; or
Ep
p = Ep
n
= Ep
ne and Et = e’
p or
Ep
p = Ep
n
= Ep
ne and ET = e’
p
15 ......................................... (3a)
Wherein, Et is a preset expected emotion set, as an
example by the teacher or on behalf of the teacher, for
each subject taught in class;
ET is the predominant emotion of the teacher; and
e’p 20 is the emotion assigned to the first class (positive
emotion).
[0036] The analysis module 112 may identify the negative emotion as the
predominant emotion if any one of the following conditions are satisfied:
Ep
n
> Ep
p
≥ Ep
ne ; or
Ep
n
> Ep
ne≥ Ep
p 25 ; or
Ep
p
= Ep
n
= Ep
ne and Et = e’n or
11
Ep
p
= Ep
n
= Ep
ne and ET = e’n
..........................................
(3b)
Wherein, e’n is the emotion assigned to the second class
(negative emotion).
5 [0037] The analysis module 112 may identify the neutral emotion as the
predominant emotion if any one of the following conditions are satisfied:
Ep
ne> Ep
p
≥ Ep
n ; or
Ep
ne> Ep
n
≥ Ep
p ; or
Ep
p
= Ep
n
= Ep
ne and Et = e’ne or
Ep
p
= Ep
n
= Ep
ne and ET = e’ne 10 .........................................
(3c)
Wherein, e’ne is the emotion assigned to the third class
(neutral emotion).
[0038] In an embodiment, the predominant emotion ET of the teacher may be
15 determined by the analysis module 112 by calculating the maximum emotion
exhibited by the teacher. As an example, the teacher may exhibit the positive
emotion ‘A’ times and may exhibit the negative or the neutral emotion emotions
‘B’ times, wherein A > B. Then the analysis module 112 may determine the
positive emotion as the predominant emotion of the teacher (ET).
20 [0039] In an embodiment, the emotion determination module 114 may be
configured to categorize the emotion of each of the individual based on the
predominant emotion of the classroom. The emotion determination module 114
may determine whether the emotion exhibited by each of the student is a
recommended emotion or a non-recommended emotion. The recommended
25 emotion of a student is the emotion that may be expected by the student as per the
situation of the classroom. The non-recommended emotion of the student is the
emotion which may not be expected by the student as per the situation of the
classroom. As an example, if the subject taught in the classroom is ‘world war and
12
the consequences’, the recommended behaviour of the classroom may be a
negative emotion such as sadness. In such a scenario, referring to table 1 provided
below, a student 1 (s1) expressing a happy emotion may be exhibiting a nonrecommended emotion and student 10 (s10) expressing a sad emotion may be
5 exhibiting a recommended emotion. The emotion determination module 114 may
determine whether the emotion of the plurality of students may be the
recommended or the non-recommended emotion based on the predominant
emotion of the classroom. When the predominant emotion of the classroom is
identified as the first class (positive emotion), then all the emotions assigned to
10 the first class is determined as recommended emotion and all the emotions
assigned to the second class and the third class may be determined as the nonrecommended emotion. When the predominant emotion of the classroom is
identified as the second class (negative emotion), then all the emotions assigned to
the second class is determined as recommended emotion and all the emotions
15 assigned to the first class and the third class may be determined as the nonrecommended emotion. When the predominant emotion of the classroom is
identified as the third class (neutral emotion), then all the emotions assigned to the
third class is determined as recommended emotion and all the emotions assigned
to the first class and the second class may be determined as the non-recommended
20 emotion.
Student Emotion Identified by AI Emotion Type
s1 Happy Positive
s2 Surprise Positive
s3 Happy Positive
s4 Surprise Positive
s5 Neutral No emotion
s6 Neutral Noemotion
s7 Happy Positive
s8 Neutral Noemotion
s9 Happy Positive
s10 Sad Positive
s11 Angry Negative
s12 Angry Negative
s13 Neutral Noemotion
13
s14 Sad Negative
s15 Neutral Noemotion
s16 Surprise Positive
s17 Happy Positive
s18 Sad Negative
s19 Neutral Noemotion
s20 Happy Positive
TABLE. 1
[0040] Having discussed the various modules involved in determining the
emotion exhibited by the individuals, a flowchart describing the steps involved in
5 identifying the emotions of plurality of students based on the predominant
behaviour of the classroom is discussed hereunder.
[0041] Referring to FIG. 2, at step 202, video of plurality of individuals in the
classroom may be captured by the video and image capturing device. The video
and image capturing device may capture the facial expressions and activities of
10 each student. The captured video and images may be communicated to the image
analysis module 106 for analysing and identifying the emotions exhibited by the
plurality of students.
[0042] At step 204, on receiving the video from the video capturing module 102,
the image analysis module 106 may identify the emotions exhibited by the
15 students. That is to say, referring to table 1, the image analysis module 106 may
identify that the emotion exhibited by the student 1 (s1) to be happy emotion,
student 5 (s5) to be neutral emotion, student 10 (s10) to be sad emotion and so on.
The image analysis module 106 may identify the emotions exhibited by each of
the student in the classroom.
20 [0043] On identifying the emotions of students, at step 206, the classification
module 108 may classify each of the identified emotion. The classification
module 108 may classify each of the identified emotion into the first class, the
second class and the third class. The emotions assigned to the first class may be
the positive emotions (e’
p
), the emotions assigned to the second class may be the
14
negative emotions (e’
n
) and the emotions assigned to the third class may be the
neutral emotions (e’
ne). That is to say, the classification module may assign the
student 1 (s1) to the first class, student 5 (s5) to the second class, student 10 (s10)
to the third class and so on. Similarly, all the students in the class may be assigned
5 to the plurality classification groups.
[0044] At step 208, the analysis module 112 may determine the predominant
emotion of the classroom which may be described in detail below.
[0045] At step 210, the emotion determination module 114 may determine
whether the emotion exhibited by each of the student is the recommended emotion
10 or the non-recommended based on the predominant emotion of the classroom. As
an example, if the predominant emotion of the classroom is the positive emotion,
then the emotion exhibited by student 1 (s1), student 2 (s2) may be determined as
the recommended emotion and the emotion exhibited by student 5 (s5), student 6
(s6), student 10 (s10), student 11 (s11) may be determined as the non15 recommended emotion.
[0046] On determining the emotion of each of the student, at step 212, the data
(identified emotion) may be communicated to the data repository 104, wherein,
the information regarding each student may be stored under respective student
details. That is to say, the data repository 104 may store the emotion data of
20 student 1 under the name and facial details of student 1, emotion data of student 2
under the name and facial details of student 2 and so on.
[0047] Fig. 3A -3B is a flowchart illustrating the steps involved in determining
the predominant emotion of the classroom. At step 302, the summation module
110 may determine the total number of students in each classification group. That
25 is to say, the summation module 110 may determine the total number of students
in the first class, the second class and the third class. The total number of students
in the first class may be determined from the equation 1a. As an example,
referring to table 1, from equation 1a, it may be determined that Ep = 9. The total
number of students in the second class may be determined from the equation 1b.
30 As an example, referring to table 1, from equation 1b, it may be determined that
15
En = 5. The total number of students in the third class may be determined from the
equation 1c. As an example, referring to table 1,from equation 1c, it may be
determined that Ene = 6
[0048] On determining the total number of students in each classification group,
5 at step 304, the summation module may determine the percentage of student in
each classification group. The percentage of students in each classification group
may be determined from equation 2. The percentage of students in the first class
may be determined from the equation 2a. As an example, referring to table 1,
from equation 2a, it may be determined that Ep
p = 45%, wherein N = 20 and Ep =
10 9. The percentage of students in the second class may be determined from the
equation 2b. As an example, referring to table 1, from equation 2b, it may be
determined that Ep
n = 25%, wherein N = 20 and En = 5. The percentage of
students in the third class may be determined from the equation 2c. As an
example, referring to table 1, from equation 2c, it may be determined that Ep
ne =
30%, wherein N = 20 and Ene 15 = 6.
[0049] On determining the percentage of students in each of the classification
groups, at step 306, the analysis module 112 identifies the predominant emotion
of the classroom. The predominant emotion may be the positive emotion (first
class), the negative emotion (second class) or the neutral emotion (third class).
20 The predominant emotion may be determined by the analysis module 112 by
analysing the percentage of students in the classification group. In certain
scenarios, the percentage of students in one of the classification groups may be
greater than the percentage of students in other classification groups. In such
cases, the emotion assigned to the classification group with greatest percentage
25 may be considered as the predominant emotion. In certain other scenarios, the
percentage of students in every classification groups may be same (Ep
p =
Ep
n
= Ep
ne). In such scenarios, the preset emotion may be considered and the
classification group which is the same as the predominant emotion is identified as
the predominant emotion. As an example, if the preset emotion is positive then the
16
first class (positive) emotion is considered as the predominant emotion. The preset
emotion may be set by the teacher depending upon the subject taught in the class.
[0050] In certain other scenarios, wherein the percentage of students in every
classification groups may be same (Ep
p = Ep
n
= Ep
ne), the teacher may not have set
5 the preset emotion. In such scenarios, the analysis module 112 may determine the
predominant emotion of the teacher. The classification group which is the same as
the predominant emotion of the teacher may be identified as the predominant
emotion of the classroom.
[0051] At step 308a, the analysis module 112 may identify the emotion assigned
10 to the first class as the predominant emotion when the percentage of students
exhibiting the positive emotion is greater than the percentage of students
exhibiting the negative or the neutral emotion. That is to say, if Ep
p
> Ep
n
≥ Ep
ne or
Ep
p
> Ep
n
≥ Ep
ne, then the positive emotion is identified as the predominant emotion
of the classroom. As an example, referring to table 1 and equations provided
above, since Ep
p > Ep
n
> Ep
ne 15 (45% > 30% > 25%), the positive emotion is
identified as the predominant emotion. If the percentage of students exhibiting the
positive emotion is equal to the percentage of student exhibiting the negative and
the neutral emotion (Ep
p = Ep
n
= Ep
ne), then the analysis module 112 may check
whether the teacher teaching the subject have set the preset emotion (Et
). If the
preset emotion is positive (Et = e’p 20 ), then the emotion assigned to the first
classmay be identified as the predominant emotion. If the teacher has not set the
preset emotion, then the analysis module 112 may determine the predominant
emotion of the teacher. If the predominant emotion of the teacher is the positive
emotion (ET = e’p
), then the analysis module 112 may identify the positive
25 emotion as the predominant emotion.
[0052] At step 308b, the positive emotions such as happy, excited and so on are
determined as the recommended emotion of the classroom and the negative and
the neutral emotions such as sad, angry may be considered as the nonrecommended emotion.
17
[0053] At step 310b, the analysis module 112 may identify the emotion assigned
to the second class as the predominant equation when the percentage of students
exhibiting the negative emotion is greater than the percentage of students
exhibiting the positive or the neutral emotion. That is to say, if Ep
n
> Ep
p
≥ Ep
ne or
Ep
n
> Ep
ne≥ Ep
p 5 , then the negative emotion may be identified as the predominant
emotion of the classroom. If the percentage of students exhibiting the negative
emotion is equal to the percentage of student exhibiting the positive and the
neutral emotion (Ep
n = Ep
p
= Ep
ne), then the analysis module 112 may check
whether the teacher teaching the subject have set the preset emotion (Et
). If the
preset emotion is the negative emotion (Et = e’n 10 ), then emotion assigned to the
second class may be identified as the predominant emotion. If the teacher has not
set the preset emotion, then the analysis module 112 may determine the
predominant emotion of the teacher. If the predominant emotion of the teacher is
the negative emotion (ET = e’n
), then the analysis module 112 may identify the
15 negative emotion as the predominant emotion.
[0054] At step 310b, the negative emotions such as sad, angry and so on are
determined as the recommended emotion of the classroom and the positive and
the neutral emotions such as happy, excited and so on may be considered as the
non-recommended emotion.
20 [0055] At step 312a, the analysis module 112 may identify the emotion assigned
to the third class as the predominant equation when the percentage of students
exhibiting the neutral emotion is greater than the percentage of students exhibiting
the negative or the positive emotion. That is to say, if Ep
ne> Ep
n
≥ Ep
p
or Ep
ne>
Ep
p
≥ Ep
n
, then the neutral emotion is identified as the predominant emotion of the
25 classroom. If the percentage of students exhibiting the neutral emotion is equal to
the percentage of student exhibiting the negative and the positive emotion (Ep
ne =
Ep
n
= Ep
p
), then the analysis module 112 may check whether the teacher teaching
the subject have set the preset emotion (Et
). If the preset emotion is the neutral
emotion (Et = e’ne), then the emotion assigned to the third class may be identified
30 as the predominant emotion. If the teacher has not set the preset emotion, then the
18
analysis module 112 may determine the predominant emotion of the teacher. If
the predominant emotion is the neutral emotion (ET = e’ne), then the analysis
module 112 may identify the emotion assigned to the third class as the
predominant emotion.
5 [0056] At step 312b, the neutral emotion is determined as the recommended
emotion of the classroom and the negative and the positive emotions such as
happy, sad, angry may be considered as the non-recommended emotion.
[0057] In another embodiment, the classification module 108 may classify a
behaviour of the plurality of individuals in the classroom. The behaviour of each
10 student may be obtained by combining the emotion and the activity of each
student.
[0058] In the embodiment, the classification module 108 may classify each of
the activity performed by students into plurality of activity groups. The groups
may be a first activity group and a second activity group.
15 [0059] A first activity group may be assigned activities that may be positive in
nature. Referring to table 2 provided below, the activities that may be positive in
nature are sitting upright, standing to answer and looking forward among many
others.
Activity Activity Type
Sitting upright Positive
Standing up to answer Positive
Tilting or bending Negative
Slouching Negative
Sitting with legs up Negative
Sleeping Negative
Raising hand to answer Positive
Looking at teacher Positive
Looking into own notebook Positive
Colouring, Writing,
Answering Positive
Jumping, Throwing Negative
Eating Negative
Table. 2
19
[0060] A second activity group may be assigned activities that may be negative
in nature. Referring to table 2, the activities that may be negative in nature are
slouching, sleeping and bending among many others.
[0061] In the embodiment, the classification module 108 may further classify the
5 behaviour of each student in the classroom into plurality of classification group.
The groups may be a first class and a second class.
[0062] A first class may be assigned behaviour that may be positive in nature.
The classification module 108 may analyse the emotion exhibited and
classification of the activity of each student to determine the behaviour of each
10 student. If both the emotion and the activity are positive in nature, then the
behaviour of the student may be assigned to the first class. Happiness, excitement,
joy and so on may be the emotions that may be positive in nature. As an example,
referring to table 3 provided below, the student 1 (s1) may be exhibiting a positive
emotion (happy) and performing a positive activity. Then the classification
15 module 108 may assign the behaviour of the student to the first class.
Student Emotion Activity Emotion
Type
s1 Happy Positive Positive
s2 Surprise Positive Positive
s3 Happy Positive Positive
s4 Surprise Negative Negative
s5 Neutral Positive Positive
s6 Neutral Negative Negative
s7 Happy Negative Negative
s8 Neutral Positive Positive
s9 Happy Positive Positive
s10 Sad Positive Negative
s11 Angry Negative Negative
s12 Angry Positive Negative
s13 neutral Negative Negative
s14 Sad Positive Negative
s15 neutral Negative Negative
s16 surprise Positive Positive
s17 happy Negative Negative
20
s18 Sad Positive Negative
s19 neutral Positive Positive
s20 happy Positive Positive
Table. 3
[0063] A second class may be assigned behaviour that may be negative in
nature. If either the emotion or the activity is negative in nature, then the
5 behaviour of the student may be assigned to the first class. Sadness, anger,
disgust and so on may be the emotions that may be negative in nature. As an
example, referring to table 3, the student 10 (s10) may be exhibiting a negative
emotion (sad) and performing a positive activity. Then the classification module
108 may assign the behaviour of the student to the second class. Further, if both
10 the activity and the emotion are negative, then the classification module 108 may
assign the behaviour of the student to the second class.
[0064] Further, when the emotion exhibited by the student is neutral, then the
classification module 108 may assign the behaviour of the student to the first
class, if the activity of the student is positive. As an example, referring to table 3,
15 the behaviour of the student 5 (s5) exhibiting the neutral emotion and performing
the positive activity, may be assigned to the first class.
[0065] In the embodiment, when the emotion exhibited by the student is neutral,
then the classification module 108 may assign the behaviour of the student to the
second class, if the activity of the student is negative. As an example, referring to
20 table 2, the behaviour of the student 5 (s5) exhibiting the neutral emotion and
performing the negative activity, may be assigned to the second class. The
determination of the predominant behaviour (emotion) and the categorizing of the
behaviour of each student as recommended or non-recommended is same as
explained above.
25 [0066] In an embodiment, facial emotion and bodily gestures may be
collectively referred to as emotion.
21
[0067] FIG.4 is a block diagram illustrating hardware elements of the system
100 of FIG. 1, in accordance with an embodiment. The system 100 may be
implemented using one or more servers, which may be referred to as server. The
system 100 may include a processing module 402, a memory module 404, an
5 input/output module 408, a display module 410, a communication interface 412
and a bus 414 interconnecting all the modules of the system 100.
[0068] The processing module 402 is implemented in the form of one or more
processors and may be implemented as appropriate in hardware, computer
executable instructions, firmware, or combinations thereof. Computer-executable
10 instruction or firmware implementations of the processing module 402 may
include computer-executable or machine-executable instructions written in any
suitable programming language to perform the various functions described.
[0069] The memory module 404 may include a permanent memory such as hard
disk drive, may be configured to store data, and executable program instructions
15 that are implemented by the processing module 402. The memory module 404
may be implemented in the form of a primary and a secondary memory. The
memory module 404 may store additional data and program instructions that are
loadable and executable on the processing module 402, as well as data generated
during the execution of these programs. Further, the memory module 404 may be
20 a volatile memory, such as a random access memory and/or a disk drive, or a nonvolatile memory. The memory module 404 may comprise of removable memory
such as a Compact Flash card, Memory Stick, Smart Media, Multimedia Card,
Secure Digital memory, or any other memory storage that exists currently or may
exist in the future. The memory module 402 may store the plurality of algorithms
25 useful for classifying the emotions of plurality of individuals. The plurality of
algorithms may be training algorithm, classification, algorithm, among others.
The memory module 402 may further store the details of the plurality of the
students.
[0070] The input/output module 408 may provide an interface for input devices
30 such as computing devices, keypad, touch screen, mouse, and stylus among other
22
input devices; and output devices such as speakers, printer, and additional
displays, among others. The input/output module 408 may be used to receive data
or send data through the communication interface 412.
[0071] Liquid Crystal Displays (OLCD) or any other type of display currently
5 existing or which may exist in the future.
[0072] The communication interface 412 may include a modem, a network
interface card (such as Ethernet card), a communication port, and a Personal
Computer Memory Card International Association (PCMCIA) slot, among others.
The communication interface 412 may include devices supporting both wired and
10 wireless protocols. Data in the form of electronic, electromagnetic, optical, among
other signals may be transferred via the communication interface 412.
[0073] It should be understood that the capabilities of the invention described in
the present disclosure and elements shown in the figures may be implemented in
various forms of hardware, firmware, software, recordable medium or
15 combinations thereof.
[0074] Although embodiments have been described with reference to specific
example embodiments, it will be evident that various modifications and changes
may be made to these embodiments without departing from the broader spirit and
scope of the system and method described herein. Accordingly, the specification
20 and drawings are to be regarded in an illustrative rather than a restrictive sense.
[0075] Many alterations and modifications of the present invention will no doubt
become apparent to a person of ordinary skill in the art after having read the
foregoing description. It is to be understood that the phraseology or terminology
employed herein is for the purpose of description and not of limitation. It is to be
25 understood that the description above contains many specifications, these should
not be construed as limiting the scope of the invention but as merely providing
illustrations of some of the personally preferred embodiments of this invention.
Thus, the scope of the invention should be determined by the appended claims and
their legal equivalents rather than by the examples given.

CLAIMS
We claim:
1. A system (100) of video processing, the systemcomprising one or more
processors configured to:
5 Capture video of individuals, wherein each video frame covers multiple
individuals;
process the captured video to identify emotion exhibited by each of the
individuals;
classify , into multiple classifications, the emotion exhibited by each of the
10 individuals;
determine the number of individuals assigned to each of the classifications;
and
categorize each of the individuals as exhibiting one of recommended or nonrecommended emotion based on the classification of the emotion exhibited
15 by the individual and the number of individuals assigned to each of the
classifications.
2. The system (100) of claim 1, wherein the classifications comprises a first
class, a second class and a third class, wherein a first set of emotions, a
20 second set of emotions and a third set of emotions are assigned to the first
class, the second class and the third class, respectively, wherein the
processor classifies, into one of the three classes, the emotion exhibited by
each of the individuals based on the emotion exhibited by the individual and
the assignment of the exhibited emotion to one of the first class, the second
25 class or the third class.
3. The system (100) of claim 1, wherein the processor is configured to:
categorize the individual as exhibiting the recommended emotion if the
classification of the emotion exhibited by the individual is one of the
24
classifications that is predominant among the individuals captured in the
video; and
categorize the individual as exhibiting the non-recommended emotion if the
classification of the emotion exhibited by the individual is not one of the
5 classifications that is predominant among the individuals captured in the
video.
4. The system (100) of claim 1, wherein, the processor is configured to assign
one of the classifications as a recommended category, wherein if none of the
10 classifications is predominant among the individuals captured in the video,
then the processer is configured to:
categorize the individual as exhibiting the recommended emotion if the
classification of the emotion exhibited by the individual is same as the
classification assigned to the recommended category; and
15 categorize the individual as exhibiting the non-recommended emotion if the
classification of the emotion exhibited by the individual is not same as the
classification assigned to the recommended category.
5. The system (100) of claim 1, wherein, the processor is configured to
20 identify the emotion based on facial emotion and bodily gesture.
6. A method of video processing, the method carried out by one or more
processors, the method comprising:
capturing video of individuals, wherein each video frame covers multiple
25 individuals;
processing the captured video to identify emotion exhibited by each of the
individuals;
classifying, into multiple classifications, the emotion exhibited by each of
the individuals;
30 determining the number of individuals assigned to each of the
classifications; and
25
categorizing each of the individuals as exhibiting one of recommended or
non-recommended emotion based on the classification of the emotion
exhibited by the individual and the number of individuals assigned to each
of the classifications.
5
7. The method of claim 6, wherein the classifications comprises a first class, a
second class and a third class, wherein a first set of emotions, a second set of
emotions and a third set of emotions are assigned to the first class, the
second class and the third class, respectively, wherein the method comprises
10 the processor classifying, into one of the three classes, the emotion exhibited
by each of the individuals based on the emotion exhibited by the individual
and the assignment of the exhibited emotion to one of the first class, the
second class or the third class.
15 8. The method of claim 6, wherein the method comprises, the processor:
categorizing the individual as exhibiting the recommended emotion if the
classification of the emotion exhibited by the individual is one of the
classifications that is predominant among the individuals captured in the
video; and
20 categorizing the individual as exhibiting the non-recommended emotion if
the classification of the emotion exhibited by the individual is not one of the
classifications that is predominant among the individuals captured in the
video.
25 9. The method of claim 6, wherein the method comprises, the processor
assigning one of the classifications as a recommended category, wherein if
none of the classifications is predominant among the individuals captured in
the video, then the method comprises the processer:
categorizing the individual as exhibiting the recommended emotion if the
30 classification of the emotion exhibited by the individual is same as the
classification assigned to the recommended category; and
26
categorizing the individual as exhibiting the non-recommended emotion if
the classification of the emotion exhibited by the individual is not same as
the classification assigned to the recommended category.
5 10. The method of claim 6, wherein the method comprises, the processor
identifying the emotion based on facial emotion and bodily gesture.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 202011035815-Correspondence-040920-.pdf 2021-10-19
1 202011035815-STATEMENT OF UNDERTAKING (FORM 3) [20-08-2020(online)].pdf 2020-08-20
2 202011035815-Correspondence-040920.pdf 2021-10-19
2 202011035815-REQUEST FOR EXAMINATION (FORM-18) [20-08-2020(online)].pdf 2020-08-20
3 202011035815-PROOF OF RIGHT [20-08-2020(online)].pdf 2020-08-20
3 202011035815-Correspondence-230821.pdf 2021-10-19
4 202011035815-POWER OF AUTHORITY [20-08-2020(online)].pdf 2020-08-20
4 202011035815-FER.pdf 2021-10-19
5 202011035815-OTHERS-040920.pdf 2021-10-19
5 202011035815-OTHERS [20-08-2020(online)].pdf 2020-08-20
6 202011035815-Power of Attorney-040920.pdf 2021-10-19
6 202011035815-FORM FOR STARTUP [20-08-2020(online)].pdf 2020-08-20
7 202011035815-Power of Attorney-230821.pdf 2021-10-19
7 202011035815-FORM FOR SMALL ENTITY(FORM-28) [20-08-2020(online)].pdf 2020-08-20
8 202011035815-US(14)-HearingNotice-(HearingDate-24-08-2021).pdf 2021-10-19
8 202011035815-FORM 18 [20-08-2020(online)].pdf 2020-08-20
9 202011035815-FORM 1 [20-08-2020(online)].pdf 2020-08-20
9 202011035815-Written submissions and relevant documents [01-09-2021(online)].pdf 2021-09-01
10 202011035815-FIGURE OF ABSTRACT [20-08-2020(online)].jpg 2020-08-20
10 202011035815-Response to office action [17-08-2021(online)].pdf 2021-08-17
11 202011035815-Correspondence to notify the Controller [12-08-2021(online)].pdf 2021-08-12
11 202011035815-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-08-2020(online)].pdf 2020-08-20
12 202011035815-DRAWINGS [20-08-2020(online)].pdf 2020-08-20
12 202011035815-FORM-26 [12-08-2021(online)].pdf 2021-08-12
13 202011035815-ABSTRACT [01-07-2021(online)].pdf 2021-07-01
13 202011035815-DECLARATION OF INVENTORSHIP (FORM 5) [20-08-2020(online)].pdf 2020-08-20
14 202011035815-AMMENDED DOCUMENTS [01-07-2021(online)].pdf 2021-07-01
14 202011035815-COMPLETE SPECIFICATION [20-08-2020(online)].pdf 2020-08-20
15 202011035815-CLAIMS [01-07-2021(online)].pdf 2021-07-01
15 202011035815-FORM FOR STARTUP [28-09-2020(online)].pdf 2020-09-28
16 202011035815-COMPLETE SPECIFICATION [01-07-2021(online)].pdf 2021-07-01
16 202011035815-EVIDENCE FOR REGISTRATION UNDER SSI [28-09-2020(online)].pdf 2020-09-28
17 202011035815-STARTUP [07-01-2021(online)].pdf 2021-01-07
17 202011035815-DRAWING [01-07-2021(online)].pdf 2021-07-01
18 202011035815-FER_SER_REPLY [01-07-2021(online)].pdf 2021-07-01
18 202011035815-FORM28 [07-01-2021(online)].pdf 2021-01-07
19 202011035815-FORM 13 [01-07-2021(online)].pdf 2021-07-01
19 202011035815-FORM-9 [07-01-2021(online)].pdf 2021-01-07
20 202011035815-FORM 18A [07-01-2021(online)].pdf 2021-01-07
20 202011035815-MARKED COPIES OF AMENDEMENTS [01-07-2021(online)].pdf 2021-07-01
21 202011035815-FORM 18A [07-01-2021(online)].pdf 2021-01-07
21 202011035815-MARKED COPIES OF AMENDEMENTS [01-07-2021(online)].pdf 2021-07-01
22 202011035815-FORM 13 [01-07-2021(online)].pdf 2021-07-01
22 202011035815-FORM-9 [07-01-2021(online)].pdf 2021-01-07
23 202011035815-FER_SER_REPLY [01-07-2021(online)].pdf 2021-07-01
23 202011035815-FORM28 [07-01-2021(online)].pdf 2021-01-07
24 202011035815-STARTUP [07-01-2021(online)].pdf 2021-01-07
24 202011035815-DRAWING [01-07-2021(online)].pdf 2021-07-01
25 202011035815-COMPLETE SPECIFICATION [01-07-2021(online)].pdf 2021-07-01
25 202011035815-EVIDENCE FOR REGISTRATION UNDER SSI [28-09-2020(online)].pdf 2020-09-28
26 202011035815-CLAIMS [01-07-2021(online)].pdf 2021-07-01
26 202011035815-FORM FOR STARTUP [28-09-2020(online)].pdf 2020-09-28
27 202011035815-AMMENDED DOCUMENTS [01-07-2021(online)].pdf 2021-07-01
27 202011035815-COMPLETE SPECIFICATION [20-08-2020(online)].pdf 2020-08-20
28 202011035815-ABSTRACT [01-07-2021(online)].pdf 2021-07-01
28 202011035815-DECLARATION OF INVENTORSHIP (FORM 5) [20-08-2020(online)].pdf 2020-08-20
29 202011035815-DRAWINGS [20-08-2020(online)].pdf 2020-08-20
29 202011035815-FORM-26 [12-08-2021(online)].pdf 2021-08-12
30 202011035815-Correspondence to notify the Controller [12-08-2021(online)].pdf 2021-08-12
30 202011035815-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-08-2020(online)].pdf 2020-08-20
31 202011035815-FIGURE OF ABSTRACT [20-08-2020(online)].jpg 2020-08-20
31 202011035815-Response to office action [17-08-2021(online)].pdf 2021-08-17
32 202011035815-FORM 1 [20-08-2020(online)].pdf 2020-08-20
32 202011035815-Written submissions and relevant documents [01-09-2021(online)].pdf 2021-09-01
33 202011035815-FORM 18 [20-08-2020(online)].pdf 2020-08-20
33 202011035815-US(14)-HearingNotice-(HearingDate-24-08-2021).pdf 2021-10-19
34 202011035815-FORM FOR SMALL ENTITY(FORM-28) [20-08-2020(online)].pdf 2020-08-20
34 202011035815-Power of Attorney-230821.pdf 2021-10-19
35 202011035815-FORM FOR STARTUP [20-08-2020(online)].pdf 2020-08-20
35 202011035815-Power of Attorney-040920.pdf 2021-10-19
36 202011035815-OTHERS [20-08-2020(online)].pdf 2020-08-20
36 202011035815-OTHERS-040920.pdf 2021-10-19
37 202011035815-POWER OF AUTHORITY [20-08-2020(online)].pdf 2020-08-20
37 202011035815-FER.pdf 2021-10-19
38 202011035815-PROOF OF RIGHT [20-08-2020(online)].pdf 2020-08-20
38 202011035815-Correspondence-230821.pdf 2021-10-19
39 202011035815-REQUEST FOR EXAMINATION (FORM-18) [20-08-2020(online)].pdf 2020-08-20
39 202011035815-Correspondence-040920.pdf 2021-10-19
40 202011035815-STATEMENT OF UNDERTAKING (FORM 3) [20-08-2020(online)].pdf 2020-08-20
40 202011035815-Correspondence-040920-.pdf 2021-10-19

Search Strategy

1 2021-02-0912-09-20E_09-02-2021.pdf
1 US20160300135A1E_09-02-2021.pdf
2 2021-02-0912-09-20E_09-02-2021.pdf
2 US20160300135A1E_09-02-2021.pdf