Sign In to Follow Application
View All Documents & Correspondence

Imparting Education In A Smart Classroom Environment

Abstract: Systems and methods for imparting education in a smart classroom environment are described. In one implementation, the smart education system (102), for imparting education in a smart classroom environment, comprises a processor (152-1) and a data synchronization module (110), coupled to the processor (152-1), to receive a synchronization request from a smart device (104) to synchronize data. The smart education system (102) further includes a projection module (166), coupled to the processor (152-1), to transform the received data based on at least one of a calibration of the smart device (104) and a relative position of the smart device (104) with respect to a main projected display surface (106). Thereafter the projection module (166) projects the transformed data on the main projected display surface (106).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
01 August 2013
Publication Number
06/2015
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2022-04-18
Renewal Date

Applicants

SAMSUNG INDIA ELECTRONICS PVT. LTD.
Logix Cyber Park Plot No. C- 28 & 29 Tower D 2nd Floor Sector – 62 Noida Uttar Pradesh 201301

Inventors

1. JHA, Ashish Kumar
H No- 544 / III / 12, Rangatand Railway Colony, Dhanbad (826001), Jharkhand,

Specification

TECHNICAL FIELD
[0001] This present subject matter relates to educational systems, in general and,
particularly, but not exclusively, to methods and systems for imparting education in a
smart classroom environment.
BACKGROUND
[0002] With developments in the education industry, students have moved away
from the traditional blackboard-chalk method of imparting education. Nowadays,
students usually enroll in courses or institutes which provide current, relevant,
interesting and engaging course materials. Further, the courses are designed and
imparted by teachers, counselors and advisors, collectively referred to as instructors,
who are aware of each student's educational and professional path and goals based on a
map of progress of the course and degree of which the course is a part of.
[0003] With development of technology, computing devices, including smart
devices, such as smartphones and Personal Digital Assistants (PDAs), are often present
in smart classrooms. Instructors may use the students' familiarity with these smart
devices to implement various curricula and/or lesson plans in a technology-driven
manner, and make lessons interesting and exciting for students.
[0004] Current approaches to using such technology in educational settings may
involve using of smart devices as voting and/or answering devices. For example, a
student can send one or more answers to questions, asked by the instructor, to their
instructor via a message. Other approaches may allow students to record images and/or
videos in various situations, such as videotaping laboratory experiments and capturing
images of lecture notes.
BRIEF DESCRIPTION OF DRAWINGS
[0005] The detailed description is described with reference to the accompanying
figures. In the figures, the left-most digit(s) of a reference number identifies the figure in
which the reference number first appears. The same numbers are used throughout the
figures to reference like features and components:
[0006] Figure 1a schematically illustrates the components of a smart classroom
environment, according to an example of the present subject matter.
3
[0007] Figure 1b schematically illustrates the components of a smart education
system and a smart device in a network environment, for imparting education in a smart
classroom environment, according to an example of the present subject matter.
[0008] Figure 1c schematically illustrates the scaling up and scaling down
operation performed by the smart education system and the smart device in a smart
classroom environment, according to an example of the present subject matter.
[0009] Figure 2 illustrates a method for calibrating a display of a smart device
for imparting education in a smart classroom environment, according to an example of
the present subject matter.
[0010] Figure 3 illustrates a method for synchronizing a smart device and a
smart education system imparting education in a smart classroom environment,
according to an example of the present subject matter.
[0011] Figure 4 illustrates a method for updating the display of the smart device
and smart education system, based on an input by a student, in a smart classroom
environment, according to an example of the present subject matter.
[0012] Figure 5 illustrates a method for updating the display of the smart device
and smart education system, based on an input by an instructor, in a smart classroom
environment, according to an example of the present subject matter.
[0013] Figure 6 illustrates a method for updating display in a smart classroom
environment, according to an example of the present subject matter.
[0014] Figure 7 illustrates a method for updating display in a smart classroom
environment, according to another example of the present subject matter.
DETAILED DESCRIPTION
[0015] The present subject matter relates to systems and methods for imparting
education in a smart classroom environment. The methods and systems as described
herein may be implemented in any computing system, including smart devices, capable
of transferring data over a communication network. Example of such smart devices may
include mobile phones, smart phones, tablets, laptops, phablets and Personal Digital
Assistants (PDAs).
[0016] In recent times, there has been an exponential increase in usage of smart
devices by students. This has led to the usage of smart devices in the education industry
as well. Students regularly use the smart devices to receive lecture notes, submit
4
assignments and download and/or view reference materials related to a course. This has
changed the way the education industry operates.
[0017] Nowadays, the education industry has introduced smart classrooms.
Generally, smart classrooms are equipped with computer(s) and audio-visual
equipment(s), such as projectors and sound systems, which allows the instructors to
teach the students using a wide variety of media, such as videos and interactive
animations.
[0018] Most smart classroom environments include applications which facilitate
the instructor to share files, quizzes and assignments with the students and/ or provide
students with documents to work on. Further, the applications facilitate automatic
submission of assignments and quizzes on completion of the time allocated to the
students. This helps the instructors to focus on imparting of education and reduces time
spent on distribution and collection of quizzes or assignments.
[0019] However, most smart classroom applications do not provide facilities to
the student to interact with the instructor during the lecture session. Certain smart
classroom applications which facilitate such interaction usually require each student to
have a computer with an application installed on it to facilitate interaction and sharing of
files. However, the computer based smart classroom applications are difficult to adopt
for both the instructor and the students, as it involves the students remaining stationary
in front of a computer to interact with other students and the instructor. In many cases,
the instructor and the students have difficulty in adapting to said smart classrooms, as
said smart classroom environments are very different from the traditional blackboardchalk
classrooms, where students can move freely, talk to other students, illustrate their
points with hand gestures and eye contacts, and illustrate their ideas conveniently, by
writing on their note books or on the blackboard. Further, the infrastructure costs of the
smart classroom also, increases as each student have to be provided with their own
computer.
[0020] Certain smart classroom applications have tried to capitalize on the fact
that most students have their own smart devices. However, the smart devices of the
students vary greatly in terms of their features, technical specification, manufacturers
and model numbers. It is challenging to accommodate students with smart devices
which have varied technical specifications and network capabilities. Most smart
classroom applications have rigid requirements of smart devices, especially in terms of
processing power and network capabilities. Students who have smart devices which do
5
not match the requirements of the smart classroom applications either are not allowed to
join a session or do receive acceptable service quality. On the other hand, students with
smart devices which have additional features are not able to fully take advantage of the
extra capabilities of their smart devices. Thus, the penetration of smart devices in smart
classrooms has been quite slow.
[0021] The present subject matter describes systems and methods for imparting
education in a smart classroom environment. In one example, methods and systems for
imparting education in a smart classroom environment using a smart education system,
operated by an instructor, and one or more smart devices, wherein each smart device is
operated by a student. The smart devices may be implemented as various computing
systems, such as mobile phones, smart phones, tablets, laptops, phablets and Personal
Digital Assistants (PDAs). The smart education system may be implemented as various
computing systems, such as mobile phones, smart phones, tablets, laptops, phablets,
Personal Digital Assistants (PDAs), desktops, workstations and servers.
[0022] In one example, for the initial setup, some of the student’s smart devices
and the instructor’s smart education system include a projector, which produces a
projected display, and an image capturing device, such as a camera, that detects gesture
made by the students or the instructor with respect to projected display or with respect to
the smart education system or the smart devices. In one example, the smart devices and
the smart education system may be synchronized with each other. The smart devices and
the smart education system may also, be connected through communication links that
allows each smart device and the smart education system share with its synchronized
smart devices, the display of the each smart device. Post synchronization, the displays of
the synchronized smart devices and the smart education system shall be the same.
[0023] In one example, the smart education system has its projection on a large
projected display, visible to whole classroom. The projected display corresponding to
smart education system is henceforth, referred to as the main projected display surface.
Similarly, some of the smart devices used by the students may have their own projected
display. The projected display surface associated with each student’s smart device is
henceforth, referred to as private projected display surface.
[0024] In operation, after synchronization between the instructor’s smart
education system and the student’s smart device, a unified interactive projected surface
may be created comprising the private projected display surface(s) of one or more smart
device(s) and the main projected display surface created by the smart education system.
6
Post synchronization any student and the instructor may interact dynamically and
simultaneously, with either of the private projected display surface and the main
projected display surface, at any instance of time, forming the unified interactive
projected surface. The operations performed by the smart device(s) or the smart
education system as a result of the interaction may be visible to whole classroom via the
main projected display surface. Further, the main projected display surface may also, be
replicated on the private projected display surface of each synchronized smart device.
The synchronization of the displays of the smart devices and the smart education system
helps in generating a smart graphical interaction in the classroom, where the students
and the instructors may collaborate, discuss and interact, leading to inclusive and smart
education.
[0025] In one example, each smart device may determine the relative position of
the smart device with respect to its private projected display surface. Each of the smart
devices may also, compute ratio of dimensions of a projected image on its private
projected display surface to dimensions of the display screen of the smart device. Based
on the relative position and ratio, the display of each of the smart devices may be
calibrated with respect to its private projected display surface. In one example, each
smart device may also, determine the relative position of the smart device with respect
to main projected display surface. The smart device may also, compute ratio of
dimensions of a projected image on the main projected display surface to dimensions of
the display screen of the smart device. Based on the relative position and ratio, the
display of each of the smart devices may be calibrated with respect to the main projected
display surface. The calibration of the smart devices facilitates scaling up or scaling
down and rotation of plane of gesture made with respect to either the private projected
display surface or the main projected display surface, such that the gesture is fed as an
input to the smart device. The instructor and the students may make gestures with
respect to the main projected display surface or with respect to private projected display
surface or with respect to smart education system and the smart devices, respectively.
[0026] In one example, any kind of touch gesture or motion, on the main
projected display surface or private projected display surface, by the instructor or the
student, may be detected and interpreted as touch gesture or motion on their
corresponding smart education system and smart device. The gesture made with respect
to either of the main projected display surface and private projected display surface
may be mapped to the corresponding smart education system and smart device based on
7
the relative position of the corresponding smart education system and smart device with
respect to the projected display surface on which the gesture was made.
[0027] The students may also, make three dimension (3D) gestures with respect
to either of the projected display surfaces. In one example, the smart device may
determine the direction, motion and shape of the 3D gesture based on the location of
projected display surface with respect to smart device. The smart device may determine
the direction, motion and shape of the 3D gesture based on the orientation and size of
projected display surface with respect to the size of the display panel or screen of the
smart device. In one example, the smart education system may process 3D gestures
made by the instructor in similar manner, as described above.
[0028] In one example, if a student wishes to share something with other
students for the purpose of interactive discussion, then the student’s smart device may
send a synchronization request to the smart education system, for synchronizing the
display of the smart device and the smart education system, so as to project the display
of the student’s smart device on the main projected display surface. The smart education
system may accept or deny the request based on the instructor’s inputs. If the
synchronization request is accepted by smart education system based on the instructor’s
inputs, the smart education system may project the display of the student’s smart device
onto the main projected display surface for being displayed to the whole classroom.
[0029] In said example, if, during the period of synchronization, the student
interacts with his/her private projected display surface via gestures, the smart device
may update the private projected display surface based on the gestures and the
operations or the actions associated with the gestures. The smart device may also,
synchronize the updated private projected display surface with the smart education
system for being displayed on the main projected display surface. Thus, in said example,
during the synchronization period, any update on the private projected display surface,
due to gestures performed by student with respect to his/her private projected display
surface, is reflected dynamically on the main projected display surface, for the whole
classroom to observe. This helps the students to interactively, convey and discuss
his/her ideas with the instructor and other students. In one example, the instructor may
interact with the display on main projected display surface via gestures with respect to
the main projected display surface. The result of interaction by the instructor may
update the display of the main projected display surface which will be synchronized
with the smart device of the student, so as to update the private projected display
8
surface. This helps in interactive discussion of problems and solutions between the
instructor and the students during a lecture class.
[0030] In one example, the smart classroom may be equipped with a 3D
projector, which may receive input from the smart education system, regarding the
projection shape to project, and location to project. In one example, the instructor may
select images, visible on the main projected display surface, for the purpose of 3D
projection by 3D projector. The smart education system may transmit data associated
with the shape and location of the 3D object to the 3D projector for being displayed on
the main projected display surface. The instructor may interact with the 3D projection
via gestures, and the operations or actions associated with respect to identified gestures
may be performed. As a result of the gestures, the smart education system may update
the data associated with the shape and location of the 3D object and transmit the same to
the 3D projector for updating the 3D projection.
[0031] In another example, the student, whose smart device is synchronized with
the smart education system, may also interact with 3D projection on the main projected
display surface via gestures made with respect to the main projected display surface.
The gestures made by the student may be detected by the smart device of the student
and transmitted to the smart education system. The smart education system may then,
transmit the received data to the 3D projector for updating the 3D projection as a result
of student’s gestures.
[0032] In one example, the student may take notes, using electronic accessories,
such as S Pen and stylus, on his/her private projected display surface. In said example,
the writing or drawing actions made by the student on his/her private projected display
surface, may be recognized using handwriting recognition tools, and is transformed into
text or drawing. The text or the drawing may be displayed on the private projection
display screen. Thus, the student may perceive that he/she is actually writing or drawing
on his/her private projected display surface. The private projected display surface is
dynamically updated on the student making any writing or drawing action. In one
example, the student may share the text or drawings generated as a result of writing or
drawing actions with other students in various file formats, such as an image, a text file,
a document and a video. Thus, the student may take notes in a lecture, using his/her
smart device, by making writing or drawing actions.
[0033] In one example, the instructor or the student may perform gestures to
have his/her notes printed a printer with which the smart education system or the smart
9
device is connected to. The smart education system or the smart device may also, upload
the notes on any web portal or File Transfer Protocol (FTP) server with which the smart
education system or the smart device is connected to.
[0034] In one example, while rendering the main projected display surface,
instructor may perform gesture on a particular view, visible on the main projected
display surface, whose size is to be scaled. This gesture may be identified by smart
education system as ‘User Interface (UI) scaling gesture’. As a result of this, an option
may be displayed on the main projected display surface to scale the particular view. The
scaling operation may be performed by instructor by providing a 2D input or a 3D
gesture with respect to main projected display surface. In another example, the view in
projected display surface may be scaled up or scaled down, while retaining the original
size in smart education system.
[0035] Thus, the systems and methods for imparting education in a smart
classroom environment, as described above, facilitate students to collaborate or discuss
during a lecture by using gestures. The aforementioned systems and methods also,
facilitate the students or the instructor to conveniently, illustrate their ideas and share the
same with the whole classroom. The systems and methods also, facilitate sharing of
notes and other materials amongst the students and the instructor. The systems and
methods also, facilitate interactive discussion of problems and solutions between the
instructor and the students during a lecture class.
[0036] The above systems and methods are further, described in conjunction
with the following figures. It should be noted that the description and figures merely,
illustrate the principles of the present subject matter. It will thus, be appreciated that
those skilled in the art will be able to devise various arrangements that, although not
explicitly described or shown herein, embody the principles of the present subject matter
and are included within its spirit and scope.
[0037] The manner in which the systems and methods for imparting education in
a smart classroom environment are implemented shall be explained in details with
respect to Figures 1a, 1b, 2, 3, 4, 5, 6 and 7. While the aspects of described systems and
methods for imparting education in a smart classroom environment can be implemented
in any number of different computing systems, environments, and/or configurations, the
examples and implementations are described in the context of the following system(s).
[0038] Figure 1a schematically illustrates the components of a smart classroom
environment 100, according to an example of the present subject matter. In one
10
example, the smart classroom environment 100 includes a smart education system 102,
operated by an instructor, and a plurality of smart devices 104-1, 104-2, 104-3 and 104-
4, wherein each of the smart devices 104 is operated by a student. In one example, the
smart education system 102 may be implemented as various computing systems, such as
mobile phones, smart phones, tablets, laptops, phablets, Personal Digital Assistants
(PDAs), desktops, workstations and servers. In one example, the smart devices 104 may
be implemented as various computing systems, such as mobile phones, smart phones,
tablets, laptops, phablets and Personal Digital Assistants (PDAs).
[0039] In one example, the smart education system 102 may include a projector
105. The projector 105 may be integrated with the smart education system 102. In one
example, the projector 105 may be an accessory which may be communicatively,
coupled with the smart education system 102. In one example, the smart education
system 102 has its projection on a large projected display surface 106, visible to whole
classroom. The projected display surface 106 corresponding to smart education system
102 is henceforth, referred to as the main projected display surface 106.
[0040] In one example, the smart education system 102 may be connected with
the smart devices 104 over a communication network 108. The smart devices 104 and
the smart education system 102 may synchronize data over the communication network
108. In one example, the communication network 108 may include Global System for
Mobile Communication (GSM) network, Universal Mobile Telecommunications
System (UMTS) network, Long Term Evolution (LTE) network, Personal
Communications Service (PCS) network, Time Division Multiple Access (TDMA)
network, Code Division Multiple Access (CDMA) network, Next Generation Network
(NGN), Public Switched Telephone Network (PSTN), and Integrated Services Digital
Network (ISDN). Other examples of the communication network 108 may include cloud
networks, short range network, long range networks, machine to machine (M2M)
communication network, Gigabit Wi-Fi network, and so on. The communication
network 108 may also, any other network that use any of the commonly used protocols,
for example, Hypertext Transfer Protocol (HTTP), Transmission Control
Protocol/Internet Protocol (TCP/IP), near field communication (NFC) and Bluetooth
protocols. In one example, the smart education system includes a data synchronization
module 110 for synchronizing data with one or more smart device(s) 104 over the
communication network 108.
11
[0041] In one example, the smart education system may project or render any
object, such as a cone 112, on the main projected display surface 106. In said example,
each smart device 104 and the smart education system 102 may share with its
synchronized smart devices 104, the display of the each smart device 104. Post
synchronization, the displays of the synchronized smart devices 104 and the smart
education system 102 shall be the same.
[0042] In one example, the smart education system 102 has its projection, such
as the cone 112 on the main projected display surface 106 for being visible to the whole
classroom. Similarly, some of the smart devices 104 used by the students may have their
own projected display, i.e., a private projected display on a private projected display
surface.
[0043] In operation, after synchronization between the instructor’s smart
education system 102 and the student’s smart device(s) 104, a unified interactive
projected surface may be created comprising the private projected display surface(s) of
one or more smart device(s) 104 and the main projected display surface created by the
smart education system 102. The operations of the smart education system 102 and the
smart device(s) 104 are being explained, in the context of a geometry lecture in which
the various spatial properties of the cone 112, such as total area, lateral surface area and
volume of the cone 112, are being explained.
[0044] Post synchronization any student and the instructor may interact
dynamically and simultaneously, with the display on the main projected display surface
106, at any instance of time, forming the unified interactive projected surface. For
example, the instructor and the students may discuss the properties of the cone 112 in an
interactive manner. For the same, the students may use their smart devices 104 to
project their private projected display on the main projected display surface 106. For
example, a first student may use the smart device 104-1 to display the cone 114-1 on the
main projected display surface 106 and a second student may use the smart device 114-2
to display the cone 114-2 on the main projected display surface 106. The operations
performed by the smart device(s) 104 or the smart education system 102 as a result of
the interaction may be visible to whole classroom via the main projected display surface
106. Further, the display on the main projected display surface 106 may also, be
replicated on the private projected display surface of each synchronized smart device
104. The synchronization of the displays of the smart devices and the smart education
system helps in generating a smart graphical interaction in the classroom, where the
12
students and the instructors may collaborate, discuss and interact, leading to inclusive
and smart education. The operations of the smart device 104 and the smart education
system 102 are explained in greater detail in context of Figure 1b.
[0045] Figure 1b schematically, illustrates the components of the smart
education system 102 and the smart device 104 in a network environment 150, for
imparting education in the smart classroom environment 100, according to an example
of the present subject matter.
[0046] In one example, the smart education system 102 includes a processor
152-1, and a memory 154-1 connected to the processor 152-1. In one example, the smart
device 104 includes a processor 154-2 and a memory 154-2 connected to the processor
152-2. The processors 152-1 and 152-2 may include microprocessors, microcontrollers,
and logic circuitries. Among other capabilities, the processors 152-1 and 152-2 may
fetch and execute computer-readable instructions, stored in the memory 154-1 and 154-
2. The memory 154-1 and 154-2, communicatively coupled to the processors 152-1 and
152-2 respectively, can include any non-transitory computer-readable medium known in
the art including, volatile memory and non-volatile memory, such as Read Only
Memory (ROM), flash memories, hard disks, optical disks, and magnetic tapes.
[0047] Further, the smart education system 102 includes interfaces 156. The
interfaces 156 may include a variety of commercially available interfaces, for example,
interfaces for peripheral device(s), such as data input output devices, referred to as I/O
devices, storage devices, network device. The I/O device(s) may include wireless
interfaces, wireless antennas, Universal Serial Bus (USB) ports, Ethernet ports, host bus
adaptors, and their corresponding device drivers.
[0048] Further, the smart education system 102 and the smart devices 104 may
include modules 158-1 and 158-2 respectively. The modules 158-1 and 158-2 may be
coupled to the processors 152-1 and 152-2 respectively. The modules 158-1 and 158-2 ,
amongst other things, include routines, programs, objects, components, and data
structures, which perform particular tasks or implement particular abstract data types.
The modules 158-1 and 158-2 may also, be implemented as logic circuitries and/or any
other device or components that manipulate signals based on computer-readable
instructions.
[0049] In one example, the modules 158-1 of the smart education system 102
include an authentication module 162, the data synchronization module 110, a gesture
processing module 164, a projection module 166 and other module(s) (not shown in
13
figure). The other module(s) may include computer-readable instructions that
supplement applications or functions performed by the smart education system 102.
[0050] Further, the smart education system 102 may also, include data 160. In
one implementation, the data 160 includes content data 170 and other data (not shown in
figure). The other data 170 may include data generated and saved by the modules 158-1
for providing various functionalities of the smart education system 102.
[0051] In one example, the modules 158-2 of the smart devices 104 include a
student data synchronization module 174, a student projection module 176, a student
gesture detection module 178 and other module(s) (not shown in figure). The other
module(s) may include computer-readable instructions that supplement applications or
functions performed by the smart device 104. In one example, the smart device 104
further includes an image capturing device, such as a camera 172.
[0052] In operation, a student may interact with the smart devices 104 using user
inputs, in form of gesture or touch or motion, on the screen or above the screen of the
smart devices 104. In one example, the gesture or touch or motion may be performed on
the private projected display surface or above the private projected display surface of the
smart devices 104. The student gesture detection module 178 may recognize touch,
motion, two dimensional (2D) gestures, 3D gestures made by the student with respect to
screen of the smart device 104 or with respect to the private projected display surface,
and perform a function associated with the user input. Similarly, the gesture processing
module 164 of the smart education system may recognize user inputs, such as touch,
motion, two dimensional (2D) gestures, 3D gestures, made by the instructor with respect
to screen of the smart education system 102, and perform a function associated with the
user input. In one example, the gesture or touch or motion may be performed on the
main projected display surface 106 or above the main projected display surface 106.
[0053] In one example, based on user input, the student data synchronization
module 174 may request the smart education system 102 for synchronization. In
operation, the student data synchronization module 174 may transmit authentication data
associated with the smart device 104. The authentication module 162 of the smart
education system 102 may determine the authenticity of the smart device 104. The
authentication details may be a username password combination, an enrollment number
and Personal Identification Number (PIN) combination, a Media Access Control (MAC)
address, an Internet Protocol (IP) address, and so on. The authentication details indicate
that the smart device 104 is of a student who is enrolled in a course or otherwise,
14
eligible to attend the lecture. Based on the received authentication details, the
authentication module 162 determines whether the smart device 104 is authenticated. In
one example, the authentication module 162 may determine whether the received
authentication details from the smart device 104 match with a pre-defined list of
authenticated smart devices 104, stored in the smart education system 102.
[0054] In one example, the instructor may also, have the option of granting or
denying the request of the smart device 104. In one example, the data synchronization
module 110 determines whether the instructor has denied the request of the smart device
104. If, the instructor has not denied the request of the smart device 104, then as the
smart device 104 is provided with access permissions by the data synchronization
module 110, provides the access permissions to the smart device 104. The access
permissions may include right of uploading data to the smart education system 102 or
right of both uploading data to and downloading data from the smart education system
102. Thereafter, the data synchronization module 110 receives the data from the student
synchronization module 174 of the smart device 104. The data synchronization module
110 may then, process the received data and transmit it to the projection module 166 for
being rendered on the main projected display surface 106.
[0055] Thereafter, if the student interacts with his/her private projected display
or with his/her smart device 104, the display of the smart device 104 may change or get
updated. In one example, the student data synchronization module 174 transmits the
updated data associated with the display of the smart device to the data synchronization
module 110 of the smart education system. The projection module 166 thereafter,
projects or renders the updated display on the main projected display surface 106.
[0056] In one example, the student data synchronization module 174 of the
smart device 104 may send a synchronization request to the smart education system 102
to display on its screen and on the main projected display surface 106, the same display
as visible on the smart device 104 at time of sending the request. In one example, the
data synchronization module 110 of the smart education system 102 may run one of its
ports on a synchronization request listen mode to intercept and receive synchronization
request from the smart device 104.
[0057] Thus, on acceptance of the synchronization request by the smart
education system 102, the smart education system 102 will synchronize with the student
data synchronization module 174 and display, on the main projected display surface
106, the same view as displayed on the screen of the smart device 104. During the
15
period of synchronization, if the display of the main projected display surface 106, due
to gestures performed by the instructor, is updated, the private projected display surface
of the smart device 104 will also be updated with the same display. Similarly, any
update in the display of the main projected display surface 106, due to gestures
performed by the student will also, result in the same update in the private projected
display surface of the smart device 104. This facilitates the student to interactively,
discuss his/her ideas with other students and the instructor.
[0058] In one example, the synchronization request may comprise the
authentication details of the smart device 104, the activity which is to be run on the
smart education system 102 and projected on the main projected display surface 106,
views associated with the activity, and data associated with the views. This ensures that
the running status of components of an application on the smart education system 102
and the smart device are same. In one example, the synchronization request may also,
include the resolution and size of the screen of the smart device 104 to facilitate
calibration and conversion of coordinates of gestures based on the calibration.
[0059] In one example, after sending the synchronization request, the smart
device 104 saves the state of all components related to the activity. If the smart device
104 changes the state of any component related to the activity in the period between
sending of the synchronization request to the smart education system 102 and
acceptance of the request by the smart education system 102, then on acceptance of the
synchronization request, the state of all components are reverted back to the state of the
components at the time of sending the synchronization request.
[0060] In one example, on completion of the synchronization, and the initial
setup of initial foreground window, its view and their state and data, the data
synchronization module 110 and the student data synchronization module 174 may run
one of the ports of the smart education system 102 and the smart device 104
respectively, in a synchronized device input listener mode to facilitate the student or the
instructor to interact with either of the displays of the smart education system 102 and
the smart device 104, by performing gestures with respect to the smart education system
102 and the smart device 104, during the period of synchronization. During the period
of synchronization, any gesture performed by the student or the instructor with respect
to the smart education system 102 or the smart device 104 is processed by the gesture
processing module 164 and the student gesture detection module 178, respectively. The
gesture processing module 164 and the student gesture detection module 178, converts
16
the gesture into spatial coordinates with respect to smart education system 102 or the
smart device 104, respectively. These spatial coordinates are then, transferred between
the smart education system 102 and the smart device 104 during the period of
synchronization. In one example, the gesture may be performed by the student or the
instructor with respect to the private projected display surface or the main projected
display surface 106.
[0061] Therefore, during the period of synchronization, as a result of a gesture
with respect to any of the smart education system 102 and the smart device 104, both the
smart education system 102 and the smart device 104 updates their display based on the
gesture. In another example, the gesture may also be made with respect to the private
projected display surface or the main projected display surface 106. Thus, a student or
an instructor may perform gesture, either with respect to his/ her private projected
display surface or the main projected display surface 106, and both the projected display
surfaces may update and have similar views.
[0062] In one example, if the instructor interacts with the display on the main
projected display surface 106, the gesture processing module 164 detects the interaction.
The gesture processing module 164 identifies an action associated with the gesture made
by the instructor and performs the action. The projection module 166 then, updates the
display on the main projection display surface 106 based on the action. The data
synchronization module 110 transmits the data associated with the updated display to
the smart device 104. Based on the received data, the student projection module 166
updates the private projected display of the smartphone 104. Thus, as a result of the
instructor’s interaction with the display on the main projected display surface 106, the
display of the synchronized smart devices 104 is also, updated. In one example, the data
transmitted during synchronization is associated with at least one of the coordinates of
the gesture with respect to the smart device (04, size and resolution of a screen of the
smart device 104, and state of running components of an application running on
foreground of the smart device 104.
[0063] In one example, the gesture processing module 164 and the student
gesture detection module 178 may be structurally and functionally similar. The gesture
processing module 164 and the student gesture detection module 178 detects and
processes the gestures made by the instructor and the student, respectively. In one
example, the gesture processing module 164 and the student gesture detection module
178 may identify the gestures based on the coordinates of the gestures, such as the
17
planar coordinates of the gesture and the depth range of the gesture. In one example, the
coordinates of the gesture may be determined based on a pre-defined reference point on
the main projected display surface 106.
[0064] In one example, the smart device 104 and the smart education system 102
may have to be calibrated so as to be able to transform gestures performed, with respect
to the private projected display surface or the main projected display surface 106, into
actions. For example, moving the index finger and the thumb apart may be a gesture
which indicates zooming in. The calibration helps in identifying the percentage of the
zoom in factor for movement of the index finger and the thumb ‘x’ units apart at a
distance of y ‘units’ from the private projected display surface or the main projected
display surface 106. The calibration also, helps in determining plane of gesture and
shape of gesture with respect to the plane of the smart device 104 and/or the plane of the
smart education system 102.
[0065] In one example, the distance of centre of the main projected display
surface 106 and the centre of screen of the smart device 104 is detected by the camera
172 of the smart device 104. In one example, the camera 172 obtains a background
image in the direction of the main projected display surface 106. In one example, the
student projection module 176 processes the input from the camera 172, to determine the
depth of the background. In one example, the student projection module 176 does
background measurement, which is the measurement of the received signal when no
signal is being transmitted by a proximity sensor of the smart device 104. In another
example, the background measurement is done by segmenting out the unwanted objects
from background image, and thereafter, by comparing the obtained background image
with the reference static image of main projected display surface 106. The student
projection module 176 may also, determine the angular position of a plane of main
projected display surface 106 with respect to a plane of the screen of the smart device
104. In one example, the student projection module 176 may calibrate the display of the
smart device 104 based on the ratio of the dimensions of the main projected display
surface 106 and the dimensions of the screen of the smart device 104.
[0066] In one example, the gestures made by the instructor and the student(s) are
processed by the gesture processing module 164 and the student gesture detection
module 178, to account or compensate for the angular position of the smart device 104,
with respect to the plane of the main projected display surface 106. In one example, the
student gesture detection module 178 may transform gestures to user input from the
18
student based on the direction of main projected display surface 106, with respect to
screen of the smart device 104, the ratio of dimensions of the main projected display
surface 106 to the dimensions of the screen of the smart device 104 and the spatial
information of main projected display surface 106 with respect to the smart device 104.
[0067] In one example, the student gesture detection module 178 may classify a
user input as either an input directly on the private projected display surface of the smart
device 104 or a spatial gesture over the private projected display surface of the smart
device 104. If the student has performed any motion on the private projected display
surface of the smart device 104, then the student gesture detection module 178 processes
such motion a user touch or motion input, on corresponding coordinates of screen of the
smart device 104. If the user has performed any gesture or action over the private
projected display surface of the smart device 104, then as the student gesture detection
module 178 processes such gesture as a 3D spatial gesture with corresponding shape,
motion, speed and location with respect to the private projected display surface of the
smart device 104.The student gesture detection module 178 may also determine the
shape, location, direction and speed of user input with respect to the smart device 104 to
process the user input.
[0068] In one example, the smart classroom environment 100 may include a 3D
projector which may be communicatively coupled with the smart education system 102.
Thus, the student gesture detection module 178 may detect certain gestures as inputs for
formation of a virtual 3D object which may be projected by the 3D projector on the main
projected display surface 106. The instructor and the students may perform gestures
which may be processed by the gesture processing module 164 and the student gesture
detection module 178 to facilitate manipulation of 3D objects. In one example, the
manipulation of the 3D virtual object may result in formation of a new 3D virtual object,
or may result in changing of direction, rotating, scaling of displayed 3D virtual object.
[0069] In one example, the student gesture detection module 178 may provide
the option of scaling up of scaling down an object displayed on either the main projected
display surface 106 or the private projected display surface. For example, if a user
interface unit, such as an on-screen keyboard, as displayed on either the main projected
display surface 106 or the private projected display surface, is too large or too small,
then the student may find it inconvenient to interact with the object by providing user
inputs by touch or hand gestures with respect to the main projected display surface 106
or the private projected display surface. In such cases, the student gesture detection
19
module 178 may detect the student to perform a gesture associated with UI scaling, i.e.
the user interface (UI) scaling gesture, and perform the action associated with the UI
scaling gesture to scale up or scale down the user interface unit. In one example, the
student may perform gestures to associate or tag particular views with the UI scaling
gesture to scale up or scale down the particular view. In another example, the particular
views that may be scaled up or scaled down, i.e. have scaling capabilities, may be
associated with a scaling tag. The association of the views having scaling capabilities
with the scaling tag facilitates easy identification of views which may be scaled on either
the main projected display surface 106 or the private projected display surface. In
another example, the view on either the main projected display surface 106 or the private
projected display surface may be scaled up or scaled down while retaining the original
size of the view on the screen of the smart education system 102. The UI scaling gesture
is explained in greater detail with respect to Figure 1c.
[0070] In one example, the student gesture detection module 178 may provide
the students with the option of selecting either his/her private projected display surface
or the main projected display surface 106 as the reference with which the student will
perform gestures. If the student selects the private projected display surface as the
reference for gestures, then the gestures will be detected by sensors installed in smart
device 104. If the student selects the main projected display surface 106, then also the
gestures will be detected by sensors installed in smart device 104. However, in this case,
the smart device 104 may also determine its location with respect to the main projected
display surface 106 using the camera 172.
[0071] In one example, the student may also perform writing and drawing actions
on a projected surface which is horizontally above the smart device 104. The student
performs writing or drawing actions on projected surface, which are detected by the
student gesture detection module 178 and processed by it. The processed data is
forwarded to the student projection module 176 to update the private projected display
based on the user input. In one example, the student may take notes, using electronic
accessories, such as S Pen and a stylus, or directly using his/her hand on the private
projected display surface. The student gesture detection module 178 may detect all
writing or drawing actions made by student on the private projected display surface and
forward the same to the student projection module 176 for being displayed on the private
projected display.
20
[0072] In one example, the drawing actions or the writing actions of the student
may be provided as a video feed by the student data synchronization module 174 to the
smart devices 104 of other students. This is useful, for example in demonstrating a
complicated diagram of Engineering Graphics, where it is very difficult to understand
the drawing steps of the diagram from the final diagram.
[0073] In one example, the instructor or the student may perform gesture, to
share notes with other students. The shared note may also be in the form of video,
illustrating the steps of formation of the notes. Also, students or instructors may perform
gestures to get their notes printed from a printer with which the smart devices 104 or the
smart education system 102 is connected to.
[0074] Thus, the smart devices 104 and the smart education system 102 facilitate
creation of a unified interactive surface comprising private projected display surfaces of
students and the main projected display surface 106 of the classroom. The smart devices
104 and the smart education system 102 facilitate the students and instructor to interact
dynamically and simultaneously with either of the projected display surfaces. Thus, this
arrangement creates a smart classroom, where the students and the instructor can
collaborate, discuss and interact, leading to inclusive and smart education.
[0075] Figure 1c schematically illustrates the scaling up and scaling down
operation performed by the smart education system 102 and the smart device 104 in a
smart classroom environment, according to an example of the present subject matter. In
one example, the cone 112 is displayed on the main projected display surface 106. In
order to facilitate interaction by the students or the instructors a user interface unit, such
as an on-screen keyboard 190, may be displayed on the main projected display surface
106.
[0076] In operation, on detecting an UI scaling gesture performed by the student
or the instructor, the projection module 166 may display a UI scale 192 which indicates
the extent to which a view associated with the UI scaling gesture is zoomed. In one
example, the UI scale 192 may include an indicator 194 which indicates a current level
of zoom or scale of the associated view. In the UI scale 192, the point 196 indicates the
maximum level to which the associated view may be scaled down and the point 198 may
indicate the maximum extent to which the associated view may be scaled up. In one
example, the views that may be scaled up or down may be associated a scaling tag. The
scaling tag facilitates easy identification of the views that may be scaled up or scaled
down. In other words, the location at which the student or the instructor performs the
21
scaling gesture may correspond to multiple views. The association of the scaling tag with
a view facilitates associating the view from amongst the multiple views with the UI
scaling gesture. In one example, the maximum extents of scaling down and scaling up
may be expressed as percentage of the actual size of the associated view. For example,
the maximum extent of scaling down may be 5% of the actual size of the view and the
maximum extent of scaling up may be 400% of the actual size of the associated view.
[0077] In one example, the student or the instructor may perform pre-defined
gesture to associate a portion of the view displayed on the main projected display surface
106 with the UI scaling gesture. For example, the student or the instructor may select the
on-screen keyboard 190 for scaling up as the current size of the on-screen keyboard 190
may make it difficult for the student or the instructor to interact with the on-screen
keyboard 190. On associating the on-screen keyboard 190 with the UI scaling gesture,
the student or the instructor may scale up the on-screen keyboard 190 be performing the
UI scaling gesture associated with scaling up. The indicator 194 may dynamically
display the extent of scaling up of the on-screen keyboard 190.
[0078] On completion of the UI scaling gesture, the student or the instructor may
restore the on-screen keyboard 190 to its original size. The scaling up and scaling down
not only facilitates the instructor or the students to interact with the views of the main
projected display surface 106 but also highlight the finer points during a lecture. For
example, the instructor may scale up a structure of a hydrocarbon, to highlight to the
students, a bond which was formed due to reaction with a free radical.
[0079] Figure 2 illustrates a method 200 for calibrating a display of the smart
device 104 for imparting education in a smart classroom environment 100, according to
an example of the present subject matter. Figure 3 illustrates a method 300 for
synchronizing the smart device 104 and the smart education system 102 imparting
education in a smart classroom environment 100, according to an example of the present
subject matter. Figure 4 illustrates a method 400 for updating the display of the smart
device 104 and the smart education system 102, based on an input by a student, in a
smart classroom environment 100, according to an example of the present subject
matter. Figure 5 illustrates a method 500 for updating the display of the smart device
104 and the smart education system 102, based on an input by an instructor, in a smart
classroom environment 100, according to an example of the present subject matter.
Figure 6 illustrates a method for updating display in the smart classroom environment
100, according to an example of the present subject matter. Figure 7 illustrates a method
22
700 for updating display in a smart classroom environment 100, according to another
example of the present subject matter.
[0080] The order in which the methods 200, 300, 400, 500, 600 and 700 are
described is not intended to be construed as a limitation, and any number of the
described method blocks can be combined in any order to implement methods 200, 300,
400, 500, 600 and 700, or an alternative method. Additionally, individual blocks may be
deleted from the methods 200, 300, 400, 500, 600 and 700 without departing from the
spirit and scope of the subject matter described herein. Furthermore, the methods 200,
300, 400, 500, 600 and 700 may be implemented in any suitable hardware, machine
readable instructions or combination thereof.
[0081] In one example, the steps of the methods 200, 300, 400, 500, 600 and
700 can be performed by programmed computers. Herein, some examples are also
intended to cover program storage devices, for example, digital data storage media,
which are machine or computer readable and encode machine-executable or computerexecutable
programs of instructions, where said instructions perform some or all of the
steps of the described methods 200, 300, 400, 500, 600 and 700. The program storage
devices may be, for example, digital memories, magnetic storage media such as a
magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage
media.
[0082] With reference to method 200 as depicted in Figure 2, as illustrated in
block 202, a request for calibration of display of a smart device 104 is received. In one
implementation, the student gesture detection module 178 received the request for
calibration of display of the smart device 104 from the student.
[0083] As depicted in block 204, the relative position of the smart device 104
with respect to a main projected display surface 106 is determined. In one example, the
student projection module 176 determines the the relative position of the smart device
104 with respect to a main projected display surface 106.
[0084] As shown in block 206 a ratio of the dimensions of the main projected
display surface to the dimensions of a screen of the smart device 104 is determined. In
one example, the student projection module 176 determines the ratio of the dimensions
of the main projected display surface to the dimensions of a screen of the smart device
104.
[0085] At block 208, the display of the smart device 104 is calibrated based on
the ratio and the relative position. In one example, the student projection module 176
23
calibrated the display of the smart device 104 based on the ratio and the relative
position.
[0086] With reference to method 300 as depicted in Figure 3, as illustrated in
block 302 a request to synchronize, i.e., a synchronization request is received from a
smart device 104. In one example, the authentication module 162 of the smart education
system 102 receives the request to synchronize from the student synchronization module
174 of the smart device 104.
[0087] At block 304, authentication details from the smart device 104 are
received. In one example, the authentication module 162 of the smart education system
102 receives the authentication details from the student synchronization module 174 of
the smart device 104. The authentication details may be a username password
combination, an enrollment number and personal identification number (PIN)
combination, a media access control (MAC) address, an internet protocol (IP) address,
and so on. The authentication details indicate that the smart device 104 is of a student
who is enrolled in a course or otherwise, eligible to attend the lecture.
[0088] As shown in block 306, it is determined whether the smart device 104 is
authenticated. In one example, the authentication module 162 determines whether the
smart device 104 is authenticated based on the received authentication details. For
example, the authentication module 162 may determine whether the received
authentication details from the smart device 104 match with a pre-defined list of
authenticated smart devices 104 stored in the smart education system 102.
[0089] If at block 306, the smart device 104 is determined to be authenticated,
then as shown in block 308, it is determined whether an instructor has denied the request
of the smart device 104. In one example, the data synchronization module 110
determines whether the instructor has denied the request of the smart device 104. In one
example, on seeing a synchronization request from a smart device 104, the instructor
may choose to accept or deny the request by performing a gesture. The gesture
processing module 164 determines an action associated with the gesture. Based on the
associated action, the data synchronization module 110 determines whether the
instructor has denied the request of the smart device 104.
[0090] If at block 308, it is determined that the instructor has not denied the
request of the smart device 104, then as illustrated in block 310, access permissions to
the smart device 104 is provided. In one example, the data synchronization module 110
provides the access permissions to the smart device 104. The access permissions may
24
include right of uploading data to the smart education system 102 or right of both
uploading data to and downloading data from the smart education system 102.
[0091] As depicted in block 312, data is received from the smart device 104. In
one example, the data synchronization module 110 receives the data from the student
synchronization module 174 of the smart device 104. In one example, the data
associated with the display of the smart device 104 may be received from the smart
device 104. The data synchronization module 110 may then process the received data
and transmit it to the projection module 166 for being rendered on the main projected
display surface 106.
[0092] If at block 306, the smart device 104 is determined not to be
authenticated or if at block 308, it is determined that the instructor has denied the
request of the smart device 104, then as shown in block 314, a notification is generated
for the smart device 104. In one example, the authentication module 162 of the smart
education system 102 may generate a notification indicating error of authentication or
denial of permission by the instructor and transmit the same to the smart device 104 for
being displayed to the user.
[0093] With reference to method 400 as depicted in Figure 4, as illustrated in
block 402, a gesture is received from a student. In one example, the student gesture
detection module 178 receives the gesture from the student. In said example, the student
gesture detection module 178 may obtain data from an image capturing device, such as a
camera, integrated with the smart device 104 and process the obtained data to identify
the gesture.
[0094] As shown in block 404, the coordinates associated with the gesture is
determined. In one example, the student gesture detection module 178 determines the
coordinates of the gesture with respect to a pre-defined reference point on the screen of
the smart device 104.
[0095] As depicted in block 406, an action associated with the gesture is
ascertained. In one example, the student gesture detection module 178 ascertains an
action associated with the gesture. In one example, the student gesture detection module
178 determines the gesture and runs a query on a database which stores a mapping of a
mapping of gestures with their corresponding actions.
[0096] At block 408, the coordinates are transformed based on the calibration of
the smart device 104 associated with the student. In one example, the student projection
25
module 176 transforms the coordinates based on the calibration of the smart device 104
associated with the student.
[0097] As illustrated in block 410, the action associated with the gesture is
preformed on the smart device and at least one of the main projected display surface 106
and the private projected display surface is updated based on the same. In one example,
the student projection module 176 performs the action associated with the gesture. The
student projection module 176 may further update the private projected display of the
smart device 104. In one example, the student projection module 176 may further update
the main projected display through the smart education system 102. In one example, the
smart device 104 may transmit state of and data associated with all running components
of an application, running on the foreground of the smart device 104, to the smart
education system 102 for updating the display on the main projected display surface
106.
[0098] With reference to method 500 as depicted in Figure 5, as illustrated in
block 502, a gesture is received from an instructor. In one example, the gesture
processing module 164 receives the gesture from the instructor. In said example, the
gesture processing module 164 may obtain data from an image capturing device, such as
a camera, integrated with the smart education system 102 and process the obtained data
to identify the gesture.
[0099] As depicted in block 504, the coordinates, associated with the gesture,
with respect to the main projected display surface is determined. In one example, the
gesture processing module 164 determines the coordinates of the gesture with respect to
a pre-defined reference point on the screen of the smart education system 102 or a predefined
reference point on the main projected display surface 106.
[00100] As depicted in block 506, an action associated with the gesture is
ascertained. In one example, the gesture processing module 164 ascertains an action
associated with the gesture. In one example, the gesture processing module 164
determines the gesture and runs a query on a database which stores a mapping of a
mapping of gestures with their corresponding actions.
[00101] At block 508, the coordinates are transformed based on the calibration of
one or more smart device(s) 104 associated with the students. In one example, the
gesture processing module 164 transforms the coordinates based on the calibration of
each of the smart device 104 associated with the students with respect to the main
projected display surface 106.
26
[00102] As illustrated in block 510, the action associated with the gesture is
preformed and the display of each of the one or more smart devices is updated based on
the action and the calibration. In one example, the gesture processing module 164
transmits the transformed coordinates to the data synchronization module 110 for being
relayed to each of the one or more smart devices. The student data synchronization
module 174 receives the transformed coordinates and forwards the same to the student
projection module 176 for updating the private projected display. In one example, the
student data synchronization module 174 may also receive state of and data associated
with all running components of an application running on the foreground of the smart
education system 102 for updating the display on the private projected display surface.
[00103] With reference to method 600 as depicted in Figure 6, as illustrated in
block 602, a gesture from a student with respect to a main projected display surface is
received. In one example, the student gesture detection module 178 receives the gesture
from the student. In said example, the student gesture detection module 178 may obtain
data from an image capturing device, such as a camera, integrated with the smart device
104 and process the obtained data to identify the gesture.
[00104] As depicted in block 604, the coordinates, associated with the gesture,
with respect to the main projected display surface is determined. In one example, the
student gesture detection module 178 determines the coordinates of the gesture with
respect to a pre-defined reference point on the screen of the smart device 104.
[00105] As depicted in block 606, an action associated with the gesture is
ascertained. In one example, the student gesture detection module 178 ascertains an
action associated with the gesture. In one example, the student gesture detection module
178 determines the gesture and runs a query on a database which stores a mapping of
gestures with their corresponding actions.
[00106] At block 608, the coordinates are transformed based on the calibration of
the smart device 104 associated with the student. In one example, the student projection
module 176 transforms the coordinates based on the calibration of the smart device 104
associated with the student.
[00107] As illustrated in block 610, the action associated with the gesture is
preformed and the display of the main projected display surface is updated based on the
action. In one example, the student projection module 176 performs the action
associated with the gesture. The student projection module 176 may further update the
private projected display of the smart device 104. The data associated with updated

CLIAMS:1. A smart education system (102), for imparting education in a smart classroom environment, comprising:
a processor (152-1); and
a data synchronization module (110), coupled to the processor (152-1), to:
receive a synchronization request from a smart device (104) to synchronize data;
a projection module (166), coupled to the processor (152-1), to
transform the received data based on at least one of a calibration of the smart device (104) and a relative position of the smart device (104) with respect to a main projected display surface (106); and
project the transformed data on the main projected display surface (106).

2. The smart education system (102) as claimed in claim 1 further comprising:
a gesture processing module (164), coupled to the processor (152-1), to:
receive, from an image capturing device, data associated with a gesture performed by a instructor;
identify an action associated with the gesture based on a mapping of gestures with their corresponding actions; and
wherein the projection module (166) further:
performs the identified action; and
updates the display based on the main projected display surface (106) based on the identified action.

3. The smart education system (102) as claimed in claim 2, wherein the data synchronization module (110) further transmits the data associated with the updated display so as to update the display of the smart device (104), wherein the data is associated with at least one of the coordinates of the gesture with respect to the smart education system (102), size and resolution of a screen of the smart education system (102), and state of running components of an application running on foreground of the smart education system (102).

4. The smart education system (102) as claimed in claim 1, further comprising an authentication module (162), coupled with the processor (152-1), to:
receive authentication details from the smart device (104), wherein the authentication details comprise at least one of a username password combination of a student associated with the smart device (104), an enrollment number of the student associated with the smart device (104) and personal identification number (PIN) combination, a media access control (MAC) address of the smart device (104), and an internet protocol (IP) address of the smart device (104); and
authenticate the smart device (104) to synchronize with the smart education system (102) based on comparing the received authentication details with a pre-defined list of authenticated smart devices 104.

5. The smart education system (102) as claimed in claim 4, wherein the data synchronization module (110) further transmits data, associated with the display on the main projected display surface (106), which has been updated by an gesture made by at least one to the student and an instructor, to the smart device (104) to update the display of the smart device (104).

6. A smart device (104), for imparting education in a smart classroom environment, comprising:
a processor (152-2);
a student gesture detection module (178), coupled to the processor (152-2), to:
receive, from a camera (172), data associated with a gesture performed by a student with respect to at least one of a main projected display surface (106) and a private projected display surface;
identify an action associated with the gesture based on a mapping of gestures with their corresponding actions; and
a student projection module (166), coupled to the processor (152-2), tor:
perform the identified action; and
update the display based on the private projected display surface based on the identified action.

7. The smart device (104) as claimed in claim 6, wherein the student projection module 176 further:
determines the the relative position of the smart device (104) with respect to at least one of the main projected display surface (106) and the private projected display surface; and
wherein the student gesture detection module 178 further:
receives a request for calibration of display of the smart device (104) from the student;
determines the ratio of the dimensions of at least one of the main projected display surface and the private projected display surface to the dimensions of a screen of the smart device (104); and
calibrates the display of the smart device (104) based on the ratio and the relative position.

8. The smart device (104) as claimed in claim 6, wherein the student gesture detection module (178) further:
obtains data from the camera (172) to identify a gesture;
identifies a location on the screen of the smart device (104) at which the gesture was preformed;
determines three-dimensional (3D) spatial data associated with a 3D object placed on the identified location, wherein the 3D object is at least one of a portion and a whole view displayed in at least one of the main projected display surface (106) and the private projected display surface;
ascertains an action associated with the gesture based on a mapping of gestures with their corresponding actions;
performs the ascertained action on the 3D object;
determines the updated 3D spatial data associated with the 3D object; and
wherein the smart device (104) further comprises a student data synchronization module (174), coupled to the processor (152-2), to transmit the updated 3D spatial data to a smart education system (102) for being displayed on the main projected display surface (106).

9. The smart device (104) as claimed in claim 8, wherein the student gesture detection module (178) further:
detects at least one of writing actions and drawing actions made by the student;
converts the at least one of writing actions and drawing actions to text and diagrams based on handwriting recognition technique; and
records the at least one of writing actions and drawing actions as a multimedia file.

10. The smart device (104) as claimed in claim 9, wherein the student data synchronization module (174) shares the multimedia file, associated with the recording of the at least one of writing actions and drawing actions, with at least one of a smart education system (102) and other smart device (104).

11. The smart device (104) as claimed in claim 9, wherein the student data synchronization module (174) shares notes, generated as a result of the at least one of writing actions and drawing actions, with at least one of a smart education system (102) and other smart device (104).

12. The smart device (104) as claimed in claim 9, wherein the student data synchronization module (174) shares notes, generated as a result of the at least one of writing actions and drawing actions, with at least one of a smart education system (102) and other smart device (104).

13. The smart device (104) as claimed in claim 9, wherein the student data synchronization module (174) further transmits data to a smart education system (102) to update the display based on the main projected display surface (106) based on the identified action, on the smart device (104) being synchronized the smart education system (102).

14. The smart device (104) as claimed in claim 8, wherein the student gesture detection module (178) further:
identifies the gesture as a user interface (UI) scaling gesture;
ascertains an action associated with the UI scaling gesture;
associates one of a whole and a portion of a view displayed on at least one of the main projected display surface (106) and the private projected display surface, based on a scaling tag associated with the view; and
performs one of scaling up and scaling down of the associated view on at least one of the main projected display surface (106) and the private projected display surface, based on the ascertained action.

15. The smart device (104) as claimed in claim 14, wherein the student gesture detection module (178) further retains the original size of the associated view on a screen of the smart device (104), while performing one of scaling up and scaling down of the associated view on at least one of the main projected display surface (106) and the private projected display surface.

16. The smart device (104) as claimed in claim 8, wherein the student gesture detection module (178) further:
determines whether the gesture is performed with respect to one of a screen of the smart device (104) and the private projected display surface; and
processes the gesture as one of a two dimensional (2D) gesture and a three dimensional (3D) gesture, based on a shape and a plane of the gesture, on determining the gesture to be performed with respect to the private projected display surface.

17. The smart device (104) as claimed in claim 8, wherein the student gesture detection module (178) further receives an input from the student, wherein the input is indicative of selection of a projected display surface from amongst the main projected display surface (106) and the private projected display surface, wherein the student performs gesture with respect to the selected projected display surface.

18. The smart device (104) as claimed in claim 8, wherein the student data synchronization module (174) further transmits the data associated with the updated display so as to update the display of a smart education system (102), wherein the data is associated with at least one of the coordinates of the gesture with respect to the smart device (104), size and resolution of a screen of the smart device (104), and state of running components of an application running on foreground of the smart device (104).

19. The smart device (104) as claimed in claim 8, wherein the student data synchronization module (174) further transmits data, associated with the display on the private projected display surface (106), which has been updated by an gesture made by at least one to the student and an instructor, to the smart device (104) to update the display of the smart education system (102).

20. The smart device (104) as claimed as claimed in claim 8, wherein the student data synchronization module (174) further:
sends a synchronization request to the smart education system (102);
stores data associated with state of running components of an application running on foreground of the smart device (102) at a time of sending the synchronization request;
determines, on acceptance of the synchronization request by the smart education system (102), whether the state of at least one of the running components has changed in a time period between the sending of the synchronization request and acceptance of the synchronization request; and
reverts a current state of the at least one of the running components to the state of the at least one of the running components at the time of sending the synchronization request, based on the determining. ,TagSPECI:As Attached

Documents

Application Documents

# Name Date
1 SPEC IN.pdf 2013-08-05
2 GPOA.pdf 2013-08-05
3 FORM 5.pdf 2013-08-05
4 FORM 3.pdf 2013-08-05
5 FIGURES IN.pdf 2013-08-05
6 2317-DEL-2013-Correspondence-Others-(20-08-2013).pdf 2013-08-20
7 2317-DEL-2013-RELEVANT DOCUMENTS [08-05-2018(online)].pdf 2018-05-08
8 2317-DEL-2013-Changing Name-Nationality-Address For Service [08-05-2018(online)].pdf 2018-05-08
9 2317-DEL-2013-AMENDED DOCUMENTS [08-05-2018(online)].pdf 2018-05-08
10 2317-DEL-2013-FER.pdf 2019-01-21
11 2317-DEL-2013-OTHERS [20-06-2019(online)].pdf 2019-06-20
12 2317-DEL-2013-FER_SER_REPLY [20-06-2019(online)].pdf 2019-06-20
13 2317-DEL-2013-DRAWING [20-06-2019(online)].pdf 2019-06-20
14 2317-DEL-2013-COMPLETE SPECIFICATION [20-06-2019(online)].pdf 2019-06-20
15 2317-DEL-2013-CLAIMS [20-06-2019(online)].pdf 2019-06-20
16 2317-DEL-2013-PA [19-09-2019(online)].pdf 2019-09-19
17 2317-DEL-2013-ASSIGNMENT DOCUMENTS [19-09-2019(online)].pdf 2019-09-19
18 2317-DEL-2013-8(i)-Substitution-Change Of Applicant - Form 6 [19-09-2019(online)].pdf 2019-09-19
19 2317-DEL-2013-OTHERS-101019.pdf 2019-10-14
20 2317-DEL-2013-Correspondence-101019.pdf 2019-10-14
21 2317-DEL-2013-US(14)-HearingNotice-(HearingDate-02-02-2022).pdf 2022-01-11
22 2317-DEL-2013-FORM-26 [01-02-2022(online)].pdf 2022-02-01
23 2317-DEL-2013-Correspondence to notify the Controller [01-02-2022(online)].pdf 2022-02-01
24 2317-DEL-2013-Written submissions and relevant documents [17-02-2022(online)].pdf 2022-02-17
25 2317-DEL-2013-PatentCertificate18-04-2022.pdf 2022-04-18
26 2317-DEL-2013-IntimationOfGrant18-04-2022.pdf 2022-04-18

Search Strategy

1 search_21-01-2019.pdf

ERegister / Renewals

3rd: 07 Jun 2022

From 01/08/2015 - To 01/08/2016

4th: 07 Jun 2022

From 01/08/2016 - To 01/08/2017

5th: 07 Jun 2022

From 01/08/2017 - To 01/08/2018

6th: 07 Jun 2022

From 01/08/2018 - To 01/08/2019

7th: 07 Jun 2022

From 01/08/2019 - To 01/08/2020

8th: 07 Jun 2022

From 01/08/2020 - To 01/08/2021

9th: 07 Jun 2022

From 01/08/2021 - To 01/08/2022

10th: 07 Jun 2022

From 01/08/2022 - To 01/08/2023

11th: 29 Jul 2023

From 01/08/2023 - To 01/08/2024

12th: 30 Jul 2024

From 01/08/2024 - To 01/08/2025

13th: 30 Jul 2025

From 01/08/2025 - To 01/08/2026