Sign In to Follow Application
View All Documents & Correspondence

Virtual Office Environment

Abstract: Systems and methods for implementing a virtual environment system (102) are disclosed herein. A virtualization module (110) of the virtual environment system (102) receives a change in position of an avatar in a virtual environment from a client device (104) associated with the virtual environment system (102). The received change is based on a grid based architecture of the virtual environment. Further, the virtualization module (110) renders the position of the avatar in the virtual environment, based on the received change from the client device (104).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 October 2010
Publication Number
08/2014
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2020-11-20
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED
NIRMAL BUILDING, 9TH FLOOR, NARIMAN POINT, MUMBAI-400021, MAHARASHTRA, INDIA

Inventors

1. SHROFF, GAUTAM
23 HOSPITAL ROAD, JANGPURA A. NEW DELHI 110 014, INDIA
2. SHARMA, GEETIKA
C9/9740 VASANT KUNJ, NEW DELHI 110 070, INDIA

Specification

FORM 2
THE PATENTS ACT, 1970 (39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
1. Title of the invention:
VIRTUAL OFFICE ENVIRONMENT
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY SERVICES Indian Nirmal Building, 9th Floor, Nariman Point,
LIMITED Mumbai-400021, Maharashtra, India

3. Preamble to the description
COMPLETE SPECIFICATION
following specification particularly describes the invention and the manner in which it
is to be performed.

TECHNICAL FIELD
[0001] The present subject matter relates, in genera] to, a virtual environment and, in
particular to, a virtual office environment.
BACKGROUND
[0002] With the evolution of technology, communication via telephones, voice chat,
emails and video chat has become popular. Such communication media are leveraged in a number of offices to facilitate employee-to-employee interaction. Therefore, the employees of two remotely located offices either travel to their respective locations or communicate with each other through voice chat or video conferencing, etc. Such communication, however, is not economically feasible and in certain cases, such as in case of important discussions or meetings, ineffective.
[0003] Another conventional approach for enabling communication is to provide 3D
virtual environments. The 3D virtual environments offer real and life like environments by representing real life objects, for example human beings, in the 3D virtual environments. Since the 3D virtual environments are imbued with the characteristics of the physical world, (e.g., utilizing avatars having the same face/body model of the people they represent), a user or group of users may desire to interact in the virtual environment in much the same way as is done in the real world. Further, such 3D virtual environments are usually implemented on a server interacting with a plurality of client devices over a network. The effectiveness of the 3D virtual environments is evaluated based on their imitation of the real world.
SUMMARY
[0004] This summary is provided to introduce concepts related to an implementation
of a 3D virtual office environment. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[0005] In one implementation, a virtual environment system receives a change in a
position of an avatar in a virtual environment from a client device associated with the virtual environment system. The change in position of the avatar is based on a grid based architecture

of the virtual environment. The change in position of the avatar is reflected in the virtual environment based on the changes received from the client device. BRIEF DESCRIPTION OF THE DRAWINGS
[0006] The detailed description is described with reference to the accompanying
figures. In the figures, the left-most digit(s) of a reference number identifies the figure in
which the reference number first appears. The same numbers are used throughout the
drawings to reference like features and components.
[0007] Fig. 1 illustrates an exemplary network environment for implementation of a
system and/or method for implementing a 3D virtual office environment, in accordance with
an embodiment of the present subject matter.
[0008] Fig. 2 shows an exemplary client device implementing the 3D virtual office
environment as described in Fig. 1.
[0009] Fig. 3 illustrates an exemplary Virtual environment system, in accordance with
an embodiment of the present subject matter.
[0010] Fig. 4 illustrates an exemplary method for implementing a 3D virtual office
environment, in accordance with an embodiment of the present subject matter.
[0011] Fig. 5 illustrates an exemplary method for implementing navigation of an
avatar in the 3D virtual office environment, in accordance with an embodiment of the present
subject matter.
DETAILED DESCRIPTION OF THE DRAWINGS
[0012] The present subject matter relates to systems and methods for implementing a
virtual office environment. Virtual environments have become very popular. Recent growth in
computational resources, which can be deployed on computer-based systems, has led to
development of more realistic virtual environments. Through virtual environments, various
real-life interactions can be simulated. For example, virtual environments find use in gaming,
role-playing, business collaboration, social interaction, training, education, etc.
[0013] Within a virtual environment, each of the participating individual can be
represented through a virtual digital representation. Such virtual digital representations, also
commonly referred to as avatars, can then interact or communicate with each other.

Furthermore, the avatars can be chosen or designed by the respective users depending on their preferences or the available choices.
[0014] The virtual environment, within which the avatars interact, can be designed so
as to include elements, such as cities, buildings, and such, which aim to replicate the real
world. The interactions can be for entertainment, such as in games, or can be such that are
intended to impart knowledge in relation to specific topics. The virtual environments can be
three-dimensional (3D) so as to provide the participants with a more realistic experience.
[0015] The virtual environment may be implemented over one or more networks
where people can interact with each other and perform other activities, such as shop for virtual objects, deal in virtual property, travel to different locations in virtual environment, play in virtual playgrounds, participate in virtual university/educational programs, etc. Within such virtual environments, the participants can navigate and interact or communicate with other participants, i.e., with their respective avatars. The communication may be between one or more avatars, such as a dialogue or a discussion between the avatars, or can be also be a one-sided communication.
[0016] Conventional systems providing a virtual environment allow avatars to initiate
communication, which can be text-based or voice-based, by selecting the individuals with
whom the communication is to be initiated. The individuals can be selected from a predefined
list. In such a case, even though in cases where the avatars who wish to communicate with
each other, can only initiate communication by selecting one or more avatars from the
predefined list. This hampers the feeling of realism within the virtual environment.
[0017] Furthermore, the virtual environments implemented on systems are very
resource intensive. For example, the virtual environments when implemented may utilize additional system resources for the 3D rendering of the virtual environments. Furthermore, the systems may also utilize enormous amounts of network based resources, such as bandwidth, to allow the client devices to synchronize with other client devices or with central systems that are hosting the virtual environments.
[0018] To this end, systems and methods for implementing a virtual office
environment are described herein. In one implementation, the system can be implemented by using a client-server architecture. For example, a central device, such as a server, can host the

virtual office environment, which in turn can be accessed by one or more client devices. The
virtual environments can be utilized for simulating multiple themes or other environments
depending on the requirement. Examples of such environments include, but are not limited to,
laboratories, libraries, universities, etc. Although the present description is described with
reference to a virtual office environment, it would be appreciated by a person skilled in the
art, that the subject matter is also applicable to other types of virtual environments.
[0019] The system implements/ or facilitates communication between one or more
avatars based on their relative distance within the virtual environment, in one implementation,
the representation of a physical space within the virtual office environment may be based on
grid architecture. In such a case, the virtual representation of the physical space, within the
virtual environment, can be divided into multiple cells. Each of the cells can be representative
of a specific area of the space within the virtual office environment. Some of the cells can
represent empty office space, such as spaces which are used by employees for getting around
the office. Other cells can represent spaces that accommodate employee seats. Within the
virtual environment, the position of the avatars, or other components within the virtual
environment, can be indicated by the respective cells that the avatars occupy.
[0020] The office employees, i.e., their avatars, logged in to the virtual office
environment can be placed in any one of the cells. The relative displacement of the avatar from other avatars can be based on the relative distance between the cells occupied by them. In one implementation, communication between the avatars can be based on the relative distance between the respective cells. For example, avatars that are in close proximity to each other can have a voice-based communication or chat with avatars that are within a close proximity with each other. Furthermore, a threshold distance can be defined by one or more users, which allow the conversation between one or more avatars to be available, such as audible, to other avatars that are within the threshold distance. For example, a conversation between avatar A and B would be audible or available to other avatars, for a threshold distance extending upto one cell surrounding the cell within which an avatar A is present, only if the avatars occupy or pass through the surrounding cells.
[0021] In one implementation, the navigation of the avatars can be based on the grid
architecture of the virtual office environment. For example, the navigation of the avatars can

be affected by tracing a permissible navigation path from the initial cell to the destination cell. Once the path is traced, the avatars can traverse the path to reach the final destination cell.
[0022] In another implementation, the system may implement interfaces that allow
different platforms to host the virtual office environment. The interfaces can be further configured to allow interaction of avatars from different platforms with one another. In another implementation, the systems may further implement graphical interfaces that allow one or more avatars to share files, such as documents, spreadsheets, etc., amongst themselves.
[0023] In one implementation, the virtual office environment can be implemented on a
general client-server architecture wherein the server hosts a virtual office environment, and
multiple client devices access the virtual office application by connecting to the server though
a computer network . In another implementation, the system for implementing the virtual
office environment comprises of a local server for the local office employees and a global
server for the remotely located employees. In such a case, efficient simulation of the office
environment is based on the synchronization of the local and the global server.
[0024] The manner in which the virtual office environment can be implemented shall
be explained in detail with respect to the Figs. 1-5. While aspects of systems and methods can be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system architecture(s).
[0025] Fig. 1 illustrates an exemplary system 100 for implementing a virtual office
environment in accordance with the present subject matter. In one implementation, the system 100 includes a virtual environment system 102. The virtual environment system 102 is further connected to a plurality of client devices 104-1, 2, ..., n (collectively referred to as client devices 104) through a network 106. The network 106 may be a wireless or a wired network, or a combination thereof. The network 106 can also be a collection of individual networks, interconnected with each other and functioning as a single large network (e.g., the internet or an intranet). Examples of such individual networks include, but are not limited to, Local Area Networks (LANs), Wide Area Networks (WANs), and Metropolitan Area Networks (MANs).

[0026] Any one or more of the client devices 104 can be implemented as any of a
variety of conventional computing devices, including, for example, a server, a desktop PC, a notebook or a portable computer, a workstation, a mainframe computer, a mobile computing device, etc. For example, in case the virtual environment system 102 hosts a virtual office environment, the client devices 104 can represent one or more employees who are present within an organization. The present subject matter has been described in relation with the virtual environment system 102 implementing a virtual office environment; however other virtual environments are also possible albeit with a few modifications as will be understood by a person skilled in the art. It would be appreciated that the virtual environment implemented by the virtual environment system 102 can be chosen based on specific needs or requirements of the users of the virtual environments. As indicated previously, examples of such virtual environments include, but are not limited to, libraries, museums, university campuses, office environments and so on.
[0027] The client devices 104 render the virtual representations of the physical space
of an office. The virtual representations can be rendered in a manner so as to give the notion of a 3-dimensional (3D) environment. In one implementation, the client devices 104 render the virtual representation based on pre-stored rendering data. For example, the client devices 104 utilize stored mesh data to depict, in 3D, the physical space of an office. Furthermore, the client devices 104 can also render the virtual representation of the one or more employees that access the virtual environment system 102 to participate and interact within the hosted virtual environment. The virtual representation of the employees, also referred to as avatars, is deployed within the virtual office environment. In one implementation, the representation of the avatars within the virtual office environment can be based on user preferences, or can be based on predefined formats or content.
[0028] Within the virtual office environment, each of the employees working within
an organization can be associated with an avatar. By accessing the virtual environment system 102, the employees can control their associated avatars to perform numerous functions such as navigation, text-based chat, voice chat, document sharing, and so on. In one implementation, the avatars can either be modeled based on say the appearance of the associated employee or can be selected from a variety of predefined avatars.

[0029] In one implementation, each of the client devices 104 can further include a
controller module, such as controller module 108-1, 108-2,.., 108-n (collectively referred to as
controller module 108). The controller module 108 monitors the input commands provided by
the employees. Consider the following scenario in which an employee accesses the virtual
environment system 102 using any one of the client devices 104. such as client device 104-1.
In said implementation, the employee may further control the movement or the functionality
of its respective avatar by providing various commands, say navigation commands. On
receiving the commands, the controller module 108 can affect the necessary changes, say
change in position, within the virtual environment. For example, the controller module 108
may, on determining the pressing of the forward arrow key, render the virtual representation
so as to indicate an advance in the position of the avatar. In one implementation, the controller
module 108 can further send the changes or updates to the virtual environment system 102.
[0030] In one implementation, the virtual representation of the physical office can be
represented based on grid architecture. In such a case, the entire virtual office space can be realized using a plurality of adjoining cells. However, the division of the office space is based on a logical grid system that is independent of the actual physical location or distances in the real world office. Each of the cells can be representative of a specific area of the space within the virtual office environment. Some of the cells can represent empty office space, such spaces indicate areas that are used by employees for getting around the office. Other cells can represent spaces that accommodate employee seats. The size of the virtual office space can be represented by cells of predefined size. For example, the cells can be chosen to be equivalent to the space occupied by cubicle space of an employee, or can be chosen so as to represent a larger area such as office floors.
[0031] In one implementation, the controller module I08 renders navigation of the
avatars based on the cell in which one or more avatars may be present. For example, for navigation, a user at the client device 104-1 may specify an initial destination and a final destination. The controller module 108, on receiving such information evaluates all permissible paths between the initial and the final destination. The permissible paths can be evaluated based on numerous factors, such as presence of obstacles or walls between the initial and the final destination, whether the initial and the final destination are on the same or

different floors, and such. Once the permissible path is obtained, the avatar can be moved from the initial to the final destination along the permissible path. In this manner, the real life and day-to-day functions can be simulated virtually.
[0032] The position of the avatars within the virtual office environment is based on
the relative position of the employees of the organization within the physical space of their
office environment. The relative position of the employees within the physical office space
can be determined by various methods known in the art. Examples of such methods include,
but are not limited to, detection based on RFID, detection by CCTV cameras based on facial
recognition, etc. Once the relative position of the employee is obtained, the controller module
108 evaluates the corresponding cell position. Based on the evaluated cell position, the client
devices 104 render the associated avatars within the virtual office environment.
[0033] As indicated previously, the virtual environment system 102 hosts the virtual
office environment. Furthermore, a plurality of client devices 104 is connected to the virtual
environment system 102. The employees that are associated with each of the client devices
104 further interact with each other, i.e., with each other's avatars, through the virtual
environment system 102. In one implementation, the virtual environment system 102 further
includes a virtualization module 110. The interaction between one or more avatars accessing
the virtual environment system 102 is based on the exchange of data between the
virtualization module 110 and the controller module 108. For example, information associated
with the change in position of an avatar of a team lead is communicated by the controller
module 108 to the virtualization module 110. The virtualization module 110 then makes the
same information available to the other client devices 104, say those of the team members.
The other client devices 104 may then render the virtual office environment depicting the
change in position of the avatar in question, such that other avatars can virtually perceive the
change in motion of the avatar of the team lead. It will be appreciated that the manner in
which the information is exchanged is based on techniques and mechanism known in the art.
In one implementation, the exchange of data between the controller module 108 and the
virtualization module 110 is based on Hyper-Text transfer protocol (HTTP).
[0034] In another implementation, communication between one or more avatars can
also be based on the cell location. For example, an avatar walking past a group of two or more

avatars that are conversing, would be able to hear their conversation if present within a threshold limit from the group. In such a case, the controller module 108 can determine the relative distance between the cell of the group and the cell of the nearby avatar. In case the distance is less than the threshold limit, the controller module 108 can allow the conversation to be made available (or audible) to the nearby avatar. The nearby avatar on hearing the conversation can either pass by, or in case the conversation is of a matter of interest, may join in the conversation. This allows for spontaneous collaboration amongst members of an organization, and further contributes to the realism of the virtual office environment. In another implementation, the users may also configure the virtual environment system 102 so as to prevent making the communication available to the passing avatars. This can be implemented so as to address any privacy related concerns.
[0035] In another implementation, a unilateral communication can also be
implemented. For example, an avatar of a speaker may wish to address a group of other individuals, i.e., their respective avatars. The speaker may define a threshold distance. Based on the threshold distance, the controller module 108 determines all the avatars that are present in cells within the threshold distance from the speaker. The controller module 108 then can further enable a communication, such as a presentation, to be audible to avatars within the threshold distance. Therefore, whatever the speaker conveys, or shouts, would be available, or audible, to one or more avatars that are present in cells within the threshold distance. In yet another implementation, the client devices 104 are also configured to allow text-based communication between the avatars.
[0036] As indicated previously, the virtual environment system 102 as described can
be implemented within various types of networking environments. In one implementation, the virtual environment system 102, when deployed within a distributed computing environment having an internal network and an external network, can be implemented on at least two computing devices, say virtual environment system 102-1 and 102-2, The virtual environment system 102-1 can be implemented within the internal network and the virtual environment system 102-2 can be implemented for external network for efficient load handling. In such a case, the virtual environment system 102-1 can be configured to handle all requests that are local to the internal network, whereas external requests can be handled by the virtual

environment system 102-2. In one implementation, the virtual environment system 102-1 and
102-2 may further include a monitoring module (not indicated in the figure) for synchronizing
the virtual environment system 102-1 and virtual environment system 102-2.
[0037] The working of the systems and devices as introduced in Fig. 1 are further
described with reference to Figs. 2 and 3. Fig. 2 illustrates exemplary components of the client device 104. In an embodiment, the client device 104 includes processor(s) 202, I/O interface(s) 204, and memory 206. The processor(s) 202 are configured to fetch and execute computer-readable instructions stored in the memory 206. In one implementation, the memory 206 includes modules 208 and data 210. The data 210 includes data that is either generated or utilized by one or more of the modules 208. In one implementation, the modules 208 further include a controller module 108, a rendering module 212, a monitoring module 214, and other modules 216. The data 210 includes design data 218, user input 220, application data 222 and other data 224.
[0038] Fig. 3 illustrates exemplary components of the virtual environment system 102.
In an embodiment, the virtual environment system 102 includes processor(s) 302, I/O
interface(s) 304, and memory 306. The processor(s) 302 may be implemented as one or more
microprocessors, microcomputers, microcontrollers, digital signal processors, central
processing units, state machines, logic circuitries, and/or any devices that manipulate signals
based on operational instructions. Among other capabilities, the processor(s) 302 are
configured to fetch and execute computer-readable instructions stored in the memory 306. In
one implementation, the memory 306 includes module(s) 308 and program data 310. The
program data 310 includes data that is either generated or utilized by one or more of the
module(s) 308. The module(s) 308 further include the virtuaiization module 110, an
application interface module 312 and other module(s) 314. The program data 310 include
message data 316, static information 318, location data 320, and other data 322.
[0039] The memory 206 and 306 may include a computer-readable medium known in
the art including, for example, volatile memory, such as SRAMs, DRAMs, etc., and/or non¬volatile memory, such as EPROMs, flash memories, etc.
[0040] The virtual office environment is implemented locally on each of the client
devices 104 in the network 106. The controller module 108 receives the user inputs regarding

various applications that a user wants to run/execute in a virtual environment. For example, the user might wish to navigate to another location within the virtual office environment, or might wish to communicate to another employee in the virtual office, or share a file over the network. Such functions can be initiated by the users of the client devices 104 through one or more input commands. The controller module 108 then determines the changes that have to be affected to one or more avatars within the virtual office environment of the virtual environment system 102. Once the changes are determined, the rendering module 212 renders the avatars and virtual office environment based on the changes. For example, if the controller module 108 determines a change in position, the rendering module 212 renders the avatars based on the updated positions. The user inputs of performing an update in the virtual environment, such as navigating, communicating, or sharing a file, is reflected by the rendering module 212 locally on the client device 104. The user inputs are stored in the memory 206 as user input 218.
[0041] In one implementation, further changes can be communicated to the
monitoring module 214. For example, the change in position of an avatar can be communicated by the controller module 108 to the virtualization module 110 in the virtual environment system 102. Based on the changes received, the virtual ization module 110 communicates the changes or updates to other client devices 104. These changes can then be rendered by the rendering module 212 of the other client devices 104.
[0042] In one implementation, the monitoring module 214 of the client device 104 can
communicate the changes to the virtualization module 110 in the virtual environment system 102 by either polling or pushing. For example, in low activity status of the network, the monitoring module 214 is in 'push' mode. In this case, whenever there is an update received in the virtual environment in any client devices 104, then the virtual environment system 102 pushes the received updates to the monitoring module 214 of the various client devices connected to the network 106.
[0043] Further in the case of high activity status of the network 106, the monitoring
module 214 switches from the 'push' mode to a 'polling' mode. In such a case, the monitoring module 214 keeps on polling (i.e., asking the virtual environment system 102) for

any updates in the virtual environment from any client device 104. In this manner, the
capacity of the virtual environment system 102 and the client device 104 are used optimally.
[0044] While all of the above steps are performed on a single server implementing a
virtual environment system 102, these may also be implemented on an internal or local virtual
environment system 102-1, and an external virtual environment system 102-2. For example,
the virtual environment system 102-1 may store all the information including the design data
of the virtual environment for a local office including a number of client devices 104, and the
virtual environment system 102-2 may store information about all the other virtual
environment system 102-1. In such a case, the virtual environment system 102-1 may receive
the user inputs from the client devices, and communicate them to the virtual environment
system 102-2 which further communicates them to the various other virtual environment
system 102-1 across the network and the various client devices 104, subsequently. In one
implementation, the central server may be a global server such as a Google cloud server.
[0045] The various functions that might be performed in the virtual environment are
further explained in combination to the above mentioned description of Fig. 2 and Fig. 3. NAVIGATION
[0046] As described above, the virtual office environment is based on a grid-
architecture divided into logical cells. Each of the cells are representative and independent of the actual physical distances in a real world office space. As explained briefly, the navigation may be explained with respect to the cells within the virtual environment. In order to navigate from one place to another, either locally in the same office premises or to a different office located externally, the user may provide one or more input commands for affecting change in position of the avatar. In one implementation, the input commands are obtained by the controller module 108 and stored in user input 220. For example, user may provide instructions indicating an initial and destination cell location within the virtual office environment. Based on the input commands received from the user of one or more of the client devices 104, the controller module 108 evaluates a permissible path between the initial and the destination cells. Once the permissible path is evaluated, the rendering module 212 renders the avatar with the updated position, in one implementation, the rendering module 212 renders the avatar such that it appears to be traversing the permissible path as obtained by

the controller module 108. In one implementation, the rendering module 212 utilizes design
data 218 for rendering the components of the virtual office environment, the avatars, etc.
[0047] In another implementation, the controller module 108 further communicates,
for example updates in the position of one or more avatars to the virtualization module 110. The virtualization module 110 then further communicates these updates to respective monitoring module 214 of other client devices 104 across the network. Based on the updates communicated to the other client devices 104, the respective virtual office environment can be updated accordingly. For example, based on the navigation command provided at client device 104-1, the position of the avatar associated with client device 104-1 would be updated on the client device 104-1, as well as other client device 104-2 connected with the virtual environment system 102.
[0048] In one implementation, the location, i.e.. both the initial and the destination
cell, of an avatar is stored in the virtual environment system 102 in the location data 320. COMMUNICATION
[0049] As indicated previously, the virtual environment system 102 also implements
communication between one or more avatars based on their relative position within the virtual office environment. The communication can be private, i.e., intended for limited number of avatars, or can be public wherein the communication could be intended for an open audience. In one implementation, the virtual environment allows the employees to interact and communicate to other employees in the office area either through text or 3D visual mode. The employees may chose from a number of options such as 'whisper' option or 'shout' option for communicating in the virtual environment. The communication may be between one or more avatars, such as a dialogue or a discussion between the avatars, or can also be a unidirectional communication.
[0050] In one implementation, the users may interact with each other through the
textual mode, where the cells in the proximity of the avatar are shown in the 2D proximity view on the client device 104. Further, the client device also displays a user-interface for text-based communication, in order to share text messages with the people in their respective proximities. It would be appreciated that, as and when the avatars move from one cell to another, their relative proximity would also change. In further implementation, the user may

select from the options of whisper' or 'shout' for communicating in the virtual environment. For example, in the 'whisper' mode, the user may speak to an individual or a group of individuals as desired by him, and in the 'shout' mode, the person may communicate with the entire office.
[0051] In another implementation, the text-based messages can be stored in message
data 316. In yet another implementation, the user of one or more of the client devices 104 can define a threshold limit. As discussed, each of the cells of the grid architecture would be surrounded by a plurality of celts. For example, a cell having a square shape would be surrounded by eight identical cells. It may be a case that the cell is presently occupied by an avatar A. If the threshold limit extends up to the immediately surrounding cells, then any avatars present within any of the immediately surrounding cell would be privy to the communication initiated by avatar A. In such a case the controller module 108 determines the cell which, say, avatar B occupies. If the cell of the avatar B is any one of the immediately surrounding cells, then the controller module 108 allows the communication by the avatar A to be available to avatar B.
[0052] Furthermore, any avatar that is passing through any one or more of the
immediately surrounding cell would be able to receive the communication between avatar A, and say avatar B. In such a case, the passing avatar may either choose to stop and contribute or share the topic of discussion between avatars A and B, or may choose to continue with its destination. In one implementation, the locations of the avatars within the virtual office environment, and their respective immediately surrounding cells are stored in location data 320.
[0053] In case avatar A wishes to initiate a unilateral communication, for example, a
lecture, a larger threshold limit can be specified. In such a case, all other avatars that wish to attend the lecture can occupy cell positions within the threshold limit and be able to hear the contents of the lecture.
[0054] The communication for other avatar associated with other client devices 104
are routed through the virtualization module 110 in the virtual environment system 102. In one implementation, the controller module 108 receives the communication from one or more of the client devices 104, say client device 104-1. The communication can either be text-based

or voice based. The controller module 108 then communicates the communication to the visualization module 110 which then passes the same to one or more of the other client devices, such as client device 104-2. In another implementation, the visualization module 110 further communicates these updates to the monitoring module 214 of the various client devices 104.
SHARING FILES AND OTHER DATA
[0055] The visualization module 110 also implements sharing of files and other data
amongst one or more of the client devices 104. In one implementation, an application
interface module 312 at the virtual environment system 102 provides interfaces for different
applications at the various client devices 104 across the network. The interfaces allow for the
transfer of files between one or more client devices 104. In another implementation, the
application interface module 312 also allow one of the client devices 104, say client device
104-1 to visually display contents of one or more files to other client devices, such as client
devices 104-2, 3, ...9. For example, the client device 104-1 may wish to share a presentation
with other client devices 104-2,3,.. .9, can share the contents of the presentation.
[0056] In one implementation, controller module 108 determines the content that is
required to be shared with one or more client devices 104. For example, the content can be stored in application data 222. Once the content is available, the controller module 108 transmits the same to the application interface module 312 of the virtual environment system 102. The application interface module 312 then may transfer the contents to be shared to other client devices 104. Once the content is received by the client devices 104, the rendering module 212 displays the contents on a display device associated with the client devices 104. Examples of content include, but are not limited to audio/video content, textual content, presentation, etc.
[0057] Fig. 4 illustrates an exemplary method for implementing the virtual office
environment in accordance with an embodiment of the present subject matter. These exemplary methods may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract data types. The methods may

also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communication network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0058] The order in which the method is described is not intended to be construed as a
limitation, and any number of the described method blocks can be combined in any order to implement the method, or an alternative method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein.
[0059] At block 402, the virtual office environment is created. In one implementation,
the virtual office environment is created by using the real office architecture design in order to replicate the real world office in the virtual environment. The office floor area is divided into logical grids or cells that are independent of the physical distances in the office. The virtual environment further includes avatars which represent humans in the real world. The seating arrangement in the virtual environment is the same as in the real world office. All the surrounding cells of a particular cell constitute the aura of the celt. In the virtual world, an employee may be able to communicate with all the people whose cells are within the threshold limit of an employees cell.
[0060] At block 404, user inputs indicating a change in a component of the virtual
environment, are received. In one implementation, the user inputs are received by the controller module 108 of the client device. For example, the change in a component may be, navigation of an avatar from one place to another, communicating with other participants in the virtual environment, or sharing files or data with other participants in the virtual environment etc.
[0061] Further at block 406, the changes in the components of the virtual environment
are affected in the virtual environment based on the user inputs. In one implementation, the rendering module 212 of the client device 104 affects the changes in the virtual environment in both the textual or 3D view.
[0062] At block 408, the received changes in the components of the virtual
environment are stored at the client device 104 locally. In one implementation, the rendering

module 212 in combination with the controller module 108 stores the received changes in a local database.
[0063] At block 410, the received changes in the components of the virtual
environment, are communicated to the Virtual Environment System 102. In one
implementation, the changes are received by the virtual Nation module 110 of the virtual
environment system 102 from the controller module 108 of the client device 104. The Virtual
Environment System 102 further communicates the received changes to the other client
devices 104 across the network 106. The virtualization module 110 further stores these
updates in the central database located at the virtual environment system 102.
[0064] At the block 412, the changes in the components of the virtual environment are
received at the various client devices 104 across the network 106 from the virtual environment system 102. In one implementation, the monitoring module 214 of the client device 104 receives the changes from the virtualization module 110 of the virtual environment system 102.
[0065] In a further implementation, the monitoring module 214 remains in a 'push'
mode whenever a low activity status is detected in the virtual environment. The virtualization module 110 pushes the changes, as soon as they are received, to the monitoring module 214 of the client devices 104. As soon as the activity status of the virtual environment increases, the monitoring module 214 switches from the 'push' mode to the 'poling' mode. In the 'poling' mode the monitoring module 214 keeps on poling for the updates from the virtual environment system 102.
[0066] At block 414, the changes in the components of the virtual environment
received at the monitoring module 214, are reflected in the virtual environment of the corresponding client device 104. In one implementation, the monitoring module 214 communicates the received changes in the components of the virtual environment, to the rendering module 212 of the client device 104 which further updates the virtual environment based on the received changes.
[0067] Fig. 5 illustrates an exemplary method for navigating in the virtual
environment as described in accordance with the present subject matter. The order in which the method is described is not intended to be construed as a limitation, and any number of the

described method blocks can be combined in any order to implement the method, or an
alternative method. Additionally, individual blocks may be deleted from the method without
departing from the spirit and scope of the subject matter described herein.
[0068] At block 502, user inputs are received in the form of navigation instructions,
for a change in position of an avatar from one cell to another in the grid architecture of the virtual environment. In one implementation, the user inputs are received by the controller module 108 of the client device 104. For example, the user input may be the source and the destination information etc.
[0069] At block 504, a permissible path for navigating from the source to the
destination is evaluated based on the received user inputs. In one implementation of the present subject matter, the permissible path is evaluated based on the grid division of the office space. The movement of the avatar may be based on the cells in grid to be selected as the avatar is transferred from one grid to another.
[0070] At block 506, the changes to the position of the avatar are affected based on
the user inputs and the evaluated permissible path for navigation. In one implementation, the position of the avatar is affected by the rendering module 212 of the client device 104. The rendering module 212 receives the user inputs and the evaluated permissible path from the controller module 108 of the client device 104.
[0071] At block 508, the changes in the position of the avatar are communicated to all
the other client devices 104 across the network 106. In one implementation, the changes are received by the virtualization module 110 of the virtual environment system 102 and are further communicated to the monitoring module 214 of the various client devices 104 connected across the network.
[0072] At block 510, the changes are updated in the virtual environment at every
client device 104 across the network. In one implementation, the changes are received from the virtual environment system 102 at the monitoring module 214 of the client device 104. The monitoring module 214 remains in a 'push' mode in the low activity state of the virtual environment. Whenever, there is a change in the position of the avatar, this change is pushed to the monitoring module 214 of the client device 104. On receiving an update in the position of the avatar, the monitoring module 214 switches from the 'push' mode to the 'poling' mode

and keeps poling for the further updates. Furthermore, the monitoring module 214 communicates these changes to the rendering module 212 of the client devices in order to reflect the changes in the position of the avatar.
[0073] Although implementations for a virtual office environment have been
described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as exemplary implementations for the virtual office environment.

I/We Claim:
1. A virtual environment system (102) comprising:
a processor (302);
a memory (306) coupled to the processor (302), the memory (306) comprising, a virtualization module (110) configured to:
receive commands indicating a change in position of an avatar in a virtual environment from a client device (104) associated with the virtual environment system (102), wherein the change in the position is based on a grid based architecture of the virtual environment; and
changing, within the virtual environment, the position of the avatar based on the commands received from the client device (104).
2. The virtual environment system (102) as claimed in claim 1, wherein the virtualization module (110) is further configured to receive a communication command from one avatar to another avatar in the virtual environment.
3. The virtual environment system (102) as claimed in claim 1, wherein the virtualization module (110) is further configured to initiate communication between one or more avatars based on a voice chat and a video chat request from one avatar to another avatar in the virtual environment.
4. The virtual environment system (102) as claimed in claim 1, wherein the virtual environment system (102) provides an integrated textual and 3D visual representation of the virtual environment.
5. The virtual environment system (102) as claimed in claim 1, wherein the virtual environment system (102) further comprises an application interface module (312) configured to receive a file sharing request from the client device (104) in the virtual environment.

6. The virtual environment system (102) as claimed in claim 1, wherein the virtualization module (110) is further configured to communicate the received changes in the virtual environment to a monitoring module (214) of other client devices (104) associated with the virtual environment system (102).
7. A method for navigating in a virtual environment, the method comprising: receiving a change in position of an avatar in the virtual environment, wherein the change in position is based on a cell location of the avatar in a grid based architecture of the virtual environment;
evaluating a permissible path for navigation of the avatar based on the grid based architecture of the virtual environment; and
reflecting the change in position of the avatar in the virtual environment based on the evaluated permissible path.
8. The method as claimed in claim 7 further comprising communicating the change in position of the avatar in the virtual environment across the network.
9. A client device (104) comprising: a processor(202);
a memory (206) coupled to the processor (202), the memory (206) comprising:
a controller module (108) configured to receive instructions from a user indicating a change in
a position component of an avatar in a virtual environment, wherein the change is based on a
cell location of the avatar in a grid based architecture of the virtual environment; and
a rendering module (212) configured to reflect the received change in position of the avatar in
the virtual environment, based on the received instructions.
10. The client device as claimed in claim 9, wherein the controller module (108) is further
configured to communicate the received change in position of the avatar to a virtualization
module (110) of a virtual environment system (102) associated with the client device (104).

11. The client device as claimed in claim 9, wherein the system further comprises a monitoring module (214) configured to receive a change in position of another avatar in the virtual environment from the virtualization module (110) of the virtual environment system (102) associated with the client device (104).
12. The client device as claimed in claim 11, wherein the monitoring module (214) switches from a push mode to a polling mode in a high activity state of the virtual environment for managing traffic in the network.
13. The client device as claimed in claim 9, wherein the rendering module (212) is further configured to create the virtual environment based on information gathered from data capture objects selected from a group consisting surveillance cameras, webcams, and RFID tags.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 2890-MUM-2010-FORM 26(18-11-2010).pdf 2010-11-18
1 2890-MUM-2010-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26
2 2890-MUM-2010-FORM 1(18-11-2010).pdf 2010-11-18
2 2890-MUM-2010-RELEVANT DOCUMENTS [27-09-2022(online)].pdf 2022-09-27
3 2890-MUM-2010-IntimationOfGrant20-11-2020.pdf 2020-11-20
3 2890-MUM-2010-CORRESPONDENCE(18-11-2010).pdf 2010-11-18
4 2890-MUM-2010-PatentCertificate20-11-2020.pdf 2020-11-20
4 2890-MUM-2010-Information under section 8(2) (MANDATORY) [14-03-2018(online)].pdf 2018-03-14
5 2890-MUM-2010-Written submissions and relevant documents [15-09-2020(online)].pdf 2020-09-15
5 2890-MUM-2010-FORM 3 [14-03-2018(online)].pdf 2018-03-14
6 2890-MUM-2010-PETITION UNDER RULE 137 [09-09-2020(online)].pdf 2020-09-09
6 2890-MUM-2010-OTHERS [16-03-2018(online)].pdf 2018-03-16
7 2890-MUM-2010-FORM-26 [31-08-2020(online)].pdf 2020-08-31
7 2890-MUM-2010-FER_SER_REPLY [16-03-2018(online)].pdf 2018-03-16
8 2890-MUM-2010-CORRESPONDENCE [16-03-2018(online)].pdf 2018-03-16
8 2890-MUM-2010-Correspondence to notify the Controller [26-08-2020(online)].pdf 2020-08-26
9 2890-MUM-2010-COMPLETE SPECIFICATION [16-03-2018(online)].pdf 2018-03-16
9 2890-MUM-2010-US(14)-HearingNotice-(HearingDate-02-09-2020).pdf 2020-07-30
10 2890-MUM-2010-CLAIMS [16-03-2018(online)].pdf 2018-03-16
11 2890-mum-2010-abstract.pdf 2018-08-10
11 abstract1.jpg 2018-08-10
12 2890-mum-2010-form 5.pdf 2018-08-10
13 2890-mum-2010-claims.pdf 2018-08-10
13 2890-mum-2010-form 3.pdf 2018-08-10
14 2890-MUM-2010-CORRESPONDENCE(18-8-2011).pdf 2018-08-10
14 2890-mum-2010-form 2.pdf 2018-08-10
15 2890-mum-2010-correspondence.pdf 2018-08-10
16 2890-mum-2010-description(complete).pdf 2018-08-10
16 2890-mum-2010-form 2(title page).pdf 2018-08-10
17 2890-mum-2010-drawing.pdf 2018-08-10
17 2890-MUM-2010-FORM 18(18-8-2011).pdf 2018-08-10
18 2890-mum-2010-form 1.pdf 2018-08-10
18 2890-MUM-2010-FER.pdf 2018-08-10
19 2890-mum-2010-form 1.pdf 2018-08-10
19 2890-MUM-2010-FER.pdf 2018-08-10
20 2890-mum-2010-drawing.pdf 2018-08-10
20 2890-MUM-2010-FORM 18(18-8-2011).pdf 2018-08-10
21 2890-mum-2010-description(complete).pdf 2018-08-10
21 2890-mum-2010-form 2(title page).pdf 2018-08-10
22 2890-mum-2010-correspondence.pdf 2018-08-10
23 2890-MUM-2010-CORRESPONDENCE(18-8-2011).pdf 2018-08-10
23 2890-mum-2010-form 2.pdf 2018-08-10
24 2890-mum-2010-claims.pdf 2018-08-10
24 2890-mum-2010-form 3.pdf 2018-08-10
25 2890-mum-2010-form 5.pdf 2018-08-10
26 abstract1.jpg 2018-08-10
26 2890-mum-2010-abstract.pdf 2018-08-10
27 2890-MUM-2010-CLAIMS [16-03-2018(online)].pdf 2018-03-16
28 2890-MUM-2010-COMPLETE SPECIFICATION [16-03-2018(online)].pdf 2018-03-16
28 2890-MUM-2010-US(14)-HearingNotice-(HearingDate-02-09-2020).pdf 2020-07-30
29 2890-MUM-2010-Correspondence to notify the Controller [26-08-2020(online)].pdf 2020-08-26
29 2890-MUM-2010-CORRESPONDENCE [16-03-2018(online)].pdf 2018-03-16
30 2890-MUM-2010-FER_SER_REPLY [16-03-2018(online)].pdf 2018-03-16
30 2890-MUM-2010-FORM-26 [31-08-2020(online)].pdf 2020-08-31
31 2890-MUM-2010-OTHERS [16-03-2018(online)].pdf 2018-03-16
31 2890-MUM-2010-PETITION UNDER RULE 137 [09-09-2020(online)].pdf 2020-09-09
32 2890-MUM-2010-FORM 3 [14-03-2018(online)].pdf 2018-03-14
32 2890-MUM-2010-Written submissions and relevant documents [15-09-2020(online)].pdf 2020-09-15
33 2890-MUM-2010-Information under section 8(2) (MANDATORY) [14-03-2018(online)].pdf 2018-03-14
33 2890-MUM-2010-PatentCertificate20-11-2020.pdf 2020-11-20
34 2890-MUM-2010-IntimationOfGrant20-11-2020.pdf 2020-11-20
34 2890-MUM-2010-CORRESPONDENCE(18-11-2010).pdf 2010-11-18
35 2890-MUM-2010-RELEVANT DOCUMENTS [27-09-2022(online)].pdf 2022-09-27
35 2890-MUM-2010-FORM 1(18-11-2010).pdf 2010-11-18
36 2890-MUM-2010-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26
36 2890-MUM-2010-FORM 26(18-11-2010).pdf 2010-11-18

Search Strategy

1 search_18-09-2017.pdf

ERegister / Renewals

3rd: 23 Nov 2020

From 18/10/2012 - To 18/10/2013

4th: 23 Nov 2020

From 18/10/2013 - To 18/10/2014

5th: 23 Nov 2020

From 18/10/2014 - To 18/10/2015

6th: 23 Nov 2020

From 18/10/2015 - To 18/10/2016

7th: 23 Nov 2020

From 18/10/2016 - To 18/10/2017

8th: 23 Nov 2020

From 18/10/2017 - To 18/10/2018

9th: 23 Nov 2020

From 18/10/2018 - To 18/10/2019

10th: 23 Nov 2020

From 18/10/2019 - To 18/10/2020

11th: 23 Nov 2020

From 18/10/2020 - To 18/10/2021

12th: 28 Sep 2021

From 18/10/2021 - To 18/10/2022

13th: 12 Oct 2022

From 18/10/2022 - To 18/10/2023

14th: 16 Oct 2023

From 18/10/2023 - To 18/10/2024

15th: 10 Oct 2024

From 18/10/2024 - To 18/10/2025

16th: 14 Oct 2025

From 18/10/2025 - To 18/10/2026