Sign In to Follow Application
View All Documents & Correspondence

Multi Axis Navigation

Abstract: Multi axis navigation techniques are described. In implementations a user interface is output by a computing device the user interface includes a first axis and a second axis that include parameters that are navigable via one or more gestures. One or more items are chosen by the computing device for concurrent display with the first and second axes that correspond to a first one of the parameters of the first axis and a second one of the parameters of the second axis.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 August 2012
Publication Number
52/2013
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2021-05-24
Renewal Date

Applicants

MICROSOFT CORPORATION
One Microsoft Way Redmond Washington 98052 6399

Inventors

1. BARNETT Donald A.
c/o Microsoft Corporation LCA International Patents One Microsoft Way Redmond Washington 98052 6399
2. LAW Veronica Y.
c/o Microsoft Corporation LCA International Patents One Microsoft Way Redmond Washington 98052 6399

Specification

Multi-axis Navigation
BACKGROUND
[0001] The amount of content that is encountered by users in a typical day is ever
increasing. For example, due to the inclusion of cameras on mobile phones a user
may take have access to hundreds of pictures both taken by the user as well as from
other users. Further, the user may also be confronted with thousands of other items
of content, such as other media (e.g., videos), documents, emails, text messages,
and so on.
[0002] Consequently, it may be difficult for a user to locate content of interest
using traditional techniques, such as to manually navigate through a hierarchy of
folders used to organize the content. Further, it may be even more difficult for a
user to locate related content using traditional techniques. Accordingly, traditional
techniques may lead to user frustration and even to forgo functionality due to the
complexities involved with relatively large amounts of content.
SUMMARY
[0003] Multi-axis navigation techniques are described. In implementations, a user
interface is output by a computing device, the user interface includes a first axis
and a second axis that include parameters that are navigable via one or more
gestures. One or more items are chosen by the computing device for concurrent
display with the first and second axes that correspond to a first one of the
parameters of the first axis and a second one of the parameters of the second axis.
[0004] In implementations, one or more inputs are recognized as selecting a first
one of a plurality of parameters in a first axis and a second one of a plurality of
parameters in a second axis in a user interface output by a computing device using
one or more gestures. The first axis is arranged in the user interface as generally
perpendicular to the second axis. One or more items are output in the user interface
that correspond the first and second parameters.
[0005] In implementations, a computing device includes a housing, a display
device disposed on the housing, and one or more modules disposed within the
housing. The one or more modules are configured to display a user interface on the
display device, the user interface including a first axis and a second axis that
include parameters that are navigable via one or more gestures detected via
touchscreen functionality of the computing device, that are positioned as generally
perpendicular, one or another; and that are selectable via the gestures to cause
output of items in the user interface that correspond to the parameters.
[0006] This Summary is provided to introduce a selection of concepts in a
simplified form that are further described below in the Detailed Description. This
Summary is not intended to identify key features or essential features of the
claimed subject matter, nor is it intended to be used as an aid in determining the
scope of the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The detailed description is described with reference to the accompanying
figures. In the figures, the left-most digit(s) of a reference number identifies the
figure in which the reference number first appears. The use of the same reference
numbers in different instances in the description and the figures may indicate
similar or identical items.
[0008] FIG. 1 is an illustration of an environment in an example implementation
that is operable to employ multi-axis navigation techniques described herein.
[0009] FIG. 2 is an illustration of an example system showing a multi-axis
navigation module of FIG. 1 as being implemented using in an environment where
multiple devices are interconnected through a central computing device.
[0010] FIG. 3 is an illustration of a system in an example implementation in
which multi-axis navigation technqiues through a first and second axis that may be
used to specify a particular point in time are described.
[0011] FIG. 4 illustrates a system in another example implementation in which
multi-axis navigation technqiues are described through a first and second axis that
may be used to specify a particular point in time.
[0012] FIG. 5 illustrates a system in another example implementation in which
multi-axis navigation technqiues are described through a first and second axis that
are used to specify a particular point in time as a day.
[0013] FIG. 6 depicts a procedure in an example implementation in which multiaxis
navigation techniques are used to output content that corresponds to included
parameters.
[0014] FIG. 7 illustrates various components of an example device that can be
implemented as any type of portable and/or computer device as described with
reference to FIGS. 1-5 to implement embodiments of the multi-axis navigation
techniques described herein.
DETAILED DESCRIPTION
Overview
[0015] Traditional techniques that were used to organize and locate content may
become inefficient when confronted with a large number of content. For example,
it may become cumbersome to locate a particular item of content a hierarchical
arrangement of menus. It may be even more difficult to locate related items of
content using traditional techniques, thus leading to potential user frustration.
[0016] Multi-axis navigation techniques are described. In implementations, a user
interface includes a first axis and a second axis that is arranged generally
perpendicular to the first axis. Each axis may be navigated using a gesture to select
a particular parameter included in the axis. For example, the first axis may describe
particular months and the second axis may describe particular years. Selection of
the month and year may then serve as a basis to locate content for output in the user
interface, such as images that were captured at that point in time. The multi-axis
navigation may also be leveraged to locate related content, such as to navigate
through Halloween photos through multiple years for the month of October. In this
way, a user may readily navigate through a large amount of content to locate
content of interest as well as to navigate through related content. Further
discussion of the multi-axis navigation techniques may be found in relation to the
following sections.
[0017] In the following discussion, an example environment is first described that
is operable to employ the multi-axis navigation techniques described herein.
Example illustrations of the techniques and procedures are then described, which
may be employed in the example environment as well as in other environments.
Accordingly, the example environment is not limited to performing the example
techniques and procedures. Likewise, the example techniques and procedures are
not limited to implementation in the example environment.
Example Environment
[0018] FIG. 1 is an illustration of an environment 100 in an example
implementation that is operable to employ multi-axis navigation techniques. The
illustrated environment 100 includes an example of a computing device 102 that
may be configured in a variety of ways. For example, the computing device 102
may be configured as a traditional computer (e.g., a desktop personal computer,
laptop computer, and so on), a mobile station, an entertainment appliance, a set-top
box communicatively coupled to a television, a wireless phone, a netbook, a game
console, and so forth as further described in relation to FIG. 2. Thus, the
computing device 102 may range from full resource devices with substantial
memory and processor resources (e.g., personal computers, game consoles) to a
low-resource device with limited memory and/or processing resources (e.g.,
traditional set-top boxes, hand-held game consoles). The computing device 102
may also relate to software that causes the computing device 102 to perform one or
more operations.
[0019] The computing device 102 is illustrated as including an input module 104.
The input module 104 is representative of functionality relating to inputs of the
computing device 102. For example, the input module 104 may be configured to
receive inputs from a keyboard, mouse, to identify gestures and cause operations to
be performed that correspond to the gestures, and so on. The inputs may be
identified by the input module 104 in a variety of different ways.
[0020] For example, the input module 104 may be configured to recognize an
input received via touchscreen functionality of a display device 106, such as a
finger of a user's hand 108 as proximal to the display device 106 of the computing
device 102, from a stylus 110, and so on. The input may take a variety of different
forms, such as to recognize movement of the stylus 110 and/or a finger of the user's
hand 108 across the display device 106, such as a tap, drawing of a line, and so on.
In implementations, these inputs may be recognized as gestures.
[0021] A variety of different types of gestures may be recognized, such a gestures
that are recognized from a single type of input (e.g., touch gestures) as well as
gestures involving multiple types of inputs. For example, the computing device
102 may be configured to detect and differentiate between a touch input (e.g.,
provided by one or more fingers of the user's hand 108) and a stylus input (e.g.,
provided by a stylus 110). Thus, the input module 104 may support a variety of
different gesture techniques by recognizing and leveraging a division between
stylus and touch inputs.
[0022] Additionally, although the following discussion may describe specific
examples of touch and stylus inputs, in instances the types of inputs may be
switched (e.g., touch may be used to replace stylus and vice versa) and even
removed (e.g., both inputs may be provided using touch or a stylus) without
departing from the spirit and scope thereof. Further, although in instances in the
following discussion the gestures are illustrated as being input using touchscreen
functionality, the gestures may be input using a variety of different techniques by a
variety of different devices, such as to be captured by a camera for use as part of a
natural user interface (NUI).
[0023] The computing device 102 is further illustrated as including a multi-axis
navigation module 112. The multi-axis navigation module 112 is representative of
functionality of the computing device 102 to configure a user interface for multiaxis
navigation. As illustrated on the display device 106, an example user interface
is shown having a first axis 114 and a second axis 116 that is arranged as generally
perpendicular to the first axis 114, although other arrangements are also
contemplated, e.g., parallel.
[0024] Each of the axes includes parameters that are selectable to identify
corresponding content to be output in the user interface. For instance, the first axis
114 is illustrated as referencing particular months and the second axis 116 is
illustrated as representing particular years. Selection of the parameters from each
of the axes may serve as a basis for outputting content that corresponds to the
parameters. Thus, in this way a user may readily select parameters to locate
content of interest and also readily navigate through related content, further
discussion of which may be found in relation to the implementation examples
below.
[0025] FIG. 2 illustrates an example system 200 that includes the computing
device 102 as described with reference to FIG. 1. The example system 200 enables
ubiquitous environments for a seamless user experience when running applications
on a personal computer (PC), a television device, and/or a mobile device. Services
and applications run substantially similar in all three environments for a common
user experience when transitioning from one device to the next while utilizing an
application, playing a video game, watching a video, and so on.
[0026] In the example system 200, multiple devices are interconnected through a
central computing device. The central computing device may be local to the
multiple devices or may be located remotely from the multiple devices. In one
embodiment, the central computing device may be a cloud of one or more server
computers that are connected to the multiple devices through a network, the
Internet, or other data communication link. In one embodiment, this
interconnection architecture enables functionality to be delivered across multiple
devices to provide a common and seamless experience to a user of the multiple
devices. Each of the multiple devices may have different physical requirements
and capabilities, and the central computing device uses a platform to enable the
delivery of an experience to the device that is both tailored to the device and yet
common to all devices. In one embodiment, a class of target devices is created and
experiences are tailored to the generic class of devices. A class of devices may be
defined by physical features, types of usage, or other common characteristics of the
devices.
[0027] In various implementations, the client device 102 may assume a variety of
different configurations, such as for computer 202, mobile 204, and television 206
uses. Each of these configurations includes devices that may have generally
different constructs and capabilities, and thus the computing device 102 may be
configured according to one or more of the different device classes. For instance,
the computing device 102 may be implemented as the computer 202 class of a
device that includes a personal computer, desktop computer, a multi-screen
computer, laptop computer, netbook, and so on.
[0028] The computing device 102 may also be implemented as the mobile 202
class of device that includes mobile devices, such as a mobile phone, portable
music player, portable gaming device, a tablet computer, a multi-screen computer,
and so on. The computing device 102 may also be implemented as the television
206 class of device that includes devices having or connected to generally larger
screens in casual viewing environments. These devices include televisions, set-top
boxes, gaming consoles, and so on. The multi-axis navigation techniques described
herein may be supported by these various configurations of the client device 102
and are not limited to the specific examples of multi-axis navigation technqiues
described herein.
[0029] The cloud 208 includes and/or is representative of a platform 210 for
content services 212. The platform 210 abstracts underlying functionality of
hardware (e.g., servers) and software resources of the cloud 208. The content
services 212 may include applications and/or data that can be utilized while
computer processing is executed on servers that are remote from the client device
102. Content services 212 can be provided as a service over the Internet and/or
through a subscriber network, such as a cellular or WiFi network.
[0030] The platform 210 may abstract resources and functions to connect the
computing device 102 with other computing devices. The platform 210 may also
serve to abstract scaling of resources to provide a corresponding level of scale to
encountered demand for the content services 212 that are implemented via the
platform 210. Accordingly, in an interconnected device embodiment,
implementation of functionality of the multi-axis navigation module 112 may be
distributed throughout the system 200. For example, the multi-axis navigation
module 112 may be implemented in part on the computing device 102 as well as
via the platform 210 that abstracts the functionality of the cloud 208.
[0031] Generally, any of the functions described herein can be implemented using
software, firmware, hardware (e.g., fixed logic circuitry), or a combination of these
implementations. The terms "module," "functionality," and "logic" as used herein
generally represent software, firmware, hardware, or a combination thereof. In the
case of a software implementation, the module, functionality, or logic represents
program code that performs specified tasks when executed on a processor (e.g.,
CPU or CPUs). The program code can be stored in one or more computer readable
memory devices. The features of the multi-axis navigation techniques described
below are platform-independent, meaning that the techniques may be implemented
on a variety of commercial computing platforms having a variety of processors.
Multi-axis Navigation Implementation Example
[0032] FIG. 3 illustrates a system 300 in an example implementation in which
multi-axis navigation technqiues through a first and second axis that may be used to
specify a particular point in time are described. The system 300 is illustrated using
first and second stages 302, 304. At the first stage 302, the computing device 102
is illustrated as outputting a user interface configured for multi-axis navigation
having first and second axes 114, 116.
[0033] As before, the first axis 114 is configured to include parameters relating to
months and the second axis 114 is configured to include parameters relating to
years. Therefore, parameters may be selected using the first and second axes to
specify a particular month, an example of which is illustrated as "May 2009" by
giving focus to the selected parameters in the first and second axes 114, 116.
Content that corresponds to the selected parameters is illustrated as output in
conjunction with the first and second axes 114, 116, which in this example are
images captured at the particular point in time specified, e.g., May 2008.
[0034] To select different parameters, and therefore specify content for output that
corresponds to the different parameters, the user may interact with the first and/or
second axes 114, 116 in a variety of ways. For example, a drag gesture may be
used which involves selecting a point along an axis and subsequent movement in a
direction to change which parameter of the axis is highlighted. The illustrated
example is shown as a selection made using a finger of the user's hand 108 of a
parameter "July" and subsequent movement (illustrated through use of a phantom
arrow) along the first axis 114 toward a point of the first axis that is given focus.
[0035] A result of the drag gesture is illustrated at the second stage 304. The
month "July" is illustrated as being selected through the use of focus in this
example. Therefore, at this point the parameter "July" is selected for the first stage
114 while the parameter "2008" is still selected for the second stage 116 thereby
referencing a particular point in time, e.g., July 2008. Accordingly, the multi-axis
navigation module 112 may cause output of content (e.g., images) that correspond
to the particular point in time. In an implementation, this output may be performed
in real time as one or more axes are scrolled. Naturally, other implementations are
also contemplated, such as to update the user interface when the drag gesture is
completed. Thus, in this example a user may quickly navigate through months in a
year using the first axis 114. The user may also interact with the second axis to
navigate through years, further discussion of which may be found in relation to the
following figure.
[0036] FIG. 4 illustrates a system 400 in another example implementation in
which multi-axis navigation technqiues through a first and second axis that may be
used to specify a particular point in time are described. As before, system 400 is
illustrated using first and second stages 402, 404. At the first stage 402, the
computing device 102 is again illustrated as outputting a user interface configured
for multi-axis navigation having first and second axes 114, 116.
[0037] In this example, a drag gesture is illustrated as being input in conjunction
with the second axis 116. The finger of the user's hand 108 is shown as selecting a
particular parameter (e.g., 2010) and subsequently moved (e.g., illustrated through
use of a phantom line) to complete the gesture, such as by moving toward an area
having focus. A result of this gesture is illustrated in the second stage 404 in which
the parameter for a year "2010" is shown as selected. Accordingly, the multi-axis
navigation module 112 may output the content that corresponds to the month
"May" in the first axis 114 and the year "2010" in the second axis 116.
[0038] By scrolling through the parameters in the axes, the user may navigate
through corresponding content that is related. For example, the user may select the
month "December" in the first axis 114 and then navigate through different years in
the second axis 116 to view images taken during the holiday season. Although the
examples of FIGS. 3 and 4 described parameters such as "month" and "year" to
specify a particular point in time, a variety of different parameters may be utilized
to specify a particular point in time, another example of which may be found in
relation to the following figure.
[0039] FIG. 5 illustrates a system 500 in another example implementation in
which multi-axis navigation technqiues through a first and second axis that may be
used to specify a particular point in time as a day are described. In this example,
the first axis 114 is again show as including parameters that reference particular
months. The second axis 116, however, is shown as referencing particular days.
Therefore, in this example day "May 19" is selected using the first and second axes
114, 116 and the multi-axis module may output content that corresponds to the
particular day. Thus, it should be readily apparent that a wide variety of points to
time may be specified using the multi-axis technqiues described herein, such as
particular days, weeks, months, and so on.
[0040] A wide variety of other parameters may also be specified using the multiaxis
techniques described herein to locate and determine relatedness of a wide
variety of different content. For example, the multi-axis techniques may specify a
particular type of document and author to locate documents, a particular type and
title to locate media, a particular sender or recipient along with a parameter relating
to time to locate emails, text messages, voicemails, or other communications, and
so on. Additionally, in an implementation a user interface may be output to specify
parameters to be used by the first and/or second axes 114, 116 as well as to specify
a type of content that is to be subject of a search using the parameters. Further,
more than two axes may be employed without departing from the spirit and scope
thereof. A variety of other examples are also contemplated as further discussed
below.
Example Procedure
[0041] The following discussion describes multi-axis navigation techniques that
may be implemented utilizing the previously described systems and devices.
Aspects of each of the procedures may be implemented in hardware, firmware,
software, or a combination thereof. The procedures are shown as a set of blocks
that specify operations performed by one or more devices and are not necessarily
limited to the orders shown for performing the operations by the respective blocks.
In portions of the following discussion, reference will be made to the environment
100 of FIG. 1 and the systems 300-500 of FIGS. 3-5.
[0042] FIG. 6 depicts a procedure 600 in an example implementation in which
multi-axis navigation techniques are used to output content that corresponds to
included parameters. A user interface is output by a computing device, the user
interface including a first axis and a second axis that include parameters that are
navigable via one or more gestures (block 602). The first and second axes may be
arranged in a variety of ways, such as generally perpendicular as shown in FIGS. 3-
5, as generally parallel, at angle to each other, as partially or substantially
overlapping, and so on.
[0043] Additionally, a variety of different parameters may be included on the
axes. For example, the axes may be used to describe a particular point in time
through inclusion of parameters that specify a time of day, day (e.g., a day of a
week, a day of a year), week (e.g., in a month, year, and so on), year, and so forth.
A variety of other parameters may also be leveraged by the first and/or second
axes, such as author, genre, composer, musician, group, type (e.g., email, instant
message, document, media, video, blog, micro-blog, etc.), source (e.g., local,
remote, streaming), topic, and so on. For instance, a user may configure the
parameters manually as well as what is being searched (e.g., document, images, or
other type as described above) manually through interaction with a user interface.
A variety of other examples are also contemplated.
[0044] One or more inputs are recognized as selecting a first one of a plurality of
parameters in the first axis and a second one of a plurality of parameters in a second
axis in a user interface, the first axis arranged in the user interface as generally
perpendicular to the second axis (block 604). For example, a drag or other gesture
may be utilized to scroll one or more of the axes to select particular parameters
included in the axes. A variety of other techniques may also be employed, such as
a "tap" gesture to select a parameter, use of a cursor control device, and so forth.
[0045] One or more items are chosen by the computing device for concurrent
display with the first and second axes that correspond to a first parameter in the
first axis and a second parameter in the second axis (block 606). For example, a
gesture may be received to select a parameter "July" in the first axis 114 and the
parameter "2008" that is already referenced in the second axis 116. Accordingly,
the multi-axis navigation module 112 may output content (e.g., images) that
corresponds to the first and second parameters (block 608), such as to view
representations of the content (e.g., icons of documents or music files, thumbnails
of videos or images, etc.) and/or the content itself, e.g., images in their entirety.
[0046] A selection is received of one or more of the items in the user interface
(block 610). Continuing with the previous example, the content shown with the
first and second axes may be selectable to initiate output of the content, such as to
show an image full screen, begin rendering of a video, music or other media, output
of a document for editing, and so on. Thus, a user may navigate to content of
interest using the multi-axis navigation technqiues and then initiate the content
through interaction with the user interface. Naturally, a variety of other examples
are also contemplated, such as to initiate output automatically.
Example Device
[0047] FIG. 7 illustrates various components of an example device 700 that can be
implemented as any type of portable and/or computer device as described with
reference to FIGS. 1 and 2 to implement embodiments of the gesture techniques
described herein. Device 700 includes communication devices 702 that enable
wired and/or wireless communication of device data 704 (e.g., received data, data
that is being received, data scheduled for broadcast, data packets of the data, etc.).
The device data 704 or other device content can include configuration settings of
the device, media content stored on the device, and/or information associated with a
user of the device. Media content stored on device 700 can include any type of
audio, video, and/or image data. Device 700 includes one or more data inputs 706
via which any type of data, media content, and/or inputs can be received, such as
user-selectable inputs, messages, music, television media content, recorded video
content, and any other type of audio, video, and/or image data received from any
content and/or data source.
[0048] Device 700 also includes communication interfaces 708 that can be
implemented as any one or more o\f a serial and/or parallel interface, a wireless
interface, any type of network interface, a modem, and as any other type of
communication interface. The communication interfaces 708 provide a connection
and/or communication links between device 700 and a communication network by
which other electronic, computing, and communication devices communicate data
with device 700.
[0049] Device 700 includes one or more processors 710 (e.g., any of
microprocessors, controllers, and the like) which process various computerexecutable
instructions to control the operation of device 700 and to implement
embodiments of a touch pull-in gesture. Alternatively or in addition, device 700
can be implemented with any one or combination of hardware, firmware, or fixed
logic circuitry that is implemented in connection with processing and control
circuits which are generally identified at 712. Although not shown, device 700 can
include a system bus or data transfer system that couples the various components
within the device. A system bus can include any one or combination of different
bus structures, such as a memory bus or memory controller, a peripheral bus, a
universal serial bus, and/or a processor or local bus that utilizes any of a variety of
bus architectures.
[0050] Device 700 also includes computer-readable media 714, such as one or
more memory components, examples of which include random access memory
(RAM), non-volatile memory (e.g., any one or more of a read-only memory
(ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A
disk storage device may be implemented as any type of magnetic or optical storage
device, such as a hard disk drive, a recordable and/or rewriteable compact disc
(CD), any type of a digital versatile disc (DVD), and the like. Device 700 can also
include a mass storage media device 716.
[0051] Computer-readable media 714 provides data storage mechanisms to store
the device data 704, as well as various device applications 718 and any other types
of information and/or data related to operational aspects of device 700. For
example, an operating system 720 can be maintained as a computer application
with the computer-readable media 714 and executed on processors 710. The device
applications 718 can include a device manager (e.g., a control application, software
application, signal processing and control module, code that is native to a particular
device, a hardware abstraction layer for a particular device, etc.). The device
applications 718 also include any system components or modules to implement
embodiments of the gesture techniques described herein. In this example, the
device applications 718 include an interface application 722 and an input module
724 (which may be the same or different as input module 114) that are shown as
software modules and/or computer applications. The input module 724 is
representative of software that is used to provide an interface with a device
configured to capture inputs, such as a touchscreen, track pad, camera, and so on.
Alternatively or in addition, the interface application 722 and the input module 724
can be implemented as hardware, software, firmware, or any combination thereof.
Additionally, the input module 724 may be configured to support multiple input
devices, such as separate devices to capture touch and stylus inputs, respectively.
For example, the device may be configured to include dual display devices, in
which one of the display device is configured to capture touch inputs while the
other stylus inputs.
[0052] Device 700 also includes an audio and/or video input-output system 726
that provides audio data to an audio system 728 and/or provides video data to a
display system 730. The audio system 728 and/or the display system 730 can
include any devices that process, display, and/or otherwise render audio, video, and
image data. Video signals and audio signals can be communicated from device 700
to an audio device and/or to a display device via an RF (radio frequency) link, Svideo
link, composite video link, component video link, DVI (digital video
interface), analog audio connection, or other similar communication link. In an
embodiment, the audio system 728 and/or the display system 730 are implemented
as external components to device 700. Alternatively, the audio system 728 and/or
the display system 730 are implemented as integrated components of example
device 700.
Conclusion
[0053] Although the invention has been described in language specific to
structural features and/or methodological acts, it is to be understood that the
invention defined in the appended claims is not necessarily limited to the specific
features or acts described. Rather, the specific features and acts are disclosed as
example forms of implementing the claimed invention.
CLAIMS
What is claimed is:
1. A method comprising:
outputting a user interface by a computing device, the user interface
including a first axis and a second axis that include parameters that are navigable
via one or more gestures; and
choosing one or more items by the computing device for concurrent display
with the first and second axes that correspond to a first said parameter of the first
axis and a second said parameter of the second axis.
2. A method as described in claim 1, wherein the first axis is arranged in
the user interface as generally perpendicular to the second axis.
3. A method as described in claim 1, wherein the first and second said
parameters are selected via the one or more gestures.
4. A method as described in claim 3, wherein the one or more gestures
include a drag gesture.
5. A method as described in claim 1, wherein the first said parameter
and the second said parameter reference a point in time and the one or more items
correspond to the point in time.
6. A method as described in claim 5, wherein the point in time is a
particular day, week, or month.
7. A method as described in claim 5, wherein the one or more items
correspond to the point in time by being created at that point in time.
8. A method comprising:
recognizing one or more inputs as selecting a first one of a plurality of
parameters in a first axis and a second one of a plurality of parameters in a second
axis in a user interface output by a computing device using one or more gestures,
the first axis arranged in the user interface as generally perpendicular to the second
axis; and
outputting one or more items in the user interface that correspond the first
and second parameters.
9. A method as described in claim 8, wherein parameters in the first axis
specify months and parameters in the second axis specify years.
10. A method as described in claim 8, wherein at least one of the items is
selectable to cause an output of the item.

Documents

Application Documents

# Name Date
1 7451-CHENP-2012 POWER OF ATTORNEY 28-08-2012.pdf 2012-08-28
2 7451-CHENP-2012 PCT PUBLICATION 28-08-2012.pdf 2012-08-28
3 7451-CHENP-2012 FORM-5 28-08-2012.pdf 2012-08-28
4 7451-CHENP-2012 FORM-3 28-08-2012.pdf 2012-08-28
5 7451-CHENP-2012 FORM-2 FIRST PAGE 28-08-2012.pdf 2012-08-28
6 7451-CHENP-2012 FORM-1 28-08-2012.pdf 2012-08-28
7 7451-CHENP-2012 DRAWINGS 28-08-2012.pdf 2012-08-28
8 7451-CHENP-2012 DESCRIPTION (COMPLETE) 28-08-2012.pdf 2012-08-28
9 7451-CHENP-2012 CORREPONDENCE OTHERS 28-08-2012.pdf 2012-08-28
10 7451-CHENP-2012 CLAIMS SIGNATURE LAST PAGE 28-08-2012.pdf 2012-08-28
11 7451-CHENP-2012 CLAIMS 28-08-2012.pdf 2012-08-28
12 7451-CHENP-2012.pdf 2012-08-29
13 7451-CHENP-2012 FORM-3 18-02-2013.pdf 2013-02-18
14 7451-CHENP-2012 CORRESPONDENCE OTHERS 18-02-2013.pdf 2013-02-18
15 abstract7451-CHENP-2012.jpg 2013-10-25
16 7451-CHENP-2012 FORM-6 26-02-2015.pdf 2015-02-26
17 MTL-GPOA - JAYA.pdf 2015-03-13
18 MS to MTL Assignment.pdf 2015-03-13
19 FORM-6-1801-1900(JAYA).14.pdf 2015-03-13
20 Power of Attorney [04-04-2017(online)].pdf 2017-04-04
21 Form 6 [04-04-2017(online)].pdf 2017-04-04
22 Assignment [04-04-2017(online)].pdf 2017-04-04
23 7451-CHENP-2012-FER.pdf 2019-09-26
24 7451-CHENP-2012-OTHERS [31-12-2019(online)].pdf 2019-12-31
25 7451-CHENP-2012-FER_SER_REPLY [31-12-2019(online)].pdf 2019-12-31
26 7451-CHENP-2012-DRAWING [31-12-2019(online)].pdf 2019-12-31
27 7451-CHENP-2012-COMPLETE SPECIFICATION [31-12-2019(online)].pdf 2019-12-31
28 7451-CHENP-2012-CLAIMS [31-12-2019(online)].pdf 2019-12-31
29 7451-CHENP-2012-ABSTRACT [31-12-2019(online)].pdf 2019-12-31
30 7451-CHENP-2012-Information under section 8(2) (MANDATORY) [08-01-2020(online)].pdf 2020-01-08
31 7451-CHENP-2012-FORM 3 [08-01-2020(online)].pdf 2020-01-08
32 7451-CHENP-2012-Correspondence to notify the Controller [08-12-2020(online)].pdf 2020-12-08
33 7451-CHENP-2012-Written submissions and relevant documents [30-12-2020(online)].pdf 2020-12-30
34 7451-CHENP-2012-PatentCertificate24-05-2021.pdf 2021-05-24
35 7451-CHENP-2012-IntimationOfGrant24-05-2021.pdf 2021-05-24
36 7451-CHENP-2012-US(14)-HearingNotice-(HearingDate-17-12-2020).pdf 2021-10-17
37 7451-CHENP-2012-RELEVANT DOCUMENTS [25-09-2023(online)].pdf 2023-09-25

Search Strategy

1 7451CHENP2012_18-09-2019.pdf

ERegister / Renewals

3rd: 26 May 2021

From 15/03/2013 - To 15/03/2014

4th: 26 May 2021

From 15/03/2014 - To 15/03/2015

5th: 26 May 2021

From 15/03/2015 - To 15/03/2016

6th: 26 May 2021

From 15/03/2016 - To 15/03/2017

7th: 26 May 2021

From 15/03/2017 - To 15/03/2018

8th: 26 May 2021

From 15/03/2018 - To 15/03/2019

9th: 26 May 2021

From 15/03/2019 - To 15/03/2020

10th: 26 May 2021

From 15/03/2020 - To 15/03/2021

11th: 26 May 2021

From 15/03/2021 - To 15/03/2022

12th: 05 Mar 2022

From 15/03/2022 - To 15/03/2023

13th: 10 Mar 2023

From 15/03/2023 - To 15/03/2024

14th: 14 Mar 2024

From 15/03/2024 - To 15/03/2025

15th: 12 Mar 2025

From 15/03/2025 - To 15/03/2026