Abstract: A method and system (200, 400) of capturing and retrieving information in a mobile device has been described. The system is configured to detect occurrence of a user-specific condition, trigger a monitoring of at least one operational event within said mobile device, access at least one element related to said at least one operational event, and create a log of said mobile device’s events with respect to said user-specific condition and registering at least said accessed element at a designated position within the log. Further, the system is configured to ascertain a log of operational events based on a user-input, identify at least one element present within said log at least based on said input parameter, fetch content related to said at least one identified element in said log, and display at least a portion of said fetched content according to an orientation of said identified element within said log.
FIELD OF THE INVENTION:
The present invention relates generally to managing information in a mobile device and in
particular relates to logging and searching information in the mobile device.
BACKGROUND OF THE INVENTION:
Usage of mobile devices such as smartphones, tablets, palm tops has surged in the last
decade and so is the usage of a diversity of mobile-applications (or Mobile-app) to assist almost
every day–to-day task of a user, ranging from health-check applications to booking a movie ticket.
Owing to such diversity, the variety of data as generated is dispersed across various applications.
Typically, when different mobile apps are used successively in a particular situation, e.g. user
downloading a movie through video app, messaging one of his friend through message app about
movie download, and updating his social networking status immediately after having completed
movie download through social networking mobile-app etc, the data gets stored under different
heads. In other words, the mobile device breaks the activity content related to a particular situation
based on application types and stores it in respect of different mobile-apps.
There are indeed some mobile-apps within the existing mobile device (e.g. Gallery) that
tend to collate different types of stored data to provide a centralized mechanism of data access.
These apps store multimedia-media based content based on time, events, location, execution of 3rdparty
mobile-app etc. Accordingly, the app may include a number of categories for stored multimedia
content such as events, timeline, third-party apps, Bluetooth, downloads etc. However, not
only the categories of stored content are limited in number, but also a major portion of the content
of the mobile –device is non-locatable through such mobile app. Even in respect of the accessible
stored content through such mobile-apps, the categories of stored content is substantially broad
(e.g. Photos, Videos, Download) such that they include huge amount of data associated thereto and
accordingly require a repeated scrolling by the user to traverse through the entire category of data.
As a result, considering a scenario, wherein the user forgot the name or number of a friend
whom he messaged while downloading the movie, the user has limited options to ascertain said
details. Either, the user goes through the message log and manually searches. Said search may only
be successful in case the user remembers the date and/or time of download. Other way may be to
somehow recall the movie name, go into the movie download log, and ascertain the details (e.g.
date and time) of said movie download. Based upon such details, the user has to again go back to
2
the message log and look out for a correspondingly sent message based on the ascertained movie
details. As may be observed, both scenarios prove substantially cumbersome. In other words, since
content is stored under separate applications, the user has to access log of each application
separately to retrieve information. Yet, even upon having made a substantial search operation and
having spent a substantial duration of time, an accurate result may not be guaranteed. The
probability of finding the accurate results further worsens when a substantial time has elapsed since
the occurrence of the particular situation and associated operations upon the mobile-device, as the
user may only be remembering vague details about his activity or executed communications
through the mobile device.
Although there have been certain mechanisms in mobile devices wherein automatic tags
like Time, Data Type and Location are associated with the content to provide an ease of search to
the user, however, such mechanisms rely upon a continuous indexing of all of the content without
any discretion in the mobile device, thereby always rendering the processor over-occupied and
draining energy resources such as battery. Moreover, the search for information related to a
particular content requires a specific and complex string for pulling out information and
accordingly requires a specific-skill exhibition from the user, thereby prohibiting a mass-reach of
such mechanism.
Other type of content location mechanism in the mobile device include reporting all the
mobile-phone enabled activities (images captured, web-sites browsed, phone calls etc) and all
extraneous activities (distance of running, no of walking steps, number of burnt calories) for a
particular day(s) in a week. However, such mechanism again executes on its own and does not
adopt any discretion which capturing information, thereby exhibiting a substantial number of
reported results and thus still requiring an ample effort of user- conducted navigation to arrive at a
precise information. Moreover, since almost every activity is captured for a particular day, the
mechanism again suffers from the problem of excessive utilization of the mobile-device’s
resources.
There exists a need for mechanism in a mobile device that not only enables a time-efficient
and user-friendly retrieval of information within the mobile device, but also proves substantially
less burdensome in respect of the mobile-device resources’ utilization.
3
SUMMARY OF THE INVENTION:
This summary is provided to introduce a selection of concepts in a simplified format that
are further described in the detailed description of the invention. This summary is not intended to
identify key or essential inventive concepts of the claimed subject matter, nor is it intended for
determining the scope of the claimed subject matter.
Accordingly, in accordance with the purposes of the invention, the present invention as
embodied and broadly described herein provides a method and system for capturing and retrieving
information in a mobile-device. The system operates at least in accordance with the method that
comprises: ascertaining at least one log of operational events of said mobile device, based at least
on one input parameter as received from a user; identifying at least one element present within said
log at least based on said input parameter; fetching content at least related to said at least one
identified element in said log; and displaying at least a portion of said fetched content in
accordance with an orientation of said identified element within said log.
In another embodiment, the present invention provides a method and system, wherein the
system operates at least in accordance with the method comprising: detecting occurrence of a userspecific
condition; triggering, based on said detection, a monitoring of at least one operational
event within said mobile device; accessing at least one element related to said at least one
operational event; and creating a log of said mobile device’s events with respect to said userspecific
condition and registering at least said accessed element at a designated position within the
log.
The advantages of the present invention include, but are not limited to, enabling the user
in search and location relevant content within the mobile device on the principle of associative
memory and exhibiting a result set that otherwise may not have been possible to be extracted
through the provided search instruction. Thus, the user is able to access even such elements that are
independent of the search query and yet may be the content as targeted by the content. The user is
able to associate with such targeted content by exploring the content(s) linked to the particular
content that is directly related to the search instruction.
In addition, the claimed subject matter requires a user-condition for triggering the
information capture process, thereby capturing the information discreetly (non-continuously) and
utilizing the mobile-device resources like processor and battery as and when required. In addition,
the claimed subject matter does not replicate data while registering information related to mobiledevice
operational events and instead use the generate data references, thereby occupying nonsubstantial
memory-space.
4
To further clarify advantages and features of the present invention, a more particular
description of the invention will be rendered by reference to specific embodiments thereof, which
is illustrated in the appended drawings. It is appreciated that these drawings depict only typical
embodiments of the invention and are therefore not to be considered limiting of its scope. The
invention will be described and explained with additional specificity and detail with the
accompanying drawings.
BRIEF DESCRIPTION OF FIGURES:
These and other features, aspects, and advantages of the present invention will become better
understood when the following detailed description is read with reference to the accompanying
drawings in which like characters represent like parts throughout the drawings, wherein:
Figure 1 illustrates a flow chart corresponding to a first embodiment of the invention;
Figure 2 illustrates a system in accordance with first embodiment of the present invention;
Figure 3 illustrates a flow chart corresponding to a second embodiment of the invention;
Figure 4 illustrates a system in accordance with second embodiment of the present invention;
Figure 5 illustrates a flow chart corresponding to a third embodiment of the invention;
Figure 6 illustrates an exemplary operation in respect of the first embodiment;
Figure 7 illustrates a specific type of operation in accordance with Fig. 6 through a userinterface
application;
Figure 8 depicts a pictorial representation of the operation in respect of Fig. 7;
Figure 9 illustrates another type of operation in accordance with Fig. 6 through a userinterface
application;
Figure 10 represents the operation of the first embodiment in terms of its relevant entities;
Figure 11illustrates an exemplary operation in respect of the second and third embodiments;
Figure 12 illustrates the exemplary operation associated with Fig. 11 through a user-interface;
Figure 13 illustrates another type of exemplary operation associated with Fig. 11 through a
user-interface;
Figure 14 represents the operation of the second and third embodiment in terms of its
relevant stages;
Figure 15 illustrates a detailed architecture of the system illustrated in Figure 2;
Figure 16 illustrates a detailed architecture of the system illustrated in Figure 4; and
Figure 17 illustrates an implementation of the system illustrated in Figure 2 and Figure 4 in a
computing-environment.
5
Further, skilled artisans will appreciate that elements in the drawings are illustrated for
simplicity and may not have been necessarily been drawn to scale. For example, the flow charts
illustrate the method in terms of the most prominent steps involved to help to improve
understanding of aspects of the present invention. Furthermore, in terms of the construction of the
device, one or more components of the device may have been represented in the drawings by
conventional symbols, and the drawings may show only those specific details that are pertinent to
understanding the embodiments of the present invention so as not to obscure the drawings with
details that will be readily apparent to those of ordinary skill in the art having benefit of the
description herein.
DETAILED DESCRIPTION:
For the purpose of promoting an understanding of the principles of the invention, reference
will now be made to the embodiment illustrated in the drawings and specific language will be used
to describe the same. It will nevertheless be understood that no limitation of the scope of the
invention is thereby intended, such alterations and further modifications in the illustrated system,
and such further applications of the principles of the invention as illustrated therein being
contemplated as would normally occur to one skilled in the art to which the invention relates.
It will be understood by those skilled in the art that the foregoing general description and
the following detailed description are exemplary and explanatory of the invention and are not
intended to be restrictive thereof.
Reference throughout this specification to “an aspect”, “another aspect” or similar language
means that a particular feature, structure, or characteristic described in connection with the
embodiment is included in at least one embodiment of the present invention. Thus, appearances of
the phrase “in an embodiment”, “in another embodiment” and similar language throughout this
specification may, but do not necessarily, all refer to the same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover
a non-exclusive inclusion, such that a process or method that comprises a list of steps does not
include only those steps but may include other steps not expressly listed or inherent to such process
or method. Similarly, one or more devices or sub-systems or elements or structures or components
proceeded by "comprises... a" does not, without more constraints, preclude the existence of other
devices or other sub-systems or other elements or other structures or other components or
additional devices or additional sub-systems or additional elements or additional structures or
additional components.
6
Unless otherwise defined, all technical and scientific terms used herein have the same
meaning as commonly understood by one of ordinary skill in the art to which this invention
belongs. The system, methods, and examples provided herein are illustrative only and not intended
to be limiting.
Embodiments of the present invention will be described below in detail with reference to the
accompanying drawings.
Now referring to Figure 1, it can be seen that the present invention provides a method 100 for
capturing information in a mobile device. In accordance with a first embodiment, the method
comprises detecting (step 102) occurrence of a user-specific condition, wherein such condition may
be a user-input provided to the mobile device to capture information related to events within the
mobile-device, e.g. events occurring in a given time-frame or at a given location, including a userdefined
keyword etc. Based on said detection, monitoring of information related to the
corresponding one or more events within the mobile device is triggered (step 104). Thereafter,
elements related to occurrence of said at least one operational event is accessed (step 106), and a
log of the mobile-device events related to the user provided input is created (step 108) and said
accessed element is registered therein. Specifically, the accessed elements are allocated a
designated position within the log for the purposes of further depiction.
Referring to Figure 2, the present invention also provides an apparatus 200 for capturing
information in a mobile device. The apparatus 200 comprises a memory 202, an input device 204
for receiving a user-specific condition and a processor 206 to perform the aforesaid steps 104 till
108, based on said received user-condition.
The input device 204 may in turn may render a graphical user interface (as explained later) to
receive said user-input based on the use-defined condition. Based on the received user-input, the
processor 206 begins the monitoring of one or more events within the mobile device in terms of
their details, outputted results, etc. In other scenario, the processor 206 monitors already happened
events and associated details/results. Clearly, both of such scenarios are based on the type of
received output. Accordingly, a log specific to the provided user-input gets generated and the
findings of the monitoring are places at a designated position within the log.
In addition, the system 200 also includes other miscellaneous components that enable an
operational interconnection between the input device 204 and the processor 206 for execution of
their respective functionalities.
7
Now referring to Figure 3, it can be seen that the present invention in accordance with a
second embodiment provides a method 300 for retrieving information in a mobile-device. The
method comprises ascertaining (step 302) at least one log of operational events of said mobile
device, based on an input parameter as received from a user. Such log is ascertained out of one or
more logs generated within the mobile device as a result of Fig. 1 or Fig. 2. Thereafter, one or more
elements as present within the log are identified (step 304) at least based on said parameter.
However, as will be explained later, other type of elements may also be identified based on a
criterion that is different from the input parameter as provided by the user. Accordingly, based on
said one or more identified elements, content related to the identified elements is fetched (step 306)
from a main memory location of the device. Finally, either a portion or the entire fetched content is
displayed (step 308) as an end-result, wherein said display follows a pattern that is based on the
position of the identified elements within the corresponding stored log.
Referring to Figure 4, the present invention also provides a system 400 for retrieving
information in a mobile-device. The apparatus 400 comprises an input device 402 for receiving at
least one input parameter from a user, wherein said input parameter may be a search query to
search information within one or more pre-generated logs. A processor 404 is configured to
shortlist one or more log of operational events of said mobile device based on said input parameter,
wherein at least one element present within said log is identified at least based on said input
parameter. Other elements may also be identified based on a different criterion, as will be
explained later in the description.
Thereafter, the processor 404 fetches content related to said at least one identified element
from a main memory location. Finally, a display device 406 displays at least a portion of said
fetched content as final result. Accordingly, the content is displayed in accordance with the
positions of the identified element inside said log.
In addition, the system 400 also includes other miscellaneous components that enable an
operational interconnection between the input device 402, the processor 404 and the display device
408 for execution of their respective functionalities.
Now referring to Figure 5, it can be seen that the present invention in accordance with a
third embodiment provides another method 500 for retrieving information in a mobile-device. The
method 500 comprises ascertaining (step 502) at least one log of operational events of said mobile
device based on user-provided input parameter. At least two elements present within said log are
identified (step 504), such that identity of one of the elements is based on said input parameter, and
8
accordingly content related to said identified elements is fetched (step 506). Further, at least a
portion of said fetched content is displayed (step 508), wherein such display is rendered in
accordance with the respective positions of said identified elements in said log.
The steps as outlined in the present Fig. 5 may be executed by the input device 402, the
processor 404 and the display device 506 as present in Fig. 4.
Figure 6 depicts an exemplary operation in accordance with method depicted in aforesaid
embodiments of Fig.1. As may be seen from the figure, the exemplary operation is in the form of a
sequence of steps, wherein each exemplary step in the present figure merely exemplifies the
method steps as recited in Fig. 1 and accordingly shall not be construed as limiting the scope of
aforesaid steps. Further, the log of information as described above may be interchangeably referred
to as ‘recaps’ or ‘sachets’.
At step 602, a user input is received to create a recap or in other words to trigger creation of
a recap. In the present example, the recaps have been depicted as being time based or location
based. More specifically, the time-based recap request leads to capturing of information related to
operational events within the mobile device that occur within a definite time-interval, or at a
particular location of the mobile device. Other criteria of instructing capturing of information have
been illustrated later within the description.
The operational events may include an incoming call, an outgoing call, an incoming
message, an outgoing message, browsing of internet through mobile device, and a user-performed
operation over the mobile device with or without the network assistance.
In other example, the present step 602 also corresponds to a situation wherein the state of
the user of the mobile device is considered as the input, without receiving any manually provided
input by the user. The state of the user may be a jogging state or a driving state. Accordingly, the
detecting denotes either automatically sensing said user-specific condition or receiving a user input
that corresponds to said user-specific condition.
The present step 602 accordingly corresponds to the step 102 of Fig. 1
The receipt of user-input or occurrence of the user input causes a control transfer to the step
604, wherein a ‘recap on-demand capture trigger capture module’ triggers capturing of
information. The present step 604 corresponds to the step 104 of Fig. 1, while the capture module
is executed by the processor 206 of Fig. 2. In case of the sensed user condition, the capturing
9
proceeds instantly. In case of the user-provided input, the capturing commences at a time when the
condition present within the user-input gets satisfied.
At step 606, the monitoring of the data takes place as a result of triggering in step 604. A
data scan module may be authorized to carry on such monitoring. As later depicted in step 608, the
actual content stored in the designated memory or database of the mobile device is scanned as a
part of said monitoring. Said content to be scanned may be an already generated content associated
with already happened operational events within the mobile device. In other scenario, the content
to be scanned may also include content that is being generated in run-time. The content to be
scanned is selected depending upon the type of user input as have been received in step 602.
In another example, in spite of having been triggered, the monitoring is post-phoned till a
later time, i.e. in case the mobile device is heavily occupied or under-charged. Accordingly, in such
a scenario, the monitoring gets automatically scheduled triggered when the mobile device assumes
a charging state, an idle state, or is in an under-utilized state.
The present step 606 also refers to the step 104 of the Fig. 1, while the data scan module is
also executed by the processor of Fig. 2.
At step 610, data references related to the content as scanned in preceding steps are
generated. In other embodiment, pre-generated data references (if any) in respect of the the scanned
content may be considered. As understood by the person skilled in the art, the data references to the
content denote pointers to the memory-location of the generated content. A module such as raw
data-reference generator module discharges the present step 610 and is executed by the processor
of Fig. 2
At step 612, the data references as generated in the preceding steps are grouped together
under different groups, wherein each groups points to a specific category of data references that
point to the analogous content. Each of the grouped references denotes a single element. For
example, the data references pointing to photos, videos, songs etc may be clubbed together to form
corresponding different elements. The grouping action may be discharged by a reference data
group generator module is executed by the processor of Fig. 2.
At step 614, the elements created in the previous step are linked with one-another based
upon the specific user-condition as had been received in the step 602. The data references that
could not be grouped in the previous step are also linked as individual-elements with the group
based elements. Further, elements as linked are tagged by a descriptor-identifier such as a tag. For
10
example, elements that point to the birthday-related message may be tagged with a birthday cake
based identifier. The present step 614 is dischargeable by a Recap linking and auto-tagging
module, wherein said module is executed by the processor in Fig. 2
The steps 610 till 614 correspond to the step 106 of Fig. 1.
At step 616, the log of information or recap is generated as a result of linked elements, such
that linkage of elements with one another leads to specific positions of the elements within the log.
In other words, each element gets oriented at a unique positon within a chain resulted due to the
linkage.
Figure 7 illustrates a specific type of operation in accordance with Fig. 6 through a userinterface
application. More specifically, Fig. 7 (a to c) depicts a ‘time-based recap’ creation
through a user-interface as depicted in the figure. While Fig. 7a depicts options to create different
recaps, among which time based recap is one of the option. Upon having selected that option, the
user may further select a time-interval (say 2:30 PM till 6: 30 PM) as a part of Fig. 7b.
Accordingly, as a result, the system 200 operates in accordance with the description in previous
figures, and forms the time based recap 700 as depicted in Fig. 7c. The recap 700 as shown within
the Figure 7c illustrates notification or data related to operational events in the aforesaid timeinterval,
wherein examples of such events include messaging 702, played songs 704, details of
browsed websites 706 etc. In addition, the songs 704 as depicted within the recap depicts a group
of 10 songs i.e. the group of related events.
In addition, each of the elements as depicted in recap 700 bears a specific position by virtue
of its time of occurrence. For example, from the log 700 it is clear that the songs 704 have been
accessed within the mobile device after having browsed internet and before messaging another
subscriber. Accordingly, without going into specific timing details, the visual depiction of linkage
of the elements in the recap 700 indicates an order of occurrence of mobile-device’s operational
events.
In addition, another section 708 with the recap 700 depicts the a summary of the events by
denoting symbols related to messaging, songs and internet-browsing, each symbol having been
superscripted by numerals denoting the number of notifications therein. For example, the symbol of
song superscripted by 10 denotes a group of 10 songs under the head songs.
Figure 8 depicts a pictorial representation of the operation in respect of the Fig. 7, wherein a
time based recap is instructed to be constructed in the last 3 hours. More specifically, Fig. 8 is
11
essentially a timing diagram that depicts initiation of the recap construction at a point that is 3
hours earlier than present time. In other words, a time window of last 3 hours has been selected to
capture information and create log. As may be seen from the figure, data related to all the events
such as messaging, photo capturing, phone calls, video recording etc may be captured and
corresponding elements may be positioned in the recap.
In addition, as may be seen from the figure 8, various tags may be automatically associated
with the elements in the recap. For example, presence of birthday cake related keyword within the
any of the elements will lead to an automatic incorporation of a “birthday cake“, “gift”, “party” etc
related tags upon the specific element in the recap or upon the recap itself.
On the lines of Fig. 7, Figure 9 (a to c) illustrates another type of operation in accordance
with Fig. 6, wherein the recap is instructed for creation at a future time interval, 4:30 PM till 6 PM
as depicted by Fig. 8(b). Accordingly, 8(c) depicts an ongoing time based recap construction 900
when the clock strikes 4:30 PM. Beneath said ongoing recap reconstruction, the already created
recaps 902 may be seen and accessed if instructed by the user.
Figure 10 represents the operation of the first embodiment in terms of its relevant entities.
As shown in the figure, the first entity 1002 deals with “Mobile applications” 1002, wherein
the mobile applications (native or 3rd party) have been selected based on the user-provided
conditions for gathering the information. Likewise, applications running on devices (e.g. smart
watch, appliances, etc) connected to the mobile device are also considered for creating the recap,
based upon the type of user-specific condition or a user’s demand. Other examples of applications
have been provided within the portion of Fig. 10 as referred by 1002
The next entity relates to “Activities” 1004 that denote the user-operations upon the selected
application, e.g. downloading a song, making a call, messaging, wireless interaction with another
device etc. The type of activity and its outcome is tracked to form the input for the next entity
1006.
A next entity relates to “Data logger” 1006, which is a combination of raw data reference
data generation module, reference grouping module, and data group linking and tagging. The
information captured from performance of operations over different applications is registered as
elements at a designated log, based on their timeline of occurrence of events. The tags may be
associated with the elements automatically or may be manually added by the user when the log is
displayed. The users may also add extra tags to the elements or log for reference. In case of the
12
connected device such a smart watch, the log may be stored in a split condition, i.e. the log gets
bifurcated between the mobile device and the connected device to save memory within the mobile
device. Nevertheless, as aforesaid, the data references or groups of data references are registered in
the log in the name of elements and accordingly, there is no duplication of data.
A further next entity relates to “Database” 1008 as maintained by the memory of the mobile
device that maintains the system 200. In an example, Android OS driven mobile devices, SQLite
database is used for storing the actual content. As far as a designated database of the system 200 is
concerned, it stores only relevant data i.e. reference data groups/elements or ungrouped data
references/individual elements that that are present in the linked form within the log. For example,
information related to call in a time recap, only caller/callee name and number is stored in the
designated database of the system 200, while a complete detail about the call is stored in a default
call logs database maintained by the database 1008.
While the aforesaid description has depicted time based recaps or logs, there may be
locations based recaps wherein all the elements are linked with each other based on a common
location. E.g.: A recap based on New Delhi railway station will create a log of all operational
events within the mobile device that happened while the user was within the geographical
coordinates of the New Delhi railway station. As soon as the user moves to some other
geographical location, the recap will stop getting created and gets automatically stored.
Yet another recap may be a user-physical condition based recap, wherein all the mobile
device operational events will be registered as elements within the recap, as long as the user
remains within a particular physical condition, for example, jogging condition recap, driving
condition recap etc. Once such recap is instructed to be constructed, the system will automatically
sense the user-physical conditions and capture the information as and when the physical condition
is prevalent.
Other example of recap as instructed by the user may be a recap that captures all those
operational events that involve presence of a keyword E.g. User can define “Bill” as a keyword and
any content in the device (Messages, Email and Contacts etc.) which has the keyword bill will be
pushed to the Bill recap.
Yet another example of recap instructed by the user may be a recap based on usage or a type
of operation performed over one or more pre-defined mobile device applications. For example,
actions performed over an App such as sharing, liking a specific content, call from an unknown
number etc. can lead to that specific instance of data being captured. Accordingly, there may be
13
recaps such sharing recap, liking recap, unknown call number recap, self-captured photograph
recap or selfie recap. Accordingly, such recaps will have various elements that have been resulted
owing to particular nature of actions. For example, a selfie recap will only have self-captured
photographs.
Yet another example of recap as instructed by the user may be a recap based on interaction of
the mobile device with the other device. E.g. In case a video stored on the mobile device is being
screened on any of the connected device, the corresponding recap as constructed is a collection of
elements that denote occurrence of the streaming activity or interaction with the connected device.
Accordingly, such recap when seen by the user can not only notify a user about the interaction with
other devices, but also leads the user to access the actual streamed content from the main memory.
Yet another example of recap may be a user’s self-constructed recap, wherein the user would
have manually instructed any action or activity performed over the mobile device to be reported in
the user-constructed recap. For example, after having had an important chat with an unknown
caller, the user may simple flag the details to be registered in a user-defined recap. In other words,
such customized recap is formed by direct selection of content by the user in relation to one or
more operational events within the mobile device.
Further, the elements within the log or log itself may be marked with tags or identifiers
automatically. For example, the elements of the log or the log itself may be automatically tagged
with day/night tags based on the time of that day at which they were created. Moreover, the system
200 can also read into the element within the log and associate tags with the elements or the log.
For example, if the log has a message inside it which mentions the text birthday, then a Gift
Box and Birthday cake stickers get automatically associated with the log or the recap. Likewise,
while a location based recap is being constructed, the user at the particular location may be
downloading a menu card of a restaurant present within that location. Accordingly, a ‘fork and
knife’ based tag is affixed to the location recap or to a corresponding element within the location
based log. Nonetheless, while the user is observing a recap being constructed or a pre-generated
recap, the tags may be manually associated by the user to the logs or the elements within the log.
The forthcoming description will now illustrate an exemplary operation as per the second
and third embodiments to describe retrieval of a particular log and depiction of specific information
therein, out of a number of pre-generated logs. The user instruction to perform such search
operation includes a search query that may comprise one or more of a keyword, a tag, special
character or any other parameter such as voice command or touch-gesture drawn pattern.
14
Figure 11 depicts an exemplary operation in accordance with method depicted in aforesaid
embodiments of Fig. 3 and Fig. 5. As may be seen from the figure, the exemplary operation is in
the form a sequence of steps, wherein each exemplary step in the present figure merely exemplifies
method steps as recited in Fig. 3 and 5 and accordingly shall not be construed as limiting the scope
of aforesaid steps.
At step 1102, the input device 402 of the system 400 receives an input parameter through
the user input. The input parameter is received through a user-interface and comprises a user-typed
or selected parameter which may be a user text such as a keyword, a pre-defined identifier such as
tag, a numeric character, an alphanumeric character, and a special character. The tag may be
provided by the user through selection of an image, a sign, a special character, etc. In other
examples, the input parameter may be a voice based command or a pattern drawn through a touchgesture.
In yet another example, the input parameter may be a picture or image captured by a
mobile device camera or any other type of imaging device, based on which a particular log is
retrieved and specific information is depicted. The user input as received in step 1102 may be
search query to retrieve one or relevant logs and observe a relevant information. The step 1102
corresponds to the step 302 and 502 of the Fig 3 and Fig. 5
At step 1104, the input parameter as has been received is analysed by a recap user-input
analyser that is driven by the processor of the system 400. In case of a mobile device camera
captured photograph/ image/picture acting as the input parameter, the same is parsed as a part of
the analysis. Accordingly, the parsing leads to an automatic generation of intermediate keyword(s)
that is/are in turn analysed. The step 1104 also corresponds to the step 302 and 502 of the Fig 3 and
Fig. 5
At step 1106, the input parameter as analysed in the previous step is used to determine a
pivot-information. The pivot information is a category of logs such as time based log, location log,
user-activity log, user action log, or any other log category that have been illustrated in the
description so far. Accordingly, the pivot information denotes an overall context associated with
said received input parameter that may be a combination of keyword and tags. The pivot
information may be obtained from the database of the system 200 or in other words a recap
database through the action of a recap pivot matcher module, which is driven by the processor 402
of the system 400. The step 1106 also corresponds to the steps 302 and 502 of the Fig 3 and Fig. 5
At step 1108, the analysed input parameter is used to locate a specific log Id or a recap ID
from the recap database, wherein the search of such recap is restricted within the logs associated
15
with the determined pivot in step 1106. In an example, said recap ID includes the tags that are
equivalent to the tags provided within the user-input. Accordingly, the present step 1108 may be
executed by a Recap Tag Matcher module that is driven by the processor 402. The step 1108 also
corresponds to the steps 302 and 502 of the Fig 3 and Fig. 5
At step 1110, one or more elements are identified within the log as denoted by the recap ID,
which has been located in step 1108. While one of the identified parameter corresponds to the
analysed input parameter, the other identified element in the log is independent of the analysed
input parameter and may still be identified from the log ID based on proximity of linkage of to the
identified element that directly corresponds with the input-parameter. Accordingly, in an example,
there may be three-four identified elements, that may be relevant for display as a preferable output
within the log.
The present step 110 is performed by a reference data group matcher module acting upon a
reference-data group database, wherein said module may be driven by the processor 402. As
repeatedly pointed out previously, the identified elements may either be a group of similar data
references or an ungrouped individual data reference. In addition, the present step 110 corresponds
to the step 304 and 504 of Fig 3 and Fig. 5.
At step 1112, the data-references pertaining to each of the identified element in step 1110 is
retrieved. In addition, as will be understood later, the data–references pertaining to non-identified
elements present within said at least one log are also retrieved from a raw-data reference database.
The present step 1112 is executed by the a data reference matcher module acting upon the raw data
reference database and corresponds to the steps 306 and 506 of Fig. 3 and Fig. 5.
At step 1114, an actual content pertaining to the data references is fetched from a main
memory of the database. The content comprises a first type of content pertaining to one of the
identified elements that directly pertains to the input parameter. A second type of content pertains
to other types of identified elements that do not pertain to the input parameter. In addition, the
content pertaining to the non-identified elements within the log is also retrieved. The present step
1114 is executed by a data fetcher module acting upon the main memory of the mobile device, and
corresponds to the steps 306 and 506 of Fig. 3 and Fig. 5.
At step 1116, the data reference matcher module draws a mapping among the recap ID,
identified elements and actual content (as fetched in the previous step 1114).
16
At step 1118, a graphical representation of the log is depicted based on cached or predefined
details pertaining to the log ID retrieved in the previous steps. Further, the fetched content
is displayed at least partly by depicting only the first and second type of contents as the preferred
output within the graphical representation of the log. Moreover, the position of the first and second
type of content with respect to each other is maintained in line with the
orientation/linkage/sequence as depicted in the log.
Specifically, when such mapping as described in step 1116 has been implemented, the data
reference matcher module is executed by the processor 204 and a display is rendered at the display
device 406. Such mapping cum rendering of the display corresponds to the steps 308 and 508 of
Fig. 3 and Fig. 5, respectively. In addition, as a further extension, the input device 402 may receive
a user input to access content other than the aforesaid contents displayed as a preferred output.
Based on said user input, the fetched content related to non-identified elements is depicted in
accordance with orientation of said non-identified elements in said log. In other words, the user has
the discretion to observe the entire information residing in the recap ID instead of the preferred
output.
The depiction of first and second type of contents within said graphical representation
includes a symbolic representation (e.g. image or thumbnail expressions) with respect to each of
said identified elements, and metadata pertaining to said identified elements. The symbolic
representations are executable in nature and operable by a user to access the detailed data
pertaining to said identified elements within the mobile device. For example, a symbol
representation of message depicted as the output may be clicked upon to access the actual message
and details (e.g. sender/recipient contact details).
In other examples, graphical representation of more than one logs may be depicted and
accordingly two sets of preferred output get depicted as per the corresponding log ID.
Figure 12 illustrates the exemplary operation associated with Fig. 11 through a userinterface.
As shown in in the figure, the step 1202 represents a user interface that depicts a collection
of pre-generated logs. A search field (i.e. a text box field) has been provided to receive a searchquery
for locating one or more specific logs. In an exemplary scenario , the user needs to search
photos taken on 17th Jan while he was messaging a phone number starting with 9847. The user
now wishes to recreate such scenario in the form of search query. Accordingly, the user clicks upon
a control (as encircled), to draw the search scenario query in terms of selection of the tags.
17
As shown in step 1204, the user selects tags as a calendar, day time and message based tags
to recreate the desired scenario.
As shown in step 1206, the user enters the number “9847” within the text field.
Accordingly, the log as depicted in the step 1206 is obtained that not only displays the content
directly connected to the tags and text, but also such content that although is not connected to the
text but still has been displayed on account of being a part of the relevant log (that matches with the
input tags) and on account of being proximate to the content directly related to the input text.
Accordingly, a graphical representation of the log and content forming a part of the preferable
output gets depicted. In addition, the displayed content may also be identified by a metadata (i.e.
17th Jan) pertaining to the displayed content.
In other words, within the step 1206, the recap finds the message sent to number “9847”
and also depicts an ‘associated’ activity such as photos & videos which were taken during that
operation. This allows the user to use an ‘Associative search functionality’ to get photos without
explicitly specifying photo activity anywhere in the search query.
Further in step 1208, the recap in the step 1206 has been actuated by clicking upon the
encircled portion in the step 1206 to view its contents. Accordingly, the user can open recap details
and check the sequence of events. In addition, by clicking upon an element of the recap (say
photos), the corresponding element gets isolated and may be operated upon individually, for
example to see or delete the photos.
Fig. 13 depicts another type of exemplary operation connected with Fig. 11, wherein the
output displays more than one log and accordingly a more amount of preferable output. The
reasons may be attributed to the fact that for the purposes of conducting a search across the pregenerated
logs in the step 1302, the search query as provided in the step 1304 only includes tags as
a part of the search-query. Accordingly, since the intended search is slightly broad in nature, the
step 1304 depicts more than two sets of relevant logs or Log ID, both of them having the presence
of tags as otherwise inputted by the user as a search query.
Likewise, the examples provided in Fig. 12 and 13 may also be visualized to receive a
mobile device camera captured picture as the part of search query. Such picture may be inserted
within the search field by the user through various means known in the art. In other example, while
operating upon the search field, the user may simultaneously capture the picture and incorporate
within the search field as the search query.
18
Figure 14 represents the operation of the second and third embodiment in terms of its
relevant stages.
As shown in the figure, the first-stage 1402 is “Data Representation” that may be
corresponded with state of the user interfaces depicted in the steps 1202 and steps 1302 of
preceding figures. Accordingly, data representation depicts a collection of pre-stored logs such
time log, location.
The second stage “Query Handling” 1404 denotes a search field as represented in steps
1204 and 1304 to receive the search query or search input parameter from the user. The stage 1404
corresponds with the step 1102 of Fig. 11.
The third stage “Data Mining” 1406 denotes analysing the search query or search-input
parameter, extracting one or more relevant search recaps IDs, and displaying relevant contents as
preferable-output as a part of said logs. Accordingly, the stage collectively corresponds with the
steps 1104-1114 of Fig. 11.
Another type of stage 1408 is “Data filtering” 1408 that assists the previous stage “data
mining” 1406 in its operation. More specifically, the data filtering as a process is executed over the
recap database to ignore redundant data while the “data-mining” stage is in progress. In other
scenarios, the data filtering stage may also operate on its own on a periodical basis over the recap
database to weed out redundant data from the logs.
Fig. 15 illustrates architecture 1500 of the system 200 depicted in Fig. 2. As shown in the
figure, the system 200 includes a recap module 1502 that is used to create the recap or the log
based on the user-specific condition as provided. Various user specific conditions as defined by the
step 102 of Fig. 1 have been collectively represented by the numeral 1502a. The recap module
1502 is executed by the processor 206 and includes a combination of sub-modules such as Recap
On-Demand Capture Trigger module 1504 to perform the step 202 and a Data Scan Module 1506
that on being triggered by the capture module 1504 performs the step 104.
More specifically, the Data scan module 1506 is used to scan the device for content
generated or received during the occurrence of the operational events in the mobile device. In an
example, such content includes event/data such as phone call, email, message, playing music,
taking photo, taking video, etc. Accordingly, the scan module 1506 interacts with a main memory
of the mobile device for all content such as Contacts, Messages, Videos, Images, etc. It can also
19
scan SD card or any other storage medium for content. Said scanned items have been collectively
depicted by the reference numeral 1506a.
Further, the system 200 includes the Recap Data Reference Generator Module 1508, the
Recap Reference Data Grouping Module 1510, and the Recap Linking & Auto Tagging Module
1512, that are executed by processor 202 to perform the steps 206 and 208. The separate
functionalities of each module have been already depicted in the steps 610, 612 and 614,
respectively, in Fig. 6.
Further, the system 200 comprises a raw data reference database 1514 to store the data
references related to the captured data, a data reference group database 1516 to store the groups of
similar data references and a recap database 1518 to store the generated recaps.
In addition, the system 200 further includes a precious recap module 1520 that help the user
in manually selecting content to be constituted in a log. Accordingly, the precious recap module
1520 comprises a receiving module that receives a user-selection of various types of mobile-device
operational events to be included in a recap. The events have been collectively represented by
1520a. Accordingly, such recap may be known as precious recap for being a user-constructed
recap.
Although, the precious recap construction may not involve the use of the data scan module
1506 for being a user contributed recap creation task, yet the precious-recap construction requires
the role of the Recap Data Reference Generator Module 1508, the Recap Reference Data Grouping
Module 1510, and the Recap Linking & Auto Tagging Module 1512.
Further, an edit module 1522 may be provided to enable the user in editing all the generated
recaps and store them in an updated form. Further, during selecting content for precious recap
construction, the user may edit the selected content through the edit module 1522 before finally
getting the precious recap constructed.
Fig. 16 illustrates architecture 1600 of the system 400 depicted in Fig. 4. The system 400
includes a query handling module 1602 which comprises sub-modules as query
handler/analyser/parser 1604 as first sub-module, a second sub-module 1606 and a third submodule
1608. While the first sub module’s functionality has been depicted in step 1104 of Fig. 11,
the second sub-module 1606 is a combination of the recap pivot matcher module, the recap auto tag
matcher module, the reference-data group matcher module, the data-reference matcher module, and
the data fetching module as illustrated between steps 1106 till 1114 of Fig. 11. Accordingly, the
20
second sub-module 1606 exhibits the functionalities depicted from steps 1106 till 1114. The third
sub-module 1608 exhibits the functionality as depicted in step 1116 and corresponds to the datareference
matcher module.
The display device 406 exhibits the display-function as depicted in step 108 of Fig. 1 or the
step 1118 of Fig.11
Overall, the second sub-module 1606 generates various types of references, i.e. pivot
information, a recap ID, an element, a data reference and for such purpose interacts with the recap
database 1518 and the recap data grouping database 1516. The third sub-module 1608 module
finally takes the charge of consolidating each of such reference by drawing a mapping among the
same through the aid of relational databases, and finally attaches the content as fetched to the
drawn mapping, thereby causing the display device 406 to display the recap and specific content
therein. Accordingly, the third sub-module 1608 interacts with the second sub-module 1606 and the
raw data reference database 1514.
While pivot information and recap id may be extracted from the recap database 1518, the
element related information and the data reference related information are extracted from the data
reference group database 1516 and raw data reference database 1514, respectively. Finally, the
actual content is fetched from the main memory of the mobile device.
Figure 17 illustrates an implementation of the system illustrated in Figure 2 and Fig. 4 in a
computing environment. The present figure essentially illustrates the hardware configuration of the
system 200, 400 in the form of a computer system 1700 is shown. The computer system 1700 can
include a set of instructions that can be executed to cause the computer system 1700 to perform any
one or more of the methods disclosed. The computer system 1700 may operate as a standalone
device or may be connected, e.g., using a network, to other computer systems or peripheral devices.
In a networked deployment, the computer system 1700 may operate in the capacity of a
server or as a client user computer in a server-client user network environment, or as a peer
computer system in a peer-to-peer (or distributed) network environment. The computer system
1700 can also be implemented as or incorporated into various devices, such as a personal computer
(PC), a tablet PC, a personal digital assistant (PDA), a mobile device, a palmtop computer, a laptop
computer, a desktop computer, a communications device, a wireless telephone, a land-line
telephone, a web appliance, a network router, switch or bridge, or any other machine capable of
executing a set of instructions (sequential or otherwise) that specify actions to be taken by that
machine. Further, while a single computer system 1700 is illustrated, the term "system" shall also
21
be taken to include any collection of systems or sub-systems that individually or jointly execute a
set, or multiple sets, of instructions to perform one or more computer functions.
The computer system 1700 may include a processor 1702 (e.g., a central processing unit
(CPU), a graphics processing unit (GPU), or both. The processor 1702 may be a component in a
variety of systems. For example, the processor 1702 may be part of a standard personal computer
or a workstation. The processor 1702 may be one or more general processors, digital signal
processors, application specific integrated circuits, field programmable gate arrays, servers,
networks, digital circuits, analog circuits, combinations thereof, or other now known or later
developed devices for analysing and processing data The processor 1702 may implement a
software program, such as code generated manually (i.e., programmed).
The computer system 1700 may include a memory 1704, such as a memory 1704 that can
communicate via a bus 1708. The memory 1704 may be a main memory, a static memory, or a
dynamic memory. The memory 1704 may include, but is not limited to computer readable storage
media such as various types of volatile and non-volatile storage media, including but not limited to
random access memory, read-only memory, programmable read-only memory, electrically
programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic
tape or disk, optical media and the like. In one example, the memory 1704 includes a cache or
random access memory for the processor 1702. In alternative examples, the memory 1704 is
separate from the processor 1702, such as a cache memory of a processor, the system memory, or
other memory. The memory 1704 may be an external storage device or database for storing data.
Examples include a hard drive, compact disc ("CD"), digital video disc ("DVD"), memory card,
memory stick, floppy disc, universal serial bus ("USB") memory device, or any other device
operative to store data. The memory 1704 is operable to store instructions executable by the
processor 1702. The functions, acts or tasks illustrated in the figures or described may be
performed by the programmed processor 1702 executing the instructions stored in the memory
1704. The functions, acts or tasks are independent of the particular type of instructions set, storage
media, processor or processing strategy and may be performed by software, hardware, integrated
circuits, firm-ware, micro-code and the like, operating alone or in combination. Likewise,
processing strategies may include multiprocessing, multitasking, parallel processing and the like.
As shown, the computer system 1700 may or may not further include a display unit 1710,
such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display,
a solid state display, a cathode ray tube (CRT), a projector, a printer or other now known or later
developed display device for outputting determined information. The display 1710 may act as an
22
interface for the user to see the functioning of the processor 1702, or specifically as an interface
with the software stored in the memory 1704 or in the drive unit 1716.
Additionally, the computer system 1700 may include an input device 1712 configured to
allow a user to interact with any of the components of system 1700. The input device 1712 may be
a number pad, a keyboard, or a cursor control device, such as a mouse, or a joystick, touch screen
display, remote control or any other device operative to interact with the computer system 1700.
The computer system 1700 may also include a disk or optical drive unit 1716. The disk drive
unit 1716 may include a computer-readable medium 1722 in which one or more sets of instructions
1724, e.g. software, can be embedded. Further, the instructions 1724 may embody one or more of
the methods or logic as described. In a particular example, the instructions 1724 may reside
completely, or at least partially, within the memory 1704 or within the processor 1702 during
execution by the computer system 1700. The memory 1704 and the processor 1702 also may
include computer-readable media as discussed above.
The present invention contemplates a computer-readable medium that includes instructions
1724 or receives and executes instructions 1724 responsive to a propagated signal so that a device
connected to a network 1726 can communicate voice, video, audio, images or any other data over
the network 1726. Further, the instructions 1724 may be transmitted or received over the network
1726 via a communication port or interface 1720 or using a bus 1708. The communication port or
interface 1720 may be a part of the processor 1702 or may be a separate component. The
communication port 1720 may be created in software or may be a physical connection in hardware.
The communication port 1720 may be configured to connect with a network 1726, external media,
the display 1710, or any other components in system 1700 or combinations thereof. The connection
with the network 1726 may be a physical connection, such as a wired Ethernet connection or may
be established wirelessly as discussed later. Likewise, the additional connections with other
components of the system 1700 may be physical connections or may be established wirelessly. The
network 1726 may alternatively be directly connected to the bus 1708.
The network 1726 may include wired networks, wireless networks, Ethernet AVB networks,
or combinations thereof. The wireless network may be a cellular telephone network, an 802.11,
802.16, 802.20, 802.1Q or WiMax network. Further, the network 1726 may be a public network,
such as the Internet, a private network, such as an intranet, or combinations thereof, and may utilize
a variety of networking protocols now available or later developed including, but not limited to
TCP/IP based networking protocols.
23
In an alternative example, dedicated hardware implementations, such as application specific
integrated circuits, programmable logic arrays and other hardware devices, can be constructed to
implement various parts of the system 1700.
Applications that may include the systems can broadly include a variety of electronic and
computer systems. One or more examples described may implement functions using two or more
specific interconnected hardware modules or devices with related control and data signals that can
be communicated between and through the modules, or as portions of an application-specific
integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware
implementations.
The system described may be implemented by software programs executable by a computer
system. Further, in a non-limited example, implementations can include distributed processing,
component/object distributed processing, and parallel processing. Alternatively, virtual computer
system processing can be constructed to implement various parts of the system.
The system is not limited to operation with any particular standards and protocols. For
example, standards for Internet and other packet switched network transmission (e.g., TCP/IP,
UDP/IP, HTML, HTTP) may be used. Such standards are periodically superseded by faster or more
efficient equivalents having essentially the same functions. Accordingly, replacement standards
and protocols having the same or similar functions as those disclosed are considered equivalents
thereof.
In view of the aforesaid description, the present subject matter is able to separate out content
in the mobile device based on pre-set conditions like user state, actions upon Mobile app, useractivities
upon mobile device, connected device interactions, etc. By virtue of using reference links
instead of data duplication and processing content only upon receiving a user provided demand, the
systems 200 and 400 consume low memory. No background index service is required for retrieving
the information, as index is created on demand basis. In addition, the system 200, 400 consumes
low power by initiating the recap construction only when demanded by the user. Even in terms of
constructing the recap, the system is further able to schedule power –draining processing activities
only when the device is connected to an external power source or in idle/less-occupied state.
As far as retrieving the information within the mobile device is concerned, the present subject
matter prescribes a method of searching and locating information that work on the principle of
‘associative memory’. Hence, even such search results get resulted which otherwise cannot be
searched normally with tag or keywords. For end user, the information retrieval method as
24
prescribed by the present subject matter facilitates the user in recalling naturally, as retrieval
mechanism resembles a human being’s mental model of searching & locating physical object in
real time. This is in contrast to the conventional search string based search that searches content by
looking exact matches to the search strings and weighing the search results statistically. On the
other hand, the retrieval mechanism described by the present subject matter forms relationship and
association between search results and hence is able to fetch such a result for which a search-key
cannot be formed easily or has been altogether forgotten by user.
The recap as contemplated by the present subject matter records the natural sequence of
event occurrences with relevant, inherent metadata and grows it further by forming and weaving
the relationship of information in meaningful way. Accordingly, a prior user tagging of the recap is
not required to make is searchable
With the proposed database design in the present subject matter, the associations between
different fragmented activities are created without actually duplicating the content, thereby using
minimal additional space on the device. Thus, even though the user might not recall what actually
he/she wants to search for, the present database design supports recalling through these
associations.
Overall, the information retrieval method as described by the present subject matter not
only uses the keywords /Tag of content, but also utilizes the rich relationship between elements in
the created log or recaps.
While specific language has been used to describe the disclosure, any limitations arising on
account of the same are not intended. As would be apparent to a person in the art, various working
modifications may be made to the method in order to implement the inventive concept as taught
herein.
The drawings and the forgoing description give examples of embodiments. Those skilled in
the art will appreciate that one or more of the described elements may well be combined into a
single functional element. Alternatively, certain elements may be split into multiple functional
elements. Elements from one embodiment may be added to another embodiment. For example,
orders of processes described herein may be changed and are not limited to the manner described
herein. Moreover, the actions of any flow diagram need not be implemented in the order shown;
nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on
other acts may be performed in parallel with the other acts. The scope of embodiments is by no
means limited by these specific examples. Numerous variations, whether explicitly given in the
25
specification or not, such as differences in structure, dimension, and use of material, are possible.
The scope of embodiments is at least as broad as given by the following claims.
Benefits, other advantages, and solutions to problems have been described above with
regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any
component(s) that may cause any benefit, advantage, or solution to occur or become more
pronounced are not to be construed as a critical, required, or essential feature or component of any
or all the claims.
26
We Claim:
1. A method of retrieving information in a mobile device, said method comprising:
ascertaining (step 302) at least one log of operational events of said mobile device, based at
least on one input parameter as received from a user;
identifying (step 304) at least one element present within said log at least based on said
input parameter;
fetching (step 306) content at least related to said at least one identified element in said log;
and
displaying (step 308) at least a portion of said fetched content in accordance with an
orientation of said identified element within said log.
2. A method of retrieving information in a mobile device, said method comprising:
ascertaining (step 502) at least one log of operational events of said mobile device, based at
least on one input parameter as received from a user;
identifying (step 504) at least two elements present within said log, wherein identification
of at least one element is based on said input parameter;
fetching (step 506) content at least related to said identified elements in said log; and
displaying (step 508) at least a portion of said fetched content at least in accordance with
orientation of said identified elements in said log.
3. The method as claimed in claim 2, wherein ascertaining at least one log comprises determining
one or more relevant logs out of a plurality of pre-generated logs of information, said selection
being based on at least one of:
an overall context associated with said received input parameter; and
at least one pre-defined identifier present within said input parameter, said identifier
corresponding to at least one of a tag, an image, a sign, and a special character.
4. The method as claimed in claim 2 and 3, wherein each of said pre-generated logs of information
comprises said one or more elements denoting occurrences of said operational events within the
mobile device, said elements linked in a pre-defined order within said logs based on at least one of:
a chronological sequence of occurrence of said events ;
a location of occurrence of said events;
presence of one or more identical keyword across the elements;
a sequence of interaction of said mobile device with the other device;
one or more pre-defined user-actions over the mobile device, while said user operates upon
said mobile-device; and
27
a sequence of user-activities over one or more mobile-device applications, while said user
operates upon said one or more mobile-device applications.
5. The method as claimed in claims 2 and 4, wherein each of said elements in said log denotes a
type of user-activity done through the mobile-device, said user-activity being either a group of
identical user-activities or an individual activity.
6. The method as claimed in claim 2, wherein said input parameter is received through a userinterface
and comprises at least one of:
a user-typed parameter comprising at least one of: a user text, a pre-defined identifier, a predefined
tag, a numeric character, an alphanumeric character, and a special character;
an image captured by an imaging device;
a voice based command; and
a touch-gesture.
7. The method as claimed in claims 2 and 4, wherein said identification of said at least two
elements comprises:
searching at least one element within said log based on said input parameter; and
searching at least one other element within said log while not considering said input
parameter, wherein said another element is selected based on a proximity of linkage of said other
element to said at least one element.
8. The method as claimed in claim 2, wherein said fetching is preceded by:
retrieving data-references pertaining to each of said identified elements; and
optionally retrieving data–references pertaining to non-identified elements present within
said at least one log.
9. The method as claimed in claim 8, wherein said fetching comprises extracting content from a
pre-defined memory location in said mobile device through said data references, said content
comprising at least one of:
a first type of content pertaining to said input parameter and said identified elements; and
a second type of content non-pertaining to said input parameter and pertaining to said
identified elements .
10. The method as claimed in claim 2 and 9, wherein said displaying comprises:
depicting a graphical representation of said log, said representation having been retrieved
based on cached details pertaining to said log; and
displaying said fetched content partly by depicting said first and second type of contents
within said graphical representation, said first and second type of contents being oriented with
respect to each other based on the positions of said identified elements in said log.
28
11. The method as claimed in claim 10 comprises, wherein said depiction of first and second type
of contents within said graphical representation comprises:
a symbolic representation of each of said identified elements, said symbolic representations
being oriented to link with each other according to the positions of the corresponding elements in
said log; and
metadata pertaining to said identified elements;
wherein said symbolic representations are operable by a user to access an actual data
pertaining to said identified elements within the mobile device.
12. The method as claimed in claim 2, wherein said fetching further comprises fetching content
related to non-identified elements of said log.
13. The method as claimed in claim 2 and 12, further comprising:
receiving a user input to access content other than the displayed contents; and
based on said user input, depicting fetched content related to non-identified elements in
accordance with orientation of said non-identified elements in said log.
14. A system (400) for retrieving information in a mobile device, said method comprising:
an input device (402) for receiving at least one input parameter from a user
a processor (404) configured for:
ascertaining at least one log of operational events of said mobile device based on said
input parameter ;
identifying at least one element present within said log at least based on said input
parameter;
fetching content at least related to said at least one identified element in said log;
and
a display device (406) for displaying at least a portion of said fetched content in accordance
with an orientation of the identified element within said log.
15. A system (400) of retrieving information in a mobile device, said system comprising:
an input device (402) for receiving at least one input parameter from a user
a processor (404) configured for:
ascertain at least one log of operational events of said mobile device, based at least on
said input parameter;
identifying at least two elements present within said log, wherein identification of at
least one element is based on said input parameter;
29
fetching content at least related to said identified elements in said log;
and
a display device (406) for displaying at least a portion of said fetched content at least in
accordance with orientation of said identified elements in said log.
16. A method of capturing information in a mobile device:
detecting (step 102) occurrence of a user-specific condition;
triggering (step 104), based on said detection, a monitoring of at least one operational event
within said mobile device
accessing (step 106) at least one element related to said at least one operational event; and
creating (step 108) a log of said mobile device’s events with respect to said user-specific
condition and registering at least said accessed element at a designated position within the log.
17. The method as claimed in claim 16, wherein said detecting comprises either automatically
sensing said user-specific condition or receiving a user input that corresponds to said user-specific
condition.
18. The method as claimed in claim 16, wherein said user- specific condition is at least one of:
a user-inputted condition based on at least one of a time frame and a location;
a user-state;
a keyword;
a usage of the mobile device through one or more pre-defined applications, and
an interaction of the mobile device with another device.
19. The method as claimed in claim 16, wherein said user-specific condition is defined by receipt
of one or more directly selected contents by the user, said directly selected contents associated with
one or more operational events within the mobile device.
20. The method as claimed in claim 16, wherein said monitoring comprises scanning data stored in
a memory, said data having been generated as a result of said at least one operational event.
21. The method as claimed in claim 16, wherein said operational event comprises an incoming call,
an outgoing call, an incoming message, an outgoing message, browsing of internet through mobile
device, and a user-performed operation over the mobile device with or without the network
assistance.
22. The method as claimed in preceding claims, wherein said accessing of elements comprises:
30
generating at least one reference data with respect to either said scanned data present in the
memory of the mobile device or said user-selected content.
23. The method as claimed in claim 22, wherein said accessing further comprises grouping a
plurality of generated reference data that refer data having similar characteristics to create one or
more elements.
24. The method as claimed in claim 16, wherein, prior to said registering of the elements, the
method comprises linking said at least one accessed element with another accessed element that
corresponds to same user-specific condition, said linking being based on at least one of:
a chronological sequence, in case the user specific condition is based on a time frame, or a
state of the user;
a sequence of content selection as directly done by the user, in case the user-specific
condition pertains to the direct selection of content by the user in relation to one or more
operational events within the mobile device;
a location, in case the user specific condition is based on a geographical location of the
mobile device;
presence of a keyword, in case the user-specific condition pertains to said keyword;
a sequence of usage of one or more pre-defined mobile device applications, in case the user
specific condition pertains to usage of the mobile device through the one or more pre-defined
mobile device applications; and
a sequence of interaction of said mobile device with the other device, in case said user
specific condition pertains to interaction of said mobile device with the other device.
25. The method as claimed in claims 16 and 24, wherein said designated position of the at least one
accessed element within said log is based at least on linkage with other accessed element.
26. The method as claimed in claim 16, wherein the at least one accessed element depicted within
the log denotes a group of sub-elements having similar characteristics or an individual element.
27. The method as claimed in claim 16, wherein the method further comprises:
tagging said created log or accessed elements therein with one or more tags automatically
or based on receipt of a user-input, wherein said identifier denotes characteristic of the accessed
elements or an interaction of the mobile with another device with respect to a particular accessed
element.
31
28. The method as claimed in preceding claims, further comprising receiving a user input to edit
said log.
29. The method as claimed in claim 16, wherein said monitoring is triggered based on said
detection and a particular state of the mobile device, said particular state comprising at least one of:
a charging state, an idle state, an under-utilized state of the mobile device.
30. A system (200) for capturing information in a mobile device:
a memory (202)
an input device (204) for receiving a user-specific condition;
a processor (206) in operational interconnection with said memory and input device
configured to:
trigger, based on detection of said received user-specific condition, a monitoring of
at least one operational event within said mobile device.
access at least one element related to said at least one operational event from the
memory;
create a log of said mobile device’s events with respect to said user-specific
condition; and
register at least said accessed element at a designated position within the log.
31. The system as clamed in claim 30, wherein said processor as part of said monitoring is
configured to scan data stored in the memory, wherein said data is generated as a result of said at
least one operational event
32. The system as clamed in claim 31, wherein said processor while accessing element is
configured to:
generate a plurality of reference data related to data generated as a result of said at least one
operational event;
group similar reference data out of said plurality of reference data to create one or more
elements; and
manage one or more elements based on said grouped or non-grouped reference data.
33. The system as clamed in claim 31, wherein said processor while creating said log is configured
to:
create said log by linking said at least one accessed element with another accessed element
that corresponds to same user-specific condition; and
32
tag said created log or accessed elements therein with one or more identifiers automatically
or based on receipt of a user-input, wherein said identifier denotes characteristic of the accessed
elements or an interaction of the mobile with another device with respect to a particular accessed
element.
34. The system as clamed in claim 32, wherein the input device is further configured to:
receive one or more directly selected contents by the user, wherein said directly selected
contents are associated with one or more operational events within the mobile device; and
permit the processor to access said received one or more directly selected contents for
reference data generation.
35. The system as clamed in claim 30, wherein the input device further comprises an edit module
for editing the log based on a received user input.
| # | Name | Date |
|---|---|---|
| 1 | Power of Attorney [06-01-2016(online)].pdf | 2016-01-06 |
| 2 | Form 5 [06-01-2016(online)].pdf | 2016-01-06 |
| 3 | Form 3 [06-01-2016(online)].pdf | 2016-01-06 |
| 4 | Form 18 [06-01-2016(online)].pdf | 2016-01-06 |
| 5 | Drawing [06-01-2016(online)].pdf | 2016-01-06 |
| 6 | Description(Complete) [06-01-2016(online)].pdf | 2016-01-06 |
| 7 | 201611000525-Form-1-(21-01-2016).pdf | 2016-01-21 |
| 8 | 201611000525-Correspondence Others-(21-01-2016).pdf | 2016-01-21 |
| 9 | REQUEST FOR CERTIFIED COPY [25-03-2016(online)].pdf | 2016-03-25 |
| 10 | abstract.jpg | 2016-07-10 |
| 11 | 201611000525-FER.pdf | 2018-12-27 |
| 12 | 201611000525-FER_SER_REPLY [07-03-2019(online)].pdf | 2019-03-07 |
| 13 | 201611000525-DRAWING [07-03-2019(online)].pdf | 2019-03-07 |
| 14 | 201611000525-COMPLETE SPECIFICATION [07-03-2019(online)].pdf | 2019-03-07 |
| 15 | 201611000525-CLAIMS [07-03-2019(online)].pdf | 2019-03-07 |
| 16 | 201611000525-PA [22-11-2019(online)].pdf | 2019-11-22 |
| 17 | 201611000525-ASSIGNMENT DOCUMENTS [22-11-2019(online)].pdf | 2019-11-22 |
| 18 | 201611000525-8(i)-Substitution-Change Of Applicant - Form 6 [22-11-2019(online)].pdf | 2019-11-22 |
| 19 | 201611000525-Correspondence to notify the Controller [25-06-2021(online)].pdf | 2021-06-25 |
| 20 | 201611000525-FORM-26 [01-07-2021(online)].pdf | 2021-07-01 |
| 21 | 201611000525-Correspondence to notify the Controller [06-07-2021(online)].pdf | 2021-07-06 |
| 22 | 201611000525-Correspondence to notify the Controller [16-07-2021(online)].pdf | 2021-07-16 |
| 23 | 201611000525-Written submissions and relevant documents [03-08-2021(online)].pdf | 2021-08-03 |
| 24 | 201611000525-US(14)-HearingNotice-(HearingDate-02-07-2021).pdf | 2021-10-17 |
| 25 | 201611000525-US(14)-ExtendedHearingNotice-(HearingDate-20-07-2021).pdf | 2021-10-17 |
| 26 | 201611000525-US(14)-ExtendedHearingNotice-(HearingDate-07-07-2021).pdf | 2021-10-17 |
| 27 | 201611000525-PatentCertificate18-04-2022.pdf | 2022-04-18 |
| 28 | 201611000525-IntimationOfGrant18-04-2022.pdf | 2022-04-18 |
| 1 | 2016110000525search_18-12-2018.pdf |