Sign In to Follow Application
View All Documents & Correspondence

Method And System For Generating A Visual Representation Of Object Timelines In A Multimedia User Interface

Abstract: ABSTRACT METHOD AND APPARTUS FOR GENERATING A VISUAL REPRESENTATION OF OBJECT TIMELINES IN A MULTIMEDIA USER INTERFACE The present invention provides a method and apparatus for generating a visual representation of object timelines in a multimedia user interface by showing time information, associated with a moving object that needs to be displayed, directly over its motion path by assigning and displaying color-values on its time-line and displaying corresponding colors on the motion-path. The method comprises the steps of presenting an object via a display operatively coupled with an electronic device, presenting a first visual indicator that relates time information associated with a motion of the object with a motion path of the object and presenting a timeline associated with the time information. Here a visual property of the first visual indicator matches a visual property of the second visual indicator in relation with the time information. FIGURE 6

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
27 June 2014
Publication Number
06/2016
Publication Type
INA
Invention Field
ELECTRICAL
Status
Email
bangalore@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-01-31
Renewal Date

Applicants

SAMSUNG R&D INSTITUTE INDIA – BANGALORE PRIVATE LIMITED
# 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India

Inventors

1. NEYYAN, Biju Mathew
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
2. VANKA, Jaya Prakash
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
3. KRISHNAN, Praveen
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
4. THARAYIL, Ranjith
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
5. GANAPATI BANNE, Abhinandan
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India

Specification

DESC:
FORM 2
THE PATENTS ACT, 1970
[39 of 1970]
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(Section 10; Rule 13)

METHOD AND APPARTUS FOR GENERATING A VISUAL REPRESENTATION OF OBJECT TIMELINES IN A MULTIMEDIA USER INTERFACE

SAMSUNG R&D INSTITUTE INDIA – BANGALORE Pvt. Ltd.
# 2870, ORION Building, Bagmane Constellation Business Park,
Outer Ring Road, Doddanakundi Circle,
Marathahalli Post,
Bangalore -560037, Karnataka, India
Indian Company

The following Specification particularly describes the invention and the method it is being performed.

RELATED APPLICATION
Benefit is claimed to Indian Provisional Application No. 3134/CHE/2014 titled “METHOD AND SYSTEM FOR GENERATING A VISUAL REPRESENTATION OF OBJECT TIMELINES IN A MULTIMEDIA USER INTERFACE” filed on 27th June 2014, which is herein incorporated in its entirety by reference for all purposes.

FIELD OF THE INVENTION
The present invention relates to user interfaces for multimedia data presentation and particularly relates to a method and system for generating a visual representation of object motion path and timeline on a multimedia user interface.

BACKGROUND OF THE INVENTION
Multimedia refers to the integration of text, images, audio and video in a variety of application environments. Media editing applications allow users to create composite multimedia applications (e.g., movies) based on several multimedia clips, such as audio and video clips that often display a visual representation which the user controls through certain actions, such as selecting buttons on a remote or moving a controller in a certain manner.

The visual representation is a computer representation that typically takes the form of a two-dimensional (2D) or three-dimensional (3D) model in various applications, such as computer games, video games, chats, forums, communities, instant messaging services, and the like. However, this data can be heavily time-dependent, such as video and audio in a motion picture, and can require time-ordered presentation during use. Oftentimes, the time-based information is assembled into a data presentation through the use of a processing system to edit the information. For example, a video may be edited and combined with audio, text, effects and/or graphics to create a visual representation. A visual representation is any changed version of an original time-based stream of information or a modified copy of the original information.

By the progress in technology, the processing power, resolution and the screen dimensions of handheld devices have increased tremendously. A large number of users now make use of handheld devices to create content including animations rather than for just content-consumption. While creating animations, it is required to depict the motion of the objects in space-time. Usually this is accomplished by defining motion paths (to define a path in 2D or 3D space) and representing time using a timeline with a time segment, for each object in the animation. However, this consumes a lot of screen space as it requires a dedicated screen-space for providing timelines. In case of devices with smaller display area such as a hand held tablet computer, screen space is very critical when it comes to image and animation related content creation. Further, the problem with the current user interfaces is that the compounded timelines complicate the editing process for the user. In addition, the display of multiple timelines wastes screen space that may be better served to display other useful editing tools.

In light of the shortcomings of the various currently available systems, there is a need for enabling a user to view the position, movement path, direction and time information in a single glance.

The above mentioned shortcomings, disadvantages and problems are addressed herein and which will be understood by reading and studying the following specification.

SUMMARY OF THE INVENTION
The various embodiments herein disclose a method and apparatus for generating a visual representation of object timelines in a multimedia user interface. According to an embodiment herein, the method comprises of presenting an object via a display operatively coupled with an electronic device, presenting a first visual indicator that relates time information associated with a motion of the object with a motion path of the object and presenting a timeline associated with the time information.

According to an embodiment herein, presenting the first visual indicator comprises of receiving a selection of a timespan on the timeline, wherein the first visual indicator corresponds to the timespan and receiving comprises of presenting on the timespan a second visual indicator corresponding to the first visual indicator.

According to an embodiment herein, a visual property of the first visual indicator matches a visual property of the second visual indicator in relation with the time information.

According to an embodiment herein, the presenting the object comprises presenting the object at a position on the motion path corresponding to the time information of a selected position on the timeline, moving the object along the motion path based on user input and moving another object, based on the moving of the first object, along a motion path of the another object, in an amount corresponding to time associated with the moving of the first object.

According to an embodiment herein, presenting the object comprises presenting the object at a specific position by user input and presenting the first visual indicator comprises presenting the motion path of the object and the first visual indicator based on the specific position.
According to an embodiment herein, the first visual indicator comprises at least one of a color indicator, an alpha numeric indicator or a gray scale indicator. Also, the first visual indicator comprises representing the time information by using different colors in gradient or separate. Further, the motion path represents motion in a three-dimensional space. The first visual indicator is presented in relation with a multimedia object and with a tracking application with a map function.

In another aspect of the present invention, an apparatus for generating a visual representation of object timelines in a multimedia user interface is disclosed. The apparatus comprises of a processor, and a memory storing instructions when executed by the processor, causes the apparatus to perform operations comprising presenting an object via a display operatively coupled with the apparatus, presenting a first visual indicator that relates time information associated with a motion of the object with a motion path of the object, and presenting a timeline associated with the time information.

These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.

BRIEF DESCRIPTION OF THE DRAWINGS
The other objects, features and advantages will occur to those skilled in the art from the following description of the preferred embodiment and the accompanying drawings in which:

Figure 1 is a block diagram illustrating a system for generating a visual representation of object timelines in a multimedia user interface of a user device, according to an embodiment of the present invention.

Figure 2 is a block diagram illustrating a system for generating a visual representation of object timelines in a multimedia user interface of a user device, according to another embodiment of the present invention.

Figure 3 is a block diagram of a user interface module as that shown in Figure 2, according to an embodiment of the present invention.

Figure 4 is a block diagram of an object design module as that shown in Figure 2, according to an embodiment of the present invention.

Figure 5 is a snapshot of a multimedia user interface indicating various parameters involved in generating visual representation of the object timelines, according to an embodiment of the present invention.

Figure 6 is a schematic representation of the multimedia user interface indicating a time-range selection and representation of timeline on motion paths using colors, according to an embodiment of the present invention.

Figure 7 illustrates the mapping of the time range selected on timeline to the motion path of an object, according to an embodiment of the present invention.

Figure 8 illustrates the process of editing time by dragging edge of color spectrum on the motion path without changing the object’s motion path, according to an embodiment of the present invention.

Figure 9 illustrates the process of shifting time by dragging the color-spectrum on the motion path, according to an embodiment of the present invention.

Figure 10 illustrates the process of shifting object position by dragging the motion path directly, according to an embodiment of the present invention.

Figure 11 illustrates the process of displaying visual information (alphanumeric) along with/ instead of colors to represent time and position on the motion path, according to an embodiment of the present invention.

Figure 12 illustrates the process of representing the depth information of the objects using a grayscale indicator on the motion path, according to an embodiment of the present invention.

Figure 13 illustrates the process of depicting time information by using color-spectrum on a three dimensional motion path, according to an embodiment of the present invention.

Figure 14 illustrates the color representation with gradients vs color bands, according to an embodiment of the present invention.

Figure 15 is a snapshot illustrating a chat application which enables users to send and receive animated messages, according to an exemplary embodiment of the present invention.

Figure 16 is a snapshot illustrating a tracking application which marks the position of people on the map as per time, according to an exemplary embodiment of the present invention.

Although specific features of the present invention are shown in some drawings and not in others, this is done for convenience only as each feature may be combined with any or all of the other features in accordance with the present invention.

DETAILED DESCRIPTION OF THE INVENTION
In the following detailed description of the embodiments of the invention, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.

The present invention provides a method and apparatus for generating a visual representation of object timelines in a multimedia user interface. The method comprises the steps of tracking a time information associated with a movement of an object of a multimedia presentation on the user interface and representing the time information of the object on the user interface using a visual indicator. Here the time information associated with the movement of the object is represented by assigning predefined values to the visual indicator along a motion path of the object and displaying the visual indicators corresponding to the assigned values on the object motion path. The visual indicator comprises one of a color indicator, an alpha numeric indicator or a gray scale indicator.

According to an embodiment of the present invention, the method for generating a visual representation of object timelines in a multimedia user interface further comprises translating the time information to a motion representation corresponding to the object movements and controlling the motion path of the objects based on at least one of a position, magnitude and direction of the movement of the object, object facing direction and time information. The motion path and timeline of the object is controlled using touch gestures.

The motion path of the objects are controlled by at least one of changing a time range marker, shifting a time range, changing a frame marker, editing time on the motion path, visual time synchronization and visual position synchronization. Further time information associated with the object can be changed by varying visual indicator values associated with the visual indicators on the timeline and updating the modification in the timeline on the motion path in real time. The method of varying the visual indicator values comprises at least one of dragging a selected area on the motion path to shift timeline of the object, dragging at least one end of the timeline to increase or decrease the object movement speed, stretching the visual indicator range to regulate a speed of movement of the object, increasing or decreasing an intensity of the visual indicator to regulate the object movement at a particular location, moving an object position marker to shift objects position and moving the motion path to reposition and time synchronize the object based on the visual indicators. The color indicator is displayed as at least one of smooth gradient or as colored bands on the object motion path, interpolation of colors, and visual indicator of discrete values.

Here the object comprises of a media content including an animation, audio, video, graphics and text.

According to an embodiment herein, the visual representation of the multimedia object can be created on multiple user interfaces, i.e having timeline on display of a first device and the motion path on another display. Example, having timeline controls on a mobile phone and the editing interface on a tablet. The visual representation of multimedia object on multiple user interfaces comprises of tracking the time information associated with the movement of the object of the multimedia presentation on a user interface corresponding to a first device and representing the time information of the object on a user interface corresponding to a second device using the visual indicator.

Further the time information, associated with a moving object that needs to be displayed can be shown directly over its motion path by assigning and displaying color-values on its time-line and displaying corresponding colors on the motion-path. The color display is modified and the colors are displayed as a smooth gradient. Further, the colors are displayed as colored bands in which colors are assigned only to a selected portion of the timeline. Here, the selection is controlled using range markers. The selection of the range is modified by dragging the selected portion. When these selected portions are dragged, corresponding changes are updated on the motion path in real-time. The information displayed is enriched by adding other layers of information (Such as depth in 3D space) by using numbers/greyscale values.

The time information of an object being animated is edited by manipulating corresponding color values directly on the motion-path. The colored area on the motion path is dragged to shift object’s time information at the current position. Hence, dragging any end of the colored-line increases or decreases the speed of the animation. Further the colors are pinched in/out to make the animation faster/slower at that particular location. The object position marker is moved to shift objects position at current time. The motion path is moved to reposition and time sync the animation referring colors. Further, the method of the present invention provides zooming into the editing area for more details and precision.

Figure 1 is a block diagram illustrating a system for generating a visual representation of object timelines in a multimedia user interface of a user device, according to an embodiment of the present invention. The system comprises of a plurality of electronic devices or user devices 102, 104, and 106 coupled over a network 110. Further a central server 112 is also connected to the network 110. The electronic device 102 comprises of an object design module 114, a processor 116, a memory 118, a user interface module 120, a display 122 and a communication interface 124 which are all connected to a bus 126. The memory 118 store the instructions, which when executed by the processor 116, causes the apparatus to perform operations comprising presenting an object via a display operatively coupled with the apparatus, presenting a first visual indicator that relates time information associated with a motion of the object with a motion path of the object and presenting a timeline associated with the time information.

According to an embodiment herein, the user interface module 120 or a part of elements or functions of the user interface module could be merged, split or included in another module (e.g., memory, display, or etc.). Similarly the object design module or a part of the module can be placed any part of an element of the electronic device and any element of the module or any function of the module could be merged, split or placed in other module

Figure 2 is a block diagram illustrating a system for generating a visual representation of object timelines in a multimedia user interface of a user device, according to another embodiment of the present invention. The electronic devices 202 and 206 could also be coupled over a network 210. Further, the electronic device 202 is also connected to a server 212 over a network 210. The electronic device 202 comprises of a processor 214, a memory 216, a user interface module 218 and a communication interface 220 which are all connected to a bus 222. The memory 216 includes an object design module 224 and the user interface module 218 includes a display 226.

According to an embodiment herein, the user interface module 120 or a part of elements or functions of the user interface module could be merged, split or included in another module (e.g., memory, display, or etc.). Similarly the object design module or a part of the module can be placed any part of an element of the electronic device and any element of the module or any function of the module could be merged, split or placed in other module

Further the display can be a part of the user interface module. Also the display does not necessarily need to be included in the electronic device (i.e., objects, UI could be presented on the display of the external device).

Figure 3 is a block diagram of a user interface module as that shown in Figure 2, according to an embodiment of the present invention. The user interface module 310 includes a motion path indicator 312, a motion range marker 314 and a timeline marker 316. The motion path indicator 312 indicates if the motion path of the object is within a selected time range or not. The motion range marker 314 is adapted to control the selection of a portion of the timeline for which colors are to be assigned. Further the selected portion for assigning the colors can be modified by dragging the selected portion using the motion range markers 314. The time line indicator 316 provides for selecting a time range by adjusting the “In Marker” and the “Out Marker”. The selected time range is overlayed with a color spectrum (either continuous or as a sequence of color segments). The corresponding time segments on the motion path is represented with corresponding colors. Further dragging any of the time range selection markers changes the length of the time range selected. The color spectrum size is readjusted to fill the newly selected range in real time. Corresponding changes are reflected on time representation on the motion path.

Figure 4 is a block diagram of an object design module as that shown in Figure 2, according to an embodiment of the present invention. The object design module 400 includes an object identifier module 402, a time adjuster module 404, a motion path adjuster 406, a time and motion synchronizer 408, a range detector 410 and a position detector 412. The object identifier module 402 is adapted to identify one or more objects in the motion path on the user interface. The time adjuster module 404 is adapted to adjust the selected time range by adjusting the position of the In marker and the Out marker. The motion path adjuster 406 is adapted to adjust the motion path in response to the change in time range. The time and motion synchronizer 408 is adapted to move the motion path to re-position and time synchronize the object by shifting the time directly on the motion path. The range detector 410 is adapted to regulate the movement of the object within the preset range. The position detector is adapted to determine the position of the object on the motion path and the corresponding time line.

Figure 5 is a snapshot of a multimedia user interface indicating various parameters involved in generating visual representation of the object timelines, according to an embodiment of the present invention. The present invention enables displaying and editing the position, movement path, direction and time information at a single glance on the user interface of the electronic device associated with the user. The user interface comprises of an editing area 502 in which motion paths are displayed and a single timeline 504 for defining the time for the scene being edited. The object is displayed on the user interface of an electronic device, where a first visual indicator is presented which relates to time information associated with a motion of the object on a motion path of the object and the timeline is associated with the time information of the object.

Figure 6 is a schematic representation of the multimedia user interface indicating a time-range selection and representation of timeline on motion paths using colors, according to an embodiment of the present invention. According to the present invention, a Current Time Marker 610 on the timeline is used to mark the position-in-time to be displayed on the editing area. The corresponding position in time is represented as Object Position Marker 612 which indicates the object position at current time on the motion path. Further, dragging the Current Time Marker 610 on the timeline results in movement of the object along the motion path and the new object position is indicated by the Object Position Marker 612.

The Time Range 606 is selected by adjusting the In-Marker 602 and the Out-Marker 604. The selected time range 606 is overlaid with a color spectrum (either as continuous color bands or as color gradients). The position of each color on the motion path corresponds to the position of same color in the timeline. Further, when the time range 606 is moved by dragging the colored area on the timeline to a different time range, the corresponding changes are reflected on the motion path. When any of the time range selection markers are dragged, either the In-Marker 602 or the Out-Marker 604, the length of the selected time range 606 changes accordingly. The color spectrum size is readjusted to fill the newly selected time range in real time. The corresponding changes are reflected on time representation on the motion path as well. The dotted line 608 on the motion path indicates the area out of selected time range on the motion path.

Figure 7 illustrates the mapping of the time range selected on timeline to the motion path of an object, according to an embodiment of the present invention. The length of the time range is selected using the time range selection markers i.e. In Marker and the Out Marker. This provides the color spectrum size which is reflected on the motion path of the object.

Figure 8 illustrates the process of editing time by dragging edge of color spectrum on the motion path without changing the object’s motion path, according to an embodiment of the present invention.

Figure 9 illustrates the process of shifting time by dragging the color-spectrum on the motion path, according to an embodiment of the present invention. The color spectrum length is readjusted to fill the edited length in real time. The animations of individual objects are synced by shifting time directly on the motion path. The current time marker on the timeline also moves along with the associated color to the new position.

Figure 10 illustrates the process of shifting object position by dragging the motion path directly, according to an embodiment of the present invention. The animations of individual objects are synchronized by changing their position directly in editing area, where the motion path is directly dragged to shift the object path. The color-spectrum and the current time marker will remain unchanged relative to the motion path.

Figure 11 illustrates the process of displaying visual information (alphanumeric) along with/ instead of colors to represent time and position on the motion path, according to an embodiment of the present invention. Methods other than mapping color spectrum can be used to represent extra information on timelines. For example numbering timeline and displaying the corresponding number information on the motion path.

Figure 12 illustrates the process of representing the depth information of the objects using a grayscale indicator on the motion path, according to an embodiment of the present invention. The embodiments herein provides for representing the depth information of the objects on the screen by adding an extra line with the gray scale indicator. The grayscale indicates that the darkest region represents the farthest position and lightest region represents the closest position of the object.

Figure 13 illustrates the process of depicting time information by using color-spectrum on a three dimensional motion path, according to an embodiment of the present invention. According to the present invention, the color spectrum is displayed on a three dimensional motion path in the editing area.

Figure 14 illustrates the color representation with gradients vs color bands, according to an embodiment of the present invention. The selected time range is overlaid with a color spectrum which is either as color gradients (segmented) or as color bands (continuous).

Figure 15 is a snapshot illustrating a chat application which enables users to send and receive animated messages, according to an exemplary embodiment of the present invention. The messaging platform enables users to send and receive animated messages which are created using the platform according to the disclosed invention. The clip-arts from an existing library of images are used to animate and convey a message visually.

Figure 16 is a snapshot illustrating a tracking application which marks the position of people on the map as per time, according to an exemplary embodiment of the present invention. The location tracking application marks the position of the people on the map as per time.

The present invention discloses an advanced method of of displaying and editing timeline and related information of moving objects especially for animation content creation. The method maps the time information to colors or any visual indicators represented directly on the motion path. Accordingly, the user selects a part (or full) of the main timeline provided on the user interace. The selected portion is then assigned colors such that time “t0” to “tn” is represented by colors ranging from “Color-0” to “Color-n”. These colors are then mapped on to the Motion-Path represented on the display.

The present embodiments have been described with reference to specific example embodiments; it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the various embodiments. Furthermore, the various devices, modules, and the like described herein may be enabled and operated using hardware circuitry, for example, complementary metal oxide semiconductor based logic circuitry, firmware, software and/or any combination of hardware, firmware, and/or software embodied in a machine readable medium.

Although the embodiments herein are described with various specific embodiments, it will be obvious for a person skilled in the art to practice the invention with modifications. However, all such modifications are deemed to be within the scope of the claims. It is also to be understood that the following claims are intended to cover all of the generic and specific features of the embodiments described herein and all the statements of the scope of the embodiments which as a matter of language might be said to fall there between.

,CLAIMS:CLAIMS

We Claim:

1. A method comprising:
presenting an object via a display operatively coupled with an electronic device;
presenting a first visual indicator that relates time information associated with a motion of the object with a motion path of the object; and
presenting a timeline associated with the time information.

2. The method of claim 1, wherein:
the presenting the first visual indicator comprises receiving a selection of a timespan on the timeline; and
the first visual indicator corresponds to the timespan.

3. The method of claim 2, wherein the receiving comprises presenting on the timespan a second visual indicator corresponding to the first visual indicator.

4. The method of claim 3, wherein a visual property of the first visual indicator matches a visual property of the second visual indicator in relation with the time information.

5. The method of claim 1, wherein the presenting the object comprises:
presenting the object at a position on the motion path corresponding to the time information of a selected position on the timeline.

6. The method of claim 1, wherein the presenting the object comprises:
moving the object along the motion path based on user input; and
moving another object, based on the moving of the first object, along a motion path of the another object, in an amount corresponding to time associated with the moving of the first object.

7. The method of claim 1, wherein:
the presenting the object comprises presenting the object at a specific position by user input; and
the presenting the first visual indicator comprises presenting the motion path of the object and the first visual indicator based on the specific position.

8. The method of claim 1, wherein the first visual indicator comprises at least one of a color indicator, an alpha numeric indicator or a gray scale indicator.

9. The method of claim 1, wherein the first visual indicator comprises representing the time information by using different colors in gradient or separate.

10. The method of claim 1, wherein the motion path represents motion in a three-dimensional space.

11. The method of claim 1, wherein the first visual indicator is presented in relation with a multimedia object.

12. The method of claim 1, wherein the first visual indicator is presented in relation with a tracking application with a map function.

13. An apparatus comprising:
a processor; and
a memory storing instructions when executed by the processor, causes the apparatus to perform operations comprising:
presenting an object via a display operatively coupled with the apparatus;
presenting a first visual indicator that relates time information associated with a motion of the object with a motion path of the object; and
presenting a timeline associated with the time information.

14. The apparatus of claim 13, wherein:
the presenting the first visual indicator comprises receiving a selection of a timespan on the timeline; and
the first visual indicator corresponds to the timespan.

15. The apparatus of claim 14, wherein the receiving comprises presenting on the timespan a second visual indicator corresponding to the first visual indicator.

16. The apparatus of claim 15, wherein a visual property of the first visual indicator matches a visual property of the second visual indicator in relation with the time information.

17. The apparatus of claim 13, wherein the presenting the object comprises:
presenting the object at a position on the motion path corresponding to the time information of a selected position on the timeline.

18. The apparatus of claim 13, wherein the presenting the object comprises:
moving the object along the motion path based on user input; and
moving another object, based on the moving of the first object, along a motion path of the another object, in an amount corresponding to time associated with the moving of the first object.

19. The apparatus of claim 13, wherein:
the presenting the first visual indicator object comprises presenting the object at a specific position by user input; and
the presenting the first visual indicator comprises presenting the motion path of the object and the first visual indicator based on the specific position.

20. The apparatus of claim 13, wherein the first visual indicator comprises at least one of a color indicator, an alpha numeric indicator or a gray scale indicator.

21. The apparatus of claim 13, wherein the first visual indicator comprises representing the time information by using different colors in gradient or separate.

22. The apparatus of claim 13, wherein the motion path represents motion in a three-dimensional space.

23. The apparatus of claim 13, wherein the first visual indicator is presented in relation with a multimedia object.

24. The apparatus of claim 13, wherein the first visual indicator is presented in relation with a tracking application with a map function.

25. A non-transitory computer-readable medium storing computer-readable instructions that, when executed, instruct an apparatus to execute operations comprising:
presenting an object via a display operatively coupled with the apparatus;
presenting a first visual indicator that relates time information associated with a motion of the object with a motion path of the object; and
presenting a timeline associated with the time information.

Dated this the 21st day of January 2015

Signature

KEERTHI J S
Patent agent
Agent for the applicant

Documents

Application Documents

# Name Date
1 3134-CHE-2014-IntimationOfGrant31-01-2024.pdf 2024-01-31
1 POA_Samsung R&D Institute India-new.pdf 2014-06-27
2 3134-CHE-2014-PatentCertificate31-01-2024.pdf 2024-01-31
2 2012_ASCG_1215_Provisional Specification_preliminary draft.pdf 2014-06-27
3 3134-CHE-2014-CLAIMS [22-03-2020(online)].pdf 2020-03-22
3 2012_ASCG_1215_Drawings.pdf 2014-06-27
4 abstract-3134-CHE-2014.jpg 2015-02-05
4 3134-CHE-2014-COMPLETE SPECIFICATION [22-03-2020(online)].pdf 2020-03-22
5 3134-CHE-2014-CORRESPONDENCE [22-03-2020(online)].pdf 2020-03-22
5 2012_ASCG_1215_Drawings_as filed on 21th Jan, 2015.pdf 2015-03-12
6 3134-CHE-2014-DRAWING [22-03-2020(online)].pdf 2020-03-22
6 2012_ASCG_1215_Complete Specification_21th Jan, 2015.pdf 2015-03-12
7 Request for Certified Copy of 3134CHE2014_PS.pdf 2015-04-21
7 3134-CHE-2014-FER_SER_REPLY [22-03-2020(online)].pdf 2020-03-22
8 Request for Certified Copy of 3134CHE2014_CS.pdf 2015-04-21
8 3134-CHE-2014-FORM 3 [22-03-2020(online)].pdf 2020-03-22
9 3134-CHE-2014-Request For Certified Copy-Online(21-04-2015).pdf 2015-04-21
9 3134-CHE-2014-OTHERS [22-03-2020(online)].pdf 2020-03-22
10 3134-CHE-2014-PETITION UNDER RULE 137 [22-03-2020(online)].pdf 2020-03-22
10 3134-CHE-2014-RELEVANT DOCUMENTS [06-08-2019(online)].pdf 2019-08-06
11 3134-CHE-2014-FER.pdf 2019-09-23
11 3134-CHE-2014-FORM-26 [06-08-2019(online)].pdf 2019-08-06
12 3134-CHE-2014-FORM 13 [06-08-2019(online)].pdf 2019-08-06
13 3134-CHE-2014-FER.pdf 2019-09-23
13 3134-CHE-2014-FORM-26 [06-08-2019(online)].pdf 2019-08-06
14 3134-CHE-2014-PETITION UNDER RULE 137 [22-03-2020(online)].pdf 2020-03-22
14 3134-CHE-2014-RELEVANT DOCUMENTS [06-08-2019(online)].pdf 2019-08-06
15 3134-CHE-2014-OTHERS [22-03-2020(online)].pdf 2020-03-22
15 3134-CHE-2014-Request For Certified Copy-Online(21-04-2015).pdf 2015-04-21
16 3134-CHE-2014-FORM 3 [22-03-2020(online)].pdf 2020-03-22
16 Request for Certified Copy of 3134CHE2014_CS.pdf 2015-04-21
17 3134-CHE-2014-FER_SER_REPLY [22-03-2020(online)].pdf 2020-03-22
17 Request for Certified Copy of 3134CHE2014_PS.pdf 2015-04-21
18 2012_ASCG_1215_Complete Specification_21th Jan, 2015.pdf 2015-03-12
18 3134-CHE-2014-DRAWING [22-03-2020(online)].pdf 2020-03-22
19 2012_ASCG_1215_Drawings_as filed on 21th Jan, 2015.pdf 2015-03-12
19 3134-CHE-2014-CORRESPONDENCE [22-03-2020(online)].pdf 2020-03-22
20 abstract-3134-CHE-2014.jpg 2015-02-05
20 3134-CHE-2014-COMPLETE SPECIFICATION [22-03-2020(online)].pdf 2020-03-22
21 3134-CHE-2014-CLAIMS [22-03-2020(online)].pdf 2020-03-22
21 2012_ASCG_1215_Drawings.pdf 2014-06-27
22 3134-CHE-2014-PatentCertificate31-01-2024.pdf 2024-01-31
22 2012_ASCG_1215_Provisional Specification_preliminary draft.pdf 2014-06-27
23 POA_Samsung R&D Institute India-new.pdf 2014-06-27
23 3134-CHE-2014-IntimationOfGrant31-01-2024.pdf 2024-01-31

Search Strategy

1 2019-09-1117-01-00_12-09-2019.pdf
1 2020-08-0815-27-28AE_10-08-2020.pdf
2 2019-09-1117-01-00_12-09-2019.pdf
2 2020-08-0815-27-28AE_10-08-2020.pdf

ERegister / Renewals

3rd: 04 Apr 2024

From 27/06/2016 - To 27/06/2017

4th: 04 Apr 2024

From 27/06/2017 - To 27/06/2018

5th: 04 Apr 2024

From 27/06/2018 - To 27/06/2019

6th: 04 Apr 2024

From 27/06/2019 - To 27/06/2020

7th: 04 Apr 2024

From 27/06/2020 - To 27/06/2021

8th: 04 Apr 2024

From 27/06/2021 - To 27/06/2022

9th: 04 Apr 2024

From 27/06/2022 - To 27/06/2023

10th: 04 Apr 2024

From 27/06/2023 - To 27/06/2024

11th: 04 Apr 2024

From 27/06/2024 - To 27/06/2025

12th: 12 May 2025

From 27/06/2025 - To 27/06/2026