Abstract: Technique for converting static graphical data received from a first user device into an animated graphical data at a second device. The techniques disclosed herein include mapping the static graphical data to a plurality of pre-determined/pre-stored animated 10 graphical data using associated metadata. Said techniques further disclose identifying, among the plurality of pre-determined/pre-stored animated graphical data, at least one animated graphic that closes matches the static emoji/Animoji/sticker/avatar and converting the static graphical data into the animated graphical data using the identified animated graphical data.
[0001] The present disclosure generally relates to a method for converting static
graphic into an animated graphic. In particular, the present disclosure relates to a
method for converting a static graphic received from one device into an animated
5 graphic at the other device.
BACKGROUND
[0002] This section is intended to provide information relating to the field of the
invention and thus any approach/functionality described below should not be assumed
10 to be qualified as prior art merely by its inclusion in this section.
[0003] Growth in the field of wireless communication technology has provided users
with plethora of choices to select from to make communication with each other hassle
free. In particular, now there exist so many multimedia/messaging platforms that not
15 only allow user to interact with each other by means of audio/video calls but also allow
the users to interact with each other by allowing them to share their views and
experiences in the form of emoji/sticker/avatar/Animoji/gif etc. However, these
platforms only allow a user to share either static or animated graphic data with the other
user. Thus, if an animated graphic data is to be shared with other user it not only
20 consumes more bandwidth in comparison to static graphic data but the user devices
may also find a challenge in sending/receiving such animated graphical data in case the
network signal is week. Thus, sharing an animated graphical data with a user may not
give that pleasant experience every time.
25 [0004] Therefore, there is a need of the technology where a user may share a static
graphical data with other user and the other user may be able to convert such data into
animated graphical data.
SUMMARY OF THE INVENTION
30 [0005] One or more shortcomings of the prior art are overcome, and additional
advantages are provided by the present disclosure. Additional features and advantages
3
are realized through the techniques of the present disclosure. Other embodiments and
aspects of the disclosure are described in detail herein and are considered a part of the
disclosure.
5 [0006] In a main aspect, the present disclosure provides a method for converting static
graphic into an animated graphic. To achieve this, the method discloses receiving a
static graphic along with associated metadata from a first user device. The method
further discloses mapping the static graphic to a plurality of pre-determined animated
graphics using the associated metadata. The method further discloses identifying,
10 among the plurality of pre-determined animated graphics, at least one animated graphic
that closes matches the static graphic and converting the static graphic into the animated
graphic using the identified animated graphic. The method finally discloses
transmitting the animated graphic along with a sticker ID to one or more other devices.
15 [0007] In yet another aspect, the present disclosure recites that the associated metadata
includes at least one of sticker ID, necessary tags and colour information.
[0008] In still another aspect, the present disclosure recites that the step of mapping
further comprises comparing the metadata of the static graphic with the metadata of the
20 plurality of animated graphics.
[0009] In yet another aspect, the present disclosure recites that the method is further
configured for identifying at least one tag associated with the static graphic that
represents information related to one or more emotions and adding the corresponding
25 emotions from the identified animated graphic to the static graphic, while converting
the static graphic into animated graphic on the basis of mapping.
[0010] In still another aspect, the present disclosure recites that the method is further
configured for identifying at least one tag associated with the static graphic that
30 represents information related to one or more objects and allowing to add one or more
4
corresponding objects to the static graphic while converting the static graphic into
animated graphic on the basis of mapping.
[0011] In yet another aspect, the present disclosure recites that the method is further
5 configured for generating one or more animated graphics from the static graphic, based
on mapping and ranking the one or more generated animated graphics on the basis of
users like or dislike.
10 [0012] In another main aspect, the present disclosure provides a device for converting
static graphic into an animated graphic. To achieve this, the device comprises a
receiver unit configured to receive a static graphic along with associated metadata from
a first user device and a processing unit operatively couple to the receiver unit. The
processing unit further comprises a mapping unit configured to map the static graphic
15 to a plurality of pre-determined animated graphics using the associated metadata. The
processing unit further comprises an identification unit configured to identify, among
the plurality of pre-determined animated graphics, at least one animated graphic that
closes matches the static graphic and a conversion unit configured to convert the static
graphic into the animated graphic using the identified animated graphic. The device
20 further includes a transmitter unit operatively coupled to the processing unit, said
transmitter unit configured to transmit the animated graphic along with a sticker ID to
one or more other devices.
[0013] In still another aspect, the present disclosure recites that the associated metadata
25 includes at least one of sticker ID, necessary tags and colour information.
[0014] In yet another aspect, the present disclosure recites that the mapping unit is
further configured to compare the metadata of the static graphic with the metadata of
the plurality of animated graphics.
30
[0015] In still another aspect, the present disclosure recites that the processing unit is
further configured to identify at least one tag associated with the static graphic that
represents information related to one or more emotions and add the corresponding
5
emotions from the identified animated graphic to the static graphic, while converting
the static graphic into animated graphic on the basis of mapping.
[0016] In yet another aspect, the present disclosure recites the processing unit is further
5 configured to identify at least one tag associated with the static graphic that represents
information related to one or more objects and add one or more corresponding objects
to the static graphic while converting the static graphic into animated graphic on the
basis of mapping.
10 [0017] In still another aspect, the present disclosure recites the processing unit is
further configured to generate one or more animated graphics from the static graphic,
based on mapping and rank the one or more generated animated graphics on the basis
of users like or dislike
15 [0018] In the above paragraphs, the most important features of the invention have been
outlined, in order that the detailed description thereof that follows may be better
understood and in order that the present contribution to the art may be better understood
and in order that the present contribution to the art may be better appreciated. There
are, of course, additional features of the invention that will be described hereinafter and
20 which will form the subject of the claims appended hereto. Those skilled in the art will
appreciate that the conception upon which this disclosure is based may readily be
utilized as a basis for the designing of other structures for carrying out the several
purposes of the invention. It is important therefore that the claims be regarded as
including such equivalent constructions as do not depart from the spirit and scope of
25 the invention.
OBJECT OF THE INVENTION
[0019] The object of the present disclosure is to provide a method that generates static
graphic data received from a user device into animated graphic data at the other device.
30
6
[0020] Another object of the present disclosure is to provide a system that converts a
received static graphical data into animated graphical data.
[0021] Yet another object of the present disclosure is to provide a method for delivering
5 static graphic data as is, with no significant value addition on the receiver side on the
representation, i.e. without having to search for animated graphical data.
[0022] Still another object of the present disclosure is to provide a method by which
static graphical data may be shared with the user, covering the essence of animated
10 graphical data, even in low bandwidth environment.
BREIF DESCRIPTION OF DRAWINGS
[0023] Further aspects and advantages of the present invention will be readily
understood from the following detailed description with reference to the accompanying
15 drawings, where like reference numerals refer to identical or functionally similar
elements throughout the separate views. The figures together with the detailed
description below, are incorporated in and form part of the specification, and serve to
further illustrate the aspects and explain various principles and advantages, in
accordance with the present invention wherein:
20
[0024] Fig. 1 illustrates an exemplary environment 100 including plurality of
computing devices for converting static graphical data into animated graphical data, in
accordance with an embodiment of the present disclosure.
25 [0025] Fig. 2 illustrates a block diagram 200 illustrating an exemplary computing
device that converts received static graphical data into animated graphical data, in
accordance with an embodiment of the present disclosure.
[0026] Fig. 3 is a block diagram illustrating a server 300 that converts static graphical
30 data received from one deice into animated graphical data for another device, in
accordance with an embodiment of the present disclosure.
7
[0027] Fig. 4 illustrate flow chart of method for converting static graphical data
received from one device into animated graphical data for another device, in
accordance with an embodiment of the present disclosure.
5 [0028] Skilled person in art will appreciate that elements in the drawings are illustrated
for simplicity and have not necessarily been drawn to scale. For example, the
dimensions of some of the elements in the drawings may be exaggerated relative to
other elements to help to improve understanding of aspects of the present invention.
10 DETAILED DESCRIPTION OF THE INVENTION
[0029] Referring in the present document, the word "exemplary" is used herein to mean
"serving as an example, instance, or illustration." Any embodiment or implementation
of the present subject matter described herein as "exemplary" is not necessarily to be
construed as preferred or advantageous over other embodiments.
15
[0030] While the disclosure is susceptible to various modifications and alternative
forms, specific embodiment thereof has been shown by way of example in the drawings
and will be described in detail below. It should be understood, however that it is not
intended to limit the disclosure to the particular forms disclosed, but on the contrary,
20 the disclosure is to cover all modifications, equivalents, and alternatives falling within
the scope of the disclosure.
[0031] The terms “comprises”, “comprising”, or any other variations thereof, are
intended to cover a non-exclusive inclusion, such that a setup, device that comprises a
25 list of components does not include only those components but may include other
components not expressly listed or inherent to such setup or device. In other words,
one or more elements in a system or apparatus proceeded by “comprises… a” does not,
without more constraints, preclude the existence of other elements or additional
elements in the system or apparatus or device. Further, the terms graphic data,
30 graphical data and emoji/sticker/Animoji mean same in respect of the present invention
and thus may be used interchangeably throughout the present disclosure.
8
[0032] Disclosed herein is a technique for converting static graphical data (i.e. static
emoji/Animoji/sticker/avatar) received from a first user device into an animated
graphical data (i.e. animated emoji/Animoji/sticker/avatar) at a second device. The
techniques disclosed herein include mapping the static emoji/Animoji/sticker/avatar to
5 a plurality of pre-determined/pre-stored animated emoji/Animoji/sticker/avatars using
associated metadata. Said techniques further disclose identifying, among the plurality
of pre-determined/pre-stored animated emoji/Animoji/sticker/avatar, at least one
animated graphic that closes matches the static emoji/Animoji/sticker/avatar and
converting the static emoji/Animoji/sticker/avatar into the animated
10 emoji/Animoji/sticker/avatar using the identified animated graphic. The technique
therefore provides an interactive and entertaining means of sharing animated
emoji/Animoji/sticker/avatar in the form of static emoji/Animoji/sticker/avatar.
[0033] Fig. 1 illustrates an environment 100 for converting static graphical data
15 (i.e. static emoji/Animoji/sticker/avatar) received from a first user device into an
animated graphical data (i.e. animated emoji/Animoji/sticker/avatar) at a second device
in accordance with an embodiment of the present disclosure. The environment 100
includes one or more user devices 102a-102e (interchangeably referred to as “the user
device 102 or device 102” and a server 106 communicably coupled to the user device(s)
20 102a-102e via a network 104. Example of the network 104 may include, but not limited
to, the Internet, a Wide Area Network (WAN), a Local Area Network (LAN), or any
combination thereof.
[0034] The user device 102 may be configured to provide a user interface to a user to
25 enable the user to interact within the environment 100 to convert static graphical data
(i.e. static emoji/Animoji/sticker/avatar) received from other user device into an
animated graphical data (i.e. animated emoji/Animoji/sticker/avatar), in accordance
with an embodiment of present disclosure. The user device 102 may include any mobile
computing or communication device, such as, but not limited to, a notebook computer,
30 a personal digital assistant (PDA), a mobile phone, a smartphone, a laptop, a tablet or
any similar class of mobile computing device with sufficient processing,
9
communication, audio/video recording and sticker/Animoji/emoji generation and
sharing capabilities. In an embodiment, any of the user devices 102a-102e may be
configured to convert static emoji/avatar/Animoji/sticker into animated
emoji/avatar/Animoji/sticker received from the other. The functional aspect of the
5 user device 1021-102e will be explained in detailed in figure 2 below.
[0035] Figure 2 discloses an exemplary user device 200 that is configured to convert
static graphical data herein referred as static emoji/Animoji/sticker/avatar received
from another device (not shown) into an animated graphical data herein referred as
10 animated emoji/Animoji/sticker/avatar. In an exemplary embodiment, the user device
200 illustrated in figure 2 may be any of the user device from 102a-102e shown in
figure 1. Further, as illustrated in figure 2, the user device 200 includes a transceiver
208 configured to receive a static emoji/Animoji/sticker/avatar along with associated
metadata from any of the other user devices 102a-102e. In an exemplary embodiment,
15 the user device 200 that receives the static emoji/Animoji/sticker/avatar and user device
(not shown) that shares the static emoji/Animoji/sticker/avatar with the user device 200
are two different devices from the environment 100. Further, the metadata associated
with the static emoji/Animoji/sticker/avatar may include at least one of sticker ID,
necessary tags and color information.
20
[0036] In an aspect, the tags associated with the static emoji/Animoji/sticker/avatar
may be helpful in identifying one of emotions, objects, sentiments and like features
contained within the static emoji/Animoji/sticker/avatar, which may be useful while
converting said static emoji/Animoji/sticker/avatar into animated
25 emoji/Animoji/sticker/avatar. In another aspect, the sticker ID may be helpful in
distinguishing each static emoji/Animoji/sticker/avatar with the other. In an aspect,
each of the static emoji/Animoji/sticker/avatar has a unique sticker ID associated with
it and no two-static emoji/Animoji/sticker/avatar may have same sticker ID.
Furthermore, in an aspect, the color information may be helpful in retaining the color
30 information of the objects present in the static emoji/Animoji/sticker/avatar while
10
converting static emoji/Animoji/sticker/avatar into animated
emoji/Animoji/sticker/avatar.
[0037] Further as shown in figure 2, the device 200 may include a processing unit 204.
5 Said processing unit 204 may remain connected to the transceiver unit 208 by one or
more wired or wireless means. In an aspect, the processing unit 204 may be
configured to map the static emoji/Animoji/sticker/avatar to a plurality of predetermined/pre-stored animated emoji/Animoji/sticker/avatar using the associated
metadata. In an exemplary embodiment, in order to achieve the above aspect, the
10 device 200 may include a database/memory 206, wherein said database/memory 206
may include templates of millions of animated emoji/Animoji/sticker/avatar along with
their associated metadata such as sticker ID, Information Tags and color information.
Thus, in order to map the static emoji/Animoji/sticker/avatar to a plurality of predetermined/pre-stored animated emoji/Animoji/sticker/avatars, the processing unit 204
15 may be configured to access a lookup table resident inside the database/memory 206
that stores such information.
[0038] In an aspect, to map the received static emoji/Animoji/sticker/avatar to a
plurality of pre-determined/pre-stored animated emoji/Animoji/sticker/avatars the
20 processing unit 204 may be configured to compare at least one of the metadata
information (i.e. sticker ID, Tag, color information) of the static
emoji/Animoji/sticker/avatars with the metadata of the plurality of animated
emoji/Animoji/sticker/avatars using the look-up table (not shown) stored in the
database/memory 206. The processing unit 204 is further configured to identify
25 among the plurality of pre-determined/pre-stored animated
emoji/Animoji/sticker/avatars, at least one animated emoji/Animoji/sticker/avatars that
closes matches the static emoji/Animoji/sticker/avatars. In view of the disclosure so
far, it is to be appreciated that said matching is done on the basis of mapping of
metadata, specifically tags and color information, associated with both static
30 emoji/Animoji/sticker/avatars and animated emoji/Animoji/sticker/avatars.
11
[0039] Once, the closet related animated emoji/Animoji/sticker/avatars is identified,
the processing unit 204 is configured to convert the static emoji/Animoji/sticker/avatars
into the animated emoji/Animoji/sticker/avatars using the at least one of tags and color
information associated with the identified animated emoji/Animoji/sticker/avatars. In
5 an exemplary embodiment, the processing unit 204 may be configured to generate one
or more variants of the animated emoji/Animoji/sticker/avatars for one static
emoji/Animoji/sticker/avatars. The processing unit 204 is further configured to
generate a sticker ID for each one of the generated animated
emoji/Animoji/sticker/avatars. Said sticker ID helps in distinguishing thus created
10 animated emoji/Animoji/sticker/avatars with other animated
emoji/Animoji/sticker/avatars pre-stored in the database/memory 206.
[0040] In an exemplary embodiment, if the processing unit 204 identifies at least one
tag associated with the static emoji/Animoji/sticker/avatars represents information
15 related to one or more emotions, the processing unit 204 is configured to add the
corresponding emotions from the identified animated emoji/Animoji/sticker/avatars to
the static emoji/Animoji/sticker/avatars, while converting the static
emoji/Animoji/sticker/avatars into animated emoji/Animoji/sticker/avatars on the basis
of mapping. In an example, if the processing unit 204 identifies one or more tags
20 associated with the static emoji/Animoji/sticker/avatars representing anger and the
color information associated as red, the processing unit 204 may be configured to
animate the static emoji/Animoji/sticker/avatars by adding emotions and color from the
identified animated emoji/Animoji/sticker/avatars representing anger such as “red hot
steaming face with wide eyes staring with anger” into static
25 emoji/Animoji/sticker/avatars to convert it into animated
emoji/Animoji/sticker/avatars.
[0041] In another exemplary embodiment, if the processing unit 204 identifies at least
one tag associated with the static emoji/Animoji/sticker/avatars represents information
30 related to one or more objects contained/relevant to the static
emoji/Animoji/sticker/avatars, the processing unit 204 may be configured to add the
12
corresponding object from the identified animated emoji/Animoji/sticker/avatars to the
static emoji/Animoji/sticker/avatars, while converting the static
emoji/Animoji/sticker/avatars into animated emoji/Animoji/sticker/avatars on the basis
of mapping. In an example, if the processing unit 204 identifies one or more tags
5 associated with the static emoji/Animoji/sticker/avatars as cricket world cup 2011 lifted
by Sachin Tendulkar, the processing unit 204 may be configured to animate the static
emoji/Animoji/sticker/avatars by adding objects from the identified animated
emoji/Animoji/sticker/avatars representing India winning world-cup such as “waiving
Indian Flag” into the background of static emoji/Animoji/sticker/avatars.
10
[0042] Further, it is to be appreciated that in such an embodiment, to achieve the
objectives of the present invention, the device 200 may remain connect to the server
106 as shown in figure 1 by means of an application (not shown) resident on the user
device 200 platform via the network 204. In one exemplary aspect, the server 106
15 may be configured to receive the static emoji/Animoji/sticker/avatars from the user
device 200 for conversion into animated emoji/Animoji/sticker/avatars. In such
aspect, the server 106 may be configured to convert the static
emoji/Animoji/sticker/avatars into animated emoji/Animoji/sticker/avatars and share it
with device 200 along with the sticker ID. The detailed process of same is disclosed
20 in figure 3 below. Furthermore, in another aspect, the server may be configured to
provide all the necessary information to the user device 200 to convert the static
emoji/Animoji/sticker/avatars into animated emoji/Animoji/sticker/avatars.
[0043] Further, in another exemplary aspect, the server 106 may be configured to
25 intercept the static emoji/Animoji/sticker/avatars shared by any of the user device 102a102e as shown in figure 1 before reaching to the destined user device 200. In such an
aspect, the server 106 may be configured to convert the static
emoji/Animoji/sticker/avatars into animated emoji/Animoji/sticker/avatars and share it
with device 200 along with the sticker, described in detail in figure 3 below.
30
13
[0044] Embodiment described herein can be understood with an example, wherein a
user sends a static sticker representing “Sachin Tendulkar lifting world cup” to another
user through his user device. The user device 200 may receive such sticker and may
be configured to convert said static sticker into animated sticker. It is to be appreciate
5 that in such a scenario the user device 200 may remain connected to the server 106
through network 104. The server 106, upon receipt of such static sticker, may be
configured to map said static sticker with multiple animated stickers pre-stored in the
database/memory, using one or more tags and color information. In the present
example, the server 106 may identify one or more tags associated with said sticker as
10 thrilled, happy, proud etc. Based on said tags the server 106 may identify one or more
animated sticker with similar tags (i.e. sticker representing similar emotions). Further,
the server 106 may animate said static sticker by adding emotions such as Sachin crying
with happiness and Indian flag waiving in the background, by using the identified
animated sticker.
15
[0045] Embodiment illustrated above is exemplary in nature and the device 200 and/or
any element of the device 200 may include any number of addition components
required to perform the desired operation of the system 200. In an exemplary
20 embodiment, the user device 200 may also include an Input / Output unit 202 (also
referred to as IO unit 202) for taking user inputs. Further, in an exemplary
embodiment, the user device 200 may include other essential elements that may be
required for carrying out one or more functionalities of the user device 200, however
same are not explained for the sake of brevity.
25
[0046] Coming back to figure 2, the IO unit 202 may be configured to receive one or
more user inputs. In an embodiment, the IO unit 202 comprises input/output devices
such as, but not limited to, of an audio recording device, an image capturing device, a
touch display, a keypad, one or more sensors, speakers and so forth. In an embodiment,
30 the audio recording device may be configured to enable the user to create
emoji/sticker/animoji/avatar or select an emoji/Animoji/sticker/avatars from the one
stored in the memory. The image capturing device (not shown) may be configured to
14
capture an image and a video in combination with audio recording device of the user.
The touch display or the keypad may enable a user to provide one or more manual input
to the user device 200.
5 [0047] Further, in an aspect, the memory unit 206 of the user device 200 may be
configured to store data and/or instruction required for processing of the processing
unit 204. In some embodiments, the memories unit 206 may include memory storage
devices such as, but not limited to, Read Only Memory (ROM), Random Access
Memory (RAM), flash disk, and so forth. The memory unit 206 may store the one or
10 more inputs received from the user.
[0048] In some embodiment, the processing unit 204 may execute a set of instructions
stored in the memory unit 206 to provide a user interface to the user. The user interface
may allow the user to interact within the environment 100 (shown in Fig. 1). The
15 processing unit 204 may process the one or more user inputs and transmit the processed
user inputs to the server 106. In an exemplary embodiment, the processing unit 204
may be configured to identify the metadata of the static emoji/Animoji/sticker/avatars
and share said information with the server 106, when required. The user device 200
may further include a display unit (not shown) configured to display the received
20 animated sticker to the user.
[0049] Further in an embodiment, the user device 200 may be implemented with an
Artificial Intelligence (AI) model. The AI model may be configured to extract
information such as frequency of usage of the created animated
25 emoji/Animoji/sticker/avatars by the user to rank thus created animated
emoji/Animoji/sticker/avatars. Further the AI model may be configured to identify
more frequently associated/linked tags used in converting static
emoji/Animoji/sticker/avatars into animated emoji/Animoji/sticker/avatars to analyze
the behavioral pattern of a user. The AI model may also be helpful in determining the
30 information including usage pattern, degree of resemblance and like aspects of said
animated emoji/Animoji/sticker/avatars with other animated
15
emoji/Animoji/sticker/avatars created in past having similar metadata so as to provide
feedback to the processing unit 204 of the user device 200 to make self-improvements.
In another embodiment, the device 200 may use one or more known machine learning
algorithms to analyze the usage pattern of the animated emoji/Animoji/sticker/avatars
5 using the determined information. This may help the device 200 in making constant
improvements in its system to create more and more interactive animated
emoji/Animoji/sticker/avatars from static emoji/Animoji/sticker/avatars in future.
[0050] Embodiments illustrated above is exemplary in nature and the user device 200
10 may include any other addition components required to perform desired functionality
of the user device 102.
[0051] Fig. 3 is a block diagram illustrating the server 300 for converting static
graphical data herein referred as static emoji/Animoji/sticker/avatar received from
15 another device (not shown) into an animated graphical data herein referred as animated
emoji/Animoji/sticker/avatar. In an exemplary embodiment, the server 300 is similar
to the server 106 disclosed in figure 1. The server 300 includes a receiver unit 302, a
processing unit 304, a mapping unit 306, an identification unit 308, a conversion unit
310, a transmission unit 312, and one or more memory units/databases 314. Each of
20 the receiver unit 302, the processing unit 304, the mapping unit 306, the identification
unit 308, the conversion unit 310, the transmission unit 312, and the one or more
memory units/databases 314 may be operatively coupled to each other.
[0052] The receiver unit 302 may be configured to establish a communication between
25 the one or more user devices 102a-102e, via the network 104, to receive static
emoji/Animoji/sticker/avatar. The receiver unit 302 may be configured to receive said
static emoji/Animoji/sticker/avatar along with associated metadata from a user device
(not shown). The associated metadata may include at least one of sticker ID, necessary
tags and color information associated with the static emoji/Animoji/sticker/avatar. In
30 one exemplary embodiment, the receiver unit 302 may be configured to receive the
static emoji/Animoji/sticker/avatar along with associated metadata from the first user
16
device. In other exemplary embodiment, the receiver unit 302 may be configured to
receive the information pertaining to static emoji/Animoji/sticker/avatar from a second
user device to which it has been sent by the first user device.
5 [0053] The server 300 further comprises a processing unit 304 configured to further
process the static emoji/Animoji/sticker/avatar along with associated metadata from the
first user device or the second user device. In an aspect, the processing unit 304
includes a mapping unit 306 operatively coupled to the receiver unit 302. The
mapping unit 306 may be configured to access the static emoji/Animoji/sticker/avatar
10 and the associated metadata information from the receiver unit 302 and map the static
graphic to a plurality of pre-determined/pre-stored animated graphics using the
associated metadata. In an embodiment, in order to achieve the above aspect, the
server 300 may include a database/memory unit 314, wherein said database/memory
unit 314 may include templates of millions of animated emoji/Animoji/sticker/avatar
15 along with their associated metadata such as sticker ID, Information Tags and color
information. Thus, in order to map the static emoji/Animoji/sticker/avatar to a
plurality of pre-determined/pre-stored animated emoji/Animoji/sticker/avatars, the
mapping unit 306 may be configured to access a lookup table resident inside the
database/memory 206 that stores such information.
20
[0054] In an aspect, to map the received static emoji/Animoji/sticker/avatar to a
plurality of pre-determined/pre-stored animated emoji/Animoji/sticker/avatars the
mapping unit 304 may be configured to compare at least one of the metadata
information (i.e. sticker ID, Tag, color information) of the static
25 emoji/Animoji/sticker/avatars with the metadata of the plurality of animated
emoji/Animoji/sticker/avatars using the look-up table (not shown) stored in the
database/memory unit 314. In one exemplary aspect, the mapping unit 306 may use
one or more known machine learning algorithms to map the metadata of the static
emoji/Animoji/sticker/avatars with the metadata of the plurality of animated
30 emoji/Animoji/sticker/avatars using the look-up table, making the process of mapping
much simpler and faster.
17
[0055] Further the processing unit 302 of the server 300 may include an identification
unit 308 operatively connected to the mapping unit 306. The identification unit 308
may be further configured to identify among the plurality of pre-determined/pre-stored
5 animated emoji/Animoji/sticker/avatars, at least one animated
emoji/Animoji/sticker/avatars that closes matches the static
emoji/Animoji/sticker/avatars. In view of the disclosure so far, it is to be appreciated
that said matching is done on the basis of mapping of metadata, specifically tags and
color information, associated with both static emoji/Animoji/sticker/avatars and
10 animated emoji/Animoji/sticker/avatars.
[0056] The processing unit 304 of the server 300 further includes a conversion unit 310
operatively coupled to the identification unit 308. Once, the closet related animated
emoji/Animoji/sticker/avatars is identified, the conversion unit 204 is configured to
15 retrieve the necessary information related to the identified animated
emoji/Animoji/sticker/avatars from the identification unit 308 and convert the static
emoji/Animoji/sticker/avatars into the animated emoji/Animoji/sticker/avatars. In an
exemplary embodiment, the necessary information may include at least one of
information associated with tags (identifying emotions, sentiments, objects and like
20 information conveyed by the emoji/Animoji/sticker/avatars) and color information
associated with the identified animated emoji/Animoji/sticker/avatars. In an
exemplary embodiment, the conversion unit 308 may be configured to generate one or
more variants of the animated emoji/Animoji/sticker/avatars for one static
emoji/Animoji/sticker/avatars.
25
[0057] The processing unit 304 of the server 300 is further configured to generate a
sticker ID for the created animated emoji/Animoji/sticker/avatars. Said sticker ID
helps in distinguishing thus created animated emoji/Animoji/sticker/avatars with other
animated emoji/Animoji/sticker/avatars pre-stored in the database/memory unit 314.
30 In one example, the created animated emoji/Animoji/sticker/avatars along with the
sticker ID is stored in the database/memory unit 314 for future reference. In another
18
example, the created emoji/Animoji/sticker/avatars along with the sticker ID is sent to
the second user device for the other user to use.
[0058] To understand the functionality of identification unit 308 and the conversion
5 unit 310 of the server 300 below embodiments may be referred.
[0059] In an exemplary embodiment, if the identification unit 308 identifies at least
one tag associated with the static emoji/Animoji/sticker/avatars represents information
related to one or more emotions, the conversation unit 308 is configured to add the
10 corresponding emotions from the identified animated emoji/Animoji/sticker/avatars to
the static emoji/Animoji/sticker/avatars, while converting the static
emoji/Animoji/sticker/avatars into animated emoji/Animoji/sticker/avatars on the basis
of mapping. For example, if the identification unit 204 identifies one or more tags
associated with the static emoji/Animoji/sticker/avatars representing happy and the
15 color information associated as pink, the conversion unit 310 may be configured to
animate the static emoji/Animoji/sticker/avatars by adding emotions and color from the
identified animated emoji/Animoji/sticker/avatars representing anger such as “happy
face with pink color in the background” into static emoji/Animoji/sticker/avatars to
convert it into animated emoji/Animoji/sticker/avatars.
20
[0060] In another exemplary embodiment, if the identification unit 308 identifies at
least one tag associated with the static emoji/Animoji/sticker/avatars represents
information related to one or more objects contained/relevant to the static
emoji/Animoji/sticker/avatars, the conversion unit 310 may be configured to add the
25 corresponding object from the identified animated emoji/Animoji/sticker/avatars to the
static emoji/Animoji/sticker/avatars, while converting the static
emoji/Animoji/sticker/avatars into animated emoji/Animoji/sticker/avatars on the basis
of mapping. In an example, if the identification unit 308 identifies one or more tags
associated with the static emoji/Animoji/sticker/avatars as cricket world cup 2011 lifted
30 by Sachin Tendulkar, the conversion unit 310 may be configured to animate the static
emoji/Animoji/sticker/avatars by adding objects from the identified animated
19
emoji/Animoji/sticker/avatars representing India winning world-cup such as “waiving
Indian Flag” into the background of static emoji/Animoji/sticker/avatars to convert it
into animated emoji/Animoji/sticker/avatars.
5 [0061] Further, in an aspect, the database/memory unit 314 of the server 300 may be
configured to store data and/or instruction required by the processing unit 304 to
perform he desired function of the server 300. In some embodiments, the memories
unit 314 may include memory storage devices such as, but not limited to, Read Only
Memory (ROM), Random Access Memory (RAM), flash disk, and so forth. The
10 memory unit 206 may store the one or more inputs received from the user.
[0062] In some embodiment, the processing unit 304 may execute a set of instructions
stored in the memory unit 314 to achieve the desired objectives of the present invention.
The processing unit 304 may process the one or more user static
15 emoji/Animoji/sticker/avatars and transmit the animated emoji/Animoji/sticker/avatars
to the second user device for further use. In an exemplary embodiment, the mapping
unit 306, the identification unit 308 and the conversion unit 310 may be a part of
processing unit 304 and may placed inside the processing unit 304. In another
exemplary embodiment, the mapping unit 306, the identification unit 308 and the
20 conversion unit 310 may be separate hardware entities placed external to the processing
unit and may remain connected to the processing unit 304 by one or more know wired
or wireless means.
[0063] Further in an embodiment, the server 300 may be implemented with an
25 Artificial Intelligence (AI) model. The AI model may be configured to extract
information such as frequency of usage of the created animated
emoji/Animoji/sticker/avatars by the user to rank thus created animated
emoji/Animoji/sticker/avatars. Further the AI model may be configured to identify
more frequently associated/linked tags used in converting static
30 emoji/Animoji/sticker/avatars into animated emoji/Animoji/sticker/avatars to analyze
the behavioral pattern of a user. The AI model may also be helpful in determining the
20
information including usage pattern, degree of resemblance and like aspects of said
animated emoji/Animoji/sticker/avatars with other animated
emoji/Animoji/sticker/avatars created in past having similar metadata so as to provide
feedback to the processing unit 304 of the server 300 to make self-improvements. In
5 another embodiment, the server 300 may use one or more known machine learning
algorithms to analyze the usage pattern of the animated emoji/Animoji/sticker/avatars
using the determined information. This may help the server 300 in making constant
improvements in its system to create more and more interactive animated
emoji/Animoji/sticker/avatars from static emoji/Animoji/sticker/avatars in future.
10
[0064] Embodiment illustrated above are exemplary in nature and the server 300 may
include any additional unit required to perform the desired operation of the server 300.
Further, embodiments of the present disclosure cover or intend to cover any one or
more operations as illustrative above to be performed at the user device 102a-102b.
15 The user device 102a-102e may include any of the corresponding unit required to
perform said operation of the server 300.
[0065] Fig. 4 is a flowchart of exemplary method 400 for converting static graphical
data received from one device into animated graphical data for another device. This
20 flowchart is provided for illustration purposes, and embodiments are intended to
include or otherwise cover any methods or procedures for generative animation. Fig. 4
is described in reference to Figs. 1-3.
[0066] At block 402, the server 106 may be configured for receiving a static graphic
25 referred herein as static emoji/Animoji/sticker/avatars along with associated metadata
from a first user device via the receiver unit 302. In an embodiment, the associated
metadata includes at least one of sticker ID, necessary tags and color information.
[0067] At step 404, the method 400 discloses mapping the static
30 emoji/Animoji/sticker/avatars to a plurality of pre-determined animated
emoji/Animoji/sticker/avatars using the associated metadata. In an exemplary
21
embodiment, though not explicitly disclosed, the step of mapping includes comparing
the metadata of the static emoji/Animoji/sticker/avatars with the metadata of the
plurality of animated emoji/Animoji/sticker/avatars.
5 [0068] At step 406, the method 400 discloses identifying, from among the plurality of
pre-determined animated emoji/Animoji/sticker/avatars, at least one animated
emoji/Animoji/sticker/avatars that closes matches the static graphic. In particular, the
method 400 at step 406 discloses in one embodiment identifying at least one tag
associated with the static graphic that represents information related to one or more
10 emotions. Further, method 400 at step 406 discloses in another embodiment
identifying at least one tag associated with the static graphic that represents information
related to one or more objects.
[0069] At step 408, the method 400 further discloses converting the static
15 emoji/Animoji/sticker/avatars into the animated emoji/Animoji/sticker/avatars using
the identified animated emoji/Animoji/sticker/avatars. In one exemplary
embodiment, the method 400, at step 408, discloses adding the corresponding emotions
from the identified animated emoji/Animoji/sticker/avatars to the static
emoji/Animoji/sticker/avatars, while converting the static graphic into animated
20 graphic on the basis of mapping. In another exemplary embodiment, the method 400,
at step 408, discloses In one exemplary embodiment, the method 400, at step 408,
discloses adding one or more corresponding objects to the static
emoji/Animoji/sticker/avatars while converting the static
emoji/Animoji/sticker/avatars into animated emoji/Animoji/sticker/avatars on the basis
25 of mapping.
[0070] At step 410, the method 400 discloses transmitting the animated
emoji/Animoji/sticker/avatars along with a sticker ID to one or more other devices for
user use.
30
[0071] The flowchart and block diagrams in the Figures illustrate the architecture,
functionality, and operation of possible implementations of devices, methods, and
22
computer program products according to various embodiments of the present
invention. In this regard, each block in the flowchart or block diagrams may represent
a module, unit, segment, or portion of instructions, which comprises one or more
executable instructions for implementing the specified logical function(s). In some
5 alternative implementations, the functions noted in the block may occur out of the order
noted in the figures. For example, two blocks shown in succession may, in fact, be
executed substantially concurrently, or the blocks may sometimes be executed in the
reverse order, depending upon the functionality involved. It will also be noted that each
block of the block diagrams and/or flowchart illustration, and combinations of blocks
10 in the block diagrams and/or flowchart illustration, can be implemented by special
purpose hardware-based systems that perform the specified functions or acts or carry
out combinations of special purpose hardware and computer instructions.
[0072] The foregoing description of the various embodiments is provided to enable any
15 person skilled in the art to make or use the present invention. Various modifications to
these embodiments will be readily apparent to those skilled in the art, and the generic
principles defined herein may be applied to other embodiments without departing from
the spirit or scope of the invention. Thus, the present invention is not intended to be
limited to the embodiments shown herein, and instead the claims should be accorded
20 the widest scope consistent with the principles and novel features disclosed herein.
[0073] While the invention has been described with reference to a preferred
embodiment, it is apparent that variations and modifications will occur without
departing the spirit and scope of the invention. It is therefore contemplated that the
25 present disclosure covers any and all modifications, variations or equivalents that fall
within the scope of the basic underlying principles disclosed above.
[0074] REFERENCE NUMERALS:
Reference Numerals Components
100 Environment for working of said invention
102a-102e User devices
23
104 Network
106 server
200 User device
202 Input/output unit of the user device
204 Processing unit of the user device
206 Database/memory unit of the user device
208 Transceiver unit of the user device
300 Server
302 Receiver unit of the server
304 Processing unit of the server
306 Mapping unit of the server
308 Identification unit of the server
310 Conversion unit of the server
312 Transmission unit of the server
314 Memory unit of the server
400 Method flowchart
402-410 Method steps
We Claim:
1. A method for converting static graphic into an animated graphic, said method
comprising:
5 receiving a static graphic along with associated metadata from a first user
device;
mapping the static graphic to a plurality of pre-determined animated graphics
using the associated metadata;
identifying, among the plurality of pre-determined animated graphics, at least
10 one animated graphic that closes matches the static graphic;
converting the static graphic into the animated graphic using the identified
animated graphic; and
transmitting the animated graphic along with a sticker ID to one or more other
devices.
15
2. The method as claimed in claim 1, wherein the associated metadata includes at
least one of sticker ID, necessary tags and colour information.
3. The method as claimed in claim 1, wherein the step of mapping further
20 comprises comparing the metadata of the static graphic with the metadata of the
plurality of animated graphics.
4. The method as claimed in claim 1, further comprises:
identifying at least one tag associated with the static graphic that represents
25 information related to one or more emotions; and
adding the corresponding emotions from the identified animated graphic to the
static graphic, while converting the static graphic into animated graphic on the
basis of mapping.
30 5. The method as claimed in claim 1, further comprises:
identifying at least one tag associated with the static graphic that represents
25
information related to one or more objects; and
adding one or more corresponding objects to the static graphic while converting
the static graphic into animated graphic on the basis of mapping.
5 6. The method as claimed in claim 1, further comprises:
generating one or more animated graphics from the static graphic, based on
mapping; and
ranking the one or more generated animated graphics on the basis of users like
or dislike.
10
7. A device for converting static graphic into an animated graphic, said device
comprising:
a receiver unit configured to receive a static graphic along with associated
metadata from a first user device;
15 a processing unit operatively couple to the receiver unit, wherein the processing
unit further comprises:
a mapping unit configured to map the static graphic to a plurality of predetermined animated graphics using the associated metadata;
an identification unit configured to identify, among the plurality of pre20 determined animated graphics, at least one animated graphic that closes
matches the static graphic;
a conversion unit configured to convert the static graphic into the
animated graphic using the identified animated graphic; and
a transmitter unit operatively coupled to the processing unit, said transmitter
25 unit configured to transmit the animated graphic along with a sticker ID to one
or more other devices.
8. The device as claimed in claim 7, wherein the associated metadata includes at
least one of sticker ID, necessary tags and colour information.
30
26
9. The device as claimed in claim 7, wherein the mapping unit is further
configured to compare the metadata of the static graphic with the metadata of
the plurality of animated graphics.
5 10. The device as claimed in claim 7, wherein the processing unit is further
configured to
identify at least one tag associated with the static graphic that represents
information related to one or more emotions; and
add the corresponding emotions from the identified animated graphic to the
10 static graphic, while converting the static graphic into animated graphic on the
basis of mapping.
11. The device as claimed in claim 7, wherein the processing unit is further
configured to:
15 identify at least one tag associated with the static graphic that represents
information related to one or more objects; and
add one or more corresponding objects to the static graphic while converting
the static graphic into animated graphic on the basis of mapping.
20 12. The device as claimed in claim 7, wherein the processing unit is further
configured to:
generate one or more animated graphics from the static graphic, based on
mapping; and
rank the one or more generated animated graphics on the basis of users like or
25 dislike.
| # | Name | Date |
|---|---|---|
| 1 | 202011029419-FORM 18 [10-06-2024(online)].pdf | 2024-06-10 |
| 1 | 202011029419-STATEMENT OF UNDERTAKING (FORM 3) [10-07-2020(online)].pdf | 2020-07-10 |
| 2 | 202011029419-POWER OF AUTHORITY [10-07-2020(online)].pdf | 2020-07-10 |
| 2 | 202011029419-Proof of Right [09-10-2020(online)].pdf | 2020-10-09 |
| 3 | 202011029419-COMPLETE SPECIFICATION [10-07-2020(online)].pdf | 2020-07-10 |
| 3 | 202011029419-FORM 1 [10-07-2020(online)].pdf | 2020-07-10 |
| 4 | 202011029419-DECLARATION OF INVENTORSHIP (FORM 5) [10-07-2020(online)].pdf | 2020-07-10 |
| 4 | 202011029419-DRAWINGS [10-07-2020(online)].pdf | 2020-07-10 |
| 5 | 202011029419-DRAWINGS [10-07-2020(online)].pdf | 2020-07-10 |
| 5 | 202011029419-DECLARATION OF INVENTORSHIP (FORM 5) [10-07-2020(online)].pdf | 2020-07-10 |
| 6 | 202011029419-COMPLETE SPECIFICATION [10-07-2020(online)].pdf | 2020-07-10 |
| 6 | 202011029419-FORM 1 [10-07-2020(online)].pdf | 2020-07-10 |
| 7 | 202011029419-POWER OF AUTHORITY [10-07-2020(online)].pdf | 2020-07-10 |
| 7 | 202011029419-Proof of Right [09-10-2020(online)].pdf | 2020-10-09 |
| 8 | 202011029419-FORM 18 [10-06-2024(online)].pdf | 2024-06-10 |
| 8 | 202011029419-STATEMENT OF UNDERTAKING (FORM 3) [10-07-2020(online)].pdf | 2020-07-10 |
| 9 | 202011029419-FER.pdf | 2025-07-22 |
| 10 | 202011029419-FORM 3 [21-08-2025(online)].pdf | 2025-08-21 |
| 1 | 202011029419_SearchStrategyNew_E_Search_Strategy_MatrixE_19-03-2025.pdf |