Abstract: Embodiments of the present invention may relate to a system [100] and a method [200] for a live camera frame modification, includes steps of: receiving, a session request to share the live camera frame from a first social networking application stored on a first electronic device [102]; sending an acknowledgement from a second social networking application to the first social networking application in response to the session request, wherein the second social networking application is stored on a second electronic device [108]; receiving and displaying said live camera frame within an interface of the second social networking application at the second electronic device [108]; receiving a selection of the at least one artefact for the displayed live camera frame; and communicating, in real-time, at least one information associated with the selected at least one artefact from the second social networking application to the first social networking application, to modify said the live camera frame. FIG.l
TECHNICAL FIELD
Embodiments of the present invention generally relate to the field of social networking. More particularly, the present invention relates to a method and a system for live camera frame modification.
BACKGROUND
This section is intended to provide information relating to general state of the art and thus any approach/functionality described below should not be assumed to be qualified as a prior art merely by its inclusion in this section.
Currently, it is well known in the art that when a user creates a document then the user shares the document with other users authorizing the other users to make edits in the document simultaneously. Each edit made by any of the other users is clearly seen at the time the edit is made and afterwards. These edits may include but not limited to, changing font, text, lines, adding additional information to the document, inserting figures or a combination thereof.
With the advancement in the technology, it has also been made possible to share and edit live camera frames simultaneously by multiple users. When a first user tries to capture a camera frame and before finally capturing as well as saving the camera frame in a memory of a mobile device, the first user invites a second user and shares his live camera frame with the second user. The sharing of live camera frame with the second user further allows the second user to make changes or edits in the live camera frame of the first user.
However, the existing systems and methods for live camera frame sharing and editing poses certain drawbacks. Firstly, when the second user make edits in the live frame, the edits so made gets applied at the live camera frame at the time when such edits have been made. Then, the entire live camera frame with applied edits are transmitted back to the first user. Secondly, in such systems
and methods for sharing the live camera frame, the first user is unable to undo edits applied by the second user to the shared live frames. As a result, the actual live camera frame gets lost or damaged as the edits or changes are applied on the live camera frame images and the first user is unable to undo such edits. Thirdly, if the first user wants to make some further edits or changes in the live camera frame, the first user has to again start sharing the live camera frames. Moreover, the sharing of the entire camera frame along with the applied changes/edits from the second user to the first user consumes more processing power, data bandwidth and time.
Therefore, in view of the above drawbacks and limitations of the existing systems and methods, a method and a system is required for sharing live camera frame to apply edits or changes to overcome the above-mentioned problems.
SUMMARY
This section is provided to introduce certain embodiments and aspects of the present disclosure in a simplified form that are further described below in the detailed description. This summary is not intended to identify the key features or the scope of the claimed subject matter.
Embodiments of the present invention may relate to a method for a method for a live camera frame modification, the method comprising: receiving, a session request to share the live camera frame from a first social networking application stored on a first electronic device; sending an acknowledgement from a second social networking application to the first social networking application in response to the session request, wherein the second social networking application is stored on a second electronic device; based on said acknowledgement, receiving and displaying said live camera frame within an interface of the second social networking application at the second electronic device; receiving a selection of the at least one artefact, at the interface of the
second social networking application, for the displayed live camera frame; and communicating, in real-time, at least one information associated with the selected at least one artefact from the second social networking application to the first social networking application, to modify said the live camera frame.
Embodiments of the present invention may further relate to a system for a live camera frame modification, the system comprising: a memory; a processor coupled to the memory; a communication module configured to: receive, a session request to share the live camera frame from a first social networking application stored on a first electronic device, and send an acknowledgement from a second social networking application to the first social networking application in response to the session request, wherein the second social networking application is stored on a second electronic device; and an interface unit coupled with the communication module, configured to: based on said acknowledgement, receive and display said live camera frame within an interface of the second social networking application at the second electronic device, and receive a selection of the at least one artefact, at the interface of the second social networking application, for the displayed live camera frame; wherein the communication module further configured to communicate, in real-time, at least one information associated with the selected at least one artefact from the second social networking application to the first social networking application, to modify said the live camera frame.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will
be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components or circuitry commonly used to implement such components.
FIG.l illustrates an exemplary system architecture [100] for live camera frame modification, in accordance with an embodiment of the present invention.
FIG.2 illustrates a method flow diagram [200] for live camera frame modification, in accordance with an embodiment of the present invention.
FIG.3 illustrates a signaling flow diagram [300] for live camera frame modification, in accordance with a first embodiment of the present invention.
FIG.4 illustrates a signaling flow diagram [400] for live camera frame modification, in accordance with a second embodiment of the present invention.
DETAILED DESCRIPTION
In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent, however, that embodiments of the present invention may be practiced without these specific details or with additional details that may be obvious to a person skilled in the art. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not address any of the problems discussed above or might address only some of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein. Example embodiments of the present invention are described below, as illustrated in various drawings.
Embodiments of the present invention relates to systems and methods for live camera frame modification of a social networking application present on an electronic device.
As used herein, the social networking application may refer to a software/mobile application present on the electronic device and having a list of one or more users connected with each other through the social networking application.
As used herein, the electronic device may include, but are not limited to, a mobile phone, a tablet, a phablet, a laptop, a desktop computer, a personal digital assistant (PDA), a wearable device, a plain old telephone service device and any such device obvious to a person skilled in the art.
As illustrated in FIG.l, the present invention illustrates an exemplary system architecture [100] for live camera frame modification, in accordance with an embodiment of the present invention. The system architecture [100] includes, but not limited to, a first user [User A] associated with a first electronic device [102], a second user [User B] associated with a second electronic device [108], the first electronic device [102] and the second electronic device [108] connected with a server [106] through a network [104]. Any communication or exchange of information between the first electronic device [102], the second electronic device [108] and the server [106] takes place through the network [104]. Further, the first electronic device [102] includes, but not limited to, a communication module [102A], an interface unit [102B], a camera unit [102C], one or more applications [102D], a memory [102E], and a processor [102F]. Similarly, the second electronic device [108] includes, but not limited to, a communication module [108A], an interface unit [108B], a camera unit [108C], one or more applications [108D], a memory [108E], and a processor [108F].
The first electronic device [102] is configured to execute, using the processor [102F], one or more applications [102D]. Such applications include, but not
limited to, a social networking application, a mail application, a gaming
application and any such application that is obvious to a person skilled in the art.
In a preferred embodiment, the first electronic device [102] is configured to
execute a first social networking application stored in the memory [102E] of the
5 first electronic device [102]. Further, the first user [User A] is associated with the
first electronic device [102] and operates the first social networking application present on the first electronic device [102].
The second electronic device [108] is configured to execute, using the processor [108F], one or more applications [108D]. Such applications include, but not
10 limited to, a social networking application, a mail application, a gaming
application and any such application that is obvious to a person skilled in the art. In a preferred embodiment, the second electronic device [108] is configured to execute a second social networking application stored in the memory [108E] of the second electronic device [108]. Further, the second user [User B] is
15 associated with the second electronic device [108] and operates the second
social networking application present on the second electronic device [108].
The first user [User A] access the first social networking application present on the first electronic device [102] and then opens up a customized camera associated with the first social networking application through the camera unit
20 [102C] of the first electronic device [102]. Subsequently, the first user [User A]
sends a session request using the first social networking application, to the second user [User B] for sharing a live camera frame with the second user [User B], wherein the session request is transmitted through the communication module [102A] of the first electronic device [102] and the live camera frame is
25 associated with the first social networking application. The session request may
include, but not limited to, a camera frame sharing session request, a video call session request, any such request obvious to a person skilled in the art.
7
The second social networking application present on the second electronic
device [108] receives the session request from the first social networking
application present on the first electronic device [102], wherein the session
request is received at the second electronic device [108] through the
5 communication module [108A] of the second electronic device [108]. Once the
second social networking application receives the session request from the first
social networking application, the second user [User B] either accepts or rejects
the session request and an acknowledgment is sent, by the second social
networking application to the first social networking application, indicating an
10 acceptance or a rejection of the session request.
In an event, the second user [User B] rejects the session request, the session is
terminated. In an event, the second user [User B] accepts the session request,
the live camera frame associated with the first social networking application
present at the first electronic device [102] is transmitted from the first social
15 networking application present at the first electronic device [102] to the second
social networking application present at the second electronic device [108]. Further, the live camera frame associated with the first social networking application is transmitted through the communication module [102A] of the first electronic device [102].
20 Subsequently, the live camera frame associated with the first social networking
application is received by the second social networking application through the communication module [108A] of the second electronic device [108]. Further, the live camera frame associated with the first social networking application is displayed within an interface of the second social networking application,
25 wherein the live camera frame is displayed through the interface unit [108B] of
the second electronic device [108].
Once the live camera frame associated with the first social networking application is displayed within the interface of the second social networking
8
application, the second user [User B] selects at least one artefact at the interface of the second social networking application for the live camera frame associated with the first social networking application. Such selection of the at least one artefact is then received by the second social networking application.
5 The at least one artefact may include, but not limited to, a sticker, a filter, an
image, a color and any such artefact obvious to a person skilled in the art. In an
embodiment, the at least one artefact is a pre-stored or pre-defined artefact by
the second social networking application and the at least one artefact has a
unique resource identifier. In another embodiment, the at least one artefact is
10 any other artefact which is not defined by the second social networking
application. Further, the at least one artefact is either stored in the memory [108E] of the second electronic device [108] or stored at a remote/cloud storage.
After the second social networking application receives the selection of the at least one artefact, the second social networking application communicates, in
15 real-time, at least one information associated with the selected at least one
artefact to the first social networking application, to modify said the live camera frame. The at least one information may include, but not limited to, the unique resource identifier, a coordinate information, and a RGB colour value associated with the at least one artefact. In an event, the selected at least one artefact is
20 also a pre-stored or pre-defined artefact by the first social networking
application, the unique resource identifier and the coordinate information associated with the selected at least one artefact is communicated to the first social networking application. In an event, the selected at least one artefact is not pre-stored or pre-defined artefact by the first social networking application,
25 the RGB colour value and the coordinate information associated with the
selected at least one artefact is communicated to the first social networking application. Further, the at least one information is communicated from the second social networking application through the communication module [108A]
9
of the second electronic device [108] to the first social networking application through the communication module [102A] of the first electronic device [102].
Once the first social networking application receives the at least one information
associated with the selected at least one artefact, the first social networking
5 application parses the received at least one information and displays, through
the interface unit [102B], the selected at least one artefact on the live camera
frame based on the received at least one information. Further, the at least one
information is communicated in one of an XML format, a JSON format, a custom
data format, a binary data format, a protocol buffer format and any such format
10 that is obvious to a person skilled in the art. Moreover, the at least one artefact
is suggested only during a display time of the live camera frame i.e. the time during which the live camera frames are shared with other users.
On displaying the selected at least one artefact on the live camera frame, the
first user [User A] either accept or discard, in real-time, the at least one artefact
15 suggested by the second user [User B] to the first user [User A]. Alternatively, the
first user [User A] saves such suggestions (i.e. the at least one artefact) provided by the second user [User B] in the memory [102E] of the first electronic device [102], for later review and implementation of such suggestions (i.e. in non-real¬time).
20 For an instance, the second social networking application receives a selection of
a “crown” sticker from the second user [User B], the unique resource identifier associated with the “crown” sticker is communicated to the first social networking application, if the first social networking application already has the “crown” sticker defined. If the “crown” sticker is not defined at the first social
25 networking application, then, the RGB color value associated with the “crown”
sticker communicated to the first social networking application. Then, the first social networking application displays the “crown” sticker for the live camera
10
frame based on the unique resource identifier or the RGB color value, the first user [User A] accepts or discards the suggestion of “crown” sticker.
FIG.2 illustrates a method flow diagram [200] for live camera frame modification,
in accordance with an embodiment of the present invention, wherein the
5 method step initiates at step 202 and said method [200] is performed by the
second social networking application.
Step 204, includes receiving, by the second social networking application present on the second electronic device [108], the session request from the first social networking application present on the first electronic device [102], wherein the
10 session request is received at the second electronic device [108] through the
communication module [108A] of the second electronic device [108]. The session request is received by the second social networking application when the first user [User A] access the first social networking application present on the first electronic device [102] and then opens up a customized camera associated with
15 the first social networking application through the camera unit [102C] of the first
electronic device [102]. Subsequently, the first user [User A] sends a session request using the first social networking application, to the second user [User B] for sharing a live camera frame with the second user [User B], wherein the session request is transmitted through the communication module [102A] of the
20 first electronic device [102] and the live camera frame is associated with the first
social networking application. The session request may include, but not limited to, a camera frame sharing session request, a video call session request, any such request obvious to a person skilled in the art.
Next, step 206 includes sending the acknowledgment by the second social
25 networking application to the first social networking application, indicating an
acceptance or a rejection of the session request after the second social
networking application receives the session request from the first social
11
networking application. Such acceptance or the rejection of the session request is provided by the second user [User B].
At step 208, receiving the live camera frame associated with the first social
networking application by the second social networking application through the
5 communication module [108A] of the second electronic device [108]. Further,
the method [200] displays the live camera frame associated with the first social
networking application within an interface of the second social networking
application, wherein the live camera frame is displayed through the interface
unit [108B] of the second electronic device [108]. In an event, the second user
10 [User B] accepts the session request, the live camera frame associated with the
first social networking application present at the first electronic device [102] is transmitted from the first social networking application present at the first electronic device [102] to the second social networking application present at the second electronic device [108].
15 At step 210, once the live camera frame associated with the first social
networking application is displayed within the interface of the second social networking application, the second user [User B] selects at least one artefact at the interface of the second social networking application for the live camera frame associated with the first social networking application. The step 210
20 includes receiving, by the second social networking application, such selection of
the at least one artefact.
At step 212, communicating, by the second social networking application, in real¬
time, at least one information associated with the selected at least one artefact
to the first social networking application, to modify said the live camera frame.
25 The at least one information may include, but not limited to, the unique resource
identifier, a coordinate information, and a RGB colour value associated with the at least one artefact. In an event, the selected at least one artefact is also a pre-stored or pre-defined artefact by the first social networking application, the
12
unique resource identifier and the coordinate information associated with the
selected at least one artefact is communicated to the first social networking
application. In an event, the selected at least one artefact is not pre-stored or
pre-defined artefact by the first social networking application, the RGB colour
5 value and the coordinate information associated with the selected at least one
artefact is communicated to the first social networking application.
FIG.3 illustrates a signaling flow diagram [300] for live camera frame modification, in accordance with a first embodiment of the present invention. The signaling flow diagram [300] includes the following steps:
10 At step 302, the first user [User A] sends the session request using the first social
networking application, to the second user [User B] for sharing a live camera frame with the second user [User B], wherein the session request is transmitted through the communication module [102A] of the first electronic device [102] and the live camera frame is associated with the first social networking
15 application. The session request is transmitted when the first user [User A]
access the first social networking application present on the first electronic device [102] and then opens up a customized camera associated with the first social networking application through the camera unit [102C] of the first electronic device [102].
20 At step 304, the second social networking application present on the second
electronic device [108] receives the session request from the first social networking application present on the first electronic device [102], wherein the session request is received at the second electronic device [108] through the communication module [108A] of the second electronic device [108]. Once the
25 second social networking application receives the session request from the first
social networking application, the second user [User B] either accepts or rejects the session request and an acknowledgment is sent, by the second social
13
networking application to the first social networking application, indicating an acceptance or a rejection of the session request.
At step 306, in an event, the second user [User B] accepts the session request,
the live camera frame associated with the first social networking application
5 present at the first electronic device [102] is transmitted from the first social
networking application present at the first electronic device [102] to the second
social networking application present at the second electronic device [108].
Further, the live camera frame associated with the first social networking
application is transmitted through the communication module [102A] of the first
10 electronic device [102].
At step 308, the live camera frame associated with the first social networking
application is received by the second social networking application through the
communication module [108A] of the second electronic device [108]. Further,
the live camera frame associated with the first social networking application is
15 displayed within an interface of the second social networking application,
wherein the live camera frame is displayed through the interface unit [108B] of the second electronic device [108].
At step 310, once the live camera frame associated with the first social
networking application is displayed within the interface of the second social
20 networking application, the second user [User B] selects at least one artefact at
the interface of the second social networking application for the live camera frame associated with the first social networking application. Such selection of the at least one artefact is then received by the second social networking application.
25 At step 312, after the second social networking application receives the selection
of the at least one artefact, the second social networking application communicates, in real-time, at least one information associated with the
14
selected at least one artefact to the first social networking application, to modify said the live camera frame.
At step 314, once the first social networking application receives the at least one
information associated with the selected at least one artefact, the first social
5 networking application parses the received at least one information and displays,
through the interface unit [102B], the selected at least one artefact on the live
camera frame based on the received at least one information. Further, the at
least one information is communicated in one of an XML format, a JSON format,
a custom data format, a binary data format, a protocol buffer format and any
10 such format that is obvious to a person skilled in the art. Moreover, the at least
one artefact is suggested only during a display time of the live camera frame i.e. the time during which the live camera frames are shared with other users.
At step 316, on displaying the selected at least one artefact on the live camera
frame, the first user [User A] either accept or discard, in real-time, the at least
15 one artefact suggested by the second user [User B] to the first user [User A].
Alternatively, the first user [User A] saves such suggestions (i.e. the at least one artefact) provided by the second user [User B] in the memory [102E] of the first electronic device [102], for later review and implementation of such suggestions (i.e. in non-real-time).
20 FIG.4 illustrates a signaling flow diagram [400] for live camera frame
modification, in accordance with a second embodiment of the present invention. The signaling flow diagram [400] includes the following steps:
At step 402, the first user [User A] sends a video session request using the first
social networking application, to the second user [User B] for sharing a live
25 camera frame with the second user [User B], wherein the video session request
is transmitted through the communication module [102A] of the first electronic device [102] and the live camera frame is of the first user associated with the
15
first social networking application. The video session request is transmitted when
the first user [User A] access the first social networking application present on
the first electronic device [102] and then opens up a customized camera
associated with the first social networking application through the camera unit
5 [102C] of the first electronic device [102].
At step 404, the second social networking application present on the second electronic device [108] receives the video session request from the first social networking application present on the first electronic device [102], wherein the video session request is received at the second electronic device [108] through
10 the communication module [108A] of the second electronic device [108]. Once
the second social networking application receives the video session request from the first social networking application, the second user [User B] either accepts or rejects the video session request and an acknowledgment is sent, by the second social networking application to the first social networking application, indicating
15 an acceptance or a rejection of the video session request.
At step 406, in an event, the second user [User B] accepts the video session
request, the live camera frame associated with the first social networking
application present at the first electronic device [102] is transmitted from the
first social networking application present at the first electronic device [102] to
20 the second social networking application present at the second electronic device
[108]. Further, the live camera frame associated with the first social networking application is transmitted through the communication module [102A] of the first electronic device [102].
At step 408, in an event, the second user [User B] accepts the video session
25 request, a live camera frame associated with the second social networking
application present at the second electronic device [108] is transmitted from the second social networking application present at the second electronic device [108] to the first social networking application present at the first electronic
16
device [102]. Further, the live camera frame associated with the second social networking application is transmitted through the communication module [108A] of the second electronic device [108].
At step 410A, the live camera frame of the first user associated with the first
5 social networking application is received by the second social networking
application through the communication module [108A] of the second electronic device [108]. Further, the live camera frame associated with the first social networking application is displayed within a second interface of the second social networking application, wherein the live camera frame is displayed through the
10 interface unit [108B] of the second electronic device [108]. At step 410B, the live
camera frame of the second user associated with the second social networking application is received by the first social networking application through the communication module [102A] of the first electronic device [102]. Further, the live camera frame associated with the second social networking application is
15 displayed within a first interface of the first social networking application,
wherein the live camera frame is displayed through the interface unit [102B] of the first electronic device [102].
At step 412A, once the live camera frame of the second user associated with the second social networking application is displayed within the first interface of the
20 first social networking application, the first user [User A] selects at least one
artefact at the first interface of the first social networking application for the live camera frame of the second user associated with the second social networking application. Such selection of the at least one artefact is then received by the first social networking application. At step 412B, once the live camera frame
25 associated with the first social networking application is displayed within the
interface of the second social networking application, the second user [User B] selects at least one artefact at the interface of the second social networking application for the live camera frame associated with the first social networking
17
application. Such selection of the at least one artefact is then received by the second social networking application.
At step 414, after the second social networking application receives the selection
of the at least one artefact, the second social networking application
5 communicates, in real-time, at least one information associated with the
selected at least one artefact to the first social networking application, to modify
said the live camera frame. Also, after the first social networking application
receives the selection of the at least one artefact, the first social networking
application communicates, in real-time, at least one information associated with
10 the selected at least one artefact to the second social networking application, to
modify said the live camera frame
At step 416A, once the first social networking application receives the at least one information associated with the selected at least one artefact, the first social networking application parses the received at least one information and displays,
15 through the interface unit [102B] of the first electronic device [102], the selected
at least one artefact on the live camera frame based on the received at least one information. At step 416B, once the second social networking application receives the at least one information associated with the selected at least one artefact, the second social networking application parses the received at least
20 one information and displays, through the interface unit [108B] of the second
electronic device [108], the selected at least one artefact on the live camera frame based on the received at least one information.
At step 418A, on displaying the selected at least one artefact on the live camera
frame, the first user [User A] either accept or discard, in real-time, the at least
25 one artefact suggested by the second user [User B] to the first user [User A].
Alternatively, the first user [User A] saves such suggestions (i.e. the at least one artefact) provided by the second user [User B] in the memory [102E] of the first electronic device [102], for later review and implementation of such suggestions
18
(i.e. in non-real-time). At step 418B, on displaying the selected at least one
artefact on the live camera frame, the second user [User B] either accept or
discard, in real-time, the at least one artefact suggested by the first user [User A]
to the second user [User B]. Alternatively, the second user [User B] saves such
5 suggestions (i.e. the at least one artefact) provided by the first user [User A] in
the memory [108E] of the second electronic device [108], for later review and implementation of such suggestions (i.e. in non-real-time).
The present invention has the following technical advantages over the existing live camera frame modification systems and methods: (1) the present invention
10 saves battery of the electronic device as only the at least one information
associated with the at least one artefact is transmitted rather than the whole camera frame; (2) the present invention consumes less bandwidth; (3) the present invention requires less processing power; and (4) the present invention takes less time for communication of the at least one information associated
15 with the at least one artefact.
The units, modules, databases and/or components discussed may be present in
the form of a hardware or a software or a hardware-software combination for
performing functions and/or operations, as described herein. The connections
and/or links between each module, databases and/or component shown in the
20 figures are exemplary and may be connected in any possible way that is obvious
for a person skilled in the art. The connections and/or links between each module, databases and/or component may be physical (such as wired or wireless connections/links) or logical (such as implementing in semiconductor device).
According to one embodiment of the present disclosure, the techniques
25 described herein are implemented by one or more special-purpose computing
devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays
19
(FPGAs) that are persistently programmed to perform the techniques, or may
include one or more general purpose hardware processors programmed to
perform the techniques pursuant to program instructions in firmware, memory,
other storage, or a combination. Such special-purpose computing devices may
5 also combine custom hard-wired logic, ASICs, or FPGAs with custom
programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, portable computer systems, handheld devices, networking devices or any other device that incorporates hard-wired and/or program logic to implement the techniques.
10 The system [100] may include a bus or other communication mechanism for
communicating information, and a processor [102F, 108F] coupled with the bus for processing information. The hardware processor [102F, 108F] may be, for example, a general-purpose microprocessor. The system [100] may also include a main memory [102E, 108E], such as a random-access memory (RAM) or other
15 dynamic storage device, coupled to the bus for storing information and
instructions to be executed by the processor. The main memory also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor [102F, 108F]. Such instructions, when stored in non-transitory storage media accessible to the
20 processor, render the computer system into a special-purpose machine that is
customized to perform the operations specified in the instructions.
The system [100] further includes a read only memory (ROM) or other static
storage device coupled to the bus for storing static information and instructions
for the processor. A storage device, such as a magnetic disk, optical disk, or
25 solid-state drive is provided and coupled to the bus for storing information and
instructions. According to one embodiment, the techniques herein are
performed by the computing device in response to the processor executing one or more sequences of one or more instructions contained in the main memory.
20
Such instructions may be read into the main memory from another storage medium, such as the storage device. Execution of the sequences of instructions contained in the main memory cause the processor to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
The terms "memory" [102E, 108E] as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as the storage device. Volatile media may include dynamic memory, such as the main memory. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, or any other memory chip or cartridge.
Various forms of store may be involved in carrying one or more sequences of one or more instructions to the processor for execution. For example, the instructions may initially be carried on a magnetic disk or solid-state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. The system [100] also includes a communication interface coupled to the bus.
The present invention describes the first social networking application stored on the first electronic device [102] and the second social networking application stored on the second electronic device [108], wherein the first social networking application and the second social networking application have a same bundle identifier and share similar features and functionalities.
Although the invention has been described with respect to two users [User A, User B] suggesting/selecting at least one artefact for each other, however, it is obvious to a person skilled in the art that multiple users (i.e. more than two users) can simultaneously suggest/select one or more artefacts for each other in one of the camera frame sharing session and the video call session.
Though a limited number of the users [User A, User B], the electronic devices [102, 108], the communication modules [102A, 108A], the interface units [102B, 108B], the camera units [102C, 108C], the one or more applications [102D, 108D], the memories [102E, 108E], the processors [102F, 108F], the server [106], the network [104], and the link/connection/requests/interfaces, have been shown in the figures; however, it will be appreciated by those skilled in the art that the exemplary system of Figure. 1 of the present invention encompasses any number and varied types of the entities/elements such as users [User A, User B], the electronic devices [102, 108], the communication modules [102A, 108A], the interface units [102B, 108B], the camera units [102C, 108C], the one or more applications [102D, 108D], the memories [102E, 108E], the processors [102F, 108F], the server [106], and the link/connection/requests/interfaces, that can be configured to perform the method of the invention as described herein.
While considerable emphasis has been placed herein on the disclosed embodiments, it will be appreciated that many embodiments can be made and that many changes can be made to the embodiments without departing from the principles of the present disclosure. These and other changes in the embodiments of the present disclosure will be apparent to those skilled in the art, whereby it is to be understood that the foregoing descriptive matter to be implemented is illustrative and non-limiting.
We claim:
1. A method [200] for a live camera frame modification, the method
comprising:
receiving, a session request to share the live camera frame from a first social networking application stored on a first electronic device [102];
sending an acknowledgement from a second social networking application to the first social networking application in response to the session request, wherein the second social networking application is stored on a second electronic device [108];
based on said acknowledgement, receiving and displaying said live camera frame within an interface of the second social networking application at the second electronic device [108];
receiving a selection of the at least one artefact, at the interface of the second social networking application, for the displayed live camera frame; and
communicating, in real-time, at least one information associated with the selected at least one artefact from the second social networking application to the first social networking application, to modify said the live camera frame.
2. The method as claimed in claim 1, further comprising, transmitting a second
live camera frame from the second social networking application to the first
social networking application.
3. The method as claimed in claim 1, wherein the session request comprises one of a camera frame sharing session request and a video call session request.
4. The method as claimed in claim 1, wherein the at least one information includes at least one of a unique resource identifier, a coordinate information, and a RGB colourvalue.
5. The method as claimed in claim 1, wherein the at least one artefact may be accepted by the first user [User A] at the first electronic device [102] in one of a real-time and a non-real-time.
6. The method as claimed in claim 1, wherein the at least one artefact may be discarded by the first user [User A] at the first electronic device [102] in one of a real-time and a non-real-time.
7. The method as claimed in claim 1, wherein the at least one artefact may be may be suggested during a display time of the live camera frame.
8. A system [100] for a live camera frame modification, the system comprising:
a memory [108E];
a processor [108F] coupled to the memory [108E];
a communication module [108A] configured to:
receive, a session request to share the live camera frame from a first social networking application stored on a first electronic device [102], and
send an acknowledgement from a second social networking application to the first social networking application in response to the session request, wherein the second social networking application is stored on a second electronic device [108]; and
an interface unit [108B] coupled with the communication module[108A], configured to:
based on said acknowledgement, receive and display said live camera frame within an interface of the second social networking application at the second electronic device [108], and
receive a selection of the at least one artefact, at the interface of the second social networking application, for the displayed live camera frame;
wherein the communication module [108A] further configured to communicate, in real-time, at least one information associated with the selected at least one artefact from the second social networking application to the first social networking application, to modify said the live camera frame.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201711004719-IntimationOfGrant13-05-2024.pdf | 2024-05-13 |
| 1 | Form 3 [09-02-2017(online)].pdf | 2017-02-09 |
| 2 | 201711004719-PatentCertificate13-05-2024.pdf | 2024-05-13 |
| 2 | Drawing [09-02-2017(online)].pdf | 2017-02-09 |
| 3 | Description(Provisional) [09-02-2017(online)].pdf | 2017-02-09 |
| 3 | 201711004719-Written submissions and relevant documents [16-04-2024(online)].pdf | 2024-04-16 |
| 4 | Form 26 [10-03-2017(online)].pdf | 2017-03-10 |
| 4 | 201711004719-CORRECTED PAGES [12-04-2024(online)].pdf | 2024-04-12 |
| 5 | 201711004719-Power of Attorney-150317.pdf | 2017-03-28 |
| 5 | 201711004719-PETITION UNDER RULE 137 [12-04-2024(online)].pdf | 2024-04-12 |
| 6 | 201711004719-FORM-26 [28-03-2024(online)].pdf | 2024-03-28 |
| 6 | 201711004719-Correspondence-150317.pdf | 2017-03-28 |
| 7 | 201711004719-Correspondence to notify the Controller [27-03-2024(online)].pdf | 2024-03-27 |
| 8 | abstract.jpg | 2017-04-12 |
| 8 | 201711004719-US(14)-HearingNotice-(HearingDate-02-04-2024).pdf | 2024-03-14 |
| 9 | 201711004719-ENDORSEMENT BY INVENTORS [07-02-2018(online)].pdf | 2018-02-07 |
| 9 | 201711004719-FER_SER_REPLY [30-05-2022(online)].pdf | 2022-05-30 |
| 10 | 201711004719-DRAWING [07-02-2018(online)].pdf | 2018-02-07 |
| 10 | 201711004719-FER.pdf | 2021-12-08 |
| 11 | 201711004719-CORRESPONDENCE-OTHERS [07-02-2018(online)].pdf | 2018-02-07 |
| 11 | 201711004719-FORM 18 [12-11-2020(online)].pdf | 2020-11-12 |
| 12 | 201711004719-Changing Name-Nationality-Address For Service [07-02-2018(online)].pdf | 2018-02-07 |
| 12 | 201711004719-COMPLETE SPECIFICATION [07-02-2018(online)].pdf | 2018-02-07 |
| 13 | 201711004719-Changing Name-Nationality-Address For Service [07-02-2018(online)].pdf | 2018-02-07 |
| 13 | 201711004719-COMPLETE SPECIFICATION [07-02-2018(online)].pdf | 2018-02-07 |
| 14 | 201711004719-CORRESPONDENCE-OTHERS [07-02-2018(online)].pdf | 2018-02-07 |
| 14 | 201711004719-FORM 18 [12-11-2020(online)].pdf | 2020-11-12 |
| 15 | 201711004719-DRAWING [07-02-2018(online)].pdf | 2018-02-07 |
| 15 | 201711004719-FER.pdf | 2021-12-08 |
| 16 | 201711004719-FER_SER_REPLY [30-05-2022(online)].pdf | 2022-05-30 |
| 16 | 201711004719-ENDORSEMENT BY INVENTORS [07-02-2018(online)].pdf | 2018-02-07 |
| 17 | 201711004719-US(14)-HearingNotice-(HearingDate-02-04-2024).pdf | 2024-03-14 |
| 17 | abstract.jpg | 2017-04-12 |
| 18 | 201711004719-Correspondence to notify the Controller [27-03-2024(online)].pdf | 2024-03-27 |
| 19 | 201711004719-FORM-26 [28-03-2024(online)].pdf | 2024-03-28 |
| 19 | 201711004719-Correspondence-150317.pdf | 2017-03-28 |
| 20 | 201711004719-Power of Attorney-150317.pdf | 2017-03-28 |
| 20 | 201711004719-PETITION UNDER RULE 137 [12-04-2024(online)].pdf | 2024-04-12 |
| 21 | Form 26 [10-03-2017(online)].pdf | 2017-03-10 |
| 21 | 201711004719-CORRECTED PAGES [12-04-2024(online)].pdf | 2024-04-12 |
| 22 | Description(Provisional) [09-02-2017(online)].pdf | 2017-02-09 |
| 22 | 201711004719-Written submissions and relevant documents [16-04-2024(online)].pdf | 2024-04-16 |
| 23 | Drawing [09-02-2017(online)].pdf | 2017-02-09 |
| 23 | 201711004719-PatentCertificate13-05-2024.pdf | 2024-05-13 |
| 24 | 201711004719-IntimationOfGrant13-05-2024.pdf | 2024-05-13 |
| 24 | Form 3 [09-02-2017(online)].pdf | 2017-02-09 |
| 1 | Searchstrategy201711004719AE_11-01-2023.pdf |
| 1 | Searchstrategy201711004719E_17-11-2021.pdf |
| 2 | Searchstrategy201711004719AE_11-01-2023.pdf |
| 2 | Searchstrategy201711004719E_17-11-2021.pdf |