Abstract: ABSTRACT Method and system for generating a collaborative story. The system monitors user activities on a User Equipment (UE) and identifies a story trigger. Upon identifying the story trigger, the system identifies all uses who are participating in the story, and allows users to initiate collaboration by providing suitable interface. When the collaboration is initiated, the system collects data that matches the story from all participating devices and then generates the collaborative story using the collected data. FIG. 3
DESC:The following specification particularly describes and ascertains the nature of this invention and the manner in which it is to be performed:-
TECHNICAL FIELD
[001] The embodiments herein relate to communication devices and, more particularly, to collaborative story generation in communication devices.
BACKGROUND
[002] With the growth of technology, there are plenty of ways people can communicate each other. Modern day ‘smart’ devices support various applications that support number of applications that facilitate communication in text, audio, and video modes using internet in addition to the normal voice call over a communication network. Many such applications have become extremely popular and help people get connected.
[003] Now there are various factors that would attract a customer. While technology is a major factor, other factors such as but not limited to convenience, and the extent to which an application can be customized also play an important role. Many users get attracted to the look and feel as well (especially the interface an application provides). A user may be involved in one or more communication sessions with different users using one or more applications, and it may be possible to create a story based on the user interaction with one or more other users. Some of the existing systems support content collaboration to generate a story. However, none of the existing content collaboration systems is proactive.
OBJECT OF INVENTION
[004] An object of the embodiments herein is to automatically detect story trigger(s) in a User Equipment (UE).
[005] Another object of the embodiments herein is to automatically identify users participating in a story.
[006] Another object of the embodiments herein is to automatically collect data related to an identified story from UEs of all participating users.
[007] Another object of the embodiments herein is to automatically generate a collaborative story based on the data collected from UEs of participating users.
SUMMARY
[008] In view of the foregoing, an embodiment herein provides a method for generating Collaborative Story. Initially, at least one story trigger is identified in a User Equipment (UE), wherein the story trigger relates to at least one story. Further, at least one other user participating in the at least one story is identified. Further, data pertaining to at least one activity related to the identified story is collected from UE of users participating in the identified story, and then the collaborative story is created by collecting matching data from UE of all users who are participating in the at least one story.
[009] Embodiments further disclose a collaborative story generation system for generating Collaborative Story. The collaborative story generation system includes a hardware processor and a non-volatile memory comprising instructions. The instructions are configured to cause the hardware processor to identify, by a collaborative story generation server of the collaborative story generation system, at least one story trigger in a User Equipment (UE), wherein the story trigger relates to at least one story. Further, at least one other user participating in the at least one story is identified by the collaborative story generation server. Further, data pertaining to at least one activity related to the identified story is collected from UE of users participating in the identified story, by the collaborative story generation server, and the collaborative story is generated by collecting matching data from UE of all users who are participating in the at least one story.
[0010] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings.
BRIEF DESCRIPTION OF THE FIGURES
[0011] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[0012] FIG. 1 illustrates a block diagram of collaborative story generation system, as disclosed in the embodiments herein;
[0013] FIG. 2 is a flow diagram that depicts steps involved in the process of generating a collaborative story using the collaborative story generation system, as disclosed in the embodiments herein; FIGS. 3a and 3b are example illustrations depicting different stages of the collaborative story generation process, as disclosed in the embodiments herein; and
[0014] FIGS. 4a to 4m are example illustrations depicting different stages of creation, access, customization, and presentation of the collaborative story, as disclosed in the embodiments herein.
DETAILED DESCRIPTION OF EMBODIMENTS
[0015] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0016] The embodiments herein disclose a mechanism for generating collaborative story. Referring now to the drawings, and more particularly to FIGS. 1 through 4, where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments.
[0017] FIG. 1 illustrates a block diagram of collaborative story generation system, as disclosed in the embodiments herein. The collaborative story generation system 100 includes a collaborative story server 101. The collaborative story server 101 is configured to communicate with at least one User Equipment (UE) 102 through at least one suitable interface. The UE 102 can be any communication device that supports communication between users in the form of text, audio, video, and image, with suitable applications. For example, the UE 102 can be a smartphone that supports exchange of data in any of the aforementioned format.
[0018] The collaborative story server 101 is configured to monitor user activities on the UE 102. The monitoring involves interpreting and processing of data from various applications in the UE 102, wherein the applications can be running in the foreground or background of the UE 102. For example, the application referred to here can be a chatting application that allows communication with at least one other user through text, image, audio, and/or video format. In an embodiment, the collaborative story server 101 can have complete or limited access to data from various applications in each UE 102, and the data access from each UE 102 is regulated based on an access permission specific to the UE 102. In various embodiments, the access permission can be pre-configured or dynamically configured as per user inputs.
[0019] The collaborative story server 101 can be further configured to process data being monitored, and dynamically check for at least one keyword in the data being interpreted. The collaborative story server 101 can be further configured to check if any of the keywords identified is a story trigger, based on data in an associated database. The collaborative story server 101 can use any suitable pre-configured criteria such as but not limited to pre-load language model processing, and key map model processing to identify the keywords and then the story triggers. Further, upon identifying the story trigger, the collaborative story server 101 identifies all users who are participating in the story, and notifies, using a suitable indicator, the participating users that a story is available. The collaborative story server 101 is further configured to provide at least an option for the user(s) to approve or reject collaboration. The collaborative story server 101, upon receiving the user approval, identifies data (interaction data) that is related to the identified story as matching data, from all applications that are running on the UEs 102 participating in the story. For example, assume that the users are planning a birthday celebration as depicted in Fig. 3a in a group chat window. The users, during the communication, use relevant text, audio, video and so on. The users may also search for related information using web-browser. For example, the users may search for nearest restaurants, cake shops and so on.
[0020] In one embodiment, the collaborative story server 101 identifies all related contents from all participating UEs 102, and associates the identified contents with a story signature, wherein the story signature is a number, alphabets, special characters, or a combination thereof, which is unique to each story being created. The collaborative story server 101 is further configured to process the data associated with the story signature, creates a collaborative story.
[0021] In another embodiment, the collaborative story server 101 identifies all related contents from one UE 102, and associates the identified contents with a story signature. Further, upon identifying same story signature in at least one other UE 102, the collaborative story server 101 prompts users of those UEs to collaborate. When the users accept to collaborate, the collaborative story server 101 is further configured to process the data associated with the story signature, creates a collaborative story.
[0022] In an alternate embodiment, the collaborative story can be generated locally, by the UE 102. An application or an agent in the UE 102 can monitor user activities in the UE 102, and process data related to the activities to identify one or more story triggers. In one implementation, when more than two UEs 102 are communicating, at least one UE 102 that identifies a story trigger can generate the collaborative story.
[0023] In another embodiment, the collaborative story is generated based on user activity in a single device i.e. even when the UE 102 is not in communication with one other UE 102. In this scenario, the data in the UE 102 is considered by the collaborative story generation server 101 and/or by the application/agent in the UE 102 to trigger the collaborative story generation. Further, the collaborative story generated at the UE 102 can be uploaded to the collaborative story generation server 101 automatically or in response to a user input/instruction.
[0024] In an embodiment, the collaborative story server 101 randomly arranges the collected data while generating the collaborative story. In another embodiment, the collaborative story server 101 arranges the collected data in the order of priority, while generating the collaborative story, wherein the priority can be defined based on at least one parameter that is pre-configured, or is dynamically configured. For example, the parameter that defines priority can be the type of content such as but not limited to audio, video, text, and image. In another example, the parameter can be the user/UE from whom the data is collected i.e. the contributor. Consider the scenario depicted in Figs. 3a and 3b. As 4 users are identified as the users participating in the story, the data collected from these users may be arranged based on the contributor priority. If the contributor priority at one instance is Bro -> Cousin Bro -> Sis, then the video, image (as in 3a) from Bro is given priority while generating the collaborative story, followed by the video from cousin bro, and the text from Sis. This priority and order can be dynamically changed as per preference.
[0025] The collaborative story server 101 is further configured to provide suitable options for the user to access a collaborative story created. The collaborative story server 101, upon receiving a story access request from a user, the collaborative story server 101 displays the story requested by the user using a suitable interface. If multiple stories are stored in a user account associated with the user requesting the story, all stories are displayed to the user with an option to select the story to be viewed. In an embodiment, the collaborative story server 101 classifies the stories based on the category, and allows the user to search and view stories based on the classification.
[0026] The collaborative story server 101 can also provide a user mapping feature that can allow viewing of contents from a selected user that form part of a story being viewed. The collaborative story server 101 may also provide a content mapping feature that can allow a user to select a participating user and view all contents from that user, specific or not specific to a particular story. The collaborative story server 101 can further allow a user to dynamically edit priorities which in turn changes the order in which contents are arranged in a collaborative story.
[0027] FIG. 2 is a flow diagram that depicts steps involved in the process of generating a collaborative story using the collaborative story generation system, as disclosed in the embodiments herein. Upon identifying (202) a story trigger in at least one UE 102 being monitored, the collaborative story server 101 identifies scope of at least one story, and collects (204) interaction data from UE of all participating users, wherein the ‘interaction data’ refers to data that is relevant to the identified story. The collected data is then tagged with (206) a story signature unique to that story. Further, the participating users are notified that collaboration is possible, and are provided with option to initiate the collaboration. When the users confirm their willingness to collaborate, then relevant data is collected from respective UEs 102, and a collaborative story is created (208) as per set preferences. The collaborative story is then displayed to the users with suitable viewing and customizing options, upon receiving an access request from the user(s). The various actions in method 200 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions listed in FIG. 2 may be omitted.
[0028] FIGS. 4a to 4m are example illustrations depicting different stages of creation, access, customization, and presentation of the collaborative story, as disclosed in the embodiments herein. In 4a, four friends are discussing in a chat window, their travel plan to a place named ‘Devbagh Island’. The collaboration story generation server 101, by monitoring and processing the contents being discussed in the chat window, identifies a story trigger, and notifies the users along with an option to trigger collaborative story generation as in 4b. Once the user input to generate the collaborative story is received, the collaboration story generation server 101 generates the collaborative story as in 4c, and provides option for users to collaborate related activities to build the story. In this example, the collaborative story includes any device content such as but not limited to webpage, video, text, image, and audio. Once the story is generated, the users are provided with options to separately access contents that constitute the collaborative story as in 4e. Further, when the user accesses a particular content (i.e. webpage in this example) as in 4f, the corresponding contents are opened in a separate window as in 4g.
[0029] The users are also give option to view and access contents contributed by each user by touching the respective user icons as in 4h. When the user touches icon of user B for example, the contents in the collaborative story that are from the user B’s UE are highlighted or displayed as in 4h. The user can also customize the view by changing the order in which the contents are displayed as in 4i. In 4i, the user selects a user icon and moves it ahead of icons of other contributors, which resulting in the contents contributed by the selected user getting prioritized and moving ahead in display.
[0030] Further, the users can attach the generated collaborative story to one or more activities being performed in the UE 102. For example, while saving a bookmark pertaining to a particular webpage, one or more collaborative stories can be attached with the bookmark as in 4j. Similarly, the collaborative story can be associated with any device content such as but not limited to calendar events, a favorite image, a favorite contact, and selected videos.
[0031] When multiple collaborative stories are created, the same can be organized and displayed for user access, as in 4k. The users may also be provided options to search for stories using story name, participant name and so on. Further, when a story is accessed by a user, the messages in the story can be displayed in a separate window as in 4l. Further, the stories can be organized and displayed in different categories as in 4m.
[0032] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements. The network elements shown in Fig. 1 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
[0033] The embodiments disclosed herein specify a mechanism for collaborative story creation. The mechanism allows automatic identification of story trigger and generation of a collaborative story, providing a system thereof. Therefore, it is understood that the scope of protection is extended to such a system and by extension, to a computer readable means having a message therein, said computer readable means containing a program code for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The method is implemented in a preferred embodiment using the system together with a software program written in, for ex. Very high speed integrated circuit Hardware Description Language (VHDL), another programming language, or implemented by one or more VHDL or several software modules being executed on at least one hardware device. The hardware device can be any kind of device which can be programmed including, for ex. any kind of a computer like a server or a personal computer, or the like, or any combination thereof, for ex. one processor and two FPGAs. The device may also include means which could be for ex. hardware means like an ASIC or a combination of hardware and software means, an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means are at least one hardware means or at least one hardware-cum-software means. The method embodiments described herein could be implemented in pure hardware or partly in hardware and partly in software. Alternatively, the embodiment may be implemented on different hardware devices, for ex. using a plurality of CPUs.
[0034] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the claims as described herein.
,CLAIMS:CLAIMS
What is claimed is:
1) A method for generating Collaborative Story, said method comprising:
identifying at least one story trigger in a User Equipment (UE), wherein said story trigger relates to at least one story;
identifying at least one other user participating in said at least one story;
collecting data pertaining to at least one activity related to the identified story, from UE of users participating in the identified story; and
creating the collaborative story by collecting matching data from UE of all users who are participating in the at least one story.
2) The method as claimed in claim 1, wherein identifying the story trigger further comprising:
monitoring user activity on the UE;
processing data associated with the user activity being monitored; and
identifying at least one story associated with the user activity.
3) The method as claimed in claim 1, wherein the collaborative story comprises of at least one of identified story points, identified users, tracked user activities from respective UEs.
4) The method as claimed in claim 1, wherein an auxiliary data is stored along with the collaborative story, wherein the auxiliary data is a device content that includes at least one of a bookmark, calendar event, image, contact, and video.
5) The method as claimed in claim 1, wherein collecting the data from the UE is regulated based on an access permission, wherein the access permission is specific to UE.
6) A collaborative story generation system for generating Collaborative Story, said system comprising:
a hardware processor;
a non-volatile memory comprising instructions, said instructions configured to cause said hardware processor to:
identify, by a collaborative story generation server of the collaborative story generation system, at least one story trigger in a User Equipment (UE), wherein said story trigger relates to at least one story;
identify at least one other user participating in said at least one story, by the collaborative story generation server;
collect data pertaining to at least one activity related to the identified story, from UE of users participating in the identified story, by the collaborative story generation server; and
create the collaborative story by collecting matching data from UE of all users who are participating in the at least one story, by the collaborative story generation server.
7) The collaborative story generation server as claimed in claim 6, wherein the collaborative story generation server is configured to identify the story trigger by:
monitoring user activity on the UE;
processing data associated with the user activity being monitored; and
identifying the at least one story associated with the user activity.
8) The collaborative story generation server as claimed in claim 6, wherein the collaborative story generation server is configured to include at least one of identified story points, identified users, tracked user activities from respective UEs, in the collaborative story.
9) The collaborative story generation server as claimed in claim 6, wherein the collaborative story generation server is configured to store a device content as an auxiliary data along with the collaborative story, wherein the device content is at least one of a bookmark, calendar event, image, contact, and video.
10) The collaborative story generation server as claimed in claim 6, wherein the collaborative story generation server collects the data from the UE based on an access permission specific to UE.
Dated this 20th of July 2016 Signature:
Name of the signatory: Dr. Kalyan Chakravarthy
| # | Name | Date |
|---|---|---|
| 1 | 3931-CHE-2015-FORM-27 [30-09-2024(online)].pdf | 2024-09-30 |
| 1 | Form 5 [30-07-2015(online)].pdf | 2015-07-30 |
| 2 | 3931-CHE-2015-IntimationOfGrant23-03-2023.pdf | 2023-03-23 |
| 2 | Form 3 [30-07-2015(online)].pdf | 2015-07-30 |
| 3 | Drawing [30-07-2015(online)].pdf | 2015-07-30 |
| 3 | 3931-CHE-2015-PatentCertificate23-03-2023.pdf | 2023-03-23 |
| 4 | Description(Provisional) [30-07-2015(online)].pdf | 2015-07-30 |
| 4 | 3931-CHE-2015-Annexure [15-03-2023(online)].pdf | 2023-03-15 |
| 5 | 3931-CHE-2015-Written submissions and relevant documents [15-03-2023(online)].pdf | 2023-03-15 |
| 5 | 3931-CHE-2015-Power of Attorney-050416.pdf | 2016-06-16 |
| 6 | 3931-CHE-2015-Form 1-050416.pdf | 2016-06-16 |
| 6 | 3931-CHE-2015-Annexure [15-02-2023(online)].pdf | 2023-02-15 |
| 7 | 3931-CHE-2015-Correspondence-050416.pdf | 2016-06-16 |
| 7 | 3931-CHE-2015-Correspondence to notify the Controller [15-02-2023(online)].pdf | 2023-02-15 |
| 8 | Form 18 [20-07-2016(online)].pdf | 2016-07-20 |
| 8 | 3931-CHE-2015-FORM-26 [15-02-2023(online)].pdf | 2023-02-15 |
| 9 | 3931-CHE-2015-US(14)-HearingNotice-(HearingDate-28-02-2023).pdf | 2023-02-08 |
| 9 | Drawing [20-07-2016(online)].pdf | 2016-07-20 |
| 10 | 3931-CHE-2015-ABSTRACT [29-07-2020(online)].pdf | 2020-07-29 |
| 10 | Description(Complete) [20-07-2016(online)].pdf | 2016-07-20 |
| 11 | 3931-CHE-2015-CLAIMS [29-07-2020(online)].pdf | 2020-07-29 |
| 11 | 3931-CHE-2015-Power of Attorney-180816.pdf | 2016-09-12 |
| 12 | 3931-CHE-2015-CORRESPONDENCE [29-07-2020(online)].pdf | 2020-07-29 |
| 12 | 3931-CHE-2015-Form 5-180816.pdf | 2016-09-12 |
| 13 | 3931-CHE-2015-FER_SER_REPLY [29-07-2020(online)].pdf | 2020-07-29 |
| 13 | 3931-CHE-2015-Form 1-180816.pdf | 2016-09-12 |
| 14 | 3931-CHE-2015-Correspondence-F1-F5-PA-180816.pdf | 2016-09-12 |
| 14 | 3931-CHE-2015-OTHERS [29-07-2020(online)].pdf | 2020-07-29 |
| 15 | 3931-CHE-2015-FER.pdf | 2020-02-01 |
| 15 | Form-2(Online).pdf | 2016-10-18 |
| 16 | 3931-CHE-2015-FORM-26 [15-03-2018(online)].pdf | 2018-03-15 |
| 16 | 3931-CHE-2015-FORM-26 [16-03-2018(online)].pdf | 2018-03-16 |
| 17 | 3931-CHE-2015-FORM-26 [16-03-2018(online)].pdf | 2018-03-16 |
| 17 | 3931-CHE-2015-FORM-26 [15-03-2018(online)].pdf | 2018-03-15 |
| 18 | 3931-CHE-2015-FER.pdf | 2020-02-01 |
| 18 | Form-2(Online).pdf | 2016-10-18 |
| 19 | 3931-CHE-2015-Correspondence-F1-F5-PA-180816.pdf | 2016-09-12 |
| 19 | 3931-CHE-2015-OTHERS [29-07-2020(online)].pdf | 2020-07-29 |
| 20 | 3931-CHE-2015-FER_SER_REPLY [29-07-2020(online)].pdf | 2020-07-29 |
| 20 | 3931-CHE-2015-Form 1-180816.pdf | 2016-09-12 |
| 21 | 3931-CHE-2015-CORRESPONDENCE [29-07-2020(online)].pdf | 2020-07-29 |
| 21 | 3931-CHE-2015-Form 5-180816.pdf | 2016-09-12 |
| 22 | 3931-CHE-2015-CLAIMS [29-07-2020(online)].pdf | 2020-07-29 |
| 22 | 3931-CHE-2015-Power of Attorney-180816.pdf | 2016-09-12 |
| 23 | 3931-CHE-2015-ABSTRACT [29-07-2020(online)].pdf | 2020-07-29 |
| 23 | Description(Complete) [20-07-2016(online)].pdf | 2016-07-20 |
| 24 | Drawing [20-07-2016(online)].pdf | 2016-07-20 |
| 24 | 3931-CHE-2015-US(14)-HearingNotice-(HearingDate-28-02-2023).pdf | 2023-02-08 |
| 25 | Form 18 [20-07-2016(online)].pdf | 2016-07-20 |
| 25 | 3931-CHE-2015-FORM-26 [15-02-2023(online)].pdf | 2023-02-15 |
| 26 | 3931-CHE-2015-Correspondence-050416.pdf | 2016-06-16 |
| 26 | 3931-CHE-2015-Correspondence to notify the Controller [15-02-2023(online)].pdf | 2023-02-15 |
| 27 | 3931-CHE-2015-Form 1-050416.pdf | 2016-06-16 |
| 27 | 3931-CHE-2015-Annexure [15-02-2023(online)].pdf | 2023-02-15 |
| 28 | 3931-CHE-2015-Written submissions and relevant documents [15-03-2023(online)].pdf | 2023-03-15 |
| 28 | 3931-CHE-2015-Power of Attorney-050416.pdf | 2016-06-16 |
| 29 | Description(Provisional) [30-07-2015(online)].pdf | 2015-07-30 |
| 29 | 3931-CHE-2015-Annexure [15-03-2023(online)].pdf | 2023-03-15 |
| 30 | Drawing [30-07-2015(online)].pdf | 2015-07-30 |
| 30 | 3931-CHE-2015-PatentCertificate23-03-2023.pdf | 2023-03-23 |
| 31 | 3931-CHE-2015-IntimationOfGrant23-03-2023.pdf | 2023-03-23 |
| 31 | Form 3 [30-07-2015(online)].pdf | 2015-07-30 |
| 32 | 3931-CHE-2015-FORM-27 [30-09-2024(online)].pdf | 2024-09-30 |
| 32 | Form 5 [30-07-2015(online)].pdf | 2015-07-30 |
| 1 | 2020-01-3113-23-39_31-01-2020.pdf |
| 1 | 2021-01-0801-45-29AE_08-01-2021.pdf |
| 2 | 2020-01-3113-23-39_31-01-2020.pdf |
| 2 | 2021-01-0801-45-29AE_08-01-2021.pdf |