Sign In to Follow Application
View All Documents & Correspondence

System And Method For Creating Personalized Story

Abstract: SYSTEM AND METHOD FOR CREATING PERSONALIZED STORY Exemplary embodiments of the present disclosure are directed towards a system for creating a personalized story, comprising: a first end-user device102 and a second end-user device104 configured to establish two way wireless-communications with a server103 over a network110, the server103 comprising a story engine106 comprising computer-executable instructions, that when executed, instruct the first end-user device102 and the second end-user device104 to carry out the immersive travel experience conducted within the computer simulated travel environment at least one central database112 comprises a plurality of story templates105, one or more media contents105, the story engine106 configured to carry out the travel experience and creating a media content associated with the story within the computer simulated travel environment to at least one end-user by the plurality of story templates105, the one or more media contents105 of the at least one central database112. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
05 October 2018
Publication Number
41/2018
Publication Type
INA
Invention Field
CHEMICAL
Status
Email
naresh@prometheusip.com
Parent Application
Patent Number
Legal Status
Grant Date
2019-09-17
Renewal Date

Applicants

QUAQUA EXPERIENCES PVT. LTD
Vamsiram's Jubilee Casa,1 st Floor (Level -2), Plot-1246, Road No:62, Jubilee Hills, Hyderabad-500033, Telangana, India.

Inventors

1. PURAV SHAH
Villa # 36, Villa Scapes, Gandipet, Hyderabad-500075, Telangana, India.
2. MAHESH GADHVI
Vamsiram's Jubilee Casa,1 st Floor (Level -2), Plot-1246, Road No:62, Jubilee Hills, Hyderabad-500033, Telangana, India.
3. DALJEET SINGH
Flat No. 404, Gautami Enclave, Kondapur, Hyderabad-500081, Telangana, India.
4. VEERA RAGHAVAN
L&T Serene County , Magnolia Flat No. 007 Ground Floor, Gachibowli, Hyderabad-500032, Telangana, India.
5. AKSHAY AVASTHI
Block 6 , Flat 803, My Home Vihanga, Gachibowli, Hyderabad, Telangana, India.

Specification

Claims:As claimed in:
1. A system for creating a personalized story, comprising:

a first end-user device 102 and a second end-user device 104 configured to establish two way wireless-communications with a server 103 over a network 110, wherein the server 103 comprising:

a story engine 106 comprising computer-executable instructions, that when executed, instruct the first end-user device 102 and the second end-user device 104 to carry out the immersive travel experience conducted within the computer simulated travel environment;

at least one central database 112 comprises a plurality of story templates 105, one or more media contents 105, the story engine 106 configured to carry out the travel experience and creating a media content associated with the story within the computer simulated travel environment to at least one end-user by the plurality of story templates 105, the one or more media contents 105 of the at least one central database 112.

2. The system as claimed in 1, wherein the at least one database 112 configured to store the profile information, media content and metadata attributes related to the end-user.

3. The system as claimed in 1, wherein the story engine 106 causes the first end-user device 102 and the second end-user device 104 to retrieve the travel content and the curated content by the at least one central database 112.

4. The system as claimed in 3, wherein the end-user experiences the immersive travel story of the point of interest with narration, virtual assistant, graphics, animation, and tripometer by the travel content and the curated content.

5. The system as claimed in 1, wherein the story engine 106 configured to define several travel story scripts for the end-user.

6. The system as claimed in 1, wherein the story engine 106 further comprises a source module 202, a content curation module 204, a filtration module 206, a communication module 208, a content enhancement module 210, a metadata module 212, and a content decoding module 214.

7. The system as claimed in 6, the content curation module 204 configured to check the proof of identify, platform compatibility, and matching with the travel content and collects the available travel content from the multiple information sources.

8. The system as claimed in 6, the source module 202 configured to assemble the travel content from the multiple information sources.

9. The system as claimed in 6, the filtration module 206 configured to check the quality of the travel content.

10. The system as claimed in 6, the content enhancement module 210 configured to enhance the travel content retrieved from the central database 112.

11. The system as claimed in 6, the metadata module 212 configured to add the metadata attributes to the travel content to create the immersive travel story.

12. The system as claimed in 6, the content decoding module 214 configured to allow the end-user to describe their own travel story in the real-time and decodes the own travel story in to the relevant travel content and context to create a personalized immersive travel story to the end-user.

13. A method for managing the content management module to create an immersive travel story comprising:
collecting the media content from the internet web or the central database;

filtering the collected media content by the filtration module;

breaking the entire media content into multiple scenes, sections, point of interest, narrative types by the filtration module;

defining several story scripts to the each individual based on the profile information of the end-user;

adding the metadata attributes to the media content by the metadata module;

storing the metadata attributes in the central database and is dynamically updated for every new media content that gets uploaded on the web regularly by the other end-user;

creating the immersive travel story by obtaining the profile information and travel content of the end-user.

14. A method for creating a personalized immersive travel story comprising:

setting the profile information by the end-user in an end-user device;

defining the own travel story of the end-user at the end-user device;

decoding the defined travel story into relevant content and context by the content management module;

collecting the travel content, profile information and metadata attributes stored in the central database 112 and the internet web;

creating the personalized immersive travel story to the end-user by the content creating module. , Description:TECHNICAL FIELD

[001] The disclosed subject matter relates generally to the field of customized media content generation in an end-user device. More particularly, the present disclosure relates to a system and method for creating a personalized story using the media content in an end-user device.

BACKGROUND

[002] In today’s world the millions of video contents in the internet are very easily available and accessible to watch in the smartphones of the end-users. Every individual spends at least 15-20 minutes of time in a day to watch videos and on the activities like sharing, liking, posting etc. It is difficult to get a perfect video for the end-user to select and watch from the millions of videos available in the internet. There is always something missing like story, facts, information, authenticity etc. The majority of the videos contents are not relevant to the end-user. It is difficult to satisfy the different interests of the individuals with the same video content.

[003] Generally, the travel tourism is one of the demand businesses and is the fastest growing industry in the world. The travel tourism organization provides relevant videos, virtual experiences and storytelling to the travellers or end-users with the travel experience systems. The travel tourism organization provides only a few relevant videos to all the travellers regardless of their tastes and preferences. The travellers or end-users do not want to watch all those relevant videos due to the time concern. The current travel experience systems are not providing genuine data, relevant data about the locations. The current story telling system provides the user’s travel story information to different stories on same individual traveler’s interests. For example, an adventure traveller seeks a different experience of a place and a meticulous business tourist seeks a different experience of a city. Hence, there is need for creating an immersive personalized story or video content to the individual to present in the current world.

[004] In the light of the aforementioned discussion, there exists a need for a certain system with novel methodologies that would overcome the above-mentioned disadvantages.

SUMMARY

[005] The following presents a simplified summary of the disclosure in order to provide a basic understanding of the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.

[006] An objective of the present disclosure is directed towards satisfying the different interests of individuals using a single video content.

[007] Another objective of the present disclosure is directed towards single video content which has a story built has to fulfil many customers with different ways of experiencing the travel story.

[008] Another objective of the present disclosure is directed towards creating a dynamic and real time travel story to the end-user to experience the personalized immersive travel experience.

[009] Another objective of the present disclosure is directed towards experiencing a personalized story in the chosen language with emotions.

[0010] Another objective of the present disclosure is directed towards obtaining an end to end travel journey experience without travelling.

[0011] Another objective of the present disclosure is directed towards creating a personalized immersive story around the own travel content or creates a travel story of their own.

[0012] Exemplary embodiments of the present disclosure are directed towards a system and method for creating personalized story.

[0013] According to an exemplary aspect, a system comprising a first end-user device and a second end-user device configured to establish two way wireless-communications with a server over a network.

[0014] According to another exemplary aspect, the server comprising a story engine comprising computer-executable instructions, that when executed, instruct the first end-user device and the second end-user device to carry out the immersive travel experience conducted within the computer simulated travel environment.

[0015] According to another exemplary aspect, the system further comprising a central database comprises a plurality of story templates, one or more media contents.

[0016] According to another exemplary aspect, the story engine configured to carry out the travel experience and creating a media content associated with the story within the computer simulated travel environment to at least one end-user by the plurality of story templates, the one or more media contents of the at least one central database.

BRIEF DESCRIPTION OF THE DRAWINGS

[0017] FIG. 1 is a diagram depicting a schematic representation of personalized story creating environment, in accordance with one or more embodiments.

[0018] FIG. 2 is a diagram depicting an exemplary embodiment of story engine 106 shown in FIG. 1, in accordance with one or more exemplary embodiments.

[0019] FIG. 3 is a flowchart depicting an exemplary method for creating a personalized story, in accordance with one or more embodiments.

[0020] FIG. 4 is a flowchart depicting an exemplary method for creating a personalized story based on the end-user’s defined story, in accordance with one or more embodiments.

[0021] FIG. 5 is a block diagram illustrating the details of digital processing system in which various aspects of the present disclosure are operative by execution of appropriate software instructions.

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

[0022] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.

[0023] The use of “including”, “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms “first”, “second”, and “third”, and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.

[0024] Referring to FIG. 1 is a diagram 100 depicting a schematic representation of personalized story creating environment, in accordance with one or more embodiments. The environment 100 depicts a first end-user device 102, a second end-user device 104, a server 103, story templates 105, a media content 108, a story engine 106, a, a network 110 and a central database 112. The central database 112 comprises the media content 108, custom scene templates and rule sets, where scene templates and rule sets are combined to form the story templates 105. The scene templates acts as an input for the story engine 106 to produce story templates 105. The central database 112 comprises industry standard tools configured to create the scene templates and the created scene templates acts as an input to the story engine 106. The story engine 106 may be configured to create a media content output associated with personalized story to the end-user based on the media content 108 and the story templates from the central database 112. The media content 108 may be presented to the story engine 106 along with metadata about the media content 108 or the subject matter of the media content 108. The subject matter of the media content 108 may include but not limited to, theme characters, guests, and the like.

[0025] The story creating environment 100 provides the immersive story script (travel script, for e.g.) to the end-user by media content output associated with personalized story by the server 103. The server 103 may be referred as web based servers, remote servers, and so forth. The end-user may able to experience a dynamic and real time story of the immersive travel experience. The immersive travel experience may include but is not limited to, a virtual reality experience, augmented reality experience, mixed reality experience, and the like. The first end-user device 102 and the second end-user device 104 may include but are not limited to, a personal digital assistant, smart phones, personal computers, a mobile station, computing tablets, a handheld device, an internet enabled calling device, an internet enabled calling software, a telephone, a mobile phone, a digital processing system, and the like. The first end-user device 102 and the second end-user device may be operated by the end-user. The end-user may include but is not limited to, a traveller, an explorer, a voyager, a tourist, an adventurer, a vacationer, a character, a hero, a central character, experience seeker, a story creator, and the like. The network 110 may include but is not limited to, an Internet of things (IoT network devices), an Ethernet, a wireless local area network (WLAN), or a wide area network (WAN), a Bluetooth low energy network, a ZigBee network, a WIFI communication network e.g., the wireless high speed internet, or a combination of networks, a cellular service such as a 4G (e.g., LTE, mobile WiMAX) or 5G cellular data service, a RFID module, a NFC module, wired cables and the like without limiting the scope of the present disclosure.

[0026] The story engine 106 may be a web-based application or mobile-based application in the first end-user device 102 and the second end-user device 104. The story engine 106 may be configured to facilitate the end-user to create the customized content from any location of the world without having to be present physically. For example, the end-user can create immersive travel story using the story engine 106 from any location of the world without having to be present physically. The story engine 106 may be configured to create the customized immersive story by curating the available media content from the first end-user device 102 and the second end-user device 104. The story engine 106 may be accessed by the end-user after providing the identity credentials in the first end-user device 102 and the second end-user device 104. The identity credentials may include but is not limited to, a login ID, and user ID, a password, a pin number, and the like.

[0027] The story engine 106 may be configured to enable the first end-user device 102 and second end-user device 104 to retrieve the media content 108 by the central database 112. The media content 108 may include the travel videos (360 videos, 2D videos, for example), destination, city, scenes, point of interest, images (360 images), infographics, weather information, type of trails, text, and so forth. The story engine 106 may be configured to create the personalized media content associated with the story by the media content 108 from the first end-user device 102 and the second end-user device 104. The media content 108 may include immersive travel story related to the travel, and the like. The story engine 106 may also be configured to define the customized media content associated with several travel story scripts for the end-user. The travel story scripts include scripts which may not be limited to an adventure traveller, a family traveller, a business traveller, a weekend getaway traveller, a backpack traveller and the like. The story engine 106 may allow the end-user to experience the personalized media content in the chosen language with emotions. The central database 112 may include the metadata for the story engine 106 to perform actions using each module. Metadata such as images, videos, metadata attributes (weather, trails, travel essentials, landmarks, routes, maps, etc.), recorded files, may be stored in the central database 112. The central database 112 may include the information such as the filtered content, the curated content, travel content, the metadata attributes of end-users, and so forth. The central database 112 may also be configured to store the information shared by the end-user. The central database 112 may also store the profile information of the registered end-users. The profile information may include the current location, the previous history, age, gender, general interests, and so forth.

[0028] Referring to FIG. 2 is a diagram 200 depicting an exemplary embodiment of story engine 106 shown in FIG. 1, in accordance with one or more exemplary embodiments. The story engine 106 may include, a source module 202, a media content curation module 204, a filtration module 206, a communication module 208, a media content enhancement module 210, a metadata attributes module 212, a context creation module 214, and a feature stitching module 216. The term “module” is used broadly herein and refers generally to a program resident in memory of the first end-user device 102 and the second end-user device 104. The source module 202 may assemble the media content associated with travel content from the multiple information sources. The information sources may include but is not limited to web, internet, entity, user and the like.

[0029] The filtration module 206 may check the quality of the uploaded content. The travel content quality may include but not limited to, blurriness, parallax error, resolution, narration, copyright/ip, fair use policy, and the like. The content curation module 204 may collect the available travel content from the multiple information sources. The content curation module 204 may check the proof of identify, platform compatibility, and matching with the travel content. The content curation module 204 may also be configured to retrieve the context and media content from the central database 112. The filtration module 206 may decompose the curated travel content by breaking the travel content into multiple scenes, sections, point of interest, narrative types etc. The content enhancement module 210 may enhance the media content retrieved from the central database 112 and other information sources. The enhancement of the travel content may include editing the video, motion graphics and trails, mastering, and so forth. The metadata attributes module 212 may add the metadata attributes to the enhanced media content to create the immersive travel story. The context creation module 214 may be configured to create the context of the media content. The context may include but not limited to, narration, structure, location, other features, and the like. The context creation module 214 may also be configured to create several story scripts for different end-users such as adventure traveller, family traveller, a business traveller, a weekend getaway traveler, backpack traveller, and the like. The story engine 106 may finally correlate and create a very specific story using machine intelligence application programming interface product tool. The story engine 106 may select the right context, media content, and scenes and dynamically stitch a personalized experience with content features to available content platform by the feature stitching module 216. The available content platform, for example, Maps, hotspots, selphie, tripometer , Virtual Assistant, and the like. The story engine 106 may also be configured to provide the rate, review and recommend to the end-user based on the created story. The end-user may define own personal story of travel journey in the real time and provide the defined story to the story engine 106, thereby the story engine 106 may decode the represented story into relevant content and context and creates a new story for the end-user.

[0030] Referring to FIG. 3 is a flowchart 300 depicting an exemplary method for creating a personalized story, in accordance with one or more embodiments. As an option, the method 300 is carried out in the context of the details of FIG. 1, and FIG. 2. However, the method 300 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.

[0031] The exemplary method 300 commences at step 302, curating the available content from the internet web or a central database and other information sources by filtering the logics such as travel media contents, destination, city, type of trails to obtain the filtered information. Thereafter, at step 304, breaking the curated content into multiple contents. The multiple contents such as scenes, sections, point of interest, narrative types, and the like. Thereafter, at step 306, defining several story scripts for each end-user. Here, the end-user may include aadventure traveller, family traveller, business traveller, weekend getaway traveller, backpack traveller, and the like. Thereafter, at step 308, adding the metadata attributes link to the multiple contents such as scenes, point of interest, images, infographics, weather information, and the like. Thereafter, at step 310, storing the metadata in the central database and updating the metadata in the central database for every new content that gets uploaded on the internet web or other information sources. Thereafter, at step 312, dynamically obtaining the profile information (current location, previous history, age, gender, general interests etc.) of the end-user who wants to experience a dynamic and real time story of a travel journey. Thereafter, at step 314, executing the story engine to correlate and create a very specific story using machine intelligence application programming interface tools. Thereafter, at step 316, selecting the right story script, media content, scenes and dynamically stitching defined travel story scripts with the metadata attributes by the story engine. Thereafter, at step 318, determining whether the end-user creates the own story script of travel experience in real time. If answer to the step 318 is YES, decoding the end-user’s story into relevant media content and context and creating a new media content associated with new story for the end-user at step 320. If answer to the step 318 is NO, experiencing the media content associated with personalized story to the end-user in the chosen language with emotions at step 322.

[0032] Referring to FIG. 4 is a flowchart 400 depicting an exemplary method for creating a personalized story based on the end-user’s defined story, in accordance with one or more embodiments. As an option, the method 400 is carried out in the context of the details of FIG. 1, FIG. 2, and FIG. 3. However, the method 400 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.

[0033] The method commences at step 402, providing the identity credentials by the end-user to logging in to the story engine. Thereafter, at step 404, setting the profile information by the end-user in the end-user device. Thereafter, at step 406, defining the own story script (travel story script, for e.g.) to the story engine by the end-user in the end-user device. Thereafter, at step 408, decoding the defined story script into relevant content and context by the story engine. Thereafter, at step 410, creating the new media content associated with the new story based on the end-user’s defined story script.

[0034] Referring to FIG. 5, FIG. 5 is a block diagram 500 illustrating the details of digital processing system 500 in which various aspects of the present disclosure are operative by execution of appropriate software instructions. Digital processing system 500 may correspond to the end-user device 102 or 104 (or any other system in which the various features disclosed above can be implemented).

[0035] Digital processing system 500 may contain one or more processors such as a central processing unit (CPU) 510, random access memory (RAM) 520, secondary memory 530, graphics controller 560, display unit 570, network interface 580, an input interface 590. All the components except display unit 570 may communicate with each other over communication path 550, which may contain several buses as is well known in the relevant arts. The components of Figure 5 are described below in further detail.

[0036] CPU 510 may execute instructions stored in RAM 520 to provide several features of the present disclosure. CPU 510 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 510 may contain only a single general-purpose processing unit or can be a part of Cloud processing Unit.

[0037] RAM 520 may receive instructions from secondary memory 530 using communication path 550. RAM 520 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 525 and/or user programs 526. Shared environment 525 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 526.

[0038] Graphics controller 560 generates display signals (e.g., in RGB format) to display unit 570 based on data/instructions received from CPU 510. Display unit 570 contains a display screen to display the images defined by the display signals. Input interface 590 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs. Network interface 580 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1, network 110) connected to the network.

[0039] Secondary memory 530 may contain hard drive 535, flash memory 536, and removable storage drive 537. Secondary memory 530 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 500 to provide several features in accordance with the present disclosure.

[0040] Some or all of the data and instructions may be provided on removable storage unit 540, and the data and instructions may be read and provided by removable storage drive 537 to CPU 510. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 537.

[0041] The removable storage unit 540 may be implemented using medium and storage format compatible with removable storage drive 537, or a cloud storage such that removable storage drive 537 can read the data and instructions. Thus, the removable storage unit 540 includes a computer readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can be in other forms (e.g., non-removable, random access, etc.).

[0042] In this document, the term "computer program product" is used to generally refer to the removable storage unit 540 or hard disk installed in hard drive 535. These computer program products are means for providing instructions to digital processing system 500. CPU 510 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above.

[0043] The term “storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as secondary memory 530. Volatile media includes dynamic memory, such as RAM 520. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.

[0044] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 550. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

[0045] More illustrative information will now be set forth regarding various optional architectures and uses in which the foregoing method may or may not be implemented, as per the desires of the auto system/user. It should be strongly noted that the following information is set forth for illustrative purposes and should not be construed as limiting in any manner. Any of the following features may be optionally incorporated with or without the exclusion of other features described.

[0046] Furthermore, the described features, structures, or characteristics of the disclosure may be combined in any suitable manner in one or more embodiments. In the above description, numerous specific details are provided such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, etc., to provide a thorough understanding of embodiments of the disclosure.

[0047] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub-combinations of the various features described herein above as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.

Documents

Application Documents

# Name Date
1 201841037803-STATEMENT OF UNDERTAKING (FORM 3) [05-10-2018(online)].pdf 2018-10-05
2 201841037803-REQUEST FOR EARLY PUBLICATION(FORM-9) [05-10-2018(online)].pdf 2018-10-05
3 201841037803-POWER OF AUTHORITY [05-10-2018(online)].pdf 2018-10-05
4 201841037803-FORM-9 [05-10-2018(online)].pdf 2018-10-05
5 201841037803-FORM FOR STARTUP [05-10-2018(online)].pdf 2018-10-05
6 201841037803-FORM FOR SMALL ENTITY(FORM-28) [05-10-2018(online)].pdf 2018-10-05
7 201841037803-FORM 1 [05-10-2018(online)].pdf 2018-10-05
8 201841037803-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [05-10-2018(online)].pdf 2018-10-05
9 201841037803-EVIDENCE FOR REGISTRATION UNDER SSI [05-10-2018(online)].pdf 2018-10-05
10 201841037803-DRAWINGS [05-10-2018(online)].pdf 2018-10-05
11 201841037803-DECLARATION OF INVENTORSHIP (FORM 5) [05-10-2018(online)].pdf 2018-10-05
12 201841037803-COMPLETE SPECIFICATION [05-10-2018(online)].pdf 2018-10-05
13 201841037803-CLAIMS UNDER RULE 1 (PROVISIO) OF RULE 20 [05-10-2018(online)].pdf 2018-10-05
14 Correspondence by Agent_Submission Application Forms,Form26_12-10-2018.pdf 2018-10-12
15 201841037803-FORM 18A [27-10-2018(online)].pdf 2018-10-27
16 201841037803-FER.pdf 2018-11-30
17 201841037803-PETITION UNDER RULE 137 [28-05-2019(online)].pdf 2019-05-28
18 201841037803-OTHERS [28-05-2019(online)].pdf 2019-05-28
19 201841037803-FORM 3 [28-05-2019(online)].pdf 2019-05-28
20 201841037803-FER_SER_REPLY [28-05-2019(online)].pdf 2019-05-28
21 201841037803-DRAWING [28-05-2019(online)].pdf 2019-05-28
22 201841037803-CORRESPONDENCE [28-05-2019(online)].pdf 2019-05-28
23 201841037803-COMPLETE SPECIFICATION [28-05-2019(online)].pdf 2019-05-28
24 201841037803-CLAIMS [28-05-2019(online)].pdf 2019-05-28
25 201841037803-ABSTRACT [28-05-2019(online)].pdf 2019-05-28
26 201841037803-FORM-26 [02-07-2019(online)].pdf 2019-07-02
27 201841037803-Correspondence to notify the Controller (Mandatory) [02-07-2019(online)].pdf 2019-07-02
28 201841037803-Correspondence to notify the Controller (Mandatory) [08-07-2019(online)].pdf 2019-07-08
29 Correspondence by Agent_Form26_09-07-2019.pdf 2019-07-09
30 201841037803-HearingNoticeLetter09-07-2019.pdf 2019-07-09
31 201841037803-Written submissions and relevant documents (MANDATORY) [22-07-2019(online)].pdf 2019-07-22
32 201841037803-Proof of Right (MANDATORY) [22-07-2019(online)].pdf 2019-07-22
33 201841037803-Annexure (Optional) [22-07-2019(online)].pdf 2019-07-22
34 Marked up Claims_Granted 320625_17-09-2019.pdf 2019-09-17
35 Drawings_Granted 320625_17-09-2019.pdf 2019-09-17
36 Description_Granted 320625_17-09-2019.pdf 2019-09-17
37 Claims_Granted 320625_17-09-2019.pdf 2019-09-17
38 Abstract_Granted 320625_17-09-2019.pdf 2019-09-17
39 201841037803-PatentCertificate17-09-2019.pdf 2019-09-17
40 201841037803-IntimationOfGrant17-09-2019.pdf 2019-09-17
41 201841037803-FORM 3 [30-04-2021(online)].pdf 2021-04-30

Search Strategy

1 search_26-11-2018.pdf

ERegister / Renewals

3rd: 10 Nov 2021

From 05/10/2020 - To 05/10/2021

4th: 10 Nov 2021

From 05/10/2021 - To 05/10/2022

5th: 10 Nov 2021

From 05/10/2022 - To 05/10/2023

6th: 10 Nov 2021

From 05/10/2023 - To 05/10/2024

7th: 10 Nov 2021

From 05/10/2024 - To 05/10/2025

8th: 10 Nov 2021

From 05/10/2025 - To 05/10/2026

9th: 10 Nov 2021

From 05/10/2026 - To 05/10/2027

10th: 10 Nov 2021

From 05/10/2027 - To 05/10/2028