Sign In to Follow Application
View All Documents & Correspondence

Computer Implemented Methods And System Configured For Unified Travel Experience

Abstract: Exemplary embodiments of the present disclosure are directed towards a system for a unified travel experience in computer simulated travel environment, comprising: end-user device104 configured to establish two way wireless-communications with unified travel experience system106 over network108, unified travel experience system106 comprising: unified travel management module110 comprising computer-executable instructions, that when executed, instruct end-user device104 to carry out unified travel experience conducted within computer simulated travel environment, unified travel experience brings a plurality of parameters in one paradigm of360 mixed reality experience, the plurality of parameters comprising content, metadata, booking, and community, unified travel management module110 causes end-user device104 to perform real time actions within computer simulated travel environment; and database112 comprises computer data that is related to the parameters such as filtered content, curated content, metadata, booking services data, community data, unified travel management module110 configured to carry out unified travel experience and real time actions within computer simulated travel environment to end-user102 by computer data of database112. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
05 October 2018
Publication Number
41/2018
Publication Type
INA
Invention Field
GENERAL ENGINEERING
Status
Email
naresh@prometheusip.com
Parent Application
Patent Number
Legal Status
Grant Date
2021-05-20
Renewal Date

Applicants

QUAQUA EXPERIENCES PVT. LTD
Vamsiram's Jubilee Casa,1 st Floor (Level -2), Plot-1246, Road No:62, Jubilee Hills, Hyderabad-500033, Telangana, India.

Inventors

1. PURAV SHAH
Villa # 36, Villa Scapes, Gandipet, Hyderabad-500075, Telangana, India.
2. MAHESH GADHVI
Vamsiram's Jubilee Casa,1 st Floor (Level -2), Plot-1246, Road No:62, Jubilee Hills, Hyderabad-500033, Telangana, India.
3. DALJEET SINGH
Flat No. 404, Gautami Enclave, Kondapur, Hyderabad-500081, Telangana, India.
4. VEERA RAGHAVAN
L&T Serene County , Magnolia Flat No. 007 Ground Floor, Gachibowli, Hyderabad-500032, Telangana, India.
5. AKSHAY AVASTHI
Block 6 , Flat 803, My Home Vihanga, Gachibowli, Hyderabad, Telangana, India.

Specification

Claims:As claimed in:
1. A system for a unified travel experience in a computer simulated travel environment, comprising:

at least one end-user device 104 configured to establish two way wireless-communications with a unified travel experience system 106 over a network 108, wherein the unified travel experience system 106 comprising:

a unified travel management module 110 comprising computer-executable instructions, that when executed, instruct the at least one end-user device 104 to carry out the unified travel experience conducted within the computer simulated travel environment, wherein the unified travel experience that brings a plurality of parameters in one paradigm of a 360 mixed reality experience, the plurality of parameters comprising a content, metadata, booking, and community, the unified travel management module 110 causes the at least one end-user device 104 to perform a plurality of real time actions within the computer simulated travel environment; and

at least one database 112 comprises a computer data that is related to the plurality of parameters such as a filtered content, a curated content, a metadata, booking services data, community data, the unified travel management module 110 configured to carry out the unified travel experience and the plurality of real time actions within the computer simulated travel environment to at least one end-user 102 by the computer data of the at least one database 112.

2. The system as claimed in 1, wherein the at least one database 112 configured to store the information that is related to the end-user 102.

3. The system as claimed in 1, wherein the unified travel management module 110 causes the at least one end-user device 104 to retrieve the filtered content and the curated content by the at least one database 112.
4. The system as claimed in 3, wherein the end-user 102 experiences the immersive story of the point of interest with narration, virtual assistant, graphics, animation, and tripometer by the filtered content and curated content.

5. The system as claimed in 1, wherein the unified travel management module 110 configured to load the metadata onto a plurality of maps on the at least one end-user device 104.

6. The system as claimed in 5, wherein the end-user device 104 enabled the end-user 102 to move other locations using the plurality of maps within the computer simulated travel environment.

7. The system as claimed in 1, wherein the unified travel management module 110 configured to provide a plurality of booking services in real time within the computer simulated travel environment.

8. The system as claimed in 1, wherein the unified travel management module 110 configured to connect the end-user 102 to the community and get connected to other users in real time.

9. A method for providing unified travel experience in a computer simulated travel environment, comprising:

providing a plurality of identity credentials by an end-user 102 to access a unified travel management module 110, whereby the unified travel management module 110 configured to carry out a filtered and curated content at an end-user device 104 after selecting a prominent information by the at least one end-user 102;

loading the metadata onto a plurality of maps within the computer simulated travel environment on the end-user device 104 using the unified travel management module 110 based on the selected prominent information on the end-user device 104, whereby the plurality of maps comprising a plurality of suitable routes between an origin and a destination within the computer simulated travel environment;
performing a plurality of real time actions within the computer simulated travel environment without breaking the interest or leading the end-user 102 to another website; and

connecting the end-user 102 to a community and getting connected to other users in real time by the unified travel management module 110.

10. The method as claimed in 9, further comprising a step of recording the plurality of actions in a database 112 using the unified travel management module 110.

11. The method as claimed in 9, further comprising a step of personalizing on every frame of the experience and recommending based on the end-user 102 preferences and behavior by the unified travel management module 110.

12. A computer program product comprising a non-transitory computer-readable medium having a computer-readable program code embodied therein to be executed by one or more processors, the program code including instructions to:

provide a plurality of identity credentials by an end-user 102 to access a unified travel management module 110, whereby the unified travel management module 110 configured to carry out a filtered and curated content at an end-user device 104 after selecting a prominent information by at least one end-user 102;

load the metadata onto a plurality of maps within the computer simulated travel environment on the end-user device 104 using the unified travel management module 110 based on the selected prominent information on the end-user device 104, whereby the plurality of maps comprising a plurality of suitable routes between an origin and a destination within the computer simulated travel environment;

perform a plurality of real time actions within the computer simulated travel environment without breaking the interest or leading the end-user 102 to another website; and

connecting the end-user 102 to a community and getting connected to other users in real time by the unified travel management module 110.

13. The computer program product as claimed in 12, wherein the unified travel management module 110 comprising at least one content retrieving module 202 configured to provide the filtered content and curated content from a database 112 to the end-user 102.

14. The computer program product as claimed in 12, wherein the unified travel management module 110 comprising at least one metadata module 204 configured to overlay the metadata on the plurality of maps within the computer simulated travel environment.

15. The computer program product as claimed in 12, wherein the unified travel management module 110 comprising at least one booking module 206 configured to display the booking services to the end-user 102.

16. The computer product as claimed in 12, wherein the unified travel management module 110 further comprising at least one community module 208 configured to connect the end-user 102 with the community within the computer simulated travel environment.

17. The computer product as claimed in 12, wherein the unified travel management module 110 further comprising at least one computer simulated travel environment management module 210 configured to manage the content 301 within the computer simulated travel environment. , Description:TECHNICAL FIELD

[001] The disclosed subject matter relates generally to the field of travelling and tourism experience. More particularly, the present disclosure relates to computer implemented methods and a system configured for a unified travel experience and managing real time actions within Virtual reality or Augmented reality and Mixed reality environments.

BACKGROUND

[002] Generally, the travelling and tourism is one of the fastest growing industry in the world. The traveling typically involves physically travelling and virtual travelling to the desired destinations of the traveler. Many people would love to travel a lot more than they actually do, but they cannot afford the money, time and/or energy. Physical abilities may also takes part in travelling to the extreme locations and the busy individuals cannot get the time to reach the destinations which takes one or more days to travel.

[003] For example the virtual travelling involve viewing or travelling to destinations or locations by means of images and videos by using the virtual travelling experience systems. The virtual travelling experiences are not in the real-time and therefore not always up-to-date. The current virtual travelling options can replicate the visual data from a real visual environments or an imaginary scene. In the current trends, the virtual travelling experience systems are non-personalized and has a poor user interface. The virtual travelling experience systems are providing same content to all the travelers regardless of their unique characteristics, interests and preferences. Hence, there is a need for travelling experience systems that incorporate contextual information and details about the desires of a user to provide a fully integrated reality experience that utilizes the ever-expanding corpus of available data. The travel experience systems also need to provide a real-time demonstration of the geographical location according to real-time commands of the end-user. The travel experience systems also need to integrate the virtual, augmented and mixed reality in a single platform for a better travelling experience.

[004] In the light of the aforementioned discussion, there exists a need for a certain system with novel methodologies that would overcome the above-mentioned disadvantages.

SUMMARY

[005] The following presents a simplified summary of the disclosure in order to provide a basic understanding of the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.

[006] An objective of the present disclosure is directed towards improving the integrated experience of travelling multiple locations using virtual reality travelling experience system.

[007] Another objective of the present disclosure is directed towards integrating the multiple data sources in a single web-based application or mobile-based application.

[008] Another objective of the present disclosure is directed towards the system can integrate other data sources to context specific data related to the virtual reality environment and the interactions from the end-user.

[009] Another objective of the present disclosure is directed towards experiencing the travelling with the 360° reality experience.

[0010] Exemplary embodiments of a system and method for a unified travel experience in a computer simulated travel environment.

[0011] According to an exemplary aspect, the system comprising at least one end-user device configured to establish two way wireless-communications with a unified travel experience system over a network, the unified travel experience system comprising: a unified travel management module comprising computer-executable instructions, that when executed, instruct the at least one end-user device to carry out the unified travel experience conducted within the computer simulated travel environment, wherein the unified travel experience that brings a plurality of parameters in one paradigm of a 360 mixed reality experience, the plurality of parameters comprising a content, metadata, booking, and community, the unified travel management module causes the at least one end-user device to perform a plurality of real time actions within the computer simulated travel environment.

[0012] According to another exemplary aspect, the system further comprising a database comprises a computer data that is related to the plurality of parameters such as a filtered content, a curated content, a metadata, booking services data, community data, the unified travel management module configured to carry out the unified travel experience and the plurality of real time actions within the computer simulated travel environment to at least one end-user by the computer data of the at least one database.

BRIEF DESCRIPTION OF THE DRAWINGS

[0013] FIG. 1 is a diagram depicting a schematic representation of a unified travel experience environment, in accordance with one or more embodiments.

[0014] FIG. 2 is a diagram depicting a unified travel management module 110 as shown in FIG. 1, in accordance with one or more exemplary embodiments.

[0015] FIG. 3A is a diagram depicting exemplary embodiments of the front view of the unified travel experience with the 0-90° view, in accordance with one or more exemplary embodiments.

[0016] FIG. 3B is a diagram depicting exemplary embodiments of the right view of the unified travel experience with the 90°-180° view, in accordance with one or more exemplary embodiments.

[0017] FIG. 3C is a diagram depicting exemplary embodiments of the back view of the unified travel experience with the 180°-270° view, in accordance with one or more exemplary embodiments.

[0018] FIG. 3D is a diagram depicting exemplary embodiments the left view of the unified travel experience with the 270°-360° view, in accordance with one or more exemplary embodiments.

[0019] FIG. 4 is a flowchart depicting an exemplary method for unified travelling experience, in accordance with one or more embodiments.

[0020] FIG. 5 is a flowchart 500 depicting an exemplary method for planning the real-time tour, in accordance with one or more embodiments.

[0021] FIG. 6 is a flowchart 600 depicting an exemplary method for traveling the end-user within the computer simulated travel environment, in accordance with one or more embodiments.

[0022] FIG. 7, is a block diagram illustrating the details of digital processing system in which various aspects of the present disclosure are operative by execution of appropriate software instructions.

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

[0023] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.

[0024] The use of “including”, “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms “first”, “second”, and “third”, and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.

[0025] Referring to FIG. 1 is a block diagram 100 depicting an example environment in which aspects of the present disclosure can be implemented. Specifically, FIG. 1 depicting a schematic representation of a unified travel experience environment, in accordance with one or more embodiments. The unified travel experience environment 100 provides unified travel experience and managing real time actions within the computer simulated travel environment. The unified travel experience may include but is not limited to, a virtual reality experience, augmented reality experience, mixed reality experience, and so forth. Managing real time actions may include bookings, interactions, and so forth. The environment 100 depicts an end-user 102, an end-user device 104, a unified travel experience system 106, and a network 108. The end-user 102 may be the protagonist of the unified travel experience system 106. The end-user 102 may include but is not limited to, a traveler, an explorer, a voyager, a tourist, an adventurer, a vacationer, a character, a hero, a central character, an experience seeker, and so forth. The end-user 102 may be allowed to perform unified travel experience tasks through the unified travel experience system 106. The unified travel experience tasks may include but are not limited to, experiencing the place (for e.g. a country, a city, etc.), researching in depth, anchoring to the map, planning and generating the itinerary, booking the experience, sharing the memories, and connecting to the other individuals. The individuals may include but are not limited to, people, visitors, celebrities, locals, partners, companions, and so forth. The unified travel experience tasks may be prioritized based on the preferences of the end-user 102. The unified travel experience tasks may be performed by the end-user 102 to experience the unified travel experience. The unified travel experience may include but is not limited to, a virtual reality experience, augmented reality experience, mixed reality experience, and so forth. The unified travel experience that brings four or more parameters in one paradigm of a 360 mixed reality experience, the four or more parameters comprising a content, metadata, booking, and community, the unified travel management module causes at least one end-user device to perform real time actions within the computer simulated travel environment. In an example, the 360 degree frame may have six segment views spread over a 60 degree view angle. The 360 degree frame may include, but not limited to, experience ( video of the point of interest ), explore (maps showcasing other points related to the experience point of interest ), enable (videos about the facilities around the point of interest like food, stay ,transport etc.), educate (informational videos about the point of interest providing facts and figures), execute (all booking experience videos related to the point of interest) and engage (socialize with others ,feedback from various personas like celebrities ,friends ,locales etc. ) , it may be represented as 6E magic cube.

[0026] The end-user device 104 may include but is not limited to, a personal digital assistant, a smart phone, a personal computer, a mobile station, computing tablets, a handheld device, an internet enabled calling device, an internet enabled calling software, a telephone, a mobile phone, a digital processing system, and so forth. The network 108 may include but is not limited to, an Internet of things (IoT network devices), an Ethernet, a wireless local area network (WLAN), or a wide area network (WAN), a Bluetooth low energy network, a ZigBee network, a WIFI communication network e.g., the wireless high speed internet, or a combination of networks, a cellular service such as a 4G (e.g., LTE, mobile WiMAX) or 5G cellular data service, a RFID module, a NFC module, wired cables and the like without limiting the scope of the present disclosure.

[0027] The unified travel experience environment is enabled to establish a communication between the end-user device 104 and the end-user 102. The unified travel management module 110 may be accessed by providing the identity credentials by an end-user 102. The identity credentials may include but is not limited to, a login ID, and user ID, a password, a pin number, and the like. The unified travel experience system 106 may be referred web based servers, remote servers, and so forth. The unified travel management module 110 may be configured to be used as a web-based application or mobile-based application in the end-user device 102. The unified travel management module 110 may facilitate the end-user 102 to explore any location in the world without having to be present physically. The unified travel management module 110 may causes the end-user device 104 to retrieve the filtered content and the curated content by the database 112. The database 112 may include the essentials for the unified travel management module 110 to perform actions using each module. Information such as images, videos, metadata (weather, trails, travel essentials, landmarks, routes, maps, etc.), recorded files, may be stored in the database 112. The database 112 may include the information such as the filtered content, the curated content, the metadata, booking services, and the community data of community users, and so forth. The database 112 may also be configured to store the information shared by the end-user 102.

[0028] Referring to FIG. 2 is a block diagram 200 depicting the unified travel management module 110 shown in FIG. 1, in accordance with one or more exemplary embodiments. The unified travel management module 110 may include, a content retrieving module 202, a metadata module 204, a booking module 206, and a community module 208. The content retrieving module 202 may be configured to retrieve and provide the immersive story of the point of interest with the narration to the end-user 102 from the database 112. The immersive experience may be provided by a virtual assist, graphics, animation trip-meter, and so forth. The content retrieving module 202 may provide the filtered content and the curated content to the end-user 102. The filtered content and the curated content may include but not limited to, information about the place, relevant tourism data, local conditions, and so forth. The term “module” is used broadly herein and refers generally to a program resident in memory of the end-user device 104.

[0029] The metadata module 204 may be configured to overlay the metadata on maps. The maps may include multiple layers of metadata (for e.g. six layers). The metadata may include but is not limited to, weather, trails, travel essentials, landmarks, routes, curated itineraries, local guides, time and forecasting, and so forth. The maps may provide interactive and experience driven to the end-user 102. The computer simulated travel environment management module 210 that provides integration of virtual objects for presenting to the end-user 102 of end-user device 104 via an interface to provide a virtual reality or augmented reality experience. The booking module 206 may be configured to display the booking services to the end-user 102. The booking experience may be provided to the end-user 102 by loading the booking services from the unified travel management module 110. The booking services may include but is not limited to, Flight, Bus, Local Travels, hotels, local activities, merchants, website, packages, shops, buy travel services and the like. The community module 208 may be configured to connect the end-user 102 with the community within the computer simulated travel environment. The community may include but is not limited to, visitors, celebrities, locals, authentic local groups, cultural centers, tourism boards, local vendors, tour operator’s guides, contributors, promotors, and so forth. The community module 208 may also be configured to allow the end-user 102 to view other users who experienced the unified virtual travel experience earlier. Each parameter (content, metadata, bookings, community, for e.g.) may be dynamically uploaded and configured in the virtual environment and augmented reality world.

[0030] Referring to FIG. 3A, 3B, 3C, and 3D are diagrams 300a, 300b, 300c and 300d of end-user screens depicting exemplary embodiments of the unified travel experience with a 360° view, in accordance with one or more exemplary embodiments. For example, the unified travel experience with a 360° view may include four views. The four views may include but are not limited to, front view, right view, back view, left view, and so forth. The end-user screen 300a depicts the front view of the unified travel experience with a 0-90° view also referred as the first frame. The end-user screen 300a may provide the filtered content and the curated content by the content retrieving module 202 in a virtual reality or augmented reality world. The end-user screen 300a further depicts the integration of content 301 in a virtual reality or augmented reality world by the computer simulated travel environment management module 210. In an example, the end-user screen 300a presents the virtual content 302, wherein the end-user 102 can experience the immersive story of point of interest with narration, virtual assistant, graphics, animation, tripometer, and so forth. The computer simulated travel environment management module 210 may be configured to manage the content 301 within the computer simulated travel environment.

[0031] The end-user screen 300b depicts the right view of the unified travel experience with a 90°-180° view and also referred as the second frame. The end-user screen 300b further depicts a landmark icon 302, a route map 304, a weather forecasting option 306 and travel essentials option 308 by loading the metadata from the metadata module 204. The landmark icon 302 depicts the landmark either by three-dimensional graphical objects or icons depending on the location of the landmark. The landmark such as, prominent buildings, heritage sites, historical natural monuments, and so forth. The route map 304 may be configured to provide a suitable route between an origin and a destination. The weather forecast option 306 may be configured to predict the weather conditions at a particular location after selecting weather forecast option 306 by the end-user 102. If the end-user 102 selects the travel essentials option 308, then it may provide a list of essentials to the end-user 102 in the virtual reality or augmented reality world.

[0032] The end-user screen 300c depicts the rear view of the unified travel experience with a 180°- 270° and also referred as the third frame. The end-user screen 300c includes a booking layer configured to provide the bookings with the booking services by the booking module 306. The end-user screen 300c further depicts a transport booking option 310, a shopping option 312, a hotel booking option 314, and services option 316. The transport booking option 310 for the end-user to query and booking transport tickets, the shopping option 312 for providing number of products to the end-user 102. The shopping option 312 may present offers for the products for purchase through the unified travel management module 110. The hotel booking option 314 for the end-user 102 to query and hotel reservations, and the services option 316 for providing number of services to the end-user 102.

[0033] The end-user screen 300d depicts the left view of the unified travel experience with a 270°-360° view and also referred as the fourth frame. The end-user screen 300d further depicts visitors 318, celebrities 320, and locals 322. The community layer may be loaded on the end-user screen 300d to experience the interaction with the community by the community module 208. The visitors 318 may include who experienced the unified virtual travel experience earlier or the users are ready to experience the unified virtual travel experience.

[0034] Referring to FIG. 4 is a flowchart 400 depicting an exemplary method for providing unified travel experience and managing real time actions, in accordance with one or more embodiments. As an option, the method 400 is carried out in the context of the details of FIG. 1, FIG. 2, and FIG. 3A-3D. However, the method 400 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.

[0035] The exemplary method 400 commences at step 402, wherein the identity credentials may be provided by the end-user to access the unified travel management module. Once the unified travel management module is accessed, the end-user may set the profile details at step 404. The profile details may include name, change password option, profile picture, time zone, greet message, and so forth. Thereafter, at step 406, the end-user may select the prominent information to get the unified travel experience. The prominent information may include but is not limited to, location, date, time, language, schedule, local guides, tourism authority, and so forth. Thereafter, at step 408, the unified travel management module may carry out a filtered and curated content at the end-user device after selecting the prominent information by the at least one end-user. Thereafter, at step 410, the metadata may be loaded onto maps within the computer simulated travel environment on the end-user device using the unified travel management module based on the selected prominent information. Thereafter, at step 412, the end-user may be enabled to move other locations using the maps within the computer simulated travel environment. Thereafter, at step 414, real time actions may be performed within the computer simulated travel environment without breaking the interest or leading the end-user to another website. Thereafter, at step 416, the end-user may be connected to the community and getting connected to other users in real time by the unified travel management module.

[0036] Referring to FIG. 5 is a flowchart 500 depicting an exemplary method for planning the real-time tour, in accordance with one or more embodiments. As an option, the method 400 is carried out in the context of the details of FIG. 1, FIG. 2, and FIG. 3A-3D. However, the method 400 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.

[0037] The exemplary method 500 commences at step 502, the filtered content and curated content may be obtained on the end-user device. Thereafter at step 504, the end-user may identify the filtered content and curated content. Thereafter at step 506, the end-user may analyze the identified filtered content and curated content to make the plan for real-time tour. Thereafter at step 508, the real time actions may be performed within the computer simulated travel environment at the end-user device.

[0038] Referring to FIG. 6 is a flowchart 600 depicting an exemplary method for traveling the end-user within the computer simulated travel environment, in accordance with one or more embodiments. As an option, the method 600 is carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, FIG. 4, and FIG.5. However, the method 600 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.

[0039] The exemplary method 600 commences at step 602, wherein the end-user may be traveled within the computer simulated travel environment from one location to other. Thereafter at step 604, the metadata may be loaded onto the maps within the computer simulated travel environment on the end-user device by the unified travel management module. Thereafter at step 606, Determining whether the end-user identify the maps to travel to other location within the computer simulated environment on the end-user device. If answer to the step 606 is YES, the method continues at step 608, the required map may be selected by the end-user to travel other location within the computer simulated environment. If answer to the step 606 is NO, the previous location is unchanged.

[0040] Referring to FIG. 7, FIG. 7 is a block diagram 700 illustrating the details of digital processing system 700 in which various aspects of the present disclosure are operative by execution of appropriate software instructions. Digital processing system 700 may correspond to the end-user device 104 (or any other system in which the various features disclosed above can be implemented).

[0041] Digital processing system 700 may contain one or more processors such as a central processing unit (CPU) 710, random access memory (RAM) 720, secondary memory 730, graphics controller 760, display unit 770, network interface 780, an input interface 790. All the components except display unit 770 may communicate with each other over communication path 750, which may contain several buses as is well known in the relevant arts. The components of Figure 7 are described below in further detail.

[0042] CPU 710 may execute instructions stored in RAM 720 to provide several features of the present disclosure. CPU 710 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 710 may contain only a single general-purpose processing unit or can be a part of Cloud processing Unit.

[0043] RAM 720 may receive instructions from secondary memory 730 using communication path 750. RAM 720 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 725 and/or user programs 726. Shared environment 725 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 726.

[0044] Graphics controller 760 generates display signals (e.g., in RGB format) to display unit 770 based on data/instructions received from CPU 710. Display unit 770 contains a display screen to display the images defined by the display signals. Input interface 790 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs. Network interface 780 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1, network 108) connected to the network.

[0045] Secondary memory 730 may contain hard drive 735, flash memory 736, and removable storage drive 737. Secondary memory 730 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 700 to provide several features in accordance with the present disclosure.

[0046] Some or all of the data and instructions may be provided on removable storage unit 740, and the data and instructions may be read and provided by removable storage drive 737 to CPU 710. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 737.

[0047] The removable storage unit 740 may be implemented using medium and storage format compatible with removable storage drive 737, or a cloud storage such that removable storage drive 737 can read the data and instructions. Thus, the removable storage unit 740 includes a computer readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can be in other forms (e.g., non-removable, random access, etc.).

[0048] In this document, the term "computer program product" is used to generally refer to the removable storage unit 740 or hard disk installed in hard drive 735. These computer program products are means for providing instructions to digital processing system 700. CPU 710 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above.

[0049] The term “storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as secondary memory 730. Volatile media includes dynamic memory, such as RAM 720. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.

[0050] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 750. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

[0051] More illustrative information will now be set forth regarding various optional architectures and uses in which the foregoing method may or may not be implemented, as per the desires of the auto system/user. It should be strongly noted that the following information is set forth for illustrative purposes and should not be construed as limiting in any manner. Any of the following features may be optionally incorporated with or without the exclusion of other features described.

[0052] Furthermore, the described features, structures, or characteristics of the disclosure may be combined in any suitable manner in one or more embodiments. In the above description, numerous specific details are provided such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, etc., to provide a thorough understanding of embodiments of the disclosure.

[0053] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub-combinations of the various features described herein above as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.

Documents

Orders

Section Controller Decision Date
U/S-15. CHIRANJIT SARKAR 2019-09-19
U/S - 7 7 (1 (f) ) CHIRANJIT SARKAR 2021-05-20
U/S - 7 7 ( 1(f) ) CHIRANJIT SARKAR 2021-05-20

Application Documents

# Name Date
1 201841037801-STATEMENT OF UNDERTAKING (FORM 3) [05-10-2018(online)].pdf 2018-10-05
2 201841037801-REQUEST FOR EARLY PUBLICATION(FORM-9) [05-10-2018(online)].pdf 2018-10-05
3 201841037801-POWER OF AUTHORITY [05-10-2018(online)].pdf 2018-10-05
4 201841037801-FORM-9 [05-10-2018(online)].pdf 2018-10-05
5 201841037801-FORM FOR STARTUP [05-10-2018(online)].pdf 2018-10-05
6 201841037801-FORM FOR SMALL ENTITY(FORM-28) [05-10-2018(online)].pdf 2018-10-05
7 201841037801-FORM 1 [05-10-2018(online)].pdf 2018-10-05
8 201841037801-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [05-10-2018(online)].pdf 2018-10-05
9 201841037801-EVIDENCE FOR REGISTRATION UNDER SSI [05-10-2018(online)].pdf 2018-10-05
10 201841037801-DRAWINGS [05-10-2018(online)].pdf 2018-10-05
11 201841037801-DECLARATION OF INVENTORSHIP (FORM 5) [05-10-2018(online)].pdf 2018-10-05
12 201841037801-COMPLETE SPECIFICATION [05-10-2018(online)].pdf 2018-10-05
13 201841037801-CLAIMS UNDER RULE 1 (PROVISIO) OF RULE 20 [05-10-2018(online)].pdf 2018-10-05
14 Correspondence by Agent_Submission Applications,Form26_12-10-2018.pdf 2018-10-12
15 201841037801-FORM 18A [27-10-2018(online)].pdf 2018-10-27
16 201841037801-FER.pdf 2018-11-28
17 201841037801-OTHERS [20-05-2019(online)].pdf 2019-05-20
18 201841037801-FER_SER_REPLY [20-05-2019(online)].pdf 2019-05-20
19 201841037801-DRAWING [20-05-2019(online)].pdf 2019-05-20
20 201841037801-CORRESPONDENCE [20-05-2019(online)].pdf 2019-05-20
21 201841037801-COMPLETE SPECIFICATION [20-05-2019(online)].pdf 2019-05-20
22 201841037801-CLAIMS [20-05-2019(online)].pdf 2019-05-20
23 201841037801-ABSTRACT [20-05-2019(online)].pdf 2019-05-20
24 201841037801-Correspondence to notify the Controller (Mandatory) [24-06-2019(online)].pdf 2019-06-24
25 201841037801-Correspondence to notify the Controller (Mandatory) [26-06-2019(online)].pdf 2019-06-26
26 201841037801-FORM-26 [02-07-2019(online)].pdf 2019-07-02
27 201841037801-HearingNoticeLetter03-07-2019.pdf 2019-07-03
28 Correspondence by Agent_Power of Attrney_08-07-2019.pdf 2019-07-08
29 201841037801-HearingNoticeLetter09-07-2019.pdf 2019-07-09
30 201841037801-Written submissions and relevant documents (MANDATORY) [15-07-2019(online)].pdf 2019-07-15
31 201841037801-Annexure (Optional) [15-07-2019(online)].pdf 2019-07-15
32 201841037801-RELEVANT DOCUMENTS [17-10-2019(online)].pdf 2019-10-17
33 201841037801-FORM-24 [17-10-2019(online)].pdf 2019-10-17
34 201841037801-FORM-26 [11-03-2020(online)].pdf 2020-03-11
35 201841037801-Form26_Power of Attorney_20-03-2020.pdf 2020-03-20
36 201841037801-Correspondence-20-03-2020.pdf 2020-03-20
37 201841037801-Retyped Pages under Rule 14(1) [24-03-2020(online)].pdf 2020-03-24
38 201841037801-2. Marked Copy under Rule 14(2) [24-03-2020(online)].pdf 2020-03-24
39 201841037801-Correspondence to notify the Controller [20-01-2021(online)].pdf 2021-01-20
40 201841037801-Written submissions and relevant documents [11-02-2021(online)].pdf 2021-02-11
41 201841037801-Annexure [11-02-2021(online)].pdf 2021-02-11
42 201841037801-FORM 3 [30-04-2021(online)].pdf 2021-04-30
43 201841037801-PatentCertificate20-05-2021.pdf 2021-05-20
44 201841037801-IntimationOfGrant20-05-2021.pdf 2021-05-20
45 201841037801-ReviewPetition-HearingNotice-(HearingDate-28-01-2021).pdf 2021-10-17

Search Strategy

1 searchstrategy_28-11-2018.pdf

ERegister / Renewals

3rd: 10 Nov 2021

From 05/10/2020 - To 05/10/2021

4th: 10 Nov 2021

From 05/10/2021 - To 05/10/2022

5th: 10 Nov 2021

From 05/10/2022 - To 05/10/2023

6th: 10 Nov 2021

From 05/10/2023 - To 05/10/2024

7th: 10 Nov 2021

From 05/10/2024 - To 05/10/2025

8th: 10 Nov 2021

From 05/10/2025 - To 05/10/2026

9th: 10 Nov 2021

From 05/10/2026 - To 05/10/2027

10th: 10 Nov 2021

From 05/10/2027 - To 05/10/2028