Abstract: DRIVER ASSISTANCE SYSTEM AND A METHOD THEREOF ABSTRACT The present disclosure discloses an interactive driver assistance system and a method for informing, training and guiding a driver of a vehicle about surroundings in a virtual space using user interface. The driver assistance system is configured to receive vehicle surroundings data from one or more sensors installed in the vehicle; and guide a driver of the vehicle to traverse a path/course based on the vehicle surroundings data. The vehicle surroundings data comprises at least one of vehicle density around the vehicle, navigational details, critical parameters comprising nearest and closest vehicles, congested traffic, presence of large obstacles and a course of response to a particular driving situation. Figure 1
DESC:TECHNICAL FIELD
The present disclosure relates to guidance system. In particular, but not exclusively, the present disclosure relates to a method and system for assisting a driver.
BACKGROUND
Driver assistance systems are common in current scenario. For providing safety and comfort for a driver while driving, various systems are built. Few systems concentrate on intimating a driver about surroundings, whereas few other systems assist the driver maneuver traffic and difficult paths.
The above systems relate only to traffic situations and path maneuvering. However, a driver may require assistance for various activities. For example, a driver may need information about availability of parking in a particular building. In another instance, the driver may require guidance while driving in a hilly terrain. In the present scenario, the driver may have to install dedicated system for each type of assistance needed. In one instance, the user may not choose to use individual assistance systems, as installing individual systems may seem expensive. In another instance, the driver may install various systems for receiving assistance. However, while driving, the driver may lose concentration when interacting with different assistance systems. Thus, installing various systems and interacting with them while driving may prove dangerous to the driver.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
SUMMARY
In an embodiment, the present disclosure relates to an interactive driver assistance system for informing, training and guiding a driver of a vehicle about surroundings in a virtual space using user interface. The driver assistance system is configured to receive vehicle surroundings data from one or more sensors installed in the vehicle; and guide a driver of the vehicle to traverse a path/course based on the vehicle surroundings data. The vehicle surroundings data comprises at least one of vehicle density around the vehicle, navigational details, critical parameters comprising nearest and closest vehicles, congested traffic, presence of large obstacles and a course of response to a particular driving situation.
In an embodiment, the present disclosure discloses a method for informing, training and guiding a driver of a vehicle about surroundings in a virtual space using user interface. The method comprises receiving vehicle surroundings data from one or more sensors installed in the vehicle and guiding a driver of the vehicle to traverse a path/course based on the vehicle surroundings data. The vehicle surroundings data comprises at least one of vehicle density around the vehicle, navigational details, critical parameters comprising nearest and closest vehicles, congested traffic, presence of large obstacles and a course of response to a particular driving situation.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The novel features and characteristic of the disclosure are set forth in the appended claims. The disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying figures. One or more embodiments are now described, by way of example only, with reference to the accompanying figures wherein like reference numerals represent like elements and in which:
Figure 1 shows an exemplary block diagram of a vehicle comprising a driver assistance system, in accordance with some embodiments of the present disclosure;
Figure 2 shows an exemplary block diagram illustrating internal architecture of a driver assistance system, in accordance with some embodiments of the present disclosure;
Figure 3 shows an exemplary flowchart illustrating method steps for assisting a driver of a vehicle, in accordance with some embodiments of the present disclosure;
Figure 4 shows an example illustrating communication between vehicles and communication between vehicles and infrastructures, in accordance with some embodiments of the present disclosure;
Figure 5shows an example illustrating guidance provided by driver assistance system to manoeuvre traffic, in accordance with some embodiments of the present disclosure; and
Figure 6is an exemplary block diagram of a general-purpose computer system.
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or apparatus.
Embodiments of the present disclosure relates to an interactive driver assistance system. The driver assistance system receives inputs or requests for assisting the driver. Further, the driver assistance system analyses the inputs or requests and assists the driver suitably.
Figure 1 shows an exemplary block diagram of a vehicle (100). The vehicle (100) comprises sensors (101), actuators (102), Global Positioning System (GPS (103)), an Electronic Control Unit (ECU (104)), a driver assistance system (105), and a user interface (106). A driver (108) is associated with the vehicle (100). The driver assistance system (105) may be connected to a network (107A) and a network (107B). The sensors (101) may include, but are not limited to, acceleration sensor, braking sensor, clutch pedal depression sensor, vehicle (100) proximity detection sensor, rain sensor, parking sensor, wheel speed sensor, temperature sensor, heating ventilation and air conditioning sensor, oxygen sensor, pressure sensor, and any other sensor that may be present in a vehicle (100). The actuators (102) may include, but are not limited to, Anti Braking System (ABS), Traction Control System (TCS), Electronic Brake Distribution (EBD), air bags, vipers, headlamps, air conditioner, navigation system, audio system, video system, and any other actuator present in the vehicle (100). The ECU (104) receives one or more inputs from the sensors (101) and provides instructions to perform an action to the actuators (102). The driver assistance system (105) receives inputs or requests for assisting the driver (108). Upon receiving such requests, the driver assistance system (105) analyses the requests and accordingly provides solution to assist the driver (108). Further, the driver assistance system (105) may interact with the ECU (104) to assist the driver (108).
In an embodiment, the requests may relate to traffic manoeuvring, path manoeuvring, traffic data (204), terrain data (204), Internet services, parking assistance, and the like.
The driver assistance system (105) may be connected to the network (107B). Further, the driver assistance system (105) may provide Internet services when connected to the network (107B) and may or may not be provided by an internet service provider. The ECU (104) may/may not access the network (107B), for internal requirements. The network (107A) is responsible for networking/connection between the internal hardware and the database server which could be available offline/online. This network (107B) may or may not utilized by both the ECU (104) and the driver assistance system (105).
The driver assistance system (105) interacts with the driver (108) using the user interface. The user interface may include, but are not limited to, keyboard, mic, touchscreen, and the like.
In an embodiment, the driver assistance system (105) may communicate with the network (107A) and the network (107B) using connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/Internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc. The network (107A) and the network (107B) may include, without limitation, a direct interconnection, wired connection, e-commerce network, a peer to peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol (WAP)), the Internet, Wireless Fidelity (Wi-Fi), etc.
Figure 2 illustrates internal architecture of the driver assistance system (105) in accordance with some embodiments of the present disclosure. The driver assistance system (105) may include at least one Central Processing Unit (“CPU” or “processor (203)”) (203) and a memory (202) storing instructions executable by the at least one processor (203). The processor (203) may comprise at least one data processor for executing program components for executing user or system-generated requests. The memory (202)is communicatively coupled to the processor (203). The driver assistance system (105) further comprises an Input/Output (I/O) interface (201). The I/O interface (201) is coupled with the processor 203through which an input signal or/and an output signal is communicated.
In an embodiment, data (204) may be stored within the memory (202). The data (204) may include, for example, driver personality data (204), driver conversation data (206), driver behavior data (207), driver historical decisions (208), driver authentication data (209) and other data (210).
In an embodiment, the driver personality data (205) may include, but is not limited to, data (204) collected from social media of the driver (108), personal details of the driver (108), personality traits, attributes, characteristics of the driver (108), habits, and the like.
In an embodiment, the driver conversation data (206) may include, but is not limited to, conversation between the driver (108) and the driver assistance system (105). Here, the conversation may include verbal conversation, touch interactions, visual interactions (like actions or gestures made by the driver (108)), etc.
In an embodiment, the driver behavior data (207) may include driving pattern of the driver (108). Here, driving pattern is not limited to, indication of switching of accelerator pedal, brake pedal and clutch pedal by the driver (108), steering wheel movements, lane change frequency, and the like.
In an embodiment, the driver historical decisions (208) may include, but are not limited to, decisions taken by the driver (108) when one or more solutions are provided to the driver (108), recording and storing of the decision taken by the driver (108), responses from the driver (108).
In an embodiment, the driver authentication data (209) may include voice samples of the driver (108), visual samples of the driver (108), passwords provided by the driver (108), and the like to authorize the driver assistance system (105) to perform one or more actions.
In an embodiment, the data (204) in the memory (202) is processed by modules (211) of the driver assistance system (105). As used herein, the term module refers to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a Field-Programmable Gate Arrays (FPGA), Programmable System-on-Chip (PSoC), a combinational logic circuit, and/or other suitable components that provide the described functionality. The modules (211) when configured with the functionality defined in the present disclosure will result in a novel hardware.
In one implementation, the modules (211) may include, for example, a communication module (212), a driver recognition module (213), a self-assessment module (214), an analyzing module (215), a solution identifying module (216), a prioritization module (217), a vocabulary module (218), an animator module (219), a comprehending module (220), an action module (221), a training module (222)and other modules (223). It will be appreciated that such aforementioned modules (211) may be represented as a single module or a combination of different modules (211).
In an embodiment, the communication module (212) receives the inputs or requests from the driver (108). Also, the communication module (212) communicates with the ECU (104). Here, the communication module (212) may receive data (204) from the ECU (104). Also, the communication module (212) may provide instructions to the ECU (104). The communication module (212) may also connect to the network (107A) or the network (107B) via wired interface or wireless interface. In an embodiment, the communication module (212) may support vehicle (100) to vehicle (100) (V2V) communication and vehicle (100) to infrastructure (v2x) communication.
In an embodiment, the driver recognition module (213) recognizes the driver (108) and authenticates the driver (108). The driver recognition module (213) may receive keypad inputs, touch inputs, voice inputs or visual inputs from the driver (108) as user credentials for authentication. The driver recognition module (213) may recognize the driver (108) by comparing the user credential details with predetermined credentials. The predetermined credentials may be stored in the memory (202) and the user credentials may be received from the driver 108 through the user interface (106).
In an embodiment, the self-assessment module (214) may assess a requirement to assist the driver (108) based on inputs received from the ECU (104). Here, the self-assessment module (214) determines a need to assist the driver (108), without driver (108) intervention. For example, while driving, when the driver (108) falls asleep, and the vehicle (100) begins to drift away from a path, the self-assessment module (214) may detect the drifting of the vehicle (100) and may indicate the ECU (104) to correct course of the vehicle (100).
In an embodiment, the analyzing module (215) analyses the inputs or requests. Here, the analyzing module (215) may use driver conversation data (206), driver personality data (205) and driver behavior data (207) to analyse the inputs or requests. Here, analysis may include understanding nature of the inputs or requests, resources needed to handle the inputs or requests, and the like.
In an embodiment, the solution identifying module (216) identifies one or more solutions to assist the driver (108) based on the inputs or requests. Here, the solutions may be identified from a plurality of solutions stored in a database (not shown in figure). The solution identifying module (216) may use driver conversation data (206), driver personality data (205), driver behavior data (207), and driver historical decisions (208) for identifying one or more solutions.
In an embodiment, the prioritization module (217) prioritizes the one or more solutions based on severity and criticality of the inputs or requests. For example, when the driver (108) is drowsy while driving, and the vehicle (100) is drifting off course, the one or more solutions may include waking the driver (108) and prepare for collision consequences. Here, the prioritization module (217) may prioritize the solution of suggesting the driver (108) to steer the vehicle (100) back to course rather than suggesting braking of the vehicle (100). In an embodiment, during a less critical situation, the prioritization module (217) may prioritize the one or more solutions and further request for driver (108) preference.
In an embodiment, the vocabulary module (218) converts the prioritized solution to a language understandable by the driver (108). For example, the prioritized solution may be a in a format understandable by the driver assistance system (105). The vocabulary module (218) may convert the prioritized solution to English format, which may be understandable by the driver (108). The vocabulary module (218) narrates the prioritized solution to the driver (108). The vocabulary module (218) may use the driver personality data (205) for determining language to be used for the driver (108), vocabulary, slang, and accent.
In an embodiment, animator module (219) provides the prioritized solutions to the driver (108) in an animated interface. The animator module (219) may include audio units, video units, tactile induced units, or other units which facilitate interactive user interactions.
In an embodiment, the comprehending module (220) may be used to understand requests arriving from the driver (108) or the self-assessment module (214). Further, the comprehending module (220) may communicate the requests to the analyzing module (215). Primary function of the comprehending module (220) may be to comprehend the requests and communicate the comprehended requests to various modules (211) of the driver assistance system (105).
In an embodiment, the action module (221) performs action only when an output/ options module (not shown) demands for an action. The action module (221) may or may not utilize private information of the driver (108). In an embodiment, the action module (221) may suggest an action comprising at least one of, making online bookings, making online transactions, conducting online searches, etc. This might require driver’s personal data for authentication and for successful online transactions, booking & searches, etc. These actions are executed by the action module (221). In an embodiment, the action module (221) is further configured to guide the driver (108) when the driver (108) is not capable of performing the action.
In an embodiment, the training module (222) triggers a training exercise for the driver (108). The training comprises a training session based on inputs from the driver (108) and current driving skill of the driver (108). The training scenarios may be real-time exercises and may be in a path selected by the driver (108). In an embodiment, the path may be selected by the training module (222) based on driving skill level of the driver (108). In an embodiment, the training may gradually increase difficulty level based on progress by the driver (108). The training module (222) may use inputs from the driver historical decisions (208) and the driver personality data (205).
In an embodiment, the driver assistance system (105) may implement Artificial Intelligence algorithms to assist the driver (108).
In an embodiment, the other modules (223) may include, but is not limited to, an output/ options module. The output/ options module may provide a mode in which the one or more solutions is to be conveyed to the driver (108).
Figure 3 shows a flow chart illustrating a method for assisting a driver, in accordance with some embodiments of the present disclosure.
As illustrated in Figure 3, the method (300) may comprise one or more steps for assisting a driver, in accordance with some embodiments of the present disclosure. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
The order in which the method (300) is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
At step 301, the communication module (212) may receive vehicle surroundings data from one or more sensors (101) installed in the vehicle (100) based on self-assessment of the system or on driver’s requests. In an embodiment, the communication module (212) may also receive user requests through the user interface (106). The vehicle surrounding data may include, but is not limited to, vehicle density around the vehicle (108), navigation details, critical parameters comprising nearest and closest vehicles, traffic congestion, presence of large objects and course of response to a particular situation. Further, the self-assessment module (214) may assess a requirement to assist the driver (108) based on inputs/ user requests. Here, the self-assessment module (214) determines a need to assist the driver (108), without driver intervention. Thereafter, the analyzing module (215) analyses the inputs or user requests. Here, the analyzing module (215) may use driver conversation data (206), driver personality data (205) and driver behavior data (207) to analyse the inputs or requests. The analysis may include understanding nature of the inputs or requests, resources needed to handle the inputs or requests, and the like.
At step 302, the solution identifying module (216) identifies one or more solutions to guide the driver (108) based on the inputs or requests. Here, the solutions may be identified from a plurality of solutions stored in a database (not shown in figure). The solution identifying module (216) may use driver conversation data (206), driver personality data (205), driver behavior data (207), and driver historical decisions (208) for identifying one or more solutions. Further, the prioritization module (217) prioritizes the one or more solutions based on severity and criticality of the inputs or requests. In an embodiment, during a less critical situation, the prioritization module (217) may prioritize the one or more solutions and further request for driver (108) preference. Thereafter, the vocabulary module (218) converts the prioritized solution to a language understandable by the driver (108). The vocabulary module (218) narrates the prioritized solution to the driver (108). The vocabulary module (218) may use the driver personality data (205) for determining language to be used for the driver (108), vocabulary, slang, and accent.
In an embodiment, the animator module (219) provides the prioritized solutions to the driver (108) in an animated interface. In an embodiment, the action module (221) performs action only when an output/ options module (not shown) demands for an action. The action module (221) may or may not utilize private information of the driver (108). In an embodiment, the action module (221) may suggest an action comprising at least one of, making online bookings, making online transactions, conducting online searches, etc. This might require driver’s personal data for authentication and for successful online transactions, booking & searches, etc. These actions are executed by the action module (221). In an embodiment, the action module (221) is further configured to guide the driver (108) when the driver (108) is not capable of performing the action.
In an embodiment, the guidance is provided using a user interface (106) comprising at least one of an audio, a video and an animation, a combination of audio/video/animation. The user interface (106) is configured to illustrate manoeuvring steps to the driver (108), while monitoring his/her actions. The guidance comprises of relevant and critical information at any given moment of time. The guidance comprises of at least one of information of closest vehicle (100) in the proximate of the vehicle (100), information on traffic and degree of depression of at least one of an accelerator, a brake pedal and a clutch, rotation of the steering wheel. The guidance is further based on at least one of driver (108) personality, driver (108) history, safety priority and driver (108) needs.
In an embodiment, the training module (222) provides training to the driver (108) based on the skill of the driver (108) analyzed during a real time driving and training phase. The training module (222) is configured to provide a real-time training to the driver (108) based on at least one of, driver (108) history, driver (108) needs and current skill level of the driver (108).
In an embodiment, the interactive driver assistance system (105) is further configured to assist user requests, where assistance comprises of at least one of, performing online bookings, performing online transactions, and conducting online searches, which may or may not require user authentication, these kinds of actions are executed by action module (221).
Figure 4shows an example for providing assistance to the driver (108). From the Figure 4, it can be observed that the vehicle (100) interacts with satellites to determine location of the vehicle (100) and navigate to the home on a quickest path.
Consider another scenario where the driver (108) is driving the vehicle (100) in a narrow hilly road. Another vehicle from opposite lane arrives in front of the vehicle (100). Thus, there is a need for the driver (108) to drive the vehicle (100) in reverse. Here, the driver (108) may request for assistance for manoeuvring the narrow hilly road. The driver assistance system (105) may determine surroundings of the vehicle (100) and also determine terrain condition. Further, the driver assistance system (105) may intelligently assist the driver (108) to drive reverse without damaging the vehicle (100). Particularly, the driver assistance system (105) may monitor steering wheel, depression of clutch pedal, brake pedal and accelerator pedal. Based on the monitoring, the driver assistance system (105) may suggest the driver (108) to manage the steering wheel. The suggestions may include switching between the clutch pedal, brake pedal and the accelerator pedal for efficiently manoeuvring the path. From the Figure 4, two vehicles can communicate with each other using v2v communication to manoeuvre a path. The v2v communication is very helpful while overtaking and during navigating in hilly areas. The v2v communication is not limited to, assistance during overtaking and during navigating in hilly areas.
Consider a scenario where the driver (108) would like to book tickets for a movie. The driver (108) requests the driver assistance system (105) for available shows in a radius close to the driver (108). The driver assistance system (105) may determine the vehicle (100) location and identify cinemas nearby to the vehicle (100) location. Further, the driver assistance system (105) may suggest available movies. Also, the driver assistance system (105) may suggest movies based on traffic and Estimated Time of Arrival (ETA) of the driver (108) to the cinemas.
Consider another scenario of a two-way road without a divider. Let us assume the vehicle (100) is surrounded by four four-wheelers and four two-wheelers as shown in Figure 5. A conversation between the driver assistance system (105) and the driver (108) is given below:
Driver assistance system (105): driver (108), there are eight vehicles around the vehicle (100), and the two-wheeler in front is the closest. As it is an incline, I suggest keeping distance from the two-wheeler in the front. Also, I strongly suggest not to overtake the two-wheeler. Traffic on the opposite lane is chocked and will remain the same for a while. I can find an alternate route. The route is longer, but we will reach sooner than pursuing the current path.
Driver (108): So, you are suggesting we take a U-turn and continue until next exit. I doubt it will help us as the route seems very long.
Driver assistance system (105): Yes. It is longer than the current route. However, the time for reaching destination through the alternate route is 25 minutes, compared to 46 minutes in the current route. Meanwhile I can update on sports feed. Would you like to hear?
Driver (108): Okay. No need for rerouting. However, please give me the latest sports update.
Driver assistance system (105): Okay
In an embodiment, the driver assistance system (105) and real-time guidance to the driver (108).
In an embodiment, the driver assistance system (105) guides the driver (108) to manoeuvre a path safely.
In an embodiment, the driver assistance system (105) guides and trains the driver in real time driving situations and virtual training environment (108) as per need and driver (108) requests.
In an embodiment, the driver assistance system (105) provides guidance to the driver (108), only when it absolutely essential and needed. The combination of the suggestion medium (audio, video & tactile, etc.) is specific to a particular driving situation. Hence, does not overload the driver with numerous suggestions and actions. This ensures user friendly and effective interaction between the driver assistance system (105) and the driver (108).
In an embodiment, the driver assistance system (105) takes the personality of the driver (108), the conversation history into account to eliminate redundancy of suggestions and increase precision.
COMPUTER SYSTEM
Figure 6illustrates a block diagram of an exemplary computer system (600) for implementing embodiments consistent with the present disclosure. In an embodiment, the computer system (600) is used to implement the method for adaptive streaming of multimedia data (204). The computer system (600) may comprise a central processing unit (“CPU” or “processor (203)”) (602). The processor (602) may comprise at least one data (204) processor (203) for executing program components for dynamic resource allocation at run time. The processor (602) may include specialized processing units such as integrated system (bus) controllers, memory (202) management control units, floating point units, graphics processing units, digital signal processing units, etc.
The processor (602) may be disposed in communication with one or more input/output (I/O) devices (not shown) via I/O interface (601). The I/O interface (601)may employ communication protocols/methods such as, without limitation, audio, analog, digital, monoaural, RCA, stereo, IEEE-1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), RF antennas, S-Video, VGA, IEEE 802.n /b/g/n/x, Bluetooth, cellular (e.g., code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc.
Using the I/O interface (601), the computer system (600) may communicate with one or more I/O devices. For example, the input device (610) may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, stylus, scanner, storage device, transceiver, video device/source, etc. The output device (611) may be a printer, fax machine, video display (e.g., cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED), plasma, Plasma display panel (PDP), Organic light-emitting diode display (OLED) or the like), audio speaker, etc.
In some embodiments, the computer system (600) is connected to the service operator through a communication network (609). The processor (602) may be disposed in communication with the communication network (609) via a network interface (603). The network interface (603) may communicate with the communication network (609). The network interface (603) may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/Internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc. The communication network 609 may include, without limitation, a direct interconnection, e-commerce network, a peer to peer (P2P) network, local area network (LAN), wide area network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, Wi-Fi, etc. Using the network interface (603) and the communication network (609), the computer system (600) may communicate with the one or more service operators.
In some embodiments, the processor (602) may be disposed in communication with a memory (605) (e.g., RAM, ROM, etc. not shown in Figure 6) via a storage interface (604). The storage interface (604) may connect to memory (605) including, without limitation, memory (202) drives, removable disc drives, etc., employing connection protocols such as serial advanced technology attachment (SATA), Integrated Drive Electronics (IDE), IEEE-1394, Universal Serial Bus (USB), fibre channel, Small Computer Systems Interface (SCSI), etc. The memory (202) drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, Redundant Array of Independent Discs (RAID), solid-state memory (202) devices, solid-state drives, etc.
The memory (605) may store a collection of program or database components, including, without limitation, user interface (606), an operating system (607), web server (608) etc. In some embodiments, computer system (600) may store user/application data (606), such as the data (204), variables, records, etc. as described in this disclosure. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase.
The operating system (607) may facilitate resource management and operation of the computer system (600). Examples of operating systems include, without limitation, Apple Macintosh OS X, Unix, Unix-like system distributions (e.g., Berkeley Software Distribution (BSD), FreeBSD, NetBSD, OpenBSD, etc.), Linux distributions (e.g., Red Hat, Ubuntu, Kubuntu, etc.), IBM OS/2, Microsoft Windows (XP, Vista/7/8, 10 etc.), Apple iOS, Google Android, Blackberry OS, or the like.
In some embodiments, the computer system (600) may implement a web browser (608) stored program component. The web browser (608) may be a hypertext viewing application, such as Microsoft Internet Explorer, Google Chrome, Mozilla Firefox, Apple Safari, etc. Secure web browsing may be provided using Secure Hypertext Transport Protocol (HTTPS), Secure Sockets Layer (SSL), Transport Layer Security (TLS), etc. Web browsers 608may utilize facilities such as AJAX, DHTML, Adobe Flash, JavaScript, Java, Application Programming Interfaces (APIs), etc. In some embodiments, the computer system 600may implement a mail server stored program component. The mail server may be an Internet mail server such as Microsoft Exchange, or the like. The mail server may utilize facilities such as ASP, ActiveX, ANSI C++/C#, Microsoft .NET, CGI scripts, Java, JavaScript, PERL, PHP, Python, WebObjects, etc. The mail server may utilize communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), Microsoft Exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like. In some embodiments, the computer system (600) may implement a mail client stored program component. The mail client may be a mail viewing application, such as Apple Mail, Microsoft Entourage, Microsoft Outlook, Mozilla Thunderbird, etc.
In an embodiment, the computer system (600) may be connected to surrounding vehicles or an infrastructure (612) through the communication network (609)
The terms "an embodiment", "embodiment", "embodiments", "the embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.
The terms "including", "comprising", “having” and variations thereof mean "including but not limited to", unless expressly specified otherwise.
The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise. The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
The illustrated operations of Figure 3, shows certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified or removed. Moreover, steps may be added to the above described logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
REFERRAL NUMERALS:
Description Reference number
Vehicle 100
Sensors 101
Actuators 102
GPS 103
ECU 104
Driver assistance system 105
U/I 106
Network 107A
Network 107B
Driver 108
I/O interface 201
Memory 202
Processor 203
Data 204
Driver personality data 205
Driver conversation data 206
Driver behavior data 207
Driver historical decisions 208
Driver authentication data 209
Other data 210
Modules 211
Communication module 212
Driver recognition module 213
Self-assessment module 214
Analyzing module 215
Solution identifying module 216
Prioritization module 217
Vocabulary module 218
Animator module 219
Comprehending module 220
Action module 221
Training module 222
Other modules 223
Computer system 600
I/O interface 601
Processor 602
Network interface 603
Storage interface 604
Memory 605
User interface 606
Operating system 607
Web service 608
Communication network 609
Input device 610
Output device 611
Surrounding vehicle and infrastructure 612
,CLAIMS:We claim:
1. An interactive driver assistance system for informing, training and guiding a driver of a vehicle about surroundings in a virtual space using user interface, the system configured to:
receive vehicle surroundings data from one or more sensors installed in the vehicle; and
guide a driver of the vehicle to traverse a path/course based on the vehicle surroundings data, wherein the vehicle surroundings data comprises at least one of vehicle density around the vehicle, navigational details, critical parameters comprising nearest and closest vehicles, congested traffic, presence of large obstacles and a course of response to a particular driving situation.
2. The interactive driver assistance system as claimed in claim 1, is configured to analyze, criticality of the driving situation, safety and capability of driver to perform an action, and if needed, suggest an action comprising at least one of, to manoeuvre obstacles, to avoid traffic, to control the vehicle, to negotiate proximate vehicles and to park the vehicle.
3. The interactive driver assistance system as claimed in claim 2, further configured to guide the driver when the driver is not capable of performing the action.
4. The interactive driver assistance system as claimed in claim 1, wherein the guidance is provided using a user interface comprising at least one of an audio, a video and an animation, a combination of audio/video/animation, wherein the user interface is configured to illustrate manoeuvring steps to the driver, while monitoring his/her actions.
5. The interactive driver assistance system as claimed in claim 1, wherein the guidance comprises of relevant and critical information at any given moment of time.
6. The interactive driver assistance system as claimed in claim 5, wherein the guidance comprises of at least one of information of closest vehicle in the proximate of the vehicle, information on traffic and degree of depression of at least one of an accelerator, a brake pedal and a clutch, rotation of the steering wheel.
7. The interactive driver assistance system as claimed in claim 6, wherein the guidance is based on at least one of driver personality, driver history, safety priority and driver needs.
8. The interactive driver assistance system as claimed in claim 1, wherein the training is provided to the driver based on the skill of the driver analyzed during a real time driving and training phase.
9. The interactive system as claimed in claim 1, configured to provide a real-time training to the driver based on at least one of, driver history, driver needs, current skill level of the driver.
10. The interactive driver assistance system as claimed in claim 1, further comprises of assisting user requests, wherein assistance comprises of at least one of, performing online bookings, performing online transactions, and conducting online searches, which may or may not require user authentication.
11. A method for informing, training and guiding a driver of a vehicle about surroundings in a virtual space using user interface, the method comprising:
receiving, by an interactive driver assistance system, vehicle surroundings data from one or more sensors installed in the vehicle; and
guiding, by an interactive driver assistance system, a driver of the vehicle to traverse a path/course based on the vehicle surroundings data, wherein the vehicle surroundings data comprises at least one of vehicle density around the vehicle, navigational details, critical parameters comprising nearest and closest vehicles, congested traffic, presence of large obstacles and a course of response to a particular driving situation.
12. The method as claimed in claim 11, wherein guiding comprises analyzing, criticality of the driving situation, safety and capability of driver to perform an action, and if needed, suggesting an action comprising at least one of, to manoeuvre obstacles, to avoid traffic, to control the vehicle, to negotiate proximate vehicles and to park the vehicle.
13. The method as claimed in claim 11, wherein the guidance isprovided when the driver is not capable of performing the action.
14. The method as claimed in claim 11, wherein the guidance is provided using a user interface comprising at least one of an audio, a video and an animation, a combination of audio/video/animation, wherein the user interface is configured to illustrate manoeuvring steps to the driver, while monitoring his/her actions.
15. The method as claimed in claim 11, wherein the guidance comprises of relevant and critical information at any given moment of time.
16. The method as claimed in claim 15, wherein the guidance comprises of at least one of information of closest vehicle in the proximate of the vehicle, information on traffic and degree of depression of at least one of an accelerator, a brake pedal and a clutch, rotation of the steering wheel.
17. The method as claimed in claim 16, wherein the guidance is based on at least one of driver personality, driver history, safety priority and driver needs.
18. The method as claimed in claim 11, wherein the training is provided to the driver based on the skill of the driver analyzed during a real time driving and training phase.
19. The method as claimed in claim 11, whereinthe training is provided in real-time to the driver based on at least one of, driver history, driver needs, current skill level of the driver.
20. The method as claimed in claim 11, further comprising assisting user requests, wherein assistance comprises of at least one of, performing online bookings, performing online transactions, and conducting online searches, which may or may not require user authentication.
Dated this 30th of March 2018
R. RAMYA RAO
IN/PA – 1607
AGENT FOR THE APPLICANT
OF K&S PARTNERS
| # | Name | Date |
|---|---|---|
| 1 | 201721011653-Annexure [24-01-2025(online)].pdf | 2025-01-24 |
| 1 | Form 5 [31-03-2017(online)].pdf | 2017-03-31 |
| 2 | 201721011653-Response to office action [24-01-2025(online)].pdf | 2025-01-24 |
| 2 | Form 3 [31-03-2017(online)].pdf | 2017-03-31 |
| 3 | Form 1 [31-03-2017(online)].pdf | 2017-03-31 |
| 3 | 201721011653-8(i)-Substitution-Change Of Applicant - Form 6 [21-01-2025(online)].pdf | 2025-01-21 |
| 4 | Drawing [31-03-2017(online)].pdf | 2017-03-31 |
| 4 | 201721011653-ASSIGNMENT DOCUMENTS [21-01-2025(online)].pdf | 2025-01-21 |
| 5 | Description(Provisional) [31-03-2017(online)].pdf | 2017-03-31 |
| 5 | 201721011653-PA [21-01-2025(online)].pdf | 2025-01-21 |
| 6 | PROOF OF RIGHT [28-06-2017(online)].pdf | 2017-06-28 |
| 6 | 201721011653-Written submissions and relevant documents [17-03-2024(online)].pdf | 2024-03-17 |
| 7 | 201721011653-PETITION UNDER RULE 138 [17-02-2024(online)].pdf | 2024-02-17 |
| 7 | 201721011653-ORIGINAL UNDER RULE 6 (1A)-30-06-2017.pdf | 2017-06-30 |
| 8 | 201721011653-FORM-26 [09-10-2017(online)].pdf | 2017-10-09 |
| 8 | 201721011653-Correspondence to notify the Controller [01-02-2024(online)].pdf | 2024-02-01 |
| 9 | 201721011653-DRAWING [31-03-2018(online)].pdf | 2018-03-31 |
| 9 | 201721011653-FORM-26 [01-02-2024(online)]-1.pdf | 2024-02-01 |
| 10 | 201721011653-COMPLETE SPECIFICATION [31-03-2018(online)].pdf | 2018-03-31 |
| 10 | 201721011653-FORM-26 [01-02-2024(online)].pdf | 2024-02-01 |
| 11 | 201721011653-FORM-8 [03-04-2018(online)].pdf | 2018-04-03 |
| 11 | 201721011653-US(14)-ExtendedHearingNotice-(HearingDate-02-02-2024).pdf | 2024-01-12 |
| 12 | 201721011653-FORM 18 [03-04-2018(online)].pdf | 2018-04-03 |
| 12 | 201721011653-Response to office action [01-08-2023(online)].pdf | 2023-08-01 |
| 13 | 201721011653-Correspondence to notify the Controller [20-07-2023(online)].pdf | 2023-07-20 |
| 13 | 201721011653-FORM 18 [03-04-2018(online)]-1.pdf | 2018-04-03 |
| 14 | 201721011653-ORIGINAL UNDER RULE 6 (1A)-FORM 26-111017.pdf | 2018-08-11 |
| 14 | 201721011653-US(14)-ExtendedHearingNotice-(HearingDate-24-07-2023).pdf | 2023-07-13 |
| 15 | 201721011653-Response to office action [30-06-2023(online)].pdf | 2023-06-30 |
| 15 | Abstract.jpg | 2019-05-02 |
| 16 | 201721011653-Correspondence to notify the Controller [12-06-2023(online)].pdf | 2023-06-12 |
| 16 | 201721011653-FER_SER_REPLY [05-08-2021(online)].pdf | 2021-08-05 |
| 17 | 201721011653-FORM-26 [12-06-2023(online)].pdf | 2023-06-12 |
| 17 | 201721011653-FER.pdf | 2021-10-18 |
| 18 | 201721011653-US(14)-HearingNotice-(HearingDate-13-06-2023).pdf | 2023-05-26 |
| 19 | 201721011653-FER.pdf | 2021-10-18 |
| 19 | 201721011653-FORM-26 [12-06-2023(online)].pdf | 2023-06-12 |
| 20 | 201721011653-Correspondence to notify the Controller [12-06-2023(online)].pdf | 2023-06-12 |
| 20 | 201721011653-FER_SER_REPLY [05-08-2021(online)].pdf | 2021-08-05 |
| 21 | 201721011653-Response to office action [30-06-2023(online)].pdf | 2023-06-30 |
| 21 | Abstract.jpg | 2019-05-02 |
| 22 | 201721011653-ORIGINAL UNDER RULE 6 (1A)-FORM 26-111017.pdf | 2018-08-11 |
| 22 | 201721011653-US(14)-ExtendedHearingNotice-(HearingDate-24-07-2023).pdf | 2023-07-13 |
| 23 | 201721011653-Correspondence to notify the Controller [20-07-2023(online)].pdf | 2023-07-20 |
| 23 | 201721011653-FORM 18 [03-04-2018(online)]-1.pdf | 2018-04-03 |
| 24 | 201721011653-Response to office action [01-08-2023(online)].pdf | 2023-08-01 |
| 24 | 201721011653-FORM 18 [03-04-2018(online)].pdf | 2018-04-03 |
| 25 | 201721011653-FORM-8 [03-04-2018(online)].pdf | 2018-04-03 |
| 25 | 201721011653-US(14)-ExtendedHearingNotice-(HearingDate-02-02-2024).pdf | 2024-01-12 |
| 26 | 201721011653-COMPLETE SPECIFICATION [31-03-2018(online)].pdf | 2018-03-31 |
| 26 | 201721011653-FORM-26 [01-02-2024(online)].pdf | 2024-02-01 |
| 27 | 201721011653-DRAWING [31-03-2018(online)].pdf | 2018-03-31 |
| 27 | 201721011653-FORM-26 [01-02-2024(online)]-1.pdf | 2024-02-01 |
| 28 | 201721011653-Correspondence to notify the Controller [01-02-2024(online)].pdf | 2024-02-01 |
| 28 | 201721011653-FORM-26 [09-10-2017(online)].pdf | 2017-10-09 |
| 29 | 201721011653-ORIGINAL UNDER RULE 6 (1A)-30-06-2017.pdf | 2017-06-30 |
| 29 | 201721011653-PETITION UNDER RULE 138 [17-02-2024(online)].pdf | 2024-02-17 |
| 30 | PROOF OF RIGHT [28-06-2017(online)].pdf | 2017-06-28 |
| 30 | 201721011653-Written submissions and relevant documents [17-03-2024(online)].pdf | 2024-03-17 |
| 31 | Description(Provisional) [31-03-2017(online)].pdf | 2017-03-31 |
| 31 | 201721011653-PA [21-01-2025(online)].pdf | 2025-01-21 |
| 32 | Drawing [31-03-2017(online)].pdf | 2017-03-31 |
| 32 | 201721011653-ASSIGNMENT DOCUMENTS [21-01-2025(online)].pdf | 2025-01-21 |
| 33 | Form 1 [31-03-2017(online)].pdf | 2017-03-31 |
| 33 | 201721011653-8(i)-Substitution-Change Of Applicant - Form 6 [21-01-2025(online)].pdf | 2025-01-21 |
| 34 | Form 3 [31-03-2017(online)].pdf | 2017-03-31 |
| 34 | 201721011653-Response to office action [24-01-2025(online)].pdf | 2025-01-24 |
| 35 | Form 5 [31-03-2017(online)].pdf | 2017-03-31 |
| 35 | 201721011653-Annexure [24-01-2025(online)].pdf | 2025-01-24 |
| 36 | 201721011653-Response to office action [11-08-2025(online)].pdf | 2025-08-11 |
| 1 | 2021-02-0113-25-24E_01-02-2021.pdf |
| 2 | 201721011653AE_12-11-2021.pdf |