Sign In to Follow Application
View All Documents & Correspondence

System For Controlling An Autonomous Vehicle

Abstract: The present disclosure relates to system(s) and method(s) for controlling an autonomous vehicle. In one embodiment, vehicle data may be received from a network device and a set of autonomous vehicles. Further, real-time vehicle data of the set of autonomous vehicles may be collected. Furthermore, cyborg data associated with a human being within a vicinity of autonomous vehicle may be obtained. Based on the vehicle data, the real-time vehicle data and the cyborg data, a predicted behavior of autonomous vehicle may be determined. The autonomous vehicle may be further controlled based on the predicted behaviour and a current behaviour of the autonomous vehicle.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
02 January 2019
Publication Number
07/2019
Publication Type
INA
Invention Field
ELECTRICAL
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2024-10-16
Renewal Date

Applicants

HCL Technologies Limited
A-9, Sector - 3, Noida 201 301, Uttar Pradesh, India

Inventors

1. SUNDARARAJ, Jayaramakrishnan
HCL Technologies Limited, Hub-1, Karle Town Center SEZ, Nagavara, Bangalore - 560045, Karnataka, India
2. DEY, Sourav
HCL Technologies Limited, Surya Sapphire, PLOT NO 3, 1st Phase, Electronic City, Bangalore - 560100, Karnataka, India

Specification

[001]The present application does not claim priority from any patent application.
TECHNICAL FIELD
[002] The present disclosure in general relates to the field of an autonomous vehicle.
More particularly, the present invention relates to a system and method for controlling the autonomous vehicle.
BACKGROUND
[003] Currently, world is moving towards human-less interaction with vehicles. Some
of autonomous vehicles exists in a market. The autonomous vehicles autonomously drive the vehicle properly on roads without any human interference. At times, the autonomous vehicles may not be able to drive accurately. It may take sudden turns without indicating a vehicle behind, or may not be able to detect an object in way of the vehicle. Thus, the autonomous vehicles take wrongs actions. This may result into an accidents or collisions of vehicles. The occurrence of the accidents or collisions disturbs other vehicles on the road. Also, sometimes it may not be possible to control the autonomous vehicle. Hence, there is a need of predicting behaviour of the autonomous vehicle and then controlling the autonomous vehicle based on the predictions.
SUMMARY
[004] Before the present systems and methods for controlling an autonomous vehicle,
is described, it is to be understood that this application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosure. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only, and is not intended to limit the scope of the present application. This summary is provided to introduce concepts related to systems and method for controlling the autonomous vehicle. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[005] In one implementation, a system for predicting behavior of an autonomous
vehicle is illustrated. The system comprises a memory and a processor coupled to the

memory, further the processor is configured to execute programmed instructions stored in the memory. In one embodiment, the processor may execute programmed instructions stored in the memory for receiving vehicle data from a set of sensors, mounted at variety of locations on a road, and a set of autonomous vehicles, within a vicinity of autonomous vehicle. The processor may further execute programmed instructions stored in the memory for collecting real-time vehicle data from the set of autonomous vehicles. The real-time vehicle data comprises a current location of vehicle, a current behavior of vehicle, a destination, a current information of adjacent autonomous vehicles, and an emergency information. Further, the processor may execute programmed instructions stored in the memory for obtaining cyborg data associated with a human being within the vicinity of autonomous vehicle. The cyborg data may be obtained by using a cyborg device mounted to each autonomous vehicle of the set of autonomous vehicles. Furthermore, the processor may execute the programmed instructions stored in the memory for determining a predicted behaviour of autonomous vehicle based on analysis of the vehicle data, the real-time data and the cyborg data.
[006] In another implementation, a method for predicting behavior of an autonomous
vehicle is illustrated. In one embodiment, the method may comprise receiving vehicle data from a set of sensors, mounted at variety of locations on a road, and a set of autonomous vehicles, within a vicinity of autonomous vehicle. The method may further comprise collecting real-time vehicle data from the set of autonomous vehicles. The real-time vehicle data comprises a current location of vehicle, a current behavior of vehicle, a destination, a current information of adjacent autonomous vehicles, and an emergency information. Further, the method may comprise obtaining cyborg data associated with a human being within the vicinity of autonomous vehicle. The cyborg data may be obtained by using a cyborg device mounted to each autonomous vehicle of the set of autonomous vehicles. Furthermore, the method may comprise determining a predicted behaviour of autonomous vehicle based on analysis of the vehicle data, the real-time data and the cyborg data.
[007] In yet another implementation, a computer program product having embodied
computer program for predicting behaviour of an autonomous vehicle is disclosed. In one embodiment, the program may comprise a program code for receiving vehicle data from a set of sensors, mounted at variety of locations on a road, and a set of autonomous vehicles, within a vicinity of autonomous vehicle. The program may further comprise a program

code for collecting real-time vehicle data of the set of autonomous vehicles. The real-time vehicle data may comprise a current location of vehicle, a current behavior of vehicle, a destination, a current information of adjacent autonomous vehicles, and an emergency information. Further, the program may comprise a program code for obtaining cyborg data associated with a human being within the vicinity of autonomous vehicle. The cyborg data may be obtained by using a cyborg device mounted to each autonomous vehicle of the set of autonomous vehicles. Furthermore, the program may comprise a program code for determining a predicted behaviour of autonomous vehicle based on analysis of the vehicle data, the real-time data and the cyborg data.
BRIEF DESCRIPTION OF DRAWINGS
[008] The detailed description is described with reference to the accompanying
figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[009] Figure 1 illustrates a network implementation of a system for controlling an
autonomous vehicle, in accordance with an embodiment of the present subject matter.
[0010] Figure 2 illustrates the system for predicting behaviour of the autonomous vehicle, in accordance with an embodiment of the present subject matter.
[0011] Figure 3(A) illustrates a system of an autonomous vehicle, in accordance with an embodiment of the present subject matter.
[0012] Figure 3(B) illustrates an autonomous vehicle device for controlling the autonomous vehicle, in accordance with an embodiment of the present subject matter.
[0013] Figure 4 illustrates a method for controlling the autonomous vehicle, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0014] Some embodiments of the present disclosure, illustrating all its features, will now be discussed in detail. The words "receiving", "obtaining", "determining", "collecting", "analysing", "generating" and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of

these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the exemplary, systems and methods for controlling an autonomous vehicle are now described. The disclosed embodiments of the system and method for controlling an autonomous vehicle are merely exemplary of the disclosure, which may be embodied in various forms.
[0015] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure for controlling an autonomous vehicle is not intended to be limited to the embodiments illustrated, but is to be accorded the widest scope consistent with the principles and features described herein.
[0016] The present subject matter relates to control an autonomous vehicle. In one embodiment, vehicle data from a set of sensors and a set of autonomous vehicles may be received. The set of sensors may be mounted at variety of locations on a road side. The set of autonomous vehicles may be within a vicinity of autonomous vehicle. Further, real-time vehicle data of a set of vehicles within a vicinity of autonomous vehicle may be received. Further, cyborg data associated with a human being within the vicinity of autonomous vehicle may be obtained from a cyborgs device. The cyborg device may be attached to each autonomous vehicle of the set of autonomous vehicles. The cyborg data, the vehicle data, and the real-time vehicle data may be analysed to determine a predicted behaviour of autonomous vehicle.
[0017] Further, an autonomous vehicle device may determine a current behavior of the autonomous vehicle based on analysis of the real-time vehicle data and the cyborg data. The current behaviour of autonomous vehicle may be compared with a predicted behaviour of autonomous vehicle to generate one or more suggestions. The one or more suggestions may be further used to control the autonomous vehicle.

[0018] Referring now to Figure 1, a network implementation 100 of a system 102 for
controlling an autonomous vehicle is disclosed. Although the present subject matter is explained considering that the system 102 is implemented on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. In one implementation, the system 102 may be implemented over a cloud network. The system 102 may be referred as a network centric intelligent system. Further, it will be understood that the system 102 may be connected to multiple network connected devices such as devices 104-1, 104-2…104-N, collectively referred to as network connected devices 104 hereinafter. Examples of the network connected devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The network connected devices 104 may be communicatively coupled to the system 102 through a network 106.
[0019] In one implementation, the network 106 may be a wireless network, a wired
network or a combination thereof. The network 106 may be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0020] In one embodiment, the system 102 may receive vehicle data from the network
connected devices 104 and a set of autonomous vehicles. The network connected devices 104 may be mounted at variety of locations. The set of autonomous vehicles may include an autonomous vehicle 103, neighbour autonomous vehicles 105-A and 105-B, and alike. In one aspect, the vehicle data may comprise information of adjacent vehicles, location, behaviour of autonomous vehicle, driving information of autonomous vehicle, and an image processor data.
[0021] The system 102 may further collect real-time vehicle data of the autonomous
vehicle 103, the neighbour autonomous vehicles 105-A and 105-B. The real-time vehicle
6

data may comprise a current location of vehicle, a current behavior of vehicle, a destination, an emergency information and the like. Further, the system 102 may obtain cyborg data from a cyborg device. The cyborg data may be associated with a human being within the vicinity of autonomous vehicle. The cyborg device may be attached to each autonomous vehicle of the set of autonomous vehicles. The cyborg data may be obtained based on analysis of mind information of human being using a cortical learning technique.
[0022] Further, the system 102 may determine a predicted behaviour of autonomous
vehicle based on analysis of the vehicle data, the real-time vehicle data and the cyborg data. The vehicle data, the real-time data and the cyborg data may be analysed using a machine learning technique. The system 102 may further transmit the predicted behaviour of autonomous vehicle to an autonomous vehicle device 103. The system 102 may communicate with the autonomous vehicle device 103 using a wireless communication. The autonomous vehicle device 103 may be mounted in each autonomous vehicle of the set of autonomous vehicles.
[0023] In one embodiment, the autonomous vehicle 103 may comprise an autonomous
vehicle device 107 (not shown in the figure 1). The autonomous vehicle device 107 may collect the real-time vehicle data of the set of autonomous vehicles. The real-time vehicle data may comprise a current location of vehicle, a current behavior of vehicle, a destination, an emergency information and the like. Further, the autonomous vehicle device 107 may obtain the cyborg data from the cyborg device. The cyborg data may be associated with the human being within the vicinity of autonomous vehicle. The cyborg device may be attached to each autonomous vehicle of the set of autonomous vehicles. The cyborg data may be obtained based on analysis of mind information of human being using a cortical learning technique.
[0024] The autonomous vehicle device 107 may further determine a current behaviour
of autonomous vehicle. The current behavior may be determined based on analysis of the real-time vehicle data and the cyborg data. The autonomous vehicle device 107 may further compare the current behaviour and the predicted behaviour to generate one or more recommendations. The autonomous vehicle device 107 may further control the autonomous vehicle based on the one or more recommendations.
7

[0025] Referring now to figure 2, the system 102 for predicting behaviour of an
autonomous vehicle is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, at least one processor 202 may be configured to fetch and execute computer-readable instructions stored in the memory 206.
[0026] The I/O interface 204 may include a variety of software and hardware interfaces,
for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with the user directly or through the user device 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[0027] The memory 206 may include any computer-readable medium known in the art
including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[0028] The modules 208 may include routines, programs, objects, components, data
structures, and the like, which perform particular tasks, functions or implement particular abstract data types. In one implementation, the module 208 may include a receiving module 212, a collection module 214, an obtaining module 216, a prediction module 218, and other modules 222. The other modules 222 may include programs or coded instructions that supplement applications and functions of the system 102.
[0029] The data 210, amongst other things, serve as a repository for storing data
processed, received, and generated by one or more of the modules 208. The data 210 may
8

also include a repository 224, and other data 226. In one embodiment, the other data 226 may include data generated as a result of the execution of one or more modules in the other modules 222.
[0030] In one implementation, a user may access the system 102 via the I/O interface
204. The user may be registered using the I/O interface 204 in order to use the system 102. In one aspect, the user may access the I/O interface 204 of the system 102 for obtaining information, providing input information or configuring the system 102.
[0031] In one embodiment, the receiving module 212 may receive vehicle data from a
network device and a set of autonomous vehicles. The vehicle data may comprise information of adjacent vehicles, location, behaviour of autonomous vehicle, driving information of autonomous vehicle, and an image processor data. In one aspect, the network device may be present at variety of locations. The network device may be an IOT (Internet of Thing) device, a mobile device, a public safety device, a GPS enabled device, sensors and the like. Further, the set of autonomous vehicles may be within a vicinity of the autonomous vehicle. The vicinity of autonomous may be predefined.
[0032] In one aspect, the vehicle data may be analysed using a deep learning algorithm
to detect an anomaly. The anomaly may correspond to an object in a way of autonomous vehicle. In one example, the image processor data may be analysed to detect the anomaly.
[0033] In one embodiment, the autonomous vehicle may be an industrial heavy vehicle,
a naval ship, an airport inters transport vehicle, a road transport vehicle, a campus battery operated vehicle, an intercity river transport system, an agricultural based vehicle and the like.
[0034] Further, the collection module 214 may collect real-time vehicle data of the set
of autonomous vehicles. The real-time vehicle data may comprise a current location of vehicle, a current behavior of vehicle, a destination, a current information of adjacent autonomous vehicles, an emergency information and the like.
[0035] Upon receiving the real-time vehicle data, the obtaining module 216 may obtain
cyborg data associated with a human being within the vicinity of autonomous vehicle. The human being in the vicinity of the autonomous vehicle may include passengers in the autonomous vehicle, a traffic police at a traffic signal where the autonomous vehicle is
9

present, passengers in other autonomous vehicles adjacent to the autonomous vehicle. The cyborg data may be obtained using a cyborg device attached to each autonomous vehicle of the set of autonomous vehicles. In one aspect, the cyborg device may be attached to body of the human being. In one example, the cyborg device, attached to the autonomous vehicle, may intimate the human being, in the autonomous vehicle, to wear the cyborg device.
[0036] In one embodiment, the cyborg device may receive mind information of the
human being within the vicinity of autonomous vehicle. The mind information may be received via a Bluetooth, Wi-Fi, a functional MRI technique and the like. Once the mind information is received, a neural behavior of the human being may be identified based on analysis of the mind information. In one aspect, the mind information may be analysed using a cortical learning algorithm. Further, the neural behavior may be compared with historical neural behavior stored in a repository. Based on the comparison, emotions of the human being may be identified. The emotions may be further classified into positive emotions and negative emotions using a classifier algorithm. In one aspect, the positive emotions may reflect fine driving of the autonomous vehicle. Further, the negative emotions may reflect problem in the driving of the autonomous vehicle. Thus, the cyborg device may determine the cyborg data based on the emotions of the human being.
[0037] Further, the prediction module 218 may analyse the vehicle data, the real-time
vehicle data and the cyborg data. The vehicle data, the real-time vehicle data and the cyborg data may be analysed using a machine learning algorithm. Based on the analysis, the prediction module 218 may determine a predicted behaviour of autonomous vehicle. In one embodiment, the predicted behaviour may correspond to change in direction of vehicle, occurrence of object in way of vehicle, collision of vehicles and the like. In one example, the predicted behaviour may be behaviour of the autonomous vehicle in next predefined time.
[0038] Once, the predicted behaviour is determined, the prediction module 218 may
transmit the predicted behaviour to the autonomous vehicle. In one embodiment, the prediction module 218 may transmit the predicted behaviour to an autonomous vehicle device attached to the autonomous vehicle.
[0039] Referring now to figure 3(A), a system of the autonomous vehicle 103 is
illustrated in accordance with an embodiment of the present subject matter. Referring now
10

to figure 3(B), the autonomous vehicle device 107 to control an autonomous vehicle 103 is illustrated in accordance with an embodiment of the present subject matter. Further, the autonomous vehicle device 107 controlling the autonomous vehicle 103 is illustrated with figure 3(A) and figure 3(B). In one embodiment, the system of the autonomous vehicle 103 may comprise an autonomous vehicle device 107, an intelligent cyborg system 307, a feedback system communicator 308, a propulsion system 301, sensor systems 302, control systems 303, a navigation system 304, a collision avoidance system 305, a multiple interface communicator 306 and a Cognitive Data Analyser (CDA) 309.
[0040] In one embodiment, the autonomous vehicle device 107 may include at least
one processor 310, an input/output (I/O) interface 312, and a memory 314. The at least one processor 310 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, at least one processor may be configured to fetch and execute computer-readable instructions stored in the memory 306.
[0041] The I/O interface 312 may include a variety of software and hardware interfaces,
for example, a web interface, a graphical user interface, and the like. The I/O interface 312 may allow the autonomous vehicle device 107 to interact with the user. Further, the I/O interface 312 may enable the autonomous vehicle device 107 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 312 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 312 may include one or more ports for connecting a number of devices to one another or to another server.
[0042] The memory 314 may include any computer-readable medium known in the art
including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 314 may include modules 316 and data 315.
[0043] The modules 316 may include routines, programs, objects, components, data
structures, and the like, which perform particular tasks, functions or implement particular
11

abstract data types. In one implementation, the module 316 may include a collection module 318, an obtaining module 320, a determination module 322, a generation module 324, a control module 326, and other modules 328. The other modules 328 may include programs or coded instructions that supplement applications and functions of the autonomous vehicle device 107.
[0044] The data 315, amongst other things, serve as a repository for storing data
processed, received, and generated by one or more of the modules 316. The data 315 may also include a repository 330, and other data 332. In one embodiment, the other data 332 may include data generated as a result of the execution of one or more modules in the other modules 332.
[0045] In one implementation, a user may access the autonomous vehicle device 107
via the I/O interface 312. The user may be registered using the I/O interface 312 in order to use the autonomous vehicle device 107. In one aspect, the user may access the I/O interface 312 of the autonomous vehicle device 107 for obtaining information, providing input information or configuring the autonomous vehicle device 107.
[0046] In one embodiment, the collection module 312 may collect the real-time vehicle
data of the set of autonomous vehicles. The real-time vehicle data may be received from the propulsion system 301, the sensor systems 302, the control systems 303, the navigation system 304 and the collision avoidance system 305. In one aspect, the propulsion system
301 may provide driving information of the set of autonomous vehicles. The sensor systems
302 may comprise an ultrasonic directional sensor to detect an object in front of each
autonomous vehicle. The control system 303 may provide behaviour of the autonomous
vehicle. The navigation system 304 may provide destination and route to the destination of
each autonomous vehicle. The collision avoidance system 305 may provide emergence
information of each autonomous vehicle. In one embodiment, the real-time vehicle data
may comprise a current location of vehicle, a current behavior of vehicle, the destination,
current information of adjacent autonomous vehicles, the emergency information and the
like. In one aspect, the collection module 212 may receive information of autonomous
vehicles adjacent to the autonomous vehicle from the feedback system communicator 308.
12

[0047] Further, the collection module 312 may receive the predicted behaviour of
autonomous vehicle. The predicted behaviour may be received from the system 102. The predicted behaviour may be received via the multi-interface communicator 306.
[0048] Upon receiving the real-time vehicle data, the obtaining module 314 may obtain
the cyborg data associated with a human being within the vicinity of autonomous vehicle. The cyborg data may be obtained using a cyborg device referred as the intelligent cyborg system 307. In one aspect, the intelligent cyborg system 307 may be attached to each autonomous vehicle of the set of autonomous vehicles. In another aspect, the intelligent cyborg system 307 may be attached to body of the human being.
[0049] In one embodiment, the intelligent cyborg system 307 may receive mind
information of the human being within the vicinity of autonomous vehicle. The mind information may be received via a Bluetooth, Wi-Fi, a functional MRI technique and the like. Once the mind information is received, a neural behavior of the human being may be identified based on analysis of the mind information. In one aspect, the mind information may be analysed using a cortical learning algorithm. Further, the neural behavior may be compared with historical neural behavior stored in a repository. Based on the comparison, emotions of the human being may be identified. The emotions may be further classified into positive emotions and negative emotions using a classifier algorithm. In one aspect, the positive emotions may reflect fine driving of the autonomous vehicle. Further, the negative emotions may reflect problem in the driving of the autonomous vehicle. Thus, the intelligent cyborg system 307 may determine the cyborg data based on the emotions of the human being.
[0050] Once the real-time vehicle data and the cyborg data are received, the
determination module 316 may determine a current behaviour of autonomous vehicle. The current behaviour may be determined based on analysis of the real-time vehicle data and the cyborg data. In one aspect, the CDA 309 may analyse the real-time data and the cyborg data. The real-time data and the cyborg data may be analysed using a machine learning algorithm.
[0051] Further, the generation module 318 may compare the predicted behaviour and
the current behavior of the autonomous vehicle. Based on the comparison, the generation module 318 may generate one or more recommendations for the autonomous vehicle. The
13

one or more recommendations may correspond to change direction of vehicle, change lane, stop vehicle, and the like.
[0052] Once the one or more recommendations are generated, the control module 320
may control the autonomous vehicle based on the one or more recommendations. The control module 320 may take actions to change behaviour of the autonomous vehicle. The control module 320 may modify driving pattern of the autonomous vehicle. Based on the one or more recommendations, the control module 320 may avoid accidents, collision and the like. In one embodiment, the control module 320 may identify best route based on the one or more recommendations.
[0053] Exemplary embodiments discussed above may provide certain advantages.
Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
[0054] Some embodiments of the system and the method is configured to control an
autonomous vehicle.
[0055] Some embodiments of the system and the method is configured to provide real-
time recommendations to the autonomous vehicle.
[0056] Referring now to figure 4, a method 400 for controlling an autonomous vehicle,
is disclosed in accordance with an embodiment of the present subject matter. The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, and the like, that perform particular functions or implement particular abstract data types. The method 400 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0057] The order in which the method 400 is described is not intended to be construed
as a limitation, and any number of the described method blocks can be combined in any order to implement the method 400 or alternate methods. Additionally, individual blocks may be deleted from the method 400 without departing from the spirit and scope of the
14

subject matter described herein. Furthermore, the method 400 can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 400 may be considered to be implemented in the above described system 102.
[0058] At block 402, vehicle data may be received. In one implementation, the
receiving module 212 may receive the vehicle data. The vehicle data may be received from a network device, present at variety of locations, and a set of autonomous vehicles within a vicinity of autonomous vehicle. The vehicle data may comprise information of adjacent vehicles, location, behaviour of autonomous vehicle, driving information of autonomous vehicle, and an image processor data.
[0059] At block 404, real-time vehicle data of the set of autonomous vehicles may be
collected. In one implementation, the collection module 214 mat collect the real-time vehicle data. The real-time vehicle data may comprise a current location of vehicle, a current behavior of vehicle, a destination, a current information of adjacent autonomous vehicles, an emergency information and the like.
[0060] At block 406, cyborg data associated with a human being may be obtained. In
one implementation, the obtaining module 216 may obtain the cyborg data associated with the human being within the vicinity of autonomous vehicle. The cyborg data may be obtained from a cyborg device attached to each autonomous vehicle of the set of autonomous vehicles.
[0061] At block 408, a predicted behaviour of the autonomous vehicle may be
determined. In one implementation, the prediction module 218 may determine the predicted behavior of autonomous vehicle. The predicted behaviour may be determined based on analysis of the vehicle data, the real-time vehicle data and the cyborg data.
[0062] At block 410, the predicted behavior may be transmitted to an autonomous
vehicle device. In one implementation, the prediction module 218 may transmit the predicted behaviour. The autonomous system may further compare a current behavior of autonomous vehicle with the predicted behaviour. Based on the comparison, one or more recommendations may be generated. Further, the autonomous vehicle may be controlled based on the one or more recommendations.
15

[0063] Although implementations for systems and methods for controlling an autonomous vehicle have been described, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for controlling the autonomous vehicle.

WE CLAIM:

A system for predicting behaviour of an autonomous vehicle, the system comprising: a memory;
a processor coupled to the memory, wherein the processor is configured to execute programmed instructions stored in the memory to:
receive vehicle data and from a network device, present at variety of locations, and a set of autonomous vehicles within a vicinity of autonomous vehicle;
collect real-time vehicle data of the set of autonomous vehicles, wherein the real-time vehicle data comprises a current location of vehicle, a current behavior of vehicle, a destination, a current information of adjacent autonomous vehicles, and an emergency information;
obtain cyborg data associated with a human being within the vicinity of autonomous vehicle, wherein the cyborg data is obtained from a cyborg device attached to each autonomous vehicle of the set of autonomous vehicles; and
determine a predicted behavior of autonomous vehicle based on analysis of the vehicle data, the real-time vehicle data and the cyborg data.
. The system as claimed in claim 1, wherein the vehicle data comprises information of adjacent vehicles, location, behaviour of autonomous vehicle, driving information of autonomous vehicle, and an image processor data.
. The system as claimed in claim 1, further configured to transmit the predicted behavior of autonomous vehicle to an autonomous vehicle device.
. The autonomous vehicle device to control the autonomous vehicle, the autonomous vehicle device comprising: a memory;
a processor coupled to the memory, wherein the processor is configured to execute programmed instructions stored in the memory to:
collect the real-time vehicle data of the set of autonomous vehicles, wherein the real-time vehicle data comprises a current location of vehicle, a

current behavior of vehicle, a destination, an information of adjacent autonomous vehicles;
obtain the cyborg data associated with a human being within the vicinity of the autonomous vehicle, wherein the cyborg data is obtained by using a cyborg device attached to each autonomous vehicle of the set of autonomous vehicles;
determine a current behavior of the autonomous vehicle based on analysis of the real-time vehicle data and the cyborg data;
receive the predicted behaviour of autonomous vehicle;
generate one or more recommendations based on comparison of the current behavior and the predicted behavior of the autonomous vehicle; and
control the autonomous vehicle based on the one or more recomm endati ons.
The autonomous vehicle device as claimed in claim 4, wherein the one or more recommendations correspond to change in direction, change in lane, and destination.
The system as claimed in claim 1, wherein the cyborg data is obtained based on:
receiving mind information of human being within the vicinity of autonomous vehicle using a function MRI technique;
applying a cortical learning technique on the mind information to generate a mind-map of human being;
comparing the mind-map of human being with a historical mind-map stored in a repository; and
obtaining cyborg data based on the comparison of the mind-map of human being and the historical mind-map.
A method for predicting behaviour of an autonomous vehicle, the method comprises steps of:
receiving, by a processor, vehicle data and from a network device, present at variety of locations, and a set of autonomous vehicles within a vicinity of autonomous vehicle;
collecting, by the processor, real-time vehicle data of the set of autonomous vehicles, wherein the real-time vehicle data comprises a current location of vehicle, a

current behavior of vehicle, a destination, a current information of adjacent autonomous vehicles, and an emergency information;
obtaining, by the processor, cyborg data associated with a human being within the vicinity of autonomous vehicle, wherein the cyborg data is obtained from a cyborg device attached to each autonomous vehicle of the set of autonomous vehicles; and
determining, by the processor, a predicted behavior of autonomous vehicle based on analysis of the vehicle data, the real-time vehicle data and the cyborg data.
The method as claimed in claim 7, wherein the vehicle data comprises information of adjacent vehicles, location, behaviour of autonomous vehicle, driving information of autonomous vehicle, and an image processor data.
The method as claimed in claim 7, further comprises transmitting the predicted behavior of autonomous vehicle to an autonomous vehicle device.
). The method as claimed in claim 7, wherein the cyborg data is obtained based on:
receiving mind information of human being within the vicinity of autonomous vehicle using a function MRI technique;
applying a cortical learning technique on the mind information to generate a mind-map of human being;
comparing the mind-map of human being with a historical mind-map stored in a repository; and
obtaining cyborg data based on the comparison of the mind-map of human being and the historical mind-map.
A computer program product having embodied thereon a computer program for predicting behaviour of an autonomous vehicle, the computer program product comprises:
a program code for receiving vehicle data and from a network device, present at variety of locations, and a set of autonomous vehicles within a vicinity of autonomous vehicle;
a program code for collecting real-time vehicle data of the set of autonomous vehicles, wherein the real-time vehicle data comprises a current location of vehicle, a

current behavior of vehicle, a destination, a current information of adjacent autonomous vehicles, and an emergency information;
a program code for obtaining cyborg data associated with a human being within the vicinity of autonomous vehicle, wherein the cyborg data is obtained from a cyborg device attached to each autonomous vehicle of the set of autonomous vehicles; and
a program code for determining a predicted behavior of autonomous vehicle based on analysis of the vehicle data, the real-time vehicle data and the cyborg data.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 201911000116-IntimationOfGrant16-10-2024.pdf 2024-10-16
1 201911000116-STATEMENT OF UNDERTAKING (FORM 3) [02-01-2019(online)].pdf 2019-01-02
2 201911000116-PatentCertificate16-10-2024.pdf 2024-10-16
2 201911000116-REQUEST FOR EXAMINATION (FORM-18) [02-01-2019(online)].pdf 2019-01-02
3 201911000116-Written submissions and relevant documents [28-08-2024(online)].pdf 2024-08-28
3 201911000116-REQUEST FOR EARLY PUBLICATION(FORM-9) [02-01-2019(online)].pdf 2019-01-02
4 201911000116-POWER OF AUTHORITY [02-01-2019(online)].pdf 2019-01-02
4 201911000116-Correspondence to notify the Controller [09-08-2024(online)].pdf 2024-08-09
5 201911000116-FORM-9 [02-01-2019(online)].pdf 2019-01-02
5 201911000116-FORM-26 [09-08-2024(online)].pdf 2024-08-09
6 201911000116-US(14)-HearingNotice-(HearingDate-13-08-2024).pdf 2024-07-19
6 201911000116-FORM 18 [02-01-2019(online)].pdf 2019-01-02
7 201911000116-FORM 1 [02-01-2019(online)].pdf 2019-01-02
7 201911000116-CLAIMS [17-12-2021(online)].pdf 2021-12-17
8 201911000116-FIGURE OF ABSTRACT [02-01-2019(online)].jpg 2019-01-02
8 201911000116-COMPLETE SPECIFICATION [17-12-2021(online)].pdf 2021-12-17
9 201911000116-CORRESPONDENCE [17-12-2021(online)].pdf 2021-12-17
9 201911000116-DRAWINGS [02-01-2019(online)].pdf 2019-01-02
10 201911000116-COMPLETE SPECIFICATION [02-01-2019(online)].pdf 2019-01-02
10 201911000116-FER_SER_REPLY [17-12-2021(online)].pdf 2021-12-17
11 201911000116-Proof of Right [30-11-2021(online)].pdf 2021-11-30
11 abstract.jpg 2019-02-18
12 201911000116-FER.pdf 2021-10-18
12 201911000116-Proof of Right (MANDATORY) [02-07-2019(online)].pdf 2019-07-02
13 201911000116-FORM 13 [09-07-2021(online)].pdf 2021-07-09
13 201911000116-OTHERS-090719.pdf 2019-07-13
14 201911000116-Correspondence-090719.pdf 2019-07-13
14 201911000116-POA [09-07-2021(online)].pdf 2021-07-09
15 201911000116-Correspondence-090719.pdf 2019-07-13
15 201911000116-POA [09-07-2021(online)].pdf 2021-07-09
16 201911000116-FORM 13 [09-07-2021(online)].pdf 2021-07-09
16 201911000116-OTHERS-090719.pdf 2019-07-13
17 201911000116-Proof of Right (MANDATORY) [02-07-2019(online)].pdf 2019-07-02
17 201911000116-FER.pdf 2021-10-18
18 201911000116-Proof of Right [30-11-2021(online)].pdf 2021-11-30
18 abstract.jpg 2019-02-18
19 201911000116-COMPLETE SPECIFICATION [02-01-2019(online)].pdf 2019-01-02
19 201911000116-FER_SER_REPLY [17-12-2021(online)].pdf 2021-12-17
20 201911000116-CORRESPONDENCE [17-12-2021(online)].pdf 2021-12-17
20 201911000116-DRAWINGS [02-01-2019(online)].pdf 2019-01-02
21 201911000116-COMPLETE SPECIFICATION [17-12-2021(online)].pdf 2021-12-17
21 201911000116-FIGURE OF ABSTRACT [02-01-2019(online)].jpg 2019-01-02
22 201911000116-CLAIMS [17-12-2021(online)].pdf 2021-12-17
22 201911000116-FORM 1 [02-01-2019(online)].pdf 2019-01-02
23 201911000116-FORM 18 [02-01-2019(online)].pdf 2019-01-02
23 201911000116-US(14)-HearingNotice-(HearingDate-13-08-2024).pdf 2024-07-19
24 201911000116-FORM-26 [09-08-2024(online)].pdf 2024-08-09
24 201911000116-FORM-9 [02-01-2019(online)].pdf 2019-01-02
25 201911000116-POWER OF AUTHORITY [02-01-2019(online)].pdf 2019-01-02
25 201911000116-Correspondence to notify the Controller [09-08-2024(online)].pdf 2024-08-09
26 201911000116-Written submissions and relevant documents [28-08-2024(online)].pdf 2024-08-28
26 201911000116-REQUEST FOR EARLY PUBLICATION(FORM-9) [02-01-2019(online)].pdf 2019-01-02
27 201911000116-REQUEST FOR EXAMINATION (FORM-18) [02-01-2019(online)].pdf 2019-01-02
27 201911000116-PatentCertificate16-10-2024.pdf 2024-10-16
28 201911000116-STATEMENT OF UNDERTAKING (FORM 3) [02-01-2019(online)].pdf 2019-01-02
28 201911000116-IntimationOfGrant16-10-2024.pdf 2024-10-16

Search Strategy

1 2021-06-2916-49-52E_30-06-2021.pdf
2 201911000116searchAE_07-02-2022.pdf

ERegister / Renewals

3rd: 09 Jan 2025

From 02/01/2021 - To 02/01/2022

4th: 09 Jan 2025

From 02/01/2022 - To 02/01/2023

5th: 09 Jan 2025

From 02/01/2023 - To 02/01/2024

6th: 09 Jan 2025

From 02/01/2024 - To 02/01/2025

7th: 09 Jan 2025

From 02/01/2025 - To 02/01/2026