Sign In to Follow Application
View All Documents & Correspondence

Vehicle Control Method And Apparatus

Abstract: The present disclosure relates to a control system (1) for controlling operation of a host vehicle (2). The control system (1) includes a controller (4) having a processor (5) and a memory (6). The processor (5) is operable to receive a signal (S1) from sensing means (7) and to process the signal (S1) to identify a target vehicle (3). The processor (5) determines a vertical offset (?V) between the host vehicle (2) and the target vehicle (3). A target follow distance (D1) may be set based on the determined vertical offset (?V). The present disclosure also relates to a vehicle (1) incorporating the control system (1); a related method of controlling a vehicle (2); and a non-transitory computer-readable medium. [FIGURE 4]

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
01 March 2018
Publication Number
36/2019
Publication Type
INA
Invention Field
PHYSICS
Status
Email
patents@lls.in
Parent Application

Applicants

JAGUAR LAND ROVER LIMITED
Abbey Road Whitley Coventry Warwickshire CV3 4LF, United Kingdom

Inventors

1. SHAMSHIRI, Navid
Jaguar Land Rover, Patents Department W/1/073 Abbey Road, Whitley Coventry Warwickshire CV3 4LF, United Kingdom
2. RAVEENDRAN, Arun
Jaguar Land Rover, Patents Department W/1/073 Abbey Road, Whitley Coventry Warwickshire CV3 4LF, United Kingdom
3. BOYD, Robin
Jaguar Land Rover, Patents Department W/1/073 Abbey Road, Whitley Coventry Warwickshire CV3 4LF, United Kingdom

Specification

The present disclosure relates to a vehicle control method and apparatus. In particular, but not exclusively, the present disclosure relates to a control apparatus and method for a host vehicle. The control apparatus and method identify a target vehicle and determine a vertical offset between the host vehicle and the identified target vehicle. The host vehicle may be controlled in dependence on the determined vertical offset.
BACKGROUND
It is known to provide a host vehicle with a control system to maintain a distance to another vehicle on a road, for example in a traffic jam or at higher speeds. The known systems generally operate by determining a distance to the other (target) vehicle, either measured as a length or as a function of time, based on the current road speed. In an off-road scenario, inclined slopes and gradients are often encountered. As the surface of the slope may be variable or low traction it is bad practice to follow a target vehicle onto a hill before they have completed the ascent or descent. If a vehicle is engaged in a follow mode and the target vehicle begins an ascent or descent, it would be desirable for the host vehicle not to follow the target vehicle. However, existing control systems which seek to maintain a defined distance to the target vehicle are unable to identify when these scenarios arise.
SUMMARY OF THE INVENTION
Aspects of the present invention relate to a control system, a vehicle; a method and a non-transitory computer-readable medium as claimed in the appended claims.
According to a further aspect of the present invention there is provided a control system for controlling operation of a host vehicle, the host vehicle being a land vehicle; wherein the control system comprises: a processor for receiving a signal from sensing means, the processor being operable to process the signal to identify a target vehicle which is a land vehicle; and to determine a vertical offset between the host vehicle and the target vehicle. The target vehicle may be in front of the host vehicle. The target vehicle may, for example, be the vehicle in front of the host vehicle in a convoy or may be a lead vehicle in a convoy. The host vehicle may be a following vehicle (i.e. a vehicle which is following the target vehicle). By tracking the vertical offset between the host vehicle and the target vehicle, the processor may determine a rate of ascent/decent of the target vehicle. The rate of ascent/descent may be indicative of the gradient. In certain embodiments, the processor may establish a location where the slope begins. The processor may implement an

appropriate control strategy for the host vehicle. The control strategy may include, for example, continuing to follow the target vehicle (for example if a relatively shallow gradient is detected), or waiting at the base or crest of the gradient for a driver confirmation to proceed. It may also be possible in some cases to establish if the target vehicle has cleared the crest or base of the hill and it may therefore be appropriate to hold the host vehicle until this has been established before automatically resuming to follow.
The control system has particular application in an off-road scenario when the host vehicle is travelling on an unmetalled surface. However, it should be understood that aspect of the present invention may also be relevant to on-road scenarios, for example when the host vehicle is travelling on a highway or other metalled surface. The controller may be operable to control operation of the host vehicle in dependence on the determined vertical offset. By way of example, the controller may be operable to adjust a target follow distance in dependence on the determined vertical offset. The target follow distance represents a target distance to be maintained between the host vehicle and the detected target vehicle.
The sensing means may be operable to establish a detection region in front of the host vehicle. The sensing means could be remote from the host vehicle, for example on a drone vehicle. Alternatively, the sensing means may be provided on the host vehicle. The sensing means may be arranged in a forward-facing orientation on said host vehicle. The sensing means may comprise at least one sensor. The sensing means may comprise at least one optical sensor.
The processor may be operable to determine a rate of change of the determined vertical offset. The processor may be operable to identify a base of an incline and/or a crest of an incline in dependence on the rate of change of the determined vertical offset. A change in the rate of change of the determined vertical offset may indicate that the target vehicle has started to ascend or descend an incline; or may indicate that the target vehicle has completed the ascent or descent of the incline. The processor may be operable to determine when the rate of change of the vertical offset decreases below a first threshold value; and/or to determine when the rate of change of the vertical offset increases above a second threshold value.
The processor may be operable to record an ascent conclusion position of the target vehicle when the rate of change of the vertical offset decreases below said first threshold value. The ascent conclusion position may correspond to a crest of an incline or the base of an incline. The processor may be operable to record an ascent/descent commencement position when

the rate of change of the vertical offset increases above said second threshold value. The ascent commencement position may correspond to a base of an incline or the top of an incline .
The processor may be operable to set a target follow distance between the host vehicle and the target vehicle. The target follow distance may be set in dependence on the determined vertical offset between the host vehicle and the target vehicle.
The processor may be operable to track an ascent/descent route of the target vehicle. The processor may translate the ascent/descent route onto a map, for example using GPS positioning technology. The processor may determine the elevation of the target vehicle along said ascent/descent route. The processor may be operable to assess one or more terrain characteristics in dependence on the elevation of the target vehicle and/or the determined vertical offset.
The processor may be operable to calculate a speed of the target vehicle along said ascent/descent route. The processor may optionally assess one or more terrain characteristics in dependence on the determined speed of the target vehicle. The processor may be operable to generate a location-based history of the speed of the target vehicle.
The one or more terrain characteristics may include an incline angle. The incline angle may be determined based on changes in the vertical offset. The processor may determine the incline angle in dependence on the tracked position of the target vehicle and/or a change in the vertical offset. The incline angle may be determined based on the rate of change of the vertical offset.
The signal received by the processor may be generated by one or more image sensors. The signal may comprise image data. The processor may be operable to process said image data to identify the target vehicle. The signal could be received from a remote sensor, for example installed in a drone or targeting system. The processor may be operable to process said image data to determine a pitch angle and/or a roll angle and/or a yaw angle of the target vehicle.
The processor may be operable to determine the vertical offset between the host vehicle and the target vehicle in dependence on a pitch angle and/or a roll angle of the host vehicle. The processor may compensate for the pitch angle and/or the roll angle of the host vehicle when determining the vertical offset.

The processor may be operable to output a host vehicle control signal in dependence on the determined vertical offset. The host vehicle control signal may comprise a signal to halt the host vehicle; and/or to set a target speed of the host vehicle. The host vehicle control signal may be operable to bring the host vehicle to a halt at a determined location. The location may be a determined ascent or descent commencement position determined by the processor in dependence on a rate of change in the vertical offset.
According to a further aspect of the present invention there is provided a control system for controlling operation of a host vehicle, the host vehicle being a land vehicle; wherein the control system comprises:
a controller comprising a processor and a memory device, the processor having an input for receiving a signal from at least one sensor, the processor being operable to execute a set of instructions stored on the memory device;
wherein, when executed, said instructions cause the processor to:
process the signal from said at least one sensor to identify a target vehicle which is a land vehicle; and
determine a vertical offset between the host vehicle and the target vehicle.
According to a further aspect of the present invention there is provided a vehicle comprising a control system as claimed in any one of the preceding claims. The vehicle may comprise a sensing means for generating said signal. The sensing means may comprise one or more optical sensor is mounted to the vehicle.
According to a further aspect of the present invention which protection is sought, there is provided a method of controlling operation of a host vehicle, the host vehicle being a land vehicle; wherein the method comprises:
receiving a signal from sensing means;
processing the signal to identify a target vehicle which is a land vehicle; and determining a vertical offset between the host vehicle and the target vehicle. The target vehicle may be in front of the host vehicle. The target vehicle may, for example, be the vehicle in front of the host vehicle in a convoy or may be a lead vehicle in a convoy. The host vehicle may be a following vehicle (i.e. a vehicle which is following the target vehicle). By determining the vertical offset between the host vehicle and the target vehicle, a rate of ascent/decent of the target vehicle may be determined.
The sensing means may be provided on the host vehicle.

The method may comprise determining a rate of change of the determined vertical offset. The method may comprise identifying a base of an incline and/or a crest of an incline in dependence on the rate of change of the determined vertical offset.
The method may comprise determining when the rate of change of the vertical offset decreases below a first threshold value; and/or determining when the rate of change of the vertical offset increases above a second threshold value. The method may comprise recording an ascent or descent commencement position.
The method may comprise setting a target follow distance between the host vehicle and the target vehicle in dependence on the determined vertical offset between the host vehicle and the target vehicle.
The method may comprise tracking an ascent/descent route of the target vehicle. The ascent/descent route may be recorded.
The method may comprise assessing one or more terrain characteristics in dependence on the determined vertical offset and the ascent/descent route of the target vehicle. The one or more terrain characteristics may include an incline angle. The method may comprise determining the incline angle based on a rate of change of the vertical offset.
The method may comprise estimating a speed of the target vehicle along said ascent/descent route; and optionally assessing one or more terrain characteristics in dependence on the determined speed of the target vehicle. The method may comprise generating a location-based speed history of the target vehicle.
The sensing means may comprise an image sensor. The signal may comprise image data. The method may comprise processing said image data to identify the target vehicle. The method may comprise processing said image data to determine a pitch angle and/or a roll angle and/or a yaw angle of the target vehicle.
The method may comprise determining the vertical offset between the host vehicle and the target vehicle in dependence on a pitch angle and/or a roll angle of the host vehicle.
The method may comprise setting a speed of the host vehicle in dependence on the determined vertical offset between the host vehicle and the target vehicle. The method may

comprise outputting a host vehicle control signal in dependence on the determined vertical offset. The host vehicle control signal may comprise a signal to halt the host vehicle; and/or to adjust a speed of the host vehicle.
According to a further aspect of the present invention there is provided a target object tracking system for a vehicle, the target object tracking system comprising:
a processor for receiving image data captured by one or more sensor disposed on the vehicle, wherein the processor is configured to:
analyse the image data to identify image components; determine a movement vector of each image component, the movement vectors each comprising a magnitude and a direction;
classify at least one of the image components as a target image component relating to the target object and at least one of the remaining image components as a non-target image component;
modify the movement vector of the at least one target image component in dependence on the movement vector of the or each non-target image component; and
track the target object in dependence on the modified movement vector of the at least one target image component. The non-target image component may correspond to a static or stationary feature. The target object tracking system modifies the movement vector of the at least one target image component in dependence on the movement vectors of the non-target image components. At least in certain embodiments, this modification may at least partially correct for changes in the position and/or orientation of the sensing means, for example as a result of movements of the vehicle. Applying this correction to any potential target image components may improve the object detection system, for example over a rough surface. The modified movement vector may provide more accurate positioning information of the target object relative to the vehicle.
The processor may be configured to form at least a first set of said non-target image components. The first set may comprise a plurality of said non-target image components identified as having movement vectors in a first direction. The processor may form said first set by comparing the movement vectors of the image components and identifying at least one image component having a first movement vector comprising a first direction and/or a first magnitude. The processor may be configured to compare a rate of change of the movement vectors of the image components. For example, the processor may compare the rate of change of the magnitude and/or the direction of the movement vectors. The

processor may be configured to identify at least one image component having a first movement vector comprising a first direction changing at a first rate and/or a first magnitude changing at a first rate. Thus, the first set may be formed of non-target image components having at least substantially the same direction.
The processor may be configured to compare the magnitude of the movement vectors of the non-target image components. The non-target image components in the first set may have substantially the same magnitude. Thus, the first set may be formed of non-target image components having at least substantially the same magnitude.
The processor may be configured to determine a correction factor in dependence on the movement vector of the non-target image components in said first set. Alternatively, or in addition, the processor may be configured to modify the movement vector of the at least one target image component by subtracting the movement vector of the non-target image components in said first set.
The processor may be configured to identify image components which are spatially separated from each other. For example, the processor may be configured to identify image components that are distal from each other within the image.
The image data may be video image data captured by one or more image sensors disposed on the vehicle. The processor may be configured to identify the or each image component as a persistent image component. A persistent image component is an image component which may be identified for a predetermined period of time, for example over successive frames of the video image.
The target object tracking system may be configured to track a moving target object. The target object may be a pedestrian or cyclist, for example. Alternatively, the target object may be a target vehicle. The target vehicle may be a wheeled vehicle, such as an automobile.
According to a further aspect of the present invention there is provided a vehicle comprising a target object acquisition system as described herein. The vehicle may comprise sensing means for generating the image data. The sensing means may comprise one or more image sensors, such as a camera. The vehicle may be a wheeled vehicle, such as an automobile.

According to a further aspect of the present invention there is provided a method of tracking a target object from a vehicle in dependence on image data captured by one or more sensor disposed on the vehicle; wherein the method comprises:
analysing the image data to identify image components;
determining a movement vector of each image component, the movement vectors each comprising a magnitude and a direction;
classifying at least one of the image components as a target image component relating to the target object and at least one of the remaining image components as a non-target image component;
modifying the movement vector of the at least one target image component in dependence on the movement vector of the or each non-target image component; and
tracking the target object in dependence on the modified movement vector of the at least one target image component.
The non-target image component may correspond to a static or stationary feature. The method may comprise forming at least a first set of said non-target image components. The first set may comprise a plurality of said non-target image components identified as having movement vectors in a first direction. The method may comprise forming said first set by comparing the movement vectors of the image components. The method may comprise identifying at least one image component having a first movement vector comprising a first direction and/or a first magnitude. The method may comprise forming said first set by comparing the rate of change of the movement vectors of the image components. For example, the method may comprise comparing the rate of change of the magnitude and/or the direction of the movement vectors. The method may comprise identifying at least one image component having a first movement vector comprising a first direction changing at a first rate and/or a first magnitude changing at a first rate.
The method may comprise comparing the magnitude of the movement vectors of the non-target image components. The non-target image components in the first set may have substantially the same magnitude.
The method may comprise modifying the movement vector of the at least one target image component comprises subtracting the movement vector of the non-target image components in said first set.
The method may comprise identifying image components in the image data which are spatially separated from each other.

The image data may be video image data captured by one or more image sensors disposed
on the vehicle; and the or each image component is a persistent image component. A
persistent image component is an image component which may be identified for a
5 predetermined period of time, for example over successive frames of the video image.
The method may comprise tracking a moving target object. The target object may be a pedestrian or cyclist, for example. Alternatively, the target object may be a target vehicle. The target vehicle may be a wheeled vehicle, such as an automobile. 10
According to a further aspect of the present invention there is provided a non-transitory computer-readable medium having a set of instructions stored therein which, when executed, cause a processor to perform the method(s) described herein.
15 According to a further aspect of the present invention there is provided a target object
acquisition system for a vehicle, the target object acquisition system comprising:
a processor for receiving image data captured by one or more sensor disposed on the vehicle, wherein the processor is configured to:
analyse the image data to identify image components;
20 determine a movement vector of each identified image component, the
movement vectors each having a magnitude and a direction;
form a first set comprising a plurality of said image components having a
first movement vector, and classifying the image components in said first set as
non-target image components;
25 form a second set comprising an image component having a second
movement vector, the second movement vector being different from the first movement vector, and classifying the or each image component in said second set as a target image component relating to the target object; and
acquire the target object in dependence on the target image component in
30 said second set.
The non-target image component may correspond to a static or stationary feature. The first set may comprise a plurality of image components; and the second set consists of a single image component.
35
The processor may form said first set by comparing the movement vectors of the image components and identifying at least one image component having a first movement vector
10

comprising a first direction and/or a first magnitude. The processor may be configured to
compare a rate of change of the movement vectors of the image components. For example,
the processor may compare the rate of change of the magnitude and/or the direction of the
movement vectors. The processor may be configured to identify at least one image
5 component having a first movement vector comprising a first direction changing at a first rate
and/or a first magnitude changing at a first rate.
The processor may form said second set by comparing the movement vectors of the image components and identifying at least one image component having a second movement
10 vector comprising a second direction and/or a second magnitude. The processor may be
configured to compare a rate of change of the movement vectors of the image components. For example, the processor may compare the rate of change of the magnitude and/or the direction of the movement vectors. The processor may be configured to identify at least one image component having a second movement vector comprising a second direction
15 changing at a first rate and/or a second magnitude changing at a first rate.
The first direction and the second direction may be different from each other; and/or the first magnitude and the second magnitude may be different from each other.
20 The image components identified in the image data may be spatially separated from each
other. For example, the processor may be configured to identify image components that are distal from each other within the image.
The techniques described herein for correcting the movement vector of the at least one
25 target image component are applicable to the target object acquisition system. The
processor may be configured to modify the movement vector of the at least one target image component in dependence on the movement vector of the or each non-target image component.
30 The image data may be video image data captured by one or more image sensors disposed
on the vehicle. The or each image component may be a persistent image component. A persistent image component is an image component which may be identified for a predetermined period of time, for example over successive frames of the video image.
35 The processor may be configured to acquire a moving target object. The target object may
be a pedestrian or cyclist, for example. Alternatively, the target object may be a target vehicle. The target vehicle may be a wheeled vehicle, such as an automobile.
11

According to a further aspect of the present invention there is provided a vehicle comprising
a target object tracking system as described herein. The vehicle may comprise sensing
means for generating the image data. The sensing means may comprise one or more image
5 sensors, such as a camera. The vehicle may be a wheeled vehicle, such as an automobile.
According to a further aspect of the present invention there is provided a method of acquiring
a target object from a vehicle in dependence on image data captured by one or more sensor
disposed on the vehicle; wherein the method comprises:
10 analysing the image data to identify image components;
determining a movement vector of each identified image component, the movement
vectors each having a magnitude and a direction;
forming a first set comprising a plurality of said image components having a first
movement vector, and classifying the image components in said first set as non-target image
15 components;
forming a second set comprising an image component having a second movement
vector, the second movement vector being different from the first movement vector, and
classifying the or each image component in said second set as a target image component
relating to the target object; and
20 acquire the target object in dependence on the target image component in said
second set.
The non-target image component may correspond to a static or stationary feature. The first
set may comprise a plurality of image components. The second set may consist of a single
25 image component.
The method may comprise forming said first set by comparing the movement vectors of the image components. The method may comprise identifying at least one image component having a first movement vector comprising a first direction and/or a first magnitude. The
30 method may comprise forming said first set by comparing the rate of change of the
movement vectors of the image components. For example, the method may comprise comparing the rate of change of the magnitude and/or the direction of the movement vectors. The method may comprise identifying at least one image component having a first movement vector comprising a first direction changing at a first rate and/or a first magnitude
35 changing at a first rate.
12

The method may comprise forming said second set comprises comparing the movement
vectors of the image components. The method may comprise identifying at least one image
component having a second movement vector comprising a second direction and/or a
second magnitude. The method may comprise forming said second set by comparing the
5 rate of change of the movement vectors of the image components. For example, the method
may comprise comparing the rate of change of the magnitude and/or the direction of the movement vectors. The method may comprise identifying at least one image component having a second movement vector comprising a second direction changing at a second rate and/or a second magnitude changing at a first rate.
10
The first direction and the second direction may be different from each other. The first magnitude and the second magnitude may be different from each other.
The method may comprise identifying image components in the image data which are spatially separated from each other.
15
The method may comprise modifying the movement vector of the at least one target image component in dependence on the movement vector of the or each non-target image component.
20 The image data may be video image data captured by one or more image sensors disposed
on the vehicle. The or each image component is a persistent image component.
According to a further aspect of the present invention there is provided a control system for controlling operation of a host vehicle, the host vehicle being a land vehicle; wherein the
25 control system comprises:
a processor operable to track a target vehicle and generate a target vehicle trace; correlate a host vehicle location and/or a host vehicle trajectory with the target vehicle trace; and
output a deviation signal for controlling a vehicle control module upon determination
30 that the correlation between the host vehicle location and/or the host vehicle trajectory and
the target vehicle trace exceeds a predefined deviation threshold. The target vehicle may be in front of the host vehicle. The target vehicle may, for example, be the vehicle in front of the host vehicle in a convoy or may be a lead vehicle in a convoy. The host vehicle may be a following vehicle (i.e. a vehicle which is following the target vehicle). By correlating the host
35 vehicle location and/or the host vehicle trajectory, the processor may determine when the
host vehicle is no longer following the target vehicle. When the processor determines that the host vehicle is no longer following the target vehicle, the deviation signal may be output
13

to the vehicle control module to inhibit or to cancel operation of the vehicle control module in dependence on the target vehicle trace. The target vehicle trace may comprise a (geospatial) route followed by the target vehicle. The correlation may be performed as the host vehicle travels along the same route. 5
At least in certain embodiments, the host vehicle and the target vehicle are both land vehicles. The host vehicle and the target vehicle may be wheeled vehicles.
At least in certain embodiments, the processor may determine if a target vehicle identified in
10 front of the host vehicle represents a valid lead vehicle or an invalid lead vehicle. The
processor may thereby determine if operation of the vehicle control module in dependence
on the target vehicle trace is a valid control strategy or an invalid control strategy. The
processor may be configured to output a correlation signal for controlling the vehicle control
module upon determination that the correlation between the host vehicle location and/or the
15 host vehicle trajectory and the target vehicle trace is within the predefined deviation
threshold. The correlation signal may provide a positive indication that the host vehicle location and/or the host vehicle trajectory are within the predefined deviation threshold.
According to a further aspect of the present invention there is provided a control system for
20 controlling operation of a host vehicle, the host vehicle being a land vehicle; wherein the
control system comprises:
a processor operable to track a target vehicle and generate a target vehicle trace; correlate a host vehicle location and/or a host vehicle trajectory with the target vehicle trace; and
25 output a correlation signal for controlling a vehicle control module upon
determination that the correlation between the host vehicle location and/or the host vehicle trajectory and the target vehicle trace is within the predefined deviation threshold. At least in certain embodiments, the processor may determine if a target vehicle identified in front of the host vehicle represents a valid lead vehicle or an invalid lead vehicle. The processor may
30 thereby determine if operation of the vehicle control module in dependence on the target
vehicle trace is a valid control strategy or an invalid control strategy. If the target vehicle is a valid lead vehicle, the processor may activate the vehicle control module to control the host vehicle in dependence on the target vehicle trace. If the target vehicle is an invalid lead vehicle, the processor may de-activate the vehicle control module or inhibit operation of the
35 vehicle control module in dependence on the target vehicle trace.
14

The vehicle control module may be configured to control dynamic operation of the host
vehicle. For example, the vehicle control module may control one or more of the following
set: a steering angle; a traction torque; a braking torque. The vehicle control module may, for
example, comprise a cruise control module for the host vehicle. At least in certain
5 embodiments, the vehicle control module is operable in dependence on the target vehicle
trace. For example, the vehicle control module may operate at least substantially to match dynamic operation of the host vehicle to the target vehicle trace.
The target vehicle trace may comprise one or more of the following: a target vehicle location;
10 a target vehicle speed; and a target vehicle trajectory. The processor may correlate the
respective location and/or speed and/or trajectory of the host vehicle and the target vehicle.
The target vehicle trace may comprise (geospatial) location data for the target vehicle, for
example defining the target vehicle route. The target vehicle route may comprise a target
15 route for the following route.
The target vehicle trace may comprise target vehicle speed and/or target vehicle trajectory
along the target vehicle route. The target vehicle speed may comprise a target speed for the
host vehicle. The target vehicle trajectory may comprise a target trajectory for the host
20 vehicle.
The target vehicle may be tracked continuously or periodically, for example at predetermined
temporal or spatial intervals. The target vehicle location and/or the target vehicle speed
and/or target vehicle trajectory may be determined continuously. Alternatively, the target
25 vehicle location and/or the target vehicle speed and/or target vehicle trajectory may be
determined at a plurality of intervals.
The deviation threshold may comprise a deviation distance threshold. The processor may be
operable to determine when the distance between a current location of the host vehicle and
30 the route taken by the target vehicle exceeds the deviation distance threshold.
The host vehicle trajectory may be determined in dependence on a steering angle of the
host vehicle. Alternatively, or in addition, the host vehicle trajectory may be determined in
dependence on an inertial signal, for example generated by one or more accelerometers
35 and/or gyroscopes disposed in said host vehicle.
15

The target vehicle trace may optionally also comprise a trajectory of the target vehicle as it travels along said route. The predefined deviation threshold may comprises a divergence angle between a trajectory of the host vehicle and a trajectory of the target vehicle defined in the target vehicle trace.
5
The target vehicle route may be used to define a target route to be taken by the host vehicle.
The processor may be operable to control a steering angle of the host vehicle to cause the
host vehicle to follow the target route. The processor may, for example, control an electric
power assist steering (EPAS) system. At least in certain embodiments, the control system
10 described herein may provide autonomous or semi-autonomous steering control of the host
vehicle.
Alternatively, or in addition, the target vehicle trace may comprise a speed of the target
vehicle as it travels along the route. The predefined deviation threshold may comprises a
15 speed threhsold between a speed of the host vehicle and a speed of the target vehicle.
The processor may be operable to receive a signal from at least one sensor provided on the
host vehicle. The processor may be operable to process said signal to generate the target
vehicle trace. The signal may comprise image data captured by one or more image sensors.
20 The processor may be operable to process said image data to generate the target vehicle
trace. Alternatively, or in addition, the at least one sensor may comprise one or more of the following set: a LIDAR sensor, a RADAR sensor, an ultrasonic sensor.
According to a further aspect of the present invention there is provided a control system for
25 controlling operation of a host vehicle, the host vehicle being a land vehicle; wherein the
control system comprises:
a controller comprising a processor and a memory device, the processor being configured to execute a set of instructions stored on the memory device;
wherein, when executed, said instructions cause the processor to:
30 track a target vehicle and generate a target vehicle trace;
correlate a host vehicle location and/or a host vehicle trajectory with the target vehicle trace; and
output a deviation signal for controlling a vehicle control module if the
correlation between the host vehicle location and/or the host vehicle trajectory and
35 the target vehicle trace exceeds a predefined deviation threshold.
16

According to a further aspect of the present invention there is provided a vehicle comprising a control system as described herein.
According to a further aspect of the present invention there is provided a method of
5 controlling operation of a host vehicle, the host vehicle being a land vehicle; wherein the
method comprises:
tracking a target vehicle and generating a target vehicle trace; and
correlating a location and/or a trajectory of the host vehicle with the target vehicle trace; and
10 outputting a deviation signal if the correlation between the host vehicle location
and/or the host vehicle trajectory and the target vehicle trace exceeds a predefined deviation threshold. The deviation signal may be suitable for controlling operation of a vehicle control module. For example, the vehicle control module may revert to a set target speed in dependence on said deviation signal.
15
The method may comprise outputting a correlation signal for controlling the vehicle control module upon determination that the correlation between the host vehicle location and/or the host vehicle trajectory and the target vehicle trace is within the predefined deviation threshold.
20
According to a further aspect of the present invention there is provided a method of controlling operation of a host vehicle, the host vehicle being a land vehicle; wherein the method comprises:
tracking a target vehicle and generating a target vehicle trace; and
25 correlating a location and/or a trajectory of the host vehicle with the target vehicle
trace; and
outputting a correlation signal for controlling the vehicle control module upon determination that the correlation between the host vehicle location and/or the host vehicle trajectory and the target vehicle trace is within the predefined deviation threshold. At least in
30 certain embodiments, the method may enable determination of whether a target vehicle
identified in front of the host vehicle represents a valid lead vehicle or an invalid lead vehicle. If the target vehicle is a valid lead vehicle, the method may comprise controlling the host vehicle in dependence on the target vehicle trace. If the target vehicle is an invalid lead vehicle, the method may comprise inhibiting control of the host vehicle in dependence on the
35 target vehicle trace.
17

The method may comprise determining the host vehicle trajectory in dependence on a steering angle of the host vehicle.
The predefined deviation threshold may comprise a divergence angle between a trajectory of
5 the host vehicle and a trajectory of the target vehicle defined in the target vehicle trace.
The predefined deviation threshold may comprise a distance between a current location of the host vehicle and a location of the target vehicle defined in the target vehicle trace.
The method may comprise receiving a signal from at least one sensor provided on the host vehicle. The method may comprise processing said signal to generate the target vehicle trace. The signal may comprise image data captured by one or more image sensors. The image data may be processed to generate the target vehicle trace. Alternatively, or in addition, the at least one sensor may comprise one or more of the following set: a LIDAR sensor, a RADAR sensor, an ultrasonic sensor.
According to a further aspect of the present invention there is provided a non-transitory computer-readable medium having a set of instructions stored therein which, when executed, cause a processor to perform the method(s) described herein.
The host vehicle may be a land vehicle. The target vehicle may be a land vehicle. The term “land vehicle” is used herein to refer to a vehicle configured to apply steering and drive (traction) forces against the ground. The vehicle may, for example, be a wheeled vehicle or a tracked vehicle.
The term “location” is used herein to refer to the relative position of an object on the surface of the earth. Unless indicated to the contrary, either explicitly or implied by the context, references herein to the location of an object refer to the geospatial location of that object.
30 It is to be understood that by the term 'type of terrain' is meant the material comprised by the
terrain over which the vehicle is driving such as asphalt, grass, gravel, snow, mud, rock and/or sand. By 'off-road' is meant a surface traditionally classified as off-road, being surfaces other than asphalt, concrete or the like. For example, off-road surfaces may be relatively compliant surfaces such as mud, sand, grass, earth, gravel or the like. Alternatively
35 or in addition off-road surfaces may be relatively rough, for example stony, rocky, rutted or
the like. Accordingly in some arrangements an off-road surface may be classified as a
18

surface that has a relatively high roughness and/or compliance compared with a substantially flat, smooth asphalt or concrete road surface.
Any control unit or controller described herein may suitably comprise a computational device
5 having one or more electronic processors. The system may comprise a single control unit or
electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term “controller” or “control unit” will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control
10 functionality. To configure a controller or control unit, a suitable set of instructions may be
provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller
15 to be executed on said computational device. The control unit or controller may be
implemented in software run on one or more processors. One or more other control unit or controller may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
20
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the target following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all
25 embodiments and/or features of any embodiment can be combined in any way and/or
combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
30
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the present invention will now be described, by way of
example only, with reference to the accompanying Figures, in which:
Figure 1 shows a plan view of a host vehicle incorporating a control system in
35 accordance with an embodiment of the present invention;
Figure 2 shows a side elevation of the host vehicle shown in Figure 1 incorporating the control system in accordance with an embodiment of the present invention;
19

Figure 3 shows a schematic representation of the control system incorporated into the host vehicle shown in Figures 1 and 2;
Figure 4 illustrates the operation of the control system to set a target follow distance
for a vehicle control module of the host vehicle;
5 Figure 5 illustrates the operation of the control system to calculate a target vehicle
trace;
Figure 6 illustrates an instrument cluster generating a driver notification;
Figure 7 shows a schematic representation of the control system in accordance with
a further embodimnet of the present invention;
10 Figure 8 shows a comparison of the path of the host vehicle to the tracked
movements of the target vehicle; and
Figure 9 shows determination of a deviation between the path of the host vehicle and the tracked movements of the target vehicle.
15 DETAILED DESCRIPTION
A control system 1 for controlling operation of a host vehicle 2 in accordance with an embodiment of the present invention will now be described with reference to the accompanying figures.
20 The control system 1 is described herein with reference to a host vehicle reference frame
comprising a longitudinal axis X, a transverse axis Y and a vertical axis Z. The control system 1 is operative to assist in the control of the host vehicle 2 particularly, but not exclusively, in an off-road driving scenario. As shown in Figures 1 and 2, the control system 1 is operable to track a target vehicle 3 in front of the host vehicle 2. The target vehicle 3
25 may, for example, be a target vehicle or a vehicle in front of the host vehicle 2 in a convoy.
The host vehicle 2 and the target vehicle 3 are both land vehicles (i.e. vehicles configured to apply steering and drive (traction) forces against the ground).
The host vehicle 2 is a wheeled vehicle having four wheels W1-4. A torque is transmitted to
30 the wheels W1-4 to apply a tractive force to propel the host vehicle 2. The torque is
generated by one or more torque generating machine, such as an internal combustion
engine or an electric traction machine, and transmitted to the driven wheels W1-4 via a
vehicle powertrain. The host vehicle 2 in the present embodiment has four-wheel drive and,
in use, torque is transmitted selectively to each of said wheels W1-4. It will be understood
35 that the control system 1 could also be installed in a host vehicle 2 having two-wheel drive.
The host vehicle 2 in the present embodiment is an automobile having off-road driving capabilities. For example, the host vehicle 2 may be capable of driving on an unmetalled
20

road, such as a dirt road or track. The host vehicle 2 may, for example, be a sports utility vehicle (SUV) or a utility vehicle, but it will be understood that the control system 1 may be installed in other types of vehicle. The control system 1 may be installed in other types of wheeled vehicles, such as light, medium or heavy trucks. The target vehicle 3 is a wheeled 5 vehicle, such as an automobile or an off-road vehicle. The target vehicle 3 may have the same configuration as the host vehicle 2 or may have a different configuration.
A schematic representation of the control system 1 installed in the host vehicle 2 is shown in Figure 3. The control system 1 comprises a controller 4 having at least one electronic
10 processor 5 and a memory 6. The processor 5 is operable to receive a data signal S1 from a sensing means 7. As described herein, the processor 5 is operable to process the data signal S1 to track the target vehicle 3. The processor 5 is operable to control operation of the host vehicle 2 in dependence on the relative location of the target vehicle 3. As described herein, the processor 5 is operable to control a target follow distance D1 in dependence on a
15 determined vertical offset ΔV between the host vehicle 2 and the target vehicle 3. The target follow distance D1 is a target distance to be maintained between the host vehicle 2 and the target vehicle 3. The processor 5 is operable to output a target follow distance signal SD1 to a vehicle control module. In the present embodiment, the vehicle control module comprises a cruise control module 8. The cruise control module 8 is operable in a follow mode to
20 control the host vehicle 2 to follow the target vehicle 3. When the follow mode is activated, the cruise control module 8 controls the host vehicle target speed to maintain the target follow distance D1 between the host vehicle 2 and the target vehicle 3. The cruise control module 8 may output a target speed signal SV1 to an engine control module 9 which controls the output torque transmitted to the wheels W1-4. The cruise control module 8 may
25 also generate a brake control signal for controlling a braking torque applied to said wheels W1-4.
The processor 5 is operable to receive a reference speed signal VRef for indicating a current reference host vehicle target speed. The reference speed signal VRef may, for example, be
30 generated by one or more wheel speed sensors 10. The processor 5 is operable to receive a vehicle attitude signal VAtt for indicating a current attitude of the host vehicle 2 about at least one axis. The vehicle attitude signal VAtt comprises one or more of the following: a vehicle pitch signal Sα indicating a vehicle pitch angle α; and a vehicle roll signal Sβ indicating a vehicle roll angle β. The vehicle attitude signal VAtt is generated by a vehicle attitude sensor
35 module 11 comprising one or more accelerometers and/or gyroscopes. The vehicle pitch angle α and the vehicle roll angle β define the current angular orientation of the longitudinal
21

axis X and the transverse axis Y respectively to a horizontal plane. The vehicle attitude signal VAtt may also comprise a vehicle yaw signal Sγ indicating a vehicle yaw angle γ
As shown in Figure 3, the sensing means 7 is mounted in a forward-facing orientation to 5 establish a detection region in front of the host vehicle 2. The sensing means 7 comprises at least one optical sensor 12 mounted to the host vehicle 2. The sensing means 7 may comprise a single camera. Alternatively, the sensing means 7 may comprise a stereoscopic camera. The at least one optical sensor 12 may be mounted at the front of the vehicle, for example incorporated into a front bumper or engine bay grille; or may be mounted within the
10 vehicle cabin, for example in front of a rear-view mirror. The at least one optical sensor 12 has a field of view FOV having a central optical axis VX extending substantially parallel to a longitudinal axis X of the host vehicle 2. The field of view FOV is generally conical in shape and extends in horizontal and vertical directions. The at least one optical sensor 12 comprises a digital imaging sensor for capturing image data. The at least one optical sensor
15 12 captures image data substantially in real-time, for example at 30 frames per second. The at least one optical sensor 12 in the present embodiment is operable to detect light in the visible spectrum of light. The sensing means 7 comprises optics (not shown) for directing the incident light onto an imaging sensor, such as a charge-coupled device (CCD), operable to generate image data for transmission in the data signal S1. Alternatively, or in addition, the
20 sensing means 7 may be operable to detect light outside of the visible light spectrum, for example in the infra-red range to generate a thermographic image. Alternatively, or in addition, the sensing means 7 may comprise a Lidar sensor for projecting a laser light in front of the host vehicle 2. Other types of sensor are also contemplated.
25 The sensing means 7 is connected to the controller 4 over a communication bus 13 provided in the host vehicle 2. The data signal S1 is published to the communication bus 13 by the sensing means 7. In the present embodiment, the connection between the sensing means 7 and the controller 4 comprises a wired connection. In alternative embodiments, the connection between the sensing means 7 and the controller 4 may comprise a wireless
30 connection, for example to enable remote positioning of the sensing means 7. By way of example, the sensing means 7 may be provided in a remote targeting system, such as a drone vehicle. The processor 5 is operable to read the data signal S1 from the communication bus 13. The processor 5 extracts the image data from the data signal S1 and implements an image processing algorithm to identify the target vehicle 3 within the image
35 data. The image processing algorithm may recognise a shape or profile of the target vehicle 3, for example using pattern matching techniques. Alternatively, or in addition, the image processing algorithm may identify the target vehicle 3 based on relative movement of the
22

target vehicle 3 between frames of the image date. The image processing algorithm may
optionally use a known vehicle colour and/or vehicle type to identify the target vehicle 3 in
the image data. By applying the image processing algorithm to successive frames of the
image data, the processor 5 tracks the movement of the target vehicle 3. The processor 5
5 may also calculate a distance (range) to the target vehicle 3 from the host vehicle 2. The
processor 5 may compare the size of the target vehicle 3 in the image data to a reference image taken at a known distance between the host vehicle 2 and the target vehicle 3. The reference image could, for example, be captured during a calibration phase.
10 The processor 5 may optionally calculate an attitude of the target vehicle 3, for example to
calculate a pitch angle and/or a roll angle and/or a yaw angle of the target vehicle 3. The processor 5 may, for example, analyse the image data to identify one or more edges and/or one or more sides of the target vehicle 3. The processor 5 may compare the edges and/or sides identified in the image data to a virtual model of a vehicle to determine the attitude of
15 the target vehicle 3. The virtual model could be generated based on one or more images of
the target vehicle 3 captured during a calibration phase.
As shown in Figures 2 and 4, the processor 5 is operable to determine the vertical offset ΔV (relative elevation) of the target vehicle 3 in relation to the host vehicle 2. The vertical offset
20 ΔV is calculated from a predetermined reference point on the host vehicle 2 and an
identifiable reference point on the target vehicle 3, such as a base of the body or a wheel centre of the target vehicle 3. The processor 5 receives the vehicle pitch signal Sα and the vehicle roll signal Sβ indicating the vehicle pitch angle α and the vehicle roll angle β of the host vehicle 2. The processor 5 uses the vehicle pitch signal Sα to determine the orientation
25 of the central optical axis VX in a vertical plane and thereby determines the orientation of the
field of view FOV1 in three-dimensional space. The processor 5 identifies the target vehicle 3 in the image data and determines an elevation angle α1 (positive or negative) relative to the central optical axis VX. The processor 5 sums the elevation angle α1 and the known pitch angle α and the total is used to calculate the vertical offset ΔV of the target vehicle 3 in
30 dependence on the distance to the target vehicle 3. A positive value for the vertical offset ΔV
indicates that the target vehicle 3 is at a higher elevation (i.e. above the height of) the host vehicle 2; and a negative value for the vertical offset ΔV indicates that the target vehicle 3 is at a lower elevation (i.e. below the height of) the host vehicle 2. It will be understood that the vehicle roll signal Sβ may alter the apparent vertical offset ΔV depending on the position of
35 the target vehicle 3 within the image. The processor 5 may also apply a correction based on
the known vehicle roll angle β of the host vehicle 2.
23

The processor 5 may track the target vehicle 3 to determine a target vehicle route, i.e. the
route taken by the target vehicle 3. The processor 5 may optionally analyse the data signal
S1 to calculate a target vehicle speed and/or a target vehicle trajectory as the target vehicle
3 travels along said target vehicle route. The speed and/or trajectory may be calculated at
5 known time or spatial intervals along said target vehicle route. The processor 5 may store
the calculated target vehicle speed to build a speed profile for the target vehicle 3 as it travels along a route. The calculated speed and/or trajectory (direction of travel) of the target vehicle 3 may optionally be stored along with an instant location of the target vehicle 3 (either known or calculated). The calculated speed and/or trajectory at any given location
10 may be defined as a movement vector Vn having a magnitude (representing the target
vehicle speed) and direction (representing the trajectory of the target vehicle 3). As illustrated in Figure 5, multiple movement vectors Vn can be calculated for the target vehicle 3 to generate a target vehicle trace 14. The target vehicle trace 14 represents the (geospatial) target vehicle route as well as the target vehicle speed along that route. The
15 target vehicle trace 14 may optionally also comprise the trajectory of the target vehicle 3
along the target vehicle route. The position of each movement vector Vn indicates a calculated position of the target vehicle 3. The length of the movement vectors Vn represents a calculated target vehicle speed at that location; and the direction of the movement vectors Vn represents a trajectory of the target vehicle 3 at that location. The
20 processor 5 stores each movement vector Vn in conjunction with the calculated location of
the target vehicle 3 to build the target vehicle trace 14. The movement vectors Vn may be calculated at predetermined time intervals, for example one calculation per second. The movement vectors Vn may optionally be time-stamped to indicate the time at which they were generated. The location of the target vehicle 3 may be calculated relative to the host
25 vehicle 2 (for example a separation distance and/or orientation). Alternatively, the location of
the target vehicle 3 may comprise a real-world location, for example determined with reference to positional data (for example GPS data) and/or map data. The movement vectors Vn can be stored as a sequence along a target vehicle route. The target vehicle trace 14 may subsequently be used by the processor 5 as an input for controlling the host
30 vehicle 2, for example to match a host vehicle target speed to a calculated target vehicle
speed at a corresponding location on a route. The target vehicle trace 14 illustrated in Figure 5 is defined in two dimensions corresponding to a plan view of the target vehicle route. However, it will be appreciated that the target vehicle trace 14 may be generated in three dimensions, for example to represent the elevation of the target vehicle 3.
35
The speed at which the target vehicle 3 is travelling may be calculated in dependence on a detected movement of the target vehicle 3 between frames of the image data. The processor
24

5 may compensate for movements of the host vehicle 2 when estimating the speed at which
the target vehicle 3 is travelling. In particular, the processor 5 may take account of the
reference host vehicle target speed (which is known from the reference speed signal VRef)
and/or angular movements of the host vehicle 2 (which are known from the vehicle roll signal
5 Sβ and/or the vehicle pitch signal Sα and/or the vehicle yaw signal Sγ).The movements of
the target vehicle 3 within the image data may thereby be adjusted to compensate for movements of the host vehicle 2 (which affect the orientation of the sensing means 7).
The control system 1 is operative to assist in the dynamic control of the host vehicle 2. The
10 processor 5 outputs the follow distance signal SV1 to the communication bus 13 to control
dynamic operation of the host vehicle 2 in dependence on the determined vertical offset ΔV between the host vehicle 2 and the target vehicle 3. In the present embodiment, the control system 1 is operative to set the target follow distance D1 between the host vehicle 2 and the target vehicle 3. The target follow distance D1 is the distance to be maintained between the
15 host vehicle 2 and the target vehicle 3. In the present embodiment, the control system 1 sets
the target follow distance D1 in dependence on the determined vertical offset ΔV. The target follow distance D1 could be calculated dynamically in dependence on the determined vertical offset ΔV. The target follow distance D1 may be calculated in dependence on one or more of the following factors: the determined vertical offset ΔV; the reference host vehicle
20 target speed; the calculated target vehicle speed; the pitch angle α of the host vehicle 2; and
the pitch angle of the target vehicle 3. In the present embodiment, the processor 5 is operable to access a lookup table T1 stored in the memory 6 to determine the target follow distance D1. As shown in Figure 4, the target follow distance D1 is defined in the lookup table T1 with reference to the determined vertical offset ΔV. The processor 5 reads the
25 target follow distance D1 for the vertical offset ΔV from the lookup table T1 and generates
the target follow distance signal S1 to set the target follow distance D1 for the cruise control module 8.
The processor 5 may be operable to process the vertical offset ΔV to identify terrain features
30 and characteristics. For example, the processor 5 may determine a rate of change of the
vertical offset ΔV. The rate of change of the vertical offset ΔV may provide an indication of a
change in the gradient of a slope on which the target vehicle 3 is travelling. Identifying when
the rate of change of the vertical offset ΔV changes from a substantially steady state
condition may indicate that the target vehicle 3 is beginning to ascend a slope. Conversely,
35 identifying when the rate of change of the vertical offset ΔV changes to a substantially
steady state condition may indicate that the target vehicle 3 is cresting a slope. As shown in Figure 6, a notification may be output to a display screen 15 provided in the host vehicle 2,
25

for example provided in an instrument cluster 16, to notify the driver of the host vehicle 2 that
a slope has been detected. A prompt may be output requesting that the driver press a
resume button (not shown) to continue. It will be understood that, when determining the rate
of change of the vertical offset ΔV, the processor 5 may allow for changes resulting from the
5 movements of the host vehicle 2, for example to compensate for changes in the elevation of
the host vehicle 2.
There may be circumstances when it is no longer appropriate for the cruise control module 8 to operate in the follow mode described herein. The processor 5 may optionally be operable
10 to deactivate the cruise control module 8 in dependence on the vertical offset ΔV between
the host vehicle 2 and the target vehicle 3. The processor 5 may, for example, be operable to deactivate the cruise control module 8 when the vertical offset ΔV exceeds a predetermined first offset threshold. The first offset threshold may, for example, be stored in the memory 6. The processor 5 is operable to monitor the vertical offset ΔV and, upon
15 determining that the vertical offset ΔV is greater than the predetermined first offset threshold,
to deactivate the cruise control module 8. Alternatively, or in addition, the processor 5 may be operable to increase the target follow distance D1 if the determined vertical offset ΔV exceeds the first offset threshold. The processor 5 may optionally deactivate the cruise control module 8 if the determined vertical offset ΔV exceeds a second offset threshold
20 which is larger than the first offset threshold. Alternatively, or in addition, the processor 5
may be operable to bring the host vehicle 2 to a halt when the determined vertical offset ΔV exceeds the predefined first offset threshold. Alternatively, or in addition, the processor 5 may be operable to output an alert or notification to a driver of the host vehicle 2 when the determined vertical offset ΔV exceeds the predefined first offset threshold.
25
The present invention has been described in relation to determining a vertical offset ΔV between the host vehicle 2 and the target vehicle 3. In a variant, the control system 1 in accordance with the present invention may be operable to determine an absolute elevation of the target vehicle 3, for example with reference to sea level. The elevation of the host
30 vehicle 2 may be known, for example with reference to GPS data. The absolute elevation of
the target vehicle 3 may be calculated in dependence on the elevation of the host vehicle 2. The absolute elevation of the target vehicle 3 may be used to calculate the location of the target vehicle 3, for example in dependence on map data stored in the memory 6 or a separate data storage device.
35
The operation of the control system 1 in conjunction with the cruise control module 8 will now be described. The cruise control module 8 is activated and dynamically controls the host
26

vehicle target speed in order to maintain a target follow distance D1 between the host vehicle 2 and the target vehicle 3. The control system 1 is operable to adjust the target follow distance D1 in dependence on the determined vertical offset ΔV between the host vehicle 2 and the target vehicle 3. The at least one optical sensor 12 is forward-facing and has a field 5 of view FOV which captures a scene in front of the host vehicle 2. In use, the at least one optical sensor 12 captures image data which is transmitted to the processor 5 in the data signal S1. The processor 5 processes the image data to identify the target vehicle 3 within the captured scene. The processor 5 analyses of the image data to determine the vertical offset ΔV between the host vehicle 2 and the target vehicle 3. The processor 5 then
10 accesses the lookup table T1 stored in memory 6 and reads a target follow distance D1 in dependence on the determined vertical offset ΔV. The processor 5 then outputs a target follow distance signal SD1 to the cruise control module 8. The cruise control module 8 controls the host vehicle target speed to maintain the target follow distance D1 determined by the processor 5 in dependence on the determined vertical offset ΔV.
15
The processor 5 may be operable to generate a host vehicle trace 18. The host vehicle trace 18 may represent the (geospatial) route taken by the host vehicle 2 as well as the speed of the host vehicle 2 along that route. The host vehicle trace 18 may optionally also comprise the trajectory of the host vehicle 2 along the target vehicle route. The host vehicle trace 18
20 may optionally comprise vehicle attitude data generated by the vehicle attitude sensor module 11. For example, the host vehicle trace 18 may comprise one or more of the following: the pitch angle α, and the roll angle β, and the yaw angle γ. The vehicle attitude data may be stored along said host vehicle trace 18, for example by defining the orientation of the longitudinal axis X, a transverse axis Y and a vertical axis Z making up the reference
25 frame of the host vehicle 2. The vehicle attitude data may optionally also comprise acceleration data about one or more of said axis.
In use, tracking of the target vehicle 3 may be lost or interrupted, for example due to the presence of obstacles or terrain features between the host vehicle 2 and the target vehicle 3
30 which partially or completely obscure the target vehicle 3. The target vehicle trace 14 may be used to predict the location of the target vehicle 3 after tracking is lost or interrupted. For example, the processor 5 may be operable to predict movement of the target vehicle 3 in dependence on one or more of the following: an identified location of the target vehicle 3 when tracking was interrupted; a trajectory of the target vehicle 3 when tracking was
35 interrupted; a speed of travel of the target vehicle 3 when tracking was interrupted; and map (geographic) data showing possible routes from a last-known location of the target vehicle 3.
27

By predicting a location of the target vehicle 3, at least in certain embodiments the processor 5 may be operable more quickly to re-establish tracking of the target vehicle 3.
As outlined above, the processor 5 may track the target vehicle route. At least in certain
5 embodiments, the processor 5 may calculate the speed and/or the trajectory of the target
vehicle 3. The calculated speed and/or trajectory at a given location may be defined as a movement vector Vn having a magnitude (representing the target vehicle speed) and direction (representing the trajectory of the target vehicle 3). By repeating these calculations (for example at predetermined temporal or spatial intervals), the processor 5 builds the
10 target vehicle trace 14 for the target vehicle 3, as illustrated in Figure 5. A modified
embodiment of the control system 1 described herein with reference to Figures 1 to 6 may utilise the resulting target vehicle trace 14 for controlling the host vehicle 2. The modified embodiment of the control system 1 will now be described with reference to Figure 7, 8 and 9. Like reference numerals are used for like components and the description of this
15 embodiment will focus on the modifications.
The control system 1 according to the modified embodiment is operable to compare the movements of the host vehicle 2 to the tracked movements of the target vehicle 3. As described herein, the processor 5 is operable to build the target vehicle trace 14. As shown in Figure 7, the processor 5 is configured to receive a location signal SGPS from a GPS module 17. The location signal SGPS comprises location data specifying a current location of the host vehicle 2. The processor 5 uses the location data from the GPS module 17 to compare a current route of the host vehicle 2 to the (historic) tracked target vehicle route. The processor 5 in the present embodiment is operable to compare the location data from the GPS module 17 with the target vehicle trace 14 stored in the memory 6. The processor 5 correlates the movements of the host vehicle 2 to the recorded movements of the target vehicle 3. As illustrated in Figure 8, the processor 5 determines a deviation δ between the current route of the host vehicle 2 and the tracked route of the target vehicle 3. The deviation δ in the present embodiment comprises a distance between the current location of the host vehicle 2 and the nearest point on the tracked route of the target vehicle 3.
If the processor 5 determines that the deviation δ exceeds a predetermined deviation
threshold, a deviation signal SDEV may be generated. In the present embodiment, the
processor 5 determines that the host vehicle 2 is deviating from the target vehicle route if the
35 distance between the location of the host vehicle 2 and the target vehicle trace 14 exceeds a
predefined distance threshold. The deviation signal SDEV is published to the communication bus 13 and read by the cruise control module 8. The cruise control module 8 is configured to
28

deactivate a currently active follow mode in dependence on said deviation signal SDEV. Thus, if the processor 5 determines that the host vehicle 2 is deviating from the target vehicle route, the deviation signal SDEV is generated to deactivate the follow mode of the cruise control module 8. 5
If the processor 5 determines that the deviation δ is within the predetermined deviation threshold, a correlation signal SCOR may be generated. In the present embodiment, the processor 5 determines that the path of the host vehicle 2 at least substantailly corresponds to that of the target vehicle route if the distance between the location of the host vehicle 2
10 and the target vehicle trace 14 remains with a predefined distance threshold. The deviation
signal SCOR is published to the communication bus 13 and read by the cruise control module 8. The cruise control module 8 is configured to activate a follow mode in dependence on the correslation signal SCOR. Thus, if the processor 5 determines that the host vehicle 2 is following the target vehicle route, the correlation signal SCOR is generated
15 to activate the follow mode of the cruise control module 8. It will be understood that this
control strategy may be used to detect a target vehicle 3 and to automatically or semi-automatically activate the vehicle control module.
Alternatively, or in addition, the processor 5 may compare the trajectory of the host vehicle 2 to the trajectory of the target vehicle 3 along the target vehicle route. The processor 5 may determine that the host vehicle 2 has deviated from the target vehicle route if the comparison of the trajectory of the host vehicle 2 and that of the target vehicle 3 at a corresponding location exceeds a predefined angular threshold or is within a predefined angular threshold. This operating mode is illustrated in Figure 9. The target vehicle route taken by the host vehicle 2 initially follows the tracked route of the target vehicle 3, which may be determined with reference to the target vehicle trace 14. When the trajectory of the host vehicle 2 exceeds the predefined angular threshold, the processor 5 outputs the deviation signal SDEV to deactivate the follow mode of the cruise control module 8. In the illustrated arrangement, the processor 5 determines that the host vehicle 2 is no longer following the tracked route of the target vehicle 3 at a deviation point 19 (denoted by a star indicia in Figure 9). When the trajectory of the host vehicle 2 is within the predefined angular threshold, the processor 5 outputs the correlation signal SCOR to activate the follow mode of the cruise control module 8.
35 Alternatively, or in addition, the processor 5 may consider a steering angle θ of the host
vehicle 2 when comparing the current route of the host vehicle 2 and the tracked routed of the target vehicle 3. The steering angle θ is published to the communication bus 13, for
29

example by a steering angle sensor 20 associated with a steering wheel (not shown), and is
read by the processor 5. A comparison of the trajectory of the target vehicle 3 and the
steering angle θ provides an indication of whether the host vehicle 2 is following the tracked
route of the target vehicle 3. The processor 5 may calculate the divergence of the target
5 vehicle route taken by the host vehicle 2 and the target vehicle route. At least in certain
embodiments, the processor 5 may determine if the driver of the host vehicle 2 is
intentionally steering away from the target vehicle route. The processor 5 may determine
when the intended path of the host vehicle differs from the target vehicle route. By
measuring any such deviation, the processor may determine that the host vehicle 2 is no
10 longer following the target vehicle 3. The processor 5 may, for example, be operable to
resume a driver requested speed, rather than to set the target speed in dependence on the determined target vehicle speed.
The processor 5 may be operable to determine the trajectory of the target vehicle 3 in
15 dependence on the target vehicle trace 14. This may be used to provide an element of
prediction of a possible location of the target vehicle 3 when tracking is interrupted. For example, knowledge of the steering angle θ of the host vehicle and/or odometry data can be used to establish if the host vehicle 2 is taking a similar path to the target vehicle 3. In this situation, action can be taken by the host vehicle 2 to improve confidence, for example
20 holding the current target (follow) speed or the speed at which the target vehicle 3 was
travelling when tracking was interrupted. The target speed of the host vehicle 2 can be held for a time or distance to allow re-establishment of tracking of the target vehicle 3, for example when the target vehicle 3 reappears from around a corner or from behind an obstacle.
25
The processor 5 could be modified to generate a host vehicle trace 18 for comparison with the target vehicle trace 14. The location data generated by the GPS module 17 to identify the geospatial location of the host vehicle 2 could be stored in the memory 6. The processor 5 may also store dynamic operating parameters of the host vehicle 2, such as the reference
30 speed (derived from the GPS module 17 or other on-board sensors, such as the wheel
speed sensors 10). The host vehicle trace 18 and the target vehicle trace 14 could be compared using a pattern matching algorithm to assess the correlation between the traces. The comparison may, for example, compare the relative positions of the host vehicle 2 and the target vehicle 3; and/or the trajectory of the host vehicle 2 and the target vehicle 3. If the
35 deviation δ between the host vehicle trace 18 and the target vehicle trace 14 exceeds a
predefined deviation threshold δ”, the processor 5 is configured to output the deviation signal SDEV. If the deviation δ between the host vehicle trace 18 and the target vehicle trace 14 is
30

within a predefined deviation threshold δ”, the processor 5 is configured to output the correlation signal SCOR.
It will be appreciated that various modifications may be made to the embodiment(s)
5 described herein without departing from the scope of the appended claims. The target
vehicle 3 may comprise an active or passive marker to facilitate identification by the
processor 5. A passive marker may, for example, comprise a visible target or reflector (either
an optical or radio wave reflector) which may be detected by appropriate detection means
provided on the host vehicle 2. An active marker may comprise an active source, such as a
10 wireless radio frequency (RF) transmitter or a light source (for emitting visible or infrared
light), to facilitate identification by the processor 5. The host vehicle 2 may comprise detection means.
As outlined herein, the processor 5 may process the image data generated by the at least
15 one optical sensor 12 to calculate the attitude of the target vehicle 3. The processor 5 may,
for example, determine an orientation of a reference frame of the target vehicle 3. The
processor 5 may analyse the image data captured by the at least one optical sensor 12 to
calculate the orientation a longitudinal axis X1, a transverse axis Y1 and a vertical axis Z1 of
the target vehicle 3. The processor 5 may be operable to store the resulting target vehicle
20 attitude data in conjunction with the location data. For example, the processor 5 may store
the target vehicle attitude data with said target vehicle trace 14. The processor 5 may be
operable to control the host vehicle target speed and/or the target follow distance D1 in
dependence on the angular attitude of the target vehicle 3 about one or more of said axes.
For example, the processor 5 may reduce the following target speed and/or increase the
25 target follow distance D1 if the angular orientation of the target vehicle 3 is greater than or
equal to a predefined angular threshold. Alternatively, or in addition, when comparing the routes taken by the host vehicle 2 and the target vehicle 3, the processor 5 may be configured to compare the attitude of the host vehicle 2 and the target vehicle 3.
30 The target vehicle trace 14 and/or the host vehicle trace 18 may be stored indefinitely. The
historic data may be referenced at a later time, for example when repeating the target vehicle route. The target vehicle trace 14 and/or the host vehicle trace 18 could optionally be shared, for example transferred to another vehicle intending to follow the same route.
35 The control system 1 described herein is operable to selectively activate and/or deactivate
the cruise control module 8. Alternatively, or in addition, the control system 1 may be operable to selectively activate and/or deactivate a steering control module (not shown)
31

operable to control a steering angle 0 of the host vehicle 2. The steering control module may, for example, be incorporated into the vehicle control module described herein. The steering control module may provide autonomous or semi-autonomous control of the steering of the host vehicle 2, for example by controlling an electric power assist steering (EPAS) system. The target vehicle route may be used to define a target route to be taken by the host vehicle 2. The EPAS system may be operable to control the host vehicle 2 to follow the target route. The control system 1 described herein may selectively activate and/or deactivate the steering control module to enable and/or disable autonomous or semi-autonomous control of the steering. When the trajectory of the host vehicle 2 is within the predefined angular threshold, the processor 5 outputs the correlation signal SCOR to activate the steering control module. When the trajectory of the host vehicle 2 exceeds the predefined angular threshold, the processor 5 outputs the deviation signal SDEV to deactivate the steering control module. It is envisaged that the control functions described herein in relation to the cruise control module 8 may be equally applicable to these control functions.
The processor 5 has been described as generating a target vehicle trace 14 by storing the movement vectors along a calculated route of the target vehicle 3. Alternatively, or in addition, the processor 5 may be operable to store data relating to a determined attitude of the target vehicle 3 along the calculated route. For example, the processor 5 may store a calculated pitch angle and/or roll angle and/or yaw angle of the target vehicle 3 as it travels along the calculated route. The processor 5 may be operable also to evaluate the terrain over which the target vehicle 3 is travelling. Alternatively, or in addition, the processor 5 may be operable to store terrain data generated as a result of this evaluation process along the calculated route travelled by the target vehicle 3. The processor 5 may utilise one or more of the stored datasets when planning a route for the host vehicle 2.

WE CLAIMS:
1. A control system for controlling operation of a host vehicle, the host vehicle being a
land vehicle; wherein the control system comprises:
a processor for receiving a signal from sensing means, the processor being operable to process the signal to identify a target vehicle which is a land vehicle; and to determine a vertical offset between the host vehicle and the target vehicle.
2. A control system as claimed in claim 1, wherein the processor is operable to set a target follow distance between the host vehicle and the target vehicle in dependence on the determined vertical offset between the host vehicle and the target vehicle.
3. A control system as claimed in claim 1 or claim 2, wherein the processor is operable to determine a rate of change of the determined vertical offset.
4. A control system as claimed in claim 3, wherein the processor is operable to identify a base of an incline and/or a crest of an incline in dependence on the rate of change of the determined vertical offset.
5. A control system as claimed in any one of the preceding claims, wherein the processor is operable to track an ascent/descent route of the target vehicle.
6. A control system as claimed in claim 5, wherein the processor is operable to assess one or more terrain characteristics in dependence on the determined vertical offset and the ascent/descent route of the target vehicle,optionally wherein the processor is operable to calculate a speed of the target vehicle along said ascent/descent route; and optionally to assess one or more terrain characteristics in dependence on the determined speed of the target vehicle.
7. A control system as claimed in any one of the preceding claims, wherein the signal comprises image data; and the processor is operable to process said image data to identify the target vehicle., optionally , wherein the processor is operable to process said image data to determine a pitch angle and/or a roll angle and/or a yaw angle of the target vehicle.
8. A control system as claimed in any one of the preceding claims, wherein the processor is operable to determine the vertical offset between the host vehicle and the target vehicle in dependence on a pitch angle and/or a roll angle of the host vehicle.

9. A method of controlling operation of a host vehicle, the host vehicle being a land
vehicle; wherein, the method comprising:
receiving a signal from sensing means;
processing the signal to identify a target vehicle which is a land vehicle; and
determining a vertical offset between the host vehicle and the target vehicle.
10. A method as claimed in claim 9 comprising setting a target follow distance between
the host vehicle and the target vehicle in dependence on the determined vertical offset
between the host vehicle and the target vehicle.

Documents

Application Documents

# Name Date
1 201811007656-Covering Letter [15-12-2020(online)].pdf 2020-12-15
1 201811007656-STATEMENT OF UNDERTAKING (FORM 3) [01-03-2018(online)].pdf 2018-03-01
2 201811007656-FORM 1 [01-03-2018(online)].pdf 2018-03-01
2 201811007656-Form 1 (Submitted on date of filing) [15-12-2020(online)].pdf 2020-12-15
3 201811007656-Request Letter-Correspondence [15-12-2020(online)].pdf 2020-12-15
3 201811007656-FIGURE OF ABSTRACT [01-03-2018(online)].pdf 2018-03-01
4 201811007656-DRAWINGS [01-03-2018(online)].pdf 2018-03-01
4 201811007656-REQUEST FOR CERTIFIED COPY [28-05-2018(online)].pdf 2018-05-28
5 201811007656-DECLARATION OF INVENTORSHIP (FORM 5) [01-03-2018(online)].pdf 2018-03-01
5 201811007656-Correspondence-210318.pdf 2018-04-02
6 201811007656-Power of Attorney-210318.pdf 2018-04-02
6 201811007656-COMPLETE SPECIFICATION [01-03-2018(online)].pdf 2018-03-01
7 abstract.jpg 2018-03-28
7 201811007656-FORM-26 [19-03-2018(online)].pdf 2018-03-19
8 abstract.jpg 2018-03-28
8 201811007656-FORM-26 [19-03-2018(online)].pdf 2018-03-19
9 201811007656-Power of Attorney-210318.pdf 2018-04-02
9 201811007656-COMPLETE SPECIFICATION [01-03-2018(online)].pdf 2018-03-01
10 201811007656-Correspondence-210318.pdf 2018-04-02
10 201811007656-DECLARATION OF INVENTORSHIP (FORM 5) [01-03-2018(online)].pdf 2018-03-01
11 201811007656-DRAWINGS [01-03-2018(online)].pdf 2018-03-01
11 201811007656-REQUEST FOR CERTIFIED COPY [28-05-2018(online)].pdf 2018-05-28
12 201811007656-Request Letter-Correspondence [15-12-2020(online)].pdf 2020-12-15
12 201811007656-FIGURE OF ABSTRACT [01-03-2018(online)].pdf 2018-03-01
13 201811007656-FORM 1 [01-03-2018(online)].pdf 2018-03-01
13 201811007656-Form 1 (Submitted on date of filing) [15-12-2020(online)].pdf 2020-12-15
14 201811007656-STATEMENT OF UNDERTAKING (FORM 3) [01-03-2018(online)].pdf 2018-03-01
14 201811007656-Covering Letter [15-12-2020(online)].pdf 2020-12-15