Abstract: ABSTRACT PREDICTION OF TRAVEL TIME OF VEHICLES Methods and systems for prediction of travel time of vehicles are described. A prediction model is formulated that can be used for predicting travel times of a vehicle at a plurality of sections on a route. The prediction model indicates dependence of travel time of a vehicle at a section on travel time of the vehicle on a second number of sections immediately before the section on the route. The second number of sections of one section may be different from that of another section. [[To be published with Fig. 3]]
FORM 2
THE PATENTS ACT, 1970 (39 of 1970) & THE PATENTS RULES, 2003
COMPLETE SPECIFICATION (See section 10, rule 13) 1. Title of the invention: PREDICTION OF TRAVEL TIME OF VEHICLES
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor,
SERVICES Nariman Point, Mumbai 400021,
India
INDIAN INSTITUTE OF Indian Indian Institute of Technology
TECHNOLOGY MADRAS Madras IIT P.O., Chennai 600 036,
India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it
is to be performed.
TECHNICAL FIELD
[0001] The present subject matter relates, to prediction of travel time
of vehicles, such as public transport buses.
BACKGROUND
[0002] Due to dynamically varying traffic conditions, accurate
prediction of travel time of vehicles plying on roads may not be possible. For instance, it may not be possible to predict the time at which a public transport bus will arrive at a particular bus stop on its route accurately if variations in the traffic conditions are very high.
BRIEF DESCRIPTION OF DRAWINGS
[0003] The features, aspects, and advantages of the present subject
matter will be better understood with regard to the following description, and
accompanying figures. The use of the same reference number in different
figures indicates similar or identical features and components.
[0004] Fig. 1 illustrates a system for predicting travel time of a vehicle,
in accordance with an implementation of the present subject matter.
[0005] Fig. 2 illustrates a method for choosing sectional order of
dependence for sections on a route, in accordance with an implementation of the present subject matter.
[0006] Fig. 3 illustrates a method for predicting travel time of a vehicle,
in accordance with an implementation of the present subject matter.
[0007] Fig. 4 illustrates a method for predicting travel time of a vehicle,
in accordance with an implementation of the present subject matter.
[0008] Fig. 5 illustrates the predicted and measured travel times for a
particular time period, in accordance with an implementation of the present subject matter.
[0009] Fig. 6 illustrates the results of comparison of the performance
of the present subject matter with and without Kalman filtering for a sample
test day, in accordance with an implementation of the present subject
matter.
[0010] Fig. 7 illustrates the results of comparison of the performance
of the present subject matter with conventional methods for several days, in
accordance with an implementation of the present subject matter.
[0011] Fig. 8(a) illustrates reduction of Mean Absolute Percentage
Error (MAPE) provided by the present compared to a Historical Average
(HA) method for each section, in accordance with an implementation of the
present subject matter.
[0012] Fig. 8(b) illustrates reduction of MAPE provided by the present
compared to Space Discretization (SD) method for each section, in
accordance with an implementation of the present subject matter.
[0013] Fig. 8(c) illustrates reduction of MAPE provided by the present
compared to Artificial Neural Network (ANN) method for each section, in
accordance with an implementation of the present subject matter.
[0014] Fig. 9 illustrates a comparison of the present subject matter with
the SD and ANN methods in terms of MAPE for all trips that happened on a
sample day, in accordance with an implementation of the present subject
matter.
[0015] Fig. 10 illustrates the superior performance in terms of MAPE
over the HA method at the trip level, in accordance with an implementation
of the present subject matter.
[0016] Fig. 11(a) illustrates the comparison of Mean Absolute Error
(MAE) values for arrival time predicted for bus stop A, in accordance with
an implementation of the present subject matter.
[0017] Fig. 11(b) illustrates the comparison of MAE values for arrival
time predicted for bus stop B, in accordance with an implementation of the
present subject matter.
DETAILED DESCRIPTION
[0018] Accurate prediction of the time at which a vehicle may arrive at
a particular location and the amount of time a vehicle may take to cross a
section of a transport route, also referred to as a route, may not be possible
due to dynamically varying traffic conditions. For instance, public transport
buses have uncertainties associated with their arrival and travel times at
bus stops on their routes due to several factors, such as traffic lights, dwell
times at bus stops, seasonal variations, fluctuating travel demands, and the
like. Further, factors like excess vehicles, diverse modes of transport, and
lack of lane discipline make prediction of travel time even more inaccurate.
[0019] The present subject matter relates to predicting travel time of
vehicles. With the implementations of the present subject matter, travel and arrival time of vehicles, such as public transport buses, can be accurately predicted in real-time.
[0020] In an implementation of the present subject matter, a prediction
model is developed, using which travel times of vehicles at various sections along a route can be accurately predicted. To develop the prediction model, first, a time series is formulated using a history dataset that includes historical data of travel times at each of a plurality of sections for a plurality of trips on the route for at least one vehicle. Based on the time series, an overall order of dependence is determined for the plurality of sections. The overall order of dependence is a first number of sections immediately before a section on the route that are likely to influence the travel time of the vehicle at the section. For instance, if the first number of sections is determined to be five, it is likely that the travel time of a vehicle at a section is influenced by its travel time in five sections immediately preceding the section. The overall order of dependence may be a common value for all sections of the route.
[0021] Subsequently, a prediction model is formulated. The prediction
model indicates dependence of travel time of the vehicle at each section on travel time at the first number of sections immediately before the section.
For instance, the prediction model may represent the travel time of the vehicle at a section as a function of the travel times at each section of the first number of sections immediately preceding the section. The prediction model may include variables and coefficients corresponding to the variables.
[0022] Further, for each section of the plurality of sections, a sectional
order of dependence is determined based on the time series. The sectional
order of dependence for a section is a second number of sections
immediately before the section on the route that influence the travel time of
the vehicle at the section. While the overall sectional order of dependence
is common for all sections on the route, the sectional order of dependence
is a value specific to a particular section. The second number of sections is
lesser than or equal to the first number of sections. Subsequently, the
coefficients of the prediction model may be estimated based on the
sectional order of dependence by fitting the prediction model to the history
dataset. In this manner, the formulated prediction model is fine-tuned for
each section on a route. In an example, the prediction model may include a
temporal component, which indicates dependence of travel time at a section
in a trip on travel time at the section in a previous trip.
[0023] The prediction model, updated with the coefficients, may then
be used for predicting travel times of the vehicle. For instance, the travel time at a particular section, i.e., the time taken to cross the particular section, can be predicted using real-time data of travel times of the vehicle at the second number of sections corresponding to the section. In an example, to make the prediction feasible, the prediction model may be rewritten in a state-space form, using which a predictive filtering technique, such as a Kalman filter technique (KFT), can be applied. The state-space form may also be referred to as a state-space model.
[0024] With the systems and methods of the present subject matter,
arrival times and travel times of vehicles can be accurately predicted. The prediction can be used even for public transport buses, for which the travel
time is highly variable. The present subject matter can be used for predicting
travel time even in conditions where lane discipline is not strictly followed
and traffic is non-homogenous, i.e., the modes of transport can range from
bicycles and two-wheelers to heavy vehicles, such as buses and trucks. The
non-homogenous traffic may also be referred to as mixed traffic.
[0025] The above and other features, aspects, and advantages of the
subject matter will be better explained with regard to the following
description, and accompanying figures. It should be noted that the
description and figures merely illustrate the principles of the present subject
matter along with examples described herein and, should not be construed
as a limitation to the present subject matter. It is thus understood that
various arrangements may be devised that, although not explicitly described
or shown herein, embody the principles of the present disclosure. Moreover,
all statements herein reciting principles, aspects, and examples thereof, are
intended to encompass equivalents thereof. Further, for the sake of
simplicity, and without limitation, the same numbers are used throughout
the drawings to reference like features and components.
[0026] Fig. 1 illustrates a system 100 for predicting travel time of a
vehicle, in accordance with an implementation of the present subject matter.
[0027] The system 100 can include a processor 102 to run at least one
operating system and other applications and services. The system 100 can also include an interface 104 and a memory 106. Further, the system 100 can include modules 108 and data 112.
[0028] The processor 102, amongst other capabilities, may be
configured to fetch and execute computer-readable instructions stored in the memory 106. The processor 102 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. The functions of the various elements shown in the figure,
including any functional blocks labelled as ''processor'', may be provided
through the use of dedicated hardware as well as hardware capable of executing machine readable instructions.
[0029] When provided by a processor, the functions may be provided
by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term "processor" should not be construed to refer exclusively to hardware capable of executing machine readable instructions, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing machine readable instructions, random access memory (RAM), non-volatile storage. Other hardware, conventional and/or custom, may also be included.
[0030] The interface 104 may include a variety of machine readable
instructions-based interfaces and hardware interfaces that allow the system 100 to interact with different entities, such as the processor 102, the modules 108, and the data 112. Further, the interface 104 may enable the components of the system 100 to communicate with computing devices, web servers, and external repositories. The interface 104 may facilitate multiple communications within a wide variety of networks and protocol types, including wireless networks, wireless Local Area Network (WLAN), RAN, satellite-based network, and the like.
[0031] The memory 106 may be coupled to the processor 102 and
may, among other capabilities, provide data and instructions for generating different requests. The memory 106 can include any computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0032] The modules 108 may include routines, programs, objects,
components, data structures, and the like, which perform particular tasks or implement particular abstract data types. The modules 108 may further include modules that supplement applications on the system 100, for example, modules of an operating system. Further, the modules 108 may be implemented in hardware, instructions executed by a processing unit, or by a combination thereof.
[0033] In an implementation, the modules 108 may be machine-
readable instructions which, when executed by the processor 102, perform any of the described functionalities. The machine-readable instructions may be stored on an electronic memory device, hard disk, optical disk or other machine-readable storage medium or non-transitory medium. In one implementation, the machine-readable instructions can also be downloaded to the storage medium via a network connection.
[0034] The modules 108 include one or more modules which may
perform different functionalities. The one or more modules include a prediction module 110. The functions of the prediction module 110 are explained in later paragraphs.
[0035] The data 112 serves, amongst other things, as a repository for
storing data that may be fetched, processed, received, or generated by one or more of the modules 108.
[0036] In operation, the prediction module 110 develops a prediction
model that captures dependence of travel time of a vehicle at a section on a transport route of the vehicle on travel times at preceding sections on the transport route. Such a dependence may be referred to as spatial correlations between sections on the transport route of the vehicle. The transport route may be referred to as the route and sections on the route of a vehicle may be referred to as a section. A section may be, for example, 500 m in length. In an example, the prediction model may be an auto-regressive (AR) model. The AR model may be a linear non-stationary AR model.
[0037] In an example, the prediction model is developed such that the
travel time at a section is represented as a function of the travel times at a
first number of sections immediately before the section on the route. For
instance, consider that the route has a total of 50 sections and that the first
number of sections is five. Now, the travel time at the seventh section from
the origin of the route is represented as a function of the travel time at each
of the second section from the origin, third section from the origin, fourth
section from the origin, fifth section from the origin, and the sixth section
from the origin. This is because these are the five sections immediately
preceding the seventh section. The first number of sections is referred to as
an overall order of dependence for the route and is represented by 'Q. The
overall order of dependence may be common for all sections on the route.
The determination of the overall order of dependence will be explained later.
[0038] Denoting the travel time at a section k by Zk and considering
that Zk is a linear combination of its travel times in Q previous sections with some additive noise e(k), Zk may be represented as below:
Zk = bko+∑Qi=1bkiZk-i +e(k)
where e(k) may have a zero-mean and a variance of σ2w(k). Further, bki refers
to ith regression coefficient, as will be explained later. Equation (1) is part of
the prediction model. Here, bki denotes coefficients of the prediction model
and zk-i denotes variables of the prediction model. As will be understood,
the coefficients and the variables vary from one section to another.
Accordingly, the prediction model may include the coefficients and variables
corresponding to each of the plurality of sections.
[0039] Fig. 2 illustrates a method 200 for determining the overall
sectional order of dependence, a sectional order of dependence for each
section, and the regression coefficients of the prediction model, in
accordance with an implementation of the present subject matter. The
method 200 may be performed by the prediction module 110.
[0040] At block 202, travel time vector for each section on a route for a
plurality of trips along the route are concatenated. The specific time slot may be, for example, a one-hour slot. Here, travel time for a section refers to time taken for a vehicle to cross the section. The travel time for a section may also be referred to as travel time vector at the section or the section travel time. The travel time vectors for the sections for the plurality of trips may be obtained from a history dataset, which may be part of the data 112. In an example, the travel times within a specific time slot, such as a one-hour time slot, may be concatenated. To facilitate concatenation of travel times from a specific time slot, prior to the concatenation, the section travel times may be arranged in the order of their start time and date. The arranged travel times may be represented as follows: z = (z1, z2,...zN),
where z may be referred to as historical data, zi is a travel time vector (d X 1) at the ith section and d is the number of trips across all days in a time slot, and N is the number of sections.
[0041] The concatenation may be such that all rows (d of them) of z
are concatenated into one long sequence or a preliminary time series S.
The concatenation may induce some seasonal trends with a period N. To
eliminate the seasonal trends, at block 204, a seasonal difference is
performed. The seasonal difference may be performed based on an Auto-
Correlation Function (ACF) of the preliminary time series. Autocorrelation,
also known as serial correlation, is the correlation of a signal with a delayed
copy of itself as a function of delay. Informally, it is the similarity between
observations as a function of the time lag between them. Autocorrelation is
generally done to check the correlations and repeated patterns in the data.
[0042] The seasonal difference may be performed if the ACF values at
multiples of Ns display a slow decay. Here, Ns is a multiple of N and is used to check the seasonality that may range from 1 to N. The output of the seasonal difference may be referred to as a time series and is represented as S'.
[0043] On the time series, at block 206, a partial ACF (PACF) may be
computed. This is because, for a stationary time series, i.e., a time series for which mean, variance and autocorrelation structure do not change over time, an order of auto-regression can be determined based on the decay of associated PACF value with respect to lag. In this case, the lag may be the number of previous sections. The variation of the PACF values with respect to the lag may be plotted in the form of a PACF plot, as will be understood by a person skilled in the art.
[0044] Further, at block 208, Q is read off based on a standard
statistical threshold such that PACF value is greater than a cut-off PACF
value. For instance, a largest lag for which the PACF value is above the cut¬
off PACF value may be selected as Q, i.e., the overall order of dependence.
[0045] As mentioned earlier, the overall order of dependence is
common for all sections on the route. However, in some cases, there may
be minor variations in the actual order of dependence among the sections.
For instance, consider that the overall order of dependence is determined
as 5. In such a case, while the actual order of dependence may be 5 for one
section, for another section, the actual order of dependence may be 4. In
such cases, to make the prediction model more accurate, at block 210, a
sectional order of dependence is determined for each section. The sectional
order of dependence for a section may be a second number of sections
immediately preceding the section and may be lesser than or equal to the
first number of sections. For instance, consider that the first number of
sections is 5, indicating that the travel time at a section is likely to be
influenced by travel times at the 5 sections immediately preceding that
section. Now, if the second number of sections is 4, a determination may be
performed that the travel time at the section is influenced by the travel times
at the previous 4 sections, and not the fifth section before the section.
[0046] To determine the second number of sections, the first number
of sections may be taken as a starting point. Subsequently, for each section, it may be determined as to which all sections from the first number of
immediately preceding sections influence travel time at that section. For instance, if the first number of sections is 5, for the seventh section from the origin, it may be determined as to which of the sixth section, fifth section, fourth section, third section, and second section influence the travel time at the seventh section. To determine the second number of sections, a forward regression technique and a backward regression technique may be utilized. As will be understood, using forward and backward regression techniques, a subset of independent variables that impact a dependent variable may be selected from a set of independent variables. In this case, the dependent variable may be travel time at a section and independent variables may be travel times at sections before the section on the route. Accordingly, the forward and backward regression techniques may utilize the section travel times in the time series.
[0047] In an example, to determine the second number of sections, a
plurality of iterations may be performed. Each iteration may include a forward regression and a a backward regression. Each iteration may be of an order higher than that of a previous iteration. For instance, at the first iteration, a first order, i.e., the immediately previous section, is considered, and at the second iteration, a second order, i.e., two immediately preceding sections are considered for the regression. The order can be increased until the overall order of dependence, i.e., the first number of sections. Accordingly, the number of iterations is lesser than or equal to the first number of sections. Each iteration includes estimation of an error residual from the forward regression and estimation of an error residual from the backward regression. The error residual from the forward regression may be referred to as the first error residual and the error residual from the backward regression may be referred to as the second error residual. A significance of correlation between the first error residual and the second error residual at each iteration is checked.
[0048] Upon performing the plurality of iterations, the highest order at
which the significance of correlation between the first error residual and the
second error residual is lesser than a significance threshold, is selected as the sectional order of dependence for the section. For instance, consider that the overall order of dependence is 5 and that 4 iterations - first iteration with the first order of dependence, second iteration with the second order of dependence, third iteration with the third order of dependence, and fourth iteration with the fourth order of dependence - have been performed. Consider also that the significance of correlation between the first error residual and the second error residual is less than the significance threshold for the first order, for the second order, and for the third order, but is greater than the significance threshold for the fourth order. In such a case, the sectional order of dependence may be selected as 3, as the highest order for which the significance is lesser than the significance threshold is 3. As will be understood, the highest order of dependence is lesser than or equal to the overall order of dependence.
[0049] Next, at block 212, coefficients are computed for each section
by fitting the prediction model to the travel times in the history dataset. For the fitting, a linear regression technique may be utilized. Accordingly, the coefficients may be referred to as regression coefficients. The coefficients may be computed based on the overall order of dependence and the sectional order of dependence. This will be explained with reference to equation (1), which is reproduced below:
Zk=bk0 +∑Qi=1bkiZk-i +e(k
Here, for a section, the sectional order of dependence may be lesser than the overall order of dependence, say Q-f, where f may be zero or a positive integer. In such a case, the coefficients bkQ-+1, bkQ-+2, ..., bkQay be assigned a value of zero, as the section travel time for the section does not depend on the travel times at these sections. Further, the other coefficients, i.e., bk1,bk2, ..., bkQ-f are determined. For the determination, the actual travel time at the section and the actual travel times at each of the Q-f sections are retrieved from the history dataset and applied in the above equation.
Using these values, by linear regression, the values bk1,bk2, ..., bkQ-f can be
determined.
[0050] Further, the resulting residual variances are stored for each
section. The coefficients may be referred to as bn, where n denotes nth
section. The residual error variance may be represented as σ2w (n). bn may
be represented as below:
bn=[bn0,bn1,bn2, ...bnQ],An
where b0n is the bias. As explained earlier, here, bnQ-f+1, bnQ-f+2,...,bnQ may have a value of zero.
[0051] The determination of the sectional order of dependence for
each section and accordingly determining regression coefficients for the sections ensures that the prediction model can provide accurate predictions for each section on the route. Further, as explained, the coefficients may be computed based on travel time data obtained for a particular time slot. For instance, the steps 206-212 of the method 200 may be repeated for section on the route for each time slot. Accordingly, the regression coefficients computed for a section may vary from one time slot to another time slot. Therefore, the prediction model may include coefficients for a plurality of sections for a plurality of time slots. This enables providing accurate travel time predictions for all sections for all time slots.
[0052] In an example, the prediction model may include a temporal
component that indicates dependence of travel time at a section in a trip on travel time at the section in a previous trip. The previous trip may have been undertaken by a previous vehicle, i.e., a vehicle that traversed the sections at an earlier instant of time. For instance, consider that a first public transport bus travelling on a route has crossed 40 sections on the route. Consider also that a second public transport bus has just started from the origin on the route. In such a case, the travel time for the first public transport bus at the third section from the origin may be used to predict the travel time for the second public transport bus at the third section. The inclusion of the
temporal dependence in the prediction model will be mathematically
represented below:
[0053] If Zpvk and Zk denote the travel time at the kth section of the
previous vehicle and current vehicle, respectively, then zpvk may be
represented as follows:
Zpvk= Zk + n(k) (2)
where n(k) is additive noise with a possible non-zero mean c0(k) and
variance σ2v(k). The dependence of the travel time of the current vehicle on
the travel time of the previous vehicle may be referred to as a temporal
dependency. Below equations describe how the temporal dependencies are
learnt:
Zn = {znkj : k indexes all days, j=1, ... Tr(k)}, (3)
where Tr(k) is the number of trips on kth day, zn is the travel time observation
vector (d X 1) at the nth section, and d is the number of trips across all D
days in a time slot, such as one-hour slot.
OP= [zn11,..z1n(Tr(1)-1)z21n...zn2(Tr(2)-1)..ZnD1 ...ZnD(Tr(D)-1] (4)
IP = [z12n...zn1(Tr(1)zn22...zn2(Tr(2).....znD2...znD(Tr(D)] (5)
The significance of the equations (4) and (5) may be explained as below:
[0054] Consider a data series as Z11, Z12, Z13, ..., Z1Tr(1) for a first day
for Tr(1) number of trips on the first day, Z21, Z22, Z23, ...Z2Tr(2) for a second day for Tr(2) number of trips on the second day, and so on for multiple trips on multiple days. Here, OP is a series formed as 1st trip of the first day to last but one trip of the first day, 1 st trip of the second day to last but one trips of the second day, and so on. Similarly, IP is a series formed as 2nd trip of the first day to last trip of the first day, 2nd trip of the second day to last trip of the second day, and so on. Therefore, the differences between OP and IP, i.e., OP - IP, is the difference between current trip and the immediately preceding trip on the same day. Therefore, the differences between OP and IP is computed, and subsequently, the sample mean (Cn0) and covariance (σv2(n)) of OP-IP may be computed.
[0055] Now, the utilization of the prediction model to predict the section
travel times will be explained. In an implementation, the prediction model
may be utilized by a predictive filtering technique for performing the
prediction. An example, of a predictive filter is a Kalman-Filter (KF).
Accordingly, the predictive filtering technique may be referred to as Kalman-
Filter technique (KFT). To enable utilization by the predictive filter, the
prediction module 110 may formulate a state-space based linear dynamical
system (LDS) model based on the prediction model. The state-space-based
LDS model may be referred to as a state-space model. Since, at each
section, an overall order of dependence (Q) on the previous sections is
allowed, a one-dimensional state-space model cannot be used.
Accordingly, a state-space model is built using the current vehicle's
(observed) section travel times up to section m and the previous vehicle's
section travel times beyond section m. The state-space model, usable by
the Kalman filter, is represented in the below equations:
X(k) = A(k-1) X(k-1) + A0(k-1) + w(k) (6)
y(k) = C(k) X(k) + C0(k) + v(k) (7)
where X(k) is a state variable, also referred to as state vector, that is describing the system behavior and y(k) is a measurement variable. X(k) is taken as the travel time in a current section and X(k-1) represents travel times in previous many sections. Further, w(k) represents process noise, i.e., noise in the state equation and v(k) represents measurement noise, noise in the observation equation.
[0056] To develop the state-space model, the following computations
are performed by the prediction module 110:
General State Vector X(k)=[Zm+k, Zm+k-1...Zm+k-Q+1],
y(k) = Zpvm+k (9)
(10)
(11)
C(k) = [1,0...0]a (1 X Q) (12)
C0(k) = [C0m+1] (13)
A0(0) =[b0m+10...0]T(QX1) (14)
A0(k)=[b0m+1+k0...0]T(Q X 1) (15)
w(k) = [w(k) 0 ... 0]T Q×1,w(k)~N(0,σ2w(m+k));v(k)~(0,σ2w(m+k)) (16)
(17)
[0057] The above computations are part of the Kalman filtering
technique. For the equations (8) - (17), bn and σ2w(n) are obtained as inputs
from the block 212 of the method 200, as explained earlier. Further, c0n and
av2(n), which were explained with reference to equation (2), are obtained as
inputs from the learnt temporal dependencies. Further, 'w' is associated
process disturbance and R is corresponding covariance.
[0058] It is to be noted that the dimension of X(k), the state vector, here
is Q. X(k) captures the travel times of Q consecutive sections ending at section m+k. Here, the linear regression parameters learnt in the method 200 form the first row of A(k). The remaining Q-1 rows of A(k) facilitate a 1 -step downshift of the first Q-1 components of X(k). The bias terms of the linear regression are captured in A0(k). The non-zero mean of the observation noise is captured in C0(k). Since the travel time of the previous vehicle is an additive noise-corrupted version of the current vehicle (as represented by the equation (2)), first component of C(k) is always 1. The residual error variances are captured by the first component of w(k) and v(k).
[0059] The state-space model, as explained above, may be utilized to
perform travel time prediction for a vehicle at a section based on the travel
time of the vehicle at the second number of sections immediately preceding
the section on the route. For the prediction, the KFT may be used. The
prediction performed is explained with the help of below equations:
Initial State Vector X(0) = [ZmZm_1... Zm-Q+1]T, D(0)=X(0) (18)
X'(0|0) = 0Q×1 (19)
P(0|0) = 0Q×Q (20)
Obs. Vector [y(l) ... y(K)] = [zpvm+1 ... zpvm+k] (21)
for k←1 to K
D(k) = A(k-1) * D(k-1) + A0(k-1) (22)
X'(k) = X(k) - D(k) (23)
E(k) = C(k) * D(k) + C0(k) (24)
y'(k) = y(k)-E(k) (25)
X'(k|k-1) = A(k-1) * X-(k-1|k-1) (26)
P(k|k-1) = A(k-1) * P(k-1|k-1) * AT(k-1) + Rw(k) (27)
KG(k) = P(k|k-1) * CT(k) [C(k) * P(k|k - 1) * CT(k) + Rv(n)]-1 (28)
X ¬ (k|k) = A(k-1) * ¬ (k|k-1) + KG(k) [y'(k) - C(k) * X ¬ (k|k-1)] (29)
P(k|k) = [ I-KG(k) *C(k)] *P(k|k-l) (30)
X(k|k) = X' (k|k) + D(k) (31)
Zm+k= X(k|k) (1) (32)
[0060] The equations (18) - (32) correspond to the steps in KFT.
Equations (18) - (27) are referred to as time update equations and use the model and system inputs to predict a priori state estimate. Equations (28) -(32) are referred to as measurement update equations, which use the output measurements to obtain a posteriori estimate.
[0061] Overall, the above equations may be run as a recursively, so
that new measurements can be processed when they are obtained. The
running of the above equations recursively may be referred to as recursive use of the KFT. To run the above equations, only the current instant state estimate, current input, and output measurements are utilized to calculate the next instant's state estimate. As will be understood, the current input may include the real-time data of section travel times in the first number of sections and the section travel times in the previous trip along the route. [0062] As will be understood, A(k), A0(k), C(k), and C0(k), obtained from equations (11), (15), (12), and (13), respectively, are utilized in the above equations. Further, in the above equations, z1,z2,...,zm represent current travel time data of current vehicle at its first m sections and
zpvm+1, zpvm+2, ..., represent travel time data of the previous vehicle beyond the mth section. Further, Zm+k represents the travel time prediction for the kth section ahead of mth section. Here, k may be any positive integer. Thus, using the present subject matter section travel times can be predicted not only for immediately next section (m+1th section) that a vehicle is to arrive at, but also for other sections after the immediately next section, such as m+2nd section, m+3rd section, and the like. Accordingly, the present subject matter can be used for accurate prediction of section travel times at several sections that a vehicle is yet to cross.
[0063] The state-space model can be viewed as a zero-input model
with non-zero time-varying means for w(k) and v(k) due to the presence of
A0(k) and C0(k), respectively. In other words, the noise mean parameter of
the state-space model is non-zero. However, generally, KFT can be applied
for models with zero-mean noise. To make the state-space model usable
for KFT-based prediction, in an implementation, the current state variable
and measurement variable of the state-space model, i.e., (X(k); y(k)), are
linearly transformed into another set (X'(k); y'(k)) such that the noise mean
parameter is zero. For instance, the new state-space model is of the zero-
input and zero-mean noise form with the same A(k) and C(k) matrices.
[0064] The two are related by X'(k) = X(k) - D(k) (as represented by
equation (23) and y'(k) = y(k) - E(k) (as represented by equation (25)). D(k)
and E(k) may be recursively computed as given by equations (22) and (24), respectively. The computation of the first linear term and the second linear term may be performed during each recursive use of the KFT. D(k) may be referred to as a first linear term and E(k) may be referred to as a second linear term. Equations (26) - (32) show the standard KF updates on the zero-mean noise LDS (X'(k), y'(k)). Hence, yv(k) from equation (25) is fed to the KF updates of equations (26) - (32). To obtain the prediction of the original variables X(k), D(k) is added in equation (31) to offset for the subtraction of the first linear term to the state variable. Thus, the linearly transformed value X'(k|k) from equation (29) is un-transformed at equation (31) for outputting the predicted section travel time. The travel time prediction at the kth section ahead will be the first component of X(k|k).
[0065] Using the models of the present subject matter, the prediction
of the travel time is optimal if the associated distributions are gaussian. However, in some cases, the distribution of the travel times (which are always positive) in the history dataset at any particular section and time slot may be right-skewed, i.e., may be a lognormal distribution. In such cases, a log transformation may be applied on each travel time observation in the history dataset before applying the method 200 and before performing the computations of equations (2) - (32). The log transformation may be performed while collating travel times as part of the history dataset. This enables making the marginal distributions approximately gaussian, so that the predictions made are close to optimal. Subsequently, the predictions output from equation (32), i.e., the predicted section travel times, are exponentiated to obtain final predictions.
[0066] Fig. 3 illustrates a method 300 for developing a prediction model
usable in prediction of travel time of vehicles at sections along a route that has a plurality of sections, in accordance with an implementation of the present subject matter. The travel time at a section may refer to the time taken to cross the section.
[0067] The order in which the method 300 is described is not intended
to be construed as a limitation, and any number of the described method
blocks may be combined in any order to implement the method 300, or an
alternative method. Furthermore, the method 300 may be implemented by
processor(s) or computing device(s) through any suitable hardware, non-
transitory machine-readable instructions, or a combination thereof.
[0068] It may be understood that steps of the method 300 may be
performed by programmed computing devices and may be executed based on instructions stored in a non-transitory computer readable medium. The non-transitory computer readable medium may include, for example, digital memories, magnetic storage media, such as one or more magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media. Further, although the method 300 may be implemented in a variety of systems, the method 300 is described in relation to the aforementioned system 100, for ease of explanation. The steps of the method 300 may be performed by the prediction module 110.
[0069] At block 302, a time series is formulated using a history dataset. The history dataset includes historical data of travel times at each of the plurality of sections for a plurality of trips on the route for at least one vehicle. The time series may be, for example, the time series S\ which was generated at block 204.
[0070] In an example, for developing the history dataset, the method
300 includes partitioning the route into the plurality of sections, collecting the travel time in the plurality of sections for a plurality of trips on the route for at least one vehicle, and collating the travel time as the history dataset. [0071] At block 304, an overall order of dependence for the plurality of sections is determined based on the time series. The overall order of dependence is a first number of sections immediately before a section on the route that influence the travel time of the vehicle at the section. The overall order of dependence may be, for example, Q, as explained with
reference to Fig. 2. The overall order of dependence may be a common value for all sections on the route.
[0072] At block 306, a prediction model is formulated. The prediction
model indicates dependence of travel time of the vehicle at each section on travel time at the first number of sections immediately before the section. The prediction model comprises variables and coefficients corresponding to the variables for each section of the plurality of sections. The prediction model may be, for example, the prediction model represented by the equation (1).
[0073] At block 308, for each section of the plurality of sections, a
sectional order of dependence is determined based on the time series. The sectional order of dependence for a section is a second number of sections immediately before the section on the route that influence the travel time of the vehicle at the section. The second number of sections may be lesser than or equal to the first number of sections. The second number of sections may be, for example, Q-f.
[0074] At block 310, the coefficients for each section of the plurality of
sections are estimated. The estimation may be based on the sectional order of dependence and the overall order of dependence, and may be performed by fitting the prediction model to the travel times in the history dataset. The estimation of the coefficients may be performed in the manner as explained with reference to block 212.
[0075] Fig. 4 illustrates a method 400 for predicting travel time of
vehicles at sections along a route that has a plurality of sections, in accordance with an implementation of the present subject matter. The route may have an origin and a destination.
[0076] The order in which the method 400 is described is not intended
to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method 400, or an alternative method. Furthermore, the method 400 may be implemented by
processor(s) or computing device(s) through any suitable hardware, non-
transitory machine-readable instructions, or a combination thereof.
[0077] It may be understood that steps of the method 400 may be
performed by programmed computing devices and may be executed based on instructions stored in a non-transitory computer readable medium. The non-transitory computer readable medium may include, for example, digital memories, magnetic storage media, such as one or more magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media. Further, although the method 400 may be implemented in a variety of systems, the method 400 is described in relation to the aforementioned system 100, for ease of explanation. The steps of the method 400 may be performed by the prediction module 110.
[0078] At block 402, receiving real time data of travel time of a vehicle
at each section of a first plurality of sections travelled by the vehicle in a current trip. The first plurality of sections is a subset of a plurality of sections on a route. For instance, if the route has a total of 50 sections, and the vehicle has crossed 10 sections so far, the travel time of the vehicle at each of the 10 sections may be received.
[0079] At block 404, travel time at a second plurality of sections on the
route from a previous trip on the route is received. The second plurality of
sections may be a superset of the first plurality of sections. A previous trip
on the route may have been undertaken by another vehicle. For instance,
while the vehicle may have crossed 10 sections starting from the origin of
the route, the other vehicle may have crossed 45 sections on the route.
[0080] At block 406, a state-space model is received. The state-space
model is generated from a prediction model.
[0081] The prediction model indicates dependence of travel time of a
vehicle at a section on travel time of the vehicle on a second number of sections immediately before the section on the route. The second number of sections of one section may be different from that of another section. The prediction model may be the prediction model generated from the method
300, which may include the variables and coefficients corresponding to each section. the state-space model may be, for example, the model explained with reference to equations (6) and (7). The second number of sections may be, for example, Q-f. The prediction model may also indicate dependence of travel time at a section in one trip on the route on travel time at the section in a previous trip on the route, as explained with reference to equations (2)-(5).
[0082] At block 408, travel time of the vehicle for at least one section
to be traversed in the current trip is predicted using the prediction model and the real-time data. For instance, as explained in equation (31), section travel time at any kth section ahead of the current section traversed by the section can be predicted using the state-space model. The prediction of the travel time may involve recursive use of Kalman filtering technique (KFT), as explained earlier.
[0083] The accuracy of the travel times predicted by the present
subject matter is illustrated with the help of a few examples below.
EXAMPLES
[0084] The proposed methodology was implemented in a bus route in
a city in India, and data was collected from across 34 days. All the plying buses were GPS enabled with their position information logged every 5 seconds. In the bus route, there are 20 bus stops and 13 intersections. The bus route represents heterogeneous and lane-less traffic conditions and include varying geometric characteristics, volume levels, and land use characteristics. The GPS data were from 6 AM to 10 PM in the selected route. The collected GPS data included the ID of the GPS unit, time stamp, and latitude and longitude of the location at which the entry was made. During the processing, distance between any two consecutive entries was calculated using Haversine formulae. The bus route was segmented into sections each of 500 m length and time taken to cover each section was calculated by linear interpolation from the high frequency GPS data. The
data across all sections and trips over 34 days was used as input for training
and testing.
[0085] For learning, the trips from 27 test days were grouped into
hourly slots and all trips with start times falling in a given one-hour slot were
assumed to follow a specific LDS model.
[0086] First, a sample plot of the predicted travel times is compared
with measured travel times for a peak period (8 AM - 11AM and 3 PM - 8
PM) and for an off-peak period (midnight - 8 AM, 11 AM - 3 PM, and 8 PM
- midnight) trip on all sections.
[0087] Fig. 5 illustrates the predicted and measured travel times for a
peak period, in accordance with an implementation of the present subject
matter. From Fig. 5, it is clear that the predicted travel times closely follow
the actual travel times.
[0088] In addition, the performance of the present subject matter was
compared with and without (just using the non-stationary AR predictions)
applying the Kalman filtering.
[0089] Fig. 6 illustrates the results of comparison of the performance
of the present subject matter with and without Kalman filtering for a sample
test day, in accordance with an implementation of the present subject
matter. The results clearly vindicate the utility of filtering. It is noted that KF
reduces the Mean Absolute Percentage Error (MAPE) by about 6.3% in the
worst case.
[0090] Next, the results of the present subject matter were compared
with three conventional methods: (a) historical average (HA) of the training
data which will serve as a baseline comparison, (b) Space discretization
(SD) (a conventional method which uses a data-based model and employs
model calibration), and (c) an Artificial Neural Network (ANN) method, which
learns (using historical data) temporal dependencies from previous trips to
predict travel at each section. The prediction was carried out for a test period
of one week in the selected route and the prediction accuracy was
calculated in terms of MAPE and Mean Absolute Error (MAE). Percentage
error is the Absolute Error divided by the true prediction expressed in
percentage. The prediction accuracy of a technique may also be referred to
as the performance of the technique.
[0091] Fig. 7 illustrates the results of comparison of the performance
of the present subject matter with the aforesaid three methods for several
days, in accordance with an implementation of the present subject matter.
It can be observed that the proposed method consistently outperforms the
existing methods by reducing MAPE up to 16%, 4% and 4.6% as compared
to historical average, space discretization, and ANN approaches,
respectively.
[0092] Next, comparisons at were performed at a finer level, i.e., at a
section level and trip level.
[0093] Fig. 8(a) illustrates reduction of MAPE provided by the present
compared to the HA method for each section, in accordance with an
implementation of the present subject matter.
[0094] Fig. 8(b) illustrates reduction of MAPE provided by the present
compared to the SD method for each section, in accordance with an
implementation of the present subject matter.
[0095] Fig. 8(c) illustrates reduction of MAPE provided by the present
compared to the ANN method for each section, in accordance with an
implementation of the present subject matter.
[0096] From Figs. 8(a) and (b), it can be observed that the present
subject matter outperforms HA and SD methods across all sections with a
worst-case improvement of up to 16% and 14%, respectively, in terms of
MAPE. Further, from Fig. 8(c), it is clear that the present subject matter
provides a worst-case improvement of up to 8% in terms of MAPE
compared to the ANN method.
[0097] Fig. 9 illustrates a comparison of the present subject matter with
the SD and ANN methods in terms of MAPE for all trips that happened on a
sample day, in accordance with an implementation of the present subject
matter. From Fig 9, it can be observed that the MAPE varied from 14% to
25% for the present subject matter, 14% to 37% for the SD method, and
17% to 44% for the ANN method. Further, the present subject matter
provides a superior performance in terms of MAPE over the HA method
even at the trip level.
[0098] Fig. 10 illustrates the superior performance in terms of MAPE
over the HA method at the trip level, in accordance with an implementation
of the present subject matter. As can be seen, a worst-case improvement
of up to 26% is obtained. The improvements over SD method are also
uniform as before with a WC improvement of 12%.
[0099] When compared with the ANN method, the improvement at a
trip level of the present subject matter is far more uniform than observed at
a section level (in Fig. 8). In particular, the improvements in terms of MAPE
are pronounced during peak trips (right half of Fig. 9) and a worst-case
improvement of up to 10% in terms of MAPE can be observed.
[00100] As explained earlier, an application of the present subject
matter is to provide the arrival information of public transport vehicle, such
as bus, to passengers at the bus stops at which the public transport vehicle
is yet to arrive prior to its arrival. Therefore, the performance of the present
subject matter was evaluated and compared by checking the deviation of
the predicted travel time from the measured travel time for each bus stop.
Further, the errors were expressed in user-understandable clock time
difference.
[00101] Fig. 11(a) illustrates the comparison of MAE values for the
arrival time predicted for bus stop A, in accordance with an implementation
of the present subject matter. The bus stop A is 7.52 km from origin of the
bus.
[00102] Fig. 11(b) illustrates the comparison of MAE values for the
arrival time predicted for bus stop B, in accordance with an implementation
of the present subject matter. The bus stop B is 11.03 km from origin of the
bus.
[00103] Here, the arrival time is predicted for the bus stops A and B when a bus is 1, 3, 5, and 10 sections away from the bus stops A and B, respectively. The arrival times may be predicted using travel time at each section. From Figs. 11(a) and (b), it can be seen that the present subject matter outperforms HA and SD methods in all the cases. [00104] The present subject matter provides an effective travel time prediction method even under mixed traffic conditions and when there is a lack of lane discipline. The present subject matter provided an improvement in terms of error in the predicted travel time by up to 26 % over baseline methods and by up to 14% over the existing state-of-art. The present subject matter provides a superior performance both under a (i) single-section setting (when the vehicle is one section behind a location for which travel time is predicted) and (ii) multi-step setting.
[00105] The present subject matter provides a data-based statistical model which fully exploits the inherent spatial dependencies while also factoring the temporal dependency in a minimal way. For instance, the present subject matter provides a heuristic method to compute the extent (or order) of spatial dependence a given section might experience from its previous sections. Also, the historical data are utilized to learn the non-stationary (a)spatial dependencies between travel times of adjacent sections based on the above computed order and (b)temporal dependency between successive trips. Further, the learnt model is reformulated as a linear dynamical system (LDS) in a state space form which enables application of Kalman Filtering (KF) for prediction.
[00106] Although the present subject matter has been described with reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the subject matter, will become apparent to persons skilled in the art upon reference to the description of the subject matter.
I/We Claim:
1. A method for predicting travel time of a vehicle at a section in a route
comprising a plurality of sections, the method comprising:
formulating a time series using a history dataset, wherein the history dataset comprises historical data of travel times at each of the plurality of sections for a plurality of trips on the route;
determining an overall order of dependence for the plurality of sections based on the time series, wherein the overall order of dependence is a first number of sections immediately before a section on the route that are likely to influence the travel time of the vehicle at the section;
formulating a prediction model indicating dependence of travel time of the vehicle at each section on travel time at the first number of sections immediately before the section, wherein the prediction model comprises variables and coefficients corresponding to the variables for each section of the plurality of sections;
determining, for each section of the plurality of sections, a sectional order of dependence based on the time series, wherein the sectional order of dependence for a section is a second number of sections immediately before the section on the route that influence the travel time of the vehicle at the section, wherein the second number of sections is lesser than or equal to the first number of sections; and
estimating the coefficients for each section of the plurality of sections based on the sectional order of dependence and the overall order of dependence, by fitting the prediction model to the travel times in the history dataset, wherein the prediction model is usable to predict the travel time at each of the plurality of sections.
2. The method as claimed in claim 1, comprising:
partitioning the route into the plurality of sections;
collecting the travel time in the plurality of sections for a plurality of trips on the route; and
collating the travel time as the history dataset.
3. The method as claimed in claim 2, wherein distribution of travel time in the history dataset is lognormal, and collating the travel time as the history dataset comprises performing a log transformation on the collected travel times.
4. The method as claimed in claim 2, wherein formulating the time series comprises:
concatenating the travel time in the plurality of sections to form a preliminary time series; and
eliminating seasonal trends in the preliminary time series using an autocorrelation function (ACF) of the preliminary time series to form the time series.
5. The method as claimed in claim 1, wherein determining the overall order of dependence comprises determining partial auto correlation function (PACF) value of the time series and identifying a largest lag for which the PACF value is above a cut-off PACF value as the overall order of dependence.
6. The method as claimed in claim 1, wherein determining the sectional order of dependence for each section comprises:
performing a plurality of iterations of forward regression and backward regression, each iteration of forward regression and backward regression being performed with an order higher than that of a previous iteration, wherein a number of iterations performed is lesser than or equal to the overall order of dependence and wherein each iteration of forward regression and backward regression comprises:
estimating an error residual from the forward regression;
estimating an error residual from the backward regression; and
checking significance of correlation between the error residual from the forward regression and the error residual from the backward regression; and
selecting the highest order at which the significance of correlation is lesser than a significance threshold, as the sectional order of dependence for the section, wherein the highest order is lesser than or equal to the overall order of dependence.
7. The method as claimed in claim 1, comprising:
including a temporal component in the prediction model, the temporal component indicating dependence of travel time at a section in a trip on travel time at the section in a previous trip on the route.
8. The method as claimed in claim 1, comprising formulating a state-space model based on the prediction model.
9. The method as claimed in claim 8, comprising predicting travel time of a vehicle at a section based on travel time of the vehicle at the second number of sections immediately before the section on the route using the state-space model.
10. A method for predicting travel time of a vehicle in real time at a section in a current trip on a route comprising:
receiving real-time data of travel time of a vehicle at each section of a first plurality of sections travelled by the vehicle in a current trip, the first plurality of sections being a subset of a plurality of sections on a route;
receiving travel time at a second plurality of sections on the route from a previous trip on the route, the second plurality of sections being a superset of the first plurality of sections;
receiving a state-space model generated from a prediction model, wherein the prediction model indicates:
dependence of travel time of a vehicle at a section on travel
time of the vehicle on a second number of sections immediately
before the section on the route; and
dependence of travel time at a section in one trip on the
route on travel time at the section in a previous trip on the route;
and
predicting travel time of a vehicle for at least one section to be traversed in the current trip using the prediction model and the real-time data.
11. The method as claimed in claim 10, wherein predicting travel time of the vehicle comprises recursive use of Kalman filtering technique (KFT).
12. The method as claimed in claim 11, wherein a noise mean parameter of the state-space model is non-zero, and the recursive use of the KFT comprises:
linearly transforming state variable and measurement variable of the state-space model such that the noise mean parameter zero, wherein the linear transformation comprises subtraction of a first linear term from the state variable; and
adding the first linear term to predicted state variable to offset for the subtraction.
13. A system for predicting travel time of a vehicle at a section in a route
comprising a plurality of sections, the system comprising:
a processor; and
a prediction module coupled to the processor, wherein the prediction module is executable by the processor to:
formulate a time series using a history dataset, wherein the history dataset comprises historical data of travel times at each
of the plurality of sections for a plurality of trips on the route for at least one vehicle;
determine an overall order of dependence for the plurality of sections based on the time series, wherein the overall order of dependence is a first number of sections immediately before a section on the route that influence the travel time of the vehicle at the section;
formulate a prediction model indicating dependence of travel time of the vehicle at each section on travel time at the first number of sections immediately before the section, wherein the prediction model comprises variables and coefficients corresponding to the variables for each section of the plurality of sections;
determine, for each section of the plurality of sections, a sectional order of dependence based on the time series, wherein the sectional order of dependence for a section is a second number of sections immediately before the section on the route that influence the travel time of the vehicle at the section, wherein the second number of sections is lesser than or equal to the first number of sections; and
estimate the coefficients for each section of the plurality of sections based on the sectional order of dependence and the overall order of dependence, by fitting the prediction model to the travel times in the history dataset, wherein the prediction model is usable to predict the travel time at each of the plurality of sections.
14. The system as claimed in claim 13, wherein the prediction module is executable to:
formulate a state-space model based on the prediction model; and
predict travel time of a vehicle at a section based on travel time of the vehicle at the second number of sections immediately before the section on the route using the state-space model.
15. The system as claimed in claim 13, wherein, to determine the
sectional order of dependence for each section, the prediction module is
executable to:
perform a plurality of iterations of forward regression and backward regression, each iteration of forward regression and backward regression being performed with an order higher than that of a previous iteration, wherein a number of iterations performed is lesser than or equal to the overall order of dependence and wherein, to perform each iteration of forward regression and backward regression, the prediction module is executable to:
estimate an error residual from the forward regression; estimate an error residual from the backward regression; and check significance of correlation between the error residual from the forward regression and the error residual from the backward regression; and
select the highest order at which the significance of correlation is lesser than a significance threshold, as the sectional order of dependence for the section, wherein the highest order is lesser than or equal to the overall order of dependence.
16. The system as claimed in claim 13, wherein the prediction module is
executable to:
introduce a temporal component in the prediction model, the temporal component indicating dependence of travel time at a section in a trip on travel time at the section in a previous trip.
| # | Name | Date |
|---|---|---|
| 1 | 201821022194-STATEMENT OF UNDERTAKING (FORM 3) [13-06-2018(online)].pdf | 2018-06-13 |
| 2 | 201821022194-PROVISIONAL SPECIFICATION [13-06-2018(online)].pdf | 2018-06-13 |
| 3 | 201821022194-FORM 1 [13-06-2018(online)].pdf | 2018-06-13 |
| 4 | 201821022194-DRAWINGS [13-06-2018(online)].pdf | 2018-06-13 |
| 5 | 201821022194-Proof of Right (MANDATORY) [18-07-2018(online)].pdf | 2018-07-18 |
| 6 | 201821022194-FORM-26 [18-07-2018(online)].pdf | 2018-07-18 |
| 7 | 201821022194-ORIGINAL UR 6(1A) FORM 1 & FORM 26-240718.pdf | 2018-10-26 |
| 8 | 201821022194-DRAWING [13-06-2019(online)].pdf | 2019-06-13 |
| 9 | 201821022194-CORRESPONDENCE-OTHERS [13-06-2019(online)].pdf | 2019-06-13 |
| 10 | 201821022194-COMPLETE SPECIFICATION [13-06-2019(online)].pdf | 2019-06-13 |
| 11 | 201821022194-FORM 18 [14-06-2019(online)].pdf | 2019-06-14 |
| 12 | 201821022194-FORM-26 [20-06-2019(online)].pdf | 2019-06-20 |
| 13 | 201821022194-ORIGINAL UR 6(1A) FORM 26-260619.pdf | 2019-07-22 |
| 14 | Abstract1.jpg | 2019-08-09 |
| 15 | 201821022194-FORM 3 [08-03-2021(online)].pdf | 2021-03-08 |
| 16 | 201821022194-OTHERS [09-03-2021(online)].pdf | 2021-03-09 |
| 17 | 201821022194-FER_SER_REPLY [09-03-2021(online)].pdf | 2021-03-09 |
| 18 | 201821022194-CLAIMS [09-03-2021(online)].pdf | 2021-03-09 |
| 19 | 201821022194-FER.pdf | 2021-10-18 |
| 20 | 201821022194-FORM-8 [17-12-2021(online)].pdf | 2021-12-17 |
| 21 | 201821022194-US(14)-HearingNotice-(HearingDate-11-01-2024).pdf | 2023-12-12 |
| 22 | 201821022194-Correspondence to notify the Controller [18-12-2023(online)].pdf | 2023-12-18 |
| 23 | 201821022194-Correspondence to notify the Controller [22-12-2023(online)].pdf | 2023-12-22 |
| 24 | 201821022194-FORM-26 [09-01-2024(online)].pdf | 2024-01-09 |
| 25 | 201821022194-Written submissions and relevant documents [25-01-2024(online)].pdf | 2024-01-25 |
| 26 | 201821022194-PatentCertificate13-02-2024.pdf | 2024-02-13 |
| 27 | 201821022194-IntimationOfGrant13-02-2024.pdf | 2024-02-13 |
| 1 | search(55)E_30-09-2020.pdf |