Predict method for TVGEV
objects.
An object with S3 class "TVGEV"
A vector with class "Date"
or that can be
coerced to this class.
Numeric vector of periods expressed as multiple of
the block duration. Usually the block duration is one year and a
value 100
will correspond to a 100-year return level, i.e.
to the GEV quantile with probability 0.99. The default value
allows the construction of an acceptable RL plot, but if a
smoother RL curve and/or Confidence Band is needed, more values
would be used.
Numeric vector of confidence level(s).
The method used to compute the confidence intervals. See Details.
Type of output.
Logical
used only when
confintMethod
is "boot"
. When TRUE
, the RL
named "Quantile"
is computed as the average of the RLs
obtained with the bootstraped parameters, and it will differ from
the RL computed with the estimated parameters in object
.
Integer level of verbosity.
Arguments to be passed to bs
.
A data frame or an array with the RL and confidence limits
for each combination of date, period and
confidence level. So when confintMethod
is not
"none"
the three values Quant
, L
and U
are given for each combination (the first not depending on the confidence level). The data frame is in 'long' format, with different rows for multiple levels.
Compute the Return Levels (RLs) for different periods and
different values of the time block, as well as Confidence
Intervals (CIs) for these. The results are not predictions in the
usual acceptance. The CIs can be obtained by the usual
delta method, provided that the approximate covariance for
the estimated parameters is found in object
. They also can
be obtained by bootstrap, using by default the bootstrap
distribution embeded in object
if any or by computing it
else. The bootstrap can be parametric or non-parametric, see
bs.TVGEV
. Finally, the profile-likelihood
method can be used: the confidence limits for a RL are obtained by
maximising and minimising this RL under the constraint that the
log-likelihood is greater than a suitable value. This method
avoids the usual re-parameterisation of the model which can be
tedious for the general form of model allowed in TVGEV
.
Despite of its positive sounding name, the "bias correction"
can have a negative impact with some Extreme Value models as used
here, especially for non-parametric bootstrap. In future versions,
the dots ...
might be used to control the profile-likelihood
as well as the bootstrap.
When confintMethod
is set to
"loglik"
the required computing time is huge because
several constrained optimisations are required. Since the marginal
distribution hence the RL vary only slowly in practice, an
unnecessary computing burden will result when many values of
newdate
are used.
The bootstrap method bs.TVGEV
for
TVGEV
objects, the formal arguments of which can be used
here when confintMethod
is chosen as "boot"
.
example(TVGEV)
#>
#> TVGEV> ## transform a numeric year into a date
#> TVGEV> df <- within(TXMax_Dijon, Date <- as.Date(sprintf("%4d-01-01", Year)))
#>
#> TVGEV> df0 <- subset(df, !is.na(TXMax))
#>
#> TVGEV> ## fit a TVGEV model. Only the location parameter is TV.
#> TVGEV> t1 <- system.time(
#> TVGEV+ res1 <- TVGEV(data = df, response = "TXMax", date = "Date",
#> TVGEV+ design = breaksX(date = Date, breaks = "1970-01-01", degree = 1),
#> TVGEV+ loc = ~ t1 + t1_1970))
#>
#> TVGEV> ## The same using "nloptr" optimisation.
#> TVGEV> t2 <- system.time(
#> TVGEV+ res2 <- TVGEV(data = df, response = "TXMax", date = "Date",
#> TVGEV+ design = breaksX(date = Date, breaks = "1970-01-01", degree = 1),
#> TVGEV+ loc = ~ t1 + t1_1970,
#> TVGEV+ estim = "nloptr",
#> TVGEV+ parTrack = TRUE))
#>
#> TVGEV> ## use extRemes::fevd the required variables need to be added to the data frame
#> TVGEV> ## passed as 'data' argument
#> TVGEV> t0 <- system.time({
#> TVGEV+ df0.evd <- cbind(df0, breaksX(date = df0$Date, breaks = "1970-01-01",
#> TVGEV+ degree = 1));
#> TVGEV+ res0 <- fevd(x = df0.evd$TXMax, data = df0.evd, loc = ~ t1 + t1_1970)
#> TVGEV+ })
#>
#> TVGEV> ## compare estimate and negative log-liks
#> TVGEV> cbind("fevd" = res0$results$par,
#> TVGEV+ "TVGEV_optim" = res1$estimate,
#> TVGEV+ "TVGEV_nloptr" = res2$estimate)
#> fevd TVGEV_optim TVGEV_nloptr
#> mu0 32.06678895 32.06638460 32.06679233
#> mu1 -0.02391857 -0.02392656 -0.02391860
#> mu2 0.07727041 0.07728411 0.07727031
#> scale 1.75585289 1.75541862 1.75585346
#> shape -0.18130928 -0.18112018 -0.18130938
#>
#> TVGEV> cbind("fevd" = res0$results$value,
#> TVGEV+ "VGEV_optim" = res1$negLogLik,
#> TVGEV+ "TVGEV_nloptr" = res2$negLogLik)
#> fevd VGEV_optim TVGEV_nloptr
#> [1,] 177.2014 177.2014 177.2014
#>
#> TVGEV> ## ====================================================================
#> TVGEV> ## use a loop on plausible break years. The fitted models
#> TVGEV> ## are stored within a list
#> TVGEV> ## ====================================================================
#> TVGEV>
#> TVGEV> ## Not run:
#> TVGEV> ##D
#> TVGEV> ##D yearBreaks <- c(1940, 1950, 1955, 1960:2000, 2005, 2010)
#> TVGEV> ##D res <- list()
#> TVGEV> ##D
#> TVGEV> ##D for (ib in seq_along(yearBreaks)) {
#> TVGEV> ##D d <- sprintf("%4d-01-01", yearBreaks[[ib]])
#> TVGEV> ##D floc <- as.formula(sprintf("~ t1 + t1_%4d", yearBreaks[[ib]]))
#> TVGEV> ##D res[[d]] <- TVGEV(data = df, response = "TXMax", date = "Date",
#> TVGEV> ##D design = breaksX(date = Date, breaks = d, degree = 1),
#> TVGEV> ##D loc = floc)
#> TVGEV> ##D }
#> TVGEV> ##D
#> TVGEV> ##D ## [continuing...] ]find the model with maximum likelihood, and plot
#> TVGEV> ##D ## something like a profile likelihood for the break date considered
#> TVGEV> ##D ## as a new parameter. However, the model is not differentiable w.r.t.
#> TVGEV> ##D ## the break!
#> TVGEV> ##D
#> TVGEV> ##D ll <- sapply(res, logLik)
#> TVGEV> ##D plot(yearBreaks, ll, type = "o", pch = 21, col = "orangered",
#> TVGEV> ##D lwd = 2, bg = "gold", xlab = "break", ylab = "log-lik")
#> TVGEV> ##D grid()
#> TVGEV> ##D iMax <- which.max(ll)
#> TVGEV> ##D abline(v = yearBreaks[iMax])
#> TVGEV> ##D abline(h = ll[iMax] - c(0, qchisq(0.95, df = 1) /2),
#> TVGEV> ##D col = "SpringGreen3", lwd = 2)
#> TVGEV> ##D
#> TVGEV> ## End(Not run)
#> TVGEV>
#> TVGEV>
#> TVGEV>
p1 <- predict(res2, confintMethod = "none")
#> Since 'object' is really time-varying, the Return Levels
#> depend on the date. A default choice of dates is made here.
#> Use the 'newdate' formal to change this.
p2 <- predict(res2, confintMethod = "delta")
#> Since 'object' is really time-varying, the Return Levels
#> depend on the date. A default choice of dates is made here.
#> Use the 'newdate' formal to change this.
p3 <- predict(res2, confintMethod = "boot")
#> Since 'object' is really time-varying, the Return Levels
#> depend on the date. A default choice of dates is made here.
#> Use the 'newdate' formal to change this.
#> 'object' does not embed a bootstrap distribution. Using `bs` to compute this.
p4 <- predict(res2, newdate = "2020-01-01", confintMethod = "proflik")
#>
#> o Finding CI for Return Levels
#>
#> **************
#> Lower bounds
#> **************
#>
#> Confidence Level: 0.95
#>
#> o Date: 2020-01-01
#>
#> - Period: 1000
#> Optimisation status: 3
#> Iterations: 63
#> Objective: 39.81
#> Constraint check -0.0000411
#> Gradient directions: 0.0028
#>
#> - Period: 500
#> Optimisation status: 3
#> Iterations: 13
#> Objective: 39.58
#> Constraint check -0.0000018
#> Gradient directions: 0.0009
#>
#> - Period: 200
#> Optimisation status: 3
#> Iterations: 20
#> Objective: 39.18
#> Constraint check -0.0000085
#> Gradient directions: 0.0081
#>
#> - Period: 150
#> Optimisation status: 3
#> Iterations: 12
#> Objective: 39.04
#> Constraint check -0.0000323
#> Gradient directions: 0.0054
#>
#> - Period: 100
#> Optimisation status: 3
#> Iterations: 27
#> Objective: 38.80
#> Constraint check -0.0000317
#> Gradient directions: 0.0034
#>
#> - Period: 50
#> Optimisation status: 3
#> Iterations: 19
#> Objective: 38.34
#> Constraint check -0.0000093
#> Gradient directions: 0.0038
#>
#> - Period: 20
#> Optimisation status: 3
#> Iterations: 18
#> Objective: 37.57
#> Constraint check -0.0000263
#> Gradient directions: 0.0090
#>
#> - Period: 10
#> Optimisation status: 3
#> Iterations: 43
#> Objective: 36.83
#> Constraint check -0.0000798
#> Gradient directions: 0.0032
#>
#> - Period: 5
#> Optimisation status: 3
#> Iterations: 18
#> Objective: 35.91
#> Constraint check -0.0000109
#> Gradient directions: 0.0062
#>
#> **************
#> Upper bounds
#> **************
#>
#> Confidence Level: 0.95
#>
#> o Date: 2020-01-01
#>
#> - Period: 1000
#> Optimisation status: 3
#> Iterations: 71
#> Objective: -45.70
#> Constraint check -0.0000209
#> Gradient directions: 0.0384
#>
#> - Period: 500
#> Optimisation status: 4
#> Iterations: 9
#> Objective: -44.74
#> Constraint check -0.0000081
#> Gradient directions: 0.0066
#>
#> - Period: 200
#> Optimisation status: 3
#> Iterations: 29
#> Objective: -43.46
#> Constraint check -0.0000052
#> Gradient directions: 0.0055
#>
#> - Period: 150
#> Optimisation status: 3
#> Iterations: 9
#> Objective: -43.05
#> Constraint check -0.0000114
#> Gradient directions: 0.0078
#>
#> - Period: 100
#> Optimisation status: 3
#> Iterations: 38
#> Objective: -42.47
#> Constraint check -0.0000110
#> Gradient directions: 0.0057
#>
#> - Period: 50
#> Optimisation status: 3
#> Iterations: 33
#> Objective: -41.49
#> Constraint check -0.0000008
#> Gradient directions: 0.0027
#>
#> - Period: 20
#> Optimisation status: 3
#> Iterations: 52
#> Objective: -40.21
#> Constraint check -0.0002247
#> Gradient directions: 0.0104
#>
#> - Period: 10
#> Optimisation status: 3
#> Iterations: 19
#> Objective: -39.24
#> Constraint check -0.0000272
#> Gradient directions: 0.0019
#>
#> - Period: 5
#> Optimisation status: 3
#> Iterations: 17
#> Objective: -38.22
#> Constraint check -0.0000060
#> Gradient directions: 0.0015