Skip to content

testing quality of forecast #1

@turgeonmaxime

Description

@turgeonmaxime

If we are to put a forecasting tool in the hands of non-technical users, we need some way of automatically assess the quality of the forecasts and tell the user when they need to reach out for support. This can tricky: we want to be accurate, but we don't want too many false positives...

Here are two tests that we could implement:

  • Anomaly detection via Generalized ESD Test. See for example Twitter's package.

  • Ljung-Box test for autocorrelation in the residuals.

Another approach would be to fit a couple simple models and compare their accuracy with the overall model; if the simple models give better forecasts than prophet, that would be a good signal to contact a statistician.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions