algorithm for detecting a limit of time-series numbers

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • geekguy
    New Member
    • Apr 2019
    • 1

    algorithm for detecting a limit of time-series numbers

    Is there a well known and proven algorithm to find the limit of a set of points, which are time based metrics? I'm looking for an existing implementation, in order not to try and invent something which already exists.

    This is what I have in mind:

    Run over the numeric values of a series
    1. Calculate L which is the top limit, and E, where 2*E is the convergent stripe around L.
    - E is configurable
    - Most points, P percentage, except of anomalies/noise, are below the top line of L+E.
    - P is configurable, ranging at 90-95-99% range.
    - A Linear line around the last X points (Linear Regression) has a minor slope of +/-S
    - X and S are configurable

    The algo uses a stored state of a previous maximum limit - max-L.
    - In case L which was calculated in (a) is lower than max-L, it is considered a local max which is discarded
    - If L is greater than max-L then it is considered a new max limit and stored as the new max-L

  • swelteringwellies
    New Member
    • Feb 2026
    • 1

    #2
    @ Drift Hunters zz0.ku4iybxkhgg zz Yes — what you’re describing is not a new problem. It’s basically a combination of well-known techniques:
    1. Percentile / quantile estimation
      Your “L where P% of points are below L+E” is essentially a high percentile (e.g. 95th, 99th percentile).
      Instead of inventing L, you can compute:

    L = quantile(data, P)

    This is standard and implemented everywhere (NumPy, SciPy, pandas, Apache Commons Math, etc.).
    zz0.s1fvsu6bydz z

    Comment

    Working...