Stochastic Processes Glossary
A
- Autoregressive moving average model
- In statistics, autoregressive moving average (ARMA) models, sometimes called Box-Jenkins models after George Box and F. M. Jenkins, are typically applied to time series data.
B
- Bernoulli process
- In probability and statistics, a Bernoulli process is a discrete-time stochastic process consisting of a finite or infinite sequence of independent random variables X1, X2, X3,..., such that for each i, the value of Xi is either 0 or 1 and for all values of i, the probability that Xi = 1 is the same number p.
- Bertrand's ballot theorem
- In combinatorics, Bertrand's ballot theorem is the solution to the question: "In an election where one candidate receives p votes and the other q votes with p=q, what is the probability that the first candidate will be strictly ahead of the second candidate throughout the count?" The answer is (p-q)/(p+q).
- Biased random walk (biochemistry)
- In cell biology, a biased random walk enables bacteria to source for food and flee from harm.
- Birth-death process
- The birth-death process is a process is an example of a Markov process (a stochastic process) where the transitions are limited to the nearest neighbors only.
- Branching process
- In probability theory, a branching process is a Markov process that models a population in which each individual in generation n produces some random number of individuals in generation n + 1, according to a fixed probability distribution that does not vary from individual to individual.
- Brownian motion
- The term Brownian motion (in honor of the botanist Robert Brown) refers to either the physical phenomenon that minute particles immersed in a fluid move about randomly; or the mathematical models used to describe those random movements.
- Brownian tree
- A Brownian tree, whose name is derived from Robert Brown via Brownian motion, is a form of computer art that was briefly popular in the 1990s, when home computers started to have sufficient power to simulate Brownian motion.
C
- Chapman-Kolmogorov equation
- In mathematics, specifically in probability theory, and yet more specifically in the theory of stochastic processes, the Chapman-Kolmogorov equation (also known as the master equation in physics) is an identity relating the joint probability distributions of different sets of coordinates on a stochastic process.
- Compound Poisson process
- Continuous-time Markov chain
- In probability theory, a continuous-time Markov chain is a stochastic process { X(t) : t = 0 } that enjoys the Markov property and takes values from amongst the elements of a discrete set called the state space.
E
- Examples of Markov chains
- A game of Monopoly, snakes and ladders or any other game whose moves are determined entirely by dice is a Markov chain.
F
- Filtration (abstract algebra)
- In mathematics, a filtration is an indexed set Si of subobjects of a given algebraic structure S, with an index set I that is a totally ordered set, subject only to the condition that if i = j in I then Si is contained in Sj.
- Fokker-Planck equation
- The Fokker-Planck equation (also known as the Kolmogorov Forward equation) describes the time evolution of the probability density function of position and velocity of a particle.
G
- Galton-Watson process
- The Galton-Watson process is a stochastic process arising from Francis Galton's statistical investigation of the extinction of surnames.
- Gauss-Markov process
- As one would expect, Gauss-Markov stochastic processes (named after Carl Friedrich Gauss and Andrey Markov) are stochastic processes that satisfy the requirements for both Gaussian processes and Markov processes.
- Gaussian process
- A Gaussian process is a stochastic process {Xt}t ∈T such that every finite linear combination of the Xt (or, more generally, any linear functional of the sample function Xt) is normally distributed.
- Geometric Brownian motion
- A geometric Brownian motion (GBM) (occasionally, exponential Brownian motion) is a continuous-time stochastic process in which the logarithm of the randomly varying quantity follows a Brownian motion, or, perhaps more precisely, a Wiener process.
- Girsanov's theorem
- In probability theory, Girsanov's theorem tells how stochastic processes change under changes in measure.
I
- Ito calculus
- Ito calculus, named after Kiyoshi Ito, treats mathematical operations on stochastic processes. Its most important concept is the It� stochastic integral.
- Ito's lemma
- In mathematics, Ito's lemma is used in stochastic calculus to find the differential of a function of a particular type of stochastic process. It is therefore to stochastic calculus what the chain rule is to ordinary calculus. The lemma is widely employed in mathematical finance.
K
- Karhunen-Loève theorem
L
- Lag operator
- In time series analysis, the lag operator or backshift operator operates on an element of a time series to produce the previous element.
- Law of the iterated logarithm
- In probability theory, the law of the iterated logarithm is the name given to several theorems which describe the magnitude of the fluctuations of a random walk.
- Loop-erased random walk
- In mathematics, loop-erased random walk is a model for a random simple path with important applications in combinatorics and, in physics, quantum field theory. It is intimately connected to the uniform spanning tree, a model for a random tree.
- Lévy flight
- A Lévy flight, named after the French mathematician Paul Pierre Lévy, is a type of random walk in which the increments are distributed according to a "heavy tail" distribution.
- Lévy process
- In probability theory, a Lévy process, named after the French mathematician Paul Lévy, is any continuous-time stochastic process that has "stationary independent increments". The most well-known examples are the Wiener process and the Poisson process.
M
- Malliavin calculus
- The Malliavin calculus, named after Paul Malliavin, is a theory of variational stochastic calculus, in other words it provides the mechanics to compute derivatives of random variables.
- Markov chain
- In mathematics, a (discrete-time) Markov chain, named after Andrei Markov, is a discrete-time stochastic process with the Markov property. In such a process, the past is irrelevant for predicting the future given knowledge of the present.
- Markov chain geostatistics
- Markov chain geostatistics applies Markov chains in geostatistics for conditional simulation on sparse observed data; see Li et al. (Soil Sci. Soc. Am. J., 2004), Zhang and Li (GIScience and Remote Sensing, 2005) and Elfeki and Dekking (Mathematical Geology, 2001).
- Markov process
- In probability theory, a Markov process is a stochastic process characterized as follows: The state ck at time k is one of a finite number in the range {1,…,M} . Under the assumption that the process runs only from time 0 to time N and that the initial and final states are known, the state sequence is then represented by a finite vector C = (c0,...,cN).
- Markov property
- In probability theory, a stochastic process has the Markov property if the conditional probability distribution of future states of the process, given the present state, depends only upon the current state, i.e. it is conditionally independent of the past states (the path of the process) given the present state. A process with the Markov property is usually called a Markov process, and may be described as Markovian.
- Martingale
- In probability theory, a (discrete-time) martingale is a discrete-time stochastic process (i.e., a sequence of random variables) X1, X2, X3, ... that satisfies the identity E(Xn+1 | X1,…,Xn) = Xn, i.e., the conditional expected value of the next observation, given all of the past observations, is equal to the last observation. As is frequent in probability theory, the term was adopted from the language of gambling.
N
- Nonlinear autoregressive exogenous model
- In time series modeling, a nonlinear autoregressive exogenous model (NARX) is a nonlinear autoregressive model which has exogenous inputs.
O
- Ornstein-Uhlenbeck process
- In mathematics, the Ornstein-Uhlenbeck process, also known as the mean-reverting process, is a stochastic process given by the following stochastic differential equation drt = θ(rt - μ)dt + σ dWt, where, θ, μ and σ are parameters.
P
- Poisson process
- A Poisson process, one of a variety of things named after the French mathematician Siméon-Denis Poisson (1781 - 1840), is a stochastic process which is defined in terms of the occurrences of events in some space.
- Population process
- In applied probability, a population process is a Markov chain in which the state of the chain is analogous to the number of individuals in a population (0, 1, 2, etc.), and changes to the state are analogous to the addition or removal of individuals from the population.
Q
- Queueing theory
- Queueing theory (sometimes spelled queuing theory, but then losing the distinction of containing the only English word with 5 consecutive vowels) is the mathematical study of waiting lines (or queues).
R
- Random walk
- In mathematics and physics, a random walk is a formalization of the intuitive idea of taking successive steps, each in a random direction. A random walk is a simple stochastic process.
S
- Semi-Markov process
- A semi-Markov process is one that, when it enters state i, spends a random time having distribution Hi and mean μi in that state before making a transition.
- Stationary process
- In the mathematical sciences, a stationary process (or strict(ly) stationary process) is a stochastic process in which the probability density function of some random variable X does not change over time or position. As a result, parameters such as the mean and variance also do not change over time or position.
- Stochastic calculus
- Stochastic calculus is a branch of mathematics that operates on stochastic processes. The operations include integration and differentiation that involve both deterministic and random (i.e. stochastic) variables. It is used to model systems that behave randomly.
- Stochastic process
- In the mathematics of probability, a stochastic process can be thought of as a random function.
- Stopping rule
- In decision theory, a stopping rule is a mechanism for deciding whether to continue or stop a process on the basis of the present position and past events, and which will almost always lead to a decision to stop at some time, known as a stopping time.
- Stratonovich integral
- In probability theory, a branch of mathematics, the Stratonovich integral is a stochastic integral, the commonest alternative to the Ito integral.
- Strong mixing
- In mathematics, strong mixing is a concept applied in ergodic theory, i.e. the study of dynamical systems at the level of measure theory. It can be applied to stochastic processes.
- Substitution model
- A substitution model describes the process from which a sequence of characters of a fixed size from some alphabet changes into another set of traits.
T
- Time series
- In statistics and signal processing, a time series is a sequence of data points, measured typically at successive times, spaced apart at uniform time intervals.
T
- White noise
- White noise is a random signal (or process) with a flat power spectral density. In other words, the signal's power spectral density has equal power in any band, at any centre frequency, having a given bandwidth.
- Wiener equation
- A simple mathematical representation of Brownian motion, the Wiener equation, named after Norbert Wiener, assumes the current velocity of a fluid particle fluctuates randomly: [...]
- Wiener filter
- Unlike the typical filtering theory of designing a filter for a desired frequency response the Wiener filter approaches filtering from a different angle. By creating a filter that filters only on the frequency domain it is possible for the filter to pass noise.
- Wiener process
- In mathematics, the Wiener process, so named in honor of Norbert Wiener, is a continuous-time Gaussian stochastic process with independent increments used in modelling Brownian motion and some random phenomena observed in finance. It is one of the best-known Lévy processes.