Continuity in probability

In probability theory, a stochastic process is said to be continuous in probability or stochastically continuous if its distributions converge whenever the values in the index set converge. [1][2]

Definition

Let be a stochastic process in . The process is continuous in probability when converges in probability to whenever converges to .[2]

Examples and Applications

Feller processes are continuous in probability at . Continuity in probability is a sometimes used as one of the defining property for Lévy process.[1] Any process that is continuous in probability and has independent increments has a version that is càdlàg.[2] As a result, some authors immediately define Lévy process as being càdlàg and having independent increments.[3]

References

  1. ^ a b Applebaum, D. "Lectures on Lévy processes and Stochastic calculus, Braunschweig; Lecture 2: Lévy processes" (PDF). University of Sheffield. pp. 37–53.
  2. ^ a b c Kallenberg, Olav (2002). Foundations of Modern Probability (2nd ed.). New York: Springer. p. 286.
  3. ^ Kallenberg, Olav (2002). Foundations of Modern Probability (2nd ed.). New York: Springer. p. 290.
Prefix: a b c d e f g h i j k l m n o p q r s t u v w x y z 0 1 2 3 4 5 6 7 8 9

Portal di Ensiklopedia Dunia

Kembali kehalaman sebelumnya