Almost sure convergence: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Hendra I. Nurdin
m (streamlining)
imported>Jitse Niesen
(move parts to subpages)
(7 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{subpages}}
{{subpages}}


'''Almost sure convergence''' is one of the four main modes of [[stochastic convergence]]. It may be viewed as a notion of convergece for random variables that is similar to, but not the same as, the notion of [[pointwise convergence]] for real functions.
'''Almost sure convergence''' is one of the four main modes of [[stochastic convergence]]. It may be viewed as a notion of convergence for random variables that is similar to, but not the same as, the notion of [[pointwise convergence]] for real functions.
 
<!--==Examples==-->
<!--==Examples==-->
<!--===Basic example===-->
<!--===Basic example===-->
<!--===Intermediate example===-->
<!--===Intermediate example===-->
==Definition==
In this section, a formal definition of almost sure convergence will be given for complex vector-valued random variables, but it should be noted that a more general definition can also be given for random variables that take on values on more abstract [[topological space|topological spaces]]. To this end, let <math>(\Omega,\mathcal{F},P)</math> be a [[measure space|probability space]] (in particular, <math>(\Omega,\mathcal{F}</math>) is a [[measurable space]]). A (<math>\mathbb{C}^n</math>-valued) '''random variable''' is defined to be any [[measurable function]] <math>X:(\Omega,\mathcal{F})\rightarrow (\mathbb{C}^n,\mathcal{B}(\mathbb{C}^n))</math>, where <math>\mathcal{B}(\mathbb{C}^n)</math> is the [[sigma algebra]] of [[Borel set|Borel sets]] of <math>\mathbb{C}^n</math>. A formal definition of almost sure convergence can be stated as follows:


A sequence <math>X_1,X_2,\ldots,X_n,\ldots</math> of random variables is said to '''converge almost surely''' to a random variable <math>Y</math> if <math>\mathop{\lim}_{k \rightarrow \infty}X_k(\omega)=Y(\omega)</math> for all <math>\omega \in \Lambda</math>, where <math>\Lambda \subset \Omega</math> is some measurable set satisfying <math>P(\Lambda)=1</math>. An equivalent definition is that the sequence <math>X_1,X_2,\ldots,X_n,\ldots</math>  converges almost surely to <math>Y</math> if <math>\mathop{\lim}_{k \rightarrow \infty}X_k(\omega)=Y(\omega)</math> for all <math>\omega \in \Omega \backslash \Lambda'</math>, where <math>\Lambda'</math> is some measurable set with <math>P(\Lambda')=0</math>. This convergence is often expressed as:


==Definition==
<math>\mathop{\lim}_{k \rightarrow \infty} X_k = Y \,\,P{\rm -a.s},</math>  
In this section, a formal definition of almost sure convergence will be given for complex vector-valued random variables, but it should be noted that a more general definition can also be given for random variables that take on values on more abstract sets. To this end, let <math>(\Omega,\mathcal{F},P)</math> be a [[measure space|probability space]] (in particular, <math>(\Omega,\mathcal{F}</math>) is a [[measurable space]]). A (<math>\mathbb{C}^n</math>-valued) '''random variable''' is defined to be any [[measurable function]] <math>X:(\Omega,\mathcal{F})\rightarrow (\mathbb{C}^n,\mathcal{B}(\mathbb{C}^n))</math>, where <math>\mathcal{B}(\mathbb{C}^n)</math> is the [[Borel set]] of <math>\mathbb{C}^n</math>. A formal definition of almost sure convergence can be stated as follows:


A sequence <math>X_1,X_2,\ldots,X_n,\ldots</math> of random variables is said to '''converge almost surely''' to a random variable <math>Y</math> if <math>\mathop{\lim}_{n \rightarrow \infty}X_k(\omega)=Y(\omega)</math> for all <math>\omega \in \Lambda</math>, where <math>\Lambda \subset \Omega</math> is some set satisfying <math>P(\Lambda)=1</math>. An equivalent definition is that the sequence <math>X_1,X_2,\ldots,X_n,\ldots</math>  converge almost surely to <math>Y</math> if <math>\mathop{\lim}_{n \rightarrow \infty}X_k(\omega)=Y(\omega)</math> for all <math>\omega \in \Omega \backslash \Lambda'</math>, where <math>\Lambda'</math> is some set with <math>P(\Lambda')=0</math>. This convergence is often expressed as:
or


<math>\mathop{\lim}_{k \rightarrow \infty} X_n = Y\,\,P-a.s </math> or <math>\mathop{\lim}_{k \rightarrow \infty} X_n = Y a.s</math>.
<math>\mathop{\lim}_{k \rightarrow \infty} X_k = Y\,\,{\rm a.s}</math>.


==Important cases of almost sure convergence==
==Important cases of almost sure convergence==
Line 25: Line 22:


This is an example of the [[strong law of large numbers]].
This is an example of the [[strong law of large numbers]].
==References==
#D. Williams, ''Probability with Martingales'', Cambridge : Cambridge University Press, 1991.
#P. Billingsley, ''Probability and Measure'' (2 ed.), ser. Wiley Series in Probability and Mathematical Statistics. Wiley, 1986.
#E. Wong and B. Hajek, ''Stochastic Processes in Engineering Systems'', Springer-Verlag, New York, 1985.
==See also==
*[[Stochastic convergence]]
*[[Convergence in distribution]]
*[[Convergence in probability]]
*[[Convergence in rth order mean]]
==Related topics==
*[[Stochastic variable]]
*[[Stochastic process]]
*[[Stochastic diffential equation]]
<!--==External links==-->

Revision as of 06:53, 14 July 2008

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Almost sure convergence is one of the four main modes of stochastic convergence. It may be viewed as a notion of convergence for random variables that is similar to, but not the same as, the notion of pointwise convergence for real functions.

Definition

In this section, a formal definition of almost sure convergence will be given for complex vector-valued random variables, but it should be noted that a more general definition can also be given for random variables that take on values on more abstract topological spaces. To this end, let be a probability space (in particular, ) is a measurable space). A (-valued) random variable is defined to be any measurable function , where is the sigma algebra of Borel sets of . A formal definition of almost sure convergence can be stated as follows:

A sequence of random variables is said to converge almost surely to a random variable if for all , where is some measurable set satisfying . An equivalent definition is that the sequence converges almost surely to if for all , where is some measurable set with . This convergence is often expressed as:

or

.

Important cases of almost sure convergence

If we flip a coin n times and record the percentage of times it comes up heads, the result will almost surely approach 50% as .

This is an example of the strong law of large numbers.