Probability distribution: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Greg Woodhouse
(probability vs. probability distribution)
imported>Ragnar Schroder
(Changed the structure, it still needs better flow of logic.)
Line 1: Line 1:
[[Probability]] a mathematical appropach to quantifying uncertainty. A '''probability distribution''' is a [[function]] that, whren integrated over a set representing an event, gives the probability of that event. In other words,
A [[probability distribution]] is a mathematical approach to quantifying uncertainty.


<math>p(A) = \int_A f\,d\mu</math>.


In practice, the "event" A corresponds to a range of values, say <math>a \le x \le b</math>, so the above integral will tae the more familiar form
There are two main classes of probability distributions: Discrete and continuous. Discrete distributions describe variables that take on discrete values only (typically the positive integers), while continuous distributions describe variables that can take on arbitrary values in a continuum (typically the real numbers).
 
In more advanced studies, one also comes across hybrid distributions.
 
 
 
== Introduction to discrete distributions ==
 
Faced with a list of mutually exclusive propositions or possible outcomes, people intuitively put  "weights" on the different alternatives. 
 
For instance, consider the following 2 propositions:
 
*During next week there will be rain in London.
*During next week there will be no rain in London.
 
Based on available information about the record of past weather in England,  people tend intuitively to put more "weight" on the first possibility than the second.
 
If all the "weights" are real numbers ranging from 0 to 1 and their sum is exactly 1,  we have a [[discrete probability distribution]],  and the "weights" are called [[probabilities]].
 


<math>p(a \le x \le b) = \int_a^b f\,dx</math>.
As a simple illustration as to how the individual probabilities may be obtained in practice, consider the expected results for a coin toss experiment. While we don't know the results for any individual toss of the coin, we can expect the results to average out to be heads half the time and tails half the time (assuming a fair coin).




As a simple example, consider the expected results for a coin toss experiment.  While we don't know the results for any individual toss of the coin, we can expect the results to average out to be heads half the time and tails half the time (assuming a fair coin). 


There are two main classes of probability distributions:  Discrete and continuous.  Discrete distributions describe variables that take on discrete values only (typically the positive integers),  while continuous distributions describe variables that can take on arbitrary values in a continuum (typically the real numbers).
=== Formal definition===


In more advanced studiesone also comes across hybrid distributions.
Given a countable set S={s0, ... ,sn, ... } of mutually exclusive propositions (or possible outcomes of an experiment).
Let A=[0,1}, a proper subset of the [[real number]]s '''R'''.
A discrete probability distribution is then a subset T={(s0,t0),...,(sn,tn), ...} of the cartesian product <math>S \times A</math>such that all the ti sum to exactly 1.  




== Important discrete probability distributions ==
=== Important examples ===


[[Bernoulli distribution]] - Each experiment is either a 1 ("success") with probability p or a 0 ("failure") with probability 1-p.  An example would be tossing a coin.  If the coin is fair,  your probability for "success" will be exactly 50%.
[[Bernoulli distribution]] - Each experiment is either a 1 ("success") with probability p or a 0 ("failure") with probability 1-p.  An example would be tossing a coin.  If the coin is fair,  your probability for "success" will be exactly 50%.
Line 32: Line 50:




== Important continuous probability distributions ==  
 
 
 
== Introduction to continuous distributions ==
 
However,  the list does not have to be finite and discrete as in the example above.  For instance,  consider the following infinite set of propositions:
 
The height of the next individual we'll meet is 0.0 meters.
...
The height of the next individual we'll meet is 1.0 meters.
...
 
Now the height of an individual is a real number that in principle may take on any value. 
 
However,  based on information about the heights of people,  it's very natural to put more "weight" on the possibility between 1.5 meters and say 2.0 meters than say
 
 
=== Formal definition ===
 
A continuous probability distribution is a function that, whren integrated over a set representing an event, gives the probability of that event. In other words,
 
<math>p(A) = \int_A f\,d\mu</math>.
 
In practice, the "event" A corresponds to a range of values, say <math>a \le x \le b</math>, so the above integral will tae the more familiar form
 
<math>p(a \le x \le b) = \int_a^b f\,dx</math>
 
 
 
=== Important examples ===


[[Gaussian distribution]] - Also known as the normal distribution.   
[[Gaussian distribution]] - Also known as the normal distribution.   
Line 47: Line 94:


[[Laplacian distribution]] -  
[[Laplacian distribution]] -  
==References==
==See also==
==External links==




[[Category:Mathematics Workgroup]]
[[Category:Mathematics Workgroup]]

Revision as of 15:24, 25 June 2007

A probability distribution is a mathematical approach to quantifying uncertainty.


There are two main classes of probability distributions: Discrete and continuous. Discrete distributions describe variables that take on discrete values only (typically the positive integers), while continuous distributions describe variables that can take on arbitrary values in a continuum (typically the real numbers).

In more advanced studies, one also comes across hybrid distributions.


Introduction to discrete distributions

Faced with a list of mutually exclusive propositions or possible outcomes, people intuitively put "weights" on the different alternatives.

For instance, consider the following 2 propositions:

  • During next week there will be rain in London.
  • During next week there will be no rain in London.

Based on available information about the record of past weather in England, people tend intuitively to put more "weight" on the first possibility than the second.

If all the "weights" are real numbers ranging from 0 to 1 and their sum is exactly 1, we have a discrete probability distribution, and the "weights" are called probabilities.


As a simple illustration as to how the individual probabilities may be obtained in practice, consider the expected results for a coin toss experiment. While we don't know the results for any individual toss of the coin, we can expect the results to average out to be heads half the time and tails half the time (assuming a fair coin).


Formal definition

Given a countable set S={s0, ... ,sn, ... } of mutually exclusive propositions (or possible outcomes of an experiment). Let A=[0,1}, a proper subset of the real numbers R. A discrete probability distribution is then a subset T={(s0,t0),...,(sn,tn), ...} of the cartesian product , such that all the ti sum to exactly 1.


Important examples

Bernoulli distribution - Each experiment is either a 1 ("success") with probability p or a 0 ("failure") with probability 1-p. An example would be tossing a coin. If the coin is fair, your probability for "success" will be exactly 50%.

An experiment where the outcome follows the Bernoulli distribution is called a Bernoulli trial.

Binomial distribution - Each experiment consists of a series of identical Bernoulli trials, f.i. tossing a coin n times, and counting the number of successes.

Uniform distribution - Each experiment has a certain finite number of possible outcomes, each with the same probability. Throwing a fair die, f.i., has six possible outcomes, each with the same probability. The Bernoulli distribution with p=0.5 is another example.

Poisson distribution - Given an experiment where we have to wait for an event to happen, and the expected remainding waiting time is independent of how long we've already waited. Then the number of events per unit time will be a Poisson distributed variable.

Geometric distribution -

Negative Binomial distribution -



Introduction to continuous distributions

However, the list does not have to be finite and discrete as in the example above. For instance, consider the following infinite set of propositions:

The height of the next individual we'll meet is 0.0 meters. ... The height of the next individual we'll meet is 1.0 meters. ...

Now the height of an individual is a real number that in principle may take on any value.

However, based on information about the heights of people, it's very natural to put more "weight" on the possibility between 1.5 meters and say 2.0 meters than say


Formal definition

A continuous probability distribution is a function that, whren integrated over a set representing an event, gives the probability of that event. In other words,

.

In practice, the "event" A corresponds to a range of values, say , so the above integral will tae the more familiar form


Important examples

Gaussian distribution - Also known as the normal distribution.

Uniform continuous distribution -

Exponential distribution - Given a sequence of events, and the waiting time between two consequitive events is independent of how long we've already waited, the time between events follows the exponential distribution.

Gamma distribution -

Rayleigh distribution -

Cauchy distribution -

Laplacian distribution -


References

See also

External links