Probability: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Ragnar Schroder
No edit summary
mNo edit summary
 
(33 intermediate revisions by 17 users not shown)
Line 1: Line 1:
== Probability ==
{{subpages}}
{{TOC|right}}
A '''probability''' is a number representing the likelihood of a random event or an uncertain proposition occurring, ranging from 1 representing certainty down to 0 for impossibility.


Probability theory is a branch of mathematics.  
Probability is the topic of [[probability theory]], a branch of [[mathematics]] concerned with analysis of random phenomena.  Like algebra, geometry and other parts of mathematics,  probability theory has its origins in the natural world. Humans routinely deal with incomplete and/or uncertain information in daily life: in decisions such as crossing the road ("Will this approaching car respect the red light?"), eating food ("Am I certain this food is not contaminated?"), and so on.  Probability theory is a mathematical tool intended to formalize this ubiquitous mental process. The probability concept is a part of this theory, and is intended to formalize uncertainty.


Like algebra,  geometry and other parts of mathematics,  probability theory has it's origins in the natural world.  Humans routinely deal with incomplete and/or uncertain information in daily life,  in decisions such as crossing the road ("will this approaching car respect the red light"),  eating food ("am I certain this food is not contaminated?"), and so on.  We make decisions based on basic intuition and experience.
There are three basic ways to think about the probability concept:


Probabilities is a formal mathematical concept intended to formalize uncertainty.
* Bayesian probability.
* Frequentist  probability.
* Axiomatic probability.


There are two basic ways to think about the probability concept:
== Bayesian probability ==
*  Subjective (Bayesian) probability.   
{{main|Bayes Theorem|Statistical significance}}
* Objective probability.
According to [[Bayes Theorem]], probability is taken as a measure of how reasonable a belief is in light of prior observations and theoretical considerations.
 
=== Example of the Bayesian viewpoint ===
 
A Bayesian may assign a probability of 1/2 to the proposition that there was life on [[Mars]] a billion years agoA frequentist would not do that, since one cannot say that the event that there was life on Mars a billion years ago happens in half of all cases; there are no such cases.
 
== Frequentist probability ==
In this approach one views probabilities as reflecting the frequencies of the various outcomes, if an infinite number of tests is carried out.




The different approaches are largely pedagogical, as some people find one approach or the other much easier.
=== Example of the frequentist viewpoint ===
The probability of "heads" when flipping a fair coin is 50%, because when we flip it an infinite number of times, that's what the frequency will be.




== Bayesian probability ==
==The axiomatic approach==
In this approach probability theory is viewed as an extension to classical logic. The probabilities represent a state of knowledge. One startes with a set of propositions and all the available information. Using common sense based methods one assigns "weights" to the available alternatives, generating a "prior probability distribution"As more information comes in, one combines the "prior" with the new data and obtain a new "posterior" probability distributionwhich represents an updated state of knowledge - each probability is a number that describes how much faith one should allocate to its associated proposition.
 
Neither the Bayesian nor the frequentist approach really give us a satisfactory [[formal structure]] for development of a rigid theory.  
 
The axiomatic approach takes a different tack. Instead of focusing on the question "What is probability?" we step back and ask "How does probability ''work''?"
 
We then focus on abstract mathematical concepts such as [[set|sets]], [[measure (mathematics)|measure]], [[sigma algebra|sigma algebras]] and a set of rules we expect probability to follow,  known as [[Kolmogorov's axioms]].
 
But the ultimate justification for this approach rests on experience, tooFortunately, the results derived by applying these axioms accord very nicely with experience. We can also derive Bayes' theorem as a consequence, and we can show that in a large number of trials, the frequencies will give us a good estimate of the probabilities; or at least it will do so ''on average''.
 
===Example of the axiomatic approach===
Given a standard deck of card. We want to draw a card at random.
 
This experiment can be modeled by a set of 52 possible outcomes.  Any set with 52 elements has exactly <math>2^{52}</math> subsets.  To each of those subsets we may assign a certain number (a "probability"),  so that certain axioms are satisfied.  We choose to assign probability .25 to the subset consisting of all 13 hearts.  We also assign .25 the the subset consisting of all spades.


== Objective probability ==
According to the axioms, we must then assign probability 0.5 to the subset consisting of every spade and heart in the deck.
In this approach one views probabilities as "propensities" of the actual system under study  -  f.i. a fair coin will have a "propensity" to show heads 50% of the time.  This approach is more restrictive than the Bayesian interpretation: F.i. there is no way to assign a probability as to whether or not there exists life in the Andromeda galaxy this way,  since no "propensities" have been measured.


In non-axiomatic informal terms,  we would describe this result thus:


Example of the bayesian viewpoint:
If the probability that a card (drawn at random from a standard deck of cards) is a heart is .25, and the probability that it is a spade is also .25, and if I know that the being a heart and being a spade are mutually exclusive possibilities (i.e., a card cannot be both), then the probability that it is a heart ''or'' a spade is 0.5 = 0.25 + 0.25.
One is given a die, and no information about it. From the principle of maximum entropy, one assumes that all 6 outcomes are equally likely. Our state of knowledge as to the outcome is thus best modeled by putting equal "weights" to all 6 alternatives. This weight will be 1/6, since we give the alternatives 1/6th of our confidence each.  This is then our prior probability distribution.
Now,  there is a possibility the die is "loaded". Therefore,  we will watch the outcomes,  and continually adjust our  "weights" according to the results we obtain from throwing the die.


Example of the objective viewpoint:
== More technical information ==
We are given a die,  and a list of it's measured (or theoretical) "propensities",  i.e. probabilities for each possible outcome.  We then use this information to calculate the "propensities"/probabilities of certain outcomes.  If our results seem improbable,  we may decide to do experiments to re-measure the "propensities".
* [[Bayes Theorem]]
* [[principle of maximum entropy]]
* [[Probability distributions]]
* [[Kolmogorov's axioms]]
* [[probability theory]]
* [[probability space]]


== See also ==
* [[Statistics]]
* [[Statistical significance]]


References: 
== External links ==
Intros:
*Intros
http://www.mathgoodies.com/lessons/vol6/intro_probability.html
** http://www.mathgoodies.com/lessons/vol6/intro_probability.html
http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/book.html
** http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/book.html


More advanced:
*More advanced
http://bayes.wustl.edu/
** http://bayes.wustl.edu/
http://plato.stanford.edu/entries/probability-interpret/
** http://plato.stanford.edu/entries/probability-interpret/[[Category:Suggestion Bot Tag]]

Latest revision as of 11:00, 7 October 2024

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

A probability is a number representing the likelihood of a random event or an uncertain proposition occurring, ranging from 1 representing certainty down to 0 for impossibility.

Probability is the topic of probability theory, a branch of mathematics concerned with analysis of random phenomena. Like algebra, geometry and other parts of mathematics, probability theory has its origins in the natural world. Humans routinely deal with incomplete and/or uncertain information in daily life: in decisions such as crossing the road ("Will this approaching car respect the red light?"), eating food ("Am I certain this food is not contaminated?"), and so on. Probability theory is a mathematical tool intended to formalize this ubiquitous mental process. The probability concept is a part of this theory, and is intended to formalize uncertainty.

There are three basic ways to think about the probability concept:

  • Bayesian probability.
  • Frequentist probability.
  • Axiomatic probability.

Bayesian probability

For more information, see: Bayes Theorem and Statistical significance.

According to Bayes Theorem, probability is taken as a measure of how reasonable a belief is in light of prior observations and theoretical considerations.

Example of the Bayesian viewpoint

A Bayesian may assign a probability of 1/2 to the proposition that there was life on Mars a billion years ago. A frequentist would not do that, since one cannot say that the event that there was life on Mars a billion years ago happens in half of all cases; there are no such cases.

Frequentist probability

In this approach one views probabilities as reflecting the frequencies of the various outcomes, if an infinite number of tests is carried out.


Example of the frequentist viewpoint

The probability of "heads" when flipping a fair coin is 50%, because when we flip it an infinite number of times, that's what the frequency will be.


The axiomatic approach

Neither the Bayesian nor the frequentist approach really give us a satisfactory formal structure for development of a rigid theory.

The axiomatic approach takes a different tack. Instead of focusing on the question "What is probability?" we step back and ask "How does probability work?"

We then focus on abstract mathematical concepts such as sets, measure, sigma algebras and a set of rules we expect probability to follow, known as Kolmogorov's axioms.

But the ultimate justification for this approach rests on experience, too. Fortunately, the results derived by applying these axioms accord very nicely with experience. We can also derive Bayes' theorem as a consequence, and we can show that in a large number of trials, the frequencies will give us a good estimate of the probabilities; or at least it will do so on average.

Example of the axiomatic approach

Given a standard deck of card. We want to draw a card at random.

This experiment can be modeled by a set of 52 possible outcomes. Any set with 52 elements has exactly subsets. To each of those subsets we may assign a certain number (a "probability"), so that certain axioms are satisfied. We choose to assign probability .25 to the subset consisting of all 13 hearts. We also assign .25 the the subset consisting of all spades.

According to the axioms, we must then assign probability 0.5 to the subset consisting of every spade and heart in the deck.

In non-axiomatic informal terms, we would describe this result thus:

If the probability that a card (drawn at random from a standard deck of cards) is a heart is .25, and the probability that it is a spade is also .25, and if I know that the being a heart and being a spade are mutually exclusive possibilities (i.e., a card cannot be both), then the probability that it is a heart or a spade is 0.5 = 0.25 + 0.25.

More technical information

See also

External links