Probability Theory
Historical note about Probability
Probability theory like many other branches of mathematics evolved out of practical consideration. It had its origin in the 16th century when an Italian Physician and mathematician Jerome Cardan (1501-1576) wrote the first book on the subject ''Book on Games of Chance'' (Biber de Ludo Algae). It was published in 1663 after his death. In 1165, a gambler Chevalier de Metre approached the well-known French Philosopher and Mathematician Blaise Pascal (1623-1162) for certain dice problems. Paul became interested in these problems and discussed them with the famous French mathematician Pierre de Fermat (1601-1665). Both Pascal and Fermat solved the problem independently. Besides, Pascal and Fermat's outstanding contributions to probability theory were also made by Christian Huygens (1629-1665), a Dutchman, J.Bernoulli (1654-1705), De Moivre (1667-1754), a Frenchman Pierre Laplace (1749-1827), A Frenchman and the Russian P.L.Chebyshev (1821-1897), A Markov (1856-1922) and A.N Kolmogorov (1903-1987), Kolmogorov is credited with the axiomatic theory of probability. His book 'Foundations of Probability' published in 1993, introduces probability as a set function and is considered a classic.
Development of the theory of probability
The theory of probability was developed by the famous Indian mathematician, Galileo was the first person who developed the quantitative measurement of probability. He connected this theory with the theorems of gambling and seriously considered it. Many other mathematicians thought over problems of chance, problems of head and tail and so on.
During the 17th century France mathematicians, Pascal and Ferment provided a sound and scientific-based theory of probability. During the 18th and 19th centuries, mathematicians like Laplace, Gauss, D-Movie, Daniel Bernoulli etc. continued to contribute a lot towards its development and growth. Karl Person and Fisser also based their theory of sampling on the theory of probability during the 20th century.
Probability theory is a branch of mathematics that deals with uncertainty and randomness. It provides a formal framework for quantifying and analyzing uncertainty, allowing us to make predictions, make decisions, and model various real-world phenomena. Probability theory is widely used in a wide range of fields, including statistics, science, engineering, finance, and more.
Importance of probability in real life
1. Experiment
2. Random Experiment
3. Simple Space
4. Event
5 Probability of occurrence of an event
Probability
According to Laplace, "Probability is the ratio of favourable events to the total number of equally likely events."
In practice, the value of 'p' always lies between 0 and 1.
If the happening of an event is certain, the probability will be '1' means P(A)=1 and if its occurrence is impossible, its probability will be 0 means P(A)=0.
On the basis of definition, to find out the possibility of happening of a particular event:
Probability Formula
P(A)= (Number of favourable cases)/(Total number of equally likely cases
Example: A bag contains 4 Red and 5 Green balls. A ball is drawn at random. What is the probability that it is red in colour?
Solution:
Let the probability of getting a red ball is event A
Total number of balls=4+5=9
No. of balls of favourable(red)colour=4
P(A)=(Number of favourable coloured balls)/(Total number of balls)=4/9
Concepts Relating To Probability
Random Experiment: Probability is calculated through mathematical experiments under uncertainties. A random Experiment is the combination of Random means purely on chance and without any personal bias or desire of the investigator. The meaning of the term experiment is much broader than that in physical sciences. For example, if a dice a roll, there are 6 possible outcomes(1,2,3,4,5,6). Sometimes 2, sometimes 4 and sometimes some other number may be the outcome.
Sample Space (Ω): The sample space is the set of all possible outcomes of a random experiment. It represents the universe of possible events. For example, when rolling a six-sided die, the sample space is {1, 2, 3, 4, 5, 6}.
Sample Space: The set of all possible outcomes of a random experiment, which is denoted by the letter 'S'.They may also be called 'Sample points example: If a coin is tossed once, its possible outcome will be two Head(H) or Tail(T) and thus the sample space will be written as =(H, T) or n(S)=2.
If two coins are tossed simultaneously, the sample space will be as follows:
S=(HH, HT, TH, TT) or n(S)=4
Events
An event is an element or point of the sample space related to a random experiment. For Example, if a dice is rolled, then the total No. of outcomes is 6 (1,2,3,4,5,6). Out of these the outcome filling any specified description is known as an event.
a) Simple, Composite and compound events:
When the probability of occurring a single event is calculated, It is called a simple event. Example: Probability of getting a king in drawing a card from a pack of cards, drawing a red ball from a bag containing 4 red and 6 white balls.
If the occurrence of more than a single event in the single draw is calculated, it is a composite event. Example: getting an even number in rolling a single dice, getting a king or queen in drawing a card.
The occurrence of two or more events at the same time is called a compound event. Example: rolling two dice, tossing three coins, tossing a coin three times, drawing four cards.
b) Independent event- Events are said to be independent if the occurrence or non-occurrence of an event does not affect, and is not affected by, the event in succeeding trials or trials. Example: If a dice is rolled two times, the number coming upwards, each time will be independent of the other.
c) Dependent event- If the occurrence of one event affects the happening of other events, then the succeeding events are said to be dependent events. For example, there are 3 red and 5 white balls in a bag and two balls are drawn one by one without replacement.
d) Mutually exclusive events- Two or more events are said to be mutually exclusive when the happening of any one of these excludes the happening of all others in the same experiment. For example: if a card is drawn from a pack of cards, the probability of getting a king and queen is calculated.
e) Equally likely events- Two or more events are said to be equally likely when the chance of happening each event is equal or the same. Example: head and tail are equally likely events in tossing a coin and appearing any number from 1 to 6.
f) Complementary Events- If two or more events are mutually exclusive and exhaustive, then one of these events will definitely occur. For example: if we want a head tossing a coin, the occurrence of the head will be an event and non-occurrence will be a complimentary event.
g) Sure and impossible events- The events which are covered in sample space. The events not covered in this space are said to be impossible events. Example- In a rolling of a dice, the occurrence of any one number 1,2,3,4,5,6 is a sure event, while the occurrence of 7 or any other number is an impossible event.
h) Exhaustive event- The total number of possible outcomes of a trial/ experiment is
called exhaustive events. For example: in rolling dice, there are 6 exhaustive events, i.e., 1,2,3,4,5 and 6.
Probability is a mathematical measurement, which can be expressed in different forms having equal mathematical value. These forms may be either ratios, fractions or percentages. Probability lies between 0<p(event)<1.
Probability Distribution
Probability distributions are very important in the area of statistics. Probability distribution plays a vital role in business decision-making and experiments where a collection of actual data is not possible. Probability Distribution: A probability distribution describes how the probabilities are assigned to different events or outcomes in a random experiment. It can be represented in various ways, such as a probability mass function (for discrete outcomes) or a probability density function (for continuous outcomes).
The main uses of probability distribution :
1. Decision Making: The probability distributions are useful in making wise decisions and solving business problems under uncertainty and risk.
2. Useful in a situation where actual data can not be collected. Sometimes it is not possible to collect actual data due to a lack of time and funds. In that situation, the expected frequency distribution is very helpful in making a wise decision.
3. Forecasting: Probability distributions are also very helpful in forecasting. On the basis of theoretical distribution, a businessman can anticipate the expected demand for his product in the market in the near future.
4. Solving business and other problems. Expected probability distributions are useful in solving many business and other problems. By using probability distributions a quality controller can know whether a particular production technique is going on smoothly or not.
Apply Probability Distribution
In Probability distributions, Binomial and Poisson's distributions are discrete in nature while the normal distribution is a continuous probability distribution.
1. The event must be mutually exclusive. It means, that if the two events A and B are mutually exclusive then the probability of occurrence of either A or B is denoted by P(A union B)=P(A)+P(B). These three distributions can be applied only in those situations where the events are mutually exclusive in each trial with two options only, namely success and failure.
2. Each trial is independent. It means that the outcomes of any trial do not affect the outcomes of any subsequent trials.
3. If we denote the success by 'p' and the failure of 'q' then the sum of success and failure is always equal to unity means p+q=1.
4. The size of the sample must be finite or fixed.
Random Variables: A random variable is a variable that takes on different values based on the outcomes of a random experiment. It is often denoted as X or Y. There are two types of random variables: discrete (taking on a countable set of values) and continuous (taking on a continuous range of values).
Probability Mass Function (PMF) and Probability Density Function (PDF): These functions describe how probabilities are distributed across the values of a random variable. For discrete random variables, the PMF provides the probability associated with each possible value. For continuous random variables, the PDF describes the likelihood of the variable falling within a certain range.
Cumulative Distribution Function (CDF): The CDF of a random variable X gives the probability that X takes on a value less than or equal to a specified value. It provides a complete picture of the random variable's distribution.
Joint Probability: When dealing with multiple random variables, joint probability describes the probability associated with specific combinations of values for these variables. This is often used in the context of events that involve more than one random variable.
Conditional Probability: Conditional probability quantifies the likelihood of an event occurring given that another event has already occurred. It is denoted as P(E | F), where E is the event of interest and F is the conditioning event.
Independence: Two events or random variables are considered independent if the occurrence (or value) of one does not affect the occurrence (or value) of the other. Independence is an essential concept in probability theory.
Bayes' Theorem: Bayes' theorem is a fundamental theorem in probability theory that allows for the calculation of conditional probabilities. It plays a critical role in Bayesian statistics and machine learning.
Probability theory provides a powerful framework for dealing with uncertainty and making informed decisions in various fields. It forms the basis for statistical inference, risk assessment, and decision-making under uncertainty. Additionally, it is a fundamental component of many advanced mathematical and computational techniques used in modern science and technology.
Theory of Sets
An asset is an aggregate or collection of objects. Sets are usually designated by capital letters, A, B, C and so on. The member of set A is called the elements of A. In general, when x is an element of A we write x belongs to A, and if x is not an element of A we write x does not belong to A.
Example:
The set whose elements are the integers 5,6,7,8 is a finite set with four elements. We could denote this by
A = {5,6,7,8} # 5 belongs to A and 9 does not belong to A.
Universal Set
The universal set is the set of all objects under consideration, and it is generally denoted by U.
We consider some operations on sets.
a) Complement Set
b) Intersection Set
c) Union set
More
0 Comments