Probability distribution
Probability distribution is a table or a function that gives a numerical value (probability value) for each outcome of a statistical (random) experiment. With this probability distribution, one can model behavior of a random variable. So, probability distribution is also called function of random variable.Variable and Random Variable
Variable तपाईहरुले सुन्दै आएको Term हो । Variable भन्नाले एक भन्दा बढी True Value भएको संकेत हो। Variable लाई अंग्रेजी वर्णमालाको अक्षर X, Y, Z….. ले जनाईन्छ ।
उदाहरणको लागी, मान्नुहोस एउटा कक्षामा तपाई र अरु साथीहरु पनि छन, मानौ राम, सिता, हरि, गोपाल, ........ ।
अब, मैले X ले तपाईहरुको परिवारमा भएको सदस्यहरुको संख्या बुझाउँछ भनेमा,
X = परिवारमा भएको सदस्यहरुको संख्या हुन्छ ।
यस अवस्थामा,
तपाईको लागी X ले कती अंक बुझायो त? भन्नुहोस त!
रामको लागी X ले 5 अंक बुझायो
किन भनेर मैले रामलाई सोद्दा उसले उसको परिवारमा 5 जना सदस्यहरु भएको कुरा बतायो ।
यहाँ, X एउटा Variable हो किनभने, X को मान कक्षामा भएका बिध्यार्थीहरु राम, सिता, हरि, गोपाल, ........ को लागी फरक फरक आउन सक्छ ।
यसरी एउटा भन्दा बढी value represent गर्ने संकेत लाई Variable भनिन्छ ।
Random Variable
Random Variable हामिले सुन्दै आएको Term हो । Random Variable भनेको एक प्रकारको function हो । यसले Sample Space मा भएका सबै elements हरुलाई एकएक वटा number सँग जोडा मिलाउने (mapping गर्ने ) काम गर्छ। Random Variable लाई अंग्रेजी वर्णमालाको ठुलो अक्षरले जनाईन्छ जस्तै X, Y, Z….., and the value of random variable is denoted by small letters of english alphabetes, like X = x.
उदाहरणको लागी
Suppose we toss two coins. Let H stand for head and T stand for tail. Then there are four possible outcomes in our toss, which are HH, HT, TH and TT. Each of these outcome can be assigned with some numerical value.
Let us count number of heads in each outcome, i.e.;
X= number of heads
Then, we can assign numerical value to each sample space as below
Outcome  HH  HT  TH  TT 
X  2  1  1  0 
NOTE
A random variable is also called chance variable. If we have two or more random variables, we use letters X, Y, Z …,
and a random variable can be of any form: onevariate, bivariate or multivariate.
Types of random variable (Discrete and Continuous)
Random variable are of two types. Discrete and Continuous.
Discrete Random Variable
A random variable X is called discrete if it can assume finite number of values, and the values it can take can be separated by gaps.
For example
 number of accidents per month
 number of telephones calls in a day
 number of defective bulbs
If two bulbs are selected from a certain lot, and we represent
X = number of defective bulbs
then value of X may be 0, 1 or 2. But it cannot be 0.1 or 1.5 etc.
Here, X can take only specific values which are 0, 1 and 2 and so on
Therefore, in this case, X is discrete random variable.  if two dice are rolled, and
X = total sum of numbers on two dice
then value of X will be 2, 3, …, 12.
In this case “X= total sum of numbers on two dice” is a discrete random variable because there are only 11 possible values for X, they are 2,3,...,12.
Continuous Random Variable
A random variable X is called continuous if it can take all values in a possible range. In such r.v., there are no gaps between its possible ranges. The value lies always in the form of an interval, the interval may be very small.For example
 height of a student, it could be any real number between certain extreme limits.
Suppose,
X = height of students in grade X
which lies been between 100cm to 150 cm. Then height can take any value between 100 to 150.It may be 135cm or 135.5cm or it may take any value between 135cm and 136cm.
Therefore, in this case, X is continuous random variable.  Temperature: The temperature recorded at a specific time of the day can take any value within a range, even if we typically measure it in discrete units like Celsius or Fahrenheit.
 Reaction Time: The time it takes for a person to react to a stimulus is a continuous variable since it can theoretically take any positive value, although it's often measured in milliseconds.
 Distance Traveled: The distance traveled by a moving object over a given time interval can vary continuously, depending on factors like speed and time, so is a continuous random variable
Probability Distribution
Probability distribution is function of random variable.
It explains the distribution of probabilities of a random variable.
Since there are two types of random variable, we have two types of probability distribution.
One is discrete
probability distribution and other is continuous probability distribution.
Discrete probability distribution
Let \( X \) be a discrete random variable then probability distribution of \( X \) is denoted by \( f(x) \) and defined by
\( f( x )=P( X=x )\)
where
 \(f( x )\ge 0\)
 \(\displaystyle \sum_X f( x )=1\)
 if we toss a pair of coins, and
X = number of heads obtained
then probability distribution of X is given in the table (or histogram) below.S TT HT,TH HH x=X 0 1 2 f(X=x) \( \frac{1}{4}\) \( \frac{2}{4}\) \( \frac{1}{4}\)
माथि दिएको Table box वा histogram लाई नै probability distribution of X भनिन्छ
जहाँ सबै f(x) हरुको मान nonnegative हुन्छ, (जस्तै \( \frac{1}{4}\), \( \frac{2}{4}\), \( \frac{1}{4}\) सबै nonnegative हो )
 सबै f(x) हरुको योगफलको मान 1 हुन्छ, (जस्तै \( \frac{1}{4}+\frac{2}{4}+\frac{1}{4}=1\) सबैको योगफलको मान 1 हुन्छ)
NOTEA discrete probability function is a function that can take a discrete number of values (not necessarily finite). Each of the discrete values has a certain probability of occurrence that is between zero and one. That is, a discrete function that allows negative values or values greater than one is not a probability function.
Exercise
 A fair coin is tossed twice. Let X be the number of heads that are observed.
 Construct the probability distribution of X
 Find the probability that at least one head is observed.
The possible values that X can take are 0 , 1 , and 2
Each of these numbers corresponds to an event in the sample space S={TT,TH,HT,HH} of equally likely outcomes for this experiment:
X=0 to{TT},X=1 to {TH,HT},and X=2 to HH.
 Construct the probability distribution of X
The probability of each of these events, hence of the corresponding value of X, can be found simply by counting, to giveS TT HT,TH HH x=X 0 1 2 f(X=x) 0.25 0.5 0.25
A histogram that graphically illustrates the probability distribution is given below  Find the probability that at least one head is observed.
At least one head” is the event X≥1, which is the union of the mutually exclusive events X=1 and X=2. Thus
P(X≥1)=P(1)+P(2)=0.50+0.25=0.75
 A fair coin is tossed three times. Let X be the number of heads that are observed.
 Construct the probability distribution of X
 Find the probability that at least two head is observed.
Continuous probability distribution
Let \( X \) be a continuous random variable then probability distribution of \( X \) is denoted by \( f(x) \) and defined by
\(f( x )=P( a\le X\le b )\)
where
 \(f( x )\ge 0\)
 \( \displaystyle \int\limits_{\infty }^{\infty }{f( x )}dx=1\)
 if X is a continuous random variable with probability function
\(f( x )=\frac{x^2}{21} \) for \(1 \le x \le 4 \)
then probability is
\(P(2 <x <3)\)
or \( \int_2^3\frac{x^2}{21} dx \)
or \( \int_2^3 f(x) dx \)
or \( \frac{1}{21} \int_2^3 x^2 dx \)
or \( \frac{1}{21} \left [\frac{x^3}{3} \right ]_2^3 \)
or \( \frac{1}{21} \left [\frac{3^32^3}{3} \right ] \)
or \( \frac{1}{21} \left [\frac{19}{3} \right ] \)
or \( 0.3 \)
The probability distribution of continuous random variable is also called probability density (or probability density function ) and it can only be shown by a graph (as below).
Continuous random variable को probability लाई graph बाट मात्र देखाउन सकिन्छ। Table बाट देखाउन सकिदैन किनभने continuous random variable मा प्रत्येक point मा probability zero हुन्छ।
Continuous random variable को probability मा
\( f(a \le x \le b)= f(a<x \le b)= f(a<x<b)= f(a \le x<b)\)
सबैको मान एउटै हुन्छ।
Continuous random variable मा point को probability zero हुन्छ।
\( \displaystyle \int_a^a f(x) dx =0\)
NOTESince continuous probability functions are defined for an infinite number of points over a continuous interval, the probability at a single point is always zero. Probabilities are measured over intervals, not single points. That is, the area under the curve between two distinct points defines the probability for that interval.
NOTEDiscrete probability functions are referred to as probability mass functions and continuous probability functions are referred to as probability density functions. The term probability functions covers both discrete and continuous distributions.
NOTEDiscrete probability functions are referred to as probability mass functions and continuous probability functions are referred to as probability density functions. The term probability functions covers both discrete and continuous distributions.
Exercise
 Given \(f( x )=cx^2 \) for \(0 \le x \le 3 \), find value of c and \(P( 1 \le x \le 2) \) [1/9,2/27]
Joint probability distribution
Joint probability distribution
In science and in real life, we are often interested in two (or more) random variables at the same time. For example, the IQ and birth weight of children, or the number of Facebook friends and the age of Facebook members.
In such situations, the random variables have a joint distribution that allows us to compute probabilities of events involving both variables and understand the relationship between the variables. This is simplest when the variables are independent. When they are not, we use covariance and correlation as measures of the nature of the dependence between them.
In summary, दुई वा दुई भन्दा बढी random variable को संयुक्त probability distribution लाई Joint Probability Distribution भनिन्छ ।
Definition
Let \( X \) and\( Y\) are two discrete random variables then joint probability distribution of \( X \) and \( Y\) is denoted by \( f(x,y) \) and defined by
\(f( x,y )=P( X=x,Y=y )\)
where \(f( x,y )\ge 0\)
 \( \displaystyle \sum_Y \sum_X f( x,y )=1\)
Here, X and Y are two discrete random variables in which
For example
X takes values \(x_1,x_2,\dots ,x_n\)
Y takes values \(y_1,y_2,\dots ,y_n\)
The ordered pair (X, Y ) take values in the product \((x_1,y_1),(x_1,y_2),\dots ,(x_n,y_n)\).
It is also joint probability function (or probability mass function) of X and Y The joint probability distribution of X and Y is \(f( x,y )=kxy\) for \(x=1,2,3\) and \(y=1,2,3\).
Determine value of \(k\) and probability \(P( X=2,Y=1 )\).
Solution
Given joint probability distribution of two discrete random variables X and Y, is
\(f( x,y )=kxy\) for \(x=1,2,3\) and \(y=1,2,3\)
The joint probability distribution are given in table below.f(x,y) X 1 2 3 Y 1 k 2k 3k 2 2k 4k 6k 3 3k 6k 9k We know that, the total probability is 1.
Therefore,
\( \displaystyle \sum_Y \sum_Xf( x,y )=1\)
or \(36k=1\)
or \(k=\frac1{36}\)
Hence, the value of k is
\(k=\frac1{36}\).
Next, the probability\(P( X=2,Y=1 )\)is
\( P( X=2,Y=1 )\)
or \( 2k\)
or \( \frac{2}{36}\)
or \( \frac{1}{18}\)
This completes the solution
 Two caplets are selected at random from a bottle containing three aspirin, two sedative and four laxative caplets. If X and Y are respectively the number of aspirin and sedative caplets included among the two caplets drawn from the bottle, find the probabilities associated with all possible pairs of values of X and Y.
Solution
Given that two caplets are selected at random from a bottle containing three aspirin, two sedative and four laxative caplets. Also given that
X= number of aspirin and
Y= number of sedative
Therefore, possible selection of random variables X and Y are
\(( 0,0 ),( 0,1 ),( 1,0 ),( 1,1 ),( 0,2 )\) and \(( 2,0 )\)
Here, the pair\(( 0,0 )\) represents the selection of 0 Aspirin and 0 Sedative caplets and this can be done in
\( \begin{pmatrix} 3 \\ 0 \end{pmatrix} \begin{pmatrix} 2 \\ 0 \end{pmatrix} \begin{pmatrix} 4 \\ 2 \end{pmatrix} \) ways
Thus, the probability of selecting 0 Aspirin and 0 Sedative caplets is
\( \begin{pmatrix} 3 \\ 0 \end{pmatrix} \begin{pmatrix} 2 \\ 0 \end{pmatrix} \begin{pmatrix} 4 \\ 2 \end{pmatrix} =\frac{1}{6} \)
That is,
\( f( 0,0 )=\frac{1}{6}\)
Continuing this way, the probabilities of all possible selection of X and Y are
\( f(x,y)=\begin{pmatrix} 3 \\ x \end{pmatrix} \begin{pmatrix} 2 \\ y \end{pmatrix} \begin{pmatrix} 4 \\ 2xy \end{pmatrix} \) for \( x=0,1,2,y=0,1,2,x+y \le 2 \)
Thus, the probability distribution of X and Y aref(x,y) X 0 1 2 Y 0 1/6 1/3 1/12 1 2/9 1/6 0 2 1/36 0 0 This completes the solution.
Exercise
 Given \(f( x,y )=\frac{x^2+y}{32}\) for \( x=0,1,2,3 \) and \( y=0,1\), find \(P(X \ge 2,Y=1)\).
 Roll a fair die twice. Let X be the face shown on the first roll, and let Y be the face shown on the second roll, then find the distribution of f(x,y).
 Let the random experiment be to roll a fair die twice, let us define the random variables X = the maximum of the two rolls, and Y = the sum of the two rolls, then find the f(x,y).
 Given \(f( x,y,z )=\frac{xyz}{108}\) for \(x=1,2,3,y=1,2,3,z=1,2\), find \(P(X=1,Y=1,Z=2)\)
Joint probability density
Let \( X \) and\( Y\) are two continuous random variables then joint probability density of \( X \) and\( Y\) is denoted by \( f(x,y) \)and defined by
\(f( a\le X \le b,c\le Y\le d )=\displaystyle \int_{y=c}^{y=d} \int_{x=a}^{x=b}f( x,y )dxdy\)
where \(f( x,y )\ge 0\) for any region bounded by X and Y
 \(\displaystyle \int_{\infty}^{\infty} \int_{\infty}^{\infty}f( x,y )dxdy=1\)
The continuous case is essentially the same as the discrete case: we just replace sums by integrals.
For example
If X takes values in [a, b] and Y takes values in [c, d] then the pair (X, Y ) takes values in the product [a, b] × [c, d]. The joint probability density function (joint pdf) of X and Y is accumulated by a function f(x, y) giving the probability density in a small rectangle of width dx and height dy. The joint probability function of two continuous random variables X and Y is
\( f(x,y) = cxy \) for \( 0\leq x \leq 4,1 \leq y \leq 5 \)
Determine value of \(c\) and find \(P(1\le X\le 2,2\le Y\le 3)\)
Solution
Given joint probability function of two continuous random variable X and Y, the total probability is 1.
Thus,
\( \int_{\infty}^{\infty} \int_{\infty}^{\infty}f( x,y )dxdy=1\)
or \( \int_{y=1}^5 \int_{x=0}^4 cxy dxdy=1\)
or \(c \int_{y=1}^5 \{ \int_{x=0}^4 x dx \} y dy=1\)
or \(c \int_{y=1}^5 8 y dy=1\)
or \(8c \int_{y=1}^5 y dy =1\)
or \(8c 12 =1\)
or \(96c=1\)
or \( c=\frac1{96}\)
Next, using value of c,
\(P( 1\le X\le 2,2\le Y\le 3 )\)
or\(\int_{y=2}^{3} \int_{x=1}^{2}\frac{xy}{96}dxdy\)
or \(\frac1{96} \int_{x=1}^{2} \{ \int_{y=2}^{3} ydy \}dx\)
or \(\frac1{96}\frac{5}{2} \int_{x=1}^{2}xdx \)
or \(\frac{5}{128}\)
This completes the solution.  The joint probability function of two continuous random variable X and Y is
\( f(x,y) = \frac{3x(x+y)}{5} \) for \( 0 \le x \le 1,0 \le y \le 2\)
Find the probability \(P[ ( x,y )\in A ]\) where \(A=\{(x,y ):0 \le x \le \frac{1}{2},1 \le y \le 2\}\).
Solution
Given the joint probability function of two continuous random variables X and Y, probability for the region A is
\(P[ ( x,y )\in A ]\)
or \(P( 0 < x < \frac{1}{2},1 < y < 2 )\)
or \( \int_1^2 \int_0^{\frac{1}{2}} \frac{3}{5}x( x+y )dx dy\)
or \(\frac{3}{5} \int_1^2 \bigg \{ \int_0^{\frac{1}{2}} x( x+y )dx \bigg \}dy\)
or \(\frac{3}{5} \int_1^2 \bigg \{ \int_0^{\frac{1}{2}}( x^2+xy )dx \bigg \} dy\)
or \(\frac{3}{5} \int_1^2 \bigg( \frac{x^3}{3}+\frac{x^2y}{2} \bigg )_{x=0}^{x=1/2} dy\)
or \(\frac{3}{5} \int_1^2 \bigg ( \frac{( 1/2 )^3}{3}+\frac{( 1/2 )^2y}{2} \bigg ) dy \)
or \(\frac{3}{5} \int_1^2 \bigg( \frac{1+3y}{24} \bigg )dy\)
or \(\frac{3}{5}.\frac{1}{24} \int_1^2( 1+3y )dy\)
or \(\frac{3}{120}\bigg( y+\frac{3y^2}{2} \bigg)_1^2 \)
or \(\frac{3}{120}\bigg( 2+\frac{3.2^2}{2}1\frac{3.1^2}{2} \bigg)\)
or \(\frac{11}{80}\)
This completes the solution.
Exercise
 Given \(f( x,y )=kx( xy )\) for \( 0<x<1,x<y<x\), find the value of \(k\)
 Given \(f( x,y )=k( x+y^2)\) for \( 0 < x < 1, 0 <y < 1\), find value of \(k\)
 Given \(f( x,y )=2\) for \(x>0,y>0,x+y<1\). Find probability for followings.
 \(P( X\le 2,Y\le 2)\)
 \(P( X+Y> 1)\)
 \(P(X>2Y)\)
 Given \( f( x,y )=2xy \) for \( 0 < x <1,0 < y < 1 \). Find \( F(\frac{1}{2},1)\)
 Given \(f( x,y,z )=( x+y )e^{z}\) for \(0<x<1,0<y<1,z>0\). Find the probability \(P[ ( x,y,z )\in A ]\) where \(A=(x,y,z ):0<x<\frac{1}{2},\frac{1}{2}<y<1,z<1\)
Marginal Probability Distribution
If joint probability distribution of two random variables X and Y is known, then we can obtain probability distribution of X and Y separately.
These separate probability distribution are called marginal probability distribution.Let \( X \) and \( Y\) are two random variables with joint probability function \(f(x,y) \) then marginal probability function of \( X \) is denoted by
\( g(x) \) or \( f_x(x,y) \).
If X is discrete, then marginal probability function of \( X \) defined by
\( g(x)=\sum_Y f( x,y )\)
If X is continuous, then marginal probability function of \( X \) defined by
\( g(x)=\int_{\infty}^{\infty} f( x,y ) dy\)
Similarly,
marginal probability function of \( Y\) is denoted by \( h ( y ) \) or \( f_y(x,y)\)
If Y is discrete, then marginal probability function of \( Y \) defined by
\( h ( y )=\sum_X f( x,y )\)
If Y is continuous, then marginal probability function of \( Y \) defined by
\( h ( y )=\int_{\infty}^{\infty} f( x,y ) dx\)
For example Given joint pdf \(f( x,y )=\frac{xy}{36}\) for \( x=1,2,3,y=1,2,3\). Find marginal probability function of X.
Solutionf(x,y) X 1 2 3 Y 1 \(\frac{1}{36}\) \(\frac{2}{36}\) \(\frac{3}{36}\) 2 \(\frac{2}{36}\) \(\frac{4}{36}\) \(\frac{6}{36}\) 3 \(\frac{3}{36}\) \(\frac{6}{36}\) \(\frac{9}{36}\) g(x) \(\frac{1}{6}\) \(\frac{2}{6}\) \(\frac{3}{6}\)
\(g( x )=\displaystyle \sum_{y=1}^{3}f( x,y )\)
or \(g( x )=\displaystyle \sum_{y=1}^{3}\frac{xy}{36}\)
or \(g( x )=\frac{x.1}{36}+\frac{x.2}{36}+\frac{x.3}{36}\)
or \(g( x )=\frac{x}{6}\)
Thus, marginal probability function of X is
\(g( x )=\frac{x}{6}\) for \( x=1,2,3\).  Given joint pdf \(f( x,y )=\frac{xy}{36}\) for \( x=1,2,3,y=1,2,3\). Find marginal probability function of Y
Solutionf(x,y) X h(y) 1 2 3 Y 1 \(\frac{1}{36}\) \(\frac{2}{36}\) \(\frac{3}{36}\) \(\frac{1}{6}\) 2 \(\frac{2}{36}\) \(\frac{4}{36}\) \(\frac{6}{36}\) \(\frac{2}{6}\) 3 \(\frac{3}{36}\) \(\frac{6}{36}\) \(\frac{9}{36}\) \(\frac{3}{6}\)
\(h( y )=\displaystyle \sum_{x=1}^{3}f( x,y )\)
or \(gh( y )=\displaystyle \sum_{x=1}^{3}\frac{xy}{36}\)
or \(h( y )=\frac{1.y}{36}+\frac{2.y}{36}+\frac{3.y}{36}\)
or \(h( y )=\frac{y}{6}\)
Thus, marginal probability function of Y is
\(h( y )=\frac{y}{6}\) for \( y=1,2,3\).  Given joint pdf \(f( x,y )=\frac{2(x+2y)}{3} \) for \( 0\le x\le 1,0\le y\le 1 \). Find marginal probability function of X and Y.
Solution
Given the joint probability function of two continuous random variables X and Y, the marginal probability function of X is
\(g( x )= \int_{\infty}^{\infty}f( x,y )dy\)
or \(g(x)= \int_{y=0}^{1}\frac{2}{3}( x+2y )dy\)
or \(g(x)=\frac{2}{3}( xy+\frac{2{{y}^2}}2 )_0^1\)
or \( g(x)=\frac{2}{3}( x+1 )\)
Thus, marginal probability function of X is
\( g(x)=\frac{2( x+1 )}{3}\) for \( 0\le x\le 1\)
Next, the marginal probability function of Y is
\(h( y )= \int_{\infty}^{\infty}f( x,y )dx\)
or \(h( y)= \int_{y=0}^{1}\frac{2}{3}( x+2y )dx\)
or \(h ( y)=\frac{2}{3}( \frac{x^2}{2}+2xy )_0^1\)
or \(h ( y)=\frac{2}{3}( \frac{1}{2}+2y )\)
or \(h ( y)=\frac{2}{3}( \frac{1+4y}{2} )\)
or \(h ( y)=\frac{1}{3}( 1+4y )\)
Thus, marginal probability function of Y is
\( h ( y)=\frac{1}{3}( 1+4y )\) for \( 0\le y\le 1\)
Exercise
 Given joint pdf \(f( x,y )=\frac{xy}{36}\) for \(x=1,2,3,y=1,2,3\). Find marginal probability function of Y
 Given joint pdf \( f( x,y )=\frac{x^2+y}{32}\) for \(x=0,1,2,3 \) and \(y=0,1\), Find marginal probability function of X and Y
 Given joint pdf \( f( x,y,z )=\frac{xyz}{108} \) for \(x=1,2,3,y=1,2,3,z=1,2\). Find (a) Marginal probability function of X (b) joint marginal probability function of \( X \) and \( Y\).
 Given \(f( x,y )=\frac{6}{5}( x+y^2)\) for \( 0 < x < 1, 0 <y < 1\), Find marginal probability function of X and Y.
 Given joint pdf \( f( x,y )=2xy\) for \(0 <x <1,0 <y < 1\). Find marginal probability function of X and Y
 Given joint pdf \(f( x,y )=\frac{3x( x+y )}{5}\) for \( 0 < x < 1, 0 <y < 2\). Find marginal probability function of X and Y
 Given joint pdf \( f( x,y,z )=( x+y )e^{z}\) for \(0 < x < 1,0 < y< 1,z > 0\). Find (a) joint marginal function of X and Z (b) find marginal density of X alone.
 Given joint pdf \( f( x,y )= 8xy\) for \( 0 \le x < y \le 1\). Find marginal probability function of X.
 Given joint pdf \(f( x,y )=2\) for \( 0 < x< 1,0 < y < x\). Find marginal probability function of \( X\)
 If the joint probability function of two continuous random variables X and Y is \( f( x,y )=2x\) for \(0 < x< 1,0 < y < 1\). Find marginal probability function of X and Y.
 Let the random experiment be to roll a fair die twice, let us define the random variables X = the maximum of the two rolls, and Y = the sum of the two rolls, Find marginal probability function of X and Y.
Conditional Probability Distribution
The conditional probability is the case specific marginalized probability.
If \( X \) and \( Y\) are two random variables with joint probability \( f(x,y) \) and if \( g( x )\)is marginal probability function of \( X \), \(h( y )\) be marginal probability function of \( Y\), then
 conditional probability function of \( X \) given \( Y=y\) is denoted by \(w( Xy )\) or \( f_{Xy}(x,y) \) and defined by
\( w( xy )=\frac{f( x,y )}{h( y )};h( y )\ne 0\)  conditional probability function of \( Y\) given \( X=x\) is denoted by \( w( Yx )\) or \( f_{Yx}(x,y) \) and defined by
\( w( Yx )=\frac{f( x,y )}{g( x )};g( x )\ne 0\)
 Given joint pdf \(f( x,y )=\frac{xy}{36}\) for \( x=1,2,3,y=1,2,3\). Find conditional probability function of X given Y=2
Solutionf(x,y) X 1 2 3 Y 1 \(\frac{1}{36}\) \(\frac{2}{36}\) \(\frac{3}{36}\) 2 \(\frac{2}{36}\) \(\frac{4}{36}\) \(\frac{6}{36}\) 3 \(\frac{3}{36}\) \(\frac{6}{36}\) \(\frac{9}{36}\)
\(h( y )=\frac{y}{6}\) for \( y=1,2,3\).
Now,conditional probability of X given Y=y is
\( w( X=xy )\)
or \( \frac{f( x,y )}{h(y )}\)
or \( \frac{xy}{36} . \frac{6}{y}\)
or \( \frac{x}{6} \)
Hence, conditional probability of X given Y=2 is
\( w( X=xy )=\frac{x}{6} \) for \( x=1,2,3\)
 Given \( f( x,y )=\frac{x^2+y}{32} \) for \(x=0,1,2,3,y=0,1\), find conditional probability of \( Y\) given \(X=1\).
Solution
Given joint probability function of X and Y, the marginal probability function of X is
\(g( x )=\frac{2x^2+1}{32}\) for \(x=0,1,2,3\)
Now,conditional probability of Y given X=x is
\( w( Y=yx )\)
or \( \frac{f( x,y )}{g( x )}\)
or \(\frac{x^2+y}{32}.\frac{32}{2x^2+1} \)
or \(\frac{x^2+y}{2x^2+1} \)
At x=1, the conditional probability function is
\(\frac{1^2+y}{2.1^2+1} \)
or\(\frac{1+y}{3} \) for \(y=0,1\)
This completes the solution.  Given \( f( x,y )=2xy\) for \(0 < x < 1,0 < y < 1\). Find conditional probability function of X given Y=1/2.
Solution
Given joint probability density of two continuous random variables X and Y, the marginal probability function of Y is
\(h( y )=\displaystyle \int_{y=0}^{1}( 2xy )dx\)
or \(h( y )=\frac{32y}{2}\) for \(0 < x < 1\)
Now, the conditional probability function of X given Y=y is
\(w( X=xy )\)
or\(\frac{f( x,y )}{h( y )}\)
or\(\frac{2xy}{\frac{32y}{2}}\)
or\(\frac{42x2y}{32y}\)
Thus, conditional probability function of X given Y=1/2 is
\(\frac{42x2. \frac{1}{2}}{32. \frac{1}{2}}\)
or\(\frac{32x}{2}\) for \(0 < x < 1\)
Exercise
 Given \(f( x,y )=\frac{xy}{36}\)for \( x=1,2,3,y=1,2,3\). Find the conditional probability function of X given Y=y
 Given \( f( x,y )=\frac{x^2+y}{32}\) for \( x=0,1,2,3, y=0,1\). Find conditional probability function of Y given X=1
 Given \( f( x,y,z )=\frac{xyz}{108}\) for \( x=1,2,3,y=1,2,3,z=1,2\). Find conditional probability function of X given Y=1.
 Given \( f( x,y )=\frac{2(x+2y )}{3}\) for \( 0 < x < 1,0 < y < 1\). Find conditional probability function of X given Y=y
 Given \(f( x,y )=\frac{3x( x+y )}{5}\) for \( 0 < x < 1,0 < y < 2\) . Find conditional probability function of Y given X=1.
 Given \(f( x,y )=2\) for \( 0 < x < 1,0 < y < x\). Find the conditional probability function of X given Y=y.
Moment Generating Function
Moment is measures of descriptive statistics. It measures descriptive statistics relative to the center of data values. Moments can be used to calculate descriptive statistics like mean, variance, skewness and kurtosis.
If X is random variable with probability function \( f(x) \) , and \( r\in Z^{+}\) then
rth moment of X about origin is denoted by \( \mu _r'\) or \( E(X^r)\) and defined by.
\(E[X^r]= \sum_X x^r.f( x ) \) if X is discrete
\(E[X^r]= \int_{\infty}^{\infty} x^r.f( x ) dx\) if X is continuous
In particular \( \mu_1'=E( X ) \): 1st moment about origin is mean
 \( \mu _2'=E(X^2) \)
 \( \mu _r'=E(X^r) \)
\( r^{th}\) moment about mean is denoted by \( \mu_r \) and defined by
\( E(XE(X))^r]= \sum_X ( xE(X))^r. f( x ) \) if X is discrete
\( E(XE(X))^r]= \int_{\infty}^{\infty} ( xE(X))^r. f( x ) dx \) if X is continuous
Here,
 \( \mu_2=E(XE(X))^2 \): 2nd moment about mean is variance
Moment Generating Function
Let X be a random variable with probability function \(f( x )\), then expected value of \( e^{tX} \) is
\( E[e^{tX} ]= \int_{\infty}^{\infty}e^{tx}.f( x )dx\) if X is continuous
\( E[e^{tX} ]= \sum_X e^{tx}.f( x ) \) if X is discrete
We do the further solution assuming X is continuous, if X is discrete, then simply the integration is replaced by summation.
Here,
\( E[ e^{tX} ]= \int_{\infty}^{\infty}e^{tx}.f( x )dx\)
or \( E[ e^{tX} ]= \int_{\infty}^{\infty} [1+tx+\frac{t^2}{2!}x^2+\ldots +\frac{t^r}{r!}x^r+\ldots ].f( x )dx\)
or \( E[ e^{tX} ]= \int_{\infty}^{\infty} [f( x )dx+tx f( x )dx+\frac{t^2}{2!}x^2f( x )dx+\ldots +\frac{t^r}{r!}x^rf( x )dx+\ldots ]\)
or \( E[ e^{tX} ]= [\int_{\infty}^{\infty} f( x )dx+t\int_{\infty}^{\infty} x dx+\frac{t^2}{2!}\int_{\infty}^{\infty} x^2f( x )dx+\ldots +\frac{t^r}{r!}\int_{\infty}^{\infty} x^rf( x )dx+\ldots ]\)
or \( E[ e^{tX} ]=1+t \mu_1' +\frac{t^2}{2!}\mu_2'+\ldots +\frac{t^r}{r!}\mu_r'+\ldots \)
Here, \( E[ e^{tX} ]\) generates moments about origin.
Thus
\( E[ e^{tX} ]\) is called moment generating function.
It is denoted by \( M_X( t )\) and defined by
\( M_X( t )=E[ e^{tX} ] \)
where
\( M_X( t )=1+t \mu_1' +\frac{t^2}{2!}\mu_2'+\ldots +\frac{t^r}{r!}\mu_r'+\ldots \)
Theorem
Show that \( \frac{d^r} {dt^r} M_X(t) _{t=0}=\mu _r'\)
Proof Let X be a random variable with probability function f(x), then moment generating function of X is
\( M_X( t )=1+t \mu_1' +\frac{t^2}{2!}\mu_2'+\ldots +\frac{t^r}{r!}\mu_r'+\ldots\) (A)
Diff. (A) w.r. to t we get
\( \frac{d}{dt}M_X( t )=\mu_1'+t\mu_2'+\ldots +\frac{t^{r1}}{(r1)!}\mu_r'+\frac{t^r}{r!}\mu_{r+1}'+\ldots \) (1)
Putting \( t=0 \), in (1) we get
\(\frac{d}{dt}M_X(t) _{t=0}=\mu_1 \)
Again, diff. (1) w.r. to t, we get
\(\frac{d^2}{dt^2}M_X(t)=\mu _2'+\ldots +\frac{t^{r2}}{( r2 )!}\mu _r'+\frac{t^{r1}}{( r1 )!}\mu _{r+1}'+\ldots \) (2)
Putting \( t=0 \), in (2) we get
\( \frac{d^2}{dt^2}M_X( t ) _{t=0}=\mu_2'\)
Similarly, continuing the derivatives w. r. to. t up to rth order we get
\( \frac{d^r}{dt^r}M_X( t ) _{t=0}=\mu_r'\)
This completes the proof.
Example

Given \( f( x )=\frac{1}{8} \begin{pmatrix} 3 \\ x \\ \end{pmatrix} \) for \(x=0,1,2,3\). Find moment generating function of X
Solution
Given X is a random variable with probability function f(x), now moment generating function of X is
\( M_X( t )=E[ e^{tX} ] \)
or \( M_X( t )=\displaystyle \sum_Xe^{tx}.f( x )\)
or \( M_X( t )=\displaystyle \sum_Xe^{tx}.\frac{1}{8} \begin{pmatrix} 3 \\ x \\ \end{pmatrix} \)
or \( M_X( t )=\displaystyle \sum_{x=0}^3 e^{tx}.\frac{1}{8} \begin{pmatrix} 3 \\ x \\ \end{pmatrix} \)
or \( M_X( t )=\frac{1}{8} [\displaystyle e^{t.0}. \begin{pmatrix} 3 \\ 0 \end{pmatrix} + e^{t.1}. \begin{pmatrix} 3 \\ 1 \end{pmatrix}+ e^{t.2}. \begin{pmatrix} 3 \\ 2 \end{pmatrix}+ e^{t.3}. \begin{pmatrix} 3 \\ 3 \end{pmatrix} ] \)
or \( M_X( t )=\frac{1}{8} [1+3e^{t}+3e^{2t}+e^{3t} \)
or \( M_X( t )=\frac{1}{8} [1+3.1^2.e^t+3.1. (e^t)^2+(e^t)^3 \)
or \( M_X( t )=\frac{1}{8} (1+e^t)^3 \)  Given \(f( x )= e^{x} \) for \( x > 0 \) and use it to determine \( \mu \) and \( \sigma^2 \).
No comments:
Post a Comment