• Nie Znaleziono Wyników

Probability Calculus 2019/2020 Problem set 14 1. Consider a time-homogeneous Markov chain (X

N/A
N/A
Protected

Academic year: 2021

Share "Probability Calculus 2019/2020 Problem set 14 1. Consider a time-homogeneous Markov chain (X"

Copied!
2
0
0

Pełen tekst

(1)

Probability Calculus 2019/2020 Problem set 14

1. Consider a time-homogeneous Markov chain (Xn)n­0, such that

P =

0 12 12 0

1 4

1 2 0 14

1

2 0 0 12 0 12 12 0

.

a) Calculate p12(2).

b) Assuming X0 = 1 (with probability 1), find the probability that Xn will reach state 2 before it reaches state 4.

c) Find m32.

d) Is the chain periodic? Irreducible?

e) Find the stationary distribution.

f) Approximate the probability that X10000= 1.

g) Find the mean recurrence time for state 1.

2. We roll a die until we obtain 16 or 66. What is the probability that we will obtain 16 first?

3. A frog jumps from stone to stone. There are five stones, forming a regular pentagon (ABCDE).

Once on a stone, the frog chooses randomly (independently from previous choices) either the stone to the left, or to the right, with probabilities 12 and 12. The frog starts from stone A.

a) Calculate the probability that the frog will return to stone A before it reaches stone C.

b) Calculate the mean recurrence time for a stone and compare with the stationary distri- bution.

4. We toss a coin until we obtain three heads in a row. Find the mean number of tosses.

5. Smith is in jail and has 1 dollar; he can get out on bail if he had 4 dollars. A guard agrees to make a series of bets with him. If Smith bets A dollars, he wins A dollars with probability 0.4 and loses A dollars with probability 0.6. What is the better strategy to get out on bail (win 4 dollars before losing all of his money):

a) timid strategy: to bet 1 dollar each time?

b) bold strategy: to bet as much as possible each time?

(2)

Some additional simple problems you should be able to solve on your own:

A. A time-homogenous Markov chain (Xn)n­0 over the space E = {1, 2, 3} has the following transition matrix:

P =

0 12 12

1 3

1 2

1 6

0 1 0

.

a) What is the probability that starting from state 2, after two steps the Markov chain will again be in state 2?

b) Assuming X0 = 1 a.s. calculate the probability that the Markov chain will return to state 1 before it reaches state 3.

c) Assuming X0 = 3 a.s. calculate the mean expected number of steps for reaching state 1.

d) Find the stationary distribution. Is the Markov chain irreducible? Periodic?

e) Approximate the probability that X10000= 1.

f) Calculate the mean recurrence times for the states and compare with the stationary distri- bution.

Cytaty

Powiązane dokumenty

The number of needlewomen employed in particular units was a random variable with a uniform distribution over the interval [100, 300].. The wage of a needlewoman depends on

Using the Bernstein Inequality, find an upper bound to the probability that upon rolling a die 300 times, we will obtain a six at least 60 times2. be uncorrelated random variables,

How many times should an experiment of tossing the coin be repeated, in order to assure that with probability of at least 0.95, the empirical fraction of tails will not differ from p

Based on the Poisson approximation, assess the probability that out of 10 6 transfers that were entered in a given month, at least three transfers were handled erroneously

What is more probable: obtaining a sum of points equal to 7 or rolling the same number twice?.2. A coin was tossed

There exists a test for the verification of the carrier state; the test gives a positive result for all carriers, and a (false) positive result for 5% of those who are not

Define the binomial, geometric, Poisson and uniform distributions?. Problems (you should know how to solve after

Consider the following game: we toss a symmetric coin until heads appear?. What is a reasonable price for participation in