Markov chain

nipperess

New member
Joined
Apr 22, 2006
Messages
4
I have a Markov chain problem that I have been hvaing trouble with. THe question is:

The Acme Company is testing its new Gimble-Making Machine. The machine creates Gimbles one at a time, and each one may be either usable or defective. After some testing, it is determined that the chance of a defective Gimble being produced is 5% provided the preceding Gimble produced was not defective; however, if the preceding Gimble was defective, the chance of the current Gimble being defective jumps to 55%. Moreover, you notice that these chances do not change regardless of the previous history of defective Gimbles produced. As such, you decide to model whether a Gimble is defective or not using a Markov Chain; that is, you define Yt = 0 if the tth Gimble produced is usable and Yt = 1 if the tth
Gimble produced is defective and model the stochastic process {Yt}t∈{1,2,3...} as a Markov chain with transition matrix:

[0.95 0.05]
[0.45 0.55]

Finally, you do some further testing and determine that when the machine is turned on each morning, the chance that its first Gimble is defective is 10%.

a) Calculate E(Yt) for any time t.
b) For any s < t, calculate Cov(Ys, Yt)

I have just started learning about Markov chains but I don't understand how to do these questions. Any help would be much appreciated. Thanks
 
Thanks for your help Galactus. However, I am not quite following what you did. What was your expected value for Yt? I will try looking through my notes to see if I can work it out.
 
In most recent treatments of Markov Chains the rows sum to one not the columns. This seems to be a specialized question. I am not at all sure of its terms. A statistician such as Roy Haas may be of help here.
 
Can anyone help me please? I have tried doing again it but I just don't know what I am doing.
 
Top