Your friend is a student in Commerce school; his area of concentration is financ
ID: 3079694 • Letter: Y
Question
Your friend is a student in Commerce school; his area of concentration is finance. He wishes to understand the price movements of a financial asset. The market for the asset is open every day. The asset's price is measured at the close of the market each day. He defines the state of the market at time t (denoted Xt) as state 1 if the asset's price at the close of the market on day t is higher than its price at the beginning of the day t (which is the price at the end of day t-1). The state of the market at time t is state 2 if the asset's price at the close of the market on day t is not higher than its price at the beginning of day t. He determines the following information about price movements: If the asset price increased yesterday and increased today, it will increase tomorrow with probability .9. If the asset price failed to increase yesterday and increased today, it will increase tomorrow with probability .6. If the asset price increased yesterday and failed to increase today, it will increase tomorrow with probability .5. If the asset price failed to increase yesterday and failed to increase today, it will increase tomorrow with probability .3. Using the definitions above, is the process {Xt} a Markov chain? Explain your answer. If the answer in part (a) is no, can you modify the definition of the states so that the resulting process is a Markov chain? If so, please define the states of the process and the associated transition matrix P.Explanation / Answer
A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process usually characterized as memoryless: the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov property. Markov chains have many applications as statistical models of real-world processes A famous Markov chain is the so-called "drunkard's walk", a random walk on the number line where, at each step, the position may change by +1 or -1 with equal probability. From any position there are two possible transitions, to the next or previous integer. The transition probabilities depend only on the current position, not on the manner in which the position was reached. For example, the transition probabilities from 5 to 4 and 5 to 6 are both 0.5, and all other transition probabilities from 5 are 0. These probabilities are independent of whether the system was previously in 4 or 6. Another example is the dietary habits of a creature who eats only grapes, cheese or lettuce, and whose dietary habits conform to the following rules: It eats exactly once a day. If it ate cheese today, tomorrow it will eat lettuce or grapes with equal probability. If it ate grapes today, tomorrow it will eat grapes with probability 1/10, cheese with probability 4/10 and lettuce with probability 5/10. If it ate lettuce today, it will not eat lettuce again tomorrow but will eat grapes with probability 4/10 or cheese with probability 6/10. This creature's eating habits can be modeled with a Markov chain since its choice tomorrow depends solely on what it ate today, not what it ate yesterday or even farther in the past. One statistical property that could be calculated is the expected percentage, over a long period, of the days on which the creature will eat grapes. A series of independent events (for example, a series of coin flips) satisfies the formal definition of a Markov chain. However, the theory is usually applied only when the probability distribution of the next step depends non-trivially on the current state. Many other examples of Markov chains exist.
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.