Sunday, September 25, 2022

9781108415859 pdf download

9781108415859 pdf download

Introduction to probability second edition bertsekas pdf,Business Installer

Introduction to Probability (0th) Edition better than downloaded Introduction to Probability 0th Edition PDF solution manuals? Introduction to Probability (Cambridge Mathematical pdf download. Suppose that B1,. Then apply the multiplication rule 2. Identity 2. Venn diagram representation of decomposition 2. Example 2. There are three types of coins in 04/03/ · Introduction to Probability by David F. Anderson, ISBN [PDF eBook eTextbook] Publisher: Cambridge University Press; 1st edition (November 1, ) Introduction to probability , This classroom-tested textbook is an introduction to probability theory, with the right balance between mathematical pre. 73 pdf introduction to probability, 2nd edition github introduction to probability 2nd edition by bertsekas and tsitsiklis solutions probability problems and solutions pdf free ebook. ... read more




Continuation of Examples 1. If we wish to model a flip of a biased coin we alter the probabilities. For exam­ ple, suppose we know that heads is three times as likely as tails. Clarity might demand that we distinguish different probability measures notationally from each other. Another important point is that it is perfectly valid to assign a probability of zero to a nonempty event, as with Q above. Let the experiment consist of a roll of a pair of dice as in the games of Monopoly or craps. We assume that the dice can be distinguished from each other, for example that one of them is blue and the other one is red. Here a, b is a so-called ordered pair which means that outcome 3,5 is distinct from outcome 5,3. We flip a fair coin three times.


Let us encode the outcomes of the flips as 0 for heads and 1 for tails. We review simple counting techniques in Appendix C. In such cases Cartesian product spaces arise naturally as sample spaces. If А1,Аг, In terms of product notation, the sample space of Example 1. Random sampling Sampling is choosing objects randomly from a given set. It can involve repeated choices or a choice of more than one object at a time. Dealing cards from a deck is an example of sampling. There are different ways of setting up such experiments which lead to different probability models. In this section we discuss 5 1. Random sampling three sampling mechanisms that lead to equally likely outcomes. This allows us to compute probabilities by counting. The required counting methods are developed systematically in Appendix C. Before proceeding to sampling, let us record a basic fact about experiments with equally likely outcomes.


Suppose the sample space Q is a finite set and let Q denote the total number of possible outcomes. In this case probabilities of events can be found by counting. Terminology It should be clear by now that random outcomes do not have to be equally likely. Look at Example 1. An ordered sample is built by choos­ ing objects one at a time and by keeping track of the order in which these objects were chosen. After each choice we either replace put back or discard the just chosen object before choosing the next one. This distinction leads to sampling with replacement and sampling without replacement. An unordered sample is one where only the identity of the objects matters and not the order in which they came. We discuss the sampling mechanisms in terms of an urn with numbered balls.


An urn is a traditional device in probability see Figure 1. You cannot see the contents of the urn. You reach in and retrieve one ball at a time without looking. We assume that the choice is uniformly random among the balls in the urn. Three traditional mechanisms for creating experiments with random outcomes: an um with balls, a six-sided die, and a coin. Sampling with replacement, order matters Suppose the um contains n balls numbered 1,2, We retrieve a ball from the um, record its number, and put the ball back into the um. Putting the ball back into the um is the replacement step. We carry out this procedure k times.


The outcome is the ordered fe-tuple of numbers that we read off the sampled balls. Each Sj can be chosen in n different ways. By Fact C. Let us illustrate this with a numerical example. Suppose our um contains 5 balls labeled 1, 2, 3, 4, 5. Sample 3 balls with replacement and produce an ordered list of the numbers drawn. At each step we have the same 5 choices. In these cases we are sampling from the set {H, T or {1,2,3,4,5,6}. Random sampling Check that Examples 1. Sampling without replacement, order matters Consider again the urn with n balls numbered 1,2, We retrieve a ball from the urn, record its number, and put the ball aside, in other words not back into the urn. This is the without replacement feature. We repeat this procedure k times. Because of this, we clearly cannot have k larger than n. Consider again the urn with 5 balls labeled 1, 2, 3, 4, 5. Sample 3 balls without replacement and produce an ordered list of the numbers drawn.


The first ball can be chosen in 5 ways, the second ball in 4 ways, and the third ball in 3 ways. Equation 1. This is a restatement of the familiar fact that a set of n elements can be ordered in n! different ways. That is, outcomes 1,2,5 and 2,1,5 were regarded as distinct. Next we suppose that we do not care about order, but only about the set {1,2,5} of elements sampled. This kind of sampling without replacement can happen when cards are dealt from a deck or when winning numbers are drawn in a state lottery.


Since order does not matter, we can also imagine choosing the entire set of k objects at once instead of one element at a time. Notation is important here. The ordered triple 1,2,5 and the set {1,2,5} must not be confused with each other. Consequently in this context we must not mix up the notations and { }. As above, imagine the urn with n balls numbered 1,2, Let 1 Do not be confused by the fact that an outcome co is itself now a set of numbers. The number of elements of Q is given by the binomial coefficient see Fact C. Another way to produce an unordered sample of k balls without repetitions would be to execute the following three steps: i randomly order all n balls, ii take the first k balls, and iii ignore their order. Let us verify that the probability of obtaining a particular selection {si, The number of possible orderings in step i is nl.


The number of favorable order­ ings is kl n — k! Then from the ratio of favorable to all outcomes P{the selection is {si, The description above contains a couple of lessons. i There can be more than one way to build a probability model to solve a given problem. But a warning is in order: once an approach has been chosen, it must be followed consistently. Mixing up different representations will surely lead to an incorrect answer. Random sampling ii It may pay to introduce additional structure into the problem. The second approach introduced order into the calculation even though in the end we wanted an outcome without order. Suppose our urn contains 5 balls labeled 1, 2, 3, 4, 5. Sample 3 balls without replacement and produce an unordered set of 3 numbers as the outcome. The outcome {2,2,3} does not make sense as a set of three numbers because of the repetition. This scenario will appear naturally in Example 6. Further examples The next example contrasts all three sampling mechanisms.


Suppose we have a class of 24 children. We consider three different scenarios that each involve choosing three children. a Every day a random student is chosen to lead the class to lunch, without regard to previous choices. What is the probability that Cassidy was chosen on Monday and Wednesday, and Aaron on Tuesday? This is sampling with replacement to produce an ordered sample. Over a period of three days the total number of different choices is b Three students are chosen randomly to be class president, vice president, and treasurer.


No student can hold more than one office. What is the probability that Mary is president, Cory is vice president, and Matt treasurer? Imagine that we first choose the president, then the vice president, and then the treasurer. This is sampling without replacement to produce an ordered sample. We apply formula 1. What is the probability that the team consists of Shane, Heather and Laura? A team means here simply a set of three students. Thus we are sampling without replacement to produce a sample without order. There are teams that include Mary since there are that many ways to choose the other two team members from the remaining 23 students.


The next two examples illustrate this idea. Our urn contains 10 marbles numbered 1 to We sample 2 marbles without replacement. What is the probability that our sample contains the marble labeled 1? Let A be the event that this happens. However we choose to count, the final answer P A will come from formula 1. Solution with order. Sample the 2 marbles in order. As in 1. Now the outcomes are subsets of size 2 from the set {1,2, Both approaches are correct and of course they give the same answer. Infinitely many outcomes Example 1. Rodney packs 3 shirts for a trip. It is early morning so he just grabs 3 shirts randomly from his closet.


The closet contains 10 shirts: 5 striped, 3 plaid, and 2 solid colored ones. What is the probability that he chose 2 striped and 1 plaid shirt? To use the counting methods introduced above, the shirts need to be distin­ guished from each other. This way the outcomes are equally likely. So let us assume that the shirts are labeled, with the striped shirts labeled 1, 2, 3, 4, 5, the plaid ones 6, 7, 8, and the solid colored ones 9, Since we are only interested in the set of chosen shirts, we can solve this problem with or without order. This comes from the number of ways of choosing 2 out of 5 striped shirts shirts labeled 1, 2, 3, 4, 5 times the number of ways of choosing 1 out of 3 plaid shirts numbered 6, 7, 8. The event of interest, A, consists of those triples that have two striped shirts and one plaid shirt. The elements of A can be found using the following procedure: i choose the plaid shirt 3 choices , ii choose the position of the plaid shirt in the ordering 3 choices , iii choose the first striped shirt and place it in the first available position 5 choices , iv choose the second striped shirt 4 choices.


Infinitely many outcomes The next step in our development is to look at examples with infinitely many possible outcomes. Flip a fair coin until the first tails comes up. Record the number of flips required as the outcome of the experiment. What is the space Q of possible 12 Experiments with random outcomes outcomes? The number of flips needed can be any positive integer, hence Q must contain all positive integers. We can also imagine the scenario where tails never comes up. This outcome is represented by oo infinity. The outcome is k if and only if the first k — 1 flips are heads and the feth flip is tails. As in Example 1. On line 1. If you forgot how to do that, turn to Appendix D. Notice that the example showed us something that agrees with our intuition, but is still quite nontrivial: the probability that we never see tails in repeated flips of a fair coin is zero. This phenomenon gets a different treatment in Example 1.


We pick a real number uniformly at random from the closed unit interval [0,1]. Let X denote the number chosen. What is the probability that X lies in a smaller interval [a, b] с [о, 1]? We meet it again in Section 3. Consider a dartboard in the shape of a disk with a radius of 9 inches. The bullseye is a disk of diameter inch in the middle of the board. What is the 13 1. Infinitely many outcomes probability that a dart randomly thrown on the board hits the bullseye? Let us assume that the dart hits the board at a uniformly chosen random location, that is, the dart is equally likely to hit anywhere on the board.


The sample space is a disk of radius 9. There is a significant difference between the sample space of Example 1. A countably infinite sample space works just like a finite sample space. ioeA Finite and countably infinite sample spaces are both called discrete sample spaces. By contrast, the unit interval, and any nontrivial subinterval of the real line, is uncountable. No integer labeling can cover all its elements. This is not trivial to prove and we shall not pursue it here. But we can see that it is impossible to define the probability measure of Example 1. But this leads immediately to absurdity. The rules of probability have been violated. Examples 1. Later we develop tools for building models where the random point is not uniform. Note that the axiom requires additivity only for a sequence of pairwise disjoint events, and not for an uncountable collection of events. The interval [0,1] is the union of all the singletons {x} over x e [0,1].


So there is no conceivable way in which the probability of [0,1] comes by adding together probabilities of points. Consequences of the rules of probability We record some consequences of the axioms in Definition 1. The discussion relies on basic set operations reviewed in Appendix B. Decomposing an event The most obviously useful property of probabilities is the additivity property: if Ai,Аг,A3, Calculation of the probability of a complicated event A almost always involves decomposing A into smaller disjoint pieces whose prob­ abilities are easier to find. The next two examples illustrate both finite and infinite decompositions. An urn contains 30 red, 20 green and 10 yellow balls. Draw two without replacement. What is the probability that the sample contains exactly one red or exactly one yellow?


To clarify the question, it means the probability that the sample contains exactly one red, or exactly one yellow, or both inclusive or. This interpretation of or is consistent with unions of events. We approach the problem as we did in Example 1. We distinguish between the 60 balls for example by numbering them, though the actual labels on the balls are not important. This way we can consider an experiment with equally likely outcomes. Having exactly one red or exactly one yellow ball in our sample of two means that we have one of the following color combinations: red-green, yellow-green or red-yellow. These are disjoint events, and their union is the event we are interested in. We used unordered samples, but we can get the answer also by using ordered samples.


Peter and Mary take turns rolling a fair die. If Peter rolls 1 or 2 he wins and the game stops. If Mary rolls 3, 4, 5, or 6, she wins and the game stops. They keep rolling in turn until one of them wins. Suppose Peter rolls first. a What is the probability that Peter wins and rolls at most 4 times? To say that Peter wins and rolls at most 4 times is the same as saying that either he wins on his first roll, or he wins on his second roll, or he wins on his third roll, or he wins on his fourth roll. These alternatives are mutually exclusive.


This is a fairly obvious way to decompose the event. Peter wins on his feth roll if first both Peter and Mary fail k — 1 times and then Peter succeeds. Each roll has 6 possible outcomes. b What is the probability that Mary wins? If Mary wins, then either she wins on her first roll, or she wins on her second roll, or she wins on her third roll, etc. There is no a priori bound on how long the game can last. Hence we have to consider all the infinitely many possibilities. This will be typical going forward. Once we have understood the general principles of building probability models, it is usually not necessary to define explicitly the sample space in order to do calculations. Sometimes one is much easier to compute than the other.


Roll a fair die 4 times. What is the probability that some number appears more than once? Any one of 1 through 6 can appear two, three or four times. But also two different numbers can appear twice, and we would have to be careful not to overcount. However, switching to the complement provides an easy way out. Consequences of the rules of probability Figure 1. Venn diagram representation of two events A and B. Even in this section it appears several times. There is an alternative way to write an intersection of sets: instead of А П В we can write simply AB. Both will be used in the sequel. With this notation 1. The Venn diagram in Figure 1.


Monotonicity of probability Another intuitive and very useful fact is that a larger event must have larger probability: if А с В then P A 1. Figure 1. Now inequality 1. Venn diagram representation of the events A QB. Here is another proof of the fact, first seen in Example 1. Let A be the event that we never see tails and An the event that the first n coin flips are all heads. Never seeing tails implies that the first n flips must be heads, so A c An 18 Experiments with random outcomes and thus P A Inclusion-exclusion We move to inclusion-exclusion rules that tell us how to compute the probability of a union when the events are not mutually exclusive. Look at the Venn diagram in Figure 1. Identity 1. Note that 1. We solved this problem in Example 1. Now apply first inclusion-exclusion 1. What is the probability that a randomly chosen individual from the town is not blond and does not have blue eyes?


We assume that each individual has the same probability to be chosen. In order to translate the information into the language of probability, we iden­ tify the sample space and relevant events. The sample space Q is the entire population of the town. At this point we could forget the whole back story and work with the following problem: suppose that 1. Another way to get the same result is by applying 1. We can compute P AC and P BC from P A and P B. Measurability The notion of an event is actually significantly deeper than we let on in this chap­ ter. But for discrete sample spaces there is no difficulty. This all seems very straightforward. But it turns out that there can be good reasons to use smaller collections T7 of events. One reason is that a smaller T7 can be useful for modeling purposes. This is illustrated in Example 1. A second reason is that technical problems with uncountable sample spaces prevent us from taking T7 as the power set.


We discuss this issue next. Recall from Example 1. How­ ever, the technical issues go beyond this. It turns out that it is impossible to define consistently the uniform probability measure for all subsets of [0,1]. Hence T7 must be something smaller than the power set of [0,1], but what exactly? To put the theory on a sound footing, the axiomatic framework of Definition 1. is a sequence of events in T7, then their union in T7. is also Any collection T7 of sets satisfying properties i - iii is called a a -algebra or a a -field.


The members of a a-algebra are called measurable sets. The properties of a a-algebra imply that countably many applications of the usual set operations to events is a safe way to produce new events. To create the a-algebra for the probability model we desire, we typically start with some nice sets that we want in T7, and let the axioms determine which other sets must also be members of T7. When Q is [0,1], the real line, or some other interval of real numbers, the standard choice for T7 is to take the smallest a -algebra that contains all subintervals of £2. The members of this T7 are called Borel sets. It is impossible to give a simple description of all Borel sets.


But this procedure allows us to rule out pathological sets for which probabilities cannot be assigned. Construction of probability spaces belongs in a subject called measure theory and lies beyond the scope of this book. Similar issues arise with random variables. Not every function X from an uncountable Q into R can be a random variable. We have to require that {X e B is a member of T7 whenever В is a Borel subset of R. This defines the notion of a measurable function, and it is the measurable functions that can appear as random variables. In particular, in Definition 1. Fortunately it turns out that all reasonable sets and functions encountered in practice are measurable.


Consequently this side of the theory can be completely ignored in an undergraduate probability course. Readers who wish to pursue the full story can turn to graduate textbooks on real analysis and probability, such as [Bil95], [Fol99], and [Dur 10]. Collections of events as information Another aspect of the collection T7 of events is that it can represent informa­ tion. This point becomes pronounced in advanced courses of probability. The next example gives a glimpse of the idea. Continuation of Example 1. This means that we cannot separate outcomes 1,3 and 3,1 from each other. By a judicious choice of T7 we can forbid the model from separating 1,3 and 3,1. The point of this example is that by restricting T7 we can model the information available to the observer of the experiment, without changing Q.


Section 1. We roll a fair die twice. Describe a sample space Q and a proba­ bility measure P to model this experiment. Let A be the event that the second roll is larger than the first. Find the probability P A that the event A occurs. Exercise 1. For breakfast Bob has three options: cereal, eggs or fruit. He has to choose exactly two items out of the three available. a Describe the sample space of this experiment. Express A as a subset of the sample space. a You flip a coin and roll a die. Describe the sample space of this experiment. b Now each of 10 people flips a coin and rolls a die. How many elements are in the sample space? c In the experiment of part b , how many outcomes are in the event where nobody rolled a five?


How many outcomes are in the event where at least one person rolled a five? Every day a kindergarten class chooses randomly one of the 50 state flags to hang on the wall, without regard to previous choices. We are interested in the flags that are chosen on Monday, Tuesday and Wednesday of next week. With this convention 2. X A basket contains a litter of 6 kittens, 2 males and 4 females. A neighbor comes and picks 3 kittens randomly to take home with him. Let X be the number of male kittens in the group the neighbor chose. The probability mass function of X is as follows. Contrast the experiment above with this one. A moment later she comes around again, picks up a random kitten, pets it, and puts it back in the basket with its siblings.


She repeats this altogether three times. Let Y be the number of times she chose a male kitten to pet. You might even Example 2. The modest size required often comes as a surprise. In fact, the 67 2. Further topics on sampling and independence problem is sometimes called the birthday paradox, even though there is nothing paradoxical about it. To solve the problem we set up a probability model. We ignore leap years and assume that each of the possible birthdays are equally likely for each person. We assume that birthdays of different people are independent. Then we can restate the problem as follows. Take a random sample of size k with replacement from the set {1, 2,. Let pk be the probability that there is repetition in the sample. The complementary event that there is no repetition is easier to handle.


One way to think about it is the following. Thus, if there are approximately pairs of people, there should be a good probability of success. The following approximation will shed more light on why this guess is reasonable. Of course, the computation we just did is not fully rigorous because of the approximations. There are plenty of anecdotes related to the birthday problem. We recount two here. The first story is from Lady Luck: The Theory of Probability by Warren Weaver. Most of them thought it incredible that there was an even chance with only 22 or 23 persons.


Noticing that there were exactly 22 at the table, someone proposed a test. We got all the way around the table without a duplicate birthday. But I am the 23rd person in the room, and my birthday is May 17, just like the general over there. The second story is about Johnny Carson, the former host of the Tonight Show. It is a good example of a common misinterpretation of the problem. However, instead of looking for a match anywhere in the crowd, he asked for the birthday of one particular guest and then checked if anybody in the audience shared the same birthday.


It turned out that there was no match even though the size of the audience was roughly Johnny took this as confirmation that the solution of the birthday problem is incorrect. Where did Johnny Carson go wrong? He performed the following, very different experiment. Fix a date a number between 1 and and let rk be the probability 1 2 Warren Weaver, Lady Luck: The Theory of Probability, Dover Publications, New York, , page October 1, This is one lesson from the Nobel Prize winning investigations into human decision-making by Daniel Kahneman and Amos Tversky [Kah11]. Below we describe two cases of faulty reasoning with serious real-world consequences. The Sally Clark case is a famous wrongful conviction in England.


In she was charged with murder. At the trial an expert witness made the following calculation. Population statistics indicated that there is about a 1 in chance of an unexplained infant death in a family like the Clarks. The jury convicted her. Much went wrong in the process and the reader is referred to [SC13] for the history. Let us consider these two points in turn. i The assumption of independence that led to the 1 in 72 million probability can readily fail due to unknown genetic or environmental factors. Here is a simple illustration of how that can happen. Suppose a disease appears in 0. Suppose further that this disease comes from a genetic mutation passed from father to son with probability 0. What is the probability that both sons of a particular family have the disease? If the disease strikes completely at random, the answer is 0. However, the illness of the first son implies that the father carries the mutation. Hence the conditional probability that the second son also falls ill is 0.


Thus the correct answer is 0. ii The second error is the interpretation of the 1 in 72 million figure. To put a different view on this point, consider a lottery. Yet there are plenty of lottery winners, and we do not automatically suspect cheating just because of the low odds. In a large enough population even a low probability event is likely to happen to somebody. The crisis originated with mortgages. We present a brief explanation to highlight an independence assumption that failed when circumstances changed. Mortgages are loans that banks make to people for buying homes. Investment banks buy up large numbers of mortgages and bundle them into financial assets called mortgage-backed securities.


These securities are sold to investors. As the original borrowers pay interest and principal on their loans, the owner of the security receives income. The investor might be for example a pension fund that invests contributions from current workers and pays out pensions to retirees. Some homeowners default, that is, fail to pay back their loans. That obviously hurts the owner of the security. But defaults were assumed random unrelated rare events. Under this assumption only a tiny proportion of mortgages default and the mortgage-backed security is a safe investment. Yet the danger in this assumption is evident: if a single event brings down many mortgages, these securities are much riskier than originally thought. Developments in the early s undermined this assumption of safety. A housing bubble drove home values high and enticed homeowners to borrow money against their houses.


To increase business, banks lowered their standards for granting loans. Investors bought and sold mortgage-backed securities and other derivative securities related to mortgages without fully understanding the risks involved. Eventually home values fell and interest rates rose. Borrowers began to default on their loans. A cycle of contagion ensued. Banks loaded with bad mortgages lost money and could no longer extend credit to businesses. Businesses suffered and people lost jobs. People out of jobs could not pay back their loans. Hence more mortgages defaulted and more mortgage-backed securities declined in value. The assumption of independent and rare defaults was no longer valid because events in the overall economy were causing large numbers of mortgages to default.


Ultimately the crisis led to the worst global economic downturn since the s. Many actors made mistakes in this process, both in industry and in government. Complex mathematical models were used to analyze mortgage-backed securities. Decision-makers are often not the people who understand the limitations of these models. Warnings from experts were not heeded because there were large profits to be made. As we saw in Section 1. Hence, in the precise version of Definition 2. Sample space for infinitely many trials The n-fold Cartesian product space 2. A sample point ω is now an infinite sequence of 0s and 1s. Entry si of the sequence represents the outcome of the ith trial. In the finite n case we could define the probability measure P by simply giving formula 2.


For infinite sequences this will not work because each sequence ω has either infinitely many 0s or infinitely many 1s or both, and hence the righthand side of 2. From this you can already surmise that the sequence space is an example of an uncountable space. Recall here the discussion on countably infinite and uncountable sample spaces at the end of Section 1. For fair coin flips there is an attractive alternative construction. The probability that this point lies in a specific subinterval of 0, 1] is the length of the interval. This is exactly as in Example 1. Furthermore, fixing the first n digits a1 ,. This implies that the X1 , X2 , X3 ,. n i 72 Conditional probability and independence Exercises We start with some warm-up exercises arranged by section. Section 2. We roll two dice. Find the conditional probability that at least one of the numbers is even, given that the sum is 8. A fair coin is flipped three times. What is the probability that the second flip is tails, given that there is at most one tails among the three flips?


What is the probability that a randomly chosen number between 1 and is divisible by 3, given that the number has at least one digit equal to 5? The first urn contains two balls labeled 1 and 2. The second urn contains three balls labeled 3, 4 and 5. We choose one of the urns at random with equal probability and then sample one ball uniformly at random from the chosen urn. What is the probability that we picked the ball labeled 5? The first urn contains three balls labeled 1, 2 and 3. The second urn contains four balls labeled 2, 3, 4 and 5. Then we sample one ball uniformly at random from the chosen urn. What is the probability that we picked a ball labeled 2? When Alice spends the day with the babysitter, there is a 0.


Her little sister Betty cannot turn the TV on by herself. But once the TV is on, Betty watches with probability 0. Tomorrow the girls spend the day with the babysitter. a What is the probability that both Alice and Betty watch TV tomorrow? b What is the probability that Betty watches TV tomorrow? c What is the probability that only Alice watches TV tomorrow? Define events precisely and use the product rule and the law of total probability. Find P Ac B. We shuffle a deck of cards and deal three cards without replacement. Find the probability that the first card is a queen, the second is a king and the third is an ace. We return to the setup of Exercise 2. Suppose that ball 3 was chosen. What is the probability that it came from the second urn?


I have a bag with 3 fair dice. One is 4-sided, one is 6-sided, and one is sided. I reach into the bag, pick one die at random and roll it. The outcome of the roll is 4. What is the probability that I pulled out the 6-sided die? The Acme Insurance company has two types of customers, careful and reckless. A careful customer has an accident during the year with probability 0. A reckless customer has an accident during the year with probability 0. Suppose a randomly chosen customer has an accident this year. What is the probability that this customer is one of the careful customers?


We choose a number from the set {1, 2, 3,. For each of the following choices decide whether the two events in question are independent or not. Note that 1 is not a prime A C number. Decide whether A and B are independent or not. Let A and B be two disjoint events. Under what condition are they independent? Every morning Ramona misses her bus with probability 1 10 , independently of other mornings. What is the probability that next week she catches her bus on Monday, Tuesday and Thursday, but misses her bus on Wednesday and Friday? We flip a fair coin three times. Check whether the events A1 , A 2 , A 3 are independent or not. We choose a number from the set {10, 11, 12,.


a Let X be the first digit and Y the second digit of the chosen number. Show that X and Y are independent random variables. b Let X be the first digit of the chosen number and Z the sum of the two digits. Show that X and Z are not independent. We have an urn with balls labeled 1,. Two balls are drawn. Let X1 be the number of the first ball drawn and X2 the number of the second ball drawn. a The balls are drawn with replacement. b The balls are drawn without replacement. c Does the answer to either a or b prove something about the independence of the random variables X1 and X2? A fair die is rolled repeatedly. Use precise notation of probabilities of events and random variables for the solutions to the questions below. a Write down a precise sum expression for the probability that the first five rolls give a three at most two times.


b Calculate the probability that the first three does not appear before the fifth roll. c Calculate the probability that the first three appears before the twentieth roll but not before the fifth roll. Jane must get at least three of the four problems on the exam correct to get an A. She also assumes that the results on different problems are independent. a What is the probability she gets an A? b If she gets the first problem correct, what is the probability she gets an A? Ann and Bill play rock-paper-scissors. Each has a strategy of choosing uniformly at random out of the three possibilities every round independently of the other player and the previous choices. a What is the probability that Ann wins the first round? Remember that the round could end in a tie.


Exercises Exercise 2. a Find the probability that there will be no accidents at this intersection during the next 7 days. b Find the probability that next September there will be exactly 2 days with accidents. c Today was accident free. Find the probability that there is no accident during the next 4 days, but there is at least one by the end of the 10th day. A team of three is chosen randomly from an office with 2 men and 4 women. Let X be the number of women on the team. a Identify the probability distribution of X by name. b Give the probability mass function of X. I have a bag with 3 fair dice: a 4-sided die, a 6-sided die, and a sided die. I reach into the bag, pick one die at random and roll it twice. The first roll is a 3 and the second roll is a 4.


The rolls of a given die are independent. Suppose events A , B, C , D are mutually independent. Show that events AB and CD are independent. Justify each step from the definition of mutual independence. Urn I has 1 green and 2 red balls. Urn II has 2 green and 1 yellow ball. a Pick an urn uniformly at random, and then sample one ball from this urn. What is the probability that the ball is green? b After sampling the ball in part a and recording its color, put it back into the same urn. Then repeat the entire experiment: choose one of the urns uniformly at random and sample one ball from this urn. What is the probability that we picked a green ball in both the first and the second experiment? c Pick an urn uniformly at random, and then sample two balls with replacement from this same urn. What is the probability that both picks are green?


d Sample one ball from each urn. We play a card game where we receive 13 cards at the beginning out of the deck of We play 50 games one evening. For each of the following random variables identify the name and the parameters of the distribution. b The number of games in which I receive at least one ace during the evening. c The number of games in which all my cards are from the same suit. d The number of spades I receive in the 5th game. Further exercises Exercise 2. There is a softball game at the company picnic. His probability of hitting a single is 0. Once on base, his probability of scoring after hitting a single is 0. What is the probability that Uncle Bob will be able to score in this turn?


Assume that 1 3 of all twins are identical twins. You learn that Miranda is expecting twins, but you have no other information. a Find the probability that Miranda will have two girls. b You learn that Miranda gave birth to two girls. What is the probability that the girls are identical twins? Explain any assumptions you make. Suppose a family has two children of different ages. We assume that all combinations of boys and girls are equally likely. a Formulate precisely the sample space and probability measure that describes the genders of the two children in the order in which they are born. b Suppose we learn that there is a girl in the family. Precisely: we learn that there is at least one girl. What is the probability that the other child is a boy? c Suppose we see the parents with a girl, and the parents tell us that this is their younger child. What is the probability that the older child we have not yet seen is a boy?


Suppose a family has three children of different ages. a Formulate precisely the sample space and probability measure that describes the genders of the three children in the order in which they are born. b Suppose we see the parents with two girls. Assuming we have no other information beyond that at least two of the children are girls, what is the probability that the child we have not yet seen is a boy? What is the probability that the oldest child we have not yet seen is a boy? We choose an urn randomly and then draw a ball from it. a What is the probability that we draw a red ball?


b Find the conditional probability that we chose urn k, given that we drew a red ball. You play the following game against your friend. You have two urns and three balls. One of the balls is marked. You get to place the balls in the two urns any way you please, including leaving one urn empty. Your friend will choose one urn at random and then draw a ball from that urn. If he chose an empty urn, there is no ball. His goal is to draw the marked ball. a How would you arrange the balls in the urns to minimize his chances of drawing the marked ball? b How would your friend arrange the balls in the urns to maximize his chances of drawing the marked ball? c Repeat a and b for the case of n balls with one marked ball. We shuffle a deck of cards and deal two cards. Find the probability that the first card is a queen and the second is a spade. A bag contains three kinds of dice: seven 4-sided dice, three 6sided dice, and two sided dice. A die is drawn from the bag and then rolled, producing a number.


For example, the sided die could be chosen and rolled, producing the number Assume that each die is equally likely to be drawn from the bag. a What is the probability that the roll gave a six? b What is the probability that a 6-sided die was chosen, given that the roll gave a six? An urn contains one 6-sided die, two 8-sided dice, three 10sided dice, and four sided dice. One die is chosen at random and then rolled. a What is the probability that the roll gave a five? b What is the probability that the die rolled was the sided die, given that the outcome of the roll was seven? We choose one of the words in the following sentence uniformly at random and then choose one of the letters of that word, again uniformly at random: 78 Conditional probability and independence SOME DOGS ARE BROWN a Find the probability that the chosen letter is R. b Let X denote the length of the chosen word. Determine the probability mass function of X.


The decomposition idea works just as well for conditional probabilities: if {B1 ,. k e Given that the chosen letter is R, what is the probability that the chosen word was BROWN? We choose one of the words in the following sentence uniformly at random and then choose one of the letters of that word, again uniformly at random: THE QUICK BROWN FOX JUMPED OVER THE GATE a Find the probability that the chosen letter is O. Incoming students at a certain school take a mathematics place- ment exam. The possible scores are 1, 2, 3, and 4.


a What is the probability that a randomly selected student from the incoming class will become a mathematics major? b Suppose a randomly selected student from the incoming class turns out to be a mathematics major. What is the probability that he or she scored a 4 on the placement exam? Two factories I and II produce phones for brand ABC. You purchase a brand ABC phone, and assume this phone is randomly chosen from all ABC phones. Suppose the phone is not defective. What is the probability that it came from factory II? Urn A contains 2 red and 4 white balls, and urn B contains 1 red and 1 white ball. A ball is randomly chosen from urn A and put into urn B, and a ball is then chosen from urn B. What is the conditional probability that the transferred ball was white given that a white ball is selected from urn B?


We have an urn with 3 green balls and 2 yellow balls. We pick a sample of two without replacement and put these two balls in a second urn that was previously empty. Next we sample two balls from the second urn with replacement. a What is the probability that the first sample had two balls of the same color? b What is the probability that the second sample had two balls of the same color? c Given that the two balls chosen from the second urn have the same color, what is the probability that the second urn contains two balls of the same color? We have two bins. The first bin has 6 blue marbles and 4 yellow marbles. The second bin has 3 blue marbles and 4 yellow marbles. We choose a bin at random and then draw a marble from that bin. a If the marble we select is yellow, what is the probability that we chose the first bin?


b Now suppose we put the yellow marble from a back in the bin it was drawn from and then draw a marble from the same bin. This marble is also yellow. What is the probability we chose the first bin? These are professional football teams. Suppose that the probability of a 7-year-old fan of the Chicago Bears going to a game in a given year is 0. A 7 year old is selected randomly from Madison. a What is the probability that the selected 7 year old goes to a professional football game next year? b If the selected 7 year old does go to a game in the coming year, what is the probability that this 7 year old was a fan of the Packers?


Consider three boxes with numbered balls in them. Box A contains six balls numbered 1,. Box B contains twelve balls numbered 1,. Finally, box C contains four balls numbered 1,. One ball is selected from each box uniformly at random. b What is the probability that the ball chosen from box B is 12 if the arithmetic mean of the three balls selected is exactly 7? A medical trial of 80 patients is testing a new drug to treat a certain condition. This drug is expected to be effective for each patient with probability p, independently of the other patients. You personally have two friends in this trial. Given that the trial is a success for 55 patients, what is the probability that it was successful for both of your two friends? A crime has been committed in a town of , inhabitants. The police are looking for a single perpetrator, believed to live in town.


DNA evidence is found on the crime scene. Before the DNA evidence, Kevin was no more likely to be the guilty person than any other person in town. What is the probability that Kevin is guilty after the DNA evidence appeared? Reason as in Example 2. Let be a discrete random variable with possible values {0, 1, 2,. a Verify that the above is a probability mass function. The king has chosen one of the three uniformly at random to be pardoned tomorrow, while the two unlucky ones head for the gallows. The guard already knows who is to be pardoned. Prisoner A begs the guard to name someone other than A himself who will be executed. a After receiving this information, does A still have probability 13 of being pardoned? b Prisoner A whispers his new information to prisoner C. Prisoner C learned conditional probability before turning to a life of crime and is now hopeful.


What is his new probability of being pardoned? B Exercise 2. Decide whether A, B, and C are mutually independent. Suppose that P A P A Exercise 2. P A B c a Is it possible to calculate P A from this information? Either declare that it is not possible, or find the value of P A. b Are A and B independent, not independent, or is it impossible to determine? Peter and Mary take turns throwing one dart at the dartboard. Peter hits the bullseye with probability p and Mary hits the bullseye with probability r. Whoever hits the bullseye first wins. Suppose Peter throws the first dart. a What is the probability that Mary wins?


b Let X be the number of times Mary throws a dart in the game. The pure dominance and the hybrid are alike in appearance. Children receive one gene from each parent. If, with respect to a particular trait, two hybrid parents have a total of four children, what is the probability that exactly three of the four children have the outward appearance of the dominant gene? Solution: If we assume that each child is equally likely to inherit either of two genes from each parent, the probabilities that the child of two hybrid parents will have dd, rr , or r d pairs of genes are, respectively, 41 , 41 , Hence, because an offspring will have the outward appearance of the dominant gene if its gene pair is either dd or r d, it follows that the number of such children is binomially distributed with parameters 4, If we let X be the number of trials required until the first success, then X is said to be a geometric random variable with parameter p. An important property of the Poisson random variable is that it may be used to approximate a binomial random variable when the binomial parameter n is large and p is small.


Then n! Calculate the probability that there is at least one error on this page. If we know from past experience that, on the average, 3. Solution: If we think of the gram of radioactive material as consisting of a large number n of atoms each of which has probability 3. Let X be such a random variable. In words, Equation 2. A somewhat more intuitive interpretation of the density function may be obtained from Equation 2. In other words, the probability that X will be contained in an interval of length ε around the point a is approximately εf a. From this, we see that f a is a measure of how likely it is that the random variable will be near a. There are several important continuous random variables that appear frequently in probability theory.


The remainder of this section is devoted to a study of certain of these random variables. Similarly, we shall denote the density of Z by f z ·. Corollary 2. By Proposition 2. Random Variables 41 Example 2. Find Var X. Solution: Recalling see Example 2. Solution: As previously noted in Example 2. However, we are often interested in probability statements concerning two or more random variables. A variation of Proposition 2. The same result holds in the discrete case and, combined with the corollary in Section 2. The corresponding result to Equation 2. Solution: Let X denote the sum obtained. The hats are mixed up and each man randomly selects one. Find the expected number of men who select their own hats. Compute the expected number of different types that are contained in a set of 10 coupons. Solution: Let X denote the number of different types in the set of 10 coupons.


That Equation 2. An important result concerning independence is the following. Proposition 2. Suppose that X and Y are jointly continuous. Random Variables 47 Let us consider now the special case where X and Y are indicator variables for whether or not the events A and B occur. In general it can be shown that a positive value of Cov X, Y is an indication that Y tends to increase as X does, whereas a negative value indicates that Y tends to decrease as X increases. b Find Cov X, Y. Properties of Covariance For any random variables X, Y, Z and constant c, 1. The following proposition shows that the covariance between the sample mean and a deviation from that sample mean is zero.


It will be needed in Section 2. We are interested in estimating p, the fraction of the population that is for the proposition, by randomly choosing and then determining the positions of n members of the population. In such situations as described in the preceding, it is common to use the fraction of the sampled population that is in favor of the proposition as an estimator of p. Let us now compute its mean and variance. Identify a person who favors the proposition with a white ball and one against with a black ball. Suppose first that X and Y are continuous, X having probability density f and Y having probability density g. Random Variables 53 By differentiating Equation 2. Solution: From Equation 2.


In general, the n random variables X 1 , X 2 ,. If we let X i denote the ith smallest of these random variables, then X 1 ,. To obtain the distribution of X i , note that X i will be less than or equal to x if and only if at least i of the n random variables X 1 ,. Therefore, since there are n! It is sometimes necessary to obtain the joint distribution of the random variables Y1 and Y2 that arise as functions of X 1 and X 2. Assume that the functions g1 and g2 satisfy the following conditions: 1. A proof of Equation 2. That the result of this differentiation will be equal to the right-hand side of Equation 2. If the conditional distribution of X 1 ,. An individual whose level of exposure to a certain pathogen is x will contract the disease caused by this pathogen with probability P x. If the exposure level of a randomly chosen member of the population has probability density function f , determine the conditional probability density of the exposure level of that member given that he or she a has the disease.


b does not have the disease. c Show that when P x increases in x, then the ratio of the density of part a to that of part b also increases in x. Consider Example 3. Let N denote the total number of doors selected before the miner reaches safety. Again let X denote the time when the miner reaches safety. a b c d e Give an identity that relates X to N and the Ti. What is E[N ]? What is E[TN ]? Using the preceding, what is E[X ]? Suppose that independent trials, each of which is equally likely to have any of m possible outcomes, are performed until the same outcome occurs k consecutive times. That is, they believe that these digits have all the Conditional Probability and Conditional Expectation appearance of being independent choices from a distribution that is equally likely to be any of the digits from 0 through 9.


Possible evidence against this hypothesis is the fact that starting with the 24,,st digit there is a run of nine successive 7s. Is this information consistent with the hypothesis of a uniform distribution? However, it can be shown that under the uniformity assumption the standard deviation of N will be approximately equal to the mean. As a result, the observed value is approximately 0. A coin having probability p of coming up heads is successively flipped until two of the most recent three flips are heads. Let N denote the number of flips. Find E[N ]. A coin, having probability p of landing heads, is continually flipped until at least one head and one tail have been flipped. a b c d Find the expected number of flips needed. Find the expected number of flips that land on heads. Find the expected number of flips that land on tails. Repeat part a in the case where flipping is continued until a total of at least two heads and one tail have been flipped.


b Find the expected number of trials needed until both outcome 1 and outcome 2 have occurred. You have two opponents with whom you alternate play. If your objective is to minimize the expected number of games you need to play to win two in a row, should you start with A or with B? Hint: Let E[Ni ] denote the mean number of games needed if you initially play i. Derive an expression for E[N A ] that involves E[N B ]; write down the equivalent expression for E[N B ] and then subtract. A coin that comes up heads with probability p is continually flipped until the pattern T, T, H appears. That is, you stop flipping when the most recent flip lands heads, and the two immediately preceding it lands tails. Let X denote the number of flips made, and find E[X ]. At each stage a ball is randomly selected from the urn and is then returned along with m other balls of the same color.


Let X k be the number of red balls drawn in the first k selections. Find E[X 1 ]. Find E[X 2 ]. Find E[X 3 ]. Conjecture the value of E[X k ], and then verify your conjecture by a conditioning argument. e Give an intuitive proof for your conjecture. Now suppose that whenever a red ball is chosen it is returned along with m others of the same type, and similarly whenever a blue ball is chosen it is returned along with m others of the same type. Now, use a symmetry argument to determine the probability that any given selection is red. Shooting ends when two consecutive shots hit the target. a Find μ1 and μ2. A maximal subsequence of consecutive values having identical outcomes is called a run. For instance, if the outcome sequence is 1, 1, 0, 1, 1, 1, 0, the first run is of length 2, the second is of length 1, and the third is of length 3.


a Find the expected length of the first run. b Find the expected length of the second run. Independent trials, each resulting in success with probability p, are performed. a Find the expected number of trials needed for there to have been both at least n successes and at least m failures. b Find the expected number of trials needed for there to have been either at least n successes or at least m failures. Conditional Probability and Conditional Expectation Hint: Make use of the result from part a. A set of n dice is thrown. All those that land on six are put aside, and the others are again thrown. This is repeated until all the dice have landed on six.


Let N denote the number of throws needed. b Let X i denote the number of dice rolled on the ith throw. Let m n denote the mean number of cycles. Conjecture a general formula for m n. Prove your formula by induction on n. Express the number of cycles in terms of these X i. Use the representation in part e to determine m n. Are the random variables X 1 ,. Find the variance of the number of cycles. A prisoner is trapped in a cell containing three doors. The first door leads to a tunnel that returns him to his cell after two days of travel. The second leads to a tunnel that returns him to his cell after three days of travel. The third door leads immediately to freedom. a Assuming that the prisoner will always select doors 1, 2, and 3 with probabilities 0. b Assuming that the prisoner is always equally likely to choose among those doors that he has not used, what is the expected number of days until he reaches freedom?


In this version, for instance, if the prisoner initially tries door 1, then when he returns to the cell, he will now select only from doors 2 and 3. c For parts a and b find the variance of the number of days until the prisoner reaches freedom. Workers 1,. Suppose that each worker, independently, has probability p of being eligible for a job, and that a job is equally likely to be assigned to any of the workers that are eligible for it if none are eligible, the job is rejected. Find the probability that the next job is assigned to worker 1. Determine, by differentiating its moment generating function, its expected value and variance.


Show, by computing its moment generating function, that W is a noncentral chi-squared random variable with parameters n and θ. e Find the expected value and variance of a noncentral chi-squared random variable with parameters n and θ. The amount of money spent by a customer is uniformly distributed over 0, Find the mean and variance of the amount of money that the store takes in on a given day. An individual traveling on the real line is trying to reach the origin. However, the larger the desired step, the greater is the variance in the result of that step. Specifically, whenever the person is at location x, he next moves to a location having mean 0 and variance βx 2. Let X n denote the position of the individual after having taken n steps. Suppose that we want to predict the value of a random variable X by using one of the predictors Y1 ,.


Hint: Compute Var Yi by using the conditional variance formula. A and B play a series of games with A winning each game with probability p. The overall winner is the first player to have won two more games than the other. a Find the probability that A is the overall winner. b Find the expected number of games played. There are three coins in a barrel. These coins, when flipped, will come up heads with respective probabilities 0. A coin is randomly selected from among these three and is then flipped ten times. Let N be the number of heads obtained on the ten flips. Does N have a binomial distribution? If X is geometric with parameter p, find the probability that X is even. Suppose that X and Y are independent random variables with probability density functions f X and f Y. The number of storms in the upcoming rainy season is Poisson distributed but with a parameter value that is uniformly distributed over 0, 5.


Find the probability there are at least three storms this season. Further suppose that Y is a gamma random variable with parameters r, λ , where r is a positive integer. Find Var N. Suppose each new coupon collected is, independent of the past, a type i coupon with probability pi. A total of n coupons is to be collected. Let Ai be the event that there is at least one type i in this set. Two players alternate flipping a coin that comes up heads with probability p. The first one to obtain a head is declared the winner. We are interested in the probability that the first player to flip is the winner. Before determining this probability, which we will call f p , answer the following questions. a Do you think that f p is a monotone function of p? If so, is it increasing or decreasing? d Find f p. Suppose in Exercise 29 that the shooting ends when the target has been hit twice. b Find P1 and P2. For the remainder of the problem, assume that player 1 shoots first. c d e f Find the probability that the final hit was by 1.


Find the probability that both hits were by 1. Find the probability that both hits were by 2. Find the mean number of shots taken. A, B, and C are evenly matched tennis players. Initially A and B play a set, and the winner then plays C. This continues, with the winner always playing the waiting player, until one of the players has won two sets in a row. That player is then declared the overall winner. Find the probability that A is the overall winner. Suppose there are n types of coupons, and that the type of each new coupon obtained is independent of past selections and is equally likely to be any of the n types.


Suppose one continues collecting until a complete set of at least one of each type is obtained. a Find the probability that there is exactly one type i coupon in the final collection. Hint: Condition on T , the number of types that are collected before the first type i appears. b Find the expected number of types that appear exactly once in the final collection. A and B roll a pair of dice in turn, with A rolling first. The game ends when either player reaches his or her objective, and that player is declared the winner. a Find the probability that A is the winner. b Find the expected number of rolls of the dice. c Find the variance of the number of rolls of the dice. The number of red balls in an urn that contains n balls is a random variable that is equally likely to be any of the values 0, 1,.


Conditional Probability and Conditional Expectation d Verify your answer to part c by a backwards induction argument. The opponents of soccer team A are of two types: either they are a class 1 or a class 2 team. This weekend the team has two games against teams they are not very familiar with. Assuming that the first team they play is a class 1 team with probability 0. b the probability that team A will score a total of five goals. A coin having probability p of coming up heads is continually flipped. Let P j n denote the probability that a run of j successive heads occurs within the first n flips. In a knockout tennis tournament of 2n contestants, the players are paired and play a match.


This continues for n rounds, after which a single player remains unbeaten and is declared the winner. Suppose that the contestants are numbered 1 through 2n , and that whenever two players contest a match, the lower numbered one wins with probability p. Also suppose that the pairings of the remaining players are always done at random so that all possible pairings for that round are equally likely. a What is the probability that player 1 wins the tournament? b What is the probability that player 2 wins the tournament? Hint: Imagine that the random pairings are done in advance of the tournament. Now condition on the round in which players 1 and 2 are scheduled to meet. In the match problem, say that i, j , i Introduction to Probability Models Figure 3.


c Find the probability that persons 1, 2,. d Find the probability that 1, 2,. Use Equation 3. Hint: First multiply both sides of Equation 3. In Example 3. Suppose that we continually roll a die until the sum of all throws exceeds What is the most likely value of this total when you stop? There are five components. These components form a system as shown in Figure 3. The system is said to work if a signal originating at the left end of the diagram can reach the right end, where it can pass through a component only if that component is working. For instance, if components 1 and 4 both work, then the system also works. What is the probability that the system works? Conditional Probability and Conditional Expectation This problem will present another proof of the ballot problem of Example 3. Explain this correspondence. These are the probabilities if the bet is that a roulette wheel will land on a specified color.


The gambler will quit either when he or she is winning a total of 5 or after plays. What is the probability he or she plays exactly 15 times? In the ballot problem Example 3. An urn contains n white and m black balls that are removed one at a time. Explain why this probability is equal to the probability that the set of withdrawn balls always contains more white than black balls. A coin that comes up heads with probability p is flipped n consecutive times. What is the probability that starting with the first flip there are always more heads than tails that have appeared? e Use part d to obtain f x. d Conclude that X Nk has distribution F. Suppose in Example 3. a If A is currently serving, what is the probability that A wins the next point? b Explain how to obtain the final score probabilities. In the list problem, when the Pi are known, show that the best ordering best in the sense of minimizing the expected position of the element requested is to place the elements in decreasing order of their probabilities.


Consider the random graph of Section 3. a From the results of Section 3. Prove this directly. c For the Bose—Einstein distribution, compute the probability that exactly k of the X i are equal to 0. In Section 3. The number of accidents in each period is a Poisson random variable with mean 5. Find the expected number of flips of a coin, which comes up heads with probability p, that are necessary to obtain the pattern h, t, h, h, t, h, t, h. The number of coins that Josh spots when walking to work is a Poisson random variable with mean 6. Each coin is equally likely to be a penny, a nickel, a dime, or a quarter.


Josh ignores the pennies but picks up the other coins. a Find the expected amount of money that Josh picks up on his way to work. b Find the variance of the amount of money that Josh picks up on his way to work. c Find the probability that Josh picks up exactly 25 cents on his way to work. Consider a sequence of independent trials, each of which is equally likely to result in any of the outcomes 0, 1,. Say that a round begins with the first trial, and that a new round begins each time outcome 0 occurs. Let N denote the number of trials that it takes until all of the outcomes 1,. Also, let T j denote the number of trials that it takes until j distinct outcomes have occurred, and let I j denote the jth distinct outcome to occur. Therefore, outcome I j first occurs at trial T j.


a Argue that the random vectors I1 ,. Hint: See Exercise 42 of Chapter 2. d Find E[N ]. Let N be a hypergeometric random variable having the distribution of the number of white balls in a random sample of size r from a set of w white and b blue balls. where we use the convention that j Conditional Probability and Conditional Expectation a With M as defined as in Section 3. c Use the recursion of b to find Pw,r 2. For the left skip free random walk of Section 3. Let X n denote its value in time period n, and suppose we want to make a probability model for the sequence of successive values X 0 , X 1 , X 2.


The simplest model would probably be to assume that the X n are independent random variables, but often such an assumption is clearly unjustified. For instance, starting at some time suppose that X n represents the price of one share of some security, such as Google, at the end of n additional trading days. Such an assumption defines a Markov chain, a type of stochastic process that will be studied in this chapter, and which we now formally define. Unless otherwise mentioned, this set of possible values of the process will be denoted by the set of nonnegative integers {0, 1, 2,. We suppose that whenever the process is in state i, there is a fixed probability Pi j that it will next be in state j. Such a stochastic process is known as a Markov chain. Equation 4. The value Pi j represents the probability that the process will, when in state i, next make a transition into state j.


Example 4. Suppose also that if it rains today, then it will rain tomorrow with probability α; and if it does not rain today, then it will rain tomorrow with probability β. Each digit transmitted must pass through several stages, at each of which there is a probability p that the digit entered will be unchanged when it leaves. If he is cheerful today, then he will be C, S, or G tomorrow with respective probabilities 0. If he is feeling so-so today, then he will be C, S, or G tomorrow with probabilities 0. If he is glum today, then he will be C, S, or G tomorrow with probabilities 0. Specifically, suppose that if it has rained for the past two days, then it will rain tomorrow with probability 0.


If we let the state at time n depend only on whether or not it is raining at time n, then the preceding model is not a Markov chain why not? However, we can transform this model into a Markov chain by saying that the state at any time is determined by the weather conditions during both that day and the previous day. In other words, we can say that the process is in state 0 state 1 state 2 state 3 if it rained both today and yesterday, if it rained today but not yesterday, if it rained yesterday but not today, if it did not rain either yesterday or today.


Note that the preceding is a finite state random walk with absorbing barriers states 0 and N. Each policyholder is given a positive integer valued state and the annual premium is a function of this state along, of course, with the type of car being insured and the level of insurance. Thus, no claims is good and typically results in a decreased premium, while claims are bad and typically result in a higher premium. For a given Bonus Malus system, let si k denote the next state of a policyholder who was in state i in the previous year and who made a total of k claims in that year. Consider a policyholder whose annual number of claims is a Poisson random variable with parameter λ. We now define the n-step transition probabilities Pinj to be the probability that a process in state i will be in state j after n additional transitions. The Chapman—Kolmogorov equations provide a method for computing these n-step transition probabilities.


and the desired probability P00 Example 4. Given that it rained on Monday and Tuesday, what is the probability that it will rain on Thursday? Ball colors are red and blue. At each stage a ball is randomly chosen and then replaced by a new ball, which with probability 0. If initially both balls are red, find the probability that the fifth ball selected is red. Solution: To find the desired probability we first define an appropriate Markov chain. This can be accomplished by noting that the probability that a selection is red is determined by the composition of the urn at the time of the selection. So, let us define X n to be the number of red balls in the urn after the nth selection and subsequent replacement. Now, to go from 1 red ball in the urn to 0 red balls, the ball chosen must be red which occurs with probability 0.


What is the probability that there will be exactly 3 nonempty urns after 9 balls have been distributed? However, because we only require the probability, starting with a single occupied urn, that there are 3 occupied urns after an additional 8 balls have been distributed we can make use of the fact that the state of the Markov chain cannot decrease to collapse all states 4, 5,. Let A be a set of states, and suppose we are interested in the probability that the Markov chain ever enters any of the states in A by time m. Once the {Wn } Markov chain enters state A it remains there forever.


Markov Chains The new Markov chain is defined as follows. Because there would be a run of three consecutive 8. To compute this probability all we need to know are the transition probabilities P11 , P12 , P21 , P For instance, Pinj is the probability that the state at time n is j given that the initial state at time 0 is i. If the unconditional distribution of the state at time n is desired, it is necessary to specify the probability distribution of the initial state. Note that this implies that state j is accessible from state i if and only if, starting in i, it is possible that the process will ever enter state j.


ii If state i communicates with state j, then state j communicates with state i. iii If state i communicates with state j, and state j communicates with state k, then state i communicates with state k. Properties i and ii follow immediately from the definition of communication. To prove iii suppose that i communicates with j, and j communicates with k. Similarly, we can show that state i is accessible from state k. Hence, states i and k communicate. Two states that communicate are said to be in the same class. It is an easy consequence of i , ii , and iii that any two classes of states are either identical or disjoint. In other words, the concept of communication divides the state space up into a number of separate classes. The Markov chain is said to be irreducible if there is only one class, that is, if all states communicate with each other.


Note that while state 0 or 1 is accessible from state 2, the reverse is not true. Proposition 4. This leads to the conclusion that in a finite-state Markov chain not all states can be transient. To see this, suppose the states are 0, 1,. Then after a finite amount of time say, after time T0 state 0 will never be visited, and after a time say, T1 state 1 will never be visited, and after a time say, T2 state 2 will never be visited, and so on. But as the process must be in some state after time T we arrive at a contradiction, which shows that at least one of the states must be recurrent.


Another use of Proposition 4. Corollary 4. If state i is recurrent, and state i communicates with state j, then state Proof. Thus, by Proposition 4. For if state i is transient and communicates with state j, then state j must also be transient. For if j were recurrent then, by Corollary 4. ii Corollary 4. Solution: It is a simple matter to check that all states communicate and, hence, since this is a finite chain, all states must be recurrent.



This classroom-tested textbook is an introduction to probability theory, with the right balance between mathematical pre. English Pages [] Year DOWNLOAD FILE. This two-volume text provides a complete overview of the theory of Banach spaces, emphasising its interplay with classic. This book covers in a leisurely manner all the standard material that one would want in a full year probability course w. Introduction to Probability Models, Twelfth Edition, is the latest version of Sheldon Ross's classic bestseller. Discusses probability theory and to many methods used in problems of statistical inference.


The Third Edition features m. The skill of statistical thinking is increasing in importance in this predominantly data-driven world. With Mendenhall,. Table of contents : Contents Preface To the instructor From gambling to an essential ingredient of modern science and society Chapter 1 Experiments with random outcomes 1. Introduction to Probability This classroom-tested textbook is an introduction to probability theory, with the right balance between mathematical precision, probabilistic intuition, and concrete applications. Introduction to Probability covers the material precisely, while avoiding excessive technical details. After introducing the basic vocabulary of randomness, including events, probabilities, and random variables, the text offers the reader a first glimpse of the major theorems of the subject: the law of large numbers and the central limit theorem. The important probability distributions are introduced organically as they arise from applications.


The discrete and continuous sides of probability are treated together to emphasize their similarities. Intended for students with a calculus background, the text teaches not only the nuts and bolts of probability theory and how to solve specific problems, but also why the methods of solution work. David F. Anderson is a Professor of Mathematics at the University of WisconsinMadison. His research focuses on probability theory and stochastic processes, with applications in the biosciences. He is the author of over thirty research articles and a graduate textbook on the stochastic models utilized in cellular biology. He was awarded the inaugural Institute for Mathematics and its Applications IMA Prize in Mathematics in , and was named a Vilas Associate by the University of Wisconsin-Madison in Timo Seppäläinen is the John and Abigail Van Vleck Chair of Mathematics at the University of Wisconsin-Madison.


He is the author of over seventy research papers in probability theory and a graduate textbook on large deviation theory. He is an elected Fellow of the Institute of Mathematical Statistics. He was an IMS Medallion Lecturer in , an invited speaker at the International Congress of Mathematicians, and a —16 Simons Fellow. Benedek Valkó is a Professor of Mathematics at the University of Wisconsin- Madison. His research focuses on probability theory, in particular in the study of random matrices and interacting stochastic systems. He has published over thirty research papers. He has won a National Science Foundation NSF CAREER award and he was a —18 Simons Fellow. C A M B R I D G E M AT H E M AT I C A L T E X T B O O K S Cambridge Mathematical Textbooks is a program of undergraduate and beginning graduate level textbooks for core courses, new courses, and interdisciplinary courses in pure and applied mathematics. These texts provide motivation with plenty of exercises of varying difficulty, interesting examples, modern applications, and unique approaches to the material.


ADVISORY BOARD John B. Conway, George Washington University Gregory F. Lawler, University of Chicago John M. Lee, University of Washington John Meier, Lafayette College Lawrence C. Washington, University of Maryland, College Park A complete list of books in the series can be found at www. Smith Set Theory: A First Course , D. Cunningham , G. Goodson Introduction to Experimental Mathematics, S. Johansen A Second Course in Linear Algebra , S. Horn Exploring Mathematics: An Engaging Introduction to Proof , J. Smith A First Course in Analysis , J. Conway Introduction to Probability , D. Anderson, T. Valkó Chaotic Dynamics: Fractals, Tilings, and Substitutions Introduction to Probability DAVID F. org Information on this title: www. Anderson, Timo Seppäläinen and Benedek Valkó This publication is in copyright.


Subject to statutory exception and to the provisions of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University Press. First published Printed in United States of America by Sheridan Books, Inc. A catalogue record for this publication is available from the British Library. Library of Congress Cataloging-in-Publication Data Names: Anderson, David F. Anderson, University of Wisconsin, Madison, Timo Seppäläinen, University of Wisconsin, Madison, Benedek Valkó, University of Wisconsin, Madison. Description: Cambridge: Cambridge University Press, [] Series: Cambridge mathematical textbooks Includes bibliographical references and index.


Identifiers: LCCN ISBN Subjects: LCSH: Probabilities—Textbooks. Classification: LCC QA A DDC To our families Contents Preface To the instructor From gambling to an essential ingredient of modern science and society Chapter 1 1. It is intended for classroom use as well as for independent learners and readers. The mathematics is covered as precisely and faithfully as is reasonable and valuable, while avoiding excessive technical details. Two examples of this are as follows. Random variables are defined precisely as functions on the sample space. This is important to avoid the feeling that a random variable is a vague notion. Once absorbed, this point is not needed for doing calculations. Short, illuminating proofs are given for many statements but are not emphasized. The main focus of the book is on applying the mathematics to model simple settings with random outcomes and on calculating probabilities and expectations.


Introductory probability is a blend of mathematical abstraction and handson computation where the mathematical concepts and examples have concrete real-world meaning. The principles that have guided us in the organization of the book include the following. i We found that the traditional initial segment of a probability course devoted to counting techniques is not the most auspicious beginning. Hence we start with the probability model itself, and counting comes in conjunction with sampling. A systematic treatment of counting techniques is given in an appendix. The instructor can present this in class or assign it to the students. ii Most events are naturally expressed in terms of random variables. Hence we bring the language of random variables into the discussion as quickly as possible. iii One of our goals was an early introduction of the major results of the subject, namely the central limit theorem and the law of large numbers. These are xii Preface covered for independent Bernoulli random variables in Chapter 4.


Preparation for this influenced the selection of topics of the earlier chapters. iv As a unifying feature, we derive the most basic probability distributions from independent trials, either directly or via a limit. This covers the binomial, geometric, normal, Poisson, and exponential distributions. Many students reading this text will have already been introduced to parts of the material. They might be tempted to solve some of the problems using computational tricks picked up elsewhere. We warn against doing so. The purpose of this text is not just to teach the nuts and bolts of probability theory and how to solve specific problems, but also to teach you why the methods of solution work. The sections marked with a diamond ± are optional topics that can be included in an introductory probability course as time permits and depending on the interests of the instructor and the audience. They can be omitted without loss of continuity.


At the end of most chapters is a section titled Finer points on mathematical issues that are usually beyond the scope of an introductory probability book. In particular, we do not mention measure-theoretic issues in the main text, but explain some of these in the Finer points sections. Other topics in the Finer points sections include the lack of uniqueness of a density function, the Berry—Esséen error bounds for normal approximation, the weak versus the strong law of large numbers, and the use of matrices in multivariate normal densities. These sections are intended for the interested reader as starting points for further exploration. They can also be helpful to the instructor who does not possess an advanced probability background. The symbol ² is used to mark the end of numbered examples, the end of remarks, and the end of proofs.


There is an exercise section at the end of each chapter. The exercises begin with a small number of warm-up exercises explicitly organized by sections of the chapter. Their purpose is to offer the reader immediate and basic practice after a section has been covered. The subsequent exercises under the heading Further exercises contain problems of varying levels of difficulty, including routine ones, but some of these exercises use material from more than one section. Under the heading Challenging problems towards the end of the exercise section we have collected problems that may require some creativity or lengthier calculations. The concrete mathematical prerequisites for reading this book consist of basic set theory and some calculus, namely, a solid foundation in single variable calculus, including sequences and series, and multivariable integration.


Appendix A gives a short list of the particular calculus topics used in the text. Appendix B reviews set theory, and Appendix D reviews some infinite series.



Introduction to probability second edition bertsekas pdf,PDF Architect Installer

Introduction to probability Free Download | blogger.com Home; Latest Books; Advanced Search; English Year ISBN , File Type pdf File Size MiB Free Download Book. Latest Downloads. A-B-C for Book Collectors - A narrative glossary of words and phrases commonly used in book-collecting DOWNLOAD FILE. Polecaj historie. Introduction to probability , , 81 3MB Read more. Introduction to Probability (Cambridge Download Esteban Podetti free PDF. About Author. Pinochos: marionetas o niños de verdad: Las desventuras del deseo. Pages; Español; La generación del 98 en sus anécdotas. Pages; Español; El Matadero. 98 Pages; Español; La arquitectura moderna en Latinoamérica: La arquitectura moderna en Latinoamérica 4 Sample Space and Probability Chap. 1 and we say that S is countably blogger.com example, the set of even integers can be written as {0,2,−2,4,−4, }, and is countably infinite. Alternatively, Download Introduction to Probability written by David F. Anderson, Timo Seppalainen, Benedek Valko in PDF format. This book is under the category Mathematics and bearing the isbn/isbn13 number / You may reffer the table below for Introduction to Probability (0th) Edition better than downloaded Introduction to Probability 0th Edition PDF solution manuals? Introduction to Probability (Cambridge Mathematical ... read more



If we wish to label the probability mass function with the random variable we write pX k. are obtained from the transition probability matrix P 1 and if tails from the matrix P 2. Two towns are connected by a daily bus and by a daily train that go through a valley. Hint: Compute Var Yi by using the conditional variance formula. Hence, if the algorithm has not yet terminated because it found a set of satisfiable values different from that of A , it will do " so within an expected time of at most n 2 and with a standard deviation of at most n 2 In particular, we do not mention measure-theoretic issues in the main text, but explain some of these in the Finer points sections. See Exercise C.



Show that if independent of A. Consequently, Equation 4. Consider these events: Example 2. We approach the problem as we did in Example 1. Hence, the desired probability is!

No comments:

Post a Comment