##
More Bayesian Probability
*June 25, 2007*

*Posted by Peter in Exam 1/P, Exam 4/C.*

add a comment

add a comment

We know 10% of all proteins are membrane proteins. There are three types of amino acid: hydrophobic (H); polar (P); and charged (C). In globular protein (non-membrane protein) the percentage of each type of amino acids is equal: 1/3 each. In membrane protein the percentages are: H 50%; P 25%; C 25%. Now we have a unidentified sequence: HHHPH. What is the probability that it is a membrane protein?

**Solution:** Let M be the event that the protein is a membrane protein, and G be the event that the protein is globular (non-membrane). Then

if we assume that all proteins are classified as belonging to either type M or type G. Now, we are also given that

that is, the probability that a selected amino acid is hydrophobic, polar, or charged, given that it belongs to a globular protein, is 1/3 each. We also have

Our prior hypothesis is that there is a 0.1 probability that the protein is a membrane protein. Now, the likelihood of observing the amino sequence HHHPH given that the protein is membrane, is

This assumes that amino acid types are independent of each other within a given protein. Similarly, the likelihood of observing the same sequence given that the protein is globular, is

The joint probabilities are then

and therefore the unconditional probability of observing the sequence HHHPH is, by the law of total probability,

Hence by Bayes’ theorem, the posterior probability of the protein being membrane, given that we observed the particular amino sequence HHHPH, is

which is approximately 29.67%. This answer makes sense, because in the absence of any information, we can only conclude there is a 10% probability of selecting a membrane protein. However, once we observed the sequence HHHPH, the posterior probability is significantly greater, since it is far more likely to observe such a sequence if the protein were membrane than if it were globular—indeed, the likelihood was 1/64 versus 1/243. However, because the overall distribution of proteins is such that 90% are globular, the posterior probability is not vastly greater—only 30%.

**Exercise:** Suppose you observed the sequence HHPCHHPHHHCH. What is the posterior probability of the protein being membrane? Why do we get a different result here? Why do we have to observe far longer sequences before we can have a high posterior probability that the sequence belongs to a membrane protein, compared to a similar degree of confidence that the sequence belongs to a globular protein?

##
The moment calculation shortcut
*June 20, 2007*

*Posted by Peter in Exam 1/P, Exam 3/MLC, Exam 4/C.*

add a comment

add a comment

Suppose X is a continuous random variable that takes on nonnegative values. Then we have the following definition:

where is the probability density function of X. Indeed, in the general case, let g be a function on the support of X. Then

So when for some positive integer k, we obtain the formula for the **k(th) raw moment** of X. Let’s work through an example.

Show that the expected value (first raw moment) of a Pareto distribution with parameters α and θ is equal to θ/(α-1). Recall that the density of the Pareto distribution is

**Solution:** We compute

Then make the substitution , , to obtain

For , the integrals converge, giving

which proves the desired result. However, the computation is quite tedious, and there is often an easier approach. We will now show that instead of using the density of X, we can use the survival of X when computing moments. Recall that

that is, the survival function is the probability that X exceeds x, which is the integral of the density on the interval , or the complement of the cumulative distribution function F(x). With this in mind, let’s try integration by parts on the definition of the expected value, with the choices , ; , :

where the last equality holds because of two assumptions: First, that , and g(0) is finite; and second, that the resulting integral of g'(x) S(x) is convergent. Note that the individual terms of the integration by parts are not themselves convergent, but taken together, they are—thus, a fully rigorous proof requires a more formal treatment than what is furnished here.

A consequence of this result is that for positive integers k,

This formula is easier to work with in some instances, compared to the original definition. For instance, we know that the Pareto survival function is

so we find

which we can immediately see results in a simpler integrand. This result also proves the life contingencies relationship

since the complete expectation of life is simply the expected value E[T(x)] of the future lifetime variable T(x), and is the survival function of T(x). In life contingencies notation, the definition of expected value would then look like this:

which is usually more cumbersome than the formula using only the survival function.

##
Order Statistics of Exponential RVs
*June 5, 2007*

*Posted by Peter in Exam 1/P, Exam 3/MLC, Exam 4/C.*

add a comment

add a comment

Here’s a question I read from the AoPS forum, and answered therein:

In analyzing the risk of a catastrophic event, an insurer uses the exponential distribution with mean as the distribution of the time until the event occurs. The insured had

nindependent catastrophe policies of this type. Find the expected time until the insured will have the first catastrophe claim.

The sum of *n* independent and identically distributed exponential random variables is gamma distributed. Specifically, if are exponential with mean , then *S* is gamma with mean and variance .

It’s noteworthy that the sum *S* is a random variable that describes the time until the *n*-th claim if claims followed a Poisson process (whose **interarrival** times are exponentially distributed).

However, according to the model you specified, the events are not interarrival times, but rather they run concurrently. So the time until the *k*-th event is NOT gamma; rather, it is the *k*-th order statistic . Fortunately, the first such order statistic is exponential, which we show by recalling that has PDF

If *k* = 1, we immediately obtain

which is exponential with mean . Note that this answer makes sense because as the number of policies *n* held by the insured increases, the expected waiting time until the first claim **decreases**. This would not be the case if one and only one policy at a time were in force at all times until the last policy claim–such a scenario would correspond to the gamma distribution previously mentioned.

To check your understanding, here are some exercises:

- What is the PDF of the second order statistic , and what does it represent?
- Which of the belong to the gamma or exponential family of distributions?
- Prove that

##
Question 12, Spring 2007 Exam 4/C
*May 27, 2007*

*Posted by Peter in Exam 4/C.*

add a comment

add a comment

**12.** For 200 auto accident claims you are given:

- Claims are submitted
*t*months after the accident occurs,*t*= 0, 1, 2, …. - There are no censored observations.
- is calculated using the Kaplan-Meier product limit estimator.
- , where is calculated using Greenwood’s approximation.
- .

Determine the number of claims that were submitted to the company 10 months after an accident occurred.

**Solution.** There are two key observations we need to make. The first is that we are given the risk set at time *t* = 0, namely . The second observation is that because no observations are censored, the Kaplan-Meier estimator of the survival time takes on a particularly simple form. This is because in the absence of censoring, ; that is, the risk set at time is simply the risk set at time minus those who died in the meantime. Therefore

So with this in mind, we have . Recalling Greenwood’s approximation,

so

Substituting and solving, we obtain .