# Pmf and cdf relationship marketing

### The Histogram, Pmf and Pdf

SAS software provides the PDF, CDF, QUANTILE, and RAND function, which proc iml; /* PMF function for the beta-binomial distribution. Thus, the interpretation of the CDF is the same whether we have a discrete or continuous variable (read pdf or pmf), but the definition is slightly different. The term probability mass function, PMF, is about how the function in the discrete Based on studies, PDF is the derivative of CDF, which is the.

PMF uses discrete random variables. PDF uses continuous random variables. CDF is used to determine the probability wherein a continuous random variable would occur within any measurable subset of a certain range. Here is an example: We shall compute for the probability of a score between 90 and Both terms have been used often in this article. So it would be best to include that these terms really mean. It takes only a countable number of distinct value, like, 0,1,2,3,4,5,6,7,8,9, and so on.

Other examples of discrete random variables could be: The number of children in the family. The number of people watching the Friday late night matinee show. Suffice to say, if you talk about probability distribution of a discrete random variable, it would be a list of probabilities that would be associated to the possible values. Alternately, that is why the term continuous is applied to the random variable because it can assume all of the possible values within the given range of the probability.

## Connecting the CDF and the PDF

Examples of continuous random variables could be: The temperature in Florida for the month of December. The histogram displays the number of samples there are in the signal that have each of these possible values.

Figure b shows the histogram for the samples in a. For example, there are 2 samples that have a value of8 samples that have a value of0 samples that have a value ofetc. We will represent the histogram by Hi, where i is an index that runs from 0 to M-1, and M is the number of possible values that each sample can take on. For instance, H50 is the number of samples that have a value of Figure c shows the histogram of the signal using the full data set, all k points.

As can be seen, the larger number of samples results in a much smoother appearance. Just as with the mean, the statistical noise roughness of the histogram is inversely proportional to the square root of the number of samples used. From the way it is defined, the sum of all of the values in the histogram must be equal to the number of points in the signal: The histogram can be used to efficiently calculate the mean and standard deviation of very large data sets.

This is especially important for images, which can contain millions of samples. The histogram groups samples together that have the same value. This allows the statistics to be calculated by working with a few groups, rather than a large number of individual samples.

## Hypergeometric distribution

Using this approach, the mean and standard deviation are calculated from the histogram by the equations: Table contains a program for calculating the histogram, mean, and standard deviation using these equations.

Calculation of the histogram is very fast, since it only requires indexing and incrementing. In comparison, calculating the mean and standard deviation requires the time consuming operations of addition and multiplication.

The strategy of this algorithm is to use these slow operations only on the few numbers in the histogram, not the many samples in the signal. This makes the algorithm much faster than the previously described methods. Think a factor of ten for very long signals with the calculations being performed on a general purpose computer.

The notion that the acquired signal is a noisy version of the underlying process is very important; so important that some of the concepts are given different names. The histogram is what is formed from an acquired signal. The corresponding curve for the underlying process is called the probability mass function pmf. A histogram is always calculated using a finite number of samples, while the pmf is what would be obtained with an infinite number of samples. The pmf can be estimated inferred from the histogram, or it may be deduced by some mathematical technique, such as in the coin flipping example.

Figure shows an example pmf, and one of the possible histograms that could be associated with it. The key to understanding these concepts rests in the units of the vertical axis. As previously described, the vertical axis of the histogram is the number of times that a particular value occurs in the signal.

The vertical axis of the pmf contains similar information, except expressed on a fractional basis. In other words, each value in the histogram is divided by the total number of samples to approximate the pmf.

### Compute the CDF and quantiles of discrete distributions - The DO Loop

This means that each value in the pmf must be between zero and one, and that the sum of all of the values in the pmf will be equal to one. The pmf is important because it describes the probability that a certain value will be generated. For example, imagine a signal generated by the process described by Fig. What is the probability that a sample taken from this signal will have a value of ?

Figure b provides the answer, 0. What is the probability that a randomly chosen sample will have a value greater than ?

Adding up the values in the pmf for: Thus, the signal would be expected to have a value exceeding on an average of every 82 points. What is the probability that any one sample will be between 0 to ? Summing all of the values in the histogram produces the probability of 1. The histogram and pmf can only be used with discrete data, such as a digitized signal residing in a computer.

A similar concept applies to continuous signals, such as voltages appearing in analog electronics. The probability density function pdfalso called the probability distribution function, is to continuous signals what the probability mass function is to discrete signals.