101.school
CoursesAbout
Search...⌘K
Generate a course with AI...

    Statistics 1-1

    Receive aemail containing the next unit.
    • Introduction to Statistics
      • 1.1Importance and Applications of statistics
      • 1.2Types of Data
      • 1.3Classification of Statistics
    • Descriptive Statistics
      • 2.1Measures of Central Tendency
      • 2.2Measures of Dispersion
    • Probability
      • 3.1Basic Probability Concepts
      • 3.2Conditional Probability
      • 3.3Theories of Probability
    • Probability Distribution
      • 4.1Probability Mass Function & Probability Density Function
      • 4.2Special Distributions: Binomial, Poisson & Normal Distributions
      • 4.3Central Limit Theorem
    • Sampling and Sampling Methods
      • 5.1Concept of Sampling
      • 5.2Different Sampling Techniques
    • Estimation and Hypothesis Testing
      • 6.1Point and Interval Estimation
      • 6.2Fundamentals of Hypothesis Testing
      • 6.3Type I and II Errors
    • Comparison of Two Populations
      • 7.1Independent Samples
      • 7.2Paired Samples
    • Analysis of Variance (ANOVA)
      • 8.1One-way ANOVA
      • 8.2Two-way ANOVA
    • Regression Analysis
      • 9.1Simple Regression
      • 9.2Multiple Regression
    • Correlation
      • 10.1Concept of Correlation
      • 10.2Types of Correlation
    • Nonparametric Statistics
      • 11.1Chi-Square Test
      • 11.2Mann-Whitney U Test
      • 11.3The Kruskal-Wallis Test
    • Statistical Applications in Quality and Productivity
      • 12.1Use of Statistics in Quality Control
      • 12.2Use of Statistics in Productivity
    • Software Application in Statistics
      • 13.1Introduction to Statistical Software
      • 13.2Statistical Analysis using Software

    Probability Distribution

    Understanding Probability Mass Function and Probability Density Function

    function describing a discrete probability distribution by stating the probability of each value

    Function describing a discrete probability distribution by stating the probability of each value.

    Introduction

    In the world of statistics, understanding how the values of a random variable are distributed is crucial. This is where the concepts of Probability Mass Function (PMF) and Probability Density Function (PDF) come into play. These two functions provide a mathematical description of the random variable's distribution.

    Probability Mass Function (PMF)

    The Probability Mass Function is a function that gives the probability that a discrete random variable is exactly equal to some value. The probability mass function is often the primary means of defining a discrete probability distribution, and such functions exist for either scalar or multivariate random variables whose domain is discrete.

    A PMF must satisfy two conditions:

    1. The probability for each value must be between 0 and 1, inclusive.
    2. The sum of all probabilities must equal 1.

    For example, consider a fair six-sided die. The PMF of the outcome of a single roll is 1/6 for each of the faces (1, 2, 3, 4, 5, 6), and the sum of these probabilities is 1.

    Probability Density Function (PDF)

    The Probability Density Function is a function whose value at any given sample in the sample space can be interpreted as providing a relative likelihood that the value of the random variable would equal that sample. In other words, while the absolute likelihood for a continuous random variable to take on any particular value is 0 (since there are an infinite set of possible values to begin with), the value of the PDF at two different samples can be used to infer, in any particular draw of the random variable, how much more likely it is that the random variable would equal one sample compared to the other sample.

    A PDF must satisfy two conditions:

    1. The probability for each value must be non-negative.
    2. The integral over the entire space must equal 1.

    For example, the height of adult males in the U.S. is normally distributed with a mean of about 70 inches. The PDF at 70 inches shows the relative likelihood of a man being 70 inches tall.

    Differences between PMF and PDF

    While both PMF and PDF provide a description of the distribution of a random variable, they are used in different contexts. PMF is used for discrete random variables, for which the outcomes are countable. On the other hand, PDF is used for continuous random variables, which can take on an infinite number of outcomes.

    In conclusion, understanding PMF and PDF is fundamental to understanding statistics and probability. These functions provide a way to describe the distribution of random variables, which is crucial in making predictions, inferences, and decisions based on data.

    Test me
    Practical exercise
    Further reading

    Good morning my good sir, any questions for me?

    Sign in to chat
    Next up: Special Distributions: Binomial, Poisson & Normal Distributions