Inferential statistics involves making generalizations and drawing conclusions about a larger population based on sample data. It focuses on two main activities: parameter estimation and hypothesis testing. It uses probability theories to measure uncertainty in inferences, given that samples are only a part of the population
Statistical models provide a framework for making inferences. For instance, a linear regression model can be used to estimate parameters (like regression coefficients) which are then generalized to the broader population.
- Tools for Hypothesis Testing: Statistical models are used for formulating and testing hypotheses. e.g. a model could be used to test whether there is a significant relationship between two variables.
- Parameter Estimation: Models assist in estimating population parameters, which is a central aspect of inferential statistics. e.g. mean or variance of a population.
- Analysis of Uncertainty: Inferential statistics, through statistical models, quantify the uncertainty associated with estimations. This is done by calculating confidence intervals, p-values, etc.
- Predictions and Generalizations: Statistical models are used to make predictions about new data. In inferential statistics, these predictions are extended to the general population.
In statistical inference there is uncertainty in any estimate. The methods we are going to see use repeated sampling from a given probability model, known as parametric bootstrap. We simulate the stochastic process that generated the data, repeatedly drawing samples under identical conditions.
#Models
A statistical model is a mathematical representation of the relationship between variables in a dataset. These models are used to describe, explain, and predict phenomena, as well as to make inferences about the populations from which the data is collected. Statistical models are fundamental in various fields such as economics, biology, engineering, sociology, and more.