1. Applying conditional probability
Welcome back! In the previous lesson, we explored how multivariate distributions help us analyze interactions between business variables through joint probability, the likelihood of two or more events occurring together. Now, we'll take this a step further by introducing conditional probability.
2. Understanding conditional probability
Conditional probability refers to the probability of an event B occurring given that another event A has already happened. In business contexts, this is incredibly valuable when analyzing customer behavior, market trends, and risk assessments.
For example, if we know that a customer has viewed a product page,
3. Understanding conditional probability
what is the probability that they will make a purchase? By conditioning on known information, we can make more precise decisions and improve strategic planning.
4. Real-world applications of conditional probability
Conditional probability comes up all the time in real business situations.
For example, in customer retention analysis, a subscription service might want to know: what's the chance a customer will renew their subscription given that they've used premium features recently?
In fraud detection, banks often ask: how likely is a transaction to be fraudulent given unusual spending patterns?
And in supply chain management, companies may want to estimate the chance of a delay given certain conditions, like bad weather or a supplier's track record.
5. Revising probabilities with Bayes' theorem
One of the most important applications of conditional probability is Bayes' Theorem, which provides a systematic way to calculate revised probabilities in the light of new information or evidence.
Bayes' Theorem combines prior probabilities, which represent initial beliefs or assumptions about an event, with new data and likelihood, which is the probability of observing the new data under these assumptions.
6. Revising probabilities with Bayes' theorem
This results in posterior probabilities, which are updated beliefs that take both the prior and the new evidence into account.
7. Bayesian modeling and decision-making
Bayes' Theorem serves as the foundation for Bayesian modeling, where probabilities are treated as assumptions that can be updated continuously as new data becomes available. This makes Bayesian modeling particularly useful when dealing with uncertainty and evolving information.
In machine learning and artificial intelligence, Bayesian models are used to enhance predictive accuracy, manage uncertainty and incomplete data. They power spam filters, recommendation systems, and natural language processing by continuously refining probability estimates based on user interactions.
By dynamically updating probabilities, Bayesian modeling allows systems to adapt and improve their decision-making over time.
8. Example: fraud detection
A bank implements a Bayesian model to assess whether a credit card transaction is fraudulent. Based on industry data, the bank starts with a prior probability that only 2% of transactions are fraudulent.
When a new transaction occurs, the system looks for unusual spending behavior, which is much more common in fraudulent transactions. The data shows that 85% of fraudulent transactions exhibit unusual spending behavior. Only 5% of legitimate transactions show the same behavior.
Using Bayes' Theorem, the model updates its probability of fraud given this new evidence. As a result, the fraud probability for this transaction jumps from 2% to 25.8%, an increase of over 12 times!
This means the bank should flag this transaction for review, as it is now much more likely to be fraudulent. By continuously updating fraud probabilities with new transaction data, the system improves over time, reducing false positives while catching more real fraud cases.
9. Let's practice!
Let's first put conditional probability into practice with some exercises!