The Analytics Power Hour – Episode 285
Title: Our Prior Is That Many Analysts Are Confounded by Bayesian Statistics
Date: November 25, 2025
Hosts: Michael Helbling, Moe Kiss, Tim Wilson
Guest: Michael Kaminsky (Co-CEO of Recast)
Overview
In this lively episode, the team dives deep into Bayesian statistics, exploring its growing influence in analytics and confronting the challenge many analysts face when approaching Bayesian methods versus classical (frequentist) frameworks. Special guest Michael Kaminsky returns to demystify Bayesian thinking, debunk common misconceptions, and illuminate how Bayesian approaches can lead to more nuanced, practical business decisions. Expect baseball metaphors, Enigma machine stories, and honest exploration of real-business analytics.
Key Discussion Points & Insights
1. What is Bayesian Statistics? (02:19–05:42)
- Historical Context:
- Many perceive Bayesian statistics as a "new" approach. However, Michael Kaminsky clarifies it's actually the original form of probability theory, predating frequentist methods popularized by R.A. Fisher in the early 20th century.
- Bayesian statistics’ recent resurgence is tied to advances in computational power.
- Kaminsky’s Explanation:
- “Bayesians tend to think from simulations... build a model that describes the world... and compare the implications of that simulated world with actual data... and then try to learn about some parameters of interest.” – Michael Kaminsky [03:21]
- Practical Framing:
- Mo invokes Kazi Kazarkov’s description of Bayesian inference as “just a best guess,” but Kaminsky offers a refinement: Bayes is about rigorously combining all available evidence to reach the best estimate, not just blind guessing. [05:42–08:17]
2. Bayesian vs. Frequentist: Why the Confusion? (08:17–16:19)
- Frequentist World:
- Frequentist inference excels in contexts like agriculture (where random sampling is feasible) and A/B testing.
- Core frequentist assumptions, such as representative random samples, can break down in business scenarios (e.g., when you have all the available data or non-random samples).
- Kaminsky:
- “Frequentist statistics work really, really well when you are in that scenario, when you are taking random samples from a large population... There's a lot of baked-in assumptions... But there's a lot of other questions... [where] they don't really fit naturally into that framework.” [10:01–13:22]
3. Sampling, Priors, and Stakeholder Involvement (13:54–24:38)
- Challenges with Sampling:
- Mo and Tim probe challenges when you can’t draw random, representative samples – a common business scenario.
- Kaminsky: Frequentist methods struggle in these cases, but Bayesian modeling simply requires you to define how your data arose—enabling analysis even with biased or incomplete sampling.
- Stakeholder Knowledge as Priors:
- Incorporating domain expertise is central in Bayesian work: “As a scientist... we want to start by talking to the domain experts...” [21:59]
- The iterative, reality-check process—“tweaking the model” until results make sense—is, as Kaminsky puts it, “a Bayesian process (done badly).” [23:00]
4. Intuitive Examples: Baseball & Poker (16:19–26:26, 24:38–26:26)
- Baseball Analogy:
- Kaminsky's Favorite: Predicting a player's batting average after one good game combines observed data (4/4 batting) with broad knowledge that most MLB players’ averages are between .200 and .350.
- Tim’s guess of ".300" is, in fact, Bayesian reasoning—combining new evidence with prior beliefs. [18:01–20:35]
- “Humans do Bayesian reasoning constantly... your mind is doing this all the time.” – Michael Kaminsky [19:58]
- Poker and Expertise:
- Knowing the probabilities in Texas Hold ‘em and updating beliefs as cards are revealed is a quintessential Bayesian thought process.
- The more domain knowledge you have, the stronger (and better calibrated) your priors.
5. Simulation: Why Bayes is Powerful Now (26:26–29:26)
- Simulation as Analysis:
- Bayesian analysis often means running lots of simulations (“simulate and count”) to find answers where analytical solutions are tough or impossible.
- Growth in computation makes this feasible: “What I like about Bayesian statistics is that you don’t need to memorize all the different rules... we just simulate...” [27:12–29:26]
6. Bayesian Analysis in History: The Enigma Machine (29:26–34:15)
- World War II Story:
- Mo describes how breaking the Enigma code involved updating beliefs about letter patterns based on partial knowledge and trying out “best guesses” (cribs)—a real-world application of Bayesian reasoning.
- Kaminsky: “We're just going to look for patterns and... throw all of the compute at the problem... That's a very Bayesian feel...” [31:44–33:24]
7. Bias, Transparency, and Analyst Influence (34:15–40:36)
- Does Bayes Let You “Rig” Results?
- Both frequentist and Bayesian approaches allow analyst bias; transparency about priors and about analytical choices is critical.
- “There is no procedure that will prevent that. If you think we have to figure out the procedure that's going to prevent [manipulation], throw it out—never gonna happen.” – Michael Kaminsky [34:52–38:24]
- The best analysts, regardless of method, are up front about their assumptions.
8. Convergence, Credible Intervals, and Limits of Each Method (40:36–43:00)
- Convergence:
- When rigorously used, both Bayesian and frequentist frameworks usually provide similar answers.
- The real difference: Bayes can answer more types of questions, especially where simulations or structural models are needed.
- Notable quote: “Almost always you can get to the same frequentist results with a Bayesian approach... but you can't always go the other way.” – Michael Kaminsky [42:03]
9. Examples Where Bayesian Shines (43:00–46:50)
- Multilevel Models:
- Bayesian modeling handles hierarchical (e.g., school/class/student) models much more flexibly.
- “The minimum sample size of a Bayesian analysis is zero samples... We’re just going to simulate and see what happens...” [43:00–46:50]
10. Real-world Business Application & Decision Making (46:50–54:29)
- Incorporating Domain Knowledge:
- Tim: Businesses often err by assuming “the data will just give me the answer,” ignoring the value of stakeholder knowledge. [46:50]
- Kaminsky: Good analytics starts with domain expertise—executives intuitively use Bayesian reasoning to poke holes in analyses that don’t fit their experience. [48:09]
- A/B Testing and Decision-Making:
- Bayesians think a lot about uncertainty and regret—helping businesses design tests that match their true goals.
- The Bayesian framework allows flexible decision rules, including for phenomena like the “winner’s curse” in experimentation. [51:12–54:29]
- “We don’t have to have the perfect AB test... We can do a bunch of other stuff that isn’t that and still have a very reasonable business outcome...” – Michael Kaminsky [53:11]
Notable Quotes
- On Bayesian Reasoning:
- “It’s updating your beliefs. Right. That is the most simple way I try and rationalize this.” – Moe Kiss [20:35]
- “Experts can have a lot better guesses about future outcomes and a lot stronger beliefs because of all of that expertise that they have.” – Michael Kaminsky [25:45]
- On Simulation:
- “If you can simulate it, you can generate a statistical analysis with it.” – Michael Kaminsky [42:41]
- On Bias and Transparency:
- “There is no way that we can design an analysis procedure that will prevent an analyst from shaping the results.” – Michael Kaminsky [34:52]
- “An analysis is an argument, not truth. And... it’s up to the reader... is it a good argument or a bad argument?” – Michael Kaminsky [38:14]
- On Business Application:
- “You should design your decision-making approach based on what you’re trying to achieve as a business. And this... is another place where I think Bayesians do really well. Bayesians care a lot about decision making.” – Michael Kaminsky [51:12]
- On the Limits of Methods:
- “The minimum sample size of a Bayesian analysis is zero samples. You can do a totally reasonable Bayesian analysis with zero samples.” – Michael Kaminsky [44:35]
Key Timestamps
| Time | Segment/Insight | |-----------|----------------------------------------------------------| | 02:19 | Intro to Bayesian statistics and its history | | 05:42 | “Best guess” vs. rigorous evidence combination | | 10:01 | Where frequentist methods are strong/weak | | 13:54 | Issues with sampling in business contexts | | 16:19 | Baseball Bayesian reasoning example | | 18:01 | Applying prior knowledge to observed data | | 20:35 | "Updating your beliefs" as Bayesian grounds | | 21:59 | Involving stakeholders/domain experts in analysis | | 24:38 | Poker and domain expertise as priors | | 26:26 | Simulation as modern Bayesian power | | 31:44 | Enigma machine/WWII decoding as Bayesian process | | 34:15 | Where bias enters analytics and how transparency helps | | 40:36 | Convergence between Bayesian and frequentist methods | | 43:00 | Multilevel models uniquely suited for Bayesian analysis | | 46:50 | Business practice: bringing domain knowledge into play | | 51:12 | Redefining A/B testing and business decision making |
Memorable Moments
- Mo’s Playground Approach: Mo openly learns and unpacks the Enigma machine story on air, demonstrating how even non-experts can connect with Bayesian reasoning.
- Chicken Banana (65:03): Post-show banter veers delightfully into the joys of silly children’s music, showing the hosts’ camaraderie and light-hearted approach.
Additional Resources Recommended
-
Books:
- The Theory That Would Not Die – A recommended entry point to Bayesian thought [08:17]
- Statistical Rethinking by Richard McElreath – Top pick for deeper learning [43:00]
-
Articles:
- Statistics For People In a Hurry by Kazi Kazarkov (with audio reading) [57:59]
- The Rose Code by Kate Quinn (historical fiction re: Enigma, for general interest) [57:59]
- A Letter to a Young Person Worrying about AI by Cedric Patrick Chin [60:32]
Summary
This episode demystifies Bayesian statistics—cutting through its intimidating reputation and showing with clear examples, humor, and honest discussion, how Bayesian thinking is both natural and increasingly essential in business analytics. The key takeaway: The real power of the Bayesian approach lies in its flexibility, transparency, and ability to incorporate both data and expert knowledge into coherent, actionable insights, especially in complex, real-life contexts where classical approaches often stumble.
End of Summary
