An Actuarial View of Correlation and Causation—From Interpretation to Practice to Implications

Link: https://www.actuary.org/sites/default/files/2022-07/Correlation.IB_.6.22_final.pdf

Graphic:

Excerpt:

Examine the quality of the theory behind the correlated variables. Is there good
reason to believe, as validated by research, the variables would occur together? If such
validation does not exist, then the relationship may be spurious. For example, is there
any validation to the relationship between the number of driver deaths in railway
collisions by year (the horizontal axis), and the annual imports of Norwegian crude
oil by the U.S., as depicted below?36 This is an example of a spurious correlation. It is
not clear what a rational explanation would be for this relationship.

Author(s): Data Science and Analytics Committee

Publication Date: July 2022

Publication Site: American Academy of Actuaries

Leaders: Stop Confusing Correlation with Causation

Link:https://hbr.org/2021/11/leaders-stop-confusing-correlation-with-causation

Excerpt:

A 2020 Washington Post article examined the correlation between police spending and crime. It concluded that, “A review of spending on state and local police over the past 60 years…shows no correlation nationally between spending and crime rates.” This correlation is misleading. An important driver of police spending is the current level of crime, which creates a chicken and egg scenario. Causal research has, in fact, shown that more police lead to a reduction in crime.

….

Yelp overcame a similar challenge in 2015. A consulting report found that companies that advertised on the platform ended up earning more business through Yelp than those that didn’t advertise on the platform. But here’s the problem: Companies that get more business through Yelp may be more likely to advertise. The former COO and I discussed this challenge and we decided to run a large-scale experiment that gave packages of advertisements to thousands of randomly selected businesses. The key to successfully executing this experiment was determining which factors were driving the correlation. We found that Yelp ads did have a positive effect on sales, and it provided Yelp with new insight into the effect of ads.

Author(s): Michael Luca

Publication Date: 5 Nov 2021

Publication Site: Harvard Business Review

Causal design patterns for data analysts

Link: https://emilyriederer.netlify.app/post/causal-design-patterns/

Graph:

Excerpt:

One antidote to this is true experimentation in which treatment is randomly assigned within the homogenous target population. Experimentation, particularly A/B tests, have become a mainstay of industry data science, so why observational causal inference matters?

Some situations you cannot test due to ethics or reputational risk

Even when you can experiment, understanding observational causal inference can help you better identify biases and design your experiments

Testing can be expensive. There are direct costs (e.g. testing a marketing promotion) of instituting a policy that might not be effective, implementation costs (e.g. having a tech team implement a new display), and opportunity costs (e.g. holding out a control group and not applying what you hope to be a profitable strategy as broadly as possible)2

Randomized experimentation is harder than it sounds! Sometimes experiments may not go as planned, but treating the results as observational data may help salvage some information value

Data collection can take time. We may want to read long-term endpoints like customer retention or attrition after many year. When we long to read an experiment that wasn’t launched three years ago, historical observational data can help us get a preliminary answer sooner

It’s not either-or but both-and. Due to the financial and temporal costs of experimentation, causal inference can also be a tool to help us better prioritize what experiments are worth running

Author: Emily Riederer

Publication Date: 30 January 2021

Publication Site: Emily Riederer