One of the key challenges to creating an analytical organization is to get quants and execs on the same page. More often than not, it is “Execs are from Venus, Quants are from Mars.”

So both groups need to approach problems from the other’s perspective in order to bridge the gap. Here’s how:


You have the right answer. I know that. If you’re any good, you have a business case driving to an NPV or IRR (if you don’t let’s discuss in another post). If you’re really good, you’ve done some sensitivity testing. Bottom line, your quantitative analysis probably points to a  reality like, “From a profitability standpoint, we are getting killed on ears/nose/throat surgeries, but our orthopedics guys are making hay. Let’s do more ortho and scale back ENT!”

True or not, from an Executive standpoint, there may be any number of barriers to taking action, and you serve your purpose (if your purpose is to increase profitability, anyway) by getting out ahead of these factors. At worst, asking an exec about those barriers prior to jumping in with a recommendation would help your cause. At best, do that qualitative legwork in advance, or align yourself with someone who understands them intuitively. Common sense, right? Yet we’ve all been in a conference room watching an exec’s eyes glaze over as we present clear and present facts. Sometimes it is because he knows certain medical procedures fit a broader strategy play, and he doesn’t want to have the dialog…and yes, that IS in fact your problem if you have an agenda and the numbers aren’t making the case.


You understand the business. I know that. If you’re any good, you understand the “real” consequences of what your quants are recommending, and you know that if you simply plow forward based on the numbers, things get missed, and there are unintended outcomes that your instincts are honed to anticipate. The data model doesn’t always exactly reflect reality, etc., and you’re not the only one who thinks so…all the more reason to govern with analytics as an input but seasoned experience as the final say.

Correct as that may be, these factors are all variables, and by capturing them better and at least trying to quantify them, we all get smarter. You don’t want to change vendors for a raw material input because your brand relies on a perception of quality?  Fine, let’s quantify that.

Has marketing done a proper (and not hand-to-mouth) analysis around your customers’ sensitivities to quality, and have they correlated that material input to quality? If you use a local vendor, do your customers even know? If you switch, what level of attrition is reasonable to expect? Is that built into the model? Now over-analysis is a liability if it is used as an excuse to delay or avoid a decision, but you know better.

The point is to develop a habit of trying to get your quants to model what you know intuitively, so that you’re bridging a gap. Rather than telling them (rightly, in many cases) that the models aren’t good enough, translate your experience into something that can be researched and confirmed.

Then both teams get smarter.