You are currently browsing the category archive for the ‘analytics’ category.

So you’re a small business or a team within a larger business.  You have hiring responsibility for your team and some level of P&L responsibility.  You’ve heard for the past 5-10 years nothing but “data, data, data,” and “information overload,” and “competing with analytics,” but you don’t know where to start.

(Or, you know someone who is in the above situation and you forward this post to them so they can read it in the first person!)

To keep the blog post short, let’s keep to why, who, and how.

Why hire a data analyst?

Why would you hire a data analyst?  Because they have the power to make your life easier, your team smarter, and your business more effective by filling gaps in most information workers’ skillsets.  Data analysts are used to working in the trenches, resolving real information issues and getting you to the first horizon, the hallowed “One Version of the Truth.”  I cannot describe in words the power of everyone on the team working from the same set of assumptions, not just for historical “truths” but also for granular forecasts.

The right data analyst can then take you beyond the truths into predictive analytics, they can drive relationships out of messy data by eliminating noise.  They build better business cases, and they implement statistics actively to drive tangible business results.  If the “why” is still a question, comment below, and maybe we can tease out additional posts.

Who do I hire?

I am biased.  I think that your first hire should come out of a data shop within a well-respected consulting firm.  I cut my teeth at A.T. Kearney, and even though I am out of the game now, there are many who had similar training to me, and KPMG has a similar shop.  In fact, many consulting firms are running to catch up, and their high-flyers should all be in scope.  Find a Manager in the group, one who managed teams, managed projects, sold work, hired analysts, gave talks at schools.  These people are out there, and you can get them.

When you interview them, ask them to lay down a vision for how they would work with your team, which problems they would prioritize.  Tell them about your team and engage in a meaningful dialog.  You would be surprised how many leaders are out there, and you need one.

But make sure that the person with whom you are speaking has the soft side and hard side covered.  Probe to make sure the person does in fact understand the business impact of the projects on which they worked and is not just running numbers and analysis sake.

So then what (the How)?

Once you bring someone in, you have a simple deal to make.  You need them to demonstrate value in the immediate term by starting small and delivering something meaningful.  But your end of the bargain is executive support.  Your new data analyst will require your patience if business processes need changing.  You may need to tell one of your finance guys that they do in fact need to abandon an Excel-based forecasting system in favor of working with your new analyst to build the forecast assumptions into the new process that everyone can access.

It isn’t an easy transition to being a data-savvy organization, but your competition is trying, so you may as well get on it, and hopefully the brief advice above is a noble start.

Advertisements

I am having trouble explaining the magnitude for my excitement about the federal government’s massive entres into data transparency. In no time at all, Vivek Kundra has helped Obama and Co. rewrite the rules around the way the public sector looks at information and accountability.

Take www.usaspending.gov. For years, we consultants have worked with organizations public and private to address spending. “Strategic Sourcing,” as the phrase was coined by A.T. Kearney veterans, runs into the same barrier 9 times out of 10. Someone in purchasing has a bad contract, and the client is paying 50% more than they should be for staples or laptops or asphault for roads, but for one reason or another, they don’t want to shake that contract. In the public sector, the problem is 10x because public sector employees are not as directly incentivized by the bottom line. They are, however, motivated by public opinion.

So by posting all these contracts online, we get a chance to see that the “Cheyenne Mountain Complex/Integrated Tactical Warning /Attack Assessment,” a defense contract out to Lockheed Martin for $26.1M in 2009 spending (and it would be great to know the total contract value) has a contract variance of 167.84% and an average of 120 days late per milestone. I am not saying we don’t need this project, but I am pleased that it’s variance to plan is online and that as more citizens clue into these variances, those responsible for their delivery will no doubt tighten the reigns, when before they could proceed business-as-usual, the public none the wiser.

Add to this excitement my boundless energy for data.gov, a perhaps more ambitious project over which plenty of blog ink has been spilt. And better bloggers than I at that. The core of the concept, is taking what information is available and public…and making it available and public and accessible. What really matters to me is that there are massive implications for management consultants, data analysts, and the quantitative community at large who are paying attention. Entire business can and will be built around taking this now-accessible information, digesting it, and making it useful for businesses. Our President and our Federal CIO have taken the first step, and now it is our task to take the next.

What an exciting time it becomes as a result.

The analytics community is a powerful but fragmented one, and that is both good and bad. Good in the sense that for all intents and purposes, every meaningful challenge in the business community has already been solved in one place or another. Bad in the sense that no one of us shows up Monday morning with any more than perhaps 30% of that collective toolset at the ready…at best.

So what do we “quant blue shirts” do when we encounter intractable business problems in our management consulting lives? We reach out to our limited network, and we cobble together the best solution we can given what’s been done before, periodically innovating in the margins and typically producing incremental value add. Tight project timelines don’t often allow for truly white space analytical thinking, but we get by alright. Thrillin’ and billin’, right?

These tight timelines, however, have always made management consultancies poor at collecting and leveraging intellectual capital. More devastatingly, there is a culture, particularly among top tier management consultants, to create but not to use intellectual capital. If rewards are given for creating the next great thing, scorn is cast on those poor plebs who, the next week, use what was discovered on another project. And so IC is cast into the abyss of some arcane, mystic IC tracker, like the warehouse from Raiders of the Lost Ark, never to be seen again, next month’s IC award go to someone in the Milan office who unknowingly created from scratch a tool or approach that was solved years ago in the Tokyo office (or perhaps in the Milan office).

I would posit for the group that crowd-sourcing analytics is one answer. When an cagey problem arises, it should immediately and without scorn be thrown to the community, who can and will consume it like a swarm of locusts. The collective memory and capability of the community is as sinewy and agile as Ajax (the Homeric Greek, not the language, though perhaps both apply?). The rule of the road will be, “solve problems in your wheelhouse (yours and others’) and throw the rest to the group.

I would take this a step further, however, and lay down the claim that the academic community needs to be closer to these problems. (Yes, I said it.) I think the “community” should include students. I remember, as a student, flying through Multivariable Calculus exams that today would make my head spin, and I know that my peers are nodding when it comes to other quantitative methods like econometrics or applies statistics. Even a shaky solution to a simple consumer polling challenge could be corrected with swift precisoin a freshman undergrad who happens to be on that chapter.

And why would they log on in the first place? There’s no better way to understand how to apply their academic concepts than to apply them, and there is something appealing about seeing your own wizardry at work in the real world. Plus, students want jobs, and they want jobs they like. Log onto a site from time to time and interact with these blue shirts–you’ll see if it’s for you, if you like the work in the trenches and if you like the people, the pace, the culture.

For you skeptics, I imagine there are two major hurdles, so let’s hash them out. First, everyone wants to hoarde and silo IC, and we even have issues around postings of protected intellectual property. Clearly there needs to be an understanding around client confidentiality. Our clients don’t want their analytical laundry aired, and consulting firms don’t want their chief value distributed. Both of these, I believe, are manageable. Don’t post anything confidential, and don’t post anything protected. Most challenges are abstract enough that I feel we can cross that bridge when we get there. And if the problem originates in the community, the solution is one from the community. Information wants to be free anyway, and the realists out there are getting eroded.

The second challenge is that if we blue shirts making hay are leveraging insight from the academic community as well as our peer community, there could emerge the perception that consultants are looking for free labor, getting paid for other people’s insight. I don’t think this is the point, though I acknowledge it is something for which to look out. The point is to create a symbiotic community. Students get closer to real world challenges and areas to apply their skills and talents, as well as a forum outside of recruiting for interacting with those people with whom they may later choose to work. There’s no better way to understand the work that consultants are doing than to dig in on some of their problems and interact with the personalities. Not everyone will want to get involved, but that’s true of anything, so let’s put it out there and see who picks it up…

One of the key challenges to creating an analytical organization is to get quants and execs on the same page. More often than not, it is “Execs are from Venus, Quants are from Mars.”

So both groups need to approach problems from the other’s perspective in order to bridge the gap. Here’s how:

Quants:

You have the right answer. I know that. If you’re any good, you have a business case driving to an NPV or IRR (if you don’t let’s discuss in another post). If you’re really good, you’ve done some sensitivity testing. Bottom line, your quantitative analysis probably points to a  reality like, “From a profitability standpoint, we are getting killed on ears/nose/throat surgeries, but our orthopedics guys are making hay. Let’s do more ortho and scale back ENT!”

True or not, from an Executive standpoint, there may be any number of barriers to taking action, and you serve your purpose (if your purpose is to increase profitability, anyway) by getting out ahead of these factors. At worst, asking an exec about those barriers prior to jumping in with a recommendation would help your cause. At best, do that qualitative legwork in advance, or align yourself with someone who understands them intuitively. Common sense, right? Yet we’ve all been in a conference room watching an exec’s eyes glaze over as we present clear and present facts. Sometimes it is because he knows certain medical procedures fit a broader strategy play, and he doesn’t want to have the dialog…and yes, that IS in fact your problem if you have an agenda and the numbers aren’t making the case.

Execs:

You understand the business. I know that. If you’re any good, you understand the “real” consequences of what your quants are recommending, and you know that if you simply plow forward based on the numbers, things get missed, and there are unintended outcomes that your instincts are honed to anticipate. The data model doesn’t always exactly reflect reality, etc., and you’re not the only one who thinks so…all the more reason to govern with analytics as an input but seasoned experience as the final say.

Correct as that may be, these factors are all variables, and by capturing them better and at least trying to quantify them, we all get smarter. You don’t want to change vendors for a raw material input because your brand relies on a perception of quality?  Fine, let’s quantify that.

Has marketing done a proper (and not hand-to-mouth) analysis around your customers’ sensitivities to quality, and have they correlated that material input to quality? If you use a local vendor, do your customers even know? If you switch, what level of attrition is reasonable to expect? Is that built into the model? Now over-analysis is a liability if it is used as an excuse to delay or avoid a decision, but you know better.

The point is to develop a habit of trying to get your quants to model what you know intuitively, so that you’re bridging a gap. Rather than telling them (rightly, in many cases) that the models aren’t good enough, translate your experience into something that can be researched and confirmed.

Then both teams get smarter.

There has been some debate as of late around the progress and relevance of agile business intelligence (http://agileintelligence.blogspot.com/, http://exceptionalgeeks.com/bi-curious/2009/10/22/bi-release-management/, http://www.analyticbridge.com/profiles/blogs/agile-business-intelligence, http://herdingcats.typepad.com/my_weblog/2009/10/an-strawman-argument-for-agile-project-management.html, among others).

What makes the debate interesting to me is more in the weeds and less in the ether. Particularly as it applies to smaller organizations and not massive roll-outs is this: Can you start with nothing on Monday, and show someone something USEFUL on Wednesday? (I am not even giving you until Friday.)

If the answer is, “Well, it depends,” then forget it. Take a different approach.

I am not advocating abandoning all planning in favor or piling on a list of reports to be run together. Rather, I would say that you need to start VERY small, VERY targeted, and show value immediately.

Monday:

Find your most important stakeholder. Sit her down, and ask about decisions she makes, whether tactical, operational, strategic, whatever. Fill a whiteboard digging into the decisions she makes. For each decision, try to understand what information she uses to make that decision, and what the availability of that data is.

If you’re any good at this (perhaps I should say “once you’re any good at this”), you’ll find a really important decision that requires data that “has to be somewhere” but that she doesn’t use currently. Perhaps she buys cases of wine for her restaurant every couple weeks based on a walk-through of the cellar, and she eyeballs it and just fills the shelf on gut. Fine.

Tuesday:

Spend Tuesday pulling down data out of the available systems. If there’s an inventory system, great. If there’s POS data (there is always something), pull it down and dig through it. Use Excel. Use SQL Server. Use R. Beat that data up using whatever you’re most comfortable with. Build out a picture of historical wine sales volume by day/week/month and by varietal and by region and by distributor and by whatever information is available. Use all dimensions. Compile all that information all morning, and book time with your stakeholder mid-afternoon to discuss. Yes, it is possible—just get it done.

When you get that session, talk through all the data with her. Excel and a projector—keep asking, “What in this would help you make that decision better?” Key is to use her expertise and yours. Perhaps you know a little (or googled around about) economic order quantity and can apply a simple formula to generate a re-order quantity report by SKU. Perhaps she knows that vendors like to consolidate purchases at certain levels, and you build that adjustment into your metrics. Land on something absolutely useful, absolutely simple. When she says, “yeah, but it doesn’t work that way,” do NOT ignore her. Ask why. Drill. Incorporate some complexity, and ask her about how the process could change. That is what whiteboards are made for.

Wednesday:

Build out the report. Structure the data so that your report can pull it quickly, easily, and correctly. I still prefer the ROLAP model, but I am not precious, the report is. Schedule that report into her inbox for the morning when she re-orders, and then build out the simplest, most straightforward ETL package to pull JUST the data needed for that report. Leverage a bus model so you can expand later, but keep it simple.

You did it right if she actually uses that report or metric to do her job better, faster, and smarter than before that report existed. If she does, you win. If not, you need to get better at this.

Why it matters:

It is easy to lose sight of the real goal when pulling together reports, metrics, dashboards, and everything else. It is most often the business stakeholder who just wants to see a massive, sexy, real-time dashboard, but trophy BI projects don’t make businesses smarter. Focus on decision points and information that make those decisions smarter, and build those out first. It is a LOT slower to build out your BI infrastructure one report at a time, but if your first 5 reports are the 5 most impactful reports you ever build, it won’t matter. 100 reports that no one uses to make decisions are a waste of time and money.

So Agile BI? You bet. It would be rude not to.

adam neary

A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.

blog categories

tweets