Thursday, May 28, 2009

We're Moving

We have moved - the blog is now hosted at PureStone.Wordpress.com.


Friday, May 1, 2009

Customer Lifecycle Value

Depending upon on how well your know your business, a great discussion to have somewhat regularily is whether or not the customer lifecycle value is increasing or decreasing. To achieve this we need to know a few things...
  • How much has the customer purchased from us?
  • How long are they likely to stay with us?
  • What does it cost us to serve them?
None of these are necessarily easy questions to answer, but that does not mean we should not talk about these items. Worst case, you should at least be looking at the average revenue and cost per client and see how those are changing. They are probably pretty good indicators of lifecycle value. If we look at the trends of our revenues, costs (COGS & SGA), and profits per customer this should certainly indicate if we are doing better or worse.

While most of us do this to some degree, we probably also throw in a great deal many more variables and business rules and end up discussing various concepts. What about once a month or once a quarter getting all the department heads together and discuss progress on only these items.

Thursday, April 23, 2009

Setting Targets

Setting targets for Performance Indicators should be well thought through. This should not be an exercise in looking at the historical average (unless that is specifically relevant) and then apply 10% as the desired increase. You will want to review history, but you need to understand the goal. It is also important to define the KPI clearly.

For example, let's use the retail market's target of sales to sales last year. Retail has traditionally looked at this on a daily basis, as well as rolled up to the week, month, quarter, and year. I have two primary concerns with this:
  1. If the weather was bad, we ran a promotion, or some other contributing factor, we may not know it and are really not comparing apples to apples. Additionally, what if last year was really bad? Beating that number doesn't do much for us.  
  2. If we are reviewing this on a daily basis, we loose institutional knowledge due to the repetition. What if we miss a day? Is there any repercussion? What if we miss three days in a row? What if we miss 10 days out of 14? Were there enough days in there of good performance to hide the fact that a trend is occurring?
What would make more sense to me would be to look at this number as a rolling average, or take the total sales for the last 365 days / 365 on a daily basis. Here we can very quickly identify a positive or negative trend, as we don't have to look at numbers that swing wildly by the day of the week. Instead of talking about  a couple of bad days, we understand that even though we had a couple of bad days, the overall trend is above the goal. We can also integrate our sales goal and show it relative to the trend line.  


Tuesday, April 21, 2009

The Value of Scorecarding

One of my first Scorecard exercises is one of my favorites.  It taught me a great deal about the power of scorecarding.  

I did what I suspect most people do.  I interviewed all the VPs and developed a long list of KPIs.  I then used an excel spreadsheet to organize the KPIs.  I put the KPIs down the rows, and the VPs across the columns.  Then to help visualize the data, I placed "red" cells where VPs were directly impacted by the KPIs and "yellow" cells where the VPs were indirectly related.  I did not intend the colors for anything other to call out attention for each of the VPs.

By choosing the "red" and "yellow" I had each of the VPs concerned that they were under performing in each of those areas.  I had to explain a number of times, the reason for the colors.  
  • The first lesson was that by associating colors with performance, I clearly had the attention and focus of the executives of this team.  It sparked a number of very strong conversations about performance.
  • The second lesson is that communication is just as important.  By doing a less than stellar job of communicating (at least from a visual sense) the information, I wasted a tremendous amount of time that should have been used for more strategic discussion.  
Scorecarding can be a very powerful tool, but it needs to be used appropriately.  

Tuesday, April 14, 2009

Business Intelligence vs Business Analytics


There is a growing debate over Business Intelligence vs. Business Analytics and what the future holds.  Clearly the Business Intelligence world has been shaken with Hyperion, Business Objects, and Cognos all now smaller parts of bigger companies.  This has created a number of marketing opportunities for the likes of Microstrategy and SAS.  The obvious marketing play was independence.  Now it is clear that SAS is taking a slightly different tact by claiming that Business Intelligence is dead and the future is Analytics.

Marketing messages aside, what we need to be focusing upon how we use information and the management process.  Call it data, information, intelligence, analytics, or whatever we come up with next, it is all irrelevant if we don't understand how to use it.  A basement full of great tools doesn't mean the house remains maintained.  
  • Do you have rules on when to use the specific tools in the BI suite?
  • Do your people have the analytical skills required?
  • Do you have a process where the information can be discussed and actions agreed upon?
We all agree that organizations need to make fact based decisions.  The other thing we should all be working upon is creating a common vernacular for each of the tools.  As analysts, consultants, pundits, bloggers, we do little good if we don't teach the value of how to use each of the tools.  You don't need predictive analytics for an exemption report.  You don't need a sexy looking reports that do little to explain the goal.  Organizations don't need real time scorecards.  

What organizations do need are ways to make people comfortable to take decisive action.  We also need these actions to align to company goals and strategy.  The tools we use need to be consistent enough for us to trust them, and the minds that analyze them need to be able to use the tools well enough to communicate only what matters in a digestible presentation.


Thursday, April 9, 2009

Scorecard or Business Fact Sheet

A common Scorecard design is to list a bunch of business facts - how many customers, total square feet, total employees, inputs, etc.  While these can be important business facts that executives need to know, they may not be manageable numbers.  By adding them to the scorecard, they take up valuable real estate and misdirect focus.  

As you are thinking through your scorecard design, take some time to consider if an item is a REAL KPI, or just a business fact.  Then design the scorecard to focus on objectives with potential links to business fact report(s).


Friday, April 3, 2009

Initiative Performance Indicators (IPIs)

I made the argument that Key Performance Indicators and Key Risk Indicators are really the same thing, yet a nuance worth discussion is initiative management.  We launch new factories, new products, training programs, marketing material, etc all the time, yet often do a sub-optimal job managing the project.  And execution waters down further as we try to manage the portfolio.

Even though initiatives are different than performance indicators, we need to account for their management within the same framework.  We need to understand our objectives, the priorities, resource constraints, milestones, etc in order to more proactively manage the business to achieve more strategic goals.  We need to enhance our ability to discuss our progress to our goals (both annual and strategic) and how all the KPIs and Initiatives are working together to achieve the end.