Monday, September 05, 2011

The Cloud Opportunity for Storage

Executive Summary

Strategic analysis suggests that in the maturing hard drive manufacturing business, there is a significant and strategically critical opportunity for hard drive manufacturers to integrate forward into services for both the consumer and business markets while simultaneously offering outsourced storage management and service for businesses offering cloud storage.
A major advantage of offering a cloud based back up service to consumers is the annuity-like nature of the revenue stream. And because hard drives can be bundled with an optional or prepaid service offering, sales costs of associated back-up services are likely to be lower. A $100 hard drive with hard won profitability can be converted to an upfront purchase plus an annual service revenue that could run in the $10-40 range. Alternative revenue models might include an annual service based pricing model with an on-site hard drive as part of the service offering, in other words charge for service and give away a hard drive.
Incremental revenues from hard drive restoration represent an additional opportunity.

Introduction

One thing is obvious about storage. Most consumers can’t tell the difference between storage offerings. In other words, it’s a classic commodity business where the most effective producer will win.
But as industries mature, companies need to reexamine their value proposition and see whether the business model has or could change. What are the adjacent businesses that represent a low risk opportunity to add revenues, but more importantly are sufficiently complementary to create a differentiated advantage.
With consolidation having already occurred in the hard drive business and two dominant players remaining, Western Digital and Seagate, how might the future evolve?
One alternative would be to accept the carving up of the market. Competition between WDC and Seagate would be comparable to the classic end game in many markets where two dominant players squeeze out other smaller competitors. Coke and Pepsi come to mind. Apple IOS and Google Android. Windows and LINUX.
But there are alternative evolutionary models that could emerge. One traditional weakness of Silicon Valley companies is that they tend to be product oriented. They often are uncomfortable with services and solutions. But as products become more mainstream, businesses often wish to outsource; consumers on average become less knowledgeable than the initial adopters. Another driving issue is that as markets become more competitive, vertical integration often presents advantages. For example, in the highly competitive computer business, the current success story, Apple, has vertically integrated to create an ecosystem that includes content ownership (Steve Jobs’ ownership position in Disney as a result of the sale of Pixar), on line and physical retailing, devices (Macs, iPads, iPhones, iPods) and services (MobileMe, cloud services). This approach is a far cry from the traditional horizontal model where the PC industry consisted of horizontal competitors (Intel in processors, Microsoft in operating systems, Microsoft in productivity software, WDC and Seagate in hard drives, Sandisk and Samsung in flash memory).
One way of looking at the problem is a task oriented approach. A hard drive vendor is not selling hard drives, rather it is selling a solution to a particular problem.
For example, most computer users have been faced with the problem of their system becoming unreliable and needing to restore their system to the previous successful state. While Windows provides the ability to roll back the operating system to a previous version, the comprehensiveness of the roll back is less than having a snapshot of the full hard drive so that it can be restored. This restoration is no trivial problem particularly for heavy users of PCs who many have dozens of pieces of software to reinstall. Reinstalling software on a machine can take several days, a costly exercise for an individual and even more costly for organizations with multiple employees unless they have set up facilities for automating backup and restoration.
A second example is music. An avid music collector may have 20,000 pieces of music. Replacement might well cost $20,000. And while additional hard drives are inexpensive, back up tends to be unreliable and recent material gets missed. Mindless and consistently reliable back up has value to protect the value of the investment.
A third example is memorabilia. Digital photos can easily be lost with a high emotional cost to an individual.
A fourth example is record keeping. Even individuals today often find their financial records stored on unreliably backed up computers.

Cloud Storage

Clearly, cloud storage is an area that is being targeted by many companies. Amazon, Apple, Google, Microsoft, Adobe, just to name a few of the more visible companies are offering free or relatively low cost storage. The intent in most cases is lock-in of the customer relationship as part of an overarching product or in some cases services and advertising strategy.
It’s easy to predict that price pressures in this market will be extreme, leading to efforts by the cloud storage offerors to reduce their costs. But as in all areas of IT, it is the total cost of ownership (TCO) that matters. One way of reducing their TCO will be to outsource storage to the low cost producer. And the most capable manufacturer of hard drives is likely to be well positioned to vertically integrate and offer outsourced storage to these large players. A secondary advantage of vertical integration for a manufacturer is the resulting increased production volume.
If a hard drive manufacturer integrates forward into services, it will lower its manufacturing costs. As the two dominant vendors in hard drive manufacturing operate with low margins, it takes a significant effort to increase the volume and thereby improve or maintain margins. Vertical integration represents a significant opportunity with likely first mover advantages.
The outsourced storage option can be structured in several ways. For large cloud operators, the outsourcing might involve management of in-house storage. For other businesses and consumers, a more traditional cloud storage services model would work. There would also be opportunities for premium storage with additional security attached.

First Mover Advantage

Hard drive manufacturers are highly specialized organizations with value chains that normally ignore service opportunities. Adding the capabilities needed for forward vertical integration requires new focus, new skills and new management. But accepting the opportunity, understanding that services will drive additional volume means that these skills need to be added.
The first to adopt this strategy will quickly gain additional volume. Additional volume lowers costs and makes the forward vertical integration more profitable or enables faster acquisition of downstream customers. It is, as is normal in learning curve businesses, a competitive advantage goes to those who move quickly. What is perhaps less obvious is that the slower the rate of cost improvement, the more necessary it is to seek innovative ways to increase the cost advantage.

Does Downside Exist?

At first thought, pursuing vertical integration might seem to run the risk of alienating large buyers who are already offering cloud services. But if the storage offering is positioned as being available to cloud services and lowering their costs in addition to being available as a consumer service, the cost advantage should be compelling. Owning and managing the storage at a company site also offers a way of capturing hardware volume and minimizing communications costs.
The company that is probably the largest risk is Google. Their use of a proprietary file system and in-house assembly of servers means that they are likely to be most resistant to outsourcing management of storage.

Tuesday, August 30, 2011

Resource Allocation and Innovation at HP


HP’s recent decision to sell or spin off its low margin PC business represents a classic problem in strategy to which there are different points of views.

Option 1: Sell the Dogs
In the early years of modern business strategy (the 60s and 70s), the conventional wisdom was to perform a portfolio analysis and analyze the characteristic of your portfolio of opportunities. A common characteristic of such portfolio models, made popular by BCG and McKinsey was the decision to sell off the “dogs” in the portfolio. The dogs were typically characterized as low or negative growth businesses with poor profitability or cash flow.

Option 2: Rejuvenate the Dogs
Over the years, portfolio analysis has leaned away from this first generation view of portfolios and encompassed the idea that mature businesses can be rejuvenated and transformed into more profitable businesses through innovation, increased value added, new business models, tackling untapped adjacent markets, or through expansion into adjacent businesses.
Many have criticized the HP decision to move more in the direction of a services business and imitate the move IBM made earlier away from its personal computer business to a services and solution orientation.

The Value of News About a Company
Others have criticized HP for the clumsy and value destroying way in which it announced its decision to exit the PC, tablet and mobile business. As Regis McKenna, PR guru to Steve Jobs and Apple board member has pointed out in his writings, news about a company is as much part of a product and the perception of its brand and ecosystem value as is product information. This is clearly a lesson that HP has not understood as well as Apple.

But the decision to focus on corporate services rather having two lines of business one focused around business and the other around consumer technology and services shows the importance of thinking about the future, values and vision.

It’s pretty hard to imagine an HP that does not attempt to compete with IBM by offering services and solutions. The EDS acquisition was clearly part of that strategy.

Adjacent and Downstream Innovation
What surprised me personally was that HP would walk away from the products of the digital household. I have two reasons for being surprised:

First, increasingly product innovations for business are born in the consumer sector, where consumers often adopt technologies more quickly than businesses. Businesses are constrained by issues of standardization, support and security. By nature they are often more conservative in their adoption. CD-ROM drives, smartphones and Skype all represent examples of technologies adopted much earlier by consumers. Without a presence in the consumer end of the market, HP will miss many trends.

Second, the consumption of digital devices in the home is likely to be a large and growing area that will encompass many areas outside of HP’s traditional strengths in printers and PCs. There are many dollars at stake. Yes, the barriers to entry are lower and there are many consumer electronics competitors that HP would had to have faced, but the need to develop the design and user interface skills that Apple has demonstrated so ably would have been a significant benefit to HP in all its businesses. The ability to do deals to build ecosystems would also have been a core strength in a world where integration matters.

Centralized vs. Decentralized Innovation
Bob Sutton at Stanford’s engineering school has written about the success of Apple being tied to a centralized command structure and a small product line. 

Reference: http://bobsutton.typepad.com/my_weblog/2011/08/5-warning-signs-apple-has-started-to-slip.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+typepad%2FBobsutton%2Fmy_weblog+%28Bob+Sutton%29

HP’s history is one of decentralized innovation, in a way, a far more difficult culture to create and maintain. The spin-off of the PC business and all the downstream consumer innovations that it would inevitably have seeded may not be the most damaging consequence of this spin off. Rather it is the move away from strong decentralized innovation and impact on HP’s culture that may be the biggest loss.

What Motivated HP’s Decision
Without being party to the discussions on this decision, it’s hard to know which of two factors was more important in the decision.

One factor may have been that the capital costs of maintaining leadership in the business services market may have been the critical issue. HP’s core profit center – printers and ink – is certainly under pressure from competition and the substitution of devices and web servers for paper. If you can download to a computer or tablet, you print less. A networked world needs less paper.

A second factor may be the comfort zone of the new CEO. CEO’s come to the board room table with areas of comfort and discomfort. The rejuvenation of a consumer business is no small task and requires many strong innovators. It would likely take time. 


It would be a terrible shame if the decision to spin out the PC business were driven by the unrealistic time expectations of the stock market, and even worse if it were motivated by the way in which senior management was incented. 

Perhaps the best we can hope for is that HP’s PC division will be spun out and its new management team, newly refocused, can pursue innovation, unbound by the trade-offs of portfolio analysis in a business with radically different business models

Sunday, August 28, 2011

Strategies for Sharing

OK, so I use lots of different social networking and sharing tools. At some point, you have to ask yourself:

1. Am I using too many tools?
2. Am I spending too much time with them to the detriment of real work?
3. Has the world changed so that your personal brand deserves attention as much as a marketing budget and sales activities are required for larger businesses?

The Situation
I mainly use Linkedin, Facebook, my blog (which you are reading), and my two web sites, one personal and one business (www.eclicktick.com and www.alistairdavidson.com)

For research purposes, I track interesting pages with Evernote. With multiple computers, I use Dropbox for exchanging information between machines and with third parties.

What are my decision rules?

First, I try to use new technologies to understand them. That was one of the reasons for putting up my web sites. A particular challenge on my web sites is navigation, so I have experimented with different navigation approaches for accessing photographs and poems on my personal web site. I use a graphical map, categorization by theme, and ratings of quality. It's not social networking, rather it is the classic role of the editor.

Currently, I am experimenting with Twitter, Klout, Tweettronics, Gist and Hootsuite.

Second, I try to distinguish between content with "reader appeal" and content that is narrowly of interest to me. General appeal content I will put up on Facebook. More obscure material that I may use as a reference source in the future for a project or area I work in I will store on Evernote. Longer articles e.g. from consulting firms, I download as pdf files and store in a directory for future sharing. Base upon the number of Likes and comments I receive, I suspect that I put too much up on Facebook.

Third, most of my own writings get put up on my business web site, www.eclicktick.com under Free White Papers.

Fourth, I put very little content up on Linkedin. Linkedin strikes me as a very salesy site. There are a lot of people showing their wares. I use Twitter, linked to Linkedin to highlight my blog posts on the theory that if I bother to write something on my blog, I would like to think I have something to say that may be of interest to others. I also cross post to Facebook.

While Twitter receives a lot of press in the mainstream media, what is most surprising about Twitter is how few people have many followers and how many people have almost no one following them. Using Tweettronics to look at followership, 40% of Twitter authors have less than 15 people following. And if you have several hundred people following you, you are likely in the top 10% of Twitter authors.

As to the question of how much time to spend, I suspect that like all marketing problems, it boils down to short term and long term outcomes. It's surprising how an article or piece of content can generate business years down the road. This lag makes measurement of a personal brand and sales effectiveness difficult. But for the short and medium term, tracking followership, getting reaction from readers and inquires about business or e.g. job proposals on Linkedin is the only measure of whether your personal content strategy is working.

So, if you are reading this post, what are your decision rules? Are you randomly posting and tweeting? Do you have a content or segment strategy for your personal promotion? What decision rules are you using for allocating your time to different tools and topics?


Saturday, August 27, 2011

The Importance of Execution in Music and Content Sales

I am a great admirer of Amazon.com. I spend far too much there. I use Amazon.com almost exclusively for buying music in spite of also having a subscription to Best Buy's Napster.

One of the reasons I enjoy Amazon (and enjoyment is something many MBAs forget in their 40,000 foot view of strategy) is that when I visit Amazon, which I do every day, they offer free music that I might otherwise not listen to. To be frank, the hit rate is low. Few of the free songs end up in my playlists or my favorites list, but Amazon is a little like college radio, lots of coal and the occasional diamond (which sounds a lot like the addictive behavior of a variable ratio schedule of reinforcement for the followers of behaviorist B. F. Skinner)

This past week, Amazon offered a new album by Barbra Streisand as a promotional item at $3.99. An extended version of the album with more songs was priced at $16.99. I decided against the album, figuring I could always listen to it under my Napster subscription where it would cost me nothing.

But in a rare experience, listening to it on Napster made me fall in love with the album (appropriate given that the album is very sentimental). So I decided to buy the album on Amazon because I prefer the download experience with Amazon - songs get stored in the cloud and the integration with iTunes is better.

But strangely the extended version of the album did not show up in Amazon just the shorter version now back to its price of $9.99. Now it's quite rare that I cannot find something in Amazon, so I bought the extended album at Napster for $16.99

It's a small example, but it does show the importance, and, perhaps, the difficulty of perfect execution, particularly in a software based retailer.

What are the take-aways from this experience?

1. Amazon does not just sell products. It focuses upon the experience of shopping by offering bargains and implicitly recommendations on good or at least new music.

2. Search tools matter. My inability to find the album I was looking for led to a rare lost sale for Amazon.

3. User interface and integration matters. My preference for Amazon is, in no small part, due to its seamless integration of music downloads with iTunes. I find the same integration with my Kindle downloads and Audible. The strength of my preference is reinforced by the ability to store my music in the Amazon cloud (though to be fair it has taken several months for music stored in the cloud to be automatically downloaded reliably to my machine)

4. One of the small annoyances with Napster is that when music is downloaded, it does not end up grouped into a directory under the artist and album. Someone at Napster is not on their toes. I would be relatively indifferent between buying from Amazon and Napster if their download integration were comparable.

5. While Napster is a cloud based music service and exceptionally good value, its marketing people are not keeping up. Offering a service where music can only be downloaded once is not comparable to the Amazon offering where a buyer has the security of knowing his/her music collection is backed up in the cloud or redownloadable in the case of Kindle books or Audible audio books.

It's tough doing e-commerce. You not only need to have a good business model, you need to consider your search engine capabilities, your pricing, the experience you create with customers, user interface and integration issues. It's no wonder that many fail and the best pull ahead.

Friday, August 19, 2011

HP's Decision to Spin off its PC Division


HP’s decision to spin off or sell its PC division strikes me as reflecting badly upon both the strategy and execution of the business.

Consider the following facts:
  • HP is the largest technology company in the world with the leading market share in personal computers. As such it has potential economics of scale, economies of scope, buying power and should have a cost advantage over its competitors.
  • HP is one of the few full service computer providers who can provide a wide range of hardware to business buyers, presumably allowing lower cost of sales and service.
On the negative side, HP’s branding is confused. I have never been clear on what HP’s brands stand for at the overarching level (what is the difference between Compaq brand and HP brand?), or at the product line level – there is very little successful branding to communicate different value propositions to different types of buyers.

Perhaps the biggest sign of problems in the HP brand is its lack of innovation. Why for example was HP so late to the tablet market? What was its offering reviewed as slow and buggy? How can major companies launch buggy products with anticipating consequence? Even if you are late to market, it’s a bad idea to create negative word of mouth from early adopters and reviewers. It’s negative viral marketing.

Perhaps the biggest condemnation of HP’s innovation is to ask the question why it cannot even design an equivalent to the Apple Airbook. It’s pretty obvious to anyone who has travelled with a computer, that less weight and small form factors are desirable. It was a need obvious before the Airbook was launched. It was obvious when the first generation of Airbook was launched. It is even more obvious today with the second generation of Airbook.

There are other reasons to be critical too. If a company like HP decides to compete against a large and dominant competitor such as Apple in the tablet market and it is not prepared to develop a superior product offering, it must anticipate a long uphill battle. To launch a product and then give up on it quickly suggests either that the company lacks a long term plan or that it seriously underestimated its competition. Even worse, in the consumer market, competition is based upon ecosystems. Lacking an explicit ecosystem strategy to compete with a dominant competitor’s ecosystem is a recipe for disaster.

HP claims that profitability is a major reason for spinning off the PC division. But profitability is under the control of management. If Apple can command premium prices for ease of use, for design, for their integrated ecosystem, why can’t HP? Lack of profitability represents a failure to seek superior value and obtain the pricing that superior value claims.

Research on new product innovations suggests that the single largest predictor of new product success is offering a differentiated high value product. HP seems not to have understood this lesson.

Perhaps the fundamental reason for HP’s lack of success is explained by research on product managers. Some researchers suggests that companies with strong and senior product managers who are both accountable and given authority produce better products and bring them to market faster. In organizations where the product manager is less experienced, has little authority and plays a more coordinating role, products take longer to get to market and are of lower quality.

Monday, August 08, 2011

Public Speaking and Presentations: Eleven Thoughts

I attended a presentation this week where two people spoke. The first had no slides and his presentation seemed to be an ill thought out musing on his personal enthusiasm for his current role. The second bored us to death with an ill-conceived and disorganized presentation on his attractive sounding software.

By way of background, I have done many presentations in my career, some good, some bad. I have also received media training and done both radio and TV interviews with the benefit of a PR person to coach me. And I, in turn, have trained and coached sales people and distributors, consulted to firms who develop sales and sales training materials, helping them develop and deliver presentations, videos and sales pitches.

So, let me offer some simple rules about presentations for those who have not received training.

1. Prepare. There are very few people in the world who can stand up and interest a crowd. You are probably not one of them. If you were, you would probably be on stand-up comic circuit. I think it was Voltaire who apologized to a client on the extreme length of a book. He explained that shorter takes longer. It's also true with presentations.

2. Figure out in advance what message you would like to implant in the audience. Most people immediately forget what they have read or heard. So, you need to figure out what number you want them to remember, what picture you want them to be able to reproduce, and what story want them to be able to tell to their spouse, their boss, or their colleagues.

3. Don't assume that your audience knows what you know. Start with the basics so that even those in the audience who are not experts can get something out of the presentation.

4. Quantify. People are generally more interested in the implications of something than the something. I recently did some work with a major communications equipment vendor in the area of fixed mobile convergence. One of the exercises I did was to calculate the net present value over ten years of having voice calls diverted from congested cell towers to a WiFi connected call which went over the fixed broadband connection in the home or office. It turned out to be worth as much as $3.2B per million customers. Translating the general unquantified benefit to a number gets people's attention. It's not easy sometimes to come up with number, but more often that not, you can. Personal benefits matter too. For many managers, not getting fired for making a poor purchasing decision is an important issue. So, if you are a minor player and the client has committed to using a well established vendor, don't compete directly, pursue an indirect method of entry into the company.

6. Showing people software is generally extremely boring. My belief is that if you can't sell a piece of software without showing it, you are not very good at selling. Show the ideas behind the software so people can understand what is going on. If you are going to do a demo, automate it, practice it or use a canned slide show. Don't waste people's attention and time. If you do, you will lose them -- they will drift off and play with their tablet or phone.

7. Don't be confused. Most people don't buy software because it is better. They buy it because of momentum. So talk about momentum, your ecosystem, your backward compatibility and product evolution. Your software may well be the best thing since sliced bread was invented, but people tend to make decisions on technology momentum. If the technology is better but unsupported and lacking a long term future, it does not matter if your technology is better, unless it is orders of magnitude better. And if it is magnificently better, then it is likely only going to appeal to a niche.

8. Ask the audience questions. It gives you a chance to both wake them and keep them involved. It also allows you to gauge their level of knowledge.

9. Don't talk about yourself, your programs and how you are personally excited about everything your company is doing. Excitement should be the result of your presentation not a personal description of your emotional state.

10. Make the presentation easy to retrieve. Provide a web address from which it can be retrieved.

11. If you do a question and answer session, repeat the question before replying, so that those in the audience who could not hear the question have some idea on what you are replying to.

There are, clearly, many more guidelines, but these eleven seemed particularly obvious after the presentation this week.

Friday, July 29, 2011

Stupid America, clever America? I suggest clever is better.

My late father, as international a person as you could ever meet, used to say about the US that the United States contains the best and the worst. Today, the worst seems terribly visible. Stupidity seems to abound. Notoriety and popularity seems more important that knowledge and truth. Misguided belief seems more important than the scientific method.

For a living, I am a strategist. In simple terms, this means, I try to figure out where the best bang for the buck is. If you have five products and one is really a huge opportunity, it is common sense that you should be putting more resources against the big opportunity and less on past products – ones that you might be investing in from tradition and perhaps inertia.

If iPods are declining, and iPhones are increasing, then you should spend more management attention and advertising on iPhones. If offering a solution, something that solves a customer problem is better value for a customer, it make little sense to merely offer a product and be surprised that you have missed pieces of the puzzle.

Management matters. It means prioritizing the important. It also means dealing with the truth, because companies, like marriages, if built on lies, rarely last. Management of a country matters too. And so does political leadership.

The Economic Environment Matters

It should hardly be controversial to Americans that the economic environment matters. In the 1970s and 1980s, Japan’s success led to many discussions of industrial policy. Today, in the US, industrial policy is a term that is never used. What people seem to forget is that the sum of all the investment decisions, investment encouragement by government, and the regulatory environment create a de facto industrial policy. Denying that it exists means that individual policies may end up acting in opposite ways and making growth difficult. Believing in free markets and their power should not mean denying that government, laws, policies and regulations exist and have a cumulative impact upon the birth of businesses.
Stupid people deny government exists and has an effect. Smart people think about optimizing.

The Healthcare Example

Some parts of America today remind me of stupid companies I have worked with. As a foreigner, but one who did both his degrees in the US, I expect that writing this article will result in rude comments from many Americans. When I have commented to some Americans that I thought the healthcare system in the US was, on average, inferior to other developed countries, I have received comments that in their polite version suggested I return to those countries. But by the way, well known researchers like Michael Porter and Elizabeth Teasberg would agree, as would the CEO of one of the most successful healthcare organizations in the US, George Halverson at Kaiser Permanente that research shows the US system is incredibly expensive, wasteful and inconsistent.

But as a strategist with an international background, I know that countries, like companies, which only see their competition as being American always get into trouble. Like any large and successful country, many Americans wear a set of filters that often causes them to miss how the US looks from outside. Even worse, stupid Americans today, seem to arrogantly assume that other countries have nothing to teach America rather than recognizing, like clever Americans, that America has been the beneficiary, a melting pot of the best from other countries.

In discussions of American healthcare, I am continually astounded by people who tell me that the US has the best healthcare system in the world. Their evidence is always anecdotal. Actual research says that the US has wildly inconsistent healthcare delivered at high cost. They try to tell me that Canadians are constantly coming down to the US because they can’t get treatment in Canada. (The data says otherwise.) But what they miss, and one would hope for better in a capitalist country, is that the US spends roughly $2.5 trillion or 18% of GDP on healthcare and produces worse outcomes than other developed countries. Even if the outcomes were equivalent, that spending is about twice the percent of GDP of other countries who spend from 6-12% of GDP. What this means is that a more effective healthcare system comparable to international developed countries with universal healthcare would be one that cost around a trillion dollars a year less. It would be equivalent to a trillion dollar tax cut per year.

When you see the numbers in this light, perhaps paying attention to more effective ways of delivering healthcare might be much more important than negotiating imaginary deficit reductions in Congress or debating the merits of universal healthcare (which is according to most, a requirement for reducing total healthcare costs). And if you think the problem is bad today, consider demographics, the most predictable of sciences as a general rule. The bulge of the Baby Boom is getting older and more expensive for healthcare. The current method of delivery is projected to get even more expensive without dramatic change. But as things cannot go up for ever, failing to reform healthcare means we will lose much of what we prize in healthcare.

What do the stupid in Congress focus on? Trying to eliminate universal healthcare and programs to improve healthcare performance. Stupid.

Infrastructure Investment

For many years after the Second World War, the US was the preeminent economic power in the world. Today, the US is still important, but its relative power is less. The European Union, for all its flaws, is roughly the same size as the US. And in many areas, the European Union is as successful as the US. The quality of life is high. Everyone has access to healthcare. The social safety net is strong. Things are not perfect: innovation is not as easy in Europe. Some European industries and countries are weak. But Europe does many things well and there are lessons to be learned there.

China has many problems, but it also has a high rate of growth. It is building cities for 300 million people over the next 10-15 years. Think about it. That’s like building housing for the entire US population. China, which clearly has significant infrastructure deficits, is investing around $3 trillion in infrastructure. This new infrastructure, like the infrastructure built in Europe after the Second World War, or in the developing five dragons of Asia (S. Korea, Hong Kong, Singapore, Taiwan, Japan) make doing business more attractive in these countries.

But what do see in the US. Bridges that fall down. Levees overcome by storms. Roads that are poorly maintained. Ports at capacity. Unrejuvenvated water and sewage systems.

While American innovation may be praised for developing an iPhone app that identifies potholes in Boston when you drive over them, it would be better if no such app were required. Postponing maintenance on roads does not mean postponing the repair cost one year. Rather it means that when you do repairs later than you should, the cost may be five or ten times higher. In a word, stupid.

Education

I am a big believer in education. My parents sacrificed to put me in good schools. And I think it is fair to say that I have studied at some of the best schools and universities in three countries. I have taught strategy, international strategy, IT strategy and innovation internationally. But I never had to worry about wildly fluctuating school budgets and class sizes. Is there anyone is the US who really believes that you can run a school system with a wildly fluctuating budget?

If one were to design a revenue stream for schools, would it not be a good idea to make it predictable? “We are not going to teach advanced mathematics this year as it is too expensive.” does not strike me as a good educational policy. It’s stupid.

Market Forces

Look, I have an MBA from a well known East Coast business school. I think markets and competition matters. Communism did not work, but there strikes me there are a number of important points that are missed by those with a poor understanding of capitalism.

First, economies go up and down. And while allowing companies to go out of business because they have been out-competed does make sense, allowing recessions to turn into depressions does not make sense. Micro policy is not the same as macro economic policy. In the same, way allowing frequent financial crises, a characteristic of lack of regulation in the 19th century also strikes me as a bad idea. I am in awe that so many people in the supposedly capitalist US seem to have no knowledge of economic history or no recognition of the cost of poor or nonexistent regulation. In a society that prizes the individual, failing to regulate (e.g. via contaminated food or transportation that is unsafe) means granting a right to kill or maim an individual or cheat someone. This right has a huge cost to the individual, and also a large societal cost. Regulation that prevents such events is great buy for the person who would be killed or hurt and typically represents low cost insurance for everyone. The absence of trust in society is costly in many ways: the higher cost of police, more use of lawyers, security systems, deals that don’t get done, economic activity that is too risky to pursue. Trust is advantageous. It makes people clever and enables clever people to start businesses more easily. Which leads to job growth.

The recent recession has a result in an economy that remains in trouble. Consumer demand, which represents the majority of the economy, roughly 72% of GDP historically, is down because people are poorer – their houses are worth less – and the resolution of the real estate bubble means that people will be and will feel poorer for a long time. Government spending is being cut back by ill guided policy decisions causing firing of government employees particularly at the state level. Companies and individual are reluctant to borrow because of uncertainty and lack of consumer demand. Exports are a relatively small percentage of the US economy. And real estate construction, a major contributor to the boom years of the first decade of the century, is likely to be depressed for some time. So where is economic growth and employment going to come from. Certainly not from the stressed and poor, not from real estate and not for companies with factories operating below capacity or who manufacture off shore.

What the brilliant economist John Maynard Keynes pointed out is that economies can get stuck, below their optimum level of output. If they do, tax revenues go down and deficits balloon. You need to use government spending to stimulate growth. It pays for itself. And if increasing economic growth and reducing deficits were not a sufficient idea, the historically illiterate might want to remember that while Franklin Delano Roosevelt was saving capitalism by introducing previously radical ideas such as counter cyclical spending, infrastructure investment, bank regulation, and deposit insurance, fascism took over in Germany, Spain and Italy. Unemployment is never a good idea politically either.

So, what are debating? Cutting spending, which will do nothing for employment. And if that were not bad enough, we are not attempting to prevent the excesses of unregulated financial markets. How many times to you have to shoot yourself in the foot to understand you should change your behavior?

It’s stupid.

Complexity

You don’t have to be a genius to notice that there are a lot more people on the planet. And with successful economic growth, more resources are being consumed and more pollution is being caused. So, there are three choices:

1. Slow population growth.
2. Hope that things work out.
3. Manage the growth.

So, what do we do in the United States. We discourage family planning, which has the side effect of reducing women’s economic choice. (That’s not only morally unfair, it deprives us of the brains of up to half of humanity.) The stupid seem to assume that things will work out magically. Without getting into the traditional debate between Malthusian views that overpopulation will create starvation and the idea that technology can also solve overpopulation problems, perhaps there lies some middle ground. Policies matter. Markets matter. Changing the cost of polluting will change behavior. It’s stupid not to realize it.

People under-forecast the variability of the future. Management 101 teaches our difficulty in forecasting the future. We also have difficult investing in projects to prevent future outcomes. You only see the costs, but not the benefits. Even if the disaster occurs and is more expensive, solving the disaster demonstrates “leadership”. That’s stupid. Post facto leadership is way more expensive. Consider the cost of New Orleans if you don’t believe me, or the potential flooding disaster around Sacramento, California if levees break.

In a complex world, you have to make more complex decisions. And complex decisions are not amenable to propagandistic debate. If you don’t believe that science is the most important tool for understanding systems, then on what basis do you want to make decisions? The Bible? The Koran? Dianetics? The purported saying of the Spaghetti Monster? The word of some lunatic who has eccentrically calculated when the world will end? How you felt this morning when you got up? Polling data? Being stupid does not prevent disasters. Learning from data, research and models reduces the probability of disasters.

And if you have not understood that politically powerful companies prefer not to pay for their effects on the environment, then perhaps it is time to go back and read Machiavelli. The current situation is always supported by the powerful who have political voice. The future has few supporters. I am consistently dismayed by those who argue the idea that scientists are somehow not a reliable source of information, when companies with billions of dollars on the line are investing politically to defend their position. The process of scientific review is about validating and reproducing data that independent scientists have reported. Learning is better than a fixed opinion. Fixed opinions are typically stupid if they don’t take into account new information.
It’s stupid to think that unfixed climate change is not going to be the most costly event the human race could have avoided, bar nuclear or other apocalyptic warfare.

Energy Policy

The Industrial Revolution is the basis for the economic growth of the past 300 years. We have been living in an age of cheap energy with little recognition of the cost of polluting the environment. The US has built cities based upon cheap energy. It has, some have suggested, gone to war to preserve access to inexpensive energy.

So, where are we today. We have, as many have pointed out, facing the fact that developing countries want a higher standard of living. More meat. Better transportation. More use of energy. With historical technologies, this growth will inevitably result in more pollution, a hotter planet, more destructive weather and higher energy prices, and in particular more expensive oil.
In the short term, nobody wants their standard of living to go down because the cost of energy goes up. They don’t want to be unable to drive because gas prices are too high. So, we need to do something about, what seems to be a pretty inevitable problem. It’s cheaper to do it now than to do it later. If you don’t remember the oil embargoes by OPEC in the 1970s and their impact, you are being stupid.

And yet many feel we have a god given right to drive big cars and have big homes far from work. So, here’s the bad news. Changing infrastructure is really hard particular when the poor are affected or when a particular industry may have to change. It’s probably a fifty year problem but one that needs to be tackled aggressively. There’s lot of political resistance and government is going have to help manage the transition. It takes a long time to replace a fleet of cars. It takes even longer to change cities and transportation networks. Don’t get me wrong. I like a house with a garden, but in a higher energy cost environment, urban planning needs to change. You need higher densities to support public transit. You need new infrastructure to support electric cars. And by the way, these are typically the kinds of things that don’t happen by accident. They require government policy – regional urban planning in particular.

Denying this just means the cost of transportation will go up and the value of your house will go down even more than if incentive policies are put in place to manage the transition to a higher cost world .

It would be stupid to think otherwise.

The Debt Limit

The current debate over the debt limit demonstrates the inadequacies of most politicians. No other country has such a foolish law. If Congress has approved spending and tax policy, then it is a simple set of mathematics as to how much debt is needed. Approving the debt more than once, first through spending and revenue legislation and then a second time, is the ultimate example of overlapping and redundant legislation. That’s stupid.

Shooting yourself in the head from a financial perspective and risking an increase in the cost of borrowing is not something a normally intelligent person would do. It’s much more costly in absolute dollars if you are a country with trillions of dollars of debt.

If you want to ruin a country, think short term. Think only about the next election. Be stupid. There is no better way.

Thursday, June 09, 2011

Making Sense of Media and the Cloud

Apple’s announcement of its iCloud service has long been predictable since Apple purchased Lala.com in 2009 and promptly closed it down. Lala was a clever web site that allowed you to upload your music to the Internet. In an intelligent move, Lala saved you the trouble of uploading if they already had a copy of the specific piece music in their site (after all, bits are bits). Many people may not understand the advantage of this approach, but in a word it boils down to asymmetry. Broadband connections are faster downloading than uploading. In my personal case, It would take several weeks to upload my digital music with a slow upload connection.

Apple is of course, not alone in using the cloud to sync content across multiple devices. Industry titans, Amazon and Google have as slightly different strategy of using the cloud to store files, back up your digital music and play music from the cloud. Google’s service, combined with its Android operating system is attempting to compete with Apple’s offering but with less emphasis upon syncing. Amazon is attempting to compete with the iTunes store. And there are many more ways of storing your vital assets in the cloud. Evernote is popular for notes that you can access from different devices. Dropbox is useful for sharing files with clients or different computers.

But the more general question is: “What is the future of digital media in the cloud?” Back in 2007, I wrote that a new framework for understanding the evolution of digital media would help predict the evolution of the market. Historically, when you bought a book, a magazine or a piece of music, the relationship of the buyer to the content was pretty well understood. If you owned a book, you could read it, keep it and lend it to friends. You could not make a copy of it with a copier and give it or sell it to other people. If you bought music, you could play it or lend it to friends. You could make a copy for personal use, but not redistribute it.

The digital revolution changed everything. All the legal rights attached to the digital content could in theory be unbundled and repackaged in different ways. Lala.com offered the innovative approach of allowing you to buy the right to stream music from the Internet for 10 cents a song, much less than the then conventional price of roughly $1 a song. Legal subscription services such Rhapsody.com or the reconstructed and now legal Napster offered programs where, for a monthly fee you could stream music from a library of millions of pieces of music.

But these examples don’t illustrate yet, the richness of marketing approaches now available. Consider for example the difference between the original digital music model and Audible, the well known audiobook seller now purchased by Amazon. Historically, prior to the emergence of cloud based music services, when you bought a piece of music, you typically had no right to re-download it if your hard drive crashed. Audible took a different approach. Its service was more like an online personal library. If you bought an audio book, you kept a copy in your online cloud library and you could download it any time you needed to. Amazon’s Kindle books have the same feature. Unlike music, you could download a book you owned many times from your on line Kindle library, as you now can with Amazon’s cloud service.

In the future, the purchase, rental or use of content will offer far more variety than we have been accustomed to in the past. Consider movies. Today, you can purchase a movie in various types of quality (DVD, Blu-Ray, 3D). You can download the movie for purchase or with varying time periods for usage. You can rent the movie from a store or Kiosk. You can also subscribe to a service like Amazon Prime, Hulu Prime or Netflix and pay for a subscription to a library of movies. You can also use the on-demand feature on cable to see some moves whenever you want. Or you can watch movies with commercial interruption on the Internet or cable.

In this new world of more complicated content marketing, content owners have more choice. They can choose some combination of five different ways of earning money from their content, what strategist call business models. They include:

1. Sale of content
2. Rental of content
3. Rental of subscriptions
4. Advertising supported content
5. Sponsored content

And for each of these business models, a content owner can choose to bundle different legal rights with the use of the content. Rights could, for example, include:

1. Ability to lend the content (e.g. Kindle book lending)
2. Ability to resell the content (common with physical media, less common with digital media today)
3. Ability to upgrade the quality of the content e.g. changing from a regular DVD to a Blu-Ray version.
4. Ability to re-download the content
5. Ability to play the content on multiple devices
6. Ability to share content with friends and family
7. The right to get paid a sales commission if the user triggers a sale to a third party
8. The right to reuse the content in a new work in exchange for defined payment.
9. Number of devices or type of devices on which the content may be used.
10. Number of simultaneous uses.

The recent decline in DVD sales shows how important this new choice has become. Many consumers now realize that they don’t need to purchase movies that they may only watch once. Once you subscribe to an online music service, you quickly realize that most music is not to your taste and not worth buying. The big savings from a subscription service is avoiding purchasing music you don’t play. It’s only worth buying music you care about for the purpose of being able to play it more conveniently on more devices (assuming of course, you have not made provision for some type of cloud streaming or access).

Perhaps the biggest surprise for most people is the role of sponsorship. Honda recently sponsored the Wall Street Journal for a day, making the subscription portions of the Wall Street Journal available to non-subscribers. The New York Times cut a deal with Ford’s Lincoln division to sponsor a subscription to readers deemed likely to buy Lincolns. It’s an example of how changing business models and unbundled legal rights are changing the rule of the game in content marketing.

Tuesday, August 31, 2010

Review of Liquid Planner (www.liquidplanner.com)

Review by Alistair Davidson, Partner, Eclicktick Consulting

Strategic planning is not project planning. But strategic planning often requires project planning and dealing with high levels of uncertainty. There is, as a result, no single good way of managing execution that will apply to everyone’s situation. LiquidPlanner, an online project management software package, priced at $25 per user per month, is a simple package that includes an important element in project planning and project portfolio management – the difficulty of estimating how much work is required to complete a task.

Uncertainty about the length of time required to complete a task has numerous consequences. First, team members may pad their estimates of the work required to complete a task. Second, downstream tasks dependent upon the padded estimate will look even worse than they should. Put together too many padded estimates and projects may become impossible to sell internally. Restructuring of projects and resources become even more difficult than if better estimates were available. Even worse, arbitrary decisions about what is realistic and what is not may lead to top-down decisions by a senior manager or project manager—resulting it forecasts that are equally as erroneous and arbitrary.

LiquidPlanner’s solution is straightforward. Tasks are estimated as taking a range of time, e.g. from 2-4 hours or 1-4 days. The Gant charts (horizontal bar charts) that lay out the steps in a project show the most likely completion time for the task, the earliest possible date for the task and worse case for delivery. Promised or delivery dates now become easier to understand and more likely to be accurate.

LiquidPlanner is very much a child of 2010. The application comes with a portal where portions of the project can be shared with external participants (e.g. suppliers, customers, sub-contractors, partners, etc.). Messaging between participants is exposable through the portal. And an excellent series of video tutorials gets users up to speed quickly.

Adding in Monte Carlo or stochastic modeling to regular project management software is an option in many software packages, But usage of more probabilistic project management is still quite rare. LiquidPlanner is an easy to use and attractive tool for managing projects and small portfolios of time-based activities. Unlike many more complicated tools for Monte Carlo modeling, it keeps things simple.

Monday, August 30, 2010

Raising Notebooks From the Dead

Don’t get me wrong. While I ran a software company for many years, I am not someone who likes to spend time delving into the innards of computers. And that perhaps makes this article more significant.

I had a problem, one that many households and small businesses share --lots of old notebooks sitting around that are essentially unusable – problems with drivers, old operating systems, slow speed, too little RAM. I am strategic and business development consultant who works with clients that always seem to be in crisis, so my time is more valuable that playing around with old computers.

But last week, I decided I had a quiet time in August. It was time to clean up. As an experiment, I downloaded Ubuntu from ubuntu.com, burned an installation disk and tried it out. Now there are four ways you can try out Ubuntu as an operating system.

1. Run it from the CD which makes no changes to the computer.
2. Install it so that when you boot your computer, you can choose which operating system to boot.
3. Install it as a window that runs within MS-Windows using for example the free VMPlayer from vmware.com.
4. Replace your old operating system with Ubuntu.

One of my notebooks would not let me dual boot the system but it is a machine that needs to have its Windows reinstalled. But I succeeded with practically no effort in setting up Ubuntu and VMPlayer on my desktop and latest notebook. And I converted an older notebook to a Ubuntu notebook that runs better and faster that it did with Windows, a fact that would be of no surprise to Unix/Linux enthusiasts.

Score 1 for easy installation for Ubuntu.
Score 2 for VMWare player.

But what about usability? The Macintosh (which is based upon UNIX) and Windows set the standard for ease of use. How does Ubuntu stack up? To my surprise, I would have to rate it acceptable verging on good for a normal, relatively unskilled user. Ubuntu comes with most of the applications a normal person needs in their daily computer usage, e.g. Firefox for browsing, Evolution, an Outlook like email client, Open Office, an open source equivalent of MS-Office, music playing software, video playing software and a convenient application center where you can download programs like GIMP, a Photoshop (TM Adobe) like application. All of this with a reasonable easy to use windowing environment (GNOME) which is easy to master.

Now, I have no personal interest in becoming a LINUX geek, but my “take away” from this small project is that Ubuntu and the supported software provides an inexpensive way of turning an older machine with the performance of a boat anchor, one with performance and driver problems into something useful. Given that buying a decent netbook might cost you $400, Ubuntu can save you money and provide an inexpensive way of making a second or third computer usable for “commodity” tasks such as word processing, spreadsheets, presentations, small databases, music, video and photography.

The bottom line: Ubuntu does not require as much hardware as Windows and the level of driver support, installation ease make it usable by just about anyone who can burn a disk in Windows. In fact, on my desktop, Ubuntu recognized a sound card and used it when Windows is currently failing to do so -- pretty impressive.

And did I mention that all the software mentioned - the Ubuntu operating system, the applications and VMPlayer from VMWare are all free?

Wednesday, July 28, 2010

Why economics matters to strategists?

Recently, there has been a great deal of debate about the new iPhone 4, its ability to hold a signal and the quality of the AT&T network. In brief, the claim is that the antenna on the iPhone 4 is affected by being held and that AT&T has an inferior network to Verizon. And because iPhone users and detractors are often emotionally involved in their views, the framing of the problem is often not very useful to a strategist.

The economics of networks is not being considered in this debate. As smart phone users are a new category of uses and the iPhone was the first to make data usage easy, it is no surprise that AT&T has run into usage economics first.

Consider the following:

Customer usage: if you allow customers to use a much of a service as they like, they will use more of it up until the point where other constraints such as their willingness to use their phone all the time, becomes a barrier. Unlike mature markets, where saturation of usage has occurred (e.g Netflix which assumes you only have so much time to watch movies), in developing services, usage has often not yet reached a natural upper limit. In a fixed capacity network such as 3G, adding more users and making usage easier leads (surprise, surprise) to more usage.

Service provider investment: if you do the calculation on revenues per byte, voice calls are more profitable than data sessions – about two orders of magnitude more profitable. So if you are a service provider, you have the problem of expanding your network to justify less profitable usage. It’s no surprise that investment has lagged.

Next generation economics: as so often happens when people straight line their conclusions about growth, they fail to consider alternative ways of solving problems. In the case of smart phones, the alternative to using capacity on the cell tower is “diversion” or diverting traffic to alternate channels. These alternatives fall into three basic categories:

1. Public wi-fi sites
2. Home and business wi-fi sites
3. Femtocells

What all the first two have in common is that smart phones today typically offer wi-fi capabilities as well as 3G capabilities. And the speed of wi-fi is typically greater than 3G. So if you want to download a big file, it’s a good idea to do it via wi-fi.

Femtocells are less well known. They are, in effect, small base stations that provide better signal to a cell phone in a home or office, but rather than connecting to the mobile network via a public cell tower (and its backhaul), instead they link to the home or office fixed broadband that is pretty much always available at low cost and typically zero incremental cost for usage to the backbone or fixed network that connects the cell network.

Both wi-fi and femtocells divert traffic from overburdened cell towers and their backhaul and use an existing under-utilized and low cost fixed infrastructure. Because data is charged at a lower rate than voice traffic, diverting data is the most profitable from the perspective of return on capital. But voice can be diverted as well over wi-fi freeing up capacity at cell towers.

Not surprisingly, diversion is a multi-billion dollar issue for mobile companies -- unlike the difficult and time consulting process of putting up cell towers, wi-fi and femtocells require no regulatory input and are significantly cheaper (no bandwidth purchase needed, no rental fees, no backhaul costs).

Perhaps even more importantly, given that the majority of usage of cell phones occurs in the office or home, highly granular improvement in the network can be focused upon areas where most usage occurs. You can think of the use of wi-fi and femtocells as adding leaves to the branches of a tree.

Tuesday, November 24, 2009

Practical Thoughts on Forecasting, Planning and Budgeting
Copyright Alistair Davidson, 2009. All rights reserved. Alistair Davidson is a strategic consultant with extensive experience in developing budgeting, strategic planning and business intelligence systems. He is a contributing editor to Strategy and Leadership magazine, author of three books on strategy and technology. His career includes developing numerous software planning systems and tools, in addition to facilitating business and IT strategies.
Contact information: alistair@eclicktick.com, +1-650-450-9011


Executive Summary
Developing a forecast or budget sounds like it should be a simple task.
It rarely is. A planning and budgeting process is often used for multiple purposes and has many stakeholders, each with his/her own interests, constraints, risk profiles, deadlines and commitment to the process. Often budgeting processes become a minimalist exercise that reflects the power of the CFO rather than a process that is useful to strategic thinking and strategic management in the organization. There is nothing worse than a budget developed six months prior to the beginning of the fiscal year that is out of date before the year has begun.

Developing a forecast or a budget is typically an iterative process. The number of iterations within a particular year will affect the perception of the value of the process (more iterations typically annoys people). Just as importantly, forecasts and budgets are exercises that will repeat typically at least quarterly and annually. They will always evolve over time.


Developing a successful forecast or budget requires five elements for success:

1. Senior sponsorship is required to ensure that cooperation is obtained from all participants.

2. The level of work imposed or solicited needs to be perceived as reasonable and appropriate, often a difficult task in a changing environment without good supporting technology.

3. Senior management should avoid using the forecasting process to educate themselves and causing significant workload on the numerous participants. This is best achieved by spending time up front to understand the workload caused by a forecast exercise.

4. A longer term perspective on the forecast should be addressed up front by analyzing the workload and number of pieces of data being developed for each forecast in the current and future periods. Perfect systems should be avoided and iteration anticipated.

5. The forecast should be strategic and focus upon critical issues rather than letting the apparent simplicity of spreadsheets drive linear extrapolations of past performance. Forecasts should explicitly address uncertainty and risk, capacity utilization and the process of forecast revision over the forecasting period.

Introduction: Why Is Forecasting Difficult?
Planning and budgeting (P&B)are basic tools in managing a business. Budgets are often used to control spending and set expectations. Forecasts are often thought of as revisions to budgets done on a periodic basis to make sure that the budget is compared based on more recent data. Forecasts often extend beyond the budget period as well. If you are doing a quarterly update, there is little point in a cycle of 9 month, 6 month and 3 month projections, it's often equally as easy to do a rolling 12 month forecast.


Forecasting is often a problem for a companies because it serves multiple purposes and the importance of these multiple purposes looks different to different stakeholders throughout the organization. In an ideal world, the designer of a P&B process would identify the key stakeholders (i.e. people or groups that make the project a success or block it) and identify their minimum requirements for the P&B project's success. Generally speaking a stakeholder will have a threshold that must be delivered for satisfaction and other elements which they want emphasized a great deal.


Let's examine the stakeholders for the process:

1. The Board of Directors typically expects a forecast so that they can determine the future performance of the company. Their bias is typically towards financial results and to a lesser extent risk management. Strategic issues are often downplayed particularly in diversified companies with many different business.

2. Senior management in the company often uses the budgeting and forecasting process to force the organization to make choices about allocation of capital budgets, operating budgets, new product development processes, sales and marketing efforts. Annoyingly for the developer of forecasts and budgets, the projections are often used as a way of educating senior management about the business leading to multiple iterations.

3. Middle management and financial functions in the organization often use performance against the budget and forecast to revise their activities to meet their goals and performance metrics. Just as importantly, they use revisions to the budgets to prepare other stakeholders for surprises. Failure to deliver a budget is frequently less of a problem than the problem of surprising a boss with a failure to meet a budget. The budgeting and forecasting process is often seen as a chore rather than a productive use of time.


A very common problem for managers is the design of and successful implementation of a planning and budgeting process. There are several key constraints that make the design process difficult.


First, most managers undertaking a P&B process have little experience in their design and consistently underestimate the effort involved.


Second, P&B issues change constantly. If the CEO or profit center owner decides something is important, it is likely to be dropped into a P&B process at the last moment causing significant work load to the implementers.


Third, planning systems often draw upon information in other systems and formal migration processes for designing, testing, rolling out and training users is weak. Integration issues and business intelligence issues are surprisingly rarely addressed even though the information used in a good P&B system is very valuable to many users.


Fourth, planning tools are often dictated by functional areas for their own benefit and don't consider other users.


Fifth, budgets for P&B systems are generally small. Custom work has to be shoehorned into short time periods without regard to the actual work required.


Sixth, year over year comparisons are often extremely difficult because the planning system varies from year to year. The consistent information is often not the most important information for operating managers.


Seventh, disagreement about the scope of planning is very common. Planning is a bit like the elephant in the apocryphal story about the six blind men and the elephant. It feels very different depending upon where you are relative to the elephant.


Diagnosing Your Forecasting Problems
Before you can improve your forecasting (and often before you can obtain cooperation from stakeholders), it's typically a good idea to seek to understand from stakeholders what past problems have occurred and what needs are going unmet.



Typical problems might include:

1. Poor accuracy in the budget or forecast
2. Time wasting revisions to the P&B data as stakeholder requests evolve.
3. Excessive manual work.
4. Irrelevance after the process has been completed due to the length of time taken, iterative approvals or changes in the environment subsequent to the development and approval.
5. Inappropriate levels of granularity i.e. budgeting at too detailed or insufficient detail (which we might refer to as the Goldilocks problem)
For each of these five problems, different solutions are available. The resulting P&B efforts and focus are likely to look different.


The Multi- Period Problem
Perhaps the most important problem with P&B systems is that they tend to be focused upon financial requirements and not upon customers. Consider the situation of a customer that buys a pilot project that if successful could lead to 10-100X more volume of purchases and significant profitability. However, if the subsequent profitable purchase falls outside the budget or forecast period, resource allocation decisions may cause underinvestment in developing pilot projects.


Customer profitability is rarely well represented by a single period view of a customer. P&B systems that only look at single year profitability may be missing the most important drivers of overall profitability. A software company that only looks at financial results can show revenue growth while in decline. If a healthy number of new customers are not being acquired, maintenance revenues may continue to increase but the competitive position may be deteriorating.


If the purpose of forecasting is to be useful, then the information base for the forecasting should be useful, relevant and flexibly capable of being modeled and changed.


The Multi-Forecast Problem
A common problem with P&B systems is that budgets and forecasts get revised. In theory, it should be easy to compare a budget with a revised forecast that is updated every three months. However, it becomes exponentially hard if the data is driven off detailed data and the detailed data does not exist on a quarterly basis e.g. a total market size number that is only revised annually by a tech consulting firm or if the forecasts have been inconsistent in their use of bottoms-up and top-down forecasting.


There are no easy solutions to this particular problem.


The Implicit Strategic Decision Problem
A common problem in forecasting processes is the hidden assumption that the forecasting process should force managers or enable divisional or head office managers to make resource allocation decisions while developing a forecast.
The challenge here is that budgeting and forecasting systems tend to be linear extrapolations of the past. Innovations, in contrast, are often harder to forecast and have higher uncertainty. As a result, plans and forecasts presented in a spreadsheet often seem to imply that each line of the forecast is equally certain and reliable.



The truth is typically otherwise.


And the more detailed the forecast, the more likely it is that each line will produce a variance in subsequent quarters. The solution is typically thought to be focusing at a high level of forecast that irrelevant or immaterial variances don't show up.


Infrastructure Issues

P&B systems typically cause the creation of spreadsheets and PowerPoint slideshows. More sophisticated systems have been developed for consolidating hierarchical budgets and forecasts. These typically fall into two categories - multidimensional spreadsheet or OLAP tools and relational databases. Because these tools are typically difficult to use, the reality is that most work is done in spreadsheets and then uploaded to consolidation tools.


However, it is not clear that these consolidation tools are helpful in the process of developing a forecast. Without going into the mechanics of building up a consolidation or reconciling a top down goal setting with a bottom up forecast, let's just say that it is messy and complicated. The result is that forecasts are often painful to develop and inaccurate.


Dealing with High Uncertainty - Scenarios vs. Business Success
In highly uncertain environments, there are four basic approaches to forecasting.


Multiple Forecasts
The approach used most often is develop multiple forecasts. A high, medium and low forecast is the most common version seen. A more sophisticated approach is to develop a probability weighted set of forecasts where each forecast is assigned a probability. Each forecast then results in an expected value that adjusts the revenue forecast by the probability of revenues. Probabilities add to up to 100% so that the sum of the expected values is the best estimate of revenues. The unattractive side of this approach is that while it is useful for revenue forecasting, it is less useful for many expenses as they will be adjusted to reflect different revenue cases. Another weakness is that research suggests that most managers underestimate variability and risk.


Monte Carlo or Stochastic Modeling
Spreadsheet based forecasts are typically characterized as deterministic. Monte Carlo modeling involves a more detailed approach to uncertainty. Each line in a forecast is mapped to a potential probability distribution (e.g. a normal or power curve distribution). A simulation is run hundreds or thousands of times to see the overall outcome distribution with each line item varying according to its probability distribution each simulation run. Because of the number of runs made and the need to map a distribution to each item in the forecast, more detailed consolidations are typically avoided in smaller companies and forecasts are done at a fairly high level of summary.


Scenarios
Multiple forecasts are often erroneously called scenarios. But more properly, a scenario is a term reserved for naming and describing a future in which the organization might have to operate. Scenarios represents ways of stretching the thinking of the organization so that the organizations anticipates the impact of a proposed strategy across a more diverse set of potential environments.


Resource Allocation Frameworks
Many organizations address resource allocation decisions separately from P&B processes. They may make resource allocation decisions within the P&B process, but they separate out earlier stages of resource allocation. Often these systems are characterized as new product processes, productizing processes (in service organizations where services migrate from custom to standard) and capital budgeting processes, where a limited pool of capital is allocatable to major investment projects via a formal approval process.


Thinking Outside the Box - What are you not measuring and not forecasting
It may seem strange to think about what you are not forecasting but well managed companies should look at what they are not addressing and the larger issue of how well an organization is doing in the overall environment.


There are several ways of thinking about this problem.


Some companies look at share of wallet or share of expenditures. They measure their forecasts as a percent of total spending by a group of customers. When forecasts are looked at in this light, companies can understand to what extent they are not obtaining revenues and profits from areas of the market where they have blind spots.


Another key measure that is important to look at is what is your addressable market vs. the total market. A forecast can often look good in the context of your addressable market. However, if competitors are making in-roads into the total market, you business may be in decline without you realizing it.


Benchmarked performance. International companies often have difficulty comparing markets. Having a benchmark allows for comparisons of performance that are independent of pricing, cost inputs and currency fluctuations. The publisher, Harlequin, used to look at books sold per thousand women over 18 per year as a measure of market penetration and maturity.


Capacity Management and Forecasting
Many organizations do a good job of forecasting revenues and expenses, but do a bad job of forecasting capacity and utilization. There are many reasons for this bias. It's difficult to forecast lagged variables such as hiring a person, training them and make them effective or processes whose cycle times are longer than the cycles times of marketing and sales. And developing people and capacity often needs to be done in advance of obtaining sales leading to a perception of risk, both in terms of internal evaluations of performance and financial outcomes.


The key task here is to make sure that assumptions are explicit rather than implicit and that the organization is committed to the marketing and sales objectives. By having clarity and consensus better decision are likely to be made and delivery failures are less likely.


Explicit decisions on training part time people can also be pursued to manage peak loading and capacity problems.

Summary
Planning and budgeting systems are more complex to design, specify and obtain compliance with than most managers anticipate. Making sure that sponsorship, scoping, evolution and links to the rest of the organization are considered early in the process will increase the likelihood of success.

Tuesday, October 27, 2009

Restricting Senior Executive Pay

In the past week, there have been numerous comments about overreaching by the Federal Government in placing caps upon the pay of the top twenty-five executives in companies that have received major government investment.

Government determining the pay of executives is clearly an overreach, but I find it hard to criticize the modest restrictions.

Consider the situation if these institutions were not banks, where bankruptcy was a legal option. Under such circumstances, every employee in the organization would likely be facing firing, salary and other reductions in remuneration.

Banks are difficult to put into bankruptcy because such events would trigger complex and cascading contract events, so the Fed, the Treasury and other regulators have been forced to take over these financial institutions in a "soft" bankruptcy.

When government involvement is looked at in this light, perhaps the complaint should be that more has not been done.

Friday, August 21, 2009

Chief Customer Officers and the Digital Living Room

The Digital Living Room still continues to fall between the cracks of the silos in organizations. Consider a recent experience with major vendors.

An evaluation HP Windows Media Server that I purchased recently for a project came with MacAfee for virus protection. The protection expired after the initial trial period of 7 months.

So, I decided to use Norton, which I use on my other machines. However, Symantec does not make it clear whether their products work on Windows Home Server. Nor did they reply to my support request. So, I decided to upgrade the existing MacAfee solution as the lazy man's approach.

After two support calls separated by three days, and after 50 minutes on hold, I determined on the second support call that the product currently has installation problems and also does not upgrade its data files. The conclusion was not shared with me on the first support call where a different answer had been suggested.

Now, as a past CEO of various software companies, I sympathize with the challenges of supporting continually changing software. And I am not particularly worried about this server, which is primarily used for file back up and music sharing.

But the whole experience of:

1. Having to determine whether a product works with a home server.
2. Inability to offer a clear and simple decision process on what to buy.
2. Confusing installation processes.
3. Difficult to use administrative software more appropriate for a small business than a home user.
4. Delivery of support via the small business support line causes unnecessary phone calls and downloads of support software.

reflects an inside-out silo'd view of the customer.

Persuading consumers to tackle the Digital Living Room will require a more customer centric perspective. A Chief Customer Officer would help a company transform the customer experience so that successful customers would become ambassadors on behalf of products and services.

Wednesday, August 12, 2009

Getting Sued

I recently had an off the records conversation with a former CEO of major firm who had spent much of his CEO tenure dealing with more legal suits than any business should have to deal with.

The take away from our discussion was that with the full benefit of hindsight, the company's problem was that (1) it was small, (2) its patented technology was really valuable to very large customers, (3) the company priced licensing of its technology based upon the value of the technology to licensors and their customers.

What the company forgot -- and this is a common mistake of small companies in the United States -- is that a large company looking at a small company always has the choice of litigating and use the law as a weapon to beat the small company to death.

So, when I work with small high tech companies and their business plans talk about barriers to entry including a patent, I will often grimace.

If the technology is unsuccessfully, nobody will care.

If the technology is successful, then the chances of being sued go up.

If the technology is exceptionally successful or useful, it's pretty much a sure thing that you are going to get sued.

Without major reform to patent law in the US - which is clearly needed -- there are no simple legal answers to this problem, except to using pricing as a tool.

Pricing can encourage licensing and make it more attractive than suing. What people forget is that there are many ways of pricing a product. Pricing creativity can reduce legal risk, accelerate revenues and in some cases increase total revenues.

It's worth thinking about before you get sued.

Thursday, August 06, 2009

The Digital Living Room - Miles to Go Before You Can Sleep

Yesterday, I attempted to watch a Blu-ray movie obtained from Netflix, produced by Sony studios, on a six month old Sony Vaio multi-media laptop, hooked up to a 25.5 inch Samsung monitor as a secondary monitor. Blu-ray of course is a wonderful standard for high def movies developed by Sony.

It took me two hours and two support calls to get the movie started.

Now it's pretty hard to argue that compatibility should have been an issue with Sony controlling all the technology. If I were not doing consulting to firms in the digital living room area, I would have broken something in frustration. I rarely get so annoyed by technology that I curse out loud, but it was one of those evenings.

So, you may ask, what happened?

The problem began with the InterVideo software shipped with the notebook. When I stuck in the Blu-ray disk, it told me that I need to renew a key in the Blu-ray viewing software. I was sent to a confusing page with the software vendor Corel, a relationship I was unaware I had. My immediate thought: is this some kind of virus problem? The key transaction then failed twice. So far ten minutes wasted.
The courteous and knowledgeable support person in Costa Rica talked me through disabling user account control on my notebook and obtaining a key upgrade. I was fuming by this stage. Not only do I object to an unnecessary key renewal, the software does not even work well. So far, 60 minutes wasted.

But the problems did not stop there. I could not escape the previews on the DVD. Now, I would like to think I am a pretty knowledgeable about the digital living room. People hire me to look at their products and do competitive comparisons. But it was practically impossible to get to the movie. I think I saw the previews seven times. Unlike most people I have two media computers with Bluray from different vendors. Same problem on both my desktop and my notebook. Total time wasted now at around 70 minutes.

On the second support call, we determined that a second Netflix Blu-ray disk, immediately went to a main menu from which you could play your movie easily. Admittedly, the second disk did reveal I had a bad setting on my desktop, causing the colors to be wrong, which I eventually fixed by letting the video app control color settings. The conclusion from the second support call was that the Blu-ray disk was defective.

Now, I am not a typical user. I am way more persistent. I eventually figured out a way of getting to the movie with some additional experimentation. Total time invested over the entire evening ended up at over two hours. But I would have to say that the experience was ridiculous. A media notebook that can't play media. A Blu-ray disk that won't let you get to the movie on it. Disappearing menus. Software that won't let you play your movie on your external monitor. Multimedia computers that have non working registration software that prevent usage.
Is Amazon the Most Interesting Media Company in the World Today?

The move to digital content has to a large extent was initially spearheaded by Apple. It's initial focus on its business system was on simplicity. Pricing per track was set at 99 cents. Simple and understandable. A good pricing model for a new and complex technology.

But today, the world looks quite different. Digital music is now mainstream. And with mainstream businesses, traditional retail issues and innovation start to become more important.

Amazon is at the forefront of this new trend. It is a result, possibly the most interesting media company in the world.

Consider the following:

Amazon sells traditional books, electronic books on Kindle or iPhone/iTouch, traditional physically delivered music, downloadable MP3s, new and second hand DVDs, downloadable movie purchases and downloadable rentable movie viewing. Other than a subscription model, Amazon has most of the purchase options covered.

Even more interestingly, unlike Apple, Amazon is behaving like a smart retailer. It uses free songs, free Kindle copies of the first book in a series to create traffic, in a way analogous to the supermarket offering cheap milk to bring in customers or samples to get customers to try a new brand.

And Amazon is testing pricing. It offers daily specials pricing anywhere from 99 cents to $2.99, with $1.99 as the most common price point to encourage traffic, obtain sampling and give customers a reason to keep coming back every day to their web site. It's better than advertising, because revenues are produced by the hook that pulls in the customer. And of course, the big problem with an ecommerce site is getting traffic. If they visit, you have a chance of selling them something.

But the really interesting capability that Amazon is building is deep understand of individual customer tastes and price elasticity, something that Apple has spurned. Amazon is learning about the tradeoffs that individual customers make on different types of purchase, lease or download of content. When will a customer own or rent? What do you need to do to create trial? When does it make sense to discount to trigger additional purchases of content from a writer or artist, or to addict a reader to a book series?

Great businesses don't freeze their strategy. They continually improve them. Amazon seems to be aggressively learning faster than other players. Kudos to them.

Monday, August 03, 2009

To Pareto or Not To: Changing the Profitability of Your Business
Copyright Alistair Davidson, August 2009 as an unpublished work. Alistair Davidson is a strategic consultant with turnaround experience who has been CEO of several companies and helped companies improved their revenues and business development activities.

Contact:
alistair@eclicktick.com Phone: +1-650-450-9011

Certain key insights in strategy seem to be continually important. Flanking a competitor is often a better strategy than attacking them head on is one example.

In many business and economic analysis, 20% or so of a market, customer group or products seems to account for a disproportionate result, often characterized as 80% of the results sought (revenues, profits, etc.). This Pareto or 20:80 rule became very popular in the 80s when activity based costing exercises revealed that for many companies profitability was driven by a small number of customers. The less intuitive conclusion, one that frequently has to be explained to first time readers is that if 20% of your customers account for in excess of 100% (say 150-200%) of your profits, then the you are losing money on the other customers.
What is challenging about the Pareto insight is that it offers a universal rule of thumb, frequently and consistently important, but the prescription from the insight is often less obvious. And sometimes it is wrong. The Pareto insight leads to one of two conclusions:


1. The 20% of customers represent a unique group and lessons learned from them are not directly applicable to the rest of the market.


2. The 20% of customer represent a model for my future business.


Business Model Revision

The Pareto rule is often difficult to apply is where a new business model is introduced. It is never 100% clear whether a potentially disruptive technology at the low end of a market will change market requirements or provide a platform for an initial insignificant competitor to build upon. In a parallel way, migration of high end features from premium products and services to the mass market is also difficult to predict in some markets.

The concept of Track and Trace (making parcels and envelopes trackable though out the logistics process), offered by FedEx and UPS was extremely threatening to the Canadian Post Office. They could see no way of matching the capability given the volume of mail and packages they delivered.

The internal debate revolved around whether they should take a Pareto approach and focus a Track and Trace capability only on parcels and high value added packages, or whether this would be a long term capability for all logistic operations. Their eventual conclusion was they needed to have more presence in the premium package and envelope business and Track and Trace would largely be restricted to the high end of the market. To this end, they bought a courier company, Purolator.

Dropping Customers

One potential prescription from learning that 20% of your customers account for 150% or more of profits is to slim down the business and focus upon the profitable customers. However this strategy is often emotionally very difficult for many managers and often pursued too late. Managers have spent so much time investing in acquiring customers that given up the customers is distressing.

Raising Prices

Raising prices for the less profitable customers seems like an obvious solution to a 20:80 insight, but resistance from the sales force, inadequate systems for tracking discounting behavior and negative feedback from customers are likely barriers that will need to be overcome. Slightly more clever approaches change the basis of pricing in ways that are more palatable to customers. Leasing and usage based pricing are particularly attractive to capital constrained customers and change the nature of the evaluation process.

Reducing the Cost of Delivery for Unprofitable Customers

Financial services organization frequently have used self service with ATMs and online banking to reduce the cost of less profitable services and less profitable customers. In contrast, wealth management services offer higher levels of service and advice.

Reducing Marketing Costs

A less obvious approach to making many customers profitable is to delight customers so that they become your sales agents. Strong word of mouth can significantly reduce a required marketing budget. Amazon uses daily specials in e.g. the MP3 download market to encourage daily visits to their site, a low cost way of generating traffic. Heavy users may tell their friends about hot music they have found a deal on.

Life Cycle Management

As with most costing decisions, it turns out that pricing is often a strategic decision. As a result, the time frame over which you measure customer profitability is critical as are the implications for organizational capacity.
  • Mature software companies like Oracle will often discount their software significantly to obtain sales (with the largest discounts occurring at quarter and year end when sales reps are under pressure to meet their goals). Part of their willingness to do so, is their knowledge that software is actually more a service than a capital expenditure with maintenance revenues a critical part of the annuity relationship with a customer.
  • The success of the Apple iPhone is based in part upon the fact that that a $600 cost of purchase by AT&T is resold to a subscriber for $200 in return for a two year contract that might add up to $2400 of revenues and a strong probability of retention at the end of the contract. These high ARPU (average revenue per user) clients are likely some of AT&Ts most profitable.
  • The challenge for AT&T is to how to grow their business. Are these customers atypical, i.e. a 20% that is atypical or do they represent a different business. Research suggests that there is a large gap between the number of subscribers that would like an iPhone-like service (i.e. with easy to use Internet access) and the actual number of data subscribers. One interpretation is that subscription rates would increase with lower data plan prices. A move in this direction would have significant implications on network architecture for AT&T mobile network capacity by reducing the revenue per user from data plans and lowering revenues per byte transmitted.

Increasing Value Propositions

Bundling is one example of changing value propositions. Companies like Hyperion (now Oracle) and Microsoft have used bundling to reduce the cost of individual applications, but create more value for customers. In telecom, triple and quad plays (combination of voice services, broadband, TV services and mobile services) are a common marketing approach.

In some ways, bundling can be slightly unintuitive. Most purchasers of e.g. MS-Office probably don't use most of the features they purchase, but the incremental cost of having compatible features available has value. Most fixed rate pricing programs e.g. Netflix make this same tradeoff. Netflix may lose money on some customers who are heavy users of the service, but the reality is that most people are at or close to the limits of time they can devote to viewing videos. Value perceived is not necessarily usage.

Changing the Basis of Competition and Cost of Delivery

The Pareto insight is one that many companies are facing in the current recession. When demand for a product category drops dramatically, downsizing assumptions are often affected by assumptions about demand and profitability distributions.
In the automobile industry, the politically unspoken "elephant in the room" is that gasoline prices will in the future be maintained at a far higher level than previously for reasons of balance of trade, security and global warming. This increase in the total cost of ownership of a car will make likely make small cars more popular and more expensive than they have been historically. It will also make driving more expensive reducing total demand for cars. Auto companies are faced, as a result, with downsizing, a cyclical downturn of unusual size and a longer term secular shift in purchase patterns.

As with many markets, the automobile market is likely to become far less homogenous. The market may evolve towards predominantly electric powered microcars for in city driving, hybrids for trips requiring greater range, and larger hybrid capacity vehicles for transporting larger groups of people.
For automobile companies, their cost structure, traditional assumptions about profitability, scale and scope economies are all open to question.
In the same way, telecom service providers faced with demand for low priced unlimited data plans are having to rethink their network architecture to divert home and office data traffic away from cell towers to home fixed broadband connections.

Summary
The key take-away from any Pareto analysis is that it is a useful rule of thumb that inspires important questions about making money in your business. Making the right decision means not only looking at product and customer profitability, but also your delivery process and value chain from the perspective of both current and emerging usage patterns.