Tuesday, August 31, 2010

Review of Liquid Planner (www.liquidplanner.com)

Review by Alistair Davidson, Partner, Eclicktick Consulting

Strategic planning is not project planning. But strategic planning often requires project planning and dealing with high levels of uncertainty. There is, as a result, no single good way of managing execution that will apply to everyone’s situation. LiquidPlanner, an online project management software package, priced at $25 per user per month, is a simple package that includes an important element in project planning and project portfolio management – the difficulty of estimating how much work is required to complete a task.

Uncertainty about the length of time required to complete a task has numerous consequences. First, team members may pad their estimates of the work required to complete a task. Second, downstream tasks dependent upon the padded estimate will look even worse than they should. Put together too many padded estimates and projects may become impossible to sell internally. Restructuring of projects and resources become even more difficult than if better estimates were available. Even worse, arbitrary decisions about what is realistic and what is not may lead to top-down decisions by a senior manager or project manager—resulting it forecasts that are equally as erroneous and arbitrary.

LiquidPlanner’s solution is straightforward. Tasks are estimated as taking a range of time, e.g. from 2-4 hours or 1-4 days. The Gant charts (horizontal bar charts) that lay out the steps in a project show the most likely completion time for the task, the earliest possible date for the task and worse case for delivery. Promised or delivery dates now become easier to understand and more likely to be accurate.

LiquidPlanner is very much a child of 2010. The application comes with a portal where portions of the project can be shared with external participants (e.g. suppliers, customers, sub-contractors, partners, etc.). Messaging between participants is exposable through the portal. And an excellent series of video tutorials gets users up to speed quickly.

Adding in Monte Carlo or stochastic modeling to regular project management software is an option in many software packages, But usage of more probabilistic project management is still quite rare. LiquidPlanner is an easy to use and attractive tool for managing projects and small portfolios of time-based activities. Unlike many more complicated tools for Monte Carlo modeling, it keeps things simple.

Monday, August 30, 2010

Raising Notebooks From the Dead

Don’t get me wrong. While I ran a software company for many years, I am not someone who likes to spend time delving into the innards of computers. And that perhaps makes this article more significant.

I had a problem, one that many households and small businesses share --lots of old notebooks sitting around that are essentially unusable – problems with drivers, old operating systems, slow speed, too little RAM. I am strategic and business development consultant who works with clients that always seem to be in crisis, so my time is more valuable that playing around with old computers.

But last week, I decided I had a quiet time in August. It was time to clean up. As an experiment, I downloaded Ubuntu from ubuntu.com, burned an installation disk and tried it out. Now there are four ways you can try out Ubuntu as an operating system.

1. Run it from the CD which makes no changes to the computer.
2. Install it so that when you boot your computer, you can choose which operating system to boot.
3. Install it as a window that runs within MS-Windows using for example the free VMPlayer from vmware.com.
4. Replace your old operating system with Ubuntu.

One of my notebooks would not let me dual boot the system but it is a machine that needs to have its Windows reinstalled. But I succeeded with practically no effort in setting up Ubuntu and VMPlayer on my desktop and latest notebook. And I converted an older notebook to a Ubuntu notebook that runs better and faster that it did with Windows, a fact that would be of no surprise to Unix/Linux enthusiasts.

Score 1 for easy installation for Ubuntu.
Score 2 for VMWare player.

But what about usability? The Macintosh (which is based upon UNIX) and Windows set the standard for ease of use. How does Ubuntu stack up? To my surprise, I would have to rate it acceptable verging on good for a normal, relatively unskilled user. Ubuntu comes with most of the applications a normal person needs in their daily computer usage, e.g. Firefox for browsing, Evolution, an Outlook like email client, Open Office, an open source equivalent of MS-Office, music playing software, video playing software and a convenient application center where you can download programs like GIMP, a Photoshop (TM Adobe) like application. All of this with a reasonable easy to use windowing environment (GNOME) which is easy to master.

Now, I have no personal interest in becoming a LINUX geek, but my “take away” from this small project is that Ubuntu and the supported software provides an inexpensive way of turning an older machine with the performance of a boat anchor, one with performance and driver problems into something useful. Given that buying a decent netbook might cost you $400, Ubuntu can save you money and provide an inexpensive way of making a second or third computer usable for “commodity” tasks such as word processing, spreadsheets, presentations, small databases, music, video and photography.

The bottom line: Ubuntu does not require as much hardware as Windows and the level of driver support, installation ease make it usable by just about anyone who can burn a disk in Windows. In fact, on my desktop, Ubuntu recognized a sound card and used it when Windows is currently failing to do so -- pretty impressive.

And did I mention that all the software mentioned - the Ubuntu operating system, the applications and VMPlayer from VMWare are all free?

Wednesday, July 28, 2010

Why economics matters to strategists?

Recently, there has been a great deal of debate about the new iPhone 4, its ability to hold a signal and the quality of the AT&T network. In brief, the claim is that the antenna on the iPhone 4 is affected by being held and that AT&T has an inferior network to Verizon. And because iPhone users and detractors are often emotionally involved in their views, the framing of the problem is often not very useful to a strategist.

The economics of networks is not being considered in this debate. As smart phone users are a new category of uses and the iPhone was the first to make data usage easy, it is no surprise that AT&T has run into usage economics first.

Consider the following:

Customer usage: if you allow customers to use a much of a service as they like, they will use more of it up until the point where other constraints such as their willingness to use their phone all the time, becomes a barrier. Unlike mature markets, where saturation of usage has occurred (e.g Netflix which assumes you only have so much time to watch movies), in developing services, usage has often not yet reached a natural upper limit. In a fixed capacity network such as 3G, adding more users and making usage easier leads (surprise, surprise) to more usage.

Service provider investment: if you do the calculation on revenues per byte, voice calls are more profitable than data sessions – about two orders of magnitude more profitable. So if you are a service provider, you have the problem of expanding your network to justify less profitable usage. It’s no surprise that investment has lagged.

Next generation economics: as so often happens when people straight line their conclusions about growth, they fail to consider alternative ways of solving problems. In the case of smart phones, the alternative to using capacity on the cell tower is “diversion” or diverting traffic to alternate channels. These alternatives fall into three basic categories:

1. Public wi-fi sites
2. Home and business wi-fi sites
3. Femtocells

What all the first two have in common is that smart phones today typically offer wi-fi capabilities as well as 3G capabilities. And the speed of wi-fi is typically greater than 3G. So if you want to download a big file, it’s a good idea to do it via wi-fi.

Femtocells are less well known. They are, in effect, small base stations that provide better signal to a cell phone in a home or office, but rather than connecting to the mobile network via a public cell tower (and its backhaul), instead they link to the home or office fixed broadband that is pretty much always available at low cost and typically zero incremental cost for usage to the backbone or fixed network that connects the cell network.

Both wi-fi and femtocells divert traffic from overburdened cell towers and their backhaul and use an existing under-utilized and low cost fixed infrastructure. Because data is charged at a lower rate than voice traffic, diverting data is the most profitable from the perspective of return on capital. But voice can be diverted as well over wi-fi freeing up capacity at cell towers.

Not surprisingly, diversion is a multi-billion dollar issue for mobile companies -- unlike the difficult and time consulting process of putting up cell towers, wi-fi and femtocells require no regulatory input and are significantly cheaper (no bandwidth purchase needed, no rental fees, no backhaul costs).

Perhaps even more importantly, given that the majority of usage of cell phones occurs in the office or home, highly granular improvement in the network can be focused upon areas where most usage occurs. You can think of the use of wi-fi and femtocells as adding leaves to the branches of a tree.