Who needs another framework?

Carnegie Mellon – the brand behind the global CMM standard for software providers,– or rather another part of Carnegie Mellon called the ITSQC -has another model available specifically for e-sourcing: the eSCM-SP and eSCM-CL or eSourcing Capability Model for Service Providers and Client Organizations respectively. Wikipedia link here. ITSQC homepage here.

I’ve had a read of the eSCM-SP and am struggling to see what value it adds, certainly in terms of how I understand sourcing and eSourcing. Two things to bear in mind with this model:

  1. The acknowledgements list includes contributers from Satyam, IBM, HP, Accenture, Deloitte etc. No mention of an Ariba or a Freemarkets (let alone anyone else in the space). No long list of CPOs from major organisations. No mention even of any organisations that track and analyse the space. Yet the ITSqc says in its description of the ITSqc research consortium that Our members consist of international industry leaders in eSourcing on both the Client Organization and Service Provider Sides of the relationship, including clients, service providers, advisors or consultants, and the standards community.
  2. I’m dubious about the value of their definition of sourcing vs e-sourcing. You’ll have to download the documents yourself to see the graphic I’m referring to – in the meantime here are the definitions:
  • IT Sourcing contains Applications Development & Management, Desktop Maintenance, Application Service Provider, Data Center Support, Telecommunications Network Support
  • Task & Business Process Outsourcing contains everything from IT Sourcing and also includes Finance & Accounting, Engineering Services, Human Resources, Data Capture, Integration & Analysis, Call Center, Medical/Legal Transcription, Purchasing
  • eSourcing covers IT Sourcing and also Task & Business Process Outsourcing
  • Sourcing contains everything in IT Sourcing and Task & Business Processing Outsourcing and also the likes of Janitorial Services, Lines Services

Clear? Like I said – you’ll need to look at the graphic in their documentation to get a better understanding. In the meantime here is my interpretation:

According to the model the core of sourcing is the sourcing of IT-related services, e.g. Desktop Maintenance, Applications Development, Data Center support.

The next level up in the sourcing definition brings in the sourcing of what has become known as BPO (Business Process Outsourcing), e.g. the sourcing of Accouting services, the sourcing of Legal Transcription services, the sourcing of HR services, putting together call centers.

Both of these levels are covered by the model’s eSourcing definition. The sourcing stuff that is outside of scope of the model is, for example, Janitorial Services and Linen Services.

There is a pattern in all of this: The model defines eSourcing as the stuff you can outsource to a 3rd party offshore provider. It excludes from scope the stuff that needs people onsite, or transportaton of physical goods.

Now – if you look back at the list of the contributors of companies to the definition of the model you’ll see that, surprise surprise, they tend to be the providers of the outsourced services that can be provided offshore (e.g. legal transcription, application development services).

But when someone tells me that they are looking for eSourcing or IT-enabled sourcing, to me that means using IT to help make sourcing better. This can mean anything from using SAP to using Excel templates (or anything in between) and can certainly by used to source Janitorial Services better just as it can be used to source Desktop Maintenance better. The definitions used by the eSCM suggest that they see eSourcing as the procurement of services that can be provided remotely using the internet.

So is the model going to help you decide whether to go Ariba or SAP, or whether to outsource the whole of your sourcing function to China? Probably not. But will the model help you decide whether Accenture or Wipro will be best to run your 400 person call center? Possibly yes.

So tread carefully – and beware that just because people are using the same words doesn’t mean they are talking about the same thing.

While I’m on the subject of the eSCM here are a few more thoughts:

The eSCM shares the same brand as the CMMI that has become very popular with IT service providers over the past decade. But it doesn’t follow that just because the CMMI is a de facto standard in the IT industry that the eSCM will become a standard in the procurement space. In fact CMMI level 5 certification is not in itself a guarantee of a stable, quality provider: Satyam (coincidentally one of the contributors to the eSCM) are CMMI level 5 certified (check their awards page and scroll down to 2005-2006 for CMMI and pre-2001 for SEI-CMM, the predecessor of CMMI) and yet its leaders are at the centre of a fraud probe.

As far as 5-level maturity models go in the sourcing space I am quite taken with Hackett’s one. Incidentally my post on the subject is one of the most popular pages on this blog.

Till next time.

How you can be on time, on budget and on scope but still FAIL

OTOBOS – On time, on budget, on scope – is the mantra of software project managers everywhere. So many IT projects fail on one or more of these three factors that project management 101 takes  great pains to ensure that aspiring project managers learn how to stick to all three (or at least two).

But being OTOBOS is only the first hurdle to cross. Fail at OTOBOS and you fail, for sure. But there is a much, much bigger picture than simply being on time, on budget and on scope. You can be OTOBOS and still, from a business perspective deliver a big FAIL.

There are a host reasons why this can happen. Here’s one fairly common one, in enterprise software at least.

The project team has forgotten the real objective behind the IT system

There is a subtle, but important, difference between a requirement and an objective.

Here’s how the scenario plays out: A senior individual makes a proposal for an IT system to be built/implemented in order to achieve a particular business objective. Once the proposal is approved a project team is convened to implement the IT system. The team assiduously collects requirements from stakeholders and carefully manages any changes to those requirements. Eventually a shiny new system is unveiled that meets all the requirements … but which fails to deliver the business objective.

What happened?

It could just be a simple breakdown of communication somewhere along the line from the business objective to the requirements to the developed code. Usually any project objective is hidden in a project initiation document or a project charter never to be seen again by anyone until after the project has finished.

Or it could just be that business objectives have shifted since the project was initiated and no-one remembered to tell the project team. As projects get longer this risk becomes more real.

Here’s a story I heard a while ago – and I’m sure those of you in enterprise IT land won’t find it surprising. A software product was supposed to streamline a procurement process by empowering end users to place purchase requests themselves, thereby avoiding paper trails and manual approvals. The system was duly delivered, on time, on budget, on scope. It met all requirements. But it turned out that the system was too cumbersome for end users to use. They continued to place their orders on paper and have them approved on paper, and then it was someone else’s job to input the approved orders onto the new system. Far from streamlining the process you could argue that the system had ended up adding a new step and more time to the process.

How to avoid this issue?

Have a clear, simple business objective for your project that everyone, from the sponsor to the intern knows and uses every opportunity to repeat. This helps people remember what they are really trying to achieve. Team members can identify early any threats that might prevent delivery of the objective. And when business priorities change (as they surely will) these can be filtered to the project team early enough that you stand a chance of doing something sensible.

No Joke – this is what my Windows Live looked like this morning

I did a double take when I saw this. I had clicked on an uninteresting story and lo and behold while Vista was taking its time loading up Internet Explorer, lo and behold, this is what the Windows Live app showed. Eventually it refreshed with correct “news” but not before I was able to take this screenshot.

Looks like I was treated to a Windows Live wireframe. If so it’s entertaining to see the level of humour and cynicism amongst their designers.

Looks like a windows live wireframe
Looks like a windows live wireframe

On Free, Open Source and VRM

Open source is pretty standard practice these days – whether on the desktop (OpenOffice), on the server (MySQL), in the enterprise (WSo2), even tools like CRM are now available open source in some flavour or another. There are two types of people who are interested in open source.

A. Open source is interesting because it’s open. The philosophy behind this one is that “given enough eyeballs all bugs are shallow” a la http://en.wikipedia.org/wiki/Eric_S._Raymond.

B. Open source is interesting because it’s free to use. The philosophy here is, duh, why spend money when I can have something for free?

Open source, because it’s open, is great for learning, great for practising and great for demonstrating your chops to your peers and to potential employers.

Open source, because it’s free to use, lets companies build more stuff better and faster and cheaper with open source tools than they would have been able to without. If you want you can even pay for your open source (even if only via add-on services for enterprise support).

There’s a big difference. And I suspect that for most people the free aspect of open source is more important than the open aspect.

Open Source and Free
Open Source and Free

For example I’ve been giving VRM a lot of thought recently (see post in my other blog in which I was put right on some assumptions by Doc Searls and Graham Sadd), and I’ve been interested in some of the debates (e.g here and here) about the role of open source in VRM.

On the one hand is a train of thought that goes something like “this needs to become commercial to be successful”. On the other hand is a train of thought that goes something like “the open source community is our best bet to solve something as massive as this.”

And then of course there are plenty of comments along the lines of “it is both commercial and open source”.

But open source as relates to VRM – is it really that important? Adriana Lukas’s MINE! project is open source but does it need to be editable by anyone (open)? Or does it need to be freely distributable to anyone (free)?

Of course the use of free software elements will be invaluable in building effective VRM, just as with building any software these days. But open source doesn’t need to be part of the philosophy of the movement itself. Identi.ca is open source. Twitter is not. But there are plenty of ways of getting at the data in Twitter in third party applications.

It’s at the data layer that I think the open-ness debate needs to happen. The importance with VRM is in a user controlling their own data and communicating that data easily to potential vendors, under the user’s terms. The code that enables this to happen is of secondary importance. And will probably come from a whole host of small, different pieces. Some parts may be open source, but other parts may be proprietary (but with good APIs). Perhaps I’m trivialising things but I wonder whether with VRM the secret of success is “open data” rather than “open source”?

Common sense procurement tools: Pro Purchaser

Worth checking out for procurement professionals is Pro Purchaser, a Canadian provider of market information. What I like about the service these guys provide is that it is all about common sense.

At first sight the site seems incredibly basic. It only has monthly updates of pricing information, for example. But the real value is in the combination of pricing information and “Negotiating Nuggets” – reports that Pro Purchaser produce on various aspects of negotiating.

For example, if you are wondering how useful monthly prices are going to be to you as a buyer, here’s a great Negotiating Nugget from Pro Purchaser called Direction matters more than accuracy.

Gartner Magic Quadrant for Sourcing Application Suites – A Reaction

See this link on Spend Matters for the story on Gartner’s 2008 Magic Quadrant for Sourcing Software http://www.spendmatters.com/index.cfm/2008/7/15/A-Free-Look-at-a-Gartners-Sourcing-Magic-Quadrant. My comment was a bit too lengthy for a comment on Jason’s post. So here it is, below:

First a disclosure: I lead the product development for TradingPartners. TradingPartners provides eAuction services. Very similar to what FreeMarkets pioneered all those years ago with their “Full Source” offering. We don’t sell software licenses, let alone software suites, so wouldn’t fall into Gartner’s analysis but we are considered competitors with a number of the companies mentioned in the Gartner Sourcing Magic Quadrant report. You can make up your own mind to what degree the comments below are self-serving or not.

I’m not keen on the name of the report. Despite all disclaimers the report is titled “Magic Quadrant” which implies that there is one “magic quadrant” that buyers should look at, i.e. the top right. And when I look at the vendors in the graphic the relative ranking seems fairly arbitrary. Certainly it’s not clear from the report why Quadrem should be better able to execute than BravoSolution.

The best part was the overall 10,000ft view of what is happening in the market. In particular:

  1. The distinction between strategic sourcing and tactical sourcing. “Organizations should expect to eventually deploy two separate sourcing solutions or two configurations of a single solution: one for tactical sourcing (for example, querying a contract fuel vendor for this week’s price per liter) and one for strategic sourcing (such as simultaneously negotiating rental car contracts across multiple vendors for service for the next three years and in 10 countries”.
  2. The summary of the consolidation in the sourcing software space (gone are Freemarkets, B2eMarkets, Frictionless, Mindflow, Procuri, Verticalnet).
  3. The recognition that wrap-around sevices are of paramount importance in strategic sourcing initiatives. “[E]ffectively leveraging different auction/event types for the best results requires a knowledge that can be gained only be using strategic sourcing applications. Furthermore, enabling suppliers to register online and providing customer service to troubleshoot their issues requires a significant effort that a procurement group will not be able to support without advanced planning and incremental staffing”.
  4. The recognition that strategic sourcing tools don’t require ERP integration. “They function nicely as standalone tools, because the trigger to commence a strategic sourcing event is the initiation of a project, and prospective vendors do not need to be in the vendor master unless they win the bid. The output of an event tends to be a contract. The unstructured nature of strategic sourcing lends itself to solutions that are architected as project management and document repository tools”. Here Gartner calls strategic sourcing unstructured. I would prefer to call it BRP (in contrast to ERP).
  5. The recognition that, in reality, buyers are still sticking to Excel rather than fully automating the sourcing process. “Requirements should be specified in the sourcing tool at the line-item level to fully evaluate and document the resulting bids using the application; however, in practice, many companies simply attach the specifications and record the resulting proposals at the header level, and analyze the results offline.”

Some parts of the report I would take with a pinch of salt:

  1. Including forward auctions in the debate. They are a red herring. Sure from a technology point of view they are similar to reverse auctions but in practical business terms they are of little relevance to most buyers.
  2. Cautioning that some suppliers are buggy. Without any meaty supporting arguments I’d assume all software is equally equal in this regard
  3. Come to think of it, a lot of the “strengths/weaknesses” seem cursory. E.g. Ariba is praised because it “offers varying scales of its sourcing product so customers can consume functionality as gradually as desired”. And Ariba is criticised because its “customers tend to use sourcing to solicit bids from local vendors”. 

Spreadsheet Worst Practices

I’ve long contended that, behind all the hype about Source To Pay systems and SRM packages and Flex interfaces and eAuction software, Excel remains one of the top 3 software tools for buyers. (The other 2 being Outlook and Google).

So I enjoyed reading this article on CFO.com all about spreadsheet worst practices and how to avoid them. Here’s how the article starts:

There’s little doubt that electronic spreadsheets are the most widely-used financial software application. But they are also the most-abused.

The article clearly struck a chord with CFO’s readership, as they published a follow up with readers’ views.

The CFO article is directed mainly at those who use Excel for number crunching, analysis and what-if planning. So the practices in the article will be of most interest to buyers who use Excel for analysis. But there are also some nuggets that you can pull out of the articles, even if you only ever use Excel for issuing RFQ templates.

The practices CFO highlights:

1. Poor segregation of data. Some people use Excel just as a super calculator. So if you look into a cell you might find the formula “=300000*1.50+158000*1.46+250000*1.20*0.95”. While it might make sense to the person doing the calculation at the time that we are looking at the total forecast spend for three different parts (300000 units at $1.50, 158000 at $1.46 and 250000 at $1.20 less a 5% discount), a formula as bare as this is not going to help explain the data 3 months down the line
2. Poor documentation of assumptions. The last part in my example formula is 250000*1.20*0.95. You could read this as 250000 parts at $1.20 with a discount of 5%. But why the discount? Does the discount always apply? Or is it some volume discount based on ordering over 200000 items?
3. Poor documentation of constraints. Don’t put one complex formula in a cell. Remember in your maths exams when you were always told to show your workings? Same applies in Excel. Better to use multiple, intermediate calculations to show how you are getting to the final result.
4. Difficulties in making changes. If we decided that we wanted to change the forecast volume for part B to 180000 then it’s not immediately straightforward to know where to update the spreadsheet
5. Now it’s here, Now it’s not. The ability to change one value in a spreadsheet and have all the relevant values re-calculated is very powerful. But it’s also easy to lose track of where you were before you started your what-if scenarios. CFO.com’s recommendation is to use different worksheets for different scenarios, with one master worksheet to summarise and compare the results of your different scenarios.
6. Presentation Ready. It’s not hard to set your spreadsheets up for printing – with headers, footers, page sizing, repeating columns and rows. But it’s often overlooked, to the annoyance of the people you are emailing your spreadsheet to.

Downgrading Vista

We’ve been with Vista since middle/end of last year. Despite our road testing (which went well) and my own initial experiences of Vista (not that bad) we are seeing performance issues esp. with bootup and crunching large files.

So yesterday we took the decision to offer our laptop users the option of downgrading to XP if they would prefer the performance gain. Especially given the recent coverage here (via Crunchgear). Desktops still seem to be working fine on Vista since they are more powerful machines. But even the maxed out laptops we are using can be pretty slow with Vista.

Procurement Solutions Exhibition – Things I might be buying

I was at Procurement Solutions yesterday as a speaker. But also walking the exhibition floor I came across two companies which looked pretty cool and which I will be checking out further:


Huddle is one of the UK’s rising stars in the Enterprise 2.0 space. They make very simple online collaboration software, in a similar vein as Basecamp. Why am I interested? I am looking at better ways for us to integrate with our clients during our projects. Trouble is, most of our clients aren’t very familiar with MS Project. So a non-MS Project project managemnet/collaboration tool is going to be interesting. Especially if the price tag compares favourably against MS Project. Huddle’s Head of Marcomms was on the stand and she was refreshingly candid. I asked her how the product compared to Basecamp and DreamFactory. She wasn’t familiar with DreamFactory but there and then googled them, on her laptop which was hooked up to the plasma display on her stand. She didn’t drill into their site, clearly – she just looked at the google results. But it’s refreshing to see someone who can be open about their product and its merits compared to the rest of the marketplace – and doesn’t need to rely so much on spin. (Unless of course she was secretly spinning me in some subtle way I didn’t notice).


Mimio make devices that attach to your whiteboard and capture the information on the whiteboard electronically. It’s all very automagic but the bits I played with and looked at certainly seemed to work. Now, I adore whiteboards. So something that can extract the scribbles I make on whiteboards – and at a sub $1000 price tag – is going to be interesting to check out further.

Another reason why enterprise software is generally so bad

One of the main reasons is that the individual buying the software is rarely the person using the software. The individual buying the software is doing so on behalf of one party (probably a CFO or similar) but then imposing it on some others (probably some junior staff). Two issues here: Number 1 – there are plenty of opportunities for misunderstanding what is genuinely needed and what really makes sense to use; Number 2 – this approach smacks of a planned economy in which a “ministry of enterprise software” decides what software works best for us, the users. We all know that planned economies tend to underperform market economies.

Another reason is because of the divergence between what Thingamy are calling “ERP” vs “BRP”, (also described well here) or “Easily Repeatable Processes” vs “Barely Repeatable Processes”.   The argument here being that enterprise software may be quite good at automating drudge tasks like accounts payable, but that it struggles with something more fluid like strategic sourcing.

But on reflection, postulating a fundamental difference between ERP and BRP is something of an assumption.

Take brewing a cup of tea. Is this an “easily repeatable process” or an “barely repeatable process”? You’d hope that it is easily repeatable. You’d be surprised. I certainly was. I was once using an auditorium that was also being used by some students to learn process mapping. They had put up posters of their process maps for making a cup of tea on the walls. Each process was completely different. And the lecturer told me that as part of the course the students then have to try to make a cup of tea in front of the whole class by following their process map. Needless to say, they usually fail.

If it’s so hard to even define the process for making a cup of tea clearly, it should come as little surprise that the extensive process definition exercises associated with major enterprise software implementations end up delivering something that the users find themselves struggling with.