Two things caught my eye this week on the ongoing love/hate relationship between IT and Procurement.
IT doesn’t like Procurement
An article in Computer Weekly, 10 June 2008 print edition under the title: Procurement teams fail to serve IT. It reports on a talk given by Andy Kyte from Gartner. Some quotes from the piece:
Andy Kyte said that some businesses spend as much as 90% of their IT budgets on third-party suppliers but fail to get good value for money. Kyte said that IT procurement teams often act as if the only stakeholder they work for is the finance departments, and this is the reason why many deals do not add value and even lead to projects failing.
And, directly quoting Andy: “There is an obsession with cutting costs in IT procurement, but IT is about ensuring a better quality service to users.”
You’d expect Andy Kyte to know his stuff when it comes to procurement. After all, he’s featured here at AribaLive 2005.
Procurement doesn’t like IT
Which brings me to the this posting on the Ariba blog about Software As A Service vs IT. Here’s one quote: “it seems clear that IT’s tight grip on all things digital is gone and end users are winning the war.”
Which war? Justin goes on to explain:
What business user can survive without access to SaaS applications like WebEx or Salesforce? After a long battle over letting those and other apps through the network gates, the end users have won. IT simply can’t hold their end users back, strangling their productivity as more and more critical applications move online.
So can we ditch the stereotypes?
The stereotypes are there in full effect: From IT’s perspective, procurement is only about cutting costs. From Procurement’s perspective, IT is a gatekeeper which prevents users from doing business effectively. Sure there is some truth to both of those stereotypes. Yet at the same time there are CIOs who are looking at ways to break down information barriers within and between enterprises just as there are CPOs who are looking to generate value beyond unit price savings. From what I’ve seen in my professional life so far – which has touched both IT and Procurement – the similarities between the functions are greater than the differences.
The doctor’s comment from my last post leads me neatly onto posting about what is broken about today’s models of enterprise software in general (and enterprise e-sourcing software in particular).
Here’s a take on how your average enterprise software deployment fails to meet expectations:
- Someone in “the business” figures that a software tool will help address a particular business objective. After all, they were at an industry conference recently in which a company spoke about the tremendous benefits it achieved by adopting solution XYZ.
- They issue a list of requirements to potential software vendors …
- … who all agree that their software is capable of meeting those requirements.
- The business then identifies a consulting partner to manage the implementation.
- Implementation of the software begins with everyone in high spirits.
- Then everybody realises that the business processes are a lot more complex than had been assumed in step 1.
- To compound this, it turns out that when the software vendor had said their product was capable of meeting the requirements, they neglected to mention that this would only be at considerable customisation costs.
- More money is thrown at the project.
- Then the business decides to change the requirements. (Repeat steps 6 to 8 for as long as $$$ and enthusiasm allow. Remember that the consulting partner is charging day rates for this so won’t be overly upset. Or if you thought you had a fixed price agreement now is when the change orders start flying thick and fast.)
- The project is going nowhere. Exhaustion sets in and finally everybody agrees to implement something approximating the original set of requirements. The software is only just about being used.
- The system is declared “live” to great fanfare from the project team, and indifference from the people who are supposed to be using it.
- Just when everyone thinks the project has finished, someone checks back to see how well the software system has delivered against its original objectives. No-one can remember the original objectives.
- Various options now exist: Scrap the system (or scale back the implementation scope), point fingers, hire more people to operate the new system etc etc. Some or all of these options get played out.
- Eventually the dust settles. Project sponsor, consulting partner and software vendor all agree that the project was a tremendous success. Press releases, glowing resumes and presentations at industry conferences follow.
- …….. time to start again with the next customer.
Has Software-as-a-Service / On Demand solved all this? Not necessarily, is the simple answer, you realise, when you speak to some of the people who actually have to use this stuff.
This started as a bit of fun but it’s turned out to be a surprisingly useful model for me. With apologies to Maslow’s hierarchy of needs this is Buxton’s hierarchy of technology needs.
Maslow’s hierarchy, from top to bottom is:
- Self-actualisation (e.g. morality, spontaneity, creativity)
- Esteem (e.g. achievement, confidence)
- Love/Belonging (e.g. family, friendship)
- Phyisology (e.g. food, water, shelter)
The point being that when you are being threatened by a bear you are not going to be particularly concerned about writing poetry.
Buxton’s Hierarchy of Technology Needs, then, from top to bottom goes as follows:
- Game-Changing (e.g. development of a new generation of products)
- Esteem (e.g. delivering better processes, directly supporting the generation of revenue)
- Social (e.g. having core applications like everyone else does- Internet, Excel etc)
- Safety (e.g. having a reliable PC that doesn’t crash)
- Communication (e.g. just having a phone or fax that will allow communication)
The point for technology leaders being that if people’s PCs keep crashing then the new SAP system you’ve just implemented is not going to be very interesting.
Some technology leaders are happy with working at just the bottom 2 or 3 levels. Some organisations consider technology is only good for the bottom 2 or 3 levels. At these levels technology is about providing “fixes”.
However as the technology leader you have a great opportunity to invent solutions for problems that have yet to be identified. (Invention, after all, is the mother of necessity - see this earlier post). When you do, just don’t forget the importance of the things at the bottom of the hierarchy.
Computer Weekly today has an article that starts with with the following sentence.
Last month, an IT director of a multinational company asked … how he could get major IT suppliers to agree to global corporate deals.
The print version of the article does not once mention using procurement’s help. To its credit one of the original posts on the online forum does mention talking to procurement (and one suggests a reverse auction) but it’s surprising in 2007 that it doesn’t occur to an IT Director to talk to a Purchasing Director about negotiating with and managing suppliers. Is it fear of loss of control? Is it ignorance of the role of procurement? Either way this massive blindspot is a problem because both Purchasing and IT both have a lot to offer each other.
Article in current issue of Computing: Retailers Rethink IT Director Role.
It reports that “High Street giants Boots and House of Fraser are phasing out the IT director role” and that these companies are now “sticking with existing IT and giving responsibility to the chief financial officer”.
In other words, if IT is primarily about support then what do you need an IT director for?
It is notable that the report just mentions the “IT Director” job title. Does this mean that CIO’s and CTO’s are still going to be around for a while?
On the one hand, obviously yes: 2nd hand cell phones in Burkina Faso
On the other hand, who in the USA or Europe would argue about the business benefits of mobile phones – they are so important that they are invisible.
IT does matter, arguably it matters more the more you take it for granted and the more invisible it becomes. But do IT departments matter? Well, that’s another story.
Was fortunate enough to be able to go to a dinner hosted by CBR last night, entitled “Spanning the IT/Business Divide”. It was being paid for by an SOA company, so there was a lot of talk about whether SOA does or does not help span the IT/Business Divide (conclusion – the name doesn’t help, because an SOA does not provide a “Service” in the way business people would understand).
More interesting was that the most provocative questions and issues, which provoked a full and frank exchange of views, came from a gentleman who used to be in IT but is now a buyer of software rather than a technologist. Sample question: “why should I invest an unknown $$$ value into an SOA architecture when I could instead, at a low, known cost hire some bodies to type information from one system to another”.
If your immediate response to that question is to get riled and believe that the questionner is stuck somewhere in the dark ages then you have fallen into the trap of perpetuating the IT/Business divide. Because I bet you, “The Business” is asking the same question. It is a serious question. IT departments need to be able to step this far back from their day to day operations to be able to challenge themselves and their futures with questions like this.
And then I was reminded of this post on Deal Architect I read a while back, in which Vinnie Marchandi intimates that the most important skill needed in IT departments is vendor management. IT directors need to address this challenge and justify why they (assuming they are technologists) are even there in the first place. Only then does the IT/Business divide stand a chance of being spanned.
p.s. Thanks to Jason Stamperer from CBR for running the event. And apologies for not having paid attention to the new layout of CBR magazine. Truth be told I have been a big fan of the magazine for a while now but now I think about it I haven’t seen a copy in a while.
Been on holiday for the past few weeks – the distance from work, and the clear blue seas, help give a new perspective on things.
One of these for me has been the common assumption techies have that users are fundamentally dumb. IT support people are well known for holding this view point. But then again, so do many software developers. For example, all those mega ERP-style systems that implement rigid processes assume that users can’t be allowed to think for themselves and that instead the system must do as much of the thinking for them as possible.
But the reality is that people are all actually pretty smart. If only technologists would start working from this assumption – assume that their users/customers etc are smart. If they can’t operate the technology then assume that it is the system that is dumb, not the users. This shift in mind set would not only help IT be seen to be more of an asset to the business (by reducing the amount of “them and us”), but would also allow technology to deliver more real value to the real users. Which after all is the point of technology in the first place, isn’t it?
I am pretty surprised that people still do software development where costs are high. For example Google are advertising for engineers based in London, Zurich and Dublin. Seems nuts when there are very highly skilled developers around the rest of the world who can deliver the same value for a fraction of the price.
Having said which, the price differentials between countries are themselves pretty staggering. And can change rapidly (e.g. prices in India are now far above the rates you could get in 1999 for millenium-bug testers, as has the level of skill, I am sure).
In this fragmented market I expect to see the development of trans-border outsourcing companies who can deliver quality developers at $10 an hour irrespective of where they happen to be based. And as a customer I could buy the services of this company even if this year that means my development is in India this year, next year in Russia, next year in China and next in Vietnam.
Over Christmas with my parents, cousins etc.
- My uncle talking about whether to buy a new computer now or to wait until Vista comes out (ok, to be fair, he called it “Some new program from Microsoft”, but even so)
- My dad talking about using eBay (hell, I hadn’t even used eBay until two days ago and I’m the one working in the online auction industry)
- My mum talking about her recent installation of broadband and Skype
- Debating with my aunt the merits or not of printing out your digital photos
Unthinkable even 2 years ago.
At the same time that technological evolution (not just Web 2.0 but even such prosaic things as Windows and USB cables) makes technology easier to use, more people use it and therefore their level of ability with it increases. A virtuous or vicious circle, depending on your standpoint.
The same goes in the workplace. Ten years ago people outside the IT department didn’t really want to know what the geeks did. Nowadays it’s often the people outside the IT department are the ones with the coolest ideas of how to use new technology to do business better. I have a theory that as much, if not more, technical innovation comes via the CEO’s kids than out of the IT department.
It’s a real challenge for Technology Leaders if they are to avoid irrelevance over the next few years. And I haven’t heard anyone with the answer yet. Though, again, I have a theory that the following will help:
- The IT department must always be aware that what was right last year probably won’t be right this year
- It’s pointless trying to stem the tide of future innovation just because it is inelegant or a potential security breach or because you didn’t think of it