Monthly Archives: November 2010

Are we all the same

You take an organisation. Say 1000 people. Lets say that are a bank. Pretty straightforward.

Take the 1000 users and consider the IT requirements. How would you do it?

Are all the staff equal? Do they all have the same applications and device requirements? Do they all need the same level of MAPS – remember – Management, Availability, Performance and Security? Would you interview them or would it be easier to give them all the same experience?

Of courser from an ease of IT management it would be great to give them all the same apps and devices. I think we used to do that a long time ago, before people could profile themselves and do something called – personalisation.

But what if you could identify the users into groups ( yes Paul we do that ) and identify a ‘optimal user experience’. Would it be the same?

So OK its a bank and security is important but take a look at my little table below.

image

Now this is the $64 Million question. Even though there are different requirements for applications, possibly device and service expectations, wouldn’t it be easier to take  a server put it into the vault and run IT from there? Of course but if that server was actually 40 odd servers and suffered from ‘poor’ service management?

It is possible that different ‘platforms’ could be used to maximise costs and reduce overheads such as a Public Cloud or a Hosted Private Cloud or even a On Premise Private Cloud. But the beauty of something as simple ( elegant ) as the MAPS table is that to someone in the bank – lets say the COMPLIANCE OFFICER – there is one element that ranks well above all the others. SECURITY.

All decisions flow from that and that is why you don’t see many Banks in the Public Cloud – yet. Of course, this little exercise works a lot better if you weren’t a Bank. Think it through.

Paul.

Advertisements
Tagged ,

Missing Question

When you think about understanding what a customer needs in any selling situation you will always come down to the simple question of “what does this person want”.

Now if you are selling newspapers, ice-cream or Apple iPhones then the decision is pretty easy. The customer knows what he or she wants and you just have to offer a fair price, a prompt service and a happy customer Smile

But in the world of selling technology for some reason this scenario is hugely complicated and fraught with uncertainly.  Too often we work so hard but for little gain Sad smile

In my view there is a MISSING QUESTION. Let me explain.

There is a pretty well trodden methodology for determining whether the person opposite you – lets say a CIO –  has a need for your products and services. Well two ways actually, The first is where the sales guy rocks up with a techie and does a demo or presents a brochure. Naturally this is awful and I don’t recall many CIO’s feeling enamoured with this approach. The second approach is where the sales person actually engages the CIO in a conversation about business need and uses that point to track back through a series of due diligence questions to determine whether there is in effect anything that he can sell this guy. Call it Business Needs Analysis for want of a better phrase.

Now of course I’m teaching you to suck eggs! Ugh…can you imagine doing that Sad smile

So lets draw this out to make it very clear.

image

So reading from right to left the Business Imperative from the Board, Shareholders or Legislative Body is to Save Money. The Business Driver is therefore to Reduce the Cost of Running the IT Estate by 20% in 2011. Great. But what is the real pressure behind the Business Imperative to Save Money. Is it just financial prudence or is there something else?

In many cases there is always something else. Emotion and IT are great bedfellows. IT is perhaps the most emotional subject in any organisation large or small. Mention IT in the corridor and you will make people’s blood pressure go sky high.

So the missing question is simple.

WHAT IS THE EMOTIONAL FORCE behind the Business Imperative?

 

image

Now I have hugely oversimplified the process and there are many formal models for doing this. I quite like the model developed by the Cranfield Business School for example, but I think I have made the point.

The problem is that for many people answering the question there are grey areas. They might not know actually how important the business driver actually is, Perhaps they are not a board member or they may not play golf with the CEO. They may not be a CIO but the IT Manager or they don’t understand the question. In many cases there may not yet be a defined Business Driver. All this is vital to the sales guy because if he walks away with a thought that he just needs to be produce a proposal to help the customer ‘save money’ then its game over for his forecast.

by asking the Emotional Force question can often shake the tree and reveal all sorts of sub – plots and nuances that can help the sales guy help the customer decide on the Business Driver and then the best solution and service to meet both the Business and Emotional Driver.

Paul.

Tagged

Choosing IT projects that matter the most

image

Quite often you find yourself trying to make sense of which IT projects to work on and which ones to not to. It should be blatantly obvious but if you are a sales guy or a consultant, helping understand where your ‘mission critical’ proposal sits is often fraught with disappointment when the deal doesn’t close.

Whilst you may say this is appallingly obvious, it is amazing how people don’t even have a basic way of deciding on the value of an IT project. These people typically sign off a project without going through any due diligence.

So I drew this little picture above. Now the Horizontal and Vertical Axis may be described differently by different people but the truth of the matter is that any smart CIO or CFO will expect to build a picture like this in their head or on paper. They will only want to sign off the projects that matter the most. Those projects that deliver the most business impact and impact on the IT service. It could be viewed as those projects that protect the business the most or saves the business the most money. The key concept is there will be always be a table like this. Now whether you actively push to see and where your project fits, or whether you proactively work with your customer to produce this, this is up to you.

Every customer – well conscious ones – will have a grid. The hard bit is knowing what this grid is but as a question it isnt a bad one to have in your bag. Of course, an unconscious customer wont have a grid – they just tell you they do but then make irrational buying decisions.

 

Paul.

Tagged

Making the clouds go faster

Recently I shared a flight with a seriously clever guy from Microsoft Research. He was a scientist. Yes a scientist working for Microsoft. BTWI told him I worked for Starbucks!

The conversation got round to cloud computing because it is a topic that anyone can be an expert without revealing that you actually have no idea what it is about.

This chap got me into talking about the issue of making clouds go faster which was especially interesting as we were probably about 20000 Feet above Greenland at the time ploughing through the cumulus variety.

His area of research was all about making applications run substantially faster, both across the internet but also within the cloud powered data centers. He explained that the latency in the current protocols and switching environments fall well short, with slow page loads and lost connections being the root cause of end user frustration and low levels of cloud adoption.

He told me about research Microsoft were doing that looks at new architectures and protocols that boosts performance. He mentioned two things which I scribbled on my napkin before he lost me completely and the food was served.

I learned that there were two areas of focus in this cloud faster initiative called DCTCP ( datacenter TCP )  and WideArea TCP. Now I know enough about networking to recognise the TCP aspect and the issues he was talking about with transmission issues, packet loss, client retries and limitations with TCP sliding windows etc.

When I got back I thought I would blog about this because it is implicit in all discussions around cloud and hosting, and it is quite interesting.

So DTCP is a change to algorithms that decreases an applications latency inside a data center by decreasing queue lengths and packet loss whilst ensuring high throughput. Clever. The WideArea TCP deals with the ‘last hop’ server which to the layman is the last server a packet goes through before it ends up on your local machine and browser.

If you want to learn more then pop over to Microsoft Research or take a look at the video demos of what this is all about. http://research.microsoft.com/en-us/projects/cloudfaster/default.aspx

Very interesting.

 

Paul.

Tagged ,

A few choice words

Ever thought that we use too many words and find that we never get anywhere? Is it you or the person in front of you?

Well imagine such a person and read below. See if they apply. Smile

 

image

 

I hope this has helped you? . It does with me sometimes. Quite a lot actually.

 

Paul.

Azure–Eye On Earth–seriously cool

If you live in Europe as I do then this is a really neat piece of cloud computing initiative.

The video tells the story.

I hope you enjoy it and realise like I did that whilst there is a lot of hype around cloud there is also a lot of real world changing deployments of cloud technology and that Microsoft Azure is one of the very few vendors who can provide such a rich and versatile platform as a service (PaaS) capability.

What I love about the Eye On Earth story is how Azure solved there scale and elastic problem and it struck me that the guys at The European Environment Agency are living the true characteristics of cloud computing. Well done. Thank God I am European. Ugh did I just say that? I am British and proud!

If you want to have a look at where you live and experience the power of Azure at the same time then visit http://www.eyeonearth.eu/

Anyway off to find out what Google is up to before everyone accuses me of being a Microsoft luddite.

Paul.

Tagged

Digitizing your life

It seems that Digitization is one of the big emerging trends for both individuals and corporates.Two snippets of evidence plus the usual excitement around putting all things digital into the cloud.

Firstly, , a initiative called My Life Bits which is a Microsoft Research Project who have  developed the MyLifeBits software which is designed to make annotation easy, including gang annotation on right click, voice annotation, and web browser integration. It includes tools to record web pages, IM transcripts, radio and television. Put simply it allows you to digitize your daily life up into the cloud.

Alongside Microsoft Research one Gordon Bell has captured a lifetime’s worth of articles, books, cards, CDs, letters, memos, papers, photos, pictures, presentations, home movies, videotaped lectures, and voice recordings and stored them digitally. He is now paperless, and is beginning to capture phone calls, IM transcripts, television, and radio.

Read his book – Total Recall– quite fascinating.

image

Secondly I noticed some of the Gartner CIO questions for 2011 aimed at large Enterprise corporate customers discussing Digitization, Here is an exmaple of one of their questions.

image

So it made me wonder:

1. How much information about me as an individual is digitized today and where is it

2. How much of that information is manipulated by me, searched by me or amended by me versus that held, manipulated and searched by others e.g. Government, employer, colleagues

3. How long would it take ( and would it be worthwhile ) to digitize my life

On reflection Point 1 is difficult but I hazard a guess at about 1TB of data held in various locations both in the cloud and on devices and hard drives.

Point 2 is impossible although I remember EMC producing a paper around this topic and they estimated that there was more held about us by others. So on that basis say another 1TB +.

Point 3 is probably impossible as well but the My Life Bits example and the approach they took and the ease of actually digitizing everyday information you process does make it seem feasible that over the next 20 years or so  this will be a reality. Oh and for me it would take approximately that long – 20 years.Why? All those Birmingham City FC programmes will take ages to scan in!!!

Paul.

Tagged

Go together with a partner

If you want to go quickly go alone,

if you want to go far go together.

 

Take this phrase and think about a customer, consultant or any one you work with. Is this someone who could benefit from working as a team.

Take a customer. They have an IT team and talk about their capability, skills and inhouse knowledge. They have an impressive list of projects to complete and no doubt, will have no need for any external help. They are excited that they can achieve so much and have very impressive project plans and goals.

Then you meet other customers who know they need help. Now these people may have similar confidence in their own ability, but may realise they do need help from a partner they can trust. Why? Well they realise that running and developing an IT infrastructure isn’t a game of speed. it is about delivering service not just today or tomorrow but for ever. It is about optimising what they have and making sensible use of their budget. To these guys they need someone to call in to help them with their IT decision making, and to help them distil the knowledge into their own people.

The hard bit though for these customers is distinguishing between partners and resellers. A partner is someone who puts skin in the game. A partner who can demonstrate their durance and commitment. This may shown in value for money or agility in responding to requests, It can also be on how they advise and consult.

And one of the biggest things they can show is how they go far together with their partners. You see having partners you can trust are important not just to a customer. We all need to go together if we want to go far. So if I was a customer I would be looking not just at the company in front of me, but also their choice of partner to support them.

 

Paul.

Who spoke about Cloud–first?

Have you ever wondered who it was that said the word Cloud in relation to this paradigm shift that everyone in our industry talks about every day? I got to thinking that it would be neat to see if you could track it back to an individual and not an organization like Sun, Oracle, Microsoft or IBM.  I also didnt want to find someone who alluded to cloud but didn’t actually say cloud. I just want to find the person or persons who we can thank for all that we talk about incessantly.

So I  thought I would do some investigative work to see what I could find. I mean after all, we are all assuming someone did actually say the word Cloud first before anyone else?  I limited myself to a browser and the following question – Who invented cloud computing? – and spent about 10 minutes.

So here we go.

According to WikiAnswers.Com there is no sole inventor, it is a business model that utilizes many of the existing modern models, such as the internet, all in addition have no sole, proprietary inventor. Cloud Computing is considered to be an internet-based computing model. However, the ideology of outsourcing computer hardware has existed even since the 1960s, with John McCarthy who theorized of an eventual computing outsource model. So the concept of cloud may have been attributed to Mr McCarthy but because he didn’t actually use the word cloud he has to be ruled out. Sorry John.

Now a stronger contender according to WikiPedia  ( and other resources ) is a gentleman by the name of Ramanth Chellappa who in a 1997 lecture  defined the term cloud as a computing paradigm where the boundaries of computing will be determined rationale rather than technical limits.

Sadly, Mr Chellappa was the only attributable individual I could find. However I did find links to suggestions that Western Union back in 1965 published a report that spoke of a  ambitious plan to create “a nationwide information utility, which will enable subscribers to obtain, economically, efficiently, immediately, the required information flow to facilitate the conduct of business and other affairs.”  But they didn’t explicitly mention the word cloud so they cant be included.

I then read something that said it was Eastern European criminal technologists. These original private “clouds” are the first true cloud computing infrastructures seen in the wild. Even way back then the criminal syndicates had developed “service oriented architectures” and federated id systems including advanced encryption.  But I couldn’t find any recognition of an individual criminal so I ruled this conspiracy theory out too.

But at the end of my ten minutes I only got this far, so my vote is Mr Chellappa as he seems the most credible and referenced. Any other contenders?

Paul.

Tagged

Be careful there is a Moose in the Hoose

Moose stands for Maintain and Operate the Organisation, Systems, and Equipment. ( according to Forrester ) or MOOSE.

The concept is that for many organisations benchmarking the cost of IT is near impossible because of the complexities and fluctuations of IT spend, further complicated by cross charging, CAPEX, OPEX and so on.

So the MOOSE idea goes something like this. For several years, Forrester has been recommending that companies focus their benchmarking efforts primarily on the IT MOOSE portion of their IT budget, rather than the total IT budget.  Why? It lacks the yearly variation of the new project portfolio, which will rise or fall as strategic business priorities change and cash flow dictates.

Why is this important? Well more and more people ( especially CFOs seeking value for money ) are looking at benchmarking as a way to help make budget decisions and to help identify and manage cost reduction initiatives. Doing this on new IT project spend is hard work. Each organisation will have a different approach to technological innovation and will not expect to want to follow the pack too often. They will see this as their edge or USP. However, in the land of delivering the core IT service, there should be similarities in particular sectors that can be useful comparisons.

Delivering PCs, servers, backup and so on is such a commodity service that many CFOs will believe benchmarking is totally valid an exercise. Personally I share some of this but what CFOs cant measure is the capability of the people, the strength of IT processes and the skills of management of the IT function. However, trying to argue softer reasons to a CFO faced with a MOOSE budget that is significantly higher than the industry average is a tough deal.

Focusing on MOOSE lets you look at tangible areas of delivering IT that you can make a difference e.g. break fix maintenance, sweating old PCs, storage, DR and so on, Furthermore, MOOSE is something you have to do so its not an optional expenditure no matter what emotional statements a CFO or CEO may say. You know what I mean – “that’s is, no more spending on IT”.

So in summary, making sure you are looking at the MOOSE in your HOOSE is a big deal and looking at how this benchmarks in your sector may not be a bad thing, especially if you have a CFO who expects you to reduce the overall cost of IT. It may help you protect budget for the innovstive IT projects you have in mind, or if you are a sales person selling MOOSE services it may help you protect your contracts.

Paul.

Tagged