Monthly Archives: September 2013

January 1st 2000 and April 9th 2014 – one date memorable, the other pivotal

y2k xp

One date predicted dramatic scenarios due to perceived issues with the ticking internal clock inside millions and millions of computers worldwide, that if not fixed, would lead to planes falling out of sky, global  collapse  of financial markets and anarchy in the technology world at that time. The other predicts perhaps less dramatic social  impact as Y2K, but will lead to business disruption  over time because of one irreversible reason – staying on Windows XP signifies an acceptance that the business has fallen behind the pace of change which signifies something else – marginalization of the IT department.

Strong words so I will do my best to back up my rhetoric.

Y2K will have been some 14 years ago when the clock of expiry turns on the most popular operating system known to mankind – Windows XP. I am told that there are still some 30% of large organizations have not either started or completed this process. It wouldn’t surprise me if true.

Analysts galore spend countless hours trying to predict the impact on the 9th April 2014 and put forward compelling reasons why doing nothing is not an option. But as we know from Y2K doing nothing was an option as we all learned on Jan 1 2000 as the planes didn’t fall from the sky, financial markets still traded and the general perception was that a lot of money was spent for nothing.

In 1999 Playing chicken with the impending doom was something that large organisations either considered and discounted, or kept quiet about as the clock struck midnight. After all no one wanted to be found out for not patching their stuff especially when practically the whole world sat up all night waiting for Armageddon to fall.

In 2014, whilst I can imagine there will be senior stakeholders out there with that slightly nagging feeling that playing chicken again might work, the impact is not going to the be same as Y2K. No one is likely to be cited in national press for not migrating their infrastructure ( yes – infrastructure ) to the next supported platforms.  Of course word will leak out and the computing press will go to school to build a list of those shamed by staying put.

And why is playing chicken an option? Because the true cost of fixing the Windows XP thing is not attractive. For the first time ever I believe the true cost and impact of changing the status quo is far reaching and not for the feint hearted. And why is this so? Well on one level it is probably just a hardware replacement programme with a newer operating system, but on another, all important level, the decision to make the change rolls up to something so much more important.

This change? An evolutionary opportunity for the IT organization to catch up with the pace of change, being dictated by the business. Yes pace of change because the decision to remain static on the business workplace – I repeat business workplace – has created such a gulf between IT and the business that the shadow created has dangerously grown. Strong words I know but the Windows XP experience defines or frames the world of work. Oh dear it is so outdated and depicts a complete disconnect from the business thought process.

Let me try to explain in three ways.

  1. We all know what the IT shadow is. It is that area of business operations that rely on IT services and productivity that does not rely on the ‘recommended’ IT service catalog. Driven by employees exercising choice on devices and access locale, the shadow started the marginalization of the IT organisation. And it is continuing unabated.
  2. We all understand speed. The speed of change for a business has now far outpaced the speed of change available from the IT organization. Gone are the days where IT dictated the pace. That went south about 10 years ago and is unlikely to return unless IT moves lock stock barrel into the cloud. Unlikely, unreasonable and unrealistic.
  3. We all understand productivity. The concept of being ‘productive at work’ has moved 360 degrees as technology has enabled this everywhere and anywhere characteristic. The modern world of work now selects the most appropriate technology available during the working day. Whether it be applications, devices and outcomes the average employee is now relaxed with choice on how they collaborate, communicate and participate.

Or more simply put. The decision to replace Windows XP is causing significant headaches for the IT organization. Blast. It shouldn’t but it is! It is because all the cook book best practices used in the past may be working on the ‘technical’ aspects of change – applications, data, migration, and infrastructure and so on – but it is the ‘productivity’ aspects of change that IT is really struggling with. Yes I know it costs a lot of money. Significant sums in fact but the ‘cost’ of understanding what the business needs is where the pain actually is.

And it is simple to see why this is. IT hasn’t really got stuck into the business discussions as they drove towards different ways of working. Sure IT has been out there getting stuck in with the business understanding specific projects or infrastructure needs. Sure IT has been engaging more and more on business case discussions as they worked hard to make sure they understood the desired state expectations for a project, but the ‘design of the new workplace’ is new. Skills aren’t simply there. The business appear to know exactly what they need but IT is falling behind. Whether it is BYOD, mobile working, cloud services, social networking or environmental change to buildings and travel, the business case goes way past the bits and bytes differences that IT is used to.

The next 5 years are going to be massive.

  • The young employees will be the mature employees.
  • The new entrants will be technological independent.
  • Older employees will have either buried their heads in the sand or seen the light (  or retired ) .
  • Technology will be so pervasive in every walk of life that the concept of the World of Work will be implicit in what we do both at work and at play.
  • Digitization will outpace the operating system decisions we are talking about today.
  • IT organizations will still be in situ ( or have been left with the Run function) while others make world of work decisions.

Don’t believe me? Let’s make it 10 years to be on the safe side. No let’s make it 14 years. Because the change between 2014 and 2028 will no doubt witness so much more change than the Y2k to 2014 period.



Big data? When small data is often big enough

If you are English ( or is it British ) there is a saying that if you look after the pennies the pounds will look after themselves. Against this context Big Data is the Pound coin; Small Data is the penny.


Big data is hot news. It  appears regularly on TV programmes and films as the next big technological challenge facing the world, as the public get introduced to the reality that all the cameras dotted around cities, cark parks and streets, the street furniture they walk past and  the various government agencies are collecting data on them. Huge amounts of data.

Perhaps many people in the street haven’t realised that all this information gets stored somewhere, but its a slowly dawning truth that Big Data is becoming an everyday phrase. A bit like Cloud Computing. As was Y2K. For the guy in the street these words were ( and are ) just words joined together, and the depth of knowledge behind the ‘tech’ is shallow. Of course it would be. Understanding the details and complexity is our jobs. Of course.

Business stakeholders sit there struggling with the question ” Do I have a big data problem?”. ” Do I need one”, ” When will I know I have Big Data” and “who can help me?”. Technologists sit there with fantastically intriguing words like ” association rule learning, data fusion, genetic algorithms, crowdsourcing, parallelism, anomaly detection, sentiment analysis, time series analysis and visualisation.

Similar to Cloud Computing however Big Data does run a risk of being undervalued or underappreciated because of its ‘size’. For sure there are organizations out there who undoubtedly have Big Data requirements. Governments, Security Agencies, Genetic Analysts, Meteorologists,  Search Companies, Banks and so on all capture, store, analyse and share ridiculous amounts of infinite data on all manner of subjects – and will continue to do so for ever.

For an individual person the concept of Big Data is often something very difficult to compute. The numbers used to describe Big Data – words like quintillion completely flabbergast us as we still come to terms with the fact that we as individuals have more information stored about us digitally than content we create ourselves. All this makes no real sense to 99% of the people out there. For many the concept of a USB stick holding the contents of their iTunes library or their family photographs is bad enough. The modern smartphone being able to record and store videos with more storage than a laptop or PC not that long ago is mindboggling.

But there is also Small Data. The data we touch each day. Emails, text, photos, documents. Some are totally random and throw away. Others feature in a process at work. We start off with a document that goes to someone else who does something to the document and it goes off to some other process. Billions of small data transactions every day. I suspect the bulk of them are isolated and do not feature in any Big Data scenario but to us they are a massive impact on our lives. The devices we use help us proliferate this small data. We possibly sync small data between devices so we are can have access to favourite photos or documents. A lot of us blend our personal small data with our corporate small data. The average corporate laptop or smartphone will confirm this.

And of course just like the pennies that become pounds it is the thirst for knowledge and angles that push big data into mainstream conversation because it is the gold nuggets in our small data that people are after. If we tweet something about a product or holiday that data is socially engineered to build a map on what is a hot today to help drive sales or product development. If we comment on a document or report that comment is now collated to build a better picture on a partner or client engagement. When we do this then someone is using Big Data infrastructures and services to build this map.

Against this context Big Data is the Pound coin; Small Data is the penny.

And before you argue that I have misunderstood what Big Data is let me say this. The phrase Big Data as two words together is not very exciting. Two words stuck together that can only get Bigger and Biggest – just like Small Data – equally not very exciting that can only get Smaller and Smallest.  Look at Cloud Computing. Two words stuck together.

Our industry loves doing this with words don’t we? They are a sort of oxymoron, making no real sense or contradicting each other. But they are useful for us when we apply the Why question. Why does having Big Data matter? Ask a scientist at NASA. Ask a medical professional trying to solve disease. Ask military planners trying to save life in combat. Ask city planners trying to build better and safer environments for us to live in.

After all Cloud Computing might sound better if it were called Knowledge Grid, and Big Data might sound better if it were called Information Search.

No I argue that its Why How and What you do with the Big Data that is important. How you analyse, share and execute on the information ( intelligence ) you obtain from the data is what matters. All the devices we use ( and will continue to do so ) are all pointed at Big Data. The operating systems on our iPhones, Android and Windows devices are honed to connect to Big Data. God help us when they don’t have a signal! We cant find our Big Data!!

Now perhaps Big Data is for the technologists. The guys who worry about storing the information their organization thirsts for to make better decisions and drive change.  Sheer mountainous volumes of data. Complex algorithms and smart underlying storage and networking subsystems that index, slice, dice, cache and present infinite amounts of data to crawlers and bots that bring the information to our devices at high speed.

Small Data is for Marketing, Business Development  and Operational people who worry about analysing the streams of information looking for hooks and evidence to  cement strategies they are building. People who thirst to improve data flows to help drive processes that services an aspect of business operation. Big data is too big for these guys. They cant think in big data terms. The human brain cant think in this way.

Or perhaps Big and Small Data is the landscape for the individual user confronted with all their devices and a way of controlling where they have stored all the information important to them.

Perhaps all of these groups.

But you know what? It doesn’t matter what we call it; its what we do with it. No scratch that. Its what we learn from having it.

( There are no shortcuts to any place worth going.Colin Powell )


Confusion is nothing to be feared


Read this statement.

“Confusion in IT is nothing to be feared – the most successful IT organizations are the ones that embrace confusion and continue to deliver continuous service. Those organizations cope with confusion and always come out delivering an IT service that informs their customers and delivers a higher level of satisfaction than those organizations who lets confusion spiral into a bewildering, disorientating and  perplexing experience.”

Agree or disagree? Lets examine the two sides of this statement.

if you agree then you are probably thinking:

Your business is changing in an ever changing landscape that means that decisions to enter new markets, launch new products, develop new partnerships and speculate with new ideas is an ever moving event that makes every week ( or day ) a brand new challenge. Change is the norm and must be embraced,

You believe that those services that support this change you are experiencing such as IT, Marketing, HR, Finance and Operations are all willing participants in this fast moving vehicle and that change both planned and unplanned must be taken fair and square and dealt with at all costs to ensure that the ‘enhanced service’ expected by the business is maintained. Tough but not insurmountable with the right attitude and collaboration.

Confusion is sometimes a by product of all this change. Its natural. Its expected. It is what it is.

Some people step up and lead through the mass of flowing messages, conflicting data collections and ensuing potential pitfalls to make clear and concise decisions. These people do not fear change. They appreciate that part of winning is about losing, losing again and then learning from mistakes. Confusion discovers true leaders and pragmatic ‘go to’ people. Confusion creates situations where ideas flourish and where innovation can resolve and prosper. Of course, not everyone can cope with confusion in a positive frame of mind. The structure of the IT organization can often be a clue to how it will cope with confusion. Perhaps  this is not a bad question to ask when considering the opportunity to drive change into and through an IT organization you know. Food for thought?

Knowing that confusion is not far from the surface plays into the hands of an experienced consultant or coach. Using the risk of confusion as a way to simplify decision making and to build on the lessons learned, offers up a change in thought process and encourages the stakeholders to build resilience into IT services which in turn drives culture and communication with the people left with the challenges of managing change no matter what happens – good and bad.

 if you disagree then you are probably thinking:

Confusion is a huge risk. Confusion causes people to make mistakes, cut corners and cause system downtime. Depending on the sector you operate in can lead to confusion causing accidents and worse. Lack of due diligence and process caused by the confused state of decision makers and where IT is supposed to be going is often cited by IT people as an influence outside their control, and the main reason for project slippage and dissatisfied customers.

Confusion costs money which creates a society where IT becomes locked down that minimises any opportunity for confusion through prescriptive and orderly products and services. Admirable, but competing forces from the business profile demands a middle ground where governance risk and compliance is an easier bedfellow with flexible and reactive service demands.

Of course the real answer is that confusion and calm should be embraced equally, and dealt with, because both will happen today tomorrow and probably in 5 and 10 years time. Why? Because despite all the attempts to create the ‘Standard’ platforms and services the blueprints dictate for IT infrastructure and service design, there is the other immovable force that threatens to create a position of stability that calls for a more fluid view. Fluid in the nature of how IT services will be consumed, managed and paid for. The cultures drive choice and IT has to promote and deliver choice.

Allowing confusion to fester and grow is a dangerous characteristic. Focusing on those events that are easily controlled is a cop out. Standing up and meeting the business full on as they introduce change is the more beneficial route. For sure it is hard work because change really is hard work. But now IT has no choice. It no longer calls all the major shots and cant afford to be marginalised. Competing services and platforms are readily available as businesses tire of the ‘yes but’ attitude prevailing from certain IT departments.

Strong leadership is the key.

Being surrounded by like minded colleagues and partners who cope with confusion and reach an informed state is vital.

IT organizations need to make sure they have such people and partners available to ‘coach’ through key decision making milestones both planned and unplanned. Relying entirely on internal skills offers up both benefits and risks.  Too often confusion can generate more confusion and the spiral of uncertainty can bring an IT service to its knees in no time at tall.  But opening up the landscape to trusted advisors compliments internal expertise and a smart IT leader knows how to maximise (exploit ) such contacts to ensure that the journey through good ( and bad ) times is manageable and positive.

The stakes are high of course. More than ever before.  In a totally confused state huge amounts of expenditure can be wasted and effectively poured down the drain. Look at the recent UK government news around ‘mis managed’ IT projects and you can marvel at how much confusion must there have been to necessitate an effective ‘stop or No Go’ to be initiated. Where does it all go wrong? After all the technology is easier to understand and deploy and we have all learned the lessons of scope creep, project governance and portfolio management.

Of course the real answer is that confusion and calm should be embraced equally and dealt with because both will happen today tomorrow and probably in 5 and 10 years time. Why? Because despite all the attempts to create the ‘Standard’ platforms and services that the blueprints dictate for IT infrastructure and service design, there is the other immovable force that threatens to create a position of stability that calls for a more fluid view. Fluid in the nature of how IT services will be consumed, managed and paid for. The cultures drive choice and IT has to promote and deliver choice. Communicating change and the true benefits can reap huge benefits. Communicating how technology can improve the workforce can reap huge benefits.

Remember this simple yet far reaching statement I often pull down when faced with a conversation about technological change).