“I have a dream that one day this …“industry” … will rise up and live out the true meaning of its creed. (…)” …
… and that is, where data, applications, BCDR and ease of management was part of unique single strategy and a normal way of operating a day-to-day datacenter allowing to focus on core applications crucial to the growth of an organization. Seems to good to be true you would say, but I am still convinced today that all pieces are available out there, and it’s up to us to put them together and crunch the challenges we live day after day. It is up to us to define where the information lives.
Over the years, data centers have become largely fragmented, with numerous types of proprietary software siloed inside specialized hardware components. Virtualization helped absorb and minimize this challenge but created another one, much worst to a certain extend : sprawl of server and application, explosive data growth generating a complexity in management.
The sprawl of uncontrolled and poorly managed deployed applications led to Application Unavailability endangering organization profitability, and the numerous example are growing around us: remember the Amazon datacenter failure…. let’s take it the other way, why is application availability so vital in today’s world? Where should I start…?? We must agree that In order to remain competitive, organizations must build or acquire new, mission critical applications that automate and enhance business processes.
I always like to remember an article I had the opportunity to read on where it says: “As business becomes increasingly dependent on technology and information, availability is a universal concern for every business, in every industry… And globalization means there are no more periods of ‘acceptable’ downtime. At any time of the day or night, somewhere in the world, customers and vendors need access to your corporate information. If they can’t get it, they’ll go elsewhere – creating an opportunity for your competition.” (David M. Fishman, Sun MicroSystems “Application Availability: An Approach to Measurement”) I was young and I have to admit that it touched me and created a feeling of urgency; this is maybe why I am so passionate about what I do.
…. “It is a dream deeply rooted in the IT industry dream (…)”, automation and ease of management are not longer a nice to have its’ a MUST.
Too many apps, too many servers, siloed, endangering the profitability of an organization, with overall thin management capabilities… certainly NOT the conversation we wish to have with a C-Level, at least I don’t. And we’ve been trying for many years to overcome the challenge from many angles; have we succeeded? Somehow, but when I walk into datacenters today, too often it looks like a collection of bad past IT decisions, and all I see is static data, static application, too often siloed, disallowing an organization to reach its full potential..
We’re all magician somehow as we were able to sustain the growth with cost control year over year and this success led to my next preferred subject Big Data! Oh yes! That one is my preferred one. Softchoice just released an engagement to a large Canadian customer where the so call “big data” had became uncontrollable. And why? For one simple reason: the consumerization of IT. It is known that unstructured data represent an average of 80% of the entire data center, leaving 20% for structured data. Are we hoping this will stop? Get ready friends, the world is expecting an explosion to 130 Exabytes… oh yes, E.x.a.b.y.t.e.s
According to theCisco® Visual Networking Index (VNI) Global Mobile Data Traffic Forecast for 2011 to 2016, worldwide mobile data traffic will increase 18-fold over the next five years, reaching 10.8 exabytes per month — or an annual run rate of 130 exabytes — by 2016. Hold on a minute… 2016 ? That’s in … 3 years??
The definition of Big Data, also called “information explosion”, is nothing more that what we know, we just haven’t quantify it yet. This rapid increase in the amount of published information or data and the effects of this abundance has started to hit medium and large organization in the back. As the amount of available data grows, the problem of managing the information becomes more difficult, which can lead to information overload. Management you said?? Yes I did! Again…
How much do we “fear” data will grow?? Gartner Director, April Adams, reported in 2012 that data capacity on average in enterprises, grows at 40 percent to 60 percent year over year, much more than the 30% year over year we’re used to don’t you think??
Unstructured and structured data is the lifeblood of all organizations, and I have seen more and more customers turning to Softchoice struggling with the increasing amount of data stored in their datacenter and how to manage to exponential growth of the consumerization of IT . Because storage capacity has increased while costs were going down IT has become more lax about what, where, when, how and how much data has been stored in the datacenter.
Now here’s an interesting challenge friends: While the ability to store increasing amounts of data empowers organizations, it also presents them with the challenge of managing all of that information. Really? Management again? Smells like a need of solutions.
I like to call this activity “balancing the datacenter and such success is, I believe, made of a precise combination of top leading technologies that align a strong roadmap and integrates well with business processes. Bottom line we want something, that does it on its own, and is capable to adapt to changing environment and be easy to manage through policies while relying on industry top manufacturers.
Wanna know more?? Interested is the magic? Interested in understanding what we do different and how we do it? Reach out, ask the question, get prepared with the people who see it every day and are continuously looking at it. We surely don’t want to get catch in the wave of data.