Be #Hybrid… #IoT will eat up the cloud

I’ve been surfing the HPE’s wave since 2015. And wave is a small word. Call it a Tsunami! HPE has deployed all strengths to better focus. It’s no lie to say that HPE has completed transformed and looks forward. Think about it: #Aruba, #SimpliVity, #NimbleStorage, #OneVien, #Synergy, #Niara, #CloudCruiser and so much more.

If It’s not been Hybrid ready, Cloud lover, i don’t know what is 🙂

Not to say cloud is not a destination, it has its place and fits well in many lines of businesses, but can’t be however the sole strategy of an agile, fast and growing IT journey.

Think differently before it’s too late

The biggest challenge we all face is growth and data efficiency. It’s to be blind not to admit that everything we do is about increasing our organizations profitability by bringing  innovative solutions to business challenges. Came across a Gartner’s article (http://blogs.gartner.com/thomas_bittman/2017/03/06/the-edge-will-eat-the-cloud/) recently, revealing when what we hear is all about the cloud. But is the cloud, while very attractive, the end of the road? Difficult to say, when applications keep growing, data keeps flowing in, and IoT keeps putting pressure. Tomorrow belongs to the fast; will Cloud only be the answer?

Shrinking

Then if Cloud is not the answer to all things, a right mix must take place; a mix where data is on prem and can flow to the Cloud and back; with the size of data sets, this is not an easy task: comes in data efficiency.

Building the right base, the right foundation is key. Can a 60TB dataset be sent to the cloud and back? no you would say. not necessarily i can argue. We all heard about compression and deduplication and they come in every conversation. Why would i keep a large block size made of 0’s and 1’s when i have that exact same dataset already present? Or better, as the Cloud ingest data, why ingesting the same 1’s and 0’s over and over again without intelligence; that’s not called working Intelligently at the edge, that’s called throwing hardware at the problem.

Yes what i’m talking about i called Metadata, a set of data that describes and gives information about other data, a.k.a data about data. Could that be the solution then? If Metadata comes in, then ”What” is needed to execute on that data and algorithm?

Before a house comes the fondations

If we expect to have such a big house of data, the foundation is key; building a solid base that will sustain growth and the ”weight” if data is fondamental, that we think about IoT or ERP or any other need our organizations might need, data size of crucial for ”tomorrow”. The idea of having a very tiny subset of data representing the largest workload is attractive and while in storage solutions for a long time, this has reached to us, software defined architects of hyper super converged solutions; software defined, yes, but hardware accelerated, please. https://www.simplivity.com/wp-content/uploads/DeepStorage-SimpliVity-Data-Protection.pdf; why hardware accelerated? it’s a complete new conversation that I’ll be sharing in the next post, but keep in mind that nothing, NOTHING is faster than hardware acceleration for computing the most demanding data. Teaming up with edge virtualization creator is definitely the most secure path through the convergence of the datacenter we are witnessing every day.

At the end isn’t it about minimizing the risk and working with top solution? I believe so, and i wouldn’t put nay of my customers at risk but not considering solid and reliable solution.

Conclusion

When building tomorrow’s datacenter having data efficiency should be at the heart of the final decision; IoT and other applications that are coming our datacenter are eating up our options; throwing more hardware at the problem is not the solution. Think smart, look out for innovative directions and keep up with manufacturers innovating day in day out for a better tomorrow.

Advertisements

About florenttastet

As an IT professional and leader, my objective is to help an organization grow its IT department with new and innovative technologies in order to have production at the most efficient level ensuring the right alignment in the deployment of such technologies through a precise Professional Services results in a extraordinary experience for the customer. Team member of multiple projects, I have developed a strict work ethic allowing development of superior communication skills, as well as the ability to multi-task and meet precise deadlines. As an IT veteran with a broad background in consulting, management, strategy, sales and business development, I have developed an deep expertise in virtulization using VMware and Citrix products with a strong skillset on Storage Arrays (HP, EMC, Netapp, Nimble & IBM). I have also developed a Security practice through CheckPoints NGX R65-R66 (CCSA obtained) and Cisco PIX-ASA product line. Specialties: Microsoft infrastructure products; Monitoring HPOV, SCOM, CiscoWorks, Firewalls: checkpoint, PIX and ASA. Virtualization with VMware (ESX through vSphere & View/Horizon), Microsoft (Hyper-V server, VDI and App-V), Citrix (Xenserver, XenDesktop, Xenapp), Storage (EMC, HP, Netapp, Nimble & IBM), Reference Architectures and Converged Datacenters (vSPEX, Flexpod, vBlock, PureFlex & HP Matrix)
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s