What Open Source DevOps means for the future of Enterprise Infrastructure

What Open Source DevOps means for the future of Enterprise Infrastructure

No comments

A change, they say, is as good as a holiday. That might have been true in simpler times, but with change being the overarching constant in the IT world today, it very rarely seems like that. Change in today’s IT world is not only ever-present, it’s something that is essential to get to grips with if you want any hope of surviving – let alone excelling – in your respective field. One of the most significant transformations happening in the IT world today is the increasing shift away from on-premise infrastructure management to hybrid and cloud solutions. While it’s by no means unequivocal among IT professionals that on-premise enterprise infrastructure is in its twilight years, it’s hard to argue with the facts: cloud and hybrid infrastructure solutions are disrupting traditional infrastructure models in enterprises. In this blog, we’ll take a look at the contributing factors to this fundamental shift, including the role of open source DevOps and the increasingly common use of virtualisation.

Where are we on a timeline of the on- versus off-premise infrastructure debate?

Although the shift away from traditional infrastructures and the increasing feasibility of cloud architecture has been a long time coming, we’ve only now reached the tipping point of enterprise adoption. Until just two years ago, C-level executives were voting pretty unanimously against the cloud’s ability to replace on-premise applications – their main reason being the security and stability benefits of in-house infrastructure. With high-profile hacks seemingly escalating in regularity and intensity with each year, it’s understandable that security is a primary concern – however, no business today operates in isolation from the internet, and it’s an unfortunate truth that until such a time as we have a true solution to locking down online security, some element of risk to sensitive data is unavoidable whether it is stored on-premise or in the cloud. The question of stability has also been addressed as virtual architectures and Infrastructure as a Service (IaaS) technology have matured alongside increasing broadband capabilities around the world. The levels of stability that can be achieved at scale and across multiple geographies is far beyond that which was economically achievable with an on-premise model.

Granular virtualisation is breaking down the barriers between on- and off-premise infrastructure.

Before the turn of the millennium, virtualisation was something that only massive data-centres were likely to have anything to do with, but after the release of VMWare and ESX, virtualisation became feasible for commercial and personal use. In 2006, the worlds largest book seller entered the cloud market with Amazon Web Services (AWS). Just ten years later, AWS is a $7 billion business servicing 5 million customers in the UK alone. A key enabler for this explosive growth of virtualisation and cloud has been infrastructure automation.  Multiplying the size of your server estate multiplies the overhead of configuration management, and the ability to provision and de-provision quickly and consistently are critical to ensuring that cost is not also multiplied unnecessarily. Today, virtualisation technology has matured to the extent that it’s possible to not only virtualise systems, but also granular processes.  In fact, two of the hottest trends in technology are microservices and containerisation.  It remains to be seen if containerisation will ultimately replace virtualisation, but there is a clear drive towards more granular application services and infrastructure to support them.  Infrastructure automation has made it possible to provide businesses with services that would otherwise be astronomically expensive in hardware terms. Combine infrastructure as code with the surge in popularity of Software Defined Networking (SDN) and enterprise architecture of the near future will look quite different from that of today – the decoupling of network control from the hardware layer not only means less reliance on in-house hardware and less constraints on physical space, it also gives IT professionals an unprecedented level of control over their environment. Ultimately, this enables organisations to deliver faster without infrastructure bottlenecking the process.

Open Source DevOps tools are perfect for hacking out and experimenting with new infrastructure concepts.

DevOps is changing the face of enterprise architecture because it brings these game-changing technologies together under one roof. With open source DevOps tools, it’s unbelievably easy to create a completely new prototype or Minimum Viable Product (MVP)  without disrupting the way things get done in your organisation. By using infrastructure automation to create virtualised systems that you can then tweak in whichever way you please without fear of failure. Open source DevOps software also requires little to no capital investment, so at the worst you may end up wasting a few hours of staff time. Simply put, Open source DevOps software lets your entire organisation come together and work out where and how your infrastructure and processes could be improved. This means you can have the benefit of trying out new technologies or concepts as soon as you hear about them, and instantly roll back to your stable system should anything go awry. Keeping up with the latest infrastructure technology through open source DevOps allows you to keep tabs on the latest trends, while keeping enough distance to invest only in the ones that truly benefit your organisation.

ECS Digital is a DevOps consultancy with 12 years’ experience in implementing DevOps in businesses of all kinds, all around the world. Our team has a combined wealth of knowledge on infrastructure automation and open source DevOps. To find out more about what DevOps could mean for your business, don’t hesitate to contact us.

Andy CuretonWhat Open Source DevOps means for the future of Enterprise Infrastructure
read more
Why learning for future innovation is an essential skill

Why learning for future innovation is an essential skill

No comments

light bulb made up of learning and innovation symbolsThere are few parts of our lives that haven’t been fundamentally changed by the growth of technology over the past few decades – and nobody knows this better than Information Technology (IT) professionals. In fact, if you work in IT there’s a good chance that your job didn’t even exist ten years ago. But technology isn’t only changing the IT world: it’s changing almost every facet of the way we live, work and interact.

How you approach this level of change on a daily basis can either be the catalyst for boundless innovation or a serious detriment to the success of your business. In this blog, we’ll take a look at why being prepared to learn for future innovation can be the best defense against stagnation in an ever-changing market.

Learning for future innovation requires specific techniques and agility

Learning for future innovation is a very different process to learning for something that already exists – learning for an existing technology is more straight-forward because the method you choose is already tried and tested. Learning for future innovation, by contrast, seems almost self-contradictory. While it’s certainly no walk in the park, there are ways to make this easier, and at the rate that technology continues to drive our world forward, there will be an ever increasing number of topics to cover. And, if the mounting evidence is to be believed, most of us have been taught how to learn ‘wrongly’ throughout our lives. For professionals who are serious about learning future technologies, it’s vital to be able to adapt to a variety of working conditions, learning styles and environments in order to think outside the box and innovate more easily than the competition.

Everybody learns in their own way; no two learning styles are the same.

Every person has their preferred learning style, and what works for one person might be totally ineffective for the next. Here is a brief description of the most common learning styles:

  • Elaborative interrogation: Being able to explain why an explicitly stated fact or concept is true – in other words, repeatedly questioning the facts or pushing the concept to its limits.
  • Self-explanation: Explaining new concepts in the context of existing information, or explaining the necessary steps taken during problem solving.
  • Summarisation: Summarising information in various lengths, to study from later.
  • Highlighting/underlining: Marking the pertinent sections of a text or piece of work to be revisited later.
  • Keyword mnemonic: Using keywords and mental imagery to associate verbal materials.
  • Imagery for text: Forming a set of related mental images from text materials while reading or listening.
  • Rereading: Restudying text material again after an initial reading, often several times.
  • Practice testing: Self-testing or doing practice tests on the material that needs to be learned.
  • Distributed practice: Implementing a schedule of practice that spreads out study activities over time, with the objective of forming a long-term understanding.
  • Interleaved practice: A schedule of practice that mixes different kinds of problems, or a study programme that mixes different kinds of material within one single study session.

Having an understanding of the different learning styles and how they differ from one another isn’t only a good way to find out which works best for you, it’s also a valuable tool for understanding how the other members of your team may prefer to learn. Ultimately, working as a team means that being able to translate new information into a format your colleagues are able to understand is as important as being able to understand it yourself.

With over 10 years experience delivering courses, our trainers understand their audiences and are able to deliver the subject matter in ways that all attendees can digest.  For more information on how to kick-start your learning journey towards future innovation, or to enquire about our DevOps consultation services, please contact us today.

Image credit: Digitalist Mag

Andy CuretonWhy learning for future innovation is an essential skill
read more
The top tech trends of 2015, and what their future holds

The top tech trends of 2015, and what their future holds

No comments

Knowledge, undeniably, is power. As far back as we’re able to look into recorded history, superior information has been a defining factor behind the success of individuals, tribes, countries, and, in a far more general sense, the entire human race. This is particularly relevant to the information explosion we find ourselves in today – with future technologies emerging virtually overnight, organisations that can consistently capitalise on the latest technologies and use them to derive real competitive advantages will, naturally, be in a position of power. But the sheer amount of new technology that appears on a daily basis means it’s not necessarily practical to investigate every new gadget or each technological flavour of the week – and it’s a safe bet that many of the popular trends in technology won’t end up sticking around for the long-term. With that in mind, let’s take a look at some of the most promising future technologies, what they might look like in years to come, and why they’re worth learning about.

1.  The Internet of Things (IoT)

a-map-of-the-internet-of-things

In the 2015 Gartner Hype Cycle Report for Emerging Technologies, which measures how close future technologies are to mainstream adoption, the Internet of Things has emerged as the most-hyped technology for two consecutive years. There are several good reasons for this. Firstly, it’s great for showmanship – few things make it as clear that ‘the future’ has arrived than a virtual home assistant likeApple’s Homekit, Amazon’s Echo or Android’s Project Brillo. Most significantly, though, it plays into a trend in technology that Gartner calls ‘digital humanism’ – in other words, technology that keeps people as its central focus. It’s hard to deny that IoT looks like a significant turning point in the mainstream adoption of technology. It’s also easy to see how IoT epitomises the concept of ‘digital humanism’, and the evidence for the power it has to bring real, positive change to people’s lives is already well-documented.

Even though IoT is five to ten years away from maturity by Gartner’s forecast, we’re already seeing its influence creeping into our lives in many ways. Relatively speaking, IoT is also a blanket term, as it can be applied to a number of emerging technologies such as autonomous vehicles and smart homes. We’ve already seen the impact of the device revolution on our personal and private lives, and as more of our traditionally ‘offline’ world comes online, our world will change exponentially. All things considered, it seems like a fairly safe bet that the Internet of Things is a future technology to keep an eye on, as it will likely bring new importance to the jobs of developers and designers in the world of the future.

2.  Machine Learning

machine_learning

Over the past few months, machine learning technology has been making its way into the mainstream in a variety of ways. Google’s neural network ‘dreams’ broke headlines around the internet not too long ago, and more recently, Apple is reported to be investing heavily in machine learning experts. It might be a long way from maturity, but machine learning is already making its presence felt in its foundational stages. In the coming years, especially in light of other hype cycle entrants like IoT, connected homes and Smart Advisors, an understanding of machine learning will become increasingly important, and eventually a prerequisite, for developers.

Machine learning was also high on this year’s hype cycle report, slightly higher on the Peak of Inflated Expectation than IoT and Autonomous Vehicles, but with a projected time to mainstream adoption of 2-5 years. Machine learning is already being adopted into the mainstream to a certain extent, such as Facebook’s AI research and Amazon Web Services’ Machine Learning service, and the technology is likely to evolve considerably in the coming years.

3.  Self-Service Analytics

Self_Service_Analytics.png

As more businesses start to acknowledge the value of the ocean of data that we have at our disposal, being able to manipulate and analyse that data will become increasingly important in crafting a competitive edge. Historically, this kind of data analysis could only be carried out by qualified data analysts, but with the proliferation of self-service analytics services in recent years, advanced analytical techniques are available to people with no background or training in analytics. This doesn’t necessarily mean the death of the data analyst, what it does mean is that the power of advanced data analytics is available to anyone who takes the time to familiarise themselves with the software.

In the context of IoT and machine learning, advanced analytics represent an integral part of the way that we will interact with technology in the future. Being able to understand the principles behind advanced analytics, and how to use them in a practical sense, will be a great advantage for developers and designers. There are already many powerful open source analytics tools available, and advanced analytics is likely to become increasingly important in many industries and markets.

ECS Digital provides consulting and training services in DevOps and other future practices and technologies that can help transform your business into a truly future-facing organisation.

To find out more about what we offer, please don’t hesitate to contact us.

Andy CuretonThe top tech trends of 2015, and what their future holds
read more