Gartner Catalyst 2018: Realizing the Vision
Gartner Catalyst 2018: Realizing the Vision by Gordon Van Huizen
Last week in San Diego I had the pleasure of attending Gartner’s Catalyst 2018 conference. And what a week it was! For those that may be unfamiliar with the event, it’s a gathering of technology practitioners hosted, presented and facilitated by analysts of the Gartner for Technology Professionals team. It’s a conference where the rubber hits the road. It’s one of the places where technology visions are brought down to earth and discussed in terms of methodology, design and implementation. This stands to reason, as a large segment of attendees could be categorized as architects, including application and solution architects, as well as senior practitioners and technology leaders in a variety of disciplines, including data scientists and those with I&O responsibility; they are the people that focus on translating vision and goals into reality.
One of the things I focus on at such a gathering is where we are, from a software industry perspective, with respect to embracing new technologies and marrying them with solutions delivery. I’m of course keenly interested in technology advancements, of which there was much discussion. But I’m even more interested in how practitioners are leveraging the combination of architectural and methodology advancements to deliver value. In other words, how real and materially beneficial the advancements we all envision are becoming. The takeaway from this year’s Catalyst is that the things we’ve been talking about and exploring over the past decade are ready for prime time. And the path to successful execution is becoming increasingly clear. If 2017 was the year it all fit together, 2018 is the year the journey map emerged.
The Trinity of Microservices, CI/CD and DevOps Comes Together
The concepts of continuous integration, continuous delivery and establishing ongoing collaboration between development and ops have of course been around for quite some time. Some IT organizations have begun adopting them—at various rates and in various degrees—often to support a shift toward an Agile methodology. And then something interesting happened: over the past couple of years, forward-thinking IT practitioners began to explore the idea that the architectural principles of microservices could be of benefit to their organizations and their efforts.
Few were thinking of adopting the high-scale, extreme velocity of (the frequently cited) Netflix or other high-end Web properties. Rather, they realized how the architectural principles of microservices could increase the agility afforded by software systems and speed the delivery and evolution of software solutions.
The core beneficial distinction between microservices and other architectures is that microservices create autonomy. Autonomy to change parts of a system without impacting other parts—the ability to evolve (and possibly roll back) enhancements without disruption. Then came the realization that to realize such a vision requires both having a solid CI/CD pipeline in place and embracing a DevOps mindset and execution model. At Catalyst, the focus was appropriately placed as much, if not more, on people and process than it was on the enabling technologies. The question asked quite directly in one session was “Are you ready to organize for speed?”.
In addition to discussing how to establish the technical foundation, sessions provide guidance and real-world examples of how to go about realigning culture, roles and processes to do just that. With speed comes the need to deliver reliably, repeatedly and at scale. With freedom comes the need for accountability—a “build it to run it” mentality—as well as the need to understand the implication of a change. The implications of this are significant, including adopting a “shift left” approach where changes are tested for functionality, performance and security as they’re developed, not in a “deploy and operate” phase. And of course, this all requires automation everywhere to work.
What a fast-forward from just a few years ago when these concepts and approaches were being discussed and explored but not frequently implemented. Catalyst 2018 proved that they now are, that practices and disciplines are taking root, and that guidance and examples exist on (almost) all of the necessary fronts. And that the set of concerns are being thought of holistically.
Cloud is Now a Given, But Being Cloud-Native Isn’t
Every now and then one is reminded of how far we’ve come over the past several years with respect to cloud computing. As recently as 2012 or ’13, cloud was still an open question for many people. Its role, benefits and risks for software development and delivery were not broadly understood. SaaS had taken off, but not much else had. Now, cloud options exist for virtually every layer of abstraction within the software stack. Catalyst sessions, of course, addressed the range of options at all levels, with guidance around the tradeoffs.
In addition, the popular definition of “cloud-native” has shifted. Not long ago, cloud-native was more or less synonymous with architecting applications so that they could support elasticity (so that they could be easily scaled) and be operationally robust. Important things to be sure, but the implications of a cloud-native architecture are now core to IT goals more broadly. Cloud native is no longer just about “cloudiness”, but an ability to support the precepts discussed above.
The takeaway here, from my perspective, is three-fold:
- Cloud services and cloud-based platforms have quickly become the primary means of app delivery.
- Cloud platforms increasingly provide foundational support for the shift toward rapid, agile solution delivery.
- To support a transformation to rapid, agile delivery, application architectures should be based upon cloud-native principles whether apps are to be deployed to the cloud or not.
Low-Code Hits the Mainstage…at a Technologist Conference
Perhaps the thing that took me most by surprise was the intense amount of interest in low-code app development using high-productivity platforms. There were highly-attended sessions that either focused on or featured the topic. A significant (and overwhelming, in a good way) number of attendees wanted to have deep discussions about it, with many of them in the process of active exploration and evaluation.
And I’m told by analysts that they’re suddenly taking a significant number of client inquiries on the topic. Remember that this was a conference by and for technology professionals. Real technologists code, right? I mean, even most of the Gartner analysts at Catalyst can open their laptops on stage and show some code! So, why this new interest in low-code? The cynic in me might have thought that it was something being driven down from the top (executive leadership) or the side (line of business). But that assumption would have been wrong. Everyone I spoke with wanted to engage in a real conversation about low-code. Many appreciate how the fundamental precepts of abstraction and automation are brought together in low-code development. And many others realize that not everybody involved in app development initiatives needs to be a hardcore developer. In fact, one could easily argue that it’s beneficial to have some individuals that aren’t.
The reality is that the reasons so often cited for the adoption of low-code (broadening the talent pool, bring business expertise directly into the effort, automating routine development tasks, supporting rapid ideation and experimentation, etc.) are ringing true…with technology professionals. We’ve been experiencing ever-broadening market adoption, but seeing this degree of resonance as a technology-focused conference was really something to behold.
What a Week
And that’s just scratching the surface. The conference went deep on each of these topics and explored many more. It was nothing short of a feast for technology professionals, with hours and hours of thought-provoking discussion. And the need for a healthy low-tech weekend following!