This issue will be all about the processing of data – developing and implementing those all-important programs in support of mission critical transaction processing. In particular, the languages, tools methodologies, frameworks and much more that all play a role in creating the business logic necessary to process the data created by transactions. The two, data and processing have been linked since the very first computer was built and even today, there are those in the NonStop community that can recall being recruited to work in “data processing.” Of course, in the intervening years, so much has changed that, in order to remain competitive, businesses have long ago foregone the five and even three year planning cycles to where development and testing of business logic is now continuous.
In this issue of The Connection you will likely have read many articles focused on the theme of Application and Design Techniques in the NonStop world. What is important now is the shift of focus, once again, to a distributed model – no longer mainframes and minicomputers or even client / server computing, but rather, the edge and the core / cloud. It was Peter Levine, a partner at Andreessen Horowitz, who proclaimed that with a “return to the edge,” we would see “the end of cloud computing.” Widely promoted at the time, Levine explained his observation by highlighting for us that, “As machine learning proliferates, the current model of cloud computing will become too slow (and) computation will move to the edge … In such an edge computing model, your device could federate a complex request out to other nearby devices that have spare processing capacity, pay for those compute cycles using Bitcoin, and receive a response without any request to a centralized cloud server. The cloud servers would still be around, but they would be responsible for doing offline computation across large data sets.”
Not so fast, responded Gartner who firmly believes in the duopoly of cloud and edge. In its analysis (updated in January 2019), The Edge Completes the Cloud: A Gartner Trend Insight Report, Gartner points out that “Edge computing delivers the decentralized complement to today’s hyperscale cloud and legacy data centers. To maximize application potential and user experience, (enterprise architects) and technology innovation leaders need to plan distributed computing solutions along a continuum from the core to the edge.” In other words, while it will likely be true that compute power implemented at the edge – hence, the “intelligent edge” – will become powerful enough to support processing returning to the edge, it doesn’t discount in any way the presence of private clouds within traditional data centers together with public cloud offerings sourced from anywhere in the world.
All of which is to say, it’s a distributed world once again and when it comes to application and design techniques everyone involved in the process, from architects to coders, a distributed processing mindset will be required. What is processed at the edge versus what is processed at the core isn’t a case of selecting one language, one database management system, one monitoring and management tool and yes, one operating system and supporting utilities. While I am not actively promoting a return to best of breed and the integration of point products, to remain competitive there will indeed be a need for some rationalization to happen. And where will NonStop fit into all of this?
We have come this far without any mention of the trend towards a DevOps model – a trend that has very strong advocates in the NonStop community. If you haven’t been following recent articles from Nexbridge Inc. Managing Director, Randall Becker, then you will have missed following all the work being done in support of GitHub. While I will leave any discussion on GitHub, BitBucket, and GitLab and yes, perhaps more relevant, NSGit – “you know, the git front-end for GUARDIAN” to Randall, as he is likely to have contributed something on this topic in this issue of The Connection. However, what I will discuss is that today, so much of what we develop in support of this return to distributed computing, following a DevOps model, is open source where NSGit with all of its controls and this evolution in creating business logic is a big deal. It’s important as ultimately, the business logic being developed may end up out on an intelligent edge computer or back in the digital core on a cloud. It just doesn’t (and shouldn’t) matter anymore.
Predictions can always come horribly undone given the appearance of something attracting little initial interest. However, as I look at where the future of NonStop lies, there is more than a 50:50 chance that NonStop will be bringing its fault tolerant capabilities to the intelligent edge. In the online, real-time transaction processing world where NonStop has continued to play an important role, it’s almost a given that business will push NonStop deployments, virtualized of course, to the edge to improve response times as well as support scale-out as and when needed, and that’s a good thing. Transaction processing works best when the processing is close to the transaction – developers take note. You may not know at first where the business logic will live but aren’t you pleased that NonStop can now scale down to a very small system on the edge just as it can scale out to being among the biggest applications running in a cloud at the core.
One of the biggest advantages of living through many generations of computing is that there are very few surprises install for us – yes, we have seen it all. As the ancient text in the Book of Ecclesiastes observed, “The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun.” And with that observation, I will continue with my predictions concerning the edge as after all, it’s about time we start calling NonStop systems “NonStop edge systems “ – wouldn’t that be cool!