Home Data Solutions The IT Director’s Perspective: Maximizing the Value of HPE NonStop

The IT Director’s Perspective: Maximizing the Value of HPE NonStop

by Thomas Gloerfeld

[vc_row][vc_column][vc_empty_space][vc_column_text]

Introduction

“You are traveling through another dimension, a dimension not only of sight and sound but of mind. A journey into a wondrous land whose boundaries are that of imagination.”

Your next stop, the datacenter.

It is full of the latest and greatest technologies. Rows and rows of servers sit surrounded by flash disk arrays and enough fiber to reach to the moon and back. This is the new home for your software, the lifeblood of your company. You reflect proudly on what you have helped to create and then, doubt begins to cloud this wonderful vision. Is my software worthy of running on all of this shiny new technology? Can my software take advantage of all of these technologies? Is my software ready to run in the cloud?

Folks who run NonStop might take a slight pause while they ponder the cloud and wonder if they can fit their software into this new paradigm, but upon reflection, they will realize that Tandem was one of the first cloud providers. Back in 1976, they had envisioned applications being able to be run on independent CPUs with access to data hidden by a messaging system. You could add new hardware without having to change the application; if you needed more of a given resource, you just added it. Sounds awfully like the definition of cloud computing, doesn’t it?

Unlocking data to turn it into knowledgeIf you are reading this, you must be an IT professional. Information Technology, short and to the point, is two words: Information and Technology. The technology is the ‘magic’ that makes the information available to our users. Most users don’t care about the technology, but the information they definitely do care about. Information is the life-blood of a company; without it the company cannot operate. There isn’t a single major corporation that could survive if they suddenly lost all of their data. Data is one of the most important assets a company owns!

Providing access to our corporation’s data has never been easier either. There are plenty of tools that allow our business partners to mine company data for the precious nuggets that will provide real value to our company unless that data is still stored in Enscribe files. It is the year 2017 and the last statistic about the NonStop customer base still has about 50% of the data stored on NonStop being stored in Enscribe files. Imagine the value that IT can provide to the company if they simply unlocked that data. Since IT organizations only exist to provide value to their company, this should be at the top of everyone’s wish list.

Enscribe is not a database, it is a set of files, most likely badly designed to save a few bytes on a disk. It has no tools to query it (NO! Enform isn’t a user-friendly tool), to run what-if queries, to dynamically access it using whatever tools the user wants to use. Still, for some of us, Enscribe is all we have. There are a few ways to deal with this Enscribe data:

    1. Ignore it and pray it goes away
    2. Throw it over the wall to the person in the next office
    3. Nuke it all and start from scratch
    4. Pick them off one-by-one.

 

Option 1 is what most people seem to be doing, option 2 is great if you’re not the guy in the next office, option 3 never works as the amount of risk vs. reward is too high, which leaves us with option 4 where we pick each file off one-by-one, sometimes referred to as the Tony Romo method. Option 4 is the best way to approach the task as you get immediate results with very little risk to your production environment.

 

DATA BECOMES KNOWLEDGE
There are hundreds of examples where ‘random’ bytes sitting on a computer’s mass storage device have provided real, actionable information to corporations. One that comes to mind as the United States enters hurricane season is the ability to visualize a company’s supply chain to see best how to deploy their assets to help the folks impacted by a hurricane.

Imagine a map of the US with all of the company’s assets shown on it along with the path of the hurricane. The company had the data, but it wasn’t accessible to folks who needed it when it was locked in an Enscribe file, by moving it to a SQL database they were able to load it into a piece of software they purchased from a 3rd party. Once loaded it turned into a valuable tool.

It is relatively easy to convert Enscribe files to a modern SQL database using a tool such as comforte’s Escort SQL; this incredible software product converts Enscribe applications and files to NonStop SQL without requiring any changes to existing programs (i.e., no source code changes). This allows an organization to focus on creating new applications that provide the business with new capabilities instead of spending time rewriting existing functionality.

Instead of developers spending their time writing reports they will be able to develop new functions and features. Hiring will be easier as most folks don’t graduate from university knowing Enscribe and probably have no interest in learning it. And, perhaps most importantly, other groups will be able to access the data using industry standard techniques which has immense value for the company, adding a web front-end will take days instead of months.

Once the SQL database has been architected and the application object files are ready to use the new SQL database, nothing else is required! From this point forward, your data is now housed in one of the best SQL databases in the industry.

Additionally, tools such as JDBC/ODBC, Tableau and QLIK can be used by the entire organization to access the previously unavailable data.

Once you have converted your files to tables, you have taken the first step on your journey towards the cloud. The next logical step is to start working on a strategy to move your applications and data into the cloud.

 

 

Aligning mature core applications with changing expectations
As businesses grow and become more mature and sophisticated, most companies have a cloud strategy in place. All of the new features they produce are cloud-native or at least cloud accessible. They have multiple offerings that their customers can access via the Internet, which allows them to be more competitive. Additionally, they are exploring the way to move more of their computing power into the cloud to allow them to focus on providing more features to their business partners. It’s a great time to be an IT professional!

As you sit back and ponder the journey you, and your applications, have taken over the last few years you still get a bit uncomfortable when you think about your dreaded legacy application. It still provides lots of value to the company, but most of that value is hidden behind legacy interfaces. Go to an airport and watch the gate crew change a seat assignment and you can’t help but cringe as you notice them hitting the tab key ten times in a row to move to the seat you would like using a terminal emulator that ‘speaks’ 3270.

 

 

Aligning mature core applications with changing expectations
As businesses grow and become more mature and sophisticated, most companies have a cloud strategy in place. All of the new features they produce are cloud-native or at least cloud accessible. They have multiple offerings that their customers can access via the Internet, which allows them to be more competitive. Additionally, they are exploring the way to move more of their computing power into the cloud to allow them to focus on providing more features to their business partners. It’s a great time to be an IT professional!

As you sit back and ponder the journey you, and your applications, have taken over the last few years you still get a bit uncomfortable when you think about your dreaded legacy application. It still provides lots of value to the company, but most of that value is hidden behind legacy interfaces. Go to an airport and watch the gate crew change a seat assignment and you can’t help but cringe as you notice them hitting the tab key ten times in a row to move to the seat you would like using a terminal emulator that ‘speaks’ 3270.

To create a RESTful or SOAP service using CSL Studio requires only four steps:

    1. Import the applications DDLs
    2. Import the Pathway definitions
    3. Generate the REST wrappers and Doco or the SOAP WSDL
    4. Deploy the service to the NonStop

 

All told, in about 15 minutes, a legacy Pathway server can be exposed as a running web service!

Web-enabling legacy applications isn’t just for the replacement of green screens; any running Pathway service can be exposed in this manner. Once an enterprise’s legacy services have been exposed, one can argue that they are no longer legacy. The ROI for a tool such as CSL is amazing, no longer are company assets hidden behind a proprietary set of solutions, now third-party applications can be purchased that can access these services and create new, more powerful services. Creating a new feature can be as simple as writing a small workflow to call a few services in a totally new way. Testing can be automated using 3rd party tools. Running a stress test is as simple as pressing a button on an open source tool that is designed to generate data and send it to a web service.

Once a company has exposed its back-end services, they should start thinking about using the new Virtual NonStop capabilities. Imagine a world where a complete service can be deployed to an end-point location to ensure that it is always available. If the remote location goes offline, it can still continue to operate without any impact to the users! Once a company moves into the virtual world, the possibilities are virtually limitless. As a company’s transaction rate increases, the system would automatically expand to handle it without any user intervention. Gone forever are the days of buying enough hardware to handle the peak season and watching as it sits unused for 11 months out of the year.

By using the Escort product, an Enterprise can make their data available via one of the best SQL engines, NonStop SQL. And with their applications being exposed using CSL, a company now has all of the building blocks to provide immense value for years to come. The last piece of the puzzle is to make sure that all of the IT assets are fully secured as no discussion of modernization can ignore such an important topic.

Security as a first-class citizen
Defense in depth is not a fancy new security concept; it dates back to medieval times. Back then, buildings were constructed to present as daunting a challenge to attackers as possible. They would be surrounded by water, or built on a hill, have a central enclosure of stone walls, and towers where people would watch for attacks, then one or more outer walls, also with towers. The height of the walls would increase towards the middle, enabling inner defenders to shoot over the defenders of the outer walls. Back then, when a breach occurred people died.

Luckily, today people don’t die when a breach occurs, but the reputation of the company along with the trust they have built with their customers does. To prevent a catastrophic event like a breach from happening, IT organizations need to layer their security. Folks like to use an onion to describe good security: as you peel back the layers, there is another one protecting the center.

Typically, good security can be separated into seven layers:

    1. Well-defined policies and procedures that are understood by everyone in the organization. Things like data classifications, password strength, code reviews and usage policies
    2. Physical security such as locks, ID badges, walls, and guards
    3. Perimeters built with firewalls, denial of service prevention, network address translations, and message validation services
    4. Network protection using encryption and identification services
    5. Server and desktop protection via patching and malware detection
    6. Application authentication, authorization, auditing, and secure coding practices including single sign-on

 

Achieving data protection with the innermost layer
Layer seven; the innermost layer that protects the sensitive data that the ‘bad guys’ want has two very good standards to use when deciding what and how to do things. Payment Card Industry Data Security Standard (PCI DSS) and Health Insurance Portability and Accountability Act (HIPAA). Both of these standards are constantly reviewed and enhanced to ensure that private data stays private.

There are a few techniques that all organizations employ to protect their data. First, ensure that all (not some) communication is via a secure communications protocol such as TLS. If the attackers can read your communications, they can learn things that will help them attack your infrastructure such as a user name, password, or the address of a service. It should go without saying that the use of telnet and FTP should be banned (good security practices state they should be removed from the system). There is no reason for any communications to be in the clear. Second, all of the data should be encrypted or tokenized. Most security professionals agree that an organization should tokenize the super-sensitive data (PII) and then encrypt it when it is in motion or at rest.

Adding TLS to an application is simple so there is no reason not to do it!

Of course, encryption has many flavors; it is recommended that all physical media have encryption turned on, which ensures that if a disk or tape is stolen, the data on it is useless. All modern operating systems have the ability to encrypt a disk or tape built-in. Encrypting all communications is pretty easy, all browsers support TLS 1.2, which should be used by anything using a web-like interface. comforte’s SSL-AT provides transparent TLS security to any application via an intercept library meaning that no application modification is required. Adding TLS to an application is simple so there is no reason not to do it. For those who want to have total control over their application, comforte has another product, SecurLib/SSL, which provides developers with complete control of their TLS implementation via simple-to-use APIs.

Tokenization is the process of replacing sensitive data with unique identification symbols that retain all the essential information about the data.

Once you have enabled encryption on all of your communications and storage devices, it is time to encrypt or tokenize your sensitive data (social security number, drives license number, credit card number, PINs, bank account information, etc.). The approach most companies take is to encrypt or tokenize the data when it is written to a storage device. With tokenization, the sensitive data is replaced with a token, while with encryption the data is encoded and locked with an encryption key. In either case, if the data is stolen, it has no exploitable value.
Data security is no longer optional, bad actors are working 24/7 to hack into your systems.

No matter how good you and your security team are, sooner or later one small mistake such as a missed patch of an obscure server and they are in. If your data is encrypted when in motion, and tokenized or encrypted when at rest, you have nothing to be concerned about.

 

Summary and Conclusion
As technologies advance and user expectations change, the NonStop platform must adapt. Just reliably supporting some of the most critical business functions in organizations worldwide is not quite enough anymore. Applications on the NonStop need to be seamlessly embedded into modern architectures and offer a modern user experience. At the same time, data security has to be treated as a first-class citizen, because of the additional exposure of business functions and data on the NonStop to other applications.

Therefore, there are three main elements to consider:

    • Unlocking data by converting Enscribe files to SQL
    • Unlocking business functions with web service enablement
    • Securing sensitive data with tokenization or encryption

This might seem like a daunting task, but with the right partner at your side, you can unlock the full potential of your NonStop while minimizing effort and risk.

comforte is a leading global provider of data security, connectivity, and application modernization solutions. comforte delivers best-in-class products and support for customers using HPE NonStop systems.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]

[/vc_column_text][/vc_column][/vc_row]

You may also like