Kevin Parker Archive

Start with a good foundation


Like most jobs in life, preparation is the key to success. After getting to know the Serena Deployment Automation technology by working with the free version (download from the Serena website) for a few hours (see yesterday’s post) I decided it was time to try for real.

My application was a Library Management System I developed a while ago for a public library in the United Kingdom. Like most developers I like to have something familiar to play with when I am learning a new technology.

So I started by defining my application to Serena Deployment Automation (SDA). The truth is that the help system (which is very helpful) suggested I defined the Components first. Partly because I like to try and test things to their limits and partly because I like to do the unexpected.

Create the Application

To start I clicked on Management, then Application and then Create New Application. I gave the application a name and a description and I was done. Easy. But was it too easy?

Once my application was created it dropped me into the Environment definition page. I was expecting this because I had been through the tutorials and samples when I first downloaded the Appliance. Here is where we define the target environments for the application. Every application lives somewhere. The environments are definitions of the locations you will be deploying to for development, testing and production. Each environment can comprise of one or more targets.

I clicked on Add Environment and the drop down menu invited me to pick from DEV, INT or QA. Well they didn’t suit me so I realized I needed to create my own Environments.

Create the Environments

So now my plan was off track and that made this whole thing even more fun.

I clicked on Environments and there were DEV, INT and QA. So I clicked on Create Environment and all I had to do was to give the Environment a name and description. Next I was shown the Environment Details page. It had no details of course because it had just been created.

The Application was yet to be associated with the Environment and it had no resources. It was then it dawned on me. My application comprises of three parts. The database, running all the time, the programs running when invoked and the scripts that run once each time the application is refreshed. These resources, these components could go to any of the servers in my environment. I needed to define these components so I could tell SDA which components go where.

So I should have followed the instructions after all and defined the Components first. Good to know the help system has my best interests at heart. Even though I went down the wrong path all the entries I made are going to be used when we get down to the deployment itself.

Create the Components

So I now click on Components and the Create Component button.

Here I am invited, as usual, to give my Components a name and Description and, in addition, details of their location and the repository type. SDA supports almost 20 different types of repository including PVCS, Dimensions and Subversion.


My foundation is in place. Now I have to fill out a little more of the details and decide how the deployment should go. This whole process took no more than 5 minutes. In that time I had set up the Environments, the Application and the Components.

What I really want to do is deploy my Application and its Components through the sequence of Environments I have set up. To do that we need to define the process we want the deployment to follow. And that is what we’ll do tomorrow.

We are not reinventing the wheel here but we are perfecting it.

A couple of weeks ago I wrote about downloading the new Serena Deployment Automation Appliance. This is the free, community edition of the exceptionally advanced Deployment Automation technology we introduced last year. You can get your own copy, free forever, at the community edition website.

The story so far

Since that post I have been working with the Appliance learning how to automate deployments. For about half an hour each day, for the past week, I have been pressing buttons, dragging and dropping and generally putting the technology through its paces partly to improve my understanding of how it all works but mostly to see just how much better automation is than the manual processes I used to use. I have to say I’m impressed! Let me take you on my journey and share with you how I became an automation-maven in just a week. In order to make this digestible I am going to write it in 5 separate postings.

Today, we have naming of parts* (Monday)

Serena Deployment Automation divides the aspects of deployment into 3 units of deployment:

  • Applications – this is the entirety of what you are deploying. It might consist of scripts, executables, images, configurations, in fact anything you need to upgrade your application from its current state to its upgraded state.
    • In my case my Application is the Llareggub (a fictional place in Wales) Public Library
  • Components – these are the distinct collections of items, often from specific repositories Dimensions CM, PVCS/VM, VSS, Subversion etc), that are going to be deployed. Each collection will be of similar types of artifacts that need the same deployment process.
    • In my case I had three collections of components: database schema changes in the /SQL folder, new programs in the /CBL folder and set up scripts in the /BMS folder.
  • Environments – are where the deployments are going to to go from and to. These can be single or multiple deployment targets such as a single server or virtualized shopfront like Amazon.
    • In my case I just followed the paradigm that was in use in the Appliance already of DEV, INT and QA, development, integration test and QA testing.

Each of these has associated attributes, the most important are:

  • Processes – these are associated with Applications and Components. The component processes allow you to create activities that are different for different classes of components and to create different process for those components. For example a set of database DDL needs very different treatment to a bunch of DLL’s. But how we apply DDL to a MS/SQL server is very different to how we apply it to an IBM/UDB server. Imagine these as micro-processes, as your toolkit for deploying this kind of component. Application processes are macro processes made up of a collection of the micro-, the component-processes. Here you can create a deployment process that initially loads the application on to a new server or a process for putting out a security patch.
    • In my case I had processes for deploying the application from DEV to INT and from INT to QA comprising of the stop-backup-apply-ddl-restart of the database, backup-deploy of the code and backup-deploy-execute of the scripts
  • Properties – describe the nature of the Applications, Components and Environments. They specify, for example, the source repository type, the identity of a deployment target, the approvers and many other attributes.
    • In my case I kept things simple and took defaults whenever I could.

Tomorrow – setting up the Application and the Components and a deeper dive into Properties and Processes. * apologies to Henry Reed, Naming of Parts and E. V. Milner, Baking of Tarts

“Drinking our own champagne” is how we approach technology here at Serena. If we have a our own tool that supports part of the application development lifecycle we use it for our own development efforts. In fact the Serena development teams deploy the beta versions of our solutions straight into their production environments because they want exploit the cool-new-stuff just as much as you do!

When I sat down today to start writing about automated deployment in modern enterprises I thought I’d follow the Serena mantra and “drink our own champagne” too. So I jumped on the website and downloaded the completely free and completely pre-configured “appliance” and the virtual environment that runs it.

All appliance – no science

This is something truly amazing. The appliance is a fully installed, configured and ready to go. All you have to do is press the “deploy” button. It runs on top of Virtual Box which is a virtualized environment that runs on most enterprise platforms. Now I will tell you that the download of the appliance took almost 20 minutes as it is over 5GB but the Virtual Box download only took a few seconds.

While it was downloading I read through the easy to follow documentation that walks through deploying an application to Tomcat, one that deploys a database update and a deployment to WebLogic. I also took a look at the cool videos so I could get a sense of what was ahead. By the time I’d watched the last one the downloads were done.

Installation of Virtual Box took a couple of  minutes and was easy as I just took all the defaults. Importing the Appliance also took a couple of minutes.

It takes a community

My environment is Windows 8.1 Pro on a Surface II computer. The moment I started the Appliance I got an error message.

So I popped over to the Serena Deployment Automation Community forum where my exact problem (Symptom: Appliance won’t start: Reason: Pop up blocker software on Windows 8.1) was described. I followed the advice and voilà!

Moments later I found myself looking at a logon screen for Serena Deployment Automation.

Mr. Impatient

Like most geeks I want to click buttons and links more than I want to read documentation or follow a script.

So I clicked on Application and got a list of applications to be deployed. Then I clicked on Tomcat Sample Application and got a list of deployment areas. Next to DEV it asked me to Request Process. Deploy Application was already selected so I just hit SUBMIT.

Seconds later I get the deployment results screen appear and I can see the steps executing.

That was fast

So I installed and configured enterprise class Deployment Automation technology. I executed my very first Automated Deployment. All in under an hour.

The software is free forever. It is unrestricted in functionality and supported by the community of users. This free version limited to a generous 5 deployment end-points and I can buy more for under $1,500. I get a tee-shirt when I do.

How cool is that?

Follow the recipe

So now I am going to settle down and follow each of the guided tours and see what other miracles of technology await my discovery.

Next week I’ll share those experiences too.

In the meantime why not take an hour and try out Serena Deployment Automation for yourself?


System’s programmers on the mainframe have a pretty difficult time these days. More and more complexity, rampant growth of z/Linux, Websphere and RD&T boxes. Draconian constraints, compliance and governance mandates to be applied. All with fewer and fewer resources. It is a common problem.

Serena is here to help. Our ChangeMan SSM technology is designed to be the SysProg’s best friend and unswerving ally.

Sitting quietly in the background monitoring system datasets and members like the APF authorized libraries, the LINKLIST datasets, console commands and any critical application datasets, ChangeMan SSM will send out an alert to the SysProg when members of these datasets are changed in real-time. That message can be delivered to TSO or to email. Not only does ChangeMan SSM know who, how and when the change occured it also know what changed and provides a critical audit trail. If that change was accidental (or malicious) the SysProg can ask ChangeMan SSM to restore the change instantly.

Want to try it free for yourself? Go to the Serena website and download your free trial today or contact me and I’ll be happy to show you just how it works.

Business man runningThis is the first installment of an occasional series of posts in the run up to the launch of ChangeMan 8.1 later this year. As part of the routine pre-launch activities I’ll be chatting to the development team to learn about the cool features that have been added to this latest version.

ChangeMan ZMF is used by some of the largest mainframe development shops in the world. It is typical for these organizations to have hundreds of thousands of components managed and tens of thousands of components in motion. Keeping track of that requires the kind of sophistication and advanced capabilities that can only be found in ChangeMan ZMF.

Many of the largest customers in the world have the most complex and sophisticated release management problems and they make use of the Enterprise Release Option (ERO) of ChangeMan ZMF. Even though the mainframe has blisteringly fast performance, displaying a list of the 4,000 to 10,000 members of a release (typical for several ERO customers) could take a while.

In ChangeMan ZMF 8.1 the ERO process of selecting, searching and sorting lists has had a major overhaul. Where a list of a few thousand components might have taken several minutes in the past this has been reduced to mere seconds. This has been achieved through a number of thoughtful innovations in how we retrieve the information, how we cache it and how we present it. We have also optimized the access methods so only relevant items are retrieved in the first place.

New filtering capabilities also make it possible to really refine these lists to just what you need making the delivery times even faster. As Dave Banovetz, lead engineer on ERO said “the more defined your query the faster you get results.”

Due to several z/OS operating system constraints, it was common to maximize the storage utilization on these multi-thousand component lists. Again some very clever programming allows us to capture all the data and keep well short of these memory limitations.

To learn more about the new features and capabilities in ChangeMan ZMF version 8.1 please contact me.

Tags: Serena

(This is the conclusion of a 7-part series. Read part 1, part 2, part 3, part 4, part 5 and part 6.

Reality: Deployment should be repeatable and predictable

In this series we’ve looked at a number of reasons why people don’t automate their deployments. I had a boss once who was fond of saying, “If you don’t have time to do it right, when will you have time to do it over?” He was right

50% faster to create

Numerous customers all tell the same story. With Serena Deployment Automation they typically spend only half the time they used to creating deployment scripts because they are able to design them graphically and re-use standard parts that have been built for them and are included in the tool.

The customers build libraries of their own standard deployment (and back out) techniques and this brings consistency and repeatability across the organization. This saves time when building the scripts and saves even more time when a deployment fails.

90% faster to execute

The “nut loose on the keyboard” has always been the limiting factor in any computer system. When deployments are automated there is no pause between steps while human-1.0 searches the network to see if the server restarted, no scanning the log for a completion code before running the next script, no calling the supervisor for the password … all things that add delay. And automation means running steps in parallel which human-1.0 doesn’t like to do.

25 billion deployments a day

The pace of deployment is the fastest accelerating aspect of our industry today. Fueled by smartphone applications business users expect deployments to happen with a daily cadence where once they were happy to get a monthly refresh. By 2020 there will be 25 billion devices connected to the Internet (that’s 3.5 devices for every person on the planet!) all in need of frequent updates from you.

Today there are 1.3 billion smartphone users and every one of them is a release management expert. Each day they decide which apps to update and which delete from their devices. They make these judgments based on a hundred factors that fuel their instinct for what’s good and what’s not, how much space they might save and how much battery they consume. Profound, technically complex decisions by users not trained or skilled in Information Technology but experienced in who delivers what they want with repeatable precision and without disruption.

Whether you are deploying to a mainframe or to a wearable device, on-premise or a cloud, you are part of a global distribution machine that thrives on speed. Fast response to consumer need turns an idea into an industry. But it is an unforgiving place too: expectations are for flawless execution every day because one failure can turn your “coolest” organization into the “coldest” has-been at the speed of a single tweet.

Delivering that consumer confidence only comes when you have the confidence in your ability to deliver. Automation is how you establish confidence and delivery it repeatedly.

Systems are best at repetitive. Humans are best at creative.


In order to help you get started with your automation, Serena has made their latest version of Serena Deployment Automation available in a Community Edition format that lets you experience the most up-to-date deployment automation technology for free. Download it here today.

Myth: Errors happen: It’s software

Errors do occur. They occur for a reason. Often those reasons are out of our control. Someone changes an IP Address of a server. Someone changes the password to the back office system. Someone changes the name of a shared .DLL.

Of course in a well-managed and carefully controlled environment those kinds of things shouldn’t happen without the proper authentication, notification and approval. And the infamous “someone” is a responsible professional who calculates the impact of their changes and collaborates with everyone to minimize that impact. In a perfect world.

In the real world change is constant and calculating the consequences of change virtually impossible. Errors can occur and it is our job as release engineers to ensure that they don’t.

Every time we manually fix a problem we waste our effort and no one learns from the experience. The same errors occur repeatedly and we keep applying the same fixes.

When your deployment is automated there might still be errors that occur. However now, when you improve the automation, those errors are addressed once and for all. Each time you do this you save time and money for your organization and get closer to a comprehensive solution.

Never send out another memo about a changed process or new exception. Implement them in the deployment automation directly so that they become the new organizational standard.

And never worry again about the changes to your tool chain when vendors update their tool integrations. With Serena solutions we keep an extensive library of deployment tool vendor integrations so you don’t need to.

Serena Deployment Automation supports toolchain integration by providing a flexible, robust, and extensible plug-in architecture.


In order to help you get started with your automation, Serena has made their latest version of Serena Deployment Automation available in a Community Edition format that lets you experience the most up-to-date deployment automation technology for free. Download it here today.

Myth: Each deployment needs me

Helicopters have been described as “10,000 parts flying together in close formation. It is the mechanic’s job to keep that formation as tight as possible.”

Modern software applications comprise of millions of parts when you consider the huge chunks of code we bind into our applications from the database, security, web server, communications, encryption and authentication vendors. Add to that the seemingly infinite numbers of dependencies on external web services and internal CRM and financial systems.

There are 100 million lines of code in the Ford Taurus

But, just like the helicopter’s mechanic, the software and release engineers can’t be there all the time the system is on the air (or in the air).

It would be prohibitively expensive to have engineers chaperoning their application 24×7. Yet, whenever there is a deployment, no matter how routine, release engineers “want to be there just in case.”

This is laudable commitment to ensuring success but belies a worrisome truth. Is the release engineer who hangs around the release “just in case” more capable than the one who doesn’t hang around but gets on with the next release automation task?

Deployment Automation maintains an inventory of every artifact deployed

Automation means never having to say you’re sorry

Release engineers who build automated deployments know that they can incorporate all the necessary logic to deal with the expected (and unexpected) consequences of their deployments. They know they can leave the automation to execute quietly and efficiently without human intervention.

Automation engineers also know if that if something occurs that has not happened before they can a) handle that safely too and b) add further automation to deal with this new exception in a proper, predictable fashion each time it occurs in the future.

Serena’s Deployment Automation technology release engineers are freed up from constant script development and modification. Now release engineers can turn their attention to supporting the development teams and enabling their continuous improvement programs, their continuous integration process and their continuous delivery goals.

Instead of firefighting every failed script and every broken deployment, release engineers can use the Serena Deployment Automation logging capabilities to do full root cause analysis of problems that arise. Then they get to address the problem at its source by improving the coverage and completeness of the automation eliminating the possibility of future errors occurring.


In order to help you get started with your automation, Serena has made their latest version of Serena Deployment Automation available in a Community Edition format that lets you experience the most up-to-date deployment automation technology for free. Download it here today.

Myth: Emergency fixes are different

“I don’t want to know why it happened: I just want you to fix it!” was what I was told early one morning by the Director of Sales. And she was right: getting the online store back online was the most important thing for the business. Blamestorming would come later.

There is a temptation at 3:00 am to just do whatever it takes to bring the system back on the air even if that means bypassing protocols and procedures designed to protect system integrity. Sales-and-Marketing and Audit-and-Compliance might not see eye-to-eye on this approach.

So why do emergency fixes have to be different? This myth is all about time. The time it takes to write the script. The time it takes to execute the script. The time it takes to get the system back on the air.

Approvals where you need them

Half the effort and a tenth of the time

With Serena’s Deployment Automation we can halve the time it takes to create a script and we reduce that actual deployment times by 90%. And we do not bypass the audit controls, the change reporting or the system integrity.

Whether you are changing one setting in a DNS configuration or every .DLL in the application the procedure for updating your application is the same. You should not rely on the skill, experience and knowledge of a tired release engineer who is working under the pressure of a very upset executive. It is at these times when you must rely upon proven and reliable solutions that deploy consistently, safely and quickly.


In order to help you get started with your automation, Serena has made their latest version of Serena Deployment Automation available in a Community Edition format that lets you experience the most up-to-date deployment automation technology for free. Download it here today.

Myth: Every target is unique

When we started this series we talked about how release engineers have difficulty in keeping up with the rate of change in their environment. Every day a new security patch or software update is applied that changes the known topology of one or more deployment targets.

This is why, some release engineers insist, they have to hand-craft the deployment scripts each and every time.

In an ideal world every target environment would be standardized. But we don’t live in an ideal world. Whether it is our own on-premise platforms, virtualized or cloud platforms we know that their configurations are in a state of constant evolution. This makes it hard for release engineers who must spend time determining the target topology before they write and execute the deployment script.

Any time we rely upon humans we introduce the possibility of error. What if we could detect the target topology before we deploy and then follow a script for that combination of configurations?

Serena’s Deployment Automation allows you to do just that. In fact it even allows you to stand up your virtualized and cloud-based platforms right there in your deployment thus ensuring that the deployment target is what you expect it to be.

All new process editor

Out of the box Serena Deployment Automation integrates with many third party technologies. Everything from your favorite code repository and build tool to your deployment and test technologies and it even integrates with your problem management system.

Using a graphical design environment, release engineers can construct their deployment automation quickly and design it so that different logical paths are followed depending on what target topology is encountered.

Now you get to build upon your proven automation and keep pace with the evolving landscape that surrounds you.


In order to help you get started with your automation, Serena has made their latest version of Serena Deployment Automation available in a Community Edition format that lets you experience the most up-to-date deployment automation technology for free. Download it here today.

Tags: Serena