Hello Automaters! It gives me a great deal of pleasure to introduce RES ONE Automation 10.1, part of the RES ONE Enterprise family. The product is Generally Available as of today, July 6th, and is downloadable from the RES Success Center. Feel free to skip to the detailed section from the quick menu below or keep reading to see everything we’ve added to this exciting new release of RES ONE Automation.

Licensing


Before we get into the exciting new stuff, first a reminder…A little while ago we enhanced the license model for additional licenses with RES ONE Identity Director integration.

As a reminder, we:

  • Enhanced the license model in RES ONE Automation to support the new Modules (Access and Identity) in RES ONE Identity Director and RES ONE Security.
  • Have offered you 12 (complimentary) license points when you enable the integration with RES ONE Identity Director.

This was introduced in the summer of 2016, but we still get queries about it, so I wanted to point it out again!

Parallel Processing


Probably the most hotly anticipated new feature, and certainly one of the biggest feature requests – we bring you Parallel Processing! When we brought you Agent+ in the version 10 release, I promised that we would be able to bring you a slew of new features and functionalities now that we have a new foundation for the Agent platform. And Parallel Processing is the first exciting feature of those new innovations!

With the Agent+, you are able to run multiple Jobs simultaneously. This parallel processing makes it possible to start another Job on the same machine without waiting for the previous Job to finish. This could be useful if you, for instance, want to back up the Oracle database (Job 1) and back up the security event log (Job 2) on Server A. Both Jobs could run in parallel.

This feature is only applicable when scheduled to an Agent+. The legacy Agent for Windows and Agents for Unix/Linux and Mac OS X do not support parallel processing.

  • At Jobs > Scheduling, when configuring a new Job, you can select the option Schedule in parallel with other jobs to allow this Job to be scheduled in parallel instead of serialized (default).
  • At Jobs > Scheduling, Activity and Job History, the column Parallel has been added to view whether the Job has run in parallel or not. It is also possible to schedule the Job in parallel via the command line option /parallel.

We have made this functionality available to you in both the Management Portal and the Rich Console.

Of course, you can also call the Parallel Processing function via the Command Line and the API too! Keep in mind that:

  • The Tasks and Queries inside the Module, Project or Run Book will always run serialized in the order configured in that Job.
  • When multiple Jobs are scheduled, in parallel and serialized, the following applies:
    • When a serialized Job is scheduled to run on an Agent+, the Agent+ will wait for all other running Jobs (parallel or serialized) to finish before starting the serialized Job.
    • When a serialized Job is started on an Agent+, no other Jobs (parallel or serialized) will be started on that Agent until the serialized Job has finished.
    • For example, if you schedule a serialized Job at 9 AM, and two parallel Jobs at 10 AM, the parallel Jobs will only start when the serialized Job has finished.
  • Certain Tasks can interfere with other Tasks when running in parallel due to operational limitations on the machine. For example, Tasks that request some types of Resources (registry keys/values, files, folders, shares, etc.) from the same fileshare. This could result in a different outcome than expected.
  • The following Tasks are excluded, by default, from parallel processing:
    • Task Install Windows Package
    • Task Reboot Computer
    • Task Shutdown Computer

Use Cases


A couple of strong use cases for Parallel Processing are as follows:

IFTTT (If This Then That) Scenario

  • Continuously monitor the state of a Print Queue.
  • If failure occurs, stop the spooler service, clear the stuck job, then restart the service again.

Faster Provisioning

  • Perform a lengthy file copy operation.
  • At the same time, be installing an application and carrying out system configuration tasks.

Examples


Below follows several examples of Job behavior and comparisons between Serialized Jobs, Parallel Processing, and mixing types to help better explain options now available to you with 10.1

  

AES-256 encryption

This has been a major piece of work for us, and in 10.1 you will see the first stage of a two-stage process brought out into the light of day.

With the security worries, scares, and threats that we are reading about daily, not only in the Computing Press, bu

t in the World News, the RES team has been hard at work making sure that you can use our products in the knowledge that you are safe and secure.

Previously, we used a proprietary obfuscation in RES ONE Automation that made sure security sensitive data was kept away from prying eyes.

By security sensitive, I am referring to anywhere username or password are used or datastore connection information, as well as component (Agent, Dispatcher, Datastore) connectivity data, is held.

These are used in the registry, the Automation datastore, and in filesystem where the Components are installed.

Whilst our obfuscation was effective, and has served us well for many years, it was necessary for some customers to do their own penetration testing so they can make sure they are happy with the way we do things. As we do more and more projects in the federal space, in local and central government, and in healthcare, where security demands are now on a par with federal, our move from proprietary obfuscation to AES-256 means that customers can omit a lot of their penetration testing, recognize faster deployments, and feel comfort in knowing that we have done the heavy lifting in the background.

 

As I mentioned, for us this is a two-stage process. What you will see here in 10.1 of RES ONE Automation is AES-256 for NEW installations. If you do a fresh install, everything is setup and configured to use AES-256. You will see an indicator in Setup => Database to show you that the datastore is configured with AES-256.

We will introduce stage two soon, where we will offer customers with an existing installation the ability to convert their environment from our proprietary obfuscation to AES-256 encryption. So, whilst customers COULD create a building block on an existing environment and move it into a clean datastore in order to gain AES-256, my recommendation would be to wait for the next release and let us worry about it for you!

Agent Activity Logging


Some customers have asked us through UserVoice (our ideation platform) for better logging of Jobs and their status, and I am pleased to say this marks the second of the new innovations that we bring to Agent+.

Detailed Agent activity is now included in Event log for Agent+. In the Event log, which can be viewed in the Microsoft Event Viewer, new actions (or events) for the Agent+ have been added.

You will find all the Event Logging that pertains to RES ONE Automation under:

  • RESWAS
  • RESWCS
  • RESWDS

The naming is a proud nod to our past, where the product was named RES Wisdom. Some customers are already logging Service Start and Stop actions under these, so we were conscious that we should be consistent here.

Now, you are able to see the name of the Job, it’s description, it’s GUID, and the status (success or if it failed, what the exit code is). There is a wealth of information in the help and admin guide, which explains the Event ID numbers and what they mean.

These new Event Logging capabilities are very useful for customers that would like to monitor jobs using other 3rd party tools like Splunk, Microsoft System Center Operations Manager or other ITOM products.

Ubuntu 16


RES ONE Automation 10.1 introduces support for another non-windows agent. This time, we are pleased to add Ubuntu 16 to the family. It is worth noting that there is only a 64-bit distribution of the Operating System, so as you’d expect, we have built a 64-bit Agent.

Telemetry

From RES ONE Automation 10.1, RES is monitoring the functionality that is used within the product. This allows us to prioritize the areas that we focus our development efforts on.

That way, we have:

  • More visibility on how the product is used amongst the different customers
  • Better grip on deployment of new Agents and used operating system versions
  • More accurate information on systems used by our customers to make informed decisions about the deprecation of features or functionality
  • Steady adoption rate of the Management Portal

Telemetry tasks include:

  • Collect and report back on:
    • How many Agents and Agents+ are deployed
    • Which Tasks are executed and how many times

Please refer to the RES ONE Automation Administration Guide for a complete list of collected data and an illustration of the data provided to RES. Rest assured, there is no user or customer specific data transmitted to us, this is purely allowing our developers to focus their efforts on the right areas of the product!

After installing RES ONE Automation 10.1, one Dispatcher will automatically perform the collection once every 30 days just before sending the usage data. If no communication is possible (for example, due to no Internet access), the telemetry task will not be executed. After a restart of the Dispatcher, it will try to make another attempt. The telemetry action is logged in the Audit Trail.

Final Thoughts


For me, 10.1 has been a really important release – there are some exciting new functionalities that we’ve added, and it really feels like we are innovating at a great pace. As always, I’d like to use this as a platform to say thanks to such a dedicated team in development, who have pulled together and produced another awesome release to be proud of! I’m looking forward to hearing from our customers and partners about how you are leveraging Parallel Processing in particular!

Keep on Automating,

Grant Tiller (on behalf of the RES Product Management team)