The first is a tool which allows you to identify all journals which have a remote journal defined to it and display them in a list. The list has a number of options you can run against each journal as listed below
- Activate the remote journal link
- De-Activate the remote journal link
- Clean up the receivers attached to the local journal
- Display the remote journal details
- Work with the local journal attributes
- Remove the remote journal link
The next tool allows you to clean up the receiver chain for any journal, this uses the same program as the above for the clean up but does not rely on the journal being linked to a remote journal. You can use this against a remote journal as well, the only caveat we would state is make sure you do not delete receivers that your HA process is not finished with (HA4i does its own clean-up so does not need this tool).
A lot of people have issues creating a total journal environment especially if it needs to have a remote journal attached. We have built a tool that will build the local journal and remote journal from a few passed in parameters, obviously we have a naming convention we adhere to so it important that you understand how the journal objects are defined etc. It will pull back the remote system information based on the RDBDIRE you pass in and use it connect to the remote system to build the remote journal objects and attach them to the local journal. It requires that the utilities we provide are installed on the remote system and are running for the remote journal build.
The next thing you need to do is journal the objects to the journal, we decided to build a tool for that as well. The front end command allows a selection of objects to be journalled based on a number of parameters, even the name can be generic which adds a lot of flexibility to the process. We find that the use of individual journal commands can be hard work when you are trying to journal all the objects in a library, this takes all the hassle out of it by running the required command based on the object found in the listing for each and every object.
The last tool to get the dust down was something that we wrote many years ago when journalling was in its infancy and everyone wanted to know how much traffic would flow between the systems in a HA set up. The tool needed some tweaking as we had filters and lots of other capabilities built in that do not make sense with the way Remote journaling is set up today (Remote journalling did not exist so filtering was carried on the source to reduce traffic and the overhead of sending the data. Today Remote journalling is so fast and efficient most are happy to send everything).
The data collected can be used to build graphs and reports about the level of activity that the remote journal is adding to the network, this allows issues to be identified before they bring the network to a halt. The following is a screen shot of some of the data we captured in our tests, the data goes back a few days even though we only started the data collection today. The most we loaded to the network in a one hour period was just over 5GB and the maximum bytes per second was around 24Mb (Megabits) and that is an uncompressed figure. The actual network load could be lower than that. These system are connected on a a 1GB link so not real stress was added to the network with nearly 12 million file updates..
This completes the utilities collection for now, we will continue to do more internal testing before we package up ready for users to try out. If we think of any other tools we will add if appropriate, or if you have a need why not let us know and we will see if we can develop it for you.
If you would like to know more about the tools or see a demo let us know.