Multi-server Configurations


Caché or Ensemble


A single Deltanji instance can manage remote code located on more than one Caché or Ensemble node by the use of InterSystems' Enterprise Cache Protocol (ECP). Deltanji itself can be installed on one of the node where managed code is located, or you can set up a dedicated node to host Deltanji. Either way, the node hosting Deltanji lies at the centre of the ECP network used by Deltanji. Outbound ECP connections from the Deltanji-hosting node link to each of the other nodes where code is to be managed. In InterSystems terminology the Deltanji node is an ECP application server and the other nodes are ECP data servers. Each of those nodes must also have an ECP connection to the Deltanji node, i.e. they are also ECP application servers connecting to the Deltanji node as their ECP data server. Thus a 'hub-and-spokes' topology is created, with a pair of links on each spoke to provide bidirectional connectivity.

Note: Some InterSystems license keys do not include ECP. On version 2010.1 or later the expression $SYSTEM.License.NetworkEnabled() returns 1 if ECP is licensed. Alternatively, consult your InterSystems account manager.

The following instructions assume all the nodes are hosted on the same type of operating system (i.e. all Windows, or all UNIX/Linux, or all OpenVMS). Consult Deltanji Support for additional guidance if you need to deploy Deltanji in a heterogeneous environment.

1. Install Deltanji on its node using other sections of this book.

  • When installing program files, if possible place them on a network share that all nodes in your Deltanji network can access using the same name. For example, if your nodes run on Windows share the directory where you place the unpacked software kit, e.g. as \\deltanjihost\deltanjikit.
  • When entering the installation directory path as you run the installation script, specify the network share name rather than the local directory path. Note that if your host OS is Windows the Caché or Ensemble service needs access to the share, i.e. it cannot run under the LocalSystem account.

2. Enable the %Service_ECP service on all nodes, so they can all act as ECP data servers. Make sure that the Deltanji node will be able to accept enough concurrent incoming ECP connections. If necessary, increase the parameter that affects this limit. For added security the ECP service can be configured only to accept incoming connections from specific IP addresses. If you do this, make sure that the connections detailed in subsequent steps will be permitted.

3. On each node apart from the one hosting Deltanji, define the following using Management Portal:

  • An ECP connection to the Deltanji node. For consistency use the same connection name on each node. The suggested name is DELTANJI.
  • A remote database definition named DELTANJI using the ECP connection defined in the previous step to access to the Deltanji database (normally named DELTANJI) on the Deltanji node.
  • A local database called DELTANJI-LOCAL.
  • A namespace called DELTANJI-LOCAL that uses the DELTANJI-LOCAL database as its default database for globals and routines. Clear the checkbox that would otherwise create a default web application for the namespace. On Ensemble 2017.1 or later, opt not to Ensemble-enable the namespace.
  • A namespace that uses the DELTANJI remote database as its default database for globals and uses the DELTANJI-LOCAL database as its default for routines. The namespace name must match the one on the Deltanji node, which is normally called DELTANJI. Clear the checkbox that would otherwise create a default web application for the namespace. On Ensemble 2017.1 or later, opt not to Ensemble-enable the namespace. When creating this namespace it is advisable to point it initially at the DELTANJI-LOCAL database for both globals and routines. Then after the namespace has been created you should edit its definition to use the DELTANJI remote database for globals.
  • The following mappings on the %ALL pseudo-namespace:
    • Global %gjtask to the DELTANJI-LOCAL database.
    • Globals %vc.Stud* to the DELTANJI-LOCAL database.
    • Globals %vc* to the DELTANJI database (remote).
    • Routines %vc* to the DELTANJI-LOCAL database.
    • Package VCmStudio to the DELTANJI-LOCAL database.
    If you do not yet have a %ALL namespace, create it. After entering %ALL as the namespace name, tab off the field so that the options on the page can update (2016.1 or later). If you are required to pick a database, select CACHETEMP (this setting will never be used). If given the option, choose not to create a default web application for it.

4. On the Deltanji node, export the routines %vc*.INT from the DELTANJI namespace. Then import them into the just-created DELTANJI-LOCAL namespace on each of the other nodes. Compilation errors during import can be ignored.

5. In the DELTANJI-LOCAL namespace in a terminal session in each of those nodes, run the following: d Cache5^%vcins() When prompted, confirm the loading of the classes, templates and add-ins.

6. In the DELTANJI-LOCAL namespace start the Deltanji task server process using the command d fast1^%vczn and also add code into SYSTEM^%ZSTART in the %SYS namespace (or into the ZMIRROR routine if you are using InterSystems mirroring) so that fast1^%vczn is run in the DELTANJI-LOCAL namespace whenever the instance is started. For example:

%ZSTART ;
 q
SYSTEM ;
 n $et,$es
 s $et="i '$es d $zu(9,"""",""SYSTEM^%ZSTART error: ""_$ze) s $ec="""""
 j fast1^%vczn|"DELTANJI-LOCAL"|
 q

7. On the Deltanji node, define an ECP connection to each of the other nodes. If possible, name these connections to match the hostname of the computer where the target Caché or Ensemble instance runs. If your connections involve multiple instances of Caché or Ensemble that are running on the same host you will need you pick alternative names for your ECP connections. In such cases it is recommended that you also set the CliSysName parameter to match. For instance if your Deltanji node needs to connect to two instances called QA and UAT that both run on a host called TESTING you could name the two ECP connections QA and UAT, setting the CliSysName of the QA instance to 'QA' and that of the UAT environment to 'UAT'. Search your InterSystems documentation for instructions about where to set this parameter. Changes to CliSysName only take effect at restart.

GT.M


Multi-server configurations are not currently supported on GT.M.