Multi-server Configurations


Caché or Ensemble


A single Deltanji instance can manage remote code located on more than one Caché or Ensemble node by the use of InterSystems' Enterprise Cache Protocol (ECP). Deltanji itself can be installed on one of the node where managed code is located, or you can set up a dedicated node to host Deltanji. Either way, the node hosting Deltanji lies at the centre of the ECP network used by Deltanji. Outbound ECP connections from the Deltanji-hosting node link to each of the other nodes where code is to be managed. In InterSystems terminology the Deltanji node is an ECP application server and the other nodes are ECP data servers. Each of those nodes must also have an ECP connection to the Deltanji node, i.e. they are also ECP application servers connecting to the Deltanji node as their ECP data server. Thus a 'hub-and-spokes' topology is created, with a pair of links on each spoke to provide bidirectional connectivity.

Note: Some InterSystems license keys do not include ECP. Check with your InterSystems account manager if you are unsure about the status of yours.

The following instructions assume all the nodes are hosted on the same type of operating system (i.e. all Windows, or all UNIX/Linux, or all OpenVMS). If you need to deploy Deltanji in a heterogeneous environment consult Deltanji Support for additional guidance.

1. Install Deltanji on its node using other sections of this book.

  • When installing program files, if possible place them on a network share that all nodes in your Deltanji network can access using the same name. For example, if your nodes run on Windows share the directory where you place the unpacked software kit, e.g. as \\deltanjihost\deltanjikit.
  • When entering the installation directory path as you run the installation script, specify the network share name rather than the local directory path. Note that if your host OS is Windows the Caché or Ensemble service needs access to the share, i.e. it cannot run under the LocalSystem account.

2. Enable the %Service_ECP service on all nodes, so they can all act as ECP data servers. Make sure that the Deltanji node will be able to accept enough concurrent incoming ECP connections. If necessary, increase the parameter that affects this limit.

3. On each node apart from the one hosting Deltanji, define the following using Management Portal:

  • An ECP connection to the Deltanji node. For consistency use the same connection name on each node. The suggested name is DELTANJI.
  • A local database called DELTANJI-LOCAL.
  • A namespace called DELTANJI-LOCAL that uses the DELTANJI-LOCAL database as its default database for globals and routines.
  • A remote database definition named DELTANJI giving access to the Deltanji database on the Deltanji node. That database is normally also named DELTANJI.
  • A namespace that uses the DELTANJI remote database as its default database for globals, and uses the DELTANJI-LOCAL database as its default for routines. The namespace name must match the one on the Deltanji node, which is normally called DELTANJI.
  • The following mappings on the %ALL pseudo-namespace:
    • Global %gjtask to the DELTANJI-LOCAL database.
    • Globals %vc* to the DELTANJI database (remote).
    • Globals %vc.Stud* to the DELTANJI-LOCAL database.
    • Routines %vc* to the DELTANJI-LOCAL database.
    • Package VCmStudio to the DELTANJI-LOCAL database.

4. On the Deltanji node, export the routines %vc*.INT from the DELTANJI namespace. Then import them into the just-created DELTANJI-LOCAL namespace on each of the other nodes.

5. In the DELTANJI-LOCAL namespace in a terminal session in each of those nodes, run the following: d Cache5^%vcins() When asked, confirm the loading of the classes, templates and add-ins.

6. In the DELTANJI-LOCAL namespace start the Deltanji task server process using the command d fast1^%vczn and also add code into SYSTEM^%ZSTART in the %SYS namespace so that fast1^%vczn is run in the DELTANJI-LOCAL namespace whenever the instance is started.

7. On the Deltanji node, define an ECP connection to each of the other nodes. If possible, name these connections to match the hostname of the computer where the target Caché or Ensemble instance runs. If your connections involve multiple instances of Caché or Ensemble that are running on the same host you will need you pick alternative names for your ECP connections. In such cases it is recommended that you also set the CliSysName parameter to match. For instance if your Deltanji node needs to connect to two instances called QA and UAT that both run on a host called TESTING you could name the two ECP connections QA and UAT, setting the CliSysName of the QA instance to 'QA' and that of the UAT environment to 'UAT'. Search your InterSystems documentation for instructions about where to set this parameter. Changes to CliSysName only take effect at restart.

GT.M


Multi-server configurations are not currently supported on GT.M.