4U Nodes supported in OUs (Operating Units) manage their own site list. The 4U site list will contain all sites that the node needs to use in PEPFAR reporting. In addition, the 4U site list may contain sites that are used for other business or reporting needs. PEPFAR subscribes to the OU node's list and provides a unique PEPFAR ID that allows a 4U instance to map this metadata element to a site record. That allows a 4U instance to communicate a PEPFAR ID back to PEPFAR at the time of reporting.
When new Site data is entered into a 4U site list, the system will do a search to determine if that site already exists in the 4U site list. If the site does not exist, the 4U user can decide to add the site.
PEPFAR will subscribe to the site list of specific 4U nodes. When PEPFAR receives an new site from a 4U instance, they will:
What does PEPFAR do when there is an update to a site?
Note: For 4U instances that are not ready to physically support site management, the sites can continue to be managed by OU personnel in the PEPFAR repository.
Question- Are these the right steps?
1) How does the 4U site list for an OU get initially populated? By end users within the OU. Does the initial data come from Global? Not sure it matters. Because Global received their data from the OU in the first place. If the initial load comes from PEPFAR we already have code to do that. Sounds good.
If we are doing a bulk load from PEPFAR site data,
New site adding process: Screen mockups https://docs.google.com/document/d/1WuEOsYJhLw-dqk1FnadOFwqylsPlf90Y13huqy08dvQ/edit#heading=h.wy9rbj8467ht
Other DATIM Site management functions are the same as current DATIM process. This documentation will need to be updated for 4U: uml-sequence.png
A process has been developed for PEPFAR to subscribe to 4U site data. This process happens in an automated way. An administrator does a one time configuration to set up the exchange.
title DATIM Site Workflow participant 4U-DHIS2 participant 4U-InfoMan participant 4U-IL participant P-IL participant P-InfoMan participant P-DHIS2 note over 4U-DHIS2, 4U-IL :DATIM4U note over P-IL, P-DHIS2 :PEPFAR loop Automated job to populate 4U ILR Site document 4U-DHIS2->4U-InfoMan:  automated cron job to extract 4U site data end loop Automated sync subscribed Sites from Node to PEPFAR P-IL-> P-InfoMan:  Initiate request update of subscription of site to a 4U instance P-InfoMan-> P-IL:  Generate Query to get site data from 4U P-IL-> P-IL:  Log request headers P-IL->4U-IL:  Send the query to specific 4U instance 4U-IL-> 4U-IL:  Log request headers 4U-IL->4U-InfoMan:  Send the query to 4U InfoMan 4U-InfoMan->4U-IL:  Return query results 4U-IL-> 4U-IL:  Log result headers 4U-InfoMan->P-IL:  send results of query to PEPFAR P-IL-> P-IL:  Log result headers P-IL->P-InfoMan:  Publish results to PEPFAR InfoMan end
4U-<OU1> Site data from 1 of N 4U nodes managing their own sites
4U-<OU2> Site data from 1 of N 4U nodes managing their own sites
How is the data merged? Need screen shots and directions.
Modify the new app being developed by Jembi to
This is an automated process that is configured once by an administrator or by the 4U install and then runs automatically.
title DATIM Site Workflow participant 4U-DHIS2 participant 4U-InfoMan participant 4U-IL participant P-IL participant P-InfoMan participant P-DHIS2 note over 4U-DHIS2, 4U-IL :DATIM4U note over P-IL, P-DHIS2 :PEPFAR loop Automated job to populate PEPFARs ILR Site document P-DHIS2->P-InfoMan:  automated cron job to extract 4U site data end loop Automated sync to move subscribed IDs from PEPFAR to 4U ILR 4U-IL-> 4U-InfoMan:  Initiate request for subscription update to Site IDs 4U-InfoMan-> 4U-IL:  Generate Query to get site IDs from PEPARA 4U-IL-> 4U-IL:  Log request headers 4U-IL->P-IL:  Send the query to PEPFAR instance P-IL-> P-IL:  Log request headers P-IL->P-InfoMan:  Send the query to PEPFAR InfoMan P-InfoMan->P-IL:  Return query results P-IL-> P-IL:  Log result headers P-InfoMan->4U-IL:  send results of query to 4U 4U-IL-> 4U-IL:  Log result headers 4U-IL->4U-InfoMan:  Publish results to 4U InfoMan end
We may or may not need to do this step. It depends upon the strategy we use to import the desired mechanism data. If we only need the site name to import mechanism data, then we do not need this step. Jim will need to help us understand the options for keeping mechanism data aligned.
We also need to be careful that importing the ids is not a change that will kick off the update cycle with global again. We want to make sure we don't get into an endless loop of transmitting the same records.
An OpenHim mediator has been designed to read any outgoing ADX message on the node and check to make sure that any 4U site IDs in the message are translated to PEPFAR site IDs.
Workflow - Export Aggregate Data