DRAFT

High-level scenario 

4U Nodes supported in OUs (Operating Units)  manage their own site list.  The 4U site list will contain all sites that the node needs to use in PEPFAR reporting.  In addition, the 4U site list may contain sites that are used for other business or reporting needs.  PEPFAR subscribes to the OU node's list and provides a unique PEPFAR ID that allows a 4U instance to map this metadata element to a site record.  That allows a 4U instance to communicate a PEPFAR ID back to PEPFAR at the time of reporting.  

When new Site data is entered into a 4U site list, the system will do a search to determine if that site already exists in the 4U site list.   If the site does not exist, the 4U user can decide to add the site.  

PEPFAR will subscribe to the site list of specific 4U nodes.  When PEPFAR receives an new site from a 4U instance, they will:  

  1.  Determine if the site is a PEPFAR reporting site
    1. No - Stop
    2. Yes - Determine if the site is already in the PEPFAR site list 
      1. No - Add site and PEPFAR site id 
      2. Yes - ???

What does PEPFAR do when there is an update to a site?  

Note:  For 4U instances that are not ready to physically support site management, the sites can continue to be managed by OU personnel in the PEPFAR repository.  

System Actors

DATIM4U 

DATIM Global 

 

Question- Are these the right steps?  

STEP 1 - Initialize Node with Site data? 

Questions:

1)  How does the 4U site list for an OU get initially populated?  By end users within the OU.  Does the initial data come from Global?  Not sure it matters.  Because Global received their data from the OU in the first place.  If the initial load comes from PEPFAR we already have code to do that.  Sounds good.

If we are doing a bulk load from PEPFAR site data, 

 

STEP 2  - Sites are managed at 4U node using DHIS2 functionality and a DHIS2 App.   

New site adding process:  Screen mockups https://docs.google.com/document/d/1WuEOsYJhLw-dqk1FnadOFwqylsPlf90Y13huqy08dvQ/edit#heading=h.wy9rbj8467ht

Other DATIM Site management functions are the same as current DATIM process.  This documentation will need to be updated for 4U:   uml-sequence.png

STEP 3 - PEPFAR to subscribes to 4U site data

A process has been developed for PEPFAR to subscribe to  4U site data.  This process happens in an automated way.   An administrator does a one time configuration to set up the exchange.  

 

title DATIM Site Workflow
participant 4U-DHIS2
participant 4U-InfoMan
participant 4U-IL
participant P-IL
participant P-InfoMan
participant P-DHIS2
note over 4U-DHIS2, 4U-IL :DATIM4U
note over P-IL, P-DHIS2 :PEPFAR 
loop Automated job to populate 4U ILR Site document 
	4U-DHIS2->4U-InfoMan: [1] automated cron job to extract 4U site data 
end
loop Automated sync subscribed Sites from Node to PEPFAR 
    P-IL-> P-InfoMan: [2] Initiate request update of subscription of site to a 4U instance 
	P-InfoMan-> P-IL: [3] Generate Query to get site data from 4U	
	P-IL-> P-IL:  [4] Log request headers
	P-IL->4U-IL: [5] Send the query to specific 4U instance
	4U-IL-> 4U-IL:  [6] Log request headers
	4U-IL->4U-InfoMan: [7] Send the query to 4U InfoMan
	4U-InfoMan->4U-IL: [8] Return query results
	4U-IL-> 4U-IL:  [9] Log result headers
	4U-InfoMan->P-IL: [10] send results of query to PEPFAR
	P-IL-> P-IL:  [11] Log result headers
	P-IL->P-InfoMan:  [12] Publish results to PEPFAR InfoMan
end

STEP 4  - Prepare PEPFAR ILR data for import into PEPFAR DHIS2 


  

4U-<OU1>  Site data from 1 of N 4U nodes managing their own sites 

4U-<OU2>  Site data from 1 of N 4U nodes managing their own sites

How is the data merged?  Need screen shots and directions.  

STEP 5 - Import subscribed data into PEPFAR DHIS2 

Modify the new app being developed by Jembi to 

  1.  Find sites in DATIM-Merged that have been added or updated since last import
  2. For each site, run matching algorithm to make sure that the site doesn't already exist.  
  3. If site is a PEPFAR reporting site, add it and add the PEPFAR ID???
    .  

 

 

Step 6 - 4U site subscribes to Site IDs added at DATIM Global

This is an automated process that is configured once by an administrator or by the 4U install and then runs automatically.  

title DATIM Site Workflow
participant 4U-DHIS2
participant 4U-InfoMan
participant 4U-IL
participant P-IL
participant P-InfoMan
participant P-DHIS2
note over 4U-DHIS2, 4U-IL :DATIM4U
note over P-IL, P-DHIS2 :PEPFAR 
loop Automated job to populate PEPFARs ILR Site document 
	P-DHIS2->P-InfoMan: [1] automated cron job to extract 4U site data 
end
loop Automated sync to move subscribed IDs from PEPFAR to 4U ILR
    4U-IL-> 4U-InfoMan: [2] Initiate request for subscription update to Site IDs
	4U-InfoMan-> 4U-IL: [3] Generate Query to get site IDs from PEPARA	
	4U-IL-> 4U-IL:  [4] Log request headers
	4U-IL->P-IL: [5] Send the query to PEPFAR instance
	P-IL-> P-IL:  [6] Log request headers
	P-IL->P-InfoMan: [7] Send the query to PEPFAR InfoMan
	P-InfoMan->P-IL: [8] Return query results
	P-IL-> P-IL:  [9] Log result headers
	P-InfoMan->4U-IL: [10] send results of query to 4U
	4U-IL-> 4U-IL:  [11] Log result headers
	4U-IL->4U-InfoMan:  [12] Publish results to 4U InfoMan
end



 

Step 7 - Import the PEPFAR IDs into 4U DHIS2 

We may or may not need to do this step.  It depends upon the strategy we use to import the desired mechanism data.  If we only need the site name to import mechanism data, then we do not need this step.  Jim will need to help us understand the options for keeping mechanism data aligned.  

We also need to be careful that importing the ids is not a change that will kick off the update cycle with global again. We want to make sure we don't get into an endless loop of transmitting the same records.  

Step 8 - Translate the ADX message 4U IDs to PEPFAR IDs

An OpenHim mediator has been designed to read any outgoing ADX message on the node and check to make sure that any 4U site IDs in the message are translated to PEPFAR site IDs.  

Workflow  - Export Aggregate Data