This document details the steps and considerations needed when working with the DATIM4U/DHIS2 Import-Export app for collecting and submitting data into DATIM. An understanding of the general data architecture, dependencies and knowledge of how to revert changes made in the system are required before using this methodology.
The PEPFAR calendar must be consulted for determining correct dates for submission and windows for re-submission.
|Table of Contents|
1. File Validation
Run the data file(s) against the R scripts as defined in the Data Import and Exchange guidance of the DATIM support page. Any issues detected must first be resolved before moving to the next step. The scripts should be run pointing to the applicable DATIM4U environment while running the following checks:
Validate duplicate rows
Validate data elements
Validate OUs (Organizational Units)
Data violation checks
2. Importing into DATIM4U
In the DATIM4U environment, using the Import-Export app you will need to…
Ensure that no data within the file has changed since the data checking process was conducted.
First import using a dry-run to ensure no issues are encountered.
Conduct a full import.
Verify the data is in DATIM4U as expected.
Note: to help ensure data quality, before importing your data, you should first verify that no data had unintentionally been added to the dataset.
3. Approval Process
Use the “Data Approval” app to set the mechanisms’ approval level to “Accepted by Interagency” for the given reporting cycle.
Note: The process for how a country conducts their approvals in DATIM4U may vary depending on in-country procedure. It should be noted that the data is vulnerable to user manipulation until the level of approval is set to Accepted by Interagency.
4. Data Deduplication
Once all mechanisms have been approved to the Interagency level, run the “Data Deduplication” app to deduplicate the reported numbers.
Note: It is imperative to complete the approval process and thereby “locking” the data before commencing deduplication, as changed data may invalidate deduplication.
5. Submission to DATIM
When the data in DATIM4U is ready for submission, go into the “Data Approval” app and ensure that all mechanisms for all agencies in the reporting cycle have been approved to “Accepted by Interagency”.
PCO creates a ZenDesk ticket requesting to commence the submission. The ZenDesk ticket should contain the following:
Confirm that the data files were run through the R scripts without errors.
Attach the export from the R script data violation checks. If no violations were received, then state as such.
Request approval to commence submission from your SI Advisor.
Request S/GAC to prepare DATIM to receive your country’s data for the given reporting/target cycle.
The SI advisor must respond to the ticket with their approval, or provide feedback as to why no approval is given.
S/GAC will then respond to the ZenDesk ticket stating they have opened DATIM to receive the data.
Once the approvals have been received, you may commence submission using the “Data Export Log” app.
Note: Submissions should not be conducted between the hours of 19:00 and 21:30 EST. This is a temporary measure to avoid potential conflict with other processes running on DATIM.
Modifying Data - Before submission (to DATIM)
There are two ways to modify data in DATIM4U. The simplest is by directly editing the data through the web interface, though this may not be efficient when needing to modify large amounts of data. The second option is by re-importing a modified data file. This section will focus on the second option.
The OGAC supported method to reimport modified data is not by importing “over” the data (using the app’s Update Strategy), but rather by deleting the old dataset from DATIM4U and then importing the fresh dataset. This is done to ensure data integrity, as importing a file over existing data will not account for or remove any records that have been deleted. In addition, any deduplicated data will remain and cannot be adjusted to account for updates, meaning that this must first be deleted from the system when adjusting applicable indicators. (see additional information regarding deduplication below - "Nightly Deduplication clean-up script")
Using the Import-Export app, select your OU, provide the applicable date range and select the applicable datasets for export. This will provide you an export of the current data in DATIM4U, including the deduplication data (if already conducted).
Once you have the newly exported file, ensure that the approval levels are at “pending”, to avoid encountering issues when importing.
Go back into the Import-Export app and import the exported file using the Delete Strategy, effectively expunging the enclosed data from DATIM4U.
Follow the Submission Workflow from the beginning using the new data file.
Note: If deduplication has not yet been conducted and are certain that no data has been deleted in the updated data file, one could then reimport using an “update strategy”, but it does not necessarily save much time and introduces a level of uncertainty to the data.
Modifying Data - Re-submission or Out-of-Cycle submission (to DATIM)
There may be situations when data needs to be adjusted after it has already been submitted to DATIM. This will require pre-approval and coordination from OGAC.
Note: DATIM4U's “Data Export Log” app used in the submission of data to DATIM only supports the submission of the current quarter and COP planning cycles. If re-submission of a past period is required, S/GAC must be contacted through ZenDesk to determine the best course of action, potentially requiring manual intervention.
Create a ZenDesk ticket stating the need to update data, requiring a "re-submission" into DATIM.
Explicitly request approval from the Country's SI Advisor.
Explicitly request approval from S/GAC DATIM team.
After receiving approval from both S/GAC and the Country SI Advisor, modification of the data in DATIM4U can begin according to the “Modifying Data - Before submission” section above.
When you are at the “Submission” stage, it is important to continue to use the same ZenDesk ticket from step 1 and to state that the country is ready for “re-submission”, asking S/GAC to remove the old data in preparation. Expect to wait one business day for S/GAC to prepare DATIM for the resubmission and proceed only once they have confirmed completion.
Additional considerations when editing data
Dependent environment variables
In order for result or target data to be manipulated, both the approval level and the dataset expiry date must be within the correct timeframe, providing a window for the editing of data. Secondly, the deduplication app has an associated deduplication date specified in the “datastore”, which needs to be in the accepted date range to ensure deduplications can be calculated and subsequently resolved. Steps to manage these aspects are defined in subsequent documentation.
Fine-tuning update files
It should be noted that the exported data files can be fine-tuned so that only applicable data is “surgically” removed when importing using the delete strategy. It is imperative to include any deduplicated data, ensuring this is removed if it is existing and relevant to the indicator or facility being deleted. The deduplication data will appear in your export files with Mechanism ID “000000” and/or “000001”.
Use Case: Issues were noticed in the deduplicated data of the OVC Technical Area for COP16/FY17 targets. The simplest process to resolve this would be to delete only the deduplicated data for OVC and re-do the deduplication activity. Below are the general steps to conduct this process.
- Export the relevant COP16/FY17 data from D4U into a CSV file
- Delete all technical areas other than OVC from the CSV file
- Delete all Mechanisms other than the two deduplication Mechanisms (000000 & 000001) from the CSV file
- Now you have a CSV file that contains only the deduplicated data for OVC. Import this into the COP16/FY17 reporting period using the "delete strategy"
- Run deduplication app and commence deduplication
Nightly Deduplication clean-up script
DATIM and DATIM4U run a nightly script that compares the time-stamps (date of creation) of the de-duplication values (the negative values that are created from the deduplication app) to the values causing the duplication. If data for a site that is causing duplication is updated, then its time-stamp will be newer than the de-duplicated value, and as a result its de-duplicated value is no longer valid and will automatically be deleted from the system. In other words, this script assists in cleaning out obsolete deduplicated values when the contributing values have changed. As a result, after the script has run, the user will be able to open the deduplication app and will be presented with the new set of deduplication to step through, representing the updated values. All deduplications that were not affected are still relevant and remain in the system unchanged.