Last Updated 2023/09/01

APPENDIX: DATA PIPELINE SERVICE OVERVIEW AND DEFINITIONS 

1: DATA PIPELINE IMPLEMENTATION PROCESS

  1. The Aclaimant Product Services and Customer Success teams will meet with Customer to kick off the data services aspects of the project. 
    1. During this call, the Product Services team will confirm the contract elements, validate the handling of soft and hard errors, discuss the relationships with your current data providers, review workflow and lay out the next steps in the implementation. 
    2. During this conversation, the data request letters that will be sent to data sources will be finalized. 
  2. The data exchange process(es) will be set up to ensure the secure exchange of data. 
    1. As each data source is received, each data source will be thoroughly reviewed, understood, and then mapped to specific Aclaimant data fields. 
    2. Each data source field is mapped to an existing or newly created Aclaimant field. The Product Services team designs, reviews and receives approval of the mapping process from the client.
    3. Once mappings are identified and configuration (if needed) is complete, the load programming can be finalized and thoroughly tested. Aclaimant utilizes a sophisticated data pipeline toolset that runs the entire data conversion process, including load setup, processing, balancing, notification and auditing. 
    4. Customer Commitments: All data files sent to Aclaimant must be encrypted and sent securely (via SFTP or API).  The security of your data is extremely important to us. 

 

2: IMPLEMENTATION PROCESS AND SUPPORT

  1. Responsible for Project Management of new implementations from the Internal sales.  Estimated timelines provided below.  Please note that timelines are subject to change based on factors outside of Aclaimant control, including client responsiveness, requirement or scope changes, etc.
  2. Kick Off to the External Client Handoff to CS including but not limited to:
    1. Launch – kickoff internally and with client
    2. Discovery/Design – Determining clients overall needs and current vs future state. Designing a comprehensive, configurable solution to fit within the Aclaimant system functionality that meets all scoped aspect of the client contract
    3. Build/Validate – Working to fully build/implement the agreed upon workflow design. Internal/External validations approved by client on all aspects 
    4. Deployment/Training – Building/compiling training materials including detailed walkthroughs and videos as to meet contracted client needs
    5. Hypercare/Handoff – Overseeing initial client go-live and continued troubleshooting throughout. Compile all source material and conduct a detailed handoff with assigned CS resource

 

3: ONGOING CUSTOMER SUPPORT AND SLAS

 

  1. Data Ingestion:
    1. The Aclaimant Data Pipeline engine includes a scheduler to automatically determine when a data file is expected. 
    2. Once a file is sent to a secure Aclamant hosted site, the automation engine will start to process the data file. 
      1. If a dataset is late, then the engine automatically will notify the Aclaimant support team who will notify the customer contact group via email of the missing dataset. 
      2. The customer contact group can be changed at any time.  
  2. Data Quality:
    1. SLA: Aclaimant to ensure that the ingested data is accurate and error-free.
    2. Measurement: Data quality will be assessed by periodic audits and validation checks against predefined accuracy benchmarks.
    3. Remedy: Should data quality fall below the SLA threshold, the provider will investigate the root cause and implement measures to improve accuracy.
    4. Once the file is received and initially ingested for our staging servers, the Aclaimant Data Pipeline engine confirms the correct data set is present by checking the name and date embedded in the data file and schedules it for processing. Of note, the data being processed is never posted until all the data is processed and balanced. The system will review the data load during the validation stage. Hard and soft errors may result. Hard errors stop the process. Soft errors allow the file to process and these errors will populate on a customer data quality dashboard. If the load passes the validation tests, the data load proceeds.  Examples of issues that will be caught by this functionality include:
      1. Example: 
        1. Report Date before Occurred Date
        2. First transaction date before Occurred Date
        3. Policy period with abnormal effective and end dates (400 days apart, or more). These are typically 1 year apart.
        4. Closed Date before Report/Occurred Date
    5. Following a successful data load, an Aclaimant standard set of data quality widgets will populate the data quality dashboard. The dashboard is updated for viewing by authorized users of the system on an Aclaimant dashboard. These reports identify data issues that are not considered critical in nature (aka, “soft errors”).
      1. Examples include
        1. Claims that have Invalid Cause of Injury Coding
        2. Claims without a Reported Date
        3. Open Claim with Closed Date
        4. Closed Claim without a Closed Date
  3. Support Response Time:
    1. SLA: Aclaimant commits to responding and notifying GAIG and the TPA of errors in the ingestion process within 1 business day. Aclaimant will notify the Customer and TPA project team via email if TPA/Carrier data is not received or hard errors have occurred due to data discrepancies that would prevent a data load. 
    2. Remedy: If response times exceed the SLA, we will escalate the issue internally and provide an expedited resolution plan.