- Количество слайдов: 86
Effective Solutions to Data Integration: Adopting HUD Supported CSV or XML to Merge and Migrate Data Matthew D. Simmonds, Simtech Solutions Inc. Steve Coretti, Safe Harbors, City of Seattle, King County , United Way of King County Kelli Lee, City of Chicago, Business and Information Services Department September 18 -19, 2006 – Denver, Colorado Sponsored by the U. S. Department of Housing and Urban Development
Overview • Learning Objectives • Three Cases of Effective Integration (~30 mins. ) – Commonwealth of Massachusetts – Seattle, King County – City of Chicago • Data Integration: Excel Data Manipulation Tips & Techniques (~30 mins. ) • Understanding the HMIS Schema & Customizing it for Local Use (~20 mins. ) • City of Chicago Data Integration Policy, Procedures and Security (~20 mins. ) • Integration Roles and Responsibilities (~15 mins. ) • Questions and Discussion (~20 mins. )
Learning Objectives • To explore the benefits of the national data integration standard and options for broader adoption and expansion. • To review the basic structure of HUD's HMIS XML Schema and understand how it can be customized for a local community. • To learn the main administrative and technical steps involved in an HMIS data integration project.
Data Integration: Commonwealth of Massachusetts Matthew D. Simmonds, Simtech Solutions Inc. , President September 18 -19, 2006 – Denver, Colorado Sponsored by the U. S. Department of Housing and Urban Development
Data Integration: Commonwealth of Massachusetts • • Introduction Project Goal Process Flow Implementation Status Challenges Resources and Costs Summary Lessons Learned
Introduction Development Cycle Start Date: February 2005 Live Date: August 13, 2005 • • Project Scope 21 Continuums of Care 235 Identified Organizations 4 Commercial Off The Shelf Applications in use 5 Proprietary Applications
Project Goal • Meet HUD’s Data Collection and Reporting Requirements; Unduplicated Count • Access to better and more timely information on the needs of the homeless • Conduct analysis on usage patterns; identify gaps in services • Provide links to resources and other referral information and access to mainstream services such as Food Stamps • Streamline the intake process resulting in improved referrals, case management, and service coordination • Manage operations
Process Flow: HMIS Conversion and Integration Approach Commercial Off The Shelf Applications send data via XML or CSV through SFED Custom Applications send data via CSV or XML Information is staged for automated De-duplication Pre-Conversion Staging Area Data are checked for Validity. Invalid Data are sent back to the sender with return reasons. Good data are manipulated into a relational database format. Data are loaded into the target system Manual De-duplication based on Data Cleansing Reports Clean Unduplicated Database
Process Flow Commercial Off The Shelf Applications send data via XML or CSV through SFED Information is staged for automated De-duplication Pre-Conversion Staging Area Custom Applications send data via CSV or XML Data records are checked for validity. Invalid data records are sent back to the sender with return reasons. Good data records are manipulated into a relational database format. Data records are loaded into the target system Manual de-duplication based on Data Cleansing Reports Clean Unduplicated Database
Benefits of Using Standards Web Application X Integration Platform Current App Z The amazing new Web Application Y Reporting Platform (could be the same as the integration platform) Excel Spreadsheet Clean Reports • Accepting files in the HUD approved standards allows for simpler aggregation of data. • Agencies can more easily switch between providers if a better or more affordable solution is available. • An integration platform may have data cleansing programs that can create reports to be used by database administrators to improve the integrity of their own HMIS systems. • Aggregated reporting over unduplicated information is made possible. • Data backup. • Improved response time and efficiency for pandemic and natural disaster remediation. • Leverage sponsored systems with custom systems for best of breed at a lower cost.
Implementation Status as of 9/12/06 There are Three Types of HMIS Users: • HMIS Users with No Data Conversion • HMIS Users with Data Conversion • HMIS Users by Data Upload In Massachusetts there are 21 Continuums of Care • 9 Continuums of Care will be Uploading • 12 Continuums of Care will be HMIS Users • 235 Organizations Have Been Identified • 105 Organizations are Active (80 users and 25 uploaders) • 376 Programs are Active • 81 Targeted to Deploy
Challenges • Getting application service providers to cooperate. • Creating a common integration framework that is technically feasible for low-end users without being overly manual. • The politics of data ownership and security. • Managing moving timelines.
Resources and Cost Summary • A team of 20+ contract developers, testers, DBAs, and writers along with a project manager, an application architect, a conversion lead and a lawyer. (roughly $1 -2 million) • Joint Architectural Design (JAD) sessions with local agencies. (Free) • Internal staff members at DTA in varying roles and varying participation levels. (? ? )
Lessons Learned • Rome was not built in a day. • Developing applications in a political environment requires more meetings and legal work than a typical for-profit application. • The work is more emotionally rewarding than a typical for-profit application. • It can be done.
Safe Harbors Data Integration Steve Coretti, Safe Harbors (City of Seattle, King County, United Way of King County) September 18 -19, 2006 – Denver, Colorado Sponsored by the U. S. Department of Housing and Urban Development
Safe Harbors Data Integration • • Intro What Issues Were Addressed Overall Process How Is It Working? Challenges and Solutions Cost/Resources Lessons Learned
Safe Harbors Data Integration • Introduction – Safe Harbors Database in use for close to 2 years – Several large agency databases existed prior to this project which had been in use for some time – Looking for a cost effective way to move the data
Safe Harbors Data Integration • What Issues Were Addressed – How to integrate large pre-existing databases with the HMIS – Need to get large bed counts represented – Need for a Secure Data Transfer Method – Has to be reusable and cost effective
Safe Harbors Data Integration
Overall Process • 2 Software clients (Windows and Java) created • Web Service set up to receive data via a secure gateway • Data cleansing is responsibility of agency • Weekly updates per agreement
Safe Harbors Data Integration • How Is It Working? – As of June, 4 of the 6 agencies are actively sending data – No major technical issues – Overall agency satisfaction with the technical solution
Safe Harbors Data Integration • Challenges – Differing Technical Abilities – Business Process Issues – Project Management
Safe Harbors Data Integration • Resources and Costs – Agencies did their own data extracts – Balance of Project costs borne by the Co. C – Web Service and Client Software contracted out – Project Cost Approximately $150 K
Safe Harbors Data Integration • Lessons Learned – Be Aware of Hidden Tasks and Costs – Data Extraction and Mapping is more complex than it appears – Make sure Business Processes exist to collect all of the required data
Data Integration: City of Chicago Kelli Lee, City of Chicago, Business and Information Services Department September 18 -19, 2006 – Denver, Colorado Sponsored by the U. S. Department of Housing and Urban Development
Data Integration: City of Chicago • • Introduction Project Goal Overall Process How Is It Working? Challenges Resources and Costs Summary Lessons Learned
Data Integration: City of Chicago • Introduction – Live Date: April 27, 2005 – Chicago Continuum of Care: 97 total participating agencies in HMIS – 13 Integrating agencies (non-direct users) – 8 Data Migration agencies (direct users)
Data Integration: City of Chicago • Project Goal – Integrate data from multiple agencies into HMIS in a standard format using a secure transfer method.
Data Integration: City of Chicago • Overall Process – Agencies • • • Clean their data Transforms their data into City standard format and sends file to the City using the secured transmission method Owns the data – City of Chicago • • • Validates the file Transforms the file into vendor required format Sends the file via a VPN tunnel to vendor servers
Data Integration: City of Chicago • Standard Format – XML format (ongoing interface) • Standard Secured Transmission Method – HTTPS encrypted using SSL (manual-temporary) – Web service using SOAP Protocol version 1. 1 (automated-permanent)
Data Integration: City of Chicago • Communication to Agencies – Project website posted with technical documentation • • • XSD (XML schema) Data dictionary Required fields list Web Service requirements Sample WSDL Transmission instructions – Meetings – Phone calls to individual agencies – Available Support (onsite, email, phone, website)
Data Integration: City of Chicago • How Is It Working? – Completed data migration for 8 agencies representing over 4, 000 records – As of July 11, 2006, • • 2 agencies are sending data using web service weekly (range of 20 -200 records). 4 agencies are in various stages of web service or XML development – Opened HTTPS as a permanent solution – Conduct meetings with agencies for questions and follow up with agencies on progress/support
Data Integration: City of Chicago • Challenges – Differing levels of business knowledge – Wide range of technical expertise/resources – Longer implementation periods
Data Integration: City of Chicago • Resources and Costs Summary – Agencies responsible for their own data extraction • Use of consultants – City Infrastructure Development fees: $260, 000. 00/1800 hours (City funded) • • • Transmission: Web Service and HTTPS Format: XSD Analysis, QA, Requirements documentation
Data Integration: City of Chicago • Lessons Learned – Technical industry standards do not fit the social service industry (need simpler solutions). – Underestimation of implementation time – Need for increased level of technical support and follow up – Underutilized project website – Sent emails of project website updates – Large organizations vs small organizations • Resources (people, technical infrastructure and $$$)
Data Integration: Excel Data Manipulation Tips & Techniques Matthew D. Simmonds, Simtech Solutions Inc. , President September 18 -19, 2006 – Denver, Colorado Sponsored by the U. S. Department of Housing and Urban Development
Overview: Data Manipulation Techniques in Excel • Database conversions are not a “flip of the switch” process and require some work to get the data from the source database into the target database. • This presentation will outline tips and techniques used in Microsoft Excel to prepare data for loading into a target database or for performing offline data cleansing.
Why use Excel? • Microsoft Excel as it is technically the least common denominator and the most ubiquitous database application around. Most database platforms can save files in comma separated variable (CSV) format which can be easily imported in Excel and worked with.
General Rules • Save your work in versions so you can revert back if necessary. • Any new data added to the system post-export may have to be re-cleansed so you might want to freeze your database. • Work with the Excel file only in a secured and control environment. • Excel does have its limitations. 65536 rows to be exact. • Audit your work.
Data manipulation that can be done in Excel: I. Identify duplicate records. II. Remove duplicate records. III. Manipulate database columns to match a target format. IV. Populate blank data quality codes. V. Split up one field into several fields. VI. Check for a middle initial. VII. Strip out undesirable characters. VIII. Combine data elements that are stored across multiple columns into one column. IX. Translate values to be consistent with the target format. X. Identify and remove erroneous service records.
I. Identify duplicate records. To use this technique perform the following: • Copy the unique fields you wish to concatenate to blank columns at the beginning of the worksheet. These are considered your “key fields” and should be the key identifiers you use to find a record. (HUD recommends name, DOB, gender and social) • Insert another blank column to the left of these columns. • Enter the concatenate function (i. e. =CONCATENATE(C 2, D 2, E 2)) in the first row of the column you just created. Copy this formula down the column to the last record. • Click on Data | Sort and select the column containing the newly created key.
I. Identify duplicate records. • Insert another column. Check for duplicates using the EXACT command. If the new key field is in column A and Column B is blank enter the formula in the first row of Column B as follows =EXACT(A 2, A 3). Copy the formula down to the remaining cells. The EXACT() function returns TRUE if the values it is given are identical, and FALSE otherwise. • Since it may be difficult to locate all of the cells that have a value of TRUE in them you may wish to highlight the TRUE values using conditional formatting. To do this select the test column and click on Format | Conditional Formatting. Arrange the drop down boxes to read “Cell Value Is” “Equal To” “TRUE”. Select the Format button, followed by the Patterns tab, then choose the color to highlight the cell if the function value is true.
II. Remove duplicate records. After identifying redundant records you have to resolve how to handle them. NOTE: If you are working with one flat table and need to break it into separate tables you MUST copy the client ID from the first record to the duplicated records so everything jives. Once you have performed this step you can copy the table over to the other worksheets where they can be manipulated further as needed. Scroll the worksheet for values of “True”. Compare the matching records to determine how to handle the record as follows: A). If the data is purely redundant (elevator button syndrome) then delete any duplicate records from the table with a value in Exact Match? =“True”. B). If there are potentially valid data in multiple records you may choose to merge the data into one record and delete the other(s).
IIIA. Manipulate the database columns as needed to match the desired target format. Basic Manipulation • Inserting/Adding Columns – Simply select the column to the right of the column you wish to add, right click, and select “Insert”. • Deleting Columns - This process can be handled simply by selecting the column that is not needed, right clicking and selecting “Delete”. • Moving Columns – Select the column you wish to move, right click, select “Cut”, select the column to the right of where you want the column to go, right click, and select “Paste”.
IIIB. Manipulate the database columns as needed to match the desired target format. Advanced Manipulation • Create a separate worksheet in the same workbook that has the source data. • Copy the target format field names into a header record (the first row) on the worksheet. You may have to use Copy | Paste Special and select Transpose in order to copy field names listed vertically to be listed across the worksheet. • Enter the equals sign (=) in the first column and first row under the newly copied file header row. Click over to the source file, select the corresponding field in the first row under the source file’s header row and hit Enter.
IIIB. Manipulate the database columns as needed to match the desired target format. Con’t • To properly map a text field use the =T(Field. ID) so that the mapped value does not return a zero (0) if it is blank. Repeat this process until you have mapped one field for each of the values in your target database. • Copy the row containing these new formulas down your spreadsheet to at least as many rows as you have records in your source file. • Create a blank worksheet. Copy the entire worksheet using Select All (the box in top left corner of the workbook) and use Paste Special. Select the option to paste Values only. • Save the newly created target file.
IV. Populate blank data quality codes. Fields such as last permanent zip code and social security number require a data quality code. If your staff has not been answering this question you can derive whether or not the code is partial or full based on the length of the data entered using the =LEN() function to count the number of characters. To use this technique perform the following: • Insert a column next to the field you want to create codes for. • Enter =LEN(Field. You. Want. To. Check) into Row 2 of a new column and copy the formula down the column. • Sort the database by this new length column. • Use either Copy | Paste (CTRL C, CTRL V) or Find|Replace (CTRL F) to change values from lengths to HUD data quality code values.
V. Splitting up one field into several fields. (using full name as an example) To use this technique perform the following: • Right-click column heading for the full name field and select Insert. If you are splitting the field • Select the name field column then click Data in the menu bar and select Text to Columns. • Select the option for Delimited and click Next. • In the Delimiters section click Space and/or Comma (depending on how your data is stored) and then click Finish. Note if the data is stored as Simmonds, Matthew D. you will have to insert three columns and use both comma and space as delimiters. • When prompted, answer OK to replace the contents of the destination cells. • Change the text in the column headers to now accurately reflect the change. (i. e. Last Name, First Name, Middle Name, Suffix).
VI. Checking for a middle initial. If you are breaking up one field, such as Full Name, into several you may have to account for smaller fields such as a middle initial field that will sometimes be incorrectly spliced into other fields. You can use the Len() formula to check this as well. To use this technique perform the following: • Enter =Len(cell_you_want_to_check_for_the_middle_ initial) in a blank column. • Copy this formula down to the last data row. • Sort the data by this newly created length field using Data | Sort. • Copy & Paste data values as necessary to separate middle initials from last name fields.
VII. Stripping out undesirable characters. For the sake of data consistency it is common to remove extra characters such as $, -/. If you are trying to link a social security number that has dashes with one that does not you will never get a match. You can use the Find|Replace (CTRL F) function to resolve this. To use this technique perform the following: • To ensure the Find | Replace works only over the column you wish to work with copy the data column to a blank worksheet. • Highlight the column you want to work with and hit CTRL+F. • To remove dashes you would enter “–” in the field labeled Find what: , click the Replace tab and replace the dash with blanks by hitting the space bar once in the Replace with: field and click Replace. • After performing the Find|Replace copy the properly formatted data back into the original worksheet.
VIII. Combining Data Elements Across Multiple Columns to one Column Race =CONCATENATE(B 2, C 2, D 2, E 2, F 2) American Indian or Alaskan Native FALSE Asian FALSE Black or African American FALSE Native Hawaiian or Other Pacific Islander FALSE White TRUE To use this technique: • Create a blank column and use the =Concatenate formula as shown above to merge the values. • Copy this formula down to the last data row. • Use Copy | Paste Special to copy the newly created values to another blank column. • Use Find | Replace to change the merged values with valid values. (ie replace FALSEFALSETRUE with “White”) • Repeat this process until all merged values are replaced with valid values.
IX. Translating Values to be consistent with a target format. To use this technique: • Create an Excel workbook with three worksheets. The first will contain the values from the source table. The second will be a Field Value Map table to match values used within the source and target databases. The third table will contain the formulas need to translate your table. • Map the source values to the target values on the Field Value Map worksheet similar to the table shown. NOTE: You MUST have your source values listed in ascending order (alphabetically or numerically) for this to work properly. Table Field Source Value New Target Value Client ID Race an Alaskan Native Client ID Race ai American Indian Client ID Race A Asian Client ID Race B Black/African American Client ID Race nh Native Hawaiian Client ID Race pi Other Pacific Islander Client ID Race W White Client ID Race O Other Multi-Racial
IX. Translating Values to be consistent with a target format. (continued) To use this technique: • • • For any fields that need value translations use the =VLOOKUP function in Excel to translate the source database values to the target values. For values that are not to be translated you can simply use the = technique described in IIIB. Open a blank workbook. Use Copy | Paste Special and select to paste the Values on your target worksheet to the new workbook. Save the newly translated file.
X. Removing erroneous service records. Quite often it is difficult to figure out when a client has left (especially for programs such as street outreach) so the data records for a recorded service end up having blanks for an exit date. Blanks can be valid if you are still servicing the client. One way to check for erroneous blank conditions is to determine if you have provided the same client with the same service since the record where the blank exit occurs. You can do this using the =AND function as shown. A B C D Client ID 3 E SERVICE NAME AIRS CODE START DATE END DATE 108493 Bed BH-180 10/14/2003 10/15/2003 4 108493 Bed BH-180 4/13/2005 5/23/2005 5 108493 Bed BH-180 5/25/2005 6 108493 Bed BH-180 5/25/2005 5/26/2005
X. Removing erroneous service records. Quite often it is difficult to figure out when a client has left (especially for programs such as street outreach) so the data records for a recorded service end up having blanks for an exit date. Blanks can be valid if you are still servicing the client. One way to check for erroneous blank conditions is to determine if you have provided the same client with the same service since the record where the blank exit occurs. You can do this using the =AND function as shown. NOTE: Make sure the date fields are formatted as dates! A B C D Client ID 3 E SERVICE NAME AIRS CODE START DATE END DATE 108493 Bed BH-180 10/14/2003 10/15/2003 4 108493 Bed BH-180 4/13/2005 5/23/2005 5 108493 Bed BH-180 5/25/2005 6 108493 Bed BH-180 5/25/2005 5/26/2005
Conclusion • Excel is filled with powerful functions that can be used to help cleanse your HMIS database. Other functions to look into include filtering out invalid records out using the FILTER function (Data | Filter), auditing your data and creating reports using the COUNTIF function and using advanced data validation techniques (Data | Validation) to ensure clean data entry. Performing these tasks and mastering the functionality available within Excel will help ensure the integrity of your HMIS database.
Understanding the HMIS Schema & Customizing it for Local Use Steve Coretti, Safe Harbors, City of Seattle, King County, United Way of King County September 18 -19, 2006 – Denver, Colorado Sponsored by the U. S. Department of Housing and Urban Development
What is XML? • Created so that structured documents could be sent over the web • Provides a template for data validation • Allows for use of multiple programming languages • Provides standard ways for developers to incorporate, extend and customize using keywords
Why Use Standard Schema? • Represent HUD Data Elements as closely as possible • Universal data elements • Program Specific data elements • Create Universal Compatibility Among System Providers • Provide Guide to Allow Consistent Data Responses • Helps with data quality and control
The HUD HMIS XML Schema
The HUD HMIS XML Schema • Schema Allows For: – – – Identification of data source Sending current client data Sending updates to existing client data Hashing of certain data elements Modification to include additional data elements
Structure Overview • ‘Required’ elements – Very few ‘require’ an answer – Not the same as HUD’s required elements – Can set additional elements as ‘required’ if desired by not specifying “min. Occurs” value – Use
Structure Overview • Required: –
Modifications to the HMIS Schema • Elements may be added using existing elements schema as a guide – Example: Target Population (exists currently as blank string) 1 = SF Single Female 18 and over 2 = SM Single Male 18 and over 3 = SMF Single Male and Female 18 and over 4 = FC Families with Children 5 = YM Unaccompanied young males under 18 6 = YF Unaccompanied young females under 18 7 = YFM Unaccompanied young females and males under 18
Adding Custom Elements in 2. 7 Schema
Reference Material • Modified XML Schema (2. 6. 1) • Used by Safe Harbors before 2. 7 was released • Current HUD XML Schema (2. 7) • HMIS Data Integration Cumulative Package Overview • XML Schema by Eric Van Der VList
Data Integration: City of Chicago Policy, Procedures and Security Kelli Lee, City of Chicago, Business and Information Services Department September 18 -19, 2006 – Denver, Colorado Sponsored by the U. S. Department of Housing and Urban Development
Data Integration: Policy, Procedures and Security • • • Goals Policy Considerations City of Chicago Process/Outcome Security Forms Chicago Continuum of Care Standard Operating Procedures • Challenges
Data Integration: Policy, Procedures and Security • Goals – Develop policies to be in compliance with local, state and federal laws. – Protect client data. – Standardization of business practices and data collection.
Data Integration: Policy, Procedures and Security • Policy Considerations – – Who owns the data? How to ensure privacy How to prevent disclosure What are the laws protecting special populations (i. e. HIV/AIDs, Mental Health, DV) and how to apply them for HMIS
Data Integration: Policy, Procedures and Security • City of Chicago Process/Outcome – Collaboration between City Legal Counsel, Co. C, agencies and their legal counsel on developing policies – DV agencies use IL State system, Infonet, not HMIS – Separate policy training class for all users – Integration agencies (same policy as direct users): • • Own their data in HMIS and not shared Have the same system access to their data in HMIS as non-Integration agencies Data can be in a de-identified format Data can be extracted from HMIS and given back to agency no longer participating
Data Integration: Policy, Procedures and Security • Security – Chicago’s HMIS Application • Public Key Infrastructure (PKI) • Role Based Security (19 different levels) • Agency level – Greater access to own agency client data such as private case notes and clinical assessments • HMIS Data Sharing Network (DSN) – Group of HMIS direct-use agencies sharing certain aspects of client case records; Does not apply to Integration agencies • Enterprise Case Management Search (ECM) – If not in DSN, allows search of clients based on primary client identifiers to prevent duplicate client records – Sample Information Security Protocol (handout p. 4) • Agency’s procedures for security of HMIS, authorized machines/users, access and use of client data and compliance with Co. C Standard Operating Procedures
Data Integration: Policy, Procedures and Security • Forms (Handouts) – Agency/User Responsibilities/Rights • • • User Policy, Responsibility and Statement of Ethics Agreement (p. 29) Agency Partner Agreement (p. 14) Interagency Network Data Sharing Agreement (p. 21) – Does not apply to Integration agencies
Data Integration: Policy, Procedures and Security • Forms (Continued) – Client Consent/Rights • Brief Explanation of ECM-HMIS (p. 18) • Notice of Uses and Disclosures (p. 26) • Consent for Network Data Sharing/Revocation (p. 7/11) – Does not apply to Integration agencies • Release of Information for Referrals/Revocation (p. 9/12) – Does not apply to Integration agencies • Consent to Enter Clinical Assessment Information into HMIS for HIV-AIDS, Mental Health and Substance Abuse/Revocation (p. 19)
Data Integration: Policy, Procedures and Security • Chicago Continuum of Care Standard Operating Procedures (SOP) Document Review – – – Agency Responsibilities (p. 32) Agency Technical Administrator Responsibilities (p. 33) HMIS Participation Policy (p. 34) Initial Participation Requirements (p. 37) HMIS Agency Hardware, Connectivity and Security Requirements (p. 39) – HMIS Data Collection Requirements (p. 41) – Data Ownership Policies and Procedures (p. 43)
Data Integration: Policy, Procedures and Security • Challenges – – – More forms to review with clients Time increase in providing services to clients Administrative complexity Increased burden on agency staff resources Agency staff high turnover: continual training Language barrier for non-English speaking clients or special needs (i. e. braille) – Iterative long process involving various advocacy groups, legal counsel and many meetings
Integration Roles and Responsibilities Matthew D. Simmonds, Simtech Solutions Inc. , President September 18 -19, 2006 – Denver, Colorado Sponsored by the U. S. Department of Housing and Urban Development
Integration Roles: All Parties • DEFINE DATA MANIPULATION RULES –There may be certain data conditions that require a call on how they should be handled. Conditions that may meet such criteria include: –No program exit date. Do you add the average length of stay to create an exit date? –Services overlap. Should you delete the second record? Combine records? –Client exit date is before entry date. Do you reject the record? Add the average length of stay to the entry date to create a new exit date? • AUDIT THE CONVERTED DATA –Spot check records at random for accuracy. – Run and compare reports on both the source and target.
Optional Approach – Separate the Identifying Information • It may be worthy of consideration to separate the client identifying information from the remainder of the HMIS data during the export process. • Under this approach a source agency could send the client ID file and the source agency’s host could send the remainder. The Source Host would need to publish the canned query to generate the Client ID file.
Integration Roles: The Source Homeless Agency SIGN AGREEMENTS Need to review and sign any data sharing or release agreements with the target agency. MANUAL DATA INPUTS Any data that are not converted as they were not part of the source database or the export format cannot be translated. For example, “Barred Client Y/N” is not part of the HUD schema but may have a place in the target database. SET UP AGENCY INFORMATION such as all program information, users, bed lists, and services that will be used within the agency. Some of this information is not for integration but would be needed for conversion purposes.
Integration Roles: The Source Homeless Agency UPDATE ERRONEOUS DATA Through either the target system if the data is allowed in OR through their own application if the data was rejected. If data was rejected then the file update needs to be reinitiated or that record needs to be added manually on the target system.
Integration Roles: The Source Agency’s Application Host • Revise their export formats as needed to accommodate the target format. • Create the extract files for each agency. • Obtain release of information agreements from each agency to give data to the Target Agency. • Submit data to each agency in the Standardized Format so the agency can release it to the Target Agency OR if data releases are signed submit data directly to the Target Agency. • Set up data translation values.
Integration Roles: The Target Host PRE-CONVERSION ANALYSIS • Need to determine the list of agencies to be converted and what formats you are converting from. AGENCY SET UP • Each program and agency has to be defined manually by a representative from the Host/Target Agency in order to initiate the conversion process. • An administrator will need to be assigned as well and given user name and password information to maintain bed lists, services and other customizations as necessary. DEFINE CLEANSING RULES • Certain erroneous data conditions can be cleansed by using an automated approach. There may be scenarios where the target agency needs to make a call on how to handle a data condition.
Integration Roles: The Target Host (continued) • Data Mapping: Fields from the source database have to be directed into corresponding fields in the target database. • Data Validation: Data is checked for validity on a field by field basis. For example, a person’s gender could not be “Fred”. • Duplicity Checking: The database is scoured for duplicate data. Duplicate data can throw off your totals and is inefficient. • Data Translation: Data is reformatted into a standardized reporting format. • Data Creation: Information is made out of the data so that it can be easily acquired later instead of creating it during the reporting process. For example, the length of stay for a client can be calculated and stored in the file.
Integration Roles: The Target Host (continued) CONVERSION STAGING • Environments need to be set up for each agency that will be converting data. Data will need to have separate repositories for both the initial format as well as the converted format. • Processes are developed that will enable both source and target environments to be easily refreshed and the conversion programs to be rerun.
QUESTIONS and DISCUSSION